Although I’m more of a casual gamer, I’m starting to realize that “indie game” gets thrown around casually to the point where I wouldn’t be able to accurately define an indie game if someone asked me to define it. I’ve also listened to many experienced gamers griping about the haphazard use of the term.
In its truest sense, a game is defined as “indie” when it’s created by individuals or small development teams, independent from a publisher that would’ve otherwise provided technical or financial support.
The first indie games started back in the 80s where the industry struggled. Several consoles saturated the market, pumping out too many games with minimal quality. The perfect example was the infamous E.T. featured on the Atari. Sadly, this snowballed into multiple bankruptcies. From the ashes emerged an incredible phenomenon.
Amateurs learned how to code and took it upon themselves to develop their own games on PC. While it was a great DIY, it would take decades before anyone could hope to make money as an “indie developer”.
Over time, it became less expensive to create and distribute games. Non-traditional game play gained more visibility in the early 2000s, especially after the birth of Steam. That’s when these indie games gained more accessibility.
The rest is history. The indie game market exploded, with earlier hits such as Dear Esther and Gone Home through Fez, Braid, and Super Meat Boy.
In conducting research for this editorial, I also discovered the term indieapocalypse. Murmurs materialized in 2015 when the ease of creating and distributing games initiated an outpouring of poorly-made titles. It reminded many of the gaming industry crash in 1983. Thankfully, we have not seen this indieapocalypse to date–thanks in part to the ebb and flow of supply and demand as well as healthy competition.
It is through examining this history where I understood when and how the definition of indie game began to blur. Instead of strict criteria, indie games nowadays would qualify as such if they shared certain qualities.
One such characteristic is demonstrating a level of independence, whether creative or financial. That is the most obvious. Another characteristic would be the existence of a small development team who’d put out a title generally known to be “smaller” than a triple-A title. Without a publisher breathing over their necks, small devs have much more creative freedom within their content. Finally, some could look at indie from a cultural standpoint–favoring innovation and artistic experimentation over corporate red-tape.
Given some of the above traits, many devs would utilize a lower budget 8 or 16-bit “retro” aesthetic but overlay it with more intricate and unique mechanics that weren’t done in the 80s or 90s.
However, as I alluded to earlier, the meaning of indie gets blurred, and the term then gets tossed around, applying to games that actually don’t fit the true definition.
The Witness is a popular example. Although Thekla, Inc. funded and published on their own game, its financials is what thrusts it somewhere between indie and triple-A. It took about 6 million dollars to develop The Witness and retails for around $40. A typical indie game is priced at $20 at most.
Hello Games considers their title No Man’s Sky indie. Yes, Sony did publish their game and at the same price range as a triple-A title. However, Sony didn’t provide financial support, and its the unusual game play and small team of developers is where Hello Games makes their argument.
A third case study surrounds Journey, a title that did receive financial and publishing support from Sony. That certainly doesn’t sound like an indie title, but thatgamecompany believes Journey is still considered an indie as Sony didn’t involve themselves in the creation process.
Where exactly do we draw the line? On one hand, I do feel it’s unfortunate that the special meaning behind an indie game has become diluted over the years. If a title fits any of the definitions above in the slightest, people can call it “indie”. On the other hand, who is it hurting? The overwhelming number of games released online from individuals or small developers provides more options to cater to every single consumer out there, and they provide much needed competition against blockbuster titles like The Last of Us, Red Dead Redemption II, or the Super Mario franchise.
I believe many die-hards in the industry balk at the widespread use of the term indie game; however, in my very humble opinion, I think it’s all part of the evolution of gaming. Much like with other technological advancement, gaming has become so much more accessible–both from a creation and consumer standpoint.
Naturally, it makes sense to see more games identifying as indie, and I feel as long as the hard work and quality is there, I’m okay with that. The very heart and soul of “indie” is its fiery desire to stand up against its behemoth brethren. A small band of developers have a much steeper hill to climb, forcing them to think outside the box, and that should be recognized.
As an indie author, I celebrate the rise of indie games, flawed definition and all.
Speaking of indie games, we need to continue to celebrate and support them! There was an Indie World Showcase in April. If you missed it, check out our coverage here by our own Eddie V.
How do you define an indie game? Do you feel that people are just slapping “indie” on anything that isn’t a triple-A title? Let us know in the comments or on our Discord. We’d love to hear from you!