DLC as Inflation

Years ago, I pondered reducing quantity as a means of avoiding raising price. I will avoid repeating the examples, since the link is right there.

In our present games, quantity is reduced via moving content to expansion packs and DLC. Players notice when you increase the average cost of a game from $40 to $50 to $60, and at some point it becomes hard to make that next jump, especially if you need to be the first company making that jump. So put less in the box. Heck, that’s even good publicity because you are now releasing an expansion pack sooner, and more expansion packs. You’re putting out so many updates, you’re selling something called a “season pass” for all that DLC. So what if the amount of non-procedural content in sequel+DLC is less than the original game? How many reviewers rate the game based on that?

This is not necessarily a bad thing. It does cost money to make games, and costs do increase over time. At some point, either the base game costs $80, or you are buying that same amount of content in a $40 with two $20 expansions. It doesn’t matter that you used to be able to get that much game for $40, any more than it matters that you used to go watch a double-feature at the cinema for a quarter. Costs rise, and this is one way that customers have chosen to absorb them. I say “customers have chosen” because the company would be perfectly happy to take your $80 up front with one release date, but it turns out that more players will give them more money if they sell it in smaller pieces. You get the game business models you are willing to pay for.

There are some obvious ways we benefit from that as players. If you have exit points at $40 and $60, you can decide that you don’t like the direction the game is going; if you pay $80 up front, you’ve already paid your $80. You are also getting that first $40 worth of game sooner, and given the popularity of playing beta and early release games, that seems to be an in-demand option. Each part of the game needs to justify itself as being worthwhile, rather than just getting one score for the whole game and hoping the reviewers forgive some problems in the third act.

There are some obvious drawbacks in terms of game design as well. Insert your favorite twenty stories about perverse game monetization strategies. Having that spread of DLC and expansions can fracture the playerbase and promote “pay to win” via power creep. Hey, if you need a way to sell that third expansion, how about “you’ll be more powerful if you buy it”?

Like most design and business decisions, this can be done well or badly. I would just like us to be more conscious of it and buy games for value and quality design, rather than letting our primate brains react to big numbers on the screen. But we’re gamers, and we react to big numbers on the screen.

: Zubon

11 thoughts on “DLC as Inflation”

  1. The problem I have is that it’s not the same game. The old $40 now $80 game was complete, and more often epic, in the poetic sense. The core of the new $40 game is a lesser thing, with content to be tacked on later that is often non-contiguous to the primary campaign. By necessity the expansions can’t be necessary to the plot, and the ending is closed by the time they’re released and experienced.

    I’d rather pay more for an extended story than be repeatedly shown, “At an unspecified point in time, in a nearby location, our heroes…” – the novel rather than the pulp magazine.

    1. On the other hand – since you’ve essentially bought *part* of the game, if its not very good you can forgo spending on the DLC. In essence it helps you hedge against poor quality.

      And even if the game *is* good and ‘complete’ there’s often room for expansion (even in the older ‘complete’ games) that doesn’t justify a full-sized sequel – that’s very game dependent of course.

      Plus DLC covers a lot of ground – from simple item packs/new maps to skins to mini-stories added into and alongside the main story (such as Dawnguard/Dragonborn for Skyrim). Stuff that might have been intended to be included in the main release but was cut for time.

  2. Yes – personally I can’t stand DLC – I will happily pay more for the collector edition – especially if it has neat stuff in it.

    Heck I’d pay *more* for a decent physical package – I used to collect my game boxes and manuals – and I used to find value in a well published manual (and possibly map) as much as the game itself (Icewind Dale for instance).

    I’ll buy an *expansion* – if it’s decent. I hate DLC though. Because it’s either too overpowering, not really related to the story, doesn’t actually feel like it has anything to do with the game at all, or feels like content that should have been included in the game.

    Too overpowering includes things or items that basically break the game and let you auto win (Shin Megami Tensai IV DLC for example).

    Not really related to the story – pretty much every piece of Tomb Raider DLC (the newest version).

    Doesn’t feel like it actually has anything to do with the game at all – Red Dead Redemption Zombies!

    Content that should have been included with the game – too many to count – Day 1 DLC that requires a 5k unlock code to let you play what is already on the disc.

    I played Mass Effect 2 (on the PC) – and the ‘verification’ to make sure you had the rights to the DLC required a *service* to run on my machine – as well as breaking frequently and requiring new holes in my firewall – just to play the game. Well the effect was – I didn’t nor will I buy Mass Effect 3.

    I bought Arena, Daggerfall, Morrowind, Redguard – seems like I’m a heck of a Bethesda fan right – except I sat on buying Oblivion until they put out a ‘Game of the year edition with all DLC included’. And yes – I waited about 2 years to play Skyrim over the same issue.

    I still have not watched the movie ‘The Hobbit’ – because I’m waiting for the extended version to go on fire sale price – not because I think it’s overpriced – but because after buying 3 versions of the Lord of the Rings I felt burned enough to wait – mostly because I’m happy to pay to watch the full movie in the theater – I’m happy to pay to watch the full movie on disc – I refuse to play the game where the ‘full’ movie is only released a year after the fact to get more money from you. So instead of me happily paying 40-50 bucks for the extended version had it come out at the same time as the theatrical – I’ll wait until it’s on sale – because *they* (marketing people – whoever they are that have tried to extend a single sale across multiple versions) have broken my response to this tactic.

    I do not want ‘part’ of your game – I do not want an incomplete mess – I want to pay you and enjoy the thing for what it is. If you want more money from me after – then make whatever it is a real expansion that *continues* the story with a new one, but is essentially a complete epilogue in it’s own right (see Diablo II and expansion – or Icewind Dale, or Bauldur’s Gate) – if you want to charge more – do it.

    1. You’re comments about ME2 are spot on – I enjoyed the game and bought one of the DLC, never again.

  3. The problem is you’re forgetting 2 important factors, Zubon: inflation and development costs.

    Game prices rose in absolute terms, but game prices have not kept up with overall inflation. Here’s an article with a great example from the console era: http://arstechnica.com/gaming/2010/10/an-inconvenient-truth-game-prices-have-come-down-with-time/ That $31.99 game in 1992 would cost $48.33 in 2010 dollars. (Or $51.83 today given the 7.3% inflation from 2010 to 2014 dollars according to http://www.usinflationcalculator.com/). So, even if we allow for inflation, game prices have gone down for the base game.

    But, the other thing is that you’re not getting the same game. Let’s compare a screenshot of System Shock (http://upload.wikimedia.org/wikipedia/en/d/db/SHOCK001.GIF) to any of the BioShock: Infinite screenshots out there. The games have a much higher graphical fidelity, and that costs a lot more money. Take a look at the credits list for System Shock (http://www.mobygames.com/game/dos/system-shock/credits) vs. BioShock: Infinite (http://www.mobygames.com/game/windows/bioshock-infinite/credits), where the core art team for B:I is six times the size of SS. And, SS wasn’t a budget title for the time.

    Yeah, let’s ignore the sad state of game design right now. But, we can also point out the same thing about programmers, especially given the rise of huge tech companies like Google or Facebook in the last few decades. Game programming is very demanding, and you either have to pay them well or accept the people unable or unwilling to get hired at some tech company that could get acquired by Facebook for $19 billion.

    The gaming audience is essentially demanding more graphically complex games at the same price. This cannot happen. You either have to accept games with less graphical fidelity (those are called retro indie games and they can be really fun), accept that you’re going to pay A LOT more for a core game to cover the expenses, or accept that you’re going to have to pay extra for some of the little niceties that used to “come standard” when we paid more for games due to inflation.

    Or, you know, start your own game company and prove to those greedy fuckers that you can somehow keep a business running paying more people and making less inflation-adjusted money than companies did a decade ago. It could be fun. :)

    1. I’m not sure how much we’re really disagreeing there. I’ll endorse everything you’ve said there and expand:

      Programmers would be subject to cost disease even if they were not becoming more in demand in other industries, more valuable to other tech companies, etc. That factor alone should drive up the cost of games. I remember as an adolescent being told that computer programming was a hot career now but the job market would be saturated by the time we were adults. That’s kind of like the old quotes of no one needing a full meg of memory.

      Our other commenters are passionately addressing the countering force of all that value you are getting for your gaming dollar: siphoning off value elsewhere.

      1. Sorry, that intro paragraph was a bit stronger than I intended. I meant this more as a clarification of your points with some concrete examples, not a rebuttal.

        This tends to be one of those issues where people assume that developers are somehow becoming wealthy due to the increase in costs. I know from personal experience that for every Milo wanting performance upgrades for his Porche, there are many people just trying to make mortgage payments, and still others living off credit card debt because they’re chasing their passion rather than the biggest paycheck.

    2. > The gaming audience is essentially demanding more graphically complex games at the same price. This cannot happen.

      It’s not obvious why this can’t happen. Don’t the tools available to devs get more more powerful and cheaper with time? And doesn’t the potential market size get bigger over time, allowing fixed costs to be spread over a larger customer base?

      You’d think there would be economies of scale and so-called experience curve effects etc, which all tend to drive down unit costs in industries as they grow. It’s hard to think of any industry where prices for mainstream products don’t come down a lot over time in real terms, while the products also get better at the same time. Certainly neither hardware nor software in general seems to be defying these trends.

      FWIW there’s probably also a considerable segment, myself included, that would prefer something that is better value for money and be willing to forgo state of the art graphics as a trade-off, if that’s really what it would take to keep costs down.

      And the gaming industry is probably missing a trick by not offering different graphical quality versions of a game at different price points. Those for whom money is plentiful have high-spec rigs and would probably pay even higher prices than are asked now. Those with tighter budgets often don’t have the horsepower to run on max settings anyway, and are simultaneously put off from buying at all by prices they find too high.

      If the gaming industry thinks the way you suggest, maybe they are like the car industry would be if it only listened to car enthusiasts that talked all the time about top speeds of car models, with the result that they ended up only making expensive sports cars and complaining about how costly they were to make.

      1. FWIW there’s probably also a considerable segment, myself included, that would prefer something that is better value for money and be willing to forgo state of the art graphics as a trade-off, if that’s really what it would take to keep costs down.
        You’re in luck! There has been a big trend lately of indie games with retro graphics at bargain bin prices.

      2. There are a number of reasons why the constant increase in graphical quality cannot be sustained. They all work together to keep costs increasing. Get ready, this is long because I’m going to assume you know very little about game development.

        The main issue is that the biggest game development expense is employees. Tools may improve, but usually the improvements in tools are for graphical fidelity not speed in creating graphical assets. And, this increased fidelity from the tools requires specialized training, which means your employees are more expensive because the specialized training is rarer and that drives up prices.

        But, how much more complex have games gotten? In the Atari 2600 days, you had one person doing the programming and the code. However, the resolution was 160×192, so the art requirements were pretty low. Even increases in 2D required more art talent. The jump to 3D required a whole new skill set dealing with polygons. Then we added textures to those polygons. Then you have normal maps, parallax maps, ambient occlusion maps, etc. You need artists who understand each of these things for a modern game. Now add in animation, something that people are quick to criticize in a game if it’s not quite to their liking. Using mo-cap just increases these costs as well because mo-cap studios don’t come cheap, and off-the-shelf data has its own problems.

        There’s a joke in the game industry about specialization in art. The joke is about how some artist spends his time modelling noses on football players as his or her entire job. Only, that’s kinda not a joke.

        Of course, you need programmers who know how to program systems to use these.new art techniques. That means more specialized skill and that requires more programmers are required for the project. Programmers smart enough to handle these new technologies are probably smart enough to go work for Google or Facebook, companies that are known for paying software engineers very, very well.

        The other problem is that there is a diminishing return on graphical fidelity. This image shows it in great detail: http://imgur.com/aFKEttJ (Yes, there are some problems with that image, but it shows the nature of the problem and why technologies like normal mapping, etc. have become more important.) This is why the jump from the PS1 to PS2 was amazing, from PS2 to PS3 noticeable, but the jump from PS3 to PS4 has been a bit harder to see, precisely. So, in order to get a noticeable increase in fidelity, it requires A LOT more effort, even with great tools and outsourced artists.

        FWIW there’s probably also a considerable segment, myself included, that would prefer something that is better value for money and be willing to forgo state of the art graphics as a trade-off, if that’s really what it would take to keep costs down.

        You say that, but you probably don’t actually spend money that way. Let’s pretend that Square-Enix decides to “go retro” with Tomb Raider and focus less on graphical fidelity while Naught Dog continues to push the envelope with Uncharted. Which is going to sell better, assuming both are sold at the same price? Number of sales is important when you consider that you want a larger market to increase the people who are interested in the next entry in the series. Especially if you aren’t going to be able to sell DLC to monetize a smaller audience.

        And, as Zubon points out, there’s no shortage of game with lower graphical fidelity. Personally, I was particularly taken with the game Rogue Legacy, which was a really fun game. There’s also the well-known example of Minecraft.

        In addition to indie titles, “Budget titles” have existed for as long as I’ve been buying games, but somehow no game developer has grown to the size of a BioWare or Naughty Dog from budget software. So, it’s not a case where some developer can just not care about graphics and still do well enough to grow their company.

        Yes, there are some notable exceptions here, such as Minecraft. But, these are exceptions for a reason. As awesome as Minecraft is for Mojang, it’s not something you can just duplicate. There are plenty of games with low graphical fidelity that never reach the same levels of success. Plus, I think more people are going to be excited by a game like EverQuest Next Landmark because it has better graphical fidelity.

        And the gaming industry is probably missing a trick by not offering different graphical quality versions of a game at different price points.

        They already do the different levels of graphical fidelity; it’s call “level of detail” (LoD), and it requires even more effort to do right. LoD is often use to make models far away from the player simpler and therefore easier to draw, but it can also be used to keep frame rate up in a modern game.

        There’s no way a game company could charge less for lower graphical fidelity, because the hard-core would just buy the cheapest version and pirate the graphics from the more expensive game. Adding layers to prevent this would just increase the employees you need, and would not result in profits.

        So, there’s a fairly in-depth reason why I stated that demands in increased graphical fidelity cannot continue with the same price point. The costs to support requirements are growing faster than tools and efficiencies of scale can compensate. As nice as it might be to say that graphical fidelity doesn’t matter, the reality is that it does matter for the majority of the game playing audience.

      3. I still have not watched the movie ‘The Hobbit’ – because I’m waiting for the extended version to go on fire sale price

        For the love of all that is holy, don’t get the Extended version of The Hobbit. The regular film is unbearably long, I can’t imagine how awful the Extended edition would be. (I loved the Hobbit book and the Lord of the Rings books and movies.)

        Don’t the tools available to devs get more more powerful and cheaper with time?

        The tools devs use to make games are shockingly similar to the ones used 20 years ago.

        And doesn’t the potential market size get bigger over time, allowing fixed costs to be spread over a larger customer base?

        With iPhone, Android, Windows, Mac, Linux, XBox, PS3, Wii etc, etc the software market while bigger, is so fragmented and has hardware with such varying performance that much of the work of making software has to be duplicated for every potential platform.

        I feel really bac for the phone game people, some of them have to create, test and manage hundreds of variations of the exact same game depending on what handset it is running on.

Comments are closed.