Now, obviously, it would be preferable for most of us if microtransactions weren’t a thing at all. In reality though, business decisions dictate that they are a necessary step for monetisation and, from the looks of things, a lot of gamers are only too happy to oblige.
The battle lines over microtransactions are also constantly being redrawn. At first, we turned our noses up at the mere mention of them, they were for F2P mobile games, and now most people are fine with them as long as they’re cosmetic. Just don’t put in loot boxes, whatever you do. Over time the resistance to microtransactions has been eroded, and now there’s rarely a AAA game that doesn’t have at least some form of in-game purchases.
It’s easy to take a stand against them, but when you see the figures for what the likes of Activision and EA are earning from these things, it doesn’t half feel like a losing battle.
As ever with something like this though, there are many shades of complexity to what’s going on here. This isn’t necessarily just a case of milking the fans for every last penny they’ve got. There’s a generally accepted right way and wrong way to go about introducing in-game purchases, as EA and DICE found out to their detriment with Star Wars Battlefront 2.
The first thing to consider is that we crave lasting experiences. In 1990, Super Mario World arrived to rave reviews and a six-hour length. That just wouldn’t cut it these days. If a game isn’t playable for hundreds of hours there are likely to be complaints, forcing developers and publishers to adapt to changing demand. We want games that we can buy and potentially play for years to come. We also only want to spend $60, the same amount we spent on Super Mario World in 1990.
At some point, the money has to come from somewhere if you want monthly content updates for years. Rare aren’t going to keep making Sea of Thieves quests out of the goodness of their hearts, they’re going to want a regular revenue stream that can keep paying for it all. In essence, the profit from ongoing support for a game should be equal to the revenue from just dropping support and developing a new game, else it wouldn’t be worthwhile. The way this content is funded these days is usually microtransactions, often of the cosmetic variety. A few willing customers are prepared to drop 10 bucks on the occasional weapon skin, and it means the rest of us all get free maps and new weapons. It’s a marked shift from just five years ago when it was all about the season passes. Every multiplayer game had a season pass which fragmented the player base and separates the haves and the have-nots. Now we’ve just a few stuck-in-the-past series like Battlefield and Call of Duty holding out.
If you ask me though, season passes are far worse for the average game than microtransactions, provided the microtransactions are implemented well. I’ve sunk hundreds and hundreds of hours into Rainbow Six Siege, been given nine new maps, dozens of patches, and 16 new Operators, and I haven’t had to spend a penny. To me that’s a far better deal than picking up this year’s Call of Duty, then dropping £40 on the map packs, and then doing it all again a year later. The value proposition there is just insane.
So what do you think then, do microtransactions flat-out have no place in a $60 game? Would you prefer a return to season passes, or perhaps you think there’s no need for additional content after launch? Let us know what you think of the situation below!