Dictator's 5 Modern Gaming Gripes

Technology is an inexorable forward march, and in the case of video game technology in particular it is a march at double-time. This progress has brought about a lot of improvements, like wireless controllers that allow us to lounge in relative comfort at a distance rather than being tethered to the TV. Or autosave. Or standardized controls. And have you seen modern video game graphics?! But, there are always downsides to progress. These range from minor irritations to rage-inducing. So follow along with the 5 Gripes Of Modern Gaming (also known as Dictator Outs Himself As A Grumpy Old Man)


Ahhh, microtransactions, the lowest hanging of fruit, the whipping boy of modern gaming. The idea of spending real money to get in-game items does not necessarily have to be a terrible thing. It can help out players that don't have hundreds of hours to devote to a game by allowing them to purchase gear, rather than being left behind or being forced to grind. Or, in the case of Overwatch, they provide no gameplay advantage but simply let you customize your character's appearance to your own personal taste. Similarly, in Heroes Of The Storm 2.0, if you want to unlock a certain character or skin, you can earn the in-game currency rather quickly if you do your daily quests. But if you don't have time, you can just quickly buy into the character you want to play fairly easily. These examples are fairly unobtrusive and innocuous. But not all developers are as well-behaved as Blizzard. For example, in Ubisoft's For Honor, the in-game currency required to buy and upgrade your character's equipment is paid out through multiplayer matches at an abysmal rate. So if you want to stay competitive with other players, you either have to grind an obscene amount or shell out real cash. And the microtransactions were still a very poor exchange value. Another egregious offender was Konami's Metal Gear Solid V: The Phantom Pain, which added some soul-sucking microtransactions post-launch. As you built your base and researched weapons and gear, your overhead for maintaining the base and staff soared, to a point where you began seeing diminishing returns. The answer was to establish Forward Operating Bases (FOBs) but to get the special credits to found an FOB you either had to play the online mode (and then risk raids by enemy players that could steal your resources and money) or shell out real money. Don't do either of those methods and you were simply unable to research the best equipment.  You should never be able to play to win, or be forced to pay to stay competitive or pay to achieve further progress.

The Death of Local Multiplayer

This is a huge complaint of mine. Lately more and more games don't allow splitscreen play, with even some long-standing stalwarts finally dropping the practice. This is a shame because some of my finest memories of gaming involve playing local multiplayer content. Whether it was loading up and ripping through Locust with a friend in Gears Of War 2, getting pummeled by friends for annihilating them with double Needlers in Halo 2 or just the relentless smack talk no matter the game, it was always a riot. Being at a different location communicating through headsets just isn't the same experience.  From a moneymaking perspective, getting rid of local multiplayer makes a lot of sense. With splitscreen, they would sell one console, one copy of the game and 2-4 controllers.  Get rid of splitscreen and now they sell 2-4 consoles, 2-4 copies of the game, 2-4 controller and 2-4 internet service subscription. Some developers have cited technical reasons for tossing it as well. For example, Bungie/343 Industries, a long-term proponent of splitscreen, finally tossed the feature because the wished for the game to run at a constant 60fps and didn’t feel that was possible with splitscreen. At least they made a conciliatory gesture of allowing friends to play the campaign cooperatively without an active Xbox Live account. But I would rather be able to play in the same room as my friends than have super-amazing graphics.

Constant Updates/Ludicrous File Sizes

There is nothing worse than having a little bit of time to play a game and you power up your console and find that it needs to perform a 12 gigabyte update. Except for when it just did a 12 gigabyte update the day before. I have heard of people leaving their console off for a good bit of time and then finding that it needs 12+ hours of updating. And then there are the Day 1 updates. You wait 5 years for a game, get home from the midnight release and want to play a couple minutes before going to sleep but you can’t because it requires a 42 gigabyte Day 1 patch (Seriously, this happened to me with Mass Effect: Andromeda). How in the world does a game need 42 GB of updating on the day it releases? And not only is this a time-killer, these updates, along with huge initial installs, burns up memory. If you’d have told me back in my Xbox 360 days that its successor would have a 1 terabyte drive, I would have been skeptical. If you proceeded to tell me that it would be constantly at 90% memory usage, I would have called you an outright liar. Now every time I buy a new game, it turns into a juggling session of trying to figure out what games of the 12 installed I’m really going to play again in the near future.

Always On/Requires Internet

I have a real bone to pick with this one. Lately the number of games requiring a constant internet connection to play has increased exponentially. There are two issues with this, one short-term and one long-term. In the short-term, it is immensely frustrating when you try to play a game and have no intentions of playing with/against other players and then finding out that a weak connection or server-wide issue prevents you from playing even by yourself. Once again picking on Ubisoft's For Honor, playing the single-player story mode still strangely required you to use their servers, which were horribly flawed from day one. Unable to play the multiplayer mode because the servers were acting up? Well, you can't go play single-player by yourself either, because, well, I don't actually know why. The long-term issue is that nothing is immortal and that includes game servers (except for World Of Warcraft, which appears to be the closest thing). Eventually player count drops enough to where it is no longer sensible to keep a server active and the company closes it. What that means is that eventually a lot of these always-online games become a paperweight. And that means that future gamers cannot go back and replay possible future classics, such as perhaps the original Destiny. Reliving that experience will be relegated to fond memories and watching videos on Youtube (or perhaps whatever replaces Youtube. Remember that “nothing is immortal” line?). And I find that concerning.

Lazy Open-World Games

I have nothing against a well-made open-world game. And the industry has cranked out a number of excellent examples in the past few years. Look no further than examples such as The Witcher 3: Wild Hunt, Horizon: Zero Dawn and Legend Of Zelda: Breath of the Wild. But lately there have been a number of games where because A) open-world games are in demand and are all the rage, and B) new consoles have the processing power to make open-world games much easier, developers have been hastily throwing together open-world games where the open-world aspect adds nothing to the experience. In some cases, making it open-world even detracts from the game. A recent example would be Mass Effect: Andromeda. If BioWare had let the Mass Effect formula alone, it might have been a decent game, but they caved to the pressure of making it open-world and it resulted in lots of bland unchanging environments, filler missions to pad the length and justify the open-world format, as well as screwing up the story pacing. In a similar situation was Keiji Inafune's ReCore. While an interesting concept, why was the decision made to make an open-world game set in an inhospitable desert? The game was subsequently panned for it's bland repetitive environments, meaningless fetch quests and poorly-paced narrative. It's a case of good technology being used poorly, and I believe that a lot of companies need to take a step back in the early development progress and ask themselves if making the game open-world really brings anything to the table.

So, what changes have come about in modern gaming that really grinds your gears? Anything that I missed or anything that I was wrong about? Let me know. In the meantime, I have to go yell at some kids to get off my lawn.
Next PostNewer Post Previous PostOlder Post Home

1 comment: