I pretty much agree 100% with TFA. Especially about the use of a mouse and having easier configuration of a game.
I recently reinstalled Mass Effect 1 and Mass Effect 2, as I wanted to do two full playthroughs of each, one as a good guy and one as a dick. I have all the DLC for both games, and am currently enjoying the bad guy playthrough on ME2.
After reading this article, I realized I had been thinking many of the same things it points out. Especially because of ME2.
For example, in ME2 there are areas where you cannot save. These are usually missions, and it can be frustrating if you suddenly realize you have to leave for work or some other engagement, but your less than halfway through getting Jack from Purgatory station. You either have to lose your progress or be late for work.
The worst offense of ME2 (in my opinion) is how ridiculously difficult it is to customize your graphical and quality experience with the game. With ME1, you had several .ini files that you could scan through and change this or that to suit your needs. Easy tutorials and guides exist to let you know what all the settings do and their effects on the game. With ME2, they have all these .ini files boxed into one file called coalesced.ini, and it's not standard text style. There is an editor you can download called coalescededitor, and it does a good job of seperating everything out so you can make sense of it, and even includes a section of jump lists to get you to the most common settings to change. Such as disabling mouse acceleration. But, why should the community have to be the ones to implement a user friendly way to make the game experience better and more well suited to your system? Would it really have been so difficult for the company who HAS the code and the devs to have just put an extra menu in the config program that allows you to make these changes in a simple and efficient manner?
This leads into what is probably my biggest gripe about modern games (modern, to me, being roughly 2006-ish and up), and that is texture quality. This is a simplification of the issue, however, but still valid and indicitive of inherent issues with dumbed-down configurability. Lets take Mass Effect 2 for example, since it's still fresh on my mind. This is a beautiful game. Great color variety, awesome level designs, cool architecture through out, and some of the best looking character models of any game ever made. The animations (mostly suit-capture), the decent AI, and the storyline are just awesome. But, this game suffers from a very VERY common problem that virtually every game exhibits - garbage textures.
Now, don't get me wrong. The majority of the textures in ME2 are very good or even surprisingly realstic and superb. Take Zaeed for instance. His head and face texture is probably the best looking character texture I've ever seen in a game, even better than Crysis or Crysis 2 characters. Every time I see his face in the game, I seriously think "Man, that is friggin awesome work they did there."
But then I see Jack's body texture (normal costume, not alternate) and think, "Why the hell does her tattoo texture look so damn muddy and blurry? Why wouldn't they try to make a damned TATTOO'D body texture look super detailed and crisp?" And I look a the floor beneath my character. Muddled and blurry. I walk up to a wall. Muddled and blurry. I watch a cut scene with a close up of a mech. Muddled and blurry.
I understand the technical side of this issue. Higher rez textures use more memory and take longer to render and all that jazz. Blah blah cry me a river. This isn't 1998, and I don't have an 8Mb 3dFX card. I have a relatively cutting edge 560Ti 1Gb card. It can handle textures above 512 resolution. So stop hardcoding texture limits, and instead allow us gamers to use 4096 or 8192 textures if we wish. If your still using a Geforce 5600, well, time to upgrade anyway.
I also realize that using texture sizes of 8192 won't always fix a muddy looking wall when your character jams their nose against it, and using 8192 texture sizes on every part of a complex character model is just impractical, but that kind of brings the point full circle. Find a way to map smaller texture sizes onto the same piece with higher overall texture definition. I don't know how you would do it, as I'm not a programmer or game dev. I just think it would really be a good thing for the gaming industry if they figured out how to permantly get rid of muddy garbage looking textures.
Well, that's my two cents. I'm out.