Everybody who's been gaming or awhile is well aware of the Videogame Crash of 1983, a period that saw the collapse of the American console market and a strange period when many people thought the videogame was dead. The causes are numerous and hotly contested, but it's likely just an unexciting story of a bubble that popped. One strain of the story I've always found interesting as it is improbable, is that two games are primarily responsible for the crash: Howard Scott Warshaw's E.T. and Tod Frye's Pac-Man, both for the 2600. In both cases, we're talking about massively hyped games that sold tremendously well, but then got returned to stores in droves. My thought for today is whether something like this could happen again--could a rapid-fire succession of massively disappointing games topple the industry like it did in the 80s?
We've recently seen five games that by all rights "should" have been great--expectations were high, fanboys numerous, and, for the most part, very talented people were in control. However, in each case, the major critics either dismissed them as mediocre or blasted them as if they were almost personally offended by their perceived lack of quality:
Duke Nukem Forever. Metacritic score: 55.
Alpha Protocol Metacritic: 72 (Gamespot: 60, IGN: 63).
Hunted: The Demon's Forge. Metacritic score: 63.
Alice: Madness Returns. Metacritic score: 75 (IGN: 65).
Dungeon Siege 3. Metacritic: 73 (IGN: 65, Gamespot: 60).
Even Nintendo seems to be having problems. Despite the waves of hype the 3DS is currently receiving over the re-release (yawn) of Ocarina of Time, I still see the whole thing as another Virtual Boy with a much better marketing campaign. I see an upcoming backlash, though, as more purchasers find that they aren't getting full refunds when they try to return the devices that give them headaches. That's the kind of episode and bad publicity that can make anyone think twice about buying a game. As for Nintendo's new console, it sure looks like that "U" stands for "Useless." Sony, of course, is unlikely to ever recover from the PSN nightmare, and Microsoft doesn't seem far behind. Even if the new console is great, who can justify it in this economy?
Let me start by saying I haven't played Alice: Madness Returns and will probably never play it. After all, avoiding games like this is why I read reviews, such as this one, this one, and this one. If you don't want to read all those, let me sum it up for you: The baby has turned into a pig.
It's funny how so many quotes from Lewis Carroll's work seem appropriate here. Consider:
March Hare: Have some wine.
(Alice looked all round the table, but there was nothing on it but tea.)
Alice: I don't see any wine.
March Hare: There isn't any.
Alice: Then it wasn't very civil of you to offer it.
March Hare: It wasn't very civil of you to sit down without being invited.
It's been a long time since I've been excited about a forthcoming CRPG. I usually just find myself disappointed and then bitter when I find that the latest "CRPG" is just another mindless twitch-fest with bigger boobs than ever before. Sigh.
So, what would I like to see in a CRPG? I thought I'd provide a wishlist.
#5. Quality packaging. Yes, I know that games are data and are best distributed over the internet. But that doesn't mean that there can't also be a tangible component, such as nice printed manuals, maps, and reference cards. The goal here should be to make those "extras" not only a pleasure to hold, but truly useful in the game (i.e., no collectors' edition bullshit of interest only to fanboys). Periodically the game should refer you to them, as well, since there is nothing more boring than being asked to read a lot of text on a screen. Why not do like the old games did for copy protection, and ask you to read entry #43 in your lovely printed journal? Hellz yeah! That sure beats trying to read a bunch of stupid text on a screen, or, worse, hearing it read by some voice actor without a clue of its context. As for nitwits who can't be bothered to actually read a book, those idiots wouldn't be interested in my kind of game anyway so to hell with them.
I had recently written about what I perceive to be the false notion of console gaming holding PC gaming back (and, frankly, with a recent release like L.A. Noire and future releases like Skyrim, again, it's hard to make that argument outside of a purely superficial (audio/visual) - not contentual - standpoint). Perhaps, as this new article puts forth, it's not consoles, but tablets, that the traditional PC industry has more to worry about?
Of course, as far as I'm concerned, we're actually still at least a few years off from that happening, at least until Apple breaks the required link between their iOS devices and a computer equipped with iTunes (and that's a question of "when", not "if"). Android devices are of course close to completely breaking free of the computer tether, but there are other issues for those classes of devices to overcome first. Other tablet OS's, present and future, are probably somewhere in-between the two.
Interestingly, there's a girl here at my day job who had bought an iPad 2 about a month back and then recently got an iPhone 4, but was frustrated that there was no way to copy what was on her iPad 2 (purchases) over to the iPhone 4. You see, she considers her computer horribly outdated and really didn't want to go through iTunes on her rickety old PC! Obviously, very flawed thinking, but it's very interesting what the non-techies have in their thought processes (and in this case how she wants to basically compute outside of work exclusively on the iPad 2 and iPhone 4)... Definitely a paradigm shift of some type! In any case, it's the old argument that it's not so much computers that are being challenged, it's the limited generalized definition of what a computer is that is being challenged. Does a computer really mean that desktop or laptop many of use a good portion of the day? Sure, but that's not all it means. As an iPad 2 user - outside of the tethering restriction for the occasional iTunes sync - I can argue that my tablet is as much of a computer as most desktops and laptops, with strikingly similar functionality (and in some cases, then some).
Ultimately, I think it's clear we're all headed to a connected eco-system of devices, where a lot of stuff is in the cloud, with minimal need for local storage. You'll simply use whatever device is handy or whatever is best suited to a particular task (say a touch screen or a keyboard). We even already have brilliantly functional cloud gaming services (and of course, VOD, like Netflix), so, outside of artificial bandwidth restrictions by ISP's, there's little reason to think that the future has anything to do with increasingly more powerful traditional computers. For some of us who have been in love with technology since our earliest memories, this is a tough sell, but it's hard to argue that's not where we're headed, and perhaps it's just as hard to argue that it's even a bad a thing. I'm sure even the most hardcore among us have tired of the upgrade/incompatibility/instability cycle at some point, if only briefly.
Here's my take on the hottest news for Wednesday, May 18th.
PSN logins exploited again, Sony takes pages offline. Sony in consultations with TSA to improve security; new system utilizes The PlayStation Eye and requires taking off shoes and submitting to nude-ray scan.
If you were a pirate back in the 80s, or enjoy abandonware titles today, do you ever wish you could in some way repay those designers or developers who made your favorite games?
I've been reading some good books lately about the subject of fun and videogames, such as Koster's Theory of Fun and McGonigal's Reality is Broken, plus whatever I see cropping up on Google Reader. Anyway, I've been studying their definitions and trying to come up with a synthesis, plus adding in a few things of my own from my studies of Ancient Greece. Needless to say, almost everything these authors feel is new or original is just the latest incarnation of things taught by Aristotle and Plato.
These are some thoughts I'm trying to work up into a book project, but there's a few of the core concepts.
I'll be the first one to say that computers are only as smart as those that would design and write the software. That said, the New York Times recently posted an article concerning the ratings system of the ESRB and how it will move from human-based grading to computer-based grading. It isn't that the computers have some sort of A.I. that plays the entire game through an assigns a rating (wouldn't that be grand?) but rather that the games will move toward a questionnaire-based rating system.
In my upcoming Matt Chat with Scratches designer Agustín Cordes, we talk a lot about our perspectives on classic games such as Myst and King's Quest, and how those have changed over time. Agustín says that he doesn't consider Myst (1993) to be a vintage game, simply because it feels too modern to him compared to the earlier King's Quest (1984). It made me start thinking about how we perceive time when talking about individual games. There is 9 years difference between Myst and King's Quest, but 18 years have passed since Myst first graced the Mac (16 for PC). It seems to me that each year that goes by seems to compress that 9 year gap to make it seem shorter, so that it seems like Myst followed very closely on the heels of King's Quest (and thus I feel comfortable grouping both under the category "classic"). For Agustín, on the other hand, that gap seems much wider, perhaps because the Myst-style is still "modern" in the sense that most adventure games still follow its model.