Game Over. Start Again? – Part 2

No, Really – It Can Be A Computer

With the debut of broadband internet multiplayer gaming via consoles such as PS2, Xbox and Gamecube, and the introduction of keyboard and mouse peripherals edging these game machines closer to being full-fledged computers, is it possible that the video game industry has finally made good on the long-standing pledge of convergence?

Don’t count on it.

The add-on keyboard and internet service may have caught the public eye when Sega brought them to the Dreamcast in 2000, but it wasn’t the first time, and almost certainly won’t be the last. The first add-on keyboards/game-to-computer upgrades actually date back to the underpowered Atari 2600. No fewer than three such peripherals were put on that market toward the end of that console’s life span, and other manufacturers tried to follow suit. Almost from the beginning, Mattel Electronics made “forward-looking statements” about the Intellivision becoming the core of a full-fledged personal computer system, but kept delaying the bulky keyboard module which would literally engulf the company’s game console. Before long, consumer complaints and truth-in-advertising advocates caught up with Mattel, and questions were asked in Congress and the halls of the Federal Trade Commission. To avoid hefty fines, Mattel rushed the Intellivision Computer Module to a somewhat limited market – and it did almost nothing that had been promised. But with a product at last delivered, Mattel was off the hook.

Another attempt by Mattel to make good on these promises was its purchase of a low-end computer/game console called Aquarius from a Taiwanese manufacturer. Aquarius burst onto the scene in 1983 with bright blue rubber keys, a casing that resembled a game console more than anything, a licensed-but-lobotomized version of Microsoft BASIC and a processor that had one of its software designers openly calling it “a computer for the 70s!” – not an auspicious start. Mattel added a peripheral of its own, the Mini-Expander, allowing Aquarius to play specially-programmed versions of such Intellivision games as Utopia and Tron Deadly Discs, though the graphics and game play of those titles paled in comparison even to their Intellivision versions. Aquarius tanked, and tanked hard. Mattel wound up begging out of its contract and selling its stock of the machine and related peripherals and software back to the Taiwanese manufacturer.

Coleco, which had won nothing but favorable press for its high-end Colecovision console (which included a near-perfect version of the arcade hit Donkey Kong with every machine), made a similar misstep with the introduction of its Adam computer. Arriving as both a stand-alone product and an expansion module so one could turn one’s Colecovision into a full-fledged computer, Adam was, like Aquarius, initially sold on the basis that it could play bigger, faster, better-looking games. A large software library was promised, but there was one problem – a large user base never quite materialized. Worse yet, many of the Adam computers that were sold turned out to be defective, and like Aquarius, the machine hit the stores just in time to be engulfed in the video game crash of 1983-84. Coleco hastily exited the game and computer business, finding great success in its next major product: Cabbage Patch Kids dolls.

Some companies did successfully launch companion lines of computer products, most notably the Atari 400 and 800 computers introduced in 1979, though they were incompatible with the 2600. But these computers only came after Nolan Bushnell sold Atari to Warner Bros. for millions. Earlier than that, Bushnell had listened to a pitch from one of his young employees; with his brilliant-minded engineer friend, this Atari worker had invented a new computer, and they thought Bushnell should buy the design from them and launch a line of Atari computers. Bushnell passed on the idea, instead pointing the two youthful inventors toward a source of venture capital – and together, Steve Jobs and Steve Wozniak launched the meteoric, personal-computer-industry-sparking success story of Apple Computer. Barely three years after turning down their pitch, Nolan Bushnell was kicked off the Atari board under the new Warner-owned corporate regime, and new CEO Ray Kassar ordered the creation of Atari’s first computers in an unsuccessful attempt to catch up with Apple.

Online gaming isn’t new either. Sega marketed modems for the Genesis console, but even before that there was talk of Nintendo turning the NES into the hub of its own network: imagine trading stocks, sending e-mail and ordering products online with the same “joypad” controller you just used to play a round of Super Mario Bros. 2! Those plans failed to materialize.

But even before that, an early precursor to online gaming hit the market for the Atari 2600. A memory cartridge and modem all in one, the Gameline peripheral would connect Atari owners to an online database of downloadable software – all for a monthly fee, of course. The only drawback was that the downloaded game held in the Gameline cartridge’s memory would be wiped the moment the console was powered down. Gameline met with minimal success, and its creators later turned toward the burgeoning home computer market, a demographic they felt would be more familiar with modems and downloads. Gameline later underwent a series of rather drastic changes in both name and structure, eventually becoming an upstart telecommunications service called America Online.

Sex and Violence

It seems like video games only make the news these days if there’s a major hardware innovation or if someone’s decided they’re corrupting the youth of the world. Blame for the Columbine High School massacre in Littleton, Colorado was all but laid at the feet of The Matrix and Id Software’s highly customizable first-person PC gun battle, Doom. Not long before that incident, games such as Mortal Kombat had also been singled out for violent content, and Night Trap, a full-motion video “interactive movie” released by Sega, had received bad press for violence and featuring women in skimpy clothing or lingerie. More recently, Grand Theft Auto: Vice City has been given a great deal of negative press for similar themes.

And once again, neither argument is new to the industry. The first protests against violent video games came during the infancy of the art form, thanks to 1975’s Death Race, an arcade driving game by Exidy which required one or two players to guide a pair of cars around a closed arena, running down as many stick figures as possible. Each figure “killed” would be marked by an indestructible grave marker afterward. Now, according to the game, these stick figures were zombies urgently in need of squishing. But it didn’t take long for Exidy to begin receiving complaints about the game, and Death Race even earned coverage in print news outlets and a report on 60 Minutes. Exidy wound up cutting the manufacturing run of Death Race short, and only 1,000 machines were produced. Atari founder Nolan Bushnell promptly made appearances, declaring that no such games promoting violence against human beings would ever be produced by Atari; in some ways, Bushnell’s timely play for press attention presaged similar maneuvers that Nintendo and Sega would use against one another during the vicious, backbiting Congressional hearings that eventually led to the modern video game rating system.

It took longer for the issue of sex in video games to appear, but it finally did in 1983. American Multiple Industries – actually an adult video vendor operating under a somewhat more respectable sounding name – released a series of adult-themed cartridges for the Atari 2600, starting with one which raised the most controversy of all: Custer’s Revenge. Players guided a naked American soldier across the screen, trying to avoid a stream of arrows, to reach an equally naked Native American woman tied to a pole on the other side of the screen; when the woman was reached, players hit the action button on the joystick to – as the documentation so euphemistically put it – “score.” The graphics were primitive enough that they were more laughable than erotic, but in this case it was the thought that counted – and both American Multiple Industries and Atari received numerous protests from the National Organization of Women, several Native American groups, and rape victims’ advocacy groups. Atari did its best to do damage control, pointing out that it had no control over its licensees’ game content. American Multiple Industries released a few more games, though none of them had quite the one-two punch of controversy that Custer’s Revenge carried, and shuttered its video game operations a year later when the home video game industry suffered a massive shake-out.

Crash

If the video game industry is cyclical in nature, with warm periods of public acceptance and cold gaps of public scorn, why did the crash only happen once? From a business standpoint, it only needed to happen once.

In 1983 and 1984, the home video game industry entered a steady but gradual decline. There were too many competing hardware platforms on the market, some of them by the same manufacturer: Atari software designers were frustrated that the company had refused to retire its biggest cash cow, the Atari 2600, on time. The 5200 was to be the next generation machine, but two major problems stifled it – months passed between the initial release of the console and an adapter that would allow new 5200 owners to play their old library of 2600 games. Colecovision hit the stores with such a peripheral already available, making it a no-brainer for Atari 2600 owners wishing to trade up. The 5200 was also killed by the still-profitable market for Atari 2600 games. A very few third-party manufacturers even bothered to create 5200 games – why should they when the 2600 had a far larger user base? Other consoles such as Intellivision and Odyssey 2 hung on for dear life, but they were niche systems whose user base consisted primarily of early adopters – i.e., few customers looking for a new game system bothered with those machines with a lesser market penetration, and fewer third-party software makers bothered to create games for them.

But the third-party game makers were more than happy to milk the Atari 2600 for all it was worth, hoping to ride the venerable workhorse of the video game industry straight to the bank. And this is where it all started to go horribly, horribly wrong.

In 1979, Atari lost a landmark case against the world’s first third-party video game software company, Activision. Initially, Atari had sued Activision’s four founding members – all ex-Atari employees – for misappropriation of trade secrets, but that case had been lost, and finally Atari failed to prove in court that Activision’s very existence was detrimental to Atari’s sales. At most, Atari was granted a license provision that sent small royalties its way anytime a third-part game company mentioned in packaging or sales materials that its games would run on Atari’s hardware – but that was all. The floodgates were open and anyone could make games for the Atari 2600 so long as the royalty was paid. Even if the games, to put it bluntly, sucked. Even if they were Custer’s Revenge.

To make a long story short, dozens of companies piled onto the 2600 bandwagon who had never shown any previous interest in the home video game business: 20th Century Fox, CBS, board game gurus Avalon Hill, Children’s Television Workshop (the makers of Sesame Street), and even companies hoping to tie up niche markets like Christian-themed gaming. Sega’s first home video game products were Atari 2600 cartridges. These dozens of companies produced hundreds of games, and to be charitable, many of them were substandard. (It should also be noted, however, that some of Atari’s in-house titles weren’t much better, as demonstrated by miserable adaptations of arcade games like Pac-Man and movies – namely E.T., five million unsold copies of which are buried in a New Mexico landfill.) As the consumer base shifted its weight toward home computers and high-end systems like Colecovision, Atari’s hangers-on began dumping games at bargain basement prices. Atari itself swore not to do the same, but eventually it had to – and the bottom dropped out of the stock value of virtually every video game company when it happened. Even arcade game manufacturers were caught in the undertow and went out of business.

Activision barely survived the wash-out of third-party software houses; its next closest competitor, Imagic, abandoned plans for an IPO, sold its intellectual property rights, and quietly folded. Many talented game designers found themselves out of work in a fledgling IT job market that thumbed its nose at “mere” game designers. As former Odyssey 2 programmer Bob Harris said, “I made a conscious decision to get out of video games because I would have to move to California to stay with it, which I couldn’t afford, and I was scared by the negative view game work was given when I interviewed outside the game industry. I started looking for other jobs, I remember that the general attitude I seemed to run into was ‘so, you’ve been programming games, and now you want a real job.’ It came as a little slap in the face, because the real-time aspects of game programming were more challenging than most other programming jobs I’ve had since.”

Somewhat eerily predicting the dot-com boom and bust, the video game market was brought to its knees. Programmers were out of jobs en masse. Speculators hoping to hop aboard for a sure thing were left in bankruptcy proceedings. The Atari 2600 and other system hardware and software was dumped onto the market at fire sale prices, if stores could be convinced to keep selling them at all. For all intents and purposes, the American video game industry was dead.

Coming up in part three: the video game industry resurrects itself with a different set of rules, and comes full circle. Thought the classics were old hat? Game over. Start again.