All Natural Is Not Always All Good

Posted January 1, 2004 By Pattie Gillett

Though some might not agree, it cannot be a coincidence that less than a year after the death of Orioles pitcher Steve Bechler, the Food and Drug Administration finally succeeded in doing what it had been trying to do for ten years – getting the herbal supplement ephedra off the market. Bechler’s death from heatstroke in 2003 was linked to his use of the diet pills containing ephedra. Though doctors had known for years that the active ingredient in ephedra supplements could cause dangerous increases in both heart rate and blood pressure, putting users at increased risk of heart attack, stroke, and related ailments, Bechler’s death gave the media a high profile casualty to link with the drug.

Some might say that it is too cynical to presume that without a martyr, the FDA ‘s case against ephedra might never have made it this far, but history tells a different story. In fact, due to the double standard in how supplements and drugs are regulated, it might actually be surprising that the ban has happened at all. Read the remainder of this entry »

        

Profit and Loss in the Marketplace of Ideas

Posted January 1, 2004 By Dave Thomer

When Dr. Laura Schlessinger made comments that many claimed denigrated and disrespected homosexuals, offended individuals quickly boycotted the show and its advertisers.

When Dixie Chicks singer Natalie Maines made comments that many claimed denigrated and disrespected President George W. Bush, offended individuals quickly boycotted the band, its albums and its concerts.

These and other boycotts just as quickly spawned a backlash of their own. By threatening media figures with a loss of income, boycotters attempt to stifle the expression of certain unpopular ideas, and thus deprive these figures of their right to free speech. Author Peter David, for example, has repeatedly criticized such efforts on his blog, including one entry where he writes: “I’m talking about pure, simple, appropriate, proportional response. If you disagree with someone, say it with words, because saying it with punitive, retaliatory measures proves nothing except that you are petty and intolerant.”

I applaud the sentiment behind David’s position, but as I have thought about it, I can’t help but feel he’s not quite on the right track here. In certain situations in a market-based economy, boycotts and economic pressure are a wholly legitimate method of political and social discourse. Read the remainder of this entry »

        

Better Supplement Controls No Great Loss

Posted January 1, 2004 By Pattie Gillett

The public response to the recent ephedra ban (which is also this subject of this month’s Public Policy article) puzzles me to no end. Reminiscent of the Today sponge episode of Seinfield, people have responded to the FDA’s banning of the potentially dangerous supplement by hoarding the stuff.

ABC News reports that health food stores were cleaned out of products containing the supplement within hours of the announcement of the impending ban. Though GNC, the nation’s largest retailer of supplements, stopped selling products that contain ephedra in June 2003, there are still hundreds of thousands of pharmacies, health food stores and gyms who are more than willing to sell Metabolife, Speed Stack, Ripped Force, and others, by the case, if necessary.

Longtime users credit ephedra supplements with helping them to stay in shape, fight fatigue, perform better in sports, and, of course, lose weight. More than anything else, ephedra supplements are marketed as weight-loss aids. On its web site Metabolife International claims its Metabolife 356, one of the most popular ephedra products on the market, increases the body’s metabolism so users burn fat faster. A well placed asterisk warns the reader that these claims have not been evaluated by the FDA and that the products is not intended to cure, prevent, treat, or diagnose disease. It’s a catchall disclaimer that supplement manufacturers use to remind users which side of the DSHEA Act of 1994 they are on. Read the remainder of this entry »

        

Game Over. Start Again? – Part 3

Posted December 1, 2003 By Earl Green

…and Burn

Late 1984 and 1985 belonged to the home computer market, and the handful of software makers that had survived the crash. But the industry wasn’t dormant. In 1983, Nintendo had made a splash in its native Japan with the Famicom game console – short for Family Computer. Vastly ahead of anything in the American market with its processing power, the Famicom seemed like a shoo-in for the American market until the crash happened. Nintendo initially approached Atari to market the Famicom in the western hemisphere, and at first, it seemed like a done deal – both sides were eager to join forces.

Then one of the biggest decisions in the entire history of the business of video games took place, signaling the rise of one company and the fall of the other – and all because of a misunderstanding.

At a 1984 Consumer Electronics Show, Atari and Nintendo were close to inking the Famicom distribution pact. Very close. As a show of good faith, Atari had received rights to translate Nintendo’s games for the U.S. home computer market, and the future looked bright – until Atari executives noticed a new version of Donkey Kong running on an Adam computer at Coleco’s booth. Infuriated, they confronted their counterparts at Nintendo, who threatened to yank the home console rights out from under Coleco. By the time the misunderstanding was settled, there had been a changing of the corporate guard at Atari, and the deal was off. Nintendo was on its own, and by the time it made it to market with the American version of Famicom – now called the Nintendo Entertainment System – U.S. investors and consumers had turned a cold shoulder toward the video game business.

Through a series of brilliant marketing maneuvers, such as including a remote-controlled robot called R.O.B. with the system and selling the resulting package as a toy, Nintendo broke into the market and kick-started a renaissance of the game industry. But it wasn’t about to let another crash begin: Nintendo clamped down on licensing and manufacturing rights, forcing anyone wishing to make games for the NES come to Nintendo to have the cartridges manufactured. Anyone circumventing the built-in security features which would allow only licensed software to run was swiftly and aggressively sued, including an Atari subsidiary, Tengen, which battled with Nintendo over the cartridge rights to a simple Russian puzzle game called Tetris. Nintendo won the battle, and the rest – including Tengen – was history. Read the remainder of this entry »

        

Beware of Posture Commandos

Posted December 1, 2003 By Dave Thomer

I am trying to sit very, very straight as I type this. If I do not, trained posture commandos might rappel in through the window and shove a plank down the back of my shirt, which I am heartily opposed to not simply because I fear splinters in my back, but because I like this shirt. You may ask why I am in such a state of heightened posture awareness. For one thing, I am genuinely trying to take better care of myself. More importantly, I am now convinced that my mother has posture control agents stationed throughout the city, and I don’t want to run afoul of any of them.

You may find it unreasonable that I, a 28-year-old husband, father, homeowner, and otherwise responsible adult, would fear my mother and her hawkish pro-posture stance. The problem is, as much as our parents might be looking out for us and proud of our accomplishments and whatever else they stick on those ‘For You, Son, On Your First Gray Hair’ greetings cards, some crucial sector of brain cells fails to let go of the fact that we’re the same individuals who would once choose Crayola over Sherwin Williams as a wall covering any day of the week. And no matter how hard we try to play that responsible adult, eventually, we will slip up and give them ammunition. Read the remainder of this entry »

        

Behind the Brands

Posted December 1, 2003 By Pattie Gillett

My dusty college marketing textbooks define brand loyalty as the “degree to which a consumer intentionally and repeatedly chooses one brand over another.” My mother just spent a week with me, helping Dave and I prepare for my first Thanksgiving at my new house. I now define brand loyalty as “the length of time one person will fight with a blood relative in the middle of a crowded grocery store before making a purchase.” I’ll spare you the bloody details but let’s just say that my mother and I have agreed to disagree on most of the decisions that one makes when one wheels their cart into a supermarket, namely brand decisions. (Come to think of it, we even had a brief spat over the choice of shopping cart but that’s neither here nor there.)

For many people venturing out on their own, the thrill of buying your very own Cocoa Puffs and dish detergent wears off pretty fast. Grocery shopping, whether you do it daily, weekly, or monthly, becomes a relatively irksome, repetitive, and expensive task. According to the Bureau of Labor Statistics, the average consumer unit (new government speak for household) spends about 14.5% of its annual income on food and grocery items. (However, this figure does include meals eaten away from home.) That’s quite a slice of pie. Read the remainder of this entry »

        

Game Over. Start Again? – Part 2

Posted November 2, 2003 By Earl Green

No, Really – It Can Be A Computer

With the debut of broadband internet multiplayer gaming via consoles such as PS2, Xbox and Gamecube, and the introduction of keyboard and mouse peripherals edging these game machines closer to being full-fledged computers, is it possible that the video game industry has finally made good on the long-standing pledge of convergence?

Don’t count on it.

The add-on keyboard and internet service may have caught the public eye when Sega brought them to the Dreamcast in 2000, but it wasn’t the first time, and almost certainly won’t be the last. The first add-on keyboards/game-to-computer upgrades actually date back to the underpowered Atari 2600. No fewer than three such peripherals were put on that market toward the end of that console’s life span, and other manufacturers tried to follow suit. Almost from the beginning, Mattel Electronics made “forward-looking statements” about the Intellivision becoming the core of a full-fledged personal computer system, but kept delaying the bulky keyboard module which would literally engulf the company’s game console. Before long, consumer complaints and truth-in-advertising advocates caught up with Mattel, and questions were asked in Congress and the halls of the Federal Trade Commission. To avoid hefty fines, Mattel rushed the Intellivision Computer Module to a somewhat limited market – and it did almost nothing that had been promised. But with a product at last delivered, Mattel was off the hook.

Another attempt by Mattel to make good on these promises was its purchase of a low-end computer/game console called Aquarius from a Taiwanese manufacturer. Aquarius burst onto the scene in 1983 with bright blue rubber keys, a casing that resembled a game console more than anything, a licensed-but-lobotomized version of Microsoft BASIC and a processor that had one of its software designers openly calling it “a computer for the 70s!” – not an auspicious start. Mattel added a peripheral of its own, the Mini-Expander, allowing Aquarius to play specially-programmed versions of such Intellivision games as Utopia and Tron Deadly Discs, though the graphics and game play of those titles paled in comparison even to their Intellivision versions. Aquarius tanked, and tanked hard. Mattel wound up begging out of its contract and selling its stock of the machine and related peripherals and software back to the Taiwanese manufacturer.

Coleco, which had won nothing but favorable press for its high-end Colecovision console (which included a near-perfect version of the arcade hit Donkey Kong with every machine), made a similar misstep with the introduction of its Adam computer. Arriving as both a stand-alone product and an expansion module so one could turn one’s Colecovision into a full-fledged computer, Adam was, like Aquarius, initially sold on the basis that it could play bigger, faster, better-looking games. A large software library was promised, but there was one problem – a large user base never quite materialized. Worse yet, many of the Adam computers that were sold turned out to be defective, and like Aquarius, the machine hit the stores just in time to be engulfed in the video game crash of 1983-84. Coleco hastily exited the game and computer business, finding great success in its next major product: Cabbage Patch Kids dolls.

Some companies did successfully launch companion lines of computer products, most notably the Atari 400 and 800 computers introduced in 1979, though they were incompatible with the 2600. But these computers only came after Nolan Bushnell sold Atari to Warner Bros. for millions. Earlier than that, Bushnell had listened to a pitch from one of his young employees; with his brilliant-minded engineer friend, this Atari worker had invented a new computer, and they thought Bushnell should buy the design from them and launch a line of Atari computers. Bushnell passed on the idea, instead pointing the two youthful inventors toward a source of venture capital – and together, Steve Jobs and Steve Wozniak launched the meteoric, personal-computer-industry-sparking success story of Apple Computer. Barely three years after turning down their pitch, Nolan Bushnell was kicked off the Atari board under the new Warner-owned corporate regime, and new CEO Ray Kassar ordered the creation of Atari’s first computers in an unsuccessful attempt to catch up with Apple.

Online gaming isn’t new either. Sega marketed modems for the Genesis console, but even before that there was talk of Nintendo turning the NES into the hub of its own network: imagine trading stocks, sending e-mail and ordering products online with the same “joypad” controller you just used to play a round of Super Mario Bros. 2! Those plans failed to materialize.

But even before that, an early precursor to online gaming hit the market for the Atari 2600. A memory cartridge and modem all in one, the Gameline peripheral would connect Atari owners to an online database of downloadable software – all for a monthly fee, of course. The only drawback was that the downloaded game held in the Gameline cartridge’s memory would be wiped the moment the console was powered down. Gameline met with minimal success, and its creators later turned toward the burgeoning home computer market, a demographic they felt would be more familiar with modems and downloads. Gameline later underwent a series of rather drastic changes in both name and structure, eventually becoming an upstart telecommunications service called America Online.

Sex and Violence

It seems like video games only make the news these days if there’s a major hardware innovation or if someone’s decided they’re corrupting the youth of the world. Blame for the Columbine High School massacre in Littleton, Colorado was all but laid at the feet of The Matrix and Id Software’s highly customizable first-person PC gun battle, Doom. Not long before that incident, games such as Mortal Kombat had also been singled out for violent content, and Night Trap, a full-motion video “interactive movie” released by Sega, had received bad press for violence and featuring women in skimpy clothing or lingerie. More recently, Grand Theft Auto: Vice City has been given a great deal of negative press for similar themes.

And once again, neither argument is new to the industry. The first protests against violent video games came during the infancy of the art form, thanks to 1975’s Death Race, an arcade driving game by Exidy which required one or two players to guide a pair of cars around a closed arena, running down as many stick figures as possible. Each figure “killed” would be marked by an indestructible grave marker afterward. Now, according to the game, these stick figures were zombies urgently in need of squishing. But it didn’t take long for Exidy to begin receiving complaints about the game, and Death Race even earned coverage in print news outlets and a report on 60 Minutes. Exidy wound up cutting the manufacturing run of Death Race short, and only 1,000 machines were produced. Atari founder Nolan Bushnell promptly made appearances, declaring that no such games promoting violence against human beings would ever be produced by Atari; in some ways, Bushnell’s timely play for press attention presaged similar maneuvers that Nintendo and Sega would use against one another during the vicious, backbiting Congressional hearings that eventually led to the modern video game rating system.

It took longer for the issue of sex in video games to appear, but it finally did in 1983. American Multiple Industries – actually an adult video vendor operating under a somewhat more respectable sounding name – released a series of adult-themed cartridges for the Atari 2600, starting with one which raised the most controversy of all: Custer’s Revenge. Players guided a naked American soldier across the screen, trying to avoid a stream of arrows, to reach an equally naked Native American woman tied to a pole on the other side of the screen; when the woman was reached, players hit the action button on the joystick to – as the documentation so euphemistically put it – “score.” The graphics were primitive enough that they were more laughable than erotic, but in this case it was the thought that counted – and both American Multiple Industries and Atari received numerous protests from the National Organization of Women, several Native American groups, and rape victims’ advocacy groups. Atari did its best to do damage control, pointing out that it had no control over its licensees’ game content. American Multiple Industries released a few more games, though none of them had quite the one-two punch of controversy that Custer’s Revenge carried, and shuttered its video game operations a year later when the home video game industry suffered a massive shake-out.

Crash

If the video game industry is cyclical in nature, with warm periods of public acceptance and cold gaps of public scorn, why did the crash only happen once? From a business standpoint, it only needed to happen once.

In 1983 and 1984, the home video game industry entered a steady but gradual decline. There were too many competing hardware platforms on the market, some of them by the same manufacturer: Atari software designers were frustrated that the company had refused to retire its biggest cash cow, the Atari 2600, on time. The 5200 was to be the next generation machine, but two major problems stifled it – months passed between the initial release of the console and an adapter that would allow new 5200 owners to play their old library of 2600 games. Colecovision hit the stores with such a peripheral already available, making it a no-brainer for Atari 2600 owners wishing to trade up. The 5200 was also killed by the still-profitable market for Atari 2600 games. A very few third-party manufacturers even bothered to create 5200 games – why should they when the 2600 had a far larger user base? Other consoles such as Intellivision and Odyssey 2 hung on for dear life, but they were niche systems whose user base consisted primarily of early adopters – i.e., few customers looking for a new game system bothered with those machines with a lesser market penetration, and fewer third-party software makers bothered to create games for them.

But the third-party game makers were more than happy to milk the Atari 2600 for all it was worth, hoping to ride the venerable workhorse of the video game industry straight to the bank. And this is where it all started to go horribly, horribly wrong.

In 1979, Atari lost a landmark case against the world’s first third-party video game software company, Activision. Initially, Atari had sued Activision’s four founding members – all ex-Atari employees – for misappropriation of trade secrets, but that case had been lost, and finally Atari failed to prove in court that Activision’s very existence was detrimental to Atari’s sales. At most, Atari was granted a license provision that sent small royalties its way anytime a third-part game company mentioned in packaging or sales materials that its games would run on Atari’s hardware – but that was all. The floodgates were open and anyone could make games for the Atari 2600 so long as the royalty was paid. Even if the games, to put it bluntly, sucked. Even if they were Custer’s Revenge.

To make a long story short, dozens of companies piled onto the 2600 bandwagon who had never shown any previous interest in the home video game business: 20th Century Fox, CBS, board game gurus Avalon Hill, Children’s Television Workshop (the makers of Sesame Street), and even companies hoping to tie up niche markets like Christian-themed gaming. Sega’s first home video game products were Atari 2600 cartridges. These dozens of companies produced hundreds of games, and to be charitable, many of them were substandard. (It should also be noted, however, that some of Atari’s in-house titles weren’t much better, as demonstrated by miserable adaptations of arcade games like Pac-Man and movies – namely E.T., five million unsold copies of which are buried in a New Mexico landfill.) As the consumer base shifted its weight toward home computers and high-end systems like Colecovision, Atari’s hangers-on began dumping games at bargain basement prices. Atari itself swore not to do the same, but eventually it had to – and the bottom dropped out of the stock value of virtually every video game company when it happened. Even arcade game manufacturers were caught in the undertow and went out of business.

Activision barely survived the wash-out of third-party software houses; its next closest competitor, Imagic, abandoned plans for an IPO, sold its intellectual property rights, and quietly folded. Many talented game designers found themselves out of work in a fledgling IT job market that thumbed its nose at “mere” game designers. As former Odyssey 2 programmer Bob Harris said, “I made a conscious decision to get out of video games because I would have to move to California to stay with it, which I couldn’t afford, and I was scared by the negative view game work was given when I interviewed outside the game industry. I started looking for other jobs, I remember that the general attitude I seemed to run into was ‘so, you’ve been programming games, and now you want a real job.’ It came as a little slap in the face, because the real-time aspects of game programming were more challenging than most other programming jobs I’ve had since.”

Somewhat eerily predicting the dot-com boom and bust, the video game market was brought to its knees. Programmers were out of jobs en masse. Speculators hoping to hop aboard for a sure thing were left in bankruptcy proceedings. The Atari 2600 and other system hardware and software was dumped onto the market at fire sale prices, if stores could be convinced to keep selling them at all. For all intents and purposes, the American video game industry was dead.

Coming up in part three: the video game industry resurrects itself with a different set of rules, and comes full circle. Thought the classics were old hat? Game over. Start again.

        

Citizenship: A Call to Service?

Posted November 2, 2003 By Dave Thomer

In 1910, Harvard philosopher William James tried to justify his pacifism with an essay called “The Moral Equivalent of War.� Horrified by the destructiveness of war, James nonetheless recognized that there were strengths to be found in a military environment. Dedication, strength of purpose, the feeling of being bound together into a common effort greater than one’s individual needs . . . all of these have very tangible benefits, which unfortunately are often overwhelmed by the death and destruction of actual war. James argued that while he firmly believed in pacifism, a peaceful society would not be built easily, and would probably not be built at all if those virtues could not be harnessed in a non-violent way. Thus, he said:

If now . . . there were, instead of military conscription a conscription of the whole youthful population to form for a certain number of years a part of the army enlisted against Nature, the injustice [that some struggle in life while others have lives of leisure] would tend to be evened out, and numerous other goods to the commonwealth would follow. The military ideals of hardihood and discipline would be wrought into the growing fibre of the people; no one would remain blind as the luxurious classes now are blind, to man’s relations to the globe he lives on, and to the permanently sour and hard foundations of his higher life.

America has recognized the value of community service for years, and has tried to encourage it through a number of government initiatives such as the Peace Corps and AmeriCorps, which President George W. Bush recently consolidated under the umbrella of the USA Freedom Corps. Retired General Wesley Clark, now running for the Democratic presidential nomination, has proposed the creation of a Civilian Reserve which would seek to emulate the structure of the military reserves, in which volunteers with special skills could be called upon to utilize those skills in emergency situations in exchange for a stipend, health benefits, and a guarantee that they could return to their former jobs when their service was complete. In his speech to introduce the proposal, Clark explicitly placed his own military service in the broader context of the service performed by police, fire and emergency workers and by community activists and volunteers: Read the remainder of this entry »

        

Be Reasonable – Part 1

Posted November 1, 2003 By Dave Thomer

Much of this site’s content centers around the attempt to put together reasonable arguments in support of one position or another. We haven’t really spent much time exploring what ‘reasonable argument’ are, however, and one of the quickest ways to end a potentially constructive conversation is to let basic terms go unexamined. It might seem like an understanding of logic and reasoning should be common sense, but within the philosophical arena, there are fundamental differences about the very nature of logic and reasoning that aren’t just academic hand-wringing. Those differences often spill out into people’s everyday discourse – as do the errors that drive logic professors crazy. So what I’d like to do is start a sort of primer to basic structures of logic, and touch on some of the related issues.

What I’m discussing here is very basic, formal deductive logic. It’s formal not in the sense that it wears a three-piece suit, but in that it’s concerned with the form, or structure, that an argument takes – how its parts fit together, and what does and does not follow from given pieces of information. A logician will often work with symbols rather than actual arguments to keep this emphasis clear. It’s deductive because it works from given information to determine what other facts absolutely must be true – there are no shades of gray or degrees of probability. Of course, not every argument that one encounters will fit neatly into a particular formal structure or be amenable to a strict yes-or-no evaluation. Much of the reasoning we do in everyday life is of the inductive variety, which factors degrees of probability into the mix. But many of the underlying principles are the same, which makes the study of formal deductive logic worthwhile.
Read the remainder of this entry »

        

That Dream Within a Dream

Posted October 1, 2003 By Pattie Gillett

A few weeks ago, Dave and I celebrated our fourth wedding anniversary. Fourth. Not fifth or tenth. Fourth. I’m not even sure what one receives for a fourth wedding anniversary. Wood? Tupperware? Nevertheless, most people I mentioned it to were politely unimpressed. Some even went so far as to ask if we have started planning how we were going to celebrate next year’s “big one.”

To some degree, I can understand their feelings. To people who have been married for twenty or twenty five years or longer, four years of marriage seems like no big deal. To me, however, four successful years with the same person, is something to raise a glass to – particularly in an age where society’s collective attitudes about marriage seem a bit schizophrenic.

I woke up this blessed date, October 17 (yes, women do earn points for remembering the date, too), and stumbled downstairs to begin my day, which went something like this: Read the remainder of this entry »