In Search of Stupidity: Over 20 Years of High-Tech Marketing Disasters

Read chapters and sections from the first and second editions.

Read Chapters and Sections

Welcome to the Korean edition of In Search of Stupidity: Over 20 Years of High-Tech Marketing Disasters!  Korea is a country that knows something about stupidity, having  suffered since the 1950’s from the ongoing effects of what may the 19th and 20th centuries stupidest idea, Communism. But we in the US are well aware that Korea is also a first-world country with a first-class, high-tech infrastructure.  This accounts for the fact that Koreans are known to be addicted to playing and dominating the rest of the world in massive MUDs (Multiuser Dungeon Domain) games, such as World of Warcraft, Starcraft, and MU. In the US, if your son or daughter looks up from their PC sometime and mutters something about how their shiny World of Warcraft castle has been sacked by rampaging Orcs, evil knights, or pitiless demons, the odds are often quite good it’s a Korean doing the sacking and rampaging!

This means the time is right for a Korean edition of Stupidity.  As Korea takes its place amongst the world’s high-tech elite, it faces an important choice!  Will Korean companies  A) repeat the mistakes made by others and suffer repeated financial losses, layoffs, and much angst and personal woe as its software and hardware industries attempt to grow?  Or will it be B) Korea gracefully avoids repeating past disasters made by idiots in other countries while it grows to unparalleled high-tech greatness?

Sorry, I’m voting for “A.” Human nature is universal and human stupidity is an incredibly powerful force capable of ignoring history, commonsense, and practical experience (if you doubt this, just spend a minute looking north).  But you, lucky reader, are equipped with a valuable antidote to high-tech stupidity.  You hold in your hands not just a book, but an institutional memory that will pour foresight and advanced knowledge into your brain.  Soon, you will be equipped to avoid the mistakes of the past and will be able to march forward into the future secure in the knowledge that if you do indeed screw up, your mistakes will probably be original ones.

And this, at the very least, will put you one up on the industry’s biggest industry software company, which, with its latest release, managed to repeat positioning mistakes made almost a quarter century ago by MicroPro, which has earned its own inglorious chapter in this tome.  I’m speaking, of course, of the launch of Windows Vista. Several months after the release of Vista to businesses (the product was released to consumers in January, 2007) the consensus of the market has been that Vista is a flop.  It’s a peculiar type of flop. Financially, Vista is making a great deal of money for Microsoft.  No surprise there; after all, the company has an almost complete monopoly in the desktop OS and main business applications markets and the dominant position in the server OS segment.  OEMs are in complete thrall to Microsoft; if you don’t offer Windows on your laptop, you’ve got an unsellable silicon brick in your warehouse.

But that said, Vista has failed to meet any of the lofty goals originally set for it.  It has failed to excite any segment of the market, including key influencer groups such as the press and the gamers.  It is not driving sales of new computers.  At retail, the pull figures for Windows out of the channel are dreary and show no signs of changing (we’re researching these numbers and will be reporting on them soon in an upcoming issue of Softletter). The blogs are condescending and even many Microsoft devotees are dismayed by what they see and hear.  Legitimate copies of Windows XP (and even 2000!) just became more valuable and knuckles will have to be severely cracked before the hands grasping those precious old boxes relax and allow a fresh copy of Vista to be shoved into a reluctant grasp.  The fact is, few people see any need or have any desire to buy Vista.

In all fairness, some of the problems that accompanied the Vista launch are beyond Microsoft’s control.  As the Internet has matured as a development and infrastructure platform, the growth of SaaS and advent of hybrid applications has led to an inevitable challenge to Microsoft’s monopolies in OSs and desktop applications.  Over the next several years, Microsoft will need to execute the painful chore of chewing off its own two strong revenue arms (but not too quickly) and hope they regenerate into new revenue and profit builders.  It’s not a tasty task, and you can’t really blame the company for avoiding it, necessary though it is.

But  paradigm shifts aside, the biggest problem crippling the Vista rollout was Microsoft’s complete bungling of the critical task of properly positioning the product.  Vista’s marketing identity is a dirty smear in the mind’s eye of the public; today, it’s almost impossible to find anyone (particularly anyone from Microsoft) who can cleanly and quickly describe why Vista is superior to Windows XP.  And a confused buyer is a buyer is a buyer that does not buy (or one who buys something they can understand).

Microsoft’s Positioning Sins

During the Vista rollout, Microsoft committed several positioning sins. Redmond’s mistakes begin with the deadly transgression of creating a positioning conflict within its own product lines. It’s a surprising mistake. During the history of the Windows 9.X vs. NT product lines, Microsoft was frequently tormented by customers confused by which product to buy, a mistake highlighted by the firm’s creation of one of high-tech’s dumbest ads, the “two nags racing” piece which you can see preserved on www.insearchofstupidity.com in the Museum of Stupid High-Tech Marketing. While 9.X and NT both existed, Microsoft frequently had to explain why a customer should buy one product over the other when both were called Windows, looked very much the same, did very much the same thing, and cost pretty much the same. But Microsoft was lucky in that during this period its chief jousting opponent was IBM and OS/2.

But with Vista Microsoft pointed the lance at its own foot, kicked its marketing war horse into action, and firmly pinned its toes to the ground. There are no less than six (actually seven, counting the OEM version. Wait, isn’t that eight if you count academic packages? Are we missing some other variants? Probably. Does Windows CE count?) versions of Vista currently available for sale:

  • Vista Starter (which you can’t buy unless you live in the Third World, apparently.)
  • Vista Home Basic (which, amazingly, does not include the new UI.)
  • Vista Home Premium
  • Vista Business
  • Vista Enterprise
  • Vista Ultimate

This plethora of choices leads customers to naturally ask a deadly question: “Which one do I buy?” Before, a consumer only had to compare between Windows XP Home and Professional (Windows Media Edition came along too late in the life cycle of the product to become well-known enough to confuse anyone). To help customers, Microsoft has published a blizzard of collateral, comparison sheets, pricing matrices, etc., etc. Thinking about whether to buy “Vista Home Premium” vs “Vista for Business?” What’s “Super Fetch” worth to you? How about “Volume Shadow Copy.” But it’s good to know the “Business” version includes “Premium Games.” Just what a business person is looking for in their business version of Vista. Why not include applications that have some applicability to business concerns? Maybe Stock analysis and reporting? Specialized business calculators? Something?  Anything?

And making things ever more interesting is that the EULAs accompanying each version are different.  Want to create a virtual machine on your PC and run Vista Home in it?  You can’t!  How about Vista Business?  You can!  Why one and not the other?  Who knows?

Moving along down the path of positioning damnation is Microsoft’s failure to attach any powerful or desirable image to Windows Vista as a product line.  Try to imagine in your mind what memorable picture or capability is associated with Vista.  None comes to mind. The product does have a new interface, but Microsoft made no attempt to build a compelling image of usefulness around the AeroGlass UI. Yes, the icons are bigger and the screen savers are prettier, but so what? Microsoft might have discussed how the new desktop gave users “X-ray vision” like Superman, increasing their day to day productivity while working with Windows, but it didn’t. Vista is supposed to be far more secure than XP, and perhaps Microsoft could have discussed “Fort Knox,” an integrated group of technologies that allowed you to lock up your PC with bank vault tightness, but it didn’t. (Given Microsoft’s past security woes, it may have lacked the stomach for this gambit.)

By contrast, when Apple released Tiger (OS X 1.4) the market and the  press were bombarded with information spotlighting “Spotlight,” an integrated search utility baked into the new release.  Desktop search was by no means new on either Macs or PCs, but the Mac campaign succeeded in making people aware of its usefulness and, more importantly, gave them a mental picture of why they might want to give Tiger a look.  With Leopard (OS X 1.5), the emphasis was on “Time Machine” (integrated backup).

Another key mistake made in launching Vista was to match features to pricing expectations and here Microsoft has also failed, particularly in respect to Windows Ultimate.  Ultimate is the kitchen sink of Windows, the one with all the toys and whistles and it’s expensive at $450 for a retail version (and pricey at $270 for the upgrade version).  But not to worry! With your purchase of Ultimate you’re promised all sorts of goodies only you, our ultimate customer, can download.  And what are these exciting ultimate premiums? Well, to date, they include fun things like Windows Hold ‘Em (a poker game), extra language packs (those squeals of delight are coming from buyers who just unpacked Finnish) for the Windows multi-language user interface, Secure Online Key Backup, and BitLocker Drive Preparation Tool. (The latter two products are included in other, cheaper versions of Windows.) And oh, let’s not forget the new screen saver that let’s you use videos.  Alas, it’s in beta and not due to be finished for a while yet. Ultimate customers, are, of course, delighted with all of this.

In its long and storied history, Microsoft has distinguished itself from its competition by its ability to avoid the self-inflicted wound of stupid marketing.  With the release of Windows Vista, this has changed. But during the release of Windows Vista, Microsoft has repeated mistakes made by MicroPro (positioning conflict), Borland (positioning conflict, pricing/feature disparity), Novell (positioning conflict), Ashton-Tate (pricing /feature disparity coupled with inept PR) and itself (Windows 9X vs. NT), proving that the company  now suffers from the same institutional memory hole that afflicts much of high-tech. The Vista release now serves as a valuable and contemporary object lesson in how not to position and launch a new software product.

Best of luck!

In the first edition of In Search of Stupidity: Over 20 Years of High-Tech Marketing Disasters, I made a deliberate decision to avoid giving specific advice about how companies could avoid being stupid. At the time, I thought the process was fairly obvious; study the mistakes of the past, apply self-observation to your current behavior, and if you see yourself repeating a previous example of idiocy, stop and do something else. As I point out in the preface to the second edition, the claim that high-tech companies are constantly running into “new” and “unique” situations that they cannot possibly be expected to anticipate and intelligently resolve is demonstrably false (particularly if you read In Search of Stupidity). The truth is that technology companies are constantly repeating the same mistakes with wearying consistency (as this second edition makes even clearer), and many of the stupid things these companies do are completely avoidable.

But despite my fond expectations, many who read the first edition claimed they needed more guidance on avoiding stupid behavior and more detailed instructions on how to pump up the frontal lobes of the collective corporate brain. Thus, I’ve added helpful analyses and, where appropriate, checklists on specific actions you can take to both avoid acting stupidly and transform yourself into a marketing Einstein after suffering a brain hiccup similar to the one that afflicted Microsoft in 2012 when Steve Ballmer and company decided to repeat MicroPro’s devastating WordStar vs. WordStar 2000 positioning catastrophe of 1984. Although sometimes created in the spirit of tongue in cheek, the analyses and fundamental items in the lists will assist you in your quest to raise your marketing and sales IQ. Follow their sage advice, and you will find they offer you both redemption (good) and foresight (much, much better).

Another critique leveled at the first edition of In Search of Stupidity was its love of hindsight (also sometimes known as “history”). In the opinion of a fairly vocal minority, applying hindsight to the situations I wrote about was unfair; they believe I was picking on a band of dewy-eyed naifs wandering about a primordial high-tech Garden of Eden where original sin was unknown until introduced into paradise by Lucifer. (Prime candidates for the role of “Father of Lies” over the years have include Steve Jobs, Bill Gates, Larry Ellison, and a bevy of other industry movers and shakers from the period covered. The winner of the part depends on historical context and your personal opinion.)

For an overview of this viewpoint, I urge you to go to Amazon.com and read the In Search of Stupidity reviews, both good and bad, to see how people expressed themselves on the topic of hindsight. In my humble opinion, the words of Robert Hangsterfer “bob_hangsterfer” from Glendale, Wisconsin (two stars, “Rehashed stories, no guidance,” May 10, 2005), best sum up the disdain of some for learning from the mistakes of the past: “The author berates the ‘losers’ of the PC software wars and laughs at their ‘stupid decisions.’ Yet, how were the executives supposed to know what decisions would lead to success or failure?” Bob calls plaintively from the virtual pages of Amazon.

Now, in all honesty, I don’t regard this criticism as trenchant but rather somewhat tautological: “Nothing do I know; therefore I know nothing. So how can you expect me to act like I know?” But, the question deserves an answer. So, let’s take “Brands for the Burning,” which deals with Intel’s $500 million plus meltdown over the Pentium’s inability to properly handle floating-point math past four digits. How could Intel possibly have known the consequences of its actions? How could Intel have possibly predicted what would happen when a major brand is besmirched by a major (or at least a perceived major) flaw in a high-profile product? What clues existed that would have possibly informed poor, confused, lost little Intel that its course of attempting to cover up a flaw in its flagship microprocessor, refusing to acknowledge the impact of the problem, and not offering to make customers whole to the extent possible was a stupid path to take?

Well, Intel could have studied the 1982 example of Johnson & Johnson, when some cockroach slipped cyanide into capsules of Extra Strength Tylenol and murdered seven people. The poisoning immediately destroyed sales of the leading brand of acetaminophen, and most observers predicted that Tylenol was doomed. In the first days of the disaster, advertising guru Jerry Della Femina, author of the classic From Those Wonderful Folks Who Gave You Pearl Harbor (Simon & Schuster, 1970) and other tomes about the world of ads and admen, was quoted by the New York Times as saying that “I don’t think they can ever sell another product under that name. There may be an advertising person who thinks he can solve this, and if they find him, I want to hire him, because then I want him to turn our water cooler into a wine cooler.”

I assume that Jerry has drunk a lot of vino in the intervening 24 years because his prediction was dead wrong. Instead of shriveling away, Johnson & Johnson launched a PR campaign that by 1994, the year in which the Intel debacle occurred, had already become a model of what to do when circumstances damage a company’s reputation or brand. The campaign included the following elements:

  • An immediate press campaign by Johnson & Johnson informing the public about the poisoned capsules and warning them not to use any Tylenol product. Company executives were instructed to not obfuscate or deny the scope of the problem but instead cooperate with the media in getting the story out so as to ensure everyone heard about the poisoning.
  • An immediate recall of all Tylenol capsule products on store shelves (at a cost of more than $100 million to Johnson & Johnson).
  • An offer to immediately swap out all Tylenol capsules with Tylenol tablets.
  • A series of forthright statements by Tylenol upper management expressing their shock and pain over the deaths.
  • After the completion of the recall, an extensive PR announcement of the introduction of new Tylenol products in tamper-proof packaging, coupled with an extensive series of promotional programs offering the new products at reduced prices via price discounts, coupons, and so on.

When you read Chapter 8 of this book, contrast these actions with the ones Intel actually took.

The result of Johnson & Johnson’s classic (and much studied) campaign rescued the product from the marketing grave. During the crisis Tylenol had seen its share of the market drop from 37 percent to 0 percent. A few months after the poisonings, Tylenol was back up to 24 percent market share and today still reigns as the leading brand of this popular painkiller.

So, there’s your answer, Bob. That’s how Intel could have known what to do. With a little study, a little history (hindsight), and a healthy dollop of common sense, we know how Intel could have saved itself a world of embarrassment and derision as well as a cool $500 million+.

(Oh, how do we know this? Because, Bob, amazingly enough, the second generation of Pentiums also suffered from math problems! But, Intel had learned its $500 million lesson. The company promptly offered to recall the “defective” processors and make dissatisfied customers whole. Since most customers didn’t really know what a floating-point unit [FPU] chip was and even more probably no longer knew how to do long division, the public—offered the security of Intel’s “guarantee blankie”—decided not to bother fixing a problem that wasn’t bothering them, and no one paid any attention to the whole imbroglio except for a small cadre of picky math people and hardware-obsessed geeks who took their new chips home and went away happy.)

Now, in all fairness, it’s not just high tech that suffers from a reluctance to learn from the mistakes of the past. For just a moment, let’s step outside high technology and take a look at what is perhaps America’s most seminal business, the automotive industry. By the 1970s the U.S. car industry had raised the practice of building shoddy buggies to an art form. I particularly remember toward the end of the decade an abomination produced by Chrysler called the Cordoba, which was, we were assured by pitchman Ricardo Montalban, clad in “fine Corinthian leather.” It is a virtual certainty that all that survives of these cars are the leather seats; the bodies long ago turned to rust. Chryslers of this era were matched in this respect only by the Chevy Vega, a car that began to disappear in a cloud of iron oxide particles from the moment it was driven out of the showroom. If that wasn’t exciting enough, for more thrills one could always buy the “Immolator,” the Ford Pinto with that amazing, mounted-above-the-rear-axle-so-it-was-guaranteed-to-explode-when-smacked-hard gas tank.

But by the second half of the 1980s, a turnaround seemed to have taken place. Ford, General Motors, and Chrysler all appeared to turn important quality corners. Although no American cars have ever reached the benchmark standards set by Japanese carmakers, the situation definitely improved. Instead of dissolving in a heap of nano particles by 60,000 miles, the fate suffered by the hapless Cordoba, American cars started to be put together so well that people began to expect their homegrown tin to hit the 100,000-mile mark. U.S. carmakers latched on to the Japanese discovery that people like to “live” in their cars, and soon American cars had caught up with their overseas rivals in respect to the number of cubbies and cup holders festooning their buggies; Chrysler in particular was so diligent in this regard that some people began referring to their minivans and sedans as Slurpeemobiles. Even more telling, while K cars such as the Dodge Aries and Plymouth Reliant were never state-of-the-art automotives, 20 years after their introduction versions of each could be seen still limping up the frozen, rust-inducing streets of New York, New England, and eastern Canada (where they were sometimes known as Toronto Taxis).

The cult of quality continued to spread across the American auto landscape during these years; at Ford, quality was “job one”; the sort of legendary Lee Iacocca, when he wasn’t driving the development of such abortions as the revived Chrysler Imperial, a 1970s K car with a 1960s design that appealed to people in their 80s, proclaimed that “if you could find a better built car, buy it.” Not to be outdone, General Motors started a new division, Saturn, designed to prove that if you stuck a group of Americans in a remote location in the backwoods of Tennessee with nothing better to do, they’d build a small, underpowered, but pretty reliable car just as good as the Japanese were doing in the 1980s.

The result of all this attention to quality and reliability paid off; during the late 1980s and through much of the 1990s, American cars held their own against the Japanese and seriously dented the Europeans. But by the mid-to-late 1990s, American car companies had begun to backslide. Today, Toyota Camrys and Honda Accords routinely reach 150,000 and even 200,000 miles of reliable, trouble-free use while sporting ever more sophisticated designs, increased fuel economy, and more powerful engines. Doors and body panels on Japanese cars align with geometric precision; by contrast, the body panels of Chevys and Pontiacs often look like they’ve been attached to the car by assemblers suffering from problems with both depth perception and basic geometry. On European cars, interior plastic trim usually has a plush feel and pleasing patinas; on American cars, the plastic frequently appears as if it were made from recycled polyester double-knit leisure suites stored over from the 1970s. I’ve owned both a Pontiac Grand Am and a Bonneville over the past ten years; both suffered from serious electrical problems before they hit 65,000 miles. (To add insult to injury, my 1999 Pontiac Bonneville in the course of two weeks underwent a transformation from transportation to tin at 69,000 miles when in quick succession the car’s AC system ceased working and the engine’s plastic(!) intake manifold cracked, drowning the buggy’s innards in antifreeze.) By contrast, every electrical and mechanical component in my dispose-a-car Hyundai Elantra wagon still worked properly before I gave the thing away at 130,000 miles.

A Honda Accord’s high-beam stick flicks over with a satisfying “snick”; by contrast, the action on a Pontiac Bonneville’s light control stalk is equivalent to yanking on a stale stick of licorice. The Ford Focus, Lincoln LS, Pontiac Aztek, Chrysler 300m, and so on, and so on, have all been plagued with extensive quality complaints upon their initial introduction. Consumer Reports, the gold standard for objective auto ratings (yes, the magazine is not much fun to read, and it has that annoying left-wing, tree-hugger-life-was-better-in-the-19th-and-early-20th-centuries-when-choo-choo-trains-belched-smoke-into-the-air attitude, but it does buy its test vehicles and thus has no need to suck up to Detroit or Tokyo in the manner of publications such as Car and Driver and Road and Track) consistently accords Japanese cars with seas of little red bull’s-eyes (top ranked) while American cars are awash in black dots (bottom of the barrel). This after 30 years of multiple opportunities to catch up and adjust to the new reality the Japanese had introduced to the market, namely, that well-engineered and highly reliable cars will be favored by buyers over cars that aren’t.

Quality, reliability, and design issues had become such a problem in the U.S. auto industry that by 2006 General Motors and Ford bonds had been reduced to junk status; both companies were shedding plants, employees, and market share by the bucket load, and Ford scion William Ford had been reduced to making remarkably-just-like-the-ones-made-20-years-ago ads featuring smiling (presumably because they’d not been fired for making astoundingly unreliable first-year Ford Focuses) workers promising that Ford was (again) going to make reliable and well-built cars.

In defense of the indefensible, several auto-industry observers offered the feeble excuse that the reason for the reoccurrence of poor quality in American cars was that Detroit had decided to focus on building big SUVs in its quest for profits and market domination. The problem with this theory is that while the Americans were pouring out tons of GMC Jimmies with front ends that wobbled like tops at about 60,000 miles and Eddie Bauer–version Ford Expeditions with body parts that tended to fly off the car’s exterior at moderate velocities, the Japanese were turning out giant, global-warming-contributing and ice-cap-melting monstrosities that were also highly reliable and well made (as adjudged by Consumer Reports).

What possible justification can American companies (and I blame both the bosses and the workers for their failure to get it) offer to excuse their failure to at least match the Japanese in quality and reliability? Answer: there is no excuse. What we’re dealing with is sheer idiocy and a failure to study the mistakes of the past (hindsight) so as to avoid doing the same stupid thing all over again.

I’m not quite sure why hindsight in general has developed a bad reputation amongst the high-tech cognosceti, but I have several theories. One revolves around culture, specifically the culture of Silicon Valley, high technology’s principal engine and driver since the late 1970s. Silicon Valley is located in California, a land of self-actualization and narcissism and home to some of the silliest cults to ever plague mankind. Take est, for example. Developed by former used-car salesman (of course!) “Werner Erhardt” (not his actual name, but who cares), est was built around the platitude of “What is, is.” This was translated into a very profitable seminar program of how to arrive at “is” by getting “it.”[1] The series was highlighted by a series of exercises that took the attendees on a trip through a tomato, allowed est trainers to yell rude things at the attendees, encouraged acolytes to ignore most social niceties, and didn’t allow them to go to the bathroom (very often) during the est seminars. The core of the est belief system revolved around a mantra that “your beliefs, feelings, and experiences were the only things that were truly valid.”[2]

est had its competitors, as you’d expect. At MicroPro, for example, many in upper management’s inner circle were “graduates” of something called “The Laughing Men.” This was an offshoot of est that taught, according to what I was told at the time, pretty much the same thing. (I assumed the reason the men were laughing was that they were allowed to go to the bathroom.) At Ashton-Tate, Scientology was very popular, with the company’s founder, George Tate, being a practitioner.

est and its imitators were quite the thing in the 1970s and early 1980s and created large cadres of sociopaths[3] who felt they were immune from such interpersonal obligations as saying they were sorry when they misbehaved and totally focused on gratifying their every last whim and desire (which, come to think of it, characterizes much of upper management at many high-tech companies today). Some est graduates finally snapped out of it (often after receiving the divorce papers or a punch in the nose), and one day Werner Erhardt bailed out of the business and moved to Europe. But an examination of the current zeitgeist indicates est’s solipsistic message of relying only on your experiences to “create your own reality” has taken hold in much of high tech (as well as in other industries), and there’s probably not much of a market for teaching something everyone already believes in. And while you’re busy tending to your own reality, you tend to not have much time for worrying about others’ realities, particularly unsuccessful ones, since your reality will obviously not include their failures.

Another theory focuses on the underlying nature of engineers and programmers, many of whom continue to create new and innovative companies that they often then destroy by repeating past stupidity. The best programmers and engineers are usually “world creators,” people who like to “live” in their work and are happiest when they have complete control over every aspect of the tools, techniques, and technologies they use to create products. The frequently written about “not invented here” (NIH) syndrome is a direct result of this ethos, and the damage it can wreak on a company is illustrated in Chapter 4, which discusses how a key programming group at MicroPro finally destroyed the company over the issue of product control. A corollary to NIH is DTMNBICTCAYD, or “Don’t tell me nothing, because I created this company and you didn’t,” an affliction that also frequently leads to history repeating itself.

Again, it’s not just high tech that suffers from these syndromes. In 1908, Henry Ford created the Model T, the car that allowed Ford to transform the face of America and become for more than 20 years the largest automotive company in the world. By the standards of the day, the T was high tech, well built, easy to maintain, reliable, and cheap. But car technology was rapidly changing, and in 1912, while Ford was away on a trip, several of his engineers built a prototype of a new Model T, an “upgrade” that incorporated improvements such as a smoother ride and more stable wheelbase. Ford’s response to this attempt to improve “his” car without his exercising direct control over the process was to smash and vandalize the prototype while the shocked engineers watched.[4]

Now, in all fairness, some businesses insist on using hindsight to study past failures: the airline industry is a good example of this. After a crash or major flight faux pas, NTSB investigations do not normally follow these lines:

NTSB investigator: “Uh, captain, I note you’ve just flown your airplane straight into that mountain and killed all your passengers.”

Airplane captain: By Jove, with the benefit of 20/20 hindsight, we can all see you’re right! But, you know, I just had to experience catastrophic aerial failure for myself to truly comprehend it. Having lived through the disaster, I’ve absorbed on a deeply personal level just how bad crashing my plane and killing all my passengers can be and will in the future understand intuitively why it’s an experience to be avoided in the first place!

Instead, after a crash or serious operating mistake by a flight crew, the circumstances are analyzed, broken into their constituent parts, and then programmed into a flight simulator, which can be thought of as an electronic box stuffed full of hindsight. After this, flight crews from all over the world are periodically summoned to attend simulator classes so they can directly learn from all this hindsight until their instructors are satisfied they are unlikely to repeat another’s mistake.

But, enough preaching. If you’re one of those sturdy types who march to their own drummer, who seeks to squeeze the juice of life from sources pure of the carping calls of second guessing, and who desires to personally experience every emotion directly so as to live a life unadulterated by the pallid personas of those cowards who shrink from the whip of disaster and scourge of financial failure, let me salute you and bid you good luck and Godspeed!

Just do me one favor. At some point, send me your e-mail address and a description of what you’re up to. I’ll need some good material for the third edition of In Search of Stupidity.

 

 

Preface to the First Edition

In 1982, Harper & Row published In Search of Excellence: Lessons from America’s Best-Run Companies by Thomas J. Peters and Robert H. Waterman, Jr. In Search of Excellence quickly became a seminal work in the category of business management books and made its authors millionaires. Although it’s no longer the literary obsession of freshly minted MBAs that it was in the 1980s, the book’s distribution and influence has proved long lasting and pervasive. After its introduction, the book stayed on best-seller lists for almost 4 years and sold more than 3 million copies. A survey by WorldCat, an electronic catalog of materials from libraries in the United States and other countries, ranks In Search of Excellence as being on more library shelves than any other book in the world. With 3,971 libraries listing it as being in their collections, the book tops the list of 100 books held by libraries. It has held the number-one position since 1989.

In Search of Excellence, when it first came out, applied soothing balm to the raw nerves of the American psyche, and this helps account for its tremendous success. The 1970s had been a gloomy time for U.S. businesses. The Japanese had run American companies out of consumer electronics; Japanese cars lasted 100,000 miles, while American cars started breaking down at 20,000; and as the 1980s began, Japanese companies had just started making memory chips more cheaply than their American counterparts. The Japanese even announced they were starting a “Fifth Generation” project to build software that would make computers very, very smart indeed, leaving the poor old United States with software systems that would be the technological equivalent of Studebakers. (The project was a complete bust, like all the others emanating from the artificial intelligence hype machine of the 1980s, and it never developed much more than software capable of storing some nice recipes for sushi.) Yes, the United States was doing OK in this new market for little machines called “microcomputers,” but the pundits universally agreed that eventually the Japanese were going to move into that industry as well and that would be it for the Americans.[5] Maybe IBM would survive; after all, it did business like the Japanese anyway. For the ambitious young MBA, a start-up position in agribusiness, such as sheepherding, began to look like the fast track to the top.

In Search of Excellence helped buck everyone up. All the companies it profiled were American firms competing successfully in world markets. It seemed obvious that if you studied the organizations closely, learned the fundamental practices and techniques they used to achieve excellence, and then applied those practices and techniques to your business, it would become excellent too!

The basic thesis of In Search of Excellence isn’t complex and can be summed up succinctly: Excellent companies create corporate cultures in which success flourishes. (Yes, this is something of a tautology, but it’s a nice one and people always like reading it.) An excellent corporate culture is one that loves the customer, loves its employees, loves the company’s products, and loves loving the company. Once enough love is flowing through the corporate veins, a company will organically become excellent and in turn create excellent products and services. This will lead to more customer, employee, product, and corporate love, lifting all concerned to even greater heights of selling and purchasing ecstasy. The cycle becomes self-sustaining, and a universe of almost sybaritic business success awaits those who master the Zen of Excellence.

Most of In Search of Excellence thus functions as the corporate equivalent of the Kama Sutra, profiling different companies as they bend and twist themselves into different postures and techniques designed to build customer desire for the company, increase customer love for the company’s products, and provide lasting satisfaction with the company’s service. The positions and techniques discussed vary widely and include being reliable, shooting for 100 percent, communicating intensely, being creative, talking about it, talking about it a lot, listening a lot, getting on with it, and so on. High-tech firms are particularly well represented in the book, with IBM, Xerox, DEC, and many others serving as exemplars of how to seize the business world by the tail via the practice of excellence.

For the next several years, copies of In Search of Excellence flew off bookstore shelves. Thousands of companies, including most in the high-tech sectors, took its maxims to heart. People walked, talked, and communicated with incredible intensity. Peters became a widely sought-after speaker and business consultant (Waterman dropped out of public sight). He wrote more books, including A Passion for Excellence and The Pursuit of WOW!, all of which continued the earlier book’s quest for that ineffable corporate phlogiston that when ignited leads inexorably to success. America’s affair with excellence appeared to be endless.

Unfortunately, while U.S. businesses were vigorously applying excellence to every nook and cranny of their corporate bodies, a few people began to note that many of the firms listed in Peters and Waterman’s tome seemed to be, well, less than excellent. As early as 1984, Business Week published a cover story entitled “Oops!” that debunked some of the book’s claims. Most people dismissed these early criticisms as journalistic carping, but over time it became more difficult to ignore that something was very wrong with the book’s concept of business excellence.

Take, for example, its examination of Lanier, a major competitor in what is now a vanished world—that of dedicated word processors. The market for these single-purpose computers had been built and defined by Wang. As the market grew, companies such as Lanier, Xerox, IBM, and almost a hundred others competed fiercely for the privilege of selling $20,000.00 boxes that did what a $99.95 piece of software does today (actually, the software does much more). These dedicated devices were often the only experience many people had with computers throughout much of the 1970s, and to many people word-processing stations epitomized “high tech.”

In Search of Excellence thought Lanier was really excellent, a company that “lives, sleeps, eats, and breathes customers.” The book described how the company’s top executives went on sales calls once a month, how the president of the company personally handled service calls (and if you believed that, you probably also went out and bought a famous bridge in New York City), how its service was even better than IBM’s, and so forth, and so on.

And Lanier was a sharp marketing bunch, too! The company knew that the term “word processor” put everybody “off.” That’s why Lanier called its word processors “No Problem Typewriters.” Sheer advertising genius.

The only problem with all of this was that Lanier wasn’t an excellent company; it was a dead company, a shot-through-the-head dinosaur whose sluggish nervous system hadn’t yet gotten round to telling the rest of its body to lie down and die. In 1981, an Apple II+ running AppleWriter or ScreenWriter[6] did everything a Lanier word processor did, never mind an IBM PC with WordStar. By 1985, the market for dedicated word processing was as extinct as the Tyrannosaurus Rex, but Peters and Waterman seemed not to have noticed they were profiling a walking corpse.

Now, you can argue that market shifts can catch companies unaware and that Lanier was a victim of the unexpected. This, however, can’t be true. In Search of Excellence was written in 1981 and published in 1982. By 1981, thousands of Apples, RadioShack TRS-80s,[7] Commodore PETs, and a wide variety of CP/M systems were selling monthly. The IBM PC was also launched that year. WordStar, AppleWriter, and Scripsit (popular on the Radio Shack systems) had been available for years. Hundreds of ComputerLand stores, one of the first national franchises dedicated to selling desktop computer systems, were doing business nationwide, and dozens more were opening on a monthly basis. Yet somehow Lanier, the company that apparently did everything but have sexual relations with its customers, never found out from a single one of them that they were interested in buying an IBM PC or an Apple with a good word-processing program that did everything a Lanier word processor did at a fraction of the cost and did other things as well, such as run a nifty type of new program called a “spreadsheet.” You would think an excellent company would have caught on much sooner.

It only became worse as time passed and people kept track of the book’s list of “excellent performers,” particularly the high-tech ones. For instance, Data General: gone into oblivion.[8] Wang: moribund by 1987. DEC: PC roadkill. NCR: a mediocre performer bought up by AT&T that passed into extinction without leaving a trace. Texas Instruments: the company that coinvented the microprocessor saw its TI99/4A tossed out of the computer market by 1984. IBM: In 10 years it went from an American icon to an American tragedy.

Xerox, on the ropes by the late 1990s, was on the book’s list of hero companies. By the mid-1980s, industry mavens were already puzzling over how a company could develop the graphical user interface (GUI), mouse, object-oriented programming, and Ethernet and fail to make a single successful product from any of these groundbreaking innovations. Instead, Xerox made its inaugural debut into the PC market with an obsolete-before-its-release clunker of an 8-bit CP/M machine with the appetizing name of “Worm” that sold just about as well as you would expect.

Atari, for God’s sake, even made it to the book’s Hall of Excellence. In 1983, the year after In Search of Excellence’s publication, the company was close to death after releasing the worst computer game of all time, E.T. (based on the movie). Before its product hit the store shelves, an “excellent” company would have used the plastic cartridges that contained this all-time turkey to club to death the parties responsible for producing the game that ruined the Christmas of 1982 for thousands of fresh-faced video game junkies.[9]

It wasn’t simply the companies profiled in In Search of Excellence that proved to be disappointments. During the 1980s, it was impossible, especially in high tech, to escape the training seminars, book extracts, and corporate programs that sprang up dedicated to ensuring everyone was excellent all the time and every day. Yet, despite all the talking, walking, and communicating, high-tech firms kept doing stupid things. Again and again and again. And every time they did they paid a price. Again and again and again.

One key to the problem may be that in 2002, Peters announced the data used to “objectively” measure the performance of the companies profiled in the book was faked. Oops. Well, remember, excellence means never having to say you’re sorry.

But despite this little faux pas, a more important answer lies in the types of companies analyzed in In Search of Excellence. With only a few exceptions, they were large firms with dominant positions in markets that were senescent or static. IBM ruled the world of mainframe computers. DEC and Data General had carved out comfortable fiefdoms in minicomputers. Xerox reigned over copiers. Wang and Lanier both possessed principalities in dedicated word processing.

In these types of business environments, affairs proceed at a measured pace and plenty of time is available for navel gazing. Their vision clouded by all that lint, companies such as IBM and DEC decided it was their natural goodness that made them successful, and therefore they were successful because they were naturally good. By the time Peters and Waterman got around to interviewing them, most of these firms were ossifying, their internal cultures attempting to cement employee mind-sets and processes in place in a futile attempt to freeze the past so as to guarantee the future. These firms weren’t excellent; they were arthritic.

For high-tech companies, navel gazing is a particularly inappropriate strategy because markets tend not to stay stable very long. In 1981, for example, distinct markets for spreadsheets, word processors, databases, and business presentation products existed in the software industry. By the late 1980s, word processing alone was a $1 billion category. By 1995, all of these categories had been subsumed by the office suite (particularly Microsoft’s).

What, therefore, accounted for the success of companies such as Microsoft, Oracle, and Symantec and the failure of other firms such as Novell, MicroPro, and Ashton-Tate? Was it Microsoft’s “respect for the individual,” something In Search of Excellence told us IBM had in abundance? Well, Bill Gates once stood up at the start of a presentation being given by a new product manager, fixed the unfortunate fellow with a cold stare, and asked, “Where did we hire you from?” before leaving the room.

Hmm. Perhaps not.

Perhaps it was a “seemingly unjustifiable overcommitment to some form of quality, reliability, or service”? IBM had that in abundance also. Well, Dell Computer is currently the reigning king of PC hardware, not IBM. Although Dell’s service is OK, the company isn’t “unjustifiable” about it. Oh, Dell pays lip service to the concept of great customer service, and within the constraints of its business model, it does the best it can. If you don’t like your PC, Dell will probably take it back if you’re within the warranty period and you scream loudly enough and pay for the shipping and maybe fork over a restocking fee if you’re a small business. If your PC breaks, the company will do its best to get you to fix the thing. But Michael Dell, unlike the excellent CEO of Lanier, won’t be calling your house to handle affairs personally.

That’s because Dell has figured out that what people really care about these days in a computer is high performance at a low price. Dell has learned over the years to build such machines. IBM didn’t, and ended up exiting the PC business muttering about “commodization” and “focusing on core competencies” while Dell grew to 2005 revenues of almost $50 billion on sales of servers, notebooks, desktop systems, printers, and related items. Computers are very reliable and on a statistical basis don’t break down often. If the ones made by your company do, it is possible to sell a great many of them if you price them cheaply enough, as in the case of Packard Bell, a company that briefly became a powerhouse in PC retailing. Alas, the machines were of poor quality, they broke often, and few people ever bought a second Packard Bell computer.

On the other hand, Dell computers rarely (though they have been known to erupt in flames).[10] You, the customer, know that. You’re willing to buy a Dell PC because you’ve made a bet in your mind that the risk that the computer you buy won’t work isn’t worth the extra money it would cost to have your fanny kissed in the event of a breakdown. People who buy desktop PCs aren’t a high-roller audience, and it makes no sense to treat them like one.

Let’s move on.

Or perhaps it was “autonomy and entrepreneurship”? Motorola, a company with a history of allowing different autonomous groups within its phone division to tear at each other’s throats while firms like Nokia tore away its market share, surely has that in abundance. In the entrepreneurial spirit of “up and at ’em,” these groups managed to build what is perhaps the coolest-looking cell phone of its time, the StarTAC. The only problem with the StarTAC was that when it was first introduced it was a very cool analog system when everyone wanted digital phones.

And it was certainly entrepreneurship that led Motorola to launch its Iridium project. Motorola spent $5 billion plus to put 66 low-earth satellites into orbit so that anyone could phone anytime from anywhere with a Motorola phone. Unfortunately, the satellites spend 70 percent of their time over our planet’s oceans and aren’t usable for much of their life (unless perhaps you’re adrift in the middle of the Atlantic); the phones, though they may have worked from the top of Mount Everest, didn’t work indoors, in the shadows of buildings, or under trees (early demos of the system enjoined purchasers to “make sure the phone is pointed at the satellite”[11]); the service’s monthly cost was high; the phones were huge; and every major metropolitan area already had cheap and reliable cellular systems. In other words, Iridium had no market. After the last satellite was launched, the system quickly went bankrupt. Despondent Motorola stockholders, watching the value of their shares plummet as Iridium crashed and burned, suggested sending up the project’s marketing and engineering teams in rockets without spacesuits to join their orbiting financial debacle, but current law forbids this. You would think an excellent company with entrepreneurial instincts would notice that 70 percent of Earth’s surface is water.

Uh huh. Maybe that isn’t it.

In fact, if you examine high-tech companies, only one factor seems to constantly distinguish the failures from the successes. This factor is stupidity. More successful companies are less stupid than the opposition more of the time. As Forrest Gump astutely noted, “Stupid is as stupid does.”

One of stupidity’s most endearing traits is its egalitarian nature. Its eternal dull lamp beckons endlessly to those dim bulbs who seek to rip open the hulls of successful companies and ideas on the sharp rocks of bad judgment and ignorance. With stupidity, your reach never exceeds your grasp; any company, no matter how large or small, can aspire to commit acts of skull-numbing idiocy and have a hope of success.

Take, for example, the creation of the worst piece of high-tech marketing collateral ever developed, the brainchild of the founder of a small company, Street Technologies. The front page of Street Technologies’ expensive, four-color, 8 1/2 ~TMS 11 corporate opus posed the following challenge:

How to eliminate half your work force.”

The inside of the brochure provided the means to rise to the task:

“Get the other half to use your software!”

When it was pointed out to the president of Street Technologies that a marketing campaign designed to create mass unemployment and spark a brutal Darwinian struggle for personal survival in its target audience might not be the most effective of all possible approaches, he airily dismissed the issue with the observation that “the piece was not aimed at the employees but their bosses.” He’d apparently not considered the issue of who was going to be opening the mail.

Creating silly collaterals isn’t a task reserved only for high tech’s small fry. The second worst piece of marketing collateral ever created was a noble effort by software giant Computer Associates. This was a brochure designed to be included in a direct marketing campaign for a bundle of OS/2 business software. The piece trumpeted the presence of a free goodie that buyers of the bundle would receive upon purchase—a package of canned sounds you could use to liven up your OS/2 desktop. Sounds highlighted in this amazing bit of literature included “farting,” “pissing,” and “orgasm.” One can only mourn that the package didn’t include the noise made when a marketing manager is summarily decapitated for committing an act of boneheaded silliness, such as developing and printing thousands of patently tasteless and offensive four-color brochures.

The reason for the absence of stupidity can vary. In some cases, firms avoid stupidity because the company’s culture creates more intelligent behavior. In other cases, it’s because a company’s personnel is smarter than the competition’s and thus avoids making stupid mistakes. In yet others, it’s because a business’s leadership is smarter than the competition’s and thus tends not to behave stupidly. Usually, it’s a varying mix of all three. In a sense, the reason for not acting stupidly doesn’t matter—the avoidance of it does. By reducing the number of stupid actions you take vis-à-vis your competition, you’re more likely to outcompete them over time.

Some may object that stupidity isn’t quantifiable, but in point of fact, the opposite is true. Stupid behavior is both quantifiable and identifiable. For example, it’s stupid to create two products with the same name, price point, functionality, and target audience and attempt to sell them at the same time. This may seem stunningly obvious, but somehow one of the world’s largest software companies, MicroPro, publisher of WordStar, a product that once ruled the word-processing market, did precisely that. A few years later, Borland repeated very much the same mistake with very much the same results. Then Novell. After you read Chapter 3 and learn precisely why this is a stupid thing to do and what the likely outcome is, you’ll be less likely to make this mistake in your own marketing and sales efforts. That puts you one up on your competition who, unless they’ve also read this book, are far more likely to repeat MicroPro’s fatal blunder.

Nitpickers like to claim that context often changes the nature of what is stupid behavior, but this principle is vastly overstated. For instance, if you spend many millions of dollars successfully creating a consumer brand, and then, when your most important product is revealed to be defective, stupidly attempt to blow off the public (as I describe Intel attempting to do in Chapter 5), you’ll suffer. It really doesn’t matter what industry you’re in or what product you’re selling. Expect to be immolated.

Or take the example of Syncronys, publisher of the immortal, never-to-be-forgotten SoftRAM “memory doubling” utility for Windows. Introduced in May 1995 with a list price of $29.95, SoftRAM was designed to “compress” your computer’s memory using your computer’s memory to give you, effectively, twice the memory you had physically installed (the problem with this concept should be apparent once you think about it). SoftRAM was quite the best-seller upon its release, with the Windows 3.x version selling more than 100,000 copies and the Windows 95 version more than 600,000. The company’s president, Rainer Poertner, was dubbed Entrepreneur of the Year by the Software Council of Southern California. Syncronys stock jumped from $.03 per share in March 1995 to a high of $32.00 per share in August 1995.

SoftRAM was a handsome-looking piece of software that after installation presented buyers with a snazzy dashboard that supposedly let them increase their PC’s RAM with the touch of a button. Unfortunately for both purchasers of SoftRAM and Syncronys, the software didn’t actually do that. Actually, it didn’t really do anything except change a configuration setting in Windows 3.x that increased the amount of memory that could be swapped to disk, an operation a Windows user could perform manually in less than a minute for free.

It turned out that SoftRAM was an example of what Syncronys coyly called “placeboware,” the software equivalent of a deed to the Brooklyn Bridge. The concept annoyed the spoilsports at the Federal Trade Commission (FTC) greatly, who forced the company to stop selling the package and promise to give everyone their money back. (Interestingly enough, no one was prosecuted for fraud in the case, the FTC apparently having bought the argument that the difference between computer sales reps and car salesmen is that car salesmen know when they’re lying.) It would seem obvious to anyone with even half an uncompressed brain that no one would ever buy a product from Syncronys again, but in an act of supreme idiocy the company actually tried to sell other software packages[12] after the SoftRAM debacle. Sheer imbecility, as Syncronys promptly went out of business.

However, more than just a few trenchant examples of stupidity are needed to support a substantive examination of the subject, which brings me to the point of this book. In Search of Stupidity was written to provide you with a more comprehensive look at the topic. Within these pages are documented many of high tech’s worst marketing and development programs and strategies, as brought to you by some of its most clueless executives. In my quest to bring you the best of the worst, I selected from a wide range of companies, from arrogant smaller hotshots on the path to meltdown to sluggish giants too muscle bound to get out of their own way.

In the interest of fairness, I haven’t included hard-luck stories. No natural disasters, plane crashes, or tragic deaths played a part in any of the disasters discussed. All of the blunders, snafus, and screwups described in this book’s pages were avoidable by the individuals and companies that made them and are avoidable by you and your company. After reading this book, you’ll know what they are and you’ll be in a position to act less unintelligently. For you, history won’t repeat itself.

Of course, it is possible you’ll make other stupid mistakes, ones not chronicled in these pages, but not to worry. If your competition is making the mistakes I describe in these pages, as well as all the others, you’ll still probably prevail. Remember, the race goes not to the strong, the swift, or the more intelligent, but to the less stupid.

Besides, I’m planning a sequel.

Best of luck!

 

[1] For a ribald but also highly informative look at est, I suggest you rent a copy of the 1977 film Semi-Tough. Possibly the best movie ever made by Burt Reynolds, its depiction of est (only thinly disguised in the movie) is both accurate and very funny.

[2] Outrageous Betrayal by Steven Pressman (St. Martin’s Press, 1993)

[3] I speak about this from personal experience, having had close acquaintances who went through the training and who remained unbearable for years.

[4] The Reckoning by David Halberstam (William Morrow, 1986)

[5] In fact, the Japanese did introduce a plethora of CP/M and MS-DOS “clones.” Like many other companies, the Japanese firms failed to understand the impact of the IBM standard on the industry, and none of the machines made a significant impact on the market. In Japan, NEC and Fujitsu attempted to establish independent hardware standards, but their efforts were eventually overwhelmed by IBM’s PC standard. The most important long-term impact the Japanese had on computing technology was Sony’s successful introduction of a standard for 3-inch floppies.

[6] An early attempt at a true What You See Is What You Get (WYSIWYG) word processor. The product displayed your text on a bitmapped screen and could show italicized and underlined text. On a 1MHz Apple II it also ran veery slooowly.

[7] The first computer I ever owned was used RadioShack TRS-80 Model I, semi-affectionately known by its owners as “Trash One.” The reliability of early models was less than stellar, and the paint tended to rub off their keyboards, leading older systems to develop a rather decrepit appearance.

[8] Data General made its own contribution to stupidity with the introduction of the Data General–One in 1985. This was the first “clamshell” portable and, in terms of weight and functionality, a breakthrough. A fully loaded system cost about $3,000.00, weighed about 12 pounds, supported up to 512KB of RAM, could hold two 3.5-inch double-sided 700KB floppies, and featured an LCD screen capable of displaying a full 80~TMS25 lines of text, an unusual feature for a portable in that era. It also had enough battery life to allow you to get some work done from your airplane seat. Unfortunately, the LCD screen also sported a surface so shiny and reflective you could literally comb your hair in it, making it almost impossible to view the screen for everyday computing chores. No one could ever quite figure out what had possessed Data General to release a system that basically functioned as a $3,000.00 personal grooming system. I still own one of these systems and once tried to sell it at a garage sale for $25.00. I am happy to discover they’re currently worth about $500.00 in the collectibles market.

[9] It has been my privilege to meet the person who holds the world record for getting the highest score ever achieved on this game, a young man who worked for me in the late 1990s. (The E.T. game and original Atari 2600 game system are somewhat collectible and still used by those interested in retro gaming. If you want to experience the horror that was E.T., you can download the game and a 2600 emulator for your PC from various Internet sites.) I won’t reveal the name of this stalwart gamer because my revelation might permanently damage his career. When I knew him, he suffered from insomnia, and after playing many hours of E.T., I can understand why.

[10] In June, 2006 at a seminar in Osaka, Japan, a Dell laptop was photographed erupting into flames because of a defective cell in its lithium ion battery. The story was particularly embarrassing because Dell had over the last year cut back its not-very-world-class support to levels that invoked the ghost of Packard Bell, proving once again that today’s high-tech hero is only one stupid decision away from becoming tomorrow’s computing clown.

[11] I was present at such a demo. I interrupted the demonstrator to inquire “Which one?”

[12] For instance, it tried to sell a utility called “Big Disk.”

In every high-tech company I’ve known, there’s a war going on between the geeks and the suits. Before you start reading a book full of propaganda from software marketing wizard and über-suit Rick Chapman, let me take a moment to tell you what the geeks think.

Play along with me for a minute, will you? Please imagine the most stereotypically pale, Jolt-drinking, Chinese-food-eating, video-game-playing, Slashdot-reading, Linux-command-line-dwelling dork. Because this is just a stereotype, you should be free to imagine either a runt or a kind of chubby fellow, but in either case this isn’t the kind of person who plays football with his high-school pals when he visits mom for Thanksgiving. Also, because he’s a stereotype, I shouldn’t have to make complicated excuses for making him a him.

This is what our stereotypical programmer thinks: “Microsoft makes inferior products, but it has superior marketing, so everybody buys its stuff.”

Ask him what he thinks about the marketing people in his own company. “They’re really stupid. Yesterday I got into a big argument with this stupid sales drone in the break room, and after ten minutes it was totally clear that she had no clue what the difference between 802.11a and 802.11b is. Duh!”

What do marketing people do, young geek? “I don’t know. They play golf with customers or something, when they’re not making me correct their idiot spec sheets. If it was up to me I’d fire ’em all.”

A nice fellow named Jeffrey Tarter used to publish an annual list, called the Soft*letter 100, of the 100 largest personal computer software publishers. Table 1 shows what the top ten looked like in 1984.

Table ‑.1 Top Software Publishers. 1984

Table 1.

Rank       Company                    Annual Revenue

1          MicroPro International  $60,000,000

2          Microsoft Corp.             $55,000,000

3          Lotus                            $53,000,000

4          Digital Research             $45,000,000

5          VisiCorp                       $43,000,000

6          Ashton-Tate                   $35,000,000

7          Peachtree                       $21,700,000

8          MicroFocus                   $15,000,000

9          Software Publishing        $14,000,000

10         Broderbund                   $13,000,000

OK, Microsoft is number 2, but it’s one of a handful of companies with roughly similar annual revenues. Now let’s look at the same list for 2001 (see Table 2)

Table 2. Top Software Publishers in 2001

Rank       Company                   Annual Revenue

1          Microsoft Corp.             $23,845,000,000

2          Adobe                           $1,266,378,000

3          Novell                           $1,103,592,000

4          Intuit                            $1,076,000,000

5          Autodesk                       $926,324,000

6          Symantec                      $790,153,000

7          Network Associates     $745,692,000

8          Citrix                            $479,446,000

9          Macromedia                   $295,997,000

10         Great Plains                   $250,231,000

Whoa. Notice, if you will, that every single company except Microsoft has disappeared from the top ten. Also notice, please, that Microsoft is so much larger than the next largest player that it’s not even funny. Adobe would double its revenue if it could just get Microsoft’s soda pop budget.

The personal computer software market is Microsoft. Microsoft’s revenue, it turns out, makes up 69 percent of the total revenue of the top 100 companies combined. This is what we’re talking about here.

Is this just superior marketing, as our imaginary geek claims? Or is it the result of an illegal monopoly? (Which begs the question, How did Microsoft get that monopoly? You can’t have it both ways.)

According to Rick Chapman (he’s formally known as Merrill, but everyone calls him Rick), the answer is simpler: Microsoft was the only company on the list that never made a fatal stupid mistake. Whether this was by dint of superior brainpower or just dumb luck, in my opinion the biggest mistake Microsoft made was the talking paperclip. And how bad was that, really? We ridiculed the company, shut off the feature, and went back to using Microsoft Word, Excel, Outlook, and Internet Explorer every minute of every day.

But for every other software company that once had market leadership and saw it go down the drain, you can point to one or two giant blunders that steered the boat into an iceberg. MicroPro fiddled around rewriting printer architecture instead of upgrading its flagship product, WordStar. Lotus wasted a year and a half shoehorning 1-2-3 to run on 640KB machines, and by the time it was done, Excel was shipping and 640KB machines were a dim memory. Digital Research wildly overcharged for CP/M-86 and lost a chance to be the de facto standard for PC operating systems. VisiCorp sued itself out of existence. Ashton-Tate never missed an opportunity to piss off dBASE developers, poisoning the fragile ecology that’s so vital to a platform vendor’s success.

I’m a programmer, of course, so I tend to blame the marketing people for these stupid mistakes. Almost all of them revolve around a failure of nontechnical business people to understand basic technology facts. When Pepsi-pusher John Sculley was developing the Apple Newton, he didn’t know something that every computer science major in the country knows: Handwriting recognition isn’t possible. This was at the same time that Bill Gates was hauling programmers into meetings, begging them to create a single rich-text edit control that could be reused in all their products. Put Jim Manzi (the suit who let the MBAs take over Lotus) in that meeting, and he would be staring blankly and thinking, “What’s a rich-text edit control?” It never would have occurred to him to take technological leadership because he didn’t grok the technology. In fact, the very use of the word “grok” in that sentence would probably throw him off.

If you ask me, and I’m biased, no software company can succeed unless there’s a programmer at the helm. So far the evidence backs me up. But many of these boneheaded mistakes come from the programmers themselves. Netscape’s monumental decision to rewrite its browser instead of improving the old code base cost the company several years of internet time, during which its market share went from around 90 percent to about 4 percent, and this was the programmers’ idea. Of course, the nontechnical and inexperienced management of that company had no idea why this was a bad idea. There are still scads of programmers who defend Netscape’s ground-up rewrite: “The old code really sucked, Joel!” Yeah, uh-huh. Such programmers should be admired for their love of clean code, but they shouldn’t be allowed within 100 feet of any business decisions, because it’s obvious that clean code is more important to them than shipping, uh, software.

So I’ll concede to Rick a bit and say that if you want to be successful in the software business, you have to have a management team that thoroughly understands and loves programming, but they have to understand and love business, too. Finding a leader with strong aptitude in both dimensions is difficult, but it’s the only way to avoid making one of those fatal mistakes that Rick catalogs lovingly in this book. So read the book, chuckle a bit, and if there’s a stupid head running your company, get your resume in shape, and start looking for a house in Redmond.

Joel Spolsky

http://www.joelsonsoftware.com

http://www.fogcreek.com

I love this book. When telling stories about some of the finest fiascos in our industry, the author offers unique insight and humor. The result is a book that is both readable and worth reading. That’s a powerful combination that I find increasingly uncommon. I was a fan of the first edition of In Search of Stupidity, and I am honored to be writing this foreword for the second edition.

I am particularly fond of the title of this book. Taken completely out of context, it suggests that if you want to find stupidity in our industry, you have to search for it. I envision a typical person who wanders accidentally into the Software and Computers section of his local bookstore. He sees this book on the shelf and believes that stupidity in high tech is difficult to find.

Aw, never mind that. People are not so easily fooled. Anybody who reads the newspaper can easily look at our industry and see that stupidity is like beer at an NFL game: Half the people have got plenty of it, and they keep spilling it on the other half.

As of August 2006, here is what the average person knows about the world of high-tech products:

  • The FBI just spent $170 million on a software project that completely failed and delivered nothing useful. Most of us would have been willing to deliver them nothing useful for a mere $85 million or so.
  • We each get 50 e-mails a week from eBay, none of which actually came from eBay. So we find somebody who knows about computers and ask why, and he starts spewing stuff that sounds like Star Trek technobabble.
  • The movie industry wants us to buy all our DVDs again so we can see them in “high definition,” but it can’t decide which new format it wants to support. Either way, this comes in the nick of time, because as we all know, the central problem with DVD technology is the atrocious picture quality.
  • The time between the initial release of Windows XP and Windows Vista is roughly the life span of a dog, and apparently the main new feature is that it will be harder to use digital music and video content. Oh yeah, and it looks prettier.

The world of high tech is fouled up beyond all recognition, and everybody knows it.

But everybody loves reading about it. When it comes to failed software projects or dumb marketing mistakes, the mainstream news media is eager to print anything they can get their hands on. Nobody writes stories about software projects or marketing efforts that succeed.

The funny part is that most of the stupidity never makes it into print. Those of us in the industry know that things are actually even stupider than the perspective in the press. For example, most people know that whenever Microsoft announces a new product, it gives it a really boring name that nobody can remember. But those of us in the industry know that the boring name was immediately preceded by a “code name” that was memorable or even clever. It’s almost like Microsoft has a department whose mission is to make sure their public image always looks lame and pedestrian compared to Apple.

And let’s not forget that stupidity can show up in success as well as failure. Do you know the inside story of the Motorola RAZR? In the original plan, the powers-that-be at Motorola were convinced that the RAZR would be a “boutique phone,” a niche product that would appeal to only a small segment of the market. It ordered enough components to make 50,000 of them. In the first quarter of production, the wireless companies placed orders for more than a million units. Motorola had the most popular cell phone on the market, and it was completely unprepared for it. It took them a year to get production capacity up to meet the demand. Today, Motorola is shipping RAZR phones at a pace that is equivalent to selling 50,000 of them every day before lunch.

In the news media, on the message boards, and here in this book, stories about product disasters in our industry are a lot of fun to read.

That’s why the first edition of this book was great, and this one is even better. I applaud the author for the changes he has made in the second revision, giving more specific attention to the matter of learning from the marketing mistakes made by others. I imagine lots of people will enjoy that kind of thing.

But truth be told, not all of us aspire to such a high and noble station.

If you are like me, you probably lied to yourself about why you wanted to read this book. You told yourself how great it would be to learn from the mistakes of others. In reality, we don’t want to learn—we want to gloat. We like to watch things crash and burn. This book is the marketing equivalent of the car chase scene from Terminator 3.

Wielders of clichés would say that misery loves company. Call it what you will, but let’s just admit it together: We like to read about products and marketing efforts that exploded in balls of flame. It helps us feel better about our own stupidity.

And in my opinion, that’s OK. In the vast constellation of unhealthy vices and guilty pleasures, this book isn’t really all that harmful.

Eric Sink

Source Gear

http://software.ericsink.com/

As noted in Chapter 2 of this book, the release of the Altair microcomputer in 1975 heralded the beginning of the modern high-tech industry. But observers of the period also believe there was more to the Altair than just chips; the unit seemed to emit a mysterious elixir that entered the body of computer aficionados worldwide and sparked a strange war of the soul that has raged in the body of the computer geekdom for more than three decades. The war is between those who advocate for free software and open, patentless technology available to all and those who believe in making substantial sums of money from selling proprietary software and the vigorous protection of intellectual property. It’s the Kumbayahs vs. the Capitalists.

Other influences may be responsible for the ongoing struggle. Perhaps Star Trek bears some of the blame. Few in microcomputing hadn’t watched the series, and as Captain Kirk, Mr. Spock, Bones, Scottie, and their innumerable successors went gallivanting through the galaxy, they seemed to have no visible means of financial support. No one in the Star Trek universe wearing green eye shades ever appeared to worry about the propensity of the various casts to blow up what you’d think were undoubtedly very expensive space ships, given their capabilities of violating the laws of physics, transporting the crew to numerous planets inhabited by women who spent most of their time wearing lingerie and dodging ray gun fire from angry races of aliens who kept screaming “kaplok!” (and who also seemed to have no monetary worries). Perhaps the reason for Captain Kirk’s insouciance lay in the fact that everyone in Star Trek had access to what were called “transporters,” magical devices that could be used to whisk you from the space ship Enterprise to a planet without having to pay a toll. Later in the series’ development, transporters could be used to create chocolate milk shakes, drinks, and even the occasional boyfriend or girlfriend via simple voice commands. And all for free!

Of course, no computer has a Star Trek–like transporter system built into it, but from the standpoint of people interested in obtaining software without forking over monetary compensation, it has something almost as good. That good thing is the “copy” command. And since software, unlike milk shakes, drinks, and boyfriends, is already digitized, just about anyone can execute this wondrous command and enjoy a cornucopia of software in an environment free of the distasteful economic friction of “paying.”

Technology’s interest in the concept of free software was demonstrated almost conterminously with the release of the Altair in the events surrounding the “liberation” of the first BASIC for this pioneering machine. When first available, the Altair had no useful software, and the market was eagerly awaiting the release of Altair BASIC (waiting was something Altairians were very good at doing because Altair maker MITS was legendary for announcing new products it couldn’t deliver, a habit the rest of the industry soon learned to emulate). The product had been developed by a small software firm, Micro-Soft, run by two people no one had ever heard of, Paul Allen and Bill Gates. Micro-Soft had cut a deal with MITS to receive a royalty on every sale of Altair BASIC and was eagerly waiting for a stream of revenue to flow into the tiny firm’s coffers upon the official release of the new product to a marketer eager to buy it.

Unfortunately for Gates’s and Allen’s short-term plans, someone had appropriated an early version of Micro-Soft’s BASIC, stored on paper tape, at a small MITS trade show held in Palo Alto in 1975. The tape was promptly reproduced and then handed out at such venues as the Homebrew Computer Club, a semilegendary group of computer hackers and enthusiasts who met regularly in Silicon Valley to share information, gossip, advice, and other things, such as “liberated” chips and especially liberated Altair software. Soon, paper tapes containing an early, buggy version of Altair BASIC were in wide use and oddly enough, no one offered to pay Micro-Soft a dime for the product.

In 1975 there was very little that was kumbayah about Bill Gates, and he responded to the purloining of Microsoft BASIC by writing an open letter to the software liberators, published in the Homebrew Computer Club’s newsletter (and in similar publications), chiding them for their thieving ways and asking them to voluntarily pay for the privilege of using his BASIC. His letter made the logical point that if people weren’t recompensed for all their time and hard work spent creating new and better software products, they would have no incentive to do so, and the software industry would wither and die.

Gates’s pleas for financial remuneration went widely unheeded. The very act of releasing the letter generated generous amounts of sneers and opprobrium from software’s kumbayahs, three hundred or four hundred letters addressed to Gates chastising him for his greed, and about three or four voluntary payments for Altair BASIC. Ruined by the premature widespread release of Altair BASIC and financial loss this entailed, Micro-Soft went out of business, and Gates and Allen were never heard from…aga…errr…no. That’s not what happened.

What actually happened was the widespread release of Altair BASIC established the product as the de facto standard for microcomputers. Despite some idiosyncrasies, Micro-Soft’s BASIC was regarded as an engineering triumph—lean, loaded with features, and, in comparison with the mainframe and mini-computer BASICs most programmers worked with, incredibly fast. Although everyone didn’t want to pay for Altair, which later became Microsoft (with no hyphen) BASIC, everyone wanted to use it. Since Microsoft’s deal allowed the company to license the product to other firms, Microsoft was soon enjoying a tidy business licensing its BASIC to a plethora of other computer companies. In point of fact, it was the industry’s high regard for Microsoft’s BASIC that led IBM to Bill Gates’s door and enabled him to take advantage of the biggest business opportunity of the 20th century.

Nonetheless, as the industry began its rapid development, resentment on the part of software entrepreneurs grew as software piracy spread. And make no mistake, spread it did. Copying a software program worth hundreds, or even thousands, of dollars, was as easy as inserting a blank floppy disk into a disk drive and typing in your system’s version of the “copy” command. Games in particular were the target of frequent liberation efforts, with user groups for systems such as the Amiga and Atari ST sponsoring “swap nights” where members were encouraged to bring in their software collections for communal sharing. Many businesses entered into the kumbayah spirit of things, with it being a common occurrence for a company to buy one copy of a business software package such as WordStar and distributing it to every member of the company.

To counter the practice of software liberation, now usually called “piracy,” a whole host of what were eventually called “copy protection” systems and techniques were developed. Most of these focused on protecting Apple software because this computer system attracted the bulk of new software development until the release of the IBM PC. Some of the techniques employed included things such as forcing a disk drive to write to locations on a floppy nominally off limits to the hardware; “Spiradisk,” a system that wrote data to the disk surface in a big spiral; hardware “dongles,” plastic keys that contained a chip with a software key embedded into it; and so on.

In response to the efforts of one part of the software industry to prevent pirating software, another part promptly launched an effort to thwart the protectors (this had the happy effect of employing more programmers). Anticopy protection systems included software products such as Locksmith, copy-cracking boards that sucked an entire software product into memory and spit it out to disk, products that were capable of reading dongle keys, and so on, and so on, and so on. As soon as one copy protection scheme was introduced, it was immediately under attack by resourceful folks following in the glorious tradition of Altair BASIC and the Homebrew Computer Club.

By the early 1980s, IBM entered the market with its own microcomputer, and the focus of the endless cat-and-mouse game between the Capitalists and Kumbayahs shifted to the PC. The software industry’s reaction to rampant software piracy was the general introduction of copy protection for many of the major software packages. WordStar 2000, Lotus 123, dBase, and other packages incorporated elaborate schemes meant to halt, or at least slow, the piracy tide. For a brief period in the 1980s, almost a dozen software companies were pitching other software companies on the effectiveness of their respective protection systems.

I initially had a great deal of sympathy for the effort. As a field software engineer for MicroPro, I had become quite accustomed to walking into a customer’s location and seeing multiple copies of WordStar (which was not copy protected) installed on every computer in the place but being able to spot only one set of manuals available to the “user” base. Some simple math seemed to indicate a lot of bread was being snatched from my mouth, or at least from the mouth of the company paying my salary.

It was also annoying to find myself spending time providing technical support to people who were clearly flying the software Jolly Roger. One of my responsibilities was to take local technical support calls while in the office from people who were having difficulty with our word processor. A disturbingly high number of my calls went something like this:

Me: Hi! This is MicroPro technical support. How can I help you?

The “customer”: I need help installing my NEC 3550 printer.

Me: No problem! Please pull out your installation manual out, and turn to page 256. (This was an age when users were a manly bunch, with thumbs thickly muscled from paging through software documentation similar in size and comprehensiveness to small encyclopedias. Not the like the effete perusers of PDFs and HTML you find today.) I’ll be glad to walk you through the process.

The “customer”: Uh, I don’t have a manual in front of me.

Me: No problem. I’ll hold on the phone until you can get it.

The “customer”: Uh, I don’t have a manual.

Me: Can I ask what happened to it?

The “customer”: Uh, the dog ate it. (Other popular claims focused on thieving kids, roaring fires, and torrential flooding).

The computing press (the members of which were used to obtaining all the free software they wanted) was, as you might imagine, generally unsympathetic to the plight of the software firms. Despite giving perfunctory lip service to the idea that software companies had a right to protect their property from theft, the companies were (and are) constantly being lectured on “not treating their customers” like thieves, despite the indisputable fact that large numbers of them were (and are). In 1984, MicroPro estimated that eight pirated copies of WordStar were in use for every one sold. In 2005, estimates put software piracy rates in China at more than 90 percent.

And yet, by the end of the 1980s, practically every software that had implemented copy protection dropped it. Several factors were driving this trend. One was that many companies resisted buying copy protected software because it added complexity and instability to desktop computing systems and strained the resources of IT departments. Another was that copy protection added considerably to the software industry’s support burden because users called up to complain about systems that wouldn’t install because of hardware peculiarities, lost or damaged “key” disks, arguments about the number of “valid” installs, and so on. And, although our feelings undoubtedly weren’t the strongest factor driving corporate decisions, most software firms were hearing whines and groans from their field sales and support personnel about the difficulty of dealing with protected products. WordStar 2000, for example, at one time used a copy protection system that limited users to three installations of the software on different systems. This meant that whenever I or another person had to install WordStar 2000 on a demo system at a remote location, we had to go through a wearying install/deinstall routine while listening to outraged disk drives go AAAHHHHKKKK SKRRRIIIKKK WAAKA WAAKA WAAKA in order to keep our quiver full of demo installs for future use. (Field personnel weren’t initially given non-copy-protected products. When we were, the practical facts we created “on the ground” provided another reason to drop copy protection).

And finally, despite the theoretical losses software companies were suffering from piracy, it was hard to see in reality how piracy was hurting the companies. As the decade progressed, many software companies did indeed stumble and fall, but in no case was it possible to pin the blame on piracy. Also, it started to become apparent to software firms that piracy had a definite upside, as Microsoft had discovered years ago with the Altair. When the number of people using your software increased, your perception as the market leader increased as well. And pirated software functioned as a sort of marketing kudzu, tending to choke out the competition as use of your product spread throughout the computing populace. Once you had displaced the competition, it was possible to convert X percent of the pirates to paid users via various inducements and offers. Corporations, worried about legal liabilities, were also usually not reluctant to buy purloined software if the price was right.

Becoming the market leader also opened up opportunities for bundling and original equipment manufacturing (OEM) deals. At MicroPro, WordStar’s early ubiquity made it the favored word processing product to include with such systems as the Osborne, Kaypro, and many others. While OEM products were sold at a considerable discount from the software’s retail price, in most case all the software publisher had to do was provide licenses and serial numbers to its customers; the OEM customer usually was responsible for manufacturing and supporting the product. One MicroPro OEM salesman referred to the firm’s OEM business as a “money-printing operation.” This model worked in the case of such products as WordStar, dBase, WordPerfect, and most notably, Microsoft Windows. Today, Microsoft’s Windows OEM business is the most profitable component in the company’s bottom line.

In the meantime, while the proprietary software companies were garnering all the attention (and making all the money) from the market, the kumbayah forces, led by an interesting fellow by the name of Richard M. Stallman, were keeping the dream of free software alive. Stallman had entered computing by way of MIT in 1971, where he worked as a systems programmer in the university’s AI lab, at that time a hotbed of innovation in such areas as LISP and related languages. Stallman developed a reputation as an ace programmer, and while at MIT developed the legendary program Emacs, a text editor backed up by a powerful and extensible macro system. Stallman was a militant believer in what was then called the “Hacker Ethic,” a belief system that preached that software and the information it represented should be open and available to all users to change and modify as they saw fit. Stallman was fervent in his belief about the evils of charging for software, at one time proclaiming that “the prospect of charging money for software was a crime against humanity.”[1]

Unfortunately for RMS, as his friends called him, by the 1980s the MIT lab was becoming corrupted by the sirens of commerce, who asked why geeks couldn’t also have fancy cars, big homes, and gorgeous girl friends. Two AI companies (both ultimately unsuccessful) dedicated to building LISP interpreters and dedicated LISP machines spun out of the MIT lab, taking with them many of the lab’s best programmers and all, in the opinion of RMS, of the lab’s kumbayah mojo.

After a period of mourning, Stallman left the lab with a vision fixed firmly in his imagination. He would create a powerful, free, and open software environment that would allow programmers to create new and wondrous products. This environment would be based on the popular (but proprietary) UNIX operating system and, in a display of geek wit, would be called GNU (GNUs not UNIX; we’re sure you appreciate the recursion). And to ensure that what had happened at MIT could never happen again, he’d protect this environment with a new and innovative concept, a “copyleft” agreement that required programmers who used his software to build new software to make the original GNU software, and any changes or improvements made to the software they had created, available for free to anyone who wanted it under the GNU General Public License (GPL). When the GPL was introduced, Stallman became software’s Dr. Open, the civilized, reasonable, humanitarian advocate of all that was good and pure in the world. (Bill Gates has traditionally played the role of Mr. Proprietary, but since he’s supposed to be leaving Microsoft to cure diseases worldwide, Steve Ballmer will be appearing in the part moving forward.)

This was a sharp and revolutionary contrast with the typical end-user license agreement (EULA) that accompanied most proprietary software. Most EULAs allowed “licensees” of software only the right to copy “their” software onto a limited number of computers. In fact, by 2006 the Microsoft retail EULA for Windows allowed you to copy your $100+ copy of Windows XP onto only one computer, regardless of how many computers you owned. And boy oh boy, better make sure you never, ever buy a four-core processor in your computer, because that seemed to violate the Microsoft EULA. And if you read the rest of the EULA, it warned of all kinds of other things you couldn’t do, and all the warnings were written in the Scary Lawyer dialect of the English language. In fact, most EULAs are full of scary language and all kinds of implied legal threats. Interestingly enough, despite that software companies have been using EULAs for decades, it is unclear whether they have any legal validity.[2] Fortunately for the industry, no one actually ever reads a EULA; if they did, everyone would probably use only free software.

Given the current excitement over open source software and technology, it would be easy to think that Stallman’s GPL took the industry by storm, but this was not the case. The first GPL was released in 1989, and the second version, the one in current use in high technology, in 1991. At the time of their issuance, few people paid them the least bit of attention. One reason for this may be that while Stallman may have thought charging for software was wrong, almost no one else thought so, especially the many programmers who were making good money selling software and didn’t want to give up their new cars, houses, and girlfriends. Another was that Stallman’s rantings about the evils of for-sale software and rationale for giving it away sounded a bit too close to Karl Marx’s formulation of “from each according to his abilities; to each according to his needs.” In an era when the Soviet dinosaur was noisily clanking and shaking its way to extinction, Stallman’s zeitgeist seemed off to many.

It’s Finally GNU for You

But perhaps the biggest obstacle to the widespread acceptance of Stallman’s credo was that although he was preaching about the glories of free software created with GNU, he hadn’t actually sat down and finished the project. Stallman had built a series of software utilities that could be used to create software (an activity beloved of many coders) but had neglected, years after the proclamation of GNU, to provide the system with its key component, an operating system. Instead, it was left to a 21-year-old Finnish student at the University of Helsinki by the name of Linus Torvalds to create a working implementation of Stallman’s dream. UNIX, Linux’s distinguished father, had slowly been withdrawn from the programming community and had become increasingly proprietary and fragmented. Dozens of companies took their version of UNIX and built custom extensions and walls around the software. This had the effect of raising UNIX prices (and allowing these companies to do a nice business selling their specialized UNIX versions). Dissatisfied with the UNIX clone he was currently using and unable to afford a proprietary version, Torvalds decided to take a stab at writing his own operating system using the GNU tools.

Linux .001 was released in September of 1991. Shortly after its introduction, Torvalds invited anyone interested in the OS to contribute to the development of the next release. Many people did, and the most significant open source project in the industry’s history was born.

Driven by the enthusiasm of what would become know as “the open source community,” Linux made great strides over the next few years, its progress assisted by Torvalds’s decision to release Linux under the GPL. Its growth driven by open source aficionados, by the late 1990s Linux began to do serious financial damage to companies such as SGI, Sun, SCO, and others, all of whom soon saw their business models being ravaged by the new upstart.

But while Linux was steadily eating away at the profits of the UNIX firms, the Windows world safely ignored Torvalds and his OS, for the most part. A few hobbyists played with the system,[3] and Microsoft’s behavior toward Netscape and the government’s antitrust case raised the blood pressure of free software advocates worldwide; however, that was about it. After all, Windows was very, very cheap. Most people received the product for “free” with their hardware and ignored the issue that their purchase price reflected the cost of Windows, something that was easy to do when computers cost $2,000 to $3,000. And even if you bought it, once you factored in the cost of inflation and the ability to install it on every machine you owned (and a few you didn’t), the cost per computer seemed very reasonable for an operating system that ran a huge amount of software and seemed to support just about every peripheral you owned.

Also, what many have called “the open source paradox” began to rear its ugly economic head (and still does). The paradox was that while GNU, Linux, and other open source software had been written ostensibly to liberate programmers from a world of evil capitalists, ultimately it seemed the evil capitalists were most likely to benefit the most from the whole movement. After all, while it was nice that car companies, oil companies, lawyers, grocery stores, Burlington Coat Factory, and lots of businesses of all types were saving money on purchases of software, there was no proof that programmers were sharing in the bounty from all these expenditure reductions. And if you looked at some of the companies that expounded the use of Linux the loudest, such as IBM, you couldn’t but help wonder. After all, IBM had become America’s most prominent business colossus by building the most proprietary of proprietary software and hardware. IBM had been driven from its perch of preeminence by tiny start-up Microsoft, which had then gone on to enrich more geeks than any other company in history. Microsoft had created thousands of millionaire programmers; how many millionaire programmers had IBM ever created? For that matter, if Linux was so great, were all the Linux millionaires?

Some Hot Tunes

In the meantime, while everyone was focusing on software, no one was paying any attention to the music business. There didn’t seem to be any reason to do so. After all, we all knew how the music business basically worked. Every few years the youth of the world generated yet another raft of disaffected grungesters, cute girls, cute boys, some performers of indeterminate sex, ghetto rappers, hip hop blasters, soul throbbers, chanteuses, lounge acts, and so on, and so on, all of whom were signed to contracts by large, institutionally corrupt music companies. These in turn distributed cash, girls (or boys), and cocaine (or the drug of your choice) to the band while paying off music stations to play the songs of the performers under contract to the company. When the current crop of crooners aged and lost their appeal or overdosed, they were promptly replaced by a new generation of cute girls, cute boys, and so on, and the cycle continued.

The distribution model was also well understood. Music was sold to the public via albums of records, cassette tapes, and later, almost exclusively on CDs. Most of the music on the album was filler, designed to surround the one or two good songs with enough extra musical noise to justify charging $20 per CD, a price that annoyed people who remembered that before the switch to the new technology in the early 1990s, a record had cost about eight bucks. The companies raised prices because they could but justified the new price tags to the public by talking about the expense of producing CDs (despite that it cost less to mass produce them as opposed to vinyl) and to industry insiders by noting that the price of drugs had sky rocketed over the years.[4]

The music industry had known for years that public dissatisfaction with the current state of affairs was high and that people were highly interested in mixing and matching songs to create custom listening sets that matched their interests and moods (I cover this point in greater detail in Chapter 14), but no one in the business cared. The music companies had the entire distribution system, the artists, and the technology under control. In fact, in the early 1990s, the industry was able to strangle a potential threat to its domination, consumer digital audio tape players, by loading them with enough integrated copy restrictions to the point that no one was interested in buying the units. Although some music executives were dimly aware of the problems software companies had with piracy, none felt thought had any lessons to learn from high tech’s digital travails.

While the music industry was ignoring both the desires of its customers and the advance of technology, software geeks worldwide were busily working on making the life of the jingle moguls miserable. First came the development of MP3 compression, a technology that allowed software to take any music recording and compress it to about a 12th of its original size with very little loss in sound quality. Work on the MP3 format had begun in 1987, and final specifications for the technology were released to the public in 1994. Once a song had been “MP3’d,” it was small enough to be easily and quickly transmitted electronically. The next step was taken with the spread of cheap read/write optical disk systems in the mid-1990s. This in turn drove the development of software that could “rip” (copy) music from CDs to the new MP3 format. The fourth and final piece of the puzzle dropped into place with the adoption of the Internet by the public. A complete solution to bypassing the music industry’s lock on the distribution system had come into existence.

The first major company to explore the possibilities the Internet opened up for music distribution was MP3.com. The service was founded in 1998 and offered downloadable musical for free (the artists were compensated via a system that gave them a small royalty payment based on the number of times their songs were downloaded). MP3.com was not a music piracy site; a trained staff winnowed through the uploads and stripped out copyrighted material. Everyone thought the site was wonderful, it grew rapidly, and in 1999 MP3.com launched an IPO that netted the company $370 million.

The good times ceased to roll at MP3.com when in January 2000 it launched the My.MP3.com service. This enabled customers to securely register their personal CDs (you had to actually stick the CD in your PC so that MP3.com could scan it) and then stream a digital copy from your system to an online music “locker room” hosted by the My.MP3.com service. At this point, the intelligent thing for the music industry to have done was to have studied MP3.com, partnered with it, and “trained” the public to interact with the site and ones similar to it for the benefit of all concerned. Instead, the music moguls, in a act of classic and far-reaching stupidity worthy of such famous moments in rock star history as Alice Cooper tossing a hapless chicken to its death to a crowd in Toronto or Ozzy Osborne masticating an innocent bat,[5] sued poor MP3.com for copyright infringement and found a judge dim-witted enough to agree with them. Rather than appeal the case, MP3.com handed over the bulk of its IPO money to the recording industry. Fatally weakened, the service gave up the ghost during the dot-com meltdown, to the music industry’s immense satisfaction.

The smirking and high-fiving came to an abrupt end with the appearance of a new service, Napster. Based on a peer-to-peer network system that allowed computers to directly transfer MP3 files across the Internet, Napster made little effort to prevent software piracy, and the site soon became one of the most popular on the planet. The music industry, having learned absolutely nothing from the MP3.com incident, sued Napster as well and eventually was able to shut it down. As already noted in Chapter 11, Napster’s great vulnerability lay in its use of centralized servers to store the names of the files being offered to other Napster users. Now, with Napster out of business, smart programmers quickly developed new software that didn’t require the use of centralized servers but instead relied on individual computer systems located worldwide to manage the task of file coordination. The recording industry’s intelligent response to this development was to sue 19,000 parents, children, dead Vietnam vets,[6] and others for copyright infringement, an act that had absolutely no impact on the widespread practice of downloading free MP3-compressed music. The industry also began suing the individual peer-to-peer networks such as LimeWire and Kazaa, but as soon as one network disappeared, another one promptly appeared. The music industry now existed in a Greek hell of its own creating, doomed, like Sisyphus, to push the rock of copyright litigation up and down a terrain that consisted of endless hills of peer-to-peer networks.

Getting to the Root of the Problem

The industry’s stupidity reached a dizzying crescendo with Sony BMG Music Entertainment’s 2004 release to its customers of something that proved to be far more exciting than any music video ever produced—a “rootkit.” A rootkit is perhaps the most dangerous of all malware, a vicious piece of Borgware that absorbs your computer’s operating system into a vast, evil collective over which you have no control. Rootkits integrate themselves so deeply into a computer’s innards that even high quality antivirus and antispyware products often cannot detect them. The Sony rootkit, targeted primarily at Windows (though it also infected Macs but to a lesser extent), was loaded onto 52 of its music CDs, and when someone put a rootkit-infected CD into their computer, Sony’s malware was surreptitiously installed onto the system. Once there, if detected, an attempt to remove the rootkit resulted in severe damage to Windows and a nonworking computer. Once hidden on your PC, the rootkit prevented you from copying songs from the CD to another CD or to the MP3 format (though this protection was almost instantly circumvented).

The Sony rootkit spread to more than half a million machines and networks, including those in the Department of Defense and other government agencies, before writer and Windows expert Mark Russinovich discovered its existence in October of 2005. He posted his discovery online, and news of the rootkit spread worldwide in a matter of hours. (Companies such as Symantec and McAfee were heavily criticized for failing to develop software that detected Sony’s malware until Russinovich’s disclosure of its existence.)

Sony’s handling of their self-inflicted PR nightmare showed the company’s collective intelligence was even with that of the wretched headless bat publicly decapitated by Ozzy Osborne. As outrage about the rootkit grew, Sony embarked on a damage control effort that included the following:

    *    Claiming the rootkit didn’t surreptitiously “phone home,” that is, use your Internet connection to contact Sony, when it did just that every time you played a song.

    *    Not realizing that the installation of the rootkit left every computer on which it had been installed with a giant security hole any hacker with knowledge of the rootkit’s behavior could exploit.

    *    Releasing an update that supposedly fixed the security hole created by the rootkit that required you provide your name, e-mail address, and other personal information to Sony. After installation, it continued to send information about your choice of music to Sony, but now it had a name to match up with your play list.

    *    Allowing Sony’s president of global digital business, Thomas Hesse, to go on National Public Radio and conduct an interview in which he told the listening audience that “Most people don’t even know what a rootkit is, so why should they care about it?” The hapless Hesse was apparently too stupid to realize that Sony was in the process of educating most of humanity on the dangers of rootkits.

    *    Not knowing that the company supplying its rootkits, software firm First4Internet, was using an open source encoder in the rootkit.[8]

Class action lawsuits against Sony were launched in California, New York, Texas, Italy, and lots of other places. Twelve days after the discovery of the rootkit, Sony announced it would no longer sell its self-infected CDs. Then it announced it was recalling all of the infected CDs and replacing them with non-copy-protected disks. Estimates of the eventual financial damages to Sony ran from $50 to $500 million (one of the reasons for the uncertainty was that thousands of Sony-infected PCs remain in use and vulnerable. As late as June of 2006, three virus creators were arrested for exploiting the security vulnerability created by the rootkit.[9])

More to the point, the entire fiasco helped convince millions of potential buyers of online music that the easiest, cheapest, and safest thing you could was log onto one of those nice peer-to-peer networks where the music selection was wide, the price was zero, and the number of rootkits you could expect to encounter was low.

Back to the Future with WGA

The year 2000, a date that saw most of the world looking forward, saw Microsoft looking back to the 1980s and copy protection. That year Microsoft announced its new “product activation” program. The new copy protection system worked by tethering, in theory, your copy of Microsoft Office 2000 to the Internet via a key found on Microsoft servers. The process worked by your first installing Office and then allowing the product activator to snoop through your computer, send a profile of your hardware to the Microsoft server, and receive a downloaded product key from Microsoft that would allow you to actually use the software you had bought. After initial trials, the scheme was extended to Windows XP when it was released in 2001. Soon, the entire copy protection system became known as Windows Product Activation (WPA).

There were, as you can imagine, some delightful aspects to WPA. If, for instance, you decided to change the motherboard, processor, graphics card or similar hardware on your system, you ran the risk of waking up WPA and having it nag you to reinstall Windows and your other WPA-protected programs, despite that the copy you were using was perfectly legal. Reinstalling Windows sometimes meant calling up a special 800 number and sitting through a long and wearying session that required you speak every last number of the CD key that came with your copy of Windows in the hope that the phone god with whom you were communing would deign to give you a new key. If that didn’t work, you could look forward to spending some time with someone named “Ramesh” or “Gupta” who was normally sitting in a call center in India or similar exotic location and explaining why you needed a new key that allowed you to actually use the software you’d bought…errr…“licensed.”

Freedom from Choice Is What You Want

Most people looked at WPA with the same affection shown a turd dropped in a punch bowl at a wedding, but in the main, Microsoft was able to finesse its introduction. There were several reasons for this. One was that many people received Windows bundled in with their computer and, as already noted, didn’t really think about what they had paid for the product. Another was that, as had happened before, the WPA copy scheme was quickly cracked, and many people simply bypassed WPA. A third was that Microsoft had given “universal keys” to many of its corporate customers; these allowed them to do mass installs of Windows at their business locations without having to waste time going through hundreds or thousands of activations. These keys had quickly leaked into the general public and were employed by many people to use Windows in pretty much the same way they had for more than a decade. All in all, it all turned out that most people could ignore WPA, for most of the time.

This Which seemed, to most people, fair. Microsoft now had legally sanctioned monopolies in desktop operating systems and office suites (but no mauling of the competition allowed)! The company seemed on its way to establishing a similar monopoly in network operating systems, had strong positions in the enterprise database market with its SQL product, was selling a great deal of Microsoft Exchange, had a nice business in mice, and by 2002 enjoyed the luxury of having approximately $49 billion in cash sitting in the company’s piggy bank. Why would any company in its right mind disturb such a wonderful status quo?

Of course, the open source and free software folks took a great deal of enjoyment in pointing out that Linux, which had steadily increased in functionality and ease of use, was free and never required you talk to Ramesh when changing a motherboard. And in the meantime, an interesting product called first StarOffice, then OpenOffice, had appeared on the scene. StarOffice began its life as an OS/2 office suite developed by a German company in the early 1990s. After the collapse of OS/2, the software morphed into a Windows product that was bought by Sun, ostensibly because it was cheaper for the company to buy its own office software than buy Microsoft’s. The real reason was the desire of Sun CEO Scott McNealy to give Bill Gates and his company a case of heartburn, which he attempted to do by open sourcing most of StarOffice’s code, which was then transformed into OpenOffice by a series of programmers dedicated to open source ideals (they didn’t become millionaires, though). Sun still sells a version of StarOffice, though there’s little compelling reason to buy it considering the price, free, of OpenOffice.

On the other hand, although Linux was free, installing it was a royal pain that the vast majority of people had no desire to experience. The price of freedom included the privilege of choosing which Linux you would pick from dozens of different packages, called “distros,” and then attempting to install your choice on your hardware. This was made more interesting by the fact that although the core Linux operating system was usually (though not always) the same from distro to distro, the various Linux bundles often used different install procedures, had different user interfaces, looked for key files in different places, included different utilities, and so on, and so on. And, although it  was nice that OpenOffice was free and that StarOffice was cheap, once one had copied Microsoft Office to all the computers it needed to be on, the price wasn’t really that bad after all.

All this changed in 2004 when Microsoft introduced, with an Orwellian fanfare of misleading language, its new Windows Genuine Advantage (WGA) program. Windows users were prompted (under threat of losing access to updates other than ones deemed critical to security) to download a program that checked their product key for authenticity. If Microsoft determined you were indeed “Genuine,” you could continue to receive all Windows XP updates. If you weren’t, well, no updates for you, at least until WGA was cracked by hackers (it took about a week). Everything seemed to continue on much as it had before, though the I-told-you-so cackling from the free software crowd grew louder, and people started becoming a little annoyed with Microsoft. It bordered on terminal chutzpah to threaten people with the inability to obtain via Microsoft’s update system access to such things as the latest version of Internet Explorer, a product that had been allowed to rot for five years after Microsoft dispatched Netscape. It was nice that Internet Explorer 7 would have tabbed browsing and all, but Firefox and Opera had been offering those features for years.

The rootkit hit the fan in July 2006 when Microsoft unleashed part deux of WGA, called “WGA notifications.” WGA notifications was a nifty bit of code that reminded everyone very much of a recent music company’s malware. Making utterly sure that WGA notifications would be instantly loathed by humanity, Microsoft misled the world by tucking the program onto its servers and transmitting it across the wires in the company of security patches with the appellation of a “critical update.” (WGA had nothing to do with security.) Once installed, the WGA program revealed the following charming characteristics:

    *    It phoned Microsoft every time you logged into Windows to tattle on you if it thought your install of Windows wasn’t valid (proving that Microsoft had learned absolutely, positively nothing from the Sony rootkit disaster of 2004).

    *    WGA now forced Windows to display an unending series of nagware messages urging you to get “Genuine,” that is, fork over more money into Microsoft’s giant cash hoard.

    *    The EULA that came with WGA notifications was misleading and didn’t properly request the user’s consent to install the software.

    *    If you wanted to “Get Genuine,” WGA didn’t make it easy for you to see other options other than give $149 to Microsoft. And there were other options. For example, if a repair shop had loaded an invalid copy of Windows onto your system during an overall of your system but you had bought a legal copy that was sitting on your bookshelf somewhere, you could restore your legitimate key to your system in a process that appeased WGA. But it was a genuine pain to find information about this process via all the “Genuine” nag screens.

    *    WGA was misidentifying hundreds of thousands, maybe millions, of legitimate installs as “nongenuine.” Exactly how many was somewhat mysterious, since Microsoft was not very forthcoming on the issue. The company did say that of the 60 million checks it had run, 80 percent of the machines tattled on by WGA were using invalid keys. That left about 12 million “others.” High levels of complaints were coming from a wide spectrum of users, particularly people who’d had Windows preinstalled on their laptops. As one blogger asked, “Is Dell a pirate?”

    *    If you read the EULA that came with WGA notifications, you realized you were being asked to download a beta product that had the potential to cripple your copy of Windows.

    *    WGA provided no advantages at all to the user (but plenty to Microsoft). The program was simply a copy protection/antipiracy scheme, and people weren’t stupid.

Reaction to the whole WGA mess was exactly what you would expect. Several class action lawsuits were launched against Microsoft claiming the company had violated laws against spyware in several states. Microsoft promptly replaced the big tattler in WGA with a littler tattler, one that would only “periodically” call home to tell on you. Microsoft also changed the EULA to inform you more clearly about its informant. A French company quickly released a program called RemoveWGA that kicked the Jewish mother (WGA notifications) out of your computer, though the basic WGA system remained intact. Several Windows pundits such as Brian Livingston began to recommend that people not use Windows Update but to instead rely on third-party services.[10]

Fresh from its initial success, Microsoft announced that the joys of WGA would soon be extended to all the products in its line. And to ensure that there were no embarrassing ambiguities in the future, WGA in all its glory would be directly integrated into Vista, the designated heir to XP whose father may have been Bill Gates but whose mother was clearly Steve Jobs. In the meantime, the chortles and snickers from the open sourcers turned to guffaws and screams of laughter as they fell to the floor holding their ribs from an excess of merriment.

Rumors then began to quickly spread that part three of Microsoft’s spyware system would introduce a new friend to WGA’s tattler and Jewish mother: an executioner. This would come in the form of a “kill switch” that would allow Microsoft to remotely disable your nongenuine Windows at the behest and whim of Redmond. (Industry wits noted that given the number of security attacks and virus infections afflicting Windows, most people might not notice any difference in operations.) In response to a query from Ziff-Davis columnist Ed Bott, a Microsoft PR representative, speaking in Modern Flack, provided the following chunk of verbiage:

No, Microsoft anti-piracy technologies cannot and will not turn off your computer. In our ongoing fight against piracy, we are constantly finding and closing loopholes pirates use to circumvent established policies. The game is changing for counterfeiters. In Windows Vista we are making it notably harder and less appealing to use counterfeit software, and we will work to make that a consistent experience with older versions of Windows as well. In alignment with our anti-piracy policies we have been continually improving the experience for our genuine customers, while restricting more and more access to ongoing Windows capabilities for those who choose not to pay for their software. Our genuine customers deserve the best experience, and so over time we have made the following services and benefits available only to them: Windows Update service, Download Center, Internet Explorer 7, Windows Defender, and Windows Media Player 11, as well as access to a full range of updates including non-security related benefits. We expect this list to expand considerably as we continue to add value for our genuine customers and deny value to pirates. Microsoft is fully committed to helping any genuine customers who have been victims of counterfeit software, and offer free replacement copies of Windows to those who’ve been duped by high quality counterfeiters. There is more information at our website http://www.microsoft.com/resources/howtotell.

A careful reading of this statement revealed plenty of ambiguities (we didn’t ask whether WGA was going to shut down the computer, but Windows), but Microsoft’s PR people clammed up and refused to talk further. Not making people feel any better was an online article by respected security analyst Robert Schneier in which he reported that a Microsoft representative had told him that:

In the fall, having the latest WGA will become mandatory and if it’s not installed, Windows will give a 30 day warning and when the 30 days is up and WGA isn’t installed, Windows will stop working, so you might as well install WGA now.[11]

At this point, the open source people were snorting liquids through their noses as they rolled around the floor laughing hysterically, but Windows people were depressed. Forums and blogs exploded with comments from users that now was the time to finally take a look at Linux, OpenOffice, and other open source alternatives to Windows.[12] It made sense. While Microsoft was spending time and energy figuring out ways to torture many of its customers, new versions of Linux had just about caught up to Windows in terms of ease of install, functionality, and peripheral support. There were still problems, but at least you could be sure that if anyone in the open source community attempted to put something like WGA into Linux, Richard Stallman would personally throttle them. No one was enthusiastic about the prospect of allowing Bill Gates and Steve Ballmer to hold a loaded pistol at their PCs on a 24/7 basis. Given the past experiences with WGA, just how could you be sure that some idiot at Microsoft wouldn’t inadvertently do something that crippled your system at just the wrong time? Certainly some people thought the possibility existed. Before finishing this book, I spoke to an acquaintance at Microsoft who told me that: this:

I recommend to my friends that they always keep a copy of OpenOffice on their systems in the event that MS Office’s activation system locks up the software when they’re not expecting it and they can’t reach a phone or the Internet to reactivate it. Interoperability is excellent and you can usually get something done. It’s good protection against our copy protection

It appeared that open source has a friend in Redmond, after all!

[1] Free as in Freedom: Richard Stallman’s Crusade for Free Software by Sam Williams (O’Reilly Media, 2002)

[2] http://en.wikipedia.org/wiki/EULA

[3] I purchased a retail copy of Red Hat Linux in the 1990s and attempted to install it on my PC. The install promptly failed when Linux failed to know what to do with my then state-of-the art Adaptec SCSI interface card. A plaintive inquiry sent to the famed Linux community was answered by a condescending message that since Adaptec wasn’t releasing its drivers under the GPL, I shouldn’t expect Linux to work. I promptly gave up on Red Hat and Linux and continued using and buying Windows.

[4] This sounds like a facetious statement. It’s not. The field sales office I worked in was located in Secaucus, New Jersey. The MicroPro offices were down the hall from the studios of one of the region’s most popular Top 40 radio stations at the time, Z-100, and I became used to seeing a limo periodically drive up to our forsaken location and drop off such music stars as Cyndi Lauper, Bob Geldof, Madonna, and so on, for on-the-air PR appearances. I struck up an acquaintance with one of the DJs who worked there, and he explained in loving detail how the industry worked.

[5] Rock Stars do the Dumbest Things by Margaret Moser (Renaissance Press, 1998). A long-buried classic worth your time!

[6] “The Shameful Destination of your Music Purchase Dollars” by David Berlind (http://blogs.zdnet.com/BTL/?p=3486), August 14, 2006

[7] The Borg are Star Trek’s baddest bad guys, a race of cyborgs ruled by queens who run around the galaxy in large cube-style ships assimilating other races while announcing “resistance is futile.” In high-tech, Bill Gates is usually assumed to be the chief Borg queen.  However, given Steve Job’s recent penchant for suing everyone, Apple’s increasing monopoly in the music world, and the suspicious design of the Apple Cube and the Next computer, many people think Apple’s CEO may auditioning for the role.

[8] LAME, licensed under the lesser GPL

[9] “Virus Suspects arrested in UK and Finland” by Quentin Reade. (Webuser, http://www.webuser.co.uk/news/87558.html?aff=rss), June 27th, 2006

[10] Windows Secret Newsletter, issue 78 (http://windowssecrets.com/comp/060629/)

[11] http://www.schneier.com/blog/archives/2006/06/microsoft_windo_1.html

[12] I have. I’m tired of talking to Ramesh every time I swap a motherboard, something I do fairly frequently.

It’s a hard fact of life for the hardware guys and gals of high tech that it’s usually the software geeks who get most of the glory. When software people code a software failure, they usually look like their reach exceeded their grasp; when hardware types build a flop, they look like dorks. With software, a timely patch can often erase the ugliest blemish; with hardware, mistakes are set in silicon, so to speak.

The Loneliness of Being Hardware

A fairly recent example of this principle in action occurred with the release of Palm Inc.’s m130 handheld computer. Before it released its latest personal digital assistant (PDA) in March 2002, Palm bragged that the device’s 16-bit screen could display more than 64,000 different colors, but it turned out the m130 could actually show far fewer. Exactly how many fewer was a matter of some dispute. A spokesperson for the company was quoted as saying that by “blending techniques,” such as combining nearby pixels, the m130 could display 58,000 “color combinations,” which isn’t quite the same thing as 64,000 colors. Palm profusely apologized for its mistake but made no offer to take its drabber-than-expected PDAs back despite the screams of some annoyed buyers. It did tell everyone it was busy thinking about some way to make it up to its disappointed customers. Industry wits immediately suggested that every m130 be shipped with a big box of Crayola crayons.

No, it’s not fair, but that’s the way it is.

Oh, there are a couple of exceptions. A few people know who Michael Dell is, though most people think he’s that young guy who says “Dude!” in all those TV commercials. But Dell is really a boring company once you get to know it. Its main business is selling large numbers of square beige computers shipped in square white boxes. It’s a great business, and Dell is a very, very successful company, but there’s not much glamour there. Dell isn’t cool, and it isn’t glorious.

Then there’s the guy (Ted Waite of Gateway) who talks to the cow, but cows aren’t very cool (though the cow is kind of funny). And his company is losing a ton of money. That’s not very glorious.

And maybe Scott McNealy of Sun Microsystems? Well, that’s a tough one. He spends most of his time talking about Java and the Internet, though the company actually makes its money selling expensive computers running some incomprehensible OS called UNIX. Isn’t Java software?

There is Steve Jobs of Apple. Jobs has a genius for hiring people who can design wonderfully colored and shaped computers that about 4 percent of the market wants to buy. He’s the guy who brought us the movie Finding Nemo  and Buzz Lightyear, and he also looks pretty sharp in Nehru shirts. Some guy from the television show ER even played him in that interesting but completely inaccurate movie, The Pirates of Silicon Valley. Yeah, Steve Jobs is pretty cool. Too bad more people don’t use his computers.

But after that it all becomes kind of fuzzy. Who’s the father (or mother) of the PalmPilot? Who’s the Disk Drive King? The God of Monitors? The Queen of Keyboards? The Prince of Uninterruptible Power Supplies? The Master of Removable Media?

No one knows. No one cares. It’s tough to be in hardware.

On the software side, however, superstars abound. There’s Bill Gates. Paul Allen. Steve Ballmer. Larry Ellison. Marc Andreessen. Steve Case. Peter Norton. Dan Bricklin. Ray Noorda. That Linux guy from Sweden—or is it Norway?—Linus Torvalds? Some incomprehensible Englishman named Tim Berners-Lee whom everyone calls “the Father of the Web.” Heck, even Gary Kildall is famous just for failing big time. Michael Cowpland of Corel used to be pretty well known too (though most people remember him for that wedding-day picture of his trophy wife draped across a Lamborghini[1]).

There is, however, one hardware company that has some major media mojo attached to it. After years of dancing Bunny People, the Blue Man Group, and hyperactive space aliens, that company is Intel. Microprocessors are the hardware heart of the technology revolution, and Intel makes them. Most people aren’t exactly sure how a microprocessor works, but they do know Intel produces a lot of them and many know they have an Intel in their computer. Intel is the semiconductor industry’s ultimate glamour boy, hardware’s Ken doll.

But as we all know, envy exists in this world. Our Ken has a jealous rival, someone who looks at our clean-cut builder of CPUs from the periphery of the admiring throng and grinds his teeth in frustration. “Why is everyone so crazy about him?” our hardware Iago wonders. “I make CPUs too. I’m a multibillion-dollar company. My technology helps drive commerce and industry worldwide. Why doesn’t anyone care about me?”

Too frustrated to watch anymore, the observer turns away and strides by us. A quick glance at his countenance confirms his identity. Who else possesses that peculiar combination of dull stare, pockmarked skin, sandy hair, prominent dental gap, and eternally vacant expression?

Yes, that’s him all right. Alfred E. Motorola.

Memories of a Crushing Blow

Motorola has envied Intel’s marketing prowess since the companies first clashed in the early 1980s during the rollout of their respective 16-bit microprocessors. Motorola had the better chip, but Intel had “Crush,” a prototypical kill-the-competition campaign put together by William H. Davidow. Described in Davidow’s book, Marketing High Technology (Free Press, 1986), Crush integrated PR, marketing communications, and advertising in a comprehensive effort to convince customers that Intel’s ability to outdevelop, outsupport, and outsell the competition made an investment in Motorola’s technology a bad bet regardless of technical merit. Motorola was caught flat-footed by Crush and could never develop a credible response. The company ended up ceding the bulk of the glamorous and profitable market for general-purpose microprocessors to Intel.

Motorola has never forgotten Crush, and the success of Intel Inside only rubbed salt in the wound over the years. In 1999, the company decided it couldn’t stand it anymore and that it too needed to have a big corporate branding program. Thus was born Motorola’s “Digital DNA” program, a waste of $65 million that demonstrated the company had learned little from the body slam Intel dealt it years before.

Bad, Bad Genes

The first problem with Digital DNA was that Motorola never deigned to pay anyone to stick the Digital DNA logo, a sticker that read “Digital DNA from Motorola,” on their hardware. This alone was enough to doom the program. Motorola didn’t want to pay out MDF because of the expense but was missing the point. Intel’s MDF campaign allowed it to sell and charge more for its chips over rivals such as Motorola and AMD. The calculation was simple: For every dollar spent on MDF, Intel saw two dollars back via chip sales and profitability. The lack of an MDF component to the campaign also robbed Motorola of the ability to direct the marketing and advertising efforts of Digital DNA participants à la Intel Inside.

The second problem was the program’s target audience—Motorola’s customers, not the customers of their customers. Motorola’s advertising for the program was thus aimed at phone makers, car manufacturers (big buyers of embedded computer systems), and electronics makers, not the buyers of phones, cars, and electronics. This strategy ensured no consumer demand for products with Digital DNA inside them would be generated. It also put Motorola in direct competition with those companies to whom it supplied chips, such as cellular phone manufacturers. Companies such as Nokia and QUALCOMM regarded the prospect of putting a Motorola logo on their phones with little enthusiasm. Again, Motorola had completely missed the point of the Intel approach, which was to make consumers demand computers with Intel inside, thus pressuring manufacturers to buy more Intel chips.

The third problem was schizoid execution. Having made the decision to target its customers, the company also diverted precious advertising budget dollars to running print-based consumer advertising as well. This wasn’t money intelligently spent; an effective corporate branding effort requires a massive and extensive media blitz carried out over an extended period of time. A few million dollars spent in newspaper and magazine ads wasn’t going to create any significant consumer interest in Digital DNA.

After a couple of years of wasted time and money, it became clear that Digital DNA was genetically defective. The program generated no end-user demand for Motorola products, no increased awareness of Motorola, and no increased demand other than that dictated by normal business necessity for Motorola products among its customers. Digital DNA was allowed to quietly wither away into obscurity.

The last time Ken passed Alfred on the beach, he kicked sand in his face.

[1] She looked marvelous.

 

 

Appendix A: Stupid Development Tricks

Chapter three of Stupidity helps drive this point home. For MicroPro to plummet from the software industry’s pinnacle to permanent oblivion took A) upper management’s mishandling of development and market timing, B) marketing’s idiotic decision to create a fatal product positioning conflict and C) development’s dim-witted decision to rewrite perfectly good code at a critical time because they wanted to write even better code that no one really needed. A magnificent example of different groups within a company all cooperating to insure disaster.The e complete title of In Search of Stupidity includes the phrase “high tech marketing disasters” and from these words one might conclude that it is a firm’s marketers who usually bare the chief responsibility for major corporate catastrophes. This is not true. To be worthy of mention in this book, it took the combined efforts of personnel in upper management, development, sales and marketing, all fiercely dedicated to ignoring common sense, the blatantly obvious and the lessons of the past. Major failure doesn’t just happen. To achieve it, everyone must all pull together as a team.

In this spirit, we have decided to include selected portions of an interview with Joel Spolsky that ran on www.softwaremarketsolution.com, a web site sponsored by the author of this book that provides resources and information on products and services of interest to high tech marketers. (By the way, this interview was “picked up” by www.Slashdot.org, a web site dedicated to all things Open Source. It generated a considerable amount of comment and controversy. You can search their archives to read what other people thought and gain further insight into Joel’s opinions.)

We regard Joel, president and one of the founders of Fog Creek Software (www.fogcreek.com), as one of the industry’s most fascinating personalities. He worked at Microsoft from 1991 to 1994 and has over ten years of experience managing the software development process. As a program manager on the Microsoft Excel team, Joel designed Excel Basic and drove Microsoft’s Visual Basic for Applications strategy. His web site Joel on Software (www.JoelonSoftware.com) is visited by thousands of developers worldwide every day. His first book User Interface Design for Programmers, was reviewed on www.softwaremarketsolution.com and we regard it as a must have and read by anyone involved in developing and marketing software.

Why this interview? If you’ve ever worked in the software side of high technology you’ve probably experienced the following. After a careful analysis of your product’s capabilities, the competition and the current state of the market, a development and marketing plan is created. Release timeframes are discussed and agreed to. Elaborate project management templates are built and milestones set. You post the ship date up on a wall where everyone in your group can see it and begin to work like crazed beavers to meet your target.

Then, as the magic day looms nearer, ominous sounds are heard from development. Whispers of “crufty code,” and “bad architecture” are overheard. Talk of “hard decisions” that “need to be made” start to wend their way through the company grapevine. People, especially the programmers, walk by the wall on which you’ve mounted the ship date, pause, shake their heads and keep walking.

Finally, the grim truth is disgorged. At a solemn meeting, development tells everyone the bad news. The code base of the current product is a mess. Despite the best and heroic efforts of the programmers, they have been unable to fix the ancient, bug ridden, fly bespeckled piece of trash foisted on them by an unfeeling management. No other option remains. The bullet must be bitten. The gut must be sucked up. The Rubicon must be crossed. And as that sinking feeling gathers in your stomach and gains momentum as it plunges towards your bowels, you realize that you already know what you’re about to hear. And you already know that, after hearing it, you’ll be groping blindly back to your cubicle, your vision impeded by the flow of tears coursing down your face, your eyes reddened by the sharp sting of saline. And you’ve already accepted it’s time to get your resume out and polished, because the next few financial quarters are going to be very very ugly.

And then they say it. The product requires a ground up rewrite. No other option exists.

Oh, you haven’t been through this yet? Well, just wait. You will. However, as you’ll learn, what you’re going to be told may very well not be true. After reading this interview, you’ll be in a better position to protect your vision and your career in the wonderful world of high tech.

And now:

An Interview with Joel Spolsky

SMS: Joel, what, in your opinion, is the single greatest development sin a software company can commit?

Joel: Deciding to completely rewrite your product from scratch, on the theory that all your code is messy and bug prone and is bloated and needs to be completely rethought and rebuilt from ground zero.

SMS: Uh, what’s wrong with that?

Joel: Because it’s almost never true. It’s not like code rusts if it’s not used. The idea that new code is better than old is patently absurd. Old code has been used. It has been tested. Lots of bugs have been found, and they’ve been fixed. There’s nothing wrong with it.

SMS: Well, why do programmers constantly go charging into management’s offices claiming the existing code base is junk and has to be replaced?

Joel: My theory is that this happens because it’s harder to read code than to write it. A programmer will whine about a function that he thinks is messy. It’s supposed to be a simple function to display a window or something, but for some reason it takes up two pages and has all these ugly little hairs and stuff on it and nobody knows why. OK. I’ll tell you why. Those are bug fixes. One of them fixes that bug that Jill had when she tried to install the thing on a computer that didn’t have Internet Explorer. Another one fixes a bug that occurs in low memory conditions. Another one fixes some bug that occurred when the file is on a floppy disk and the user yanks out the diskette in the middle. That LoadLibrary call is sure ugly but it makes the code work on old versions of Windows 95. When you throw that function away and start from scratch, you are throwing away all that knowledge. All those collected bug fixes. Years of programming work.

SMS: Well, let’s assume some of your top programmers walked in the door and said “we absolutely have to rewrite this thing from scratch, top to bottom.” What’s the right response?

Joel: What I learned from Charles Ferguson’s great book (High St@kes, No Prisoners) is that you need to hire programmers who can understand the business goals. People who can answer questions like: “What does it really cost the company if we rewrite?” “How many months will it delay shipping the product?” “Will we sell enough marginal copies to justify the lost time and market share?” If your programmer insists on a rewrite, they probably don’t understand the financials of the company, or the competitive situation. Explain this to them. Then get an honest estimate for the rewrite effort and insist on a financial spreadsheet showing a detailed cost/benefit analysis for the rewrite.

SMS: Yeah, great, but, believe it or not, programmers have been known to, uh, “shave the truth” when it comes to such matters.

Joel: What you’re seeing is the famous programmer tactic: all features that I want take one hour, all features that I don’t want take 99 years. If you suspect you are being lied to, just drill down. Get a schedule with granularity measured in hours, not months. Insist that each task have an estimate that is two days or less. If it’s longer than that, you need to break it down into sub-tasks or the schedule can’t be realistic.

SMS: Are there any circumstances where a complete code rewrite is justified?

Joel: Probably not. The most extreme circumstance I can think of would be if you are simultaneously moving to a new platform and changing the architecture of the code dramatically. Even in this case you are probably better looking at the old code as you develop the new code.

SMS: Hmmm. Let’s take a look at your theory and compare it to some real world software meltdowns. For instance, what happened at Netscape?

Joel: Way back in April 2000, I wrote on my web site that Netscape made the “single worst strategic mistake that any software company can make” by deciding to rewrite their code from scratch. Lou Montulli, one of the five programming superstars who did the original version of Navigator, E-mailed me to say, “I agree completely, it’s one of the major reasons I resigned from Netscape.” This one decision cost Netscape four years. That’s three years they spent with their prize aircraft carrier in 200,000 pieces in dry-dock. They couldn’t add new features, couldn’t respond to the competitive threats from IE and had to sit on their hands while Microsoft completely ate their lunch.

SMS: OK, how about Borland? Another famous meltdown. Any ideas?

Joel: Borland also got into the habit of throwing away perfectly good code and starting from scratch. Even after the purchase of Ashton-Tate, Borland bought Arago and tried to make that into dBase for Windows, a doomed project that took so long that Microsoft Access ate their lunch. With Paradox, they jumped into a huge rewrite effort with C++ and took forever to release the Windows’ version of the product. And it was buggy and slow where Paradox for DOS was solid and fast. Then they did it all over again with Quattro Pro, rewriting it from scratch and astonishing the world with how little new functionality it had.

SMS: Yeah, and their pricing strategy didn’t help.

Joel: While I was on the Excel team, Borland cut the MSRP on Quattro Pro from around $500 to around $100. Clueless newbie that I was, I thought this was the beginning of a bloody price war. Lewis Levin[i], Excel BUM (“Business Unit Manager”) was ecstatic. “Don’t you see, Joel, once they have to cut prices, they’ve lost.” He had no plan to respond to the lower price. And he didn’t need to.

SMS: Having worked at Ashton-Tate, we have to tell you the dBase IV code base was no thing of beauty. But, we take your point. Actually, we saw this syndrome at work in Ashton-Tate’s word processing division. After they bought MultiMate, they spent about two years planning a complete rewrite of the product and wasted months evaluating new “engines” for the next version. Nothing ever happened. When a new version of the product was released, it was based on the same “clunky” engine everyone had been moaning about. Of course, in those two years WordPerfect and Microsoft ate Ashton-Tate’s word processing lunch.

Joel: Ashton-Tate had a word processor?

SMS: Yes, but nothing as good as WordStar, mind you!

Joel: Hmm. That reminds me that Microsoft learned the “no rewrite” lesson the hard way. They tried to rewrite Word for Windows from scratch in a doomed project called Pyramid which was shut down, thrown away and swept under the rug. Fortunately for Microsoft, they did this with parallel teams and had never stopped working on the old code base, so they had something to ship, making it merely a financial disaster, not a strategic one.

SMS: OK, Lotus?

Joel: Too many MBA’s at all levels and not enough people with a technical understanding of what could and needed to be built.

SMS: And I suppose building a brand new product called “Jazz[ii]” instead of getting 123 over to the Mac as quickly as possible, thus staking Microsoft to a two year lead with Excel, is an example of the same thing?

Joel: Actually they made a worse mistake: they spent something like 18 months trying to squeeze 123/3.0 into 640KB. By the time the 18 months were up, they hadn’t succeeded, and in the meantime, everybody bought 386’s with 4 megs of ram. Microsoft always figured that it’s better to let the hardware catch up with the software rather than spending time writing code for old computers owned by people who aren’t buying much software any more.

SMS: WordPerfect?

Joel: That’s an interesting case, and leads to another development sin software companies often make: using the wrong level tools for the job. At WordPerfect, everything, including everything, had to be written in Assembler. Company Policy. If a programmer needed a little one-off utility, it had to be hand-coded and hand-optimized in Assembler. They were the only people on earth writing all-Assembler apps for Windows. Insane. It’s like making your ballerinas wear balls and chains and taping their arms to their side.

SMS: What should they have been coding in?

Joel: In those days? C. Or maybe Pascal. Programmers should only use lower level tools for those parts of the product where they are adding the most value. For example, if you’re writing a game where the 3D effects are your major selling point, you can’t use an off the shelf 3D engine, you have to roll your own. But if the major selling point of your game is the story, don’t waste time getting great 3D graphics—just use a library. But WordPerfect was writing UI code that operates in “user time,” and doesn’t need to be particularly fast. Hand coded assembler is insane and adds no value.

SMS: Yes, but isn’t such code tight and small? Don’t products built this way avoid the dreaded “bloatware” label?

Joel: Don’t get me started! If you’re a software company, there are lots of great business reasons to love bloatware. For one, if programmers don’t have to worry about how large their code is, they can ship it sooner. And that means you get more features, and features make users’ lives better (if they use them) and don’t usually hurt (if they don’t). As a user, if your software vendor stops, before shipping, and spends two months squeezing the code down to make it 50% smaller, the net benefit to you is going to be imperceptible, but you went for two months without new features that you needed, and that hurt.

SMS: Could this possibly account for the fact that no one uses WordStar version 3.3 anymore despite the fact it can fit on one 1.4 meg floppy?

Joel: That and “Control-K.” But seriously, Moore’s law makes much of the whining about bloatware ridiculous. In 1993, Microsoft Excel 5.0 took up about $36 worth of hard drive space. In 2000, Microsoft Excel 2000 takes up about $1.03 in hard drive space. All adjusted for inflation. So stop whining about how bloated it is.

SMS: Well, we’ve had much personal experience with the press slamming a product we were managing.  For example, for years reviewers gave MicroPro hell over the fact it didn’t support columns and tables.  Somehow the fact that the product would fit on a 360K floppy just didn’t seem to mean as much as the idea that the reviewer couldn’t use our product to write his or her resume.

Joel: There’s a famous fallacy that people learn in business school called the 80/20 rule. It’s false, but it seduces a lot of dumb software startups. It seems to make sense. 80% of the people use 20% of the features. So you convince yourself that you only need to implement 20% of the features, and you can still sell 80% as many copies. The trouble here, of course, is that it’s never the same 20%. Everybody uses a different set of features. When you start marketing your “lite” product and you tell people, “hey, it’s lite, only 1MB,” they tend to be very happy, then they ask you if it has word counts, or spell checking, or little rubber feet, or whatever obscure thing they can’t live without, and it doesn’t, so they don’t buy your product.

SMS: Let’s talk about product marketing and development at Microsoft. How did these two groups work together?

Joel: Well, in theory, the marketing group (called Product Management) was supposed to give the development team feedback on what customers wanted. Features requests from the field. That kind of stuff. In reality, they never did.

SMS: Really?

Joel: Really. Yes, we listened to customers, but not through Product Management—they were never very good at channeling this information. So the Program Management (design) teams just went out and talked to customers ourselves. One thing I noticed pretty quickly is that you don’t actually learn all that much from asking customers what features they want. Sure, they’ll tell you, but it’s all stuff you knew anyway.

SMS: You paint a picture of the programmer almost as a semi-deity. But in our experience, we’ve seen powerful technical personalities take down major companies. For instance, in The Product Marketing Handbook for Software, we describe how the MicroPro development staff’s refusal to add the aforementioned columns and table features to WordStar badly hurt the product’s sales[iii].  How do you manage situations like these?

Joel: This is a hard problem. I’ve seen plenty of companies with prima donna programmers who literally drive their companies into the ground. If the management of the company is technical (think Bill Gates), management isn’t afraid to argue with them and win—or fire the programmer and get someone new in. If the management of the company is not technical enough (think John Sculley), they act like scared rabbits, strangely believing that this one person is the only person on the planet who can write code, and it’s not a long way from there to the failure of the company.

If you’re a non-technical CEO with programmers who aren’t getting with the program, you have to bite the bullet and fire them. This is your only hope. And it means you’re going to have to find new technical talent, so your chances aren’t great. That’s why I don’t think technology companies that don’t have engineers at the very top have much of a chance.

SMS: Joel, thank you very much.

[i] Lewis Levin got his start in the industry as the product manager for MicroPro’s PlanStar.

[ii] Jazz was intended to be the Macintosh equivalent of Symphony for the PC.  Like most of the integrated products it managed to do too much while not doing anything particularly well.

[iii] Over time, the programming staff noted that requests for this feature from users was dropping.  This was absolutely true, as people who wanted this capability in a word processor bought other products.

Articles

Book Review: Pandemia — How Coronavirus Hysteria Took Over Our Government, Rights, and Lives read more

“Twas Brillig and Fetterman Led the Slithy Polls by Two”: Kara Swisher Galumphs with the Democratic Candidate for U.S. Tulgey read more

Second Metaverse, Same as the First! read more

Archives

login

Have a Question?