In 2011, Steve Jobs passed away from pancreatic cancer. Like most “solid” organ cancers, it’s a tough customer. Your internal organs have no nerves and the cancer normally metastasizes and spreads before it’s detected. Victims typically die months after the first diagnosis.
Jobs was lucky in that he was diagnosed with a rarer form of the disease, neuroendocrine islet cell carcinoma. Unlike the more common and lethal adenocarcinoma, islet cell cancer is often detected earlier and is more responsive to treatment. It can be cured if found early and treated promptly.
As befitting the career of one of high-tech’s most paradoxical personalities, when Jobs was first diagnosed, instead of seeking the assistance of modern medicine and immediately undergoing surgery, he first took a nine-month detour into various alternative treatments, including fad diets and “spiritual healing,” which science has demonstrated are useless. Jobs finally underwent the surgery, which prolonged his life, but he later regretted not acting immediately when there was perhaps a chance for a cure. During an interview about his biography of the Apple titan, Steve Jobs, author Walter Isaacson stated he’d felt Jobs had delayed the operation because:
“I think that he kind of felt that if you ignore something, if you don’t want something to exist, you can have magical thinking.”
While his behavior was foolish and brought sorrow to his friends and family, you almost can’t blame Jobs for believing he possessed some sort of magic. When he returned to Apple in 1996 and resumed full control of the company in 1997, Apple was a fading power in computing and seemed destined for irrelevance, if not erasure. From 1997 to the year of his death, he executed the most remarkable turnaround in business history, releasing three massive hits in a row—the iPod, iPhone, and iPad, not to mention iTunes and the Apple App Store. By the time he passed away, Jobs had achieved more than a business transformation and turnaround. He’d transformed into a totem, a mystical figure with a special mana that, if you could somehow tap into it, would grant you with the power to bend marketing space and time to your own personal reality as well.
Inspired by Jobs and all that luscious iPhone revenue, Jeff Bezos began planning for Amazon to enter the smartphone market in 2013. On the face of it, it was a logical decision. In 2007, Amazon’s first hardware product, the Kindle ereader, had captured the ebook market. In 2011, Amazon introduced its first tablet, the Kindle Fire, and it too had been a solid success.
The function of both devices in Amazon’s sales strategy were to act as low cost, loss leader purchases designed to whisk you into Amazon’s virtual bookstore and warehouses, where you would hopefully buy enough merchandise and services to cover the costs incurred in building them. As such, the design of both the ereader and tablet were minimalist and as cheap to build as possible.
For his smartphone, Jeff Bezos had a different plan entirely. By 2013, he no longer thought of Amazon as just a store, but also as a technology powerhouse, a unique amalgam of merchant and high-tech empire. Bezos felt he competed against not only against Walmart, but also against Google, Microsoft, and, of course, Apple, the coolest brand on the planet, which until only recently had been led by the coolest guy in high-tech. Bezos thought he could be cool too, and like many other Silicon Valley CEOs, fired up a virtual ouija board in search of some of that powerful Steve Jobs mana. Soon, he’d loaded up with spiritual power and the Fire Phone development effort began.
The project, code named “Tyto,” was set up in Amazon’s Lab126 under Bezos’ watchful eye. The facility is Amazon’s design studio and a conscious homage to Jobs and his immortal “pirate” building at Apple HQ where the original Macintosh was developed. Channeling Jobs further, Bezos appointed himself the device’s “super product manager” and made every major design and functionality decision for Tyto during its development cycle.
As the device took shape, Bezos became fixated on providing a unique new feature for the new phone, a 3-D effect for the screen called “Dynamic Perspective.” Implementing it required four power-sucking cameras installed in each corner of the phone. Dynamic Perspective’s only practical use was to make your lock screen look 3-D. Sort of. Which wasn’t practical. But in Benzos’ mind, this one feature branded the Fire Phone a high-end, luxury device.
From a design standpoint, the rest of the Fire Phone was an undistinguished black polycarbonate slab with a soft black edge, a design cue taken from entry-level Chevrolets. During the product’s live rollout, much was made of the “injection molded steel connectors,” but the slides also included the term “rubber frame.” The latex fetishists in the audience may have been excited, but rest of the audience was unimpressed.
Yes, the Fire Phone had a 13 MPS front camera, while an iPhone’s only had eight, a tangle-proof headphone cord, and a software bundle that sounded nice, but it looked like something you bought at the bargain end of the smartphone counter at Best Buy. And while Amazon wanted you to watch lots of videos on your Fire Phone, it lacked an HD screen.
After the tremendous success of Windows 3.0 in 1990, the market was clamoring for Microsoft’s next act, an OS that would enable PC owners to pull even with the Macintosh. Windows NT was announced and for the next three years Microsoft spent considerable time proclaiming that this new version of the product, once known as OS/2 3.0, would be the 32-bit successor to the Windows 3.x product line. But as NT neared completion, complaints began to surface that the product was too big and resource-hungry to fit the market’s existing desktop profile. Microsoft had heard this before with other products, but Moore’s Law which, roughly paraphrased, states that computing capacity doubles every 18 months and was fully operative at the time, had in the past bailed out the company. But this time, Microsoft blinked. NT was quickly hustled offstage and Microsoft cobbled together a 16/32-bit hybrid that would eventually be known as Windows 95 and switched promotional gears, telling everyone that this product was in fact the desktop upgrade Microsoft had been promising.
After Windows 95’s release, Windows 3.x’s huge installed base, IBM’s ineptitude in marketing the competing OS/2, and a massive promotional campaign all contributed to the product’s tremendous sales success. But over time, the positioning problem grew in the critical desktop arena. Windows NT, then 2000 (the more things change . . .), had always been available in a “workstation” version that directly competed with the Windows 9x family. After all, both product lines were called Windows. They were both advertised as 32-bit operating systems. The desktop versions were comparably priced. They looked alike. So, which to buy?
Microsoft tried to help customers make the decision via a classically bad 1996 ad campaign many referred to as “Two Nags Racing.” A two-page spread, it featured a picture of two horses running neck-and-neck with the caption “You See a Horse Race. We See Two Thoroughbreds.” Apparently no one at Microsoft had realized that, uh, yes, but the horses are racing. And as we all know, only one horse can win. So, which customer is going to ride the losing steed? Faced with such a choice, corporate America paused (and the ad was quickly yanked).
Luckily for Microsoft, a way out of the dilemma appeared. Windows NT was repositioned as a network operating system (NOS) and aimed at Novell, where it achieved massive success due to the Utah firm’s marketing ineptitude. Despite some ongoing speculation and criticism, this enabled Microsoft to successfully navigate away from the dark and stormy positioning clouds that loomed ahead of it and towards further product success. In 2001, company grasped the Holy Grail of product line integration with the release of Windows XP, which enabled the company to offer one product named Windows to the market. Despite the weather, everything became sunny every day in Seattle.
By 2005, Microsoft was 10 years into its rule over computing and the view from Redmond was splendid. Microsoft Office’s grip on the desktop applications market had been locked down and seemed unassailable. Internet Explorer’s market share hovered close to 90%. The company enjoyed strong sales of its development, server and SQL database products. Microsoft set the standards now, not IBM.
Even better, a new marketing was opening up and Microsoft was sitting pretty there as well. A new term, “smartphone,” was entering computing’s lexicon. A smartphone was a hybrid device, a blend of a mobile phone and a personal digital assistant (PDA). While PDAs had never become the multi-billion dollar category foretold since the introduction of the Apple Newton, they’d carved out a foothold in the market and Microsoft’s Windows Mobile OS had made a nice transition from PDAs to hot new smartphones such as Samsung’s BlackJack. By 2007, Windows Mobile enjoyed a 30% market share in smartphone OSs. To drive the point home, that year BlackJack owners were sweating out the possibility that Windows Mobile 6.0 would not be released for their devices.
There were a few splotchy clouds drifting across what was otherwise a crystalline horizon. Losing that anti-trust suit to the U.S. DOJ had been painful. Bill Gates had left the company as a result and that was a blow, though his replacement Steve Ballmer remained in fine spirits throughout the ordeal. More inconvenient was that after the case had been settled, the DOJ draped an anti-trust anaconda around Microsoft’s windpipe with instructions to squeeze if Redmond started to show signs of reverting to its bad old ways. Fortunately, reptiles tend to sleep most of the time, though the snake did become agitated if Microsoft engineers started becoming too busy with browser technology.
And of course, as everyone knew they would after the case was settled, the Euros started suing the company left and right. Annoying, but when you enjoy two major high-tech monopolies, your profits can easily cover the costs of keeping legal vultures at bay. Just the cost of doing business.
It started becoming less good in 2006 with the release of what was supposed to be Windows XP’s successor, Windows Vista. Codenamed “Longhorn,” Microsoft promised it would be stuffed with all sorts of good things such as a journaling file system that, like Dr. Who’s tardis, could practically travel back in time to retrieve files, security protection that even super computers couldn’t crack, a gorgeous new interface, better printing, searching, networking, and on and on. Everyone in the industry had already heard this a million times in the past and the company’s Marketspeak was translated into a standard statement that read Longhorn would be late, it would have bugs, they would be fixed over the course of several incremental releases, most of the new features would eventually work, and Microsoft would sell a ton of it. What choice did anyone have?
On January 2007, Vista was duly released, and everyone stepped politely aside to avoid impeding the inevitable upgrade stampede. And waited. And waited some more. Then even more before everyone realized the stampede wasn’t coming.
It was a puzzle. It couldn’t have been lack of awareness of the product; it had been covered relentlessly by the high-tech press for years. When it was released, Microsoft pulled out all the normal PR stops with an expensive launch event, extensive advertising, an on-stage show starring Bill Gates and Steve Ballmer, and much more. All the normal PR bases were covered, but Vista just refused to sell? Why?
The answer came back to a positioning failure. Vista didn’t sell because it violated one of three fundamental laws of product positioning—the product’s feature set as delivered (not promised) failed to convince Windows XP users, Vista’s primary competition, that it was worth the time and hassle of replacing a perfectly fine and working installation.
In the years between the product’s announcement and its delivery, Microsoft had steadily chopped away at the advanced features that were supposed to make Vista a compelling purchase. The advanced file system went to the same place that Dr. Who’s tardis does when it dematerializes. The “hacker proof” code was scrapped and your new Vista system could still be cracked open by the 15-year-old next door. The new “Aero” interface that was supposed to be more beautiful than Hawaiian sunsets, drained power from your laptop at an alarming rate, leading people to turn off it off and settle for a look more akin to Seattle during a drizzle.
Making it worse was that some of the things that Microsoft did add weren’t wanted or were annoying. For example, copy protection, and a “User Account Control” (UAC) notification system that when you wanted to move or copy a file asked, “Are you sure?” And when you said yes, asked “Absolutely positive?” And when you said “YES,” asked you, “Are you sure you don’t want to change your mind?” That was often the last thing it said before you put your fist through the screen.
In short, there was no compelling reason to buy Vista and people didn’t. Microsoft gamed the adoption figures, but no one was fooled. Windows XP remained the preferred version for several more years. It turned out that Windows users did have a choice after all.
Vista’s market flop shocked Microsoft and certainly didn’t help the company’s bottom line, but no one panicked. Microsoft was still powerful, still set the standards, and still owned two invaluable monopolies. Vista was a setback, not a catastrophe. Microsoft’s attitude was reinforced with the success of Windows 7 in 2009. The product was well reviewed, avoided many of the issues that plagued Vista, and was a solid sales success.
Despite Windows 7’s success, a danger sign presented itself after the launch. This was the continued unwillingness of many Windows XP users to give up the OS. Many people remained unconvinced that the features offered by Windows 7 justified the pain of switching. In 2012, XP enjoyed a worldwide market share of 47% compared to Windows 7’s 36%.
Nonetheless, Windows 7 made many people feel better about Microsoft’s place in the world, and the company refocused its attention on nurturing its monopolies while also planning to conquer the remaining application markets it didn’t control: Their plans included:
And obviously work on Jupiter (Windows 8), Windows 7’s planned replacement, went full steam ahead.
On October 11, 2011, Steve Jobs, as we all must do, passed away from this earth from complications caused by pancreatic cancer. While he died much younger and sooner than he, his friends and family, and most of the general public would have wished, Jobs had the satisfaction of knowing he left this life at the very peak of his career and impact on high-tech. When he took complete control of Apple in in 1997 (Jobs during his first turn at Apple was kept on a leash and was never the company’s CEO), Apple seemed to be past its peak and sales and revenue were sliding. In fact, under Jobs the slide continued, with revenue declining from around $11 billion in 1997 to $6 billion in 2001, despite some widely heralded marketing campaigns and huge media coverage of the return of Apple’s prodigal son. It seemed that Jobs was no magic man despite his outsized reputation.
Jobs failure to return Apple to computing’s pinnacle of power was not surprising, and I wrote about it fairly extensively in the second edition of Stupidity. Here’s what I said in 2005:
Why not take a stab at planning to put Apple back on the throne from which it once reigned microcomputing 25 years ago? After all, everyone is bored with Windows. Linux, the only possible other competitor, has all the computing charm of a diesel truck and requires a degree in computer science to install. And everything the Apple Fairy Godmother said is true, and she left out some hard revenue facts besides. In 2001, Apple’s annual revenue hovered around $6 billion. In 2005, Apple sold more than 32 million iPods, and more one billion songs were downloaded from its iTunes service by the winter of 2006. Yearly revenues from 2005 were almost $14 billion with more than a billion of that being profit…
… Apple’s growth is coming from consumer electronics, not computers, and no one on this planet has ever figured out how to take a company from 4% market share to industry dominance in the face of an entrenched competitor determined to defend its turf. Apple came close to industry supremacy in the early 1970s and 1980s, but this was before IBM woke up. And despite Microsoft’s creeping development of the senescence that inevitably afflicts all megasized corporations, unless a big meteor hits Redmond and Bellevue, Apple cannot hope Steve Ballmer and Bill Gates are going to stand idly by while Apple lops off significant amounts of market share and money from Microsoft.
This was exactly right. Apple has never come close to regaining its former prominence in manufacturing and selling laptop, desktop, and server systems, nor does it any longer want to. Beginning with the iPod, Jobs “pivoted” (this is the new hot “wonder term” that’s replaced the more prosaic “do something else”) into consumer electronics, the iPod being succeeded by the iPhone, Jobs’ piece de resistance, and as his encore, the iPad, computing’s most successful tablet. The Macintosh is now an insignificant part of Apple’s revenues, and as I pointed out earlier, the company is in the process of preparing to leave that aspect of its business behind via a transition strategy.
The result of Jobs’ strategy took Apple to 2011 revenues of $108 billion. No turnaround of such a magnitude had ever been seen before in business anywhere and we may never see such an achievement again. The market made up its mind. Jobs was indeed magic, and high-tech CEOs everywhere decided they wanted to transfer some of his marketing and sales fairy dust to themselves.
There was of course no practical way this could be accomplished; while the Catholic Church and Buddhist denominations encourage the collection of physical remainders of saints under the label of “holy relics,” Jobs and his family had no interest in this sort of thing.
Another option was available, and that was “sympathetic magic.” Sympathetic magic works under the assumption if you behave, dress, and can even resemble a magical person, some measure of their spirit will transfer to you. Unfortunately, the process is fraught with challenges. We’ve already seen what happened when Jeff Bezos built a special workshop just like Steve Jobs’ original “Macintosh Pirate HQ” and attempted to recreate the magic of the original Mac rollout. Ugh. The failure came about because while Bezos had the lab, he completely lacked Jobs’ understanding of product positioning.
Then there was Travis Kalanick, one-time CEO of hot, hot, hot unicorn Uber. During his tenure as CEO, Uber became the number one ride sharing company and its IPO ended with the company having at its peak a market capitalization close to $70 billion. However, in an attempt to reproduce the Jolly Roger atmosphere Jobs generated at Apple during the Mac’s development, Kalanick apparently amped things up to the point where he had to issue an executives memo asking company employees to stop having sex with their bosses, throwing beer kegs out of windows, and puking on the office furniture. Kalanick also over-channeled Jobs when he was recorded screaming at Uber drivers unhappy with the firm’s payment structure. This seemed a bit unfair because Jobs was also famous for publicly abusing people (smoothie ladies, Daniel Kottke, the MobileMe guy, random employees) and no one became that worked up about it. However, Jobs had the good luck to die before he could be hoist by the petard he’d created, a smartphone with high definition camera and video recorders in everyone’s hands. Kalanick didn’t and was escorted out of the CEO suite as his antics came to light and went viral across the globe.
The brief career of Seth Bannon, one-time CEO of Amicus, also deserves a quick examination. This isn’t the best-known case of Steve Jobs Disease, but it is the funniest. Amicus was a hot, hot, hot little company that had published an app that managed fund raising for left wing political causes and people. It was generating revenue and Bannon was on the way to becoming a New York City version of Kalanick. He was profiled in Forbes, Fortune, and all the best blogs and pubs.
Bannon had the look! Hoodie, textbook millennial laser-graded-one-inch-stubble, a “Make Mistakes” T-shirt.
He went to Burning Man!
He was accepted by Y-Combinator!
He even was allowed to occupy the same Hacker House Mark Zuckerberg stayed in!
You know, the one in The Social Network movie.
He actually set up his desk in the house in the same place as the Zuck!
Then, one day in 2014, he walked into Amicus HQ and laid off the whole company without warning a la Jobs walking into Pixar and summarily firing a group of employees. (When it was pointed out to Jobs that he’d offered no severance pay and was asked to give the unfortunate workers two weeks’ notice, he agreed but made the layoff date retroactive from two weeks ago.)
Why had “Zuckerbannon” fired everyone? No money in the corporate piggy bank! You see, our young entrepreneur had failed to pay New York State and City payroll taxes for years. And no worker’s comp for months. When they forked it all over to the state (including fines), the piggy bank was busted.
Why hadn’t Bannon paid his company’s taxes?
Let Seth explain it all to you:
I hired our first accountant this year. I knew our books hadn’t been well kept and was pretty sure we owed some state taxes. He determined we owed about $5,000 in state and local taxes, which we quickly paid. But shortly after, he noticed there hadn’t been any payroll tax payments coming out of our bank account.
Impossible, I thought, Bank of America’s payroll system automatically withholds payroll taxes every month, it’s all automated. But it wasn’t. There was a single “submit tax” button in a separate part of the Bank of America website that had to be clicked to actually pay the taxes.
Bannon also liked letting people know that he’d dropped out of Harvard just like Zuckerberg and, of course, Bill Gates. (Jobs never went to college either.)
Except, he’d actually dropped out of Harvard’s continuing education school. You know, the one where you study things like macramé, driver’s ed, and how to obtain your real estate license?
Which is not quite the same thing as dropping out of Harvard.
When I was reading about Seth, I started cracking up, and was then inspired to write my second novel—Selling Steve Jobs’ Liver: A Story of Startups, Innovation, and Connectivity in the Clouds. (Yes, Bannon makes an appearance, lightly disguised, in the book and has the opportunity to establish a deep personal connection with Steve Jobs. You’ll have to read the book to find out more.)
When it was all over, Bannon was no longer a CEO and Amicus was out of business.
The most virulent case of Steve Jobs Disease was that of Elizabeth Holmes, one-time CEO of one-time bio-tech company Theranos. Holmes founded Theranos in 2003 when she dropped out of Stanford at 19 to develop a blood-draw technology that was going to disrupt phlebotomy. All of us have had to have our blood drawn, and many people find it an uncomfortable and disquieting process as vial after vial of red fluid is withdrawn from their bodies. Some people find it so disturbing that they require sedation before the procedure.
For a missile aimed at blowing apart a centuries-old nexus of culture, commerce, politics, and artistic creation, the Kindle ereader was somewhat underwhelming. The device was neither a piece of machine art nor an example of cutting edge design. But as we’ve already seen with Microsoft Windows, if there’s enough pent-up demand for a technology, even the ordinary and adequate can spark a marketing stampede.
This was the case with the Kindle. As I wrote in the second edition of In Search of Stupidity, the only factor preventing the widespread adoption of ebooks was that the quality of experience was poor because all of the devices sucked. New pages took several seconds to display. The cell phones of 2005 were inadequate for prolonged reading. The tablets and laptops of the period were too heavy and clunky.
The Kindle didn’t suck, though its appearance was a bit awkward. It was light. Its low power E Ink display matched the experience of reading paper. Pages displayed at an acceptable speed. It could hold about 200 plus books and publications in its 250 MB of internal storage. Integrated 3G wireless let you easily download books off Amazon’s online store. The unit was comfortable to hold.
The Kindle wasn’t inexpensive at $399, but when you considered the convenience and the fact that ebooks could be expected to be cheaper than their print equivalents, the price was acceptable. And soon after the Kindle’s release, Kindle reader apps appeared for iOS, Android, PCs, and on Amazon’s own Fire tablets. Any device could now be a Kindle. And Amazon had spent much time and effort ensuring that close to a 100,000 titles were available for sale on day one of the device’s release.
The quality of experience threshold had been met.
The Kindle was not a technology surprise. While I take some pride in correctly diagnosing why the 2001 ereader initiative failed and predicting the release of a successful device, I certainly wasn’t the only person to believe ereaders and ebooks were inevitable. To most of us pundits, the only question was when a successful unit would be released and what it would look like. The subsequent disruption to the book business would simply be an extension of the digital wave already overtaking newspapers, magazines, paper mail, fax, and other print technologies.
What we should have focused on more closely was who would release the breakthrough gizmo. In 2006, if you had asked technologists to guess who would create the first successful ereader, companies such as Microsoft, Apple, maybe Adobe because of its PDF technology, and perhaps HP because of its long association with digital printing, would have made the list of logical candidates. Some people might have pointed to this or that hot new startup. Few people would have guessed Amazon, thought of at the time as a mere merchant, not a member of the high-tech elite.
The first Kindle sold out in five hours and began to disrupt the book industry, to the publisher’s horror (though since Amazon had shown them prototypes of the device months before its release, they shouldn’t have been that surprised). Yet somehow the industry had persuaded itself that print was unassailable. Shortly before the launch, Barnes & Noble CEO Stephen Riggio was quoted saying, “The physical value of the book is something that cannot be replicated in digital form.” This was the same company that in its 2001 annual report proclaimed that it will “seize our future on all fronts: retail, online and digital.”
Making things worse for the publishers was that Amazon had failed to inform them that it intended to price all of its eBooks at $9.99, price rigging the ebook market at a single shot. They found out about it halfway through Jeff Bezos’ launch speech for the Kindle and its supporting platform, Kindle Direct Publishing (KDP) on November 19, 2007 at the Waldorf Hotel in New York City. Reportedly, they were stunned by the announcement, but it’s fair to ask why none of the major publishers thought to ask about Amazon’s pricing model while it was creating all that new content, and insisted on an answer before the event.
Demonstrating 2007’s initial demand was no fluke, by 2009, ebook purchases, almost all of them Kindle, reached 3 million. In 2010, 8 million. By 2013, ebooks accounted for 23% of trade publishing revenues, with 90% of sales flowing through the Kindle platform.
The trade publishers hated the $9.99 model. They were worried the one-price-fits-all straitjacket would devalue books, and they were right (though this problem would plague self-publishers far more than the trade houses). They thought ebooks would cannibalize sales of print books, and they did, though it was stupid to think that raising the price of ebooks would have much impact on the long-term future of print. They were afraid the Kindle was going to destroy the brick and mortar stores, and by 2010 superstore Borders was on the verge of collapse and would die the next year.
The publishers now faced apocalypse, while doomsday loomed ahead for the brick and mortar resellers. The pricing of the fastest growing segment of the book market was controlled by Amazon, and in the world of vendors and channels, the ability to dictate a market’s pricing model is to own the market. Soon, Jeff Bezos ruled the trade publishers. It was time to band together and sail forth to, if not kill, at least teach the Channel Shark who was making their life miserable and threatening to eviscerate their businesses a lesson he wouldn’t forget. To tame the beast, a shiny new harpoon was brought on board their virtual Orca. This fearsome weapon was named “Agency Pricing” and the publishers knew that once launched, a fierce channel war would erupt. These struggles are the business equivalent of domestic disputes, where the parties involved know each other, are co-dependent, and are intimately aware of the other side’s weak and pain points. They are always nasty and bloody, but usually not fatal.
The reason the publishers were determined to move to agency pricing was that it would enable them to regain control of their ebook destiny. Previous to the Kindle’s introduction, the publishers had sold their books to the resellers on a wholesale basis. In this model, the publisher assigns a book a “cover” or list price, and sells it to a reseller at a discount off list. The bookseller then marks up the title to whatever the market will bear. The table below provides a basic example of wholesale in action.
Table 8. Wholesale book pricing model
List or Cover Price
Wholesale Price to Reseller
Store Price to Buyer
$10 (represents a 50% discount off list)
$15 (represents a 25% buyer discount off list price. Reseller earns gross markup of 50%)
Wholesale pricing is the most popular channel financial model and it is used in many industries. It provides a great deal of flexibility to the reseller, who is free to charge above list price if demand makes it possible.
By contrast, in agency pricing, the vendor sets an item or service’s price and requires the reseller to adhere to it. In return, the reseller receives a set percentage of the retail sale.
Table 9. Agency book pricing model
List or Cover Price
Agency Fee to Reseller
Store Price to Buyer
$6.00 (represents 30% commission of list price)
Despite some misconceptions, agency pricing is legal and widely used in different industries. Apple runs its App stores on agency pricing and every purchase you make entails a 30% deposit to Apple’s bottom line.
As the publishers prepared to leave port, joining them onboard the Orca was a powerful new friend, Steve Jobs. In 2010, Jobs was ready to release his last opus, the iPad. He was interested in participating in the ebook boom and thought his new tablet was just the device to compete with the Kindle. He approached the major publishers and offered them the opportunity to sell their books through its new iBookstore Books site via the agency model, with a 30% commission accruing to Apple on every sale.
The Big Six houses, Hachette, HarperCollins, Macmillan, Penguin, Random House, and Simon & Schuster, met with themselves and with Apple, and with the exception of Random House, thought this was a good idea. At one stroke they’d break out of $9.99 and help create a major competitor for Amazon. The bargain was struck.
The new Apple deal included a most favored nation (MFN) proviso. MFN meant that if another reseller undercut Apple’s ebook price, the company was free to match it. This encouraged the publishers to maintain control over their pricing and preclude Amazon from using the loss leader strategy it had used in other markets.
While preparations for the coming channel war were underway, there was one group of people in publishing’s firmament who remained ignored and forgotten, the self-published authors. But if everyone was ignoring them, they weren’t ignoring the Kindle. The introduction of the first ebook platform to achieve widespread acceptance meant that the old barriers that had stood between them and mass numbers of readers could be bypassed. Amazon’s new Kindle platform allowed anyone to upload their book and sell it. There were no agents to contact and beg to submit your book to an editor, no slush piles, no bored and uninterested editors, and no endless stream of rejection letters. Heaven seemed to have descended to earth.
Unfortunately for self-publisher dreams, they were offered no immunity from Jeff Bezos’ belief that their margin was his opportunity. When Kindle opened for business, Amazon imposed a huge 70% MDF stocking fee on every upload. In other words, if you sold a book on Amazon for $9.99, you received $2.99 (less download fees) and Amazon received $6.99 (plus download fees).
This hadn’t fit anyone’s vision of a writer/reseller split, but Amazon deflected criticism via a brilliant marketing coup. In an example of using Newspeak to change the popular narrative that had Eric Blair, rising from his grave to shout “Bravo!” Amazon labeled the $2.99 a “royalty.” (The person at Amazon who came up this idea should have received a nice bonus and free Prime shipping for life.)
This is ridiculous. Amazon has never paid a self-published author a dime of royalties. Ever. The author/publisher royalty system has been in place for 200+ years and the parameters are well understood. Amazon does not:
Amazon does charge you to:
What Amazon provides is access to your book in their store. That’s why channels exist. In every industry I’ve studied, none attempts to extort a 70% stocking fee from their vendors. Such fees typically range between 2% to 3% list price, if they exist all.
To experience how wonderful the success of Amazon’s gambit has been for the company, read the two sentences below. The math on each works out to the same, but which sounds better?
Yes, exactly. And while statement two is untrue, the fact that currently almost the entire industry uses Amazon’s vocabulary has enabled the company to grab the high ground in discussions on its financial treatment of self-publishers. Remember, it’s up to you to create demand for your book. It was satisfying to hit the upload button on the Kindle system and be informed your book is available for sale, but you were now one of 46 million titles. To stand out from the horde, you were going to have to produce and promote your book. You would have to solicit reviews, pay for promotions, post up a website to help publicize your book, spend time and money building an email list, and on and on and on.
And you were going to have to fund all that with the $2.99 left over from every $10 dollars you sold.
Once you had cleared Amazon’s “royalty” nonsense from your head, it became evident how crushing Amazon’s stocking fee was to a self-published author’s bottom line. While it was true the publishers no longer stood between your book and the market, Amazon had replaced them with formidable time and money constraints.
“Google is committed to creating a diverse and inclusive workforce. Our employees thrive when we get this right. We aim to create a workplace that celebrates the diversity of our employees, customers, and users. We endeavor to build products that work for everyone by including perspectives from backgrounds that vary by race, ethnicity, social background, religion, gender, age, disability, sexual orientation, veteran status, and national origin.”
The above is text taken directly from the Google website at https://diversity.google/commitments/. Note that the list of protected perspectives omits political beliefs.
On November 9, 2016, the greatest triggering event in the history of political correctness and the social justice warrior (SJW) movement took place. That night, Donald J. Trump, a New York City real estate developer, marketer, the star of the reality TV show The Apprentice, and very recent Republican, defeated Hillary Clinton, also from New York, a former First Lady, U.S. senator, and U.S. Secretary of State, in a free and open election for the 45th presidency of the United States. Despite the predictions of politicians, pundits, and pollsters, Trump won a solid Electoral College victory of 306 to 232 while simultaneously losing the popular vote 46.1% to Clinton’s 48.2%. It was the most surprising and unexpected upset in American political history.
The reaction of many Democrats upon the announcement of Trump’s victory was also unprecedented. Across the country, Clinton voters wailed, gnashed their teeth, screamed into the sky, and fell to the ground crying. On television, moisture forced itself visibly out of the tear ducts of some anchorfolks, while others became emotionally distraught. Breaking with all precedent, candidate Clinton did not appear on screen to gracefully concede the election, congratulate the winner, and roll out the eternal clichés about how she would continue the fight, the future for America remained bright, and how we should come together as a country until four years from now, when the American people would undoubtedly correct the mistake they’ve just made. Instead, she reportedly threw the mother of all temper tantrums and was not fit to appear in public until the next morning, at which time she dutifully performed the expected ritual.
California, which had recently instituted an electoral system designed to suppress conservative and Republican turnout, shared in the progressive sorrow. A nascent secessionist movement revved up and the state did its best to imitate South Carolina circa 1861. A completely unconfirmed rumor spread that so many Golden Staters threw themselves to the ground to pound the tips of their patent leather shoes against the earth in protest against the cosmic injustice of it all that there were fears the vibrations would trigger the San Andreas Fault.
And then there was Google’s all hands corporate meeting at the Googleplex, held shortly after the election. The meeting ran just over an hour and was filmed for internal use only. In September 2018, the video was anonymously leaked and can be viewed in its entirety online at the link in the footnote.
These TGIF (thank god it’s Friday) meetings are held regularly at the world’s dominant search engine firm and allow employees to comment and grouse to upper management. Many other companies have their own version of these get-togethers. In every company I’ve ever worked or consulted for, political discussions were either brief or not welcome. Business was business and customer ideology and who they voted for was not relevant to the bottom line.
Not so at this meeting. If a video editor had removed every mention of the word Google and its derivations (googly, Googler, googleness) from the footage, a neutral observer would think they were watching an election post mortem held at the HQ of the California branch of the Democratic National Committee.
The video’s atmosphere is grim, moist, and huggy. Sergey Brin, Google co-founder, kicks off the proceedings by telling everyone that “most people are pretty upset and pretty sad.” But then he pivots to happier news by mentioning that California has legalized marijuana. This lifts the gloom a bit as the happy Googlers cheer the good news.
Celebrating their virtual contact high, everyone hugs each other. Then Eileen Naughton, Google VP of People Operations, promises to help fearful Googlers relocate to Canada. Sundar Pichai, Google CEO, drops broad hints that Google will be “adjusting” its search engines to deal with “filter bubbles.” No one comes right out and says Trump voters are Paleolithic neo-Nazis, but lots of hints are dropped— “tribalism that is destructive in the long term,” “voting is not a rational act,” “low-information voters,” and similar banal observations typically applied by the left to the right. VP for Global Affairs Kent Walker informs the rapt audience that Google’s “job is to educate policy makers.” (I’d always thought the company’s job was to build a search engine that gave the most accurate and neutral results possible, thus enabling people to educate themselves.)
Periodically, the prevailing narrative is interrupted by assurances that Google understands that conservatives may feel uncomfortable by disagreeing with the opinions held by every member of upper management and by what appears to be 100% of the members of the meeting. No one during the session stands up to offer viewpoints that differ in any way from the room’s prevailing zeitgeist.
The session reaches its penultimate point when the moistest white guy in the audience stands up and reads from a little script that urges everyone go through Google’s “unconscious” bias training (these programs are scams), watch a left-wing documentary currently playing at Google, and start political arguments during Thanksgiving.
As a Google stockholder, I was appalled. I found Brin’s delight over his employees’ easier access to a drug that makes you stupid and laugh at things that aren’t funny inappropriate. Were people who rely on Google’s much heralded navigation software going to have to worry that their self-driving car would one day run a solid red light and T-bone a minivan full of kids because a stoned Google programmer compiled the wrong code into the system while fumbling for a Twinkie to quell the munchies? That the people responsible for Google Maps may one day tell me to go left into a ravine instead of right onto a road because they were all sitting around giggling and not paying attention to the process of inputting accurate satellite data into their GPS mapping software?
The whole session was a PR disaster. It is very bad business to insult 50% of your potential customer base. I kept hoping that at some point a Google board member, perhaps one who’d attended the meeting on an impromptu basis, would leap on stage, throw a bag over Brin’s head, and drag him away before he could talk further. I started to speculate the whole get together was some sort of practice run for a Saturday Night Live satire about a group of rich, clueless, California high-tech nerds sitting around congratulating themselves for their inclusiveness while all believing and saying the same thing. But alas, no.
When the sniffles had dried and the last hug unwound, I stared at the screen and wondered what business would trust Google to provide reliable search results on any topic that dealt with politics, conservative demographics, and thousands of related key data points? Google was asked exactly this question when the video leaked. Their response was:
“At a regularly scheduled all hands meeting, some Google employees and executives expressed their own personal views in the aftermath of a long and divisive election season. For over 20 years, everyone at Google has been able to freely express their opinions at these meetings. Nothing was said at that meeting, or any other meeting, to suggest that any political bias ever influences the way we build or operate our products. To the contrary, our products are built for everyone, and we design them with extraordinary care to be a trustworthy source of information for everyone, without regard to political viewpoint.”
Despite this cheery assurance, many people had their doubts. In 2019, those doubts would be confirmed.
In July 2017, 28-year-old James Damore, a Google employee, made one of the biggest mistakes of his young career. He took seriously Google’s claims that it sought diversity in its workplace and that “everyone at Google has been able to freely express their opinions.” He discovered his mistake after he posted a 10 page document on one of the company’s internal forums entitled “Google’s Ideological Echo Chamber” for discussion and debate. The focus of the paper were efforts by Google to correct what it perceived as too few women working as software engineers. This action didn’t violate Google’s workplace regulations nor did he misuse the company’s business resources. In fact, his behaviour had at least implicitly been endorsed by Google in June when a stockholder at a company meeting asked Eric Schmidt, Chairman of Alphabet, the holding company controlling Google and several other spin offs, if conservatives were welcome at Google. Schmidt responded:
“The company was founded under the principles of freedom of expression, diversity, inclusiveness and science-based thinking. You’ll also find that all of the other companies in our industry agree with us.”
Damore’s monograph is written in an earnest, post-graduate style, includes several charts and graphs buttressing his statements, and makes the following arguments:
The paper lists out with some specificity Google’s discriminatory practices. Damore states:
“I strongly believe in gender and racial diversity, and I think we should strive for more. However, to achieve a more equal gender and race representation, Google has created several discriminatory practices, including:
The paper makes suggestions how Google can improve the current situation. They include:
Damore argues that one of the reasons for the disparity in male/female staffing in software engineering is that women, based on inherent preferences, choose not to enter coding as a career. He also asserts that women are more prone to “neuroticism” and “anxiety,” which in turn can generate workplace stress and perhaps a desire to avoid programming. He suggests that Google seek to make the engineering environment less stressful, thus encouraging more women to become programmers.
After Damore uploaded his paper, a couple of weeks passed while Googlers interested in the topic read it. Some people were impressed by his arguments and thanked him for having the courage to bring the matter forward, though they were a decided minority.
On August 5 2017, the paper was leaked to Gizmodo, an online website leftover from Gawker that discusses science, science fiction, and technology. The headline accompanying the release read “Exclusive: Here's The Full 10-Page Anti-Diversity Screed Circulating Internally at Google.”
The paper quickly went viral and SJW eyes widened and forehead veins bulged. A Google engineer held her breath on Twitter and threatened to quit if the company didn’t “take action.” Another emailed JamesDamore the message “You’re a misogynist and a terrible person. I will keep hounding you until one of us is fired. F**k you.” Another large-hearted champion of diversity posted on a Google forum that "If Google management cares enough about diversity and inclusion, they should, and I urge them to, send a clear message by not only terminating Mr. Damore, but also severely disciplining or terminating those who have expressed support for his memo.” (Irony is apparently an unknown element at Google.)
Large bundles of wood were piled up around a stake set up in the middle of the Googleplex in preparation for the public burning of Damore’s career. On August 7, the career was marched out, tied to the stake, and immolated when Google announced the hapless engineer’s firing (done via E-mail). The stated reason was he had violated the company’s conduct code by “advancing harmful gender stereotypes in our workplace.” Damore sued Google and the case ended up in arbitration and is still pending.
There were some problems with this public incineration. Eric Schmidt had told everyone that Google was all in on science and Damore seemed to have it on his side. Below are some results of a quick search, using Google, for the terms women, anxiety, and neurotic:
The Anxiety and Depression Association of America
From puberty to age 50, women are nearly twice as likely as men to develop an anxiety disorder.
Women are nearly twice as likely as men to be diagnosed with an anxiety disorder in their lifetime. In the past year, prevalence of any anxiety disorder was higher for females (23.4%) than for males (14.3%).1 The term "anxiety disorder" refers to specific psychiatric disorders that involve extreme fear or worry, and includes generalized anxiety disorder (GAD), panic disorder and panic attacks, agoraphobia, social anxiety disorder, selective mutism, separation anxiety, and specific phobias.
This was just the start of the search:
(By the way guys, don’t use the above to get cocky. We’re more likely to be serial killers.)
The first edition of "In Search of Stupidity: Over 20 Years of High-Tech Marketing Disasters" was released in 2002, the second in 2006. Stupidity has sold over 100 thousand copies worldwide and been translated into several languages, including Chinese, Italian, German, Hebrew, Korean, Polish, and Japanese.