The Art of the Parlay, Or: How I Learned to Stop Worrying About Platform Licensing and Market Share

Toss it back and forth long enough and a nugget of conventional wisdom eventually comes to be treated as fact. With regard to Apple and the Macintosh, the prime example is the idea that Apple made a catastrophic error in the 1980s by not licensing the Macintosh.

This idea has been repeated so often by so many sources that today, most people, even Mac users, simply accept it at face value: If only Apple had licensed the Macintosh, they could have been Microsoft.

But this is not a fact. It’s conjecture, and barring a time machine, it can never be proven. But even if you could go back to 1984 and show Apple’s then-executives a glimpse of the future and the Mac’s eventual market share — merely “licensing” the Mac very likely would not have made a difference. In fact, in an alternate universe where Apple had licensed the Macintosh or Mac OS in the mid-80s, things could have ended up worse for Apple, as in bankrupt-and-out-of-business worse.

There are a few simple reasons why nearly everyone thinks Apple could have conquered the PC industry had they licensed the Mac:

  • The Macintosh was indisputably years ahead of every other PC platform in terms of user-interface design. The mouse pointer. The desktop metaphor. Overlapping windows. Icons. WYSIWYG word processing. Ten years later, every desktop computer in the world offered similar features; but in 1984, they were only on the Mac.

  • Apple licensed neither the Mac hardware nor software.

  • Microsoft licensed MS-DOS, and later Windows, to any IBM-compatible PC manufacturer who was willing to pay for a license.

  • At a basic conceptual level, from a user’s perspective, Windows works pretty much like the Mac OS.

  • It’s generally agreed that the first version of Windows that didn’t suck shipped in 1995, a decade after the arrival of the Mac.

  • Apple matured into a modestly profitable computer company. Macs account for about 5 percent of the computers in the U.S., and 2 percent world-wide.

  • Microsoft became the most fabulously profitable and successful corporation in the history of the world. Over 90 percent of the computers in the world run some version of Windows.

Given these facts, it’s not hard to see how the conventional wisdom came to be. Apple had a 5-10 year lead in terms of UI design; if only they’d licensed the Mac OS in the 80s…

But we need to stop right here, because if we want to be realistic, if we want to be even vaguely rigorous while playing this particular game of “What If?”, then we need to clarify exactly what Apple could have licensed in the mid-80s.

The operating system, of course, we might decide — because that’s what Microsoft did, and they made tens of billions of dollars doing so.

But except Apple couldn’t just license the “Mac OS” (which wasn’t called “Mac OS” until the mid-90s) in 1984, because there weren’t any computers that could use it. Much of the original Mac operating system was implemented in ROM, as hardware. The Mac’s designers didn’t do this to tie the operating system to Apple’s proprietary hardware — they did this because it was necessary in terms of price, performance, and the meager memory and storage they had available. Each 400 KB floppy disk had to store the System (to boot the Mac), whatever apps you wanted to run, and your data files. Every KB of the Mac Toolbox in ROM freed up another KB of space on your floppy disks.

Or consider the display. The Mac’s GUI depended on a 512-by-342 pixel monochrome display, capable of displaying text in the novel color scheme of black text on a white background. This, at a time when PC displays were typically used as character-based terminals displaying orange or green type on a black background, and displayed only 320-by-240 pixels. [Correction: 320×200 with four colors, or 640 × 200 monochrome.]

In short, what were then called IBM-compatible PCs were technically incapable of providing a user experience even vaguely resembling that of the Macintosh. Apple could not have simply licensed the “Mac OS” to run on any existing personal computer platform. (Including their own wildly-successful Apple II platform; more on this later.)

Thus Apple couldn’t have merely licensed the operating system in the mid-80s. OK, then they should have licensed the entire platform to other hardware manufacturers. Admittedly this was possible, and, according to Jim Carlton’s Apple book, was exactly what Apple’s executives considered but rejected. (Carlton’s book is seriously flawed in many ways [not the least of which his conclusion that the company was on the verge of going out of business circa 1999], but it’s worth reading if taken with several grains of salt.) The idea was that Apple would license the Mac platform to a handful of big-name companies like Kodak, Motorola, and AT&T — not a wide-open licensing scheme where any company could decide to start making Mac clones.

Obviously, the idea was ultimately rejected. Apple was earning startlingly high profit margins on Mac hardware at the time, and they didn’t want to share. The competition from licensing clearly would have driven margins down, but the potential for increased volume could have driven overall profits up. Would it have worked? Maybe — but even in hindsight it can’t be deemed a sure thing. Claiming that it “definitely” would have worked is bluster. It was not at all a sure thing, and more to the point at hand, it was not all like the software-only licensing plan that turned Microsoft into a goliath.

So what’s left in our bag of hypothetical “license the Mac in the ’80s” scenarios? Most quote-unquote “business analysts” — who are in large part responsible for the “Apple should have licensed the Mac” conventional wisdom — argue that Apple should have moved away from the hardware business entirely. Surely you’ve heard it argued that Apple ought to become “a software company”, by which what analysts really mean is “a company just like Microsoft”.

The idea being that the Mac’s relatively low market share, in the face of its superior usability and design, is because the corporate market was and is resistant to buying proprietary hardware. And so thus Apple should have developed and licensed a version of the Mac OS that ran on Intel PC hardware. Then you do a little hand-waving, and boom — Apple could have been Microsoft.

The first problem with this idea, as stated earlier, is that IBM-compatible PCs simply weren’t capable of providing a Mac-like user experience in 1984, and it was many years later until they were. And by the time PCs were capable providing a Mac-like experience, Microsoft’s MS-DOS was already entrenched as the monopoly OS.

The second problem is that it’s based on the dubious assumption that the corporate IT market is innately resistant to proprietary single-vendor hardware platforms, but has no reluctance whatsoever to tie themselves to proprietary single-vendor software platforms. That the corporate market has in large part chosen an open hardware platform (x86 processor IBM-compatible PCs) and a closed software platform (DOS/Windows) is quite possibly just the way things turned out, not necessarily the way things were destined to be.

My explanation is simply that the corporate market went with IBM-compatible machines at the start of the PC revolutions because that seemed like the safest route. “No one ever got fired for choosing IBM” was the IT mantra of the ’70s and ’80s. Every subsequent step from that point onward was simply the path of least resistance. Switching to the Mac — or any other different hardware platform for that matter — would have been risky and expensive. It’s easier to do what everyone else is doing, and it’s easier to stick with what you already have.

If the purpose behind these hypothetical musings is to find a way that Apple could have capitalized on the Mac to obtain a significantly larger long-term PC market share, they would have had to have done something different early. Even by the late ’80s, they would have been playing catch-up against Microsoft.

And so what “analysts” are really saying Apple should have done is not have made the Macintosh at all — but rather to have written a GUI-based operating system compatible with the existing PCs of 1984. This is a big enough “what if” that it certainly could have significantly changed the course of computing history — but make no mistake that such an operating system would have borne very little resemblance indeed to the actual Macintosh we knew and loved.

And before you start nodding your head in yes, yes, that’s what Apple should have done agreement, it’s essential to note that such an endeavor would have turned Apple into a direct competitor to Microsoft, in the market Microsoft deemed most essential to dominate.

SURGEON GENERAL’S WARNING: Competing Directly Against Microsoft May Be Hazardous to Your Company

Such a plan to license a GUI operating system to run on Intel PC hardware may well have been Apple’s only and best chance to become an industry colossus sitting atop a Scrooge-McDuck-style mountain of gold, but that’s not to say it would have been a good chance.

The press — both mainstream and tech trade — long ago labeled “Apple-vs.-Microsoft” as one of the biggest rivalries in the industry. It’s not entirely fair to blame this solely on the press — Apple’s infamous and ill-fated “look-and-feel” lawsuit against Microsoft certainly fanned the flames.

But the truth is that Apple and Microsoft have seldom been direct competitors. Microsoft’s overriding goal for the past 25 years has been to develop and maintain an operating systems monopoly on the IBM-compatible PC platform. Apple has never entered that market, and therefore never threatened any of the markets Microsoft deemed essential.

Apple has, of course, offered their own rival hardware platform. But that’s indirect competition, not direct. Apple’s pitch has always been that you should buy an Apple computer, not that you should replace Microsoft’s OS with theirs. And I don’t think Microsoft has ever considered the Macintosh a serious threat to the IBM PC hardware platform.

Companies which compete directly against Microsoft have a tendency to go out of business quickly and gruesomely.

There is only room for one PC operating systems monopoly. It’s therefore much more likely that a plan from Apple to license a Mac-like operating system for PCs — at any point in industry history — would have resulted in Apple being crushed by Microsoft than it is that Apple would have replaced Microsoft as the dominant software vendor.

The truth is that there was no sure-fire way, even in hindsight, for Apple to turn the Macintosh into a Windows-style monopoly. But what we have arrived at is a more insightful analysis as to the key difference between Apple’s and Microsoft’s strategies through the ’80s and ’90s.

The conventional wisdom holds that the difference is that Microsoft went down the path of open licensing, whereas Apple chose to remain proprietary. Both statements are true, but they’re not the key to understanding the companies’ differences. That they ended up on different sides of the open/proprietary divide is an effect, not the cause.

The key difference is that Microsoft focused — intensely and purposefully — on parlaying each of their successes into bigger successes. They got lucky once, when they got IBM to agree to license MS-DOS as the operating system for the IBM PC. (I say they were “lucky” not to discount the shrewdness on the part of Bill Gates and his then-colleagues, but simply because IBM so vastly underestimated the importance of the OS.)

Microsoft parlayed their DOS command-line-era monopoly into the Windows GUI-era monopoly, then parlayed their Windows monopoly into the Office monopoly. In terms of Microsoft’s revenue and profits, everything other than Windows and Office are negligible. And both are entirely parlayed from MS-DOS.

Apple, on the other hand, seldom even attempted (let alone succeeded) to parlay any of their successes into further successes. The Apple II was a phenomenally successful platform. When you hear people state that Apple used to possess 15 to 20 percent market share in personal computers, they’re not talking about the Mac. (Or if they are, they’re misinformed, which is likely.) It was the Apple II that held such high market share, not the Mac.

According to Owen Linzmayer’s “Apple Confidential” [p. 19], Apple’s revenue from the Macintosh didn’t surpass that from the Apple II until 1986, and almost certainly would’ve taken even longer if the Jobs-led management had not effectively abandoned the Apple II platform — which they did not because the Apple II wasn’t popular, but despite the fact that it was popular.

The Macintosh was neither based on nor compatible with the Apple II. Neither platform could run software written for the other. The result is that there was no particular way for an Apple II user to switch to a Mac other than by buying all new hardware and all new software. (There was an Apple II add-on card available for the Mac LC in the early ’90s, but that was six or seven years after the Mac debuted.)

Whether this was a mistake or not is impossible to say. It’s not hyperbole to describe the original Macintosh as revolutionary — the entire desktop-metaphor GUI was utterly unlike the character-based terminal and command-line systems of the era. But that the Mac offered no text mode or command-line — no stdin, no stdout — wasn’t just artistic purity. If the Mac had offered such a mode, even just as a secondary option behind the GUI, the temptation would have been perhaps irresistible for third-party developers to write character-based “Mac” software that wasn’t the least bit Mac-like at all, but instead was pretty much just like the software they’d been writing for DOS and the Apple II.

Since Apple provided no such mode, developers who wanted to write software for the Macintosh had to write Mac-like software. But other than carrying the Apple brand name, the Macintosh was altogether dissimilar from the wildly popular Apple II. This was good for users, but hard for developers. Writing Mac software was both different and difficult.

In the ’90s, under John Sculley, Apple repeated this pattern of building a new platform that stood apart from an existing, popular platform with the Newton — and this time it clearly was a mistake. The biggest problem with the Newton wasn’t the size, or the price, or the initially piss-poor handwriting recognition. The biggest problem is that it didn’t really have anything at all to do with the Macintosh.

Yes, there was the Newton Connection Utility, which let you synch/export/import some data from your Mac (or Windows PC) to a Newton, and which you needed to use to install new and updated software on the Newton. But it stunk. It was always evident that the Newton was designed to stand alone; that their target was a future when you wouldn’t need a desktop computer, just your personal digital assistant. The biggest difference between the Palm Pilot and the Newton wasn’t the smaller size or lower price — but that it shipped with much more useful desktop software and better data synching capabilities.

The Newton introduced numerous wonderful UI design concepts. You never had to explicitly save or name anything; your notes, contacts, and other data were all just there. It was enough to make MPT weep for joy. But it didn’t really do anything to take advantage of the $3000 Mac on your desk. You couldn’t even mount it on your desktop.

The Palm Pilot (and later, Microsoft’s Pocket PC (née WinCE) platform) was designed and marketed as a computer peripheral, parlaying off the success of the PC. The Newton was designed and marketed not as a Mac or PC peripheral, but as the Next Big Thing, technology that would supplant the personal computer, not complement it. If the Newton had hooked to your Mac the way Palm Pilots and Pocket PCs connected to PCs, it would have been much more successful, possibly vastly so.

Apple’s goal should have been to make the Newton a must-have Mac peripheral. Instead, by creating only tenuous ties between the Mac and Newton, it was easily-done-without.

It’s the Parlay, Stupid

Your not-so-humble business analysts state as fact that “the market” favors open platforms. But is that really so? It’s hard not to be blinded by the sun-like intensity of Microsoft’s staggering profits; but if you set aside Microsoft’s DOS/Windows/Office monopoly, where else is this adage proven?

Let’s look at console gaming platforms. Nintendo and Sony have built successful platforms that are almost completely closed. Not only do they build and sell their own proprietary hardware, but they require third-party developers to obtain permission and to pay for the privilege of selling games that run on their systems.

And so when Microsoft — patron saint of the “open platform” business analysts — entered the console gaming market with the Xbox, they licensed the hardware openly to other manufacturers, and they allowed third-party games to run without licensing fees.

Right?

No, of course not. The Xbox is every bit as proprietary as Nintendo’s and Sony’s gaming platforms.

I posit that the Windows monopoly is an anomaly, and exists only because of IBM’s decision to license the DOS operating system from Microsoft, rather than buying it or writing their own from scratch. Microsoft didn’t choose or decide the “open” nature of the IBM-compatible hardware business — they just went along for the ride and then took full advantage of their fortunate position. (Cf. Edison’s famous adage: “Opportunity is missed by most people because it is dressed in overalls and looks like work;” Microsoft worked its corporate ass off to build its monopolies.)

But so while these gaming platforms contradict the idea that the market prefers open platforms — they reinforce the importance of the parlay. Nintendo has parlayed their initial success with the NES into a series of successful next-generation platforms: from NES to Super Nintendo to Nintendo 64 to GameCube. While neither software nor cartridge-compatible, each Nintendo platform was founded on the same familiar series of Nintendo-branded games and characters.

Sony followed up their initial success with the PlayStation with PlayStation II — a system that plays original PlayStation games, the ultimate compatibility parlay.

And the Xbox, although Microsoft’s initial foray into console gaming, is very much a parlay. Not off another console, but off PC gaming. The Xbox hardware is quite a bit similar to Wintel PC hardware, and the Xbox programming APIs for game developers are very much similar to those of Windows. I.e., Microsoft has made it easy for PC game developers to port to the Xbox.

Point being not that parlaying one successful platform into a new one is easy, or that it doesn’t require hard work and clever engineering, but simply that it’s easier and more likely to succeed than a new platform built from scratch.

Think Difference

Thus the difference between Microsoft and Apple wasn’t about open-vs.-closed; it was pragmatism-vs.-idealism.

If Microsoft had been like Apple, Windows would have been something totally new, divorced from and incompatible with MS-DOS. It would have been much better — in the same way the Mac was “better” — but quite probably much less popular.

If Apple had been like Microsoft, the Mac would have been built on top of the Apple II platform and the Newton would have been designed as a Macintosh peripheral.

But that’s not what happened. For about a decade, Windows was merely a graphical shell process that ran on top of MS-DOS. Most PC “power users” configured their machines to boot into a DOS command-line; Windows only started after typing “win” at a command prompt. Even after Microsoft moved Windows away from its DOS foundations, it never abandoned DOS software compatibility, and never encouraged a culture of strict enforcement to proper graphical user interface guidelines.

The result is that a lot of early “Windows” software was just DOS software that ran inside a window and provided a menu bar and few clickable buttons. Or else it was flat-out pure DOS software running inside a DOS compatibility window.

These decisions severely limited Windows, and crippled it in any objective side-by-side comparison to a Mac.

But it allowed DOS users — most especially including the gazillions of corporations that had bought IBM-compatible PCs — to migrate from DOS to Windows slowly and incrementally.

No matter what computer you currently used — IBM PC or Apple II — switching to the Mac meant buying all new hardware and all new software.

I’m not saying this was a mistake; I offer it only as the explanation as to why Apple earns millions per quarter while Microsoft earns billions. But I don’t know that Apple should have done very much different at all during the 80s and early 90s. The real Mac was revolutionary; a hypothetical Mac based on the Apple II would have been evolutionary.

That would have been the pragmatic approach, but Apple was an idealistic company staffed by idealistic employees. Read the stories about the development of the Mac at Folklore.org and it’s hard to imagine that team working on pragmatic, practical evolutionary improvements to the Apple II platform.

It’s also essential to remember that the entire concept of the personal computer was still relatively new in 1984. Something brand-new, eschewing compatibility with existing software — was something that had happened every few years until that point. It wasn’t clear at all that a revolution like the original Macintosh couldn’t take off and supersede all existing competitors.

The upshot and point of all this being that the Mac’s low market share is quite possibly (and quite probably in my mind) unrelated to Apple’s decision not to license the Macintosh; but rather the result of their decision not to parlay off the success of the Apple II. And that but if they had limited the “Macintosh” to what could have been built upon the Apple II, or if the Mac had somehow offered full compatibility with Apple II software, then what they’d have built would have been something very different from what we know as the original Mac.

We might have ended up with something merely “new and improved”, which isn’t nearly as exciting as “insanely great”.