Four Legs Good, Two Legs Better

In George Orwell’s Animal Farm, when the pigs take over the farm, and set up their workers’ paradise, the mantra of the revolution, repeated ad infinitum by a Greek chorus of bleating sheep, is “Two Legs Bad, Four Legs Good”. Which pretty much sums up the level of debate we’ve had in the camps of the Motorola/Macintosh and Intel/Microsoft alliances for the last two decades. It’s also a war that’s been fought on two fronts – from the mud-bogged trenches of the Mac/Windows jihadists to the free-flowing desert warfare of the Intel/Motorola skirmishes. And, as any general will tell you, a war fought on two fronts is bloody hard work, with the principal sufferers along the way being the confused and shell-shocked civilian population.

But one part of that war is heading for a conclusion: Apple is switching to Intel. Let me say that again: Apple. Is. Switching. To. Intel. It’s like watching Martin Luther walk up to the church door in Wittenberg and nail a piece of paper to the door only to find that, rather than the 95 Theses of Contention, it’s an advert for a lap-dancing club. So it’s probably time for a little reflection, not to mention eating of crow. I’ll have ketchup with mine…

History and Heresy

It started in the early eighties, when the Motorola 680X0 series used by Apple were ranged against the Intel 8088/8086 and later 80X86 chips in contemporary PCs. At that stage, it was more of a civil war between two basically similar architectures, where an arguable functional elegance of the Motorola design was more often than not outweighed by the firepower of Intel’s production teams, who usually managed to get that little bit more out of their silicon, a theme that has continued ever since. The big shift was in the early nineties, when Apple went from the 680X0’s Complex Instruction Set Computer (CISC) to the spare minimalism of the Reduced Instruction Set Computer (RISC) in the Motorola PowerPC chips. This progression should have, and in some areas did, deliver CPUs that were highly scalable, of low unit cost and very energy efficient, especially when compared to the CISC and hybrid complexities of Intel’s behemoth Pentium processors – an agile cavalry horse against an armoured war elephant. Of course that’s not the whole story – if you imagine the horse being kept in a locked stable and fed a diet of Big Macs while the elephant is being taught to tap-dance by a skilled team of choreographers, you get a little closer to the truth. And it’s time to drop that particular metaphor before it stands on me. A few years ago, at dinner with a director of Intel, I confessed to having, on numerous occasions, written off the future of the Intel architecture as unscaleable, of high complexity and cost (keeping yield low and unit cost high) and having an inherently high power consumption, only to be repeatedly confounded by the efforts of the production engineers at Intel. He listened carefully to what I had to say, then turned to me, smiled, and said simply, “Our production director walks on water”. OK, so he had the investment to work with from the massive economies of scale from producing the heart of 90%+ of the worlds PCs, but we’ll at least allow him and his teams the ability to skip lightly over a decent-sized pond.

Shock Troops of the Jihad

In all of this, we developers and technocrats have been the sheep at the rally – and I was laughing with the rest at Apple’s original “Toasted Bunny” Pentium II-bashing ad and engaging in the ignoble practice of slinging meta-mud from behind the barriers. I’m guilty as charged – I really couldn’t argue otherwise, both as an Apple user since 1980 and as someone who used to lecture on and evangelise the benefits of RISC architectures. The second of those followed a time of epiphany and wonder when I was given sole custody of one of the early Pyramid RISC machines and was able to blow away Rutherford Lab’s ICL/Fujitsu mainframe supercomputer with a system the size of a two-drawer filing cabinet that sat beside my desk and gently blew warm air over my sandalled toes and through my beard.


The paradox I’ve had is that I’ve always liked Intel as a company – they’re engineering-led, focussed, creative and good to work with. They were a usefully proactive investor in my last company (I may well have been the only person ever to turn up at an Intel CTOs’ conference with a Powerbook under his arm) and I’ve consulted to them. My argument was with the apparent brick walls facing their chip evolution, itself heavily constrained by the need for backward compatibility with the 80X86 instruction set, whereas Apple had freed themselves in one mighty bound by building the backward compatibility into the operating system, in one of the greatest and largely unsung feats of software engineering. The difference though now is moot – the performance difference depends more on the demonstration chosen on a given occasion than on the properties of the processor, and there has now been a long and well-documented history of Motorola, and now IBM, failing to deliver on either promises or needs. I first saw Mac OS X running on Intel in late 2000, and was sworn to secrecy on pain of excommunication from the Holy Sanctum of Infinite Loop but, given that Apple’s Darwin core runs very happily on the platform, it’s been one of the most open secrets in the industry that Apple has been keeping it’s options open. It’s now exercised that option, and yesterday saw Intel’s President and CEO, Paul Otellini, on stage at WWDC with Steve Jobs, thankfully to some decently enthusiastic applause from the assembled Macolytes.

Pragma, Principle and Irony

So what does it mean for those of us who just want to get on with doing our jobs (small j) with the best possible tools? Immediately, good news for all: Apple will benefit from economies of scale in processor procurement, from Intel’s production engineering expertise and from economies in the industry as it becomes easier for third parties to integrate their peripheral boards with the Apple architecture. Above all, it should give Apple a much-needed boost in the evolution of their laptops, a crucial product range which has effectively hit a brick wall through the Motorola/IBM failure to deliver G5s with acceptable thermal properties and power consumption. My own primary machine these days is my G4 Powerbook, which delivers very fine performance and quite abysmal battery life – quite the opposite of Powerbooks of old. So all good news for the consumer of product.

The hidden elephant here is of course the monopolisation of Intel’s place in the market – scarcely something to worry them in the short-term but, on the principle that diversity promotes innovation, I do have a concern about what we get in the longer term. For the moment, I’m not concerned, if for no other reason than that every Intel engineer I’ve ever spoken with takes keeping with or ahead of the curve of Moore’s Law as a matter of personal and professional pride. In that longer outlook though, there’s another emergent contender, one that’s possibly the ultimate irony for Apple – the games consoles, which are rapidly becoming consumer-priced multi-processor supercomputers, and all of which are based on good old IBM PowerPC technology. The irony of course is that just as Apple’s core platform gains ultimate credibility in the gaming arena, Apple itself moves the other way with alacrity. T’was ever thus, but I can see that, as the boundaries between purpose-specific devices and general-use computers blurs, there will be plenty of architectural competition to keep Intel on their corporate toes.

I haven’t even touched here on the Mac/Windows argument – the Mac experience is not about the processor but, as an aside, I am picking up a market vibe that is more consideredly pro-Mac OS than I’ve ever known – the number of individuals and companies I know who are switching and staying switched is the highest I can remember, and the whole experience of using Tiger is a (nearly) unalloyed pleasure. The adoption of Intel processors is not going to lead to Mac OS X on generic PC hardware, so any credibility gained is likely to be conceptual rather than empirical, but it all helps. Perhaps the murmurs are just possibly those of a turning tide? Of course, if Mac OS X were to suddenly become the OS of choice for the masses, I’d immediately have to go off and find another underdog to support. Now where did I put those BeOS CDs? To sum up the Intel resonance though, we can turn once more to Animal Farm, to the point where the pigs sell the faithful old horse to the glue factory, stand up on their hind legs and take over the world, declaring solemnly that, “Four Legs Good, Two Legs Better”.

2 thoughts on “Four Legs Good, Two Legs Better”

  1. *does happy watching-Mac-users-eat-crow dance*
    Fab writeup though. Danny O’B was wondering how Steve would make the announcement without desperation or pique, and there was certainly a hint of both. On the other hand, it’s a very clear signal to any of Apple’s suppliers that nothing is indispensible, and this is probably a good thing overall for the company’s long-term future, though AAPL has taken a minor dive (probably because sales are going to suffer – I was seriously tempted to get a Powerbook in the next couple of months, but that’s going to have to wait a year unless I find a particularly tasty deal)
    I had always written off Tales Of Marklar in the past for various reasons, mainly because of potential customer confusion, but I’d forgotten about the fat binaries thing. Getting developers to make them is easy – they seem quite happy to jump through any hoops that Apple provides. And there’s the obvious VMware benefit too, which is another reason why I’m holding off on that PB.
    I was wondering why they couldn’t just go for the multicore chips that IBM is doing for the 360, but I didn’t realise the power difference, nor that the new IBM chips are radically different in certain ways (no Altivec, for a start). I’ve heard that Altivec-to-SSE3 is not going to be too painful.
    As for handing Intel even more of a monopoly, one Slashdot poster put it very well: you can be certain that regular meetings with AMD have been and will continue to be a fixture in the Apple diary.

  2. It’s just been pointed out to me that, in German, this is turning into phrases like, “Mac/Fenstern jihadists zur freiflie endwstenkriegfhrung”. Ouch. And apologies to the syllabically-challenged…

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.