With Apple switching over to the x86 family, we need to ask ourselves this question: Exactly how unhealthy is a one-chip monoculture for desktop computing?
Monocultures have never been a good thing in agriculture or genetics, or in society for that matter.
In society, if everyone thinks exactly alike and agrees with everyone else, what do you end up with? Lemmings. A corporation filled with yes-men is an example of this sort of problem.
Until recently, we've always had competitive chip architectures on the desktop.
In the earliest days of the desktop revolution, around 1978, a number of chip alternatives emerged, and machines were built around each of them. The Z-80, the 8080, the 8085, the 6502, and the 6800 come to mind. Over time, this devolved into two chip families for the low-end PC business and a few different chips for high-end Unix workstations.
Until recently, it could be argued that the x86, PowerPC, SPARC, MIPS and Alpha chips were all in the mix. Then came Intel's grand announcement that it was sick of the x86 architecture it had invented and was going to change the game by introducing the be-all, end-all chip of chips — the Itanium!
When this chip was announced, years in advance of its actual design and production, there was the most amazing carnival I've ever seen.
Every company jumped on board and said that this chip is indeed the future, and all these companies said that this would be the chip they would use for sure. This included Sun, which said it would move off its own SPARC chips and use the Itanium.
This dislocation was particularly disastrous for the Alpha chip, which was arguably the most powerful chip of its day and would probably still be so if it were not for the Itanium bandwagon.
Anyway, the Alpha chip is now dead because of it. But almost all the other chips are dead too, as companies have actually fallen backward in time and have adopted the old x86 architecture in its new suit of multicore screamers with extended instruction sets.
But in a market dominated by picky engineers, how long can this last? And if it doesn't last, would it mean we have the potential for a platform shift?
The desktop scene had entered stasis with two platforms and two processor camps. But Apple has changed stasis.
We still have two platforms, but we have only one processor camp, which means that eventually there should be only one platform, since all the software for both platforms will be usable on both platforms. However that turns out, it will ruin innovation.
So I'm looking for a new platform to arrive, and there is already talk about the new IBM/Sony/Toshiba Cell chip becoming a PC platform.
The PlayStation 3 is the target product for this chip. The target market, of course, is console gaming. But the development of the Cell chip seems like a lot of design work just for a game console, if you ask me.
If the chip ends up doing nothing besides running the PS3, then it has to be the most decadent semiconductor in history. And the fact is this chip will probably crop up experimentally in specialized computer systems of some sort. Then it could make a run at the desktop.
The way this works is that one of those picky engineers, cited above, will be sitting at home reading the spec sheets of the chip and how it works.
A light bulb will go off over the engineer's head when he (or she) sees some oddball capability that few people understand. But the engineer will recognize that specific capability, because he or she will have been trying to work out some problem that the chip can now solve.
Now we're off to the races, as this same scenario happens all over the place.
Then again, the Cell chip may be a false start and not be the triggering mechanism for the platform shift. Continued consolidation will instead be in our future. How boring!
We'll see these paths clearly in the next few months as Apple finalizes its shift and as the PS3 emerges from the lab.
Discuss this article in the forums.
Copyright © 2006 Ziff Davis Media Inc. All Rights Reserved. Reproduction in whole or in part in any form or medium without express written permission of Ziff Davis Media Inc. is prohibited.