It’s getting dark for Intel

Author’s note: while at first glance this article was far off the mark, the main reason these fumbles on Intel’s part didn’t cost the company more in market share was their uncompetitive tactics to push AMD out of the market by forcing OEMs’ hand. There was no judgement in court against Intel because they settled with AMD for $1.25 billion in 2009, which is a rather clear admission of guilt. Additionally, both the EU and US governments filed antitrust cases against Intel.

September 2000 – Intel’s blackest month in countless years

Following the official withdrawal of the 1.13 GHz PIII in the last days of August, in the wake of this loss of face it also became pretty obvious that Intel will have to ditch the grandiose plans of breathing new life into the dying P6 core with a 200 MHz FSB, a 0.13 micron process, and larger on-die L2 caches. It seems the Coppermine core (the last and most advanced modification of the half a decade old P6 core, introduced in the Pentium Pro 150 MHz in the mid-1990s) simply won’t be able to go much further. While for the AMD Thunderbird it was no big thing to reach 1.1 GHz recently and has still a lot of potential for future clock increasements, the Coppermine can’t be produced in enough quantities even at the 1 GHz level – and the 1.13 GHz version doesn’t even work any more properly.

 

Celeron’s throat cut by Intel’s own incompetence

But it’s not only the higher-end consumer and corporate market where Intel loses foothold every day. In the lower-end range it looks even worse, as virtually all hardware reviewers came to the same conclusion in the last couple of weeks, namely that the new AMD Duron delivers a tough competition for Intel’s Pentium III, for prices lower than even that of a Celeron. What marketing chances does the Celeron still have under these conditions? While this cheap processor was a superb alternative to the PII/PIII at the time when the only non-Intel competition consisted of faulty and slow Socket 7 processors like the AMD K6-x series and the Cyrix II/III, around which the Celeron could run circles, nowadays competition includes a Duron that has about the same lead over the Celeron as the Celeron had over the K6-2.

Intel’s questionable marketing tactics to cripple processors to create different categories – instead of developing different designs – now yields “fruit”: the whole concept threatens to collapse. In the time of the slow and FPU-lacking Socket 7 competition Intel could get away with a CPU offering a too low FSB speed and a too small L2 cache (especially that time to time the Celeron clock speed was close at the heels of the PII/PIII clock speed, something you can’t say for the new Coppermine Celerons any more). But now when the still cheaper low-end competition offers a much stronger FPU, a 4x larger L1 cache and a 3x faster FSB (with an up to 2x faster memory bus), the Celeron’s days are counted.

 

Timna laid to rest

Descending further down the ladder, in the ultimate lowest-end market Intel managed to stumble as well. Their system-on-a-chip design, Timna, was canceled in the last days of September 2000. Unlike the Cyrix Media GX that has been successfully used in different desktop and mobile lines by Compaq, its Intel copy Timna came too late and its being bound to an overpriced memory standard, RDRAM, made it more expensive than a standard Celeron-based PC, with, say, a full-integrated SiS chipset. With the introduction of the Micro ATX and the Flex ATX standards, as well through the spreading of graphics and sound integration into the chipsets, Timna became not only too expensive but also obsolete on all levels, not in the least because of its inflexibility.

 

Servers galore?

Let’s jump to the other end of the spectrum, the highest-end workstation and server area. There it doesn’t look much better neither. The first 64bit Intel processor ever, current codename Itanium, original codename for many years Merced, has been stopped before release. It seems the first 64bit CPU coming out from Intel will be their second generation design, codename McKinley.

No wonder the Itanium died before birth. Not only was it to be a pure 64bit CPU, meaning that all the current 32bit applications would run way slower than on a current 32bit CPU (something AMD’s x86 Hammer design is wisely avoiding by incorporating an additional 32bit core into their 64bit CPUs), it also had problems with yields over 600-700 MHz, and at 800 MHz it ran into basic functionality problems. In the light of the Sun UltraSPARC III that’s going to reach 900 MHz within the next few months, Itanium would’ve never stood a chance.

A good question is how much chance the next generation 64bit Intel CPU will have. By that time AMD’s Hammer might be out as well, and Sun will be on its way to the UltraSPARC IV. It’s just a mighty luck for Intel that Sun will bring out the UltraSPARC V supposedly about two years later than originally planned. Would that CPU arrive on original schedule, it’d debut at the same time as the first – in comparison pitiful – 64bit CPU from Intel.

It’s not the first time Intel can’t deliver a product in the server market. The Xeon was to be the most powerful 32bit CPU line. In summer 1998, just before its release, the Intel press releases were full of hype about Xeon being superior in every way to the Pentium Pro, because it can even do 8-way multiprocessing. The reality was once again a different matter; nearly half a year after the original launch did the first Xeons appear that were capable of 4-way multiprocessing… the Xeons shipped in the first months were namely only dual-SMP capable. And as for the 8-way capable Xeons, they came more than a year too late. The first shipping servers with 8 pieces of 550 MHz Xeons came early 2000, when the Intel Pentium III and the AMD Athlon have already hit the 800 MHz line.

And now, at the beginning of the fourth quarter, 8-way (or for that matter, even 4-way) Xeons are still running only on the AGP-less 450NX chipset at a 100 MHz FSB, with a maximum clock frequency of 700 MHz. Yes, there are PIII Xeons that are running on a 133 MHz FSB (with RDRAM) on AGP-enabled chipsets, with clock speeds of up to 1 GHz – but only for 2-way SMP. Definitely not what a server with a heavy workload needs.

 

Pentium 4 – savior or the last failure?

Intel’s last hope to re-conquer the CPU market lies in their brand new Pentium 4 with its IA32 architecture, the first new 32bit CPU design since the Pentium Pro 150 MHz back in the mist of ages. But some aspects of the P4’s design might be pre-programmed for failure. Most prominent of them all is a too deep pipelining which makes it easier to increase clock speeds – but on the other hand, it causes a large performance hit compared to less deep pipelined CPUs at a clock-for-clock comparison. This design might have been invented at a time when Intel was the undisputed processor king, and in lack of competition, the only measure after buyers could go was the clock frequency of the CPU. That’s why Intel’s first conception was to stop the Pentium III at 1 GHz and start the Pentium 4 at 1.5 GHz, to cover the fact that clock-for-clock the Pentium III is easily 20-30% faster than its successor.

But now, with the AMD Athlon scaling nearly as easily and already¬†beating the Pentium III at similar clock speeds, this deception of Intel will backfire. Instead of people getting amazed how fast the 1.5, 1.6, 1.7 GHz P4s will be introduced one after the other, they’ll read the reviews on the hardware sites and see that the Athlon 1.5, 1.6, 1.7 GHz CPUs – released most probably not much later than their P4 counterparts – perform much better and cost a lot less.

The last hope of Intel with the Pentium 4 is its quad-pumped (4×100 MHz) Front Side Bus, resulting in a theoretically twice as high bandwidth as that of current Athlon systems (200 MHz). They want to achieve it with using RDRAM memory, but with current PC800 RDRAM design, they’d need dual channel RDRAM to match the 400 MHz FSB. And it means not only insanely high costs for memory, but also chipset and mainboard costs significantly higher than that of a current SDRAM based (or later on, a DDR SDRAM based) Athlon mainboard. And in case they use single channel RDRAM, the memory bandwidth of Pentium 4 systems will be lower than that of the upcoming new Athlon platforms with PC266 DDR SDRAM, adding increased latencies as well. So Intel is either using dual channel RDRAM and has such high system costs that the Pentium 4 will have no chance in the consumer market and will be forced into the workstation segment only, or they will use single channel SDRAM and will not only have somewhat higher system costs than for a DDR-equipped Athlon system, but also lower performance at all levels.

And the most inexplicable design failure of the Pentium 4 is not even chipset/memory related. This brand new architecture has a considerably weaker FPU than its predecessor, the Pentium III/Celeron. This “glitch” might be the ultimate downfall of this CPU at a time when the increased clock frequencies are needed mostly for complex calculations like 3D modeling and 3D games. Nobody really needs an increased integer performance and hasn’t for years. That was one of the reasons why the Super 7 processors could never really break Intel’s hegemony. Although an AMD K6-2 400 MHz offered more than double the integer performance of an Intel Pentium 200 MHz, at a 2x higher clock frequency it still had a lower FPU performance than the older Intel competitor. That meant a significant performance loss in number crunching like 3D games, and no mentionable performance increase in standard apps, where chipset and memory have a much larger impact on performance than the processor speed.

 

Chipsets – from the top to the bottom in just over a year

The Pentium 4 has already a lot of troubles on its own as a processor, but now it’s been further delayed by serious failures in the only available P4 chipset, the i850 (codename Tehama). It is but one of a whole series of chipset fiascos in the last 12-18 months.

Up to early 1999, Intel has been known for many years as the manufacturer who made the fastest and most stable chipsets on the PC market. In the classic Socket 7 area, even people who opted for an AMD K5/K6 CPU mostly chose an Intel chipset based mainboard as companion. No serious hobby system builder wanted to play around with VIA’s or ALi’s notoriously low-quality and incompatible chipsets, they mostly ended up in lowest-quality discount PCs.

This situation has been rapidly changing since about mid-1999. Although VIA chipsets are still a far shot away from being as stable as an Intel LX or BX has been, AMD succeeded in releasing a chipset last Fall that could go toe to toe with the BX in stability. And the biggest impact on the shifting around had Intel itself. Starting with the questionable move of bullying mainboard manufacturers into buying an obsolete would-be 3D accelerator (i740) in bundle if they want to receive enough chipsets, they gave up on this tactics later on and integrated the same (in the meantime even more obsolete) graphics core into a new chipset (i810) and eliminated the possibility of an external AGP slot – thus started the trip down to the bottom for Intel chipsets.

A very embarrassing fiasco was the i820/Camino, where the first generation of mainboards had to be scrapped and the second generation recalled, both because of basic stability problems. Now it continues with i850, which should have been in mass production by now and has been stopped because of heavy performance problems with 3D graphics.

 

The last rat remaining on the sinking ship: Rambus

Most of the problems with the i820 chipset as well as with the future market acceptance of the Pentium 4 originate from Intel’s alliance with Rambus. They wanted to rule the DRAM market, but lost their bid for supremacy. As the market acceptance of RDRAM sinks every month, Rambus has focused its activities from any kind of productive work to suing everyone in sight, trying to spread fear among the DRAM and chipset makers.¬† It’s amazing how in modern business life such a childish behaviour is accepted at all (on second thought, if I think of what Microsoft did in the last decade to achieve what they achieved, such tolerance of Rambus’ practices isn’t that surprising any more).

 

Afraid of the dark

With the approaching of winter, it’s getting dark and cold, even in sunny California. Especially dark and cold for Intel who has been enjoying a lazy sunbath for far too long.

No wonder at the last Intel Developer Forum the main accent was not on the processors but on the networking products – for the first time ever. Are we going to lose a major bully from the CPU boxing ring?

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.