When Intel announced that the company was working on Larrabee as their next-generation graphics part, mostly everybody thought that Intel would kill ATI/nVidia with ease. After all, the company knocked AMD from its feet with Core architecture, and Intel felt as secure as ever.
Over the course of the last couple of years, I have closely followed Larrabee with on and off-the-record discussions with a significant number of Intel employees. As time progressed, the warning lights stopped being blips in the distance and became big flashing lights right in front of our faces. After discussing what happened at the Intel Developer Forum and the Larrabee demo with Intel's own engineers, industry analysts and the like, there was no point in going back.
This article is a summation of our information on Larrabee, hundreds of e-mails and chats, lot of roadmaps and covert discussions. When we asked Intel's PR about Larrabee, his comment was that this story was "invented"
and has nothing to do with truth. We were also told that our sources were "CRAP"
, which was duly forwarded to the sources themselves. We will cherish the comments that ensued afterwards for the remainder of our days, including a meeting that followed a comment "since [Intel] PR claims we don't work on LRB, this is a blue cookie"
. Also, there were some questionable statements about our integrity, but here at Bright Side of News* we are going to continue doing what we did in the past - disclose the information regardless of how good or bad it is. We hope that it is good, but if it's not - don't expect us to stay put.
Unfortunately for the PR, marketing, and sales divisions, every company owes its existence to engineers who pour their hearts in projects and if wasn't for that - you would not have chips with hundreds, and now billions of transistors in. Engineers don't speak in Ronspeak language
, but rather are quite open. This is what we would call, a real inconvenient truth.
For a company of Intel's stature, we did not expect that a project such as Larrabee would develop in the way it has. In fact, according to information gathered over the years - LRB just doesn't look like an Intel project at all [read: sloppy execution, wrong management decisions]. The amounts of leaks we received over the course of the past years simply surprised us; on several occasions, we had the opportunity of seeing internal roadmaps and hearing about frustrations regarding unrealistic expectations from the management. First and foremost, the release dates: Intel's internal roadmaps originally cited "sampling in 2008", then "Release in Q4 2008", "Release in 2009". By summer 2008, we saw "mid-2009", "H2 2009," changing to "2010", "H2 2010" and after a recent conversation with several engineers that mentioned "we need another 13-18 months to get [Larrabee] out"; the time came [unfortunately] to complete this story. The Road to Larrabee
In a lot of ways, Larrabee had a lot of issues even before Intel had committed to the project. Intel pondered about doing LRB for years, but the CPU-centric company was in love with the idea of having a Pentium 4 reaching 10 GHz [double pumped ALU would work at 20 GHz then] and be strong enough to power graphics as well. This was the same line of thought as Sony took with IBM Cell Processor
- originally, PlayStation 3 was supposed to have two Cell CPUs at 4GHz each. After that idea went bust, nVidia got the hot potato to make a PS3 graphics chip with less than 12 months to launch. With Larrabee, we can't exactly call Imagination Technologies being the savior.
Just like the Cell peaked at 3.2 GHz and had numerous issues [bear in mind that every shipped PS3 has eight SPE units, out of which only seven are used], Intel had an internal war about Tejas
[successor to Pentium 4, next step of NetBurst architecture]. The fallout in the company was huge - even Intel's die-hards realized that the NetBurst architecture
had turned into a NetBust
and gave open hands to the Israeli team who worked on the Pentium M processors [central part of Centrino platform
]. During those wars, we heard a slogan "Megahertz is Megahurtzing us"
, especially in sales discussions versus a very hot topic at the time, AMD Opteron
. There is also an internal story that touches on the power struggle that was happening at the time, but we will leave that one for another time. Let's just say that there was a reason why a certain executive started pushing Larrabee
like there's no tomorrow.
After the CPU side of the company put its finishing touches on the Core architecture, Intel was sure that Core was the ticket to ride and started to turn its focus into becoming a real platform company by products as well, not just by PowerPoint slideware. Paying a significant fee to a licensing company [Imagination Technologies
] so that you can use their power efficient, but "peformance shortfalling" technologies [PowerVR
] in low-end netbook and notebook chipsets is somewhat ludicrous if you're a company that designs ASICs [Application Specific Integrated Circuit
] that get sold in hundreds of million units per year. Now, the performance shortfall is not exactly Imagination Technologies fault, as we all thought. The problem is that Intel hired a 3rd party vendor called Tungsten Graphics [now a whole owned subsidiary of VMware Inc.] to create the drivers for the parts. Problem with those drivers is the fact that "GMA500 suffers from utterly crappy drivers. Intel didn't buy any drivers from Imagination Technologies for the SGX, but hired Tungsten Graphics to write the drivers for it. Despite the repeated protest from the side of Imagination Technologies to Intel, Tungsten drivers DO NOT use the onboard firmware of the chip, forcing the chip to resort to software vertex processing."
There you have it folks, the reason why "Intel graphics sux"
is not exactly hardware, but rather doubtful political decisions to have a VMware subsidiary writing drivers that are forcing CPU to do the work. We remember when Intel used this as a demo of performance differences between Core 2 Duo and Quad, but the problem is - it is one thing to demonstrate, it is another to shove that to your respective buyers.
Secondly, Intel decided to use PowerVR SGX535 core, which can be found in Apple's iPhone, Nokia N900, Sony Ericsson XPERIA and similar smartphones. If the company opted for SGX545, we would probably see a netbook and notebook parts [ones based on mobile 945 chipset] beating the living daylights out of Intel's desktop chipsets that use Intel's old GenX graphics hardware.
As a consequence of underpowered hardware and questionable driver decisions, Intel was always ridiculed by gaming development teams and cursed upon whenever a publisher would force that the game supports Intel's integrated graphics - resulting in a paradox of having the best CPU and the worst GPU [several key developers warned us about never writing GMA graphics as a "GPU"]. In fact, during the preparation of this story a certain affair involving GMA graphics and driver optimizations in 3DMark Vantage
broke out courtesy of Tech Report. Tim Sweeney of Epic Games lacked courtesy
of Intel's graphics capabilities commenting that "Intel's integrated graphics just don't work. I don't think they will ever work."
But that statement could be considered as courtesy compared to his latter statement "[Intel] always say 'Oh, we know it has never worked before, but the next generation ...' It has always been the next generation. They go from one generation to the next one and to the next one. They're not faster now than they have been at any time in the past."
Bear in mind that Intel also showed open doors to developers, as Tim Sweeney is now knee deep in creating Unreal Engine 4.0, one of first engines that will utilize Larrabee... when it gets out. Naturally, direct hardware access to hardware will apply to AMD and nVidia graphics too, with Fermi architecture now supporting native C++ code execution. But that's another story...
The idea of Larrabee finally came to fruition towards the middle point of this decade. Intel started to hire people around the Globe with a lot of focus on sites in Oregon, California and Germany. In order to build a highly complex chip, you have to relay several teams working on various aspects and perhaps this might be the reason for current state of the Larrabee project and slippage in the roadmaps by some two years, with a potential for further slip.
© 2009 - 2014 Bright Side Of News*, All rights reserved.