Intel's Larrabee Architecture Disclosure: A Calculated First Move
by Anand Lal Shimpi & Derek Wilson on August 4, 2008 12:00 AM EST- Posted in
- GPUs
Things That Could Go Wrong
I had to write this section because as strong as Intel has been executing these past couple of years, we must keep in mind that in the GPU market, Intel isn't only the underdog, it's going up against the undefeated. NVIDIA, the company that walked into 3dfx's house and walked away with its IP, the company who could be out engineered and outperformed by ATI for an entire year and still emerge as dominant. This is Intel's competition, the most Intel-like of all of the manufacturers in the business, and a highly efficient one at that.
Intel may benefit from the use of its advanced manufacturing fabs in making Larrabee, but it is also burdened by them. NVIDIA has been building GPUs, some quite large, without ever investing a dime in building its own manufacturing facility. There's much that could go wrong with Larrabee, the short list follows:
Manufacturing, Design and Yield
Before we get to any of the GPU-specific concerns about Larrabee, there's always the basics when making any chip. There's always the chance that it could be flawed, it might not reach the right clock speeds, deliver the right performance and perhaps not yield well enough. Larrabee has a good chance of being Intel's largest die produced in desktop-like volumes, while Intel is good at manufacturing we can't rule these out as concerns.
Performance
As interesting as Larrabee sounds, it's not going to arrive for another year at least. NVIDIA should have even higher performing parts out by then, making GT200 look feebile by comparison. If Intel can't deliver a real advantage over the best from NVIDIA and AMD, Larrabee won't get very far as little more than a neat architecture.
Drivers and Developer Relations
Intel's driver team now is hardly its strongpoint. On the integrated graphics side we continue to have tons of issues, even as we're testing the new G45 platform we're still bumping into many driver related issues and are hearing, even from within Intel, that the IGP driver team leaves much to be desired. Remember that NVIDIA as a company is made up of mostly software engineers - drivers are paramount to making a GPU successful, and Intel hasn't proved itself.
I asked Intel who was working on the Larrabee drivers, thankfully the current driver team is hard at work on the current IGP platforms and not on Larrabee. Intel has a number of its own software engineers working on Larrabee's drivers, as well as a large team that came over from 3DLabs. It's too early to say whether or not this is a good thing, nor do we have any idea of what Intel's capabilities are from a regression testing standpoint, but architecture or not, drivers can easily decide the winner in the GPU race.
Developer relations are also very important. Remember the NVIDIA/Assassin's Creed/DirectX 10.1 fiasco? NVIDIA's co-marketing campaign with nearly all of the top developers is an incredibly strong force. While Intel has the clout to be able to talk to game developers, we're bound to see the clash of two impossibly strong forces here.
101 Comments
View All Comments
DerekWilson - Monday, August 4, 2008 - link
this is a pretty good observation ...but no matter how much potential it has, performance in games is going to be the thing that actually makes or breaks it. it's of no use to anyone if no one buys it. and no one is going to buy it because of potential -- it's all about whether or not they can deliver on game performance.
Griswold - Monday, August 4, 2008 - link
Well, it seems you dont get it either.helms - Monday, August 4, 2008 - link
I decided to check out the development of this game I heard about ages ago that seemed pretty unique not only the game but the game engine for it. Going to the website it seems Intel acquired them at the end of February.http://www.projectoffset.com/news.php">http://www.projectoffset.com/news.php
http://www.projectoffset.com/technology.php">http://www.projectoffset.com/technology.php
I wonder how significant this is.
iwodo - Monday, August 4, 2008 - link
I forgot to ask, how will the Software Render works out on Mac? Since all Direct X code are run to Software renderer doesn't that fundamentally mean most of the current Windows based games could be run on Mac with little work?MamiyaOtaru - Monday, August 4, 2008 - link
Not really. Larrabee will be translating directx to its software renderer. But unless Microsoft ports the directX API to OSX, there will be nothing for Larrabee to translate.Aethelwolf - Monday, August 4, 2008 - link
I wonder if game devs can write their games in directx then have the software renderer convert it into larrabee's ISA on windows platform, capturing the binary somehow. Distribute the directx on windows and the software ISA for mac. No need for two separate code paths.iwodo - Monday, August 4, 2008 - link
If anyone can just point out the assumption anand make are false? Then it would be great, because what he is saying is simply too good to be true.One point to mention the 4Mb Cache takes up nearly 50% of the die size. So if intel could rely more on bandwidth and saving on cache they could put in a few more core.
And am i the only one who think 2010 is far away from Introduction. I think 2009 summer seems like a much better time. Then they will have another 6 - 8 months before they move on to 32nm with higher clock speed.
And for the Game developers, with the cash intel have, 10 Million for every high profile studio like Blizzard, 50 Million to EA to optimize for Intel. It would only cost them 100 million of pocket money.
ZootyGray - Monday, August 4, 2008 - link
I was thinking of all the p90's I threw away - could have made a cpu sandwich, with a lil peanut software butter, and had this tower of babel thing sticking out the side of the case with a fan on top, called lazarus, or something - such an opportunity to utilize all that old tek - such imagery.griswold u r funny :)
Griswold - Monday, August 4, 2008 - link
You definitely are confused. Time for a nap.paydirt - Monday, August 4, 2008 - link
STFU Griswald. It's not helpful for you to grade every comment. Grade the article if you like... Anandtech, is it possible to add an ignore user function for the comments?