NVIDIA Analyst Day: Jen-sun Goes to the Mat With Intel
by Derek Wilson on April 11, 2008 12:00 AM EST- Posted in
- GPUs
Intel's Graphics Performance Disadvantage
It is no secret that Intel's integrated graphics are very, very slow and nearly useless for most modern 3D graphics applications. So when Intel says we should see their integrated graphics parts increase in performance 10x by 2010 we should get excited right? That is much faster than Moore's law in terms of performance over time (we should only expect a little more than 2x performance over a 2 year period, not an order of magnitude).
The problem with this situation as noted by NVIDIA is that today's Intel integrated graphics parts are more than 10x slower than today's affordable discrete graphics parts.
Looking at that chart, we can see that Intel integrated graphics will be competitive with today's sub $100 hardware by the year 2010 in 3DMark06. With 4 year old games, NVIDIA's current hardware will still blow away Intel's 2010 integrated solution, and the margin just climbs higher if you look at a modern game. NVIDIA claims that these tests were done with low quality settings as well, but we can't speak to that as we weren't the one's running the tests. If that's the case, then the differences could be even larger.
The bottom line is that Intel doesn't have a solution that is within reach of current graphics hardware even if it could produce 10x performance today. In two years, after NVIDIA, AMD, and even S3 have again at least doubled performance from what we currently have, there's no way Intel can hope to keep up.
And NVIDIA wasn't pulling any punches. Jen-sun went so far as to say: "I wouldn't mind it if sometimes they just say thank you – that its possible to enjoy a game on an Intel microprocessor [because of NVIDIA graphics hardware]." This is certainly an audacious statement of the facts, even if it happens to be the truth. AMD can take some of the credit their as well, of course, but the point is that Intel couldn't make it on its own as a gaming platform today without the help of its competitors. Believe me when I say that we are trying to put a neutral spin on all this, but Jen-sun was out for blood today, and he absolutely hit his mark.
To prove the point further, they went to the data Valve has been collecting through Steam. This data must be taken with the proper dose of salt – it doesn't reflect the entire GPU market by any means. The Steam data reflects a specific subset of GPU users: gamers. Even more to the point, this is a specific subset of gamers: gamers who play games using Steam who chose to report their system information (anonymously of course). Yes the sample size is large, but this is by no means a random sample of anything and thus loses some points in the statistical accuracy department.
Steam data indicates that NVIDIA GPUs are utilized in 60% of Steam gamers’ boxes, while Intel GPUs are only being utilized in 2% of those surveyed. As Jen-sun pointed out, this isn't about juggling a few percentage points: gamers who use Steam clearly do not use Intel GPUs to play their games. The picture gets even grimmer for Intel when you look at DX10-specific data: NVIDIA has about 87% of the GPUs running in DX10 boxes while Intel powers only 0.11% (which Jen-sun said "I think that's just an error," and may well have been right). He was on target when he said that in this case "approximately zero seems statistically significant."
The implication is clear that while few gamers use Intel GPUs in the first place, gamers aren’t using Intel GPUs for DX10 gaming at all. We certainly buy that, as Intel does not even have a DX10 part on the market right now, but again this data is not a statistically accurate representation of the entire gaming population. Steam data is certainly not a bad indicator when taken in a proper context and as part of a broader look at the industry.
Further refuting the idea that Intel can displace NVIDIA, Jen-sun addressed Ron Fosner's assertion that multicore processors can handle graphics better than a dedicated graphics card ever could. This is where NVIDIA gets into a little counter FUD action, where Jen-sun shows that adding cores does nothing for gaming or graphics benchmark performance. In this case, Intel was certainly referring to the ability of CPUs to handle graphics code written specifically for multicore CPUs. But for the time being, when you compare adding cores to your system to adding a more powerful GPU, NVIDIA offers up to 27x more bang for your buck.
Currently Intel has some fairly decent lab demos, and there have been murmurs of a software renderer renaissance (I'd love to see John Carmack and Tim Sweeney duke it out one more time in software; maybe that's just me), but there just isn't anything in production that even tries to show what a CPU can or can't do when in direct competition with a GPU for graphics quality and performance. And there are reasons for that: currently it still isn't practical to develop the software. Maybe when everyone is running their 8 core 16 thread CPUs we'll see something interesting. But right now and even in the next few years rasterization is going to be the way to go and pure FLOPS with massive parallelism will win every time over Intel's programmability and relatively light parallelism.
Which brings us to a point NVIDIA made later in the day: GPUs are already multicore to the point where NVIDIA likes to refer to them as manycore (as we saw Intel do this with 100+ core concepts a few years back when they were first starting to push parallelism). It's a stretch for me to think of the 128 SPs in a G80 or G92 GPU as "cores" because they aren't really fully independent. But with the type of data GPUs normally tackle it's effectively very similar. Certainly no matter how you slice it GPUs are much wider hardware than any current multicore CPU.
The point NVIDIA needs to make is that the argument is far from over as the battle hasn't even really begun. At one point Jen-sun said "[Intel] can't go around saying the GPU is going to die when they don't know anything about it." This is a fair statement, but NVIDIA can't write Intel off either. They certainly will know about GPUs if they truly intend to go down the path they seem destined to travel. Either they'll be pushing the CPU (and all it's multicore glory) as a bastion of graphics power (for which some free multicore CPU graphics development tools might be nice, hint hint), or it will be essentially entering the graphics market outright with whatever Larabee ends up actually becoming.
43 Comments
View All Comments
Wiz33 - Wednesday, June 4, 2008 - link
Intel have no bargaining power in the gamer circle. Even if they withheld licensing for the next gen platform. Gamer will just stay with the current gen chipset for nVidia SLi. Since games are usually much more GPU bound than CPU.In my case, I'm a serious gamer (but FPS lite)). I just clocked over 40 hours on Mass Effect PC since installing it last Thursday evening. In my current setup with a E6750 and 8800GTS. I still have tons of upgrade path both in CPU and GPU without moving onto the next Intel platform.
sugs - Sunday, May 11, 2008 - link
As an IC designer, I can tell you right away that 3D graphics on the scale of the products that NVidia/ATI produce is not easy. Just look at the demise of Matrox, S3 and others.I think Intel is going to have problems getting the performance of their offerings to a competitive level in the near future but they do have alot of resources and it might be different 5 years down the line.
kenour - Tuesday, April 15, 2008 - link
Dear Jen-sun,All Intel want is SLI on their chips (AS DO A LOT OF GAMERS)... so neck up you little arrogant prick and licence it to them! Don't come out with your little chest puffed our playing the tough guy! If you lease SLI technology to Intel so their highend chipsets will support SLI (Officially! Without having to use hacked drivers) for say $50US, and Intel SLI enabled all their X38/X48 boards, imagine the money that would come in. But you're too busy trying to hold on to the pathetic market share of your pathetic chipsets. There are so many gamers like me out there that would gladly purchase a second high end nvidia card and SLI them, but wont, because there is no way we would use an nvidia chipset... I would pay a $50US premium on a mobo to have SLI on an Intel chipset, and then I would buy another high end card. So put your pride aside and give them (AND US) what they want! More money for you, better gaming platform for us.
Lots of Love,
Kenour.
p.s. Yes I'm still pissed off about the rumour that SLI would be available on the X38 :P It was reported here and Tom's from memory, then retracted a week later... Was the happiest week of my life :P (well, in regards to the PC world).
ielmox - Wednesday, April 16, 2008 - link
I think nVidia is holding on to SLI as a marketing gimmick, because SLI doesn't make economic sense except for an extremely small market of wealthy and elitist gamers. I don't see any real value to SLI aside from the bragging rights of somewhat increased performance at a huge cost, and I think nVidia's strategy is guided by this knowledge.SLI uses a lot more power, generates much more heat, is buggier, harder to set up, and all this while offering diminishing returns compared with a dual or even single GPU card. In fact, unless you're SLI'ing the latest and greatest cards, you are better off with a non-SLI setup. Realistically, only a very tiny minority of gamers would ever go for an SLI set-up, so I'm guessing nVidia understands there is not much potential for financial gain.
SLI is a bit of a white elephant to most people.
gochichi - Monday, April 14, 2008 - link
The intel/nvidia combo is totally the "it" combo in computer gaming and has been for some time. AMD is working on "tidy-little-packages" with their new integrated graphics platform that can just about "game" right now, not in 2010.Nvidia, not Intel are the people that need to be working on an Intel platform equivalent in the integrated sector.
I am glad to be an Nvidia customer, I am also glad to see their not taking cheap-shots at AMD. They even came out kind of defending AMD which is understandable, both are smaller companies and both respect each other's products.
I can just picture it now: AMD laptops with synergy for $500 or less and no equivalent Intel solution due to a lack of cooperation with Nvidia.
perzy - Monday, April 14, 2008 - link
Well the thing is I think that Intel has no choice. The x86-cpu is DEAD . The heatwall keeps the frequenzy down (seen any 4 GHz chip's lately?)and well they cant keep adding another core forever. Intel is in dire PANIC, belive me. They must branch out and the GPU, PPU and maybe a little audioPU is the chips with any development years left in them.
And no there are no quantum or laser chips yet...
Come on, if a blond guy from Sweden like me can understand this why dont you spell it out for everybody?
Galvin - Monday, April 14, 2008 - link
Actually hitting 4gz for intel would be easy. hell a lot of people get those things to 4gz on air.So yeah they could do 4gz if they wanted to :)
perzy - Tuesday, April 15, 2008 - link
So do you think that Intel is content and everything is going according to plan? We should be at 10 GHz now if according to that plan, and using the netburst architecture...The 3,8 GHz P4 was so hot that Intel had to ship it with high-expensive thermal paste. Otherwise it throttle constantly.
It's strange to me that everybody(hardware sites for example) seems to think this heat thing is a little snag, a bump in the road. It isen't !
'Oh lookey, not i get 2 cores for the price of one. How nice!'
The chipmakers are trying to hide the crises their in. (Stock prices..)
Why else do they buy GPU and PPU-makers?
Galvin - Tuesday, April 15, 2008 - link
I dont think intel has a leg to stand on in the graphics market.The point i was making is if intel wants to sell core duo at 4GHZ its very doable since people can clock these to 4GHZ today on air cooling. Thats the only point I was making.
Galvin - Sunday, April 13, 2008 - link
I listened to the whole presentation.Nvidia has a whole computer on a chip. Didn't even know they had this. Was impressed, this will be nice for mobile devices. Have to wait and see where this goes.
Cuda known it for as long as anyone else. I cant wait till compressors for zip, Encoding, etc all become real time. Something no CPU will ever pull off.
We all know intel is weak in graphics, intel has tons of cash. I dont think Nvidia is going anywhere and they'll most likely get bigger in time.
Theres only 2 companies in the world that can make this kind of graphics technology AMD/Nvidia. To make claim that intel can just magically make a gpu to compete in a few years is crazy imo.