NVIDIA Analyst Day: Jen-sun Goes to the Mat With Intel
by Derek Wilson on April 11, 2008 12:00 AM EST- Posted in
- GPUs
Intel's Graphics Performance Disadvantage
It is no secret that Intel's integrated graphics are very, very slow and nearly useless for most modern 3D graphics applications. So when Intel says we should see their integrated graphics parts increase in performance 10x by 2010 we should get excited right? That is much faster than Moore's law in terms of performance over time (we should only expect a little more than 2x performance over a 2 year period, not an order of magnitude).
The problem with this situation as noted by NVIDIA is that today's Intel integrated graphics parts are more than 10x slower than today's affordable discrete graphics parts.
Looking at that chart, we can see that Intel integrated graphics will be competitive with today's sub $100 hardware by the year 2010 in 3DMark06. With 4 year old games, NVIDIA's current hardware will still blow away Intel's 2010 integrated solution, and the margin just climbs higher if you look at a modern game. NVIDIA claims that these tests were done with low quality settings as well, but we can't speak to that as we weren't the one's running the tests. If that's the case, then the differences could be even larger.
The bottom line is that Intel doesn't have a solution that is within reach of current graphics hardware even if it could produce 10x performance today. In two years, after NVIDIA, AMD, and even S3 have again at least doubled performance from what we currently have, there's no way Intel can hope to keep up.
And NVIDIA wasn't pulling any punches. Jen-sun went so far as to say: "I wouldn't mind it if sometimes they just say thank you – that its possible to enjoy a game on an Intel microprocessor [because of NVIDIA graphics hardware]." This is certainly an audacious statement of the facts, even if it happens to be the truth. AMD can take some of the credit their as well, of course, but the point is that Intel couldn't make it on its own as a gaming platform today without the help of its competitors. Believe me when I say that we are trying to put a neutral spin on all this, but Jen-sun was out for blood today, and he absolutely hit his mark.
To prove the point further, they went to the data Valve has been collecting through Steam. This data must be taken with the proper dose of salt – it doesn't reflect the entire GPU market by any means. The Steam data reflects a specific subset of GPU users: gamers. Even more to the point, this is a specific subset of gamers: gamers who play games using Steam who chose to report their system information (anonymously of course). Yes the sample size is large, but this is by no means a random sample of anything and thus loses some points in the statistical accuracy department.
Steam data indicates that NVIDIA GPUs are utilized in 60% of Steam gamers’ boxes, while Intel GPUs are only being utilized in 2% of those surveyed. As Jen-sun pointed out, this isn't about juggling a few percentage points: gamers who use Steam clearly do not use Intel GPUs to play their games. The picture gets even grimmer for Intel when you look at DX10-specific data: NVIDIA has about 87% of the GPUs running in DX10 boxes while Intel powers only 0.11% (which Jen-sun said "I think that's just an error," and may well have been right). He was on target when he said that in this case "approximately zero seems statistically significant."
The implication is clear that while few gamers use Intel GPUs in the first place, gamers aren’t using Intel GPUs for DX10 gaming at all. We certainly buy that, as Intel does not even have a DX10 part on the market right now, but again this data is not a statistically accurate representation of the entire gaming population. Steam data is certainly not a bad indicator when taken in a proper context and as part of a broader look at the industry.
Further refuting the idea that Intel can displace NVIDIA, Jen-sun addressed Ron Fosner's assertion that multicore processors can handle graphics better than a dedicated graphics card ever could. This is where NVIDIA gets into a little counter FUD action, where Jen-sun shows that adding cores does nothing for gaming or graphics benchmark performance. In this case, Intel was certainly referring to the ability of CPUs to handle graphics code written specifically for multicore CPUs. But for the time being, when you compare adding cores to your system to adding a more powerful GPU, NVIDIA offers up to 27x more bang for your buck.
Currently Intel has some fairly decent lab demos, and there have been murmurs of a software renderer renaissance (I'd love to see John Carmack and Tim Sweeney duke it out one more time in software; maybe that's just me), but there just isn't anything in production that even tries to show what a CPU can or can't do when in direct competition with a GPU for graphics quality and performance. And there are reasons for that: currently it still isn't practical to develop the software. Maybe when everyone is running their 8 core 16 thread CPUs we'll see something interesting. But right now and even in the next few years rasterization is going to be the way to go and pure FLOPS with massive parallelism will win every time over Intel's programmability and relatively light parallelism.
Which brings us to a point NVIDIA made later in the day: GPUs are already multicore to the point where NVIDIA likes to refer to them as manycore (as we saw Intel do this with 100+ core concepts a few years back when they were first starting to push parallelism). It's a stretch for me to think of the 128 SPs in a G80 or G92 GPU as "cores" because they aren't really fully independent. But with the type of data GPUs normally tackle it's effectively very similar. Certainly no matter how you slice it GPUs are much wider hardware than any current multicore CPU.
The point NVIDIA needs to make is that the argument is far from over as the battle hasn't even really begun. At one point Jen-sun said "[Intel] can't go around saying the GPU is going to die when they don't know anything about it." This is a fair statement, but NVIDIA can't write Intel off either. They certainly will know about GPUs if they truly intend to go down the path they seem destined to travel. Either they'll be pushing the CPU (and all it's multicore glory) as a bastion of graphics power (for which some free multicore CPU graphics development tools might be nice, hint hint), or it will be essentially entering the graphics market outright with whatever Larabee ends up actually becoming.
43 Comments
View All Comments
Griswold - Friday, April 11, 2008 - link
I bet it gave you wet dreams.jtleon - Friday, April 11, 2008 - link
Griswold....You need to return to Wally World!LOL
jtleon
Synosure - Friday, April 11, 2008 - link
It seems like everyone is just ignoring Amd and their hybrid solution. It would have been nice to hear his opinion on it.Griswold - Friday, April 11, 2008 - link
Its easier to talk about the blue FUD machine than the other (albeit troubled) competitor that is actually competing with your own company on all fronts.Ananke - Friday, April 11, 2008 - link
Intel and AMD aim two things:1. Integrated low power graphics - implemented in mobile computerized devices: laptops, UMPCs, smart phones, video/audio players, etc. These market has the fastest growth.
2. Paralel processing; the design and thus the know how of present GPU players in paralel processing is immerse. Such tech solutions would be suitable in financial, military, scientific modeling, which markets command hefty profit margins.
These are the reasons why AMD bought ATI
My point - corporations do things which will acccelerate margins, or accelerate growth. Financial analysts are not interested in nominal values only.
Intel was to choose either acquisition or internal development of products. It seems like they chose internal approach, since ATI was already bought, and Nvidia purchase is too expensive and complicated to make financial sense. Sun Microsystems and IBM are already situated well in the high margin paralel processing market. However, IBM recently was screwed with government ban on orders, and since they moved so many strategic operations overseas, I don't see them easily coming back to the big margin market. HP abandoned their PARISK line a long time ago, so they rely on AMD and Intel for chips supply now. So, exciting time for Intel and AMD for grabbing new market teritorries.
Nvidia is left to the discrete graphics market only. It is popular market across the gamers, magazines and general consumer, but it is not the market where the huge money are done. And, I don't see collision between Intel and Nvidia interests, except the Mobile market. What investors are warned about is that the big guys curbed opportunities for revenue and profit growth.
joegee - Friday, April 11, 2008 - link
"You already have the right machine to run Excel. You bought it four years ago... How much faster can you render the blue screen of death?" -- Jen Hsun-HuangGiven that this was in response to questions about nVidia's Vista driver problems, I don't know that this helps nVidia's case. Apparently those devices best able to render the BSoD quickly are those made by nVidia. This is not something that will become a new benchmark any vendor would care to win.
I would like a video card that will run both Excel *and* the latest games, Mr. Hsun-Huang.
-Joe G.
chizow - Friday, April 11, 2008 - link
Those Steam figures look familiar Derek. ;) I'm surprised JH didn't bring up the Microsoft class action suit as another example of Intel integrated chipsets failing miserably. Nice peak into the current market climate, although there wasn't as much discussion about the future as I had hoped.DerekWilson - Friday, April 11, 2008 - link
heh yeah ... but the steam numbers still say absolutely nothing about the state of the market in early 2007.they are a good indicator for what is happening now, and i never meant to imply other wise.
i'd love to see something more forward looking as well...
Genx87 - Friday, April 11, 2008 - link
I dont see how a company who has 0% market share above integrated graphics is going to motivate or get devs to write game engines to do ray tracing vs rasterization. John Carmack had an interview about this 2 months ago and he wasnt impressed with what Intel has and wasnt convinced Ray Tracing is better at everything than rasterization. He felt it would be a hybrid situation at best and Intel is dreaming.Pyrosma - Saturday, April 12, 2008 - link
John Carmack wasn't very impressed with Half Life when he first saw it, either. And it was built with his game engine. Oops.