NVIDIA Analyst Day: Jen-sun Goes to the Mat With Intel
by Derek Wilson on April 11, 2008 12:00 AM EST- Posted in
- GPUs
The Tenderloin and the Two Buck Chuck
As for the idea of Intel integrating a GPU onto their CPUs, NVIDIA painted a rather distasteful picture of mixing together something excellent with something incredibly sub par. The first analogy Jen-sun pulled out was one of someone's kid topping off a decanted bottle of '63 Chateau Latour with an '07 Robert Mondavi. The idea of Intel combining their very well engineered CPUs with their barely passable integrated graphics is an aberration to be avoided at all costs.
This isn't to say that CPUs and GPUs shouldn't work together, but that Intel should stick to what they know. In fact, NVIDIA heavily pushed the idea of heterogeneous computing but decried the idea that taking a system block diagram and drawing a box around the CPU and GPU would actually do anything useful. NVIDIA definitely wants their hardware to be the manycore floating point compute hardware paired with Intel's multicore general purpose processors, and they try to paint a picture of a world where both are critical to any given system.
Certainly CPUs and GPUs are currently needed and unless Intel can really pull out some magic that won't change for the foreseeable future. NVIDIA made a big deal of relating this pair to Star Trek technology: you need both your impulse engines and your warp drive. Neither is useful for the task the other is designed for: short range navigation can't be done with a warp drive, and impulse engines aren't suitable for long distance travel requiring faster than light speeds. The bottom line is that hardware should be designed and used for the task that best suits it.
Again, this says nothing about what happens if Intel brings to market a competitive manycore floating point solution. Maybe the hardware they design will be up to the task, and maybe it won't. But Jen-sun really wanted to get across the idea that the current incarnation of the CPU and the current incarnation of Intel's GPU technology are nowhere near sufficient to handle anything like what NVIDIA's hardware enables.
Coming back the argument that it's best to stick with what you know, Jen-sun stated his belief that "you can't be a great company by doing everything for everybody;" that Intel hardware works fine for running operating systems and for applications where visualization is not a factor at all: what NVIDIA calls Enterprise Computing (in contrast to Visual Computing). Going further, he postulates that "the best way for Google to compete against Microsoft is not to build another operating system."
Making another back handed comment about Intel, Jen-sun later defended their recent loss in market share for low end notebook graphics. He held that the market just wasn't worth competing in for them and that other companies offered solutions that fit the market better. Defending NVIDIA's lack of competition in this market segment, he doesn't say to himself: "Jen-sun, when you wake up in the morning, go steal somebody else's business," but rather "we wake up in the morning saying, 'ya know, we could change the world.'"
43 Comments
View All Comments
Griswold - Friday, April 11, 2008 - link
I bet it gave you wet dreams.jtleon - Friday, April 11, 2008 - link
Griswold....You need to return to Wally World!LOL
jtleon
Synosure - Friday, April 11, 2008 - link
It seems like everyone is just ignoring Amd and their hybrid solution. It would have been nice to hear his opinion on it.Griswold - Friday, April 11, 2008 - link
Its easier to talk about the blue FUD machine than the other (albeit troubled) competitor that is actually competing with your own company on all fronts.Ananke - Friday, April 11, 2008 - link
Intel and AMD aim two things:1. Integrated low power graphics - implemented in mobile computerized devices: laptops, UMPCs, smart phones, video/audio players, etc. These market has the fastest growth.
2. Paralel processing; the design and thus the know how of present GPU players in paralel processing is immerse. Such tech solutions would be suitable in financial, military, scientific modeling, which markets command hefty profit margins.
These are the reasons why AMD bought ATI
My point - corporations do things which will acccelerate margins, or accelerate growth. Financial analysts are not interested in nominal values only.
Intel was to choose either acquisition or internal development of products. It seems like they chose internal approach, since ATI was already bought, and Nvidia purchase is too expensive and complicated to make financial sense. Sun Microsystems and IBM are already situated well in the high margin paralel processing market. However, IBM recently was screwed with government ban on orders, and since they moved so many strategic operations overseas, I don't see them easily coming back to the big margin market. HP abandoned their PARISK line a long time ago, so they rely on AMD and Intel for chips supply now. So, exciting time for Intel and AMD for grabbing new market teritorries.
Nvidia is left to the discrete graphics market only. It is popular market across the gamers, magazines and general consumer, but it is not the market where the huge money are done. And, I don't see collision between Intel and Nvidia interests, except the Mobile market. What investors are warned about is that the big guys curbed opportunities for revenue and profit growth.
joegee - Friday, April 11, 2008 - link
"You already have the right machine to run Excel. You bought it four years ago... How much faster can you render the blue screen of death?" -- Jen Hsun-HuangGiven that this was in response to questions about nVidia's Vista driver problems, I don't know that this helps nVidia's case. Apparently those devices best able to render the BSoD quickly are those made by nVidia. This is not something that will become a new benchmark any vendor would care to win.
I would like a video card that will run both Excel *and* the latest games, Mr. Hsun-Huang.
-Joe G.
chizow - Friday, April 11, 2008 - link
Those Steam figures look familiar Derek. ;) I'm surprised JH didn't bring up the Microsoft class action suit as another example of Intel integrated chipsets failing miserably. Nice peak into the current market climate, although there wasn't as much discussion about the future as I had hoped.DerekWilson - Friday, April 11, 2008 - link
heh yeah ... but the steam numbers still say absolutely nothing about the state of the market in early 2007.they are a good indicator for what is happening now, and i never meant to imply other wise.
i'd love to see something more forward looking as well...
Genx87 - Friday, April 11, 2008 - link
I dont see how a company who has 0% market share above integrated graphics is going to motivate or get devs to write game engines to do ray tracing vs rasterization. John Carmack had an interview about this 2 months ago and he wasnt impressed with what Intel has and wasnt convinced Ray Tracing is better at everything than rasterization. He felt it would be a hybrid situation at best and Intel is dreaming.Pyrosma - Saturday, April 12, 2008 - link
John Carmack wasn't very impressed with Half Life when he first saw it, either. And it was built with his game engine. Oops.