NVIDIA Analyst Day: Jen-sun Goes to the Mat With Intel
by Derek Wilson on April 11, 2008 12:00 AM EST- Posted in
- GPUs
Introduction
Hot on the heels of the launch of their 9800 series products, NVIDIA is holding a Financial Analyst Day. These are generally not filled with the normal technical glitz and glitter an Editors Day has, but the announcements and material covered are no less important to NVIDIA as a company. NVIDIA has an unusually large institutional ownership rate at 84% (versus 79% and 66% for AMD and Intel respectively) so the company holds these Analyst Days in part to keep its institutional investors happy and well informed about the company’s progress.
As far as we members of the press are concerned however, Analyst Days are a valuable chance to learn about the GPU market, and anything that could impact the bottom line can help us understand NVIDIA's direction, motivation, and even the reasoning behind some of the engineering decisions they make. Today saw a lot of posturing for battles to come, and we were not disappointed.
Waking up the Beast
Most of the morning was dedicated to NVIDIA taking some time to do a little PR damage control. They've stepped out to defend themselves against the doom and gloom statements of other players in the industry. With Intel posturing for a move into the graphics market and proclaiming the downfall of rasterization and discrete graphics at the same time, NVIDIA certainly has reason to address the matter.
And we aren't talking about some standard press release boiler plate filled with fluffy marketing speak. This time, Jen-sun Huang, the man himself, stepped out front and addressed some of the concerns others in the industry have put forth. And he was out for blood. We don't get the chance to hear from Jen-sun too often, so when he speaks, we are more than happy to listen.
One of the first things that Jen-sun addressed (though he didn't spend much time on it) is the assessment by Intel's Pat Gelsinger that rasterization is not scalable and won't suit future demands. He largely just threw this statement out as "wrong and pointless to argue about," but the aggregate of the arguments made over the day all sort of relate back to this. The bottom line seems more like Intel's current approach to graphics can't scale fast enough to meet the demands of games in the future, but that speaks nothing about NVIDIA and AMD's solution which are at least one if not two orders of magnitude faster than Intel graphics right now. In fact, at one point Jen-sun said: "if the work that you do is not good enough … Moore's law is your enemy."
This seems as good a time as any to address the tone of the morning. Jen-sun was very aggressive in his rebuke of the statements made against his company. Many times he talked about how inappropriate it is for larger companies to pick on smaller ones through the use of deceptive marketing tactics (ed: Intel is 11.5 times as large as NVIDIA by market cap). To such attacks, he says "It's just not right!" and "we've been taking it, every single fricking day… enough is enough!" NVIDIA, Jen-sun says, must rely on the truth to carry its message in the absence of massive volumes of marketing dollars.
Certainly, things can be true even if they paint a picture slightly different than reality, but for the most part what Jen-sun said made a lot of sense. Of course, it mostly addresses reality as it is today and doesn't speculate about what may be when Larabee hits the scene or if Intel decides to really go after the discrete graphics market. And rightly enough, Jen-sun points out that many of Intel's comments serve not only to spread doubt about the viability of NVIDIA, but will have the effect of awakening the hearts and minds of one of the most tenaciously competitive companies in computing. Let's see how that works out for them.
43 Comments
View All Comments
Griswold - Friday, April 11, 2008 - link
I bet it gave you wet dreams.jtleon - Friday, April 11, 2008 - link
Griswold....You need to return to Wally World!LOL
jtleon
Synosure - Friday, April 11, 2008 - link
It seems like everyone is just ignoring Amd and their hybrid solution. It would have been nice to hear his opinion on it.Griswold - Friday, April 11, 2008 - link
Its easier to talk about the blue FUD machine than the other (albeit troubled) competitor that is actually competing with your own company on all fronts.Ananke - Friday, April 11, 2008 - link
Intel and AMD aim two things:1. Integrated low power graphics - implemented in mobile computerized devices: laptops, UMPCs, smart phones, video/audio players, etc. These market has the fastest growth.
2. Paralel processing; the design and thus the know how of present GPU players in paralel processing is immerse. Such tech solutions would be suitable in financial, military, scientific modeling, which markets command hefty profit margins.
These are the reasons why AMD bought ATI
My point - corporations do things which will acccelerate margins, or accelerate growth. Financial analysts are not interested in nominal values only.
Intel was to choose either acquisition or internal development of products. It seems like they chose internal approach, since ATI was already bought, and Nvidia purchase is too expensive and complicated to make financial sense. Sun Microsystems and IBM are already situated well in the high margin paralel processing market. However, IBM recently was screwed with government ban on orders, and since they moved so many strategic operations overseas, I don't see them easily coming back to the big margin market. HP abandoned their PARISK line a long time ago, so they rely on AMD and Intel for chips supply now. So, exciting time for Intel and AMD for grabbing new market teritorries.
Nvidia is left to the discrete graphics market only. It is popular market across the gamers, magazines and general consumer, but it is not the market where the huge money are done. And, I don't see collision between Intel and Nvidia interests, except the Mobile market. What investors are warned about is that the big guys curbed opportunities for revenue and profit growth.
joegee - Friday, April 11, 2008 - link
"You already have the right machine to run Excel. You bought it four years ago... How much faster can you render the blue screen of death?" -- Jen Hsun-HuangGiven that this was in response to questions about nVidia's Vista driver problems, I don't know that this helps nVidia's case. Apparently those devices best able to render the BSoD quickly are those made by nVidia. This is not something that will become a new benchmark any vendor would care to win.
I would like a video card that will run both Excel *and* the latest games, Mr. Hsun-Huang.
-Joe G.
chizow - Friday, April 11, 2008 - link
Those Steam figures look familiar Derek. ;) I'm surprised JH didn't bring up the Microsoft class action suit as another example of Intel integrated chipsets failing miserably. Nice peak into the current market climate, although there wasn't as much discussion about the future as I had hoped.DerekWilson - Friday, April 11, 2008 - link
heh yeah ... but the steam numbers still say absolutely nothing about the state of the market in early 2007.they are a good indicator for what is happening now, and i never meant to imply other wise.
i'd love to see something more forward looking as well...
Genx87 - Friday, April 11, 2008 - link
I dont see how a company who has 0% market share above integrated graphics is going to motivate or get devs to write game engines to do ray tracing vs rasterization. John Carmack had an interview about this 2 months ago and he wasnt impressed with what Intel has and wasnt convinced Ray Tracing is better at everything than rasterization. He felt it would be a hybrid situation at best and Intel is dreaming.Pyrosma - Saturday, April 12, 2008 - link
John Carmack wasn't very impressed with Half Life when he first saw it, either. And it was built with his game engine. Oops.