NVIDIA Analyst Day: Jen-sun Goes to the Mat With Intel
by Derek Wilson on April 11, 2008 12:00 AM EST- Posted in
- GPUs
New Spin on Computer Marketing
Beyond all the FUD fighting, NVIDIA talked about a new push toward marketing computers not in terms of CPU speed or GPU model number or whatever the spec of the week may be, but in terms of what the computer is designed to do. NVIDIA, OEMs, and retailers have all gotten behind the idea that it would be a good practice to start building and marketing their systems not as low-end, midrange, or high-end, but as gaming computers, multimedia computers, workstations, or business application computers.
If system builders choose to balance CPU and GPU power to favor specific applications rather than just throwing all low-end components, all midrange components, or all high-end components at a system, they can deliver much better bang for the buck to people looking to use their PC for a specific purpose. NVIDIA refers to this idea as the Optimized PC Initiative. It's kind of a side note and not an NVIDIA centric line of thought, but it is an idea that could really help the uninitiated understand what they are getting when they purchase a system. In fact, this is one of the areas that really impressed us with the Gateway P-6831 FX notebook: that it is balanced for great midrange gaming performance.
Final Words
No one can deny Jen-sun's love for his company and his hardware, but while his presentation was impressive and impassioned, we must not discount Intel's ability to compete. They are impressive in capability (they've got a lot of brilliant engineers over there) and size (they've also got a lot of money). We also can't forget that Intel is a silicon company. They've got absolutely huge resources to dedicate to producing the most bleeding edge silicon base to house their ICs. With the sheer size and power requirements of today's GPUs, every little bit helps. The combination of Intel engineers building a massively parallel floating point engine to match the power of the GPU, and then fabbed on Intel silicon could be a huge coup if they are only willing to really commit to the task and put their money (and their minds) where it matters most.
Now that both Intel and NVIDIA have hit the mat and acknowledged each other as true competitors, we hope to see some huge things happen in terms of computer graphics and massively parallel floating point computing in general. Today marks the beginning of a new era in the desktop PC world: the beginning a battle between the world's greatest silicon company and the world's greatest dedicated IC design house.
43 Comments
View All Comments
Griswold - Friday, April 11, 2008 - link
I bet it gave you wet dreams.jtleon - Friday, April 11, 2008 - link
Griswold....You need to return to Wally World!LOL
jtleon
Synosure - Friday, April 11, 2008 - link
It seems like everyone is just ignoring Amd and their hybrid solution. It would have been nice to hear his opinion on it.Griswold - Friday, April 11, 2008 - link
Its easier to talk about the blue FUD machine than the other (albeit troubled) competitor that is actually competing with your own company on all fronts.Ananke - Friday, April 11, 2008 - link
Intel and AMD aim two things:1. Integrated low power graphics - implemented in mobile computerized devices: laptops, UMPCs, smart phones, video/audio players, etc. These market has the fastest growth.
2. Paralel processing; the design and thus the know how of present GPU players in paralel processing is immerse. Such tech solutions would be suitable in financial, military, scientific modeling, which markets command hefty profit margins.
These are the reasons why AMD bought ATI
My point - corporations do things which will acccelerate margins, or accelerate growth. Financial analysts are not interested in nominal values only.
Intel was to choose either acquisition or internal development of products. It seems like they chose internal approach, since ATI was already bought, and Nvidia purchase is too expensive and complicated to make financial sense. Sun Microsystems and IBM are already situated well in the high margin paralel processing market. However, IBM recently was screwed with government ban on orders, and since they moved so many strategic operations overseas, I don't see them easily coming back to the big margin market. HP abandoned their PARISK line a long time ago, so they rely on AMD and Intel for chips supply now. So, exciting time for Intel and AMD for grabbing new market teritorries.
Nvidia is left to the discrete graphics market only. It is popular market across the gamers, magazines and general consumer, but it is not the market where the huge money are done. And, I don't see collision between Intel and Nvidia interests, except the Mobile market. What investors are warned about is that the big guys curbed opportunities for revenue and profit growth.
joegee - Friday, April 11, 2008 - link
"You already have the right machine to run Excel. You bought it four years ago... How much faster can you render the blue screen of death?" -- Jen Hsun-HuangGiven that this was in response to questions about nVidia's Vista driver problems, I don't know that this helps nVidia's case. Apparently those devices best able to render the BSoD quickly are those made by nVidia. This is not something that will become a new benchmark any vendor would care to win.
I would like a video card that will run both Excel *and* the latest games, Mr. Hsun-Huang.
-Joe G.
chizow - Friday, April 11, 2008 - link
Those Steam figures look familiar Derek. ;) I'm surprised JH didn't bring up the Microsoft class action suit as another example of Intel integrated chipsets failing miserably. Nice peak into the current market climate, although there wasn't as much discussion about the future as I had hoped.DerekWilson - Friday, April 11, 2008 - link
heh yeah ... but the steam numbers still say absolutely nothing about the state of the market in early 2007.they are a good indicator for what is happening now, and i never meant to imply other wise.
i'd love to see something more forward looking as well...
Genx87 - Friday, April 11, 2008 - link
I dont see how a company who has 0% market share above integrated graphics is going to motivate or get devs to write game engines to do ray tracing vs rasterization. John Carmack had an interview about this 2 months ago and he wasnt impressed with what Intel has and wasnt convinced Ray Tracing is better at everything than rasterization. He felt it would be a hybrid situation at best and Intel is dreaming.Pyrosma - Saturday, April 12, 2008 - link
John Carmack wasn't very impressed with Half Life when he first saw it, either. And it was built with his game engine. Oops.