NVIDIA's Bumpy Ride: A Q4 2009 Update
by Anand Lal Shimpi on October 14, 2009 12:00 AM EST- Posted in
- GPUs
Chipsets: One Day You're In and the Next, You're Out
Presently, NVIDIA’s chipset business is far from dead. They are in nearly every single Apple computer on the market, not to mention countless other OEMs. I’m not sure how much money NVIDIA is making from these chipsets, but they are selling.
NVIDIA won Apple's chipset business, Intel was not happy
Long term I don’t see much of a future for NVIDIA’s chipset business. NVIDIA said that they have no interest in pursuing an LGA-1156 chipset given Intel’s legal threats. Even if NVIDIA had a license to produce DMI chipsets, I’m not sure it makes sense.
NVIDIA's Advantage: A single chip GeForce 9400M instead of a dated Intel solution
Once the ‘dales hit, every single mainstream CPU from Intel is going to come with graphics on-package. Go out one more generation and Sandy Bridge brings the graphics on-die. AMD is doing the same thing starting in 2012.
It’s taken longer than expected, but there’s honestly no need for a third party chipset maker anymore. Most of the performance differentiation in chipsets has been moved onto the CPU die anyway, all that’s left are SATA, USB, and a bunch of validation that no one likes doing. NVIDIA is much better off building a discrete GeForce 9400M GPU at low cost and selling that. There’s much less headache involved with selling discrete GPUs than selling chipsets, plus graphics is NVIDIA’s only value add when it comes to chipsets - everyone knows how to integrate a USB controller by now. I’d say the same about SATA but AMD still has some AHCI silliness that it needs to sort out.
NVIDIA committed to supporting existing products in the channel and continues to poke fun at AMD with lines like this:
“On AMD platforms, we continue to sell a higher quantity of chipsets than AMD itself. MCP61-based platforms continue to be extremely well positioned in the entry CPU segments where AMD CPUs are most competitive vs. Intel”
As successful as NVIDIA’s AMD chipsets are today, AMD is telling us that nearly all OEM designs going forward use AMD chipsets. Again, NVIDIA’s chipset business is quite healthy today, but I don’t see much of a future in it - not that it’s a bad thing.
The only reason NVIDIA’s chipset business has lasted this long is because AMD and Intel couldn’t get their houses in order quickly enough. AMD is finally there and Intel is getting there, although it remains to be seen how well the next-generation of Atom platforms will work in practice.
A pair of Ion motherboards we reviewed
The main reason Ion got traction in the press was because it could play Blu-ray content. If Intel had done the right thing from the start and paired Atom with a decent chipset, NVIDIA would never have had the niche for Ion to fit into.
106 Comments
View All Comments
sbuckler - Wednesday, October 14, 2009 - link
Not all doom and gloom: http://www.brightsideofnews.com/news/2009/10/13/nv...">http://www.brightsideofnews.com/news/20...contract...Which also puts them in the running for the next Wii I would have thought?
Zapp Brannigan - Thursday, October 15, 2009 - link
unlikely, tegra is basically just an arm11 processor, allowing full backwards compatibility with the current arm9 and arm7 processors in the ds. If Nintendo want to have full backwards compatibility with the wii 2 then they'll have to stick the current ibm/ati combo.papapapapapapapababy - Wednesday, October 14, 2009 - link
1) ati launches crappy cards, 2) anand realizes "crapy cards, we might need nvidia after all" 3) anand does some nvidia damage control. 4)damage control sounds like wishful thinking to me 5) lol"While RV770 caught NVIDIA off guard, Cypress did not". XD
NVIDIA knew they were going to -FAIL- and made a conscious decision to KEEP FAILING? Guys, guys, they where cough off guard AGAIN. It does not matter if they know it! IT IS STILL A BIG FAIL! THEY KNEW? what kind of nonsense is that? BTW They could not shrink @ launch anything except that OEM garbage... what makes you think that fermi is going to be any different ?
whowantstobepopular - Wednesday, October 14, 2009 - link
"1) ati launches crappy cards, 2) anand realizes "crappy cards, we might need nvidia after all" 3) anand does some nvidia damage control"ROFL
Maybe Anand wrote this article to lay to the rest the last vestiges of SiliconDoc's recent rantings.
Seriously, Anand and team...
You guys do a fine job of thoroughly covering the latest official developments in the enthusiast PC space. You're doing the right thing by sticking to info that's confirmed. Charlie, Fudo and others are covering the rumours just fine, so what we really need is what we get here at Anandtech: Thorough, prompt reviews of tech products that have just released, and interesting commentary on PC market developments and directions (such as the above article).
I like the fact that you add a little bit of your own interpretation into these sorts of commentaries, and at the same time make sure we know what is fact and what is interpretation.
I guess I see it this way: You've been commentating on this IT game for quite a few years now, and the articles show it. There are plenty of references to parallels between current situations and historic ones, and these are both interesting and informative. This is one of many aspects of the articles here at Anandtech that make me (and others, it seems) keep coming back. Your knowledge of the important points in IT history is confidence inspiring when it comes to weighing up the value of your commentaries.
Finally, I have to commend the way that everyone on the Anandtech team appears to read through the comments under their articles. It's rather encouraging when suggestions and corrections for articles are noted and acted upon promptly, even when it involves extra work (re-running benchmarks, creating new graphs etc.). And the touch of humour that comes across in some of the replies (and articles) from the team makes a good comedic interlude during an otherwise somewhat bland day at work.
Keep up the good work Anandtech!
Transisto - Wednesday, October 14, 2009 - link
I like this place. . .Transisto - Wednesday, October 14, 2009 - link
I like this place. . .shotage - Wednesday, October 14, 2009 - link
Thumbs up to this post. These are my thoughts and sentiments also. Thank you to all @ Anandtech for excellent reading! Comments included ;)Shayd - Wednesday, October 14, 2009 - link
Ditto, thanks!Pastuch - Wednesday, October 14, 2009 - link
Fantastic post. I couldn't have said it better myself.Scali - Wednesday, October 14, 2009 - link
We'll have to see. nVidia competed just fine against AMD with the G80 and G92. The biggest problem with GT200 was that they went 65 nm rather than 55 nm, but even so, they still held up against AMD's parts because of the performance advantage. Especially G92 was hugely successful, incredible performance at a good price. Yes, the chip was larger than a 3870, but who cared?Don't forget that GT200 is based on a design that is now 3 years old, which is ancient. Just going to GDDR5 alone will already make the chip significantly smaller and less complex, because you only need half the bus width for the same performance.
Then there's probably tons of other optimizations that nVidia can do to the execution core to make it more compact and/or more efficient.
I saw someone who estimated the number of transistors per shader processor based on the current specs of Fermi, compared to G80/G92/GT200. The result was that they were all around 5.5M transistors per SP, I believe. So that means that effectively nVidia gets the extra flexibility 'for free'.
Combine that with the fact that 40 nm allows them to scale to higher clockspeeds, and allows them to pack more than twice the number of SPs on a single chip, and the chip as a whole will probably be more efficient anyway, and it seems very likely that this chip will be a great performer.
And if you have the performance, you dictate the prices. It will then be the salvage parts and the scaled down versions of this architecture that will do the actual competing against AMD's parts, and those nVidia chips will obviously be in a better position to compete on price than the 'full' Fermi.
If Fermi can make the 5870 look like a 3870, nVidia is golden.