NVIDIA's Bumpy Ride: A Q4 2009 Update
by Anand Lal Shimpi on October 14, 2009 12:00 AM EST- Posted in
- GPUs
Chipsets: One Day You're In and the Next, You're Out
Presently, NVIDIA’s chipset business is far from dead. They are in nearly every single Apple computer on the market, not to mention countless other OEMs. I’m not sure how much money NVIDIA is making from these chipsets, but they are selling.
NVIDIA won Apple's chipset business, Intel was not happy
Long term I don’t see much of a future for NVIDIA’s chipset business. NVIDIA said that they have no interest in pursuing an LGA-1156 chipset given Intel’s legal threats. Even if NVIDIA had a license to produce DMI chipsets, I’m not sure it makes sense.
NVIDIA's Advantage: A single chip GeForce 9400M instead of a dated Intel solution
Once the ‘dales hit, every single mainstream CPU from Intel is going to come with graphics on-package. Go out one more generation and Sandy Bridge brings the graphics on-die. AMD is doing the same thing starting in 2012.
It’s taken longer than expected, but there’s honestly no need for a third party chipset maker anymore. Most of the performance differentiation in chipsets has been moved onto the CPU die anyway, all that’s left are SATA, USB, and a bunch of validation that no one likes doing. NVIDIA is much better off building a discrete GeForce 9400M GPU at low cost and selling that. There’s much less headache involved with selling discrete GPUs than selling chipsets, plus graphics is NVIDIA’s only value add when it comes to chipsets - everyone knows how to integrate a USB controller by now. I’d say the same about SATA but AMD still has some AHCI silliness that it needs to sort out.
NVIDIA committed to supporting existing products in the channel and continues to poke fun at AMD with lines like this:
“On AMD platforms, we continue to sell a higher quantity of chipsets than AMD itself. MCP61-based platforms continue to be extremely well positioned in the entry CPU segments where AMD CPUs are most competitive vs. Intel”
As successful as NVIDIA’s AMD chipsets are today, AMD is telling us that nearly all OEM designs going forward use AMD chipsets. Again, NVIDIA’s chipset business is quite healthy today, but I don’t see much of a future in it - not that it’s a bad thing.
The only reason NVIDIA’s chipset business has lasted this long is because AMD and Intel couldn’t get their houses in order quickly enough. AMD is finally there and Intel is getting there, although it remains to be seen how well the next-generation of Atom platforms will work in practice.
A pair of Ion motherboards we reviewed
The main reason Ion got traction in the press was because it could play Blu-ray content. If Intel had done the right thing from the start and paired Atom with a decent chipset, NVIDIA would never have had the niche for Ion to fit into.
106 Comments
View All Comments
iwodo - Wednesday, October 14, 2009 - link
Why no one thought of Nvidia Chipset using PCI-Express 8x?Couldn't you theoretically make an mGPU with IO function, ( the only thing left is SATA, USB and Ethernet ) and another PCI-Express 8x So the mGPU communicate with another Nvidia CPU via its own lane without going back to CPU.
chizow - Wednesday, October 14, 2009 - link
[quote]Let’s look at what we do know. GT200b has around 1.4 billion transistors and is made at TSMC on a 55nm process. Wikipedia lists the die at 470mm^2, that’s roughly 80% the size of the original 65nm GT200 die. In either case it’s a lot bigger and still more expensive than Cypress’ 334mm^2 40nm die.[/quote]Anand, why perpetuate this myth comparing die sizes and price on different process nodes? Surely someone with intimate knowledge of the semiconductor industry like yourself isn't claiming a single TSMC 300mm wafer on 40nm costs the same as 55nm or 65nm?
A wafer is just sand with some copper interconnects, the raw material price means nothing for the end price tag. Cost is determined by capitalization of assets used to manufacture goods, the actual raw material involved means very little. Obviously the uncapitalised investment on the new 40nm process exceeds that of 55nm or 65nm, so obviously prices would need to be higher to compensate.
I can't think of anything "old" that costs more than the "new" despite the "old" being larger. If you think so, I have about 3-4 100 lb CRT TVs I want to sell you for current LCD prices. ;)
In any case, I think the concerns about selling GT200b parts are a bit unfounded and mostly to justify the channel supply deficiency. We already know the lower bounds of GT200b pricing, the GTX 260 has been selling for $150 or less with rebates for quite some time already. If anything, the somewhat artificial supply deficiency has kept demand for Nvidia parts high.
I think it was more of a calculated risk by Nvidia to limit their exposure to excess inventory in channel, which was a reportedly big issue during the 65nm G92/GT200 to 55nm G92b/GT200b transition. There's was also some rumors about Nvidia going to more of a JIT delivery system to avoid some of the purchasing discounts some of the major partners were exploiting. They basically waited for the last day of the quarter for Nvidia to discount and unload inventory stock levels in an effort to beef up quarterly results.
chizow - Wednesday, October 14, 2009 - link
Properly formatted portion meant to be quoted in above post for emphasis.
MadMan007 - Wednesday, October 14, 2009 - link
Possibly the most important question for desktop PC discrete graphics from gamers who aren't worried about business analysis is 'What will be the rollout of Fermi architecture to non-highend non-high cost graphics cards?'Is NV going to basically shaft that market by going with the cut down GT200 series DX10.1 chips like GT220? (OK, that's a little *too* lowend but I mean the architecture.) As much as we harped on G92 renaming at least it was competitive versus the HD4000 series in certain segments and the large GT200s, GTX260 in particular, were ok after price cuts. The same is very likely not going to be true for DX10.1 GT200 cards especially when you consider less tangible things like DX11 which people will feel better buying anyway.
Answer that question and you'll know the shape of desktop discrete graphics for this generation.
vlado08 - Wednesday, October 14, 2009 - link
I thing that Nvidia is preparing Fermi to meet with Larabee. Probably they are confident that AMD isn't a big threat. They know them very well and know what they are capable of and what to expect but they don't now their new opponent. Every body knows that Intel has very strong financial power and if they want something they do it some times it takes more time but eventually they bring every thing to an end. They are a force not to be underestimateed. If Nvidia has any time advantige they should use it.piesquared - Wednesday, October 14, 2009 - link
[quote]Other than Intel, I don’t know of any company that could’ve recovered from NV30.[/quote]How about recoverying from both Barcelona AND R600
Scali - Thursday, October 15, 2009 - link
AMD hasn't recovered yet. They've been making losses quarter after quarter. I expect more losses in the future, now that Nehalem has gone mainstream.Some financial experts have put AMD in their list of companies to most likely go bankrupt in 2010:
http://www.jiltin.com/index.php/finance/economy/bi...">http://www.jiltin.com/index.php/finance...-bankrup...
bhougha10 - Wednesday, October 14, 2009 - link
The one thing that is not considered in these articals is the real world. In the real world people were not waiting for the ATI 5800s to come out, but they are waiting for the GTX 300s to come out. The perception in the market place is that NVIDIA is namebrand and ATI is generic.It is not a big deal at all that the GTX 300s are late. Nvidia had the top cards for 3/4ths of the year, it is only healthy that ATI have the lead for 1/4th of the year. The only part that is bad for NVIDIA is that they don't have this stuff out for Christmas, and I am not sure that is even a big deal. Even so, these high end cards are not gifts, they are budget items (i.e. like I plan to wait till the beginning of next year to buy this or that)
Go do some online gamming if you think this is all made up. You will have your AMD fanboys, but the percentage is low. (sorry didn't want to say fanboy)
These new 210/220 GTs sound like crap, but the people buying them won't know that and will pay the money and not be any the wiser that they could have gotten a better value. They only thought of the NVIDIA namebrand.
Anyway, I say this in regards to the predicting of the eventual demise of NVIDIA. Same reason these financial anaylst can't predict the market well.
Another great artical, a lot of smart guys working for these tech sites.
Zool - Wednesday, October 14, 2009 - link
So you think that the company who has the top card and is a namebrand is the winner. Even with top cards in 3/4ths of the year they had lost quite a money on predicted margins with 4k radeons on market. They couldnt even get to midrange cards with GT200 the price for them was too high. And u think that with that monster gt300 it will be better.They will sure sell it but at what cost for them.
bhougha10 - Wednesday, October 14, 2009 - link
AMD has lost money for the last 3 years (as far as I looked back) and not just a little money a lot of money. NVIDIA on the other hand lost a little money last year. So,not sure of the point of that last reply.The point of the original post was that there is an intrinsic value or novelty that comes with a company. AMD has very little novelty. This is just the facts. I have AMD and NVIDIA stock, I want them to both win. It's good for everyone if they do both win.