NVIDIA's Bumpy Ride: A Q4 2009 Update
by Anand Lal Shimpi on October 14, 2009 12:00 AM EST- Posted in
- GPUs
Chipsets: One Day You're In and the Next, You're Out
Presently, NVIDIA’s chipset business is far from dead. They are in nearly every single Apple computer on the market, not to mention countless other OEMs. I’m not sure how much money NVIDIA is making from these chipsets, but they are selling.
NVIDIA won Apple's chipset business, Intel was not happy
Long term I don’t see much of a future for NVIDIA’s chipset business. NVIDIA said that they have no interest in pursuing an LGA-1156 chipset given Intel’s legal threats. Even if NVIDIA had a license to produce DMI chipsets, I’m not sure it makes sense.
NVIDIA's Advantage: A single chip GeForce 9400M instead of a dated Intel solution
Once the ‘dales hit, every single mainstream CPU from Intel is going to come with graphics on-package. Go out one more generation and Sandy Bridge brings the graphics on-die. AMD is doing the same thing starting in 2012.
It’s taken longer than expected, but there’s honestly no need for a third party chipset maker anymore. Most of the performance differentiation in chipsets has been moved onto the CPU die anyway, all that’s left are SATA, USB, and a bunch of validation that no one likes doing. NVIDIA is much better off building a discrete GeForce 9400M GPU at low cost and selling that. There’s much less headache involved with selling discrete GPUs than selling chipsets, plus graphics is NVIDIA’s only value add when it comes to chipsets - everyone knows how to integrate a USB controller by now. I’d say the same about SATA but AMD still has some AHCI silliness that it needs to sort out.
NVIDIA committed to supporting existing products in the channel and continues to poke fun at AMD with lines like this:
“On AMD platforms, we continue to sell a higher quantity of chipsets than AMD itself. MCP61-based platforms continue to be extremely well positioned in the entry CPU segments where AMD CPUs are most competitive vs. Intel”
As successful as NVIDIA’s AMD chipsets are today, AMD is telling us that nearly all OEM designs going forward use AMD chipsets. Again, NVIDIA’s chipset business is quite healthy today, but I don’t see much of a future in it - not that it’s a bad thing.
The only reason NVIDIA’s chipset business has lasted this long is because AMD and Intel couldn’t get their houses in order quickly enough. AMD is finally there and Intel is getting there, although it remains to be seen how well the next-generation of Atom platforms will work in practice.
A pair of Ion motherboards we reviewed
The main reason Ion got traction in the press was because it could play Blu-ray content. If Intel had done the right thing from the start and paired Atom with a decent chipset, NVIDIA would never have had the niche for Ion to fit into.
106 Comments
View All Comments
medi01 - Saturday, October 17, 2009 - link
Since when are sales based only on products qualities?How many buys are actually aware of performance details?
How many of those, that are aware, are NOT one brand loyal?
To me it rather seems nVidia, despite not having an answer to 5800 series for a few month will still successfully sell overpriced cards. It's AMD that will continue to make huge loses, having good products and "better value" price policy.
Customers should be interested in healthy competition. "I buy inferior product from a company that already dominates the market" will simply kill the underdog, and then it'll show us... :(
kashifme21 - Wednesday, October 21, 2009 - link
By supporting consoles, they might have gotten sales in the short term. However in the long term it has been a disaster for both of them.Consoles sales even putting the xbox 360 and ps3 together do not exceed 60million. which imo is not much of an ammount for these 2 companies.
what has happened now is that the focus of developers has shifted to the consoles. which is why jumps in graphics have become stagnant now. the result in this is that even pc users dont need constant upgrades like the way they used to which means less sales from the pc market for nvidia and ati.
Also as a note previously a pc user would need to make a switch in about 2 years to stay uptodate with the latest games, now since its developed with consoles in minds the cycle has become 7 years. So previously if a pc user needed to upgrade once in 2 years its only going to be 1 in 7yrs now, there simply wont be games out to take advantage of hardware.
also the console market ati and nvidia thought to cater too is in the same situation. a console user will simply buy 1 console then make no hardware purchase for the rest of the generation unless the console fails.
imo going for console sales might have given a sale in the short term butfor the long term its been bad for all the pc hardware makers be it cpu, gpu, ram, chipsets etc. as time goes on this will get worse specially if another console gen is supported.
KhadgarTWN - Sunday, October 25, 2009 - link
For the part of console, I have a little diff thought.For very short period, consoles boost GPU sells; a bit longer, consoles devour PC gaming and affect hardware sells, that's true.
But for a little longer? If selling GPU to console is a mistake, could they "Fix" it?
Think about if AMD(ATi)/nVidia refused to develop graphic part of console and Larrabee failed, where is the next Gen console?
For the years of PS2/DC/XBox, that's no big deal, Sony had its EE + GS, Nintendo had its own parts. Maybe XBOX would never born, but no big deal, consoles still live strong and prosporous.
For now, no AMD, no XB360; no nVidia, no PS3. And for Nintendo part? Doubtful they could utilize their own part.
msroadkill612 - Wednesday, October 28, 2009 - link
Am trying to make ammends for being a bit of a leech on geek sites and not contributing. Bit off topic but i hope some of you will find the below post useful.There is a deluge of requests around the geek forums re what grapics card to buy, but never have i seen a requestor specify whether they plan an always on PC vs "on only when in use".
A$ .157 / kw sydney australia - oct 09 (A$=~.92 of a $US now)
usa prices link:
http://www.eia.doe.gov/cneaf/electricity/epm/table...">http://www.eia.doe.gov/cneaf/electricity/epm/table...
(so sydney prices v similar to california - 15.29usc/ kw )
so:
in an always on PC, a graphics card which draws 10 watts less at IDLE than an alternative card
(My logic here is - If you having fun at FULL LOAD, then who cares what the load power draw/cost is, but in most cases this is a small portion of the week and can logically be ignored - IF, and this is a big IF, your power save / sleep settings are set and working correctly)
.157$/1000x10=.00157$ (substitute your local cost - always increasing though, negating you bean counters net present worth objections)
x 24x365
=A$13.75pa for each extra 10w idle draw (i hope the math is right)
If you use air conditioning all day all year, you can theoretically double this cost.
If, however, you use electric bar radiators for heating all day all year, then I afraid, my dear Watson, the "elementary" laws of physics dictate that you must waste no further time on this argument. It does not concern you, except to say that you are in the enviable position of soon being able to buy a formerly high end graphics card (am thinking dual core Nvidea here) for about the cost of a decent electric radiator, and getting quite decent framerates thrown in free with your heating bill.
Using the below specs and above prices, a hd 4870 (90w) costs us$84.39 more to idle all year in california than the 5850/5870 (27w) at current prices. In about 18 months the better card should repay the premium paid for it.
Hope this helps some of you cost justify your preferred card to those who must be obeyed.
ATI HD 5870 & 5850 idle power 27W
ATI Radeon HD 4870 idle 90w
5770 18w
5750 16w
780g chipset mobo - AMD claims idle power consumption of the IGP is just 0.95W!
Some nvidea card idle power specs: GPU Idle Watts
NVIDIA GeForce GTX 280 30 W
Gigabyte GeForce 9800 GX2 GV-NX98X1GHI-B 85 W
ZOTAC GeForce 9800 GTX AMP! Edition ZT-98XES2P-FCP 50 w
FOXCONN GeForce 9800 GTX Standard OC Edition 9800GTX-512N 48 W
ZOTAC GeForce 9800 GTX 512MB ZT-98XES2P-FSP 53 W
MSI NX8800GTX-T2D768E-HD OC GeForce 8800 GTX 76 W
ZOTAC GeForce 8800 GT 512MB AMP! Edition ZT-88TES3P-FCP 33 W
Palit GeForce 9600 GT 1GB Sonic NE/960TSX0202 30 W
FOXCONN GeForce 8800 GTS FV-N88SMBD2-OD 59 w
Wolfpup - Friday, October 16, 2009 - link
Okay, huh? I get it for the low end stuff, but mid range does this mean we're going to be wasting a ton of transistors on worthless integrated video that won't be used, taking up space that could have been more cache or another core or two?!?I have NO use for integrated graphics. Never have, never will.
Seramics - Friday, October 16, 2009 - link
Nvidia is useless and going down. I dont mind they get out of the market at all.Sandersann - Friday, October 16, 2009 - link
Even if you money is not an issue for you. You want Nvidia and ATI to do well or at least stay competitive because the competition encourages innovation. without it, you will have to pay more for less features and speed. We might get a taste of that this Christmas and into Q1 2010.medi01 - Friday, October 16, 2009 - link
http://store.steampowered.com/hwsurvey/">http://store.steampowered.com/hwsurvey/AMD - 27%
nVidia - 65% (ouch)
shin0bi272 - Friday, October 16, 2009 - link
hey can you post the most common models too? The last time I looked at that survey the 7800gtx was the fastest card and most people on the survey were using like 5200's or something outrageous like that.tamalero - Sunday, October 18, 2009 - link
http://store.steampowered.com/hwsurvey/videocard/">http://store.steampowered.com/hwsurvey/videocard/most people are on 8800 series, it doesnt reflect "current" gen
in last gen the 48XX series are increasing to 11% while the 260 series are around 4%