GeForce 8800 Roundup: The Best of the Best
by Josh Venning on November 13, 2006 11:04 AM EST- Posted in
- GPUs
Noise
In the era of power and performance per watt, noise is just as important as any other metric. NVIDIA has managed to outfit all of its high end GPUs with relatively quiet coolers and the 8800 series is no different. We tested the noise level of each of these cards while the system was idle, and we also took the ambient noise level of the room (with the system off) for reference. We found that under stress, the cards didn't experience any extra noise from fan speed-ups. For reference, the ambient noise level of the room while testing was 38.1 dB.
The only card that really stood out in these tests was the Sparkle Calibre 8800 GTX. While the peltier element does a great job cooling the GPU, the fans required to cool the peltier make this card much louder than the competition. The Calibre 8800 GTX was about 6 Db louder than any of the other 8800s, with the rest of the cards all getting a fairly consistent noise level of about 48 dB. In fact, the Sparkle Calibre is one of the few cards that can actually surpass ATI X1950 XTX noise levels, a dubious achievement at best.
Final Words
We've looked at the 8800 GTX and GTS and we've seen the kind of performance it's capable of from our 8800 launch article. Today we put each of our 8800 samples through a series of tests and saw what kind of power consumption, heat, and noise levels, as well as what kind of user-overclocks they were capable of. We found the EVGA e-GeForce 8800 GTX and the MSI GeForce NX8800 GTX got the highest overclocks of the group and saw some impressive gains in performance because of this.
The EVGA GeForce 8800 GTX w/ ACS3 seems to do a pretty good job keeping heat down, resulting in the highest overclock of the roundup, and there weren't any problems with excess noise with this card. On the opposite end of the spectrum was the Sparkle Calibre 8800 GTX. Thanks to the card's unique peltier cooler, the Calibre 8800 GTX had extremely high power demands, even for a card as power hungry as the 8800. Although the GPU ran cooler thanks to the peltier element, we couldn't overclock it any further and it managed to be the loudest card in the roundup. Extra noise, heat and power with no tangible benefit is not what we like to see.
Because most of the 8800s we had for this roundup kept the reference designs, we didn't see much difference between them in terms of power, heat and noise (with the exception of the Sparkle Calibre 8800 GTX). Also, as we said earlier, because it's so early on in the 8800's launch, the prices for these cards were generally the same: $650 for the 8800 GTXs and $480 for the GTS cards. We weren't able to find either of the Sparkle cards or the Leadtek Winfast 8800 GTX for sale yet, and the EVGA cards currently available appear to be the standard (i.e. non-ACS3) model, but aside from the Calibre sample, we can expect them to have around the same price tags.
If we had to recommend one of these 8800 cards over the others, the slight nod goes to the EVGA e-GeForce 8800 GTX w/ ACS3; not only did our sample of this card get one of the highest overclocks, but it also ran fairly cool compared to our other 8800 GTXs. If it comes to market for the nearly same price as the others, the decision is simple; on the other hand, a $25-$50 price premium might be too much. If you are among the lucky few able to drop down the money for one of these cards, the EVGA 8800 GTX with ACS3 cooling is the one to go for. If you can't find the EVGA card, then pretty much any of the reference designs will work, and although Sparkle gets extra points for trying something different with its peltier cooler the implementation just didn't work out.
34 Comments
View All Comments
JarredWalton - Monday, November 13, 2006 - link
It appears Oblivion isn't fully able to use all the SPs at present. The stock 8800 GTX should still have about 17% more potential core performance, although maybe not? If the SPs run at 1.35 GHz, what runs at 575 MHz? Or in the case of the OC'ed GTS, at 654 MHz? It could be they have a similar number of ROPs or some other logic that somehow makes the core clock more important in some cases. Or it could just be that the drivers need more optimizations to make the GTX outperform the GTS in all games. Obviously Oblivion isn't GPU bandwidth limited; beyond that, more testing will need to be done.dcalfine - Monday, November 13, 2006 - link
What about the Liquid-cooled BFG 8800GTX?Any news on that? I'd be interested in seeing how it compared in speed, overclockablility, temperature and power consumption.
Keep up the good work though!
shamgar03 - Monday, November 13, 2006 - link
I ordered one, hopefully it will do well in the over clocking section. I am a bit concerned with the differences in over clocking the cards from different manufacturers. Does anyone know the cause of that? I mean if two cards are the exact same as the reference except for the sticker you have to wonder if there is a bit of a variance in quality of semiconductor production. Maybe favorite distributors get the better cores? Any thoughts on what causes these differences?yyrkoon - Monday, November 13, 2006 - link
I assume this text about the sparkle card is in refference to it's in-ability to overclock ? In my opinion, I would rather use this card, or another card that ran equaly (or better), and remained as cool (or cooler). I dont know about you guys, or anyone else, but the though of a Graphics card approaching 90C (@ load, barring the sparkle) scares the crap out of me, and if this is a sign of things to come, then I'm not sure what my future options are. Lets not forget about 300WATTS + under load . . .
Just as the heat / power consumption is an issue (once again, in my opinion), equally disturbing, is the brass it takes to charge $650 usd, for a first generation, card, that obviously needs alot of work. Yes, it would be nice to own such a card, for pumping out graphics better than anything previous, however, I personally would rather pay $650 for something that ran a lot cooler, and offered just as much performance, or better.
Now, to the guy talking about Vista RC2 drivers from nVidia . . . Do you really expect someone to keep up on drivers, for a "product" that is basicly doomed to die a quiet death ? "RC2" . . . Release candadite . . . as far as I'm aware, the last I checked, alot of the graphics features (of Vista) in these betas were not even implemented. This means, that quite possibly, the drivers between RC2, and release could be a good bit different. Personally, I'd rather have nVidia work on the finished product drivers, VS. the release candadite drivers any day of the week. Aside from yourself, I hardly think anyone cares if you want to run RC2 until May 2007 (legally).
Griswold - Thursday, November 23, 2006 - link
I fail to see your issue with temperatures. These cards were designed to run safely at these temperatures. Just because the figures are higher than you have come used to over the years, doesnt mean its bad.RMSistight - Monday, November 13, 2006 - link
How come the Quad SLI setup was not included on the tests? Quad SLI owners want to know.DigitalFreak - Monday, November 13, 2006 - link
LOL. You really want to see how bad a $1200 setup will get spanked by a single card that costs half as much? You must be a masochist.
penga - Monday, November 13, 2006 - link
Hey, iam always interested in the most exact wattage number a card uses and i find it hard to do the maths from the given total system power consumption and conclude how much only the card eats. So my idea was why not use a mainboard with integrated graphics card and compare the numbers? hope u get the idea. what u think, wouldnt that work?DerekWilson - Monday, November 13, 2006 - link
The only way to do this would be to place extremely low resistance (but high current) shunt resistors in the power lines AND build a PCIe riser card to measure the power supplied by the motherboard while the system is running at load.There isn't a really good way to report the power of just the card any other way -- using an onboard graphics card wouldn't do it because the rest of the system would be using a different ammount of power as well (different cards require the system to do different types of work -- a higher powered graphics card will cause the CPU, memory, and chipset to all work harder and draw more power than a lower performance card).
yyrkoon - Monday, November 13, 2006 - link
Derek, I think he was asking: "why not use an integrated graphics motherboard, as a refference system, for power consumption tests".However, it should be obvious, that this wouldnt be a good idea from a game benchmark perspective, in that, it's been my experience that integrated graphics mainboards dont normally perform as well, and often use dated technology / components. Although I havent really paid that much attention to detail, I would assume you guys use the "best" motherboard, for gaming benchmarks, and probably use the same mainboard for the rest of your tests.