NVIDIA's GeForce 8800 (G80): GPUs Re-architected for DirectX 10
by Anand Lal Shimpi & Derek Wilson on November 8, 2006 6:01 PM EST- Posted in
- GPUs
The Test
In our game tests, in every game we enabled the highest level of quality possible as far as features and effects are concerned. Where it was an option we enabled 16xAF in game. In games with "texture filtering" settings (like Battlefield 2) we endabled the highest level of filtering in game. In Oblivion we forced 16xAF in the control panel.
With the exception of Oblivion, we enabled AA in all our general performance tests. Where we were given the option, we chose 4xAA. In Black & White 2 and Company of Heroes we enabled AA in game (High for BW2 and Enabled for CoH).
CPU: | Intel Core 2 Extreme X6800 (2.93GHz/4MB) |
Motherboard: |
EVGA nForce 680i SLI Intel BadAxe |
Chipset: | NVIDIA nForce 680i SLI Intel 975X |
Chipset Drivers: |
Intel 7.2.2.1007 (Intel) NVIDIA nForce 9.35 |
Hard Disk: | Seagate 7200.7 160GB SATA |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2) |
Video Card: | Various |
Video Drivers: |
ATI Catalyst 6.10 NVIDIA ForceWare 96.97 NVIDIA ForceWare 91.47 (G70 SLI) |
Desktop Resolution: | 2560 x 1600 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
A Few Words about Performance Per Watt
In the coming performance pages we will be looking at the performance of the 8800 series of graphics cards as well as power consumption and performance per watt of our test systems. Note that the power consumption and performance per watt we are reporting is for the entire system, and not just the GPU, so while you don't get an idea of the performance per watt of the GPU alone, you do get an idea of the performance per watt of the entire system configured as we have. This is an important distinction to keep in mind as performance per watt of the GPU alone could be very different than what we're reporting here. What these numbers will tell you however is the most power efficient setup we have configured here today.
111 Comments
View All Comments
aweigh - Friday, November 10, 2006 - link
You can just use the program DX Tweaker to enable Triple Buffering in any D3D game and use your VSYNC with negligable performance impact. So you can play with your VSYNC, a high-res and AA as well. :)aweigh - Friday, November 10, 2006 - link
I'm gonna buy an 88 specifically to use 4x4 SuperSampling in games. Why bother with MSAA with a card like that?DerekWilson - Friday, November 10, 2006 - link
Supersampling can make textures blurry -- especially very detailed textures.And the impact will be much greater with the use of longer more detailed pixel shaders (as the shaders must be evaluated at every sub-pixel in supersample).
I think transparency / adaptive AA are enough.
On your previous comment, I don't think we're to the point where we can hit triple buffering, vsync, high levels of AA AND high resolution (2560x1600) without some input lag (triple buffering plus vsync with framerates less than your refresh rate can cause problems).
If you're talking about enabling all these options on a lower resolution lcd panel, then I can definitely see that as a good use of the hardware. And it might be interesting to look at more numbers with these type of options enabled.
Thanks for the suggestion.
aweigh - Saturday, November 11, 2006 - link
I never knew that about SuperSampling. Is it something similar to Quincux blurring? And would using a negative LOD via RivaTuner/nHancer counteract the effect?How about NVIDIA's Digital Sharpness setting in Color Correction? I've found a smidge of sharpening can do wonders to improve overall clarity.
By the way, when you said Adaptive AA, were you referring to ATI cards?
Unam - Friday, November 10, 2006 - link
Derek,Saw your comment regarding the rationale for the test resolution, while I understand your reasoning now, it still begs the question how many of your readers have 30" LCD flat panels?
DerekWilson - Friday, November 10, 2006 - link
There might not be many out there right now, but it's still the right test platform for G80. We did test down to 1600x1200, so people do have information if they need it.But it speaks to who should own an 8800 GTX right now. It doesn't make sense to spend that much money on a part if you aren't going to get anything out of it with your 1280x1024 panel.
Owners of a 2560x1600 panel will want an 8800 GTX. Owners of an 8800 GTX will want a 2560x1600 panel. Smooth framerates with the ability to enable 4xAA in every game that allowed it is reason enough. People without a 2560x1600 panel should probably wait until prices come down on the 8800 GTX or until games that are able to push the 8800 GTX harder to buy the card.
Unam - Tuesday, November 14, 2006 - link
Derek,A follow up to testing resolutions, the FPS numbers we see in your articles, are they maximum, minimum or average?
Unam - Friday, November 10, 2006 - link
Who the heck runs 2560x1600? At 4XAA? Come on guys, real world benchmarks please!DerekWilson - Friday, November 10, 2006 - link
we did:1600x1200, 1920x1440, and even 1280x1024 in Oblivion
dragonsqrrl - Thursday, August 25, 2011 - link
....lol, owned.