NVIDIA's GeForce 8800 (G80): GPUs Re-architected for DirectX 10
by Anand Lal Shimpi & Derek Wilson on November 8, 2006 6:01 PM EST- Posted in
- GPUs
The 8800 GTX and GTS
Today we expect to see availability of two cards based on NVIDIA's G80 GPU: the GeForce 8800 GTX and 8800 GTS. Priced at $599 and $449 respectively, the two cards, as usual, differ in clock speeds and processing power.
8800 GTX (top) vs. 7900 GTX (bottom)
The 8800 GTX gets the full G80 implementation of 128 stream processors and 64 texture fetch units. The stream processors are clocked at 1.35GHz with the rest of the GPU running at 575MHz. The GTX has six 64-bit memory controllers operating in tandem, connected to 768MB of GDDR3 memory running at 900MHz. GDDR4 is supported but will be introduced on a later card.
NVIO: Driving a pair of TMDS transmitters near you
You get two dual-link DVI ports driven by NVIDIA's new NVIO chip that handles TMDS and other currently unknown functions. Keeping a TMDS on-die is a very difficult thing to do, especially if you have logic operating at such high clock speeds within the GPU, so with G80 NVIDIA had to move the TMDS off-die and onto this separate chip. The NVIO chip also supports HDCP, but you do need the crypto ROM keys in order to have full HDCP support on the card. That final decision is up to the individual card manufacturers, although at this price point we hope they all choose to include HDCP support.
The 8800 GTX has two PCIe power connectors and two SLI connectors:
Two SLI connectors on the 8800 GTX
Bridges in action
The dual power connectors are necessary to avoid drawing more power from a single connector than the current ATX specification allows for. The dual SLI connectors are for future applications, such as daisy chaining three G80 based GPUs, much like ATI's latest CrossFire offerings.
dual power connectors
The GeForce 8800 GTS loses 32 SPs bringing it down to 96 stream processors and 48 texture fetch units. The shader core runs at 1.2GHz, while the rest of the GTS runs at 500MHz. The GTS also has only five 64-bit memory controllers with 640MB of GDDR3 memory running at 800MHz.
7900 GTX (left) 8800 GTS (middle) 8800 GTX (right)
The 8800 GTS has the same NVIO chip as the 8800 GTX, but the board itself is a bit shorter and it only features one SLI connector and one PCIe power connector.
Only one power connector on an 8800 GTS
...and only one SLI connector
Both cards are extremely quiet during operation and are audibly indiscernible from a 7900 GTX.
111 Comments
View All Comments
aweigh - Friday, November 10, 2006 - link
You can just use the program DX Tweaker to enable Triple Buffering in any D3D game and use your VSYNC with negligable performance impact. So you can play with your VSYNC, a high-res and AA as well. :)aweigh - Friday, November 10, 2006 - link
I'm gonna buy an 88 specifically to use 4x4 SuperSampling in games. Why bother with MSAA with a card like that?DerekWilson - Friday, November 10, 2006 - link
Supersampling can make textures blurry -- especially very detailed textures.And the impact will be much greater with the use of longer more detailed pixel shaders (as the shaders must be evaluated at every sub-pixel in supersample).
I think transparency / adaptive AA are enough.
On your previous comment, I don't think we're to the point where we can hit triple buffering, vsync, high levels of AA AND high resolution (2560x1600) without some input lag (triple buffering plus vsync with framerates less than your refresh rate can cause problems).
If you're talking about enabling all these options on a lower resolution lcd panel, then I can definitely see that as a good use of the hardware. And it might be interesting to look at more numbers with these type of options enabled.
Thanks for the suggestion.
aweigh - Saturday, November 11, 2006 - link
I never knew that about SuperSampling. Is it something similar to Quincux blurring? And would using a negative LOD via RivaTuner/nHancer counteract the effect?How about NVIDIA's Digital Sharpness setting in Color Correction? I've found a smidge of sharpening can do wonders to improve overall clarity.
By the way, when you said Adaptive AA, were you referring to ATI cards?
Unam - Friday, November 10, 2006 - link
Derek,Saw your comment regarding the rationale for the test resolution, while I understand your reasoning now, it still begs the question how many of your readers have 30" LCD flat panels?
DerekWilson - Friday, November 10, 2006 - link
There might not be many out there right now, but it's still the right test platform for G80. We did test down to 1600x1200, so people do have information if they need it.But it speaks to who should own an 8800 GTX right now. It doesn't make sense to spend that much money on a part if you aren't going to get anything out of it with your 1280x1024 panel.
Owners of a 2560x1600 panel will want an 8800 GTX. Owners of an 8800 GTX will want a 2560x1600 panel. Smooth framerates with the ability to enable 4xAA in every game that allowed it is reason enough. People without a 2560x1600 panel should probably wait until prices come down on the 8800 GTX or until games that are able to push the 8800 GTX harder to buy the card.
Unam - Tuesday, November 14, 2006 - link
Derek,A follow up to testing resolutions, the FPS numbers we see in your articles, are they maximum, minimum or average?
Unam - Friday, November 10, 2006 - link
Who the heck runs 2560x1600? At 4XAA? Come on guys, real world benchmarks please!DerekWilson - Friday, November 10, 2006 - link
we did:1600x1200, 1920x1440, and even 1280x1024 in Oblivion
dragonsqrrl - Thursday, August 25, 2011 - link
....lol, owned.