NVIDIA's GeForce 8800 (G80): GPUs Re-architected for DirectX 10
by Anand Lal Shimpi & Derek Wilson on November 8, 2006 6:01 PM EST- Posted in
- GPUs
The thing to remember is that, even when all optimizations are disabled, there are other optimizations going on that we can't touch. There always will be. The better these optimizations get, the faster we will be able to render accurate images. Gaining more control over what happens in the hardware is a nice bonus, but disabling optimization for no reason just doesn't make sense. Thus, our tests will be done at default texture filtering quality on NVIDIA hardware. In order to understand the performance impact of High Quality vs. Quality texture filtering on NVIDIA hardware, we ran a few benchmarks with as many optimizations disabled as possible and compared the result to our default quality tests. Here's what we get:
We can clearly see that G70 takes a performance hit from enabling high quality mode, but that G80 is able to take it in stride. While we don't have the ability to specifically disable or enable optimizations in ATI hardware, Catalyst AI is the feature that dictates how much liberty ATI is able to take with a game, from filtering optimizations all the way to shader replacement. We can't tell if the difference we see in Oblivion is due to shader replacement, filtering, or some other optimization under R580.
111 Comments
View All Comments
haris - Thursday, November 9, 2006 - link
You must have missed the article they published the very next day http://www.theinquirer.net/default.aspx?article=35...">here. saying they goofed.Araemo - Thursday, November 9, 2006 - link
Yes I did - thanks.I wish they would have updated the original post to note the mistake, as it is still easily accessible via google. ;) (And the 'we goofed' post is only shown when you drill down for more results)
Araemo - Thursday, November 9, 2006 - link
In all the AA comparison photos of the power lines, with the dome in the background - why does the dome look washed out in the G80 images? Is that a driver glitch? I'm only on page 12, so if you explain it after that.. well, I'll get it eventually.. ;) But is that just a driver glitch, or is it an IQ problem with the G80 implementation of AA?bobsmith1492 - Thursday, November 9, 2006 - link
Gamma-correcting AA sucks.Araemo - Thursday, November 9, 2006 - link
That glitch still exists whether or not gamma-correcting AA is enabled or disabled, so that isn't it.iwodo - Thursday, November 9, 2006 - link
I want to know if these power hungry monster have any power saving features?I mean what happen if i am using Windows only most of the time? Afterall CPU have much better power management when they are idle or doing little work. Will i have to pay extra electricity bill simply becoz i am a cascual gamer with a power - hungry/ ful GPU ?
Another question pop up my mind was with CUDA would it now be possible for thrid party to program a H.264 Decoder running on GPU? Sounds good to me:D
DerekWilson - Thursday, November 9, 2006 - link
oh man ... I can't believe I didn't think about that ... video decoder would be very cool.Pirks - Friday, November 10, 2006 - link
decoder is not interesting, but the mpeg4 asp/avc ENCODER on the G80 GPU... man I can't imagine AVC or ASP encoding IN REAL TIME... wow, just wooowwwI'm holding my breath here
Igi - Thursday, November 9, 2006 - link
Great article. The only thing I would like to see in a follow up article is performance comparison in CAD/CAM applications (Solidworks, ProEngineer,...).BTW, how noisy are new cards in comparison to 7900GTX and others (in idle and under load)?
JarredWalton - Thursday, November 9, 2006 - link
I thought it was stated somewhere that they are as loud (or quiet if you prefer) as the 7900 GTX. So really not bad at all, considering the performance offered.