NVIDIA's GeForce 8800 (G80): GPUs Re-architected for DirectX 10
by Anand Lal Shimpi & Derek Wilson on November 8, 2006 6:01 PM EST- Posted in
- GPUs
What's Gamma Correct AA?
Gamma correction is a technique used to map linearly increasing brightness data to a display device in a way that conveys linearly increasing intensity. As displays are nonlinear devices, gamma correction requires a nonlinear adjustment to be made to brightness values before being sent to the display. Ideally, gamma corrected linear steps in the brightness of a pixel will result in linear steps in perceived intensity. The application in antialiasing is that high contrast edges can appear under aliased if the brightness of a pixel isn't adjusted high enough for humans to perceive an increase in intensity after being displayed by the monitor.
Unfortunately, gamma correcting AA isn't always desirable. Different CRT, LCD, and TVs have different gamma characteristics that make choosing one gamma correction scheme more or less effective per device. It can also result in brighter colored sub-samples having a heavier influence on the color of a pixel than darker sub-samples. This causes problems for thing like thin lines.
To illustrate the difference, we'll look at images of Half-Life taken on G80 with and without gamma correction enabled.
16XQ No Gamma 16XQ Gamma
Hold mouse over links to see Image Quality
16XQ No Gamma 16XQ Gamma
Hold mouse over links to see Image Quality
We can see the antenna decrease in clarity due to the fact that each of the brighter subsamples has a disproportionately higher weight than the darker subsamples. As far as the roof line is concerned, our options are to see the roof blurring out into the sky, or watching the sky cut into the roof.
Really, edge AA with and without gamma correction is six of one and half a dozen of the other. Combine this with the fact that the effect is different depending on the monitor being used and the degraded visibility of thin lines and we feel that gamma correct AA isn't a feature that improves image quality as much as it just changes it.
While we are happy that NVIDIA has given us the choice to enable or disable gamma correct AA as we see fit, with G80 the default state has changed to enabled. While this doesn't have an impact on performance, we prefer rendering without gamma correct AA enabled and will do so in our performance tests. We hope that ATI will add a feature to disable gamma correct AA in the future as well. For now, let's take a look at R580 and G80 compared with gamma correction enabled.
G80 4X Gamma ATI 4X Gamma
Hold mouse over links to see Image Quality
G80 4X Gamma ATI 4X Gamma
Hold mouse over links to see Image Quality
At 4xAA with gamma correction enabled, it looks like ATI is able to produce a better quality image. Some of the wires and antenna on NVIDIA hardware area a little more ragged looking while ATI's images are smoothed better.
111 Comments
View All Comments
yyrkoon - Thursday, November 9, 2006 - link
If you're using Firefox, get, and install the extension "flashblock". Just did this myself today, tired of all the *animated* adds bothering me while reading articles.Sorry AT guys, but we've had this discussion before, and its realy annoying.
JarredWalton - Thursday, November 9, 2006 - link
Do you want to be able for us to continue as a site? Because ads support us. Anyway, his problem is related to not seeing images, so your comment about blocking ads via flashblock is completely off topic.yyrkoon - Thursday, November 9, 2006 - link
Of course I want you guys to continue on as a site, just wish it were possible without annoying flashing adds in a section where I'm trying to concentrate on the article.As for the off topic part, yeah, my bad, I mis-read the full post (bad habit). Feel free to edit or remove that post of mine :)
archcommus - Thursday, November 9, 2006 - link
What browser are you using?falc0ne - Thursday, November 9, 2006 - link
firefox 2.0JarredWalton - Thursday, November 9, 2006 - link
If Firefox, I know there's an option to block images not on the originating website. In this case, images come from image.anandtech.com while the article is on www.anandtech.com, so that my be the cause of your problems. IE7 and other browsers might have something similar, though I haven't ever looked. Other than that, perhaps some firewall or ad blocking software is to blame - it might be getting false positives?archcommus - Thursday, November 9, 2006 - link
Wow to Anandtech - another amazing, incredibly in-depth article. It is so obvious this site is run by dedicated professionals who have degrees in these fields versus most other review sites where the authors just take pictures of the product and run some benches. Articles like this keep the AT reader base very very strong.Also wow to the G80, obviously an amazing card. My question, is 450W the PSU requirement for the GTX only or for both the GTX and GTS? I ask because I currently have a 400W PSU and am wondering if it will be sufficient for next-gen DX10 class hardware, and I know I would not be buying the highest model card. I also only have one HDD and one optical drive in my system.
Yet another wow goes out to the R&D monetary investment - $475 million! It's amazing that that amount is even acceptable to nVidia, I can't believe the sales of such a high end, enthusiast-targeted card are great enough to warrant that.
JarredWalton - Thursday, November 9, 2006 - link
Sales of the lower end parts which will be based off G80 are what make it worthwhile, I would guess. As for PSU, I think that 450W is for the GTX, and more is probably a safe bet (550W would be in line with a high-end system these days, although 400W ought to suffice if it's a good quality 400W). You can see that the GTX tops out at just under 300W average system power draw with an X6800, so if you use an E6600 and don't overclock, a decent 400W ought to work. The GTX tops out around 260W average with the X6800, so theoretically even a decent 350W will work fine. Just remember to upgrade the PSU if you ever add other components.photoguy99 - Thursday, November 9, 2006 - link
I just wanted to second that thought -AT articles have incredible quality and depth at this point - you guys are doing great work.
It's actually getting embarrasing for some of your competing sites, I browsed the Tom's article and it had so much fluff and retread I had to stop.
Please don't forget the effort is noticed and appreciated.
shabby - Wednesday, November 8, 2006 - link
It wasnt mentioned in the review, but whats the purpose of the 2nd sli connector?