NVIDIA's GeForce 8800 (G80): GPUs Re-architected for DirectX 10
by Anand Lal Shimpi & Derek Wilson on November 8, 2006 6:01 PM EST- Posted in
- GPUs
Texture Filtering Image Quality
Texture filtering is always a hot topic when a new GPU is introduced. For the past few years, every new architecture has had a new take on where and how to optimize texture filtering. The community is also very polarized and people can get really fired up about how this company or that is performing an optimization that degrades the user's experience.
The problem is that all 3D graphics is an optimization problem. If GPUs were built to render every detail of every of every scene without any optimization, rather than frames per second, we would be looking at seconds per frame. Despite this, looking at the highest quality texture filtering available is a great place from which to start working our way down to what most people will use.
The good news is that G80 completely eliminates angle dependent anisotropic filtering. Finally we have a return to GeForce FX quality anisotropic filtering. When stacked up against R580 High Quality AF with no optimizations enabled on either side (High Quality mode for NVIDIA, Catalyst AI Disabled for ATI), G80 definitely shines. We can see at 8xAF (left) under NVIDIA's new architecture is able to more accurately filter textures based on distance from and angle to the viewer. On the right, we see ATI's angle independent 16xAF degrade in quality to a point where different texture stages start bleeding into one another in undesirable ways.
ATI G80
Hold mouse over links to see Image Quality
Oddly enough, ATI's 16xAF is more likely to cause shimmering with the High Quality AF box checked than without. Even when looking at an object like a flat floor, we can see the issue pop up in the D3DAFTester. NVIDIA has been battling shimmering issues due to some of their optimizations over the past year or so, but these issues could be avoided through driver settings. There isn't really a way to "fix" ATI's 16x high quality AF issue.
ATI Normal Quality AF ATI High Quality AF
Hold mouse over links to see Image Quality
But, we would rather have angle independent AF than not, so for the rest of this review, we will enable High Quality AF on ATI hardware. This will give us a more fair comparison to G80, even if we still aren't really looking at two bowls of apples. G70 is not able to enable angle independent AF, so we'll be stuck with the rose pattern we've been so familiar with over the past few years.
There is still the question of how much impact optimization has on texture filtering. With G70, disabling optimizations resulted in more trilinear filtering being done, and thus a potential performance decrease. The visual result is minimal in most cases, as trilinear filtering is only really necessary to blur the transition between mipmap levels on a surface.
G70 Normal Quality AF G70 High Quality AF
Hold mouse over links to see Image Quality
On G80, we see a similar effect when comparing default quality to high quality. Of course, with angle independent anisotropic, we will have to worry less about shimmering period, so optimizations shouldn't cause any issues here. Default quality does show a difference in the amount of trilinear filtering being applied, but this does not negatively impact visual quality in practice.
G80 Normal Quality AF G80 High Quality AF
Hold mouse over links to see Image Quality
111 Comments
View All Comments
yyrkoon - Thursday, November 9, 2006 - link
If you're using Firefox, get, and install the extension "flashblock". Just did this myself today, tired of all the *animated* adds bothering me while reading articles.Sorry AT guys, but we've had this discussion before, and its realy annoying.
JarredWalton - Thursday, November 9, 2006 - link
Do you want to be able for us to continue as a site? Because ads support us. Anyway, his problem is related to not seeing images, so your comment about blocking ads via flashblock is completely off topic.yyrkoon - Thursday, November 9, 2006 - link
Of course I want you guys to continue on as a site, just wish it were possible without annoying flashing adds in a section where I'm trying to concentrate on the article.As for the off topic part, yeah, my bad, I mis-read the full post (bad habit). Feel free to edit or remove that post of mine :)
archcommus - Thursday, November 9, 2006 - link
What browser are you using?falc0ne - Thursday, November 9, 2006 - link
firefox 2.0JarredWalton - Thursday, November 9, 2006 - link
If Firefox, I know there's an option to block images not on the originating website. In this case, images come from image.anandtech.com while the article is on www.anandtech.com, so that my be the cause of your problems. IE7 and other browsers might have something similar, though I haven't ever looked. Other than that, perhaps some firewall or ad blocking software is to blame - it might be getting false positives?archcommus - Thursday, November 9, 2006 - link
Wow to Anandtech - another amazing, incredibly in-depth article. It is so obvious this site is run by dedicated professionals who have degrees in these fields versus most other review sites where the authors just take pictures of the product and run some benches. Articles like this keep the AT reader base very very strong.Also wow to the G80, obviously an amazing card. My question, is 450W the PSU requirement for the GTX only or for both the GTX and GTS? I ask because I currently have a 400W PSU and am wondering if it will be sufficient for next-gen DX10 class hardware, and I know I would not be buying the highest model card. I also only have one HDD and one optical drive in my system.
Yet another wow goes out to the R&D monetary investment - $475 million! It's amazing that that amount is even acceptable to nVidia, I can't believe the sales of such a high end, enthusiast-targeted card are great enough to warrant that.
JarredWalton - Thursday, November 9, 2006 - link
Sales of the lower end parts which will be based off G80 are what make it worthwhile, I would guess. As for PSU, I think that 450W is for the GTX, and more is probably a safe bet (550W would be in line with a high-end system these days, although 400W ought to suffice if it's a good quality 400W). You can see that the GTX tops out at just under 300W average system power draw with an X6800, so if you use an E6600 and don't overclock, a decent 400W ought to work. The GTX tops out around 260W average with the X6800, so theoretically even a decent 350W will work fine. Just remember to upgrade the PSU if you ever add other components.photoguy99 - Thursday, November 9, 2006 - link
I just wanted to second that thought -AT articles have incredible quality and depth at this point - you guys are doing great work.
It's actually getting embarrasing for some of your competing sites, I browsed the Tom's article and it had so much fluff and retread I had to stop.
Please don't forget the effort is noticed and appreciated.
shabby - Wednesday, November 8, 2006 - link
It wasnt mentioned in the review, but whats the purpose of the 2nd sli connector?