NVIDIA GeForce 8800 GT: The Only Card That Matters
by Derek Wilson on October 29, 2007 9:00 AM EST- Posted in
- GPUs
G92: Funky Naming for a G80 Derivative
If we expect the G9x to represent a new architecture supporting the GeForce 9 series, we would be wrong. In spite of the fact that part of the reason we were given for NVIDIA's move away from NVxx code naming was to bring code name and product name closer to parity (G7x is GeForce 7, G8x is GeForce 8), it seems NVIDIA has broken this rule rather early on. Code names are automatically generated, but how we only ended up with three different G8x parts before we hit G9x is certainly a mystery. One that NVIDIA didn't feel like enlightening us on, as it no doubt has to do with unannounced products.
While not a new architecture, the GPU behind the 8800 GT has certainly been massaged quite a bit from the G80. The G92 is fabbed on a 65nm process, and even though it has fewer SPs, less texturing power, and not as many ROPs as the G80, it's made up of more transistors (754M vs. 681M). This is partly due to the fact that G92 integrates the updated video processing engine (VP2), and the display engine that previously resided off chip. Now, all the display logic including TMDS hardware is integrated onto the GPU itself.
In addition to the new features, there have been some enhancements to the architecture that likely added a few million transistors here and there as well. While we were unable to get any really good details, we were told that lossless compression ratios were increased in order to enable better performance at higher resolutions over the lower bandwidth memory bus attached to the G92 on 8800 GT. We also know that the proportion of texture address units to texture filtering units has increased to a 1:1 ratio (similar to the 8600 GTS, but in a context where we can actually expect decent performance). This should also improve memory bandwidth usage and texturing power in general.
Because NVIDIA was touting the addition of hardware double precision IEEE 754 floating point on their workstation hardware coming sometime before the end of the year, we suspected that G92 might include this functionality. It seems, however, that the hardware behind that advancement has been pushed back for some reason. G92 does not support hardware double precision floating point. This is only really useful for workstation and GPU computing applications at the moment, but because NVIDIA design one GPU for both consumer and workstation applications, it will be interesting to see if they do anything at all with double precision on the desktop.
With every generation, we can expect buffers and on chip memory to be tweaked based on experience with the previous iteration of the hardware. This could also have resulted in additional transistors. But regardless of the reason, this GPU packs quite a number of features into a very small area. The integration of these features into one ASIC is possible economically because of the 65nm process: even though there are more transistors, the physical die takes up much less space than the G80.
If we expect the G9x to represent a new architecture supporting the GeForce 9 series, we would be wrong. In spite of the fact that part of the reason we were given for NVIDIA's move away from NVxx code naming was to bring code name and product name closer to parity (G7x is GeForce 7, G8x is GeForce 8), it seems NVIDIA has broken this rule rather early on. Code names are automatically generated, but how we only ended up with three different G8x parts before we hit G9x is certainly a mystery. One that NVIDIA didn't feel like enlightening us on, as it no doubt has to do with unannounced products.
While not a new architecture, the GPU behind the 8800 GT has certainly been massaged quite a bit from the G80. The G92 is fabbed on a 65nm process, and even though it has fewer SPs, less texturing power, and not as many ROPs as the G80, it's made up of more transistors (754M vs. 681M). This is partly due to the fact that G92 integrates the updated video processing engine (VP2), and the display engine that previously resided off chip. Now, all the display logic including TMDS hardware is integrated onto the GPU itself.
In addition to the new features, there have been some enhancements to the architecture that likely added a few million transistors here and there as well. While we were unable to get any really good details, we were told that lossless compression ratios were increased in order to enable better performance at higher resolutions over the lower bandwidth memory bus attached to the G92 on 8800 GT. We also know that the proportion of texture address units to texture filtering units has increased to a 1:1 ratio (similar to the 8600 GTS, but in a context where we can actually expect decent performance). This should also improve memory bandwidth usage and texturing power in general.
Because NVIDIA was touting the addition of hardware double precision IEEE 754 floating point on their workstation hardware coming sometime before the end of the year, we suspected that G92 might include this functionality. It seems, however, that the hardware behind that advancement has been pushed back for some reason. G92 does not support hardware double precision floating point. This is only really useful for workstation and GPU computing applications at the moment, but because NVIDIA design one GPU for both consumer and workstation applications, it will be interesting to see if they do anything at all with double precision on the desktop.
With every generation, we can expect buffers and on chip memory to be tweaked based on experience with the previous iteration of the hardware. This could also have resulted in additional transistors. But regardless of the reason, this GPU packs quite a number of features into a very small area. The integration of these features into one ASIC is possible economically because of the 65nm process: even though there are more transistors, the physical die takes up much less space than the G80.
90 Comments
View All Comments
vijay333 - Monday, October 29, 2007 - link
just activated the step-up on my current 8800GTS 320MB -- after shipping costs and discounting the MIR from back then, I actually get the 8800GT 512MB for -$12 :)bespoke - Monday, October 29, 2007 - link
Lucky bastard! :)vijay333 - Monday, October 29, 2007 - link
hehe...great timing too. only had 5 days remaining before the 90day limit for the step-up program expired :)clockerspiel - Monday, October 29, 2007 - link
Genrally, Anandtech does an excellent job with it's reviews and uses robust benchmarking methodology. Any ideas why the Tech Report's results are so different?http://www.techreport.com/articles.x/13479">http://www.techreport.com/articles.x/13479
Frumious1 - Monday, October 29, 2007 - link
Simply put? TechReport is doing some funny stuff (like HardOCP often does) with their benchmarking on this one. I have a great idea: let's find the WORST CASE SCENARIO for the 8800 GT vs. the 8800 GTS 640 and then ONLY show those resolutions! 2560x1600 4xAA/16xAF? Ignoring the fact that 16xAF isn't noticeably different from 8xAF - and that 4xAA is hardly necessary at 2560x1600 there are just too many questions left by the TR review. They generally come to the same conclusion that this is a great card, but it's almost like they're struggling to find ANY situation where the 8800 GT might not be as good as the 8800 GTS 640.For a different, more comprehensive look at the 8800 GT, why not try http://www.firingsquad.com/hardware/nvidia_geforce...">the FiringSquad review? They test at a variety of resolutions with a decent selection of GPUs and games. Out of all of their results, the only situation where the 8800 GTS 640 comes out ahead of the 8800 GT is in Crysis at 2xAA/8xAF at 1920x1200. Granted, they don't have 2560x1600 resolutions in their results, but how many midrange people use 30" LCDs? For that matter, how many highend gamers use 30" LCDs? I'm sure they're nice, but for $1300+ I have a lot of other stuff I'd be interested in purchasing!
There are a lot of things that we don't know about testing methodology with all of the reviews. What exact detail settings are used, for example, and more importantly how realistic are those settings? Remember Doom 3's High Quality and Ultra Quality? Running everything with uncompressed textures to artificially help 512MB cards appear better than 256MB cards is stupid. Side by side screenshots showed virtually no difference. I don't know what the texture settings are in the Crysis demo, but I wouldn't be surprised if a bunch of people are maxing everything out and then crying about performance. Being a next gen title, I bet Crysis has the ability to stress the 1GB cards - whether or not it really results in an improved visual experience.
Maybe we can get some image quality comparisons when the game actually launches, though - because admittedly I could be totally wrong and the Crysis settings might be reasonable.
Frumious1 - Monday, October 29, 2007 - link
Simply put? TechReport is doing some funny stuff (like HardOCP often does) with their benchmarking on this one. I have a great idea: let's find the WORST CASE SCENARIO for the 8800 GT vs. the 8800 GTS 640 and then ONLY show those resolutions! 2560x1600 4xAA/16xAF? Ignoring the fact that 16xAF isn't noticeably different from 8xAF - and that 4xAA is hardly necessary at 2560x1600 there are just too many questions left by the TR review. They generally come to the same conclusion that this is a great card, but it's almost like they're struggling to find ANY situation where the 8800 GT might not be as good as the 8800 GTS 640.For a different, more comprehensive look at the 8800 GT, why not try
Parafan - Monday, October 29, 2007 - link
I just dont like being fed by the same site to tell 2 totally different things when picking my new GPU card.Parafan - Monday, October 29, 2007 - link
Ive been following anandtech testresults very carefully since the UT3 demo was released. What i can find comparing these results to the others in UT3 just doesnt make any sense ;1.st
Looking at : http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
shows the new 8800GT card beating 2900XT by, almost 120fps vs 105fps or so, in 1280*1024 @ UT3.
2.nd
Looking at the first & second GPU test : http://www.anandtech.com/video/showdoc.aspx?i=3128...">http://www.anandtech.com/video/showdoc.aspx?i=3128...
Shows the 2900XT being on top with about 108,5fps, vs 8800 ULTRA, GTX and GTS, with 104,2 98,3 and 97.2 @ 1280 * 1024.
Prett close nr.s you see.
3.rd
Looking at the new test again, 8800GT VS 8800GTS : http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
Shows the 8800GT beating 8800GTS. @ 1280 * 1024 = close to 120fps vs 105fps. The GTS still over 100, when being below a 100 on the previous test.
But the huge difference is @ 1600 * 1200. 8800GT right above 100fps, when the GTS around 90? On the previous test GTS showed results as low as 77fps, cmon something smells wierd.
See where im going?
http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
just showed the 8600GTS performing alot worse in this new test compared to the old one, @ all resolutions.
and again
http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
8800GT and 8800GTX performing about the same, at the highest almost 120fps. compared to the previous test thats like 20 fps better than the GTX performed last time. Why dont these tests corresponde at all to the one just made?
Seems like all the 8800GT, GTX, ULTRA cards just got awhole freaking lot better, and making the 2900xt looking worse. WHICH I FIND DOUBLTY.. Someone bring the facts to the table.
dont tell me 2extra gb of ram made the nvidia cards play alot better, and the ati card alot worse!
DerekWilson - Monday, October 29, 2007 - link
We used a different driver version this time -- in fact, we've gone through two driver revisions from NVIDIA here.The AMD card didn't slip significatnly in performance at all (differences were all within 3%).
We did rerun the numbers, and we really think its a driver issue -- the new NV driver improved performance.
Parafan - Wednesday, November 7, 2007 - link
Well clearly a graphics issue this must be. But I read nvidia 169.xx drivers were made for optimizing the performance, but lowering the quality of the graphics.This was prooved when the water was less nicer in crysis etc with 169.04 and 169.01, than with their previous 163.xx drivers.