NVIDIA's GeForce 7800 GTX Hits The Ground Running
by Derek Wilson on June 22, 2005 9:00 AM EST- Posted in
- GPUs
Inside The Pipes
The pixel pipe is made up of two vector units and a texture unit that all operate together to facilitate effective shader program execution. There are a couple mini-ALUs in each shader pipeline that allow operations such as a free fp16 normalize and other specialized features that relate to and assist the two main ALUs.Even though this block diagram looks slightly different from ones shown during the 6800 launch, NVIDIA has informed us that these mini-ALUs were also present in NV4x hardware. There was much talk when the 6800 launched about the distinct functionality each of the main shader ALUs had. In NV4x, only one ALU had the ability to perform a single clock MADD (multiply-add). Similarly, only one ALU assisted in texture address operations for the texture unit. Simply having these two distinct ALUs (regardless of their functionality difference) is what was able to push the NV4x so much faster than the NV3x architecture.
In their ongoing research into commonly used shaders (and likely much of their work with shader replacement), NVIDIA discovered that a very high percentage of shader instructions were MADDs. Multiply-add is extremely common in 3D mathematics as linear algebra, matrix manipulation, and vector calculus are a huge part of graphics. G70 implements MADD on both main Shader ALUs. Taking into account the 50% increase in shader pipelines and each pipe's ability to compute twice as many MADD operations per clock, the G70 has the theoretical ability to triple MADD performance over the NV4x architecture (on a clock for clock basis).
Of course, we pressed the development team to tell us if both Shader ALUs featured identical functionality. The answer is that they do not. Other than knowing that only one ALU is responsible for assisting the texture hardware, we were unable to extract a detailed answer about how similar the ALUs are. Suffice it to say that they still don't share all features, but that NVIDIA certainly feels that the current setup will allow G70 to extract twice the shader performance for a single fragment over NV4x (depending on the shader of course). We have also learned that the penalty for branching in the pixel shaders is much less than in previous hardware. This may or may not mean that the pipelines are less dependent on following the exact same instruction path, but we really don't have the ability to determine what is going on at that level.
127 Comments
View All Comments
CrystalBay - Wednesday, June 22, 2005 - link
Does this card play Riddick smoothly @ shader 2++ ?????fishbits - Wednesday, June 22, 2005 - link
"In aboot 5 years i figure we'll be paying 1000 bucks for a video card. These prices are getting out of control, every generation is more expensive then the last. Dont make me switch to consoles damnit."Funny, I can't afford the very best TVs the minute they come out. Same for stereo components. But I don't cry about it and threaten "Don't make me switch to learning the ukelele and putting on my own puppet shows to entertain myself!" Every time a better component comes out, it means I get a price reduction and feature upgrade on the items that are affordable/justifiable for my budget.
Seriously, where does the sense of entitlement come from? Do these people think they should be able to download top-of-the-line graphics cards through BitTorrent? Do they walk around Best Buy cursing out staff, manufacturers and customers for being so cruel as to buy and sell big-ass plasma TVs?
On second thought, get your console and give up PC gaming. That way you can stop being miserable, and we can stop being miserable hearing about your misery.
tazdevl - Wednesday, June 22, 2005 - link
Funny how the single card deltas here are higher than at any other site.Unwhelmed for the amount of money and lack of performance increase.
Have to commend nVIDIA for ensuring retail availability at launch.
archcommus - Wednesday, June 22, 2005 - link
Impressive, but I'm still happy with my X800 XL purchase for only $179. For what it seems, with a 1280x1024 display, I won't need the kind of power this card delivers for a very long time. And less than $200 compared to $600, with still excellent peformance for now and the forseeable future? Hmm, I'll take the former.Lonyo - Wednesday, June 22, 2005 - link
I would have liked some 1280x1024 benchmarks with 8xAA from the nVidia cards and 6xAA from ATi to see if it's worth getting something like a 7800GTX with 17/19" LCD's to run som esuper high quality settings in terms of AA/AF.segagenesis - Wednesday, June 22, 2005 - link
I'm not disappointed. For one thing the price of current cards will likely drop now, and there will also be mid-range parts soon to choose from. I think the transparency AA is a good idea for say... World of Warcraft. The game is loaded with them and too often can you see the blockyness of trees/grass/whatever.#44 - Actually are you new to the market? :) I remember when early "accelerated" VGA cards were nearly $1000. Or more.
Everybody lambasted NVIDIA last year for the lack of product (6800GT/Ultra) to the market, so them actually making a presence this year instead of a paper launch should also be commended. Of course, now what is ATI gonna pull out of its hat?
KeDaHa - Wednesday, June 22, 2005 - link
The screenshot shows very clearly that SSAA provides quite a quality improvement over no AAThe difference is bloody miniscule, perhaps if you used an image SLIGHTLY larger than 640x480 to highlight the difference?
L3p3rM355i4h - Wednesday, June 22, 2005 - link
Wowzers. Time to get rid of teh 9800...shabby - Wednesday, June 22, 2005 - link
In aboot 5 years i figure we'll be paying 1000 bucks for a video card. These prices are getting out of control, every generation is more expensive then the last.Dont make me switch to consoles damnit.
Xenoterranos - Wednesday, June 22, 2005 - link
Hell, for the same price as an SLI setup I can go out and get a 23 inch cinema display...And since these cards can't handle the 30" native resolution anyway, it's a win-win. And yeah, whats up with the quality control on these benchmarks! I mean really, I almost decided to wait for the ATI next-gen part when I saw this (GeForce man since the GeForce2 GTS!)