NVIDIA's Tiny 90nm G71 and G73: GeForce 7900 and 7600 Debut
by Derek Wilson on March 9, 2006 10:00 AM EST- Posted in
- GPUs
NVIDIA's Die Shrink: The 7900 and 7600
The first major benefit to NVIDIA comes in the form of die size. The original G70 is a 334mm^2 chip, while the new 90nm GPUs are 196mm^2 and 125mm^2 for G71 and G73 respectively. Compare this to the G72 (the chip used in the 7300) at 77mm^2 and the R580 at 353mm^2 for a good idea of the current range in sizes for 90nm GPUs and you will see that NVIDIA hardware is much smaller than ATI hardware in general. The reasons behind the difference in die size between the high end ATI and NVIDIA hardware comes down to the design decisions made by ATI and NVIDIA. ATI decided to employ full-time fp32 processing with very good loop granularity, floating point blend with anti-aliasing, a high quality anisotropic filtering option, and the capability to support more live registers at full speed in a shader program. These are certainly desirable features, but NVIDIA has flat out told us that they don't believe most of these features have a place in hardware based on current and near term games and the poor performance characteristics of code that makes use of them.
Of course, in graphics there are always chicken and egg problems. We would prefer it if all companies could offer all features at high performance for a low cost, but this just isn't possible. We applaud ATI's decision to stick their neck out and include some truly terrific features at the expense of die size, and we hope it inspires some developers out there to really take advantage of what SM3.0 has to offer. At the same time, a hardware feature that goes unused is useless (hardware is only as good as the software it runs allows it to be). If NVIDIA is right about the gaming landscape, a smaller die size with great performance in current and near term games does give NVIDIA a clear competitive edge. Also note that NVIDIA has been an early adopter of features in the past that went largely unused (i.e. fp32 in the FX line), so perhaps they've learned from past experience.
The smaller the die, the more chips can fit on one silicon wafer. As the wafer costs the same to manufacture regardless of the number of ICs or yield, having a small IC and high yield decrease the cost per die to NVIDIA. Lower cost per die is a huge deal in the IC industry, especially in the GPU segment. Not only does a lower cost to NVIDIA mean the opportunity for higher profit margins, it also gives them the ability to be very aggressive with pricing while still running in the black. With ATI's newest lineup offering quite a bit of performance and more features than NVIDIA hardware, this is all good news for consumers. ATI has a history of being able to pull out some pretty major victories when they need to, but with NVIDIA's increased flexibility we hope to see more bang for your buck across the board.
We haven't had the opportunity to test an X1800 GTO for this launch. We requested a board from ATI, but they were apparently unable to ship us one before the launch. The ability of ATI to sustain this product as well as it did the X800 GTO is certainly questionable as well (after all, the X800 GTO could be built from any of three different GPUs from different generations while the X1800 GTO has significantly fewer options). However, we are hopeful that the X1800 GTO will be a major price performance leader that will put pressure on NVIDIA to drop the price of their newest parts even lower than they already are. After all, in the end we are our readers' advocates: we want to see what is best for our community, and a successful X1800 GTO and the flexibility of NVIDIA after this die shrink would certainly be advantageous for all enthusiasts. But we digress.
The end result of this die shrink, regardless of where the prices on these parts begin to settle, is two new series in the GeForce line: the 7900 at the high end and 7600 at the midrange.
The Newest in High End Fashion
The GeForce 7900 Series is targeted squarely at the top. The 7900 GTX assumes its position at the top of the top, while the 7900 GT specs out very similarly to the original 7800 GTX. Due to the 90nm design, NVIDIA was able to target power and thermal specifications similar to the 7800 GTX 512 with the new 7900 GTX and get much higher performance. In the case of the 7900 GT, performance levels on the order of the 7800 GTX can be delivered in a much smaller, cooler package that pulls less power.
While the 7900 GTX will perform beyond anything else NVIDIA has on the table now, the 7900 GT should give NVIDIA a way to provide a more cost effective and efficient solution to those who wish to achieve 7800 GTX level performance. The specifics of the new lineup are as follows:
7900 GTX:
8 vertex pipes
24 pixel pipes
16 ROPs
650 MHz core clock
1600 MHz memory data rate
512MB of memory on a 256bit bus
$500 +
7900 GT:
8 vertex pipes
24 pixel pipes
16 ROPs
450 MHz core clock
1320 MHz memory data rate
256MB of memory on a 256bit bus
$300 - $350
Image wise, the 7900 GTX takes on the same look as the 7800 GTX 512 with its massive heatsink and large PCB. In sharp contrast to the more powerful 7900 and its 110nm brother, the 7800 GTX 512MB, the 7900 GT sports a rather lightweight heatsink/fan solution. Take a look at the newest high end cards to step onto the stage:
Midrange Chic
With the introduction of the 7600 GT, NVIDIA is hoping they have an X1600 XT killer on their hands. Not only is this part designed to perform better than a 6800 GS, but NVIDIA is hoping to keep it price competitive with ATI's upper midrange. Did we mention it also requires no external power?
In our conversations with NVIDIA about this launch, they really tried to drive home the efficiency message. They like to claim that their parts have fewer transistors and provide performance similar to or greater than competing ATI GPUs (ignoring the fact that the R5xx GPU actually has more features than the G70 and processes everything at full precision). When sitting in on a PR meeting, it's easy to dismiss such claims as hype and fluff, but seeing the specs and performance of the 7600 GT coupled with its lack of power connector and compact thermal solution opened up our eyes to what efficiency can mean for the end user. This is what you get packed into this sleek midrange part:
7600 GT
5 vertex pipes
12 pixel pipes
8 ROPs
560 MHz core clock
1400 MHz memory data rate
256MB of memory on a 128bit bus
$180 - $230
And as NVIDIA wants this card to evolve into the successor to the 6600 GT, we get all of that in a neat little package:
Now that we've taken a look at what NVIDIA is offering this time around, let us take a step back and absorb the competitive landscape.
The first major benefit to NVIDIA comes in the form of die size. The original G70 is a 334mm^2 chip, while the new 90nm GPUs are 196mm^2 and 125mm^2 for G71 and G73 respectively. Compare this to the G72 (the chip used in the 7300) at 77mm^2 and the R580 at 353mm^2 for a good idea of the current range in sizes for 90nm GPUs and you will see that NVIDIA hardware is much smaller than ATI hardware in general. The reasons behind the difference in die size between the high end ATI and NVIDIA hardware comes down to the design decisions made by ATI and NVIDIA. ATI decided to employ full-time fp32 processing with very good loop granularity, floating point blend with anti-aliasing, a high quality anisotropic filtering option, and the capability to support more live registers at full speed in a shader program. These are certainly desirable features, but NVIDIA has flat out told us that they don't believe most of these features have a place in hardware based on current and near term games and the poor performance characteristics of code that makes use of them.
Of course, in graphics there are always chicken and egg problems. We would prefer it if all companies could offer all features at high performance for a low cost, but this just isn't possible. We applaud ATI's decision to stick their neck out and include some truly terrific features at the expense of die size, and we hope it inspires some developers out there to really take advantage of what SM3.0 has to offer. At the same time, a hardware feature that goes unused is useless (hardware is only as good as the software it runs allows it to be). If NVIDIA is right about the gaming landscape, a smaller die size with great performance in current and near term games does give NVIDIA a clear competitive edge. Also note that NVIDIA has been an early adopter of features in the past that went largely unused (i.e. fp32 in the FX line), so perhaps they've learned from past experience.
The smaller the die, the more chips can fit on one silicon wafer. As the wafer costs the same to manufacture regardless of the number of ICs or yield, having a small IC and high yield decrease the cost per die to NVIDIA. Lower cost per die is a huge deal in the IC industry, especially in the GPU segment. Not only does a lower cost to NVIDIA mean the opportunity for higher profit margins, it also gives them the ability to be very aggressive with pricing while still running in the black. With ATI's newest lineup offering quite a bit of performance and more features than NVIDIA hardware, this is all good news for consumers. ATI has a history of being able to pull out some pretty major victories when they need to, but with NVIDIA's increased flexibility we hope to see more bang for your buck across the board.
We haven't had the opportunity to test an X1800 GTO for this launch. We requested a board from ATI, but they were apparently unable to ship us one before the launch. The ability of ATI to sustain this product as well as it did the X800 GTO is certainly questionable as well (after all, the X800 GTO could be built from any of three different GPUs from different generations while the X1800 GTO has significantly fewer options). However, we are hopeful that the X1800 GTO will be a major price performance leader that will put pressure on NVIDIA to drop the price of their newest parts even lower than they already are. After all, in the end we are our readers' advocates: we want to see what is best for our community, and a successful X1800 GTO and the flexibility of NVIDIA after this die shrink would certainly be advantageous for all enthusiasts. But we digress.
The end result of this die shrink, regardless of where the prices on these parts begin to settle, is two new series in the GeForce line: the 7900 at the high end and 7600 at the midrange.
The Newest in High End Fashion
The GeForce 7900 Series is targeted squarely at the top. The 7900 GTX assumes its position at the top of the top, while the 7900 GT specs out very similarly to the original 7800 GTX. Due to the 90nm design, NVIDIA was able to target power and thermal specifications similar to the 7800 GTX 512 with the new 7900 GTX and get much higher performance. In the case of the 7900 GT, performance levels on the order of the 7800 GTX can be delivered in a much smaller, cooler package that pulls less power.
While the 7900 GTX will perform beyond anything else NVIDIA has on the table now, the 7900 GT should give NVIDIA a way to provide a more cost effective and efficient solution to those who wish to achieve 7800 GTX level performance. The specifics of the new lineup are as follows:
7900 GTX:
8 vertex pipes
24 pixel pipes
16 ROPs
650 MHz core clock
1600 MHz memory data rate
512MB of memory on a 256bit bus
$500 +
7900 GT:
8 vertex pipes
24 pixel pipes
16 ROPs
450 MHz core clock
1320 MHz memory data rate
256MB of memory on a 256bit bus
$300 - $350
Image wise, the 7900 GTX takes on the same look as the 7800 GTX 512 with its massive heatsink and large PCB. In sharp contrast to the more powerful 7900 and its 110nm brother, the 7800 GTX 512MB, the 7900 GT sports a rather lightweight heatsink/fan solution. Take a look at the newest high end cards to step onto the stage:
Midrange Chic
With the introduction of the 7600 GT, NVIDIA is hoping they have an X1600 XT killer on their hands. Not only is this part designed to perform better than a 6800 GS, but NVIDIA is hoping to keep it price competitive with ATI's upper midrange. Did we mention it also requires no external power?
In our conversations with NVIDIA about this launch, they really tried to drive home the efficiency message. They like to claim that their parts have fewer transistors and provide performance similar to or greater than competing ATI GPUs (ignoring the fact that the R5xx GPU actually has more features than the G70 and processes everything at full precision). When sitting in on a PR meeting, it's easy to dismiss such claims as hype and fluff, but seeing the specs and performance of the 7600 GT coupled with its lack of power connector and compact thermal solution opened up our eyes to what efficiency can mean for the end user. This is what you get packed into this sleek midrange part:
7600 GT
5 vertex pipes
12 pixel pipes
8 ROPs
560 MHz core clock
1400 MHz memory data rate
256MB of memory on a 128bit bus
$180 - $230
And as NVIDIA wants this card to evolve into the successor to the 6600 GT, we get all of that in a neat little package:
Now that we've taken a look at what NVIDIA is offering this time around, let us take a step back and absorb the competitive landscape.
97 Comments
View All Comments
Z3RoC00L - Thursday, March 9, 2006 - link
Anandtech don't favor ATi over nVIDIA. Have you checked out the majority of reviews? The only site that's giving nVIDIA a decisive win is HardOCP. If you want fanboism and retardation (yes new word I invented) please feel free to visit http://www.HardOCP.com">http://www.HardOCP.com. But if you want solid benchmarks only a few places offer them. Beyond3D, Anandtech and firingsquad. You can also check Techreport & Hothardware. Want a list?- Anandtech (GeForce 7600 and 7900 series)
- Beyond 3D (GeForce 7600 series)
- Bjorn 3D (GeForce 7600 and 7900 series)
- ExtremeTech(GeForce 7600 and 7900 series)
- Firing Squad (GeForce 7900 series)
- Firing Squad (GeForce 7600 series)
- Guru 3D (GeForce 7600 and 7900 series)
- Hard OCP (GeForce 7900 series)
- Hardware Zone (ASUS GeForce 7900 GT)
- HEXUS (GeForce 7600 and 7900 series)
- Hot Hardware (GeForce 7600 and 7900 series)
- Legit Reviews (XFX GeForce 7900 GTX XXX Edition)
- NV News (eVHGA GeForce 7900 GT CO)
- PC Perspective (GeForce 7600 and 7900 series)
- PenStar Systems (eVGA GeForce 7600 CO)
- The Tech Report (GeForce 7600 and 7900 series)
- Tom's Hardware Guide (GeForce 7600 and 7900 series)
- Tweak Town (BFG GeForce 7900 GTX)
- Club IC (French) (GeForce 7900 GT)
- iXBT (Russian) (GeForce 7600 and 7900 series)
- Hardware.FR (GeForce 7900 series)
- Hardware.FR (GeForce 7600 series)
All in all the x1900XTX comes out the winner in the high end segment when HIGH END features are used (AA and AF) and when heavy Shaders are used as well. But it's not a clear victory. Results go both ways and much like the x800XT PE vs. 6800 Ultra (with roles reversed) there will never be a clear winner between these two cards.
I for one prefer the X1900XTX, I like the fact that it will last a tad longer and offer me better Shader performance, better performance under HDR, Adaptive AA, High Quality AF, HDR + AA, AVIVO and the AVIVO converter tool. But that's just my opinion.
Fenixgoon - Thursday, March 9, 2006 - link
You do realize that the x1900 XT and XTX beat the 7800 series, right? That's all Nvidia has had until now. I'm glad to see the 7900 take the lead (albeit the few frames it gains generally don't matter). What concerns me is the budget market. I'd like to see both ATI and Nvidia do some more work in producing better budget cards. My x800pro is still an awesome mid-range card that can hang with many of these new series cards, minus SM3(I bought it some months ago as a final AGP upgrade). In the end of course, stiff compeitition = better price/performance for usSpoonbender - Thursday, March 9, 2006 - link
Been living under a rock for the last 3 years? ATI's drivers are fine these days. I still prefer NVidia's drivers, but that's a matter of preference mainly. Quality-wise, there's only the slightest difference these days.And NVidia isn't all that compatible either. They've ditched support for everything up to (and including) Geforce 2 in their newer drivers. But really, who cares? I doubt you'd get much more performance out of a GF2 by using newer drivers.
As for the bias, I'm surprised NVidia does so well in this test. I was expecting them to take a beating performance-wise.
But geez, what you're saying is really "I don't know anything about ATI, but the fact that AT includes their cards in benchmarks means they must be evil liars..."
Spinne - Thursday, March 9, 2006 - link
If you've never had experience with an ATI GPU, how qualified are you to judge their software? I've used cards made by both companies and I would not bad mouth ATI's drivers down anymore. Ever since the Catalyst series came out, their drivers have been pretty decent. The 'Driver Gap' is highly overrated and untrue to the best of my experience, atleast under Windows. Under Linux, my apartment mate tells me ATI's drivers suck, but then again, he's never used them, but I'd give some weight to his opinion. In any case, there's no point in buying a high end card like this for a Linux box.rgsaunders - Thursday, March 9, 2006 - link
First of all, let me say that Anandtech is usually the first place I visit when looking for information on new hardware, however, I find that your video card reviews seem to have fallen prey to the same pattern as other review sites. Although its nice to know how these cards perform for gaming, the vast majority of users do more than game with their machines. It would be very beneficial to those of us looking for a new video card to see results of comparative video quality for text use and photo editing as well as the normal gaming tests. In the past, I have returned video cards because of their extremely poor text quality, even though they were good for gaming. The gaming community is a vocal minority online, however, the vast majority of users spend a lot of time using their machines for textual processing or photo editing, etc and a small portion of their time gaming.Please include the requested tests in upcoming video card reviews so as to provide a balanced, professional review of these products and stand out from all the other review sites that seem to concentrate primarily on gaming.
Spinne - Thursday, March 9, 2006 - link
Can you specify what cards you've had to return due to poor texture quality? As far as I know, no cards have had problems with 2D in a very very long time. In any case, you'd have to be insane and very rich to splurge money on a G71 or R580 class card for Photoshop or @D desktop performance. It's like buying a '70 Dodge Challenger for driving to work in. I do however feel that AT needs to talk about image quality in 3D some. With all the different modes of AF and AA out there, and the cores themselves performing so well, IQ becomes a large factor in the decesion making process.rgsaunders - Thursday, March 9, 2006 - link
In the past I have had to return Asus and Abit Geforce based cards due to their dubious text\2D quality. There are differences between the various cards, ATI and nVidia, dependant upon the actual manufacturer, in their filter designs. This has a noticeable affect at times on the quality of the text. I agree that IQ in 3D is important, however I do think that text and 2D IQ are also important. The fact that a G71 or R580 class card may be overkill if all you were doing with your computer is Photoshop or MSOffice, however for some of us, the computer is a multipurpose device, used for the full gamut of applications, including occassional gaming. In the main, I usually stay a step behind the bleeding edge of video performance, as do many others. Todays bleeding edge is tomorrows main stream card and unless you review everything the first time, there is no information wrt text and 2D IQ.Zoomer - Monday, March 13, 2006 - link
These are most likely reference cards, and reference cards from nvidia have in the past proven to output a much better signal that what will be produced later on, esp. when the price cutting starts.Zoomer - Monday, March 13, 2006 - link
One more thing.Derek, why don't you guys take the time required to produce a nice review? Is it really necessary to get that article up and running on the day of the launch? If you got the cards late, bash the company for it. And take all the time you need to do a proper review like these AT have done in the past.
Reviews with just benchmarks and pharaphrased press release info is REALLY boring and is a turn off. For example, I couldn't bear to look at the graphs as they weren't relevant. I skipped right to the End.
Whatever happened to overclocking investigations? Testing for core/mem bottlenecks by tweaking the frequency? Such infomation is USEFUL as it means all these with the same care out there DOES NOT have to repeat it for themselves. Recall AT's TNT/GF2 era articles. If my memory is correct, there were pages of such investigation, and a final recommendation was made to clock up the mem clock to the limit, and then clock up the core.
Image quality comparisons like these done on for the Radeon 32 DDR, R200, etc are almost absent.
Quality of components used? Granted, this is moot for engineering sample cards, but an investigation of the cooling solution would be good. Reliability and noise of the cooling solution should be included. Does these ultra fine fins dust traps? That small high RPM screamer a possible candidate for early failure?
Performance is only one small part of the whole picture. Everyone and their dog publishes graphs. However, only a select few go beyond that, and even fewer are from these that have the trust of many.
Questar - Thursday, March 9, 2006 - link
According to Hardocp, the 7900 has horrible texture shimmering issues.