Vendor Cards: EVGA e-GeForce 7800 GTX
by Derek Wilson & Josh Venning on July 16, 2005 12:05 AM EST- Posted in
- GPUs
Performance Tests
For testing frame rates, we decided to go with three of the more popular games out today that would provide a good range of graphic variety for our cards to tackle. We tested Half Life 2 and Doom 3 at 1920x1440 and Battlefield 2 at 2048x1536. We also used 8x AF for all the games, which is turned on automatically in Doom 3 and Battlefield 2 when set on "high" quality.If you are interested in seeing how the 7800 stands up to other cards, take a look at our original G70 review. Note that this series of articles isn't meant to be a comparative review between G70 and other GPUs, but rather is designed to help you choose a 7800.
Here is the system that we used for the benchmarks:
MSI K8N Neo4 Platinum/SLI motherboard
AMD Athlon 64 FX-55 Processor
1 GB OCZ 2:2:2:6 DDR400 RAM
Seagate 7200.7 120 GB Hard Drive
OCZ 600 W PowerStream Power Supply
Half-life 2, despite its graphical quality, gets excellent frame rates on a wide variety of cards. This one is no exception. You can see that there is only about a 3 fps difference between the reference and the EVGA e-GeForce 7800 GTX, and it gains roughly another 3 when overclocked. These increases are less than 3%, and as a general rule, a less than 3% increase is insignificant. Doom 3 and Battlefield 2's increases are significant, however.
Doom 3 looks great and can tax the best cards out there with the video quality turned up. We see that the EVGA gets about 2 and a half more frames than the reference, and picks up almost 3 more when overclocked. This gives us a 3.2% out-of-the-box performance increase over 7800 GTX parts clocked at 430MHz and a 7% advantage over the same when further overclocked.
Battlefield 2, the most graphically intensive of the three, looks and runs very good on this card. Those looking for the best video card to run this game need not look any further. Again, we see about a 3 fps advantage between the EVGA and reference, but when overclocked, it gets almost 4 fps more. That's about a 5.1% increase in performance over the factory overclock; not that much more than the other two games, but notable nonetheless. And if we compare this to the reference card, we see a 10.3% performance increase, which is definitely noticeable.
For those who think that 4 frames per second aren't enough to matter, over the course of one minute, 4 frames per second does add up to 240 total frames that wouldn't have been rendered otherwise. It all comes down to smoothness here, and those 240 frames help to fill in any gaps in the action. If we are going to pay somewhere near $600 for a card, we are going to expect all the smoothness that we can get.
A good question to ask is, "is it worth the risk of heat damage to overclock my card?" The short answer is, "It depends." The card is already faster than the reference right out of the box, and while you do get a small boost in frame rate when you overclock it, it's not enough to make a very big impact on your gaming experience. However, those hoping to get every last inch of an advantage might just get it by pushing the card to its limit. But do so at your own risk - overclocking these high-transistor count GPUs running at already high clock speeds can result in damage to your new $600 baby.
26 Comments
View All Comments
Fluppeteer - Friday, July 22, 2005 - link
Isn't that an old ATi card?I don't know about a 7800Ultra, but it looks like the Quadro 4500 (based on the 7800) might be on for a SIGGRAPH launch. Since the 4400's a 512MB card, I doubt the 4500 will be a 256MB one. And hopefully *that* will bode well for a 512MB consumer card.
Fingers crossed.
Mind you, if a 4500 is a 7800GTX-based card (as the 3400 is a 6800GTo card), perhaps there'll be a 5500 (7800Ultra-based) in the manner of the 4400. By which point, presumably people will have stopped selling GeForce 5500 cards, or it's going to get confusing (other than a factor of a hundred in the price).
araczynski - Thursday, July 21, 2005 - link
i think i'll wait for the 8800xyzFluppeteer - Wednesday, July 20, 2005 - link
I understand the eVGA 7800GTX card (unusually) has a dual-link DVI connection. Since this was a feature which seemed to cause a lot of confusion among 6800-series card manufacturers, I just wondered if the reviewers (or anyone else) had the chance to test it? If it *is* dual link is an external TMDS transmitter used? What's the quality of nVidia's TMDS transmitter implementation this time round (reports of the 6800 series were critical)?The 7800GTX is probably the best card out there for trying to render at the resolutions supported by an IBM T221 or Apple's 30" cinema display - although I'm inclined to wait for a 512Mb version for my T221; it would be good to know whether it's capable of driving one.
Cheers
smn198 - Tuesday, July 19, 2005 - link
A suggestion:Regarding measuring the card's noise output and the way you measured the sound
"We had to do this because we were unable to turn on the graphics card's fan without turning on the system."
Would it be possible to try and measure the voltages going to the fan when the card is idle and under full load? Then supply the fan with these voltages when the system is off using a different power supply such as a battery (which is silent) and a variable resister.
It would also be interesting to see a graph of how the noise increases when going from idle to full load over 10 minutes (or however long it takes to reach the maximum speed) on cards which have . Instead of trying to measure the noise with the system on, again measure the voltage over time and then using your battery, variable resistor and voltage meter recreate the voltages and use this in conjunction with the voltage/time data to produce noise/time data.
Thanks
PrinceGaz - Sunday, July 17, 2005 - link
Just look at the original review to see how the 7800GTX compares with older cards, they looked at a lot more games and with a wider range of settings.This series of articles is a comparison of 7800GTX cards and is meant to focus on the differences between them. We all know a 7800GTX is faster than a 6800U Ultra so there is no point including that on the graphs.
Zak - Sunday, July 17, 2005 - link
I agree, without any comparison to older cards this is pretty useless. Z.PrinceGaz - Sunday, July 17, 2005 - link
Any word from Derek or Josh(?) as to why AA was not enabled in the tests? It would certainly be a lot more meaningful than the curent set of results at resolutions as high as 2048x1536 without AA where the lowest average framerate in any game is over 70fps. The argument about 4fps more being worthwhile because it is an extra 240 frames per minute is one of the daftest things I've read in a gfx card review.Unless you include minimum framerates and ideally a framerate graph like [H} do, and comment on playability at different resolutions and AA settings; remarks like an overclocked card getting 76fps being wothwhile over the non-overclocked one only managing 72fps are ludicrous. I bet you couldn't even tell the two apart in a test where you weren't told which was which. Turn on 4x AA and lets see how they stand up. It may come down more to memory-bandwidth but thats okay. I'm sure some manufacturer will use Samsung's 1.4nS (1400MHz) chips, or at least their 1.5nS (1333MHz) chips sooner or later, assuming the core and circuit board are up to handling those speeds.
stephenbrooks - Sunday, July 17, 2005 - link
On page 5 (Heat, Power and Noise) it says under the first graph:-----
As you can see in the graph, there's no difference in the temperature of the reference card and the EVGA e-GeForce 7800 GTX in normal mode, and it only went up by one degree when we overclocked it.
-----
...that's not quite right, as in fact both the e-GeForces were at 81C (overclocked or not) whereas teh reference card was at 80C.
overclockingoodness - Sunday, July 17, 2005 - link
#9 and #17: If you want to see the numbers, maybe you should go read the original 7800GTX review. These are just vendor series and you they are comparing the vendor's performance, which is always going to be a couple of frames here and there. It's useless to include 6800 and ATI cards in there.z0mb1e - Sunday, July 17, 2005 - link
I agree with #9, it would be nice if it had some numbers from the 6800 and an ATI card