Vendor Cards: MSI NX7800GTX
by Derek Wilson & Josh Venning on July 24, 2005 10:54 PM EST- Posted in
- GPUs
Performance Tests
Again, we're keeping with the same 3 games that we tested in the last article: Battlefield 2, Doom 3, and Half Life 2. (Half Life 2 and Doom 3 are tested at 1920x1440 and Battlefield 2 at 2048x1536.) We tested the card on the same system as the EVGA.MSI K8N Neo4 Platinum/SLI motherboard
AMD Athlon 64 FX-55 Processor
1 GB OCZ 2:2:2:6 DDR400 RAM
Seagate 7200.7 120 GB Hard Drive
OCZ 600 W PowerStream Power Supply
As we mentioned earlier, we added a set of benchmarks with 4xAA enabled. This will help us get a better idea of the subtle differences between each card's performances. The purpose of including these benchmarks is to see what happens when stress is added to memory bandwidth on these parts. One of the first things to look at is how the numbers compare between the EVGA and MSI cards out of the box, without any overclocking. As our tests on the MSI card quickly showed no difference between its performance and that of our reference card, this question has already been answered in our previous article. Please note that we did not add an MSI NX7800 GTX entry to our graphs as our tests showed it to perform exactly the same as our reference card. NX7800GTX out-of-the-box performance is highlighted in green .
As you can see, The EVGA slightly outperforms the MSI across the board at stock speeds. This was predictable given that our EVGA e-GeForce 7800 GTX came to us with the core clocked at 450MHz, as opposed to MSI's standard 430MHz. When it comes to the maximum overclock, our MSI card was able to surpass what we saw with the EVGA part. With Battlefield 2, we see that the percentage gain is more pronounced without the 4xAA enabled; our NX7800GTX overclock gave us a frame rate increase of 10.4%.
Doom 3 seemed to get about the same percentage gains from overclocking with and without AA. Without AA, overclocking the MSI card returned a 5.3% gain; and with AA, we see just slightly more, 7.5%.
Half Life 2 is the reverse of Battlefield 2. We see a higher increase in performance from overclocking with AA enabled than without. This could be because we are bumping into a CPU limitation without AA turned on. With AA enabled, we see an 8.8% increase in performance when overclocked, as opposed to only a 5.3% increase with no AA.
All the gains that we see here from overclocking are fairly significant and on par with what we would expect from a 12.8% increase in core clock speed based on our analysis of clock speeds in the 7800 GTX. Of course, we know that core speed is not as straightforward a measure as we would like it to be, but we will continue to press NVIDIA on the matter.
In comparing the EVGA and MSI max core clock numbers, remember that every card is different and may not achieve the same results that we've seen here. Since these cards had the same HSF, we would expect similar overclocking performance, and hopefully the more we test, the more we'll know about how variable (in terms of max clock speed) the 7800 GTX is. It is obvious from the numbers that there is no difference in performance between a G70 clocked at 475 and one clocked at 485.
42 Comments
View All Comments
Fluppeteer - Friday, July 29, 2005 - link
"Advertise" is perhaps a strong word, but the PDF data sheet on the eVGA web sitedoes say that one output is dual link (even though the main specifications say
the maximum digital resolution is 1600x1200, which is nonsense, like all resolution
claims, even for most single link cards).
I couldn't (last I looked) find anything about dual link support on the MSI site.
But then, MSI have in the past ignored that the 6800GTo was dual link, and then
claimed that their (real) 6800GT *was* dual link, and that the SiI transmitters
were unnecessary... (Although I'm still mystified how the PNY AGP 6600GT seems to
have dual dual link support without external transmitters.)
I'm presuming both heads have analogue output, btw (I only ask because the GTo,
for some astonishing reason, only has digital output on its single link head).
Past experience (with the 6800) suggests that the reason none of the manufacturers
mention it is that very few people actually know what dual link DVI *is*. A lot
probably haven't tried it - there being, last I looked, only three monitors which
can use it anyway, two of which are discontinued. nVidia caused a lot of confusion
by claiming support in the chipset and putting an external transmitter on their
reference card, which most manufacturers left off without updating their specs.
Unfortunately, nVidia seem to fob off all their tech support to the manufacturers,
who aren't always qualified to answer questions - I've not found anywhere to send
driver feature requests, for example. Seeing the external transmitter make it to
released boards is a vast relief to me.
Now the Quadro 4500 has been announced, I'm hoping the 512MB boards will appear
(and they might be DDL). Fingers crossed.
DerekWilson - Thursday, July 28, 2005 - link
Yes. Again, the SI TMDS for dual-link is on the pcb. So far there are no 7800 cards that we have seen without dual-link on one port.NVIDIA didn't even make this clear at their initial launch. But it is there. If we see a board without dual-link we'll let you know.
Wulvor - Monday, July 25, 2005 - link
For that extra $4 you are also paying for a longer Warranty. eVGA has a 1+1 warranty, so 1 year warranty out of the box, and another 1 year when you register online at eVGA. MSI on the other hand has a 3 year warranty, and BFG a lifetime warranty.It must be the corporate purchaser in me, $4 is well worth the extra year ( or 2 ), but I guess if you are going to be on the "bleeding" edge, then you are buying a new video card every 6 months anyways, so who cares?
smn198 - Monday, July 25, 2005 - link
A suggestion:Regarding measuring the card's noise output and the way you measured the sound
"We had to do this because we were unable to turn on the graphics card's fan without turning on the system."
Would it be possible to try and measure the voltages going to the fan when the card is idle and under full load? Then supply the fan with these voltages when the system is off using a different power supply such as a battery (which is silent) and a variable resister.
It would also be interesting to see a graph of how the noise increases when going from idle to full load over 10 minutes (or however long it takes to reach the maximum speed) on cards which have . Instead of trying to measure the noise with the system on, again measure the voltage over time and then using your battery, variable resistor and voltage meter recreate the voltages and use this in conjunction with the voltage/time data to produce noise/time data.
Thanks
DerekWilson - Monday, July 25, 2005 - link
We are definitely evaluating different methods for measuring sound. Thanks for the suggestions.Just to be clear, even after hours of looping tests on the 7800 GTX overclocked to 485/625, we never once heard an audible increase in the fan's speed.
This is very much unlike our X850 parts that spin up and down frequenly during any given test.
We have considered attempting to heat the environment to simulate a desert like climate (we've gotten plenty of email from military personel asking about heat tolerance on graphics cards), but it is more difficult than it would seem to heat the enviroment without causing other problems in our lab.
Suggestions are welcome.
Thanks,
Derek Wilson
at80eighty - Tuesday, July 26, 2005 - link
We have considered attempting to heat the environment to simulate a desert like climate [...] but it is more difficult than it would seem to heat the enviroment without causing other problems in our labDerek, If you really wanna simulate desert like heat in the room, may i suggest inviting Monica Belluci to your lab ....should work like a charm :p
reactor - Monday, July 25, 2005 - link
Ive been using MSI cards for a few years now, there fans always seem to run at top speed and ive found they usually run at higher RPM's(Slightly louder) than other manufacturers. I think that explains why the card is cooler while drawing more power, and why you didn't notice a difference in sound as the card was stressed. Im not entirely certain, but thats from my own expenriances with MSI cards.Good article, looking forward to the BFG.
yacoub - Monday, July 25, 2005 - link
"As you can see, The EVGA slightly outperforms the MSI across the board at stock speeds."Either I'm reading it wrong or you mis-wrote that line, since I see the e-VGA normal and OC'd, the NVidia reference, and the MSI OC'd, but no MSI at stock speeds. Thus it's hard to compare th EVGA stock speeds vs the MSI stock speeds when one of them isn't on the charts.
DerekWilson - Monday, July 25, 2005 - link
Check the bold print on the Performance page --MSI stock performance is the same as the NVIDIA reference performance at 430MHz ...
To compare stock numbers compare the green bar to the EVGA @ 450/600
Sorry for the confusion, but we actually tested all the games a second time and came up with the exact same numbers. Rather than add another bar, we thought it'd be easier to just reference the one.
If you guys would rather see multipler bars for equivalent results across the board, we can certainly do that.
Thanks,
Derek Wilson
davecason - Monday, July 25, 2005 - link
Since the MSI card drew a lot more power than expected but remained cooler than the eVGA card, I was thinking that some of the excess may be due to the cooling of the card itself. Maybe the fan on the MSI card works harder than the one on the eVGA card.The people at Anandtech could test the power usage of the stock video-card cooling fans independently to see what their effect is on power load. This may explain the 6 extra watts used by the MSI card. This information might be mildly useful to a person who was already stressing out their power supply with other things (such as several hard drives). Does anyone think that is worth doing?