MultiGPU Update: Two-GPU Options in Depth
by Derek Wilson on February 23, 2009 7:30 AM EST- Posted in
- GPUs
Power Consumption
All these results are taken at the wall (total system power) running the 3dmark Vantage POM (parallax occlusion mapping) shader test. This test uses very little other system resources and focuses on the GPU. This means that the numbers you see here are LOWER than total system power while playing a game -- often by more than 50W and sometimes 100W, depending on the game, benchmark and system. These numbers show clearer differences between GPU power draw, which is why we stick with them. These numbers should NOT be used to determine a proper PSU for a certain graphics card solution unless you consider a couple hundred watts of headroom in your calculation.
For idle power, NVIDIA's 55nm GT200 parts take the cake. We don't have a 55nm GTX 260 in house, but we would expect it's idle power to come in on par or lower than our GTX 285. AMD's 4850 hangs with the lower power NVIDIA options, but both the 512MB and 1GB 4870 variants pull more power than any of our other single GPU solutions.
For multiGPU options, SLI definitely draws the most power. The X2 single card multiGPU options are better than the two card solutions, and this carries over to NVIDIA as well with the GTX 295 coming in at lower power than 2x GTX 260s. If we had tested two 1GB 4870 cards, we would expect to see more than 285W power draw at idle, which is quite high indeed.
Things change up a bit when we explore load power draw. The lowest idle power parts end up drawing the most power under load. The additional draw of the GTX 280 and GTX 285 are not unexpected as they offer a typically higher level of performance for their power. All our multiGPU options do draw more power than all of our single GPU configurations, so, though we didn't calculate them, you can expect performance per watt advantages where ever a single GPU leads any multiGPU option in performance.
95 Comments
View All Comments
MamiyaOtaru - Tuesday, February 24, 2009 - link
So we have to be perfect in every way to point out errors? NBA players shouldn't listen to their coaches because their coaches can't play as well as they do? Game reviewers shouldn't trash a game because they couldn't make a better one?ggathagan - Tuesday, February 24, 2009 - link
When it comes to grammatical errors as insignificant as the ones pointed out, yes.If you're going to be that critical, then you best check your own grammar.
cptnjarhead - Wednesday, February 25, 2009 - link
Grammar shmammar, you guys need to move out of your mom’s basement and get laid. :)bigboxes - Wednesday, February 25, 2009 - link
+1stym - Monday, February 23, 2009 - link
I am curious to see how a pair of radeon 4830 would perform in this lineup. A single one is quite weak at those resolutions, but I am willing to bet a pair of those would hold its own against a single GTX280.Oh, and it would be much cheaper, too ($180 including the bridge).
Could you possibly include that setup next?
DerekWilson - Monday, February 23, 2009 - link
You are right that a single 4830 won't be enough perform on par with these guys ... but I don't think two of them would really be worth it against the GTX 280 except maybe at lower resolutions. The 1GB 4830 will run you at least $145, so you're looking at $290 for two of them and the 4850 X2 2GB is the same price. The 512MB 4830 will be limited by memory usage at higher resolutions just like the 4850 512MB.We might look at the 4830 in CrossFire internally and see if it warrants an update, but so far it isn't in the roadmap for the rest of the series.
stym - Monday, February 23, 2009 - link
I was thinking 512MB 4830s, which are in the $90~$110 price range. That price range is the only reason I mention them, because it puts the price tag of a pair of those in the exact same range as a Radeon 4830 512MB or even a GTX260.You said that a 4850 1GB doesn't make sense, and that's even more obvious for a 4830.
pmonti80 - Monday, February 23, 2009 - link
I find too that this would be an interesting match at the $200+ pricetag.wilkinb - Monday, February 23, 2009 - link
why not just drop AoC, it was bad when it came out, has always had issues and odd results and no one i know played for more then 2 months...If you want to have a mmo, why not use one that people play? and maybe even more mature in development...
I know you will say it adds value, but you dont know its it bad code or showing a different view.
ajoyner - Tuesday, February 24, 2009 - link
Most of the issues with the game are gone. There are currently no other MMO's out there that have the graphics or combat system to tax a gpu like this game. Your comment on testing a game that people play is very subjective. There are many MMO's out there that I would not touch....WOW, cough, cough.....but that doesn't mean other people don't enjoy them. I think having this game as one that is regularly benchmarked adds a great deal of value to the article.