MultiGPU Update: Two-GPU Options in Depth
by Derek Wilson on February 23, 2009 7:30 AM EST- Posted in
- GPUs
Calculating Value: Performance per Dollar
Have you ever wondered what you get for your money? Well, I suppose that's a silly question, as anyone reading this page could guess. There are a couple of ways to present this data, and we wanted something simple to understand. It is important to remember that the way we've presented this information, absolute performance is not accounted for at all: the only metric we are looking at on this page is how much you get for the money you spend. Keep in mind that a good deal on 25 frames per second might not be what you are after: absolute performance is important too and we'll be looking at that in the next section. In general, more expensive solutions perform higher, so even if there is lower "value" the performance increase could be worth it to some buyers.
We will be using these prices for this calculation.
NVIDIA GeForce GTX 285 SLI | 700 |
NVIDIA GeForce GTX 280 SLI | 630 |
NVIDIA GeForce GTX 260 SLI | 400 |
NVIDIA GeForce 9800 GTX+ SLI | 290 |
NVIDIA GeForce GTX 295 | 500 |
NVIDIA GeForce GTX 285 | 350 |
NVIDIA GeForce GTX 280 | 315 |
NVIDIA GeForce GTX 260 core 216 | 225 |
NVIDIA GeForce GTX 260 | 200 |
NVIDIA GeForce 9800 GX2 | 300 |
NVIDIA GeForce 9800 GTX+ | 145 |
ATI Radeon HD 4850 X2 | 290 |
ATI Radeon HD 4870 512MB CrossFire | 350 |
ATI Radeon HD 4850 CrossFire | 290 |
ATI Radeon HD 4870 X2 | 440 |
ATI Radeon HD 4870 1GB | 220 |
ATI Radeon HD 4870 512MB | 175 |
ATI Radeon HD 4850 | 145 |
These prices were gathered from newegg.com and google and do NOT include mail-in rebates.
Our method here is to look at the performance you get for every hundred dollars spent. Specifically this answers the question: how many fps do you get in a specific game for every hundred bucks you spend on a particular graphics card. To calculate this data, we divided our performance data in average framerate by the cost of the card and then multiplied the result by 100. This isn't really a number that means something tangible: it's more just a metric that helps us relate the value of cards within a specific test. You can't compare any of these numbers between games, or even between resolutions, except in terms of relative order -- you need to look at one test and one resolution at a time.
To help out, if all the cards in a test had a score of "10", that would mean for every hundred dollars you spend you get 10 frames per second of performance in our test. Of course, though our value chart shows all the cards on equal footing, more expensive cards will have proportionally higher performance: if you wanted 30 frames per second in that specific benchmark you would need to spend at least $300.
So this isn't the bottom line in what to buy. These benchmarks are an indication of relative value outside absolute performance. Absolute performance is also a value metric: higher performance is more valuable and may be disproportionally more valuable if it crosses a playability threshold. These graphs will help show how much of a premium or a deal you are paying or getting on your absolute performance relative to other parts.
In general, multiGPU solutions will show less "value" than single GPU counterparts because we see less than linear scaling. If a two card solution costs twice as much while performance scales at less than 2x, we'll see a lower "value" result. The single card multiGPU options have a better chance at improving value than two card solutions, as they can sometimes be found for less than twice the cost of their nearest single card single GPU derivative.
95 Comments
View All Comments
SiliconDoc - Wednesday, March 18, 2009 - link
Oh, I'm sorry go to that new technolo9gy red series the 3000 series, and get that 3780... that covers that GAP the reds have that they constantly WHINE nvidia has but DOES NOT.Yes, go back a full gpu gen for the reds card midrange...
GOOD GOLLY - how big have the reddies been lying? !?
HUGE!
MagicPants - Monday, February 23, 2009 - link
I know you're trying to isolate and rate only the video cards but the fact of the matter is if you spend $200 on a video card that bottlenecks a $3000 system you have made a poor choice. By your metrics integrated video would "win" a number of tests because it is more or less free.You should add a chart where you include the cost of the system. Also feel free to use something besides i7 965. Spending $1000 on a CPU in an article about CPU thrift seems wrong.
oldscotch - Monday, February 23, 2009 - link
I'm not sure that the article was trying to demonstrate how these cards compare in a $3000.00 system, as much as it was trying to eliminate any possibility of a CPU bottleneck.MagicPants - Monday, February 23, 2009 - link
Sure if the article was about pure performance this would make sense, but in performance per dollar it's out of place.If you build a system for $3000 and stick a $200 gtx 260 in it and get 30 fps you've just paid $106 ($3200/30fps)per fps.
Take that same $3000 dollar system and stick a $500 gtx 295 and assume you get 50 fps in the same game. Now you've paid just $70($3500/50fps) per fps.
In the context of that $3000 system the gtx 295 is the better buy because the system is "bottlenecking" the price.
OSJF - Monday, February 23, 2009 - link
What about micro stuttering in MultiGPU configurations?I just bought a HD4870 1GB today, the only reason i didn't choose a MultiGPU configuration was all the talkings about micro stuttering on german tech websites.
DerekWilson - Monday, February 23, 2009 - link
In general, we don't see micro stuttering except at high resolutions on memory intensive games that show average framerates that are already lowish ... games, hardware and drivers have gotten a bit better on that front when it comes to two GPUs to the point where we don't notice it as a problem when we do our hands on testing with two GPUs.chizow - Monday, February 23, 2009 - link
Nice job on the review Derek, certainly a big step up from recent reviews of the last 4-5 months. A few comments though:1) Would be nice to see what happens and start a step-back, CPU scaling with 1 GPU, 2 GPU and 3 CPU. Obviously you'd have to cut down the number of GPUs tested, but perhaps 1 from each generation as a good starting point for this analysis.
2) Some games where there's clearly artificial frame caps or limits, why wouldn't you remove them in your testing first? For example, Fallout 3 allows you to remove the frame cap/smoothing limit, which would certainly be more useful info than a bunch of SLI configs hitting 60FPS cap.
3) COD5 is interesting though, did you contact Treyarch about the apparent 60FPS limit for single-GPU solutions? I don't recall any such cap with COD4.
4) Is the 4850X2 still dependent on custom drivers from Sapphire? I've read some horror stories about official releases not being compatible with the 4850X2, which would certainly put owners behind the 8-ball as a custom driver would certainly have the highest risk of being dropped when it comes to support.
5) Would've been nice to have seen an overclocked i7 used, since its clearly obvious CPU bottlenecks are going to come into play even more once you go to 3 and 4 GPU solutions, while reducing the gain and scaling for the faster individual solutions.
Lastly, do you plan on discussing or investigating the impact of multi-threaded optimizations from drivers in Vista/Win7? You mentioned it in your DX11 article, but both Nvidia and ATI have already made improvements in their drivers, which seem to be directly credited for some of the recent driver gains. Particularly, I'd like to see if its a WDDM 1.0-1.1 benefit from multi-threaded driver that extends to DX9,10,11 paths, or if its limited strictly to WDDM 1.0-1.1 and DX10+ paths.
Thanks, look forward to the rest.
SiliconDoc - Wednesday, March 18, 2009 - link
Thank you very much." 2) Some games where there's clearly artificial frame caps or limits, why wouldn't you remove them in your testing first? For example, Fallout 3 allows you to remove the frame cap/smoothing limit, which would certainly be more useful info than a bunch of SLI configs hitting 60FPS cap.
3) COD5 is interesting though, did you contact Treyarch about the apparent 60FPS limit for single-GPU solutions? I don't recall any such cap with COD4.
4) Is the 4850X2 still dependent on custom drivers from Sapphire? I've read some horror stories about official releases not being compatible with the 4850X2, which would certainly put owners behind the 8-ball as a custom driver would certainly have the highest risk of being dropped when it comes to support.
"
#2 - a red rooster booster that limits nvidia winning by a large margin- unfairly.
#3 - Ditto
#4 - Ditto de opposite = this one boosts the red card unfairly
Yes, when I said "red fan boy" is all over Derek's articles, I meant it.
DerekWilson - Monday, February 23, 2009 - link
thanks for the feedback ... we'll consider some of this going forward.we did what we could to remove artificial frame caps. in fallout 3 we set ipresentinterval to 0 in both .ini files and framerate does go above 60 -- it just doesn't average above 60 so it looks like a vsync issue when it's an LOD issue.
we didn't contact anyone about COD5, though there's a console variable that's supposed to help but didn't (except for the multiGPU soluitons).
we're looking at doing overclocking tests as a follow up. not 100% on that, but we do see the value.
as for the Sapphire 4850 X2, part of the reason we didn't review it initially was because we couldn't get drivers. Ever since 8.12 we've had full driver support for the X2 from AMD. We didn't use any specialized drivers for that card at all.
we can look at the impact of multithreaded optimizations, but this will likely not come until DX11 as most of the stuff we talked about requires DX11 to work. I'll talk to NVIDIA and AMD about current multithreaded optimizations, and if they say there is anything useful to see in current drivers we'll check it out.
thanks again for the feedback
chizow - Monday, February 23, 2009 - link
Oh and specifically, much better layout/graph display with the resolution selections! :)