AMD's CrossFireX: Tri & Quad GPU Preview
by Anand Lal Shimpi on February 21, 2008 3:00 AM EST- Posted in
- GPUs
Power Consumption
In our 3-way SLI review we saw power consumption figures close to 800W at the wall outlet, thankfully with cooler running GPUs the CrossFireX numbers aren't as bad:
Number of GPUs | Idle Power | Load Power (Bioshock) |
1 x Radeon HD 3870 | 148W | 257W |
2 x Radeon HD 3870 (1 X2) | 181W | 361W |
3 x Radeon HD 3870 (2 X2 + 1) | 211W | 406W |
4 x Radeon HD 3870 (2 X2) | 240W | 538W |
With four GPUs we're over 500W of power consumption at the wall when running our Bioshock benchmark. There's still no clear need for greater than 1kW power supplies, but the better-safe-than-sorry mentality appears to be in full effect.
Final Words
We have to give AMD credit, there was no cherry picking of titles for this preview - for the most part, the benchmarks AMD itself selected showed no real need for 4-way CrossFireX over 3-way. We do appreciate the honesty, but it's clear that the world just isn't ready for a quad-GPU solution.
Due to the state of AMD's driver optimizations DX10 games currently only scale well to 3 GPUs and not much beyond (Crysis/Bioshock), while DX9 games will generally scale better all the way up to 4 GPUs. We expected the opposite to be true but AMD provided us with technical insight as to why it is the case:
"The biggest issue is DX10 has a lot more opportunities for persistent resources (resources rendered or updated in one frame and then read in subsequent frames). In DX9 we only had to handle texture render targets, which we have a good handle on in the DX10 driver. In addition to texture render targets DX10 allows an application to render to IBs and VBs using stream out from the GS or as a traditional render target. An application can also update any resource with a copy blt operation, but in DX9 copy blt operations were restricted to offscreen plains and render targets. This additional flexibility makes it harder to maximize performance without impacting quality.
Another area that creates issues is constant buffers, which is new for DX10. Some applications update dynamic constant buffers every frame while other apps update them less frequently. So again we have to find the right balance that generally works for quality without impacting performance.
We are also seeing new software bottlenecks in DX10 that we continue to work through. These software bottlenecks are sometimes caused by interactions with the OS and the Vista driver model that did not exist for DX9, most likely due to the limited feature set. Software bottlenecks impact our multi-GPU performance more than single GPU and can be a contributing factor to limited scaling.
We’re continuing to push hard to find the right solution to each challenge and boost performance and scalability wherever we can. As you can see, there are a lot of things that factor in."
From AMD's explanation it sounds like there's still a lot of work to be done on the CrossFireX driver. While we can expect to see its public debut in March, it seems like it'll be a while before we're anywhere close to ideal scaling. We've found ourselves in this position with many-GPU designs in the past, at least the players are taking things a bit more seriously this time around.
28 Comments
View All Comments
Spacecomber - Friday, February 22, 2008 - link
Having recently read the preview of the 9600GT at Anandtech, one of the things that stood out from that article was how SLI seemed to do better than Crossfire on the games that were tested. Crysis was the only game that was used in both this article and that one, and 3850's were run in Crossfire for the earlier article, not 3870's. Nevertheless, it looks like Crossfire performance gains going from 1 to 2 ATI cards is now on a par (with the new AMD/ATI drivers) with going from 1 to 2 Nvidia cards.Perhaps this will prove to be the reason for AMD/ATI selecting the tests they did in this preview. CrossfireX does about as well SLI on these particular games?
(Though we'll not see the results, we know that Derek is trying these new ATI drivers out on his Skulltrail system, if it's possible. ;-) )
Zoomer - Thursday, February 21, 2008 - link
Are you allowed to only test these games, or allowed to publish and talk about these games only?I don't see how ATi can enforce such a requirement. *cough* ghost *cough*
Wirmish - Thursday, February 21, 2008 - link
PC Perspective also test the beta-CrossFire-X.Their system is identical, except for the hard disk.
http://www.pcper.com/article.php?aid=523&type=...">http://www.pcper.com/article.php?aid=523&type=...
RÉSULTS:
Bioshock
Anand (0xAA/0xAF) -> 2=63, 3=87, 4=93
PCPer (0xAA/8xAF) -> 2=65, 3=84, 4=92
Call of Duty 4
Anand (4xAA/16xAF) -> 2=50, 3=72, 4=93
PCPer (4xAA/ 8xAF) -> 2=43, 3=56, 4=64
Unreal Tournament 3
Anand (0xAA/16xAF) -> 2=84, 3=113, 4=115
PCPer (0xAA/ 8xAF) -> 2=58, 3=54, 4=58
How do you explain these results ?
POWER CONSUMPTION - ENTIRE SYSTEM
Anand (Bioshock) -> 2=361W, 3=406W, 4=538W
PCPer (CoD 4) ----> 2=407W, 3=527W, 4=663W
Why didn't you choose the hungriest game to calculate the consumption of the system ?
Paracelsus - Thursday, February 21, 2008 - link
You've listed the 4-way CF over 1 card gain as 268%.It should be 368%. (95.2/25.3 = 3.68)
The numbers are confusing, comparing 3-way to 2-way etc. Why do that, makes more sense to compare 3-way to 1-way. Then it's easy to compare to the perfect scaling of 200, 300, 400%.
mechwarrior1989 - Thursday, February 21, 2008 - link
I got confused by it to but the Test Bench isn't what they listed on the page. Either that's a Typo or that's just supposed to refer to the Nvidia Benchmark that they did with the Tri-SLIkalrith - Thursday, February 21, 2008 - link
Yeah, it looks like they entered the test setup from the Nvidia article rather than the one used in this article. It would make sense if they entered both test setups but not to completely exclude the AMD setup used for most of the tests.Anand Lal Shimpi - Thursday, February 21, 2008 - link
Woops :) Fixed :)Take care,
Anand
Arbie - Thursday, February 21, 2008 - link
Good catch for Anandtech! AMD told you it was a Phenom board but you saw it was really a QX9650. The company must be in dire straits to try a trick like that. Sad.donkeycrock - Thursday, February 21, 2008 - link
The graph is the best and most effective one i've seen in anybody's review in a long time. cheers and well donebrunis - Thursday, February 21, 2008 - link
Hi, alot of people are still playing WoW, me included. I just bought the Samsung 245B, a wide 24" screen. I'd love to see a cpu+gpu update for the wow guide. If i should invest in an extra 8800GTS or a new Core2 or AMD Phenom cpu.
regards,
Brunis