AMD's Radeon HD 4870 X2 - Testing the Multi-GPU Waters
by Anand Lal Shimpi & Derek Wilson on August 12, 2008 12:00 AM EST- Posted in
- GPUs
Final Words
I've never felt totally comfortable with single-card multi-GPU solutions. While AMD reached new levels of seamless integration with the Radeon HD 3870 X2, there was always the concern that the performance of your X2 would either be chart topping or merely midrange depending on how good AMD's driver team was that month. The same is true for NVIDIA GPUs, most games we test have working SLI profiles but there's always the concern that one won't. It's not such a big deal for us benchmarking, but it is a big deal if you've just plopped down a few hundred dollars and expect top performance across the board.
Perhaps I'm being too paranoid, but the CrossFire Sideport issue highlighted an important, um, issue for me. I keep getting the impression that multi-GPU is great for marketing but not particularly important when it comes to actually investing R&D dollars into design. With every generation, especially from AMD, I expect to see a much more seamless use of multiple GPUs, but instead we're given the same old solution - we rely on software profiles to ensure that multiple GPUs work well in a system rather than having a hardware solution where two GPUs truly appear, behave and act as one to the software. Maybe it's not in the consumer's best interest for the people making the GPUs to be the same people making the chipsets, it's too easy to try and use multi-GPU setups to sell more chipsets when the focus should really be on making multiple GPUs more attractive across the board, and just...work. But I digress.
The Radeon HD 4870 X2 is good, it continues to be the world's fastest single card solution, provided that you're running a game with CrossFire support. AMD's CF support has been quite good in our testing, scaling well in all but Assassin's Creed. Of course, that one is a doubly bitter pill for AMD when combined with the removal of DX10.1 support in the latest patch (which we did test with here). That has nothing to do with CrossFire support of course, but the lack of scaling and the fact that 4xAA has the potential to be free on AMD hardware but isn't really doesn't stack up well in that test.
In addition to being the fastest single card solution, the 4870 X2 in CrossFire is also the fastest 2 card solution at 2560x1600 in every test we ran but one (once again, Assassin's Creed). It is very important to note that 4-way CrossFire was not the fastest solution at lower than 2560x1600 in as many cases. This is generally because there is more overhead associated with 4-way CrossFire which can become the major bottle neck in performance at lower resolution. It isn't that the 4870 X2 in CrossFire is unplayable at lower resolutions, it's just a waste of money.
We do have yet to test 3-way SLI with the newest generation of NVIDIA hardware, and the 3-way GTX 260 may indeed give 2x 4870 X2 cards a run for their money. We also have no doubt that a 3x GTX 280 solution is going to be the highest performing option available (though we lament the fact that anyone would waste so much money on so much unnecessary (at this point in time) power).
For now, AMD and NVIDIA have really put it all in on this generation of hardware. AMD may not have the fastest single GPU, but they have done a good job of really shaking up NVIDIA's initial strategy and forcing them to adapt their pricing to keep up. Right now, the consumer can't go wrong with a current generation solution for less than $300 in either the GTX 260 or the HD 4870. These cards compete really well with each other and gamers will really have to pay attention to which titles they desire greater performance in before they buy.
The GTX 280 is much more reasonable at $450, but you are still paying a premium for the fastest single GPU solution available. In spite of the fact that the price is 150+% of the GTX 260 and the 4870, you just don't get that return in performance. It is faster than the GTX 260, and most of the time it is faster than the 4870 (though there are times when AMD's $300 part outperforms NVIDIA's $450 part). The bottom line is that if you want performance at a level above the $300 price point in this generation, you're going to get less performance per dollar.
When you start pushing up over $450 and into multi-GPU solutions, you do have to be prepared for even more diminished returns on your investment, and the 4870 X2 is no exception. Though it scales well in most cases and leads the pack in terms of single card performance when it scales, there is no gaurantee that scaling will be there, let alone good, in every game you want to play. AMD is putting a lot into this, and you can expect us to keep pushing them to get performance impovements as near to linear as possible with multi-GPU solutions. But until we have shared framebuffers and real cooperation on rendering frames from a multi-GPU solution we just aren't going to see the kind of robust, consistent results most people will expect when spending over $550+ on graphics hardware.
93 Comments
View All Comments
Spoelie - Tuesday, August 12, 2008 - link
How come 3dfx was able to have a transparant multigpu solution back in the 90's - granted, memory still was not shared - when it seems impossible for everyone else these days.Shader functionality problems? Too much integration (a single card voodoo2 was a 3 chip solution to begin with)?
Calin - Tuesday, August 12, 2008 - link
The SLI from 3dfx used scan line interleaving (or Scan Line Interleaving to be exact). The new SLI still has Scan Line Interleaving, amongst other modes.The reason 3dfx was able to use this is that the graphic library used was their own, and it was built specifically to the task. Now, Microsoft's DirectX is not built for this SLI thing, and it shows (see the CrossFire profiles, selected for the best performance for a game, depending on that game).
Also, 3dfx's SLI had a dongle feeding video signal from the second card (slave) into the first card (master), and the video from the two cards was interleaved. Now, this uses lots of bandwidth, and I don't think DirectX is able to generate scenes in "only even/odd lines", and much of the geometry work must be done by both cards (so if your game engine is geometry bound, SLI doesn't help you)
mlambert890 - Friday, August 15, 2008 - link
Great post... Odd that people seem to remember 3DFX and dont remember GLIDE or how it worked. Im guessing they're too young to have actually owned the original 3D cards (I still have my dedicated 12MB Voodoo cards in a closet), and they just hear something on the web about how "great" 3DFX was.It was a different era and there was no real unified 3D API. Back then we used to argue about OpenGL vs GLIDE and the same types of malcontents would rant and rave about how "evil" MSFT was for daring to think to create DirectX
Today a new generation of illinformed malcontents continue to rant and rave about Direct3D and slam NVidia for "screwing up" 3DFX when the reality is that time moves on and NVidia used the IP from 3DFX that made sense to use (OBVIOUSLY - sometimes the people spending hundreds of millions and billions have SOME clue what they're buying/doing and actually have CS PhDs rather than just "forum posting cred")
Zoomer - Wednesday, August 13, 2008 - link
Ah, I remember wanting to get a Voodoo5 5000, but ultimately decided on the Radeon 32MB DDR instead.Yes, 32MB DDR framebuffer!
JarredWalton - Tuesday, August 12, 2008 - link
Actually, current SLI stands for "Scalable Link Interface" and has nothing to do with the original SLI other than the name. Note also that 3dfx didn't support anti-aliasing with SLI, and they had issues going beyond the Voodoo2... which is why they're gone.CyberHawk - Tuesday, August 12, 2008 - link
nVidia bought them .... and is now uncapable of take advantage if the technology :DStevoLincolnite - Tuesday, August 12, 2008 - link
They could have at least included support for 3DFX glide so all those GLIDE only games would continue to function.Also, ATI have had a "Dual GPU" Card for many years (Rage Furry Maxx) before nVidia released one.
TonyB - Tuesday, August 12, 2008 - link
can it play Crysis though?two of my friends computer died while playing it.
Spoelie - Tuesday, August 12, 2008 - link
no it can't, the crysis benchmarks are just made upstop with the bearded comments already
MamiyaOtaru - Wednesday, August 13, 2008 - link
Dude was joking. And it was funny.It's apparently pretty dangerous to joke around here. Two of my friends died from it.