AMD's Radeon HD 4870 X2 - Testing the Multi-GPU Waters
by Anand Lal Shimpi & Derek Wilson on August 12, 2008 12:00 AM EST- Posted in
- GPUs
These Aren't the Sideports You're Looking For
Remember this diagram from the Radeon HD 4850/4870 review?
I do. It was one of the last block diagrams I drew for that article, and I did it at the very last minute and wasn't really happy with the final outcome. But it was necessary because of that little red box labeled CrossFire Sideport.
AMD made a huge deal out of making sure we knew about the CrossFire Sideport, promising that it meant something special for single-card, multi-GPU configurations. It also made sense that AMD would do something like this, after all the whole point of AMD's small-die strategy is to exploit the benefits of pairing multiple small GPUs. It's supposed to be more efficient than designing a single large GPU and if you're going to build your entire GPU strategy around it, you had better design your chips from the start to be used in multi-GPU environments - even more so than your competitors.
AMD wouldn't tell us much initially about the CrossFire Sideport other than it meant some very special things for CrossFire performance. We were intrigued but before we could ever get excited AMD let us know that its beloved Sideport didn't work. Here's how it would work if it were enabled:
The CrossFire Sideport is simply another high bandwidth link between the GPUs. Data can be sent between them via a PCIe switch on the board, or via the Sideport. The two aren't mutually exclusive, using the Sideport doubles the amount of GPU-to-GPU bandwidth on a single Radeon HD 4870 X2. So why disable it?
According to AMD the performance impact is negligible, while average frame rates don't see a gain every now and then you'll see a boost in minimum frame rates. There's also an issue where power consumption could go up enough that you'd run out of power on the two PCIe power connectors on the board. Board manufacturers also have to lay out the additional lanes on the graphics card connecting the two GPUs, which does increase board costs (although ever so slightly).
AMD decided that since there's relatively no performance increase yet there's an increase in power consumption and board costs that it would make more sense to leave the feature disabled.
The reference 4870 X2 design includes hardware support for the CrossFire Sideport, assuming AMD would ever want to enable it via a software update. However, there's no hardware requirement that the GPU-to-GPU connection is included on partner designs. My concern is that in an effort to reduce costs we'll see some X2s ship without the Sideport traces laid out on the PCB, and then if AMD happens to enable the feature in its drivers later on some X2 users will be left in the dark.
I pushed AMD for a firm commitment on how it was going to handle future support for Sideport and honestly, right now, it's looking like the feature will never get enabled. AMD should have never mentioned that it ever existed, especially if there was a good chance that it wouldn't be enabled. AMD (or more specifically ATI) does have a history of making a big deal of GPU features that never get used (Truform anyone?), so it's not too unexpected but still annoying.
The lack of anything special on the 4870 X2 to make the two GPUs work better together is bothersome. You would expect a company who has built its GPU philosophy on going after the high end market with multi-GPU configurations to have done something more than NVIDIA when it comes to actually shipping a multi-GPU card. AMD insists that a unified frame buffer is coming, it just needs to make economic sense first. The concern here is that NVIDIA could just as easily adopt AMD's small-die strategy going forward if AMD isn't investing more R&D dollars into enabling multi-GPU specific features than NVIDIA.
The lack of CrossFire Sideport support or any other AMD-only multi-GPU specific features reaffirms what we said in our Radeon HD 4800 launch article: AMD and NVIDIA don't really have different GPU strategies, they simply target different markets with their baseline GPU designs. NVIDIA aims at the $400 - $600 market while AMD shoots for the $200 - $300 market. And both companies have similar multi-GPU strategies, AMD simply needs to rely on its more.
93 Comments
View All Comments
M1KEO - Saturday, August 16, 2008 - link
Buying a high end video card has little to no effect on the price of gasoline, seeing as very few power plants run off of oil. And are you relating electicity usage to forest fires and floods which are all natural disasters, and have been happening for milleniums? Look at what scientists are saying, and realize temperatures were actually warmer in the 1980's then they are now, and that plants even flourish with more CO2 in the atmosphere because that is what they use to make oxygen.far327 - Sunday, August 17, 2008 - link
Whatever makes you sleep better at night. Your approach is as if energy, despite how it is produced or distributed is an endless commodity. Where as, I am trying to take a more conservative approach towards the ideal that energy is a valuable resource because of the ways we import it and produce it. Now if energy was made via solar or wind, I would loosen up a bit with my energy spending habits because that it would then be renewable energy. I'm just saying, don't feed the pig if it's already over weight. Eventually that pig will not be able to walk, and the meat with spoil. We as a country need to completely change the way we think about our energy spending habits. If we buy these power hog cards and create a viable market for Nvidia and AMD to invest in year after year. The exuberant careless energy spending cycle continues... We are therefore feeding that pig until it will eventually collapse. WAKE UP AND SMELL THE NEWS PEOPLE!! Global warming is not even debatable anymore! It is a very real threat towards our existence as a people. I am done with this childish debate and I'm sure all of you will be happy I leave the board, but don't say you weren't all warned.BenPope - Thursday, August 14, 2008 - link
I guess SidePort will become useful on 4-way plus... in much the same way as 2 or more hypertransport links in opteron 4 and 8 way CPUs scale.So if you have 4 GPUs, the sideports could connect diagonal corners to reduce latency the two-hop latency and increase bandwidth.
Barack Obama - Thursday, August 14, 2008 - link
:)oldhoss - Thursday, August 14, 2008 - link
Uh oh...Bedwetting tree huggin liberal alert! ;-PHrel - Thursday, August 14, 2008 - link
How the heck did you not include the 9800GX2 in your testing; I mean, that's Nvidia's only comprable card. And you said yourself it outperforms the GTX 280. When you factor in that it only cost 285 dollars on newegg it's a great buy. I'm actually amazed and sincerely confused as to why that card wasn't included in this review. Big mistake anandtech; not a small oversight but a complete disregard for common sense.jeffrey - Thursday, August 14, 2008 - link
Usually, NDA dates are known well in advance for the latest and greatest tech. That means that many people are excited and looking forward to insight on release day.I was happy to see the 4870 X2 posted when I opened the site. I was even happier to see the authors of the review were Anand and Derek. This to me usually means a well-thought out unbiased article that would have unique industry insights.
The article seemed rushed, incomplete, and unbalanced. What a disappointment! ATI released the current performance king in the 4870 X2, a mid-level 4850 X2, AND refreshed the 4870 and 4850 by doubling the RAM!
So much time and effort was wasted in the article whining about AMD/ATI not using the Sideport that driver versions and system specs weren't even included.
This post probably sounds like a broken record now that I'm number 70 something giving feedback that is not very positive. I just want this site to stay the best and I felt I owed it to you Anand and Derek to try and push you to do better. Thanks for all the great work that you have done over the years.
Bezado11 - Wednesday, August 13, 2008 - link
I loved the article and well it shows that the new king of cards is the 4870X2, however; I think your doing a bit of extra work for a benchmark nobody will use. AOC is tanking hard, not sure if you guys are aware of that games overall lack of integrity. Since AOC is not going to be a well played or viewed game, why use that as a benchmark standard? I mean we won't care one bit about it sooner or later because the game is in it's death stages.Just a heads up on that. I think taking the AOC benchmark out of future reviews will be advised. Stick to what we know best and what stresses the hardware the most like Crysis etc. AOC for heavens sake doesn't even support DX10 yet.
Griswold - Thursday, August 14, 2008 - link
While I dont play AoC or plan on doing so, you just showed what a foolish idiot you are by claiming its soon demise. It has been the fastest selling MMO launch in history, I think "some" people will stick to it and even more will return when the content problem has been solved. Just because you dont like it, doesnt mean its not a good benchmark.I mean, I couldnt care less about all these "quake wars" and "ssassins creeds" that are, in my opinion, played by dumbass kids such as you, but hell, I wont complain about them being used as a benchmark.
Scour - Wednesday, August 13, 2008 - link
This article is a way to negative for AMD/ATIs cards. This looks like the reviewer hate ATI, dunno whyFirst the negative article about 790GX-chipset, now this :(