ATI Radeon HD 3870 X2: 2 GPUs 1 Card, A Return to the High End
by Anand Lal Shimpi on January 28, 2008 12:00 AM EST- Posted in
- GPUs
Last May, AMD introduced its much delayed Radeon HD 2900 XT at $399. In a highly unexpected move, AMD indicated that it would not be introducing any higher end graphics cards. We discussed this in our original 2900 XT review:
"In another unique move, there is no high end part in AMD's R600 lineup. The Radeon HD 2900 XT is the highest end graphics card in the lineup and it's priced at $399. While we appreciate AMD's intent to keep prices in check, the justification is what we have an issue with. According to AMD, it loses money on high end parts which is why we won't see anything more expensive than the 2900 XT this time around. The real story is that AMD would lose money on a high end part if it wasn't competitive, which is why we feel that there's nothing more expensive than the 2900 XT. It's not a huge deal because the number of people buying > $399 graphics cards is limited, but before we've started the review AMD is already giving up ground to NVIDIA, which isn't a good sign."
AMD has since released even more graphics cards, including the competitive Radeon HD 3870 and 3850, but it still lacked a high end offering. The end of 2007 saw a slew of graphics cards released that brought GeForce 8800 GTX performance to the masses at lower price points, but nothing any faster. Considering we have yet to achieve visual perfection in PC games, there's still a need for even faster hardware.
At the end of last year both AMD and NVIDIA hinted at bringing back multi-GPU cards to help round out the high end. The idea is simple: take two fast GPUs, put them together on a single card and sell them as a single faster video card.
These dual GPU designs are even more important today because of the SLI/CrossFire limitations that exist on various chipsets. With few exceptions, you can't run SLI on anything other than a NVIDIA chipset; and unless you're running an AMD or Intel chipset, you can't run CrossFire. These self-contained SLI/CrossFire graphics cards will work on anything however.
AMD is the first out of the gates with the Radeon HD 3870 X2, based on what AMD is calling its R680 GPU. Despite the codename, the product name tells the entire story: the Radeon HD 3870 X2 is made up of two 3870s on a single card.
The card is long, measuring 10.5" it's the same length as a GeForce 8800 GTX or Ultra. AMD is particularly proud of its PCB design which is admittedly quite compact despite featuring more than twice the silicon of a single Radeon HD 3870.
On the board we've got two 3870 GPUs, separated by a 48-lane PCIe 1.1 bridge (no 2.0 support here guys). Each GPU has 16 lanes going to it, and then the final 16 lanes head directly to the PCIe connector and out to the motherboard's chipset.
Two RV670 GPUs surround the PCIe bridge chip - Click to Enlarge
Thanks to the point-to-point nature of the PCI Express interface, that's all you need for this elegant design to work.
Each GPU has its own 512MB frame buffer, but the power delivery on the board has been reworked to deal with supplying two 3870 GPUs.
The Radeon HD 3870 X2 is built on a 12-layer PCB, compared to the 8-layer design used by the standard 3870. The more layers you have on a PCB the easier routing and ground/power isolation becomes, AMD says that this is the reason it is able to run the GPUs on the X2 faster than on the single GPU board. A standard 3870 runs its GPU at 775MHz, while both GPUs on the X2 run at 825MHz.
Memory speed is reduced however; the Radeon HD 3870 X2 uses slower, more available GDDR3 in order to keep board cost under control. While the standard 3870 uses 2.25GHz data rate GDDR4, the X2 runs its GDDR3 at a 1.8GHz data rate.
AMD expects the Radeon HD 3870 X2 to be priced at $449, which is actually cheaper than a pair of 3870s - making it sort of a bargain high end product. We reviewed so many sub-$300 cards at the end of last year that we were a bit put off by the $500 pricetag at first; then we remembered how things used to be, and it seems that the 3870 X2 will be the beginning of a return to normalcy in the graphics industry.
One GPU on the Radeon HD 3870
74 Comments
View All Comments
footballrunner800 - Monday, January 28, 2008 - link
its probably the drivers since Anandtech is still not using x64 and with a 1gb card that gives windows less than 3 gb of usable memory. The review says that AMD came with improvments on the last minute so imagine when they perfect themSunrise089 - Monday, January 28, 2008 - link
The drivers sure could.bill3 - Monday, January 28, 2008 - link
I wont link the review, doubt it's even possible, but over at the H Brent's review shows the 3870X2 in a much worse light. They show it outright losing to a single 8800GTX in COD4, Crysis, and UT3, while squeaking out a win in HL2.In the forum review thread when the differences between Brent's review and Anand's was brought up, it was basically claimed by Kyle that Anand's review is illegitimate because he only benchmarks "canned demos" (if you're familiar with H such spiel is nothing new from them). Further Kyle goes so far as to claim "AMD experienced a 60% fps increase in the Crysis canned GPU benchmark, we saw a couple frames a second in real gameplay. "
Kyle also says your COD4 bench, one of the two you guys did that wasn't "canned" and therefore invalid, is also invalid because you only benched a cutscene He hasn't said but I'm assuming the only bench you guys did that meets the Kyle standard would be Bioshock since it is real gameplay timed by fraps.
There are a few platform differences between the reviews, but Kyle has poo-pooed these as not making any major difference.
Thought you guys might be interested..the thread is the 3870X2 review thread at H forums.
Parhel - Monday, January 28, 2008 - link
HardOCP is just plain disreputable in every way. Their methodology is nonsense, their reviews are completely inadequate, and they continue to exist only because they drum up fake controversies and attempt to assassinate someone's character every few months.I'm not exaggerating when I say that I take what The Inquirer has to say more seriously.
Frumious1 - Monday, January 28, 2008 - link
I'm just shocked by how many people seem to place any relevance on the HOCP garbage. "It's because we're REAL WORLD and eveyone else is lies and fake stuff!" What a laugh. They play ONE resolution on TWO cards and pretend that's testing. Oh, and they don't use the same settings on both cards, they don't run at the same resolutions as previous reviews, they don't use the most comparable card (8800 GTS 512, anyone?), their testing isn't remotely similar to anything at any other site (hence they can just make claims about how they're doing it right and everyone else is wrong).... I could go on.Anyway, Anand and crew are best served in my opinion by completely ignoring this childish stuff from Kyle and his cohorts. You can choose: either all of the enthusiast sites except HOCP are wrong, or Kyle is wrong. Ocham's Razor suggests that the latter is far more likely.
AnnonymousCoward - Tuesday, January 29, 2008 - link
Yeah, one resolution and one competitor card doesn't say much.JWalk - Monday, January 28, 2008 - link
Yeah, Kyle is currently yelling at people in the forum thread. He has been asked multiple times why Anand, and most other sites, have a completely different view in their benchmarks. (Keep in mind that HardOCP only benchmarked 4 freaking games on 2 cards. According to Kyle, it would have been too much work to benchmark more games or more cards. Awww...not hard work...anything but that. LOL)Then, he either chants "canned" benchmarks over and over, or he tells the person asking the question to get lost and never come back to HardOCP.
It has even been pointed out more than once that the review sample he received might be the problem. Maybe he should try another card. But he is in full-blown arrogant a$$ mode right now. ;)
Devo2007 - Monday, January 28, 2008 - link
Not surprising coming from Kyle -- he comes across that way quite often.I don't like his reviews much -- not because of the canned benchmarks vs. real-world gameplay they preach about, bur rather the fact that they typically only compare one or two other cards to the one being tested. I have a GTS 320MB, and it would be nice to see whether the GTS 512MB would be worth it. Sadly, no direct comparison was done, because the GTS 320MB doesn't compete with said card.
I generally try to read several reviews to get an idea of a product - one site is never enough for these things. :)
boe - Monday, January 28, 2008 - link
Usually the stock coolers are pretty dang loud. I'm wondering if any cooling solutions will be available from zalman, artic or the other standards?It would have been sweet if there was a db measurement chart.
Gholam - Monday, January 28, 2008 - link
GeCube has a version that looks like it has 2xZalman VF900-Cu on it. Custom PCB with 4 DVI outputs too.