ATI Radeon HD 3870 X2: 2 GPUs 1 Card, A Return to the High End
by Anand Lal Shimpi on January 28, 2008 12:00 AM EST- Posted in
- GPUs
Bioshock
Our Bioshock test involves a quick run through a section of the Medical Pavillion. Since the game has no built-in benchmark, our runthrough is manual and we rely on FRAPS to measure the average frame rate. All of the enemies are already dead in the area so we don't run the risk of running into one of them during our test.
The Radeon HD 3870 X2 manages to outperform every other single card solution in our tests, being bested only by the 8800 GT SLI which is a good $150 more expensive than the ATI solution. It's worth noting that it looks like AMD had finally solved its DirectX 10 performance issues under Bioshock; it's not just that the 3870 X2 performs well, but AMD's driver team seems to have done some good work over the last few driver revisions.
74 Comments
View All Comments
poohbear - Monday, January 28, 2008 - link
well its about time, good job amd, lets see u maintain the performance lead damn it!boe - Monday, January 28, 2008 - link
Howdy,I appreciate any benchmarks we can get but if you do a followup on this card with newer drivers, I hope you will consider the following
1. A comparison with a couple of older cards x1900 and 7900
2. A sound measurement of the cards e.g. db at full utilation from 2'
3. Crossfire performance if this card supports it.
4. Benchmarking on FEAR - all bells and whistles turned on
5. DX10 vs. DX9 performance.
Thanks again for creating this article - I'm considering this card.
perzy - Monday, January 28, 2008 - link
Am I the only one tired of all these multicores? I guess programming gets even more complex now. I guess the future all games will have development cycles like Duke Nukem forever -10+ years....?Are the GPU's hitting the heatwall 2 now?
Soon I'll stop reading these hardware sites. The only reports in the near future will be 'yet another core added.' Yipee.
wien - Monday, January 28, 2008 - link
Coding for a multi-GPU setup is not really any different that coding for a single-GPU one. All the complexity is handled by the driver, unlike with multi-core CPUs.FXi - Monday, January 28, 2008 - link
Have to say they did a good job, not great, but very good. We do need to see the 700 though, as this won't hold them for long.The other thing both camps need to address is dual monitors using SLI/CF. It's been forever since this tech has been out and it hasn't been fixed. Dual screens are commonplace and people like them. Could be one large and one smaller, or dual midrange but people want the FPS without losing their 2nd screen.
I'm sure there will be a rash of promises to fix this that won't materialize for years :) (as before)
ChronoReverse - Tuesday, January 29, 2008 - link
Actually, that was one of the things that was fixed by ATI. Dual screens will work even if it's in a window and _spanning_ the monitors. I'll see if I can find the review that showed that.murphyslabrat - Monday, January 28, 2008 - link
Come on AMD, don't die until we get the Radeon 4870 x2!Retratserif - Monday, January 28, 2008 - link
Third to last paragraph.The fact "hat" both?
Overall good article. To bad we didnt get to see temps or overclocks.
PeteRoy - Monday, January 28, 2008 - link
Is Nvidia and ATI will just put more of the same instead of innovate new technologies.Wasn't that what killed 3dfx, Nvidia should know.
kilkennycat - Monday, January 28, 2008 - link
The next-gen GPU family at nVidia is in full development. Hold on to your wallet till the middle of this year (2008). You may be in for a very pleasant surprise.