The Radeon HD 5970: Completing AMD's Takeover of the High End GPU Market
by Ryan Smith on November 18, 2009 12:00 AM EST- Posted in
- GPUs
The catch however is that what we don’t have is a level of clear domination when it comes to single-card solutions. AMD was shooting to beat the GTX 295 with the 5870, but in our benchmarks that’s not happening. The 295 and the 5870 are close, perhaps close enough that NVIDIA will need to reconsider their position, but it’s not enough to outright dethrone the GTX 295. NVIDIA still has the faster single-card solution, although the $100 price premium is well in excess of the <10% performance premium.
-From Our Radeon 5870 Review, On The GTX 295 vs. The 5870
Let’s get straight to the point, shall we? Today AMD is launching the 5970, their dual-GPU card that finishes building out AMD’s technical domination of the high-end market. With it AMD delivers the absolute victory over NVIDIA’s GTX 295 that the Radeon 5870 couldn’t quite achieve and at the same time sets the new high water mark for single-card performance.
This also marks the last AMD product introduction of the year. The rest of the Evergreen series, composing the sub-$100 low-end parts, will be launching next year.
AMD Radeon HD 5970 | AMD Radeon HD 5870 | AMD Radeon HD 5850 | |
Stream Processors | 2x1600 | 1600 | 1440 |
Texture Units | 2x80 | 80 | 72 |
ROPs | 2x32 | 32 | 32 |
Core Clock | 725MHz | 850MHz | 725MHz |
Memory Clock | 1GHz (4GHz data rate) GDDR5 | 1.2GHz (4.8GHz data rate) GDDR5 | 1GHz (4GHz data rate) GDDR5 |
Memory Bus Width | 2x256-bit | 256-bit | 256-bit |
Frame Buffer | 2x1GB | 1GB | 1GB |
Transistor Count | 2x2.15B | 2.15B | 2.15B |
TDP | 294W | 188W | 151W |
Manufacturing Process | TSMC 40nm | TSMC 40nm | TSMC 40nm |
Price Point | $599 | $400 | $300 |
The 5970 serves as the nowadays obligatory dual-GPU part. It is 2 Cypress dice mounted on a single, dual-slot video card. AMD clocks it at 725MHz core and 1GHz (4GHz effective) for the GDDR5 memory. The card comes equipped with 2GB of GDDR5, which is split between the two GPUs, giving it an effective memory capacity of 1GB. The card will be selling for $600, at least so long as vendors and retailers hold the line on MSRP.
In practice this makes the card something between a 5850 in Crossfire mode and a 5870 in Crossfire mode. The clocks are the same as the 5850, but here all 20 SIMD units are enabled. This is a 15% clockspeed difference between the 5970 and 5870CF, so officially the 5870CF will continue to be the faster setup. However as we’ll see in a bit, looking at the stock 5970 can be a bit deceiving.
This also brings up the matter of the name of the card. We asked AMD what happened to the X2 tag, and the answer is that they didn’t want to use it since the card was configured neither like a 5850 nor a 5870 – it was closer to a mythical 5860. So rather than call it an odd (or worse yet, wrong) name, AMD just gave it a new model number entirely. We suspect AMD wanted to be rid of the X2 name – their processors go up to X4 after all – but there you go as far as an official reason is concerned. It looks like special multi-GPU tags are now gone in both the NVIDIA and AMD camps.
Moving on, for power, the 5970 uses an 8pin and a 6pin power connector (although the 6pin sits on top of a spot silk-screened for anther 8pin). The TDP is 294W, bringing it in just under the 300W ATX limit. Idle power is 42W, thanks to AMD’s aggressive power optimizations present in the entire 5000 series.
As some of you may have noticed, in spite of the fact that this card is at least a pair of 5850s, it consumes less than the 320W (2x160W) such a setup would. In order to meet the 300W limit, AMD went and binned Cypress chips specifically for the 5970, in order to find chips that could operate at 725MHz at only 1.05v (the 5850 runs at 1.088v). Given the power creep coming from the 4800 series, binning for the best chips is the only way AMD could get a 300W card out.
AMD’s official guidance for this card is that the minimum requirements are a 650W power supply, and they recommend a 750W power supply. The recommended power supply will become more important later on when we talk about overclocking.
Finally, AMD is also launching Crossfire Eyefinity support with the 5970, and thus far only the 5970. Currently Eyefinity doesn’t work with Crossfire mode on any of AMDs cards due to driver limitations. The drivers that the 5970 will be shipping with enable Crossfire Eyefinity support on the 5970 for 22 games – currently AMD is using whitelisting and is enabling games on a case-by-case basis. Crossfire Eyefinity will make its way in to the mainstream Catalyst drivers and be enabled for other cards early next year.
114 Comments
View All Comments
GourdFreeMan - Friday, November 20, 2009 - link
Having not bought MW2, I can say conversely that the lack of differentiation between console and PC features hurts game sales. According to news reports, in the UK PC sales of MW2 account for less than 3% of all sales. This is neither representative of the PC share of the gaming market (which should be ~25% of all "next-gen" sales based on quarterly reports of revenue from publishers), nor the size of the install base of modern graphics cards capable of running MW2 at a decent frame rate (which should be close to the size of the entire console market based on JPR figures). Admittedly the UK has a proportionately larger console share than the US or Germany, but I can't image MW2 sales of the PC version are much better globally.I am sure executives will be eager to blame piracy for the lack of PC sales, but their target market knows better...
cmdrdredd - Wednesday, November 18, 2009 - link
[quote]Unfortunately, since playing MW2, my question is: are there enough games that are sufficiently superior on the PC to justify the inital expense and power usage of this card? Maybe thats where eyefinity for AMD and PhysX for nVidia come in: they at least differentiate the PC experience from the console.I hate to say it, but to me there just do not seem to be enough games optimized for the PC to justify the price and power usage of this card, that is unless one has money to burn.[/quote]
Yes this is exactly my thoughts. They can tout DX11, fancy schmancy eyefinity, physx, everything except free lunch and it doesn't change the fact that the lineup for PC gaming is bland at best. It sucks, I love gaming on PC but it's pretty much a dead end at this time. No thanks to every 12 year old who curses at you on XBox Live.
The0ne - Wednesday, November 18, 2009 - link
My main reason to want this card would be to drive my 30" LCDs. I have two Dell's already and will get another one early next year. I don't actually play games much but I like having the desktop space for my work.-VM's at higher resolution
-more open windows without switching too much
-watch movie(s) while working
-bigger font size but maintaining the aspect ratio of programs :)
Currently have my main on one 30" and to my 73" TV. TV is only 1080P so space is a bit limited. Plus working on the TV sucks big time :/
shaolin95 - Wednesday, November 18, 2009 - link
I am glad ATI is able to keep competing as that helps keep prices at a "decent" level.Still, for all of you so amazed by eyefinity, do yourselves a favor and try 3D vision with a big screen DLP then you will laugh at what you thought was cool and "3D" before.
You can have 100 monitors but it is still just a flat world....time to join REAL 3D gaming guys!
Carnildo - Wednesday, November 18, 2009 - link
Back in college, I was the administrator for a CAVE system. It's a cube ten feet on a side, with displays on all surfaces. Combine that with head tracking, hand tracking, shutter glasses, and surround sound, and you've got a fully immersive 3D environment.It's designed for 3D visualization of large datasets, but people have ported a number of 3D shooters to the platform. You haven't lived until you've seen a life-sized opponent come around the corner and start blasting away at you.
7Enigma - Wednesday, November 18, 2009 - link
But Ryan, I feel you might need to edit a couple of your comparison comments between the 295 and this new card. Based on the comments in a several previous articles quite a few readers do not look at (or understand) the charts and instead rely on the commentary below the charts. Here's some examples:"Meanwhile the GTX 295 sees the first of many falls here. It falls behind the 5970 by 30%-40%. The 5870 gave it a run for its money, so this is no surprise."
This one for Stalker is clear and concise. I'd recommend you repeat this format for the rest of the games.
"As for the GTX 295, the lead is only 20%. This is one of the better scenarios for the GTX 295."
This comment was for Battleforge and IMO is confusing. To someone not reading the chart it could be viewed as saying the 295 has a 20% advantage. Again I'd stick with your Stalker comment.
"HAWX hasn’t yet reached a CPU ceiling, but it still gets incredibly high numbers. Overclocking the card gets 14% more, and the GTX 295 performance advantage is 26%."
Again, this could be seen as the 295 being 26% faster.
"Meanwhile overclocking the 5970 is good for another 9%, and the GTX 295 gap is 37%."
This one is less confusing as it doesn't mention an advantage but should just mention 37% slower.
Finally I think you made a typo in the conclusion where you said this:
"Overclock your 5970 to 5870 speeds if you can bear the extra power/heat/noise, but don’t expect 5970CF results."
I think you meant 5870CF results...
Overall, though, the article is really interesting as we've finally hit a performance bottleneck that is not so easily overcome (due to power draw and ATX specifications). I'm very pleased, however, that you mention first in the comments that this truly is a card meant for multi-monitor setups only, and even then, may be bottlenecked by design. The 5870 single card setup is almost overkill for a single display, and even then most people are not gaming on >24" monitors.
I've said it for the past 2 generations of cards but we've pretty much maxed out the need for faster cards (for GAMING purposes). Unless we start getting some super-hi res goggles that are reasonably priced, there just isn't much further to go due to display limitations. I mean honestly are those slightly fuzzy shadows worth the crazy perforamnce hit on a FPS? I honestly am having a VERY difficult time seeing a difference in the first set of pictures of the soldier's helmet. The pictures are taken slightly off angle from each other and even then I don't see what the arrow is pointing at. And if I can't see a significant difference in a STILL shot, how the heck am I to see a difference in-game!?
OK enough rant, thanks for the review. :)
Anand Lal Shimpi - Wednesday, November 18, 2009 - link
Thanks for the edits, I've made some corrections for Ryan that will hopefully make the statements more clear.I agree that the need for a faster GPU on the desktop is definitely minimized today. However I do believe in the "if you build it, they will come" philosophy. At some point, the amount of power you can get in a single GPU will be great enough that someone has to take advantage of it. Although we may need more of a paradigm shift to really bring about that sort of change. I wonder if Larrabee's programming model is all we'll need or if there's more necessary...
Take care,
Anand
7Enigma - Wednesday, November 18, 2009 - link
Thank you for the edits and the reply Anand.One of the main things I'd like to see GPU drivers implement is an artificial framerate cap option. These >100fps results in several of the tests at insane resolutions are not only pointless, but add unneccesary heat and stress to the system. Drop back down to normal resolutions that >90% of people have and it becomes even more wasteful to render 150fps.
I always enable V-sync in my games for my LCD (75Hz), but I don't know if this is actually throttling the gpu to not render greater than 75fps. My hunch is in the background it's rendering to its max but only showing on the screen the Hz limitation.
Zool - Wednesday, November 18, 2009 - link
I tryed out full screen furmark with vsync on and off (in 640*480) and the diference was 7 degre celsius. I have a custom cooler on the 4850 and a 20cm side fan on the case so thats quite lot.7Enigma - Thursday, November 19, 2009 - link
Thanks for the reply Zool, I was hoping that was the case. So it seems like if I ensure vsync is on I'm at least limiting the gpu to only displaying the refresh rate of the LCD. Awesome!