AMD's Radeon HD 5870: Bringing About the Next Generation Of GPUs
by Ryan Smith on September 23, 2009 9:00 AM EST- Posted in
- GPUs
Sometimes a surprise is nice. Other times it’s nice for things to go as planned for once.
Compared to the HD 4800 series launch, AMD’s launch of the HD 5800 series today is going to fall into the latter category. There are no last-minute announcements or pricing games, or NDAs that get rolled back unexpectedly. Today’s launch is about as normal as a new GPU launch can get.
However with the lack of last-minute surprises, it becomes harder to keep things under wraps. When details of a product launch are announced well ahead of time, inevitably someone on the inside can’t help but leak the details of what’s going on. The result is that what we have to discuss today isn’t going to come as a great surprise for some of you.
As early as a week ago the top thread on our video forums had the complete and correct specifications for the HD 5800 series. So if you’ve been peaking at what’s coming down the pipe (naughty naughty) then much of this is going to be a confirmation of what you already know.
Today’s Launch
3 months ago AMD announced the Evergreen family of GPUs, AMD’s new line of DirectX11 based GPUs. 2 weeks ago we got our first briefing on the members of the Evergreen family, and AMD publically announced their Eyefinity technology running on the then-unnamed Radeon HD 5870. Today finally marks the start of the Evergreen launch, with cards based on the first chip, codename Cypress, being released. Out of Cypress comes two cards: The Radeon HD 5870, and the Radeon HD 5850.
ATI Radeon HD 5870 | ATI Radeon HD 5850 | ATI Radeon HD 4890 | ATI Radeon HD 4870 | |
Stream Processors | 1600 | 1440 | 800 | 800 |
Texture Units | 80 | 72 | 40 | 40 |
ROPs | 32 | 32 | 16 | 16 |
Core Clock | 850MHz | 725MHz | 850MHz | 750MHz |
Memory Clock | 1.2GHz (4.8GHz data rate) GDDR5 | 1GHz (4GHz data rate) GDDR5 | 975MHz (3900MHz data rate) GDDR5 | 900MHz (3600MHz data rate) GDDR5 |
Memory Bus Width | 256-bit | 256-bit | 256-bit | 256-bit |
Frame Buffer | 1GB | 1GB | 1GB | 1GB |
Transistor Count | 2.15B | 2.15B | 959M | 956M |
Manufacturing Process | TSMC 40nm | TSMC 40nm | TSMC 55nm | TSMC 55nm |
Price Point | $379 | $259 | ~$180 | ~$160 |
So what’s Cypress in a nutshell? It’s a RV790 (Radeon HD 4890) with virtually everything doubled, given the additional hardware needed to meet the DirectX 11 specifications, with new features such as Eyefinity and angle independent anisotropic filtering packed in, lower idle power usage, and fabricated on TSMC’s 40nm process. Beyond that Cypress is a direct evolution/refinement of the RV7xx, and closely resembles its ancestor in design and internal workings.
The leader of the Evergreen family is the Radeon HD 5870, which will be AMD’s new powerhouse card. The 5870 features 1600 stream processors divided among 20 SIMDs, 80 texture units, and 32 ROPs, with 1GB of GDDR5 on-board connected to a 256bit memory bus. The 5870 is clocked at 850MHz for the core clock, and 1.2GHz (4.8GHz effective) for the memory, giving it a maximum compute performance of 2.72 teraflops. Load power is 188W, and idle power is a tiny 27W. It is launching at a MSRP of $379.
Below that we have the 5850 (which we will not be reviewing today), which is a slightly cut-down version of the 5870. Here we have 1440 stream processors divided among 18 SIMDs, 72 texture units, and the same 32 ROPs, with the same 256bit memory bus. The 5850 is clocked at 725Mhz for the core, and 1Ghz for the memory, giving it a maximum compute performance of 2.09 TFLOPS. With the disabled units, load power is slightly reduced to 170W, and it has the same 27W idle power. AMD expects the 5850 to perform at approximately 80% the performance level of the 5870, and is pricing it at $259.
Availability is going to be an issue, so we may as well get the subject out of the way. While today is a hard launch, it’s not quite as hard of a launch as we would like to see. AMD is launching the 5800 series with Dell, so it shouldn't come as a surprise if Dell has cards when e-tailers don't.
The situation with general availability is murky at best. The first thing we heard was that there may be a week of lag, but as of today AMD is telling us that they expect e-tailers to have 5870 cards on the 23rd, and 5850 cards next week. In any case whatever cards do make it in the channel are going to be in short supply, which matches the overall vibe we’re getting from AMD that supplies are going to be tight initially compared to the demand. So even after the first few days it may be hard to get a card. Given a tight supply we’ll be surprised if prices stick to the MSRP, and we’re likely to see e-tailers charge a price premium in the first days. Depending on just how high the demand is, this may mean it’ll take a while for prices to fall down to their MSRPs and for AMD to completely clear the backlog of demand for these cards.
Update: As of 5am EDT, we have seen the availability of 5870s come and go. Newegg had some in stock, but they have since sold out. So indeed AMD did make the hard launch (which we're always glad to see), but it looks like our concerns about a limited supply are proving to be true.
Finally, we asked AMD about the current TSMC 40nm situation, and they have told us that they are happy with it. Our concern was that problems at TSMC (specifically: yield) would be a holdup in getting more cards out there, but this does not look to be the case. However given the low supply of the cards compared to where AMD expects the supply to be, TSMC’s total 40nm capacity may not be to AMD’s liking.
327 Comments
View All Comments
ClownPuncher - Wednesday, September 23, 2009 - link
Absolutely, I can answer that for you.Those 2 "ports" you see are for aesthetic purposes only, the card has a shroud internally so those 2 ports neither intake nor exhaust any air, hot or otherwise.
Ryan Smith - Wednesday, September 23, 2009 - link
ClownPuncher gets a cookie. This is exactly correct; the actual fan shroud is sealed so that air only goes out the front of the card to go outside of the case. The holes do serve a cooling purpose though; allow airflow to help cool the bits of the card that aren't hooked up to the main cooler; various caps and what have you.SiliconDoc - Wednesday, September 23, 2009 - link
Ok good, now we know.So the problem now moves to the tiny 1/2 exhaust port on the back, did you stick your hand there and see how much that is blowing ? Does it whistle through there ? lol
Same amount of air(or a bit less) in half the exit space... that's going to strain the fan and or/reduce flow, no matter what anyone claims to the contrary.
It sure looks like ATI is doing a big favor to aftermarket cooler vendors.
GhandiInstinct - Wednesday, September 23, 2009 - link
Ryan,Developers arent pushing graphics anymore. Its not economnical, PC game supports is slowing down, everything is console now which is DX9. what purpose does this ATI serve with DX11 and all this other technology that won't even make use of games 2 years from now?
Waste of money..
ClownPuncher - Wednesday, September 23, 2009 - link
Clearly he should stop reviewing computer technology like this because people like you are content with gaming on their Wii and iPhone.This message has been brought to you by Sarcasm.
Griswold - Wednesday, September 23, 2009 - link
So you're echoing what nvidia recently said, when they claimed dx11/gaming on the PC isnt all that (anymore)? I guess nvidia can close shop (at least the gaming relevant part of it) now and focus on GPGPU. Why wait for GT300 as a gamer?Oh right, its gonna be blasting past the 5xxx and suddenly dx11 will be the holy grail again... I see how it is.
SiliconDoc - Wednesday, September 23, 2009 - link
rofl- It's great to see red roosters not crowing and hopping around flapping their wings and screaming nvidia is going down.Don't take any of this personal except the compliments, you're doing a fine job.
It's nice to see you doing my usual job, albiet from the other side, so allow me to compliment your fine perceptions. Sweltering smart.
But, now, let's not forget how ambient occlusion got poo-pooed here and shading in the game was said to be "an irritant" when Nvidia cards rendered it with just driver changes for the hardware. lol
Then of course we heard endless crowing about "tesselation" for ati.
Now it's what, SSAA (rebirthed), and Eyefinity, and we'll hear how great it is for some time to come. Let's not forget the endless screeching about how terrible and useless PhysX is by Nvidia, but boy when "open standards" finally gets "Havok and Ati" cranking away, wow the sky is the limit for in game destruction and water movement and shooting and bouncing, and on and on....
Of course it was "Nvidia's fault" that "open havok" didn't happen.
I'm wondering if 30" top resolution will now be "all there is!" for the next month or two until Nvidia comes out with their next generation - because that was quite a trick switching from top rez 30" DOWN to 1920x when Nvidia put out their 2560x GTX275 driver and it whomped Ati's card at 30" 2560x, but switched places at 1920x, which was then of course "the winning rez" since Ati was stuck there.
I could go on but you're probably fuming already and will just make an insult back so let the spam posting IZ2000 or whatever it's name will be this time handle it.
BTW there's a load of bias in the article and I'll be glad to point it out in another post, but the reason the red rooster rooting is not going beyond any sane notion of "truthful" or even truthiness, is because this 5870 Ati card is already percieved as " EPIC FAIL" !
I cannot imagine this is all Ati has, and if it is they are in deep trouble I believe.
I suspect some further releases with more power soon.
Finally - Wednesday, September 23, 2009 - link
Team Green - full foam ahead!*hands over towel*
There you go. Keep on foaming, I'm all amused :)
araczynski - Wednesday, September 23, 2009 - link
is DirectX11 going to be as worthless as 10? in terms of being used in any meaningful way in a meaningful amount of games?my 2 4850's are still keeping me very happy in my 'ancient' E8500.
curious to see how this compares to whatever nvidia rolls out, probably more of the same, better in some, worse in others, bottom line will be the price.... maybe in a year or two i'll build a new system.
of course by that time these'll be worthless too.
SiliconDoc - Wednesday, September 23, 2009 - link
Well it's certainly going to be less useful than PhysX, which is here said to be worthless, but of course DX11 won't get that kind of dissing, at least not for the next two months or so, before NVidia joins in.Since there's only 1 game "kinda ready" with DX11, I suppose all the hype and heady talk will have to wait until... until... uhh.. the 5870's are actually available and not just listed on the egg and tiger.
Here's something else in the article I found so very heartwarming:
---
" Wrapping things up, one of the last GPGPU projects AMD presented at their press event was a GPU implementation of Bullet Physics, an open source physics simulation library. Although they’ll never admit it, AMD is probably getting tired of being beaten over the head by NVIDIA and PhysX; Bullet Physics is AMD’s proof that they can do physics too. "
---
Unfortunately for this place,one of my friends pointed me to this little expose' that show ATI uses NVIDIA CARDS to develope "Bullet Physics" - ROFLMAO
-
" We have seen a presentation where Nvidia claims that Mr. Erwin Coumans, the creator of Bullet Physics Engine, said that he developed Bullet physics on Geforce cards. The bad thing for ATI is that they are betting on this open standard physics tech as the one that they want to accelerate on their GPUs.
"ATI’s Bullet GPU acceleration via Open CL will work with any compliant drivers, we use NVIDIA Geforce cards for our development and even use code from their OpenCL SDK, they are a great technology partner. “ said Erwin.
This means that Bullet physics is being developed on Nvidia Geforce cards even though ATI is supposed to get driver and hardware acceleration for Bullet Physics."
---
rofl - hahahahahha now that takes the cake!
http://www.fudzilla.com/content/view/15642/34/">http://www.fudzilla.com/content/view/15642/34/
--
Boy do we "hate PhysX" as ati fans, but then again... why not use the nvidia PhysX card to whip up some B Physics, folks I couldn't make this stuff up.