AMD’s Radeon HD 5770 & 5750: DirectX 11 for the Mainstream Crowd
by Ryan Smith on October 13, 2009 12:00 AM EST- Posted in
- GPUs
Meet The 5770
We’ll start our look at today’s cards with the 5770. As we mentioned previously the 5770 is the full speed Juniper card, with all 10 SIMDs enabled, and clocked at 850MHz for the core and 1.2Ghz (4.8GHz data rate) GDDR5 for the RAM. As has become the current standard for just about every card over $100 these days, the card is equipped with 1GB of RAM. Attaching this RAM to the GPU is a 128-bit bus, giving the card 76.8GB/sec of memory bandwidth. The use of such fast RAM strikes our interest in particular, since it means vendors are spending just as much to equip a 5770 with RAM as they are a 5870.
For cooling, the 5770 uses the same general plastic shroud as the 5800 series cards, which AMD’s 5700 team told us is called the Phoenix. Here the shroud hangs over the rear of the card by just less than half an inch, making it more like the 5870 than the 5850 where the shroud stopped at the edge of the card. The length of the card is 8.25”, adding the shroud brings it to just shy of 8.75”.
5870, 5850, 5770
Interestingly enough, we’ve been told that the Phoenix shroud isn’t going to be sticking around for long. The first wave of cards launching today and for the near future will be using the shroud, but once AMD’s vendors begin using their own designs, AMD doesn’t expect most of the vendors to stick with the shroud. XFX has specifically been named as a party that will keep using the shroud on products, but anyone else is subject to change. With a TDP of only 108W, the Phoenix shroud is probably overbuilt and certainly more expensive than vendors would like, where mainstream products come with thinner margins. We would expect the vendors that do switch to move to more traditional dual-slot coolers, likely ones that aren’t shrouded at all and would not blow hot air outside of the case.
While we were fine with the shroud on the 5800 series, we do take slight issue with it on the 5770. Because the single 6-pin PCIe power connector on this card is on the rear of the card, the shroud is in the way of the PCIe power connector. This wasn’t an issue on the 5870 since the power connectors were on top, and on the 5850 the shroud stopped at the end of the card. But here the extra shroud makes it much harder to see what you’re doing when it comes to plugging in a PCIe power connector unless you’re looking at the rear of the card, and it makes it a bit harder to remove a PCIe power plug once placed.
AMD made it clear to us that they did consider this in the design of the 5770, and to their credit the shroud never makes inserting/removing a PCIe power plug impossible, but that doesn’t mean we have to like it. We would have liked to see the shroud go the same length as the card, so that it would be just as easy to use as the 5850.
At any rate, along with keeping the 5800 series shroud, the 5770 keeps the port configuration. 1 DisplayPort, 1 HDMI port, and 2 DVI ports make up the card’s output options. This is intentional on the part of AMD, as they want to push Eyefinity on these cards just as much as they do on their high-end products, which means they want to use the same ideal configuration. We wouldn’t be shocked to see this modified at the same time as vendors dropping the shroud though; 1 DVI, 1 HDMI, and 1 DisplayPort with a flexible HDMI->DVI adapter is a likely configuration.
The price of DisplayPort->DVI dongles rears its head once again here, and even more severely. DisplayPort monitors are still rare, so the most likely Eyefinity configuration is going to be 3x DVI, which is going to require a dongle. Those dongles are still going for $100+ right now, which is a significant fraction of the price of the card itself. We talked to AMD about this issue, but it’s something that’s out of their hands for the moment.
AMD is pricing this card at $159. This puts it in competition with the cheapest GTX 260s from NVIDIA, and AMD’s 4870, the latter of which tends to sell for only $10 less for the 1GB version. AMD wouldn’t give us a clear idea on how long they expect the 4870 to last, but it seems clear that they intend to phase out the 4870 with the 5770. This may not be such a great idea, but we’ll get to that after we take a look at performance.
Notably, this leaves a $100 pricing hole in AMD’s 5000-series product lineup, since the next card up is the $259 5850. AMD pointed out to us that this is by no means unprecedented (the 4800 series launch saw a $100 gap between the 4870 and 4850) but we’re not used to seeing such a gap in recent times. This price gap makes a little more sense with AMD’s target demographics: the 5800 series is for 2560x1600 gaming, while the 5770 is targeted for 1920x1200/1080. So as far as they’re concerned, there isn’t a demographic gap to make the price gap a problem.
Anyhow, for the time being, the 4890 will function as a slight bridge on that pricing gap. It will continue to occupy a range around $180-$200.
For today’s launch, availability is expected to be in the “tens of thousands” of units. We suspect that the situation is going to mirror the 5870 launch (tight availability at first) but we’ll see. For this launch period, AMD is also extending the DIRT 2 freebie offer to vendors that want to include it with their 5770 cards. So most if not all cards will come with a voucher for this game to get it in December.
117 Comments
View All Comments
squeezee - Tuesday, October 13, 2009 - link
Remember that there is more to the card than just the ROP/TU/ALUs. If the other logic is intact it could give the dual 5770s a net larger ammount of cache, more resources for scheduling, rasterization, etc.Ryan Smith - Tuesday, October 13, 2009 - link
Exactly. Geometry is also a big thing; the 5800 series and 5700 series have the same geometry abilities. Unfortunately this isn't something we can really test in a meaningful manner.Torres9 - Tuesday, October 13, 2009 - link
"The 5770 is 108W at load and 18W at idle, meanwhile the 5850 is 86W at load and 16W at idle."do u mean the 5750 or is the 5850 that good?
ET - Tuesday, October 13, 2009 - link
I'm again seeing many comments of "DX11 gives me nothing". Well, you buying it gives developers one more reason to develop for it. If you stick to DX10, then it'd take more time to move to DX11. Really. Until the majority of the market moves to a new feature set (and hopefully Windows 7 will help move out of DX9), developers will only use higher end features as "special features".MadMan007 - Tuesday, October 13, 2009 - link
1 word for real DX11 rollout: consoles.ET - Thursday, October 15, 2009 - link
You're right, though not the way you think. Xbox programming is more like DX11 than DX9 or DX10, and the Xbox also has a tesselation unit (though simpler than in the DX11 parts), so moving to DX11 would make developers life easier.What users don't get is the difference between API and hardware capabilities. Even if developers limit themselves to DX9 level capabilities, for console compatibility, using DX10 or DX11 only to develop will be much easier than using both DX9 and DX10, and result in faster and less buggy code (optimising for two very different API's is hard).
xipo - Tuesday, October 13, 2009 - link
As MadMan007 says, there wont be a large adoption rate from the developers towards DX11 until the NEXT generation of consoles ships (around 2012) supporting DX11... Win7 won't matter because game developers are still going to make games for DX9-DX11... Probably the very few game that will come out being DX11 only are going to be some kind of tech demos & suck 4ss!ET - Tuesday, October 13, 2009 - link
I haven't seen it stated, but I'd like to know if the 4850 benchmarked is 512MB or 1GB. If it's 512MB then the comparison with the 5750 isn't valid.poohbear - Tuesday, October 13, 2009 - link
u never mentioned that the performance of the 5770 might be a driver issue? the hardware is certainly capable of outdoing the 4870 as we can see in Farcry2, so maybe its just a driver issue?Ryan Smith - Tuesday, October 13, 2009 - link
I don't believe it's a driver issue. If anything it's a Far Cry 2-specific issue, but that's something I'm going to have to do some more digging for.