More Mainstream DX10: AMD's 2400 and 2600 Series
by Derek Wilson on June 28, 2007 8:35 AM EST- Posted in
- GPUs
Introduction
We've known about the basic architecture of AMD's lower end DX10 hardware ever since mid May, but retail product hasn't made its way out the door until now. Finally launching today, and available within the next two weeks (says AMD), the Radeon HD 2400 XT and Pro and the Radeon HD 2600 XT and Pro will serve to bring competition to the $50 - $150 DX10 graphics card market. These are the cards that most people will actually end up purchasing, so both AMD and NVIDIA would like to come out on top in this market.
But even before we begin, we have to go back to the 8800 GTS 320 and talk about what a terrific value it is for people who want great performance and don't need ultra high resolutions with AA cranked up. If $300 is in the budget for graphics, this is the way to spend it. We would really love to offer more flexibility in our recommendation, but both NVIDIA and AMD have seen fit to leave a huge gap in performance between their lower high end part and upper low end parts. We saw this with the 8600 GTS falling way short of the 8800 series, and we will see it again with the HD 2600 XT not even getting close to the 2900 XT.
AMD's price gap will be even larger than NVIDIA's, leaving a hole between $150 and $400 with nothing to fill it. This seems quite a bit excessive with no other real product lines hinted at until we see a product refresh down the line. When the 8600 series launched, we were quite disappointed with the performance of the part and hoped that AMD would step up to the plate and offer a real challenger that could fill the needs of midrange graphics hardware buyers everywhere. Now we are left with a sense of desolation and a feeling that neither AMD nor NVIDIA know how to properly target the $200 - $300 price range. We would go so far as to say that neither camp offers top-to-bottom DX10, but something more along the lines of top and bottom end solutions.
But regardless of what is lacking in their lineup, the new Radeon HD cards are aimed at filling a specific need. We will talk about what they bring to the table and how they manage to do the job AMD has designed them to perform. First up is a brief look back at what's actually inside these GPUs.
UPDATE: In going back to add power tests, we discovered that the GeForce 8600 GTS we used had a slight overclock over the stock version. We have gone back and rerun our tests with the GeForce 8600 GTS at stock clock speeds and our current graphs reflect the new data. The changes, generally on the order of 5%, did not have a significant impact on the overall outcome of the article. There are a couple cases where the performance gap narrows, but the fact remains that the 8600 GTS is under powered and the 2600 XT is generally more so.
We do apologize for the initial testing error, and we will certainly do everything we can to avoid such problems in the future.
We've known about the basic architecture of AMD's lower end DX10 hardware ever since mid May, but retail product hasn't made its way out the door until now. Finally launching today, and available within the next two weeks (says AMD), the Radeon HD 2400 XT and Pro and the Radeon HD 2600 XT and Pro will serve to bring competition to the $50 - $150 DX10 graphics card market. These are the cards that most people will actually end up purchasing, so both AMD and NVIDIA would like to come out on top in this market.
But even before we begin, we have to go back to the 8800 GTS 320 and talk about what a terrific value it is for people who want great performance and don't need ultra high resolutions with AA cranked up. If $300 is in the budget for graphics, this is the way to spend it. We would really love to offer more flexibility in our recommendation, but both NVIDIA and AMD have seen fit to leave a huge gap in performance between their lower high end part and upper low end parts. We saw this with the 8600 GTS falling way short of the 8800 series, and we will see it again with the HD 2600 XT not even getting close to the 2900 XT.
AMD's price gap will be even larger than NVIDIA's, leaving a hole between $150 and $400 with nothing to fill it. This seems quite a bit excessive with no other real product lines hinted at until we see a product refresh down the line. When the 8600 series launched, we were quite disappointed with the performance of the part and hoped that AMD would step up to the plate and offer a real challenger that could fill the needs of midrange graphics hardware buyers everywhere. Now we are left with a sense of desolation and a feeling that neither AMD nor NVIDIA know how to properly target the $200 - $300 price range. We would go so far as to say that neither camp offers top-to-bottom DX10, but something more along the lines of top and bottom end solutions.
But regardless of what is lacking in their lineup, the new Radeon HD cards are aimed at filling a specific need. We will talk about what they bring to the table and how they manage to do the job AMD has designed them to perform. First up is a brief look back at what's actually inside these GPUs.
UPDATE: In going back to add power tests, we discovered that the GeForce 8600 GTS we used had a slight overclock over the stock version. We have gone back and rerun our tests with the GeForce 8600 GTS at stock clock speeds and our current graphs reflect the new data. The changes, generally on the order of 5%, did not have a significant impact on the overall outcome of the article. There are a couple cases where the performance gap narrows, but the fact remains that the 8600 GTS is under powered and the 2600 XT is generally more so.
We do apologize for the initial testing error, and we will certainly do everything we can to avoid such problems in the future.
96 Comments
View All Comments
DerekWilson - Thursday, June 28, 2007 - link
I agree that we need to know dx10 performance, which is why we're doing a followup.I would think it would be clear that, if I were buying a card now, I'd buy a card that performed well under dx9.
All the games I play are dx9, all the games I'll play over the next 6 months will have a dx9 codepath, and we don't have dx10 tests that really help indicate what performance will be like on games designed strictly around dx10.
We always recommend people buy hardware to suit their current needs, because these are the needs we can talk about through our testing better.
TA152H - Thursday, June 28, 2007 - link
OK, that recommendation part is a little scary. You should be balancing the two, because as you know, the future does come. DX9 will exist for the next six months, but there are already games using DX10 that look better than DX9. Plus, Vista surely loves DX10.But, we can agree to disagree on what's more important. I think this site's backward looking style is obvious, and while I fundamentally disagree with it, at least you guys are consistent in your love for dying technology. Then again, I still prefer Win 2K over XP, so I guess I'm guilty of it too, but in this case my primary concern would be DX10. It's better, noticeably so. But, the main thing is, you're judging something for what it's not made for. AMD's announcement made it very clear that DX10 was the main point, and HD visual effects. Yet you chose to test neither and condemn the hardware for legacy code. Read the announcement, and judge it on what's it's supposed to be for. Would you condemn a Toyota Celica because it's not as fast as a Porsche? Or a Corvette because it's got bad fuel economy? I doubt it, because that's not why they were made. Why condemn this part without testing it for what it was for? I didn't see DX9 mentioned anywhere in their announcement. Maybe that was a hint?
Chaotic42 - Thursday, June 28, 2007 - link
Yes, but how many people are going to purchase low-to-mid range cards to play games that aren't coming out for several months?poohbear - Thursday, June 28, 2007 - link
celica compared to a porsche?!?! dude that analogy is waayyyyyy off. How the hell is a toyata celica supposed to represent DX9 & a porsche DX10?!?! considering a porsche u can see instant results and enjoy it instantly, there's nothing out right now on a DX10 and i dont think even in 3 years the DX10 AP would ever encompass the differences between a celica and a porsche. get over yourself.KhoiFather - Thursday, June 28, 2007 - link
Wow, what worthless cards! Like does ATI really think people are going to buy this crap? Maybe for a media box and that's about it but for us mid-range gamers, it's worthless! All this hype and wait for nothing I tell ya!Chadder007 - Thursday, June 28, 2007 - link
Yeah, WTF?? They are all sometimes WORSE than the X1650XT!!! What is going on? According to the specs it should be better, could it be driver issues still??tungtung - Thursday, June 28, 2007 - link
I don't think driver alone will help much ... beside ATI has never really known to be able to magically put strong numbers out through driver updates.Personally I'd say the 2xxx line that AMD/ATI has just sunk to the deep abyss. First it was months late, and the performance was light years behind ... all the while the price is just well not right.
As much as I hate saying this ... it seems that we'll have to wait till Intel dips their giant feet into the graphic industry before nVidia and (especially) AMD/ATI woke up and think carefully about their next products (that is if they can bring a competitive product) ... especially in the mainstream and value market.
OrSin - Thursday, June 28, 2007 - link
Very easy to guess what is happening here. Both camps are targeting the the high engame that switch to vista and cna afford the high end cards. And the OEM cards for deal so they push Vista again on people. Niether company want to lower their high sells by releasing a mid-level part. I just wonder if the cards are just more expensive to make for D10. I don't see a reason it would be, maybe I'm wrong.Until Vista is used by more gamer my guess is they will not release a mid range card.
Early adaptors of software is getting screwed.
DigitalFreak - Thursday, June 28, 2007 - link
Agreed. I couldn't help laughing when I read the Final Words section. Kinda like "The Nvidia 86xx/85xx cards suck, and the ATI 26xx/24xx suck worse!"WTF happened this generation? The only cards worth their salt are the 88xx series. Nvidia dropped the ball with their low end stuff, and AMD.... well, AMD never really showed up for the game.
smitty3268 - Thursday, June 28, 2007 - link
I think it's clear that with these low end cards, ATI and NVIDIA both came to the conclusion that they could either spend their transistor budget implementing the DX10 spec or adding performance, and they both went with DX10. Probably so they could be marketed as Vista compatible, or whatever. It's still a mystery why they didn't choose to make any midrange cards, as they tend to sell fairly well AFAIK. Perhaps these were meant to be midrange cards and ATI/NVIDIA were just shocked by how badly performance scaled downwards in their current designs, and were forced to reposition them as cheaper cards.