ATI Radeon HD 4890 vs. NVIDIA GeForce GTX 275
by Anand Lal Shimpi & Derek Wilson on April 2, 2009 12:00 AM EST- Posted in
- GPUs
The Cards and The Test
In the AMD department, we received two cards. One was an overclocked part from HIS and the other was a stock clocked part from ASUS. Guess which one AMD sent us for the review. No, it's no problem, we're used to it. This is what happens when we get cards from NVIDIA all the time. They argue and argue for the inclusion of overclocked numbers in GPU reviews when it's their GPU we're looking at. Of course when the tables are turned so are the opinions. We sincerely appreciate ASUS sending us this card and we used it for our tests in this article. The original intent of trying to get a hold of two cards was to run CrossFire numbers, but we only have one GTX 275 and we would prefer to wait until we can compare the two to get into that angle.
The ASUS card also includes a utility called Voltage Tweaker that allows gamers to increase some voltages on their hardware to help improve overclocking. We didn't have the chance to play with the feature ourselves, but more control is always a nice feature to have.
For the Radeon HD 4890 our hardware specs are pretty simple. Take a 4870 1GB and overclock it. Crank the core up 100 MHz to 850 MHz and the memory clock up 75 MHz to 975 MHz. That's the Radeon HD 4890 in a nutshell. However, to reach these clock levels, AMD revised the core by adding decoupling capacitors, new timing algorithms, and altered the ASIC power distribution for enhanced operation. These slight changes increased the transistor count from 956M to 959M. Otherwise, the core features/specifications (texture units, ROPs, z/stencil) remain the same as the HD4850/HD4870 series.
Most vendors will also be selling overclocked variants that run the core at 900 MHz. AMD would like to treat these overclocked parts like they are a separate entity altogether. But we will continue to treat these parts as enhancements of the stock version whether they come from NVIDIA or AMD. In our eyes, the difference between, say, an XFX GTX 275 and an XFX GTX 275 XXX is XFX's call; the latter is their part enhancing the stock version. We aren't going to look at the XFX 4890 and the XFX 4890 XXX any differently. In doing reviews of vendor's cards, we'll consider overclocked performance closely, but for a GPU launch, we will be focusing on the baseline version of the card.
On the NVIDIA side, we received a reference version of the GTX 275. It looks similar to the design of the other GT200 based hardware.
Under the hood here is the same setup as half of a GTX 295 but with higher clock speeds. That means that the GTX 275 has the memory amount and bandwidth of the GTX 260 (448-bit wide bus), but the shader count of the GTX 280 (240 SPs). On top of that, the GTX 275 posts clock speeds closer to the GTX 285 than the GTX 280. Core clock is up 31 MHz from a GTX 280 to 633 MHz, shader clock is up 108 MHz to 1404 MHz, and memory clock is also up 108 MHz to 2322. Which means that in shader limited cases we should see performance closer to the GTX 285 and in bandwicth limited cases we'll still be faster than the GTX 216 because of the clock speed boost across the board.
Rather than just an overclock of a pre-existing card, this is a blending of two configurations combined with an overclock from the two configurations from which it was born. And sure, it's also half a GTX 295, and that is convenient for NVIDIA. It's not just that it's different, it's that this setup should have a lot to offer especially in games that aren't bandwidth limited.
That wraps it up for the cards we're focusing on today. Here's our test system, which is the same as for our GTS 250 article except for the addition of a couple drivers.
The Test
Test Setup | |
CPU | Intel Core i7-965 3.2GHz |
Motherboard | ASUS Rampage II Extreme X58 |
Video Cards | ATI Radeon HD 4890 ATI Radeon HD 4870 1GB ATI Radeon HD 4870 512MB ATI Radeon HD 4850 NVIDIA GeForce GTX 285 NVIDIA GeForce GTX 280 NVIDIA GeForce GTX 275 NVIDIA GeForce GTX 260 core 216 |
Video Drivers | Catalyst 8.12 hotfix, 9.4 Beta for HD 4890 ForceWare 185.65 |
Hard Drive | Intel X25-M 80GB SSD |
RAM | 6 x 1GB DDR3-1066 7-7-7-20 |
Operating System | Windows Vista Ultimate 64-bit SP1 |
PSU | PC Power & Cooling Turbo Cool 1200W |
294 Comments
View All Comments
SiliconDoc - Friday, April 24, 2009 - link
You failed to read his post, and therefore the context of my response, you IDIOT.Can you run a second ATI card for PhysX - NO.
Can you run an ati card and a second NV for PhysX - not without a driver hack - check techpowerup for the how to and files, as I've already mentioned.
So, THAT'S WHAT WE WE'RE TALKING ABOUT DUMMY.
Now you can take your stupidity along with you, noone can stop it.
pizzimp - Friday, April 3, 2009 - link
From an objective point of view there is not really a clear winner. At the lower resolutions do you really care if you are getting 80 FPS Vs 100 FPS?IMO it is the higher resolutions that matter. I would think any real gamer is always looking to upgrade there monitor :).
I wonder how old you guys are that are posting? Who cares if something is "rebadged" or just an OC version of something? Bottom line is how does the card play the game?
IMO both cards are good. It comes down to price for me.
SiliconDoc - Monday, April 6, 2009 - link
Ahh, you just have to pretend framerates you can't see or notice, and only the top rate or the average, never the bottom framerate...Then you must discount ALL the OTHER NVIDIA advantages, from cuda, to badaboom, to better folding scores, to physx, to game release day evga drivers ready to go, to forced game profiles in nvidia panel none for ati - and on and on and on...
Now, after 6 months of these red roosters screaming ati wins it all because it had the top resolution of the 30" monitor sewed up and lost in lower resolutions, these red roosters have done a 180 degree about face.... now the top resolution just doesn't matter -
Dude, the red ragers are lying loons, it's that simple.
The 2 year old 9800X core is the 4870 without ddr5. Think about that, and how deranged they truly are.
I bet they have been fervently praying to their red god hoping that change doesn't come in the form of ddr5 on that old g80/g92/g92b core - because then instead of it competing with the 4850 - it would be a 4870 - and THAT would be an embarrassment - a severe embarrassment. The crowing of the red roosters would diminish... and they'ed be bent over sucking up barnyard dirt and chickseed - for a long, long time. lol
Oh well, at least ati might get 2 billion from Obama to cover it's losses ... it's sad when a red rooster card could really use a bailout, isn't it ?
helldrell666 - Friday, April 3, 2009 - link
Well, you have a point there. But the card is still not operating on a WHQL driver, and the percentage of those who use 30" montiors is negligible compared to the owners of 22" / 24" monitors.I think this is probably due to the 256bit internal memory interface compared to the 484bit that the gtx275 has.even at xbitlabs the 4890 drops significantly in performance compared to the gtx285.
7Enigma - Friday, April 3, 2009 - link
From a subjective point of view you may feel that way, but from an objective point of view there is a clear winner, and it is the 4890. Left for Dead and Call of Duty are the only 2 30" display tests where the 275 significantly defeated the 4890. In all of the other tests either the 4890 either dominated (G.R.I.D., Fallout3), or was within 4% of the 275 which I would call a wash. At all other resolutions the 4890 was the undisputed leader. So I find it difficult to say there is no clear winner.What Nvidia should have done was not nerfed their 22" and 24" resolutions for the very few people that game at 30" with the latest drivers. To be honest I wish the article had included all of the results from the 182 drivers (they show just G.R.I.D. but allude to other games also having similar reduced results except at the highest res). It could very likely be a wash then if the 275 is more competetive at the resolutions 99% of the people buying this level/price of card are going to be playing at.
Anand, any way you could post, even just in the comments, the numbers for the rest of the games with the 182 Nvidia drivers. I don't mind doing the comparison work to see how much closer the 275 would be to the 4890 if they had kept the earlier drivers.
7Enigma - Friday, April 3, 2009 - link
Ah, I see now that the 185's are specifically to enable support for the 275 card. So you can't run the 275 with the 182 drivers. Still would be interesting to see all the data for what happened to the 285 using the newest drivers that decrease the performance at lower resolutions.minime - Friday, April 3, 2009 - link
First, thanks for your review(s). I'm a silent reader and word-of-mouth spreader for years.Second, don't you think reviewers should point their fingers a little bit more aggressively to the power-consumption? Not because it's trendy nowadays, but because it's just not sane to waste that much energy in idle (2D, anyone remembers?) mode. I was thrilled what you alone (don't take it as a disrespect) were able to achieve on the SSD issue.
SiliconDoc - Monday, April 6, 2009 - link
PSST ! The ati cards have like 30 watts more power useage in idle - and like 3 watts less in 3d - so the power thing - well they just declare ati the winner... LOLThey said they were "really surprised" at the 30 watts less in idle for the nvidia - they just couldn't figure it out- and kept rechecking ... but yeah... the 260 was kicking butt.. but... that doesn't matter - ati takes the win using 1-3 watss less in 3D..
So, you know, the red roosters shall not be impugned !
capiche' ?
VulgarDisplay - Friday, April 3, 2009 - link
It appears that you may have had Vsync turned on which caps the game at 60fps in some of the CoD:W@W tests. It's pretty apparent something is up when the nVidia card has the same FPS at 1680x and 1920x. Either way it still seems like the 4890 wins at those resolutions which is different than most sites that pretty much say it's a wash across the board. I'll take nVidia's drivers over ATi's any day.SiliconDoc - Monday, April 6, 2009 - link
Hey any little trick that smacks nvidia down a notch is not to be pointed out.