ATI Radeon HD 4890 vs. NVIDIA GeForce GTX 275
by Anand Lal Shimpi & Derek Wilson on April 2, 2009 12:00 AM EST- Posted in
- GPUs
The Cards and The Test
In the AMD department, we received two cards. One was an overclocked part from HIS and the other was a stock clocked part from ASUS. Guess which one AMD sent us for the review. No, it's no problem, we're used to it. This is what happens when we get cards from NVIDIA all the time. They argue and argue for the inclusion of overclocked numbers in GPU reviews when it's their GPU we're looking at. Of course when the tables are turned so are the opinions. We sincerely appreciate ASUS sending us this card and we used it for our tests in this article. The original intent of trying to get a hold of two cards was to run CrossFire numbers, but we only have one GTX 275 and we would prefer to wait until we can compare the two to get into that angle.
The ASUS card also includes a utility called Voltage Tweaker that allows gamers to increase some voltages on their hardware to help improve overclocking. We didn't have the chance to play with the feature ourselves, but more control is always a nice feature to have.
For the Radeon HD 4890 our hardware specs are pretty simple. Take a 4870 1GB and overclock it. Crank the core up 100 MHz to 850 MHz and the memory clock up 75 MHz to 975 MHz. That's the Radeon HD 4890 in a nutshell. However, to reach these clock levels, AMD revised the core by adding decoupling capacitors, new timing algorithms, and altered the ASIC power distribution for enhanced operation. These slight changes increased the transistor count from 956M to 959M. Otherwise, the core features/specifications (texture units, ROPs, z/stencil) remain the same as the HD4850/HD4870 series.
Most vendors will also be selling overclocked variants that run the core at 900 MHz. AMD would like to treat these overclocked parts like they are a separate entity altogether. But we will continue to treat these parts as enhancements of the stock version whether they come from NVIDIA or AMD. In our eyes, the difference between, say, an XFX GTX 275 and an XFX GTX 275 XXX is XFX's call; the latter is their part enhancing the stock version. We aren't going to look at the XFX 4890 and the XFX 4890 XXX any differently. In doing reviews of vendor's cards, we'll consider overclocked performance closely, but for a GPU launch, we will be focusing on the baseline version of the card.
On the NVIDIA side, we received a reference version of the GTX 275. It looks similar to the design of the other GT200 based hardware.
Under the hood here is the same setup as half of a GTX 295 but with higher clock speeds. That means that the GTX 275 has the memory amount and bandwidth of the GTX 260 (448-bit wide bus), but the shader count of the GTX 280 (240 SPs). On top of that, the GTX 275 posts clock speeds closer to the GTX 285 than the GTX 280. Core clock is up 31 MHz from a GTX 280 to 633 MHz, shader clock is up 108 MHz to 1404 MHz, and memory clock is also up 108 MHz to 2322. Which means that in shader limited cases we should see performance closer to the GTX 285 and in bandwicth limited cases we'll still be faster than the GTX 216 because of the clock speed boost across the board.
Rather than just an overclock of a pre-existing card, this is a blending of two configurations combined with an overclock from the two configurations from which it was born. And sure, it's also half a GTX 295, and that is convenient for NVIDIA. It's not just that it's different, it's that this setup should have a lot to offer especially in games that aren't bandwidth limited.
That wraps it up for the cards we're focusing on today. Here's our test system, which is the same as for our GTS 250 article except for the addition of a couple drivers.
The Test
Test Setup | |
CPU | Intel Core i7-965 3.2GHz |
Motherboard | ASUS Rampage II Extreme X58 |
Video Cards | ATI Radeon HD 4890 ATI Radeon HD 4870 1GB ATI Radeon HD 4870 512MB ATI Radeon HD 4850 NVIDIA GeForce GTX 285 NVIDIA GeForce GTX 280 NVIDIA GeForce GTX 275 NVIDIA GeForce GTX 260 core 216 |
Video Drivers | Catalyst 8.12 hotfix, 9.4 Beta for HD 4890 ForceWare 185.65 |
Hard Drive | Intel X25-M 80GB SSD |
RAM | 6 x 1GB DDR3-1066 7-7-7-20 |
Operating System | Windows Vista Ultimate 64-bit SP1 |
PSU | PC Power & Cooling Turbo Cool 1200W |
294 Comments
View All Comments
Psyside - Thursday, April 2, 2009 - link
Can anyone tell me about the testing metod average or maximum fps? thanks.Jamahl - Thursday, April 2, 2009 - link
some sites have the gtx275 clearly winning at all games, all resolutions.helldrell666 - Thursday, April 2, 2009 - link
You can't trust every site you check.especially since most of those sites don't post their funders names on their main page.You must've heard of Hardocp's Kyle who was fired by nvidia because he mentioned that the gtx250 is a renamed 9800gtx.7Enigma - Thursday, April 2, 2009 - link
I think this is due to Nvidia shooting themselves in the leg with the 185 drivers. With the performance penalty at the normal resolutions, anyone testing with the 185's is going to get lower results than someone testing with the previous drivers. And I'm sure you could find 10 games that all perform better on ATI/NVIDIA. That's the problem with game selection and the only real answer is what types of games you play and what engines you think will be used heavily for the next 2 years.SiliconDoc - Monday, April 6, 2009 - link
Well the REAL ANSWER is - if you play at 2650, or even if you don't, and have been a red raging babbling lying idiot red rooster for 6 months plus pretending along with Derek that 2650x is the only thing that matters, now you have a driver for NVidia that whips the ati top dog core...If you're ready to reverse 6 months of red ranting and raving for 2560X ati wins it all, just keep the prior NV driver, so the red roosters screaming they now win because they suddenly are stuck at the LOWER REZ tier to claim a win, can be blasted to pieces anyway- at that resolution.
So - NVidia now has a driver choice - the new for the high rez crown they took from the red fanboy ragers, and the prior driver which SPANKS THE RED CARD AGAIN at the lower rez.
Make sure to collude with all the raging red roosters to keep that as hush hush as possible.
1. spank the 790 at lower rezz with the older Nvidia driver
2. spank the 790 at the highest rez with the new driver
_______________________
Don't worry if you can't understand just keep hopping around flapping those litttle wings and clucking so that red gobbler jouces around - don't worry soft PhysX can display that flabby flapper !
The0ne - Tuesday, April 7, 2009 - link
Can someone ban this freaking idiot. The last few posts of his have been nothing but moronic, senseless rants. Jesus Christ, buy a gun and shoot yourself already.SiliconDoc - Tuesday, April 7, 2009 - link
Ahh, you don't like the points, so now you want death. Perhaps you should be banned, mr death wisher.If you don't like the DOZENS of valid points I made, TOO BAD - because you have no response - now you sound like krz1000 and his endless list of names, the looney red rooster that screeches the same thing you just did, then posts a link to youtube with a freaky slaughter video.
If I wasn't here, the endless LIES would go unopposed, now GO BACK and respond to my points LIKE MAN, if you have anything, which no doubt, you do not.
helldrell666 - Thursday, April 2, 2009 - link
According to xbitlabs, the 4890 beats the gtx285 at 1920x1200 resolution with 4x aa in Cod5, Crysis Warhead, Stalker CS, Fallout 3 and loses in Far Cry2.Here, the 4890 matches in Far Cry 2 and cod5 with some slightly lower fps than the gtx285 in Crysis warhead.Strange....
7Enigma - Thursday, April 2, 2009 - link
That is crazy. There is no way variations should be that huge between the 2 tests, regardless of the area they chose to test in the game. Anandtech has it as essentially a wash, while Xbit has the 4890 20% faster!?! (COD:WaW)7Enigma - Thursday, April 2, 2009 - link
Just looked closer at the Xbitlabs review. The card they used was an OC variant that had 900MHz core instead of the stock 850MHz. In certain games that are not super graphically intensive I'm willing to bet at 1920X1200 they may still be core starved and not memory starved so a 50MHz increase may explain the discrepancy.I've got to admit you need to take the Xbitlabs article with a grain of salt if they are using the OC variant as the base 4890 in all of their charts....that's pretty shady...