ATI Radeon HD 4890 vs. NVIDIA GeForce GTX 275
by Anand Lal Shimpi & Derek Wilson on April 2, 2009 12:00 AM EST- Posted in
- GPUs
CUDA - Oh there’s More
Oh I’m not done. Other than PhysX, NVIDIA is stressing CUDA as another huge feature that no other GPU maker on the world has.
For those who aren’t familiar, CUDA is a programming interface to NVIDIA hardware. Modern day GPUs are quite powerful, easily capable of churning out billions if not a trillion instructions per second when working on the right dataset. The problem is that harnessing such power is a bit difficult. NVIDIA put a lot of effort into developing an easy to use interface to the hardware and eventually it evolved into CUDA.
Now CUDA only works on certain NVIDIA GPUs and certainly won’t talk to Larrabee or anything in the ATI camp. Both Intel and ATI have their own alternatives, but let’s get back to CUDA for now.
The one area that GPU computing has had a tremendous impact already is the HPC market. The applications there lent themselves very well to GPU programming and thus we see incredible CUDA penetration there. What NVIDIA wants however is CUDA in the consumer market, and that’s a little more difficult.
The problem is that you need a compelling application and the first major one we looked at was Elemental’s Badaboom. The initial release of Badaboom fell short of the mark but over time it became a nice tool. While it’s not the encoder of choice for people looking to rip Blu-ray movies, it’s a good, fast way of getting your DVDs and other videos onto your iPod, iPhone or other portable media player. It only works on NVIDIA GPUs and is much faster than doing the same conversion on a CPU if you have a fast enough GPU.
The problem with Badaboom was that, like GPU accelerated PhysX, it only works on NVIDIA hardware and NVIDIA isn’t willing to give away NVIDIA GPUs to everyone in the world - thus we have another catch 22 scenario.
Badaboom is nice. If you have a NVIDIA GPU and you want to get DVD quality content onto your iPod, it works very well. But spending $200 - $300 on a GPU to run a single application just doesn’t seem like something most users would be willing to do. NVIDIA wants the equation to work like this:
Badaboom -> You buy a NVIDIA GPU
But the equation really works like this:
Games (or clever marketing) -> You buy a NVIDIA GPU -> You can also run Badaboom
Now if the majority of applications in the world required NVIDIA GPUs to run, then we’d be dealing in a very different environment, but that’s not reality in this dimension.
294 Comments
View All Comments
lk7900 - Monday, April 27, 2009 - link
Can you please die? Prefearbly by getting crushed to death, or by getting your face cut to shreds with a
pocketknife.
I hope that you get curb-stomped, f ucking retard
Shut the *beep* up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
http://www.youtube.com/watch?v=QGt3lpxyo1U">http://www.youtube.com/watch?v=QGt3lpxyo1U
I wish you a truly painful, bloody, gory, and agonizing death, *beep*
joeysfb - Wednesday, April 15, 2009 - link
Hahaha! An eye for an eye. Guess the table has turned. AMD used to be in a needy position... taking it from left..right..center and back from players like Nvidia.joeysfb - Monday, April 13, 2009 - link
Good job AnandTech!!, really like your behind the scene commentary.araczynski - Saturday, April 11, 2009 - link
so far my overclocked 4850 crossfire setup has been keeping me happy, i'll come back into the market when the 5000 series rolls out and i upgrade my rig in general.ChemicalAffinity - Thursday, April 9, 2009 - link
Can someone ban this guy? I mean seriously.SiliconDoc - Friday, April 24, 2009 - link
Are you on drugs, is that why you don't understand or have a single counterpoint ?Come on, come up with at least one that refutes my endless stream of corrections to the lies you've lived with for months.
No ?
Ban the truth instead ?
Yeah, that wouldn't help you.
Ananke - Thursday, April 9, 2009 - link
I had 4850, 4870-1Gb, 260-216 and 280-Overclocked. Ran on 24" 1900*1200 - Crysis and Warhead, FarCry2, GTA4, Stalker ....whatever else you can imagnine...My experience:
Radeons are hot and noisier. You HAVE to increase the fan speed and it is audible. Image quality in games is very good though. Especially Crysis was better looking with the Radeons. Bullet tracing and sunshine effects were spectacular...GTX 280 on max everything in Crysis was also very beautiful. However that card gets HOT, so you would be better off with 285. I didn't like the image quality of Radeons in movies , but maybe my settings were not good. 4850 is definitely not the money, too hot for my test.
So, 4870 or 4890 1 Gb is definitely worth buying, performance is on par with 285 on 1900*1200 - Crysis was 27-41 FPS with standart Radeon 4870, and 31-45 with 280 OC 615 MHz.
IF 285 price is $250, that would be the best buy. If it costs more is NOT worth the money, unless you really want bigger and quiter card. Performance wise is the same as Radeon 4890, which now costs 229 and can be overclocked. I did overclock the GTX280 and 285, which doesnt show any performance change, I guess they are constrained by memory bandwidth?
So, honestly, for the money Radeon 4890 for $229 is the better choice. IF you find 4870 1Gb for $169 is worth considering also. The 896MB on the Nvidias is a constraint, I would not reccomend anything but 285, but that is expensive.
Truenofan - Tuesday, April 7, 2009 - link
woops. i meant arctic cooling S1 Rev2.Truenofan - Tuesday, April 7, 2009 - link
i don't get whats going on with silicon. but i enjoy my 4870. it works best at my resolution(1920x1200) and it costed less than the 275 with the ac-1. runs very chilly(45C idle 57C load oc'ed). i dont need phys-x or an application to do video encoding that costs extra adding to the total cost of the video card. gaming is its sole purpose to me and it does that extremely well.180 + 80 dollars for the video applications costs more than what my 4870 ran me and it completely outclasses at stock speeds it let alone a 275(260) or 280(270) which mine still costed less than. now you can get a 4870 for what the 260 runs. wheres the logic in that? just so you can run a few games with physx that aren't even that good? to do some video encoding? i'll stick with my lower cost 4870.
SiliconDoc - Tuesday, April 7, 2009 - link
I see, now your 4870 completely outclasses even the 280. LOLYour 4870 is matched with the 260, not the 275, and not the 280.
You don't have anything but another set of lies, so it's not something about you determining "my problem", or you "not knowing what it is", but it is rather the obvious lies required for you to "express your opinion". Maybe you should read my responses for the 20 some pages, and tell me why any of the 20 plus solid points that destroy the lies of the reds, are incorrect ? You think you might try it ? I mean we have a lot more than just YOUR OPINION,, false as you presented it, to determine, what is correct. For instance:
http://www.fudzilla.com/index.php?option=com_conte...">http://www.fudzilla.com/index.php?optio...Itemid=4...
.
Now, not even your 4870 overclocked XXX can beat the GTX260 GLH. In your MIND, though, it does, huh....? lol
Too bad, for you. I, unlike you, know what your problem is, and that is exactly what should bother you, about me.