ATI Radeon HD 4890 vs. NVIDIA GeForce GTX 275
by Anand Lal Shimpi & Derek Wilson on April 2, 2009 12:00 AM EST- Posted in
- GPUs
New Drivers From NVIDIA Change The Landscape
Today, NVIDIA will release it's new 185 series driver. This driver not only enables support for the GTX 275, but affects performance in parts across NVIDIA's lineup in a good number of games. We retested our NVIDIA cards with the 185 driver and saw some very interesting results. For example, take a look at before and after performance with Race Driver: GRID.
As we can clearly see, in the cards we tested, performance decreased at lower resolutions and increased at 2560x1600. This seemed to be the biggest example, but we saw flattened resolution scaling in most of the games we tested. This definitely could affect the competitiveness of the part depending on whether we are looking at low or high resolutions.
Some trade off was made to improve performance at ultra high resolutions at the expense of performance at lower resolutions. It could be a simple thing like creating more driver overhead (and more CPU limitation) to something much more complex. We haven't been told exactly what creates this situation though. With higher end hardware, this decision makes sense as resolutions lower than 2560x1600 tend to perform fine. 2560x1600 is more GPU limited and could benefit from a boost in most games.
Significantly different resolution scaling characteristics can be appealing to different users. An AMD card might look better at one resolution, while the NVIDIA card could come out on top with another. In general, we think these changes make sense, but it might be nicer if the driver automatically figured out what approach was best based on the hardware and resolution running (and thus didn't degrade performance at lower resolutions).
In addition to the performance changes, we see the addition of a new feature. In the past we've seen the addition of filtering techniques, optimizations, and even dynamic manipulation of geometry to the driver. Some features have stuck and some just faded away. One of the most popular additions to the driver was the ability to force Full Screen Antialiasing (FSAA) enabling smoother edges in games. This features was more important at a time when most games didn't have an in-game way to enable AA. The driver took over and implemented AA even on games that didn't offer an option to adjust it. Today the opposite is true and most games allow us to enable and adjust AA.
Now we have the ability to enable a feature, which isn't available natively in many games, that could either be loved or hated. You tell us which.
Introducing driver enabled Ambient Occlusion.
What is Ambient Occlusion you ask? Well, look into a corner or around trim or anywhere that looks concave in general. These areas will be a bit darker than the surrounding areas (depending on the depth and other factors), and NVIDIA has included a way to simulate this effect in it's 185 series driver. Here is an example of what AO can do:
Here's an example of what AO generally looks like in games:
This, as with other driver enabled features, significantly impacts performance and might not be able to run on all games or at all resolutions. Ambient Occlusion may be something some gamers like and some do not depending on the visual impact it has on a specific game or if performance remains acceptable. There are already games that make use of ambient occlusion, and some games that NVIDIA hasn't been able to implement AO on.
There are different methods to enable the rendering of an ambient occlusion effect, and NVIDIA implements a technique called Horizon Based Ambient Occlusion (HBAO for short). The advantage is that this method is likely very highly optimized to run well on NVIDIA hardware, but on the down side, developers limit the ultimate quality and technique used for AO if they leave it to NVIDIA to handle. On top of that, if a developer wants to guarantee that the feature work for everyone, they would need implement it themselves as AMD doesn't offer a parallel solution in their drivers (in spite of the fact that they are easily capable of running AO shaders).
We haven't done extensive testing with this feature yet, either looking for quality or performance. Only time will tell if this addition ends up being gimmicky or really hits home with gamers. And if more developers create games that natively support the feature we wouldn't even need the option. But it is always nice to have something new and unique to play around with, and we are happy to see NVIDIA pushing effects in games forward by all means possible even to the point of including effects like this in their driver.
In our opinion, lighting effects like this belong in engine and game code rather than the driver, but until that happens it's always great to have an alternative. We wouldn't think it a bad idea if AMD picked up on this and did it too, but whether it is more worth it to do this or spend that energy encouraging developers to adopt this and comparable techniques for more complex writing is totally up to AMD. And we wouldn't fault them either way.
294 Comments
View All Comments
piesquared - Thursday, April 2, 2009 - link
Must be tough trying to write a balanced review when you clearly favour one side of the equation. Seriously, you tow NV's line without hesitation, including soon to be extinct physx, a reviewer relieased card, and unreleased drivers at the time of your review. And here's the kicker; you ignore the OC potential of AMD's new card, which as you know, is one of it's major selling points.Could you possibly bend over any further for NV? Obviously you are perfectly willing to do so. F'n frauds
Chlorus - Friday, April 3, 2009 - link
What?! Did you even read the article? They specifically say they cannot really endorse PhysX or CUDA and note the lack of support in any games. I think you're the one towing a line here.SiliconDoc - Monday, April 6, 2009 - link
The red fanboys have to chime in with insanities so the reviewers can claim they're fair because "both sides complain".Yes, red rooster whiner never read the article, because if he had he would remember the line that neither overclocked well, and that overclocking would come in a future review ( in other words, they were rushed again, or got a chum card and knew it - whatever ).
So, they didn't ignore it , they failed on execution - and delayed it for later, so they say.
Yeah, red rooster boy didn't read.
tamalero - Thursday, April 9, 2009 - link
jesus dude, you have a strong persecution complex right?its like "ohh noes, they're going against my beloved nvidia, I MUST STOP THEM AT ALL COSTS".
I wonder how much nvidia pays you? ( if not, you're sad.. )
SiliconDoc - Thursday, April 23, 2009 - link
That's interesting, not a single counterpoint, just two whining personal attacks.Better luck next time - keep flapping those red rooster wings.
(You don't have any decent couinterpoints to the truth, do you flapper ? )
Sometimes things are so out of hand someone has to say it - I'm still waiting for the logical rebuttals - but you don't have any, neither does anyone else.
aguilpa1 - Thursday, April 2, 2009 - link
All these guys talking about how irrelevant physx and how not so many games use it don't get it. The power of physx is bringing the full strength of those GPU's to bear on everyday apps like CS4 or Badaboom video encoding. I used to think it was kind of gimmicky myself until I bought the "very" inexpensive badaboom encoder and wow, how awesome was that! I forgot all about the games.Rhino2 - Monday, April 13, 2009 - link
You forgot all about gaming because you can encode video faster? I guess we are just 2 different people. I don't think I've ever needed to encode a video for my ipod in 60 seconds or less, but I do play a lot of games.z3R0C00L - Thursday, April 2, 2009 - link
You're talking about CUDA not Physx.Physx is useless as HavokFX will replace it as a standard through OpenCL.
sbuckler - Thursday, April 2, 2009 - link
No physx has the market, HavokFX is currently demoing what physx did 2 years ago.What will happen is the moment HavokFX becomes anything approaching a threat nvidia will port Physx to OpenCL and kill it.
As far as ATI users are concerned the end result is the same - you'll be able to use physics acceleration on your card.
z3R0C00L - Thursday, April 2, 2009 - link
You do realize that Havok Physics are used in more games than Physx right (including all the source engine based games)?And that Diablo 3 makes use of Havok Physics right? Just thought I'd mention that to give you time to change your conclusion.