The Elder Scrolls IV: Oblivion GPU Performance
by Anand Lal Shimpi on April 26, 2006 1:07 PM EST- Posted in
- GPUs
Our Settings
We tested at two major settings, one we defined as High Quality and the other we called Medium Quality. The settings were as follows:
Oblivion Performance Settings | High Quality | Medium Quality |
Resolution | 1280x1024 | 1024x768 |
Texture Size | Large | Medium |
Tree Fade | 50% | 25% |
Actor Fade | 65% | 50% |
Item Fade | 65% | 50% |
Object Fade | 65% | 50% |
Grass Distance | 50% | 25% |
View Distance | 100% | 100% |
Distant Land | On | On |
Distant Buildings | On | On |
Distant Trees | On | Off |
Interior Shadows | 50% | 30% |
Exterior Shadows | 50% | 30% |
Self Shadows | On | Off |
Shadows on Grass | On | Off |
Tree Canopy Shadows | On | Off |
Shadow Filtering | High | Low |
Specular Distance | 50% | 50% |
HDR Lighting | On | On |
Bloom Lighting | Off | Off |
Water Detail | High | Normal |
Water Reflections | On | On |
Water Ripples | On | On |
Window Reflections | On | On |
Blood Decals | High | Low |
Anti-aliasing | Off | Off |
Note that when we talk about a setting being 65% we mean that the slider is moved 65% of the way to the right. As you can see from the table above, our High Quality settings aren't as extreme as they could be and the Medium Quality settings are more suited for upper mid-range cards. Since we were dealing with such a wide spread of GPUs we had to err on the side of being more stressful in our visual settings, especially in the mid-range, in order to adequately characterize the performance of all of the GPUs. We didn't want to end up with a graph where everything performed the same because we were too lax with our detail settings.
At the end of the day, these two configurations are what we would strive for in order to get good performance while maintaining a good gameplay experience.
High End Settings
Mid Range Settings
Note that the ATI Radeon X850/X800 series of GPUs don't support Shader Model 3.0, which is required for HDR in Oblivion. Thus we had to leave the X850/X800 out of our default tests with HDR enabled and ran a second set of configurations with HDR disabled and Bloom enabled.
100 Comments
View All Comments
bobsmith1492 - Wednesday, April 26, 2006 - link
I'm playing with a 9700 mobility (basically 9600 ultra) in my laptop with a P-M and a gigger at 1024, medium settings about like you set it. Where in the world did all those extra settings come from though (shadows, water)? Is that something outside the game itself?ueadian - Thursday, April 27, 2006 - link
I played this game fine on my X800XL with high settings.. Yeah it PROBABLY dipped into the 20's but honestly I never really noticed "lag". I shortcircuited my X800XL by stupidly putting a fan with a metal casing on top of it it went ZZZZT and died. I bought a 7900 GT for 299.99 and voltmoded it to GTX speeds and I really don't notice a difference while playing the game. Yeah I'm sure if I payed attention to FPS I'd see it, but really, the only place I noticed lag with my X800XL at high settings was by oblivion gates, and my 7900 GT at 680 core 900 mem locks up near oblivion gates as well. I was sort of forced to "upgrade" my card, but the 7900 GT is the best value for the money right now considering you can do a pen mod to get it to run PAST GTX speeds fairly easy. I have a crappy CRT who's max resolution is 1024x768 and dont plan on upgrading it anytime soon, so I don't need 512mb memory to throw the resolution up to goddly high settings, besides, im pretty blind, I find it easier to play most online games like FPS's at lower resolution just to gain an advantage. Oblivion is near perfection as a GAME it's the most enjoyable game I've ever played, and I've been playing games since Doom. Yeah the engine does suck, and I was really disapointed to have my brand new top of the line video card actualy STUTTER in a game, but really, does it completely ruin the whole game for you? If you have played it you know that it doesn't.thisisatest - Thursday, April 27, 2006 - link
7900 series isn't what I consider to be the top of the line. There is high end and there is top of the line. The top of the line is clear.poohbear - Wednesday, April 26, 2006 - link
im really curious to see how dualcore cpus perform as Oblivion is supposed to take advantage of multithreading. if anandtech could do a cpu performance chart that'd be great. firingsquad did a cpu performance chart but only @ 2 resolutions, 800x600 & 1280x1024, they found significant differences between dualcore and singlecore on 800x600 but no diff on 1280x1024. now, i play @ 1024x768 on my 6800GT, so wondering if a dualcore would help in that resolution. also, if u could investigate some of the supposed tweaks for dualcores and if they truly work that'd be great too. thanks.Eris23007 - Wednesday, April 26, 2006 - link
A friend of mine is playing it on a 3.4GHz Northwood; he told me that when he enabled HyperThreading he got an immediate ~10% (or so) improvement.
That's a pretty good indication that dual cores will help a *lot*, in my view...
mpeavid - Thursday, April 27, 2006 - link
10% is VERY pooor multi threading performance. A decent multi threaded app should give 40-60 and higher for highlt efficient codes.nullpointerus - Thursday, April 27, 2006 - link
HT isn't the same as having dual cores. IIRC, ~10% improvement from HT is rather typical in certain areas where multiple cores have significantly better returns.Akaz1976 - Wednesday, April 26, 2006 - link
Anyone have any idea how 9800PRO compares to x800?hoppa - Friday, April 28, 2006 - link
What this test fails to mention is that I'm running a 9800 pro, Athlon XP 3000+, 1.5 gigs of ram, at 1280x768, and the game runs quite well even at medium settings. This game is very stressful at maximum everything but still manages to run incredibly well on older rigs and lower settings. Had I not played this game, after seeing this article I would've thought that it'd be impossible on my rig, but the truth is I've got plenty of computing power to spare.xsilver - Wednesday, April 26, 2006 - link
9800pro is considered midrange/lowend now -- i guess that article is coming latermy guess is aprox 10% less than the lowest card on each graph besides the 7300gs (also you dont have HDR)