The Elder Scrolls IV: Oblivion Performance
While it is disappointing that Oblivion doesn't have a built in benchmark, our FRAPS tests have proven to be fairly repeatable and very intensive on every part of a system. While these numbers will reflect real world playability of the game, please remember that our test system uses the fastest processor we could get our hands on. If a purchasing decision is to be made using Oblivion performance alone, please check out our two articles on the CPU and GPU performance of Oblivion. We have used the most graphically intensive benchmark in our suite, but the rest of the platform will make a difference. We can still easily demonstrate which graphics card is best for Oblivion even if our numbers don't translate to what our readers will see on their systems.
Running through the forest towards an Oblivion gate while fireballs fly by our head is a very graphically taxing benchmark. In order to run this benchmark, we have a saved game that we load and run through with FRAPS. To start the benchmark, we hit "q" which just runs forward, and start and stop FRAPS at predetermined points in the run. While not 100% identical each run, our benchmark scores are usually fairly close. We run the benchmark a couple times just to be sure there wasn't a one time hiccup.
As for settings, we tested a few different configurations and decided on this group of options:
Oblivion Performance Settings | |
Texture Size | Large |
Tree Fade | 60% |
Actor Fade | 20% |
Item Fade | 10% |
Object Fade | 25% |
Grass Distance | 30% |
View Distance | 100% |
Distant Land | On |
Distant Buildings | On |
Distant Trees | On |
Interior Shadows | 45% |
Exterior Shadows | 20% |
Self Shadows | Off |
Shadows on Grass | Off |
Tree Canopy Shadows | Off |
Shadow Filtering | High |
Specular Distance | 80% |
HDR Lighting | On |
Bloom Lighting | Off |
Water Detail | Normal |
Water Reflections | On |
Water Ripples | On |
Window Reflections | On |
Blood Decals | High |
Anti-aliasing | Off |
Our goal was to get acceptable performance levels under the current generation of cards at 1280x1024. For the most part we succeeded, but the X1600 XT just wasn't up to the task. Reducing settings for one consistently underperforming card wasn't worth it to us. These settings are also very enjoyable and playable. While more is better in this game, no GPU or CPU is going to give you everything.
While very graphically intensive, and first person, this isn't a twitch shooter. Our experience leads us to conclude that 20fps gives a good experience. It's playable a little lower, but watch out for some jerkiness that may pop up. Getting down to 16fps and below is a little too low to be acceptable. The main point to bring home is that you really want as much eye candy as possible. While Oblivion is an immersive and awesome game from a gameplay standpoint, the graphics certainly help draw the gamer in.
At our target resoltuion, the 6600 GT is utterly unplayable, but the 7600 GT falls into our passable performance range when running on a Core 2 Extreme X6800. Other than the X1600 XT, ATI cards sweep this test in terms of performance and value. In fact, at 1280x1024, we would recommend turning up a few more settings under the X1900 GT and X1900 XT.
For those of you who want higher settings than what we picked, a lower resolution is likely in order. At 1024x768, the 7900 GT and X1800 GTO gain enough head room to increase their load and remain playable, while an overclocked 7900 GT may be able to handle a little more at a higher resolution. Generally, resolution matters less than effects in this game, so we would certainly suggest it. The exception to the rule is that 800x600 starts to look a little too grainy for our tastes. It really isn't necessary to push up to 1600x1200 or beyond, but the X1900 XT does make it possible for those interested.
74 Comments
View All Comments
gmallen - Friday, August 11, 2006 - link
Most of the PC enthusiast population interested in mid-range cards are still running AGP motherboards (this is based on sales of pci motherboards vs. agp motherboards). Where are these cards?Josh7289 - Friday, August 11, 2006 - link
They don't exist.
arturnowp - Friday, August 11, 2006 - link
HiIt's written that all card in oblivion was tested with HDR Lighting with X800GTO doesn't support. I think your results are misleading. The same with SC: Chaos Theory...
BTW: Who plays Oblivion with Actor Fade at 20%, Item Fade at 10% and Object Fade at 25% you get better graphics and performance setting those option to 50-60% and turning off grass with consums a lot of power and doesn't look good. In foliage it's better to see your enemies from greater distance the say with a horse ;-)
arturnowp - Friday, August 11, 2006 - link
OK there's writen about SC: Chaos Theory but all in all conclusion are misleading "Owners of the X800 GTO may have a little more life left in their card depending on how overclocked the card is, but even at stock clocks, it might be wise to hang on for another product cycle if possibl" where GeForce 6600GT performe on par with X800GTO. It would be better to exclude X800GTO from charts or mark it as SM 2.0 card. What's better GeForce 6600GT should be tested in SM 2.0 mode...nv40 - Friday, August 11, 2006 - link
Don't why?http://www.xbitlabs.com/articles/video/display/pow...">http://www.xbitlabs.com/articles/video/display/pow...
Some difference of test are so large that it almost shocked me
For instance:
7900GT@84.21 with FX-60 can run 54 FPS avg in 1600x1200 with 4xAA 16xAF in X-bit lab
7900GT@91.33 with X6800 just be 35 FPS ave in 1600x1200 with only 4x AA in Anandtech
Problem of 91.33? Intel 975X? X6800? nVidia?
more than 40% performance difference despite X6800 is far superior to FX-60
coldpower27 - Friday, August 11, 2006 - link
They probably aren't running the same time demo sequences.nv40 - Friday, August 11, 2006 - link
Maybe... but only 9% dif in X1900GT (41 vs 38)And 7900GT test in Anandtech definitely performed much worse then X-bit lab in general
nothing with which is correct or not, but if both are right, the the conclusion may be probably draw like below:
1. Driver problem: 91.33 is much slower than 84.21 (nV Cheat, or 91.33 problem)
2. CPU problem: X6800 is much inferior than FX-60 in game (Rediculous, and far from true in every test)
3. Platform problem: nVidia cards perform much worse in intel chipset (975X)
Sharky974 - Friday, August 11, 2006 - link
I agree. I clearly remember Xbit declaring the 7900GT to win the vast majority of benches vs the X1900GT.In fact overall the X1900GT wasn't warmly recieved. I really feel this deserves some looking into.
For example, I'll have to go look, but I think Firing Sqaud also showed the X1900GT as inferior to the 7900GT.
As it stands now, it's like Anand's platforms are somehow ATI biased, on the other hand I believe Xbit platform is Nvidia biased. Xbit reviews nearly always show Nvidia winning.
Sharky974 - Friday, August 11, 2006 - link
http://www.firingsquad.com/hardware/sapphire_radeo...">http://www.firingsquad.com/hardware/sapphire_radeo...I started on the first page of benches.
As one glaring example:
Firings squad: Quake 4 1280X1024 4XAA 8XAF 7900GT-87.2 X1900GT-60.6
http://www.firingsquad.com/hardware/sapphire_radeo...">http://www.firingsquad.com/hardware/sapphire_radeo...
Anand: Quake 4 1280X1024 4XAA 7900 GT-45.1 X1900GT-49.8
http://images.anandtech.com/reviews/video/roundups...">http://images.anandtech.com/reviews/video/roundups...
With similar settings, FS has the 7900GT getting nearly double the frames Anand does. The X1900GT also gets significantly more in FS review, from 49 to 60 FPS, but nowhere near the change the 7900GT sees, with the net effect the X1900GT eaks out a win at Anand, but loses by nearly 27+ FPS at FS.
The X1900GT is definitly a better card than I had remembered, even at the FS benches though.
Also, FS was using a FX-57. Anand a much more powerful CPU, making results all the more puzzling.
In addition to some of the other suggestions, I'd question drivers. FS was using older drivers on both since it is an older review. Perhaps Nvidia drivers have seen a large performance decrease, or ATI's a similar increase? This seems fairly unlikely, though, as I dont think you normally get huge differences from driver to driver.
Unless Nvidia really was cheating RE 16-bit filtering as the INQ claimed a while back, so they fixed it causing a massive performance decrease? :) Again though, that suggestion is made half-jokingly.
This definitly needs a lot of looking into I fell. Anand's results are quite different than others around the web at first blush.
JarredWalton - Friday, August 11, 2006 - link
Levels can make a huge difference in performance. For example, Far Cry has segments that get about 80 FPS max on any current CPU (maybe higher with Core 2 Extreme overclocked...), but other areas of the game run at 150+ FPS on even a moderate CPU like a 3500+. I don't have a problem providing our demo files, but some fo them are quite large (Q4 is about 130 MB if I recall). SCCT, FEAR, and X3 provide a reference that anyone can compare to, if they want. The only other thing is that ATI driver improvements are certainly not unlikely, especially in Quake 4.