Power Within Reach: NVIDIA's GeForce 8800 GTS 320MB
by Derek Wilson on February 12, 2007 9:00 AM EST- Posted in
- GPUs
Half-Life 2: Episode One Performance
Games based on Valve's Source engine have long been a staple of our test suites. Constantly evolving, HL2:EP1 show off some of the newer features of the engine. We are using the latest version of the game available on Steam as of the publication of this article. Our benchmark makes use of the timedemo functionality provided through Source. The demo we recorded is a battle with a flying gunship in a wooden house in which house and gunship are blown to bits. All the settings are turned up as high as they will go.
Half-Life 2: Episode One brings us another test where the extra 320MB of RAM the original 8800 GTS enjoys makes no difference when AA is not enabled. Performance between the two variants is identical here with all the settings but AA cranked up. Even at 2560x1600 there is no real difference, which is quite impressive for a card with less than 512MB of RAM.
Enabling 4xAA shows us that we can still get good performance at resolutions below 2560x1600 along with antialiasing. Up until our highest resolution test, the two 8800 GTS cards performed very similarly. This means that even at high resolutions with AA enabled, HL2:EP1 doesn't incur the same penalty on the 8800 GTS 320MB that other games do.
Games based on Valve's Source engine have long been a staple of our test suites. Constantly evolving, HL2:EP1 show off some of the newer features of the engine. We are using the latest version of the game available on Steam as of the publication of this article. Our benchmark makes use of the timedemo functionality provided through Source. The demo we recorded is a battle with a flying gunship in a wooden house in which house and gunship are blown to bits. All the settings are turned up as high as they will go.
Half-Life 2: Episode One brings us another test where the extra 320MB of RAM the original 8800 GTS enjoys makes no difference when AA is not enabled. Performance between the two variants is identical here with all the settings but AA cranked up. Even at 2560x1600 there is no real difference, which is quite impressive for a card with less than 512MB of RAM.
Enabling 4xAA shows us that we can still get good performance at resolutions below 2560x1600 along with antialiasing. Up until our highest resolution test, the two 8800 GTS cards performed very similarly. This means that even at high resolutions with AA enabled, HL2:EP1 doesn't incur the same penalty on the 8800 GTS 320MB that other games do.
55 Comments
View All Comments
nicolasb - Monday, February 12, 2007 - link
The conclusion to this article:This conclusion does not seem to bear much resemblance to the actual observations. In virtually every case the card performed well without AA, but dismally as soon as 4xAA was switched on. A fair conclusion would be to recommend the card for resolutions up to 1920x1200 without AA, but definitely not with.
DerekWilson - Monday, February 12, 2007 - link
The GTS 320MB still performs well if taken on its own at 19x12 with 4xAA ... But I will modify the comment to better reflect what I mean.nicolasb - Tuesday, February 13, 2007 - link
The way the conclusion now reads is a big improvement, IMNSHO. :-)munky - Monday, February 12, 2007 - link
I was expecting better performance with AA enabled, and the article just glossed over the fact that the in half the games with AA the card performed on par or worse than last gen card that cost less.Bob Markinson - Monday, February 12, 2007 - link
For the base Oblivion install, yes, it's not so much of a memory hog. In-game texture use usually doesn't exceed 256 MB with HDR and 4xAA on @ 1152x864. (Also, please test AA perf too with HDR, both ATI and Nvidia do support it on their current gen cards at the same time.)Most popular texture mods will bring up the memory usage north of 500 MB. I've seen it hit over 700 MB. Thus, there's a good chance that any 256 MB card would be crippled with texture swapping. I should know, mine is.
Bob Markinson - Monday, February 12, 2007 - link
For the base Oblivion install, yes, it's not so much of a memory hog. In-game texture use usually doesn't exceed 256 MB with HDR and 4xAA on @ 1152x864. (Also, please test AA perf too with HDR, both ATI and Nvidia do support it on their current gen cards at the same time.)Most popular texture mods will bring up the memory usage north of 500 MB. I've seen it hit over 700 MB. Thus, there's a good chance that any 256 MB card would be crippled with texture swapping. I should know, mine is.
DerekWilson - Monday, February 12, 2007 - link
What texture mod would you recommend we test with?Bob Markinson - Monday, February 12, 2007 - link
Qarl's Texture Pack 2 and 3 are quite popular world texture packs. Please check this site for more details:http://devnull.devakm.googlepages.com/totoworld">http://devnull.devakm.googlepages.com/totoworld
Note that version 3 really does need a lot of texture memory. Also, check out Qarl's 4096 compressed landscape LOD normal map texture pack, it'll add far more depth than the plain, overly filtered Oblivion LOD textures.
DerekWilson - Monday, February 12, 2007 - link
We will take a look at those texture packs and do some testing ...Hopefully we can provide a follow up further exploring the impact of memory on the 8800 architecture.
blackbrrd - Monday, February 12, 2007 - link
I looked at the Oblivion scores, and the first thing that hit me was: they are using the standard crappy looking textures!No oblivion fan running a 8800gts would run with the standard texture pack. It is, at times, really really bad.
Running a texture pack like the one above is quite normal. If you have enough video card memory there isn't much of a slowdown - except when the data is loaded into memory - which happends all the time... It does make the game look nicer though!