NVIDIA GeForce 8800 GTS 512 & GeForce 8800 GT 256MB: Playing with Memory and G92
by Anand Lal Shimpi on December 11, 2007 12:00 AM EST- Posted in
- GPUs
8800 GT 512MB vs. 256MB
When AMD released the Radeon HD 3800 series, NVIDIA responded by saying that a cheaper 256MB version of the 8800 GT would be on its way, priced below $200. NVIDIA delivered on part of its promise, we do have a 256MB 8800 GT in hand but it's not a sub-$200 card. The 8800 GT 256 we have is the Alpha Dog Edition XXX from XFX, priced at $229 not including a $10 mail in rebate. That's not too far off the mark but it's still not less than $200.
The XFX card we have runs at a 650MHz core clock but only has a 1.6GHz memory data rate. The reference 512MB card runs at 600MHz core/1.8GHz memory.
Quake Wars starts off showing us a trend we'll see quite often with the 256MB 8800 GT, it performs virtually identically to its 512MB brother until after 1600 x 1200 then there's a sharp drop off:
The performance hit isn't as pronounced when you turn on AA, instead you get a 10 - 20% hit across the board:
Bioshock shows the same thing, competitive performance up to 1600 x 1200 but at 1920 x 1200 the 512MB card has a 16% advantage, and a 60% advantage at 2560 x 1600. It is worth noting that neither card is really playable at 2560 x 1600 in Bioshock.
World in Conflict moves the choke point up to 1600 x 1200; the two cards behave similarly at 1280 x 1024, but the 512MB 8800 GT holds on to a 20% minimum advantage at 1600 x 1200 and grows it to 40% at 2560 x 1600.
Older titles like Half Life 2 and Oblivion show absolutely no difference between the two cards, showing us that this current wave of games and most likely all those to follow require larger than 256MB frame buffers. While 256MB could cut it in the Half Life 2 and Oblivion days, the same just isn't true any more.
What we have here is an 8800 Ultra that's $50 more for not much more gain, and a 256MB 8800 GT that's at least $70 cheaper for a lot less performance. If you plan on keeping this card for any length of time, it looks like 512MB is the way to go. Frame buffer demands of modern games are only going to increase, and it looks like what we're seeing here today is an indication that the transition to 512MB as a minimum for high end gaming performance is officially underway. The 768MB memory sizes of the 8800 GTX are still not totally required, but 512MB looks like the sweet spot.
56 Comments
View All Comments
Griswold - Thursday, December 13, 2007 - link
Also (partly) wrong. Its a good price/performance part and its short in supply. That is why its priced higher. And I'm willing to bet the supply shortage is artificial. Look at how the availability of the GTS 512 is - seems to be much better than that of the GT. Its no surprise. Nvidias margins with the GT must be abyssmal compared to that of higher priced units (thats a given, but they also rendered almost their complete lineup obsolete for several weeks prior to the launch of the GTS 512), but they needed that horse to compete with the 3850/3870 price point.And you really need to stop talking out of your ass about the 3850. Its selling well and its selling at MSRP because supply is decent (and you lecture him about fundamentals... ). I think there was a the register claim of 150k units in 3 weeks. Well, thats three times the amount of the available 8800GT units in the same timeframe. Speaks for itself.
neogodless - Tuesday, December 11, 2007 - link
Whew... just bought an 8800GT and would like to feel like it was a good buy for a *little while*! Hope it has enough supply to help drive prices down in general though...R3MF - Tuesday, December 11, 2007 - link
Where are the G92 GTS cards with memory over 2.0GHz?Does this preage the entrance of a G92 GTX with memory at 2.4GHz and a higher core clock?
It isn't rocket science to put some decent speed memory on a midrange card. Witness the 3870 with 2.35GHz memory, so why haven't any of the so called "OC" versions of the G92 GTS got overclocked memory?
At the same time we all want a card that can play Crysis at 1920x1200 at High details and still get around 30FPS. The GTS can get ~30FPS at Medium details........... whoopy-do!
So, we know its possible to economically provide more bandwidth and we know its necessary, but nobody has done so including the OC'ed versions.
Is this because there is a G92 GTX product around the corner?
Yes i know there is rumoured to be a G92 GX2 dual-card sometime in january, but how about a non-cack single card version.
A card with:
720MHz core clock
2000MHz shaders
2400MHz memory
1GB or memory
would absolutely rock, so why haven't we got one?
kilkennycat - Tuesday, December 11, 2007 - link
Memory tweaking of the current series is a tiny marginal benefit with a huge increase in power-dissipation. The G92 represents the last gasp of the current G8x/G9x architecture. The shrink was absolutely essential to nVidia's GPU business to get away from the huge, power-hungry and low-yield G80 GPU.The true high-end replacement family for the 8xxx-series is coming around Q2 of 2008. It has been in design for at least the past year and is NOT just a tweak of the G8x/G9x architecture. If you really HAVE TO upgrade your system right now, just get a SINGLE 8800GT 512. At this point in time, do not invest in SLI. Keep you hands in your pockets and wait for the next gen. A single copy of the high-end version of the next-gen GPU family from nVidia is likely to have more GPU horsepower than dual 8800GTX.
Griswold - Thursday, December 13, 2007 - link
"The true high-end replacement family for the 8xxx-series is coming around Q2 of 2008. It has been in design for at least the past year and is NOT just a tweak of the G8x/G9x architecture."Its going to be an evolved (note: thats a fair bit more than just tweaked) G80/G92. You dont design a completely new architecture in a year. Remember what nvidia claimed at launch of the G80? Its been in the works for several years. They will squeeze every bit of revenue out of this architecture before they launch their true next generation architecture (on which at least one team must have been working since the launch of G80).
retrospooty - Tuesday, December 11, 2007 - link
A card with:720MHz core clock
2000MHz shaders
2400MHz memory
1GB or memory
would absolutely rock, so why haven't we got one?
Ummm.... Wait until the high end card is released in january and then see what the specs are. Its suppsed to be a dual GPU version like the 7950GTX was. So think 2 8000GT SLI performance. The memory wont likely be 2400mhz, but it will be dual channel for 512mbit bandwidth.