DX10 for the Masses: NVIDIA 8600 and 8500 Series Launch
by Derek Wilson on April 17, 2007 9:00 AM EST- Posted in
- GPUs
Final Words
DirectX 10 is here, and NVIDIA has the hardware for it. Much like ATI led the way into DX9, NVIDIA has taken hold of the industry and we can expect to see developers take their DX10 cues from G80 behavior. After all, 8800 cards have been available for nearly half a year without any other DX10 alternative, so developers and consumers have both moved towards NVIDIA for now. Hopefully the robust design of DX10 will help avoid the pitfalls we saw in getting DX9 performance even across multiple GPU architectures.
Now that affordable GeForce 8 Series hardware is here, we have to weigh in on NVIDIA's implementation. While the 8600 GT improves on the performance of its spiritual predecessor the 7600 GT, we don't see significant performance improvements above hardware currently available at the target prices for the new hardware. In NVIDIA's favor, our newest and most shader intensive tests (Oblivion and Rainbow Six: Vegas) paint the 8600 hardware in a more favorable light than older tests that rely less on shader programs and more on texture, z, and color fill rates.
We are planning on looking further into this issue and will be publishing a second article on 8600 GTS/GT performance in the near future using games like S.T.A.L.K.E.R., Supreme Commander, and Company of Heroes. Hopefully these tests will help confirm our conclusion that near future titles that place a heavier emphasis on shader performance will benefit more from G84 based hardware than previous models.
Whatever we feel about where performance should be, we are very happy with the work NVIDIA has placed into video processing. We hope our upcoming video decoding performance update will reflect the expectations NVIDIA has set by claiming 100% H.264; VC-1 and MPEG-2 are not decoded 100% by the GPU, but at least in the case of MPEG-2 it's not nearly as CPU intensive anyway. Including two dual-link DVI ports even on $150 hardware with the capability to play HDCP protected content over a dual-link connection really makes the 8600 GTS and 8600 GT the hardware of choice for those who want HD video on their PC.
For users who own 7600 GT, 7900 GS, or X1950 Pro hardware, we can't recommend an upgrade to one of these new parts. Even though new features and higher performance in a few applications is better, there's not enough of a difference to justify the upgrade. On the other hand, those who are searching for new hardware to buy in the $150 - $200 range will certainly not be disappointed with 8600 based graphics. These cards aren't quite the silver bullet NVIDIA had with the 6600 series, but DX10 and great video processing are nothing to sneeze at. The features the 8600 series supports do add quite a bit of value where pure framerate may be lacking.
These cards are a good fit for users who have a 1280x1024 panel, though some of the newer games may need to have a couple settings turned down from the max to run smoothly. That's the classic definition of midrange, so in some ways it makes sense. At the same time, NVIDIA hasn't won the battle yet, as AMD has yet to unveil their DX10 class hardware. With midrange performance that's just on par with the old hardware occupying the various price points, NVIDIA has left themselves open this time around. We'll have to wait and see if AMD can capitalize.
DirectX 10 is here, and NVIDIA has the hardware for it. Much like ATI led the way into DX9, NVIDIA has taken hold of the industry and we can expect to see developers take their DX10 cues from G80 behavior. After all, 8800 cards have been available for nearly half a year without any other DX10 alternative, so developers and consumers have both moved towards NVIDIA for now. Hopefully the robust design of DX10 will help avoid the pitfalls we saw in getting DX9 performance even across multiple GPU architectures.
Now that affordable GeForce 8 Series hardware is here, we have to weigh in on NVIDIA's implementation. While the 8600 GT improves on the performance of its spiritual predecessor the 7600 GT, we don't see significant performance improvements above hardware currently available at the target prices for the new hardware. In NVIDIA's favor, our newest and most shader intensive tests (Oblivion and Rainbow Six: Vegas) paint the 8600 hardware in a more favorable light than older tests that rely less on shader programs and more on texture, z, and color fill rates.
We are planning on looking further into this issue and will be publishing a second article on 8600 GTS/GT performance in the near future using games like S.T.A.L.K.E.R., Supreme Commander, and Company of Heroes. Hopefully these tests will help confirm our conclusion that near future titles that place a heavier emphasis on shader performance will benefit more from G84 based hardware than previous models.
Whatever we feel about where performance should be, we are very happy with the work NVIDIA has placed into video processing. We hope our upcoming video decoding performance update will reflect the expectations NVIDIA has set by claiming 100% H.264; VC-1 and MPEG-2 are not decoded 100% by the GPU, but at least in the case of MPEG-2 it's not nearly as CPU intensive anyway. Including two dual-link DVI ports even on $150 hardware with the capability to play HDCP protected content over a dual-link connection really makes the 8600 GTS and 8600 GT the hardware of choice for those who want HD video on their PC.
For users who own 7600 GT, 7900 GS, or X1950 Pro hardware, we can't recommend an upgrade to one of these new parts. Even though new features and higher performance in a few applications is better, there's not enough of a difference to justify the upgrade. On the other hand, those who are searching for new hardware to buy in the $150 - $200 range will certainly not be disappointed with 8600 based graphics. These cards aren't quite the silver bullet NVIDIA had with the 6600 series, but DX10 and great video processing are nothing to sneeze at. The features the 8600 series supports do add quite a bit of value where pure framerate may be lacking.
These cards are a good fit for users who have a 1280x1024 panel, though some of the newer games may need to have a couple settings turned down from the max to run smoothly. That's the classic definition of midrange, so in some ways it makes sense. At the same time, NVIDIA hasn't won the battle yet, as AMD has yet to unveil their DX10 class hardware. With midrange performance that's just on par with the old hardware occupying the various price points, NVIDIA has left themselves open this time around. We'll have to wait and see if AMD can capitalize.
60 Comments
View All Comments
JarredWalton - Tuesday, April 17, 2007 - link
It's not surprising that G84 has some enhancements relative to G80. I mean, G80 was done six months ago. I'd expect VP2 is one of the areas they worked on improving a lot after comments post-8800 launch. Now, should they kill the current G80 and make a new G80 v1.1 with VP2? That's up for debate, but you can't whine that older hardware doesn't have newer features. "Why doesn't my Core 2 Duo support SSE4?" It's almost the same thing. I wouldn't be at all surprised to see a new high-end card from NVIDIA in the future with VP2, but when that will be... dunno.harshw - Tuesday, April 17, 2007 - link
So ... to confirm, the card *does* let you watch HDCP content on a Dell 3007WFP at 2560x1600 ? Of course, the card would probably scale the stream to the panel resolution ...
DerekWilson - Tuesday, April 17, 2007 - link
The card will let you watch HDCP protected content at the content's native resolution -- 1920x1080 progressive at max ...Currently if you want to watch HDCP protected content on a Dell 30", you need to drop your screen resolution to 1280x800 and watch at that res -- the video is downscaled from 1920x1080. Higher resolutions on the panel require dual-link DVI, and now HDCP protected content over a dual-link connection is here.
AnnonymousCoward - Tuesday, April 17, 2007 - link
Maybe I'm in the minority, but I don't care about this HDCP business. The players are still ultra expensive, and the resolution benefit doesn't really change how much I enjoy a movie. Also, a 30" screen is pretty small to be able to notice a difference between HD and DVD, if you're sitting at any typical movie-watching distance from the screen. Well, I would guess so at least.Spoelie - Wednesday, April 18, 2007 - link
We're talking about 30" lcd monitors with humongous resolutions, not old 30" lcd tvs with 1386x768 something.Or do your really don't see any difference between
http://www.imagehosting.com/out.php/i433150_BasicR...">http://www.imagehosting.com/out.php/i433150_BasicR... and http://www.imagehosting.com/out.php/i433192_HDDVD....">http://www.imagehosting.com/out.php/i433192_HDDVD....
or
http://www.imagehosting.com/out.php/i433157_BasicR...">http://www.imagehosting.com/out.php/i433157_BasicR... and http://www.imagehosting.com/out.php/i433198_HDDVD....">http://www.imagehosting.com/out.php/i433198_HDDVD....
Myrandex - Tuesday, April 17, 2007 - link
I loved it how the two 8600 cards listed 256MB memoy only however the 8500 card showed 256MB / 512MB. Gotta love marketing in attempting to grab the masses attention by throwing more ram into a situation where it doesn't really help...Jason
KhoiFather - Tuesday, April 17, 2007 - link
Horrible, horrible performance. I'm so disappointed its not even funny! I'm so waiting for ATI to release their mid-range cards and blow Nvidia out the water to space.jay401 - Tuesday, April 17, 2007 - link
Very true, and not only because the vast majority of gamers are still running XP, but also because no games out to this point gain anything from DX10/Vista (aside from one or two that add a few graphical tweaks here and there in DX10).
When there are enough popular, well-reviewed DX10/Vista focused games available that demonstrate appreciable performance improvement when running in that environment, such that you can create a test suite around those games, then it would be time to transition to that sort of test setup for GPUs.
Griswold - Tuesday, April 17, 2007 - link
The real reason would that nobody wants to go through the nightmare of dealing with nvidia drivers under vista. ;)jay401 - Tuesday, April 17, 2007 - link
Derek you should add the specs of the 8800GTS 320MB to the spec chart on page 2, unless of course NVidia forbids you to do that because it would make it too obvious how they've cut too many stream processors and too much bus size from these new cards.Now what they'll do is end the production of the 7950GTs to ensure folks can't continue to pick them up cheaper and will be forced to move to the 8600GTS that doesn't yet offer superior performance.
gg neutering these cards so much that they lose to your own previous generation hardware, NVidia.