DX10 for the Masses: NVIDIA 8600 and 8500 Series Launch
by Derek Wilson on April 17, 2007 9:00 AM EST- Posted in
- GPUs
Final Words
DirectX 10 is here, and NVIDIA has the hardware for it. Much like ATI led the way into DX9, NVIDIA has taken hold of the industry and we can expect to see developers take their DX10 cues from G80 behavior. After all, 8800 cards have been available for nearly half a year without any other DX10 alternative, so developers and consumers have both moved towards NVIDIA for now. Hopefully the robust design of DX10 will help avoid the pitfalls we saw in getting DX9 performance even across multiple GPU architectures.
Now that affordable GeForce 8 Series hardware is here, we have to weigh in on NVIDIA's implementation. While the 8600 GT improves on the performance of its spiritual predecessor the 7600 GT, we don't see significant performance improvements above hardware currently available at the target prices for the new hardware. In NVIDIA's favor, our newest and most shader intensive tests (Oblivion and Rainbow Six: Vegas) paint the 8600 hardware in a more favorable light than older tests that rely less on shader programs and more on texture, z, and color fill rates.
We are planning on looking further into this issue and will be publishing a second article on 8600 GTS/GT performance in the near future using games like S.T.A.L.K.E.R., Supreme Commander, and Company of Heroes. Hopefully these tests will help confirm our conclusion that near future titles that place a heavier emphasis on shader performance will benefit more from G84 based hardware than previous models.
Whatever we feel about where performance should be, we are very happy with the work NVIDIA has placed into video processing. We hope our upcoming video decoding performance update will reflect the expectations NVIDIA has set by claiming 100% H.264; VC-1 and MPEG-2 are not decoded 100% by the GPU, but at least in the case of MPEG-2 it's not nearly as CPU intensive anyway. Including two dual-link DVI ports even on $150 hardware with the capability to play HDCP protected content over a dual-link connection really makes the 8600 GTS and 8600 GT the hardware of choice for those who want HD video on their PC.
For users who own 7600 GT, 7900 GS, or X1950 Pro hardware, we can't recommend an upgrade to one of these new parts. Even though new features and higher performance in a few applications is better, there's not enough of a difference to justify the upgrade. On the other hand, those who are searching for new hardware to buy in the $150 - $200 range will certainly not be disappointed with 8600 based graphics. These cards aren't quite the silver bullet NVIDIA had with the 6600 series, but DX10 and great video processing are nothing to sneeze at. The features the 8600 series supports do add quite a bit of value where pure framerate may be lacking.
These cards are a good fit for users who have a 1280x1024 panel, though some of the newer games may need to have a couple settings turned down from the max to run smoothly. That's the classic definition of midrange, so in some ways it makes sense. At the same time, NVIDIA hasn't won the battle yet, as AMD has yet to unveil their DX10 class hardware. With midrange performance that's just on par with the old hardware occupying the various price points, NVIDIA has left themselves open this time around. We'll have to wait and see if AMD can capitalize.
DirectX 10 is here, and NVIDIA has the hardware for it. Much like ATI led the way into DX9, NVIDIA has taken hold of the industry and we can expect to see developers take their DX10 cues from G80 behavior. After all, 8800 cards have been available for nearly half a year without any other DX10 alternative, so developers and consumers have both moved towards NVIDIA for now. Hopefully the robust design of DX10 will help avoid the pitfalls we saw in getting DX9 performance even across multiple GPU architectures.
Now that affordable GeForce 8 Series hardware is here, we have to weigh in on NVIDIA's implementation. While the 8600 GT improves on the performance of its spiritual predecessor the 7600 GT, we don't see significant performance improvements above hardware currently available at the target prices for the new hardware. In NVIDIA's favor, our newest and most shader intensive tests (Oblivion and Rainbow Six: Vegas) paint the 8600 hardware in a more favorable light than older tests that rely less on shader programs and more on texture, z, and color fill rates.
We are planning on looking further into this issue and will be publishing a second article on 8600 GTS/GT performance in the near future using games like S.T.A.L.K.E.R., Supreme Commander, and Company of Heroes. Hopefully these tests will help confirm our conclusion that near future titles that place a heavier emphasis on shader performance will benefit more from G84 based hardware than previous models.
Whatever we feel about where performance should be, we are very happy with the work NVIDIA has placed into video processing. We hope our upcoming video decoding performance update will reflect the expectations NVIDIA has set by claiming 100% H.264; VC-1 and MPEG-2 are not decoded 100% by the GPU, but at least in the case of MPEG-2 it's not nearly as CPU intensive anyway. Including two dual-link DVI ports even on $150 hardware with the capability to play HDCP protected content over a dual-link connection really makes the 8600 GTS and 8600 GT the hardware of choice for those who want HD video on their PC.
For users who own 7600 GT, 7900 GS, or X1950 Pro hardware, we can't recommend an upgrade to one of these new parts. Even though new features and higher performance in a few applications is better, there's not enough of a difference to justify the upgrade. On the other hand, those who are searching for new hardware to buy in the $150 - $200 range will certainly not be disappointed with 8600 based graphics. These cards aren't quite the silver bullet NVIDIA had with the 6600 series, but DX10 and great video processing are nothing to sneeze at. The features the 8600 series supports do add quite a bit of value where pure framerate may be lacking.
These cards are a good fit for users who have a 1280x1024 panel, though some of the newer games may need to have a couple settings turned down from the max to run smoothly. That's the classic definition of midrange, so in some ways it makes sense. At the same time, NVIDIA hasn't won the battle yet, as AMD has yet to unveil their DX10 class hardware. With midrange performance that's just on par with the old hardware occupying the various price points, NVIDIA has left themselves open this time around. We'll have to wait and see if AMD can capitalize.
60 Comments
View All Comments
kilkennycat - Tuesday, April 17, 2007 - link
(As of 8AM Pacific Time, April 17)See:-
http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...">http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...
http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...">http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...
Chadder007 - Tuesday, April 17, 2007 - link
Thats really not too bad for a DX10 part. I just wish we actually had some DX10 games to see how it performs though....bob4432 - Tuesday, April 17, 2007 - link
that performance is horrible. everyone here is pretty dead on - this is strictly for marketing to the non-educated gamer. too bad they will be disappointed and probably return such a piece of sh!t item. what a joke.come on ati, this kind of performance should be in the low end cards, this is not a mid-range card. maybe if nvidia sold them for $100-$140 they may end up in somebody htpc but that is about all they are good for.
glad i have a 360 to ride out this phase of cards while my x1800xt still works fine for my duties.
if i were the upper management at nvidia, people would be fired over this horrible performance, but sadly the upper management is more than likely the cause of this joke of a release.
AdamK47 - Tuesday, April 17, 2007 - link
nVidia needs to have people with actual product knowledge dictate what the specifications of future products will be. This disappointing lineup has marketing written all over it. They need to wise up or they will end up like Intel and their failed marketing derived netburst architecture.wingless - Tuesday, April 17, 2007 - link
In the article they talk about the Pure Video features as if they are brand new. Does this mean they ARE NOT implemented in the 8800 series? The article talked about how 100% of the video decoding process is on the GPU but it did not mention the 8800 core which worries the heck outta me. Also does the G84 have CUDA capabilities?DerekWilson - Tuesday, April 17, 2007 - link
CUDA is supportedDerekWilson - Tuesday, April 17, 2007 - link
The 8800 series support PureVideo HD the same way GeForce 7 sereis does -- through VP1 hardware.The 8600 and below support PureVideo HD through VP2 hardware, the BSP, and other enhancements which allow 100% offload of decode.
While the 8800 is able to offload much of the process, it's not 100% like the 8600/8500. Both support PureVideo HD, but G84 does it with lower CPU usage.
wingless - Tuesday, April 17, 2007 - link
I just checked NVIDIA's website and it appears only the 8600 and 8500 series support Pure Video HD which sucks balls. I want 8800GTS performance with Pure Video HD support. Guess I'll have to wait a few more months, or go ATI but ATI's future isn't stable these days.defter - Tuesday, April 17, 2007 - link
Why you want 8800GTS performance with improved Purevideo HD support? Are you going to pair 8800GTS with $40 Celeron? 8800GTS has more than enough power to decode H.264 at HD resolutions as long as you pair with modern CPU: http://www.anandtech.com/printarticle.aspx?i=2886">http://www.anandtech.com/printarticle.aspx?i=2886This improved Purevide HD is aimed for low-end systems that are using a low end-CPU. That's why this feature is important for low/mid-range GPUs.
wingless - Tuesday, April 17, 2007 - link
If I'm going to spend this kind of money for an 8800 series card then I want VP2 100% hardware decoding? Is that too much to ask? I want all the extra bells and whistles. Damn, I may have to go ATI for the first time since 1987 when I had that EGA Wonder.