NVIDIA GeForce 8600: Full H.264 Decode Acceleration
by Anand Lal Shimpi on April 27, 2007 4:34 PM EST- Posted in
- GPUs
Serenity (VC1)
Our final test is a VC1 test, meaning the new BSP engine remains idle in the GeForce 8600 while running this test as it is hardcoded to H.264 CAVLC/CABAC bitstreams. When decoding VC1 content, the new 8600 (and the 8500) are essentially the same as the GeForce 8800 GTX or the GeForce 7 series GPUs. While they do include support for inverse transform, that doesn't appear to make any significant difference to the strain on the CPU.
For some reason ATI's offerings continue to give us much higher CPU utilization figures. In this case it's as if hardware assist isn't working at all. We haven't been following ATI's AVIVO over the past several Catalyst revisions, so it is possible that somewhere along the line ATI broke compatibility. It could also be just one more software bug that needs to be fixed by PowerDVD. ATI's hardware is supposed to handle motion compensation while the NVIDIA hardware does not, so in theory ATI should be producing lower CPU utilization numbers in these VC1 tests.
Under WinDVD the story is no different; the new GPUs (as expected) do the same amount of decode work as the old ones and CPU utilization remains unchanged. Given that VC1 is predominantly an HD-DVD codec, the CPU utilization figures we're seeing here aren't terrible.
While NVIDIA has stated that it will look into adding a VC1 compatible BSP in future GPU revisions, it's not absolutely necessary today.
64 Comments
View All Comments
JarredWalton - Saturday, April 28, 2007 - link
The peak numbers may not be truly meaningful other than indicating a potential for dropped frames. Average CPU utilization numbers are meaningful, however. Unlike SETI, there is a set amount of work that needs to be done in a specific amount of time in order to successfully decode a video. The video decoder can't just give up CPU time to lower CPU usage, because the content has to be handled or frames will be dropped.The testing also illustrates the problem with ATI's decode acceleration on their slower cards, though: the X1600 XT is only slightly faster than doing all the work on the CPU in some instances, and in the case of VC1 it may actually add more overhead than acceleration. Whether that's due to ATI's drivers/hardware or the software application isn't clear, however. Looking at the WinDVD vs. PowerDVD figures, the impact of the application used is obviously not negligible at this point.
BigLan - Saturday, April 28, 2007 - link
Does the 8600 also accelerate x264 content? It's looking like x264 will become the successor to xvid, so if these cards can, they'll be the obvious choice for HD-HTPCs.I guess the main question would be if windvd or powerdvd can play x264. I suspect they can't, but nero showtime should be able to.
MrJim - Tuesday, May 8, 2007 - link
Accelerating x264 content would be great but i dont know what the big media companies would think about that, maybe ATI or Nvidia will lead the way, hopefully.Xajel - Saturday, April 28, 2007 - link
I'm just asking why those enhancement are not in the higher 8800 GPU's ??I know 8600 will be more used in HTPC than 8800, but it's just not a good reason to not include them !!
Axbattler - Saturday, April 28, 2007 - link
Those cards came out 5 months after the 8800. Long enough for them to add the tech it seems. I'd expect them in the 8900 (or whatever nVidia name their refresh) though. Actually, it would be interesting to see if they add to the 8800 Ultra.Xajel - Saturday, April 28, 2007 - link
I don't expect Ultra to have them, AFAIK Ultra is just tweaked version of GTX with higher MHz for both Core and RAM...I can expect it for my 7950GT successor
Spacecomber - Friday, April 27, 2007 - link
I'm not sure I understand why Nvidia doesn't offer an upgraded version of their decoder software, instead of relying on other software companies to get something put together to work with their hardware.thestain - Friday, April 27, 2007 - link
http://www.newegg.com/product/product.asp?item=N82...">All this tech jock sniffing with the latest and greatest, but this old reliable is a better deal isn't it?For watching movies.. for the ordinary non-owner of the still expensive hd dvd players and hd dvds... for standard definition content.. even without the nice improvements nvidia has made.. seems to me that the old tech still does a pretty good job.
What do you think of this ole 6600 compared to the 8600 in terms of price paid for the performance you are going to see and enjoy in reality?
DerekWilson - Saturday, April 28, 2007 - link
the key line there is "if you have a decent cpu" ... which means c2d e6400.for people with slower cpus, the 6600 will not cut it and the 8600gt/8500gt will be the way to go.
the aes-128 step still needed to be done on older hardware (as it needs to decrypt the data stream sent to it by the CPU), but using dedicated hardware rather than the shader hardware to do this should help save power or free up resources for other shader processing (post processing like noise redux, etc).
Treripica - Friday, April 27, 2007 - link
I don't know if this is too far off-topic, but what PSU was used for testing?