NVIDIA GeForce 8600: Full H.264 Decode Acceleration
by Anand Lal Shimpi on April 27, 2007 4:34 PM EST- Posted in
- GPUs
The Test
We chose to test with four NVIDIA GPUs and two ATI GPUs. From NVIDIA we used the GeForce 8800 GTX, 8600 GTS, 8600 GT and the 7950 GT. The 8800 GTX and 7950 GT have the same VP as the rest of the GeForce 7 line, so they should offer fairly similar performance to everything else in NVIDIA's lineup that runs above 400MHz (remember that NVIDIA's VP stops working at core clocks below 400MHz). We included both 8600 cards to confirm NVIDIA's claim that the two 8600s would perform identically when it comes to H.264 decoding.
ATI uses its shader units to handle video decode, so there's more performance variance between GPUs. ATI only guarantees 720p or above decode acceleration on X1600 or faster GPUs and thus we included two parts in this review: a Radeon X1600 XT and a Radeon X1950 XTX; in theory the latter should be a bit better at its decode acceleration.
For our host CPU we chose the recently released Intel Core 2 Duo E6320, running at 1.86GHz with a 4MB L2 cache. As always, we reported both average and maximum CPU utilization figures. There will be some variability between numbers since we're dealing with manual measurements of CPU utilization, but you should be able to get an idea of basic trends.
We chose three HD-DVD titles for our performance test: Yozakura (H.264), The Interpreter (H.264) and Serenity (VC1). Yozakura is a Japanese HD-DVD that continues to be the most stressful test we've encountered; even on some of the fastest Core 2 systems it will still peak at 100% CPU utilization. Keep in mind that the NVIDIA GPUs don't handle CAVLC/CABAC for VC1 decode as VP2 is hardwired for H.264 decode, thus our VC1 test shouldn't show any tremendous improvement thanks to the new GPUs.
We used the Microsoft Xbox 360 HD-DVD drive for all of our tests.
System Test Configuration | |
CPU: | Intel Core 2 Duo E6320 (1.86GHz/4MB) |
Motherboard: | ASUS P5B Deluxe |
Chipset: | Intel P965 |
Chipset Drivers: | Intel 8.1.1.1010 |
Hard Disk: | Seagate 7200.7 160GB SATA |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 4) |
Video Card: | NVIDIA GeForce 8800 GTX NVIDIA GeForce 8600 GTS NVIDIA GeForce 8600 GT NVIDIA GeForce 7950 GT ATI Radeon X1950 XTX ATI Radeon X1600 XT |
Video Drivers: | ATI Catalyst 7.4 NVIDIA ForceWare 158.16 |
Desktop Resolution: | 1920 x 1080 - 32-bit @ 60Hz |
OS: | Windows Vista Ultimate 32-bit |
64 Comments
View All Comments
JarredWalton - Saturday, April 28, 2007 - link
The peak numbers may not be truly meaningful other than indicating a potential for dropped frames. Average CPU utilization numbers are meaningful, however. Unlike SETI, there is a set amount of work that needs to be done in a specific amount of time in order to successfully decode a video. The video decoder can't just give up CPU time to lower CPU usage, because the content has to be handled or frames will be dropped.The testing also illustrates the problem with ATI's decode acceleration on their slower cards, though: the X1600 XT is only slightly faster than doing all the work on the CPU in some instances, and in the case of VC1 it may actually add more overhead than acceleration. Whether that's due to ATI's drivers/hardware or the software application isn't clear, however. Looking at the WinDVD vs. PowerDVD figures, the impact of the application used is obviously not negligible at this point.
BigLan - Saturday, April 28, 2007 - link
Does the 8600 also accelerate x264 content? It's looking like x264 will become the successor to xvid, so if these cards can, they'll be the obvious choice for HD-HTPCs.I guess the main question would be if windvd or powerdvd can play x264. I suspect they can't, but nero showtime should be able to.
MrJim - Tuesday, May 8, 2007 - link
Accelerating x264 content would be great but i dont know what the big media companies would think about that, maybe ATI or Nvidia will lead the way, hopefully.Xajel - Saturday, April 28, 2007 - link
I'm just asking why those enhancement are not in the higher 8800 GPU's ??I know 8600 will be more used in HTPC than 8800, but it's just not a good reason to not include them !!
Axbattler - Saturday, April 28, 2007 - link
Those cards came out 5 months after the 8800. Long enough for them to add the tech it seems. I'd expect them in the 8900 (or whatever nVidia name their refresh) though. Actually, it would be interesting to see if they add to the 8800 Ultra.Xajel - Saturday, April 28, 2007 - link
I don't expect Ultra to have them, AFAIK Ultra is just tweaked version of GTX with higher MHz for both Core and RAM...I can expect it for my 7950GT successor
Spacecomber - Friday, April 27, 2007 - link
I'm not sure I understand why Nvidia doesn't offer an upgraded version of their decoder software, instead of relying on other software companies to get something put together to work with their hardware.thestain - Friday, April 27, 2007 - link
http://www.newegg.com/product/product.asp?item=N82...">All this tech jock sniffing with the latest and greatest, but this old reliable is a better deal isn't it?For watching movies.. for the ordinary non-owner of the still expensive hd dvd players and hd dvds... for standard definition content.. even without the nice improvements nvidia has made.. seems to me that the old tech still does a pretty good job.
What do you think of this ole 6600 compared to the 8600 in terms of price paid for the performance you are going to see and enjoy in reality?
DerekWilson - Saturday, April 28, 2007 - link
the key line there is "if you have a decent cpu" ... which means c2d e6400.for people with slower cpus, the 6600 will not cut it and the 8600gt/8500gt will be the way to go.
the aes-128 step still needed to be done on older hardware (as it needs to decrypt the data stream sent to it by the CPU), but using dedicated hardware rather than the shader hardware to do this should help save power or free up resources for other shader processing (post processing like noise redux, etc).
Treripica - Friday, April 27, 2007 - link
I don't know if this is too far off-topic, but what PSU was used for testing?