HD Video Playback: H.264 Blu-ray on the PC
by Derek Wilson on December 11, 2006 9:50 AM EST- Posted in
- GPUs
X-Men: The Last Stand CPU Overhead
The first benchmark we will see compares the CPU utilization of our X6800 when paired with each one of our graphics cards. While we didn't test multiple variations of each card this time, we did test the reference clock speeds for each type. Based on our initial HDCP roundup, we can say that overclocked versions of these NVIDIA cards will see better CPU utilization. ATI hardware doesn't seem to benefit from higher clock speeds. We have also included CPU utilization for the X6800 without any help from the GPU for reference.
The leaders of the pack are the NVIDIA GeForce 8800 series cards. While the 7 Series hardware doesn't do as well, we can see that clock speed does affect video decode acceleration with these cards. It is unclear whether this will continue to be a factor with the 8 Series, as the results for the 8800 GTX and GTS don't show a difference.
ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware. This is different than what our MPEG-2 tests indicated. We do still see a marked improvement over our unassisted decode performance test, which is good news for ATI hardware owners.
The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario.
While only the E6700 and X6800 are capable of decoding our H.264 movie without help, we can confirm that GPU decode acceleration will allow us to use a slower CPU in order to watch HD content on our PC. The X1950 XTX clearly doesn't help as much as the 8800 GTX, but both make a big difference.
The first benchmark we will see compares the CPU utilization of our X6800 when paired with each one of our graphics cards. While we didn't test multiple variations of each card this time, we did test the reference clock speeds for each type. Based on our initial HDCP roundup, we can say that overclocked versions of these NVIDIA cards will see better CPU utilization. ATI hardware doesn't seem to benefit from higher clock speeds. We have also included CPU utilization for the X6800 without any help from the GPU for reference.
The leaders of the pack are the NVIDIA GeForce 8800 series cards. While the 7 Series hardware doesn't do as well, we can see that clock speed does affect video decode acceleration with these cards. It is unclear whether this will continue to be a factor with the 8 Series, as the results for the 8800 GTX and GTS don't show a difference.
ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware. This is different than what our MPEG-2 tests indicated. We do still see a marked improvement over our unassisted decode performance test, which is good news for ATI hardware owners.
The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario.
While only the E6700 and X6800 are capable of decoding our H.264 movie without help, we can confirm that GPU decode acceleration will allow us to use a slower CPU in order to watch HD content on our PC. The X1950 XTX clearly doesn't help as much as the 8800 GTX, but both make a big difference.
86 Comments
View All Comments
Stereodude - Wednesday, December 13, 2006 - link
Also, there's http://www.avsforum.com/avs-vb/showthread.php?p=91...">this post on AVSforum. The poster had no problems playing back Xmen-3 with a "P4 3.2Ghz HT system and a Radeon X1950Pro". Clearly a 3.2gHz HT P4 isn't nearly as powerful as any of those C2D processor nor was the X1950Pro as the various nVidia cards.Stereodude - Wednesday, December 13, 2006 - link
Perhaps, but nVidia intentionally sent them a H.264 torture test disc that's not available in the US. That also doesn't explain why the 7600GT nearly cut the CPU usage in half for one review, but only helped 20% in the other.Also, nVidia says an E6330 or X2 4200+ with a 7600GT is adequate for the most demanding H.264 titles. That sure doesn't agree with the conclusion of this Anandtech piece, which says you need a 8800GTX card to use a E6300.
balazs203 - Wednesday, December 13, 2006 - link
In the PC Perspective article they say:"In our testing the H.264 bit rates were higher than the VC-1 rates, in the high 18-19 Mbps up to 22 Mbps in some cases."
That is about half the maximum bitrate of the Anadtech tested disc.
Stereodude - Wednesday, December 13, 2006 - link
Since when does bitrate = difficulty to decode?DerekWilson - Thursday, December 14, 2006 - link
bitrate does equal difficulty to decode because it equals more to do per frame.frogge - Tuesday, December 12, 2006 - link
64 bit OS vs 32 bit...puffpio - Tuesday, December 12, 2006 - link
Will you start using more updated/modern encoding CPU tests for H.264 encoding? Currently you use Quicktime right? That doesn't use many of H264's advanced features.Have you considered using x264 (an open source encoder of H264 that generates the best quality encodes of publicly available H264 encoders) using a standard set of encoding parameters?
Nothing taxes a CPU better than video encoding :)
rain128 - Tuesday, December 12, 2006 - link
Im little bit sceptic about those test results. Becuse my Home computer on the subject line played Dejavu clip (downloaded from Apple website trailer 1 - 1080p) with CPU usage 40..60% and with current version of NVIDIA drivers. Wiht older drivers (dont know excact version, installed those over a year ago) average farame rate was between 50...70%.For a decoder used PowerDVD 7, installed trial and even when cyberlinks webpage says that H.264 codec doesnt work with trial version i had now problems with it. Gspot reported for a default rendering path Cyberlinks H.264 codec. For fulscreen capability used BSPlayer, strange was that Windows mediaplayer didnt want to play that trial eventhough all other players had no problem finding installed codecs.
TIP: with BSPlayer you can see droped frame rate count.
Renoir - Tuesday, December 12, 2006 - link
The h.264 clips on the apple website tend to have lower bit rates than those found on blu-ray discs so that explains your cpu usage.DerekWilson - Tuesday, December 12, 2006 - link
this is what we have found as well, and is also why looking at BD and HDDVD performance is more important than when we've looked at downloaded clips in the past