HD Video Playback: H.264 Blu-ray on the PC
by Derek Wilson on December 11, 2006 9:50 AM EST- Posted in
- GPUs
X-Men: The Last Stand CPU Overhead
The first benchmark we will see compares the CPU utilization of our X6800 when paired with each one of our graphics cards. While we didn't test multiple variations of each card this time, we did test the reference clock speeds for each type. Based on our initial HDCP roundup, we can say that overclocked versions of these NVIDIA cards will see better CPU utilization. ATI hardware doesn't seem to benefit from higher clock speeds. We have also included CPU utilization for the X6800 without any help from the GPU for reference.
The leaders of the pack are the NVIDIA GeForce 8800 series cards. While the 7 Series hardware doesn't do as well, we can see that clock speed does affect video decode acceleration with these cards. It is unclear whether this will continue to be a factor with the 8 Series, as the results for the 8800 GTX and GTS don't show a difference.
ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware. This is different than what our MPEG-2 tests indicated. We do still see a marked improvement over our unassisted decode performance test, which is good news for ATI hardware owners.
The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario.
While only the E6700 and X6800 are capable of decoding our H.264 movie without help, we can confirm that GPU decode acceleration will allow us to use a slower CPU in order to watch HD content on our PC. The X1950 XTX clearly doesn't help as much as the 8800 GTX, but both make a big difference.
The first benchmark we will see compares the CPU utilization of our X6800 when paired with each one of our graphics cards. While we didn't test multiple variations of each card this time, we did test the reference clock speeds for each type. Based on our initial HDCP roundup, we can say that overclocked versions of these NVIDIA cards will see better CPU utilization. ATI hardware doesn't seem to benefit from higher clock speeds. We have also included CPU utilization for the X6800 without any help from the GPU for reference.
The leaders of the pack are the NVIDIA GeForce 8800 series cards. While the 7 Series hardware doesn't do as well, we can see that clock speed does affect video decode acceleration with these cards. It is unclear whether this will continue to be a factor with the 8 Series, as the results for the 8800 GTX and GTS don't show a difference.
ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware. This is different than what our MPEG-2 tests indicated. We do still see a marked improvement over our unassisted decode performance test, which is good news for ATI hardware owners.
The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario.
While only the E6700 and X6800 are capable of decoding our H.264 movie without help, we can confirm that GPU decode acceleration will allow us to use a slower CPU in order to watch HD content on our PC. The X1950 XTX clearly doesn't help as much as the 8800 GTX, but both make a big difference.
86 Comments
View All Comments
Renoir - Tuesday, December 12, 2006 - link
Which is exactly the reason why I've been waiting so long for an article like this!redpriest_ - Tuesday, December 12, 2006 - link
Have you guys tried this configuration out? I have 2 Geforce 8800 GTXs in SLI, and using either the 97.02 or 97.44 driver and a 30" Dell monitor with a resolution capability of 2560x1600, I found I cannot play Bluray content at anything higher than a desktop resolution of 1280x800 (exactly half that resolution because of the way the dual-DVI bandwidth is setup). This means I cannot even experience full 1080p!Try anything higher than that and Cyberlink BD complains and says, set your resolution to less than 1980x1080. This sucks. I hope there is a fix on the way.
redpriest_ - Tuesday, December 12, 2006 - link
I should add I found this on nvidia's website."The Dell 3007WFP and Hewlett Packard LP3065 30" LCD monitors require a graphics card with a dual-link DVI port to drive the ultra high native resolution of 2560x1600 which these monitors support. With the current family of NVIDIA Geforce 8 & 7 series HDCP capable GPU's, playback of HDCP content is limited to single-link DVI connection only. HDCP is disabled over a dual-link DVI connection. The highest resolution the Dell 30" 3007WFP supports in single-link DVI mode is 1280x800 and therefore this is the highest resolution which HDCP playback is supported in single-link DVI mode on current Geforce 8 &7 series HDCP capable GPU's. On other 3rd party displays with a native resolutions of 1920x1200 and below, the graphics card interfaces with the monitor over a single-link DVI connection. In this case, playback of content protected Blu-Ray and HD-DVD movies is possible on HDCP capable Geforce 8& 7 series GPU's."
Someone needs to tip nvidia and other graphics card manufacturers that this is unacceptable. If I shell out $4000 ($2000 monitor, $1400 for 2 8800GTXsli, and $600 blueray drive) IT SHOULD WORK.
DerekWilson - Tuesday, December 12, 2006 - link
agreed, but don't blame NVIDIA -- blame the MPAA ... HDCP was designed around single link dvi and hdmi connections and wasn't made to work with dual link in the first place. I wouldn't be suprised if the problem NVIDIA is having has absolutely nothing to do with their hardware's capability.in addition, dell's design is flawed -- they only support resolutions above 12x8 with dual link dvi. it may have taken a little extra hardware, but there is no reason that they should not support up to at least 1920x1080 over a single link.
ssiu - Wednesday, December 13, 2006 - link
I would blame the 30" monitors -- they should at least support 1080p in single-link DVI mode, just like the way 24" monitors do.Renoir - Tuesday, December 12, 2006 - link
Wasn't blaming anyone in particular (although I'm always happy to bash the MPAA) just noting how stupid the situation is. Supporting a max of 12x8 over single link is inexcusable as far as I'm concerned.DerekWilson - Thursday, December 14, 2006 - link
then the problem you have is specifically with Dell.Renoir - Tuesday, December 12, 2006 - link
That is ridiculous! That's the problem with tech you can't take anything for granted these days. Things that seem obvious and sensible often turn out to be not as they seem. What a joke!poisondeathray - Monday, December 11, 2006 - link
sorry if this has been answered already...is powerDVD multithreaded? is your CPU utilization balanced across both cores? what effect does a quadcore chip have on CPU utilization
thx in advance
Renoir - Tuesday, December 12, 2006 - link
poisondeathray you read my mind! I have exactly the same question.