HD Video Playback: H.264 Blu-ray on the PC
by Derek Wilson on December 11, 2006 9:50 AM EST- Posted in
- GPUs
X-Men: The Last Stand CPU Overhead
The first benchmark we will see compares the CPU utilization of our X6800 when paired with each one of our graphics cards. While we didn't test multiple variations of each card this time, we did test the reference clock speeds for each type. Based on our initial HDCP roundup, we can say that overclocked versions of these NVIDIA cards will see better CPU utilization. ATI hardware doesn't seem to benefit from higher clock speeds. We have also included CPU utilization for the X6800 without any help from the GPU for reference.
The leaders of the pack are the NVIDIA GeForce 8800 series cards. While the 7 Series hardware doesn't do as well, we can see that clock speed does affect video decode acceleration with these cards. It is unclear whether this will continue to be a factor with the 8 Series, as the results for the 8800 GTX and GTS don't show a difference.
ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware. This is different than what our MPEG-2 tests indicated. We do still see a marked improvement over our unassisted decode performance test, which is good news for ATI hardware owners.
The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario.
While only the E6700 and X6800 are capable of decoding our H.264 movie without help, we can confirm that GPU decode acceleration will allow us to use a slower CPU in order to watch HD content on our PC. The X1950 XTX clearly doesn't help as much as the 8800 GTX, but both make a big difference.
The first benchmark we will see compares the CPU utilization of our X6800 when paired with each one of our graphics cards. While we didn't test multiple variations of each card this time, we did test the reference clock speeds for each type. Based on our initial HDCP roundup, we can say that overclocked versions of these NVIDIA cards will see better CPU utilization. ATI hardware doesn't seem to benefit from higher clock speeds. We have also included CPU utilization for the X6800 without any help from the GPU for reference.
The leaders of the pack are the NVIDIA GeForce 8800 series cards. While the 7 Series hardware doesn't do as well, we can see that clock speed does affect video decode acceleration with these cards. It is unclear whether this will continue to be a factor with the 8 Series, as the results for the 8800 GTX and GTS don't show a difference.
ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware. This is different than what our MPEG-2 tests indicated. We do still see a marked improvement over our unassisted decode performance test, which is good news for ATI hardware owners.
The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario.
While only the E6700 and X6800 are capable of decoding our H.264 movie without help, we can confirm that GPU decode acceleration will allow us to use a slower CPU in order to watch HD content on our PC. The X1950 XTX clearly doesn't help as much as the 8800 GTX, but both make a big difference.
86 Comments
View All Comments
Xajel - Monday, December 11, 2006 - link
I don't know why Anand these days does not care about AMD, I just hope they don't think that every body in the world has Core 2...I'm not fan of AMD, but the benefit of this kind of articles is to see how much power do you need to handle these scenarios, and I guess the magority of peoples today still have older CPU's
these test must in my opinion cover wider range of CPU's Pentium 4 (with HT and without ), Pentium D, Athlon 64, Athlon 64 X2 even the Quad FX platform this will help reader very well knows if there system can handle these thing or not
michael2k - Monday, December 11, 2006 - link
I would hazard most AMDs won't be fine; if most Intel CPUs won't be fine and if the E6600 outclasses all AMD CPUs...But I was just looking at the AMD/C2D comparison from July, the newest AMD CPUs may do fine.
mino - Monday, December 11, 2006 - link
The same here?What about QuadFX ? (under Vista)
FX-70 at $500 it is at level with E6700...
AlexWade - Monday, December 11, 2006 - link
The HD DVD 360 add-on works on a PC, why wasn't that tested too?DerekWilson - Monday, December 11, 2006 - link
We are going to do a followup using the 360 HDDVD drive (actually, I'm working on it right now).ShizNet - Tuesday, December 12, 2006 - link
great! what file foot-print advantage's in h.264? 1/4? 1/6? 1/10 compare to MPEG2? and if so can't you store h.264 on 'ol DVD? i've read HD/BD has way more space to offer than movie along needs. for that reason HD/BD will include: games, extra 'endings', rating-film options, trails....great write-up as usual
artifex - Tuesday, December 12, 2006 - link
I would love to see that article include visual comparisons with a 360 running the HD-DVD adapter. If I buy the adapter, I may be using it on both.therealnickdanger - Monday, December 11, 2006 - link
Yeah, that it curious. Besides, if you're serious about HD-movies, all the highest picture-quality films currently are encoded using VC-1. Sure, H.264 has the potential to be the best, but it hasn't been demonstrated yet. VC-1 also takes less grunt to decode, so the article could pander to many more users than just X6800 owners......just a thought.
Orbs - Monday, December 11, 2006 - link
I'd love to see that tested and compared.Eug - Monday, December 11, 2006 - link
If an E6400 2.13 GHz is OK, is a T7400 2.16 also OK? The T7400 is faster, but it has a slower memory bus.