HD Video Playback: H.264 Blu-ray on the PC
by Derek Wilson on December 11, 2006 9:50 AM EST- Posted in
- GPUs
X-Men: The Last Stand CPU Overhead
The first benchmark we will see compares the CPU utilization of our X6800 when paired with each one of our graphics cards. While we didn't test multiple variations of each card this time, we did test the reference clock speeds for each type. Based on our initial HDCP roundup, we can say that overclocked versions of these NVIDIA cards will see better CPU utilization. ATI hardware doesn't seem to benefit from higher clock speeds. We have also included CPU utilization for the X6800 without any help from the GPU for reference.
The leaders of the pack are the NVIDIA GeForce 8800 series cards. While the 7 Series hardware doesn't do as well, we can see that clock speed does affect video decode acceleration with these cards. It is unclear whether this will continue to be a factor with the 8 Series, as the results for the 8800 GTX and GTS don't show a difference.
ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware. This is different than what our MPEG-2 tests indicated. We do still see a marked improvement over our unassisted decode performance test, which is good news for ATI hardware owners.
The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario.
While only the E6700 and X6800 are capable of decoding our H.264 movie without help, we can confirm that GPU decode acceleration will allow us to use a slower CPU in order to watch HD content on our PC. The X1950 XTX clearly doesn't help as much as the 8800 GTX, but both make a big difference.
The first benchmark we will see compares the CPU utilization of our X6800 when paired with each one of our graphics cards. While we didn't test multiple variations of each card this time, we did test the reference clock speeds for each type. Based on our initial HDCP roundup, we can say that overclocked versions of these NVIDIA cards will see better CPU utilization. ATI hardware doesn't seem to benefit from higher clock speeds. We have also included CPU utilization for the X6800 without any help from the GPU for reference.
The leaders of the pack are the NVIDIA GeForce 8800 series cards. While the 7 Series hardware doesn't do as well, we can see that clock speed does affect video decode acceleration with these cards. It is unclear whether this will continue to be a factor with the 8 Series, as the results for the 8800 GTX and GTS don't show a difference.
ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware. This is different than what our MPEG-2 tests indicated. We do still see a marked improvement over our unassisted decode performance test, which is good news for ATI hardware owners.
The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario.
While only the E6700 and X6800 are capable of decoding our H.264 movie without help, we can confirm that GPU decode acceleration will allow us to use a slower CPU in order to watch HD content on our PC. The X1950 XTX clearly doesn't help as much as the 8800 GTX, but both make a big difference.
86 Comments
View All Comments
DerekWilson - Monday, December 11, 2006 - link
cool -- we'll have to investigate this.liquidaim - Monday, December 11, 2006 - link
Did you use the 3d clocks for ati cards or the normal 2d?Just wondering if that was taken into account for the MPEG-2 tests previously and not here, which is why ati cards didn't perform as well.
Not a fanboy, just asking for clarification.
DerekWilson - Monday, December 11, 2006 - link
I don't believe you can specify what clock ATI uses when decoding video -- I think this is handled internally. It may be that the hardware that helps accelerate MPEG-2 the most is tied to clock, while the majority of what benefits H.264 is not. We'll have to dig further to really know.pata2001 - Monday, December 11, 2006 - link
It was the same thing when MPEG2 came out. Heck, even in the old days of 386s, PCs are too slow to decode MPEG1 VCDs, to the point that we have a seperate MPEG1 decoder cards. Remember when DVD came out, there was a big push for GPU accelerated hardware iDCT. Today, most CPUs are powerful enough to decode MPEG2 on its own. The same thing agian with MPEG4. By the time 4-core/8-core CPUs become mainstream, we won't be hearing the need for GPU acceleration as much anymore. And by that time, there will be probably the next next gen HD format that is too powerful for CPUs for that time, cycle and repeat.DerekWilson - Monday, December 11, 2006 - link
MPEG-4 contains many advanced features not currently in use. We first saw MPEG-4 part 2 in the form of DivX, but MPEG-4 part 10 takes quite a bit more work. Some of the profiles and levels of H.264/AVC will be too much for quad core CPUs to handle. These may not be adopted by studios for use on physical media, but the codec itself is very forward looking.But in the end, you are correct -- the entire MPEG-4 spec will be a simple matter in a handful of years.
This is the case with everything though. Even if something will one day pose no trouble to computers, we can't ignore current performance. Studios must balance current performance with the flexibility to support the type of image quailty they will want near the end of the life cycle of BD and HDDVD formats.
I always look forward to this kind of thing, and it's why I test hardware -- I want to know what my PC can currently do with what is out there.
I suppose the "news" is that we've got something everyone wouldn't mind having that very few will be able to use for the time being.
Staples - Monday, December 11, 2006 - link
This is good news that MPEG2 won't become the standard for BD. Until today, I figured all movies were in MPEG2 and if this became standard and won the format war, we would be stuck with what could arguably give a worse picture than HDDVD using VC1.How do you know what movies are 50gb and or h264? Does it usually say on the box or does the player tell you?
DerekWilson - Monday, December 11, 2006 - link
In our experience with Blu-ray, the format is listed on the box. HDDVDs have been a little more cryptic and we are having to ask for help determining format.For our X-Men BD, the back of the case stated AVC @18 Mbps.
I don't think disk size has been listed on the case, and we've had to ask for this info from industry sources.
CrystalBay - Monday, December 11, 2006 - link
Are AMD X2's unable to efficiently work in these scenarios ?DerekWilson - Monday, December 11, 2006 - link
AMD CPUs will very likely perform worse than Core 2 Duo CPUs.We are considering doing a CPU comparison.
Xajel - Monday, December 11, 2006 - link
IT's logical to be worse, but most users are using these processors and they really wanna know if there rig's can handle it...it's not about AMD only, there's plenty of Pentium 4, Pentium D in these rigs, even Athlon XP still rocks in some..
what about core scaling test ?? I mean
1- Single Core
2- Single Core with Hyper Threading
3- Two Cores
4- Two Cores with Hyper Threading
5- Four Cores
it will be hard to do this scale as they are not from one arch. ( 1 to 4 are NetBurst with Pentium 4, Pentium D, Pentium EE while the last is Core Arch. )