HD Video Playback: H.264 Blu-ray on the PC
by Derek Wilson on December 11, 2006 9:50 AM EST- Posted in
- GPUs
Final Words
We've been hearing for quite some time now that Blu-ray and HDDVD movies could prove to be too much for today's desktop microprocessors; today we finally have the proof. X-Men: The Last Stand encoded using the H.264/MPEG-4 AVC High Profile at 1080p requires more processing power to decode than affordable dual core CPUs can handle. We are at a point where GPU decode acceleration is essentially required with all but the highest end processors in order to achieve an acceptable level of quality while watching HD content on the PC.
NVIDIA hardware performs better under our current set of drivers and the beta build of PowerDVD we are using, but exactly how well GeForce 7 Series hardware handles the decode process is more dependant on the type of card being used than ATI. In general, higher performance NVIDIA cards do better at decoding our H.264 Blu-ray content. The 7950 GX2 doesn't perform on par with the rest of the high end NVIDIA cards as SLI doesn't help with video decode. With the exception of the X1600 Pro, each of the ATI cards we tested affected performance almost exactly the same.
While there isn't much more to say about performance right now, we do need to consider that we are working with an early release of our player software, and ATI and NVIDIA are always improving their driver support for video decode acceleration. While we can't count on seeing improved performance in the future on current hardware, it is always nice to know that the possibility exists. We will continue to track performance with future player and driver updates.
But no matter what we see in the future, NVIDIA has done an excellent job with the 8800 series. G80 based cards will definitely lead the way in HD video decode performance, making it possible to stick with a cheaper CPU and still get a good experience. Of course, nothing about playing HD content on the PC is cheap right now, especially if we are talking about using an 8800 in conjunction with our Blu-ray drive.
For those who don't have the money to build a computer around Blu-ray or HDDVD, a standalone player is the other option. We tested our Samsung player with X-Men: The Last Stand to see if it could handle the demands of an H.264 movie (as any good CE player should). We were happy to see that the Samsung box didn't seem to have any problems playing our movie.
As for recommendations, based on our testing, we would not suggest anything less than an Intel Core 2 Duo E6600 for use in a system designed to play HD content. The E6400 may work well enough, but not even the 8800 GTX can guarantee zero dropped frames on the E6300. ATI owners will want to lean more towards an E6700 processor, but can get away with the E6600 in a pinch. But keep in mind that X-Men: The Last Stand is only one of the first H.264 movies to come out. We may see content that is more difficult to decode in the future, and faster processors are definitely a good place to pad your performance to ensure a quality HD experience on the PC.
We've been hearing for quite some time now that Blu-ray and HDDVD movies could prove to be too much for today's desktop microprocessors; today we finally have the proof. X-Men: The Last Stand encoded using the H.264/MPEG-4 AVC High Profile at 1080p requires more processing power to decode than affordable dual core CPUs can handle. We are at a point where GPU decode acceleration is essentially required with all but the highest end processors in order to achieve an acceptable level of quality while watching HD content on the PC.
NVIDIA hardware performs better under our current set of drivers and the beta build of PowerDVD we are using, but exactly how well GeForce 7 Series hardware handles the decode process is more dependant on the type of card being used than ATI. In general, higher performance NVIDIA cards do better at decoding our H.264 Blu-ray content. The 7950 GX2 doesn't perform on par with the rest of the high end NVIDIA cards as SLI doesn't help with video decode. With the exception of the X1600 Pro, each of the ATI cards we tested affected performance almost exactly the same.
While there isn't much more to say about performance right now, we do need to consider that we are working with an early release of our player software, and ATI and NVIDIA are always improving their driver support for video decode acceleration. While we can't count on seeing improved performance in the future on current hardware, it is always nice to know that the possibility exists. We will continue to track performance with future player and driver updates.
But no matter what we see in the future, NVIDIA has done an excellent job with the 8800 series. G80 based cards will definitely lead the way in HD video decode performance, making it possible to stick with a cheaper CPU and still get a good experience. Of course, nothing about playing HD content on the PC is cheap right now, especially if we are talking about using an 8800 in conjunction with our Blu-ray drive.
For those who don't have the money to build a computer around Blu-ray or HDDVD, a standalone player is the other option. We tested our Samsung player with X-Men: The Last Stand to see if it could handle the demands of an H.264 movie (as any good CE player should). We were happy to see that the Samsung box didn't seem to have any problems playing our movie.
As for recommendations, based on our testing, we would not suggest anything less than an Intel Core 2 Duo E6600 for use in a system designed to play HD content. The E6400 may work well enough, but not even the 8800 GTX can guarantee zero dropped frames on the E6300. ATI owners will want to lean more towards an E6700 processor, but can get away with the E6600 in a pinch. But keep in mind that X-Men: The Last Stand is only one of the first H.264 movies to come out. We may see content that is more difficult to decode in the future, and faster processors are definitely a good place to pad your performance to ensure a quality HD experience on the PC.
86 Comments
View All Comments
DerekWilson - Monday, December 11, 2006 - link
cool -- we'll have to investigate this.liquidaim - Monday, December 11, 2006 - link
Did you use the 3d clocks for ati cards or the normal 2d?Just wondering if that was taken into account for the MPEG-2 tests previously and not here, which is why ati cards didn't perform as well.
Not a fanboy, just asking for clarification.
DerekWilson - Monday, December 11, 2006 - link
I don't believe you can specify what clock ATI uses when decoding video -- I think this is handled internally. It may be that the hardware that helps accelerate MPEG-2 the most is tied to clock, while the majority of what benefits H.264 is not. We'll have to dig further to really know.pata2001 - Monday, December 11, 2006 - link
It was the same thing when MPEG2 came out. Heck, even in the old days of 386s, PCs are too slow to decode MPEG1 VCDs, to the point that we have a seperate MPEG1 decoder cards. Remember when DVD came out, there was a big push for GPU accelerated hardware iDCT. Today, most CPUs are powerful enough to decode MPEG2 on its own. The same thing agian with MPEG4. By the time 4-core/8-core CPUs become mainstream, we won't be hearing the need for GPU acceleration as much anymore. And by that time, there will be probably the next next gen HD format that is too powerful for CPUs for that time, cycle and repeat.DerekWilson - Monday, December 11, 2006 - link
MPEG-4 contains many advanced features not currently in use. We first saw MPEG-4 part 2 in the form of DivX, but MPEG-4 part 10 takes quite a bit more work. Some of the profiles and levels of H.264/AVC will be too much for quad core CPUs to handle. These may not be adopted by studios for use on physical media, but the codec itself is very forward looking.But in the end, you are correct -- the entire MPEG-4 spec will be a simple matter in a handful of years.
This is the case with everything though. Even if something will one day pose no trouble to computers, we can't ignore current performance. Studios must balance current performance with the flexibility to support the type of image quailty they will want near the end of the life cycle of BD and HDDVD formats.
I always look forward to this kind of thing, and it's why I test hardware -- I want to know what my PC can currently do with what is out there.
I suppose the "news" is that we've got something everyone wouldn't mind having that very few will be able to use for the time being.
Staples - Monday, December 11, 2006 - link
This is good news that MPEG2 won't become the standard for BD. Until today, I figured all movies were in MPEG2 and if this became standard and won the format war, we would be stuck with what could arguably give a worse picture than HDDVD using VC1.How do you know what movies are 50gb and or h264? Does it usually say on the box or does the player tell you?
DerekWilson - Monday, December 11, 2006 - link
In our experience with Blu-ray, the format is listed on the box. HDDVDs have been a little more cryptic and we are having to ask for help determining format.For our X-Men BD, the back of the case stated AVC @18 Mbps.
I don't think disk size has been listed on the case, and we've had to ask for this info from industry sources.
CrystalBay - Monday, December 11, 2006 - link
Are AMD X2's unable to efficiently work in these scenarios ?DerekWilson - Monday, December 11, 2006 - link
AMD CPUs will very likely perform worse than Core 2 Duo CPUs.We are considering doing a CPU comparison.
Xajel - Monday, December 11, 2006 - link
IT's logical to be worse, but most users are using these processors and they really wanna know if there rig's can handle it...it's not about AMD only, there's plenty of Pentium 4, Pentium D in these rigs, even Athlon XP still rocks in some..
what about core scaling test ?? I mean
1- Single Core
2- Single Core with Hyper Threading
3- Two Cores
4- Two Cores with Hyper Threading
5- Four Cores
it will be hard to do this scale as they are not from one arch. ( 1 to 4 are NetBurst with Pentium 4, Pentium D, Pentium EE while the last is Core Arch. )