HD Video Playback: H.264 Blu-ray on the PC
by Derek Wilson on December 11, 2006 9:50 AM EST- Posted in
- GPUs
Final Words
We've been hearing for quite some time now that Blu-ray and HDDVD movies could prove to be too much for today's desktop microprocessors; today we finally have the proof. X-Men: The Last Stand encoded using the H.264/MPEG-4 AVC High Profile at 1080p requires more processing power to decode than affordable dual core CPUs can handle. We are at a point where GPU decode acceleration is essentially required with all but the highest end processors in order to achieve an acceptable level of quality while watching HD content on the PC.
NVIDIA hardware performs better under our current set of drivers and the beta build of PowerDVD we are using, but exactly how well GeForce 7 Series hardware handles the decode process is more dependant on the type of card being used than ATI. In general, higher performance NVIDIA cards do better at decoding our H.264 Blu-ray content. The 7950 GX2 doesn't perform on par with the rest of the high end NVIDIA cards as SLI doesn't help with video decode. With the exception of the X1600 Pro, each of the ATI cards we tested affected performance almost exactly the same.
While there isn't much more to say about performance right now, we do need to consider that we are working with an early release of our player software, and ATI and NVIDIA are always improving their driver support for video decode acceleration. While we can't count on seeing improved performance in the future on current hardware, it is always nice to know that the possibility exists. We will continue to track performance with future player and driver updates.
But no matter what we see in the future, NVIDIA has done an excellent job with the 8800 series. G80 based cards will definitely lead the way in HD video decode performance, making it possible to stick with a cheaper CPU and still get a good experience. Of course, nothing about playing HD content on the PC is cheap right now, especially if we are talking about using an 8800 in conjunction with our Blu-ray drive.
For those who don't have the money to build a computer around Blu-ray or HDDVD, a standalone player is the other option. We tested our Samsung player with X-Men: The Last Stand to see if it could handle the demands of an H.264 movie (as any good CE player should). We were happy to see that the Samsung box didn't seem to have any problems playing our movie.
As for recommendations, based on our testing, we would not suggest anything less than an Intel Core 2 Duo E6600 for use in a system designed to play HD content. The E6400 may work well enough, but not even the 8800 GTX can guarantee zero dropped frames on the E6300. ATI owners will want to lean more towards an E6700 processor, but can get away with the E6600 in a pinch. But keep in mind that X-Men: The Last Stand is only one of the first H.264 movies to come out. We may see content that is more difficult to decode in the future, and faster processors are definitely a good place to pad your performance to ensure a quality HD experience on the PC.
We've been hearing for quite some time now that Blu-ray and HDDVD movies could prove to be too much for today's desktop microprocessors; today we finally have the proof. X-Men: The Last Stand encoded using the H.264/MPEG-4 AVC High Profile at 1080p requires more processing power to decode than affordable dual core CPUs can handle. We are at a point where GPU decode acceleration is essentially required with all but the highest end processors in order to achieve an acceptable level of quality while watching HD content on the PC.
NVIDIA hardware performs better under our current set of drivers and the beta build of PowerDVD we are using, but exactly how well GeForce 7 Series hardware handles the decode process is more dependant on the type of card being used than ATI. In general, higher performance NVIDIA cards do better at decoding our H.264 Blu-ray content. The 7950 GX2 doesn't perform on par with the rest of the high end NVIDIA cards as SLI doesn't help with video decode. With the exception of the X1600 Pro, each of the ATI cards we tested affected performance almost exactly the same.
While there isn't much more to say about performance right now, we do need to consider that we are working with an early release of our player software, and ATI and NVIDIA are always improving their driver support for video decode acceleration. While we can't count on seeing improved performance in the future on current hardware, it is always nice to know that the possibility exists. We will continue to track performance with future player and driver updates.
But no matter what we see in the future, NVIDIA has done an excellent job with the 8800 series. G80 based cards will definitely lead the way in HD video decode performance, making it possible to stick with a cheaper CPU and still get a good experience. Of course, nothing about playing HD content on the PC is cheap right now, especially if we are talking about using an 8800 in conjunction with our Blu-ray drive.
For those who don't have the money to build a computer around Blu-ray or HDDVD, a standalone player is the other option. We tested our Samsung player with X-Men: The Last Stand to see if it could handle the demands of an H.264 movie (as any good CE player should). We were happy to see that the Samsung box didn't seem to have any problems playing our movie.
As for recommendations, based on our testing, we would not suggest anything less than an Intel Core 2 Duo E6600 for use in a system designed to play HD content. The E6400 may work well enough, but not even the 8800 GTX can guarantee zero dropped frames on the E6300. ATI owners will want to lean more towards an E6700 processor, but can get away with the E6600 in a pinch. But keep in mind that X-Men: The Last Stand is only one of the first H.264 movies to come out. We may see content that is more difficult to decode in the future, and faster processors are definitely a good place to pad your performance to ensure a quality HD experience on the PC.
86 Comments
View All Comments
Tujan - Monday, December 11, 2006 - link
So heres a Sony notebook. It probably uses less than 40 or 50 watts. Has an HDMI connector on it. And runs on a battery. No less.
http://www.learningcenter.sony.us/assets/itpd/note...">http://www.learningcenter.sony.us/asset...CMP=vaio...
So what is my question here. This is a Centrino Core Duo for a notebook. With graphics enough to run using only battery power .
As well the notebook has a Blue-Ray drive wich can be written to.AND watch blue-ray titles.
Is this mostly in the liscencing ? How can it be when the processor used,and graphics cards used are such absolute 'top notch'for the desktop. And the notebook puts the works of them to shame.
Blue-ray,and HDMI on battery power.
This was one of AnandTechs Adds.Incodently - Hi Anandtech(Add-Click),HI Sony.
cmdrdredd - Monday, December 11, 2006 - link
I too wonder how a laptop can play blue-ray fine but a $400+ video card with a CPU probably 2x+ more powerful and more memory...can't.fanbanlo - Monday, December 11, 2006 - link
most efficient software decoder! Maybe we don't need Core 2 Duo after all!http://www.coreavc.com/">http://www.coreavc.com/
DerekWilson - Monday, December 11, 2006 - link
my understanding is that coreavc doesn't work in conjunction with HDDVD/BD -- that it doesn't support AACS.totalcommand - Monday, December 11, 2006 - link
BluRay support will be added to CoreAVC soon.KashGarinn - Tuesday, December 12, 2006 - link
When CoreAVC will support HD-DVD and bluray H.264, I'd be very interested in seeing this article updated with the comparison.Regarding the article itself, I thought it wasn't up to normal anandtech standards.. skimping on the H.264 details which makes it better and giving the reason as "but these are a little beyond the scope of this article." - What is anandtech coming to? That's like saying "we're going to compare graphic cards with directx9 capabilities, but explaining what directx is, is a little beyond the scope of this article"
Also, not comparing amd cpus? What's up with that?
And I find it odd that you didn't comment on the strangeness that nvidia has better acceleration across the board than the ATI cards, especially as the ATI cards have better shader throughput, so probably most likely hampered by software rather than hardware.. so this: "ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware." - only paints the incorrect picture.
I would give this article a 2 out of 5.. 1 for at least covering the basics (h.264 is a better codec than mpeg2) and 1 for showing that ati needs to improve it's decoder.. even though you don't point it out.
K.
ninjit - Monday, December 11, 2006 - link
I had a question about why you chose the golden-gate bridge scene to stress test the decoding capabilities of the various setups.You said that you chose that point in the movie because it had the highest bitrate (41Mbps), indicating a more complex scene.
To me though that would indicate LESS encoding done by H.264, and subsequently LESS decoding work needed to be done for playback of that particular scene.
I justify that by thinking with a very complex scene the codec cannot compress the stream as much because it would introduce too many artifacts, so the compression rate is dropped and the data rate increased to compensate for that particular section in time.
Is my reasoning correct? If not, can someone explain to me why?
I don't think choice of scene should change the graphs in terms of relative performance between setups, but it would affect absolute numbers - an easy way to check whether my thinking is wrong or not is to see if there are more dropped frames in the Golden Gate scene on the software-decoded E6600 vs. other less busy scenes.
DerekWilson - Monday, December 11, 2006 - link
we tried to explain this a little bit, so I'm sorry if we didn't get it across well enough.I'm not an expert on H.264 by any means, but I can talk about other types of decoding as they relate to still images.
The issue isn't really less compression -- when using H.264, we are always using H.264 complexity to encode the bitstream. We don't fall back to just saving raw pixel data if a scene is overly complex -- we encode more detailed information about the scene.
For instance, with still images, run length encoding can be performed with huge compression especially in images with large blocks of identical colors (like logos or images on a solid background color). Basically, the idea is to list a color and then a number of pixels that use that color. For an image that is a single solid color, you could list the color and then the number of pixels in the image. This is a very small file with little processing requirement that represents a full image. If, on the other hand, we have a checker board pattern with every other pixel being a different color, we have to list the color of every pixel, BUT we also have to process every color to makes sure of how many consecutive pixels it represents (even if it only represents one). Thus, we end up donig more processing than we would on a smaller (lower "bitrate") file.
This example is very fabricated as sophisticated run lenth encoding can handle more complex patterns, but it serves to illustrate the point: when using a specific type of encoding, higher bitrates can (and usually do) mean more complexity and processing.
As we mentioned, using no encoding requires zero processing. MPEG-2 can compress the data to lower the bitrate while increasing computational complexity. But higher bitrate MPEG-2 means more data to process per frame -- which means more CPU overhead for higher bitrates under MPEG-2. The same is true with H.264 -- bitrates are genearlly lower than MPEG-2 and require more processing power, but as H.264 encoded movies use more bitrate (more data per frame), more processing is required.
I hope this helps.
Also, to clarify -- the spot at the video that reaches 41Mbps corresponds to the highest CPU utilization (we can see this on a the perfmon timeline).
ninjit - Monday, December 11, 2006 - link
Thanks for the explanation Derek. That was very helpful.jeffbui - Monday, December 11, 2006 - link
The PS3 is able to play Blu-Ray back at 50% over normal speed without dropping frames. That gives an idea of how much power these consoles are capable of.Some interesting tidbits from a translation of an article interviewing PS3 developers.
-H.264 decoding itself was not very difficult for Cell with moderate optimization and they could play a movie in realtime at the first try unlike very difficult SACD optimization. However, because they began the development without knowing the final Blu-ray standard, they set the goal very high for decoding 2 full HD H.264 streams at 40Mbps simultaneously. Besides the clockspeed of the devkit was lower than the final product which made the development difficult. The current decoder can decode full HD H.264 with 3 SPEs.
-An SCE developer recommends trying 1.5x fast-forward playback in the PS3 BD player to see the power of Cell. When it's connected to a display via 1080/60p, it becomes very smooth as Cell has an enough margin for video decoding. In 1.5x fast-forward playback it decodes all frames then inserts them into 60fps with sped up audio.