ATI's Avivo Update - H.264 Acceleration & a Special Downloadable Surprise
by Anand Lal Shimpi on December 16, 2005 3:09 PM EST- Posted in
- GPUs
H.264 Decode Acceleration - As Promised
One of the things that ATI had promised us was that by the end of the year, their Radeon X1000 series of GPUs would have hardware accelerated H.264 decode support; and with Catalyst 5.13, ATI is delivering on that promise.
Starting next Tuesday, Radeon X1000 owners will be able to download, for free, ATI's Catalyst 5.13 driver and a Cyberlink H.264 decoder that hooks into ATI's GPU and enables hardware acceleration of H.264. More specifically, ATI's Radeon X1000 GPU in combination with the Cyberlink H.264 decoder will handle the in-loop deblocking, motion compensation and inverse transform that occur during H.264 decoding. Unfortunately, ATI only had a beta ready for us in time for this review, so there were some bugs. Right now, ATI is hoping to have the final version available on the 22nd.
The end result is that CPU utilization is reduced, making the playback of H.264 movies possible on lower end systems and have less of a performance impact on all systems. ATI's work on H.264 decode acceleration today is extremely important because H.264 is the codec of choice for both Blu-ray and HD-DVD.
So how does it work? It's all fairly simple. You just install the Cyberlink H.264 decoder, which then lets you play, with ATI GPU acceleration, H.264 content in Windows Media Player. The bundle also includes an ATI skin for Windows Media Player, but thankfully, you can revert back to the original WMP skin.
The decoder will let you play all H.264 encoded movies, including H.264 Quicktime movies in Windows Media Player, and of course, they are all GPU accelerated. As you can guess, this only works on Radeon X1000 series GPUs.
Because H.264 decoding is an extremely processor intensive task, the level of acceleration that you can get varies based on what type of GPU you have. ATI tells us that the limitations are not artificial, and they are directly related to the number of functioning ALUs on the GPU (in other words, the more pixel pipes you have, the more processing power you have). The breakdown is pretty simple:
Radeon X1300 owners will be able to get hardware acceleration at up to 480p, X1600 owners get it for 720p, and X1800 owners get full acceleration at up to 1080p. ATI did mention that they are working on bringing those limits down, but that is a time intensive driver and algorithm optimization process that may or may not happen.
For our tests, we used a Radeon X1600 XT and paired it with some 720p content from Apple's Quicktime HD gallery. Unfortunately, due to the beta nature of the decoder, we couldn't get all of the content to work. ATI has told us that there are bound to be issues with the decoder, thanks to its beta state, but it is at least functional in most cases and the final version should be available next week.
Our test of choice was the third Chronicles of Narnia trailer from Apple's HD gallery. We used perfmon to record the CPU utilization on our testbed Athlon 64 3500+ and reported the minimun, average and maximum CPU utilization values during the playback of the trailer. Our reference point is our test bed running the trailer in Quicktime 7, which isn't GPU accelerated, and comparing that to Windows Media Player 10 with the Cyberlink H.264 decoder offloading some tasks to the GPU. While there are bound to be some differences between the two players, the majority of CPU time is spent in the decoder, so the variance between players is negligible.
The average CPU utilization without ATI's GPU acceleration is a staggering 65% higher, not to mention that the peak CPU usage with GPU acceleration manages to stay under 60% while it otherwise hovers just below 80%.
While we didn't have a Radeon X1800 XT to test on hand, the benefits there should be even greater, since you can get GPU assisted decode at 1080p as well.
One of the things that ATI had promised us was that by the end of the year, their Radeon X1000 series of GPUs would have hardware accelerated H.264 decode support; and with Catalyst 5.13, ATI is delivering on that promise.
Starting next Tuesday, Radeon X1000 owners will be able to download, for free, ATI's Catalyst 5.13 driver and a Cyberlink H.264 decoder that hooks into ATI's GPU and enables hardware acceleration of H.264. More specifically, ATI's Radeon X1000 GPU in combination with the Cyberlink H.264 decoder will handle the in-loop deblocking, motion compensation and inverse transform that occur during H.264 decoding. Unfortunately, ATI only had a beta ready for us in time for this review, so there were some bugs. Right now, ATI is hoping to have the final version available on the 22nd.
The end result is that CPU utilization is reduced, making the playback of H.264 movies possible on lower end systems and have less of a performance impact on all systems. ATI's work on H.264 decode acceleration today is extremely important because H.264 is the codec of choice for both Blu-ray and HD-DVD.
So how does it work? It's all fairly simple. You just install the Cyberlink H.264 decoder, which then lets you play, with ATI GPU acceleration, H.264 content in Windows Media Player. The bundle also includes an ATI skin for Windows Media Player, but thankfully, you can revert back to the original WMP skin.
The decoder will let you play all H.264 encoded movies, including H.264 Quicktime movies in Windows Media Player, and of course, they are all GPU accelerated. As you can guess, this only works on Radeon X1000 series GPUs.
Because H.264 decoding is an extremely processor intensive task, the level of acceleration that you can get varies based on what type of GPU you have. ATI tells us that the limitations are not artificial, and they are directly related to the number of functioning ALUs on the GPU (in other words, the more pixel pipes you have, the more processing power you have). The breakdown is pretty simple:
Radeon X1300 owners will be able to get hardware acceleration at up to 480p, X1600 owners get it for 720p, and X1800 owners get full acceleration at up to 1080p. ATI did mention that they are working on bringing those limits down, but that is a time intensive driver and algorithm optimization process that may or may not happen.
For our tests, we used a Radeon X1600 XT and paired it with some 720p content from Apple's Quicktime HD gallery. Unfortunately, due to the beta nature of the decoder, we couldn't get all of the content to work. ATI has told us that there are bound to be issues with the decoder, thanks to its beta state, but it is at least functional in most cases and the final version should be available next week.
Our test of choice was the third Chronicles of Narnia trailer from Apple's HD gallery. We used perfmon to record the CPU utilization on our testbed Athlon 64 3500+ and reported the minimun, average and maximum CPU utilization values during the playback of the trailer. Our reference point is our test bed running the trailer in Quicktime 7, which isn't GPU accelerated, and comparing that to Windows Media Player 10 with the Cyberlink H.264 decoder offloading some tasks to the GPU. While there are bound to be some differences between the two players, the majority of CPU time is spent in the decoder, so the variance between players is negligible.
Decoder | Min | Avg | Max |
Quicktime (no Acceleration) |
18.8% | 53.1% | 78.1% |
Cyberlink H.264 (GPU Acceleration) | 9.4% | 32.2% | 57.8% |
The average CPU utilization without ATI's GPU acceleration is a staggering 65% higher, not to mention that the peak CPU usage with GPU acceleration manages to stay under 60% while it otherwise hovers just below 80%.
While we didn't have a Radeon X1800 XT to test on hand, the benefits there should be even greater, since you can get GPU assisted decode at 1080p as well.
39 Comments
View All Comments
ShadowVlican - Friday, December 16, 2005 - link
how bout a review on the quality of the transcoded files? we all know that all encoders are not equal, that is why some mpeg2 encoders cost more than my carmongoosesRawesome - Saturday, December 17, 2005 - link
yea, i agree. speed is all well and good, but if the output sucks then why bother?JustAnAverageGuy - Friday, December 16, 2005 - link
Read the article.
PrinceGaz - Friday, December 16, 2005 - link
He's talking about ENcoding quality for transcoding purposes, not playback quality.If a test is done on MPEG2 encoding quality, I would suggest using CCE SP as the comparison encoder as it is generally considered the best available (though it is a touch expensive to purchase).
tfranzese - Monday, December 19, 2005 - link
Read a different article then. AT isn't the only place to cover this (FiringSquad had some IQ coverage).Andyvan - Friday, December 16, 2005 - link
I'm wondering if you have both a cheap ATI card and an NVIDIA card installed in your computer, whether you would be allowed to run the converter.-- Andyvan
Rys - Friday, December 16, 2005 - link
Yes, as long as one of the boards is an X1K, the transcoding tool will run. I currently have a GeForce 7800 GTX as my primary board, and an X1800 XL as the secondary one. The new driver, decoder and transcoding tool all run fine.synic - Friday, December 16, 2005 - link
read the article, it says X1000 or greater onlyAraemo - Friday, December 16, 2005 - link
"we will look at other performance comparisons upon request from you all"Just one: Compare DVD->Divx against AutoGK(Using the official Divx.com codec?) Does the ATI tool even support ripping from an actual DVD(Or decrypted DVD files) to another format? I am curious.
fnord123 - Friday, December 16, 2005 - link
Please compare against the Microsoft Windows Media Encoder (http://www.microsoft.com/windows/windowsmedia/9ser...">http://www.microsoft.com/windows/windowsmedia/9ser.... A lot of Media Center Extender and XBox 360 people are using it to recode their .avi files to .wmv (Divx isn't supported by 360/MCExtenders). It is a slow process so if the ATI accelerator speeds it up they will have a bunch of buyers!