ATI's Avivo Update - H.264 Acceleration & a Special Downloadable Surprise
by Anand Lal Shimpi on December 16, 2005 3:09 PM EST- Posted in
- GPUs
H.264 Decode Acceleration - As Promised
One of the things that ATI had promised us was that by the end of the year, their Radeon X1000 series of GPUs would have hardware accelerated H.264 decode support; and with Catalyst 5.13, ATI is delivering on that promise.
Starting next Tuesday, Radeon X1000 owners will be able to download, for free, ATI's Catalyst 5.13 driver and a Cyberlink H.264 decoder that hooks into ATI's GPU and enables hardware acceleration of H.264. More specifically, ATI's Radeon X1000 GPU in combination with the Cyberlink H.264 decoder will handle the in-loop deblocking, motion compensation and inverse transform that occur during H.264 decoding. Unfortunately, ATI only had a beta ready for us in time for this review, so there were some bugs. Right now, ATI is hoping to have the final version available on the 22nd.
The end result is that CPU utilization is reduced, making the playback of H.264 movies possible on lower end systems and have less of a performance impact on all systems. ATI's work on H.264 decode acceleration today is extremely important because H.264 is the codec of choice for both Blu-ray and HD-DVD.
So how does it work? It's all fairly simple. You just install the Cyberlink H.264 decoder, which then lets you play, with ATI GPU acceleration, H.264 content in Windows Media Player. The bundle also includes an ATI skin for Windows Media Player, but thankfully, you can revert back to the original WMP skin.
The decoder will let you play all H.264 encoded movies, including H.264 Quicktime movies in Windows Media Player, and of course, they are all GPU accelerated. As you can guess, this only works on Radeon X1000 series GPUs.
Because H.264 decoding is an extremely processor intensive task, the level of acceleration that you can get varies based on what type of GPU you have. ATI tells us that the limitations are not artificial, and they are directly related to the number of functioning ALUs on the GPU (in other words, the more pixel pipes you have, the more processing power you have). The breakdown is pretty simple:
Radeon X1300 owners will be able to get hardware acceleration at up to 480p, X1600 owners get it for 720p, and X1800 owners get full acceleration at up to 1080p. ATI did mention that they are working on bringing those limits down, but that is a time intensive driver and algorithm optimization process that may or may not happen.
For our tests, we used a Radeon X1600 XT and paired it with some 720p content from Apple's Quicktime HD gallery. Unfortunately, due to the beta nature of the decoder, we couldn't get all of the content to work. ATI has told us that there are bound to be issues with the decoder, thanks to its beta state, but it is at least functional in most cases and the final version should be available next week.
Our test of choice was the third Chronicles of Narnia trailer from Apple's HD gallery. We used perfmon to record the CPU utilization on our testbed Athlon 64 3500+ and reported the minimun, average and maximum CPU utilization values during the playback of the trailer. Our reference point is our test bed running the trailer in Quicktime 7, which isn't GPU accelerated, and comparing that to Windows Media Player 10 with the Cyberlink H.264 decoder offloading some tasks to the GPU. While there are bound to be some differences between the two players, the majority of CPU time is spent in the decoder, so the variance between players is negligible.
The average CPU utilization without ATI's GPU acceleration is a staggering 65% higher, not to mention that the peak CPU usage with GPU acceleration manages to stay under 60% while it otherwise hovers just below 80%.
While we didn't have a Radeon X1800 XT to test on hand, the benefits there should be even greater, since you can get GPU assisted decode at 1080p as well.
One of the things that ATI had promised us was that by the end of the year, their Radeon X1000 series of GPUs would have hardware accelerated H.264 decode support; and with Catalyst 5.13, ATI is delivering on that promise.
Starting next Tuesday, Radeon X1000 owners will be able to download, for free, ATI's Catalyst 5.13 driver and a Cyberlink H.264 decoder that hooks into ATI's GPU and enables hardware acceleration of H.264. More specifically, ATI's Radeon X1000 GPU in combination with the Cyberlink H.264 decoder will handle the in-loop deblocking, motion compensation and inverse transform that occur during H.264 decoding. Unfortunately, ATI only had a beta ready for us in time for this review, so there were some bugs. Right now, ATI is hoping to have the final version available on the 22nd.
The end result is that CPU utilization is reduced, making the playback of H.264 movies possible on lower end systems and have less of a performance impact on all systems. ATI's work on H.264 decode acceleration today is extremely important because H.264 is the codec of choice for both Blu-ray and HD-DVD.
So how does it work? It's all fairly simple. You just install the Cyberlink H.264 decoder, which then lets you play, with ATI GPU acceleration, H.264 content in Windows Media Player. The bundle also includes an ATI skin for Windows Media Player, but thankfully, you can revert back to the original WMP skin.
The decoder will let you play all H.264 encoded movies, including H.264 Quicktime movies in Windows Media Player, and of course, they are all GPU accelerated. As you can guess, this only works on Radeon X1000 series GPUs.
Because H.264 decoding is an extremely processor intensive task, the level of acceleration that you can get varies based on what type of GPU you have. ATI tells us that the limitations are not artificial, and they are directly related to the number of functioning ALUs on the GPU (in other words, the more pixel pipes you have, the more processing power you have). The breakdown is pretty simple:
Radeon X1300 owners will be able to get hardware acceleration at up to 480p, X1600 owners get it for 720p, and X1800 owners get full acceleration at up to 1080p. ATI did mention that they are working on bringing those limits down, but that is a time intensive driver and algorithm optimization process that may or may not happen.
For our tests, we used a Radeon X1600 XT and paired it with some 720p content from Apple's Quicktime HD gallery. Unfortunately, due to the beta nature of the decoder, we couldn't get all of the content to work. ATI has told us that there are bound to be issues with the decoder, thanks to its beta state, but it is at least functional in most cases and the final version should be available next week.
Our test of choice was the third Chronicles of Narnia trailer from Apple's HD gallery. We used perfmon to record the CPU utilization on our testbed Athlon 64 3500+ and reported the minimun, average and maximum CPU utilization values during the playback of the trailer. Our reference point is our test bed running the trailer in Quicktime 7, which isn't GPU accelerated, and comparing that to Windows Media Player 10 with the Cyberlink H.264 decoder offloading some tasks to the GPU. While there are bound to be some differences between the two players, the majority of CPU time is spent in the decoder, so the variance between players is negligible.
Decoder | Min | Avg | Max |
Quicktime (no Acceleration) |
18.8% | 53.1% | 78.1% |
Cyberlink H.264 (GPU Acceleration) | 9.4% | 32.2% | 57.8% |
The average CPU utilization without ATI's GPU acceleration is a staggering 65% higher, not to mention that the peak CPU usage with GPU acceleration manages to stay under 60% while it otherwise hovers just below 80%.
While we didn't have a Radeon X1800 XT to test on hand, the benefits there should be even greater, since you can get GPU assisted decode at 1080p as well.
39 Comments
View All Comments
Pete84 - Friday, December 16, 2005 - link
Looks like the AIW series got yet another shit in the multimedia arm. Capture video and then convert it to whichever format you desire, very nice.vijay333 - Friday, December 16, 2005 - link
yes...another "shot" indeed :)Pete84 - Friday, December 16, 2005 - link
Oops, that is what happens when I rush typing :pRandomFool - Friday, December 16, 2005 - link
I'm just waiting for the more creative users to show up. :)ksherman - Friday, December 16, 2005 - link
Thanks AT! you guys ROCK! I do a bit of video converting after I finish a movie project, and it seems as though this proggie might work a lot faster than the other ones I have used!ksherman - Friday, December 16, 2005 - link
one question though, if this tool is eventually released with GPU assisted recoding, is this going to be an ATI-only product, or will I be able to use it with my 7800??Thalyn - Friday, December 16, 2005 - link
Even though it currently uses the CPU to process the transcode, the final product will depend on features present on the X1x00 series and not on the NV4x (6x00/7x00) series. Specifically, it makes use of GPGPU - a function set that allows the graphics card to process more generic code, rather than just graphics, for features such as physics or, in this case, video transcoding.It's true that SM3 cards have been used for this purpose before (I recall an audio DSP program written to use a 6800 Ultra, since it could do the task about 5x faster than a P4 3.0e), but this time around it's been designed to work outside of DirectX - ergo, ATi only unless nVidia incorperates GPGPU at a later date.
-Jack
ksherman - Friday, December 16, 2005 - link
well, it seems I was a little too quick on the draw... doesnt work at all with nVidia cards :(... guess i really shouldnt have expected that, or just read the little two sentence sumary on the main page. alas, I am still saddened :(karoldude - Monday, June 28, 2010 - link
cool staff ,thannks for sharing.This <a href="http://www.best-video-converter.net">video converter</a> is great , i have tried it.