Coming Soon to HD DVD: Silicon Optix HD HQV
by Derek Wilson on February 8, 2007 1:25 AM EST- Posted in
- GPUs
The HD HQV Tests
The version of HQV Silicon Optix provided for us contains tests for a couple different aspects of HD video decoding: noise reduction, resolution loss, and deinterlacing artifacts (jaggies). We will break down the specifics of each test and talk about what we are looking for. This time around, Silicon Optix's scoring system is broken down with more variability in each test, but we will try to be as objective as possible in our analysis.
Noise Reduction
The first test in the suite is the noise reduction test which is broken down into two parts. Initially, we have an image of a flower that shows large blocks of nearly solid color without much motion. This tests the ability of the video processor to remove spatial noise.
When no noise reduction is applied, we see some static or sparking in the flower and background. Hardware will score higher the more noise it is able to eliminate without introducing any artifacts into the image.
The second test that looks at noise reduction presents us with a scene in motion. It is more difficult to eliminate noise while keeping objects in motion crisp and clear. In this test, we are looking for noise reduction as well as a lack of blurring on the ship.
Scoring for these tests ranges from 0 to 25, with the highest score going to hardware that is able to reduce noise while maintaining a clear image that has no artifacts. While Silicon Optix has stated that scores can range anywhere from 0 to 25, they break down four suggested scores to use as a guide. Here's the breakdown:
25 - The level of noise is noticeably reduced without loss of detail
15 - The level of noise is somewhat reduced and detail is preserved
7 - The level of noise is somewhat reduced but detail is lost
0 - There is no apparent reduction in noise and/or image detail is significantly reduced or artifacts are introduced.
Until we have a better feel for the tests and the variability between hardware, we will stick with only using these delineations.
Video Resolution Loss
After noise reduction, we look at video resolution loss. Resolution loss can occur as a result of deinterlacing, and effectively reduces the amount of information that is displayed. In interlaced HD video, alternating fields display odd and even scanlines of one image. Simple deinterlacing techniques can choose to simply duplicate the data in the first field and toss out the rest of the information, and others will simply average the data in two fields together to create a frame. Both of these techniques have issues that cause artifacts, and both remove detail from the image.
When objects are not in motion, interlaced fields can simply be combined into one frame with no issue, and good hardware should be able to detect whether anything is moving or not and perform the appropriate deinterlacing method. In order to test the ability of hardware to accurately reproduce material from interlaced video in motion, Silicon Optix has included an SMPTE test image at 1920x1080 with a spinning bar over top to force hardware to employ the type of deinterlacing it would use when motion is detected. In the top and bottom left corners of the SMPTE test pattern are boxes that have alternating black and white horizontal lines that are one pixel wide. A high quality deinterlacing algorithm will be able to reproduce these very fine lines, and it is these that we are looking for in our test pattern.
Interestingly, AMD, NVIDIA, and PowerDVD software all fail to adequately reproduce the SMPTE resolution chart. We'll have to show a lower resolution example based on a smaller 512x512 version of the chart, but our comments apply to the full resolution results.
If the hardware averages the interlaced fields, the fine lines will be displayed as a grey block, while if data is thrown out, the block will be either solid black or solid white (depending on which field is left out).
Scoring for this test is an all or nothing 25 or 0 - either the hardware loses resolution or it does not.
Jaggies
A good deinterlacing algorithm should be able to avoid aliasing in diagonal lines apparent in less sophisticated techniques. This test returns from the original standard definition HQV test, and is a good judge of how well hardware is able to handle diagonal lines of varying slope.
Here we want each of the three lines to maintain smoothness while moving back and forth around part of the circle. Scoring is based on a sliding scale between 0 and 20 with suggested breakdowns based on which bars maintain smooth edges. We will again be sticking with a score that matches the suggested options Silicon Optix provides rather than picking numbers in between these values.
20 - All three bars have smooth edges at all times
10 - The top two bars have smooth edges, but the bottom bar does not
5 - Only the top bar has a smooth edge
0 - None of the bars have smooth edges
Film Resolution Loss
This test is nearly the same as the video resolution loss test, and the score breakdown is the same: 25 if it works or 0 if it does not. This time around, interlaced video of the SMPTE test pattern is generated using a telecine process to produce 1080i video at 60 fps from a 24 fps progressive source. Because of the difference in frame rates between video and film, a 3:2 cadence must be used where one frame of film is stretched across 3 interlaced fields and the next frame of film is stretched across 2 fields.
One major advantage of this process is that it is reversible, meaning that less guess work needs to go into properly deinterlacing video produced from a film source. The process of reversing this 3:2 pulldown is called inverse telecine, and can be employed very effectively to produce a progressive image from interlaced media. If this is done correctly, no resolution needs to be lost.
Rather than having a moving bar over top of the test pattern, the image shifts back and forth from left to right and resolution loss can make the image appear to strobe or produce the appearance of vertical lines along the edges of fine lines.
Film Resolution Loss - Stadium Test
The final test is a practical test of film resolution loss, showing what can happen when a film source is not accurately reproduced. In this case, flickering in the stadiums or a moiré pattern can become apparent.
Scoring for this test is another all or nothing score granting the video decoder being tested either a 10 or a 0.
Now that we've gotten familiar with these tests, let's take a look at how AMD and NVIDIA stack up under HD HQV.
The version of HQV Silicon Optix provided for us contains tests for a couple different aspects of HD video decoding: noise reduction, resolution loss, and deinterlacing artifacts (jaggies). We will break down the specifics of each test and talk about what we are looking for. This time around, Silicon Optix's scoring system is broken down with more variability in each test, but we will try to be as objective as possible in our analysis.
Noise Reduction
The first test in the suite is the noise reduction test which is broken down into two parts. Initially, we have an image of a flower that shows large blocks of nearly solid color without much motion. This tests the ability of the video processor to remove spatial noise.
When no noise reduction is applied, we see some static or sparking in the flower and background. Hardware will score higher the more noise it is able to eliminate without introducing any artifacts into the image.
The second test that looks at noise reduction presents us with a scene in motion. It is more difficult to eliminate noise while keeping objects in motion crisp and clear. In this test, we are looking for noise reduction as well as a lack of blurring on the ship.
Scoring for these tests ranges from 0 to 25, with the highest score going to hardware that is able to reduce noise while maintaining a clear image that has no artifacts. While Silicon Optix has stated that scores can range anywhere from 0 to 25, they break down four suggested scores to use as a guide. Here's the breakdown:
25 - The level of noise is noticeably reduced without loss of detail
15 - The level of noise is somewhat reduced and detail is preserved
7 - The level of noise is somewhat reduced but detail is lost
0 - There is no apparent reduction in noise and/or image detail is significantly reduced or artifacts are introduced.
Until we have a better feel for the tests and the variability between hardware, we will stick with only using these delineations.
Video Resolution Loss
After noise reduction, we look at video resolution loss. Resolution loss can occur as a result of deinterlacing, and effectively reduces the amount of information that is displayed. In interlaced HD video, alternating fields display odd and even scanlines of one image. Simple deinterlacing techniques can choose to simply duplicate the data in the first field and toss out the rest of the information, and others will simply average the data in two fields together to create a frame. Both of these techniques have issues that cause artifacts, and both remove detail from the image.
When objects are not in motion, interlaced fields can simply be combined into one frame with no issue, and good hardware should be able to detect whether anything is moving or not and perform the appropriate deinterlacing method. In order to test the ability of hardware to accurately reproduce material from interlaced video in motion, Silicon Optix has included an SMPTE test image at 1920x1080 with a spinning bar over top to force hardware to employ the type of deinterlacing it would use when motion is detected. In the top and bottom left corners of the SMPTE test pattern are boxes that have alternating black and white horizontal lines that are one pixel wide. A high quality deinterlacing algorithm will be able to reproduce these very fine lines, and it is these that we are looking for in our test pattern.
Interestingly, AMD, NVIDIA, and PowerDVD software all fail to adequately reproduce the SMPTE resolution chart. We'll have to show a lower resolution example based on a smaller 512x512 version of the chart, but our comments apply to the full resolution results.
If the hardware averages the interlaced fields, the fine lines will be displayed as a grey block, while if data is thrown out, the block will be either solid black or solid white (depending on which field is left out).
Scoring for this test is an all or nothing 25 or 0 - either the hardware loses resolution or it does not.
Jaggies
A good deinterlacing algorithm should be able to avoid aliasing in diagonal lines apparent in less sophisticated techniques. This test returns from the original standard definition HQV test, and is a good judge of how well hardware is able to handle diagonal lines of varying slope.
Here we want each of the three lines to maintain smoothness while moving back and forth around part of the circle. Scoring is based on a sliding scale between 0 and 20 with suggested breakdowns based on which bars maintain smooth edges. We will again be sticking with a score that matches the suggested options Silicon Optix provides rather than picking numbers in between these values.
20 - All three bars have smooth edges at all times
10 - The top two bars have smooth edges, but the bottom bar does not
5 - Only the top bar has a smooth edge
0 - None of the bars have smooth edges
Film Resolution Loss
This test is nearly the same as the video resolution loss test, and the score breakdown is the same: 25 if it works or 0 if it does not. This time around, interlaced video of the SMPTE test pattern is generated using a telecine process to produce 1080i video at 60 fps from a 24 fps progressive source. Because of the difference in frame rates between video and film, a 3:2 cadence must be used where one frame of film is stretched across 3 interlaced fields and the next frame of film is stretched across 2 fields.
One major advantage of this process is that it is reversible, meaning that less guess work needs to go into properly deinterlacing video produced from a film source. The process of reversing this 3:2 pulldown is called inverse telecine, and can be employed very effectively to produce a progressive image from interlaced media. If this is done correctly, no resolution needs to be lost.
Rather than having a moving bar over top of the test pattern, the image shifts back and forth from left to right and resolution loss can make the image appear to strobe or produce the appearance of vertical lines along the edges of fine lines.
Film Resolution Loss - Stadium Test
The final test is a practical test of film resolution loss, showing what can happen when a film source is not accurately reproduced. In this case, flickering in the stadiums or a moiré pattern can become apparent.
Scoring for this test is another all or nothing score granting the video decoder being tested either a 10 or a 0.
Now that we've gotten familiar with these tests, let's take a look at how AMD and NVIDIA stack up under HD HQV.
27 Comments
View All Comments
ShizNet - Thursday, February 8, 2007 - link
i agree with last dude - if we are talking about PC Hard/Software mixed with Cust.Electronics [40"+ LCD i guess] why not add http://usa.denon.com/ProductDetails/623.asp">this guy or similar to the mix? and see: should people put more money into VidCard/CPU [for best 1080p] or save for receiver/DVD in their HTPC?otherwise - great that you guys getting down and dirty to address some issues and breaking ice for the rest of us, before we spent all that $$$ and get mid of the road performance
Visual - Thursday, February 8, 2007 - link
i dont even understand exactly what you guys just tested... was this just some test-disc played with a software player? why didn't you start the article with more information about the test?what was the system's configuration?
what codec is used for the content, and does it have the proper flags and information needed for correct deinterlacing?
which player app and decoders you used, etc?
if there were flaws in the playback, isn't it the software's fault, not the hardware's? if there were differences on ati/nvidia hardware, isn't it because the software used their built-in capabilities improperly and in different ways? surely there can be player software that handles deinterlacing perfectly without even using any hardware acceleration...
with a digital source like a hddvd/bluray disc, i don't think these kind of tests can even apply. noise reduction, wtf? we're talking of digital storage, not audio tapes after all. noise can't just appear with age. if there is "noise" on the source, it was probably put there on purpose, not real "noise" but something that was meant to be there. why should the playback system remove it?
resolution loss and jaggies, stuff that is related to deinterlacing, and it just pisses me off. why oh why should anyone be bothered with deinterlacing in this day and age?
you say "Interlaced media is available on both HD DVD and Blu-ray" but from what i've heared, the majority (if not all) of hd-dvd and blue-ray content is currently stored as 1080p on the discs. who and why would be as dumb as to produce interlaced hd content?
DerekWilson - Thursday, February 8, 2007 - link
I've updated page 3 of the article with information on the HD DVD player used and the drivers used for AMD and NVIDIA cards.The software player enabled hardware acceleration which enables AMD and NVIDIA to handle much of the decode and deinterlacing of the HD content. This is a test of the hardware and drivers provided by AMD and NVIDIA.
Codec doesn't matter and proper flags don't matter -- a good deinterlacing algorithm should detect the type of content being played. In fact, AMD and NVIDIA both do this for standard definition content.
It might be possible for software HD DVD and Blu-ray players to handle proper deinterlacing, but most software DVD players don't even do it as effectively as possible. There are no HD DVD or Blu-ray players that we know of that support the type of adaptive deinterlacing necessary to pass the HD HQV test.
I do appologize if I didn't explain noise well enough.
The problem comes in the transfer of a movie from film to digital media. CCDs used to pick up light shining through film will absolutely introduce noise, especially in large blocks of similar color like sky. Even digital HD cameras don't have an infinite color space and will have problems with noise in similar situations due to small fluctionations in the exact digital color at each pixel for each frame.
This type of noise can be reduced by post processing, but studios usually do not do this. All you need to do is watch X-Men 3 on Blu-ray to see that noise is a huge problem.
In addition, encoding and compression introduce noise. This noise can't be removed except in the decode process.
Noise is a major issue in HD content, and while much of it could be fixed with post processing, it looks horrible at high resolution.
As for interlacing, most movies will definitely be progressive. But there are some that are 1080i and will need good deinterlacing support.
The big issue, as has been pointed out elsewhere in the comments, is TV. 1080i is the standard here.
In fact, when stations start distributing series on HD DVD and Blu-ray, it is very likely we will see them in interlaced format. Most of my DVD collection consists of TV series, so I consider deinterlacing an imporant step in HD video playback.
As much as I dislike interlaced content in general, it is unfortunately here to stay.
RamarC - Friday, February 9, 2007 - link
Because a TV program is broadcast in 1080i does in no way mean that's the format it is captured/mastered in. "24p" is the current standard for mastering of most network programming and it can result in 720p or 1080i or 1080p content.http://www.digital-digest.com/highdefdvd/faq.html#...">http://www.digital-digest.com/highdefdvd/faq.html#...
In an interview with Microsoft in the Audioholics magazine in January 2006 indicated that HD DVD movies will be stored in 1080p format like BD, even if initial players can only output at 1080i.
Interlaced HD/BluRay content will be a rarity and the performance of playback software with that content is a trivial issue.
ianken - Friday, February 9, 2007 - link
" There are no HD DVD or Blu-ray players that we know of that support the type of adaptive deinterlacing necessary to pass the HD HQV test. "Becuase they don't need it as the content is 1080p.
Silicon Optix is in the business to sell video processing chips. Their benchmark is designed to get people to look for players with their hardware.
For properly authored discs NR and adaptive deinterlace is wasted.
The thing I like about the HQV dics is that sites like this use them and that motivates ATI and NVIDIA to pass them and that gets folks a better 1080i broadcast experience. It's in the realm of poorly encoded broadcast HD TV that this stuff is important.
IMHO.
autoboy - Thursday, February 8, 2007 - link
Sorry about being a huge pain in the ass. I really do like reading your articles about video processing and they are always quite good. For me though, there is always something that seems to be missing.I just found this quote from the head of the mutlimedia division at Nvidia
FiringSquad: PureVideo seems to do more than regular bob deinterlacing when tested with the HQV Benchmark DVD. Can you give us any more details on what's being done?
Scott Vouri: Yes, we do much more than regular ‘bob’ deinterlacing, but unfortunately we can’t disclose the algorithms behind our de-interlacing technology. I do want to point out that HQV doesn’t even test one of the best things about our spatial-temporal de-interlacing – the fact that we do it on 1080i HD content, which is quite computationally intensive.
So it appears that they at least do adaptive deinterlacing which means they do what they say which means they should do inverse telecine and 3:2 pulldown correctly as well. I just can't help but think there is something missing from your setup. They should score better than a 0. Is the HQV benchmark copy protected? Can it be played on regular mpeg2 decoders? Is the PowerDVD hardware acceleration broken?
autoboy - Thursday, February 8, 2007 - link
So the codec doesn't matter for deinterlacing? The decoder decodes the video in a sort of raw format and then the video card takes over the deinterlacing? Hmm. I didn't know that. I was under the impression that the codec was the most important part of the equation. Why is interlaced video such a mystery to most of us. i have been trying to fully understand it for 6 months and I find out that I still don't know anything. I just want proper deinterlacing. Is that too much to ask?Is is really that hard to get good video playback on a PC for interlaced material! Come on...