Coming Soon to HD DVD: Silicon Optix HD HQV
by Derek Wilson on February 8, 2007 1:25 AM EST- Posted in
- GPUs
Introduction
In the past, when testing video playback features of PC graphics hardware, we have looked at the HQV benchmark by Silicon Optix. Over the years, HQV scores have improved, as we can see when comparing our first article on the subject to one written four months later. Current scores are nearly perfect on both NVIDIA and AMD hardware. But there is something lacking in these tests: they only provide insight into how hardware performs when handling standard definition content.
With the introduction of HD DVD and Blu-ray content, we have been waiting for a benchmark with which to test image quality of HD playback. Graphics hardware may ultimately have less of an impact on the HD viewing experience in the long run because media and players natively support 1080p, but it is still an important link in the chain. Interlaced media is available on both HD DVD and Blu-ray, and high quality deinterlacing at HD resolutions is just as important as it is on DVDs.
The benchmark not only looks at deinterlacing quality, but noise reduction as well. Noise can actually be more of a problem on HD video due to the clarity with which it is rendered. While much of the problem with noise could be fixed if movie studios included noise reduction as a post processing step, there isn't much content on which noise reduction is currently performed. This is likely a combination of the cost involved in noise reduction as well as the fact that it hasn't been as necessary in the past. In the meantime, we are left with a viewing experience that might not live up to the expectations of viewers, where a little noise reduction during decoding could have a huge impact on the image quality.
There are down sides to noise reduction, as it can reduce detail. This is especially true if noise was specifically added to the video for effect. We don't run into this problem often, but it is worth noting. On the whole, noise reduction will improve the clarity of the content, especially with the current trend in Hollywood to ignore the noise issue.
We have wanted to play with an HD version of HQV for a while, and we are glad to have our hands on this early version. Before we take a look at just how the competition stacks up, we will look at the tests themselves and Silicon Optix scoring system.
In the past, when testing video playback features of PC graphics hardware, we have looked at the HQV benchmark by Silicon Optix. Over the years, HQV scores have improved, as we can see when comparing our first article on the subject to one written four months later. Current scores are nearly perfect on both NVIDIA and AMD hardware. But there is something lacking in these tests: they only provide insight into how hardware performs when handling standard definition content.
With the introduction of HD DVD and Blu-ray content, we have been waiting for a benchmark with which to test image quality of HD playback. Graphics hardware may ultimately have less of an impact on the HD viewing experience in the long run because media and players natively support 1080p, but it is still an important link in the chain. Interlaced media is available on both HD DVD and Blu-ray, and high quality deinterlacing at HD resolutions is just as important as it is on DVDs.
The benchmark not only looks at deinterlacing quality, but noise reduction as well. Noise can actually be more of a problem on HD video due to the clarity with which it is rendered. While much of the problem with noise could be fixed if movie studios included noise reduction as a post processing step, there isn't much content on which noise reduction is currently performed. This is likely a combination of the cost involved in noise reduction as well as the fact that it hasn't been as necessary in the past. In the meantime, we are left with a viewing experience that might not live up to the expectations of viewers, where a little noise reduction during decoding could have a huge impact on the image quality.
There are down sides to noise reduction, as it can reduce detail. This is especially true if noise was specifically added to the video for effect. We don't run into this problem often, but it is worth noting. On the whole, noise reduction will improve the clarity of the content, especially with the current trend in Hollywood to ignore the noise issue.
We have wanted to play with an HD version of HQV for a while, and we are glad to have our hands on this early version. Before we take a look at just how the competition stacks up, we will look at the tests themselves and Silicon Optix scoring system.
27 Comments
View All Comments
JarredWalton - Thursday, February 8, 2007 - link
*grumble* Should be "we've done HQV...."ShizNet - Friday, February 9, 2007 - link
big part of GPU driver problems are backwards compatibility [GF2-7, Rad.7-X1, DX6-9..];DirectX is totally new beast - why not draw the line and develop drivers from now on for Legacy devices and DX10+ ones?
this will keep old and new drivers in 'good' shape and there's no need for over bloated size files with old junk.
Wwhat - Sunday, February 11, 2007 - link
Since DX10 is vista-only and vista uses a whole new drivermodel it is obvious and inevitable that there are separate drivers developed for post-DX10 heh.So why are you asking for something that everybody already knows is going on and sees happening? Have you not heard about the issues concerning vista and the issues the graphics companies have/had releasing drivers for it?
Plus since ATI-nay-AMD has lots of X1- cards only stuff, it's clear that they also separated their drivers in that sense already.
kilkennycat - Thursday, February 8, 2007 - link
I'm sure that Silicon Optix would only be too happy to quickly develop a hardware HDTV silicon-solution for nVidia and ATi/AMD or their board-partners as manufacturing-option for their graphics cards.. No doubt Silicon Optix developed the HD-HQV tests both to weed out the under-performers AND encourage the widest possible use of their silicon......... Would save nVidia and ATi the bother of even more driver-complication and possible tweaks to their GPU hardware (for mucho, mucho $$) for the few that want the highest-quality HD replication ( regardless of whether the source is 1080p or 1080i or even 720p) from their PCs... The same few would probably be only too willing to shell out the $50 extra or so for the "High-quality-HD" Option-version of their favorite video card.abhaxus - Thursday, February 8, 2007 - link
I use either VLC or DScalar to watch 1080i on my PC. I've got an X800XL so I don't have the ability to use avivo. Would be interested to see how this disc fairs on those two solutions, I've always liked VLC's X method deinterlacing.RamarC - Thursday, February 8, 2007 - link
The testing seemed to focus on de-interlacing issues. HD DVD (and Blu-Ray) are intended to store progressive (non-interlaced) content. Some early titles (and crappy transfers) may be stored as 1080i, but by the middle of this year, 95%+ off all HD titles will be 1080p and de-interlacing will be non-issue.ShizNet - Thursday, February 8, 2007 - link
why focus on interlaced content?______________________________________
can you say TV-broadcasting?
same 95%+ of 'stuff' you'll be watching is TV/Cable/Dish [which are 1080i] and not [HD]DVDs nor IPTV for next 5 years+
even when all TV stations will go digital it's only 540p, don't confuse it up w/ HDTV - 720/1080[i/p]. only BIG ones with deep pockets will go HDTV full time.
autoboy - Thursday, February 8, 2007 - link
You guys are missing the point of this test. Broadcast TV is almost all 1080i content and deinterlacing is very important. The HD-DVD is simply the source of the benchmark but should be able to test the playback capability of PCs for broadcast HD as well as interlaced HD-DVD if it exists. Playing progressive scan images is easy and the only thing that should affect it is the noise reduction which I don't use because it ussually reduces detail.Still...this article left me with more questions than answers.
What decoder did you use for the ATI and Nvidia Tests? Nvidia purevideo decoder or purevideo HD?
Did you turn on Cadence detection on the ATI and Inverse Telecine on the Nvidia card?
What video cards did you use? You ussually use a 7600GT and x1900pro
What drivers did you use?
What player did you use?
Is this test only for HD-DVD decoders or can you use any mpeg2 decoder which would make this a much more relevant test since 1080i HD-DVD is rare and Broadcast HD is what really matters here.
What codec does the HQV use? Mpeg2? VC-1? H.264? Because most VC-1 and H.264 are progressive scan anyway and Nvidia does not claim to support purevideo with anything but mpeg2.
Did you turn on Noise reduction in the Nividia control panel?
Why does Nvidia claim HD Spacial Temporal Deinterlacing, HD Inverse Telecine, HD Noise Redution, etc in thier documentation but cannot do any of the above in reality? Is this h.264 and not supported?
hubajube - Thursday, February 8, 2007 - link
Well this settles whether or not I build an HTPC for HD movie play. This combined with needing a fast CPU (read expensive) as well as a HDCP capable video card pretty much kills a HTPC in the short term. I'll just get a standalone player for now.cjb110 - Thursday, February 8, 2007 - link
Could you get more hddvd players and push them through this test?!?!Also include the DVD results too, as its no good if it can only do one format correctly.
tbh I think it is pretty atrocious that only recently with the Denon 5910 and the Oppo players that we have a dvd player that actual plays dvd's 'properly'.