ATI's Avivo vs. NVIDIA's PureVideo: De-Interlacing Quality Compared
by Anand Lal Shimpi on October 5, 2005 9:00 AM EST- Posted in
- GPUs
De-Interlacing Quality: Jaggies Pattern 2
"Motion adaptive de-interlacing is just one step in cleaning up scan line artifacts. The video signal processor should also employ directional filtering to catch rapidly moving shapes that may change direction, speed, and angle.NVIDIA has the clear advantage in this test, as ATI's de-interlacing shows jaggies in all three bars, while NVIDIA's leaves only the bottom bar with jagged edges.
In the second jaggies test, you'll see a cluster of three equally spaced white bars of the same thickness, rapidly moving up and down between a 5-degree and 35-degree angle. If all three bars appear to have jagged edges at all times, the video processor does not use directional filtering to smooth the images. If all three bars are smooth throughout the test, the video processing earns a passing grade."
Hold mouse over image to see NVIDIA's Image Quality.
Scoring | Description |
5 | ALL THREE BARS HAVE SMOOTH EDGES AT ALL TIMES |
3 | THE TOP TWO BARS HAVE SMOOTH EDGES, BUT THE BOTTOM BAR DOES NOT |
1 | ONLY THE TOP BAR HAS SMOOTH EDGES |
0 | NONE OF THE BARS HAVE SMOOTH EDGES |
20 Comments
View All Comments
ST - Thursday, October 6, 2005 - link
Any chance you can get 1080i deinterlacing tests in the future? 480i source material is fine, but with OTA HDTV widely available now, and the 7800gt/gtx line flaunting HD spatial temporal deinterlacing, i'm sure this is what most readers want to know about.ksherman - Wednesday, October 5, 2005 - link
I installed PureVideo, but what players actually take advantage of it?rbV5 - Wednesday, October 5, 2005 - link
Its nice to see detailed looks into a vastly overlooked area of video card performance. Kudos for using a standard to measure by, now if we see more of this type of scrutiny from more reviewers perhaps we'll actually get to see these features enabled rahter than reading about how great its going to be some day.Now lets take a good look at connectivity, custom resolution support, 1:1 pixel mapping, codec support......
LoneWolf15 - Wednesday, October 5, 2005 - link
Not making that mistake again with any company. If a feature isn't supported at the time I purchase a product, that feature doesn't exist. I'm not going to believe press releases anymore, seeing as touted features can be revoked if drivers or hardware don't work out right. Never again.
Note: I now own an ATI X800XL, and have nothing against ATI or nVidia other than that I 'm too cynical to believe either of them on any feature until I see that feature in action. nVidia already fooled me with this once. They called it PureVideo and I bought a Geforce 6800 AGP and waited eagerly for driver support that never came for hardware decode of HD WMV files (or hardware encode of MPEG), because the NV40/45 design was borked. nVidia left every single user that bought an NV40/45 card in the lurch. No recourse. So everyone who bought one with the hope of using PureVideo was screwed.
Lifted - Wednesday, October 5, 2005 - link
I was thinking the exact same thing. Never again will I by something that will have features added at a later date. This is just a marketing tactic because they already know the hardware won't handle what they promised.Patman2099 - Wednesday, October 5, 2005 - link
Is it just me, or is there no mention in the article of what Deinterlacing option they used on the ATI boardyou can change it in CCC, Ive found that the Adaptive looks best on my radeon 9700.
which deinterlaxing mode was used?
Anand Lal Shimpi - Wednesday, October 5, 2005 - link
I just amended the article to include this information:"Both the ATI and NVIDIA drivers were set to auto-detect what de-interlacing algorithm the hardware should use. We found that this setting yielded the best results for each platform in the HQV benchmark."
If I forced the adaptive or motion adaptive settings, some of the HQV tests did worse, while none improved in image quality.
Take care,
Anand
user loser - Wednesday, October 5, 2005 - link
Am I the only one that thinks the NV version of "De-Interlacing Quality: Vertical Detail" (page 3) is worse? Some of the red/green alternating lines are completely green or lose detail.Compare to the original:
http://www.belle-nuit.com/testchart.html">http://www.belle-nuit.com/testchart.html
(720 * 486 (NTSC) )
And how often do the different film cadence modes get used really ? (However, they get the same amount of points (weight) as some more elementary tests.) And I can't tell the functional difference between ATI/NV in the second image in page 9 "De-Interlacing Quality - Mixed 3:2 Film With Added Video Titles".
Or are the reasons for these differences only visible in moving video?
TheSnowman - Wednesday, October 5, 2005 - link
[quote] And I can't tell the functional difference between ATI/NV in the second image in page 9 "De-Interlacing Quality - Mixed 3:2 Film With Added Video Titles".Or are the reasons for these differences only visible in moving video?[/quote]
Nah, de-interlacing artfacts would always turn up in the proper still framegrab and be easier to see that way as well, but I can't see any de-interlacing artfacts on any of the shots that are claimed to have such issues so am at a loss to understand the author's conclusions on that page. The first ATI shot does show some nasty compression for some reason or another, but I don't see any of interlacing issues in the shots on that page from either ATI or Nvidia.
Anand Lal Shimpi - Wednesday, October 5, 2005 - link
It's tough to see here, but those are actually supposed to be interlacing artifacts. They appear as compression artifacts here, but in motion you get a very clear lined pattern.Take care,
Anand