# ATI & NVIDIA Video Enhancement Quality Tested



## W1zzard (Jun 16, 2010)

High-Definition content is available everywhere nowadays. We take a close look at what the drivers from AMD and NVIDIA offer in order to improve the quality of video playback. HQV Benchmark 2.0 uses 39 different tests that put the available image enhancement features in the spot light.

*Show full review*


----------



## EastCoasthandle (Jun 16, 2010)

No surprise there.  Is the HQV Benchmark 2.0 something users can try for themselves (free/demo)?


----------



## W1zzard (Jun 16, 2010)

unfortunately it's not free, if you are a member of the press, they have a form on their site


----------



## WarEagleAU (Jun 16, 2010)

well you could give us a link to said form, ha ha. Glad to see both ATI and NVidia both are taking control of video processing and realizing it is more movies and content than about games these days.


----------



## DannibusX (Jun 16, 2010)

Nice review!  I never even thought of benchmarking for video playback.


----------



## mstenholm (Jun 16, 2010)

Did I miss that part where you did a monitor calibration with a good calibrator? Skin tones will be off with most consumer monitors such as the LG out of the box.


----------



## W1zzard (Jun 16, 2010)

mstenholm said:


> Did I miss that part where you did a monitor calibration with a good calibrator? Skin tones will be off with most consumer monitors such as the LG out of the box.



hqv accounts for that by switching a "normal" skin image with a yellow and red image


----------



## W1zzard (Jun 16, 2010)

those are the 3 test images, no matter your calibration you can clearly see the differences. 















with proper correction all 3 skin colors will look right and the shirt and background color won't change


----------



## Delta6326 (Jun 16, 2010)

sweet review!


----------



## DaC (Jun 16, 2010)

Does someone knows where to finda a tutorial about image calibration ? I really can't make my mind on how to configure playback settings.
BTW, really nice review Wizz.


----------



## mstenholm (Jun 16, 2010)

DaC said:


> Does someone knows where to finda a tutorial about image calibration ? I really can't make my mind on how to configure playback settings.
> BTW, really nice review Wizz.



First step would be to buy/borrow a calibrator. An cheaper and way less accurate methode is to obtain a correct "exposed" photo and adjust your monitor until you are happy. Lets assume that you took any of these three photos. If your used the first one and got your monitor to show natural skin tones then all other photos/video/what ever would have a wrong color/temperature/gamma.


----------



## Izliecies (Jun 16, 2010)

W1zzard said:


> those are the 3 test images, no matter your calibration you can clearly see the differences.
> 
> with proper correction all 3 skin colors will look right and the shirt and background color won't change



Which one is the "right one"?


----------



## Mistral (Jun 16, 2010)

Izliecies said:


> Which one is the "right one"?



If I had to guess, the first. Or maybe the third. The second one is way off, too yellowish. Then again, that's on my screen.

Great read, thanks W1zz.


----------



## newtekie1 (Jun 16, 2010)

Very interesting read.  However, I wonder how much of a difference this actually makes.  I mean, it seems to me like this is one of those "you'll notice it when on freeze the image and look at it really really hard, but not when you are actually watching the movie" type of things.  I'm being serious here, how much of a real world difference is there between the worst and the best in this test?  Is there really going to be a noticeable difference between an HD4290 w/ Defaults or Optimized and an HD5870 w/ Default or Optimized?


----------



## erocker (Jun 16, 2010)

newtekie1 said:


> Very interesting read.  However, I wonder how much of a difference this actually makes.  I mean, it seems to me like this is one of those "you'll notice it when on freeze the image and look at it really really hard, but not when you are actually watching the movie" type of things.  I'm being serious here, how much of a real world difference is there between the worst and the best in this test?  Is there really going to be a noticeable difference between an HD4290 w/ Defaults or Optimized and an HD5870 w/ Default or Optimized?



I know I get a pretty drastic difference between my "out of the box" settings for my 5850 and going in CCC and setting my video settings. I remember helping DonInKansas adjust his video settings and he was pretty amazed at the difference.

I do wish both ATi and Nvidia would have a nice mix of different presets to choose from.


----------



## Sasqui (Jun 16, 2010)

Way interesting.  Personally, I've been watching more and more DVD content from my PC and noticed a big improvement with my 5870 in 1080 upscale quality from my old 4870 card.(perhaps it's the software...?)

Would like to hear details on what is "optimized"?  Was that simply trial and error to find the best driver settings?


----------



## air_ii (Jun 16, 2010)

I read on another forum that the "optimised" settings will be integrated as default in the next Catalyst release (no clue why they didn't do it in the 10.6).


----------



## W1zzard (Jun 16, 2010)

air_ii said:


> I read on another forum that the "optimised" settings will be integrated as default in the next Catalyst release (no clue why they didn't do it in the 10.6).



yes i mentioned that too. what they will enable by default is our "default on" settings.
since our "optimized" ones were test by test, there is no setting that fits all tests


----------



## W1zzard (Jun 16, 2010)

Sasqui said:


> Would like to hear details on what is "optimized"?  Was that simply trial and error to find the best driver settings?



yes that's basically it. you can change the sliders with ccc in the foreground and the video in the background will change instantly


----------



## wahdangun (Jun 16, 2010)

wizz, can u give a photo along side the score ? its hard to imagine with just looking in the score.

maybe each photo in out-of-the box, default on, and optimized.


----------



## trt740 (Jun 16, 2010)

Very nice review well done


----------



## erocker (Jun 16, 2010)

Didn't even notice you are using Catalyst 10.6 in this review. Looks like AMD is updating their site right now. Can't wait to try them out.


----------



## trickson (Jun 16, 2010)

THANK YOU FOR THE GREAT REVIEW !!!!!! I want to see them 10.6 drivers and would really love to use that benchmark for mine . But it is not free ( yet ) . Any way keep up the great work .


----------



## Sasqui (Jun 16, 2010)

W1zzard said:


> yes that's basically it. you can change the sliders with ccc in the foreground and the video in the background will change instantly



I can't recall if all of the sliders have the performance vs. quality level listed with them.  If so, I think you're saying you simply bumped all of them to highest quality for the best test.

I can't imagine a 5870 even blinking with everything set to highest.


----------



## DannibusX (Jun 16, 2010)

W1z, just wanted to give you a heads up if you didn't know, but Terry Makedon linked this review on his Twitter account.

http://twitter.com/CatalystMaker


----------



## shevanel (Jun 16, 2010)

very cool.


----------



## mechtech (Jun 16, 2010)

Hey W1ZZ.

What were the "optimized" settings you used for the radeon 5870?

Were they just set in Catalyst Control Center?

If so could you please share them with us, pretty please


----------



## Steevo (Jun 16, 2010)

I have been using deblocking, mosquito noise removal, edge enhancement, and denoise in CCC. Are these the adjustments you used W1zz?


----------



## W1zzard (Jun 16, 2010)

Steevo said:


> I have been using deblocking, mosquito noise removal, edge enhancement, and denoise in CCC. Are these the adjustments you used W1zz?



yes, just play the video, open ccc and change the sliders as you watch the video, find out what works best for you



wahdangun said:


> wizz, can u give a photo along side the score ? its hard to imagine with just looking in the score.
> 
> maybe each photo in out-of-the box, default on, and optimized.



i was considering that, but realized it is impossible to take decent photos of the screen that show the scoring criteria. 

1) screenshot = nono because of hdcp
2) camcorder = won't work because only full hd which is not enough resolution to capture another 1080p screen at full res
3) digital camera = didnt suceed, mostly because exposure is either too short or too long


----------



## crow1001 (Jun 16, 2010)

Superb article, cheers.


----------



## SK-1 (Jun 17, 2010)

mechtech said:


> Hey W1ZZ.
> 
> What were the "optimized" settings you used for the radeon 5870?
> 
> ...



+1 please.


----------



## Steevo (Jun 17, 2010)

I have mine set at 16 edge enhancement, 11 denoise, 18 mosquito noise removal, and 13 deblocking, and the deblocking varies with content for me.


----------



## theubersmurf (Jun 17, 2010)

I'm not surprised ATI won out there, they emphasized video heavily for several years in the middle of the decade.


----------



## Hayder_Master (Jun 17, 2010)

it's great idea to see new test and benchmark for graphic cards on our site, awesome work w1zzard


----------



## Mussels (Jun 17, 2010)

thats quite a lot of change from 'out of the box' to the optimised results


----------



## naoan (Jun 17, 2010)

awww the deblocking and mosquito noise removal is only available in 5XXX series... well I guess avisynth will do for now.


----------



## wahdangun (Jun 17, 2010)

W1zzard said:


> yes, just play the video, open ccc and change the sliders as you watch the video, find out what works best for you
> 
> 
> 
> ...



can u use non DRM video ? like movie trailer ?


----------



## W1zzard (Jun 17, 2010)

wahdangun said:


> can u use non DRM video ? like movie trailer ?



afaik you can't take a screencapture of gpu accelerated video, you find out and if it works i can look into it further


----------



## OneCool (Jun 17, 2010)

Mussels said:


> thats quite a lot of change from 'out of the box' to the optimised results



Agreed!

It surprised me that the IGP was holding its own against the big dogs.


Although the out come is not surprising at all.ATI has always had better video quality than Nvidia IMO.


----------



## wahdangun (Jun 17, 2010)

W1zzard said:


> afaik you can't take a screencapture of gpu accelerated video, you find out and if it works i can look into it further




what the method are you using(using third party program, or just using prt-scr) ?

maybe u can take screen capture it by using special program, just like when u want to screen capture in-game footage(u can't just use prtscr key)?

btw I will search the program and try it(I will PM u if it work).


----------



## W1zzard (Jun 17, 2010)

the official and probably legal answer is that you can't take screen captures of bluray hd content  print screen and screen capture utilities don't work, powerdvd's save capture option is disabled for all bd content


----------



## Roph (Jun 17, 2010)

It's due to hardware overlays. http://en.wikipedia.org/wiki/Hardware_overlay#Screenshots


----------



## wahdangun (Jun 18, 2010)

W1zzard said:


> the official and probably legal answer is that you can't take screen captures of bluray hd content  print screen and screen capture utilities don't work, powerdvd's save capture option is disabled for all bd content



why its must blue ray HD content ? can't u use another HD content(like movie trailer, home made video, or any free distributed video) ?


----------



## vidius maximus (Jun 18, 2010)

*Possible Tallying Error?*

Mmmm looking at where the best card lost points. It didn't add up. On the individual scores page they AMD scored a perfect 5/5 on every test except two. They fell short by only 8 points. But the total was 197/210 - that's 12 points shy of perfect. Should AMD's top score be 202 out of 210?


----------



## Mussels (Jun 18, 2010)

wahdangun said:


> why its must blue ray HD content ? can't u use another HD content(like movie trailer, home made video, or any free distributed video) ?



the test is about quality of played media that many people will have. testing a home video or a trailer is pointless, because the quality people care about is DVD/BR media - the stuff they'll actually use.


----------



## wahdangun (Jun 18, 2010)

Mussels said:


> the test is about quality of played media that many people will have. testing a home video or a trailer is pointless, because the quality people care about is DVD/BR media - the stuff they'll actually use.



but there are some HD trailer(and its have a really good quality),


----------



## Noy (Jun 18, 2010)

vidius maximus said:


> Mmmm looking at where the best card lost points. It didn't add up. On the individual scores page they AMD scored a perfect 5/5 on every test except two. They fell short by only 8 points. But the total was 197/210 - that's 12 points shy of perfect. Should AMD's top score be 202 out of 210?



I think the table is missing a line of scores, as the maximum is 205 not 210 (from that table).
I personally only ever use CCC to boost the gamma in dark movies. I really hate when a movie is so dark that you can't see anything but that might just be me on my CRT


----------



## W1zzard (Jun 18, 2010)

ah found the problem .. the maximum score for skin tone correction is 10, not 5, that's a typo in the (5) number, the scoring was done correctly out of 10 (no setup can do it perfectly)


----------



## HillBeast (Jun 21, 2010)

Nice to see this sort of review. I've always wondered how the cards fared in terms of videos quality. Still in my opinion the key decider of whether a movie will look good on your computer comes down to your monitor and how well you have set that up. Even the best card can't make a rubbish screen look good.


----------



## pr0n Inspector (Jun 23, 2010)

moar post-processing filters = better quality?:shadedshu


----------



## Super XP (Jun 25, 2010)

Very nice indead, thanks.


----------



## Indra EMC (Jun 28, 2010)

As you can see...

ATI Gpu always best for video.


----------



## shaddix (Aug 3, 2010)

> Personally I don't see the point of "film" grain noise included in many Blu-Ray movies, it would be easy to remove in post-processing and just creates an impression of lower fidelity for the majority of users, that's where the processing filters of graphics cards can shine: don't like the noise? Get rid of it with a few mouse clicks.



This is an ignorant statement, you should read up more on film and video if you're going to be writing articles on the subject.

This is like saying you don't see the point of pixels in video games as they get in the way of the image quality. The pixels(film grain) ARE the image. Removing grain always removes detail.


----------



## Mussels (Aug 3, 2010)

shaddix said:


> This is an ignorant statement, you should read up more on film and video if you're going to be writing articles on the subject.
> 
> This is like saying you don't see the point of pixels in video games as they get in the way of the image quality. The pixels(film grain) ARE the image. Removing grain always removes detail.



no... film grain is added deliberately in some movies and games. you dont even know what we're talking about.


----------



## shaddix (Aug 3, 2010)

Mussels said:


> no... film grain is added deliberately in some movies and games. you dont even know what we're talking about.



Then you should have stated as such. However it's still irrelevant.
Purposeful film grain such as in Predator, or artificial grain such as in Battlestar Galactica, if removed destroys detail, there's no way around this. And stating that removing grain/noise improves image quality is like saying removing the stars gives you a better view of the night sky.

Do you think this:






looks better than this?:


----------



## Mussels (Aug 3, 2010)

why should i have been the one to state it? i didnt write anything in the article.

you're really done your research if you didnt even look at the authors name.

the first one clearly looks better, without the artifacting caused by the grain.


----------



## shaddix (Aug 3, 2010)

Mussels said:


> the first one clearly looks better, without the artifacting caused by the grain.


----------



## Mussels (Aug 3, 2010)

shaddix said:


>



so whats you're intention here. what would you like to say?


----------



## HillBeast (Aug 3, 2010)

shaddix said:


>



Dude, are you just trolling for money or something? How was mussells even remotely involved in the article. It was a W1zzard article. What's with the rolling emoticon?


----------



## Steevo (Aug 3, 2010)

The first one is cartoonish as it is way overprocessed. The second one is shit as the pixel noise is horrible. Somewhere between the two is the best. 


Really though shaddix you need a lesson on compression artifacting, lens issues, and the analog circutry introduced noise. That is the cause of image grain, when the voltage to a CCD in a camera has to be turned up, thermal noise, or grain caused by minor differences in actual film between each frame and the CCD capturing it and the studio trying to remove it with the least effect of detail, and or the least amount of work.

So each "1080P high def" will be vary different and the use of post processing filters plays a large degree in the final product. I prefer no noise as it looks like shit, and my personal high def "prosumer grade" camcorder will produce better results than that. It makes me think there are monkeys doing the recapture if it was from film, or blithering idiots working the digital media.


All of battlestar was shot in digital HD, but the idiots who want to paint a futuristic picture decided in the future unlike today, camera standards have gone to shit. Apparently the entire show was shot from the perspective of some monkey man with a betamax camcorder following people around.


----------



## Mussels (Aug 4, 2010)

Steevo said:


> All of battlestar was shot in digital HD, but the idiots who want to paint a futuristic picture decided in the future unlike today, camera standards have gone to shit. Apparently the entire show was shot from the perspective of some monkey man with a betamax camcorder following people around.



and thats exactly the kind of film grain we're talking about. BSG was deliberately shot with a wobbly camera and extra grain added to make it look 'darker' and more 'gritty' (their words, from an interview i read).


----------

