# Bad image quality for NVIDIA GPUs at default settings



## birdie (Jul 4, 2015)

Some people claim that NVIDIA GPUs produce worse image quality than AMD GPUs using the same in-game settings which gives NVIDIA an edge.

One example:





Another example:




I wonder if W1zzard could write an article comparing image quality in BF4 and GTA5 (these are the games where the difference is the most striking) for top NVIDIA (980Ti/Titan X) and AMD GPUs (R9 390X/Fury X).

Thank you.


----------



## MxPhenom 216 (Jul 4, 2015)

Its mostly color differences I think.


----------



## R-T-B (Jul 4, 2015)

This myth is as old as time I think...  it started in force when the FX series cheated at benchmarks like 3dmark by not rendering parts of the scene.  I don't think there's any truth to it anymore.


----------



## the54thvoid (Jul 4, 2015)

Okay - top one - what?

Bottom one - big diff's but it is an AMD uber title and it's 660ti's ffs.

This thread will descend into a farce quite quickly.  Sleeping Dogs is a very AMD centric title (much like Grid 2(?) where lighting FX were massively suited to AMD at the time).

Nvidia do a lot of work to improve gfx quality.  The problem is they lock it to Nvidia hardware (Gameworks).

This thread will suck.  It's gonna be carnage.


----------



## P4-630 (Jul 4, 2015)

All my first graphic cards were ati, then I tried a XFX GeForce 6800 GS XXX and I was unhappy with the image quality at that time and then I bought a ati x1600 pro which had better image quality at that time. Later I had a laptop with GT425M and now using a laptop with GTX770M, and I'm quite happy with the image quality now.
Not sure if there still is a big difference in image quality with the latest hardware, it's just all better than it used to be.


----------



## rtwjunkie (Jul 4, 2015)

I'm not sure I understand your premise that Nvidia's worse image quality gives them an edge.  Huh? An edge in what? 

I think it depends on the user.  I prefer quality, but at least at a framerate that doesn't suck and make me able to see it.  That means at LEAST 35 fps, with top level images in games.  So, if in fact because I am using an Nvidia card right now, you say I have worse image quality, which gives me an edge?  I don't follow.

Anyway, what can be gained by deliberately starting what is sure to be a back and forth flame war?


----------



## AsRock (Jul 4, 2015)

The examples are terrible they are not even the same, when this done it needs to 100% the same picture at the very least.  Best done with games that you can save in the area you require for test and all so a game that will reconfigure for the graphics card change as not all do.

Back in Ghost Recon days i noticed the in game menu would be transparent while using ATI cards but not with nVidia's.


----------



## BiggieShady (Jul 4, 2015)

I feel like AMD and Nvidia have slightly different default contrast and vibrance in drivers which is cause of differently perceived colors.


----------



## birdie (Jul 4, 2015)

MxPhenom 216 said:


> Its mostly color differences I think.



Oh maybe, just maybe, you've failed to notice a gigantic difference in anisotropic filtering which is mostly missing on NVIDIA's screens.


----------



## birdie (Jul 4, 2015)

rtwjunkie said:


> I'm not sure I understand your premise that Nvidia's worse image quality gives them an edge.  Huh? An edge in what?
> 
> I think it depends on the user.  I prefer quality, but at least at a framerate that doesn't suck and make me able to see it.  That means at LEAST 35 fps, with top level images in games.  So, if in fact because I am using an Nvidia card right now, you say I have worse image quality, which gives me an edge?  I don't follow.
> 
> Anyway, what can be gained by deliberately starting what is sure to be a back and forth flame war?



Worse image quality usually means that less resources are spent rendering the frame which means higher FPS. I wonder why that's so difficult to grasp.


----------



## rtwjunkie (Jul 4, 2015)

birdie said:


> Worse image quality usually means that less resources are spent rendering the frame which means higher FPS. I wonder why that's so difficult to grasp.



I explained why it's difficult to grasp, but you failed to read.

Your premise assumes that all people value fps over image quality.  As I tried to convey, I would much rather force my images to top quality and sacrifice fps, as long as it remained "playable" which is above 35fps.

Since not everyone else either.values fps over image quality, the premise of your question is without merit, rendering further queries into this soon to be flame war moot.


----------



## MxPhenom 216 (Jul 4, 2015)

I dont know what you are getting at but the anisotropic filtering is there in the first comparison, and its comparing the same card NVIDIA card, not NVIDIA vs AMD, with default settings and high. And then Watch Dogs is Gaming Evolved title.

EDIT: What is with all these newer people posting stuff like this that are clearly pro AMD, Its like AMD has their own internet warriors now.


----------



## birdie (Jul 4, 2015)

MxPhenom 216 said:


> I dont know what you are getting at but the anisotropic filtering is there in the first comparison, and its comparing the same card NVIDIA card, not NVIDIA vs AMD, with default settings and high. And then Watch Dogs is Gaming Evolved title.
> 
> EDIT: What is with all these newer people posting stuff like this that are clearly pro AMD, Its like AMD has their own internet warriors now.



I'm not a fan of either company, I've just tried to draw people's attention to this ostensible discrepancy which might not be real/present at all that's why I asked for an independent comparison/review.



rtwjunkie said:


> I explained why it's difficult to grasp, but you failed to read.
> 
> Your premise assumes that all people value fps over image quality.  As I tried to convey, I would much rather force my images to top quality and sacrifice fps, as long as it remained "playable" which is above 35fps.
> 
> Since not everyone else either.values fps over image quality, the premise of your question is without merit, rendering further queries into this soon to be flame war moot.



My premise is that some people allege that NVIDIA is cheating thus its GPUs are in fact slower than competing AMD solutions. I'm not talking about the merits of a higher FPS - this doesn't bother me at all. What bothers me is that if NVIDIA is indeed cheating that we must reassess its performance metrics.


----------



## rtwjunkie (Jul 4, 2015)

No, you clearly said that Nvidia worse image quality gives them an edge.  

I'm making you aware that getting maximum FPS is not the most important thing for all people.

And thank you for confirming @MxPhenom 216's supposition that you are attempting to stir things up.


----------



## R-T-B (Jul 4, 2015)

BiggieShady said:


> I feel like AMD and Nvidia have slightly different default contrast and vibrance in drivers which is cause of differently perceived colors.



They do.  I can confirm I had to tweak my monitors tint/color values when switching to make it look the same as my old AMD.

That said, NVIDIA turns on several "optimizations" in their drivers by default.  These are shortcuts that won't really be noticed in real time, but shortcuts all the same.  AMD does the same with there drivers they just do it less agressively and it appears in different areas.  However, only NVIDIA provides a way to turn said optimizations off.  See my settings:








rtwjunkie said:


> And thank you for confirming @MxPhenom 216's supposition that you are attempting to stir things up.



There may be a legit discussion here yet (vendor optimizations), but I am not confident we are mature enough to have it, honestly.


----------



## erocker (Jul 4, 2015)

MxPhenom 216 said:


> Its mostly color differences I think.


Huh? Are you looking at different screenshots than I'm looking at?


----------



## birdie (Jul 4, 2015)

rtwjunkie said:


> No, you clearly said that Nvidia worse image quality gives them an edge.
> 
> I'm making you aware that getting maximum FPS is not the most important thing for all people.
> 
> And thank you for confirming @MxPhenom 216's supposition that you are attempting to stir things up.



God. Where was I talking about "getting maximum FPS"?

Most people choose GPUs/CPUs based on their performance/price metric and if NVIDIA is cheating (I'm *not* claiming that - I simply don't know) then people make wrong choices and get objectively worse results (i.e. image quality).


----------



## Steevo (Jul 4, 2015)

R-T-B said:


> They do.  I can confirm I had to tweak my monitors tint/color values when switching to make it look the same as my old AMD.
> 
> That said, NVIDIA turns on several "optimizations" in their drivers by default.  These are shortcuts that won't really be noticed in real time, but shortcuts all the same.  AMD does the same with there drivers they just do it less agressively and it appears in different areas.  However, only NVIDIA provides a way to turn said optimizations off.  See my settings:
> 
> ...




You seem to be incorrect, as AMD allows me to turn of "Surface Format Optimization" as it sometimes causes corruption with the current driver in GTA5. And as well I allow every game other than Skyrim, and Anisotropic filtering on GTA5, to use its own in game settings.


----------



## MxPhenom 216 (Jul 4, 2015)

erocker said:


> Huh? Are you looking at different screenshots than I'm looking at?



Nope. The first pair compares the same GPU to different settings, and Watch Dogs is an AMD title. I could see a point in this thread if the first pictures were of AMD vs NVIDIA. I dont know what the default settings are for titan x in bf but I'm pretty sure its medium or low for most cards.

I have heard @MT Alex say when he got his 770, that colors in games didnt seem as vibrant compared to his 5870s. That's why I said colors are the main different,whether the shots in the op show it or not.


----------



## the54thvoid (Jul 4, 2015)

I have a great idea.

How about we just ask @W1zzard if there is any merit in this discussion as he does test gfx cards on a very regular basis.
Otherwise, this thread is a genuine chance of chewing on a pot of bile, favouritism and ignorant, pseudo analysis.
FWIW, the AMD does better IQ idea has been around since year dot. I have never noticed any problems on either side, outside of developer biased games.


----------



## Pill Monster (Jul 4, 2015)

birdie said:


> *Some people claim that NVIDIA GPUs produce worse image quality than AMD GPUs* *using the same in-game settings* which gives NVIDIA an edge.


Who.

Cares.




MxPhenom 216 said:


> EDIT: What is with all these newer people posting stuff like this that are clearly pro AMD, Its like AMD has their own internet warriors now.


I don't understand why these threads aren't locked immediately.


----------



## R-T-B (Jul 4, 2015)

Steevo said:


> View attachment 66257
> 
> 
> You seem to be incorrect, as AMD allows me to turn of "Surface Format Optimization" as it sometimes causes corruption with the current driver in GTA5. And as well I allow every game other than Skyrim, and Anisotropic filtering on GTA5, to use its own in game settings.



Good, you appear to be correct...  Glad to see that.  Wonder how I missed that in my ownership of my r9...


----------



## GreiverBlade (Jul 4, 2015)

well coming from a R9 290 to a 980 ... i notice ... hum ... nothing (well maybe since Final Fantasy XIV Heavernsward is a nvidia gameworks ... it has something to do ...)


rtwjunkie said:


> I think it depends on the user.  I prefer quality, but at least at a framerate that doesn't suck and make me able to see it.  That means at LEAST 35 fps, with top level images in games.


oh cool @rtwjunkie since my DSR 4k give me 35 fps almost constantly i could get a 4k display and play FFXIVHW in 4k (my limite is also 35fps at last  )

also ... since i have BF4 (and some other AMD centric games ...)i can test ... the 980 ... since i don't have quick release fitting, i don't think i can bear the burden to empty the loop test the 290 then re empty it and test the 980 

aka: useless thread.


----------



## Xzibit (Jul 4, 2015)

MxPhenom 216 said:


> Nope. The first pair compares the same GPU to different settings, and Watch Dogs is an AMD title. I could see a point in this thread if the first pictures were of AMD vs NVIDIA. I dont know what the default settings are for titan x in bf but I'm pretty sure its medium or low for most cards.



That comparison is not referring to in-game settings its referring to the Driver GUI slider






The question will always be there since driver settings can override game settings for both.


----------



## GreiverBlade (Jul 4, 2015)

Xzibit said:


> That comparison is not referring to in-game settings its referring to the Driver GUI slider
> 
> 
> 
> ...


that's why i always tick the "let the 3D application decide" since i like to set my options in game to see effectively the difference "in game"


----------



## rtwjunkie (Jul 4, 2015)

I must be missing something as well with the images.  Looking at the 660Ti SLI vs 7950 Crossfire, it occurs to me that the 7950 image looks much closer to my Nvidia card images in Sleeping Dogs than the 660 SLI screenie does.  I think it's merely a matter of better cards from both camps produce better pictures.  It's a really bad example to use to try and make a point.

@GreiverBlade I too prefer to set my adjustments in-game as well, rather than forcing anything from the driver.


----------



## Xzibit (Jul 4, 2015)

GreiverBlade said:


> that's why i always tick the "let the 3D application decide" since i like to set my options in game to see effectively the difference "in game"



You can still do both and only the applicable ones will be adjusted.


----------



## GreiverBlade (Jul 4, 2015)

Xzibit said:


> You can still do both and only the applicable ones will be adjusted.


nah ... driver override is not acceptable for me i never touch those settings, in game is the way to go


----------



## R-T-B (Jul 4, 2015)

GreiverBlade said:


> nah ... driver override is not acceptable for me i never touch those settings, in game is the way to go



I agree...  part of me is practically OCD about that.  Unless the game doesn't support AA or something silly, I almost never use driver overrides.


----------



## MxPhenom 216 (Jul 4, 2015)

GreiverBlade said:


> nah ... driver override is not acceptable for me i never touch those settings, in game is the way to go


Not if its a game like Skyrim that uses a really crappy way of doing AF.


----------



## GreiverBlade (Jul 4, 2015)

MxPhenom 216 said:


> Not if its a game like Skyrim that uses a really crappy way of doing AF.


AF? autofocus?  oh silly me ... AF ... Anisotropic Filtering ...  right?

my Skyrim settings are always ruled by a no AA nor anisotropic filtering + ENB and custom ini


----------



## Bansaku (Jul 4, 2015)

OP is right, but worded in such a way that people are reading into it too much. AMD and nVidia (and Intel) have their own way of rendering a specific effect, and thus the way the screen is rendered will have subtle differences. Example: In the Witcher 3, nVidia renders the gradient for shadows in a way that the effect is 'smooth', whereas AMD the same shadow will appear 'dithered'. It really depend on the game. Some titles you will see absolutely no difference (usually from smaller studios and indie games), while AAA titles where they are usually endorsed (sponsored?) by either nVidia or AMD. Going back to the Witcher 3, nVidia handles AA like a champ while I can not even override the option in CCC. At the same time, I can go into CCC and override the nVidia tessellation of the game and use AMD's specific renderers to achieve next to no performance loss for Hairworks.

So OP is absolutely correct! Some titles look 'worse' on nVidia, and likewise some look 'worse' for AMD. 

However, it is all moot unless you have some sort of OCD and spend more time standing still and going over the scene with a fine tooth comb rather than enjoying an amazing game.


----------



## GhostRyder (Jul 4, 2015)

MxPhenom 216 said:


> Nope. The first pair compares the same GPU to different settings, and Watch Dogs is an AMD title. I could see a point in this thread if the first pictures were of AMD vs NVIDIA.


Watch Dogs is not an AMD title, it was even bundled with NVidia cards for awhile and makes use of Game Works.  That game displayed is Sleeping dogs which is part of AMD's program.

Meh, I am not sure honestly but I have heard some people say color reproduction is better on AMD.  Personally I have never thought about comparing it on the same screen but I could only with my laptop (GTX 675m) and desktop (R9 290X) but they are completely different cards and ages.  It could be true but I have never thought about it.


----------



## kn00tcn (Jul 4, 2015)

whoever made that first pic is a liar, they must have set in game AF to disabled then let the driver do nothing on the right white the left is forced AF.... that's nonsensical, i have my stuff on default (yes the minor nvidia optimizations are enabled) & things are fine, i always use in game settings & override only if the game lacks its own AF

second pic may be a bug or time of day or who knows... it's odd that the shopkeeper's light is off

these are very obvious differences but unclear reasons or settings, not sure how someone cant see them, they're even highlighted

this has nothing to do with the user or what you personally like, this is about potentially higher benchmarks in reviews

i also dont see why a 660 matters, if it's the same settings then it should look the same, this is the first time i'm hearing games are supposed to look different just cuz they're 'optimized' for amd (grid?? the only thing that should be different is if you use intel with that special smoke enabled)

...i didnt quote the people i'm replying to


----------



## Xzibit (Jul 4, 2015)

GhostRyder said:


> Meh, I am not sure honestly but I have heard some people say color reproduction is better on AMD.  Personally I have never thought about comparing it on the same screen but I could only with my laptop (GTX 675m) and desktop (R9 290X) but they are completely different cards and ages.  It could be true but I have never thought about it.



*It has do to with the default color signals*


----------



## the54thvoid (Jul 4, 2015)

Xzibit said:


> *It has do to with the default color signals*



Why I've never noticed it, my 1440p dell is still DVI connection.  My colours are as they should be.  Good to know though if I move to 4k and get a 980ti and be required to use DP or HDMI.  Good little article


----------



## LightningJR (Jul 4, 2015)

I remember back in the day GPU reviews did include screenshot captures to compare quality. I enjoyed that, I wish it would come back.

I always use "High Quality" over anything lower, not because I have ever noticed anything wrong, I am just like that


----------



## Pill Monster (Jul 5, 2015)

I'm surprised Inspector hasn't been mentioned.......


----------



## mirakul (Jul 5, 2015)

Interesting, Hey, the problem is what settings do reviewers use in their bench? DEFAULT, mostly. And if those default settings offer poorer image quality in order to have an edge in fps, it's time to question nVidia. 

It is reported that for a TitanX in BF4, changing from "let the 3D application decide" to a custom driver override setting which meets FuryX's quality witness a 7fps hit, not sure in which res though.
http://www.overclock.net/t/1563386/...cn-vs-nvidia-maxwell-and-kepler#post_24126882


----------



## LAN_deRf_HA (Jul 5, 2015)

If you want a serious thread drop the sleeping dogs example. That was a well known and very game specific issue that's just going to make people roll their eyes at the op's premise.


----------



## kn00tcn (Jul 5, 2015)

Xzibit said:


> *It has do to with the default color signals*


no it doesnt/shouldnt, by that i mean it's incredibly obvious when a monitor is running in RGB limited mode over hdmi, i sure hope people that are switching/comparing vendors realize something is horribly wrong when blacks are grey... it's another reason why hdmi & tv standards suck (similarly, a lot of amd users complain about the overscan setting)

to make matters worse, some monitors (like mine) make a mess when hdmi is used, so i have to use limited on the ps3 or any other hdmi device, then adjust my brightness/contrast as best i can

the chatter & attempted screenshots over the years attempt to show small changes in color or details, in the past i've really disliked nvidia's MSAA compared to ati's, they also had worse AF until ~2008 or 2009

btw that article is out of date, sometime last year nv updated nvcp to give you a dropdown for the RGB mode

EDIT: i dont mean to make the historical context sound so biased, so let's note that FCAT had to be updated since amd had small amounts of noise in the image that confused the color (frame) detection


----------



## Xzibit (Jul 5, 2015)

kn00tcn said:


> no it doesnt/shouldnt, by that i mean it's incredibly obvious when a monitor is running in RGB limited mode over hdmi, i sure hope people that are switching/comparing vendors realize something is horribly wrong when blacks are grey... it's another reason why hdmi & tv standards suck (similarly, a lot of amd users complain about the overscan setting)
> 
> to make matters worse, some monitors (like mine) make a mess when hdmi is used, so i have to use limited on the ps3 or any other hdmi device, then adjust my brightness/contrast as best i can
> 
> ...



Not so obvious.  *G-Sync Monitors and Limited Range RGB 16-235 on NVIDIA GPUs*

The question was as to defaults and when your using a lesser color signal you save on bandwidth.

*Limited*






*Full*


----------



## 15th Warlock (Jul 5, 2015)

AMD and Nvidia user here, first thing I do after installing new drivers in every PC I own is max out the default IQ settings in the driver settings.

On both my AMD and Nvidia systems, barring some brand especific effects like gameworks, or HBAO+, IQ looks the same.

I remember a time when Tom's hardware and anantech used to compare the differences between both companies in terms of anisotropic filtering and antialiasing techniques, but such differences became so indistinguishable between both teams, comparisons were just dropped altogether.

I cannot speak for default driver settings IQ, but I can tell you from my own empirical experience, that maxed out, both teams offer the same degree of IQ; chances are default settings won't even matter for benching, as most hardware review sites list the IQ settings used for every game to make the comparisons equal, as such settings usually override whatever the driver IQ is set to.

IMHO, this topic is not even worth discussing, the times when _both_ teams were caught red handed using especific 'low quality' settings when a given executable was launched to cheat in a few games or apps, is long behind us.

Not worth wasting W1zzard's (or anyone else's) time on such trivial topic as far as I'm concerned.


----------



## the54thvoid (Jul 5, 2015)

It's not a cheat. As I said from @Xzibit's link, it doesn't affect DVI.  A lot of folk still use DVI.
As the article alluded to, both NV and AMD have issues, Nv's being worse but it is a legacy issue that hopefully gets resolved as DP and HDMI evolve.
It would be good to see how many folks still use DVI.


----------



## Deleted member 67555 (Jul 5, 2015)

AMD and Nvidia do use different color schemes as they are based on similar but different technologies...this is fact and is not in question what so ever...AMD does use a larger color spectrum by default and it makes no difference to most people however this makes no difference in fps either...

I think people are confusing this with performance....in the end it's only a matter of preference to some people.


----------



## birdie (Jul 5, 2015)

jmcslob said:


> AMD and Nvidia do use different color schemes as they are based on similar but different technologies...this is fact and is not in question what so ever...AMD does use a larger color spectrum by default and it makes no difference to most people however this makes no difference in fps either...
> 
> I think people are confusing this with performance....in the end it's only a matter of preference to some people.



The screenshots in the original post clearly show the difference in rendering (sharpness/details/lighting) rather than colors so your message is kinda off the mark.


----------



## the54thvoid (Jul 5, 2015)

birdie said:


> The screenshots in the original post clearly show the difference in rendering (sharpness/details/lighting) rather than colors so your message is kinda off the mark.



The thread has progressed to discussion on colour as this is, courtesy of xzibit's link, a definite and tangible, inherent issue with colour use over DP or HDMI for Nvidia. As is scaling with AMD.
The screenshot is bogus as an unbiased piece of evidence. Using an ancient title that waaaaay back prompted discussion on vendor specific FX is shoddy for 2015.
I used 7970's as my last AMD set up a couple years ago. Moved to a Titan (for single GPU reasons) and there were no noticeable differences to me.
Of course I enjoyed the tesselated everything that Nvidia crammed in, even on unseen areas (water). Now that is what goes on nowadays, find out what the competing architecture does worse and shove a tonne of it into the game you help develop.
Years back, AMD did it with Sleeping Dogs and one of the Grid(?) titles. Nvidia do it quite a lot on everything they touch.


----------



## okidna (Jul 5, 2015)

I don't know if this is relevant or not, but the original poster/uploader of the video admitted that he changed the color settings/setups on the FuryX benchmark : http://forums.overclockers.co.uk/showpost.php?p=28258345&postcount=2394


----------



## mroofie (Jul 5, 2015)

This all started at wccftech (the conspiracy) why am I not surprised 
After the gtx 970 fiasco every fanboy out there is looking for problems where there aren't none  (espically now after the fury x got ripped)


----------



## mroofie (Jul 5, 2015)

okidna said:


> I don't know if this is relevant or not, but the original poster/uploader of the video admitted that he changed the color settings/setups on the FuryX benchmark : http://forums.overclockers.co.uk/showpost.php?p=28258345&postcount=2394


Thank you


----------



## mroofie (Jul 5, 2015)

Another link read the comments please

http://forums.anandtech.com/showthread.php?t=2437903&page=3

Birdie needs to do some research


----------



## Xzibit (Jul 5, 2015)

mroofie said:


> Another link read the comments please
> 
> http://forums.anandtech.com/showthread.php?t=2437903&page=3
> 
> Birdie needs to do some research



That person is mistaking in-game setting with driver UI settings I pointed out.

As far as color goes you can see the confusion in this posted screen grab







Somehow Limited range appeared to be more detailed and prefered to a person.

The color can be correct for both but DEFAULTS are in question.

More people have to do DEFAULT only test in various games to see if there is a difference in how drivers handle the IQ. For this to go anywhere.

If hes capturing video with-in AMD Fury X, CCC has a separate video color enhancements which are on by default. He'll have to turn that off or capture it from output so it wont have an effect on video comparisons.


----------



## mirakul (Jul 5, 2015)

Found something interesting there
http://forums.anandtech.com/showthread.php?t=2437903&page=3


> Originally Posted by *Atomic Playboy*
> 
> 
> GTA V has an issue with the ingame Anisotropic Filtering on Nvidia cards; basically, it doesn't work right. Disable AF in the ingame menu and force 16x AF through the Nvidia control panel (make sure to choose "Override application setting"). Also set the filtering quality to "High Quality." That should fix the texture filtering issues (as shown in this comparison shot).http://international.download.nvidi...ering-interactive-comparison-1-on-vs-off.html


So this means every GTA V bench for nVidia cards are inaccurate in term of AF setting?


----------



## the54thvoid (Jul 5, 2015)

I love my Nvidia image quality, so much I bought shares in the company and smoked my AMD shares.

But seriously, I'm all for tech discussion, and praise to @Xzibit for being very neutral with good, honest info but this thread is still toeing the line of doom.
I guarantee right now people are googling "Nvidia bad image quality" and just posting it here without any recourse to technical dissection or logical thought.
Think I'll go start a thread on why Fury X has such poor frame rate latency in many games.

Edit: it does you know. Though not as bad as poor old Kepler


----------



## AsRock (Jul 5, 2015)

Xzibit said:


> That person is mistaking in-game setting with driver UI settings I pointed out.
> 
> As far as color goes you can see the confusion in this posted screen grab
> 
> ...



Why you posing this crap still ?, 1st off the picture's are different AGAIN just like the ones from  Sleeping Dogs were it could be just a angle or even bug in game or drivers,  or in this case some one fucked with the color settings lol.


----------



## Caring1 (Jul 5, 2015)

Isn't the purpose of this Comments and feedback section for topics relative to TPU, not general hardware or software?
If the OP wanted to whine about graphics cards and settings then he could have said what he wanted in one of the many existing threads!


----------



## LightningJR (Jul 5, 2015)

mroofie said:


> This all started at wccftech (the conspiracy) why am I not surprised
> After the gtx 970 fiasco every fanboy out there is looking for problems where there aren't none  (espically now after the fury x got ripped)



Thing is this was an issue way back in the day. Both parties were reducing the quality of graphics to get better 3DMark scores. It would not surprise me to see it happen again.

I haven't seen any real or reputable sources doing these tests to really believe any of this. Until then or I do the tests myself (I have no AMD card other than a 4830) I am calling it fud. But I can't say it's not possible.


----------



## rtwjunkie (Jul 5, 2015)

Personally, my view if you really want to compare picture quality, the settings should be done IN GAME settings. Also, you MUST have the exact same scene at the same time of day in game.  Only with all this can you see if there are any real differences in the cards' ability to render a quality image, and only then can you compare frame rates.

Interestingly, I believe that is how W1zzard does HIS testing.


----------



## mirakul (Jul 5, 2015)

Can anyone check whether the default setting of image quality in nV control panel is 'quality' override or 'let the application decide'? Because it's high chance that every reviewer don't even touch there and just let it to default.


----------



## blued (Jul 5, 2015)

Xzibit said:


> Not so obvious.  *G-Sync Monitors and Limited Range RGB 16-235 on NVIDIA GPUs*
> 
> The question was as to defaults and when your using a lesser color signal you save on bandwidth.
> 
> ...




Nvidia fixed the issue many months ago with a driver update. You get full color 0-255 HDMI now. Really, you should know better than to post a year old link.


----------



## the54thvoid (Jul 5, 2015)

mirakul said:


> Can anyone check whether the default setting of image quality in nV control panel is 'quality' override or 'let the application decide'? Because it's high chance that every reviewer don't even touch there and just let it to default.



I just checked mine - I'm perfectly happy with the settings.  They're mostly allowing application control and all non app controlled are set to high.  A few features are set to off but these aren't ones I even care about.  I also have the option of doing whatever the hell I want.

@W1zzard - what setting do you use.  Just to shut this



Spoiler



muppet


 up.



blued said:


> Nvidia fixed the issue many months ago with a driver update. You get full color 0-255 HDMI now. Really, you should know better than to post a year old link.



Can you source that? otherwise @Xzibit will tear you a new one - he's quite good at finding things.


----------



## blued (Jul 5, 2015)

birdie said:


> Some people claim that NVIDIA GPUs produce worse image quality than AMD GPUs using the same in-game settings which gives NVIDIA an edge.
> 
> One example:
> 
> ...





Where did you find the second pic? Also at AT forums? Quite funny because I also posted it there as an example of a disingenuous argument vs a comparison pic with my Nvidia card when I originally argued this a couple years ago. Anyway here is same scene/location with my Nvidia card:






Just incredible how far people are willing to go to conceal or omit info or facts that may debunk their argument.


----------



## blued (Jul 5, 2015)

the54thvoid said:


> I just checked mine - I'm perfectly happy with the settings.  They're mostly allowing application control and all non app controlled are set to high.  A few features are set to off but these aren't ones I even care about.  I also have the option of doing whatever the hell I want.
> 
> @W1zzard - what setting do you use.  Just to shut this
> 
> ...


Jeezuz... how about from people how actually own Nvidia cards who may be more informed than those who dont? In case you dont believe them, well, here:

https://pcmonitors.info/articles/correcting-hdmi-colour-on-nvidia-and-amd-gpus/


----------



## blued (Jul 5, 2015)

the54thvoid said:


> Can you source that? otherwise @Xzibit will tear you a new one -* he's quite good at finding things.*


----------



## blued (Jul 5, 2015)

Btw.. I mentioned this debate at AT forums where someone posted a set of pics comparing 660ti vs 7950.. well to his surprise, I posted my Nvidia screen shots at same scenes/locations in Sleeping Dogs. I can only imagine the embarrassment of those trying to ride on that argument.

See the differences:

His furnished pics.

My own pics.


----------



## mirakul (Jul 5, 2015)

the54thvoid said:


> I just checked mine - I'm perfectly happy with the settings.  They're mostly allowing application control and all non app controlled are set to high.  A few features are set to off but these aren't ones I even care about.  I also have the option of doing whatever the hell I want.


Did I ask about your settings? I'm asking the DEFAULT settings. You didn't bother to answer my question btw.


----------



## okidna (Jul 5, 2015)

mirakul said:


> Can anyone check whether the default setting of image quality in nV control panel is 'quality' override or 'let the application decide'? Because it's high chance that every reviewer don't even touch there and just let it to default.



On 353.30, the differences between Default settings and "Quality" is :

Default : Anisotropic Filtering - Application-controlled
Quality : Anisotropic Filtering - 8X

Default : Antialiasing Mode - Application-controlled
Quality : Antialiasing Mode - Override any application setting

Default : Antialiasing Setting - Application-controlled
Quality : Antialiasing Setting - 4X

That's it.


----------



## blued (Jul 5, 2015)

blued said:


> Nvidia fixed the issue many months ago with a driver update. You get full color 0-255 HDMI now. Really, you should know better than to post a year old link.


I should also add that this would only have been an issue for those who did not have DVI or DP as an option. Monitors with only HDMI must be rare to come by.


----------



## Steevo (Jul 5, 2015)

mirakul said:


> Found something interesting there
> http://forums.anandtech.com/showthread.php?t=2437903&page=3
> 
> So this means every GTA V bench for nVidia cards are inaccurate in term of AF setting?


My 7970 is the same, AF in game causes corruption or distortion effects, forcing it through the driver fixes the issue, as well surface format optimization causes transparent effects to be corrupted or distorted.


----------



## birdie (Jul 5, 2015)

mroofie said:


> Another link read the comments please
> 
> http://forums.anandtech.com/showthread.php?t=2437903&page=3
> 
> Birdie needs to do some research



Yeah, really, ... or maybe in the same thread people are proving my point:

http://forums.anandtech.com/showpost.php?p=37532964&postcount=67



> Still poor lod distance for Titan X, and poor AF.


Which means NVIDIA is cheating and their products might actually be slower than what reviews and reviewers are saying. Anyways I've moved to respective threads at anandtech and overclockers forums since people over there are actually discussing the issue instead of calling names or calling one another "fanboy".


----------



## the54thvoid (Jul 5, 2015)

birdie said:


> Which means NVIDIA is cheating and their products might actually be slower than what reviews and reviewers are saying. Anyways I've moved to respective threads at anandtech and overclockers forums since people over there are actually discussing the issue instead of calling names or calling one another "fanboy".



Or when a supposition is not fully supported, OP ups sticks and goes somewhere else where he might find back up.
Really, there is no major issue here at all.  The default settings are almost all high quality or off, for some more exotic FX. But these settings are easily over ridden in NCP.
For this topic to be relevant you actually do NEED to know what the reviews used.  Spouting assumptions about settings that you can't possibly know without that knowledge is just plain daft.
Ask the reviewers, don't create flame threads for your delectation.  It's like me creating a thread about what boxer shorts you wear without knowing if you even wear them.
Find the proof, then we can discuss it. The null hypothesis is "Reviewers selecting default NCP options result in no fps increases above equivalent AMD CCC settings."
So, do reviewers use default? Can you control variables by matching identical AMD settings? Is there an unfair advantage from this two known factors?

Without knowing the variables used in reviews, all you get is people without any actual empirical evidence spouting unsubstantiated opinions.

End of.


----------



## rtwjunkie (Jul 5, 2015)

So, now you are outright accusing, instead of inquiring.  We all suspected when you presented two Sleeping Dogs screenshots that looked 5 miles apart in quality.

As to moving your discussion on overclockers and anandtech, enjoy their less informed opinions and total head nodding to your flame attempts. 

Mods, on that note, can we please close this?


----------



## mroofie (Jul 5, 2015)

birdie said:


> Yeah, really, ... or maybe in the same thread people are proving my point:
> 
> http://forums.anandtech.com/showpost.php?p=37532964&postcount=67
> 
> ...


Look at blued's comment aka his pictures 



rtwjunkie said:


> So, now you are outright accusing, instead of inquiring.  We all suspected when you presented two Sleeping Dogs screenshots that looked 5 miles apart in quality.
> 
> As to moving your discussion on overclockers and anandtech, enjoy their less informed opinions and total head nodding to your flame attempts.
> 
> Mods, on that note, can we please close this?



Finally someone else with a brain too


----------



## mroofie (Jul 5, 2015)

rtwjunkie said:


> So, now you are outright accusing, instead of inquiring.  We all suspected when you presented two Sleeping Dogs screenshots that looked 5 miles apart in quality.
> 
> As to moving your discussion on overclockers and anandtech, enjoy their less informed opinions and total head nodding to your flame attempts.
> 
> Mods, on that note, can we please close this?


I have seem some comments on both of those sites shocking!

Viva retarded comments  


(Ps i needed to use retarded don't kill me  )



AsRock said:


> Why you posing this crap still ?, 1st off the picture's are different AGAIN just like the ones from  Sleeping Dogs were it could be just a angle or even bug in game or drivers,  or in this case some one fucked with the color settings lol.[/QUOTE



Epic comment is epic


----------



## erocker (Jul 5, 2015)

Since thread/discussion is based on misinformation, continuing this discussion seems to be a moot point.


----------

