• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Bad image quality for NVIDIA GPUs at default settings

Status
Not open for further replies.
Joined
Apr 18, 2013
Messages
1,260 (0.29/day)
Location
Artem S. Tashkinov
Some people claim that NVIDIA GPUs produce worse image quality than AMD GPUs using the same in-game settings which gives NVIDIA an edge.

One example:
0UzUR3A.jpg

Another example:
example4i.jpg

I wonder if W1zzard could write an article comparing image quality in BF4 and GTA5 (these are the games where the difference is the most striking) for top NVIDIA (980Ti/Titan X) and AMD GPUs (R9 390X/Fury X).

Thank you.
 
Its mostly color differences I think.
 
This myth is as old as time I think... it started in force when the FX series cheated at benchmarks like 3dmark by not rendering parts of the scene. I don't think there's any truth to it anymore.
 
Okay - top one - what?

Bottom one - big diff's but it is an AMD uber title and it's 660ti's ffs.

This thread will descend into a farce quite quickly. Sleeping Dogs is a very AMD centric title (much like Grid 2(?) where lighting FX were massively suited to AMD at the time).

Nvidia do a lot of work to improve gfx quality. The problem is they lock it to Nvidia hardware (Gameworks).

This thread will suck. It's gonna be carnage.
 
All my first graphic cards were ati, then I tried a XFX GeForce 6800 GS XXX and I was unhappy with the image quality at that time and then I bought a ati x1600 pro which had better image quality at that time. Later I had a laptop with GT425M and now using a laptop with GTX770M, and I'm quite happy with the image quality now.
Not sure if there still is a big difference in image quality with the latest hardware, it's just all better than it used to be.
 
Last edited:
I'm not sure I understand your premise that Nvidia's worse image quality gives them an edge. Huh? An edge in what?

I think it depends on the user. I prefer quality, but at least at a framerate that doesn't suck and make me able to see it. That means at LEAST 35 fps, with top level images in games. So, if in fact because I am using an Nvidia card right now, you say I have worse image quality, which gives me an edge? I don't follow.

Anyway, what can be gained by deliberately starting what is sure to be a back and forth flame war?
 
Last edited:
The examples are terrible they are not even the same, when this done it needs to 100% the same picture at the very least. Best done with games that you can save in the area you require for test and all so a game that will reconfigure for the graphics card change as not all do.

Back in Ghost Recon days i noticed the in game menu would be transparent while using ATI cards but not with nVidia's.
 
I feel like AMD and Nvidia have slightly different default contrast and vibrance in drivers which is cause of differently perceived colors.
 
Its mostly color differences I think.

Oh maybe, just maybe, you've failed to notice a gigantic difference in anisotropic filtering which is mostly missing on NVIDIA's screens.
 
I'm not sure I understand your premise that Nvidia's worse image quality gives them an edge. Huh? An edge in what?

I think it depends on the user. I prefer quality, but at least at a framerate that doesn't suck and make me able to see it. That means at LEAST 35 fps, with top level images in games. So, if in fact because I am using an Nvidia card right now, you say I have worse image quality, which gives me an edge? I don't follow.

Anyway, what can be gained by deliberately starting what is sure to be a back and forth flame war?

Worse image quality usually means that less resources are spent rendering the frame which means higher FPS. I wonder why that's so difficult to grasp.
 
Worse image quality usually means that less resources are spent rendering the frame which means higher FPS. I wonder why that's so difficult to grasp.

I explained why it's difficult to grasp, but you failed to read.

Your premise assumes that all people value fps over image quality. As I tried to convey, I would much rather force my images to top quality and sacrifice fps, as long as it remained "playable" which is above 35fps.

Since not everyone else either.values fps over image quality, the premise of your question is without merit, rendering further queries into this soon to be flame war moot.
 
I dont know what you are getting at but the anisotropic filtering is there in the first comparison, and its comparing the same card NVIDIA card, not NVIDIA vs AMD, with default settings and high. And then Watch Dogs is Gaming Evolved title.

EDIT: What is with all these newer people posting stuff like this that are clearly pro AMD, Its like AMD has their own internet warriors now.
 
I dont know what you are getting at but the anisotropic filtering is there in the first comparison, and its comparing the same card NVIDIA card, not NVIDIA vs AMD, with default settings and high. And then Watch Dogs is Gaming Evolved title.

EDIT: What is with all these newer people posting stuff like this that are clearly pro AMD, Its like AMD has their own internet warriors now.

I'm not a fan of either company, I've just tried to draw people's attention to this ostensible discrepancy which might not be real/present at all that's why I asked for an independent comparison/review.

I explained why it's difficult to grasp, but you failed to read.

Your premise assumes that all people value fps over image quality. As I tried to convey, I would much rather force my images to top quality and sacrifice fps, as long as it remained "playable" which is above 35fps.

Since not everyone else either.values fps over image quality, the premise of your question is without merit, rendering further queries into this soon to be flame war moot.

My premise is that some people allege that NVIDIA is cheating thus its GPUs are in fact slower than competing AMD solutions. I'm not talking about the merits of a higher FPS - this doesn't bother me at all. What bothers me is that if NVIDIA is indeed cheating that we must reassess its performance metrics.
 
No, you clearly said that Nvidia worse image quality gives them an edge.

I'm making you aware that getting maximum FPS is not the most important thing for all people.

And thank you for confirming @MxPhenom 216's supposition that you are attempting to stir things up.
 
I feel like AMD and Nvidia have slightly different default contrast and vibrance in drivers which is cause of differently perceived colors.

They do. I can confirm I had to tweak my monitors tint/color values when switching to make it look the same as my old AMD.

That said, NVIDIA turns on several "optimizations" in their drivers by default. These are shortcuts that won't really be noticed in real time, but shortcuts all the same. AMD does the same with there drivers they just do it less agressively and it appears in different areas. However, only NVIDIA provides a way to turn said optimizations off. See my settings:

Untitled601.png


And thank you for confirming @MxPhenom 216's supposition that you are attempting to stir things up.

There may be a legit discussion here yet (vendor optimizations), but I am not confident we are mature enough to have it, honestly.
 
No, you clearly said that Nvidia worse image quality gives them an edge.

I'm making you aware that getting maximum FPS is not the most important thing for all people.

And thank you for confirming @MxPhenom 216's supposition that you are attempting to stir things up.

God. Where was I talking about "getting maximum FPS"?

Most people choose GPUs/CPUs based on their performance/price metric and if NVIDIA is cheating (I'm not claiming that - I simply don't know) then people make wrong choices and get objectively worse results (i.e. image quality).
 
Untitled.jpg
They do. I can confirm I had to tweak my monitors tint/color values when switching to make it look the same as my old AMD.

That said, NVIDIA turns on several "optimizations" in their drivers by default. These are shortcuts that won't really be noticed in real time, but shortcuts all the same. AMD does the same with there drivers they just do it less agressively and it appears in different areas. However, only NVIDIA provides a way to turn said optimizations off. See my settings:

Untitled601.png




There may be a legit discussion here yet (vendor optimizations), but I am not confident we are mature enough to have it, honestly.


You seem to be incorrect, as AMD allows me to turn of "Surface Format Optimization" as it sometimes causes corruption with the current driver in GTA5. And as well I allow every game other than Skyrim, and Anisotropic filtering on GTA5, to use its own in game settings.
 
Huh? Are you looking at different screenshots than I'm looking at?

Nope. The first pair compares the same GPU to different settings, and Watch Dogs is an AMD title. I could see a point in this thread if the first pictures were of AMD vs NVIDIA. I dont know what the default settings are for titan x in bf but I'm pretty sure its medium or low for most cards.

I have heard @MT Alex say when he got his 770, that colors in games didnt seem as vibrant compared to his 5870s. That's why I said colors are the main different,whether the shots in the op show it or not.
 
I have a great idea.

How about we just ask @W1zzard if there is any merit in this discussion as he does test gfx cards on a very regular basis.
Otherwise, this thread is a genuine chance of chewing on a pot of bile, favouritism and ignorant, pseudo analysis.
FWIW, the AMD does better IQ idea has been around since year dot. I have never noticed any problems on either side, outside of developer biased games.
 
Some people claim that NVIDIA GPUs produce worse image quality than AMD GPUs using the same in-game settings which gives NVIDIA an edge.
Who.

Cares.


EDIT: What is with all these newer people posting stuff like this that are clearly pro AMD, Its like AMD has their own internet warriors now.
I don't understand why these threads aren't locked immediately.
 
View attachment 66257


You seem to be incorrect, as AMD allows me to turn of "Surface Format Optimization" as it sometimes causes corruption with the current driver in GTA5. And as well I allow every game other than Skyrim, and Anisotropic filtering on GTA5, to use its own in game settings.

Good, you appear to be correct... Glad to see that. Wonder how I missed that in my ownership of my r9...
 
well coming from a R9 290 to a 980 ... i notice ... hum ... nothing (well maybe since Final Fantasy XIV Heavernsward is a nvidia gameworks ... it has something to do ...)
I think it depends on the user. I prefer quality, but at least at a framerate that doesn't suck and make me able to see it. That means at LEAST 35 fps, with top level images in games.
oh cool @rtwjunkie since my DSR 4k give me 35 fps almost constantly i could get a 4k display and play FFXIVHW in 4k (my limite is also 35fps at last :laugh: )

also ... since i have BF4 (and some other AMD centric games ...)i can test ... the 980 ... since i don't have quick release fitting, i don't think i can bear the burden to empty the loop test the 290 then re empty it and test the 980 :roll:

aka: useless thread.
 
Nope. The first pair compares the same GPU to different settings, and Watch Dogs is an AMD title. I could see a point in this thread if the first pictures were of AMD vs NVIDIA. I dont know what the default settings are for titan x in bf but I'm pretty sure its medium or low for most cards.

That comparison is not referring to in-game settings its referring to the Driver GUI slider

vista_nvidia1.png


The question will always be there since driver settings can override game settings for both.
 
That comparison is not referring to in-game settings its referring to the Driver GUI slider

vista_nvidia1.png


The question will always be there since driver settings can override game settings for both.
that's why i always tick the "let the 3D application decide" since i like to set my options in game to see effectively the difference "in game"
 
Status
Not open for further replies.
Back
Top