• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces Three New Mobile GPUs With Spring 2022 Availability

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,886 (2.34/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
NVIDIA has just announced three new mobile GPUs, although the question is how new any of them really are, as the model names suggest they're anything but. First up is the GeForce RTX 2050, which should be based on the Turing architecture. The other two GPUs are the GeForce MX550 and MX570, both presumably based on the Ampere architecture, although NVIDIA hasn't confirmed the specifics.

The GeForce RTX 2050 features 2048 CUDA cores, which is more than the mobile RTX 2060, but it has lower clock speeds and a vastly lower power draw at 30-45 Watts depending on the notebook design choices and cooling. It's also limited to 4 GB of 64-bit GDDR6 memory, which puts this in GeForce MX territory when it comes to memory bandwidth, as NVIDIA quotes an up to memory bandwidth of a mere 112 GB/s.




The two GeForce MX parts also support GDDR6 memory, but beyond that, NVIDIA hasn't released anything tangible in terms of the specs. NVIDIA mentions that the GeForce MX550 will replace the MX450, while stating both GPUs are intended to boost the performance for video and photo editing on the move, with the MX570 also being suitable for gaming. All three GPUs are said to ship in laptops sometime this coming spring.

Update: According to a tweet by ComputerBase, who has confirmed the information with Nvidia, the RTX 2050 and the MX570 are based on Ampere and the GA107 GPU, with the MX550 being based on Turin and the TU117 GPU. The MX570 is also said to support DLSS and "limited" RTX features, whatever that means.

View at TechPowerUp Main Site
 
Joined
Apr 15, 2021
Messages
879 (0.72/day)
More like col(crying out loud)... "regress" :(
Even though I paid a lot more than I should have, I'm glad I started having parts ordered for my new rig build back in March-April. The damn fans didn't get here until two weeks ago... so I should be going to pick it up within the next week and have it before Christmas holiday. :rockout:
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,671 (4.64/day)
Location
Kepler-186f
More like col(crying out loud)... "regress" :(
Even though I paid a lot more than I should have, I'm glad I started having parts ordered for my new rig build back in March-April. The damn fans didn't get here until two weeks ago... so I should be going to pick it up within the next week and have it before Christmas holiday. :rockout:

i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:
 
Joined
Sep 17, 2014
Messages
21,718 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:

Your screen is no longer sipping power budget from the laptop brick perhaps?
 
Joined
Jun 23, 2011
Messages
395 (0.08/day)
System Name potato
Processor Ryzen 9 5950X
Motherboard MSI MAG B550 Tomahawk
Cooling Custom WC Loop
Memory 2x16GB G.Skill Trident Z Neo 3600
Video Card(s) RTX3090
Storage 512GB, 2TB NVMe + 2TB SATA || 32TB spinning rust
Display(s) XIAOMI Curved 34" 144Hz UWQHD
Case be quiet dark base pro 900
Audio Device(s) Edifier R1800T, Logitech G733
Power Supply Corsair HX1000
Mouse Logitech G Pro
Keyboard Logitech G913
Software win 11 amd64
i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:
that's because the rendered frames still goes through the intel iGPU and then to the internal display while with external display connected, the dGPU drives it directly without having to go through the iGPU first.
 
Joined
Jun 5, 2021
Messages
284 (0.24/day)
This hood for consumer's.. dlss will help making these micro gpu's more powerful
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,671 (4.64/day)
Location
Kepler-186f
that's because the rendered frames still goes through the intel iGPU and then to the internal display while with external display connected, the dGPU drives it directly without having to go through the iGPU first.

that shouldn't be the case when you have option for "Discrete" in mobo bios, which i do.

Your screen is no longer sipping power budget from the laptop brick perhaps?

this might be it. who knows
 
Joined
Jun 23, 2011
Messages
395 (0.08/day)
System Name potato
Processor Ryzen 9 5950X
Motherboard MSI MAG B550 Tomahawk
Cooling Custom WC Loop
Memory 2x16GB G.Skill Trident Z Neo 3600
Video Card(s) RTX3090
Storage 512GB, 2TB NVMe + 2TB SATA || 32TB spinning rust
Display(s) XIAOMI Curved 34" 144Hz UWQHD
Case be quiet dark base pro 900
Audio Device(s) Edifier R1800T, Logitech G733
Power Supply Corsair HX1000
Mouse Logitech G Pro
Keyboard Logitech G913
Software win 11 amd64
that shouldn't be the case when you have option for "Discrete" in mobo bios, which i do.

your laptop still using hard mux? interesting, what model & make is it?
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,923 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASUS ROG Strix X670E-I Gaming WiFi
Cooling ID-COOLING SE-207-XT Slim Snow
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage 2TB Samsung 990 Pro NVMe
Display(s) AOpen Fire Legend 24" (25XV2Q), Dough Spectrum One 27" (Glossy), LG C4 42" (OLED42C4PUA)
Case ASUS Prime AP201 33L White
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight (White), G303 Shroud Edition
Keyboard Wooting 60HE / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.3447
that shouldn't be the case when you have option for "Discrete" in mobo bios, which i do.



this might be it. who knows

That function might not be a proper mux switch. (Data is still going through the iGPU instead of bypassing it completely.) What's the make and model of your lappy?
 
Joined
Apr 17, 2021
Messages
542 (0.45/day)
System Name Jedi Survivor Gaming PC
Processor AMD Ryzen 7800X3D
Motherboard Asus TUF B650M Plus Wifi
Cooling ThermalRight CPU Cooler
Memory G.Skill 32GB DDR5-5600 CL28
Video Card(s) MSI RTX 3080 10GB
Storage 2TB Samsung 990 Pro SSD
Display(s) MSI 32" 4K OLED 240hz Monitor
Case Asus Prime AP201
Power Supply FSP 1000W Platinum PSU
Mouse Logitech G403
Keyboard Asus Mechanical Keyboard
lol... "progress"
Will be interesting to compare battery life of the Radeon 6500 XT in laptops versus the RTX 2050 (super OLD).
 
Joined
Feb 26, 2016
Messages
551 (0.18/day)
Location
Texas
System Name O-Clock
Processor Intel Core i9-9900K @ 52x/49x 8c8t
Motherboard ASUS Maximus XI Gene
Cooling EK Quantum Velocity C+A, EK Quantum Vector C+A, CE 280, Monsta 280, GTS 280 all w/ A14 IP67
Memory 2x16GB G.Skill TridentZ @3900 MHz CL16
Video Card(s) EVGA RTX 2080 Ti XC Black
Storage Samsung 983 ZET 960GB, 2x WD SN850X 4TB
Display(s) Asus VG259QM
Case Corsair 900D
Audio Device(s) beyerdynamic DT 990 600Ω, Asus SupremeFX Hi-Fi 5.25", Elgato Wave 3
Power Supply EVGA 1600 T2 w/ A14 IP67
Mouse Logitech G403 Wireless (PMW3366)
Keyboard Monsgeek M5W w/ Cherry MX Silent Black RGBs
Software Windows 10 Pro 64 bit
Benchmark Scores https://hwbot.org/search/submissions/permalink?userId=92615&cpuId=5773
I am 99% certain the 2050 is actually an Ampere GPU. The GFLOPS/Watt for the 45W spec is 134.44, while the 30W spec is 157.70. That is in line with Ampere Mobile GFLOPS/Watt, not Turing Mobile GFLOPS/Watt. Turing Mobile was in the range of 52-83, while Ampere Mobile's current range is 89-191. Source

i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:
If you have the iGPU enabled, disable it in device manager and see if your FPS goes up on the laptop display.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,886 (2.34/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,548 (2.74/day)
Location
Ex-usa | slava the trolls
Nvidia has a history of lower image quality in games 3D and in desktop 2D. This is due to lower quality fonts, and damaged, fake, dull colours and lower textures resolution in 3D.

I would stay away from their products, I don't buy anything Nvidia..
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,671 (4.64/day)
Location
Kepler-186f
Nvidia has a history of lower image quality in games 3D and in desktop 2D. This is due to lower quality fonts, and damaged, fake, dull colours and lower textures resolution in 3D.

I would stay away from their products, I don't buy anything Nvidia..

AMD to the moon!
 
Joined
Feb 20, 2019
Messages
7,799 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Honestly, I'm done with chunky gaming laptops (have been personally for a decade already, getting to that point with recommendations and work purchases for 3D/VR presentation also).

All that matters is what nVidia can achieve in under 50W. If you want more than 50W of dGPU gaming in a portable form factor you pretty much NEED to sit down at a desk and find a power outlet, at which point you don't really need a laptop. A decade ago, travelling to a location for LAN play was a reasonable thing, but the world has changed dramatically since then, COVID being just one of the factors for this shift in trends.

As far as I can tell, Nvidia don't really have any compelling Ampere chips in the sub-50W range yet. Turing made do with the 1650 Max-Q variants at 35W which were barely fast enough to justify themselves over some of the far more efficient APUs like the 4700U and 5700U which at 25W all-in allowed for far slimmer, cooler, quieter, lighter, longer-lasting laptops that can still game.

Unfortunately these are still based on the GA107 and we've already seen how disappointing and underwhelming that is at lower TDPs - The 3050 isn't exactly an easy performance recommendation even at the default 80W TDP. A cut-down version, running far slower than that? It may struggle to differentiate itself from the ancient 1650 and any wins it gains over the old 1650 will likely come at increased performance/Watt which kills the appeal.

I think the real problem with Ampere is that Samsung 8nm simply isn't very power-efficient. Yes, it's dense, but performance/Watt isn't where it shines. Sadly, there aren't any compelling TSMC 7nm GPUs making their way into laptops right now so we're all stuck with 2019-esque performance/Watt solutions.
 
Last edited:
  • Like
Reactions: ARF
Joined
Jan 14, 2019
Messages
10,964 (5.37/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
What's the point of packing 2048 CUDA cores with an extremely low TDP and 64-bit memory bus? Would it not make more sense to use a smaller or (more) defective chip? :kookoo:
 
Joined
Oct 22, 2014
Messages
13,727 (3.83/day)
Location
Sunshine Coast
System Name Lenovo ThinkCentre
Processor AMD 5650GE
Motherboard Lenovo
Memory 32 GB DDR4
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Lenovo
Software W11 Pro 64 bit
What's the point of packing 2048 CUDA cores with an extremely low TDP and 64-bit memory bus? Would it not make more sense to use a smaller or (more) defective chip? :kookoo:
Won't that improve it's compute power?
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.56/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
I'm really looking forward to nvidia releasing Riva tnt again.
 
Joined
Dec 25, 2020
Messages
5,731 (4.31/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Nvidia has a history of lower image quality in games 3D and in desktop 2D. This is due to lower quality fonts, and damaged, fake, dull colours and lower textures resolution in 3D.

I would stay away from their products, I don't buy anything Nvidia..

I honestly thought the AMD vs. NVIDIA color accuracy conspiracy died off a decade ago. :kookoo:

There are no differences between both vendors' image quality when outputting the same signal and the same pixel format. What may cause this is incorrect settings reported by either EDID or manual user input, as NVIDIA supports chroma compression in older hardware generations (including YCbCr 4:2:0 and 4K 60Hz on Kepler GPU with DVI-D DL/HDMI 1.4 port), while AMD did not do so until Polaris.

Nowadays every monitor and every GPU should be capable of displaying a full-range RGB at 8 bpc signal at a minimum.
 

ARF

Joined
Jan 28, 2020
Messages
4,548 (2.74/day)
Location
Ex-usa | slava the trolls
I honestly thought the AMD vs. NVIDIA color accuracy conspiracy died off a decade ago. :kookoo:

There are no differences between both vendors' image quality when outputting the same signal and the same pixel format. What may cause this is incorrect settings reported by either EDID or manual user input, as NVIDIA supports chroma compression in older hardware generations (including YCbCr 4:2:0 and 4K 60Hz on Kepler GPU with DVI-D DL/HDMI 1.4 port), while AMD did not do so until Polaris.

Nowadays every monitor and every GPU should be capable of displaying a full-range RGB at 8 bpc signal at a minimum.

Well, I think the reason for this topic "die off" is that the users themselves use so different models of screens, monitors, displays, panels and own private settings, that an objective comparison is very difficult to perform.

But look at the green colour of the grass between RX 6900 XT on the right and RTX 3090 on the left.

1639933571689.png


RX 6900 XT vs RTX 3090 - Test in 8 Games l 4K l - YouTube
 

Attachments

  • 1639933572661.png
    1639933572661.png
    3.3 MB · Views: 68
Joined
Dec 25, 2020
Messages
5,731 (4.31/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Well, I think the reason for this topic "die off" is that the users themselves use so different models of screens, monitors, displays, panels and own private settings, that an objective comparison is very difficult to perform.

But look at the green colour of the grass between RX 6900 XT on the right and RTX 3090 on the left.

Maybe I am blind, maybe I have an RTX 3090... I can't tell the color of the grass apart? Nor could I ever tell a difference between this card and the Radeon VII I had before it? And you especially wouldn't be able to see it after video compression on YouTube to begin with, nor are the scenes and lighting on that comparison exactly identical - you would need two reference gradation images being displayed on reference monitors, to even begin to make any comparison valid.

Sorry bro, that stuff is hearsay... has always been. The color reproduction on both companies' GPUs is fully accurate when set correctly.
 
Joined
Feb 20, 2019
Messages
7,799 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I'm fairly certain that the last time AMD vs Nvidia image quality came up people put the matter to rest with binary XOR filters on screencaps of identical frames from repeatable benchmarks.

In the bad old days there was ATi's Quack3.exe and Nvidia's FX series cheating on aniso filtering. Both companies fixed those cheats once called out by the media. After that we had Nvidia very slow on the uptake with gamma-corrected AA for MSAA, that was the last real rendering difference between the two companies, around the time that AMD bought ATi.

There are however some exceptionally dumb issues with the Nvidia driver from this decade, one of which is still an issue even to this day:
  • A minor one that existed until quite recently (10-series cards) was that the driver defaults would drop filtering quality down a notch. That no longer seems to happen but was effectively Nvidia offering "Balanced" presets for driver settings where the game didn't request a specific setting, so many people unaware of this buried option deep in the ancient Nvidia driver advanced 3D settings menu was only getting "Quality" filtering, not "High quality" filtering:

    1639942957504.png


  • A more serious issue even present today (tested on two HDMI displays with both a 2060 and 3060Ti) is that the driver detects the display device as limited RGB (16-235) which, if you don't realise that and game in a well-lit room could easily just mistake it for muted colours and dull highlights. I know for sure that it doesn't happen with all HDMI displays on Nvidia cards but it's a big enough problem that I often see youtubers/streamers capturing output where black is not black, clearly it's 16-235 in three channels. If you look at an image-comparison video on YT of various side by side videos and the radeon card looks to have better contrast, it's because of this messed up, unfixed bug in the drivers that is subtle enough for many people to miss.

    1639943321230.png
 
Joined
Jan 14, 2019
Messages
10,964 (5.37/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
I'm fairly certain that the last time AMD vs Nvidia image quality came up people put the matter to rest with binary XOR filters on screencaps of identical frames from repeatable benchmarks.

In the bad old days there was ATi's Quack3.exe and Nvidia's FX series cheating on aniso filtering. Both companies fixed those cheats once called out by the media. After that we had Nvidia very slow on the uptake with gamma-corrected AA for MSAA, that was the last real rendering difference between the two companies, around the time that AMD bought ATi.

There are however some exceptionally dumb issues with the Nvidia driver from this decade, one of which is still an issue even to this day:
  • A minor one that existed until quite recently (10-series cards) was that the driver defaults would drop filtering quality down a notch. That no longer seems to happen but was effectively Nvidia offering "Balanced" presets for driver settings where the game didn't request a specific setting, so many people unaware of this buried option deep in the ancient Nvidia driver advanced 3D settings menu was only getting "Quality" filtering, not "High quality" filtering:

    View attachment 229435

  • A more serious issue even present today (tested on two HDMI displays with both a 2060 and 3060Ti) is that the driver detects the display device as limited RGB (16-235) which, if you don't realise that and game in a well-lit room could easily just mistake it for muted colours and dull highlights. I know for sure that it doesn't happen with all HDMI displays on Nvidia cards but it's a big enough problem that I often see youtubers/streamers capturing output where black is not black, clearly it's 16-235 in three channels. If you look at an image-comparison video on YT of various side by side videos and the radeon card looks to have better contrast, it's because of this messed up, unfixed bug in the drivers that is subtle enough for many people to miss.

    View attachment 229437
There has also been an easy fix for these: never assume that a driver update preserved your settings. ;)
 
Top