Friday, January 14th 2022

NVIDIA Releases GeForce 511.23 Game Ready Drivers

NVIDIA today released the GeForce 511.23 WHQL drivers. These drivers introduce support for Windows 11 Dynamic Refresh Rate feature, and formally debut NVIDIA Deep Learning Dynamic Super Resolution (DLDSR). This is an interesting new feature that's an inverse of DLSS, and works to improve eye-candy of older games. Your game is rendered at a higher resolution than your display head, and the render is intelligently scaled down to enhance detail. Such a feature already existed with DSR, but DLDSR adds certain "smarts" by adjusting the higher render resolution on-the-fly, to improve performance. The drivers also add support for CUDA 11.6, even more OpenCL extensions, and a handful more Vulkan extensions, besides support for even more G-SYNC compatible displays.

As a Game Ready driver, version 511.23 adds optimization for "God of War," including support for DLSS and Reflex; and "The Anacrusis." The fixes and known-issues list of this driver appears identical to that of the GeForce 511.17 drivers released earlier this week.

DOWNLOAD: NVIDIA GeForce 511.23 WHQL
Game Ready for
  • God of War
  • The Anacrusis
  • Hitman III Year 2
Gaming Technology Supported
  • Includes support for NVIDIA DLDSR (Deep Learning Dynamic Super Resolution)
New Features and Other Changes
  • Added support for Windows 11 Dynamic Refresh Rate.
  • Added support for CUDA 11.6.
  • The NVIDIA OpenCL driver has added support for new provisional extension specifications released by Khronos.
  • Added new OpenCL compiler technology as an opt-in feature.
Fixed Issues in this Release
  • [Detroit Become Human]: Random stuttering/freezing occurs in the game. [3389250]
  • Flicker/disappearing text when 12-bit color is used [3358884]
  • [HDR][G-Sync]: Mouse pointer gets stuck after turning on HDR from the Windows Control Panel or after toggling G-Sync from the NVIDIA control panel. [200762998]
To work around, click the mouse (right or left button). The mouse cursor will be sluggish for a few seconds before returning to normal operation.

Known Issues
  • [Deathloop][HDR]: TDR/corruption occurs in the game with Windows HDR enabled. [200767905] If this issue occurs, toggle the Windows HDR setting.
  • Sonic & All-Stars Racing Transformed may crash on courses where players drive through water. [3338967]
  • In multi-monitor configurations, the screen may display random black screen flicker. [3405228]
  • [NVIDIA Advanced Optimus][NVIDIA Control Panel]: After setting the display multiplexer type to "dGPU", the setting is not preserved across a reboot or resume from S4. [200779758]
  • [NVIDIA Image Scaling][Desktop]: The screen moves to the upper left corner on cold boot when Image Scaling is applied to the desktop. [3424721] Do not apply NVIDIA Image Scaling to the desktop. It is intended only for video upscaling or for games which do not run with a scaling resolution unless the same Image Scaling resolution is applied on the desktop.
  • NVIDIA Image Scaling][DirectX 11 video apps]: With Image Scaling enabled, video playback is corrupted or results in a system hang after performing an HDR transition. [3431284]
Add your own comment

66 Comments on NVIDIA Releases GeForce 511.23 Game Ready Drivers

#26
Mussels
Freshwater Moderator
Playing with the new DSR has set some stupid records for me:

5760x3240 @ 80Hz, in starcraft II.
(3840x2160 80Hz reported by panel, 2560x1440 165Hz native)


Lets just say... it was pretty.
And my 3090 chugged at times lol.
Posted on Reply
#27
wolf
Better Than Native
MusselsPlaying with the new DSR has set some stupid records for me
It's kind of incredible really, the 3080 powers through a lot of games at 2.25x and it IQ is incredible. Bonus points if the game supports DLSS. 2560x1080 up to 5120x2160 down to 3440x1440 lol and the IQ is phenomenal, and still performs better than native in the titles I tested.
Posted on Reply
#28
Lycanwolfen
TiggerNvidia H100 Hopper, will cost as much as a family car :laugh:
3090TI single card cost the same as a car in Canada.
PilgrimIt's called supersampling and it does increase the overall quality of the image significantly. Say you have a 1440P monitor and your VGA is capable of running a certain game at very high FPS in 4K. Then supersampling stage can render the image in 4K and downsample it to 1440P which significantly improves the quality of the image rather than running it at native 1440P
You can take 1440p and shove it somewhere. Seriously when the 10 series came out 4k was the thing. Then Nvidia backtracked to 1440p. Totally sad I jumped on the 4k gaming and then got screwed in the end by nvidia. Also I get a good laugh when people say certain games do not support SLI. Yes maybe in the game it's not supported but in the driver it is. I never set game settings for SLI I do it on the driver side. I play like 3 games that say no support for SLI yet if I check the SLI indicator and the GPU in W10 it shows both being used and peaking 100% both GPU's so guess what SLI works. I know this because I have been doing SLI since my Orginial Diamond Monster Voodoo 2's in SLI. I remember playing Quake 3 arena on those babies and had better FPS then the newer single card Voodoo 3's.
Lycanwolfen3090TI single card cost the same as a car in Canada.


You can take 1440p and shove it somewhere. Seriously when the 10 series came out 4k was the thing. Then Nvidia backtracked to 1440p. Totally sad I jumped on the 4k gaming and then got screwed in the end by nvidia. Also I get a good laugh when people say certain games do not support SLI. Yes maybe in the game it's not supported but in the driver it is. I never set game settings for SLI I do it on the driver side. I play like 3 games that say no support for SLI yet if I check the SLI indicator and the GPU in W10 it shows both being used and peaking 100% both GPU's so guess what SLI works. I know this because I have been doing SLI since my Orginial Diamond Monster Voodoo 2's in SLI. I remember playing Quake 3 arena on those babies and had better FPS then the newer single card Voodoo 3's.
Also in Canada the 3090TI is 3,261.99. Price tag. You can buy a used card under that easy.
Posted on Reply
#29
MxPhenom 216
ASIC Engineer
These drivers seem to work quite a bit better than the 497.09s I had prior.
Posted on Reply
#30
R-T-B
MxPhenom 216These drivers seem to work quite a bit better than the 497.09s I had prior.
Agreed. My experience with them has been good so far.
Posted on Reply
#31
Unregistered
Any point in using this on my 980ti? current driver is 497.29
Posted on Edit | Reply
#32
fusseli
Still no optimal settings available in Halo Infinite with a 3070 Ti. Oh well I just run it on ultra with ~100 fps
Lycanwolfen3090TI single card cost the same as a car in Canada.


You can take 1440p and shove it somewhere. Seriously when the 10 series came out 4k was the thing. Then Nvidia backtracked to 1440p. Totally sad I jumped on the 4k gaming and then got screwed in the end by nvidia. Also I get a good laugh when people say certain games do not support SLI. Yes maybe in the game it's not supported but in the driver it is. I never set game settings for SLI I do it on the driver side. I play like 3 games that say no support for SLI yet if I check the SLI indicator and the GPU in W10 it shows both being used and peaking 100% both GPU's so guess what SLI works. I know this because I have been doing SLI since my Orginial Diamond Monster Voodoo 2's in SLI. I remember playing Quake 3 arena on those babies and had better FPS then the newer single card Voodoo 3's.


Also in Canada the 3090TI is 3,261.99. Price tag. You can buy a used card under that easy.
Nvidia did sway everyone to 4K back around the 900 series because that's when 4K, HDMI 2.0 all hit the market. Sorry you took the bait. 4K isn't hugely better for IQ compared to 1440p on a normal sized monitor (or tv) yet takes much more processing power and vram. They are still hosing the market this way via vram they put on new cards. My brand new 3070ti only has 8gb, which is barely enough for 1440p ultrawide, it would not be enough for 4K. My 1080ti had 11gb but a lot less horsepower. They are skimming on vram to reduce their own costs and force 4Kers to buy top of the line cards. It's true.
Posted on Reply
#33
Fluffmeister
TiggerAny point in using this on my 980ti? current driver is 497.29
Not really, but I for one have grabbed them anyway. I'd like to get an RTX card at some point, but I'm not prepared to splash the sort of cash required to nab one at the moment.
Posted on Reply
#34
MxPhenom 216
ASIC Engineer
R-T-BAgreed. My experience with them has been good so far.
I haven't notice any monitor flickering yet even at desktop like I did with the 497s.
Posted on Reply
#35
lexluthermiester
GoldenXAaand they broke compute.
Who hired AMD devs?
Remember, just because a new driver is released doesn't mean you have to use it. If the driver you're using renders functionality you need and the new one breaks it, stick with the one that works.
robbSarcasm or are you truly that clueless about the hardware needed for this?
That was very clearly sarcasm, thus the raise eyebrow emoji..
Posted on Reply
#36
fusseli
MxPhenom 216I haven't notice any monitor flickering yet even at desktop like I did with the 497s.
Yeah I had flickering before too with gsynch on in windowed mode but it seems gone now.
Posted on Reply
#37
R-T-B
lexluthermiesterRemember, just because a new driver is released doesn't mean you have to use it. If the driver you're using renders functionality you need and the new one breaks it, stick with the one that works.
Indeed. FWIW my cuda based mining app is working, so CUDA seems ok.
Posted on Reply
#38
MxPhenom 216
ASIC Engineer
fusseliYeah I had flickering before too with gsynch on in windowed mode but it seems gone now.
I would get it even with gsync off, and thats in game and desktop.
Posted on Reply
#39
Unregistered
FluffmeisterNot really, but I for one have grabbed them anyway. I'd like to get an RTX card at some point, but I'm not prepared to splash the sort of cash required to nab one at the moment.
Me too, i intend to wait 3 or 4mths and see how much cash i have secreted. Till then the 980ti is still good as i only game at 1080p atm anyway
Posted on Edit | Reply
#40
Fluffmeister
TiggerMe too, i intend to wait 3 or 4mths and see how much cash i have secreted. Till then the 980ti is still good as i only game at 1080p atm anyway
Definitely, I can still play at 1080P, 1440P and 4K depending on the game, and at 120Hz with the first two. And thanks to DSR anywhere in-between too. So definitely don't feel like I'm missing out that much.

Happy to wait.
Posted on Reply
#41
Recus
Lycanwolfen3090TI single card cost the same as a car in Canada.


You can take 1440p and shove it somewhere. Seriously when the 10 series came out 4k was the thing. Then Nvidia backtracked to 1440p. Totally sad I jumped on the 4k gaming and then got screwed in the end by nvidia. Also I get a good laugh when people say certain games do not support SLI. Yes maybe in the game it's not supported but in the driver it is. I never set game settings for SLI I do it on the driver side. I play like 3 games that say no support for SLI yet if I check the SLI indicator and the GPU in W10 it shows both being used and peaking 100% both GPU's so guess what SLI works. I know this because I have been doing SLI since my Orginial Diamond Monster Voodoo 2's in SLI. I remember playing Quake 3 arena on those babies and had better FPS then the newer single card Voodoo 3's.


Also in Canada the 3090TI is 3,261.99. Price tag. You can buy a used card under that easy.
Posted on Reply
#42
GoldenX
lexluthermiesterRemember, just because a new driver is released doesn't mean you have to use it. If the driver you're using renders functionality you need and the new one breaks it, stick with the one that works.
While this is perfect from an user standpoint, telling your users to revert back when they also want to play God of War is an issue.
Posted on Reply
#43
MentalAcetylide
I'm sticking with the studio drivers for both of my cards. The last thing I need are game driver updates messing with my renders.
Posted on Reply
#44
Mussels
Freshwater Moderator
Lycanwolfen3090TI single card cost the same as a car in Canada.

Also I get a good laugh when people say certain games do not support SLI. Yes maybe in the game it's not supported but in the driver it is. I never set game settings for SLI I do it on the driver side.
Get SLI working in starcraft II legacy of the void without protoss pylon rings being broken, and get it working in killing floor 2 without vanishing textures.
Some games are 100% totally and utterly broken with SLI. SC2 had SLI support, and when the expansions came out they still claim it has SLI, when the official profile just disables the second GPU and no more (WITHOUT fixing the pylon bug, at least when i last tried).

Lucky you, playing some Esports titles and benchmarks... but there are many, many games out there that not just lack SLI support, but outright break when it's enabled.
Posted on Reply
#45
lexluthermiester
GoldenXWhile this is perfect from an user standpoint, telling your users to revert back when they also want to play God of War is an issue.
I understand your perspective, but disagee. There is nothing wrong with skipping a driver point release. God of War still runs on the drivers released in December. We have to remember that drivers and game optimizations are a very complicated task.
Posted on Reply
#46
R-T-B
lexluthermiesterI understand your perspective, but disagee. There is nothing wrong with skipping a driver point release. God of War still runs on the drivers release in December. We have to remember that drivers and game optimizations are a very complicated.
Some games demand latest drivers to play, and check at the launcher. Rare, but some do. Personally I just say "screw em" to games that do that. Much like I do with overbearing anticheat solutions...
Posted on Reply
#47
lexluthermiester
R-T-BSome games demand latest drivers to play, and check at the launcher.
I have never encountered this on drivers less than 2 years old. I have seen it on drivers that were older though. Of course I don't do much gaming on anything but titles from GOG so that might be why..
R-T-BPersonally I just say "screw em" to games that do that. Much like I do with overbearing anticheat solutions...
Right there with you!
Posted on Reply
#48
R-T-B
I've had some clients play odd online games that required driver updates, but I'm totally with you: not in my library!
Posted on Reply
#49
WhoDecidedThat
LycanwolfenYou can take 1440p and shove it somewhere. Seriously when the 10 series came out 4k was the thing. Then Nvidia backtracked to 1440p. Totally sad I jumped on the 4k gaming and then got screwed in the end by nvidia.
Not just Nvidia. The entire industry is guilty of this. Case in point - the Xbox Series S is advertised as a 1440p 60fps console which is utter bullshit. It is a 1080p 60fps console at best. With some titles managing only 1080p 30fps. Upscaled to 1440p of course. That upscaling is what was advertised.

Similarly, Xbox Series X/PS5 are advertised as 4K 120fps but it is more like 4K 60fps for modern titles. Can they run 4K 120fps? Yes, there will be titles that support it but those would be very few. You can't run something like Spiderman Miles Morales at 4K 120fps while retaining its impressive visual fidelity.

And you can bet that the RTX 3080/3090 won't be 4K cards for very long either as Ray Traced Global Illumination starts becoming more commonplace.

Graphics card companies need bullshit to advertise other than "our cards are faster than before". This is what ends up creating the 3D glasses hype, Nvidia PhysX hype,
Tesselation hype, 4K hype and more recently RTX hype. DLSS is probably the only feature they have created in recent times that isn't complete smoke and mirrors. It's pretty good for resolution scaling and anti aliasing.

I am not saying that Tesselation and RTX aren't useful. They absolutely are. But they are artificially tacked on to existing games and then a few years later it turns out that it's useful for some things only and isn't a game changer as advertised. Hell, tessellation completely stopped being advertised by 2012-13 and it was all the rage in 2009-10.

There are also features that are hard to advertise but developers find very useful. Examples of this are DX11's Compute Shader and DX12's Mesh Shader.

This is why I think DLSS became useful so quickly. It doesn't need artistic supervision to be used and is relatively straightforward to implement in existing workflows.
Posted on Reply
#50
Mussels
Freshwater Moderator
This new scaling toy is going to break me

Fart Cry six, at what i might as well call 6K

Posted on Reply
Add your own comment
Nov 21st, 2024 11:41 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts