Monday, May 18th 2015

NVIDIA Also Releases GeForce 352.86 WHQL Game Ready Driver

In addition to the first WHQL-signed GeForce driver for Windows 10, NVIDIA released GeForce 352.86 WHQL Game Ready Driver for other Windows versions. This driver adds optimizations for "The Witcher 3: Wild Hunt," which will release tomorrow (19th May). The driver also adds SLI profiles and GeForce Experience optimal settings for the game. The Witcher 3: Wild Hunt is part of NVIDIA's "Two Times the Adventure" game bundle, in which buyers of NVIDIA's GeForce GTX 960, GTX 970, and GTX 980 graphics cards; and notebooks with GTX 970M and GTX 980M, receive game codes for free. The game have also has some inherent optimizations for NVIDIA GeForce GPUs, with its Game Works varnish, which adds PhysX effects, turf effects, and HBAO+.
DOWNLOAD: Windows 8/7/Vista 64-bit | Windows 8/7/Vista 32-bit | Windows XP 32-bit | Windows XP 64-bit
Add your own comment

33 Comments on NVIDIA Also Releases GeForce 352.86 WHQL Game Ready Driver

#1
Hiryougan
"Optimalization". When 780 gets downgraded to the level of 960.
Posted on Reply
#2
mroofie
Hiryougan"Optimalization". When 780 gets downgraded to the level of 960.
Much butt hurt :0
Posted on Reply
#3
Hiryougan
Nah, i don't even own Nvidia's gpu. But the Kepler abandonment is real man.
Posted on Reply
#4
Prima.Vera
Hiryougan"Optimalization". When 780 gets downgraded to the level of 960.
what do you mean?
Posted on Reply
#5
Valeriant
Here we go, a couple hours to release time. As usual, a new driver ready for it. Will it break other games? To install or not to install, hmm. Maybe I'll try the game first before.
Posted on Reply
#6
erixx
Why release a f¨*****g huge driver when it is just a new game profile? Or wait... Do drivers change for different games? :P
Posted on Reply
#7
jabbadap
Curse you, don't say turf effects when there's no turf effects in game. Those effects would have been perfect for the Witcher.
Posted on Reply
#8
GhostRyder
Prima.Verawhat do you mean?
He is referring to things like this (not this specific site just this in general)
www.techspot.com/review/1000-project-cars-benchmarks/page2.html
What he is referencing is lately games have become less and less optimized for the Kepler cards and its starting to show. In a sense they are doing the same thing to Kepler now that Maxwell is out as they did to Fermi when Kepler came out. A few places have already been complaining about this but not sure how big it is yet or if its just a coincidence.

Glad there is a WHQL for Windows 10 out, were almost there.
Posted on Reply
#9
rtwjunkie
PC Gaming Enthusiast
GhostRyderHe is referring to things like this (not this specific site just this in general)
www.techspot.com/review/1000-project-cars-benchmarks/page2.html
What he is referencing is lately games have become less and less optimized for the Kepler cards and its starting to show. In a sense they are doing the same thing to Kepler now that Maxwell is out as they did to Fermi when Kepler came out. A few places have already been complaining about this but not sure how big it is yet or if its just a coincidence.

Glad there is a WHQL for Windows 10 out, were almost there.
So far I have not seen any ill effects. My 780 still has alot of brute force, pretty much on par with a 970, so I'll be ok with my upgrade schedule that I always stick to...in this case the 980Ti after it has been out a few months and you can actually buy one.
Posted on Reply
#10
Fluffmeister
rtwjunkieSo far I have not seen any ill effects. My 780 still has alot of brute force, pretty much on par with a 970, so I'll be ok with my upgrade schedule that I always stick to...in this case the 980Ti after it has been out a few months and you can actually buy one.
Certainly does well considering it's 2 years old now.
Posted on Reply
#11
qubit
Overclocked quantum bit
I hope they've finally fixed that recently introduced bug in the last few driver versions that prevents UT2004 from running properly in SLI mode, but I'm not holding my breath.
Posted on Reply
#12
GhostRyder
rtwjunkieSo far I have not seen any ill effects. My 780 still has alot of brute force, pretty much on par with a 970, so I'll be ok with my upgrade schedule that I always stick to...in this case the 980Ti after it has been out a few months and you can actually buy one.
Is it a big deal right now, no because its only some recent titles not getting the attention needed but the fact that it show up already is quite troublesome in my book. I mean its almost like its a "Forced" upgrade for some especially considering the reality of power between cards like the GTX 780/ti and 970/980. But it's different for everyone on upgrade paths, I just think if someone spent 700 bucks on a card+ they had better be able to use it for a couple of years for that price.
qubitI hope they've finally fixed that recently introduced bug in the last few driver versions that prevents UT2004 from running properly in SLI mode, but I'm not holding my breath.
LOL, man do you even need both 780ti's to run UT 2004 :p
Posted on Reply
#13
rtwjunkie
PC Gaming Enthusiast
@GhostRyder The way I look at it, I already have gotten my two years out of it, Anything more until I buy a 980Ti is bonus!
Posted on Reply
#14
qubit
Overclocked quantum bit
GhostRyderLOL, man do you even need both 780ti's to run UT 2004 :p
lol, I thought someone would say that. :) Here's why it matters.

1 The default configuration for my cards is of course SLI. However, it only works properly for UT2004 if I completely disable it in the driver first, not just in the 3D settings, which is a Royal Pain in the Ass.

2 I'm shooting for a solid 120fps vsync locked at max details at 1080p and there are moments when the action is very busy that it actually drops below this leading to visible judder, even with a 2700K CPU and 780 Ti. Having the second one would likely eliminate this. I know it's the card, because lowering the resolution to something like 1024x768 (ie a very easy, lightweight mode) it never drops frames. It's surprising how big a load even an old game can put on a modern system, especially if they're not well optimised and you're maxing it out. Call of Duty: World at War is like this and runs at a surprisingly relatively poor framerate on my system (haven't tried it with SLI yet though). It's always above 60fps, however.
Posted on Reply
#15
GhostRyder
qubitlol, I thought someone would say that. :) Here's why it matters.

1 The default configuration for my cards is of course SLI. However, it only works properly for UT2004 if I completely disable it in the driver first, not just in the 3D settings, which is a Royal Pain in the Ass.

2 I'm shooting for a solid 120fps vsync locked at max details at 1080p and there are moments when the action is very busy that it actually drops below this leading to visible judder, even with a 2700K CPU and 780 Ti. Having the second one would likely eliminate this. I know it's the card, because lowering the resolution to something like 1024x768 (ie a very easy, lightweight mode) it never drops frames. It's surprising how big a load even an old game can put on a modern system, especially if they're not well optimised and your maxing it out. I Call of Duty: World at War is like this and runs at a surprisingly relatively poor framerate on my system (haven't tried it with SLI yet though). It's always above 60fps, though.
I had a feeling, but I am actually surprised that it needs more to do 120hz perfectly though to be fair most games are not made to run above 60hz well so its an optimization thing. I understand your plight, it is a sad thing to see that happen!
rtwjunkie@GhostRyder The way I look at it, I already have gotten my two years out of it, Anything more until I buy a 980Ti is bonus!
True, but I am also referencing the entire line including the GTX Titan, Titan Black, Titan Z, and 780ti some of which cost $1000+. Some of us do that (I normally skip 1 generation before purchase except in cases I am not happy with something) but I feel it should be a choice to upgrade and not a forced upgrade because we stopped supporting a card once our new generation cards are out. Maxwell has not been out a year yet and is only the next generation beyond Kepler so I feel Kepler should still have full support and optimizations at least until the next generation for those wanting to stick with a card awhile. Some people may buy anticipating keeping a card for 3 or more years because they do not have a ton on money yet invested in a high card to keep it for awhile to not worry. Many times I see a person asking that very question as to what can they buy to last 2+ years.

Either way, hopefully this or some more drivers will fix the issue. If nothing else at least they still give the SLI profiles for older cards.
Posted on Reply
#16
luches
Getting random Driver crash on desktop now. Now to play witcher 3, I gotta rollback and hope the game is not much affected with older driver.
Just brilliant :shadedshu:

Win7 64bit Ultimate
780Ti
--------------------
Posted on Reply
#17
jsfitz54
luchesGetting random Driver crash on desktop now. . Now to play witcher 3, I gotta endure constant driver Crashes.
Just brilliant :shadedshu:

Win7 64bit Ultimate
780Ti
--------------------
Same desktop crashes here. Rolled back.
Posted on Reply
#18
rtwjunkie
PC Gaming Enthusiast
Note to self: "I have 780, there's nothing here for you, do NOT upgrade drivers before playing Witcher 3 tonight." :laugh:
Posted on Reply
#19
Benses
It seem that the ability to switch color profile from full or limited is not there with this driver release. So according to NVIDIA to play Witcher 3 my monitor are stuck with a washed out look.
Posted on Reply
#20
ab3e
we need AMD to step up with those drivers .. for the moment in Crossfire
with my hd7970m on my alienware i get 30FPS and a lot of Micro Shutter. If i disable the crossfire i can play on medium 30-40 FPS on my 29 ulrawide 2560x1080 monitor which says the game is well optimized even for low spec pc. A hd7970m its like a HD7870 for desktop which is pretty old card ..... My crossfire configuration is a bit faster than a gtx770 so it should handle the game fine once drivers are in place ...
Posted on Reply
#21
rtwjunkie
PC Gaming Enthusiast
rtwjunkieNote to self: "I have 780, there's nothing here for you, do NOT upgrade drivers before playing Witcher 3 tonight." :laugh:
Turns out it was a good decision not to upgrade the drivers for my 780. Totally unnecessary. Witcher 3 played flawlessly for me. See my notes/mini review in the Witcher 3 Discussion thread.
Posted on Reply
#22
...PACMAN...
BensesIt seem that the ability to switch color profile from full or limited is not there with this driver release. So according to NVIDIA to play Witcher 3 my monitor are stuck with a washed out look.
This is under the choose resolution section now, scroll to the bottom and you should find it :) Drivers working fine here, smooth in the few games I have tried with no crashes or anomalies.
Posted on Reply
#23
Hiryougan
After looking at first benchmarks, my fears confirmed. Nvidia abandoned Keplers. R9 290 on the same level or better than 780Ti in Witcher 3.

In 1440p it completely smashes 780Ti.


And guys, remember that AMD still didn't release drivers for Witcher 3(it's on Catalyst 15.4.1 Beta) so i expect it to be even better then.
Btw. for those who think these benchmarks are fake: this site is pclab.pl, it's often considered as nvidia promoting site. So it's totally out of the question. Also on other sites it's really similar.
Posted on Reply
#24
rtwjunkie
PC Gaming Enthusiast
1080p tests are on ultra. Having played 4 hours last night with my 780 on High, I can tell you it's one of the best looking games out there and probably the smoothest gameplay I have ever played. Absolutely no reason to go above Hogh once you see it. So tjose "kepler is broke" benchmarks mean exactly....nothing.
Posted on Reply
#25
Hiryougan
rtwjunkie1080p tests are on ultra. Having played 4 hours last night with my 780 on High, I can tell you it's one of the best looking games out there and probably the smoothest gameplay I have ever played. Absolutely no reason to go above Hogh once you see it. So tjose "kepler is broke" benchmarks mean exactly....nothing.
For you:

But i agree, the game looks absolutely amazing!
Posted on Reply
Add your own comment
Dec 23rd, 2024 22:45 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts