Monday, January 8th 2018

NVIDIA GeForce 390.65 Driver with Spectre Fix Benchmarked in 21 Games

The Meltdown and Spectre vulnerabilities have been making many headlines lately. So far, security researchers have identified three variants. Variant 1 (CVE-2017-5753) and Variant 2 (CVE-2017-5715) are Spectre, while Variant 3 (CVE-2017-5754) is Meltdown. According to their security bulletin, NVIDIA has no reason to believe that their display driver is affected by Variant 3. In order to strengthen security against Variant 1 and 2, the company released their GeForce 390.65 driver earlier today, so NVIDIA graphics card owners can sleep better at night.

Experience tells us that some software patches come with performance hits, whether we like it or not. We were more than eager to find out if this was the case with NVIDIA's latest GeForce 390.65 driver. Therefore, we took to the task of benchmarking this revision against the previous GeForce 388.71 driver in 21 different games at the 1080p, 1440p, and 4K resolutions. We even threw in an Ethereum mining test for good measure. Our test system is powered by an Intel Core i7-8700K processor overclocked to 4.8 GHz, paired with G.Skill Trident-Z 3866 MHz 16 GB memory on an ASUS Maximus X Hero motherboard. We're running the latest BIOS, which includes fixes for Spectre, and Windows 10 64-bit with Fall Creators Update, fully updated, which includes the KB4056891 Meltdown Fix.
We grouped all 21 games, each at three resolutions, into a single chart. Each entry on the X axis is for a single test, showing the percentage difference between old and new driver in percent. Negative values stand for a performance decrease when using today's driver. Positive numbers for performance gained.

Cryptominers can rest assured that the new GeForce 390.65 driver won't affect their profits negatively. Our testing shows zero impact in Ethereum mining. With regard to gaming, there is no significant difference in performance either. The new driver actually gains a little bit of performance on average over the previous version (+0.32%). The results hint at some undocumented small performance gains in Wolfenstein 2 and F1 2017; the other games are nearly unchanged. Even if we exclude those two titles, the performance difference is still +0.1%. The variations that you see in the chart above are due to random effects and due to limited precision on taking measurements in Windows. Generally, for the kind of testing done in our VGA reviews we typically expect 1-2% margin of error between benchmark runs, even when using the same game, at identical settings, on the same hardware.
Add your own comment

42 Comments on NVIDIA GeForce 390.65 Driver with Spectre Fix Benchmarked in 21 Games

#1
RejZoR
I don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
Posted on Reply
#2
lilunxm12
RejZoRI don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
NVIDIA is patching for their GPUs not CPUs made by other vendors. NVIDIA does market GPGPU capability of their products so technically sensitive information may be stored in GPU cache; I guess it's just NVIDIA wants to show their attitude towards market, actual possibility of being targeted should be minimal. Who's gonna use GPU to process anything related to their password?
Posted on Reply
#3
spectatorx
RejZoRI don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
Exactly my thoughts. How is gpu driver supposed to fix cpu related problems, especially cpu architecture flaws. IMO this sentence in changelog is PR bs very much in nvidia's style. The closest thing to that i can recall is MFAA - "aa" which "improves" msaa and for some reason release of mfaa somehow happened with exactly that driver where users started to see more aliasing across many games.
Posted on Reply
#4
Kuroneko
RejZoRI don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
They might not really be affected by this, but they're working with other companies to iron out performance issues.
  • NVIDIA has determined that these exploits (or other similar exploits that may arise) do not affect GPU computing, so their hardware is mostly immune. They will be working with other companies to update device drivers to help mitigate any CPU performance issues, and they are evaluating their ARM-based SoCs (Tegra).
Source: www.androidcentral.com/meltdown-spectre (about halfway)
Posted on Reply
#5
arbiter
RejZoRI don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
If the CPU is busying handleing other tasks instead of running the game and driver it can effect fps a bit. if you every play a game and stream at same time you notice when OBS is encoding and using cpu you do have a bit of fps lose if the game can use all of the cpu on its own. There is only so much Nvidia can do to optimize the driver to limit the amount of cpu impact though. If the spectre update can cost 30%(which who knows who made the this claim being this high so far as i know there was no source quoted), Could in theory lose 20-30% fps.

Unless something has changed over the time since Story I read where they used Doom, gtx1060 vs rx 480 test. If you had high end cpu both cards were close with i think 480 a hair ahead, But as they went down the line with slower cpu the performance for gtx1060 stayed consistent where as the rx 480 lost a bit of fps. This was a while ago sure AMD has worked on that and made it better but it does prove that GPU driver if not optimized can be harmed a bit by cpu being slowed down.
Posted on Reply
#6
Jism
RejZoRI don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
I remember a guy once found an exploit on Nvidia videocards. He was watching a porn site, and when he closed it, it seemed that the GPU did'nt really clear the memory holding cache of the info he was watching.

In other words, everything you do on your computer gets stored in VRAM. This was'nt cleared the moment you close or minimize the application.

I'm sure this and a few other fixes. If you where banking or whatever, something with codes or sensitive information, i'm sure this gets saved in the VRAM as well, patch is written to prevent stealing data from one instance to another in VRAM.
Posted on Reply
#7
Camm
RejZoRI don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
Spectre allows for application snooping. Whilst the exploit is hardware based, an application vendor can help mitigate the ability for spectre to snoop on it by changing how it protects itself.
Posted on Reply
#8
SIGSEGV
RejZoRI don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
LOL. /jokesonthissiteisbeyondofmyimagination
Posted on Reply
#9
xorbe
spectatorxExactly my thoughts. How is gpu driver supposed to fix cpu related problems, especially cpu architecture flaws.
The gpu driver runs with priveledge, and by recoding key indirect branches, it closes a side band data leak.
Posted on Reply
#10
Prima.Vera
If you look to the graph closelly, the biggest performance inpact is in those games that are CPU bound ;)
Posted on Reply
#11
eidairaman1
The Exiled Airman
They are covering their rears is all, not a big deal.
Posted on Reply
#12
Fluffmeister
Prima.VeraIf you look to the graph closelly, the biggest performance inpact is in those games that are CPU bound ;)
I think the CPU duopoly needs a shack up, both of them dragging their feet holding the industry back.

Raja will save the day.
Posted on Reply
#13
lexluthermiester
So the conclusion is that the differences fall within a statistical margin of error and have no appreciable effect on game performance.
Posted on Reply
#14
Slizzo
No benchmarks should be taking place until the microcode updates come out from Intel/Motherboard manufacturers. Until then these fixes are minimal.
Posted on Reply
#15
R0H1T
RejZoRI don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
We don't know how spectre & meltdown affect GPU, I assume if there's a fix then there's also an exploitable hole somewhere that must be patched. Meltdown patch is a software fix, so there might be more ways than one to exploit that vulnerability. Perhaps GPU's are also vulnerable, I dunno? Spectre is a bigger problem, there could be more exploits (aside from Variant 1 & 2) with that vulnerability in the future.
Posted on Reply
#16
lexluthermiester
R0H1TWe don't know how spectre & meltdown affect GPU, I assume
Yes, you assume, incorrectly.
meltdownattack.com/
Read, Learn. Stop spreading misinformation. GPU's are not affected by these vulnerabilities.
Posted on Reply
#17
R0H1T
lexluthermiesterYes, you assume, incorrectly.
meltdownattack.com/
Read, Learn. Stop spreading misinformation. GPU's are not affected by these vulnerabilities.
First of all stop pretending you know everything about spectre or meltdown.
The project zero team devised an attack reading Intel manuals, they also looked at the rowhammer attack & took it as a stepping stone to make this work!
Not only that there were 4 independent teams who found it all inside a year of the rowhammer demo in 2016! Stop, read, learn -
TRIPLE MELTDOWN: HOW SO MANY RESEARCHERS FOUND A 20-YEAR-OLD CHIP FLAW AT THE SAME TIME
Posted on Reply
#18
jaggerwild
There has always been a java script vulnerability, it runs on your GPU......It says in the thread title Spectre Fix.
Posted on Reply
#19
lexluthermiester
R0H1TFirst of all stop pretending you know everything about spectre or meltdown.
I'm not and I don't. However, I've read enough from the people who discovered and researched the problem to know that your conclusions and the misinformation you are stating are incorrect.
R0H1TThe project zero team devised an attack reading Intel manuals, they also looked at the rowhammer attack & took it as a stepping stone to make this work!
How does that apply to a GPU? IIRC, rowhammer was mitigated several years ago with software updates, firmware revisions and hardware redesigns. Not really a problem itself nor part of these problems.
R0H1TStop, read, learn -
Cute..
R0H1T
Twitter? Not something any researcher takes seriously.
R0H1TTRIPLE MELTDOWN: HOW SO MANY RESEARCHERS FOUND A 20-YEAR-OLD CHIP FLAW AT THE SAME TIME
Interesting, but not relevant to the context of this discussion. Again, GPU's are not vulnerable to Meltdown or Spectre. GPU processors work in different way than that of a standard CPU. That's what makes them so useful in the tasks they are made for.
jaggerwildThere has always been a java script vulnerability, it runs on your GPU......
LOL! Now that was funny!
Posted on Reply
#20
Camm
lexluthermiesterI'm not and I don't. However, I've read enough from the people who discovered and researched the problem to know that your conclusions and the misinformation you are stating are incorrect.

How does that apply to a GPU? IIRC, rowhammer was mitigated several years ago with software updates, firmware revisions and hardware redesigns. Not really a problem itself nor part of these problems.

Cute..

Twitter? Not something any researcher takes seriously.

Interesting, but not relevant to the context of this discussion. Again, GPU's are not vulnerable to Meltdown or Spectre. GPU processors work in different way than that of a standard CPU. That's what makes them so useful in the tasks they are made for.

LOL! Now that was funny!
Sigh, the GPU itself as a hardware layer hasn't been shown to be vulnerable, however the driver that runs in software space can be affected by Spectre (or Meltdown) as the drivers memory space sits in the kernel, and thus ultimately can affect the GPU.
Posted on Reply
#21
lexluthermiester
CammSigh, the GPU itself as a hardware layer hasn't been shown to be vulnerable, however the driver that runs in software space can be affected by Spectre (or Meltdown) as the drivers memory space sits in the kernel, and thus ultimately can affect the GPU.
That's an interesting and intriguing point! However because of the way both of these vulnerabilities work, the GPU is not directly vulnerable and it would likely be so difficult to pull off that it would be a waste of an attackers time and effort to attempt.
Posted on Reply
#22
R0H1T
lexluthermiesterHow does that apply to a GPU? IIRC, rowhammer was mitigated several years ago with software updates, firmware revisions and hardware redesigns. Not really a problem itself nor part of these problems.
My bad, its a timing side channel attack on KASLR, demoed in 2016 ~
www.blackhat.com/docs/us-16/materials/us-16-Jang-Breaking-Kernel-Address-Space-Layout-Randomization-KASLR-With-Intel-TSX.pdf
Cute..

Twitter? Not something any researcher takes seriously.
So you've got nothing to counter & you choose to shut your eyes & ears? The twit tells us about possible side channel attacks that could work on x86-64.
lwn.net/Articles/738975/
gruss.cc/files/kaiser.pdf
Interesting, but not relevant to the context of this discussion. Again, GPU's are not vulnerable to Meltdown or Spectre. GPU processors work in different way than that of a standard CPU. That's what makes them so useful in the tasks they are made for.
Side channel attacks could affect every piece of hardware out there. Also what did Nvidia patch if there's nothing to patch in there, explain that?

edit - You must've also missed unified memory in CUDA then, starting with CUDA 6 IIRC?
devblogs.nvidia.com/parallelforall/unified-memory-in-cuda-6/
Posted on Reply
#23
lexluthermiester
R0H1TMy bad, its a timing side channel attack on KASLR, demoed in 2016 ~
www.blackhat.com/docs/us-16...Layout-Randomization-KASLR-With-Intel-TSX.pdf
Ok now that's a bit different. KASLR is very specific and complex attack. It's not easily carried out to begin with. Not sure how it will relate to Meltdown and Spectre, but the complexity would likely become exponential.
EDIT, I was thinking about something else when I saw that name. It seems KASLR and Kaiser are one and the same. However, this still doesn't change that fact that GPU's are not directly affected by MLTDWN&SPCTR.
R0H1TSo you've got nothing to counter & you choose to shut your eyes & ears.
Twitter is a convoluted mess most of the time and I will not waste my time with it.
R0H1Tlwn.net/Articles/738975/
gruss.cc/files/kaiser.pdf
Didn't really read up on Kaiser to much as it seemed easily fixed and somewhat limited to the Linux sector. But I did gloss over that pdf and am not seeing the connection to it, Meltdown, Spectre and GPU's
R0H1TSide channel attacks could affect every piece of hardware out there.
Perhaps, but they are notoriously difficult to pull off. Most attackers either won't or can't successfully harvest usable data from such an attack. The best most could hope for is to crash the target system.
R0H1TAlso what did Nvidia patch if there's nothing to patch in there, explain that?
IIRC, Nvidia's latest patch release had nothing to do with MLTDN&SPCTR. Do you have a link? Google is giving me nothing..
Posted on Reply
#24
R0H1T
lexluthermiesterOk now that's a bit different. KASLR is very specific and complex attack. It's not easily carried out to begin with. Not sure how it will relate to Meltdown and Spectre, but the complexity would likely become exponential.
That KASLR demo was what lead to project zero's discovery, it isn't as complex if there's a hardware (design) flaw. This is a developing situation so I can't say if we'll see more spectre, meltdown variants. My assumption is that the GPU could be exposed in more ways than one, like Camm said ~ drivers for instance are vulnerable.

The GPU uarch might not have the same spectre or meltdown vulnerability, but we don't know if there are similar design flaws - which could in theory affect them, simply by studying the architecture in detail.
Starting From Zero

How did Horn independently stumble on the notion of attacking speculative execution in Intel's chips? As he tells it, by reading the manual.

In late April of last year, the 22-year-old hacker—whose job at Project Zero was his first out of college—was working in Zurich, Switzerland, alongside a coworker, to write a piece of processor-intensive software, one whose behavior they knew would be very sensitive to the performance of Intel's chips. So Horn dived into Intel's documentation to understand how much of the program Intel's processors could run out-of-order to speed it up.

He soon saw that for one spot in the code he was working on, the speculative execution quirks Intel used to supercharge its chip speed could lead to what Horn describes as a "secret" value being accidentally accessed, and then stored in the processor's cache. "In other words, [it would] make it possible for an attacker to figure out the secret," Horn writes in an email to WIRED. "I then realized that this could—at least in theory—affect more than just the code snippet we were working on, and decided to look into it."
This is what is important, a design flaw enabled 4 different teams to find the same vulnerabilities. So in essence it's just a matter of looking at the uarch long enough & hard enough, I'm not passing any judgement but I'm not ruling it out either.
lexluthermiesterIIRC, Nvidia's latest patch release had nothing to do with MLTDN&SPCTR. Do you have a link? Google is giving me nothing..
From TPU ~
Security Update
Fixed CVE-2017-5753: Computer systems with microprocessors utilizing speculative execution and branch prediction may allow unauthorized disclosure of information to an attacker with local user access via a side-channel analysis.
Posted on Reply
#25
lexluthermiester
R0H1Tdesign flaw
But this what I'm trying to help you understand. These vulnerabilities are not "design flaws". The term "design flaw" directly implies defect. That is not the case. The CPU's affected by these problems will operate perfectly well and stable and will keep doing so even if the vulnerability is exploited. While the vulnerability takes advantage if a trick of a hardware function, those functions are not in and of themselves defects. Does that makes sense?
Posted on Reply
Add your own comment
Dec 21st, 2024 21:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts