Tuesday, November 6th 2007
NVIDIA Drivers Sacrificing Crysis Quality for Performance
An interesting article over at Elite Bastards has revealed that NVIDIA's latest 169.04 driver for its GeForce series of graphics cards may in fact be sacrificing image quality in the new Crysis game in order to gain better performance. If you look at the first two images below, you will see quite clearly that the reflections in the water have been unrealistically stretched and look quite odd. The obvious explanation for this would be that NVIDIA's new driver is having issues rendering the shadows and you'd expect the company to fix it. However, this issue may run a little deeper than that. When the Crysis executable has its name changed from crysis.exe to driverbug.exe, these strange shadows mysteriously look normal all of a sudden, as shown in the second two images. Further tests revealed that the renamed executable was actually performing around 7% worse on a number of tests using both an 8800 GTS and an 8800 GT compared to the default name. So it seems that NVIDIA has tried to subtly increase the framerate when playing Crysis using its cards at the cost of image quality, without giving customers any choice in the matter. Some sites are claiming that NVIDIA is using driver tweaks to 'cheat' and gain better performance, and whilst that may be an extreme way of looking at it, this certainly seems like more than just an accidental driver issue.
Source:
Elite Bastards
81 Comments on NVIDIA Drivers Sacrificing Crysis Quality for Performance
And for everyone complaining that you wouldn't notice it in game - true, you probably wouldn't. But for a game that's supposed to be stouting some of the most advanced graphics ATM, that can't really be had when your drivers are optimizing your in-game performance. IMO, it's partly the reason they decided to 'tweak' things you're not probably going to notice, and how far do you feel they should be allowed to take these optimizations in the name of performance?
Even with ATI's GPU currently behind the field right now - I'll still take their hardware and drivers over nVidia. We might have driver optimizations, too, but we're given control over turning them on and off. At least I'm confident that although I won't have the best FPS in a given game, I'll at least have much better graphics - and in videos, too. ATI has accused nVidia numerous times, even when nVidia was stouting their physics processing. It's just a general occurance in hardware wars, but, nVidia never respond to ATI's challenging, which says something in itself, IMO.
I can't notice a difference in quality myself, and if you think that nVidia's IQ is worse than that of AMD's then you're crazy, even with this.
Hitting nVidia when they're down, eh? Sure, they may be a mistake, but you're always to the extreme.
edit>> I'm not saying I have a problem with the optimizations - its doing it behind the customers back that irritates me. At least give the consumer the option of turning the tweaks on and off. Extreme point: would you mind 45% pixelation in Crysis (or any game for that matter) if you got an extra 65 FPS? Would it bother you if you were given a choice, though?
My only concern with this is, what happens to the optimization or stability concerns of other applications + the drivers in general?
I'm sticking to the WHQL until the Crysis hype wears off.
Oh, and I am seeing something..... yes I think a closed thread will be coming at some point:D
It hasn't happened by now ..so what does that say of moderator foresight..or lack there of?
I'm not trying to turn this into a fanboi flame-war, so I'm dropping it from here out :toast:
It's not like you are going to notice a difference in quality. Plus, for most people PERFORMANCE is the issue when playing Crysis. Yes the game looks great, but there is NOBODY who can max the game out yet. Considering that most people will be running this game at ~30fps, they want eye candy as well as performance.
If the drivers alter the shadows a tiny bit but then let you move up texture quality from medium to high, then they're worth it because OVERALL you are getting better image quality and performance.
To whichever person said "why not just play all your games at 640x480" - I could counter, "why not play all your games at 1920, with 16xAA, 8xAF, maximum settings?" The answer is because you can't because the performance issues would make it unplayable.
I'm a mechanic, alright? Say you go out and buy a Mustang GT, or a Corvette, or any high performance automobile, and you bring it to me at the shop just for an oil change and a general look-over.
Say, behind your back, I 'tweak' your car - remap the fuel timings in the computer, install a high flow air filter, high-flow fuel injectors, etc.
Your car runs much faster now, but you're also getting 5-10 mpg less than when I initially brought it into the shop. Would that bother you?
Your analogy would be better served, if you sent a GPU in for repairs/RMA, and they returned it with modified clock levels.
All these drivers are betas... meaningless, they're just testing different things, and realising them to end-users to try, test, review, debug and give feedback as well. Even if they focus on performance, over "IQ" with their WHQL drivers, who cares? Graphics units are not all about visual quality, they're about rendering. And the performance required to render the visuals is already subject to degredation by the use of higher resolutions, special features such as AA, HDR, AF etc, as well as the textures used by a 3d application. Visuals are meaningless if you don't have the ability to render them.
If anything, we should be annoyed that ATi and NVidia have not released their 'next-gen' cards sooner, having known full well that demanding applications like Crysis were coming.
For those saying only FPS matters take into account you could also dumb down the quality in the CCC if you wanted to. It just makes a direct comparison between ati cards and nvidia cards unfair if nvidia is cutting corners.
On a side note isn't Nvidia's line or was at one time something about immersion in
realitycrap?Or.
Yes it looks like crap!!
At any rate if all you care about it fps then go nvidia, if you like image quality go ati. Just don't put too much stock into benchmarks using crysis + that driver b/c you're not getting the full picture.
I'll look for the link but I think it was at hardocp or something.
What about other games? How many other ones have people been missing the _________ on? Feel free to insert any number of things that can be filetered out
I found it funny that awhile back a friend and I got togeather and while playing CS:S with high res packs installed his always looked different than mine. Despite the same in game settings. Perhaps it was a previous gen of cards, but I was thinking it was 7600 something.
But the average consumer or some of the dumb gamers that only look at the top score to determine what they believe is best. Not average framerate, not quality, quality settings, etc.....
So in general this is pure and total horseshit. Nvidia has been caught yet again with their hands in the cookie jar and will no doubt try to play it off as a minor glitch that was accidental.
Maybe when they start releasing full new drivers and bragging about more fps, then you'd have something to criticise! But until then they haven't been "caught" at all. They're not even doing anything wrong. It's a BETA driver for a DEMO of a game.