Tuesday, November 6th 2007

NVIDIA Drivers Sacrificing Crysis Quality for Performance

An interesting article over at Elite Bastards has revealed that NVIDIA's latest 169.04 driver for its GeForce series of graphics cards may in fact be sacrificing image quality in the new Crysis game in order to gain better performance. If you look at the first two images below, you will see quite clearly that the reflections in the water have been unrealistically stretched and look quite odd. The obvious explanation for this would be that NVIDIA's new driver is having issues rendering the shadows and you'd expect the company to fix it. However, this issue may run a little deeper than that. When the Crysis executable has its name changed from crysis.exe to driverbug.exe, these strange shadows mysteriously look normal all of a sudden, as shown in the second two images. Further tests revealed that the renamed executable was actually performing around 7% worse on a number of tests using both an 8800 GTS and an 8800 GT compared to the default name. So it seems that NVIDIA has tried to subtly increase the framerate when playing Crysis using its cards at the cost of image quality, without giving customers any choice in the matter. Some sites are claiming that NVIDIA is using driver tweaks to 'cheat' and gain better performance, and whilst that may be an extreme way of looking at it, this certainly seems like more than just an accidental driver issue.
Source: Elite Bastards
Add your own comment

81 Comments on NVIDIA Drivers Sacrificing Crysis Quality for Performance

#26
a111087
everyone! download drivers for benchmarking and try to play with them! :D
Posted on Reply
#27
imperialreign
just my two cents (and fanboishness) - this is one of the reasons I can't stand nVidia. If you're going to 'tweak' your drivers like this, at least come out and say that there are 'optimizations' in the driver release. Not saying anything at all, and then acting stupid when someone catches on is a shady, sleazy and a poor business practice.

And for everyone complaining that you wouldn't notice it in game - true, you probably wouldn't. But for a game that's supposed to be stouting some of the most advanced graphics ATM, that can't really be had when your drivers are optimizing your in-game performance. IMO, it's partly the reason they decided to 'tweak' things you're not probably going to notice, and how far do you feel they should be allowed to take these optimizations in the name of performance?

Even with ATI's GPU currently behind the field right now - I'll still take their hardware and drivers over nVidia. We might have driver optimizations, too, but we're given control over turning them on and off. At least I'm confident that although I won't have the best FPS in a given game, I'll at least have much better graphics - and in videos, too.
It's not just in crysis, remember when Ati accused nvidia before?
ATI has accused nVidia numerous times, even when nVidia was stouting their physics processing. It's just a general occurance in hardware wars, but, nVidia never respond to ATI's challenging, which says something in itself, IMO.
Posted on Reply
#28
Ravenas
imperialreignjust my two cents (and fanboishness) - this is one of the reasons I can't stand nVidia. If you're going to 'tweak' your drivers like this, at least come out and say that there are 'optimizations' in the driver release. Not saying anything at all, and then acting stupid when someone catches on is a shady, sleazy and a poor business practice.

And for everyone complaining that you wouldn't notice it in game - true, you probably wouldn't. But for a game that's supposed to be stouting some of the most advanced graphics ATM, that can't really be had when your drivers are optimizing your in-game performance. IMO, it's partly the reason they decided to 'tweak' things you're not probably going to notice, and how far do you feel they should be allowed to take these optimizations in the name of performance?

Even with ATI's GPU currently behind the field right now - I'll still take their hardware and drivers over nVidia. At least I'm confident that although I won't have the best FPS in a given game, I'll at least have much better graphics - and in videos, too.
The only people who will be taking advantage of these optimizations are the poeple with the lower end graphics cards. Therefore, those people won't be "stouting the best visuals ever seen". The only people who are going to get the top quality graphics are the ones with the best graphics cards, it's just plain logic. These optimizations arn't doing anything but helping people with lower end cards run a very graphic intensive game. How is that bad buisiness? If anything, it's great buisiness.
Posted on Reply
#29
Weer
DaMultaTo the people in this thread that do not care about IQ. You should just run your game at 640x480 on all low textures to speed up your game even more.

This is PC Gaming, we expect notting but the best in our games!!!!


Take my frames and give me the IQ!!!!!
The AMD fanboy speaks yet again.. :shadedshu

I can't notice a difference in quality myself, and if you think that nVidia's IQ is worse than that of AMD's then you're crazy, even with this.

Hitting nVidia when they're down, eh? Sure, they may be a mistake, but you're always to the extreme.
Posted on Reply
#30
b1lk1
I could care less how many optimizations they make in their drivers top make games faster and if you choose higher FPS over higher IQ, that is everyone's own personal choice. The only single part that makes me want to puke is how everyone shoves these online benchmarks of all cards down our throats as definitive proof as to which card is better. Obviously, this skews results and makes 99% of all articles we read on which card is best a moot point. Depending on drivers, features and many other factors, each brand is best. For the group like me who prefer IQ, the choice is obvious. I just wish that stories like this would stop all the nvidiots from jamming their crap down our throats that they are the best when we all know that nvidia has used tweaks for years giving them the edge on default settings.......
Posted on Reply
#31
imperialreign
These optimizations arn't doing anything but helping people with lower end cards run a graphic intensive game. How is that bad buisiness?
When a game developer has built enough hype around a game, especially over the graphics that the game will incorporate, and a GPU hardware manufacturer has liscenesed their logo to back and support the game . . . you'll have people buying those GPUs just to play the game. Sure, you'll have great performance, but you're not getting everything that the game dev was pushing with their product. It's not fair to the consumer, or the game developers, especially if they buy any nVidia GPUs and are looking forward to the high-end graphics this game is supposed to offer.


edit>> I'm not saying I have a problem with the optimizations - its doing it behind the customers back that irritates me. At least give the consumer the option of turning the tweaks on and off. Extreme point: would you mind 45% pixelation in Crysis (or any game for that matter) if you got an extra 65 FPS? Would it bother you if you were given a choice, though?
Posted on Reply
#32
newconroer
I am not surprised at this, the plethora of beta drivers churned out in the last month has been ridiculous. Not one has made a huge impact, and five of them have been 'optimized for Crysis.'

My only concern with this is, what happens to the optimization or stability concerns of other applications + the drivers in general?

I'm sticking to the WHQL until the Crysis hype wears off.
Posted on Reply
#33
Hawk1
WeerHitting nVidia when they're down, eh? Sure, they may be a mistake, but you're always to the extreme.
Lol - when have they been down recently. Anyway, I think I would be pissed if they "optimized" my card to reduce IQ but thats just me.

Oh, and I am seeing something..... yes I think a closed thread will be coming at some point:D
Posted on Reply
#34
Ravenas
imperialreignWhen a game developer has built enough hype around a game, especially over the graphics that the game will incorporate, and a GPU hardware manufacturer has liscenesed their logo to back and support the game . . . you'll have people buying those GPUs just to play the game. Sure, you'll have great performance, but you're not getting everything that the game dev was pushing with their product. It's not fair to the consumer, or the game developers, especially if they buy any nVidia GPUs and are looking forward to the high-end graphics this game is supposed to offer.
You're not going to get everything the game offers if you don't have the best hardware...I don't see what your getting at.
Posted on Reply
#35
newconroer
|Oh, and I am seeing something..... yes I think a closed thread will be coming at some point|

It hasn't happened by now ..so what does that say of moderator foresight..or lack there of?
Posted on Reply
#36
imperialreign
imperialreignedit>> I'm not saying I have a problem with the optimizations - its doing it behind the customers back that irritates me. At least give the consumer the option of turning the tweaks on and off. Extreme point: would you mind 45% pixelation in Crysis (or any game for that matter) if you got an extra 65 FPS? Would it bother you if you were given a choice, though?
that's kind of what I'm getting at.

I'm not trying to turn this into a fanboi flame-war, so I'm dropping it from here out :toast:
Posted on Reply
#37
Davidelmo
I can't believe that anyone is actually upset or feels "betrayed" by this...

It's not like you are going to notice a difference in quality. Plus, for most people PERFORMANCE is the issue when playing Crysis. Yes the game looks great, but there is NOBODY who can max the game out yet. Considering that most people will be running this game at ~30fps, they want eye candy as well as performance.

If the drivers alter the shadows a tiny bit but then let you move up texture quality from medium to high, then they're worth it because OVERALL you are getting better image quality and performance.

To whichever person said "why not just play all your games at 640x480" - I could counter, "why not play all your games at 1920, with 16xAA, 8xAF, maximum settings?" The answer is because you can't because the performance issues would make it unplayable.
Posted on Reply
#38
imperialreign
one last thing from me (promise!), I'm going to re-word my edit:

I'm a mechanic, alright? Say you go out and buy a Mustang GT, or a Corvette, or any high performance automobile, and you bring it to me at the shop just for an oil change and a general look-over.

Say, behind your back, I 'tweak' your car - remap the fuel timings in the computer, install a high flow air filter, high-flow fuel injectors, etc.

Your car runs much faster now, but you're also getting 5-10 mpg less than when I initially brought it into the shop. Would that bother you?
Posted on Reply
#39
newconroer
Yes, because you modified the product (the car/Mustang). Nvidia is not modifying our video cards.

Your analogy would be better served, if you sent a GPU in for repairs/RMA, and they returned it with modified clock levels.


All these drivers are betas... meaningless, they're just testing different things, and realising them to end-users to try, test, review, debug and give feedback as well. Even if they focus on performance, over "IQ" with their WHQL drivers, who cares? Graphics units are not all about visual quality, they're about rendering. And the performance required to render the visuals is already subject to degredation by the use of higher resolutions, special features such as AA, HDR, AF etc, as well as the textures used by a 3d application. Visuals are meaningless if you don't have the ability to render them.

If anything, we should be annoyed that ATi and NVidia have not released their 'next-gen' cards sooner, having known full well that demanding applications like Crysis were coming.
Posted on Reply
#40
JacKz5o
Maybe thats why some people think ATI has better quality graphics than NVIDIA? :p
Posted on Reply
#41
Solaris17
Super Dainty Moderator
personally i dont like this type of thing....but it really doesnt matter to me like in crysis since i know ill look for it and be pissed but in things like ut2k4 it doesnt matter because though i jack my settings all the way i play things at MAXXXXXX ya know and i love fram rates but all that aside in 2k4 i never look at anything except the targeting rectical...nvidia could put a sighn on the walls saying world destruction in xx:xx:xx:x and i wouldnt notice at all.
Posted on Reply
#42
magibeg
Well i think what it all comes down to is that nvidia should TELL US they're doing this as well as provide the option to turn it off. Sure you could argue you get more FPS and thats all that matters, but someone else could also argue that they prefer a beautiful atmosphere when they play the game.

For those saying only FPS matters take into account you could also dumb down the quality in the CCC if you wanted to. It just makes a direct comparison between ati cards and nvidia cards unfair if nvidia is cutting corners.
Posted on Reply
#43
Steevo
I don't know if I believe what I am hearing, that people would NOT notice the weird graphics? Wow.


On a side note isn't Nvidia's line or was at one time something about immersion in reality crap?
Posted on Reply
#44
Steevo
Sorry, "Reality Redfined"



Or.




Yes it looks like crap!!
Posted on Reply
#45
jydie
I would expect both ATI and Nvidia to put out drivers that produce the best image possible, and let me adjust the settings in the game to get more performance. Nvidia's 8600 and 8800 line of video cards are very good... so, I think tweaking a driver in this manner is a foolish thing to do. Maybe it was just an oversight that will be corrected in the next update.
Posted on Reply
#46
deekortiz3
Ummm... It's still the demo. It is barely optimized for anything. Seriously wait for the retail version before to come out it has to be a lot better than the demo.
Posted on Reply
#47
brian.ca
That being an oversight is unlikely when you consider the timing.. in the end they might change it back (especially considering the bad press) but as I understand it the driver came out specifically for crysis and timeshift? coinciding with the release of their new cards. So most reviews of the new card that used crysis as a benchmark will be slightly inflated -- that was most likely the point of all this, and subsequently what was probably meant to be taken away by the articles about it.

At any rate if all you care about it fps then go nvidia, if you like image quality go ati. Just don't put too much stock into benchmarks using crysis + that driver b/c you're not getting the full picture.
Posted on Reply
#48
thebeephaha
At any rate if all you care about it fps then go nvidia, if you like image quality go ati. Just don't put too much stock into benchmarks using crysis + that driver b/c you're not getting the full picture.
Maybe... I read a review with the 2900 XT vs the 8800GTX on HL2 EP2 and it showed the 8800 owned the ATI on rendering quality.

I'll look for the link but I think it was at hardocp or something.
Posted on Reply
#49
Steevo
The problem is that it was fully intentional. When you rename the crysis executable file it suddenly is fixed.



What about other games? How many other ones have people been missing the _________ on? Feel free to insert any number of things that can be filetered out


I found it funny that awhile back a friend and I got togeather and while playing CS:S with high res packs installed his always looked different than mine. Despite the same in game settings. Perhaps it was a previous gen of cards, but I was thinking it was 7600 something.


But the average consumer or some of the dumb gamers that only look at the top score to determine what they believe is best. Not average framerate, not quality, quality settings, etc.....


So in general this is pure and total horseshit. Nvidia has been caught yet again with their hands in the cookie jar and will no doubt try to play it off as a minor glitch that was accidental.
Posted on Reply
#50
Davidelmo
SteevoThe problem is that it was fully intentional. When you rename the crysis executable file it suddenly is fixed.



What about other games? How many other ones have people been missing the _________ on? Feel free to insert any number of things that can be filetered out


I found it funny that awhile back a friend and I got togeather and while playing CS:S with high res packs installed his always looked different than mine. Despite the same in game settings. Perhaps it was a previous gen of cards, but I was thinking it was 7600 something.


But the average consumer or some of the dumb gamers that only look at the top score to determine what they believe is best. Not average framerate, not quality, quality settings, etc.....


So in general this is pure and total horseshit. Nvidia has been caught yet again with their hands in the cookie jar and will no doubt try to play it off as a minor glitch that was accidental.
Dude, it is a BETA driver. They're meant to be TESTING new things - the obvious two things to change are image quality and performance.

Maybe when they start releasing full new drivers and bragging about more fps, then you'd have something to criticise! But until then they haven't been "caught" at all. They're not even doing anything wrong. It's a BETA driver for a DEMO of a game.
Posted on Reply
Add your own comment
Dec 20th, 2024 12:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts