# NVIDIA Drivers Sacrificing Crysis Quality for Performance



## Jimmy 2004 (Nov 6, 2007)

An interesting article over at Elite Bastards has revealed that NVIDIA's latest 169.04 driver for its GeForce series of graphics cards may in fact be sacrificing image quality in the new Crysis game in order to gain better performance. If you look at the first two images below, you will see quite clearly that the reflections in the water have been unrealistically stretched and look quite odd. The obvious explanation for this would be that NVIDIA's new driver is having issues rendering the shadows and you'd expect the company to fix it. However, this issue may run a little deeper than that. When the Crysis executable has its name changed from crysis.exe to driverbug.exe, these strange shadows mysteriously look normal all of a sudden, as shown in the second two images. Further tests revealed that the renamed executable was actually performing around 7% worse on a number of tests using both an 8800 GTS and an 8800 GT compared to the default name. So it seems that NVIDIA has tried to subtly increase the framerate when playing Crysis using its cards at the cost of image quality, without giving customers any choice in the matter. Some sites are claiming that NVIDIA is using driver tweaks to 'cheat' and gain better performance, and whilst that may be an extreme way of looking at it, this certainly seems like more than just an accidental driver issue.


 

 

 



*View at TechPowerUp Main Site*


----------



## mdm-adph (Nov 6, 2007)

Remember -- it's not a conspiracy if there's profits to be made.


----------



## Steevo (Nov 6, 2007)

More of the same shit they did years ago.


----------



## b1lk1 (Nov 6, 2007)

Nvidia has always sacrificed image quality for performance.  That is the main reason I will always run ATI cards because I'd rather lose a few FPS for better IQ.  I would not call it cheating, but it's just another MAJOR reason to not use online benchmarking results to definitively define which card is best.


----------



## sheps999 (Nov 6, 2007)

b1lk1 said:


> Nvidia has always sacrificed image quality for performance.  That is the main reason I will always run ATI cards because I'd rather lose a few FPS for better IQ.  I would not call it cheating, but it's just another MAJOR reason to not use online benchmarking results to definitively define which card is best.




So an ATi 2900XT would have less framerates than an 8800GTX, but the game/benchmark would look better, yar?


----------



## Dehx (Nov 6, 2007)

b1lk1 said:


> Nvidia has always sacrificed image quality for performance.  That is the main reason I will always run ATI cards because I'd rather lose a few FPS for better IQ.  I would not call it cheating, but it's just another MAJOR reason to not use online benchmarking results to definitively define which card is best.



This is really not news.  Its been pointed out alot of times in the past that NVIDIA does this.  The alst NV card I had was a FX-5500 back in '04.  I've bought ATI ever since.  Sure, ati may come in behind a couple FPS on the stupid graphs and charts that NVIDIA is so bent on dominating.  But I'll say this:  I've never regretted a purchase I've made from ATI: (x850xt, x800gto, x1900gt, x1900xtx, x1950xt,x2600xt,x2900xt) are all great cards.


----------



## Richieb0y (Nov 6, 2007)

i cant believe this if this still in the retail and drivers or no good i think my next card is a Ati one i mabye even go cancel my pre-order


----------



## mrw1986 (Nov 6, 2007)

Oh no, subtle things I'll never notice while playing! Performance is king.


----------



## FAXA (Nov 6, 2007)

Looks like i might have to order a ATI card when some better ones come out, 'cause upgrading my 8800GTS would be a bit stupid atm. If only my X1950Pro hadn't blown up ...

Is this only happening in Crysis?


----------



## pbmaster (Nov 6, 2007)

The only difference I can see is in the second and last one. Never would notice something like that while playing. Go nVidia!


----------



## Siren (Nov 6, 2007)

Hmm... I personally would rather sacrifice IQ over frame rates in this case... Crysis has such low frame rates as it is on some of the best cards out there so why anyone would complain about this is beyond me.  Low frame rates in a multi-player game could mean the difference between killing or being killed.


----------



## Ravenas (Nov 6, 2007)

What's the resolution of those pictures?


----------



## sheps999 (Nov 6, 2007)

Ravenas said:


> What's the resolution of those pictures?



1280x1024


----------



## Dehx (Nov 6, 2007)

mrw1986 said:


> Oh no, subtle things I'll never notice while playing! Performance is king.



Subtle things you'll never notice.  Well turn off AA, and AF and put mip mapping at medium and ATI is the best choice still.


----------



## DaMulta (Nov 6, 2007)

To the people in this thread that do not care about IQ. You should just run your game at 640x480 on all low textures to speed up your game even more.

This is PC Gaming, we expect notting but the best in our games!!!!


Take my frames and give me the IQ!!!!!


----------



## Ravenas (Nov 6, 2007)

sheps999 said:


> 1280x1024



Doesn't seem like you could expect it to look much better than that at those resolutions.


----------



## OnBoard (Nov 6, 2007)

"you will see quite clearly that the shadows in the water have been unrealistically stretched and look quite odd."

I looked at the shadows for a long time and didn't see anything wrong. Now if it would read REFLECTIONS in the water, I would have spotted it instantly


----------



## DaMulta (Nov 6, 2007)

If I hadn't of broke my Hard Drive last night I would show you some good ss of that res Ravens.


all 1280 x 1024


















I had 100s more, but I lost them all......


----------



## zekrahminator (Nov 6, 2007)

You can definitely see the fanboys here... . 


The AMD guys are saying "Oh man I'm so glad I have an ATI card, it rocks my socks, and NVIDIA doesn't do this performance-for-quality crap to me". 

The NVIDIA guys are saying "Yeah, well we at least HAVE awesomely l33t performance!"

Really though, fanboyishness aside, I think this is a pretty sleazy thing for NVIDIA to do.


----------



## mrw1986 (Nov 6, 2007)

I wouldn't say fanboys...just differences in people who like quality parts...also you can't say ATI doesn't do what nVidia did. ATI has optimized drivers to work with specific games. Its just funny that when a company releases an awesome product that blows the competition out of the water people freak out trying to make up stupid excuses. And no, I'm not a fanboy, I use ATI and nVidia and I love both of them. Both manufacturers base their drivers on executables for games to increase performance, so this whole "nvidia cheating" crap has to stop. What's next, Intel is cheating because they're kicking AMD's ass because they have better cpu technology? It just bothers me that people will go to these lengths for stupid stuff like this...Every company has its ups and downs yet its always the successful ones that are blemished in the public eye.


----------



## OrbitzXT (Nov 6, 2007)

Dehx said:


> Well turn off AA, and AF and put mip mapping at medium and ATI is the best choice still.



That's a stupid thing to say, no one is talking about turning those specific features off here. nVidia realized a small thing like shadows was lowering performance and made a decision that the frames lost were not worth it, and I wholeheartedly agree. I don't even notice a difference.


----------



## Ravenas (Nov 6, 2007)

DaMulta said:


> If I hadn't of broke my Hard Drive last night I would show you some good ss of that res Ravens.
> 
> 
> all 1280 x 1024
> ...



You may loose a very slight quality, but your pictures are very similar to those in the originals at the same res.


----------



## hacker111 (Nov 6, 2007)

What is this? The graphics suck! LAUGH:]:]:]


----------



## hacker111 (Nov 6, 2007)

Lets see some intense grapics please!


----------



## Exceededgoku (Nov 6, 2007)

This is no real change though, nvidia has always done this. But tbh at least they are trying to improve things with whatever means possible so I have no problems with this... I wish they would give the option to enable "FPS mode" rather than just make the decision for us though!


----------



## a111087 (Nov 6, 2007)

It's not just in crysis, remember when Ati accused nvidia before? 
I think it is cheating, but because i don't have much money to spend on PC, I want every last FPS I can get, no matter what IQ is.
so, I guess it's just what everyone prefers - quality or performance.


----------



## a111087 (Nov 6, 2007)

everyone! download drivers for benchmarking and try to play with them!


----------



## imperialreign (Nov 6, 2007)

just my two cents (and fanboishness) - this is one of the reasons I can't stand nVidia.  If you're going to 'tweak' your drivers like this, at least come out and say that there are 'optimizations' in the driver release.  Not saying anything at all, and then acting stupid when someone catches on is a shady, sleazy and a poor business practice.

And for everyone complaining that you wouldn't notice it in game - true, you probably wouldn't.  But for a game that's supposed to be stouting some of the most advanced graphics ATM, that can't really be had when your drivers are optimizing your in-game performance.  IMO, it's partly the reason they decided to 'tweak' things you're not probably going to notice, and how far do you feel they should be allowed to take these optimizations in the name of performance?

Even with ATI's GPU currently behind the field right now - I'll still take their hardware and drivers over nVidia.  We might have driver optimizations, too, but we're given control over turning them on and off. At least I'm confident that although I won't have the best FPS in a given game, I'll at least have much better graphics - and in videos, too.




> It's not just in crysis, remember when Ati accused nvidia before?



ATI has accused nVidia _numerous times_, even when nVidia was stouting their physics processing.  It's just a general occurance in hardware wars, but, nVidia never respond to ATI's challenging, which says something in itself, IMO.


----------



## Ravenas (Nov 6, 2007)

imperialreign said:


> just my two cents (and fanboishness) - this is one of the reasons I can't stand nVidia.  If you're going to 'tweak' your drivers like this, at least come out and say that there are 'optimizations' in the driver release.  Not saying anything at all, and then acting stupid when someone catches on is a shady, sleazy and a poor business practice.
> 
> And for everyone complaining that you wouldn't notice it in game - true, you probably wouldn't.  But for a game that's supposed to be stouting some of the most advanced graphics ATM, that can't really be had when your drivers are optimizing your in-game performance.  IMO, it's partly the reason they decided to 'tweak' things you're not probably going to notice, and how far do you feel they should be allowed to take these optimizations in the name of performance?
> 
> Even with ATI's GPU currently behind the field right now - I'll still take their hardware and drivers over nVidia.  At least I'm confident that although I won't have the best FPS in a given game, I'll at least have much better graphics - and in videos, too.



The only people who will be taking advantage of these optimizations are the poeple with the lower end graphics cards. Therefore, those people won't be "stouting the best visuals ever seen". The only people who are going to get the top quality graphics are the ones with the best graphics cards, it's just plain logic. These optimizations arn't doing anything but helping people with lower end cards run a very graphic intensive game. How is that bad buisiness? If anything, it's great buisiness.


----------



## Weer (Nov 6, 2007)

DaMulta said:


> To the people in this thread that do not care about IQ. You should just run your game at 640x480 on all low textures to speed up your game even more.
> 
> This is PC Gaming, we expect notting but the best in our games!!!!
> 
> ...



The AMD fanboy speaks yet again.. :shadedshu

I can't notice a difference in quality myself, and if you think that nVidia's IQ is worse than that of AMD's then you're crazy, even with this.

Hitting nVidia when they're down, eh? Sure, they may be a mistake, but you're always to the extreme.


----------



## b1lk1 (Nov 6, 2007)

I could care less how many optimizations they make in their drivers top make games faster and if you choose higher FPS over higher IQ, that is everyone's own personal choice.  The only single part that makes me want to puke is how everyone shoves these online benchmarks of all cards down our throats as definitive proof as to which card is better.  Obviously, this skews results and makes 99% of all articles we read on which card is best a moot point.  Depending on drivers, features and many other factors, each brand is best.  For the group like me who prefer IQ, the choice is obvious.  I just wish that stories like this would stop all the nvidiots from jamming their crap down our throats that they are the best when we all know that nvidia has used tweaks for years giving them the edge on default settings.......


----------



## imperialreign (Nov 6, 2007)

> These optimizations arn't doing anything but helping people with lower end cards run a graphic intensive game. How is that bad buisiness?



When a game developer has built enough hype around a game, especially over the graphics that the game will incorporate, and a GPU hardware manufacturer has liscenesed their logo to back and support the game . . . you'll have people buying those GPUs just to play the game.  Sure, you'll have great performance, but you're not getting everything that the game dev was pushing with their product.  It's not fair to the consumer, or the game developers, especially if they buy any nVidia GPUs and are looking forward to the high-end graphics this game is supposed to offer.


edit>>  I'm not saying I have a problem with the optimizations - its doing it behind the customers back that irritates me.  At least give the consumer the option of turning the tweaks on and off.  Extreme point: would you mind 45% pixelation in Crysis (or any game for that matter) if you got an extra 65 FPS?  Would it bother you if you were given a choice, though?


----------



## newconroer (Nov 6, 2007)

I am not surprised at this, the plethora of beta drivers churned out in the last month has been ridiculous. Not one has made a huge impact, and five of them have been 'optimized for Crysis.' 

My only concern with this is, what happens to the optimization or stability concerns of other applications + the drivers in general?

I'm sticking to the WHQL until the Crysis hype wears off.


----------



## Hawk1 (Nov 6, 2007)

Weer said:


> Hitting nVidia when they're down, eh? Sure, they may be a mistake, but you're always to the extreme.



Lol - when have they been down recently. Anyway, I think I would be pissed if they "optimized" my card to reduce IQ but thats just me. 

Oh, and I am seeing something..... yes I think a closed thread will be coming at some point


----------



## Ravenas (Nov 6, 2007)

imperialreign said:


> When a game developer has built enough hype around a game, especially over the graphics that the game will incorporate, and a GPU hardware manufacturer has liscenesed their logo to back and support the game . . . you'll have people buying those GPUs just to play the game.  Sure, you'll have great performance, but you're not getting everything that the game dev was pushing with their product.  It's not fair to the consumer, or the game developers, especially if they buy any nVidia GPUs and are looking forward to the high-end graphics this game is supposed to offer.



You're not going to get everything the game offers if you don't have the best hardware...I don't see what your getting at.


----------



## newconroer (Nov 6, 2007)

|Oh, and I am seeing something..... yes I think a closed thread will be coming at some point|

It hasn't happened by now ..so what does that say of moderator foresight..or lack there of?


----------



## imperialreign (Nov 6, 2007)

imperialreign said:


> edit>>  I'm not saying I have a problem with the optimizations - its doing it behind the customers back that irritates me.  At least give the consumer the option of turning the tweaks on and off.  Extreme point: would you mind 45% pixelation in Crysis (or any game for that matter) if you got an extra 65 FPS?  Would it bother you if you were given a choice, though?



that's kind of what I'm getting at.

I'm not trying to turn this into a fanboi flame-war, so I'm dropping it from here out


----------



## Davidelmo (Nov 6, 2007)

I can't believe that anyone is actually upset or feels "betrayed" by this...

It's not like you are going to notice a difference in quality. Plus, for most people PERFORMANCE is the issue when playing Crysis. Yes the game looks great, but there is NOBODY who can max the game out yet. Considering that most people will be running this game at ~30fps, they want eye candy as well as performance.

If the drivers alter the shadows a tiny bit but then let you move up texture quality from medium to high, then they're worth it because OVERALL you are getting better image quality and performance.

To whichever person said "why not just play all your games at 640x480" - I could counter, "why not play all your games at 1920, with 16xAA, 8xAF, maximum settings?" The answer is because you can't because the performance issues would make it unplayable.


----------



## imperialreign (Nov 6, 2007)

one last thing from me (promise!), I'm going to re-word my edit:

I'm a mechanic, alright?  Say you go out and buy a Mustang GT, or a Corvette, or any high performance automobile, and you bring it to me at the shop just for an oil change and a general look-over.

Say, behind your back, I 'tweak' your car - remap the fuel timings in the computer, install a high flow air filter, high-flow fuel injectors, etc.

Your car runs much _faster_ now, but you're also getting 5-10 mpg less than when I initially brought it into the shop.  Would that bother you?


----------



## newconroer (Nov 6, 2007)

Yes, because you modified the product (the car/Mustang). Nvidia is not modifying our video cards.

Your analogy would be better served, if you sent a GPU in for repairs/RMA, and they returned it with modified clock levels.


All these drivers are betas... meaningless, they're just testing different things, and realising them to end-users to try, test, review, debug and give feedback as well. Even if they focus on performance, over "IQ" with their WHQL drivers, who cares? Graphics units are not all about visual quality, they're about rendering. And the performance required to render the visuals is already subject to degredation by the use of higher resolutions, special features such as AA, HDR, AF etc, as well as the textures used by a 3d application. Visuals are meaningless if you don't have the ability to render them.

If anything, we should be annoyed that ATi and NVidia have not released their 'next-gen' cards sooner, having known full well that demanding applications like Crysis were coming.


----------



## JacKz5o (Nov 6, 2007)

Maybe thats why some people think ATI has better quality graphics than NVIDIA?


----------



## Solaris17 (Nov 6, 2007)

personally i dont like this type of thing....but it really doesnt matter to me like in crysis since i know ill look for it and be pissed but in things like ut2k4 it doesnt matter because though i jack my settings all the way i play things at MAXXXXXX ya know and i love fram rates but all that aside in 2k4 i never look at anything except the targeting rectical...nvidia could put a sighn on the walls saying world destruction in xx:xx:xx:x and i wouldnt notice at all.


----------



## magibeg (Nov 6, 2007)

Well i think what it all comes down to is that nvidia should TELL US they're doing this as well as provide the option to turn it off. Sure you could argue you get more FPS and thats all that matters, but someone else could also argue that they prefer a beautiful atmosphere when they play the game. 

For those saying only FPS matters take into account you could also dumb down the quality in the CCC if you wanted to. It just makes a direct comparison between ati cards and nvidia cards unfair if nvidia is cutting corners.


----------



## Steevo (Nov 6, 2007)

I don't know if I believe what I am hearing, that people would NOT notice the weird graphics? Wow.


On a side note isn't Nvidia's line or was at one time something about immersion in reality crap?


----------



## Steevo (Nov 6, 2007)

Sorry, "Reality Redfined"



Or.




Yes it looks like crap!!


----------



## jydie (Nov 6, 2007)

I would expect both ATI and Nvidia to put out drivers that produce the best image possible, and let me adjust the settings in the game to get more performance.  Nvidia's 8600 and 8800 line of video cards are very good... so, I think tweaking a driver in this manner is a foolish thing to do.  Maybe it was just an oversight that will be corrected in the next update.


----------



## deekortiz3 (Nov 6, 2007)

Ummm... It's still the demo. It is barely optimized for anything. Seriously wait for the retail version before to come out it has to be a lot better than the demo.


----------



## brian.ca (Nov 6, 2007)

That being an oversight is unlikely when you consider the timing.. in the end they might change it back (especially considering the bad press) but as I understand it the driver came out specifically for crysis and timeshift? coinciding with the release of their new cards.   So most reviews of the new card that used crysis as a benchmark will be slightly inflated -- that was most likely the point of all this, and subsequently what was probably meant to be taken away by the articles about it.

At any rate if all you care about it fps then go nvidia, if you like image quality go ati.  Just don't put too much stock into benchmarks using crysis + that driver b/c you're not getting the full picture.


----------



## thebeephaha (Nov 6, 2007)

> At any rate if all you care about it fps then go nvidia, if you like image quality go ati. Just don't put too much stock into benchmarks using crysis + that driver b/c you're not getting the full picture.



Maybe... I read a review with the 2900 XT vs the 8800GTX on HL2 EP2 and it showed the 8800 owned the ATI on rendering quality.

I'll look for the link but I think it was at hardocp or something.


----------



## Steevo (Nov 6, 2007)

The problem is that it was fully intentional. When you rename the crysis executable file it suddenly is fixed. 



What about other games? How many other ones have people been missing the _________ on? Feel free to insert any number of things that can be filetered out


I found it funny that awhile back a friend and I got togeather and while playing CS:S with high res packs installed his always looked different than mine. Despite the same in game settings. Perhaps it was a previous gen of cards, but I was thinking it was 7600 something.


But the average consumer or some of the dumb gamers that only look at the top score to determine what they believe is best. Not average framerate, not quality, quality settings, etc.....


So in general this is pure and total horseshit. Nvidia has been caught yet again with their hands in the cookie jar and will no doubt try to play it off as a minor glitch that was accidental.


----------



## Davidelmo (Nov 6, 2007)

Steevo said:


> The problem is that it was fully intentional. When you rename the crysis executable file it suddenly is fixed.
> 
> 
> 
> ...



Dude, it is a BETA driver. They're meant to be TESTING new things - the obvious two things to change are image quality and performance.

Maybe when they start releasing full new drivers and bragging about more fps, then you'd have something to criticise! But until then they haven't been "caught" at all. They're not even doing anything wrong. It's a BETA driver for a DEMO of a game.


----------



## EastCoasthandle (Nov 6, 2007)

b1lk1 said:


> Nvidia has always sacrificed image quality for performance.  That is the main reason I will always run ATI cards because I'd rather lose a few FPS for better IQ.  I would not call it cheating, but it's just another MAJOR reason to not use online benchmarking results to definitively define which card is best.



Excellent post!  

It appears they have already updated their drivers to a certified WHQL 163.75


----------



## Jimmy 2004 (Nov 6, 2007)

EastCoasthandle said:


> It appears they have already updated their drivers to a certified WHQL 163.75



I think whoever posted that at Guru3D is a little confused 

As you said, that is the new WHQL driver, which means that it is not a beta, as the poster describes on the first line. Secondly, it is not the recommended driver for Crysis as the poster writes on the second line, as the beta driver (169.04) is newer than the 163.75 driver (albeit without WHQL status of course) and has been optimised for Crysis. The 163.75 driver was written pre-Crysis, and I doubt it's any different from the 163.75 beta driver other than that it has been certified by Microsoft.

Edit: and the 8800 GT isn't listed as being supported by those drivers.


----------



## AphexDreamer (Nov 6, 2007)

Does this have any relation to the F.E.A.R .exe name change thing that would give ATI Uers a 10 FPS boost in F.E.A.R on another Nvidia supported game? lol.

If you notice on those pics there is exactly a 10 FPS diffrence between the "Bug" and "the way it was meant to be played" lol


----------



## AndyBroke (Nov 6, 2007)

Both Ati and Nvidia make drivers that they "optimize" for diffrent games to get better performance. Sometimes it affects image quality. The only diffrence is that Ati users can choose if they want the optimizations or not. Ati calls it Catalyst A.I.


----------



## AphexDreamer (Nov 6, 2007)

AndyBroke said:


> Both Ati and Nvidia make drivers that they "optimize" for diffrent games to get better performance. Sometimes it affects image quality. The only diffrence is that Ati users can choose if they want the optimizations or not. Ati calls it Catalyst A.I.



Yeah but its not embedded into the dirver, I don't even have CCC on to use that Optimization.


----------



## EastCoasthandle (Nov 6, 2007)

Jimmy 2004 said:


> I think whoever posted that at Guru3D is a little confused
> 
> As you said, that is the new WHQL driver, which means that it is not a beta, as the poster describes on the first line. Secondly, it is not the recommended driver for Crysis as the poster writes on the second line, as the beta driver (169.04) is newer than the 163.75 driver (albeit without WHQL status of course) and has been optimised for Crysis. The 163.75 driver was written pre-Crysis, and I doubt it's any different from the 163.75 beta driver other than that it has been certified by Microsoft.
> 
> Edit: and the 8800 GT isn't listed as being supported by those drivers.



I think you are a little confused, the release notes do indicate improved compatibility for all TWIMTBP games.  Which in essence is the same thing.  Please read the release notes.  Although they don't say Crysis specifically it is implied.


----------



## Jimmy 2004 (Nov 6, 2007)

EastCoasthandle said:


> I think you are a little confused, the release notes do indicate improved compatibility for all TWIMTBP games.  Which in essence is the same thing.  Please read the release notes.  Although they don't say Crysis specifically it is implied.



Doesn't change the fact that it still isn't the _recommended_ driver for Crysis. It may perform better than the previous WHQL driver, but NVIDIA still recommends the 169.04 beta. I'm afraid whoever wrote the post at Guru3D is making stuff up. NVIDIA's WHQL drivers always lag considerably behind the betas.

Anyway, this is off topic so I'm not going to talk about it anymore.


----------



## EastCoasthandle (Nov 6, 2007)

Jimmy 2004 said:


> Doesn't change the fact that it still isn't the _recommended_ driver for Crysis. It may perform better than the previous WHQL driver, but NVIDIA still recommends the 169.04 beta. I'm afraid whoever wrote the post at Guru3D is making stuff up. NVIDIA's WHQL drivers always lag considerably behind the betas.
> 
> Anyway, this is off topic so I'm not going to talk about it anymore.



Actually that's not correct, it is the recommended WHQL drivers for Crysis because Crysis is part of the TWIMTBP.   Please read the release notes it will clearly state Crysis as part of their TWIMTBP 
:shadedshu.

Ok you can *thank* me


----------



## lemonadesoda (Nov 6, 2007)

BENCHMARK WARS

Time for the marketing guys at nVidia (who forced the driver developers to do the "benchmark" cheating) to start falling on swords. Time to boycott (or at least bad-mouth) nVidia for their attempted cheating.

I really dont care WHO does it, the fact that they did it to boost FPS benchmarks, for some cheap benchmark ranking glory, is very sleazy and deserves a slap


----------



## bassmasta (Nov 6, 2007)

I think I like my water shadows as they are.


----------



## mandelore (Nov 6, 2007)

mrw1986 said:


> Oh no, subtle things I'll never notice while playing! Performance is king.



never notice? lol, its blatently obvious!


----------



## PVTCaboose1337 (Nov 6, 2007)

WHO CARES!?!  If they can do better, let them...  ATI can do the same thing!


----------



## pt (Nov 7, 2007)

what a nice little fanboy war so i will add my fanboism sentence
nvidia sux44rz for trying to fool us with drivers optimizations that give worse image quality
ati ftw


----------



## cdawall (Nov 7, 2007)

this is why i dont immediatly update my vid card drivers

this is 163.44 is the splotches caused by the driver issue?



cdawall said:


> well my settings are all low/1024X768 and this is how it looks in game


----------



## imperialreign (Nov 7, 2007)

> this is 163.44 is the splotches caused by the driver issue?



possible, but not neccessarily.  It could just be the way a 'shadow' is being cast across the FOV when using the binoculars.

If I notice anything like that while fiddling with Crysis later, I'll screenshot it.


----------



## cdawall (Nov 7, 2007)

imperialreign said:


> possible, but not neccessarily.  It could just be the way a 'shadow' is being cast across the FOV when using the binoculars.
> 
> If I notice anything like that while fiddling with Crysis later, I'll screenshot it.




thanks...at least its not artifacting


----------



## imperialreign (Nov 7, 2007)

naw, those screenies look good, though.

 in the second screenie you posted, the AI on the dock looks like he's taking a leak!


----------



## cdawall (Nov 7, 2007)

he is!!! i missed the shot were he is actually peeing i think ill go back and try again


----------



## petepete (Nov 7, 2007)

I always thought that Nvidia always tried to get the upper hand on games framerate wise over ATI... (unfairly rather)

Making all these games Nvidia 'the way it's meant to be played' had me thinking a bit


This is just like the good old Intel marketing cheating over AMD in the Athalon era..


----------



## tkpenalty (Nov 7, 2007)

heh... while AMD tries to max out visual quality, nvidia tries to grab framerates... This is just personal preference... guys stop arguing about it.


----------



## erocker (Nov 7, 2007)

Crysis is actually running pretty well with these new "pre-crysis" drivers they just came out with today.


----------



## ThorAxe (Nov 7, 2007)

I have an 8800GTX at the moment but my previous cards were x1900xt, x1800xt, x800xt, 9800 Pro, Geforce4 Ti4400, Geforce3 Ti200, Geforce2 Ultra, Geforce Pro, Geforce, Voodoo2 12mb SLI. So you see I am not a fanboi. 

I think it's great that the frame-rate challenged can get the opportunity to run the game a little faster without missing much. I've use a number of drivers for Crysis and you would really have to know what to look for to notice anything.

Talk about a lot of fuss over nothing! They are BETA drivers. YOU DON"T HAVE TO USE THEM.


----------



## Scrizz (Nov 7, 2007)

man... I can't even run crysis... stupid GMA 950


----------



## newtekie1 (Nov 7, 2007)

This is a BETA driver.  A beta driver that nVidia is currently working on implementing optimizations for Crysis in.  Of course it won't be perfect, that is why it is still a beta.

NVidia is trying to find the best balence of IQ and performance and implementing optimizations in the driver.  This driver screws up the shadow/reflexion rendering.

NVidia most likely was just playing around with the shadow rendering to see what it would do, not to purposely try to cheat.  I wouldn't be surprised if the problem was fixed by the time the real drivers make it out to the market.  Of course then we will get all the ATI fanboys flooding the forums claiming they only fixed it because they were caught...

Of course the problem goes away when you rename the exe, that is how nVidia drivers work, anyone that has actually used an NVidia card in the past few years would know that.  The optimizations that they implement are based on profiles setup by nVidia for each exe.  Change the name of the exe and kiss the optimizations good-bye.


----------



## Ripcord (Nov 7, 2007)

those 2 screen shots havent been taken from the same place so its not reasonable to compare them. slightly diffrent veiwing angle


----------



## Ravenas (Nov 7, 2007)

Lol, well this seems like it needs to be said:

Nvidia. The way it's meant to be slowed.


----------



## EastCoasthandle (Nov 7, 2007)

check out the results of the 163.75 WHQL drivers from this post


----------



## zOaib (Nov 8, 2007)

that is exactly what the 8800 gt was doing , the image quality sucked compared to my hd 2900 xt , and also i had to run everything on medium and 1280 x 1024 rez to have a decent frame rate , compared to my hd 2900xt which was runign silky smooth at my max resolution 1920 x 1050 and with 2x AA and High Settings.

go figure


----------



## imperialreign (Nov 8, 2007)

Y'know - maybe it's just me, but does it seem like ATI has been rather quiet during the release storm of the Crysis demo?  Not even a word on any beta drivers . . .


----------



## PVTCaboose1337 (Nov 8, 2007)

I'm gonna take my own screenies...  we will see!


----------



## cdawall (Nov 8, 2007)

lol and post them in my comparo thread

http://forums.techpowerup.com/showthread.php?t=43633


----------

