# Crysis finally playable (Benchmark GTX 580 SLI)



## platinumyahoo (Nov 24, 2010)

When I bought crysis in 07 I strapped up with the highest end card at the time - the 9800GTX to run it. I was very disappointed when I found the framerates extremely laggy on very high settings. I then paired it with another 9800GTX in SLI and it did a little better but not good enough to go through the game, it was still a bit laggy.

The year after I upgraded to a GTX260-216 and it was an improvement but even that card in SLI could not run crysis on very high on my 22" monitor (1680x1050).

Just this week I got my two GTX 580s in the mail and I decided to try running crysis on those boys. To my surprise crysis did very well with one GTX 580 and extraordinary when I put them in SLI.

I benched the first level a few times using fraps and I just copied the FRAPSLOG data onto here. Enough with the nonsense and on to the system and results!









*GTX 580*

2010-11-24 03:14:43 - Crysis
Frames: 2216 - Time: 49974ms - Avg: 44.343 - Min: 33 - Max: 67

2010-11-24 03:21:44 - Crysis
Frames: 1861 - Time: 45733ms - Avg: 40.693 - Min: 29 - Max: 53



*GTX 580 SLI*

2010-11-24 04:51:46 - Crysis
Frames: 3233 - Time: 54570ms - Avg: 59.245 - Min: 43 - Max: 63

2010-11-24 04:52:42 - Crysis
Frames: 2676 - Time: 45288ms - Avg: 59.089 - Min: 46 - Max: 63

2010-11-24 04:53:33 - Crysis
Frames: 8743 - Time: 148323ms - Avg: 58.946 - Min: 41 - Max: 63

2010-11-24 04:57:29 - Crysis
Frames: 2324 - Time: 38808ms - Avg: 59.885 - Min: 57 - Max: 62

2010-11-24 04:58:11 - Crysis
Frames: 6027 - Time: 100928ms - Avg: 59.716 - Min: 42 - Max: 63  

As you can see the card runs this game really well on very high 1680x1050. But when paired with another GTX 580 it pretty much destroys the game. the AVG frame rate for 1 card is 42fps and the AVG frame rate in SLI is very close to 60fps! one card should be sufficient to run this game on this resolution and barely any lag should be noticeable. but if you plan on running this game at a higher resolution (maybe 1920x1200) then I would recommend SLIing these two cards.

Hope this benchmark helps!


----------



## HookeyStreet (Nov 24, 2010)

Nice one 

Crysis was the reason I gave up on PC gaming nearly 3 years ago


----------



## pantherx12 (Nov 24, 2010)

Do you have v-sync on somewhere? seems weird to get 63 max so many times.

Also I can't believe your using two 580s for that resolution, buy a new monitor next dude! or 3!


----------



## mdsx1950 (Nov 24, 2010)

Nice!

Though it was playable for me almost about an year ago


----------



## Mussels (Nov 24, 2010)

pantherx12 said:


> Do you have v-sync on somewhere? seems weird to get 63 max so many times.
> 
> Also I can't believe your using two 580s for that resolution, buy a new monitor next dude! or 3!



he could be CPU limited for the max FPS.


----------



## arroyo (Nov 24, 2010)

I remember that in 1998 Unreal was unplayable on my high end desktop (Celeron 433/Riva TNT). Then 3 years later is started to work perfectly (Pentium III 800/Geforce 256). Crysis is the same example. Cry Engine was released too early for hardware he need.


----------



## HookeyStreet (Nov 24, 2010)

Mussels said:


> he could be CPU limited for the max FPS.



Would that i7 really be a bottleneck?


----------



## platinumyahoo (Nov 24, 2010)

HookeyStreet said:


> Would that i7 really be a bottleneck?



HAHA no I dont think my 4ghz core i7 is bottlenecking anything. I do have vsync on but i dont know why the max goes over 60 sometimes, but it dosnt matter this game looks absolutely stunning on very high running that smooth, despite its age.


----------



## platinumyahoo (Nov 24, 2010)

pantherx12 said:


> Do you have v-sync on somewhere? seems weird to get 63 max so many times.
> 
> Also I can't believe your using two 580s for that resolution, buy a new monitor next dude! or 3!



Im trying to get a 26" or 27" soon but I need to find a good deal!


----------



## Mussels (Nov 24, 2010)

HookeyStreet said:


> Would that i7 really be a bottleneck?





platinumyahoo said:


> HAHA no I dont think my 4ghz core i7 is bottlenecking anything. I do have vsync on but i dont know why the max goes over 60 sometimes, but it dosnt matter this game looks absolutely stunning on very high running that smooth, despite its age.



if a 4.3GHz wolfdale could bottleneck 4870 crossfire (which i personally verified it did) i can EASILY say that your 4Ghz i7 would be held back by a 'mere' 4Ghz i7.


dont forget that these game engines are single threaded for the graphics - it only needs to max that one thread out on one core to be bottlenecked. on an i7 with HT thats a mere 12.5% in task manager... so people assume they got plenty of power to spare, when they dont as far as games are concerned.

that cap might be at 60FPS or at 200FPS, but in the end there is ALWAYS a bottleneck, be it from CPU, GPU, game engine, or monitors refresh rate.


----------



## pantherx12 (Nov 24, 2010)

platinumyahoo said:


> Im trying to get a 26" or 27" soon but I need to find a good deal!



Would try and help you out on that but I'm more the guy to talk to if you want bargains in the UK : [

Would you care to do a run or two with v-sync off so we can see max frames please? : ]


----------



## platinumyahoo (Nov 24, 2010)

pantherx12 said:


> Would you care to do a run or two with v-sync off so we can see max frames please? : ]



2010-11-24 07:24:20 - Crysis
Frames: 2412 - Time: 44192ms - Avg: 54.580 - Min: 31 - Max: 79


----------



## platinumyahoo (Nov 24, 2010)

Mussels said:


> if a 4.3GHz wolfdale could bottleneck 4870 crossfire (which i personally verified it did) i can EASILY say that your 4Ghz i7 would be held back by a 'mere' 4Ghz i7.
> 
> 
> dont forget that these game engines are single threaded for the graphics - it only needs to max that one thread out on one core to be bottlenecked. on an i7 with HT thats a mere 12.5% in task manager... so people assume they got plenty of power to spare, when they dont as far as games are concerned.
> ...



So are you saying that the game would run similar if I switched out my i7 and put in my 4ghz E8400?


----------



## crazyeyesreaper (Nov 24, 2010)

yes it would run the same

example Metro 2033

Core 2 duo at 2ghz vs I7 at 3.8ghz the with the same gpu and ram the fps is the same it dosent budge not even by a single fps

Crysis at best utilizes 2 cpu cores and that includes AI, Graphics, Physics, etc still only BARELY utilizes 2 cores

http://www.guru3d.com/article/cpu-scaling-in-games-with-quad-core-processors/9

Crysis is just as fast on a Core 2 Duo Core 2 Quad or even Phenom I the cpu made 0 difference after the 1680x1050 range

what ends up happening is while there is CPU power to spare. The game engine maxes the number of threads it can use and at that point it cant push any further adding more gpu grunt will get you more FPS but at that point you tend to hit a wall eventually where only better performance clock vs clock helps but even then its miniscule.


----------



## Mussels (Nov 24, 2010)

platinumyahoo said:


> So are you saying that the game would run similar if I switched out my i7 and put in my 4ghz E8400?



no, cause the i7 is faster clock for clock.

if you could magically disable 2 of your i7's cores and get it to 5GHz, then yeah, you'd get faster FPS (when something else isnt the limit)


to put what i'm saying into context:

something has to stop your FPS from going over 9000. do you REALLY think two 580's is going to be that limit, before your CPU is?


----------



## platinumyahoo (Nov 24, 2010)

Mussels said:


> something has to stop your FPS from going over 9000. do you REALLY think two 480's is going to be that limit, before your CPU is?



correction!, 580s 

but yea I get the jist of what your saying, there always has to be a computer part that is the bottleneck stopping the FPS from going to infinity.


----------



## Mussels (Nov 24, 2010)

platinumyahoo said:


> correction!, 580s
> 
> but yea I get the jist of what your saying, there always has to be a computer part that is the bottleneck stopping the FPS from going to infinity.



and in this case, i bet its the CPU since DX9 and 10 only have singlethreading for the graphics (DX11 fixes that)


----------



## brandonwh64 (Nov 24, 2010)

HookeyStreet said:


> Nice one
> 
> Crysis was the reason I gave up on PC gaming nearly 3 years ago



Im sorry but thats a dumb reason to give up PC gaming


----------



## MatTheCat (Nov 24, 2010)

brandonwh64 said:


> Im sorry but thats a dumb reason to give up PC gaming



Not really.

Why bother with something that is meant to be fun and part of chilling out, but in reality poses constant frustration from performance issues, pressure to upgrade, and then dissapointment when you do upgrade only to find that it was the games code that was crappy and causing the performance issues all along?

Considering all the games we get are mere console ports, in that respect my gaming rig is like overkill*5 for any current popular title, or at least it should be. But all my 360 gaming mates are playing online Black Ops with no problems whatsoever whilst I cant play it due to constant lag spikes, freezes, judder etc. There is nothing technically special about this game at all, yet it runs like shit on my rig. 

PC gamers pay out a hell of a lot for thier hardware and are left laden with compatibility/performance issues pissing out of their ears, unable to play games that they have already paid for until another 6 months worth of patches finally sort the issue(s) out (or perhaps not). 

Being a PC gamer is a stress ridden pain in the arse.


----------



## Bjorn_Of_Iceland (Nov 24, 2010)

brandonwh64 said:


> Im sorry but thats a dumb reason to give up PC gaming


Im sure he was just being humorous  You know how Crysis VeryHigh affected us all


----------



## pantherx12 (Nov 24, 2010)

MatTheCat said:


> Not really.
> 
> Why bother with something that is meant to be fun and part of chilling out, but in reality poses constant frustration from performance issues, pressure to upgrade, and then dissapointment when you do upgrade only to find that it was the games code that was crappy and causing the performance issues all along?
> 
> ...



Yes really, it's one game.

One game should never be a reason to stop playing games 

Imagine if you applied the same to books " ARRGH THIS BOOKS DISAPPOINTING AND THE PLOT IS SLOW, NEVER AGAIN WILL I RISK SUCH PROFOUND DISAPPOINTMENT! NOOOOOOOOOOOOOOOOOOOOOOOOOOOO"


----------



## streetfighter 2 (Nov 24, 2010)

I had to read the part about going from a GTX260-216 to two GTX 580s several times.  That's like going from a Lotus Elise to a Bugatti Veyron.  They're both lots of fun, but the top speed is 100mph apart.

That being said my HD 5830 plays Crysis just swell.



MatTheCat said:


> Being a PC gamer is a stress ridden pain in the arse.


Sometimes I think that when I imagine my friends fighting it out with their gaming rigs.  But then I look at the massive library of games I have (that still work) and all the free games I can get (flash, etc.) and I think, "well that's not so bad".  Of course, that's highly subjective.


----------



## mrw1986 (Nov 24, 2010)

I had Crysis running maxed out a couple years ago on my GTX280...didn't have any framerate problems. I even have that *.ini or whatever that enhances the graphics and what not.


----------



## MatTheCat (Nov 24, 2010)

pantherx12 said:


> Yes really, it's one game.
> 
> One game should never be a reason to stop playing games
> 
> Imagine if you applied the same to books " ARRGH THIS BOOKS DISAPPOINTING AND THE PLOT IS SLOW, NEVER AGAIN WILL I RISK SUCH PROFOUND DISAPPOINTMENT! NOOOOOOOOOOOOOOOOOOOOOOOOOOOO"



Yes.

But Crysis encapsulated all the frustration of wasted time spent tweaking settings, wasted time (and money) due to bouts of misguided upgrade fever, and of course the lingering dissatisfaction due to the less than satisfactory performance that your rig presented you with. Yet all along it was the game that was crappily coded, as far superior, better looking, and better running titles such as BFBC2 has proven since then. The Crysis story epitomised the endless unwinnable battle of the PC gaming enthuisiast and all the tears, labour, and $$$'s that go hand in hand with PC gaming. 

With shit books, u just put it down and don't read it. You don't waste a termendous amount of hours in vain trying to get it to be a better book, or spend hundreds of pounds paying an author to rewrite certain chapters in order to make the plot that little bit more engaging.



mrw1986 said:


> I had Crysis running maxed out a couple years ago on my GTX280...didn't have any framerate problems. I even have that *.ini or whatever that enhances the graphics and what not.



Comments like this are a pox on honest 'naive' PC gamers everywhere.

Just think of the poor kid who saves up his pocket money to purchase that GTX280 just so that he can max out Crysis on V.High, only to find out that u were talking pish or that 'no frame rate problems' just means that you were perfectly happy with 25-35 FPS for the whole of the game (with no AA).


----------



## Frick (Nov 24, 2010)

It's still no reason to turn down PC gaming. Upgrade is by choice, you can also chose to turn down the settings a bit and simply skip high end hardware.


----------



## platinumyahoo (Nov 24, 2010)

MatTheCat said:


> Just think of the poor kid who saves up his pocket money to purchase that GTX280 just so that he can max out Crysis on V.High, only to find out that u were talking pish or that 'no frame rate problems' just means that you were perfectly happy with 25-35 FPS for the whole of the game (with no AA).



very good point, just because you were happy with ~30fps on crysis dosnt mean i am, i like games to play an avg 45+ and a min of 35-40 to be completely happy. I am completely happy with the 580 in SLI set up for crysis despite the fact that im selling one. but even one 580 at this resolution is enough to get me to finally play through the whole game.

one last thing, where can i get this .ini that improves performance?


----------



## erixx (Nov 24, 2010)

This thread is -long ago- stuffed with crazy statements. First is showing off 2x 580's with such a tiny monitor... lol, then the Crysis dramaqueens (i finished it long ago and forgot about it), wow,  and finally the "PC=pain, etc", sorry but
ha ha haaaaaaaa haaaaaaaaa!


----------



## brandonwh64 (Nov 24, 2010)

Like panther and frick said! its only one game, the game itself wasn't that fun anyway. there are plenty of PC games that play every well on older systems that are a blast to play


----------



## pantherx12 (Nov 24, 2010)

platinumyahoo said:


> very good point, just because you were happy with ~30fps on crysis dosnt mean i am, i like games to play an avg 45+ and a min of 35-40 to be completely happy. I am completely happy with the 580 in SLI set up for crysis despite the fact that im selling one. but even one 580 at this resolution is enough to get me to finally play through the whole game.
> 
> one last thing, where can i get this .ini that improves performance?




I'd recommend learning to tweak it yourself it you don't already 
http://www.incrysis.com/forums/viewtopic.php?id=11614

I find a lot of settings can be upped since there is so many settings that you can lower without effecting things to much


----------



## platinumyahoo (Nov 24, 2010)

erixx said:


> This thread is -long ago- stuffed with crazy statements. First is showing off 2x 580's with such a tiny monitor... lol, then the Crysis dramaqueens (i finished it long ago and forgot about it), wow,  and finally the "PC=pain, etc", sorry but
> ha ha haaaaaaaa haaaaaaaaa!



at least give me credit for giving some kind of reference for this crazy hardware intensive game, despite my "tiny" monitor.

playing crysis on a bigger monitor on very high with AA is probably only possible with a 580 or something close in SLI


----------



## Kreij (Nov 24, 2010)

I had no stress from Crysis. 
I grabbed the Demo and I set the resolution at max (2560x1600) as well as all the settings.
It ran like a slideshow, so I moved on. 

Life's to short to fret over one game.

It is kind of interesting that this game is still being used to judge hardware.

Oh ... and good job Plat. I found the thread interesting.


----------



## erixx (Nov 24, 2010)

credit where i is due  yeah, congrats for testing and sharing, really!


----------



## EarthDog (Nov 24, 2010)

LOL, more $ than sense here. 2 GTX 580's for 1680x1050. Yeah, thats CPU limited 63 FPS.


----------



## Kreij (Nov 24, 2010)

@Plat : I'll be happy to test the boards out at 2560x1600 for you. Just mail them across the little pond and I'll get started right away.


----------



## newtekie1 (Nov 24, 2010)

MatTheCat said:


> Being a PC gamer is a stress ridden pain in the arse.



Or you could just lower the settings so that is is playable.

Set it to 1280x720 resolution, with medium textures, no texture filtering, and no AF.  See how it plays then, because that is what your Xbox buddies are getting.:shadedshu

Everyone complains about how poor PC gaming is because you are forced into upgrades.  No one is forcing you to do shit.  If you are happy with the shitty graphics quality that you get with consoles, then you don't have to upgrade all that often with PCs.  You don't have to play games at max graphics settings for them to be fun, and if the max graphics settings is what makes the game fun either the game isn't worth playing or you really aren't a gamer.



brandonwh64 said:


> Like panther and frick said! its only one game, the game itself wasn't that fun anyway. there are plenty of PC games that play every well on older systems that are a blast to play



I actually enjoyed it greatly, but to each their own.  I also played through and beat it with an x800XL...


----------



## platinumyahoo (Nov 24, 2010)

Kreij said:


> @Plat : I'll be happy to test the boards out at 2560x1600 for you. Just mail them across the little pond and I'll get started right away.



Or just mail me your monitor


----------



## DrPepper (Nov 24, 2010)

HookeyStreet said:


> Would that i7 really be a bottleneck?



Yeah I seem to have a bottleneck with my i7 at 3.6ghz and my single 470


----------



## pantherx12 (Nov 24, 2010)

newtekie1 said:


> I actually enjoyed it greatly, but to each their own.  I also played through and beat it with an x800XL...



My first play was 1440x900 with a HD4350 

15-19 fps fortunately the dropped frames coincided with movement like running so it just seamed a bit more shaky than usual  well... aside from fights or breaking things that slowed down things even more than that  .

5fps-9 on the last level with settings on low 

( I really enjoyed it though!)


----------



## TheMailMan78 (Nov 24, 2010)

The fact that it takes two 580s in SLI to average 60 fps on a 1680x1050 monitor screams so much fail I can't even put it into words. Crysis sucks.


----------



## Kreij (Nov 24, 2010)

Crysis may go down in history as the epitome of inefficient coding for high-end, graphical eye-candy, but the fact that people still try to get it to work to their satisfaction says something too.


----------



## MatTheCat (Nov 24, 2010)

Kreij said:


> Crysis may go down in history as the epitome of inefficient coding for high-end, graphical eye-candy, *but the fact that people still try to get it to work to their satisfaction says something too.*



Exactly.

With the advent of good online console gaming (360,PS3), if you are still an enthiusiastic PC gamer, chances are that you are also an obsessive performance freak, otherwise it would make much more sense to game on a console.

Although there are more stunning games around than Crysis these days (that also run a lot better), at its time, Crysis represented the pinnacle of what gaming technology could bring to the consumer and everyone wanted it running maxed on their system regardless of whether they fought the game was any good or not (I for one did think that the actual gameplay was fantastic). Not everyone knew at the time that the game was a poorly coded hardware raper, with many thinking that a maxed out stunning Crysis was just a GPU or CPU upgrade away, only to be badly dissapointed each time).

Crysis has been the ultimate 'cock-tease' in the world of PC gaming, having led millions of gamers down the exorbitantly expensive garden path of PC upgrades, only to laugh in thier faces and slam the door shut everytime.

Platinumyahoo's post is valuable (to fellow performance freaks) in that it shows just what a dirty whore Crysis is. 2*GTX 580 in SLI, and still imperfect performance although perhaps finally 'playable' in the highest settings.


----------



## pantherx12 (Nov 24, 2010)

You could just use two 6870s instead http://techpowerup.com/reviews/ATI/Radeon_HD_6870_CrossFire/8.html

cheaper.


----------



## crazyeyesreaper (Nov 24, 2010)

sigh and i find those that bash game engine code but cant see whats really going on to be even worse. 
Cryteks Game Engine is actually state of the art and in terms of the image quality it can produce it is better currently then just about every game engine in use.

Areas that they failed in would be proper threading support as in crysis will use only 2 cores and those are not used effectively. As mussels stated in a couple threads DX9 and DX10 only use 1 thread for graphics aka feeding the gpu info from the cpu if given a DX11 overhaul where that limitation dosent exist would give in general a huge performance benefit. A way to look at it is the min fps in BC2 from dual to quadcore the minimum frame rate doubles. I wont say Crysis is a great game as thats opinion based but i can give a few facts.

The way crysis renders graphics is a bit different

Everything is Dynamically done where as in most games and in Crysis 2 it will be static

that means static lighting in some cases static shadows reflections.

Think of it in terms of the issue with Ray Tracing in games today the hardware needed is to high to make mainstream. Saying Crysis is horribly coded is in essence correct. But Crytek dosent create Direct X or determine how many threads it allows for certain things. They were also a bit to ambitious in terms of size and scope of which to add an entirely dynamic lighting system among other aspects. Call it what you want but to be blunt most here have no idea how much information is being crunched to render the eyecandy.

theres Diffuse (color maps) specular (highlights) normal maps, glow maps, alpha maps, complex shaders subsurface scattering effects, now roughly add all those to get a single character rendered now add in multiple enemies plants objects props water now add in the fact all of that is being lit dynamically much like standing in the sun or watching light shimmer on water its never static the difference is minimal in games to be sure, but we have currently hit the wall where every 5% in image quality we gain requires an exponential increase in rendering power.


----------



## TheMailMan78 (Nov 24, 2010)

crazyeyesreaper said:


> sigh and i find those that bash game engine code but cant see whats really going on to be even worse.
> Cryteks Game Engine is actually state of the art and in terms of the image quality it can produce it is better currently then just about every game engine in use.
> 
> Areas that they failed in would be proper threading support as in crysis will use only 2 cores and those are not used effectively. As mussels stated in a couple threads DX9 and DX10 only use 1 thread for graphics aka feeding the gpu info from the cpu if given a DX11 overhaul where that limitation dosent exist would give in general a huge performance benefit. A way to look at it is the min fps in BC2 from dual to quadcore the minimum frame rate doubles. I wont say Crysis is a great game as thats opinion based but i can give a few facts.
> ...



Well whatever they did it wasnt that great. Two games were made from the engine so people who DID know what was going on still thought it sucked.


----------



## crazyeyesreaper (Nov 24, 2010)

not exactly mailman companies are in it to make a profit no matter how forward thinking a game engine might be if it cant be run on the lowest common denominator of systems it wont get used

why do you think bethesda still uses gamebryo engine after 7 + years.

why is Unreal Engine 2.0 still used today?

or the old but stalwart Id Tech 4 engine which saw use in Doom 3?

the industry uses what works best at the time on the greatest number of systems. Its also why companies dont change software often either

example we are using today 3ds Max 2011 yet Bethesda still uses Max 4 and 5 for there games which was released in 2002.

Another example of this is sure a Ferrari is fast but if your stuck behind grandpa on a single lane road and hes driving a beetle you stuck at his pace. that pace is much like PC hardware adoption  most still use integrated or old DX9 gpus those with DX10 gpus are now underpowered by are standards and few are using decent DX11 gpus. as such we have the bleeding edge hardware sure but any company that comes out that stresses that hardware is instantly called out on bad programming etc, yet I really dont know anyone on TPU that can program better or who has the knowledge to do better myself included.

another way to look at it  Toy Story in 1995 took 2hrs per frame 24 frames per second so every 1 second on toy story took 48hrs to render out now Cars released in 2005 took 15hrs per frame x 24fps = 360hrs for 1 second of animation  for reference the difference in performance on hardware between 1995 and 2005 is 300x

what that means is Crytek did something fairly amazing in what they had built into the engine but at the time of its release it wasnt ready for prime time its an engine that should of hit the market in 2010 to have made sense.


----------



## TheMailMan78 (Nov 24, 2010)

crazyeyesreaper said:


> not exactly mailman companies are in it to make a profit no matter how forward thinking a game engine might be if it cant be run on the lowest common denominator of systems it wont get used
> 
> why do you think bethesda still uses gamebryo engine after 7 + years.
> 
> ...



So then what was the purpose of creating Crysis?


----------



## LAN_deRf_HA (Nov 24, 2010)

Every crysis thread turns into some whiny bullshit-a-thon. 

http://forums.techpowerup.com/showpost.php?p=1985506&postcount=35


----------



## newtekie1 (Nov 24, 2010)

pantherx12 said:


> My first play was 1440x900 with a HD4350
> 
> 15-19 fps fortunately the dropped frames coincided with movement like running so it just seamed a bit more shaky than usual  well... aside from fights or breaking things that slowed down things even more than that  .
> 
> ...



1024x768@Low settings on the x800XL, of course it was on a monitor that only did 1280x1024 anyway.  It actually looked better than a lot of games the x800XL played, even with the other games maxxed out setting wise.

When you started lowering settings, the performance to IQ was in line.



TheMailMan78 said:


> The fact that it takes two 580s in SLI to average 60 fps on a 1680x1050 monitor screams so much fail I can't even put it into words. Crysis sucks.



Again, ever so slightly lower settings, takes a hell of a lot less to run at 60FPS and still looks stunning.  Hardly what I would consider "sucks".



Kreij said:


> Crysis may go down in history as the epitome of inefficient coding for high-end, graphical eye-candy, but the fact that people still try to get it to work to their satisfaction says something too.



Not really inefficient coding, but rather just allowing the engine to really use graphic horse power beyond what is reasonably necessary.  If they just capped the engine settings at High and never really allowed it to be maxxed out people would still praise it for how good it looked and wouldn't bitch about it running like shit when "maxxed out".


----------



## crazyeyesreaper (Nov 24, 2010)

well LAN most people dont care why something does X but not Y and why Y causes performance loss while X looks just as good. Most cant see the difference nor do they care to, as ive said Crysis may or may not be a good game thats for each person to decide on there own. As for what the engine is capable of is rather astounding.

dosent matter much i suppose tho most people dont care to understand why there hardware cant push a game instead they would rather scream this game sucks cause there leet hardware cant handle max settings. I agree with newtekie1 drop it done 1 notch and it runs fine.

I see it as if you want great graphics and companies to push the envelope and use your hardware then shut the hell up when they do and be thankfully there not spoon feeding you farmville


----------



## Benetanegia (Nov 24, 2010)

newtekie1 said:


> Or you could just lower the settings so that is is playable.
> 
> Set it to 1280x720 resolution, with medium textures, no texture filtering, and no AF.  See how it plays then, because that is what your Xbox buddies are getting.:shadedshu
> 
> Everyone complains about how poor PC gaming is because you are forced into upgrades.  No one is forcing you to do shit.[...]





crazyeyesreaper said:


> sigh and i find those that bash game engine code but cant see whats really going on to be even worse. [...]





crazyeyesreaper said:


> not exactly mailman companies are in it to make a profit no matter how forward thinking a game engine might be if it cant be run on the lowest common denominator of systems it wont get used[...]



Very well put both. It's been long since I read something with some sense on this kind of topics. Ah. common sense... please could I take some and put it into a little bottle, so that I have one last reserve for when I finally run out of it due to all the nonsense in the internet?


----------



## crazyeyesreaper (Nov 24, 2010)

i only have this common sense Bene as just like you i dabble in 3D and after pissing away $85grand USD to attend fullsail to do content creation for games i did manage to absorb some info lol.

also this is my only argument / debate on TPU for today iam officially out of common sense to spoon feed people with and my patience is razor thin to the point i need a beer and a big ol cigar


----------



## TheMailMan78 (Nov 24, 2010)

See the problem with this is the fact the game is three years old. THREE YEARS and something as bleeding edge as two 580's in SLI have trouble pushing it at 1680x1050? Come on guys? I would have bought this argument a year ago maybe but now its out of hand.

Let me ask you guys this....WTF did they have running to develop this game and run it at ultra high settings? Or are they to still waiting on a GPU that can push their "great" coding?


----------



## Benetanegia (Nov 24, 2010)

TheMailMan78 said:


> See the problem with this is the fact the game is three years old. THREE YEARS and something as bleeding edge as two 580's in SLI have trouble pushing it at 1680x1050? Come on guys? I would have bought this argument a year ago maybe but now its out of hand.
> 
> Let me ask you guys this....WTF did they have running to develop this game and run it at ultra high settings? Or are they to still waiting on a GPU that can push their "great" coding?



3 years have passed and what? What has really changed? Especifically what has changed for DX9? Nothing. Absolutely nothing has changed/improved in reality for anything that pushes DX9 to it's limits and probably never will. A Core i7 has a top 10-20% of single threaded performance increase per clock over a Core2 and a Core2 did the same for A64. A HD5870 has exactly the same triangle setup capabilities per clock (almost the entire front end actually) than a HD2900* and on Nvidia's side it is probably worse than it was when it comes to DX9/10 because of the way they have distributed it on Fermi.

*Why? Because there's only one and thus is only based on clocks. If you look back, every new generation used to have better clocks until (coincidence) 8800/HD3000, since then they have not increased too much if at all.

crazyeyes has already said it, DX9/10 does not allow real multi-threading rendering. You can put AI and physics on another core and Crysis does it, but you can never split the rendering and there's far far more to it than people think. "Rendering is the most parallel task a computer can do." That's probably the biggest lie ever taken as granted in the computer world. Sure it is highly parallel, in that every pixel can be calculated separately, but calculating every pixel is mostly single threaded and with lots of interdependences as Carmack said a long time ago. Like he said, creating good looking shaders and effects reuires fast single threaded execution and he'd rather have a 5 Ghz CPU/GPU than multiple ones and that goes for multi-core CPUs as well as Fermi/Cayman kind of architectures where finally the front end has been parallelized. Of course he was talking about DX9/DX10, so parallelizing the front end does make sense now with DX11, but when it comes to DX9 they are worthless.

Crysis has very complicated shaders and lighting algorithms, not to mention they are much better "fine grained" and they look much much better than any other engine out there if you actually pay attention to them. They are real time and physuically correct in a way, and in that sense genuine. Every other game fakes it in one way or another and I won't say they don't do a good job, because they do obtain great results and maybe it's even the way to go given they obtain almost the same graphical fidelity at a lower cost, and Crytek might have gone with that too. But the thing is they went with the "real thing" for everything (lighting, shadows, shading effects) and because of that CryEngine 2 is extraordinarilly well coded for what it does. They could have used other methods or tweak the ones at use down a notch, but that's not different than comparing a 1920x1200 16xAF 16xAA picture to a 1280x720 0xAF 0xAA (consoles) and saying it looks the same. The thing is both arguments have people who would back them up. You can look at BFBC2 and Crysis and say: "Ok, I think that using lower polycount, less accurate shadowing, lighting and shading is the way to go, the way every developer should go, because BC2 looks the same/better to me and runs better". And probably you would be right. But you can't say CryEngine 2 was poorly coded, because of what's going on.

The TPU_GPU picture that I use as avatar and that granted me the prize took 15-20 minutes to render. Maybe I could have made it in a way that would have took less than 1 minute (less polys, no ray-tracing, no occusion) and look 95% as well as it did, but I choose to use every quality enhancing option in order to make it look better.


----------



## TheMailMan78 (Nov 24, 2010)

Benetanegia said:


> 3 years have passed and what? What has really changed? Especifically what has changed for DX9? Nothing. Absolutely nothing has changed/improved in reality for anything that pushes DX9 to it's limits and probably never will. A Core i7 has a top 10-20% of single threaded performance increase per clock over a Core2 and a Core2 did the same for A64. A HD5870 has exactly the same triangle setup capabilities per clock (almost the entire front end actually) than a HD2900* and on Nvidia's side it is probably worse than it was when it comes to DX9/10 because of the way they have distributed it on Fermi.
> 
> *Why? Because there's only one and thus is only based on clocks. If you look back, every new generation used to have better clocks until (coincidence) 8800/HD3000, since then they have not increased too much if at all.
> 
> ...



So basically what you are saying is we will never have hardware worthy of crytek's 133t programing skillz?


----------



## DonInKansas (Nov 24, 2010)

I'm wondering why are people still pissing and moaning over a 3 year old game.


----------



## TheMailMan78 (Nov 24, 2010)

DonInKansas said:


> I'm wondering why are people still pissing and moaning over a 3 year old game.



Shut up I'm trying to learn me sumtin.


----------



## Benetanegia (Nov 24, 2010)

TheMailMan78 said:


> So basically what you are saying is we will never have hardware worthy of crytek's 133t programing skillz?



I'm mostly saying that hardware took a different direction than Crytek thought, and there's no going back.

Blame can be put mostly on 3 things:

- Crysis had to be delayed for almost a year for reasons not concerning the tech. This actually worked agains them because of what I'll explain in the next two points, but basically the most powerful card remained the same 8800 but HD stormed the front. The tech had already been shown on E3 2005 iirc, with almost all the eyecandy that it ended up having running on a single 7900 GTX @ 1280x1024. The election might seem odd but move to the next point to understand/remember.

- They were creating Crysis for 1280x1024 or 1680x1050 resolution. Back in the day they were upper mainstream almost high-end. Even most people in this forum had 1440x900 LCDs in 2006. Suddenly in a proccess that lasted only a few months 1920x1200 was "normal" and 1680x1050 was the mandatory minimum. (compare it to the 5+ years it had taken the masses to move from 1024x768)

- GPU development slowed down a lot during the final years of Crysis development. NV's 7800 and Ati's x1800 were not exactly what they wanted them to be and it took almost another year to see the 7900 and x1900. That generation lasted 2 years. Next G80 and R600, followed by the "next generation" that was actually a bit slower a year later. Another generation that almost 2 years. Previously the cycle for doubling performance was half that much. Even the most pessimistics would have predicted a 3-4x jump in performance during those 4 "generations", but the actual performance increase from a 7800 to a 8800 Ultra (fastest GPU of that generation until GTX2xx launced) was little more than 2x.

- Oh and CPU performance hit the wall. Only real improvement due to adding extra cores.


----------



## TheMailMan78 (Nov 24, 2010)

Benetanegia said:


> I'm mostly saying that hardware took a different direction than Crytek thought, and there's no going back.
> 
> Blame can be put mostly on 3 things:
> 
> ...



So yeah. We will never play it maxed out. They made a game no one will ever be able to play the way its meant to be because the whole industry failed. Not them?

Edit: I'm not being a dick. Just trying to figure your reasoning.


----------



## pantherx12 (Nov 24, 2010)

I don't think anyone failed really, things just didn't go to plan.

I still think Crysis is one of the best looking games out there. Better than stalker and battle field ( other games I was impressed by graphically)

With custom tweaks to make it look even better the only limiting factor is hardware really because the engine is so scalable ( which is epic and impressive tbh)


----------



## DrPepper (Nov 24, 2010)

TheMailMan78 said:


> So yeah. We will never play it maxed out. They made a game no one will ever be able to play the way its meant to be because the whole industry failed. Not them?
> 
> Edit: I'm not being a dick. Just trying to figure your reasoning.



Well no they took a gamble that maybe the industry would have taken a different course than it did.


----------



## Benetanegia (Nov 24, 2010)

TheMailMan78 said:


> So yeah. We will never play it maxed out. They made a game no one will ever be able to play the way its meant to be because the whole industry failed. Not them?



Kinda. The engine was created (finished) by late 2004-early 2005, there's no going back unless you want to start from scratch. Back then there was no dual core CPUs yet and very few people thought we would hit the Mhz wall or that AMD and Intel would follow the easy path of adding cores instead of making them really faster. The only attempt that I've seen from both in that direction is Bulldozer, 5 years later.

And on the GPU front like I said it was a massive slowdown paired up with a change in people's standards regarding resolution and AA/AF. The 8800 was much faster than the previous gen, but very few people that I know actually saw that, because at the same time they bought a card from than gen, they also went from gaming at 1280x1024 0/2xAA to 1920x1200 4xAA. And you know it's true for many here in TPU too.


----------



## crazyeyesreaper (Nov 24, 2010)

exactly like everything in the tech world its constantly changing you can try and predict it and you might get lucky but people forget from the first PC ever made Till around 2005 it was all single core cpus and we just kept getting faster and faster. There engine finalized as bene said there was no going back it dosent change the fact that Cryteks engine is still the most impressive game engine period in terms of image quality it can achieve. Now Cry Engine 3 as far as im aware corrects the flaws with Cry Engine 2 but theres a serious issue because of all the bitching an moaning evidence this thread and many others Crytek is going BACKWARDS in terms of tech aka static lighting and reducing those effects so sure Crysis 2 will look good it will run better but at the end of the day they sacrificed ultimate visual quality to make everyone who complained happy, as they same thems the brakes.

All we can do now is see what happens next


----------



## pantherx12 (Nov 24, 2010)

I wonder if crytek will leave an option to turn real time rendering back on.

Wouldn't surprise me if they did.


----------



## crazyeyesreaper (Nov 25, 2010)

im hoping because considering the difference when it comes to real time renders aka using 

Gamebryo

Unreal 3

Marmoset

Cry Engine 2

for character meshes   Cry Engine 2 is far superior in terms of what i could achieve so im hoping we get that option Panther but then again if they do include it and it brings systems to there knees ill have to listen to uneducated asshats condem them for it.


Also if u want a game that really is UN optimised go play Saints Row 2 on PC where an i7 and dual 480s will still in most situations net you only 15fps avg


----------



## Benetanegia (Nov 25, 2010)

I didn't know they were going to use static lighting, I thought that the containment that the city supposes (vs jungle) would be enough. That makes me sad. (I hope it's something for console versions only)

Regarding being able to enable it back, maybe only being able to do it through .ini or on the Sandbox? That would be great for me.


----------



## newtekie1 (Nov 25, 2010)

TheMailMan78 said:


> See the problem with this is the fact the game is three years old. THREE YEARS and something as bleeding edge as two 580's in SLI have trouble pushing it at 1680x1050? Come on guys? I would have bought this argument a year ago maybe but now its out of hand.
> 
> Let me ask you guys this....WTF did they have running to develop this game and run it at ultra high settings? Or are they to still waiting on a GPU that can push their "great" coding?



There are some faults with that argument.

When Crysis was released, the common screen resolution was 1280x1024.  And it most certianly could be maxed out with the current graphics hardware.  It took Tri-SLi 8800Ultra's or 9800GTX's but it was doable.

If you look at some other benchmarks, it doesn't take SLi GTX580s to push it at 1680x1050 at max settings.  In fact a pair of GTX480s can do it, they can even manage 60FPS at 1900x1200 as well.  I would even venture to guess a pair of GTX470s could be playable at max settings at 1680x1050.

You can complain about how bad the coding was all you want, but the fact is that the max settings were extreme by even todays standards, and the game was playable on a huge range of hardware.  A poorly coded modern game wouldn't run on hardware that was ancient even 3 years ago, yet Crysis does.  That if anything is a testiment to how well the game actually was coded.  Just because the developers give us the option to push graphics quality way beyond what is necessary doesn't mean the game is poorly coded.  The argument that you can't play it on max settings so it must be crap coding is idiotic.


----------



## crazyeyesreaper (Nov 25, 2010)

yea from what i remember from press releases its static lighting because the console cant handle the dynamic lighting vs detail in game so they went with a slight detail increase at the expense of lighting and shadowing that might change i dont know but for now it looks to be the whiners causes perhaps one of the best game engines ive had the privilege to tinker with, to get downgraded im all for calling out companies that suck at optimization examples Saints Row 2 but oh well, Im just 1 customer and there bottom line needs improving so thats that.



> The argument that you can't play it on max settings so it must be crap coding is *asinine*.



fixed needed more oomph


----------



## DrPepper (Nov 25, 2010)

crazyeyesreaper said:


> yea from what i remember from press releases its static lighting because the console cant handle the dynamic lighting vs detail in game so they went with a slight detail increase at the expense of lighting and shadowing that might change i dont know but for now it looks to be the whiners causes perhaps one of the best game engines ive had the privilege to tinker with, to get downgraded im all for calling out companies that suck at optimization examples Saints Row 2 but oh well, Im just 1 customer and there bottom line needs improving so thats that.
> 
> 
> 
> fixed needed more oomph



I saw a realtime demo on the PS3 that had dynamic lighting ?


----------



## Benetanegia (Nov 25, 2010)

DrPepper said:


> I saw a realtime demo on the PS3 that had dynamic lighting ?



Yeah, this is 2010 so there's probably always going to be 1 dynamic light for the characters, but that light will probably exclude everything else. Meaning that the environment will have static lighting and precomputed shadows.


----------



## crazyeyesreaper (Nov 25, 2010)

exactly

Crysis + Warhead = all lights are dynamic

Crysis 2 = 1 light dynamic everything else is static.

im not sure but they might be able to pull off 2 light sources in terms of being Dynamic but we all know dynamic reflections etc are going to suffer or be replaced with static ones  Soft shadows will most likely be disabled and replaced with regular shadows that are precomputed and somewhat blocky on all hardware.  Only time will really tell as i said with great care and understanding static and almost match Dynamic sources but it actually takes alot more work and as we all know more work = need more money more money means publisher gets more say and the game suffers its a vicious circle again at the end of the day your limited by the bottom line and all companies need a profit so were likely to see worse graphics then a 4 year old game or on par but no actual improvement which is what would be expected.


and while PC will get AA last i checked the 360 version and PS3 version will have 0 aa usually 360 games can get 1x or 2x AA at 720p and PS3 gets a form of morphological AA but niether system gets AA on this game to maintain proper frame rate


----------



## TheMailMan78 (Nov 25, 2010)

newtekie1 said:


> There are some faults with that argument.
> 
> When Crysis was released, the common screen resolution was 1280x1024.  And it most certianly could be maxed out with the current graphics hardware.  It took Tri-SLi 8800Ultra's or 9800GTX's but it was doable.
> 
> ...



No my argument was after three years it still barley breaks 60 FPS maxed out. However Ben explained why. Do I still think the game is coded badly? No. What I do think is someone should lose their job at Crytek for estimating the industry would take a different route then it did. 

I could still call Crisis fail because of this fact and that no one else decided to use this engine because of its failure of vision. But then again you all would say its not their fault they built a boat with no water to put it in. Its still the best boat there is and its the worlds fault for not developing correctly


----------



## crazyeyesreaper (Nov 25, 2010)

no as i said man the engines not fail cryteks not fail for developing it what is fail is as you said trying to predict technology when it changes so rapidly your better off to not bother lolz

but then again we are use to 60fps where as almost every console game is pegged at 30fps and many have slow down to the low teens

the fact crysis 2 is 720p supposedly with no aa of any form on consoles to maintain 30fps suggests there having a bit of issue. My guess is the game will be much like COD in that it will be around 600p to maintain frame rate the issue i have is with the failure of Crysis to be seen as a success in the Industry how much will Crysis 2 suffer for it? the fact Crysis was so hard to run will people whom are uneducated shy away from the game itself?

the big fail for crytek isnt that there game ran horribly or that made a wrong guess the stigma attached to Crysis is likely to hinder sales or so i would be willing to bet. And so while the console sales may boom for those that want to say haha i run crysis to  that many PC gamers without the high end hardware may just say forget it and not bother I have a feeling should Crysis 2 fail on PC we might see Crytek pullout of PC development and focus on consoles in which case we basically lose the last company to try and push that boundry in all forms.

just remember to this day theres 3 big engines that get used

Source

Unreal

Id Tech

of those Unreal and Source dominate now considering that if Unreal 2.0 could be edited by a studio to surpass Unreal 3 after all this time what improvements or real changes have we seen? current game engines are much like the ati hardware from 2000-5000 series tack on more shit and just run with it lol but eventually you hit a wall and things change question is what will change?

Lets face it PC gaming isnt dying but PC gaming INNOVATION is on serious decline. I dont really want to play the same console lvl graphics for another 5 damn years i want my holodeck now damn it  and i dont know where im going with this i guess my brain just killed itself.

maybe bene can figure out where im going with this rant cause i cant.


----------



## Benetanegia (Nov 25, 2010)

TheMailMan78 said:


> No my argument was after three years it still barley breaks 60 FPS maxed out. However Ben explained why. Do I still think the game is coded badly? No. What I do think is someone should lose their job at Crytek for estimating the industry would take a different route then it did.
> 
> I could still call Crisis fail because of this fact and that no one else decided to use this engine because of its failure of vision. But then again you all would say its not their fault they built a boat with no water to put it in. Its still the best boat there is and its the worlds fault for not developing correctly



Predictions are always a gamble and you cannot really blame anyone. It looks like they screwed up, but only happens so because nobody else gambled lol. Others didn't "fail", but they didn't improve anything either. All other games from 2006-2009 used 3++ year old engines. COD4/5/6/7 engine is 2004 tech really and obviously based on Quake 3 engine, UE3 is little more than UE2 (2.5) on steroids and pretty much every other game engine is a rehash of an older one supporting higher res textures, HDR and little more. What's more, I'd say 90% of newer games are based on either UE2 or Quake3. (based as in "licensed the engine once, created my own reverse engineering it")

It's really sad (for me at least) that I recently played HL2 Cinematic Mod 10 and I never really felt like I was playing a 6 year old game, because it doesn't really look very different (worse) than most of the new games I played recently, with the exception of Crysis and Metro2033. I'm playing Oblivion with some mods now (Quarls TP, OBGE) and it happens the same (yeah lack of proper games forced me into this nostalgia). In 2004/2005 when those games were new, doing that was unbelievable.


----------



## CDdude55 (Nov 25, 2010)

For me, the questions that always popped up in my head with Crysis since it the day it came out was whether or not the game was poorly coded or was it just that the game was ahead of it's time to the point where we wouldn't be able to see reasonable frames until hardware matured. And it might of been a little of both, it's taken about 3 years to really see Crysis get dismantled, yet the issue is basically non existent in current hardware, my GTX 470 can max Crysis with a reasonable framerate, but this couldn't have been said with the high end 8800's back in '07. But again, the fact that it's taken this long to see this feat be accomplished tells me a different story.


----------



## Mussels (Nov 25, 2010)

CDdude55 said:


> For me, the questions that always popped up in my head with Crysis since it the day it came out was whether or not the game was poorly coded or was it just that the game was ahead of it's time to the point where we wouldn't be able to see reasonable frames until hardware matured. And it might of been a little of both, it's taken about 3 years to really see Crysis get dismantled, yet the issue is basically non existent in current hardware, my GTX 470 can max Crysis with a reasonable framerate, but this couldn't have been said with the high end 8800's back in '07. But again, the fact that it's taken this long to see this feat be accomplished tells me a different story.



poorly coded for sure. there are just performance bottlenecks in the engine itself.


raw power lets us grunt our way through some of it, but the level of power required is far above what it should be.


----------



## newtekie1 (Nov 25, 2010)

TheMailMan78 said:


> No my argument was after three years it still barley breaks 60 FPS maxed out. However Ben explained why. Do I still think the game is coded badly? No. What I do think is someone should lose their job at Crytek for estimating the industry would take a different route then it did.
> 
> I could still call Crisis fail because of this fact and that no one else decided to use this engine because of its failure of vision. But then again you all would say its not their fault they built a boat with no water to put it in. Its still the best boat there is and its the worlds fault for not developing correctly



So your argument is that if the developers of Crysis had artifically limitted what the engine was capable of when releasing Crysis, then Crysis and the engine wouldn't have been a failure in your eyes?  Do you really think that is the best most logical approach?


----------



## crazyeyesreaper (Nov 25, 2010)

there is no performance bottleneck in the engine itself theres only so much info a damn cpu can provide the game itself it was you who stated that DX9/10 can only supply 1 thread for rendering well if youve worked in 3D and realized what is on the plate its gonna take more then what the current DX at the time offered. It is highly likely that if CryEngine 2 had been done with DX11 aka today then no one would have complained then again who knows what DX10 could have been? since it was changed to suit Nvidia at release so there was a DX10 gpu in time for Vista and we all know how that turns out. Theres alot of things that come into play. The major issue is cry engine 2 is just to heavy for the way DX handles its functions i explained some of this already and so did Bene earlier in the thread. Fact is you can make any game engine unoptimized or perform badly and those games dont even have Dynamic lighting systems.

Example Cry Engine 3 will be the first game engine every produced that offers a full global illumination in game will Crysis 2 use it fuck no but its there and its the first of its kind.  This time around if utilizing DX11 with multi threaded rendering performance should be far more realistic.

After all it comes down to choices. what if crytek had never done a very high setting? as was stated earlier i also agree had high settings been max we wouldnt have this conversation

biggest performance issue is the lighting system in crysis

examples Unreal Engine 3 the lighting is entirely faked its pre baked into the textures etc for the game engine so light sources dont really do much as every thing is static and most of the work is pre done before its ever rendered change that to an approach that does it on the fly as your running and gunning its a completely different approach

much like ray tracing vs photon mapping for rendering in a 3D app for the most realistic form of lighting.  Theres infinite choices and approaches but when your limited in how you can approach something and even more limited in how you can fix it you have to make do with what you got the fact the game scales on older harder is a testament to the effort put into it to an extent.  Again anyone here who wants to complain a game is unoptimized or dosent run well seriously needs to go play Saints Row 2 on PC and after that tell me what a real unoptimized game is.


----------



## Mussels (Nov 25, 2010)

you got it and lost it at the same time.

if its not CPU limited (spare cores) and its not GPU limited, then its engine limited for not being able to use multiple threads.

whats it engine? DX9 and 10 based. so yeah, its limited by its engine. if they remade it in DX11, we'd see different results.


----------



## crazyeyesreaper (Nov 25, 2010)

true but microsoft writes Direct X not crytek and last i remember Nvidia being unable to deliever all DX10 features resulted in a lesser version to fit both gpu maker *nvidia* and microsoft *vista launch* goals and not matter how you slice it it was a cluster fuck.  That said Direct X is just a software layer so its not really engine limited its more at the time and Operating system limitiation in terms of what it allowed to be possible due to its programing now thats fixed which is fact but it took 4 years to get there and its far to late. Crysis 2 will be cryteks chance to basically say hey we got what we needed now we have the software interface layer to handle are engine will they? i dont know i dont work for them nor will i probably ever lol but point is blame is squarely on everyone in this situation

Crytek tried to guess what way the tech world would be headed

microsoft didnt think ahead in terms of DX10 

Dx10 hardware was in its infancy DX9 they already knew didnt offer the proper threading for what they were trying to do.  Its a perfect situation for a shit storm and thats what happened. My guess is tho we wont see that situation again as i highly doubt in todays age with the cost of development that few companies will have the grunt to push things forward we are ever so slowly grinding to a halt in terms of advancement. Both Hardware Software Creativity and Scope.


----------



## Mussels (Nov 25, 2010)

regardless of which piece of the engine is the problem and who made that piece, its still the engine at fault.


----------



## crazyeyesreaper (Nov 25, 2010)

well then i guess we should all use Open Gl so theres no bottleneck 

point is people can bitch and complain that the GAME is unoptimized but its not true and that was the whole real point eitherway this has dragged on way to long and the thread should probably just be closed to be blunt.


----------



## Benetanegia (Nov 25, 2010)

Mussels said:


> regardless of which piece of the engine is the problem and who made that piece, its still the engine at fault.



No not at all. It depends on the setting you use. You can have the game playing on High instead of Very High and it will be completely smooth on todays hardware, plus it will still look much better than 95% of new games and technically superior to 100% of them. What's more, you can tweak the hell out of the graphic related Cvars and make it look almost as good as Very High while running better than stock High settings.

The values (precision, accuracy, resolution, you name it) of the graphics and physics related Cvars in Very High are actually 4x times higher than on High. Difference between Medium and high is a max of 2x, but most Cvars are either the same or only a little bit higher. Your typical game/engine distributes the options linearly, medium is 25% better than low, high is 25% more than medium and very high is 25% higher than high. Crysis is something like low +50%-> medium +25%-> high +200%-> very high. i.e it's been 3 years so I don't remember the actual Cvar, but there's one that limited how many dynamic light sources could affect a pixel or something like that and values were 1, 2, 4, 16, respectively. (Settings for nearly every other game egines are 1,1,1,1 and 1 respectively.)

Another one was the number of "photons" to be jittered per pixel (or per quad, 2x2 pixels not sure) for calculating the lighting. Think of it like antialiased lighting, so that light and reflections have no hard edges (so far the only game with this from what I've seen). Once again values were 4, 16, 32 and 128 respectively iirc. (Other games = 1,1,1,1,1.)

They did so because it was supposed to be the definitive setting. Crysis High is already much better looking than any other game released in 2007-2009, you were just not supposed to actually game on Very High except for benchmarking purposes really and eventually you would be able to play it, like you are able to run it now since HD58xx and GTX400. But that's something Crytek already said when they launched it and the actual reason that Very High was not available in DX9 mode. I know they told a different story, but that was pure marketing based on deal with M$ and Nvidia. It was actually the same thing with Farcry and id did the same with Doom3, and pretty much any other of their games. It's just that hardware (including and specially CPUs) evolved so much faster back in the day.

And like those cvars above there were many many others. You could say it was not very well "optimized" if you think that optimizing is equal to "fitting certain quality settings to available hardware" (aka lowering settings until it fits) instead of what optimizing actually is, which is making code run faster so that you do't have to lower the settings. 

Crysis code is brilliant, because of how much it does on the available hardware. And yes available hardware means the hardware that is actually exposed via DX9, which means no multi-threading on the *rendering code* and many other limitations. Crytek could have created 28 threads, that wouldn't change the fact that until DX11 the rendering is single threaded and will always be tightly tied to single core performance which is almost the same now as it was 5 years ago, when the engine was created. 

I'm sorry, but in 2004/2005 no one really thought that we would hit a wall at 3-4 Ghz, not even Intel nor AMD thought that or they wouldn't have gotten caught in the multi-core race, once the mhz race ended so abruptly. It's not until i7 that Intel did small improvements to single core performance and the only true work on that front is AMD's Bulldozer, and we have yet to see if it really improves single threaded performance and by how much.


----------



## erixx (Nov 25, 2010)

Thread is still going on? Ha!

Give me "anytime" an "unoptimized engine" that is beautiful, and stuff the "optimized engine" that is corridor-based, always dark, unable to display open and daylight scenarios, read: the 'heavenly' 'divine' 'godly' *old* Quake and Unreal engineered games. 

VIVA CRYTEK : )

As of today, Crysis is or must be perfectly playable, I believe, as I played it slightly below the max settings years ago. This bitching is like buying a sportsbike or car and complain that there are no roads to set it at full speed in your fkkg country, LOL

For sure it is not the "ENGINE's" fault, LOLLOL


----------



## Benetanegia (Nov 25, 2010)

erixx said:


> Thread is still going on? Ha!
> 
> Give me "anytime" an "unoptimized engine" that is beautiful, and stuff the "optimized engine" that is corridor-based, always dark, unable to display open and daylight scenarios, read: the 'heavenly' 'divine' 'godly' *old* Quake and Unreal engineered games.
> 
> ...



That's another way of saying it. hehehehe


----------



## pantherx12 (Nov 25, 2010)

Cheers for that post up there Benetanegia, I knew the engine was scalable but I didn't realise to such an extent. Really handy info.

Is it possible to patch games to newer versions of DX or no?


----------



## Bundy (Nov 25, 2010)

CDdude55 said:


> For me, the questions that always popped up in my head with Crysis since it the day it came out was whether or not the game was poorly coded or was it just that the game was ahead of it's time to the point where we wouldn't be able to see reasonable frames until hardware matured. And it might of been a little of both, it's taken about 3 years to really see Crysis get dismantled, yet the issue is basically non existent in current hardware, my GTX 470 can max Crysis with a reasonable framerate, but this couldn't have been said with the high end 8800's back in '07. But again, the fact that it's taken this long to see this feat be accomplished tells me a different story.



Don't forget what has been said about monitor size. I was able to play the game at about 20-30 fps with a 8800 ultra, but the screen was a 1280*960 CRT. the motion blur stuff took care of the rest. I thought the AI in the game was a bit stupid but in retrospect, that probably also helped make the game playable. I liked the game and enjoyed playing it.

I do not think I'd like to play it online, on my current monitor or with bots that could aim. I'm firmly a 60fps type of gamer these days.


----------



## Benetanegia (Nov 25, 2010)

pantherx12 said:


> Is it possible to patch games to newer versions of DX or no?



Yes and no. It is posible, but it is not profitable to do so. It's actually a cost that is really hard to justify. In order to gain any advantages from the new DX iteration it would require to code most of the engine again, and ultimately it may even be easier to start from scratch. Besides creating the code itself is not what costs more in terms of time and people, it is testing and bug fixing and tbh that's time better spent creating a new engine/game if you are a game developer.


----------



## pantherx12 (Nov 25, 2010)

Benetanegia said:


> Yes and no. It is posible, but it is not profitable to do so. It's actually a cost that is really hard to justify. In order to gain any advantages from the new DX iteration it would require to code most of the engine again, and ultimately it may even be easier to start from scratch. Besides creating the code itself is not what costs more in terms of time and people, it is testing and bug fixing and tbh that's time better spent creating a new engine/game if you are a game developer.




That's a shame.

Ahh well I can always hope that one day processing power is great enough for the computer it's self to update the software autonomously to match it. ( I can dream!)


----------



## LiveOrDie (Nov 25, 2010)

crysis was full of bugs which made the fps slower than what they should of been crysis warhead fixed most of those bugs, you will find you should get better fps in crysis warhead on the same settings.


----------



## pantherx12 (Nov 25, 2010)

Live OR Die said:


> crysis was full of bugs which made the fps slower than what they should of been crysis warhead fixed most of those bugs, you will find you should get better fps in crysis warhead on the same settings.



I thought in warhead they done things like reduced the amount of ambient animations and reduced the lighting effects by default etc.


----------



## Benetanegia (Nov 25, 2010)

pantherx12 said:


> I thought in warhead they done things like reduced the amount of ambient animations and reduced the lighting effects by default etc.



Yep. Probably a little bit of both things actually. But Crysis > Warhead, that for sure, although many people think that Warhead actually looks better. Same as BFBC2 looking better than Crysis, it's subjective, although maybe overally right in the sense that large scale subjectivity makes for an objective common POV. 

On Warhead they added a higher ammount of (less accurate) effects and made them more pronounced and "spectacular". IMO although in some aspects more beautiful and eyecatching, Warhead looks far more cartoonish and CG/game-ish, as opposed to Crysis which aimed at realism. 

And that trait was actually extended to all the facets of the game. Like gameplay or AI, which was made more straight forward and action based, but IMO less genuine and attractive. I enjoyed Warhed, but to me it was just another generic FPS, unlike Crysis which was unique and different, but I like that kind of things. 

For me a game doesn't even have to be fun to be good, in the most generic meaning of fun. I approach games as I'd approach a film or a novel, I want them to make me feel something, something I would not usually feel in real life (i.e no drama please, and no I love you much I love you too either), but I don't want to just kill kill kill, I want to think too. IMO the best games released "lately" have been Crysis, Metro2033, Cryostasis (yeah you read it right) and Mafia 2, in no particular order. There's been some others, but most of the rest have felt "hollow" to me, completely insipid.


----------



## newtekie1 (Nov 25, 2010)

pantherx12 said:


> I thought in warhead they done things like reduced the amount of ambient animations and reduced the lighting effects by default etc.



Yeah, basically in Warhead they artificially limitted the engine to make people happy.  No real optimization was done beyond that.

Not really the best thing to do, IMO, because it hinders progress and developement of technology and better IQ in games, but I guess it is what has to be done to shut the whiney bitches that can't max it on their rigs up.


----------



## DrPepper (Nov 25, 2010)

Live OR Die said:


> crysis was full of bugs which made the fps slower than what they should of been crysis warhead fixed most of those bugs, you will find you should get better fps in crysis warhead on the same settings.



All the bugs were fixed with patches.

I can't think of many if any bugs that are still in the game. Also check Benetenegia's excellent post on why it's slow.


----------



## LiveOrDie (Nov 25, 2010)

DrPepper said:


> All the bugs were fixed with patches.
> 
> I can't think of many if any bugs that are still in the game. Also check Benetenegia's excellent post on why it's slow.



even after the patches there was still a big list of errors read by the console on load.


----------

