Friday, June 27th 2008
New Crysis Warhead Details Emerge
Some early details about Crytek's Crysis Warhead game have been revealed today by the PC Gamer magazine. Here are all the new key points at a glance:
Source:
Tiscali Games
- Begins when the original game's Nomad character parts ways with Warhead's new hero, Psycho, and follows him all the way until the two are to reunite again on board the aircraft carrier.
- Mostly located on the other side of the island.
- Less linear approach and more sandbox type of gameplay, as opposed to the original.
- Same nanosuit and the same functions, with more likely to be revealed later (definitely a "surprise" in that matter is promised).
- Singleplayer campaign to last 8-10 hours.
- At least two new weapons, the granade launcher and double SMG.
- Improved enemy AI, betterily able to organize itself and follow tactics.
- New vehicles incl. Armored Scout Recon (about the size of a jeep with a mounted gun) and a hovercraft, both playable in multiplayer as well.
- New team-based MP mode and less complex than the original two.
- Betterily optimized to run faster than the original game on the same hardware.
- Won't require DX10 for maximum details and full effects.
- Dialogues done by Bioshock's Susanna O'Connor.
- Completely stand-alone and as noted by developers, not an add-on but a full title.
- The possibility of Crysis 2 to rely heavily on Warhead's sales.
31 Comments on New Crysis Warhead Details Emerge
I've found that it's actually more playable at higher resolutions with a multi-GPU setup as compared to running MGPU at a lower res
for example - on my system, Crysis seems to run a lot better at 1440x900 with crossfire, than it does at 1280x800 with crossfire
but, sometimes, though, I've felt that the game is just trying to render too much at one time . . . which would be fine, graphiclly it gives the user a much larger viewing distance, more detail, it makes for a larger appearing and more believable game world . . . but if one ventures into setting up a custom .cfg file, there are many cvars that can be adjusted to "tone down" what the game is trying to render, which increase performance substantially, without really impacting game detail and depth - most of the time image quality doesn't even appear to be effected, or the change is so slight that you'd probably not notice it.
About optimization. As I said the true meaning of optimization is to make the same thing run faster. With the time, because of the multiple definitions you can give to what the "same thing" is, optimization has derived into "tune down until it works fine".
Our perception is "flexible" and there's always a threshold you can play with and still be withing the definition of "the same thing". It's complex how I said it, but the concept is easy. Take antialiasing for example, the picture at 4xAA and 16xAA is pretty much the same, but it's not, but because is within the threshold we can be very happy with 4x. As you said in Crysis there are a lot of things you can tone down and still get pretty much the same result because it's within the threshold, but you are not rendering the same picture. Crytek aimed too high at too many features, that's unquestionable, but they made them run "fast". Continuing with AA example, imagine that 16x requires exactly twice the power than 4x and that you have two identical games except one uses 4xAA and the other uses 16xAA. If the second one runs at exactly half the speed of the first one, it's as well optimized. IMO Crysis goes one step further and makes the engine run better than that "half".
As to why they aimed so high, IMO they didn't. Ati and Nvidia prior to G80 were releasing new generations of cards (with ~2x the power of the older one) every 12 months or so. This time it's been 20 months, almost twice as much. If the release pace had been the same we would be at the doors of yet another next generation with the new ones (GTX and HD4000) selling for half the price of today and the inevitable refresh cards that usually come with a ~25% increase in performance, not to mention multi-GPU. Even worse, both companies advertised the launch of the cards we are finally having today for the end of 2007 so the decision to make Crysis tax those cards made more sense than what it does today.
EDIT: As to why no other game has made the same mistake of aiming too high, the answer is very simple: consoles. Just look at PC exclusive games and you will see most ofthem are the ones that run slower. It shouldn't be otherwise?
IMO "Crysis is ahead of it's time" is not the sentence in this case, IMO a better one is "The time is late to Crysis".
1280x720
1440x900
that leaves an average difference of about 5FPS (give or take). That's a pretty negligible difference, IMO - you're left with better IQ from the higher res.
I can't say that those results would hold true for other users, though, bases on system setups - load times for me are rather quick, but I attribute a lot of that to my system OC. I see, that makes a lot of sense, really - I hadn't thought about it too much like that.
But, then my question is - if a lot of the "extras" that can easily be disabled or toned-down without any noticeable impact on IQ . . . why the overkill? Wouldn't it have been better to tone everything back from the get-go if users can't actually see any difference between some of these settings? defi agree on that - consoles have really dumbed things down over the years . . . but I would still think based upon our hardware now that PC games should still rule the roost . . .
hell I remember DukeNukem3D on PC looked better, ran faster, loaded quicker than the DukeNukem64 port for N64 . . . same went for Quake, Doom, and many other ports of that era
Dealing with cvars I found many hints to that CryEngine uses some kind of photon based lighting model. A photon model means that not only is tested if one pixel is lit or not, but also to what extent and not only with the usual multiplier. Explanation, think of antialiasing (yeah again :)) as is exactly the same but with the lighting. I'll assume we all know what AA does, and the reason to use it. Shortly said, AA uses more than one sample per pixel because that pixel does not necesarily need to have the colour of the center of the pixel. Photon lighting serves the same purpose. Standard lighting models just check if the pixel is lit and by what amount based on the light multiplier (after calculating atenuation if there is any, etc.). But as with AA, what happens if almost half of the area (but not the center) that the pixel represents should not be lit? The pixel should only be half lit, right?
Now here comes the controversy. Instead of using a number of photons (samples) per pixel that will be rendered, the lighting model is done in geometry and the settings are stablished by the distance between photons (pretty much doubled for each setting IIRC). That the lighting is done in geometry is because it's a rasterizer not a ray-tracer, and because they probably use those same photons for creating shadow maps, between other things. And here is the problem because, if you want to be sure you have enough photons for each pixel at 1920x1200, at lower resolutions you will have a lot more per pixel which is overkill, but you have to do it if you want a proper render at high res. Maybe they could have made this variable depending on the resolution, but could lead to more overhead than goodness, who knows...
Is this feature overkill itseft? Well IMO is one of the things that make Crysis stand over the other games visually. How much realism and precision is too much?
it all just reminds me that good programmers can do the maximum amount of work possible, requiring the least amount of code, and the least amount of work, while being as efficient as possible . . . but we don't really see that much anymore :ohwell:
I mean, just in comparison . . . I haven't dug too far into Crysis as a whole, so I'm no expert here by any means, but take in-game sound for example. Audio occlusion and environment is applied to voices based upon where they originate from and based on (IMHO) PC POV . . . but if you notice, using the binoculars or weapons scopes, which really only pull a POV/DOF zoom, you can hear sounds that you shouldn't be able to, as your in essence zooming closer to that location. It wouldn't have been that hard to base audio playback filters on PC relative position rather than POV, which should prevent being able to also "zoom in" on sounds . . .
the even easier route would've been to bail out to Creative and impliment EAX, where all the audio work has already been done, and all they would've need to code was the EAX calls . . . or they could've gone the free route with OpenAL.
I have to commend them, though, on being able to accomplish what they did with the sound engine in game, being able to "filter" audio as such . . . but considering how complex and bloated the game had become, moving audio filtering from the game engine to other software could've easily have trimmed some fat at the cost of some slight audio latency.
But in the end, talking in absolutes, Crysis is not well optimized. None other game in the last years is well optimized neither. The only reason that other games run well, it's because they were programed to run on a 7900M GTX and a crappy cut down version of the PowerPC and the competence that is on the same level.
As to the binocular sound is a gameplay feature. It was present in Farcry too and maybe was there where they say the binoculars have some sort of directional microphone IIRC. Since gameplay in Farcry and Crysis is open and happens in big maps, I suppose that was the only way they could add some enemy dialogues. That way you can hear their stories and jokes, they give you some advantages as where you can attack them first, in Crysis you have the first glimpses of how the aliens are, etc...