Thursday, March 13th 2008
S.T.A.L.K.E.R.: Clear Sky DirectX 10 Screenshots
DirectX 10 games have never looked so good. PCGH posted today a story showing the first S.T.A.L.K.E.R.: Clear Sky DirectX 10 screenshots, and an exclusive 4:31 minutes long in-game footage from CeBIT 2008. The game is running on Windows Vista and video cards from NVIDIA's 9 series, probably GeForce 9800 GX2. If you ask me, this game puts Crysis to shame, and most important it runs smooth!
Source:
PCGH
52 Comments on S.T.A.L.K.E.R.: Clear Sky DirectX 10 Screenshots
Clear Sky looks great, and DX10 support is even better news. i can't wait!
I only hope they implement dropped features from the original game, like sleeping and drivable cars.
I just have to be patient and prices will go down.
Did anyone else see that Amazon have a 8800gt for $160? :D
I don't want another graphics showcase and I hope the plot is alot more solid than Shadow of Chernobyl.
Do you know if DX10 performs better on XP than Vista with this hack?
And also does this mean you don't have DX9 on XP anymore?
alkyproject.blogspot.com/2007/04/finally-making-use-of-this-blog-i.html
That dude will be raking in the money pretty soon.
Crysis disabled some effects (ultra high mode) in DX9, but you could force them on - people wrongly called it enabling DX10, when the crytek team merely turned it off figuring DX9 rigs wouldnt have the power to use it anyway.
I think a lower GPU is OK if you match it with a high end proc, like the Q6600. Perhaps you were runing an X800GTO with a weak single-threaded CPU?
The dude must have some skills = money
ITS IMPOSSIBLE TO GET DIRECTX 10 ON WINDOWS XP!
Anyway it does kinda look like half life but thats nice i liked half life, now half life with destructible environments and cod 4 style effects like shooting through materials would be great.
It's just more polished photo imaging. We've seen good lighting and shading for quite a while now. What we haven't seen, is flora, grass, leaves etc that have a natural high quality to them. They may be textured in high resolution, but they're majorily 2d, often high in aliasing and noticeably '2d,' as their clipping levels are still seemingly archaeic.
Since the 'shader' trend has begun, we've seen more developers attempting to use shaders to mask the low quality of the 2d objects, rather than use a higher quality object natively.
A good example is Crysis. The videos and screenshots we saw that boasted Crysis' 'amazing' graphics, were testbed situations, where the hardware was able to manage the shader level required to make these inadequate 2d texture ends (and some 3d) to seem better than they are. However, the Crysis that was released to consumers, is obviously different. Crytek has stated that a 'patch' of sorts to give players the same level of visual quality will be made available in the future, WHEN GPUs are able to manage the workload needed to render these shaders.
Personally, while I support the advancements in lighting and shading, as it has made leaps and bounds for the visual quality of 3d applications (DOOM III comes to mind..bump mapping was a new world wonder!), it seems it's becoming overkill.
Maybe they fear that in order to provide good quality and interactive flora, grass, leaves etc, that they need to use more texture RAM, something that also began to spiral out of control, with GPUs needing up to 1gig of dedicated on board RAM.
Now we have two issues facing developers.