While screenshots can be misleading, in both a positive and negative way, I don't see anything too impressive here.
It's just more polished photo imaging. We've seen good lighting and shading for quite a while now. What we haven't seen, is flora, grass, leaves etc that have a natural high quality to them. They may be textured in high resolution, but they're majorily 2d, often high in aliasing and noticeably '2d,' as their clipping levels are still seemingly archaeic.
Since the 'shader' trend has begun, we've seen more developers attempting to use shaders to mask the low quality of the 2d objects, rather than use a higher quality object natively.
A good example is Crysis. The videos and screenshots we saw that boasted Crysis' 'amazing' graphics, were testbed situations, where the hardware was able to manage the shader level required to make these inadequate 2d texture ends (and some 3d) to seem better than they are. However, the Crysis that was released to consumers, is obviously different. Crytek has stated that a 'patch' of sorts to give players the same level of visual quality will be made available in the future, WHEN GPUs are able to manage the workload needed to render these shaders.
Personally, while I support the advancements in lighting and shading, as it has made leaps and bounds for the visual quality of 3d applications (DOOM III comes to mind..bump mapping was a new world wonder!), it seems it's becoming overkill.
Maybe they fear that in order to provide good quality and interactive flora, grass, leaves etc, that they need to use more texture RAM, something that also began to spiral out of control, with GPUs needing up to 1gig of dedicated on board RAM.
Now we have two issues facing developers.