No, not really. If Ada isn't, then RDNA 3 isn't either. I don't think it's that, we know UE5 has been in the works for some time. But all games that shipped with it thus far have bizarrely high system requirements overall, and very little to show for it in the looks department. Neither Remnant II or this game seem to be particularly visually appealing in context of their system requirements.
Time will tell, I know as little as you, I just go on the raw numbers here, but the fact is there are several writings on the wall wrt optimization, market share and future prospects and they don't happen to be written on the side of Nvidia's proprietary RT approach, but rather on broad engine usage and having hardware agnostic, as much as possible, solutions. It's not particularly favoring AMD or RDNA3, its just that RDNA3 has more raw raster perf + better bandwidth and it shows here, is my impression.
Just compare the 4090 and 7900XTX bandwidth on VRAM and you'll see it echo the results of the bench; you're looking at 1000GB/s vs 960GB/s there. The rest of the GPU is scaled on those numbers. This is also most of the reason why the 4090 is stellar and lonely on the Ada top, and the rest trails by a mile. They lack that raw throughput, so on complex scenes, these cards excel while others choke, at least a little more than they should. Its that same issue all cards on all segments run into, except with UE5, the problem expands to the top end and becomes the equalizer, making it clear this is what truly limits cards going forward. Which has been, again, proven by the various games that lack somehow in VRAM cap or bandwidth and require extra TLC to make games run proper on them. That's how it really works after all, devs optimize
for hardware. They don't do that if the hardware doesn't require it.
This is now yet another game in the UE stable where 7900XTX excels and gets close to a 4090, where it really shouldn't be given every other game these two cards face off on. Chalking that up to 'dev optimization' I feel is not being honest about what's in front of us.
However - I do fully agree these games don't really have much to show for their inflated required specs. If this
is the future... I'm in the
meh, what for camp.
Remnant II do look underwhelming for how heavy it is, but Immortal of aveum looks much better on a technical pov.
I honestly think that we are reaching diminishing returns when it comes to geometry details and texture resolution. I've found immortal of Aveum to be very detailed, but still got that overall "artificial" looks.
The wood texture doesn't lack details... or sharpness it just looks artificial, and actually over sharpened to me (and that's from a video capture).
View attachment 309823
There's dirt around and under the nails, there's small damage on the weapons, you can clearly see the wrinkle on the skin. Yet it still look very artificial.
View attachment 309825
IMO, the quality of the assets, Light/shadow is what will bridge offline 3D graphics and real time graphics. Remember unrecord ? The game using High quality assets from mega scans is a big part of it's realistic look.
View attachment 309830
And oh man, so much this. There are absolutely ancient games on long gone engines that manage to
feel more realistic than the overpolished, oversharpened 'quality' assets we get today. This is not just diminishing returns, indeed... its beyond the point of having a point.
What truly defines games these days isn't the engine, the box of special FX, the RT or no RT. What defines games graphically is their actual graphical design. There haven't been real limitations in 'photorealism' or having sufficient pixels or polygons at your disposal for half a decade at the very least, but probably more. The only real limitations these days are dev time/cost of the operation to get a product out the door. Engines? Whatever. Almost every engine produces palatable graphics now. Its the reason things 'stagnate' as well, there just isn't much fruit left on those trees. I've said it before... gaming graphics have plateaud for quite some time now. Effectively, the DX11 peak days are the actual peak of graphical fidelity. DX12 didn't give us much, if anything, except better API efficiency to better use the CPU.