There are exceptionally few really competent (at a technical level) Unity games, IMO. I would guess that the most advanced of them would be The Long Dark. There are great games made on Unity, such as Subnautica, but few if any are mechanically complex games with lots of maps, props, etc. - and even then, The Long Dark isn't exactly a bug-free game. I'd know, with almost 3000 hours on it...
I'd go with Genshin Impact as the big Unity exception actually. It manages a seamless open world, fluid and responsive combat, launched without any major issues & bugs, and on top of that allows for flawless crossplay on PC, consoles, and mobiles. And their budget is massive - half a billion at this point and just keeps growing alongside the game.
I'd go with Genshin Impact as the big Unity exception actually. It manages a seamless open world, fluid and responsive combat, launched without any major issues & bugs, and on top of that allows for flawless crossplay on PC, consoles, and mobiles. And their budget is massive - half a billion at this point and just keeps growing alongside the game.
I thought MiHoYo had their own engine? But if that is the case, indeed... it is quite a polished game! And I caught no "unity-isms" from it. Usually you can tell straight away when a game uses a certain engine (see Unrealism phenomenon - UE games look alike), amazing indeed.
Well, yeah, though a key difference is that Unity is rarely used for AAA projects that are on the radar of the majority of people and end up causing widespread frustration when they launch.
Not that Gollum is an attempt at an AAA game - literally the only thing it has going for it is the brand. You could tell from the very first trailer that mediocrity is what it's headed for and not only in terms of budget...
And the publisher is not always to blame for the full extent of a failure when it happens. Not all companies can allow themselves to postpone projects indefinitely - that costs A LOT of money.
When a publisher and developer make contract about a release in say four years and four years later the project is still half-baked the publisher is not actually at fault.
like Gollum is the issue here, the problem is widespread, it's in 90% of all the AAA games released nowadays, sure some are worst in the shit scale, but no one cares to optimise or polish anything. And in some cases even finish what they started, they just patch whatever has done and sell it, no cohesion at all
AAA game industry is a turd factory at full speed
Devs are focusing on advancing the industry. Crysis wouldn't have existed if someone hasn't started making use of programmable shaders, or use textures larger than 100KB, or use 3D meshes in games at all.
The video game industry evolves by implementing new things, it always has been. Only those with addiction to dial everything to the max have anything to complain. DOOM 2016 itself ran like crap maxed out at 4K. Even on 1440p, only contemporary top of the line systems could run it at 60hz. And this was a title that was *mostly* conservative in new tech adoption (API migration notwithstanding).
Not defending LoTR:Gollum though. This game looks and sounds absolute trash...
But there is a fundamental difference; one is the core of the game, the other is pure extra, you could delete RT and still play the whole game start to finish.
The industry's goal is to replace the entire paradigm with RT. People perceive it as "extra" because, unfortunately, marketing focuses more on the "Ooh! Shiny!" aspect.
Personally, I'm ok with the development to be subsidized by those willing to shell >$1K on a card. On my end, I can just do what I've always been doing as a PC user: turn some toggles on/off!
Devs are focusing on advancing the industry. Crysis wouldn't have existed if someone hasn't started making use of programmable shaders, or use textures larger than 100KB, or use 3D meshes in games at all.
The video game industry evolves by implementing new things, it always has been. Only those with addiction to dial everything to the max have anything to complain. DOOM 2016 itself ran like crap maxed out at 4K. Even on 1440p, only contemporary top of the line systems could run it at 60hz. And this was a title that was *mostly* conservative in new tech adoption (API migration notwithstanding).
I don't mean incorporating modern things is bad. What I mean is those modern things aren't even incorporated in the first place. They just get shovelled into the game with no optimisation and the devs call it a day. Sure, Crysis ran like crap on high-end systems back in the days, but it was a visual masterpiece. It can even stand its ground nowadays, but runs miles better on modern hardware than some of the newest games. When a game runs with 5 FPS on current gen hardware and looks jaw-dropping, I understand. But when a game runs with 5 FPS only because of RT, but looks like shit otherwise, that's unacceptable.
In short: overall visual quality matters. Whether you have an RT toggle in the settings or not doesn't.
I don't mean incorporating modern things is bad. What I mean is those modern things aren't even incorporated in the first place. They just get shovelled into the game with no optimisation and the devs call it a day. Sure, Crysis ran like crap on high-end systems back in the days, but it was a visual masterpiece. It can even stand its ground nowadays, but runs miles better on modern hardware than some of the newest games. When a game runs with 5 FPS on current gen hardware and looks jaw-dropping, I understand. But when a game runs with 5 FPS only because of RT, but looks like shit otherwise, that's unacceptable.
In short: overall visual quality matters. Whether you have an RT toggle in the settings or not doesn't.
Devs are focusing on advancing the industry. Crysis wouldn't have existed if someone hasn't started making use of programmable shaders, or use textures larger than 100KB, or use 3D meshes in games at all.
The video game industry evolves by implementing new things, it always has been. Only those with addiction to dial everything to the max have anything to complain. DOOM 2016 itself ran like crap maxed out at 4K. Even on 1440p, only contemporary top of the line systems could run it at 60hz. And this was a title that was *mostly* conservative in new tech adoption (API migration notwithstanding).
Not defending LoTR:Gollum though. This game looks and sounds absolute trash...
The industry's goal is to replace the entire paradigm with RT. People perceive it as "extra" because, unfortunately, marketing focuses more on the "Ooh! Shiny!" aspect.
Personally, I'm ok with the development to be subsidized by those willing to shell >$1K on a card. On my end, I can just do what I've always been doing as a PC user: turn some toggles on/off!
I don't mean incorporating modern things is bad. What I mean is those modern things aren't even incorporated in the first place. They just get shovelled into the game with no optimisation and the devs call it a day. Sure, Crysis ran like crap on high-end systems back in the days, but it was a visual masterpiece. It can even stand its ground nowadays, but runs miles better on modern hardware than some of the newest games. When a game runs with 5 FPS on current gen hardware and looks jaw-dropping, I understand. But when a game runs with 5 FPS only because of RT, but looks like shit otherwise, that's unacceptable.
In short: overall visual quality matters. Whether you have an RT toggle in the settings or not doesn't.
The issue isn't really RT, or any other detail, it's the same issue that's resulted in many of the recent AAA games being trash code pushed out the door.
These publishers are bought and become corporate mentality hellholes, where the objective is to finish projects as quickly and cheaply as possible so they can move on to the next profitable venture.
There is only one solution to this. Never preorder, and don't pay any amount of money, and yes, that includes discounted versions a year later, to support this attitude.
Reward games that are a labour of love, and don't support anything else - this is how you shape the industry.
The Witcher III: Wild Hunt - Game of the Year Edition
Turbo Sloths
Watch Dogs Legion
Warhammer 40,000: Darktide
Wolfenstein: Youngblood
World Of Warcraft: Shadowlands
Wrench
Xuan-Yuan Sword VII
Voidtrain
, Cyberpunk being the obvious one, you'll need a $1.5k GPU to turn everything on, but it's the best example. Other games involve RT at a much lower level, but examples such as Dying Light 2, even Elden ring pushed an RT update with a recent patch that makes buildings and dungeons look a lot better.
Metro: Exodus Enhanced Edition has good RT. It's not overwhelming, it doesn't take the artistic direction into a new realm, just slightly adds to it. Oh, and it runs well, too.
Also, Deliver Us The Moon, which is sort of an RT-sponsored title, but still good. I just can't enable RT on AMD in this game for some reason.
A Plague Tale: Requiem
Bright Memory: Infinite
F.I.S.T.: Forged in Shadow Torch
Ghostrunner
Observer: System Redux
Resident Evil 7: Biohazard // so, better than 2 nd 3 ?
Shadow of the Tomb Raider
Stay in the Light // RTX showcase
The Medium
Voidtrain
Oh, and it's on 7800X3D 4070TI - OLED 27"1440p 240hz
Thread's content overlooked... I think the real issue here is the 'market'.
Looking at the popularity of 'retro-graphics' games like WH40K:Boltgun, it's kinda clear that 'the (AAA) gaming community' at large is being 'led around by its nose'.
It is the market that's driving big companies to push their devs to get a 'AAA' titles 'out the door, ASAP'; which, is where a lot of technical oversights come from.
Q2RTX is a complete joke. Its RT is horribly inaccurate and most of the time barely helps to improve the scene. Only reflections are accurate but SSR can do that since 1999. Lighting is absolutely not.