• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Cyberpunk 2077 Game and Performance Review Roundup—The Antidote to 2020?

Everything as expected.

Bought W3 after one of last patches, will do same for CP2077. They will fix it, eventually..
 
now 10 hours in and aside from the odd case of fps drop when loading a new area no major bugs to report had a couple instances of derpy AI and the map and fast travel system could stand with some usability buffs but no major issues
 
Damn, pretty bad performance dare I say unacceptable for a game in development for so long. The fact that core scaling is so poor makes me wonder how did they even got the game running on old gen consoles, it must go into slideshow mode in certain instances.

Almost like having to suddenly go to working remotely due to global pandemic can royally screw up a games develop within a year of its launch date.

Also I think its the dumbest decision from these developers (not just CD Projekt) to also keep developing new games for last gen consoles. Move on already. Last gen consoles are ass cheeks.

@W1zzard Any plans to do your own performance review of the game? Id like to see your benchmarks.

Also off topic for anyone who is current playing Witcher 3. There are 2 mods on Nexus website that will restore its visuals back to the initial E3 reveal quality before they downgraded it for launch. Its essentially a lighting mod and the other essentially updates every texture in the game. They make the game look insanely good.
 
Any plans to do your own performance review of the game? Id like to see your benchmarks.
spent too much time playing the game and messing with my test scene ... hopefully article tomorrow
 
Damn, pretty bad performance dare I say unacceptable for a game in development for so long. The fact that core scaling is so poor makes me wonder how did they even got the game running on old gen consoles, it must go into slideshow mode in certain instances.

I run it on a 1080Ti in ultra settings at 1440p: The fps is 100% acceptable for a singleplayer experience (no lags or funky stutter and from a 'i need +400 fps in csgo'-dude) The game looks like nothing out there, so imo ill take this games 30-50 fps ultra @1440p than say player unknowns choppy 100+
 
done with main quest only bug I could repeat is sometimes the scan/quickhack bug would get stuck if you used it too quickly after a cutscene or exiting a building
annoying but nothing a restart didn't fix, had one instance of Deathproofitus where I had 0 hp but didn't die
had few insistences of the AI getting confused or stuck (punching them in the face usually fixed it )

all in all nothing out of the norm for a rpg of this scale and complexity

performance wise the only complaint I had was the odd fps drops when in certain areas/cut scenes

game-tech score 8/10 performance is great given the scale and complexity, room to improve i don't think the core-scaling issue is unreasonable you need to understand from a programing perspective it makes sense to run a lot on one core managing all assorted logic you need to make the game world sing is a tall order and making one core do the brunt of it makes sense in a game this complex every-time you need to go back to cache/ram you interduce delay you simply can not afford if you want a smooth experience

gameplay score 8/10 (because of the irritating UI, the UI is a bit pf a pain to navigate quickly when some Choom-ba is set on flatlining you, more then once I got killed changing weapons or performing a quick-hack

Story Score 11/10 I literally had a dream I was in the game last night, CD Projekt knocked it out as usual
over-all 9/10 outstanding
 
Last edited:
OneMoar said:
Do you enjoy being publicly ignorant ?
You clearly need a mirror, as demonstrated by the vast majority of your posts.
OneMoar said:
DLSS is a near +30fps boost with little appreciable quality hit
DLSS gives a performance boost when COMPARED to any other form of AntiAliasing. Turning them both off allows the GPU to render frames WITHOUT any post processing. As always, when the GPU has to do LESS work, the framerate ALWAYS goes higher.
 
i don't think the core-scaling issue is unreasonable you need to understand from a programing perspective it makes sense to run a lot on one core managing all assorted logic you need to make the game world sing is a tall order and making one core do the brunt of it makes sense in a game this complex every-time you need to go back to cache/ram you interduce delay you simply can not afford if you want a smooth experience

Except other games similar in scope fair much better. I don't know how the game logic was implemented and neither do you but the evidence still says performance is underwhelming. I suspect the bulk of development happened in the last year or two.
 
You clearly need a mirror, as demonstrated by the vast majority of your posts.

DLSS gives a performance boost when COMPARED to any other form of AntiAliasing. Turning them both off allows the GPU to render frames WITHOUT any post processing. As always, when the GPU has to do LESS work, the framerate ALWAYS goes higher.
I think he (she?) meant that running the game at 720p + DLSS gives you better performance than 1440p + no DLSS, just worded it quite badly.

Of course using any post processing method on the same resolution impacts performance negatively. :D
 
This reminds me od Cyrix days where Quake run much better on Intel and they released their own benchmarks with conclusion that in fact runs better on Intel but regardless both systems were able to produce 13fps for smooth gameplay. lol
 
Damn, pretty bad performance dare I say unacceptable for a game in development for so long. The fact that core scaling is so poor makes me wonder how did they even got the game running on old gen consoles, it must go into slideshow mode in certain instances.

It's not developed for so long, it's being developed for two years IMO since the release of the 2018/19 cinematic. Either way I think it's an average overhyped game anyway even if it was bug-free.
 
I’m running Cyberpunk on my Core i9 Extreme/ 3090/64GB DDR4/SSD with all Ray Tracing settings turned up, but I’m playing on a Alienware curved 34” Gaming monitor at 1440p until I can upgrade to Alienware’s 38” 4K monitor.

The majority of the market is playing on a 1080p or 1440p monitor and as long as they have a 2060 or better, they should have no issues with detail turned way up and RTX ON.

I’ve had just 2 bug experiences:

#1 in a car chase, my shotgun disappeared from my hand while I shot at attacking drones.

#2 some inventory displays took so long to boot, I thought the game crashed.

Originally I was on the fence about this game and I was afraid that it was going to disappoint me. I can honestly say that this game hasn’t disappointed me at all and I’ve been amazed by the experience they’ve crafted here.

I give my experience as 9/10.

The GUI needs more explanation for crafting. It reminds me of when I played Fallout4 and I didn’t understand how to build a settlement until basically the end of the game. Had I known how to build up my settlement and trade lines throughout the game and how to manage items it would have made the gameplay a lot easier.

My major issue is that this game should have been made specifically for the PC and they should not have tried to port it so early. That would have allowed them to get it out of the door months ago and they could have focused the ports on the PlayStation 5 and and Xbox series X.

This game is basically unplayable on the original Xbox One and PS4.

This game barely works that well on Xbox one X or PlayStation 4. If they had targeted series X PS5 they would have created the best possible experience at the sacrifice of a few gamers not being able to play itbut at least everyone who played it would have gotten a similar quality experience.
 
imagine thinking that DLSS is just anouther post processing filter then imagine that it and anti-aliasing do anything close to the same thing
 
imagine thinking that DLSS is just anouther post processing filter then imagine that it and anti-aliasing do anything close to the same thing
If they didn’t do the same thing you could run them both simultaneously.
 
Maybe this is a bad take, an even worse take because it's coming from an owner of a 5700 XT, but it seems like there is a lack of optimization if the game struggles to play at 60 FPS in 1080P (max settings excluding ray tracing) on a video card that delivers a minimum of 60 FPS at 1440P on all current gen titles.
 
If they didn’t do the same thing you could run them both simultaneously.
But they do the same thing just in different ways and to various levels of effectiveness. Not all games support DLSS so the need of alternate forms of AntiAliasing still exists. A list of currently compatible games is below;

I own several of the games on that list but still disable DLSS and AA because I just don't like the way it looks when comparing the hit to performance. Would really rather have the extra performance.

With CP2077 I turn a lot of things down or off. It is the one game that I have that is now bottlenecking my CPU(Xeon X3680) in a serious way. Time for an upgrade...
 
Last edited:
But they do the same thing just in different ways and to variously levels of effectiveness.
Correct. DLSS actually incorporates most of TAA within its process. Which is why you cant run them both at the same time.
 
You don't seem to get it dlss offers Superior image quality versus TAA alone and with none of the performance drag of msaa why on God's green earth would I turn dlss and anti-aliasing off might as well just throw my monitor right out the window and the whole point about GPU load and performance is entirely irrelevant dlss does not run on the graphics core it runs on the tensor cores

So to reiterate why the hell would I turn anti-aliasing and dlss off for a vastly inferior image and less performance

saying that dlss is not needed is completely wrong. It absolutely is needed if you want the best image quality while maintaining good Performance

else you might as well just run everything at 720p on a 15-in LCD from 2010
 
Last edited:
You don't seem to get it dlss offers Superior image quality versus TAA alone and with none of the performance drag of msaa why on God's green earth would I turn dlss and anti-aliasing off might as well just throw my monitor right out the window and the whole point about GPU load and performance is entirely irrelevant dlss does not run on the graphics core it runs on the tensor cores

So to reiterate why the hell would I turn anti-aliasing and dlss off for a vastly inferior image and less performance

saying that dlss is not needed is completely wrong. It absolutely is needed if you want the best image quality while maintaining good Performance

else you might as well just run everything at 720p on a 15-in LCD from 2010
Actually what DLSS does is Upscale your image from a lower resolution and give you ALMOST the same quality you would get at the new resolution without any type of AA on. The main benefit is blurring most of the jaggies and giving you the ability to upscale.
 
Actually what DLSS does is Upscale your image from a lower resolution and give you ALMOST the same quality you would get at the new resolution without any type of AA on. The main benefit is blurring most of the jaggies and giving you the ability to upscale.
Yes but how it works is irrelevant the results are. and the results are very good no I would not be against have a tensor core accelerated FXAA,TAA but turning dlss off and running with no anti-aliasing makes absolutely no sense the result with DLSS is closer to MSAA vs a post effect

Unilaterally claiming that dlss is not needed or that turning off anti-aliasing provides the same results is flat wrong

not at 1440p or even 2160p the new generation of dlss has a bunch of settings auto, performance, balanced, quality,

combined with FPS targeting an intelligent implementation the entire thing is entirely seamless only scaling when needed to maintain the FPS Target and not being overly aggressive unless the performance is exceptionally poor
 
Last edited:
makes absolutely no sense
To YOU maybe, but that's it's matter of opinion. Some people prefer to turn it all off and get the most performance from their GPU. Why? Because the laddering effect, or "jaggies" if you wish, are so small at 1080p and above is to be unobservable unless you go looking for them. During active gameplay one is just not going to notice. No AA and it's just not a big deal. AA is a performance hog and so is DLSS. Now that I think about it, I haven't used AA consistently since the Pentium 4 days.
 
k run your 32in monitor at 720p and report back on how much it bugs you
btw I am never wrong
 
k run your 32in monitor at 720p and report back on how much it bugs you
Ooooo, witty response, I'm floored... Really... No seriously, totally feeling the verbal smack-down from you...
btw I am never wrong
And I'm sure that in your head everyone else is a complete blithering idiot... Once again, you need a mirror...

So to sum up, for the other users watching, if you're playing CyberPunk2077 and you're not getting the framerates you wish, turn off DLSS and/or AntiAliasing and turn down some of the other settings that hit your CPU/GPU in a hard way and you'll do better.
 
Last edited:
Back
Top