• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Star Wars Jedi: Survivor Benchmark Test & Performance Analysis

Haha, your point that I was responding to was that people were biased against amd because nvidia does the same thing—which, as I pointed out, isn’t true.

And dlss has been free for developers to use since 2021. Why a developer wouldn’t include it in an unreal engine 4 title that has fsr 2 would only be truly known to the developers. But anyone who doesn’t think amd is restricting the use of a superior upscaling tech in their sponsored games has blinders on and/or has trouble recognizing patterns. So yeah, my point stands: F amd.
What patterns? This is the first and so far, only game that I've seen use only FSR and not DLSS.

Do you think there's really AMD's hand in this? Are you sure there's zero cost for a developer to use DLSS? Do you think that developer laziness can and shoud be excluded from the reasons why DLSS isn't there?

I'm not saying that DLSS shouldn't be in the game. What I'm saying is that AMD isn't necessarily at fault. I'd be looking at DLSS implementation costs and developer laziness as the main culprit.
 
What patterns? This is the first and so far, only game that I've seen use only FSR and not DLSS.

Do you think there's really AMD's hand in this? Are you sure there's zero cost for a developer to use DLSS? Do you think that developer laziness can and shoud be excluded from the reasons why DLSS isn't there?

I'm not saying that DLSS shouldn't be in the game. What I'm saying is that AMD isn't necessarily at fault. I'd be looking at DLSS implementation costs and developer laziness as the main culprit.

Resident Evil 4 remake also uses a super shite version of FSR with no official support for DLSS comically the moded in DLSS is better than the developer implementation of FSR2.
 
Resident Evil 4 remake also uses a super shite version of FSR with no official support for DLSS comically the moded in DLSS is better than the developer implementation of FSR2.
Two games are hardly a pattern. Personally, I think it's more the combination of DLSS not being open source and developer laziness, than an AMD conspiracy behind this.
 
What about the Games that only support DLSS?

Obviously it's OK when games only feature Nvidia proprietary features, come on you know the drill.

Same thing goes for performance, when performance craters on AMD sponsored games it's bad optimization, when it happens on Nvidia sponsored titles it's called innovation and pushing the boundaries of what is possible.

Can't you see the bias? Do you really think that AMD instructed the developers not to work together with Nvidia on a DLSS implementation? I'm smelling some far-fetched conspiracy theory here.

Not to mention that if a company with 10% total market share of whatever it is these days can command a giant like EA to do whatever they want Nvidia should just throw in the towel, it's over.

What an absurdity.
 
Last edited:
What patterns? This is the first and so far, only game that I've seen use only FSR and not DLSS.

Do you think there's really AMD's hand in this? Are you sure there's zero cost for a developer to use DLSS? Do you think that developer laziness can and shoud be excluded from the reasons why DLSS isn't there?

I'm not saying that DLSS shouldn't be in the game. What I'm saying is that AMD isn't necessarily at fault. I'd be looking at DLSS implementation costs and developer laziness as the main culprit.
Resident evil village: FSR only
Resident evil 7: FSR only
Resident evil 4: FSR only
Resident evil 3: FSR only
Resident evil 2: FSR only
The Callisto protocol : FSR only
Dead island 2 (tested on TPU) FSR only
Far cry 6 : FSR only
Sniper Elite 5 : FSR only
Asterigos: Curse of the Stars : FSR only

But granted: not ALL AMD partnered games are FRS only, but there's some notable names on the list. FSR eventually made it's way to some DLSS only games, but unless I'm wrong, the reverse haven't happened so far
 
Resident evil village: FSR only
Resident evil 7: FSR only
Resident evil 4: FSR only
Resident evil 3: FSR only
Resident evil 2: FSR only
The Callisto protocol : FSR only
Dead island 2 (tested on TPU) FSR only
Far cry 6 : FSR only
Sniper Elite 5 : FSR only
Asterigos: Curse of the Stars : FSR only

But granted: not ALL AMD partnered games are FRS only, but there's some notable names on the list. FSR eventually made it's way to some DLSS only games, but unless I'm wrong, the reverse haven't happened so far
Right, that's 6 games, then (not counting the different RE remakes). But my point stands: rather than coming up with a conspiracy theory, let's look at DLSS licensing costs in terms of time and money.
 
@W1zzard
How are you getting 90+ FPS in 1% lows, when so many people struggle to maintain 60 on the 7800X3D?

Or is the game broken just on Ryzen CPUs?
 
Resident evil village: FSR only
Resident evil 7: FSR only
Resident evil 4: FSR only
Resident evil 3: FSR only
Resident evil 2: FSR only
The Callisto protocol : FSR only
Dead island 2 (tested on TPU) FSR only
Far cry 6 : FSR only
Sniper Elite 5 : FSR only
Asterigos: Curse of the Stars : FSR only

But granted: not ALL AMD partnered games are FRS only, but there's some notable names on the list. FSR eventually made it's way to some DLSS only games, but unless I'm wrong, the reverse haven't happened so far
You forgot Saints Row. Boundary has yet to be released but the developers admitted to removing dlss and it happened after they gained amd sponsorship. It will ship with fsr. Probably some others out there. The reverse has never happened best I can tell. The only amd sponsored games that have been an exception that I know of were Forspoken, Deathloop, and I think The Last of Us and Uncharted.

This. Is. A. Pattern.
 
Right, that's 6 games, then (not counting the different RE remakes). But my point stands: rather than coming up with a conspiracy theory, let's look at DLSS licensing costs in terms of time and money.
There's no information about any licensing cost to use DLSS. And DLSS has been modded for free by the community with great success in some FSR only games, and we haven't heard of any legal pursuit from Nvidia against the people who made those mods available. Nvidia themselves released the Open source RTX remix tool with DLSS included. Unless Nvidia is hiding very hard the financial reality of DLSS, everything seems to suggest that it's free to use, and "amateurs" manage to use it. It's even used in some indies games.
You just have to Agree To the Terms of the NVIDIA DLSS End User License Agreement, and you can download the SDK. I used DLSS when I was playing around with UE4. It's a simple plug-in, but a real dev can refine it further.

Resident Evil 4 Remake: DLSS and XeSS Community Patch Review | TechPowerUp
Nvidia’s DLSS is now freely available to any developer who wants it | Rock Paper Shotgun
Nvidia's RTX Remix Runtime is now Open-Source | OC3D News (overclock3d.net)
1682867932434.png
 
Last edited:
@W1zzard
How are you getting 90+ FPS in 1% lows, when so many people struggle to maintain 60 on the 7800X3D?

Or is the game broken just on Ryzen CPUs?

I was seeing people have issues with a 12900k/4090 as well stuck in the 60fps range. I also believe some parts of the game are much worse than others.

But I believe it's worse on Ryzen in general.

Obviously it's OK when games only feature Nvidia proprietary features, come on you know the drill.

You didn't get the memo? Only RTX owners are the true PC master race, well at least till they run out of vram and start crying about unoptimized games that is....
 
Last edited:
If you are going to implement only one upscaling on the premise that it can work for everyone, then you have at least the opportunity to focus on it so it could be well implemented. Wich isn't the case here, where it not only looks bad, but also have trouble to give you more performance reliably. Sometimes it works, sometimes it doesn't, an acquaintance with all AMD system (7600+ RX7900xt + 6000Cl30) had no increase from using it.
That kind of janky implementation doesn't help anyone. It doesn't bring mindshare about the technical qualities of FSR, and it doesn't help the players. You want people to forget about DLSS ? then make sure that your product either equal it, or beat it every time. An AMD partnered game should be the best showing of the tech, not one of its worse representation.
Do you read what you write, or is it that DLSS is being much easier to implement, or a magic solution that even when implemented wrong, it... just works? An upscaling implementation that looks bad and doesn't bring any performance advantage means that it is implemented wrong. Because we know that FSR offers about the performance of DLSS and a somewhat acceptable or even good enough quality, compared to DLSS. It's not that FSR is a broken tech that is not working, it doesn't offer performance gains or it always looks bad, "only looks bad".
And who said that the whole story is for DLSS to be forgotten? You are here the one trying to pass indirectly the idea that DLSS would be working no matter what. That it is a safe bet that can';t be implemented wrong, that always performs as it should. Just read what you wrote. And yes I DID.
Nvidia themselves released the Open source RTX remix tool with DLSS included. Unless Nvidia is hiding very hard the financial reality of DLSS, everything seems to suggest that it's free to use, and "amateurs" manage to use it. It's even used in some indies games.
Do you remember when was it that Nvidia decided to offer those free?
 
Last edited:
Do you read what you write, or is it that DLSS is being much easier to implement, or a magic solution that even when implemented wrong, it... just works? An upscaling implementation that looks bad and doesn't bring any performance advantage means that it is implemented wrong. Because we know that FSR offers about the performance of DLSS and a somewhat acceptable or even good enough quality, compared to DLSS. It's not that FSR is a broken tech that is not working, it doesn't offer performance gains or it always looks bad, "only looks bad".
And who said that the whole story is for DLSS to be forgotten? You are here the one trying to pass indirectly the idea that DLSS would be working no matter what. That it is a safe bet that can';t be implemented wrong, that always performs as it should. Just read what you wrote. And yes I DID.


What performance gains do you get with DLSS on an RTX 3070?


Do you remember when was it that Nvidia decided to offer those free?
We already have modders who implemented DLSS with a better end result than FSR from the devs. Like this guy PureDark. He's currently trying to implement DLSS himself, in this game, but denuvo prevent him from doing so.

1682868589849.png

Hardware unboxed also made a comparison and DLSS is often leading in visual quality when the two techs

are implemented.
1682869065166.png


If this game had implemented both tech and both results were bad, that would have been more understanding than implementing one tech and still ending up with a result that doesn't show it's full potential. I would also jump on a DLSS only game that would be dysfunctional. The "forget DLSS" part was mainly targeted as the people saying that FSR "is the way to go since it's hardware agnostic", when the tech still needs to achieve a consistent quality parity in every game where both are available.

Did you miss the part where I said myself that this game is one of the worst showing of FSR ? Where I said that what this game is doing is preventing to generate mindshare about the tech ? I am aware that FSR can reach parity with DLSS. It just doesn't happen that often, but for an AMD partnered game, it should happen. what I said was : "If you are only going to use FSR, you can at least do it right since you don't have to work on several upscaling tech at the same time".
I've just tried DLSS quality in God of war and my FPS went from 64 to 85. At 1440p max settings. FSR show similar gain.

I won't deny that I do have a bias towards DLSS, but that bias is fueled by the data that I've gathered, showing that at that precise moment, DLSS do seem to be the better tech. DLSS was made freely vailable in 2021. The shotgun article does mention that before you needed to apply, not necessarily pay for it. Other articles are mentioning that same news with a different wording.
 
Last edited:
If DLSS is so easy to implement then Nvidia should simply hire some dude to make FSR2->DLSS mods for games that only have FSR2, like we already have for the dozens of community made DLSS2 -> FSR2 mods out there.

Blaming Respawn for choosing to use a free and open source temporal upscaler that works on all the PC GPUs and consoles is just stupid. It's even more stupid to blame AMD for it. FSR2 is simply more useful to implement for developers making multiplatform games.

The people complaining that the devs didn't choose to use a proprietary tech that only works on their GPU maker need to take a check on narcissism.


I won't deny that I do have a bias towards DLSS
Noo.. Really? We couldn't tell.
 
Blaming Respawn for choosing to use a free and open source temporal upscaler that works on all the PC GPUs and consoles is just stupid. It's even more stupid to blame AMD for it. FSR2 is simply more useful to implement for developers making multiplatform games.

The people complaining that the devs didn't choose to use a proprietary tech that only works on their GPU maker need to take a check on narcissism.

Did they choose it? Or did they choose to take the money for not implementing other technologies (it is an AMD sponsored game)?

I'd say that only implementing one upscaling method is just as bad as locking down DLSS to one vendor. Or maybe it's worse, since the tech is objectively inferior. And it's especially bad in this game with a lot pixelization. It might get fixed, but in its current state I'd rather just set a lower resolution with regular upscaling.

AMD are not the good guy. They constantly demonstrate anti-consumer behavior just like NVIDIA and Intel. All these companies care about is getting your money.
 
i have a genuine question, which one is a more sensible approach? in terms of less bugs and performance issues?
Having the game develop purely on console and then port it over to pc?
Having the game develop on pc and release on console at the same time?
 
If DLSS is so easy to implement then Nvidia should simply hire some dude to make FSR2->DLSS mods for games that only have FSR2, like we already have for the dozens of community made DLSS2 -> FSR2 mods out there.

Blaming Respawn for choosing to use a free and open source temporal upscaler that works on all the PC GPUs and consoles is just stupid. It's even more stupid to blame AMD for it. FSR2 is simply more useful to implement for developers making multiplatform games.

The people complaining that the devs didn't choose to use a proprietary tech that only works on their GPU maker need to take a check on narcissism.



Noo.. Really? We couldn't tell.
someone tried already, but denuvo doesn't allow dlss2 to be modded, framegen is the only mod available, and it's going to be janky. :D
1682871914729.png


If you want to see me trash talk Nvidia, I've done that too. I switch sides like I change my shirt. Maybe two years from now you might call me an AMD shill. (Already happened to me on the CPU side.)
1682872642997.png

1682873306346.png

1682873539752.png


1682874030724.png

1682874134435.png
 
Last edited:
Hardware unboxed also made a comparison and DLSS is often leading in visual quality when the two techs
Completely subjective, I can make a similar chart & put FSR winning 90% of the time. Why wouldn't my list be more fair?
 
The VRAM usage is likely explained by them porting over the unified memory code from console, something I mentioned in other threads will become more and more common in new games going forward. System RAM becoming less important, Video RAM becoming more important.

On FF7 Remake it needs over 2 gigs of VRAM just to get to the title screen.

Consoles are the base line, so I would expect a budget GPU to have at least as much memory as a Xbox Series S, and mid range GPU's to have 16 gigs, high range GPU's more. A £250 console which includes optical drive, nand drive, motherboard, CPU, GPU (APU) and VRAM, and case, and controller. Has more VRAM than a 4050 LOL.
Couple things here. First, the model S does not have an optical drive.

Second, consoles are universally sold at a loss, the only exception is some nintendo hardware. The model S, sold at a profit, would likely be $600+.

Third, the model S is not playing this game at 4k60. Or 4k30. It's doing 1080p30, with dithering and a bunch of other tricks to reduce image quality, and I'll happily bet that it will frequently drop into the mid 20s. You can easily do this yourself by playing the game at low settings, where it will run on hardware like the 1070 which is multiple gens old.
So low end 8 gigs RTX 4050/4060
Mid to high range 16 gigs 4070/4070TI/4080
Enthusiast 24+ gigs 4090
Disagree.
Mid range will be fine with 12GB. 16GB will only be needed for ultra settings, which are a waste on many modern games.
24 gigs is useless for gaming today.

Completely subjective, I can make a similar chart & put FSR winning 90% of the time. Why wouldn't my list be more fair?
Feel free to go test your own 50 game suite, take video and image comparisons, and find a list to prove your point.

Otherwise, you're just being padantic.
 
Completely subjective, I can make a similar chart & put FSR winning 90% of the time. Why wouldn't my list be more fair?
That would depend on the criteria that you use. Less shimmering, more texture clarity, less ghosting, better reconstruction of details etc... comparing which one got the less rendering issues is possible. Now if you say that DLSS is "prettier" yhea that's 100% subjective.
 
That would depend on the criteria that you use. Less shimmering, more texture clarity, less ghosting, better reconstruction of details etc... comparing which one got the less rendering issues is possible. Now if you say that DLSS is "prettier" yhea that's 100% subjective.
That changes from scene to scene & even in a scene from one DLSS version to another, same with FSR. While DLSS may be more polished, given it's older as well, I wouldn't give it a landslide like they did.
Feel free to go test your own 50 game suite, take video and image comparisons, and find a list to prove your point.

Otherwise, you're just being padantic.
Feel free to post a million images comparing them side by side, also 26 games in that comparison not 50.
 
3. There are countless cards that don't even feature on the review and would be much more relevant then the ones featured. Is this to make people upgrade? to make the high end gpu's look good?
EA Denuvo = 5 GPU changes in 24 hours. I added new entries earlier today, will do more testing now to add a few more cards
 
Did they choose it? Or did they choose to take the money for not implementing other technologies (it is an AMD sponsored game)?
It's not always predictable who gets the money from whom in a cross-marketing deal. It's also not certain that this deal included some clause to forbid Respawn from implementing DLSS in the game.
The game could have used UE4's TAAU instead of FSR2, but it seems the latter is generally better. Regardless, FSR2 objectively more usable than DLSS2.

FSR2 compatibility list:

XBox Series X
XBox Series S
PS5
Nvidia RTX
Nvidia GTX
AMD RDNA 1/2/3
AMD GCN


DLSS2 compatibility list:
Nvidia RTX


I'd say that only implementing one upscaling method is just as bad as locking down DLSS to one vendor.
"Implementing a temporal reconstruction method that works on all platforms is just as bad as implementing one that only serves a fraction of the PC and console gaming market".
This is delusional at best.


Or maybe it's worse, since the tech is objectively inferior.
Compliance / compatibility is part of what makes a technology superior. In that regard, DLSS is terrible by design. There's not even any proof the tech couldn't work through compute alone.


And it's especially bad in this game with a lot pixelization. It might get fixed, but in its current state I'd rather just set a lower resolution with regular upscaling.
It's not bad at all and the pixelation you mention is only perceptible when digitalfoundry takes 400% zoom screenshots from high-speed sequences. In normal gameplay people hardly notice it.
But the good part is you can just turn it off and/or hold off on buying it until someone makes a DLSS2 mod for the game.



AMD are not the good guy. They constantly demonstrate anti-consumer behavior just like NVIDIA and Intel. All these companies care about is getting your money.
I don't care about either corporation, I care about which one is offering the best deal for consumers in the short and long term.
Open sourced technologies that work throughout all architectures are better for consumers than similar closed sourced ones that only work the last 2 GPU generations of one particular IHV.
 
We already have modders who implemented DLSS with a better end result than FSR from the devs. Like this guy PureDark. He's currently trying to implement DLSS himself, in this game, but denuvo prevent him from doing so.
Especially on AMD, Intel and GTX cards, the results must be phenomenal. Right?
Hardware unboxed also made a comparison and DLSS is often leading in visual quality when the two techs
Strange. Isn't this site that Nvidia fans call it "AMD biased"? Strange. Seen that video. No wonder a hardware implementation is better than a software one.
If this game had implemented both tech and both results were bad, that would have been more understanding than implementing one tech and still ending up with a result that doesn't show it's full potential. I would also jump on a DLSS only game that would be dysfunctional. The "forget DLSS" part was mainly targeted as the people saying that FSR "is the way to go since it's hardware agnostic", when the tech still needs to achieve a consistent quality parity in every game where both are available.

Did you miss the part where I said myself that this game is one of the worst showing of FSR ? Where I said that what this game is doing is preventing to generate mindshare about the tech ? I am aware that FSR can reach parity with DLSS. It just doesn't happen that often, but for an AMD partnered game, it should happen. what I said was : "If you are only going to use FSR, you can at least do it right since you don't have to work on several upscaling tech at the same time".
I've just tried DLSS quality in God of war and my FPS went from 64 to 85. At 1440p max settings. FSR show similar gain.

I won't deny that I do have a bias towards DLSS, but that bias is fueled by the data that I've gathered, showing that at that precise moment, DLSS do seem to be the better tech. DLSS was made freely vailable in 2021. The shotgun article does mention that before you needed to apply, not necessarily pay for it. Other articles are mentioning that same news with a different wording.
What you written before, isn't really the same with what you say here. You where making it look like FSR is junk and DLSS would have been a de facto good implementation. In any case implementing one tech and getting a bad result is on developers, not a fault of a technology that proven to be more than good enough in a number of games. Also FSR will never have parity with DLSS because it is software vs hardware. On the other hand, DLSS will never be able to cover all gamers because it is closed tech and runs on specific hardware. Having both is the best. Having only one, better that one be FSR and DLSS to come latter. Or XeSS if this can work in every hardware the same or better than FSR. I think there are cases where XeSS is better than FSR and cases where it just doesn't perform, meaning devs and Intel need to work more with it.
DLSS was made freely available right after AMD gave away FSR source code. While AMD reacts on new techs from Nvidia, Nvidia reacts on AMD's approach of giving away the equivalent techs for free. The same happened with GSync. GSync was closed and hardware only. FreeSync happened, VESA's Adaptive Sync followed, which was FreeSync anyway but without AMD's branding, Nvidia was forced to also adopt a GSync compatible version to not lose the market that they created.
 
Here's an interesting article for anyone that wants a read. Besides TPU this is the only site that I trust to tell the unbiased truth. Better turn on your Adblocker before going to DSOG though. Otherwise you'll get barraged with ads.

Indeed fighting amongst ourselves keeps us from uniting against the crappy Publishers that spew broken or unfinished games.

 
Here's an interesting article for anyone that wants a read. Besides TPU this is the only site that I trust to tell the unbiased truth. Better turn on your Adblocker before going to DSOG though. Otherwise you'll get barraged with ads.

Indeed fighting amongst ourselves keeps us from uniting against the crappy Publishers that spew broken or unfinished games.


Indeed, anyway this made me chuckle:

 
Back
Top