• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Star Wars Jedi: Survivor Benchmark Test & Performance Analysis

Especially on AMD, Intel and GTX cards, the results must be phenomenal. Right?

Strange. Isn't this site that Nvidia fans call it "AMD biased"? Strange. Seen that video. No wonder a hardware implementation is better than a software one.

What you written before, isn't really the same with what you say here. You where making it look like FSR is junk and DLSS would have been a de facto good implementation. In any case implementing one tech and getting a bad result is on developers, not a fault of a technology that proven to be more than good enough in a number of games. Also FSR will never have parity with DLSS because it is software vs hardware. On the other hand, DLSS will never be able to cover all gamers because it is closed tech and runs on specific hardware. Having both is the best. Having only one, better that one be FSR and DLSS to come latter. Or XeSS if this can work in every hardware the same or better than FSR. I think there are cases where XeSS is better than FSR and cases where it just doesn't perform, meaning devs and Intel need to work more with it.
DLSS was made freely available right after AMD gave away FSR source code. While AMD reacts on new techs from Nvidia, Nvidia reacts on AMD's approach of giving away the equivalent techs for free. The same happened with GSync. GSync was closed and hardware only. FreeSync happened, VESA's Adaptive Sync followed, which was FreeSync anyway but without AMD's branding, Nvidia was forced to also adopt a GSync compatible version to not lose the market that they created.
Yhea maybe the tone of the original post was too strong, but it wasn't my intention to belittle FSR as being bad in every scenario.

I haven't talked about it much in the TPU community, but the rumors of Microsoft working on an upscaler that might be implemented in direct X is the best-case scenario. (At least for windows) Intel Xess is also compatible with every hardware... but you'll get better results on ARC since it's XMX accelerated. If every GPU is going to have AI component down the line, it's best to have a single solution that can take advantage of that for everyone. And it seems that silicon makers are not kind to enable hardware accelerated features on their concurrent, so it's a software guy who need to do it. Even if Nvidia opened up DLSS, they would probably do it the Intel way, and there would be complains that "all is not equal", " whenever a game would look perform better with the hardware implementation.

I still remember vividly what happened with GPU compute: Apple being a "good guy" making Open CL open source, Nvidia making the closed CUDA, ATI tried something with ATI stream, and Microsoft only did the "obscure" direct compute. 14 years later, Apple shunned the API that they created, and had to copy Nvidia to get back the offline 3D market that they lost after Nvidia was persona non grata on MacOS. Devs who ignored OpenCL moved over to Metal really fast. ATI stream died for open CL, and openCL failed to gain traction for 3D rendering. The same thing happened on windows with the exception that Microsoft never tried to compete with CUDA and avoid a situation where if you want to do offline 3D rendering, a RTX GPU is the only reasonable option. CUDA won because Nvidia made sure that the tech would be easy to implement, if you have an issue, Nvidia will help you fix it.

I don't want to see another case when an open standard ends up on the sideline because the creator "just hoped for the best" and didn't give enough support to the tech. The people who are making closed standards have a shark mentality where they will rub the interested parties the right way to make sure that you keep using their stuff. You need have the same attitude to compete. I have a personal history with AMD being too complacent for support in creative apps even when they had the compute advantage :D
 
Last edited:
Did they choose it? Or did they choose to take the money for not implementing other technologies (it is an AMD sponsored game)?

I'd say that only implementing one upscaling method is just as bad as locking down DLSS to one vendor. Or maybe it's worse, since the tech is objectively inferior. And it's especially bad in this game with a lot pixelization. It might get fixed, but in its current state I'd rather just set a lower resolution with regular upscaling.

AMD are not the good guy. They constantly demonstrate anti-consumer behavior just like NVIDIA and Intel. All these companies care about is getting your money.
What makes you assume that AMD paid the developers to not use DLSS?

Also, FSR works on everything. Anyone can use it if they want to. It's an option, not a necessity.
 
Lots of games were going to use DLSS until they got sponsored by amd. Some of them were already using DLSS and then they dropped it like boundary..
It's quite obviously a trend, and it wouldn't be so bad if a few other things lined up, like it being a great FSR 2.2 implementation, but as it stands it's just crap. I just cannot hang my hat on "it works for you know... everyone!!1!" when it's crap. They'd have been better off dropping in FSR 1.0 at this rate, or XeSS considering that works for everyone, or simply leveraging the sponsorship to make sure the FSR imp was decent, rather than forcing the omission of DLSS.

AMD sponsored titles are quickly becoming a distinct subset to avoid, some problems are not their own, but some certainly are, and I'm disgusted by it.

AMD knows best, actually AMD knows better than you. And they've decided that DLSS is bad for you, because it's bad for them, and they're the main character.
 
Lots of games were going to use DLSS until they got sponsored by amd. Some of them were already using DLSS and then they dropped it like boundary..
Fair enough. I still wouldn't completely write off Nvidia's closed, in-house approach and/or licensing fees, and developer laziness as a contributing factor.
 
It's quite obviously a trend, and it wouldn't be so bad if a few other things lined up, like it being a great FSR 2.2 implementation, but as it stands it's just crap. I just cannot hang my hat on "it works for you know... everyone!!1!" when it's crap. They'd have been better off dropping in FSR 1.0 at this rate, or XeSS considering that works for everyone, or simply leveraging the sponsorship to make sure the FSR imp was decent, rather than forcing the omission of DLSS.

AMD sponsored titles are quickly becoming a distinct subset to avoid, some problems are not their own, but some certainly are, and I'm disgusted by it.

AMD knows best, actually AMD knows better than you. And they've decided that DLSS is bad for you, because it's bad for them, and they're the main character.
I don't like upscaling and avoid it if I can. However, FSR is now the primary upscaler technology because of its easy implementation and wide application to all GPUs and consoles. DLSS1/2 is a secondary technology while a technically superior upscaler for those GPUs and games that support it. I expect more games to prefer FSR as an upscaler in the future and hopefully the technology improves.
 
I expect more games to prefer FSR as an upscaler in the future and hopefully the technology improves.
And if AMD insists on blocking DLSS from being in their games, I can only hope FSR improves or at least gets implemented well enough to not be garbage. FSR by itself would be good enough, if it was actually good enough. It feels very counter intuitive to me to block DLSS in AAA games where the largest subset of possible buyers will want that option, they will lose sales over this behaviour, perhaps developers will see it increasingly too and include the others anyway, or just use streamline which I hope FSR gets added to. I have zero issues with FSR being in literally every game from now onwards, but that doesn't have to be at the expense of the others, it's not a zero sum game, perhaps everyone could just use the one best suited to their setup, what a thought.
 
Wow. AMD sponsored means evil. If a buy a 7600X for $289 Canadian I get a copy of that Game. That means I can get a 7600X for $200 Canadian. Thanks AMD BTW that was how I got COH3. All the people complaining about DLSS forget that it is a great Game to begin with and Newsflash! If the Consoles are using AMD why the hell won't they put FSR into all their Games. I don't hear anyone calling out Nvidia for pimping CP2077 but the whining comes down to the fact the Game plays better on AMD than Nvidia and the Fan boys are crying just like people crying over the firing of Tucker Carlson.
 
Wow. AMD sponsored means evil. If a buy a 7600X for $289 Canadian I get a copy of that Game. That means I can get a 7600X for $200 Canadian. Thanks AMD BTW that was how I got COH3. All the people complaining about DLSS forget that it is a great Game to begin with and Newsflash! If the Consoles are using AMD why the hell won't they put FSR into all their Games. I don't hear anyone calling out Nvidia for pimping CP2077 but the whining comes down to the fact the Game plays better on AMD than Nvidia and the Fan boys are crying just like people crying over the firing of Tucker Carlson.
The FSR implementation here makes FSR2 look like DLSS 1. The performance on AMD is a fraction of what it should be. This should be running without hiccups at 120FPS+ on 7900XTX at 4k with no RT. The graphics are not so good as to be this demanding on the hardware.

That this broken mess has AMD's name on it is bad for AMD. AMD fans should be furious that an AMD sponsored title is of such poor quality.
 
And if AMD insists on blocking DLSS from being in their games, I can only hope FSR improves or at least gets implemented well enough to not be garbage. FSR by itself would be good enough, if it was actually good enough. It feels very counter intuitive to me to block DLSS in AAA games where the largest subset of possible buyers will want that option, they will lose sales over this behaviour, perhaps developers will see it increasingly too and include the others anyway, or just use streamline which I hope FSR gets added to. I have zero issues with FSR being in literally every game from now onwards, but that doesn't have to be at the expense of the others, it's not a zero sum game, perhaps everyone could just use the one best suited to their setup, what a thought.
Yes, Nvidia is presently "the largest subset of possible buyers" but FSR services Nvidia and AMD and Intel GPUs. Whether other upscalers are used in addition to FSR is obviously something for the game developer to explore. I agree that DLSS is presently the technically superior upscaler of the bunch but its proprietary status means it will never be the preferred upscaler.
 
Yes, Nvidia is presently "the largest subset of possible buyers" but FSR services Nvidia and AMD and Intel GPUs. Whether other upscalers are used in addition to FSR is obviously something for the game developer to explore. I agree that DLSS is presently the technically superior upscaler of the bunch but its proprietary status means it will never be the preferred upscaler.
I also don't think it will be the upscaler of choice in like 10 years, but in the here and now, it's either purposeful or otherwise reason for exclusion is unforgiveable for a PC release in 2023, especially in a game where it's barely more than a checkbox.

From what I gather, reputable tech press are agreed on this, and the only people that appear to be happy/ok about it are Radeon users/vocal Nvidia haters, representing such a small portion of potential buyers. I get the argument and agree that 10 series or older GTX owners etc want FSR and can't use DLSS, sure, but that has nothing to do with also excluding DLSS.

All 3 solutions should be added when it's this easy, hec add TSR too if you like, but I thoroughly reject the notion that in the here and now, just giving us a (crap) FSR implementation is good enough predicated solely on the basis that everyone can use it. I mean, the game has other options not everyone can use after all.
 
Couple things here. First, the model S does not have an optical drive.

Second, consoles are universally sold at a loss, the only exception is some nintendo hardware. The model S, sold at a profit, would likely be $600+.

Third, the model S is not playing this game at 4k60. Or 4k30. It's doing 1080p30, with dithering and a bunch of other tricks to reduce image quality, and I'll happily bet that it will frequently drop into the mid 20s. You can easily do this yourself by playing the game at low settings, where it will run on hardware like the 1070 which is multiple gens old.

Disagree.
Mid range will be fine with 12GB. 16GB will only be needed for ultra settings, which are a waste on many modern games.
24 gigs is useless for gaming today.


Feel free to go test your own 50 game suite, take video and image comparisons, and find a list to prove your point.

Otherwise, you're just being padantic.
Fair enough on the optical drive made error, but I was comparing the Series S to low end GPU's, those most definitely wouldnt be playing it at 4k either.

I would probably rather play a game at 720p/1080p for £250 than playing at same resolution for circa £500 or maybe at 1440p/4k for a couple of grand.

Also on latest AAA titles 12gig will starve you at middle of the range settings. You have to also remember people dont just buy a GPU for today they expect them to last at least a few years, its really just enthusiasts who are prepared to replace their hardware every year.

I could make the same argument back to you, why does a mid range or low end GPU need dedicated RT shaders?

EA Denuvo = 5 GPU changes in 24 hours. I added new entries earlier today, will do more testing now to add a few more cards
The game seems mostly CPU bound, could you please change up the system and try at least half a dozen CPU's or so going from maybe skylake up to current gen and also ryzen 1,2,3?
 
Last edited:
Certain subset of users are becoming such elitists when it come to texture quality, they are happy with turning down RT (which should be the new Ultra settings) or using FSR (which destroys image quality), but turning down texture quality to Medium/High apparently making GPUs unusable :rolleyes:.

If I were to say something with that much elitism, I could say every GPU but the 4090 is obsolete for current AAA games, much less AAA games in the next 2 year. And yeah 4090 will be obsolete in a few months when UE5 games come out, time to upgrade to 4090Ti then :D
performance-rt-3840-2160.png
 
Last edited:
Is it plausible to add a any lower settings test run with less GPU's, to show how VRAM limitations affect performance?

Heck test it with just the 4090 and just record the VRAM figures with low/medium/high should be enough, and low effort. People can easily figure it out from there. "oh no 8.1GB on high, guess i can turn down one setting"



A minimum settings test would at least let us know what GPU is capable of 4K 60FPS at native res, even if it looked bad - these results tell me i wouldn't play the game on ultra even without RTX, but then i'd need to go find another reviewer who tested lower settings to find anything else out.


This would aboslutely up the time it takes to test so it would make sense to do this with a reduced set of GPU's - perhaps one control GPU and 3-4 VRAM constrained ones at most?

3070 8GB
6700XT 12GB
4080 16GB
*control GPU, 4090 or whatever*
It'd let people see if lower settings make the extra VRAM irrelevant for users not at ultra settings, because the % difference between the cards would change and reveal it's value to that market segment - do they need GPU performance, or more VRAM?

At least you can do the low graphics testing with the same GPU, while you wait on denuvos stupid hardware change limitations.


1682921513951.png

Knowing if 8GB was needed for 4K at minimum settings without DLSS/FSR, would be a deciding factor for a lot of people


I'm all for testing games at ultra settings, but ray tracing and path tracing have made them useless for the majority of gamers now - they just wanna know what works smoothly.
Borderlands 3 is commonly tested, but one graphical setting is bugged and has a 40% performance impact for no visual change - volumetric fog, that hurt performance on levels it isnt even visible in.

No gamer would ever bother having that on in the real world, if they knew it was the reason for the struggling performance (especially in the early tutorial levels) - but every benchmarker has to include it, because it's in the default settings.
 
Last edited:
I gotta say I'm looking forward to UE5 being generally implemented, it doesn't seem to have the shader loading issues, much better DX12 implementation. Still demanding, but seems far better suited for pc use UE4 has a lot of issues with stuttering, poor utilization of gpu or cpu etc. The shaderstuff new consoles may get away with due to DDR6 as base ram and absurdly fast storage?
 
Funny stuff in here.

Everything AMD does is junk and a company that controls less than 10% market share in discrete GPUs can manipulate the market. Poor, poor 685 billions market cap behemoth with 80% or more of the discrete market share, twice the income and much higher profit margin that can move the market where it wants when it wants and sell hardware to consumers at much higher prices because the consumers see it as the absolute king in the market, also known to have strong ties with game developers for at least the last 15 years, doesn't have a chance against evil junk maker AMD.

Is this the comedy section?
 
At least you can do the low graphics testing with the same GPU, while you wait on denuvos stupid hardware change limitations.

Denuvo is a digital plague. I detest it.
 
well apparently modder can already implement DLSS SR + FG into Jedi Survivor, easily doubling the FPS, but is blocked by Denuvo

If a game company hate their customers this much, do their games deserved to be bought :D
 
It's not always predictable who gets the money from whom in a cross-marketing deal. It's also not certain that this deal included some clause to forbid Respawn from implementing DLSS in the game.
The game could have used UE4's TAAU instead of FSR2, but it seems the latter is generally better. Regardless, FSR2 objectively more usable than DLSS2.

I could understand a developer only implementing FSR if it was a small indie studio with actual time and resource constraints. But this is EA, one of the major publishers, charging $70 for a game.

And yes, they decided to use FSR2 on consoles as well in this game. And at the moment that turned out to be a horrible decision. The game is completely unoptimized on all platforms, simply unfinished.
On PS5 it drops as low as 648p before reconstruction. 648p! Even Sporfoken's lowest resolution was 720p. The highest resolution is 864p. And even despite this, the game constantly drops below 60 FPS, and even below 40 FPS in most demanding scenes. That's a complete joke considering the game looks like a blurry mess with such a low base resolution.

Most of the framerate drops actually seem to be related to ray tracing, which you cannot turn off on consoles. And that's probably a CPU bottleneck since the game is using Garbage Engine 4.

EA had absolutely no right to release the game in this state. Dead Space was actually a pretty good launch. The console version was using VRS at launch, which looked ugly, but they quickly fixed that. The PC version was great with very few issues. And the game was made on the Frostbite engine, which pretty much always runs great. The PC version has DLSS support and it runs perfectly fine on 8 GB cards.
 
@W1zzard where did you do your fps measurements ? You seem to have avoided the heavy cpu bottleneck this game suffers from in many places;
Also how did you measure vram usage ?
 
Funny stuff in here.

Everything AMD does is junk and a company that controls less than 10% market share in discrete GPUs can manipulate the market. Poor, poor 685 billions market cap behemoth with 80% or more of the discrete market share, twice the income and much higher profit margin that can move the market where it wants when it wants and sell hardware to consumers at much higher prices because the consumers see it as the absolute king in the market, also known to have strong ties with game developers for at least the last 15 years, doesn't have a chance against evil junk maker AMD.

Is this the comedy section?
I thought about replying to this then decided why bother.
 
Is this the comedy section
Certainly feels that way!

  • Excusing the omission of highly desirable and easy to implement features
  • Likening Nvidia sponsored titles with ground breaking visual features and low fps to a broken half assed release missing easily added features
  • Parallel arguments about other companies that have nothing to do with this release
I suppose comedy is subjective eh, but we can all laugh at the same things for different reasons.
 
Certainly feels that way!

  • Excusing the omission of highly desirable and easy to implement features
  • Likening Nvidia sponsored titles with ground breaking visual features and low fps to a broken half assed release missing easily added features
  • Parallel arguments about other companies that have nothing to do with this release
I suppose comedy is subjective eh, but we can all laugh at the same things for different reasons.

Man I just remember about DF's interview with FSR2 lead developer and AMD intention was obvious 9 months ago, they want to block DLSS/XeSS out in games where they pay devs to include FSR2.x


Watch at 15:00 where Alex asked about Nvidia Streamline API. Apparently AMD engineer is arrogant enough to think their solution is the best for everyone, despite FSR2.x is quite useless at 1440p and below
 
Back
Top