• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Star Wars Outlaws Performance Benchmark

All of these recent performance analysis game reviews keep reminding me that I seriously need to start looking for an upgrade to my 2060 lol.
 
Not only that but apparently the 3060 Ti is beating the 6700 XT in both average and 1% lows regardless of the +4GB Vram on the 6700 XT.
This is yet another case where you will run out of raw performance than run into Vram issues anyway. 'I keep saying this that this is my experience with the latest games and my 3060 Ti that it simply runs out of performance and not the Vram in the latest games, by the time I run out of Vram I already have to lower the settings anyway AND use DLSS on top to keep things decently playable'

It may not even be raw performance-limited as AMD hasn't released game ready drivers yet. That's merely a possible reason, not an excuse as AMD hasn't even released game ready drivers for last week's new game darling, Wukong.

Any day week now, AMD.
 
Right :roll: AMD has more VRAM and wider bus, so that is not the problem, core speed is. Also, the game is an Nvidia sponsored title so please. Still, I'd be more than happy to take a 7900 XTX for a grand rather than a 4090 for two. Anyday.
Computerbase has done tests on this with AMD & Nvidia 8GB GPUs where the AMD cards run into VRAM problems faster than the Nvidia ones. The assumption is that Nvidia has better VRAM management or data compression.

Their previous title was AMD sponsored and it showed the sameish results. That's to be expected as both titles use RT by default.

Not sure why you're making the 7900XTX vs 4090 comparison as if they're comparable cards though. 4080 vs 7900XTX not as good of a comparison for your argument?
 
Now we see Nvidia with less VRAM have better performance, that means AMD have big problem in VRAM management
Computerbase has done tests on this with AMD & Nvidia 8GB GPUs where the AMD cards run into VRAM problems faster than the Nvidia ones. The assumption is that Nvidia has better VRAM management or data compression.

I would say that a single game demonstrating an issue out of many is more indicative of behavior specific to that game rather than some broader statement about AMD GPUs in general. Although i do believe AMD cards tend to use a bit more VRAM in general.
 
nvidia has a bunch of different sdk tricks to reduce ram footprint. It's basically "lossless" compression, if the devs used those tools (NTC, texture exporter etc).. then they would save some space.
 
It may not even be raw performance-limited as AMD hasn't released game ready drivers yet. That's merely a possible reason, not an excuse as AMD hasn't even released game ready drivers for last week's new game darling, Wukong.

Any day week now, AMD.
That is possible sure but even then its not like the game is suddenly much worse on a 8GB card cause even if I had more Vram on my card it wouldn't help me with the lack of performance which is the case in the more recent titles that I've tried/played so far. 'Especially UE 5 games where I'm actually totally fine with my 8GB but the performance is severely lacking the moment I dial up the settings'
I mean ye I guess when both card becomes borderline obsolete/unplayable in new games you can enjoy higher res textures on the 12 GB card and everything else on low. :laugh: 'at that point I would rather upgrade or not play the game at all but thats just me'
 
someone on reddit wrote that you can unlock hidden "outlaw" quality setting. Similar to "Unobtanium" setting for Avatar. Search it yourself. I'm too drunk and tired right now.
 
This graphics looks less what 20GB VRAM use would warrant
21GB allocated, to avoid potential stutter that occurs when the GPU needs to shuffle textures in and out of a smaller VRAM buffer.
That's not the same as in-use.

Given that 8GB cards are running it okay, it's not the unoptimised VRAM hog that TLoU:P1 was at launch.
 
Oh boy. The 8GB brigade is gonna be soiling another round of pants with this one.
Why? If it's there, why not use it?
Because 640k should be enough for anyone.
 
That is possible sure but even then its not like the game is suddenly much worse on a 8GB card cause even if I had more Vram on my card it wouldn't help me with the lack of performance which is the case in the more recent titles that I've tried/played so far. 'Especially UE 5 games where I'm actually totally fine with my 8GB but the performance is severely lacking the moment I dial up the settings'
I mean ye I guess when both card becomes borderline obsolete/unplayable in new games you can enjoy higher res textures on the 12 GB card and everything else on low. :laugh: 'at that point I would rather upgrade or not play the game at all but thats just me'

I was more complaining about the lack of performance on the AMD side for this game and Wukong, and not so much about any lack of VRAM. I tested Wukong with unsupported Nvidia drivers and then the game ready ones and got a 7-8% performance increase in the card I tested, so AMD supported drivers could give a similar uplift.

However as you said, the one refuge of more VRAM is being able to pump up the textures later in the card's life, though there are some games that stutter today when they run out of VRAM, like Hogwarts Legacy. I struggled with some poor performance even when not CPU-bound (also a problem this game has) until I reduced the textures to fit my 8GB (and even 6GB!) cards and now the game runs quite smoothly.

And then there are well-designed games like this which are happy to use all 20+GB if you got it but still perform very well with "only" 8GB (identical 4060 Ti 8GB vs 16GB performance).

@W1zzard could we keep at least one 6GB or even 4GB card in the mix for testing to see how game engines deal with even less VRAM? Not on day 1 but maybe added to the mix later on. Like a 1650 Super and 3050 6GB or similar? 6500 XT and 5600 XT as well if possible.
 
Not only that but apparently the 3060 Ti is beating the 6700 XT in both average and 1% lows regardless of the +4GB Vram on the 6700 XT.
This is yet another case where you will run out of raw performance than run into Vram issues anyway. 'I keep saying this that this is my experience with the latest games and my 3060 Ti that it simply runs out of performance and not the Vram in the latest games, by the time I run out of Vram I already have to lower the settings anyway AND use DLSS on top to keep things decently playable'
Just a case where like i think its snowdrop engine where it tries to load ram up even if data ends up not being needed just cause it can. Sadly people will ignore game runs fine even on lower ram cards cause they want something to complain about.
 
And then there are well-designed games like this which are happy to use all 20+GB if you got it but still perform very well with "only" 8GB (identical 4060 Ti 8GB vs 16GB performance).

The identical performance part may come at the price of degraded visuals. the 20+ part may be sloppy texture management that slowly creeps up.
 
It's hard not to get fixated on your main character in a third-person game who is on screen probably 98% of the time (she was modelled after a stunning woman, btw). And based on videos I've seen, it's not only her, but basically every character in the game, except for your pet, Nix. I guess Ubisoft tried to be as generic as they could for a wide appeal, but that's how the game lost its charm.

As for the game, it appears a bit too stealthy for me, and based on reviews, the stealth mechanics aren't great, either. Such a shame. I might buy it on a massive discount someday, but for now, it's a pass for me.

I care more about the environments and atmosphere in games like this 90% of the time I am looking at the back of the protagonist head.... The environments can look pretty darn good at least on a big oled screen though.

The bigger issue with this game is stealth and gunplay looks generic.

My comment wasn't just about this game but I heard the same weird things about how the protagonist looks in Horizon Forbidden West and Silent Hill 2s supporting female characters after the trailer dropped I just find it weird I guess.

Don't get me wrong if the way a particular characters looks is enough to get somebody to not buy a game more power to them it's their money after all.
 
I don't think this is necessarily bad as a wandering experience, just to experience the world of Star war, but actual gameplay seems lacking.

This game would have been a perfect candidate to mod for, I felt it is a bit like Fallout 4 in this way, if Ubisoft allow modding of Star War outlaw, there will be a lot of sales years down the line, but we all know modding is like a dirty word in year of 2024, just not going to happen.

Anyway, interesting graphic scaling, for me it feels like using high+RR is a good middle ground, can't really justify RTXDI yet.
 
Are you talking about the protagonist's character model? If so, your bar is set awfully high.
Ubisoft has set the bar higher itself in previous games. Comparing the main character's model in this to Kassandra from Assassin's Creed: Odyssey (a six year old game) is like a generational leap in the wrong direction. The facial animation is especially poor from the footage I've seen.
 
Star Wars Jedi Survivor will allocate even more vram as I've seen that use nearly all of my 24 gigs.
 
I'm not sure why Ray Reconstruction comes with such a big performance hit, usually the difference should be much smaller.
It's pretty evident the amount of samples per pixel is massively increased when RR is active, RR is supposed to replace the denoiser but if the input has low enough of a sample count it probably doesn't look right so I guess they just increased the sample count to make it work and also destroyed performance in the process.
 
Imho, the faces look much better, RTXDI brings significant advantages even over Ultra setting, yet overall this looks worse than Witcher 3. Which is almost 10 yo now.
 
Last edited:
W1zz... I'm curious, are you using the legendary Dell 3007WFP? The 2560x1600 thing intrigues me. :laugh:

Imho, the aces look much better, RTXDI brings significant advantages even over Ultra setting, yet overall this looks worse than Witcher 3. Which is almost 10 yo now.

Well, that's the importance of art style. This game may have accurate and advanced graphics rendering techniques, but that doesn't mean much if you don't like the art style. As for me, admittedly, I'm turned off by knowing who's the developer responsible for this game, IF i ever pick it up, it'll be when it's finally 75% off. I simply loathe Ubisoft.
 
Last edited:
Calling it star wars x farcry is the first thing that has truly got me interested, might give it a look after some patch maturation. Crazy to see how well the 3080 is doing, beating 6900XT/6800XT/7800XT consistently, often matching a 7900GRE and sometimes equaling the 4070Ti... Great to see they have memory management in order, use what's available smartly, allowing for the best possible experience on all cards.
 
A Typo in conclusions page:

Got a 1440p monitor? Then you need a RTX 3090, RTX 3070 Super, RX 7900 XT and faster.

Sure that your intention was to said RTX 4070 Super. RTX 3070 Super doesn't exist
 
Are you talking about the protagonist's character model? If so, your bar is set awfully high.
It's not that she's an ugly human being (well, she isn't exactly beautiful, either, but that's besides the point), but the fact that her proportions are wrong, her hair covers most of her face, she's got very low res textures and piss poor facial animations. None of the above can be said about the woman Ubisoft modelled her after.

I care more about the environments and atmosphere in games like this 90% of the time I am looking at the back of the protagonist head.... The environments can look pretty darn good at least on a big oled screen though.
I'm the same (an environmentalist gamer), but that's another issue with the game. Tatooine is the only planet in it that has any meaning for a SW fan, and even that's done poorly. The whole thing is just full of empty spaces and quests that send you from A to B for no reason whatsoever, so you spend most of the time on loading screens.

The bigger issue with this game is stealth and gunplay looks generic.
Gunplay looks generic, but stealth looks crap, imo (I don't like stealth in general, but still). One single detection is enough for game over, no second chances, no alternate solutions, just game over, that's it. Maps are full of cameras that you can't disable, and explosive barrels that are usable, but doing so is just another instant failure. What the hell, seriously? I don't know about other people, but to me, this seems more like a chore than fun.

My comment wasn't just about this game but I heard the same weird things about how the protagonist looks in Horizon Forbidden West and Silent Hill 2s supporting female characters after the trailer dropped I just find it weird I guess.
Fair enough. But still, movie and video game protagonists should be aspirational characters, something over the norm one way or another, and not basic, bland nobodies. If I don't find anything special in a living, breathing (not Gordon Freeman-like) protagonist, then why should I care about their story? When I started watching the trailer for Forspoken, and it started with the narrator saying "I'm an average girl, I live in New York, and I like cats", that was it for me. Instant turn off. If that's all the developers can say, then I really won't give a damn.

Don't get me wrong if the way a particular characters looks is enough to get somebody to not buy a game more power to them it's their money after all.
That I agree with. Unfortunately, there's all the above as well.
 
Another sad day for AMD gpu's. I might have to bite the bullet and go for 5080 this gen. 8800XT won't do better than 7900XT in raster even if you don't care about RTing. It might match 4070 Ti in RTing.
 
Back
Top