• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Star Wars Outlaws Performance Benchmark

It's not that she's an ugly human being (well, she isn't exactly beautiful, either, but that's besides the point), but the fact that her proportions are wrong, her hair covers most of her face, she's got very low res textures and piss poor facial animations. None of the above can be said about the woman Ubisoft modelled her after.


I'm the same (an environmentalist gamer), but that's another issue with the game. Tatooine is the only planet in it that has any meaning for a SW fan, and even that's done poorly. The whole thing is just full of empty spaces and quests that send you from A to B for no reason whatsoever, so you spend most of the time on loading screens.


Gunplay looks generic, but stealth looks crap, imo (I don't like stealth in general, but still). One single detection is enough for game over, no second chances, no alternate solutions, just game over, that's it. Maps are full of cameras that you can't disable, and explosive barrels that are usable, but doing so is just another instant failure. What the hell, seriously? I don't know about other people, but to me, this seems more like a chore than fun.


Fair enough. But still, movie and video game protagonists should be aspirational characters, something over the norm one way or another, and not basic, bland nobodies. If I don't find anything special in a living, breathing (not Gordon Freeman-like) protagonist, then why should I care about their story? When I started watching the trailer for Forspoken, and it started with the narrator saying "I'm an average girl, I live in New York, and I like cats", that was it for me. Instant turn off. If that's all the developers can say, then I really won't give a damn.


That I agree with. Unfortunately, there's all the above as well.

I only watched the DF video of the console version which visually looked fine to me. I'm also not overly invested in star wars at least not after Disney destroyed it lol.
 
My wife got this game yesterday. She's running 5800x3d, rx 6900xt, and 64GB RAM on a 4k120 display. Looks good. She's running ultra in cinematic mode, which I suspect is helping the graphics card keep the framerate up. She's very happy with the game so far, and she's a very picky bitch when it comes to games. It is weird that there isn't a driver update for AMD though.
 
My wife got this game yesterday. She's running 5800x3d, rx 6900xt, and 64GB RAM on a 4k120 display. Looks good. She's running ultra in cinematic mode, which I suspect is helping the graphics card keep the framerate up. She's very happy with the game so far, and she's a very picky bitch when it comes to games. It is weird that there isn't a driver update for AMD though.

Funnily enough I've only followed this game somewhat because I think my wife would like it as well.
 
Funnily enough I've only followed this game somewhat because I think my wife would like it as well.

Michele said it feels like Jedi FO and Tomb Raider.
 
Michele said it feels like Jedi FO and Tomb Raider.

My wife loves all the AC games and Tomb Raider, might grab it for her when she's on winter break probably be on a decent sale by then.
 
RIP rtx 4080 16gb super, 18-21gb vram usage in 1080p, 1440p, 2160p..... damn it, man, the rtx 3090 24gb series has very long long endurance, its better than rtx 4080 16gb series today.... !
 
Last edited by a moderator:
anyway, RIP rtx 4080 16gb super, 18-21gb vram usage in 1080p, 1440p, 2160p..... damn it, man, the rtx 3090 24gb series has very long long endurance, its better than rtx 4080 16gb series today.... !
nah 4080 is fine its just a game that uses all ram it can even even though doesn't help. 3090 only does 60fps at 1440p, 4080 is 86fps
 
Last edited by a moderator:
It's impressive just how little visual difference there is between Low and Ultra with RT. A shame you only included interior shots in the comparisons, though.
 
I wonder if Star Wars Outlaw handles VRAM restrictions like Halo? Loading in low-quality assets to prevent fps from being butchered?
I see no mention of it in the review as far as I can tell.
Many textures drop to low res for me if I'm above my 12 gig in 4k.
 
This game have RT settings inside the Advanced Settings, even if you will disabled RT (Propably Path Tracking) in the settings out of Advanced Settings, so it means RT is running anyway just at High Preset is set to medium which is not as demanding as high or ultra, but in that case all RTX GPUs will be running better than AMD anyway.
 
W1zz... I'm curious, are you using the legendary Dell 3007WFP? The 2560x1600 thing intrigues me. :laugh:
Dell U3011 :) 16:10 perfect for productivity, perfect size, bigger than 27", not as big as 32", 1:1 DPI scaling, and it seems that it'll never die. Using an old 1280x1024 17" Eizo on the side for Outlook and to check site layout on smaller screens
 
More than 20 gigs of allocated VRAM at full on maxed 4K is… something. Out of curiosity, if 4K Ultra 60 frames is only 4090 territory, how bad is the framerate even on that with RT and RTXDI and all that WITHOUT FG? Unplayable?
When do you people learn? More VRAM = More allocation.
I am at 18-20GB in tons of games on my 4090

However, a friend of mine with 4070 Ti 12GB, runs the exact same settings, with half that usage.

Most game engines allocate a given percentage, 80% for example - 0.8 x 24 = 19.2 GB

ALLOCATION means nothing really. FPS and especially minimum lows is what matters.
 
When do you people learn? More VRAM = More allocation.
I am at 18-20GB in tons of games on my 4090

However, a friend of mine with 4070 Ti 12GB, runs the exact same settings, with half that usage.

Most game engines allocate a given percentage, 80% for example - 0.8 x 24 = 19.2 GB

ALLOCATION means nothing really. FPS and especially minimum lows is what matters.

Exactly, just looking at allocation provides no real useful info, need the 1% lows.
Looking at the 8GB and 16GB 4060Ti appears to show no real difference (WHY DO WE NEED XTRA VRAM!?)
but I bet the actual gameplay (performance and visuals) tells a vastly different story.
 
When do you people learn? More VRAM = More allocation.
I am at 18-20GB in tons of games on my 4090

However, a friend of mine with 4070 Ti 12GB, runs the exact same settings, with half that usage.

Most game engines allocate a given percentage, 80% for example - 0.8 x 24 = 19.2 GB

ALLOCATION means nothing really. FPS and especially minimum lows is what matters.

so, there is no stutter when max vram usage in this game ?? 18-21gb vram on 1080p, 1440p, 2160p with rtx 4000 16gb vram series..... ?
 
VRAM allocation =/= VRAM usage, we all know that. The thing is that it's extremely difficult to objectively test for actual VRAM usage. A lot of games these days have countermeasures against VRAM overflow, like texture resolution changes, or intermittent area streaming/loading which may or may not stress your CPU and RAM, so you may or may not see a difference in a benchmark scenario. You might need to move away or leave your game area entirely to see a hitch, and then, you might not even see that hitch, depending on how sensitive you are to these things. Some games only start to stutter once the VRAM fills up, which may not happen right away, just X minutes into the game. There's lots of different things, and unfortunately (or fortunately?) personal sensitivity level to performance and detail changes is a factor, too.
 
so, there is no stutter when max vram usage in this game ?? 18-21gb vram on 1080p, 1440p, 2160p with rtx 4000 16gb vram series..... ?
Haha no. Even 8GB cards are doing well.

Look at the 4K/UHD testing, minimum fps. RTX 3070 8GB beats RX 6800 16GB.

None of these cards are able to run the game at these settings but 3070 8GB has higher average and minimum fps, meaning 8GB is plenty.

Game developers are not stupid. Like 80-90% of PC gamers have 8GB or less.
Even XSX and PS5 have 16GB total RAM, shared between OS, system and graphics.

Pretty much no games need massive amounts of VRAM, unless you absolute max out the games with RT or Path Tracing and then most GPUs will buckle anyway. GPU power is a problem long before VRAM in 99.9% of games. Ray Tracing and Path Tracing will eat alot of VRAM but most GPUs with "low VRAM" won't be able to do RT/PT well anyway.
 
Last edited:
I guess neither the 1070 or 3060 GPU based laptops I currently have will run this game well.

I'm heart broken that I can't give money to either Disney or Ubisoft. Two standup, consumer comes first companies who are beloved by the masses. Dang, darn, damn.

I will carry on... somehow.
 
It's impressive just how little visual difference there is between Low and Ultra with RT. A shame you only included interior shots in the comparisons, though.
Yeah, if you look at those screenshots on your small screen on the phone, but there is a difference when you in game on 4K TV, especially between RT and Low, but between Ultra and High is almost non existing in quality but massive in FPS
 
Now we see Nvidia with less VRAM have better performance, that means AMD have big problem in VRAM management

I do not know how much DRAM is used as temporary buffer. I expect 64GiB DRAM, with fast timings and speed, will benefit.
When the bus is fast enough, the pixels can be exchanged quite fast from DRAM to the temporary GPU VRAM.

How the silicon, software and firmware for a graphic card works also matters.
 
I care more about the environments and atmosphere in games like this 90% of the time I am looking at the back of the protagonist head.... The environments can look pretty darn good at least on a big oled screen though.

The bigger issue with this game is stealth and gunplay looks generic.

My comment wasn't just about this game but I heard the same weird things about how the protagonist looks in Horizon Forbidden West and Silent Hill 2s supporting female characters after the trailer dropped I just find it weird I guess.

Don't get me wrong if the way a particular characters looks is enough to get somebody to not buy a game more power to them it's their money after all.
Its the usual outrage culture on one hand, and on the other I think its the genuine feeling underneath that art direction is not much of a creative process anymore and more a political minefield that's being navigated.

Its a shame because neither the outrage or the politically correct art direction is helping games get better. I think people genuinely just have a strong radar for 'generic junk' and a lot of creative decisions seem to nudge towards generic rather than original.

I only watched the DF video of the console version which visually looked fine to me. I'm also not overly invested in star wars at least not after Disney destroyed it lol.
Well, then, you've just hit the nail on Kay's head there. To some, me included, Star Wars as a franchise is kind of relegated to the '13 in a dozen space opera' corner. It has to have its yearly installments now, etc. Its the same issue that plagues so many franchises that keep getting pushed to regular releases. They lose their soul - and any off chance of a talented team bringing it to new heights is culled by corporate guidelines.
 
I know that something is Nvidia optimized, when RX 6600 and RTX 3050 score almost the same.
 
I know that something is Nvidia optimized, when RX 6600 and RTX 3050 score almost the same.

Few things to consider: Radeons are performing poorly throughout. AMD has failed to release a game ready driver to the public this far, and the RX 6600 has a narrow memory interface all the same. Despite the scale, architectural strengths and weaknesses still apply. If the 6500 XT performs exceptionally poorly, it might just be a case of not enough memory bandwidth. I don't believe it's a grand conspiracy.
 
Few things to consider: Radeons are performing poorly throughout. AMD has failed to release a game ready driver to the public this far, and the RX 6600 has a narrow memory interface all the same. Despite the scale, architectural strengths and weaknesses still apply. If the 6500 XT performs exceptionally poorly, it might just be a case of not enough memory bandwidth. I don't believe it's a grand conspiracy.
Double standards. When games come out, the lack of DLSS support and/or the bad performance of Nvidia hardware is promoted as anticompetitive move from AMD.
When FSR is not supported or Radeons are performing poorly, it's again AMD's fault.

Anyway, with Nvidia controlling 80% of the market, games, especially those sponsored from Nvidia are highly optimized for Nvidia's hardware. They are only tested for compatibility on Intel and AMD hardware. Intel and AMD cards will need something more than just a new driver and that would probably be a patch from the game developers in 1-3 months from now.
 
@W1zzard your bios is a little out of date btw
1724843382851.png


especially for the 14900K and the new microcode intel are pushing(dunno if this has been mentioned)

1724843218209.png
 
Back
Top