• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Assassin's Creed Odyssey: Benchmark Performance Analysis

Ya, very poorly optimized IMO. Completely unacceptable that a game that demanding doesn't have multi GPU support. Will definitely not buy this one.
Honestly, it looks like some people are offended, because they paid huge sums for best hardware available and a game can't be run on highest settings. :-P

Highest settings are too much for GPUs we have today, but won't be too much for GPUs we get in 1-2 years. Games are expected to "last" longer than GPUs, don't they? :-)

Actually it's not that bad either. The game will happily run on fairly pedestrian hardware once you limit some effects.
Look for a Xbox One S demo. The game looks fine despite console being weaker than GTX1050.

I miss the times when GPU reviews included settings a step below maximal. The difference is often hard to notice on screen, but takes away many fps.
 
AMD sponsored games are promoted as excellent performance games. So what happened? What AMD black boxes were used?
As AMD games are usually well optimised. I should remember you that the AC series run like crap since at least Unity .

And yes, he is right, many were promoting the 2080Ti as the ultimate single card for 4K 60fps. Around the launch of the RTX series, there were 3 AAA games released: Forza Horizon 4, Shadow of the Tomb Raider and AC Odyssey. FH4 runs like heaven in most cards. In the NV sponsored SotTR, the 2080 Ti cannot manage even 60 fps average, in AC Odyssey, it's around 45. 2 out of the 3 games cannot average 60 fps, not to speak about fix 60 fps. But nice try from an unbiased user writing Intel in the CPU line and NV in the GPU line. :D
 
I should remember you that the AC series run like crap since at least Unity .
Really? Unity had issues the first year. Runs and plays fine now. Syndicate ran great. Origins also not only ran well it was the best looking yet, at the time. The only deal was you were going to need a decent CPU to tackle the two levels of DRM.
 
I appreciate the effort but for such game "Very high" benchmark and "High" is important too , since it's heavy performance hitter on Ultra.
 
Ryzen that much faster than i7? Or undervolting adding much performance? Only @W1zzard can test that to make it sure I guess...

Yes that would be good if wiz’s can test it again. Coz I am not surly getting 26fps even with ini tweaking to max. But again, I am using 18.9.3 driver and why is he using a crappy 18.9.2 beta driver. That driver was horrible. I reverted back to 18.5 because of that crappy driver.
 
Actually it's not that bad either. The game will happily run on fairly pedestrian hardware once you limit some effects.
Look for a Xbox One S demo. The game looks fine despite console being weaker than GTX1050.


I am still surprised how well my A10-5800k + 1050Ti runs games, for an APU (which a lot of people slag off saying APU's are total garbage) and low end graphics card things are all good for the low end user, if you tweak the game settings you will be surprised with what games you can play and how games have come quite away with graphics, so even on medium settings a lot of game look good and play well on low end systems.

For example even my low end setup will play forza 6 at 4k locked at 30 fps medium settings very well no issues or stuttering, when I was expecting a slide show I am more than happy with whats on offer at the low end lol
 
Honestly, it looks like some people are offended, because they paid huge sums for best hardware available and a game can't be run on highest settings. :p

Highest settings are too much for GPUs we have today, but won't be too much for GPUs we get in 1-2 years. Games are expected to "last" longer than GPUs, don't they? :)

Actually it's not that bad either. The game will happily run on fairly pedestrian hardware once you limit some effects.
Look for a Xbox One S demo. The game looks fine despite console being weaker than GTX1050.

I miss the times when GPU reviews included settings a step below maximal. The difference is often hard to notice on screen, but takes away many fps.

Some might be - I'm not so much offended personally as I am just surprised by the developer's decisions. I simply won't buy it due to lack of multi gpu support. I'll still play it via other means (likely at 1440p @ 60fps), but it's just asinine to release a game that even the brand new $1200 2080 Ti can't run steadily in 4k @ 60fps and decide not to implement multi GPU support.
 
Anyone tried running this with a 4670K or similar? I could not get Origins stable 60FPS with my system, CPU 100% in cities, GPU 60% at most. My guess this would be the same story?
 
Some might be - I'm not so much offended personally as I am just surprised by the developer's decisions. I simply won't buy it due to lack of multi gpu support. I'll still play it via other means (likely at 1440p @ 60fps), but it's just asinine to release a game that even the brand new $1200 2080 Ti can't run steadily in 4k @ 60fps and decide not to implement multi GPU support.
I think we're already way past the point where implementing multi-GPU costs more than the gain in revenue because of multi-GPU clients. Especially for such a mainstream title.

And again, the thing I've mentioned earlier.
Would it be more acceptable for you if this game was launched with lower requirements at a cost of (I'd imagine: slightly) lower image quality? Ubisoft could boost details in a future patch (like many MMORPG do).
 
45fps?
Fake.
In the video the GPU isn't heated up properly. If it's cold from idle it'll boost higher for about 20 seconds and then drop clocks a bit
 
I think we're already way past the point where implementing multi-GPU costs more than the gain in revenue because of multi-GPU clients. Especially for such a mainstream title.

And again, the thing I've mentioned earlier.
Would it be more acceptable for you if this game was launched with lower requirements at a cost of (I'd imagine: slightly) lower image quality? Ubisoft could boost details in a future patch (like many MMORPG do).

I think the money argument against implementing multi gpu support is rendered null and void when considering that Ubisoft continually implements multiple forms of DRM that don't prevent piracy. I can't even imagine what the licensing costs must be for Denuvo and VMProtect - they're clearly not worried about profit margin when implementing those DRMs since the games still get cracked - I know many folks who refuse to buy the games simply because Denuvo is present (I'm not one of them, but there are plenty of folks out there who are in that group). I'm sure implementing multi gpu support would be nowhere close to those licensing costs and you'd have another demographic who would like to buy the game, as well as being happier customers in general when they can play the game with better IQ settings and framerates. To me it really just comes down to are you taking care of your customers? And in Ubisoft's case here, and with AC Origins, for me, the answer is a solid no. And as a result I have to make the decision of whether to reward the developer for that with my money or not, and I simply choose not to. I'll play the game eventually just as I did with Origins, but literally the only two variables in play here are: a.) Does Ubisoft make any money off of me? and b.) What resolution do I play in?

To answer your 2nd bit, maybe a little, but I think they'd have to drop it quite a bit for me to be able to play it in 4k @ 60fps and having to reduce the image quality would be a pretty silly solution when technology exists to allow multiple GPUs to handle it, the developer just chooses not to implement it. If I could play it in 4k @ 60fps with everything cranked just (1) of my 1080 Ti cards, I'd go ahead and buy it though. Hopefully that answers your question.
 
The fanboyism in this comment thread is sickening.

did I mention about AMD?
I was criticizing the performance of 2080Ti, triggered much?



meh..

Did he say that you mentioned AMD? He is just pointing facts that even AMD's high end gpu can't get 60fps on 1080p even when this game is actually an AMD sponsored and optimized title. Then on what logic are you complaining about rtx 2080ti not getting 60fps on 4k? Yes the price of rtx 2080ti is absolutely absurd and anti-consumer. But it seems like you are forgetting all the cards on the table here. This game is simply too demanding to run on 4k. And there are hundreds of graphically beautiful games that run at 4k60fps+ with rtx 2080ti. One overly demanding game missing the mark doesn't make it '4k un-ready'. Nvidia can't do anything if a developer company forgets to optimize their game. I blame AMD in this case bcz they are the one's that collabed with ubisoft for this game's performance,not nvidia.
 
I just tried to load AC Odyssey on my PC and for both win7 on AMD 19.3.2 drivers and win7 on AMD 19.2.1 drivers I got crashes or BSOD. Stock settings for both CPU & GPU. Anyone else getting that behaviour in the latest version of the game?
 
Last edited:
I just tried to load AC Odyssey on my PC and for both win7 on AMD 19.3.2 drivers and win7 on AMD 19.1.1 drivers I got crashes or BSOD. Stock settings for both CPU & GPU. Anyone else getting that behaviour in the latest version of the game?

Sounds normal for ubi.
 
Sounds normal for ubi.
No it’s not normal. This has been a fantastically stable game for the majority, and easily the best AC since II.

In fact, Ubi games on Uplay have been the most stable games for me for years.
 
No it’s not normal. This has been a fantastically stable game for the majority, and easily the best AC since II.

In fact, Ubi games on Uplay have been the most stable games for me for years.

Yeah, sure, just like R6 Siege was till the last few updates lol. They can break it any time.

They're beyond incompetent to me.
 
Generalities based on your dislike of Ubi and predicated on one game are hardly useful. I wouldn’t presume to make a comment about the stability or quality of Rainbow 6 unless I had played it. I recommend you do the same.
 
Playing it right now with no issues at all. It's a system hog though.
 
Volumetric clouds is the biggest hog of all the settings. Like 10 FPS or more between very high and ultra.

Assassin's Creed® Odyssey2019-3-18-23-37-25.jpg
 
I just got it as well. With the benchmark on "Very High" my i5-4690k @ 4.7 Ghz with a GTX 1070 Ti (stock) on a 1920x1200 monitor got 69 FPS with a minimum of 40 FPS. I'm quite pleased with the performance on such an old CPU (albeit with a decent overclock). On "High" I got 78 FPS. I only have a 60Hz monitor so these kind of results are fine.
 
I just got it as well. With the benchmark on "Very High" my i5-4690k @ 4.7 Ghz with a GTX 1070 Ti (stock) on a 1920x1200 monitor got 69 FPS with a minimum of 40 FPS. I'm quite pleased with the performance on such an old CPU (albeit with a decent overclock). On "High" I got 78 FPS. I only have a 60Hz monitor so these kind of results are fine.

Yeah, that's real good. If i use the preset of Very High at 1440p i'm only getting an average of 68 fps in real time play. I cut off Death of Field, set Anti-Aliasing to low, cut Ambient Occusion to high and the main culprit those Volumetric clouds. Then you can set other settings to Ultra and resolution modifier to 120% and still see performance gain. This game likes to be tinkered with.

Assassin's Creed® Odyssey2019-3-19-23-42-17.jpg
 
Back
Top