• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

CPUs Bear Brunt of Ubisoft Deploying VMProtect Above Denuvo for AC:O

Out of curiosity, what do you consider opinion? That Witcher 3 has horseback riding that doesn't max out 4 CPU cores or that the AI in that video doesn't look above average?
I consider opinion: "CPUs Bear Brunt of Ubisoft Deploying VMProtect Above Denuvo for AC:O". You know... the thing which we are both commenting about.
 
Have you seen the draw distance and expanse of the game? Have you noticed that when it does the day/night cycle shifts how huge the level is and how much dynamic stuff is going on constantly? This game has an incredible amount of detail and AI which isn't being chopped off by a veil of haze or blocked into a small close-quarters-combat area.

See, for example @56 seconds:
so standing still in a town on a bridge with 100% CPU on 4 cores and 36% is normal? This is assuming the person didnt force CPU in some off clocking. (too many reports of high CPU for me to claim forged)

In a game menu with 60-80% CPU with nothing else showing.....How does a menu screen require 60-80% CPU o_O. Either DRM or a horribly designed game...either way....never buying

The proof is a well known respective cracker who is a credible expert who says the DRM is queued every time you move and more.

A well known credible cracker/programmer word vs a company......I'll go with the first person.
 
I haven't played, but I have my doubts it's AI is all that great. Their open worlds aren't very dynamic so far. Why would that change? I doubt it even reaches Bethesda's level of random kookiness. And if it did, it probably wouldn't need those boosted metacritic reviews.
 
credible cracker
LOL, those two words...together.

I'm not saying Ubi is completely truthful here either, because something is wrong with performance. However, I'm more likely to believe results of people on this site who test the game.

As to Ubi, they are arrogant enough that if the 2nd DRM was actually to blame for the performance drop, they would tell us, and then tell us to "deal with it, it's because of pirates that we did it!"
 
Last edited:
What is the users' dynamic resolution setting? A low framerate target setting will force low framerate and will still show high usage since the game will upscale all of its textures as high as possible in order to obtain the target framerate, buffer them in very high resolution, and then down-sample the output to match the monitor's resolution.

Users can purposely make their games appear to perform at low framerates with very high system loads if they choose settings which incorrectly match their systems' abilities. Basically, the game will always try to use all of their systems' resources and the lower they set the framerate targets, the more it will appear that something is wrong because the game will try to keep the load high even at low framerates. The benefit to this, of course, is that the appearance is improved.

https://forums.ubi.com/showthread.php/1759689
What is the maximum framerate of Assassin's Creed Origins on PC?
As on other platforms, Assassin's Creed Origins on PC uses a technology called Dynamic Resolution Rendering that adjusts the resolution on the fly to ensure a framerate as close as possible to the targeted one - in most cases it's totally invisible for the user. On PC, players can actually choose what baseline framerate they want to target: 30, 45, or 60 fps. They can also select their maximum framerate between 30, 45, 60, or 90 fps, or decide to completely uncap it.

Is it possible to turn off Dynamic Resolution Rendering?
Yes, by turning off the "Adaptive Quality" option in the menus - this will in fact deactivate Dynamic Resolution Rendering.

If there is indeed a problem, the most likely culprit and solution is that they will need to fix bugs and/or tune their dynamic resolution engine and/or game in general. This DRM stuff is completely unfounded. Can anyone provide PROOF of the DRM conspiracy!?!? Nope.
 
Last edited:
I'm completely unfamiliar with Ubisoft's work, but does DRM run in the same process as the main game? Because if it doesn't, that your easy way of settling this for good.
 
What is the users' dynamic resolution setting? A low framerate target setting will force low framerate and will still show high usage since the game will upscale all of its textures as high as possible in order to obtain the target framerate, buffer them in very high resolution, and then down-sample the output to match the monitor's resolution.

Users can purposely make their games appear to perform at low framerates with very high system loads if they choose settings which incorrectly match their systems' abilities. Basically, the game will always try to use all of their systems' resources and the lower they set the framerate targets, the more it will appear that something is wrong because the game will try to keep the load high even at low framerates. The benefit to this, of course, is that the appearance is improved.

https://forums.ubi.com/showthread.php/1759689


If there is indeed a problem, the most likely culprit and solution is that they will need to fix bugs and/or tune their dynamic resolution engine and/or game in general. This DRM stuff is completely unfounded. Can anyone provide PROOF of the DRM conspiracy!?!? Nope.

IIRC, the first ver of Denuvo sucked some CPU cycles and it was noticeable. It's not a stretch of the imagination to assume this (true, proof would be nice). And it's Ubi, they're going to get blamed by default, b/c they're such dumb pricks.
 
IIRC, the first ver of Denuvo sucked some CPU cycles and it was noticeable.
That has never been proven either. After many years, it is still up for debate. Let's just say that the difference appears to be negligible.
 
Last edited:
Back
Top