• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Anthem VIP Demo Benchmarked on all GeForce RTX & Vega Cards

Lol, if you know a game is CPU intensive yet don't even realize your system RAM might be holding you back, next time may be research harder before you start your bashing. I mean W1 benchmark with 16Gb @ 3867 MHz 18-19-19-39 (as every other video card review he did) compare to your 16 GB @ 2666mhz OCd to 2800mhz at who know what timings could result in massive fps differences. BTW I would think twice about putting highest end GPU together with bargain bin RAM lol.


Then He needs to Bench it with other lower speed RAM, like 3000 or 3200Mhz, Not everyone uses 4000Mhz ram and Ram shoudnt make 12fps + difference at 4k resolution.

BFV uses the same engine and my FPS matches with TPU's benchmarks, both use same engine.
 
I guess you forgot its not the 1990s anymore.
It runs very well on nvidia cards,amd just have to follow with another driver.If they followed your logic their cards would be a broken mess in most of current gen games.

Ahem. Also, take note of the actual FPS.

1548670570515.png


@TheGuruStud is correct. Devs need to fix their shit. BFV is a very unique Frostbite example because both AMD and Nvidia have extensively been in-house to optimize it help the blundering DICE devs out. But they really shouldn't have to; the GPU hardware isn't magically changing or anything.

Wizzard needs to add 1% lows as well.

Yes, yes he really does. 100% agreed. But I do not doubt his accuracy, regardless.

looks like cpu overhead on amd cards,may be resolved in drivers unless they drag their ass with solving it like they did in some many other dx11 games. matching a 2060 is where V64 is at so no surprises there. 1440p performance numbers look good,I'd be very glad to see my 1080Ti run at 100 fps.

I guess the amd driver was not game-ready for anthem after all.

I'm not seeing the CPU overhead here, Vega is right about where it should be, across all resolutions. 56 under a 1070, and 64 just over it. Can it improve a few %, sure it probably can. The REAL outlier here is in fact the GTX 1070 at 1440p. To me this looks a lot like lacking optimizations on the developer side, not so much Nv/AMD. Pascal's performance is abysmal compared to Turing's, for example. No blanket statement about either camp would be accurate.
 
Last edited:
Actually what i meant is the frostbite (it was auto correcting) engine itself is AMD favouring. But the game itself is neutral as it wasn't sponsored by neither of the company nor it has any gimping such as nvidia gameworks.
What exactly makes you think it's "AMD favouring"? Yes, they did include Mantle support at a time, but considering that DICE was spearheading the need for new API, that's only logical and tells nothing about "favouring" one manufacturer over the other.
 
Why is it that everyone over here seems to forget that the Vega counterpoints are GTX1070 for 56 & GTX1080 for 64 respectively?

Vega 56 beats 1070 at 1440p & 4K and we would probably have the very same situation with 1080 & 64 (had the former been tested), which is more or less exactly what you would expect with a Frostbite engine.

The issue ain't that Vega is underperforming, but rather that the RTX cards perform surprisingly well with Anthem. Kinda like what we had with the id Tech 6 engine games.

And that ain't really an issue - it's a great news for RTX owners, so let's just be happy for them as there are not so many titles that show a big performance leap over the GTX10xx generation.

Cheers!

Anthem+EA and RTX seem like a great match. I'll see myself out now :rolleyes:

It goes both ways. God forbid someone looks at DXR without eyes of lust.

I think RTX hate is getting more clicks to be fair with you :D I think what's most important is to pick some extreme and go hard on it.
 
hangs always after the intro walking first mission loading, uninstalled.
 
What exactly makes you think it's "AMD favouring"? Yes, they did include Mantle support at a time, but considering that DICE was spearheading the need for new API, that's only logical and tells nothing about "favouring" one manufacturer over the other.

Frostbite engine was basically made for BF games and they were tightly engineered for AMD in their minds. I'm surprised that you don't know that a game engine can be specifically tailored towards an specific branded gpu. For example,unreal engine is tailored towards Nvidia. Just ask any game engine developer. It's also quite common knowledge among pc gaming enthusiasts that frostbite engine is AMD biased,Radeon biased to be precised.
 

Attachments

  • Screenshot_2018-09-11-12-16-41-183_com.android.chrome.png
    Screenshot_2018-09-11-12-16-41-183_com.android.chrome.png
    145.3 KB · Views: 128
That is so stupid, activations on graphics card changes? Something that gamers might do a lot?

You have this the wrong way around. Game devs need to fix their shit. Nvidia/AMD have been fixing their junk for far too long. If it were coded correctly, then there wouldn't be any issues and you wouldn't need a new driver to play a game. I guess everyone just forgets the 90s/early 00s. To make it worse, it's using Frostbite. Nothing should even have to be done for good performance all around.

Release is in a month and this is the state it's in? LOL. Another joke ruined by EA is what I see.

I agree. The developers have the current cards and drivers. They can't optimize their own game with drivers that are out right now? They waiting for some magically ability that only future drivers can provide?
 
Frostbite engine was basically made for BF games and they were tightly engineered for AMD in their minds. I'm surprised that you don't know that a game engine can be specifically tailored towards an specific branded gpu. For example,unreal engine is tailored towards Nvidia. Just ask any game engine developer. It's also quite common knowledge among pc gaming enthusiasts that frostbite engine is AMD biased,Radeon biased to be precised.

Sorry, but most gamers are wrong. Including you.

Even the quote you linked to: AMD has worked with DICE for Battlefield. Not for Frostbite as an engine. Engines are not biased. They are objective, neutral, its code. What does happen is that specific architectures are better suited to work with certain engines. That is precisely the other way around. Engines are not built for GPU architectures (at least, not in our generic gaming space anymore) - engines are iterative, and last many generations of GPUs. And in between Engine and GPU, there is one vital part you've forgotten about: an abstraction layer, or, an API like DX11/DX12/Vulkan. That is part of the reason why engines can be iterative. The API does the translation work, to put it simply.

Look at NetImmerse/Gamebryo/Creation Engine, nearly two decades worth of games built on it. Ubisoft's Anvil is another example, used in games for the past ten years, multiple consoles and platforms.

Look at the Vega performance for Anthem compared to Battlefield. Literally nothing aligns so far with the idea that Frostbite is 'biased towards AMD'. If anything, that just applies to Battlefield, which makes sense, but that is mostly attributable not even to the engine, but to the API. AMD worked with DICE to implement Mantle, and many bits are similar between it and DX12/Vulkan.

Oh, one final note; Frostbite games have released on the PS3 (Nvidia GPU) long before AMD came to visit DICE.
https://nl.wikipedia.org/wiki/Frostbite_(engine)
 
Last edited:
I'm sick and tired of lazy devs and greedy publishers, pushing unfinished products down our throats. Most of today's game releases would be called beta 20 years back. Just look at Assassins Creed Odyssey joke, 30 fps (0.1 % - 16 fps ) at 4K with GTX 1080 TI and that is $60 game! Lousy code optimization is performance killer, no matter what kind of monster hardware you own. Most publishers don't give a f... about PC gaming anymore. 90 % of the games are criminally badly optimized console ports.

Do you know how many pixels 4K is pushing? Most of you asshats are still using 1080p monitors. I've never even used 1080p monitors ever, went from 1280x1024 to 1920x1200 and now 3440x1440. People, stop buying 1080p monitors and maybe they'll care about optimizing for 4K resolutions.
 
Frostbite engine was basically made for BF games and they were tightly engineered for AMD in their minds. I'm surprised that you don't know that a game engine can be specifically tailored towards an specific branded gpu. For example,unreal engine is tailored towards Nvidia. Just ask any game engine developer. It's also quite common knowledge among pc gaming enthusiasts that frostbite engine is AMD biased,Radeon biased to be precised.
@Vayra86 said it better than I could have already for the most parts, though you are right about Unreal Engine these days, only for wrong reasons. Only way Unreal Engine is "tailored towards NVIDIA" is the inclusion of NVIDIA specific features in the base engine, but that doesn't mean that game developer using Unreal Engine would have to use those features. On that base, one could argue that including Mantle was "tailoring towards AMD" but since DICE was actually spearheading the need and development of the API, I wouldn't count it as such - and even then not all games using Frostbite at the time included the support - it was built-in as an option, just like the NVIDIA stuff in Unreal Engine.
 
Do you know how many pixels 4K is pushing? Most of you asshats are still using 1080p monitors. I've never even used 1080p monitors ever, went from 1280x1024 to 1920x1200 and now 3440x1440. People, stop buying 1080p monitors and maybe they'll care about optimizing for 4K resolutions.

Or... maybe don't adopt 4K for a gaming monitor and keep it sane instead, so you don't have to multiply your horsepower by the same ratio as your pixel count.

Why did anyone expect GPUs to suddenly be able to push 4x as many pixels comfortably as they've done for the last ten years? If you want to early adopt, don't whine about all the bad things that come with it. You're in a niche, deal with it. Not so sure why 'asshats are still' using 1080p monitors in your view, its a pretty solid sweetspot resolution for a normal viewing distance and gaming. It is without any shadow of a doubt the most optimal use of your GPU horsepower given the PPI of typical monitors. In that sense 4K is horribly wasteful, you render shitloads of pixels you'll never notice.

@Vayra86 said it better than I could have already for the most parts, though you are right about Unreal Engine these days, only for wrong reasons. Only way Unreal Engine is "tailored towards NVIDIA" is the inclusion of NVIDIA specific features in the base engine, but that doesn't mean that game developer using Unreal Engine would have to use those features. On that base, one could argue that including Mantle was "tailoring towards AMD" but since DICE was actually spearheading the need and development of the API, I wouldn't count it as such - and even then not all games using Frostbite at the time included the support - it was built-in as an option, just like the NVIDIA stuff in Unreal Engine.

Correct, I wasn't going to go into that but its true, and on top of that, there IS an issue with AMD's DX11 drivers that are higher on overhead like @cucker tarlson pointed out. Just not so sure that Anthem suffers from that issue specifically. On UE4 you're already seeing a somewhat different picture right about now.
 
Last edited:
Sorry, but most gamers are wrong. Including you.

Even the quote you linked to: AMD has worked with DICE for Battlefield. Not for Frostbite as an engine. Engines are not biased. They are objective, neutral, its code. What does happen is that specific architectures are better suited to work with certain engines. That is precisely the other way around. Engines are not built for GPU architectures (at least, not in our generic gaming space anymore) - engines are iterative, and last many generations of GPUs. And in between Engine and GPU, there is one vital part you've forgotten about: an abstraction layer, or, an API like DX11/DX12/Vulkan. That is part of the reason why engines can be iterative. The API does the translation work, to put it simply.

Look at NetImmerse/Gamebryo/Creation Engine, nearly two decades worth of games built on it. Ubisoft's Anvil is another example, used in games for the past ten years, multiple consoles and platforms.

Look at the Vega performance for Anthem compared to Battlefield. Literally nothing aligns so far with the idea that Frostbite is 'biased towards AMD'. If anything, that just applies to Battlefield, which makes sense, but that is mostly attributable not even to the engine, but to the API. AMD worked with DICE to implement Mantle, and many bits are similar between it and DX12/Vulkan.

Oh, one final note; Frostbite games have released on the PS3 (Nvidia GPU) long before AMD came to visit DICE.
https://nl.wikipedia.org/wiki/Frostbite_(engine)

well you might be right actually. Thanks for the info bro. Also i agree why BF games always favours AMD,i thought it was game engine's fault. I thought the frostbite engine was designed in a manner to favour AMD's architecture as you said in - "specific architectures are better suited to work with certain engines". But yeah the anthem score also made me think about that frostbite and AMD collab. Also AnvilNext looks like nvidia favouring without having any nvidia related tech included. Care to explain that?

@Vayra86 said it better than I could have already for the most parts, though you are right about Unreal Engine these days, only for wrong reasons. Only way Unreal Engine is "tailored towards NVIDIA" is the inclusion of NVIDIA specific features in the base engine, but that doesn't mean that game developer using Unreal Engine would have to use those features. On that base, one could argue that including Mantle was "tailoring towards AMD" but since DICE was actually spearheading the need and development of the API, I wouldn't count it as such - and even then not all games using Frostbite at the time included the support - it was built-in as an option, just like the NVIDIA stuff in Unreal Engine.

It may be the case in only BF games cz BF games always seem to favour AMD a lot regardless of API implementation. Also many UE4 games don't use nvidia tech but still runs better on nvidia. But i do agree with him,may b it's not the game engine but rather the game's development regardless of sponsorship.
 
well you might be right actually. Thanks for the info bro. Also i agree why BF games always favours AMD,i thought it was game engine's fault. I thought the frostbite engine was designed in a manner to favour AMD's architecture as you said in - "specific architectures are better suited to work with certain engines". But yeah the anthem score also made me think about that frostbite and AMD collab. Also AnvilNext looks like nvidia favouring without having any nvidia related tech included. Care to explain that?



It may be the case in only BF games cz BF games always seem to favour AMD a lot regardless of API implementation. Also many UE4 games don't use nvidia tech but still runs better on nvidia. But i do agree with him,may b it's not the game engine but rather the game's development regardless of sponsorship.

Many Ubisoft games are sponsored or supported by Nvidia GameWorks. TurfEffects, HBAO+, enhanced God Rays... the list is endless. Some of these GameWorks effects are more efficient than their non-GameWorks counterparts on Nvidia cards. Another one is TXAA, it was introduced with Assassins Creed 3, for example... that's just off the top of my head.
 
all big ubi titles I played recently were nvidia sponsored. been that was since ac black flag. watch dogs 2 has more nvidia options than I can list. odyssey is not but nvidia's driver team are always game-ready with a driver for such a big title so you wouln't feel that,
 
Many Ubisoft games are sponsored or supported by Nvidia GameWorks. TurfEffects, HBAO+, enhanced God Rays... the list is endless. Some of these GameWorks effects are more efficient than their non-GameWorks counterparts on Nvidia cards. Another one is TXAA, it was introduced with Assassins Creed 3, for example... that's just off the top of my head.

AC Odyssey(AnvilNext) is AMD sponsored and optimized, also it doesn't have any nvidia tech/features. Still it runs much better on nvidia than AMD. Origin wasn't sponsored/optimized for any particular brand and it neither had any nvidia tech/features included,still it performs better on nvidia.
 
AC Odyssey(AnvilNext) is AMD sponsored and optimized, also it doesn't have any nvidia tech/features. Still it runs much better on nvidia than AMD.
so you'd call that nvidia biased instead of giving creadit to their driver support.whether it's amd sponsored is not debatable.whether amd optimized it is very questionable.
 
Whatever, the main thing was, AMD already have Anthem optimized adrenaline 19.1.2 as mentioned in TPU article. So don't come with the driver excuse and RTX 2060 is much cheaper than Vega 56/64 and gives much better performance. AMD have always had DX11 driver issue so don't expect any better than this from them. Also only some AMD fanboys are denying this article's credibility bcz they can't cope with the failure of red team even if that's just one game. It's very common among AMD community to diss any article/review where AMD is under performing and needs to come up with excuses to win the day. Even though i use AMD hardware,i just stay away from these brainless AMD communities.
 
I wouldn't take any of these results seriously. By the time Anthem releases I bet both AMD and Nvidia will have a driver release that will improve performance. The VIP demo is not even using the gold build of the game (apparently it is using a 6+ week old build). I'd love a proper performance review of Anthem after it releases. Also, the amount of fanboyism around here is surprising and its sad to see. Come on dudes....
 
Last edited:
so you'd call that nvidia biased instead of giving creadit to their driver support.whether it's amd sponsored is not debatable.whether amd optimized it is very questionable.

It's not about blaming or denying. The thing is, i think it's actually the game engine's fault regardless of sponsorship. That's why frostbite engine games such as battlefield,battlefront,fifa 17-19, ME Andromeda all favoured AMD and also for same reason,almost every UE4 engine game,AnvilNext engine game favours nvidia regardless of who have sponsored it. I might be wrong so i was just asking,as that other guy seems to know this stuff.

I wouldn't take any of these results seriously. By the time Anthem releases I bet both AMD and Nvidia will have a driver release that will improve performance. Also.... the VIP demo is not even using the gold build of the game (apparently it is using some 6+ week old build). I'd love a proper performance review of Anthem after it releases.

That seems reasonable and i was going to say the same thing next the final product and the final drivers would have very different end result compared to this chart. And yeah BioWare producer Mike Darrah said this demo is actually almost 6 weeks older build.
 
so you'd call that nvidia biased instead of giving creadit to their driver support.whether it's amd sponsored is not debatable.whether amd optimized it is very questionable.
Yeah I get “AMD Features” in Odyssey and FC5 for HDR there’s a Freesync 2 setting as opposed to generic HDR. Also FC5 gets Rapid Packed Math, not sure about AC. But neither of those things would have any effect on NV performance.
 
It's not about blaming or denying. The thing is, i think it's actually the game engine's fault regardless of sponsorship. That's why frostbite engine games such as battlefield,battlefront,fifa 17-19, ME Andromeda all favoured AMD and also for same reason,almost every UE4 engine game,AnvilNext engine game favours nvidia regardless of who have sponsored it. I might be wrong so i was just asking,as that other guy seems to know this stuff.



That seems reasonable and i was going to say the same thing next the final product and the final drivers would have very different end result compared to this chart. And yeah BioWare producer Mike Darrah said this demo is actually almost 6 weeks older build.

Andromeda didn't favour AMD.
All bioware titles this gen were better on nvidia.

Turing is a very impressive architecture technically, turing cards perform really well on both pascal and GCN favored titles.
I don't think there will be any specific engine or game that will perform signficantly better on GCN versus turing [relatively] which was the case for GCN vs. Pascal.
 
Last edited:
Yeah I get “AMD Features” in Odyssey and FC5 for HDR there’s a Freesync 2 setting as opposed to generic HDR. Also FC5 gets Rapid Packed Math, not sure about AC. But neither of those things would have any effect on NV performance.
Rapid Packed Math (aka half precision / FP16) helps Volta & Turing too
 
Rapid Packed Math (aka half precision / FP16) helps Volta & Turing too
Yeah we all benefit from our compute units that’ are generally underutilized.
 
Was your 2080ti using liquid nitrogen for the 4k results? My titan RTX water cooled and shunted was struggling to push low 60’s at 2100MHz. You guys must be staring at a wall to get 72fps in 4k. At 9:55 card gets overclocked


 
Last edited:
Back
Top