Saturday, January 26th 2019

Anthem VIP Demo Benchmarked on all GeForce RTX & Vega Cards

Yesterday, EA launched the VIP demo for their highly anticipated title "Anthem". The VIP demo is only accessible to Origin Access subscribers or people who preordered. For the first hours after the demo launch, many players were plagued by servers crashes or "servers are full" messages. Looks like EA didn't anticipate the server load correctly, or the inrush of login attempts revealed a software bug that wasn't apparent with light load.

Things are running much better now, and we had time to run some Anthem benchmarks on a selection of graphics cards, from AMD and NVIDIA. We realized too late that even the Anthem Demo comes with a five activation limit, which gets triggered on every graphics card change. That's why we could only test eight cards so far.. we'll add more when the activations reset.
We benchmarked Anthem at Ultra settings in 1920x1080 (Full HD), 2560x1440 and 3840x2160 (4K). The drivers used were NVIDIA 417.71 WHQL and yesterday's AMD Radeon Adrenalin 19.1.2, which includes performance improvements for Anthem.

At 1080p, it looks like the game is running into a CPU bottleneck with our Core i7-8700K (note how the scores for RTX 2080 and RTX 2080 Ti are very close together). It's also interesting how cards from AMD start out slower at lower resolution, but make up the gap to their NVIDIA counterparts as resolution is increased. It's only at 4K that Vega 64 matches RTX 2060 (something that would be expected for 1080p, when looking at results from recent GPU reviews).

We will add test results for more cards, such as the Radeon RX 570 and GeForce GTX 1060, after our activation limit is reset over the weekend.
Add your own comment

134 Comments on Anthem VIP Demo Benchmarked on all GeForce RTX & Vega Cards

#101
Vayra86
cucker tarlsonI guess you forgot its not the 1990s anymore.
It runs very well on nvidia cards,amd just have to follow with another driver.If they followed your logic their cards would be a broken mess in most of current gen games.
Ahem. Also, take note of the actual FPS.



@TheGuruStud is correct. Devs need to fix their shit. BFV is a very unique Frostbite example because both AMD and Nvidia have extensively been in-house to optimize it help the blundering DICE devs out. But they really shouldn't have to; the GPU hardware isn't magically changing or anything.
SystemMechanicWizzard needs to add 1% lows as well.
Yes, yes he really does. 100% agreed. But I do not doubt his accuracy, regardless.
cucker tarlsonlooks like cpu overhead on amd cards,may be resolved in drivers unless they drag their ass with solving it like they did in some many other dx11 games. matching a 2060 is where V64 is at so no surprises there. 1440p performance numbers look good,I'd be very glad to see my 1080Ti run at 100 fps.

I guess the amd driver was not game-ready for anthem after all.
I'm not seeing the CPU overhead here, Vega is right about where it should be, across all resolutions. 56 under a 1070, and 64 just over it. Can it improve a few %, sure it probably can. The REAL outlier here is in fact the GTX 1070 at 1440p. To me this looks a lot like lacking optimizations on the developer side, not so much Nv/AMD. Pascal's performance is abysmal compared to Turing's, for example. No blanket statement about either camp would be accurate.
Posted on Reply
#102
Kaotik
R4WN4KActually what i meant is the frostbite (it was auto correcting) engine itself is AMD favouring. But the game itself is neutral as it wasn't sponsored by neither of the company nor it has any gimping such as nvidia gameworks.
What exactly makes you think it's "AMD favouring"? Yes, they did include Mantle support at a time, but considering that DICE was spearheading the need for new API, that's only logical and tells nothing about "favouring" one manufacturer over the other.
Posted on Reply
#103
Vayra86
siluro818Why is it that everyone over here seems to forget that the Vega counterpoints are GTX1070 for 56 & GTX1080 for 64 respectively?

Vega 56 beats 1070 at 1440p & 4K and we would probably have the very same situation with 1080 & 64 (had the former been tested), which is more or less exactly what you would expect with a Frostbite engine.

The issue ain't that Vega is underperforming, but rather that the RTX cards perform surprisingly well with Anthem. Kinda like what we had with the id Tech 6 engine games.

And that ain't really an issue - it's a great news for RTX owners, so let's just be happy for them as there are not so many titles that show a big performance leap over the GTX10xx generation.

Cheers!
Anthem+EA and RTX seem like a great match. I'll see myself out now :rolleyes:
moproblems99It goes both ways. God forbid someone looks at DXR without eyes of lust.
I think RTX hate is getting more clicks to be fair with you :D I think what's most important is to pick some extreme and go hard on it.
Posted on Reply
#104
erixx
hangs always after the intro walking first mission loading, uninstalled.
Posted on Reply
#105
Frutika007
KaotikWhat exactly makes you think it's "AMD favouring"? Yes, they did include Mantle support at a time, but considering that DICE was spearheading the need for new API, that's only logical and tells nothing about "favouring" one manufacturer over the other.
Frostbite engine was basically made for BF games and they were tightly engineered for AMD in their minds. I'm surprised that you don't know that a game engine can be specifically tailored towards an specific branded gpu. For example,unreal engine is tailored towards Nvidia. Just ask any game engine developer. It's also quite common knowledge among pc gaming enthusiasts that frostbite engine is AMD biased,Radeon biased to be precised.
Posted on Reply
#106
Gasaraki
That is so stupid, activations on graphics card changes? Something that gamers might do a lot?
TheGuruStudYou have this the wrong way around. Game devs need to fix their shit. Nvidia/AMD have been fixing their junk for far too long. If it were coded correctly, then there wouldn't be any issues and you wouldn't need a new driver to play a game. I guess everyone just forgets the 90s/early 00s. To make it worse, it's using Frostbite. Nothing should even have to be done for good performance all around.

Release is in a month and this is the state it's in? LOL. Another joke ruined by EA is what I see.
I agree. The developers have the current cards and drivers. They can't optimize their own game with drivers that are out right now? They waiting for some magically ability that only future drivers can provide?
Posted on Reply
#107
Vayra86
R4WN4KFrostbite engine was basically made for BF games and they were tightly engineered for AMD in their minds. I'm surprised that you don't know that a game engine can be specifically tailored towards an specific branded gpu. For example,unreal engine is tailored towards Nvidia. Just ask any game engine developer. It's also quite common knowledge among pc gaming enthusiasts that frostbite engine is AMD biased,Radeon biased to be precised.
Sorry, but most gamers are wrong. Including you.

Even the quote you linked to: AMD has worked with DICE for Battlefield. Not for Frostbite as an engine. Engines are not biased. They are objective, neutral, its code. What does happen is that specific architectures are better suited to work with certain engines. That is precisely the other way around. Engines are not built for GPU architectures (at least, not in our generic gaming space anymore) - engines are iterative, and last many generations of GPUs. And in between Engine and GPU, there is one vital part you've forgotten about: an abstraction layer, or, an API like DX11/DX12/Vulkan. That is part of the reason why engines can be iterative. The API does the translation work, to put it simply.

Look at NetImmerse/Gamebryo/Creation Engine, nearly two decades worth of games built on it. Ubisoft's Anvil is another example, used in games for the past ten years, multiple consoles and platforms.

Look at the Vega performance for Anthem compared to Battlefield. Literally nothing aligns so far with the idea that Frostbite is 'biased towards AMD'. If anything, that just applies to Battlefield, which makes sense, but that is mostly attributable not even to the engine, but to the API. AMD worked with DICE to implement Mantle, and many bits are similar between it and DX12/Vulkan.

Oh, one final note; Frostbite games have released on the PS3 (Nvidia GPU) long before AMD came to visit DICE.
nl.wikipedia.org/wiki/Frostbite_(engine)
Posted on Reply
#108
Gasaraki
unikinI'm sick and tired of lazy devs and greedy publishers, pushing unfinished products down our throats. Most of today's game releases would be called beta 20 years back. Just look at Assassins Creed Odyssey joke, 30 fps (0.1 % - 16 fps ) at 4K with GTX 1080 TI and that is $60 game! Lousy code optimization is performance killer, no matter what kind of monster hardware you own. Most publishers don't give a f... about PC gaming anymore. 90 % of the games are criminally badly optimized console ports.
Do you know how many pixels 4K is pushing? Most of you asshats are still using 1080p monitors. I've never even used 1080p monitors ever, went from 1280x1024 to 1920x1200 and now 3440x1440. People, stop buying 1080p monitors and maybe they'll care about optimizing for 4K resolutions.
Posted on Reply
#109
Kaotik
R4WN4KFrostbite engine was basically made for BF games and they were tightly engineered for AMD in their minds. I'm surprised that you don't know that a game engine can be specifically tailored towards an specific branded gpu. For example,unreal engine is tailored towards Nvidia. Just ask any game engine developer. It's also quite common knowledge among pc gaming enthusiasts that frostbite engine is AMD biased,Radeon biased to be precised.
@Vayra86 said it better than I could have already for the most parts, though you are right about Unreal Engine these days, only for wrong reasons. Only way Unreal Engine is "tailored towards NVIDIA" is the inclusion of NVIDIA specific features in the base engine, but that doesn't mean that game developer using Unreal Engine would have to use those features. On that base, one could argue that including Mantle was "tailoring towards AMD" but since DICE was actually spearheading the need and development of the API, I wouldn't count it as such - and even then not all games using Frostbite at the time included the support - it was built-in as an option, just like the NVIDIA stuff in Unreal Engine.
Posted on Reply
#110
Vayra86
GasarakiDo you know how many pixels 4K is pushing? Most of you asshats are still using 1080p monitors. I've never even used 1080p monitors ever, went from 1280x1024 to 1920x1200 and now 3440x1440. People, stop buying 1080p monitors and maybe they'll care about optimizing for 4K resolutions.
Or... maybe don't adopt 4K for a gaming monitor and keep it sane instead, so you don't have to multiply your horsepower by the same ratio as your pixel count.

Why did anyone expect GPUs to suddenly be able to push 4x as many pixels comfortably as they've done for the last ten years? If you want to early adopt, don't whine about all the bad things that come with it. You're in a niche, deal with it. Not so sure why 'asshats are still' using 1080p monitors in your view, its a pretty solid sweetspot resolution for a normal viewing distance and gaming. It is without any shadow of a doubt the most optimal use of your GPU horsepower given the PPI of typical monitors. In that sense 4K is horribly wasteful, you render shitloads of pixels you'll never notice.
Kaotik@Vayra86 said it better than I could have already for the most parts, though you are right about Unreal Engine these days, only for wrong reasons. Only way Unreal Engine is "tailored towards NVIDIA" is the inclusion of NVIDIA specific features in the base engine, but that doesn't mean that game developer using Unreal Engine would have to use those features. On that base, one could argue that including Mantle was "tailoring towards AMD" but since DICE was actually spearheading the need and development of the API, I wouldn't count it as such - and even then not all games using Frostbite at the time included the support - it was built-in as an option, just like the NVIDIA stuff in Unreal Engine.
Correct, I wasn't going to go into that but its true, and on top of that, there IS an issue with AMD's DX11 drivers that are higher on overhead like @cucker tarlson pointed out. Just not so sure that Anthem suffers from that issue specifically. On UE4 you're already seeing a somewhat different picture right about now.
Posted on Reply
#111
Frutika007
Vayra86Sorry, but most gamers are wrong. Including you.

Even the quote you linked to: AMD has worked with DICE for Battlefield. Not for Frostbite as an engine. Engines are not biased. They are objective, neutral, its code. What does happen is that specific architectures are better suited to work with certain engines. That is precisely the other way around. Engines are not built for GPU architectures (at least, not in our generic gaming space anymore) - engines are iterative, and last many generations of GPUs. And in between Engine and GPU, there is one vital part you've forgotten about: an abstraction layer, or, an API like DX11/DX12/Vulkan. That is part of the reason why engines can be iterative. The API does the translation work, to put it simply.

Look at NetImmerse/Gamebryo/Creation Engine, nearly two decades worth of games built on it. Ubisoft's Anvil is another example, used in games for the past ten years, multiple consoles and platforms.

Look at the Vega performance for Anthem compared to Battlefield. Literally nothing aligns so far with the idea that Frostbite is 'biased towards AMD'. If anything, that just applies to Battlefield, which makes sense, but that is mostly attributable not even to the engine, but to the API. AMD worked with DICE to implement Mantle, and many bits are similar between it and DX12/Vulkan.

Oh, one final note; Frostbite games have released on the PS3 (Nvidia GPU) long before AMD came to visit DICE.
nl.wikipedia.org/wiki/Frostbite_(engine)
well you might be right actually. Thanks for the info bro. Also i agree why BF games always favours AMD,i thought it was game engine's fault. I thought the frostbite engine was designed in a manner to favour AMD's architecture as you said in - "specific architectures are better suited to work with certain engines". But yeah the anthem score also made me think about that frostbite and AMD collab. Also AnvilNext looks like nvidia favouring without having any nvidia related tech included. Care to explain that?
Kaotik@Vayra86 said it better than I could have already for the most parts, though you are right about Unreal Engine these days, only for wrong reasons. Only way Unreal Engine is "tailored towards NVIDIA" is the inclusion of NVIDIA specific features in the base engine, but that doesn't mean that game developer using Unreal Engine would have to use those features. On that base, one could argue that including Mantle was "tailoring towards AMD" but since DICE was actually spearheading the need and development of the API, I wouldn't count it as such - and even then not all games using Frostbite at the time included the support - it was built-in as an option, just like the NVIDIA stuff in Unreal Engine.
It may be the case in only BF games cz BF games always seem to favour AMD a lot regardless of API implementation. Also many UE4 games don't use nvidia tech but still runs better on nvidia. But i do agree with him,may b it's not the game engine but rather the game's development regardless of sponsorship.
Posted on Reply
#112
Vayra86
R4WN4Kwell you might be right actually. Thanks for the info bro. Also i agree why BF games always favours AMD,i thought it was game engine's fault. I thought the frostbite engine was designed in a manner to favour AMD's architecture as you said in - "specific architectures are better suited to work with certain engines". But yeah the anthem score also made me think about that frostbite and AMD collab. Also AnvilNext looks like nvidia favouring without having any nvidia related tech included. Care to explain that?



It may be the case in only BF games cz BF games always seem to favour AMD a lot regardless of API implementation. Also many UE4 games don't use nvidia tech but still runs better on nvidia. But i do agree with him,may b it's not the game engine but rather the game's development regardless of sponsorship.
Many Ubisoft games are sponsored or supported by Nvidia GameWorks. TurfEffects, HBAO+, enhanced God Rays... the list is endless. Some of these GameWorks effects are more efficient than their non-GameWorks counterparts on Nvidia cards. Another one is TXAA, it was introduced with Assassins Creed 3, for example... that's just off the top of my head.
Posted on Reply
#113
cucker tarlson
all big ubi titles I played recently were nvidia sponsored. been that was since ac black flag. watch dogs 2 has more nvidia options than I can list. odyssey is not but nvidia's driver team are always game-ready with a driver for such a big title so you wouln't feel that,
Posted on Reply
#114
Frutika007
Vayra86Many Ubisoft games are sponsored or supported by Nvidia GameWorks. TurfEffects, HBAO+, enhanced God Rays... the list is endless. Some of these GameWorks effects are more efficient than their non-GameWorks counterparts on Nvidia cards. Another one is TXAA, it was introduced with Assassins Creed 3, for example... that's just off the top of my head.
AC Odyssey(AnvilNext) is AMD sponsored and optimized, also it doesn't have any nvidia tech/features. Still it runs much better on nvidia than AMD. Origin wasn't sponsored/optimized for any particular brand and it neither had any nvidia tech/features included,still it performs better on nvidia.
Posted on Reply
#115
cucker tarlson
R4WN4KAC Odyssey(AnvilNext) is AMD sponsored and optimized, also it doesn't have any nvidia tech/features. Still it runs much better on nvidia than AMD.
so you'd call that nvidia biased instead of giving creadit to their driver support.whether it's amd sponsored is not debatable.whether amd optimized it is very questionable.
Posted on Reply
#116
Frutika007
Whatever, the main thing was, AMD already have Anthem optimized adrenaline 19.1.2 as mentioned in TPU article. So don't come with the driver excuse and RTX 2060 is much cheaper than Vega 56/64 and gives much better performance. AMD have always had DX11 driver issue so don't expect any better than this from them. Also only some AMD fanboys are denying this article's credibility bcz they can't cope with the failure of red team even if that's just one game. It's very common among AMD community to diss any article/review where AMD is under performing and needs to come up with excuses to win the day. Even though i use AMD hardware,i just stay away from these brainless AMD communities.
Posted on Reply
#117
johnnyfiive
I wouldn't take any of these results seriously. By the time Anthem releases I bet both AMD and Nvidia will have a driver release that will improve performance. The VIP demo is not even using the gold build of the game (apparently it is using a 6+ week old build). I'd love a proper performance review of Anthem after it releases. Also, the amount of fanboyism around here is surprising and its sad to see. Come on dudes....
Posted on Reply
#118
Frutika007
cucker tarlsonso you'd call that nvidia biased instead of giving creadit to their driver support.whether it's amd sponsored is not debatable.whether amd optimized it is very questionable.
It's not about blaming or denying. The thing is, i think it's actually the game engine's fault regardless of sponsorship. That's why frostbite engine games such as battlefield,battlefront,fifa 17-19, ME Andromeda all favoured AMD and also for same reason,almost every UE4 engine game,AnvilNext engine game favours nvidia regardless of who have sponsored it. I might be wrong so i was just asking,as that other guy seems to know this stuff.
johnnyfiiveI wouldn't take any of these results seriously. By the time Anthem releases I bet both AMD and Nvidia will have a driver release that will improve performance. Also.... the VIP demo is not even using the gold build of the game (apparently it is using some 6+ week old build). I'd love a proper performance review of Anthem after it releases.
That seems reasonable and i was going to say the same thing next the final product and the final drivers would have very different end result compared to this chart. And yeah BioWare producer Mike Darrah said this demo is actually almost 6 weeks older build.
Posted on Reply
#119
INSTG8R
Vanguard Beta Tester
cucker tarlsonso you'd call that nvidia biased instead of giving creadit to their driver support.whether it's amd sponsored is not debatable.whether amd optimized it is very questionable.
Yeah I get “AMD Features” in Odyssey and FC5 for HDR there’s a Freesync 2 setting as opposed to generic HDR. Also FC5 gets Rapid Packed Math, not sure about AC. But neither of those things would have any effect on NV performance.
Posted on Reply
#120
M2B
R4WN4KIt's not about blaming or denying. The thing is, i think it's actually the game engine's fault regardless of sponsorship. That's why frostbite engine games such as battlefield,battlefront,fifa 17-19, ME Andromeda all favoured AMD and also for same reason,almost every UE4 engine game,AnvilNext engine game favours nvidia regardless of who have sponsored it. I might be wrong so i was just asking,as that other guy seems to know this stuff.



That seems reasonable and i was going to say the same thing next the final product and the final drivers would have very different end result compared to this chart. And yeah BioWare producer Mike Darrah said this demo is actually almost 6 weeks older build.
Andromeda didn't favour AMD.
All bioware titles this gen were better on nvidia.

Turing is a very impressive architecture technically, turing cards perform really well on both pascal and GCN favored titles.
I don't think there will be any specific engine or game that will perform signficantly better on GCN versus turing [relatively] which was the case for GCN vs. Pascal.
Posted on Reply
#121
Kaotik
INSTG8RYeah I get “AMD Features” in Odyssey and FC5 for HDR there’s a Freesync 2 setting as opposed to generic HDR. Also FC5 gets Rapid Packed Math, not sure about AC. But neither of those things would have any effect on NV performance.
Rapid Packed Math (aka half precision / FP16) helps Volta & Turing too
Posted on Reply
#122
INSTG8R
Vanguard Beta Tester
KaotikRapid Packed Math (aka half precision / FP16) helps Volta & Turing too
Yeah we all benefit from our compute units that’ are generally underutilized.
Posted on Reply
#123
PCB
Was your 2080ti using liquid nitrogen for the 4k results? My titan RTX water cooled and shunted was struggling to push low 60’s at 2100MHz. You guys must be staring at a wall to get 72fps in 4k. At 9:55 card gets overclocked


Posted on Reply
#124
Shobit
M2BThe problem is that even in AMD sponsored titles such AC odyssey and RE2 Remake they kinda suck unfortunately.

Vega 64 is only half as fast as a 2080Ti at 4K.
Minimun fps is not the whole story...
Posted on Reply
#125
EarthDog
johnnyfiiveI wouldn't take any of these results seriously. By the time Anthem releases I bet both AMD and Nvidia will have a driver release that will improve performance. The VIP demo is not even using the gold build of the game (apparently it is using a 6+ week old build). I'd love a proper performance review of Anthem after it releases. Also, the amount of fanboyism around here is surprising and its sad to see. Come on dudes....
O.H........


Hai neighbor (Cbus)!!
Posted on Reply
Add your own comment
Dec 4th, 2024 04:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts