Saturday, January 26th 2019

Anthem VIP Demo Benchmarked on all GeForce RTX & Vega Cards

Yesterday, EA launched the VIP demo for their highly anticipated title "Anthem". The VIP demo is only accessible to Origin Access subscribers or people who preordered. For the first hours after the demo launch, many players were plagued by servers crashes or "servers are full" messages. Looks like EA didn't anticipate the server load correctly, or the inrush of login attempts revealed a software bug that wasn't apparent with light load.

Things are running much better now, and we had time to run some Anthem benchmarks on a selection of graphics cards, from AMD and NVIDIA. We realized too late that even the Anthem Demo comes with a five activation limit, which gets triggered on every graphics card change. That's why we could only test eight cards so far.. we'll add more when the activations reset.
We benchmarked Anthem at Ultra settings in 1920x1080 (Full HD), 2560x1440 and 3840x2160 (4K). The drivers used were NVIDIA 417.71 WHQL and yesterday's AMD Radeon Adrenalin 19.1.2, which includes performance improvements for Anthem.

At 1080p, it looks like the game is running into a CPU bottleneck with our Core i7-8700K (note how the scores for RTX 2080 and RTX 2080 Ti are very close together). It's also interesting how cards from AMD start out slower at lower resolution, but make up the gap to their NVIDIA counterparts as resolution is increased. It's only at 4K that Vega 64 matches RTX 2060 (something that would be expected for 1080p, when looking at results from recent GPU reviews).

We will add test results for more cards, such as the Radeon RX 570 and GeForce GTX 1060, after our activation limit is reset over the weekend.
Add your own comment

134 Comments on Anthem VIP Demo Benchmarked on all GeForce RTX & Vega Cards

#51
siluro818
Why is it that everyone over here seems to forget that the Vega counterpoints are GTX1070 for 56 & GTX1080 for 64 respectively?

Vega 56 beats 1070 at 1440p & 4K and we would probably have the very same situation with 1080 & 64 (had the former been tested), which is more or less exactly what you would expect with a Frostbite engine.

The issue ain't that Vega is underperforming, but rather that the RTX cards perform surprisingly well with Anthem. Kinda like what we had with the id Tech 6 engine games.

And that ain't really an issue - it's a great news for RTX owners, so let's just be happy for them as there are not so many titles that show a big performance leap over the GTX10xx generation.

Cheers!
Posted on Reply
#52
cdawall
where the hell are my stars
y0y030 fps more with driver updates, are you insane? he even has 9900K against 8700k in CPU-bound game, LMAO
This is a very easy thing to pull answers for.

Was W1z streaming the game while he played?

Does W1z have other applications running at the same time as the game is played?

When was the last clean install this guy did?

What driver is being used?

Does W1z run a fairly well known review site or does he have a grand total of 60 followers on YouTube?



Gee I know whose reviews I will be looking at.
Posted on Reply
#53
Kaotik
TheLostSwedeAre you implying @W1zzard doesn't know what he's doing when benchmarking? Then I suggest you might want to depart this site rather quickly and rather quietly.
While W1z surely knows what he's doing, it wouldn't be the first time TPU results have been messed up
Posted on Reply
#54
OneMoar
There is Always Moar
because a 9900k is that much faster than a 8700k ... (less than 1.5fps)

god I hate kids so much So so much
Posted on Reply
#55
TheLostSwede
News Editor
KaotikWhile W1z surely knows what he's doing, it wouldn't be the first time TPU results have been messed up
Because it's the only site that has ever messed up benchmarks, right..? That's a very sweeping statement imho and unless you can prove something is wrong here, you can keep it to yourself.
Posted on Reply
#56
xkm1948
y0y0i would rather trust VIDEO evidence than some numbers. Maybe you want to explain how he got 108 while its ~80 in the video with superior CPU?
Oh sure why are you here then? Just kindly F off.
Posted on Reply
#57
OneMoar
There is Always Moar
we should all leave tpu and start a forum where all reviews are some guy talking about tech he doesn't fully understand while randomly dropping stuff and making cringe worthy memes to attract the attention of the 12 to 15yro market

this post was sponsored by Tunnelbear but then the bear got hungry and ate it
Posted on Reply
#58
xkm1948
OneMoarwe should all leave tpu and start a forum where all reviews are some guy talking about tech he doesn't fully understand while randomly dropping stuff and making cringe worthy memes to attract the attention of the 12 to 15yro market

this post was sponsored by Tunnelbear but then the bear got hungry and ate it
I fking hate “tech-tubers”, at least the majority of them wannabes.
Posted on Reply
#59
VSG
Editor, Reviews & News
xkm1948Oh sure why are you here then? Just kindly F off.
Let's not do this. We should always aim to improve and correct ourselves if we are at fault, and share knowledge to teach others in the other scenario.
Posted on Reply
#60
xkm1948
VSGLet's not do this. We should always aim to improve and correct ourselves if we are at fault, and share knowledge to teach others in the other scenario.
You are implying people wanna have a clear logical argument. Truth is some just wanna stir shit
Posted on Reply
#61
moproblems99
cucker tarlsonPclab has been doing it for years

pclab.pl/art79629-9.html
Fabulous, then Wizzard doesn't need to do it because someone else already is.
xkm1948I fking hate “tech-tubers”, at least the majority of them wannabes.
Hey, you have to start somewhere. Wizzard probably wasn't spit out with a multimeter and probes at birth.
Posted on Reply
#62
xkm1948
Any benchmark these days put AMD in a bad spot light will always end in a shit show like this one. Toxic AMD fanboys at their best.
Posted on Reply
#63
moproblems99
xkm1948Any benchmark these days put AMD in a bad spot light will always end in a shit show like this one. Toxic AMD fanboys at their best.
It goes both ways. God forbid someone looks at DXR without eyes of lust.
Posted on Reply
#64
cucker tarlson
moproblems99It goes both ways. God forbid someone looks at DXR without eyes of lust.
sb wants to look at a $700 dxr gpu critically it's fine. I can't justify the price of the rtx lineup it myself,though it's almost exclusively related to the fact that I think they really should've put 8gb on 2060 and 11gb on 2080. but if at the same time they praise amd's $700 radeon vii it's ridiculous.
Posted on Reply
#65
moproblems99
cucker tarlsonsb wants to look at a $700 dxr gpu critically it's fine. but if at the same time they praise amd's radeon vii it's ridiculous.
Not really. What if someone thinks DXR is totally useless (like me)? What if someone has a usecase for the (expected) compute of VII? The problem is most people can't get over themselves when someone has a different view point.

I used to think that people that bought Titans were stupid and ruining the industry. However, everyone has their reasons for buying them. When I figured that out, it became easy to see why people bought things and then accept that others are different.

It makes life much easier when you accept that others have different needs and view points.
Posted on Reply
#66
xkm1948
moproblems99It goes both ways. God forbid someone looks at DXR without eyes of lust.
I dont see rabid nvidia fans in RaY Tracing threads. Care to point one for me here?
Posted on Reply
#67
moproblems99
xkm1948I dont see rabid nvidia fans in RaY Tracing threads. Care to point one for me here?
Not really. You'll find one on your own.
Posted on Reply
#68
Bansaku
So where are the GTX 1080 and 1080Ti benchmarks?! Fail...
Posted on Reply
#69
eidairaman1
The Exiled Airman
y0y030 fps more with driver updates, are you insane? he even has 9900K against 8700k in CPU-bound game, LMAO
Core i9 and Threadripper relative performance in games is about the same as the 8700K, and 2700X, no sense in spending 800+ on HEDT when you gain no gaming performance advantage to offset the cost.
Posted on Reply
#70
OneMoar
There is Always Moar
if you gain +30fps with a driver update then your driver team is doing something horribly wrong to begin with

can we finally let this fanboi roast die now kid obviously has no clue how shit accually works and is just regurgitating garbage he saw on youtube
Posted on Reply
#71
eidairaman1
The Exiled Airman
xkm1948Any benchmark these days put AMD in a bad spot light will always end in a shit show like this one. Toxic AMD fanboys at their best.
Only toxicity is from greenies always trolling AMD threads
Posted on Reply
#72
moproblems99
BansakuSo where are the GTX 1080 and 1080Ti benchmarks?! Fail...
If you read the article you would see that there was a restriction on installs so they have to wait until that restriction resets.

Your reading: Fail...
Posted on Reply
#73
Vya Domus
Tsukiyomi91Very impressed with how the RTX2060 perform at 1440p Ultra. Beating the 1070, Vega 56 & 64 at the same time is just... beautiful. $350 card is even more worthy right now than a "brand new" Pascal GPU or the hard to get Vega cards now.
If you'll ever want to apply to a marketing position at Nvidia, just show them your comment. They'll love you.
Posted on Reply
#74
lexluthermiester
AssimilatorOnly EA could be incompetent enough to put an activation limit on a demo.
And yet somehow, it simply is not surprising.

Back on topic, this game seems like it's going to be a beast and will require beefy hardware to run.
Posted on Reply
#75
TheinsanegamerN
The performance results make me wonder if EA cocked up optimization for AMD cards, or if AMD did. Given the history of both companies and their relative incompetence, both are likely.

Frostbite isn't a new engine, this shouldnt still be happening.
OneMoarit accually is very relevant vega's fill rate is garbage

to the people that don't get it ill make it as clear as I can

AMD Does not make gaming cards, they make workstation cards that happen to play games
and no I don't care how they are marketed they are workstation cards because thats the only thing GCN is good at which is compute

huge difference there is absolutely no point in comparing amd to nvidia anymore when it comes to gaming they can't and do not compete so just STOP just STOP IT
So was Kepler a "workstation" GPU as well? Should we have not called the geforce 680 a "gaming GPU"? Because GCN was a near 1:1 foe for Kepler.

GCN's problem is that its old. It's not that "AMD makes workstation cards ONLY, STOP COMPARING THEM!!1!!", AMD didnt have much in the way of funding, and all of it went into ryzen, so GCN was left with table scraps. That has gotten them into their current position. GCN was given more compute power because, at the time, compute was seemingly the wave of the future for gaming, and GCN's first technical competitor was Fermi, a very compute focused design. And when the 580 and 590 came out, nobody was typing "THESE ARE NOT GAMING CARDS OMGZZZZ!!!1!!"

AMD has made, and continues to make, gaming cards. Nobody is claiming the RX 580 is a compute card, anybody who does is a fool. Vega was a bit of a mistake, we all know that, just like the tesla titan was a mistake. Navi is their first large adjustment to their main GCN arch, while I dont think it will be anything near the VILV5/4 > GCN switch, NAVI will undoubtedly make large efficiency gains, and I wouldnt be surprised if compute on consumer models was cut down to make the cards more competitive in the gaming segment.
Posted on Reply
Add your own comment
Dec 4th, 2024 04:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts