• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Anthem VIP Demo Benchmarked on all GeForce RTX & Vega Cards

The performance results make me wonder if EA cocked up optimization for AMD cards, or if AMD did. Given the history of both companies and their relative incompetence, both are likely.

Frostbite isn't a new engine, this shouldnt still be happening.

it accually is very relevant vega's fill rate is garbage

to the people that don't get it ill make it as clear as I can

AMD Does not make gaming cards, they make workstation cards that happen to play games
and no I don't care how they are marketed they are workstation cards because thats the only thing GCN is good at which is compute

huge difference there is absolutely no point in comparing amd to nvidia anymore when it comes to gaming they can't and do not compete so just STOP just STOP IT
So was Kepler a "workstation" GPU as well? Should we have not called the geforce 680 a "gaming GPU"? Because GCN was a near 1:1 foe for Kepler.

GCN's problem is that its old. It's not that "AMD makes workstation cards ONLY, STOP COMPARING THEM!!1!!", AMD didnt have much in the way of funding, and all of it went into ryzen, so GCN was left with table scraps. That has gotten them into their current position. GCN was given more compute power because, at the time, compute was seemingly the wave of the future for gaming, and GCN's first technical competitor was Fermi, a very compute focused design. And when the 580 and 590 came out, nobody was typing "THESE ARE NOT GAMING CARDS OMGZZZZ!!!1!!"

AMD has made, and continues to make, gaming cards. Nobody is claiming the RX 580 is a compute card, anybody who does is a fool. Vega was a bit of a mistake, we all know that, just like the tesla titan was a mistake. Navi is their first large adjustment to their main GCN arch, while I dont think it will be anything near the VILV5/4 > GCN switch, NAVI will undoubtedly make large efficiency gains, and I wouldnt be surprised if compute on consumer models was cut down to make the cards more competitive in the gaming segment.
 
I dont see rabid nvidia fans in RaY Tracing threads. Care to point one for me here?

There is no hardware requirement for raytracing to work so I am personally upset with amd for not pushing a driver that makes it work and nvidia for showing it can work with out rt cores (titan V) and not offering a driver that makes it at least work on pascal.
 
Okay, most people probably havent even played this demo. I finally had to make an account after following TPU since 2008. because the website quality seems to be degrading since last year.

1. So many click bait articles.
2. Lots of spelling mistakes and other mistakes where RTX 2080Ti is suddenly a RTX 1080Ti in the following paragraph.


I am not sure how Wizzard benched the game but I I have played this and the 4k Numbers are wrong for the 2080Ti. Would be helpful if he posted all the the other numbers like his clock speeds, tempsand system specs.

With my 5Ghz 8700k, 2666 RAM OC'd to 2800Mhz, 2080Ti running at 2040-2055Mhz and 16gbps on VRAM I dont even get avg 60fps.

AT 4k:

So these benchmark numbers dont really represent actual gameplay numbers. In fact they are not even close. I get FPS dps to as low as 40fps during group play in strongholds. card struggles to hold 60fps on ultra so let alone 72fps.

The game seems to be CPU heavy even at 4k, usage goes upto 50% with 60fps cap and shoots to 80-90% if the cap is removed.

So you can all stop defending. Play the game for and test for yourself.

Wizzard needs to add 1% lows as well.
 
  • Like
Reactions: M2B

Couple points in no specific order
  • I'm willing to bet that for every person who benches the game, there will be a different set of results.
  • Live action games are really hard to produce consistent results.
  • 1% is within margin of error.
  • EA sucks and should be avoided.
 
or you guys could just don't bother trying out the demo build of the game entirely & keep it to yourselves. No one is pushing you into playing the game or even have such a luxury in spending one's time benching a game that's still in beta phase with different GPUs using the latest driver build, changing to the next GPU & redo the entire bench all over again etc. Benchmarking is not an easy task, no matter how you look at it. If you're not happy with how w1zz do, how about go start your own testing methodology & show some proofs other than crapping here all day saying that the results are "not feasible".
 
Only EA could be incompetent enough to put an activation limit on a demo.
I'm surprised they didn't monetize it and make a micro-transaction to reset activation counters.
 
with all the negative backlash they've been getting lately, I doubt they're stupid enough to pull a fast one on the already angry consumers.
 
Core i9 and Threadripper relative performance in games is about the same as the 8700K, and 2700X, no sense in spending 800+ on HEDT when you gain no gaming performance advantage to offset the cost.
except for high refresh gaming where threadripper straight up sucks,ryzen is very mediocre and x299 cpus are still lagging behind the mainstream i7s and i9.
even when comparin clock for clock and with faster memory, ryzen just can't keep up with 8700k when it comes to high refresh gaming




There is no hardware requirement for raytracing to work so I am personally upset with amd for not pushing a driver that makes it work and nvidia for showing it can work with out rt cores (titan V) and not offering a driver that makes it at least work on pascal.
RTX works on Titan V,but it's slower than flagship RTX cards. Down to 2060-2070 level.
 
Last edited:
it also costs 2.5x less...

na more like 3-3.5x. Vega 64 is around 400-450 now and most 2080ti's are around 1200-1600.

Okay, most people probably havent even played this demo. I finally had to make an account after following TPU since 2008. because the website quality seems to be degrading since last year.

1. So many click bait articles.
2. Lots of spelling mistakes and other mistakes where RTX 2080Ti is suddenly a RTX 1080Ti in the following paragraph.


I am not sure how Wizzard benched the game but I I have played this and the 4k Numbers are wrong for the 2080Ti. Would be helpful if he posted all the the other numbers like his clock speeds, tempsand system specs.

With my 5Ghz 8700k, 2666 RAM OC'd to 2800Mhz, 2080Ti running at 2040-2055Mhz and 16gbps on VRAM I dont even get avg 60fps.

AT 4k:

So these benchmark numbers dont really represent actual gameplay numbers. In fact they are not even close. I get FPS dps to as low as 40fps during group play in strongholds. card struggles to hold 60fps on ultra so let alone 72fps.

The game seems to be CPU heavy even at 4k, usage goes upto 50% with 60fps cap and shoots to 80-90% if the cap is removed.

So you can all stop defending. Play the game for and test for yourself.

Wizzard needs to add 1% lows as well.

I have been reading the same on reddit. Almost everyone unanimously is agreeing that these numbers are way off. And wondering how it was tested because I read almost the same thing in game play there numbers are no were close to numbers in this review.
 
he probably just benched the prlogue not actual gameplay.
 
Because it's the only site that has ever messed up benchmarks, right..? That's a very sweeping statement imho and unless you can prove something is wrong here, you can keep it to yourself.
Didn't say it's the only site which sometimes messes up their results.
But since you asked, here's a recent one: https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/30.html
High and Medium are not supposed to be close together in performance, Low and Medium and High and Ultra are, this is shown by every single other BFV DXR benchmark, including TPUs benchmarks on the other RTX cards.
 
Didn't say it's the only site which sometimes messes up their results.
But since you asked, here's a recent one: https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/30.html
High and Medium are not supposed to be close together in performance, Low and Medium and High and Ultra are, this is shown by every single other BFV DXR benchmark, including TPUs benchmarks on the other RTX cards.
well that all depends on the choice of the scene.
what is it with all those friggin know it all experts on tpu recently ?
are you the same person ?

https://www.techpowerup.com/forums/threads/why-are-reviewers-so-lazy-not-talking-about-tpu.251199/

no one asks you to take tpus results as the only ones,get some perspective.
if that's a recurring pattern then it's not a one-off result.

I for example tend to think their choice of locations for cpu-bound gaming tests is not representative of what you may experience in the most cpu-intensive parts of games,and their ram tests are likewise. But that provides me with another perspective of how things will run in scenarios other than cpu-bound.
 
well that all depends on the choice of the scene.
In case of BFV DXR, no it doesn't, there's clear cut difference in the settings, high and ultra are similar, low and medium are similar, it's seen in both the actual effects themselves as well as in performance.
It's not that hard, even TPU's own earlier results contradict the RTX 2060 ones.
what is it with all those friggin know it all experts on tpu recently ?
I've mainly activated here after noticing worrysome trends in news reporting on the site going down hill.
Nope
no one asks you to take tpus results as the only ones,get some perspective.
if that's a recurring pattern then it's not a one-off result.
No-one said anything like that. I was asked to back up my claim that "even though W1z surely knows what he's doing, he still gets his results messed up sometimes", so I did.
 
Okay, most people probably havent even played this demo. I finally had to make an account after following TPU since 2008. because the website quality seems to be degrading since last year.

1. So many click bait articles.
2. Lots of spelling mistakes and other mistakes where RTX 2080Ti is suddenly a RTX 1080Ti in the following paragraph.


I am not sure how Wizzard benched the game but I I have played this and the 4k Numbers are wrong for the 2080Ti. Would be helpful if he posted all the the other numbers like his clock speeds, tempsand system specs.

With my 5Ghz 8700k, 2666 RAM OC'd to 2800Mhz, 2080Ti running at 2040-2055Mhz and 16gbps on VRAM I dont even get avg 60fps.

AT 4k:

So these benchmark numbers dont really represent actual gameplay numbers. In fact they are not even close. I get FPS dps to as low as 40fps during group play in strongholds. card struggles to hold 60fps on ultra so let alone 72fps.

The game seems to be CPU heavy even at 4k, usage goes upto 50% with 60fps cap and shoots to 80-90% if the cap is removed.

So you can all stop defending. Play the game for and test for yourself.

Wizzard needs to add 1% lows as well.


Lol, if you know a game is CPU intensive yet don't even realize your system RAM might be holding you back, next time may be research harder before you start your bashing. I mean W1 benchmark with 16Gb @ 3867 MHz 18-19-19-39 (as every other video card review he did) compare to your 16 GB @ 2666mhz OCd to 2800mhz at who know what timings could result in massive fps differences. BTW I would think twice about putting highest end GPU together with bargain bin RAM lol.
 
Last edited:
I just love how the AMD fanboys needs to come up with excuses as soon as they see their company fail. The article clearly says they are using adrenaline 19.1.2 which is anthem optimized driver,fanboys are still giving driver excuses for red team's failure. Also this game wasn't gimped/sponsored. It uses frostbyte engine which is AMD biased and yet AMD fails on a neutral frostbyte engine game.

na more like 3-3.5x. Vega 64 is around 400-450 now and most 2080ti's are around 1200-1600.



I have been reading the same on reddit. Almost everyone unanimously is agreeing that these numbers are way off. And wondering how it was tested because I read almost the same thing in game play there numbers are no were close to numbers in this review.

Vega 64 is 500$+. And good RTX 2080Ti is easily found in 1300$ region. So yeah, it's 2.5x. Stop exaggerating things.
Also these numbers are 100% accurate. Only AMD biased people are crying bcz it's hurting their feelings that their red team is failing in a neutral frostbyte engine game.
 
Only AMD biased people are crying bcz it's hurting their feelings that their red team is failing in a neutral frostbyte engine game.

No correction. Only delusional people get this worked up. On either side.
 
I just love how the AMD fanboys needs to come up with excuses as soon as they see their company fail. The article clearly says they are using adrenaline 19.1.2 which is anthem optimized driver,fanboys are still giving driver excuses for red team's failure. Also this game wasn't gimped/sponsored. It uses frostbyte engine which is AMD biased and yet AMD fails on a neutral frostbyte engine game.

Vega 64 is 500$+. And good RTX 2080Ti is easily found in 1300$ region. So yeah, it's 2.5x. Stop exaggerating things.
Also these numbers are 100% accurate. Only AMD biased people are crying bcz it's hurting their feelings that their red team is failing in a neutral frostbyte engine game.
I'm not sure who you're referring to with the "AMD fanboys", but I find it hilarious how you first call Frostbite (not byte) AMD biased engine and then you call it neutral, try to make up your mind.
 
just finished playing and vega is memory starved in this game oc hbm to 1155 and 1440p is 60 to 72 \ 4k is 45 to 52fps
 
I just love how the AMD fanboys needs to come up with excuses as soon as they see their company fail. The article clearly says they are using adrenaline 19.1.2 which is anthem optimized driver,fanboys are still giving driver excuses for red team's failure. Also this game wasn't gimped/sponsored. It uses frostbyte engine which is AMD biased and yet AMD fails on a neutral frostbyte engine game.



Vega 64 is 500$+. And good RTX 2080Ti is easily found in 1300$ region. So yeah, it's 2.5x. Stop exaggerating things.
Also these numbers are 100% accurate. Only AMD biased people are crying bcz it's hurting their feelings that their red team is failing in a neutral frostbyte engine game.


Not sure where you are seeing that. Seriously you are just going to outright lie what Vega 64s are going for? Jeez! 399 at Newegg for reference with 3 free games. $369.99 on Newegg eBay brand new. What did I say about 2080ti I have the range. I didn’t say they were all 1500+. So go back and check again on Vega 64 prices before you say I am making things up and exaggerating. You must have been living under a rock where you haven’t seen the Vega 64 prices for a few months.

And lol at frostbite being AMD biased. And you call everyone else amd fanboy but yet managed to sound like nvidia fanboy. Haha.
 
I'm not sure who you're referring to with the "AMD fanboys", but I find it hilarious how you first call Frostbite (not byte) AMD biased engine and then you call it neutral, try to make up your mind.

Actually what i meant is the frostbite (it was auto correcting) engine itself is AMD favouring. But the game itself is neutral as it wasn't sponsored by neither of the company nor it has any gimping such as nvidia gameworks.
 
Not sure where you are seeing that. Seriously you are just going to outright lie what Vega 64s are going for? Jeez! 399 at Newegg for reference with 3 free games. $369.99 on Newegg eBay brand new. What did I say about 2080ti I have the range. I didn’t say they were all 1500+. So go back and check again on Vega 64 prices before you say I am making things up and exaggerating. You must have been living under a rock where you haven’t seen the Vega 64 prices for a few months.

And lol at frostbite being AMD biased. And you call everyone else amd fanboy but yet managed to sound like nvidia fanboy. Haha.

I don't really care about what i sound like cz i have owned several gpus from both brands and currently using vega 56. Also i said brand new prices of Vega 64, those prices you said are of used/refurbished vega 64.
 
Should be able to at 1080p. Though you may need to turn off anti-aliasing if you want to reach near 60fps. Some OCing is needed.
 
Back
Top