• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Anthem VIP Demo Benchmarked on all GeForce RTX & Vega Cards

Joined
Jan 27, 2019
Messages
85 (0.04/day)
Lol, if you know a game is CPU intensive yet don't even realize your system RAM might be holding you back, next time may be research harder before you start your bashing. I mean W1 benchmark with 16Gb @ 3867 MHz 18-19-19-39 (as every other video card review he did) compare to your 16 GB @ 2666mhz OCd to 2800mhz at who know what timings could result in massive fps differences. BTW I would think twice about putting highest end GPU together with bargain bin RAM lol.


Then He needs to Bench it with other lower speed RAM, like 3000 or 3200Mhz, Not everyone uses 4000Mhz ram and Ram shoudnt make 12fps + difference at 4k resolution.

BFV uses the same engine and my FPS matches with TPU's benchmarks, both use same engine.
 
Joined
Sep 17, 2014
Messages
22,569 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I guess you forgot its not the 1990s anymore.
It runs very well on nvidia cards,amd just have to follow with another driver.If they followed your logic their cards would be a broken mess in most of current gen games.

Ahem. Also, take note of the actual FPS.

1548670570515.png


@TheGuruStud is correct. Devs need to fix their shit. BFV is a very unique Frostbite example because both AMD and Nvidia have extensively been in-house to optimize it help the blundering DICE devs out. But they really shouldn't have to; the GPU hardware isn't magically changing or anything.

Wizzard needs to add 1% lows as well.

Yes, yes he really does. 100% agreed. But I do not doubt his accuracy, regardless.

looks like cpu overhead on amd cards,may be resolved in drivers unless they drag their ass with solving it like they did in some many other dx11 games. matching a 2060 is where V64 is at so no surprises there. 1440p performance numbers look good,I'd be very glad to see my 1080Ti run at 100 fps.

I guess the amd driver was not game-ready for anthem after all.

I'm not seeing the CPU overhead here, Vega is right about where it should be, across all resolutions. 56 under a 1070, and 64 just over it. Can it improve a few %, sure it probably can. The REAL outlier here is in fact the GTX 1070 at 1440p. To me this looks a lot like lacking optimizations on the developer side, not so much Nv/AMD. Pascal's performance is abysmal compared to Turing's, for example. No blanket statement about either camp would be accurate.
 
Last edited:
Joined
Dec 22, 2011
Messages
289 (0.06/day)
Processor Ryzen 7 5800X3D
Motherboard Asus Prime X570 Pro
Cooling Deepcool LS-720
Memory 32 GB (4x 8GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7900 XTX Red Devil
Storage Samsung PM9A1 (980 Pro OEM) + 960 Evo NVMe SSD + 830 SATA SSD + Toshiba & WD HDD's
Display(s) Samsung C32HG70
Case Lian Li O11D Evo
Audio Device(s) Sound Blaster Zx
Power Supply Seasonic 750W Focus+ Platinum
Mouse Logitech G703 Lightspeed
Keyboard SteelSeries Apex Pro
Software Windows 11 Pro
Actually what i meant is the frostbite (it was auto correcting) engine itself is AMD favouring. But the game itself is neutral as it wasn't sponsored by neither of the company nor it has any gimping such as nvidia gameworks.
What exactly makes you think it's "AMD favouring"? Yes, they did include Mantle support at a time, but considering that DICE was spearheading the need for new API, that's only logical and tells nothing about "favouring" one manufacturer over the other.
 
Joined
Sep 17, 2014
Messages
22,569 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Why is it that everyone over here seems to forget that the Vega counterpoints are GTX1070 for 56 & GTX1080 for 64 respectively?

Vega 56 beats 1070 at 1440p & 4K and we would probably have the very same situation with 1080 & 64 (had the former been tested), which is more or less exactly what you would expect with a Frostbite engine.

The issue ain't that Vega is underperforming, but rather that the RTX cards perform surprisingly well with Anthem. Kinda like what we had with the id Tech 6 engine games.

And that ain't really an issue - it's a great news for RTX owners, so let's just be happy for them as there are not so many titles that show a big performance leap over the GTX10xx generation.

Cheers!

Anthem+EA and RTX seem like a great match. I'll see myself out now :rolleyes:

It goes both ways. God forbid someone looks at DXR without eyes of lust.

I think RTX hate is getting more clicks to be fair with you :D I think what's most important is to pick some extreme and go hard on it.
 
Joined
Mar 24, 2010
Messages
5,047 (0.94/day)
Location
Iberian Peninsula
hangs always after the intro walking first mission loading, uninstalled.
 
Joined
Jan 27, 2019
Messages
57 (0.03/day)
What exactly makes you think it's "AMD favouring"? Yes, they did include Mantle support at a time, but considering that DICE was spearheading the need for new API, that's only logical and tells nothing about "favouring" one manufacturer over the other.

Frostbite engine was basically made for BF games and they were tightly engineered for AMD in their minds. I'm surprised that you don't know that a game engine can be specifically tailored towards an specific branded gpu. For example,unreal engine is tailored towards Nvidia. Just ask any game engine developer. It's also quite common knowledge among pc gaming enthusiasts that frostbite engine is AMD biased,Radeon biased to be precised.
 

Attachments

  • Screenshot_2018-09-11-12-16-41-183_com.android.chrome.png
    Screenshot_2018-09-11-12-16-41-183_com.android.chrome.png
    145.3 KB · Views: 114
Joined
Nov 29, 2016
Messages
671 (0.23/day)
System Name Unimatrix
Processor Intel i9-9900K @ 5.0GHz
Motherboard ASRock x390 Taichi Ultimate
Cooling Custom Loop
Memory 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32
Video Card(s) EVGA 2080 with Heatkiller Water Block
Storage 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD
Display(s) Alienware 34" Ultrawide 3440x1440
Case CoolerMaster P500M Mesh
Power Supply Seasonic Prime Titanium 850W
Keyboard Corsair K75
Benchmark Scores Really Really High
That is so stupid, activations on graphics card changes? Something that gamers might do a lot?

You have this the wrong way around. Game devs need to fix their shit. Nvidia/AMD have been fixing their junk for far too long. If it were coded correctly, then there wouldn't be any issues and you wouldn't need a new driver to play a game. I guess everyone just forgets the 90s/early 00s. To make it worse, it's using Frostbite. Nothing should even have to be done for good performance all around.

Release is in a month and this is the state it's in? LOL. Another joke ruined by EA is what I see.

I agree. The developers have the current cards and drivers. They can't optimize their own game with drivers that are out right now? They waiting for some magically ability that only future drivers can provide?
 
Joined
Sep 17, 2014
Messages
22,569 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Frostbite engine was basically made for BF games and they were tightly engineered for AMD in their minds. I'm surprised that you don't know that a game engine can be specifically tailored towards an specific branded gpu. For example,unreal engine is tailored towards Nvidia. Just ask any game engine developer. It's also quite common knowledge among pc gaming enthusiasts that frostbite engine is AMD biased,Radeon biased to be precised.

Sorry, but most gamers are wrong. Including you.

Even the quote you linked to: AMD has worked with DICE for Battlefield. Not for Frostbite as an engine. Engines are not biased. They are objective, neutral, its code. What does happen is that specific architectures are better suited to work with certain engines. That is precisely the other way around. Engines are not built for GPU architectures (at least, not in our generic gaming space anymore) - engines are iterative, and last many generations of GPUs. And in between Engine and GPU, there is one vital part you've forgotten about: an abstraction layer, or, an API like DX11/DX12/Vulkan. That is part of the reason why engines can be iterative. The API does the translation work, to put it simply.

Look at NetImmerse/Gamebryo/Creation Engine, nearly two decades worth of games built on it. Ubisoft's Anvil is another example, used in games for the past ten years, multiple consoles and platforms.

Look at the Vega performance for Anthem compared to Battlefield. Literally nothing aligns so far with the idea that Frostbite is 'biased towards AMD'. If anything, that just applies to Battlefield, which makes sense, but that is mostly attributable not even to the engine, but to the API. AMD worked with DICE to implement Mantle, and many bits are similar between it and DX12/Vulkan.

Oh, one final note; Frostbite games have released on the PS3 (Nvidia GPU) long before AMD came to visit DICE.
https://nl.wikipedia.org/wiki/Frostbite_(engine)
 
Last edited:
Joined
Nov 29, 2016
Messages
671 (0.23/day)
System Name Unimatrix
Processor Intel i9-9900K @ 5.0GHz
Motherboard ASRock x390 Taichi Ultimate
Cooling Custom Loop
Memory 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32
Video Card(s) EVGA 2080 with Heatkiller Water Block
Storage 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD
Display(s) Alienware 34" Ultrawide 3440x1440
Case CoolerMaster P500M Mesh
Power Supply Seasonic Prime Titanium 850W
Keyboard Corsair K75
Benchmark Scores Really Really High
I'm sick and tired of lazy devs and greedy publishers, pushing unfinished products down our throats. Most of today's game releases would be called beta 20 years back. Just look at Assassins Creed Odyssey joke, 30 fps (0.1 % - 16 fps ) at 4K with GTX 1080 TI and that is $60 game! Lousy code optimization is performance killer, no matter what kind of monster hardware you own. Most publishers don't give a f... about PC gaming anymore. 90 % of the games are criminally badly optimized console ports.

Do you know how many pixels 4K is pushing? Most of you asshats are still using 1080p monitors. I've never even used 1080p monitors ever, went from 1280x1024 to 1920x1200 and now 3440x1440. People, stop buying 1080p monitors and maybe they'll care about optimizing for 4K resolutions.
 
Joined
Dec 22, 2011
Messages
289 (0.06/day)
Processor Ryzen 7 5800X3D
Motherboard Asus Prime X570 Pro
Cooling Deepcool LS-720
Memory 32 GB (4x 8GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7900 XTX Red Devil
Storage Samsung PM9A1 (980 Pro OEM) + 960 Evo NVMe SSD + 830 SATA SSD + Toshiba & WD HDD's
Display(s) Samsung C32HG70
Case Lian Li O11D Evo
Audio Device(s) Sound Blaster Zx
Power Supply Seasonic 750W Focus+ Platinum
Mouse Logitech G703 Lightspeed
Keyboard SteelSeries Apex Pro
Software Windows 11 Pro
Frostbite engine was basically made for BF games and they were tightly engineered for AMD in their minds. I'm surprised that you don't know that a game engine can be specifically tailored towards an specific branded gpu. For example,unreal engine is tailored towards Nvidia. Just ask any game engine developer. It's also quite common knowledge among pc gaming enthusiasts that frostbite engine is AMD biased,Radeon biased to be precised.
@Vayra86 said it better than I could have already for the most parts, though you are right about Unreal Engine these days, only for wrong reasons. Only way Unreal Engine is "tailored towards NVIDIA" is the inclusion of NVIDIA specific features in the base engine, but that doesn't mean that game developer using Unreal Engine would have to use those features. On that base, one could argue that including Mantle was "tailoring towards AMD" but since DICE was actually spearheading the need and development of the API, I wouldn't count it as such - and even then not all games using Frostbite at the time included the support - it was built-in as an option, just like the NVIDIA stuff in Unreal Engine.
 
Joined
Sep 17, 2014
Messages
22,569 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Do you know how many pixels 4K is pushing? Most of you asshats are still using 1080p monitors. I've never even used 1080p monitors ever, went from 1280x1024 to 1920x1200 and now 3440x1440. People, stop buying 1080p monitors and maybe they'll care about optimizing for 4K resolutions.

Or... maybe don't adopt 4K for a gaming monitor and keep it sane instead, so you don't have to multiply your horsepower by the same ratio as your pixel count.

Why did anyone expect GPUs to suddenly be able to push 4x as many pixels comfortably as they've done for the last ten years? If you want to early adopt, don't whine about all the bad things that come with it. You're in a niche, deal with it. Not so sure why 'asshats are still' using 1080p monitors in your view, its a pretty solid sweetspot resolution for a normal viewing distance and gaming. It is without any shadow of a doubt the most optimal use of your GPU horsepower given the PPI of typical monitors. In that sense 4K is horribly wasteful, you render shitloads of pixels you'll never notice.

@Vayra86 said it better than I could have already for the most parts, though you are right about Unreal Engine these days, only for wrong reasons. Only way Unreal Engine is "tailored towards NVIDIA" is the inclusion of NVIDIA specific features in the base engine, but that doesn't mean that game developer using Unreal Engine would have to use those features. On that base, one could argue that including Mantle was "tailoring towards AMD" but since DICE was actually spearheading the need and development of the API, I wouldn't count it as such - and even then not all games using Frostbite at the time included the support - it was built-in as an option, just like the NVIDIA stuff in Unreal Engine.

Correct, I wasn't going to go into that but its true, and on top of that, there IS an issue with AMD's DX11 drivers that are higher on overhead like @cucker tarlson pointed out. Just not so sure that Anthem suffers from that issue specifically. On UE4 you're already seeing a somewhat different picture right about now.
 
Last edited:
Joined
Jan 27, 2019
Messages
57 (0.03/day)
Sorry, but most gamers are wrong. Including you.

Even the quote you linked to: AMD has worked with DICE for Battlefield. Not for Frostbite as an engine. Engines are not biased. They are objective, neutral, its code. What does happen is that specific architectures are better suited to work with certain engines. That is precisely the other way around. Engines are not built for GPU architectures (at least, not in our generic gaming space anymore) - engines are iterative, and last many generations of GPUs. And in between Engine and GPU, there is one vital part you've forgotten about: an abstraction layer, or, an API like DX11/DX12/Vulkan. That is part of the reason why engines can be iterative. The API does the translation work, to put it simply.

Look at NetImmerse/Gamebryo/Creation Engine, nearly two decades worth of games built on it. Ubisoft's Anvil is another example, used in games for the past ten years, multiple consoles and platforms.

Look at the Vega performance for Anthem compared to Battlefield. Literally nothing aligns so far with the idea that Frostbite is 'biased towards AMD'. If anything, that just applies to Battlefield, which makes sense, but that is mostly attributable not even to the engine, but to the API. AMD worked with DICE to implement Mantle, and many bits are similar between it and DX12/Vulkan.

Oh, one final note; Frostbite games have released on the PS3 (Nvidia GPU) long before AMD came to visit DICE.
https://nl.wikipedia.org/wiki/Frostbite_(engine)

well you might be right actually. Thanks for the info bro. Also i agree why BF games always favours AMD,i thought it was game engine's fault. I thought the frostbite engine was designed in a manner to favour AMD's architecture as you said in - "specific architectures are better suited to work with certain engines". But yeah the anthem score also made me think about that frostbite and AMD collab. Also AnvilNext looks like nvidia favouring without having any nvidia related tech included. Care to explain that?

@Vayra86 said it better than I could have already for the most parts, though you are right about Unreal Engine these days, only for wrong reasons. Only way Unreal Engine is "tailored towards NVIDIA" is the inclusion of NVIDIA specific features in the base engine, but that doesn't mean that game developer using Unreal Engine would have to use those features. On that base, one could argue that including Mantle was "tailoring towards AMD" but since DICE was actually spearheading the need and development of the API, I wouldn't count it as such - and even then not all games using Frostbite at the time included the support - it was built-in as an option, just like the NVIDIA stuff in Unreal Engine.

It may be the case in only BF games cz BF games always seem to favour AMD a lot regardless of API implementation. Also many UE4 games don't use nvidia tech but still runs better on nvidia. But i do agree with him,may b it's not the game engine but rather the game's development regardless of sponsorship.
 
Joined
Sep 17, 2014
Messages
22,569 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
well you might be right actually. Thanks for the info bro. Also i agree why BF games always favours AMD,i thought it was game engine's fault. I thought the frostbite engine was designed in a manner to favour AMD's architecture as you said in - "specific architectures are better suited to work with certain engines". But yeah the anthem score also made me think about that frostbite and AMD collab. Also AnvilNext looks like nvidia favouring without having any nvidia related tech included. Care to explain that?



It may be the case in only BF games cz BF games always seem to favour AMD a lot regardless of API implementation. Also many UE4 games don't use nvidia tech but still runs better on nvidia. But i do agree with him,may b it's not the game engine but rather the game's development regardless of sponsorship.

Many Ubisoft games are sponsored or supported by Nvidia GameWorks. TurfEffects, HBAO+, enhanced God Rays... the list is endless. Some of these GameWorks effects are more efficient than their non-GameWorks counterparts on Nvidia cards. Another one is TXAA, it was introduced with Assassins Creed 3, for example... that's just off the top of my head.
 
Joined
Aug 6, 2017
Messages
7,412 (2.76/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
all big ubi titles I played recently were nvidia sponsored. been that was since ac black flag. watch dogs 2 has more nvidia options than I can list. odyssey is not but nvidia's driver team are always game-ready with a driver for such a big title so you wouln't feel that,
 
Joined
Jan 27, 2019
Messages
57 (0.03/day)
Many Ubisoft games are sponsored or supported by Nvidia GameWorks. TurfEffects, HBAO+, enhanced God Rays... the list is endless. Some of these GameWorks effects are more efficient than their non-GameWorks counterparts on Nvidia cards. Another one is TXAA, it was introduced with Assassins Creed 3, for example... that's just off the top of my head.

AC Odyssey(AnvilNext) is AMD sponsored and optimized, also it doesn't have any nvidia tech/features. Still it runs much better on nvidia than AMD. Origin wasn't sponsored/optimized for any particular brand and it neither had any nvidia tech/features included,still it performs better on nvidia.
 
Joined
Aug 6, 2017
Messages
7,412 (2.76/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
AC Odyssey(AnvilNext) is AMD sponsored and optimized, also it doesn't have any nvidia tech/features. Still it runs much better on nvidia than AMD.
so you'd call that nvidia biased instead of giving creadit to their driver support.whether it's amd sponsored is not debatable.whether amd optimized it is very questionable.
 
Joined
Jan 27, 2019
Messages
57 (0.03/day)
Whatever, the main thing was, AMD already have Anthem optimized adrenaline 19.1.2 as mentioned in TPU article. So don't come with the driver excuse and RTX 2060 is much cheaper than Vega 56/64 and gives much better performance. AMD have always had DX11 driver issue so don't expect any better than this from them. Also only some AMD fanboys are denying this article's credibility bcz they can't cope with the failure of red team even if that's just one game. It's very common among AMD community to diss any article/review where AMD is under performing and needs to come up with excuses to win the day. Even though i use AMD hardware,i just stay away from these brainless AMD communities.
 
Joined
Apr 17, 2008
Messages
3,935 (0.65/day)
Location
West Chester, OH
I wouldn't take any of these results seriously. By the time Anthem releases I bet both AMD and Nvidia will have a driver release that will improve performance. The VIP demo is not even using the gold build of the game (apparently it is using a 6+ week old build). I'd love a proper performance review of Anthem after it releases. Also, the amount of fanboyism around here is surprising and its sad to see. Come on dudes....
 
Last edited:
Joined
Jan 27, 2019
Messages
57 (0.03/day)
so you'd call that nvidia biased instead of giving creadit to their driver support.whether it's amd sponsored is not debatable.whether amd optimized it is very questionable.

It's not about blaming or denying. The thing is, i think it's actually the game engine's fault regardless of sponsorship. That's why frostbite engine games such as battlefield,battlefront,fifa 17-19, ME Andromeda all favoured AMD and also for same reason,almost every UE4 engine game,AnvilNext engine game favours nvidia regardless of who have sponsored it. I might be wrong so i was just asking,as that other guy seems to know this stuff.

I wouldn't take any of these results seriously. By the time Anthem releases I bet both AMD and Nvidia will have a driver release that will improve performance. Also.... the VIP demo is not even using the gold build of the game (apparently it is using some 6+ week old build). I'd love a proper performance review of Anthem after it releases.

That seems reasonable and i was going to say the same thing next the final product and the final drivers would have very different end result compared to this chart. And yeah BioWare producer Mike Darrah said this demo is actually almost 6 weeks older build.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,042 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
so you'd call that nvidia biased instead of giving creadit to their driver support.whether it's amd sponsored is not debatable.whether amd optimized it is very questionable.
Yeah I get “AMD Features” in Odyssey and FC5 for HDR there’s a Freesync 2 setting as opposed to generic HDR. Also FC5 gets Rapid Packed Math, not sure about AC. But neither of those things would have any effect on NV performance.
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
It's not about blaming or denying. The thing is, i think it's actually the game engine's fault regardless of sponsorship. That's why frostbite engine games such as battlefield,battlefront,fifa 17-19, ME Andromeda all favoured AMD and also for same reason,almost every UE4 engine game,AnvilNext engine game favours nvidia regardless of who have sponsored it. I might be wrong so i was just asking,as that other guy seems to know this stuff.



That seems reasonable and i was going to say the same thing next the final product and the final drivers would have very different end result compared to this chart. And yeah BioWare producer Mike Darrah said this demo is actually almost 6 weeks older build.

Andromeda didn't favour AMD.
All bioware titles this gen were better on nvidia.

Turing is a very impressive architecture technically, turing cards perform really well on both pascal and GCN favored titles.
I don't think there will be any specific engine or game that will perform signficantly better on GCN versus turing [relatively] which was the case for GCN vs. Pascal.
 
Last edited:
Joined
Dec 22, 2011
Messages
289 (0.06/day)
Processor Ryzen 7 5800X3D
Motherboard Asus Prime X570 Pro
Cooling Deepcool LS-720
Memory 32 GB (4x 8GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7900 XTX Red Devil
Storage Samsung PM9A1 (980 Pro OEM) + 960 Evo NVMe SSD + 830 SATA SSD + Toshiba & WD HDD's
Display(s) Samsung C32HG70
Case Lian Li O11D Evo
Audio Device(s) Sound Blaster Zx
Power Supply Seasonic 750W Focus+ Platinum
Mouse Logitech G703 Lightspeed
Keyboard SteelSeries Apex Pro
Software Windows 11 Pro
Yeah I get “AMD Features” in Odyssey and FC5 for HDR there’s a Freesync 2 setting as opposed to generic HDR. Also FC5 gets Rapid Packed Math, not sure about AC. But neither of those things would have any effect on NV performance.
Rapid Packed Math (aka half precision / FP16) helps Volta & Turing too
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,042 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
Rapid Packed Math (aka half precision / FP16) helps Volta & Turing too
Yeah we all benefit from our compute units that’ are generally underutilized.
 

PCB

New Member
Joined
Jan 29, 2019
Messages
1 (0.00/day)
Was your 2080ti using liquid nitrogen for the 4k results? My titan RTX water cooled and shunted was struggling to push low 60’s at 2100MHz. You guys must be staring at a wall to get 72fps in 4k. At 9:55 card gets overclocked


 
Last edited:
Top