Monday, February 18th 2019

AMD Radeon VII Retested With Latest Drivers

Just two weeks ago, AMD released their Radeon VII flagship graphics card. It is based on the new Vega 20 GPU, which is the world's first graphics processor built using a 7 nanometer production process. Priced at $699, the new card offers performance levels 20% higher than Radeon RX Vega 64, which should bring it much closer to NVIDIA's GeForce RTX 2080. In our testing we still saw a 14% performance deficit compared to RTX 2080. For the launch-day reviews AMD provided media outlets with a press driver dated January 22, 2019, which we used for our review.

Since the first reviews went up, people in online communities have been speculating that these were early drivers and that new drivers will significantly boost the performance of Radeon VII, to make up lost ground over RTX 2080. There's also the mythical "fine wine" phenomenon where performance of Radeon GPUs significantly improve over time, incrementally. We've put these theories to the test by retesting Radeon VII using AMD's latest Adrenalin 2019 19.2.2 drivers, using our full suite of graphics card benchmarks.
In the chart below, we show the performance deltas compared to our original review, for each title three resolutions are tested: 1920x1080, 2560x1440, 3840x2160 (in that order).



Please do note that these results include performance gained by the washer mod and thermal paste change that we had to do when reassembling of the card. These changes reduced hotspot temperatures by around 10°C, allowing the card to boost a little bit higher. To verify what performance improvements were due to the new driver, and what was due to the thermal changes, we first retested the card using the original press driver (with washer mod and TIM). The result was +0.2% improved performance.

Using the latest 19.2.2 drivers added +0.45% on top of that, for a total improvement of +0.653%. Taking a closer look at the results we can see that two specific titles have seen significant gains due to the new driver version. Assassin's Creed Odyssey, and Battlefield V both achieve several-percent improvements, looks like AMD has worked some magic in those games, to unlock extra performance. The remaining titles see small, but statistically significant gains, suggesting that there are some "global" tweaks that AMD can implement to improve performance across the board, but unsurprisingly, these gains are smaller than title-specific optimizations.

Looking further ahead, it seems plausible that AMD can increase performance of Radeon VII down the road, even though we have doubts that enough optimizations can be discovered to match RTX 2080, maybe if suddenly a lot of developers jump on the DirectX 12 bandwagon (which seems unlikely). It's also a question of resources, AMD can't waste time and money to micro-optimize every single title out there. Rather the company seems to be doing the right thing: invest into optimizations for big, popular titles, like Battlefield V and Assassin's Creed. Given how many new titles are coming out using Unreal Engine 4, and how much AMD is lagging behind in those titles, I'd focus on optimizations for UE4 next.
Add your own comment

182 Comments on AMD Radeon VII Retested With Latest Drivers

#51
londiste
The game list is wrong. The actual games tested in Radeon VII review along with year, API and engine name are:
  • 2017 - DX11 - Assassin's Creed Odyssey (AnvilNext 2.0)
  • 2018 - DX11 - Battlefield V (Frostbite 3)
  • 2016 - DX11 - Civilization VI (Firaxis)
  • 2018 - DX11 - Darksiders 3 (Unreal Engine 4)
  • 2016 - DX12 - Deus Ex: Mankind Divided (Dawn)
  • 2017 - DX11 - Divinity Original Sin II (Divinity Engine)
  • 2018 - DX11 - Dragon Quest XI (Unreal Engine 4)
  • 2018 - DX11 - F1 2018 (EGO Engine 4.0)
  • 2018 - DX11 - Far Cry 5 (Dunia)
  • 2017 - DX11 - Ghost Recon Wildlands (AnvilNext)
  • 2015 - DX11 - Grand Theft Auto V (RAGE - Rockstar Advanced Game Engine)
  • 2017 - DX11 - Hellblade: Senua's Sacrifice (Unreal Engine 4)
  • 2018 - DX11 - Hitman 2 (Glacier 2.0)
  • 2018 - DX11 - Just Cause 4 (Apex)
  • 2018 - DX11 - Monster Hunter World (MT Framework)
  • 2017 - DX11 - Middle-earth: Shadow of War (LithTech)
  • 2015 - DX11 - Rainbow Six: Siege (AnvilNext)
  • 2018 - DX12 - Shadow of the Tomb Raider (Foundation)
  • 2018 - DX12 - Strange Brigade (Asura Engine)
  • 2015 - DX11 - The Witcher 3 (REDengine 3)
  • 2017 - Vulkan - Wolfenstein II (idTech6)
Posted on Reply
#52
cucker tarlson
londisteThe game list is wrong. The actual games tested in Radeon VII review along with year, API and engine name are:
  • 2017 - DX11 - Assassin's Creed Odyssey (AnvilNext 2.0)
  • 2018 - DX11 - Battlefield V (Frostbite 3)
  • 2016 - DX11 - Civilization VI (Firaxis)
  • 2018 - DX11 - Darksiders 3 (Unreal Engine 4)
  • 2016 - DX12 - Deus Ex: Mankind Divided (Dawn)
  • 2017 - DX11 - Divinity Original Sin II (Divinity Engine)
  • 2018 - DX11 - Dragon Quest XI (Unreal Engine 4)
  • 2018 - DX11 - F1 2018 (EGO Engine 4.0)
  • 2018 - DX11 - Far Cry 5 (Dunia)
  • 2017 - DX11 - Ghost Recon Wildlands (AnvilNext)
  • 2015 - DX11 - Grand Theft Auto V (RAGE - Rockstar Advanced Game Engine)
  • 2017 - DX11 - Hellblade: Senua's Sacrifice (Unreal Engine 4)
  • 2018 - DX11 - Hitman 2 (Glacier 2.0)
  • 2018 - DX11 - Just Cause 4 (Apex)
  • 2018 - DX11 - Monster Hunter World (MT Framework)
  • 2017 - DX11 - Middle-earth: Shadow of War (LithTech)
  • 2015 - DX11 - Rainbow Six: Siege (AnvilNext)
  • 2018 - DX12 - Shadow of the Tomb Raider (Foundation)
  • 2018 - DX12 - Strange Brigade (Asura Engine)
  • 2015 - DX11 - The Witcher 3 (REDengine 3)
  • 2017 - Vulkan - Wolfenstein II (idTech5)
Isn't Wolfenstein id7 or 6+ ? I remember doom was id6 and then devs said that wolfenstein was a big technical advancement over that.Supports half precision and variable rate shading.
Posted on Reply
#53
SIGSEGV
notbThis is an accusation you could take to court and become a millionaire. Can you prove it?
accusation? really?
I tested various driver version to my gpu and get it benched. pfftt..
Posted on Reply
#54
londiste
Looking at the list of games:
- Assassin's Creed, Battlefield, Civilization, Far Cry, GTA, Just Cause, Tomb Raider, Witcher and Wolfenstein need to be in the list as the latest iteration of a long-running game series along with its engine. Same applies to Hitman and possibly F1.
- Strange Brigade as a game is one-off but its engine in a newer iteration fo the one behind Sniper Elite 4 which is one of the best DX12 implementations to date.
- Divinity: Original Sin 2, Monster Hunter and Shadow of War are bit of one-offs as relevant and popular games running unique engines.
- Hellblade is a game that is artistically important and actually has a good implementation of Unreal Engine 4.
- R6: Siege is unique case as despite its release year it is current and competitive game that is fairly heavy on GPUs.
- Deus Ex: Mankind Divided is a bit of a concession to AMD and DX12. This is a modified version of same Glacier 2 engine that is behind Hitman games.
- I am not too sure about the choice or relevance of Darksiders 3, Dragon Quest XI and Wildlands. Latest big UE4 releases and one of UbiSoft's non Assassin's Creed openworld games?

It is not productive to color games based on whether they use Nvidia GameWorks. The main problem with GameWorks as far as NVidia vs AMD is concerned was that it is closed source, making it impossible for AMD to optimize for it if needed. GameWorks has been open source since 2016 or so. AMD does not have a branded program in the same way, GPUOpen and tools-effects provided in it are non-branded but are present in a lot of games.
cucker tarlsonIsn't Wolfenstein id7 or 6+ ? I remember doom was id6 and then devs said that wolfenstein was a big technical advancement over that.Supports half precision and variable rate shading.
Wolfenstein II is idTech6. Fixed. Thanks.
Posted on Reply
#55
moproblems99
Good to see they sorted things out. Would have been nice to have this upfront considering the architecture isn't exactly new or anything.
trog100it seems to me that nvidea make better graphics card than amd which leaves those in camp having a difficult time justifying exactly why they are in red camp..
The real problem is that anyone cares. Grow up and move in. (Not directed at you.)
SIGSEGVI tested various driver version to my gpu and get it benched.
The also could have introduced an issue accidentally and didn't circle back because it is not current gen.
Posted on Reply
#56
cucker tarlson
I don't think there's much in it



*there's no performance degradation for 780ti,there's very slight improvement.
*780ti vs 290x on 2016 drivers - 780ti wins in 18 runs out of 28,290x in 10 out of 28.
Posted on Reply
#57
londiste
Both AMD and Nvidia have a list of featured games:
www.amd.com/en/gaming/featured-games
www.nvidia.com/en-us/geforce/games/latest-pc-games/#gameslist

I doubt W1zzard had that in mind or checked it when choosing games but there are 6 games benchmarked from both vendors' featured list and the games were not what I really expected:
AMD: Assassin's Creed Odyssey, Civilization VI, Deus Ex: Mankind Divided, Far Cry 5, Grand Theft Auto V, Strange Brigade.
Nvidia: Battlefield V, Darksiders 3, Divinity Original Sin II, Dragon Quest XI, Monster Hunter World, Shadow of the Tomb Raider.
:)
Posted on Reply
#58
INSTG8R
Vanguard Beta Tester
rvalenciaFrom www.techpowerup.com/reviews/AMD/Radeon_VII/

Assassin's Creed Origins (NVIDIA Gameworks, 2017)

Battlefield V RTX (NVIDIA Gameworks, 2018)

Civilization VI (2016)

Darksiders 3 (NVIDIA Gameworks, 2018), old game remaster, where's Titan Fall 2.

Deus Ex: Mankind Divided (AMD, 2016)

Divinity Original Sin II (NVIDIA Gameworks, 2017)

Dragon Quest XI (Unreal 4 DX11, large NVIDIA bias, 2018)

F1 2018 (2018), Why? Microsoft's Forza franchise is larger than this Codemaster game.

Far Cry 5 (AMD, 2018)

Ghost Recon Wildlands (NVIDIA Gameworks, 2017), missing Tom Clancy's The Division

Grand Theft Auto V (2013)

Hellblade: Senuas Sacrif (Unreal 4 DX11, NVIDIA Gameworks)

Hitman 2

Monster Hunter World (NVIDIA Gameworks, 2018)

Middle-earth: Shadow of War (NVIDIA Gameworks, 2017)

Prey (DX11, NVIDIA Bias, 2017 )

Rainbow Six: Siege (NVIDIA Gameworks, 2015)

Shadows of Tomb Raider (NVIDIA Gameworks, 2018)

SpellForce 3 (NVIDIA Gameworks, 2017)

Strange Brigade (AMD, 2018),

The Witcher 3 (NVIDIA Gameworks, 2015)

Wolfenstein II (2017, NVIDIA Gameworks), Results different from www.hardwarecanucks.com/forum/hardware-canucks-reviews/78296-nvidia-geforce-rtx-2080-ti-rtx-2080-review-17.html when certain Wolfenstein II map exceeded RTX 2080'
Except it’s AC odyssey and it’s Amd but good effort regardless...
Posted on Reply
#59
M2B
medi01You just need to stop projecting. It isn't hard.



I mean, just how shameless can it become, seriously?
You got what performance upfront, when 960 beat 780Ti ($699), come again?

AMD perf improves over time, nVidia falls behind not only behind AMD, but behind own newer cards.
As card you bought gets older, NV doesn't give a flying sex act.
It needs quite a twisting to turn this into something positive.

What's rather unusual this time, is AMD being notably worse at perf/$ edge, at least with game list picked up at TP.

290x was slower than 780Ti at launch, but it cost $549 vs $699, so there goes "I get 10% at launch" again.
Kepler architecture aged bad for some reason but maxwell has aged as it should.
1060 and 980 were in the same level of performance back in 2016 and they are in 2019.
Nothing has changed (except for games that need more than 4GB of VRAM).

I don't think someone who spends 700$ on a card even cares about its performance after 3~ years or so, high-end owner needs high-end performance and upgrades sooner than mid-range user in general.
Posted on Reply
#60
Unregistered
I've heard of some sort of undervolting which improves the card's thermals greatly, the whole process seems very easy, why no one is bothering to use it?
Posted on Edit | Reply
#61
moproblems99
While all of this is very interesting, I fail to see how it impacts this gpu. New thread maybe? Perhaps some conspiracy can be brought to light from it.
Posted on Reply
#62
Vayra86
M2BKepler architecture aged bad for some reason but maxwell has aged as it should.
1060 and 980 were in the same level of performance back in 2016 and they are in 2019.
Nothing has changed (except for games that need more than 4GB of VRAM).
Kepler aged badly because of VRAM, and that is exactly where AMD had more to offer in the high-end at the time.

They had a 7970 with 3GB VRAM to compete with GTX 670/680 2GB.
And later they had a 290X with 4GB VRAM to compete with GTX 780/780ti 3GB.

At the same time, the mainstream res started slowly moving to 1440p as Korean IPS panels were cheap overseas and many enthusiasts imported one. This heavily increased VRAM demands, alongside the consoles being released with 6GB to address, which meant mainstream would quickly move to higher VRAM demands, and it happened across just 1,5 generation of GPUs, even in the Nvidia camp the VRAM almost doubled across the whole stack, and then doubled again with Pascal. That is why people are liable to think AMD cards 'aged well' and Nvidia cards lost performance over time. This culminated in the release of the 3.5GB 'fast' VRAM GTX 970. That little bit of history ALSO underlines why AMD now releases a 16GB HBM card. They are banking on the idea that people THINK it might double again over time, that is why you can find some people advocating the 16GB as a good thing for gaming. And of course the expenses of having to alter the card.

If any supporter of Red needed confirmation bias, there it is :toast:. But it doesn't make it any less of an illusion that Nvidia drivers handicap performance over time.
Posted on Reply
#63
EarthDog
Xex360I've heard of some sort of undervolting which improves the card's thermals greatly, the whole process seems very easy, why no one is bothering to use it?
They are... but not in reviews... the majority of people don't bother. There is also the point of, why should anyone have to do this in the first place?
Posted on Reply
#65
M2B
Vayra86Kepler aged badly because of VRAM, and that is exactly where AMD had more to offer in the high-end at the time.

They had a 7970 with 3GB VRAM to compete with GTX 670/680 2GB.
And later they had a 290X with 4GB VRAM to compete with GTX 780/780ti 3GB.

At the same time, the mainstream res started slowly moving to 1440p as Korean IPS panels were cheap overseas and many enthusiasts imported one. This heavily increased VRAM demands, alongside the consoles being released with 6GB to address, which meant mainstream would quickly move to higher VRAM demands, and it happened across just 1,5 generation of GPUs, even in the Nvidia camp the VRAM almost doubled across the whole stack, and then doubled again with Pascal. That is why people are liable to think AMD cards 'aged well' and Nvidia cards lost performance over time. This culminated in the release of the 3.5GB 'fast' VRAM GTX 970. That little bit of history ALSO underlines why AMD now releases a 16GB HBM card. They are banking on the idea that people THINK it might double again over time, that is why you can find some people advocating the 16GB as a good thing for gaming. And of course the expenses of having to alter the card.

If any supporter of Red needed confirmation bias, there it is :toast:. But it doesn't make it any less of an illusion that Nvidia drivers handicap performance over time.
True and untrue.
It's not just VRAM, yeah VRAM requirements have risen significantly since then but don't forget you can easily remove the VRAM bottleneck by lowering the Texture quality.
970 was slower than the 780Ti at launch but it's not the case today, it's ahead actually, even in cases where the VRAM isn't a limiting factor.
Posted on Reply
#66
IceShroom
M2BTrue and untrue.
It's not just VRAM, yeah VRAM requirements have rised significantly since then but don't forget you can easily remove the VRAM bottleneck by lowering the Texture quality.
970 was slower than the 780Ti at launch but it's not the case today, it's ahead actually, without VRAM being a limiting factor.
Pay $350+ just to play with lower Texture quality!!!!
I have a better advice, Buy a(or two) console.
Posted on Reply
#67
Vayra86
But yeah, its not just VRAM, but if you speak of aging wrt the 3GB cards, we've also seen the introduction of delta compression and much improved GPU Boost for Nvidia starting with Maxwell, while AMD was busy rebranding everything. Maxwell was that crucial moment where they actually gave up and created the gaping hole in performance and perf/watt we're still looking at today.
Posted on Reply
#68
jabbadap
Vayra86Kepler aged badly because of VRAM, and that is exactly where AMD had more to offer in the high-end at the time.

They had a 7970 with 3GB VRAM to compete with GTX 670/680 2GB.
And later they had a 290X with 4GB VRAM to compete with GTX 780/780ti 3GB.

At the same time, the mainstream res started slowly moving to 1440p as Korean IPS panels were cheap overseas and many enthusiasts imported one. This heavily increased VRAM demands, alongside the consoles being released with 6GB to address, which meant mainstream would quickly move to higher VRAM demands, and it happened across just 1,5 generation of GPUs, even in the Nvidia camp the VRAM almost doubled across the whole stack, and then doubled again with Pascal. That is why people are liable to think AMD cards 'aged well' and Nvidia cards lost performance over time. This culminated in the release of the 3.5GB 'fast' VRAM GTX 970. That little bit of history ALSO underlines why AMD now releases a 16GB HBM card. They are banking on the idea that people THINK it might double again over time, that is why you can find some people advocating the 16GB as a good thing for gaming. And of course the expenses of having to alter the card.

If any supporter of Red needed confirmation bias, there it is :toast:. But it doesn't make it any less of an illusion that Nvidia drivers handicap performance over time.
Not only that, it does not have tiled raster either so memory bandwidth became problem to it too. I would be interested to see original Kepler Titan added to tpu benchmark. Performance should be close to rx 570/gtx1060 3GB model at 1080p. @W1zzard still has one? One interesting bit though there were 6GB gtx 780s too, gtx780ti was always 3GB as 6GB was the Titan black... But yeah we are going way off topic now.

Radeon VII has a lot vram only weakness in memory subsystem is it has quite low ROPs count. In normal tasks that is more than enough, but using MSAA can really tank the performance. Luckily for AMD MSAA is dying breed, fewer and fewer games are supporting that AA method anymore.
Posted on Reply
#69
Xaled
W1zzard says:
I'd focus on optimizations for UE4 next.

Nvidia is working closely with ue4 right now. I don't think AMD would risk working on optimizations on ue4 then have all the labor gone by a patch that would let AMD cards fall behind again. Just like Nvidia did with the dx11 HAWX game.
Posted on Reply
#70
SIGSEGV
cucker tarlsonI don't think there's much in it

*there's no performance degradation for 780ti,there's very slight improvement.
*780ti vs 290x on 2016 drivers - 780ti wins in 18 runs out of 28,290x in 10 out of 28.
That's the main reason I stay with so-called "ANCIENT" driver. Thanks.
Posted on Reply
#71
cucker tarlson
SIGSEGVThat's the main reason I stay with so-called "ANCIENT" driver. Thanks.
what is the reason?
Posted on Reply
#72
SIGSEGV
cucker tarlsonwhat is the reason?
*there's no performance degradation for 780ti,there's very slight improvement.
you gave me the old graph that supposed released somewhere in 2016
Posted on Reply
#73
IceShroom
Vayra86But yeah, its not just VRAM, but if you speak of aging wrt the 3GB cards, we've also seen the introduction of delta compression and much improved GPU Boost for Nvidia starting with Maxwell, while AMD was busy rebranding everything. Maxwell was that crucial moment where they actually gave up and created the gaping hole in performance and perf/watt we're still looking at today.
Colour compression may be useful for SDR but not in HDR. Enable HDR on GTX 1080 makes it slower than Vega 64.
www.computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/#diagramm-destiny-2-3840-2160
Posted on Reply
#75
cucker tarlson
SIGSEGVyou gave me the old graph that supposed released somewhere in 2016
so the gimping ocurred after 2016 then ?
when exactly ?
Posted on Reply
Add your own comment
Nov 21st, 2024 13:59 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts