Monday, February 18th 2019

AMD Radeon VII Retested With Latest Drivers

Just two weeks ago, AMD released their Radeon VII flagship graphics card. It is based on the new Vega 20 GPU, which is the world's first graphics processor built using a 7 nanometer production process. Priced at $699, the new card offers performance levels 20% higher than Radeon RX Vega 64, which should bring it much closer to NVIDIA's GeForce RTX 2080. In our testing we still saw a 14% performance deficit compared to RTX 2080. For the launch-day reviews AMD provided media outlets with a press driver dated January 22, 2019, which we used for our review.

Since the first reviews went up, people in online communities have been speculating that these were early drivers and that new drivers will significantly boost the performance of Radeon VII, to make up lost ground over RTX 2080. There's also the mythical "fine wine" phenomenon where performance of Radeon GPUs significantly improve over time, incrementally. We've put these theories to the test by retesting Radeon VII using AMD's latest Adrenalin 2019 19.2.2 drivers, using our full suite of graphics card benchmarks.
In the chart below, we show the performance deltas compared to our original review, for each title three resolutions are tested: 1920x1080, 2560x1440, 3840x2160 (in that order).



Please do note that these results include performance gained by the washer mod and thermal paste change that we had to do when reassembling of the card. These changes reduced hotspot temperatures by around 10°C, allowing the card to boost a little bit higher. To verify what performance improvements were due to the new driver, and what was due to the thermal changes, we first retested the card using the original press driver (with washer mod and TIM). The result was +0.2% improved performance.

Using the latest 19.2.2 drivers added +0.45% on top of that, for a total improvement of +0.653%. Taking a closer look at the results we can see that two specific titles have seen significant gains due to the new driver version. Assassin's Creed Odyssey, and Battlefield V both achieve several-percent improvements, looks like AMD has worked some magic in those games, to unlock extra performance. The remaining titles see small, but statistically significant gains, suggesting that there are some "global" tweaks that AMD can implement to improve performance across the board, but unsurprisingly, these gains are smaller than title-specific optimizations.

Looking further ahead, it seems plausible that AMD can increase performance of Radeon VII down the road, even though we have doubts that enough optimizations can be discovered to match RTX 2080, maybe if suddenly a lot of developers jump on the DirectX 12 bandwagon (which seems unlikely). It's also a question of resources, AMD can't waste time and money to micro-optimize every single title out there. Rather the company seems to be doing the right thing: invest into optimizations for big, popular titles, like Battlefield V and Assassin's Creed. Given how many new titles are coming out using Unreal Engine 4, and how much AMD is lagging behind in those titles, I'd focus on optimizations for UE4 next.
Add your own comment

182 Comments on AMD Radeon VII Retested With Latest Drivers

#76
Xaled
cucker tarlsongod damn this thread is just too fun of a read for those who enjoy conspiracy threories.
A Theory like GPP for example?
Posted on Reply
#77
Vayra86
XaledA Theory like GPP for example?
Unproven gimped Nvidia drivers > GPP. Yep. Seems legit...

Why is this turning into an Nvidia driver performance slowchat? That horse is dead, buried and probably cast into the ocean by now. Let it go unless you have actual data and if you do, open a nice little topic for it. Man...
Posted on Reply
#78
M2B
jabbadapmemory subsystem is it has quite low ROPs count.
The ROP is not a part of memory-subsystem in AMD's case as far as I know.
Posted on Reply
#79
turbogear
Xex360I've heard of some sort of undervolting which improves the card's thermals greatly, the whole process seems very easy, why no one is bothering to use it?
Undervolting works well on all Vega generations and is very easy to do through Wattman.
I undervolted my custom watercooled Vega 64 before with nice performance boosts.
At 1085mV undervolt and 2.5% frequency overclock Vega 64 ran for past 1.5 years at frequencies in range of 1670MHz with max temperature at 42°C while consuming around 30W less than refernce settings.

I have undervolted Radeon VII 1030mV @1082MHz.
In the two main games that I play at the moment BF V and Black Ops 4 the average power consumption on my Radeon VII is around 250W with this setting.
Waiting for waterblocks to be available to start fine tuning Radeon VII like Vega 64.
The refernce cooler is limiting factor at the moment. :oops:
I have the watercooling setup as was used for cooling Vega 64 and Ryzen 2700x.
Only thing missing is waterblock for Radeon VII.
Posted on Reply
#80
IceShroom
Vayra86That is great, what else you got? Maybe you ought to reread what you quoted and replied to, because your earlier comment had zero relation to what was being discussed and neither does this one.
The comment I replyed said Nvidia with colour compression requiers less bandwith and VRAM. Those useful with 6 bit SDR colour with not full(25-255) colour range. But with HDR with 8+ bit colour depth 0-255 colour rannge need Raw bandwidth, the article in link shows that.
Posted on Reply
#81
Vayra86
IceShroomThe comment I replyed said Nvidia with colour compression requiers less bandwith and VRAM. Those useful with 6 bit SDR colour with not full(25-255) colour range. But with HDR with 8+ bit colour depth 0-255 colour rannge need Raw bandwidth, the article in link shows that.
Great, but the topic was aging of cards without delta compression and whether drivers had any influence on it.
Posted on Reply
#82
xkm1948
For the shroom guy:

Buy a Radeon 7 dude. Then test performance along the way for new driver releases. We will all shut up if you give us hard numbers.

Plus you get to suport AMD GPu division. Two birds one stone right? I assuming you are not those hyppcrites who are all talks but no action?
Posted on Reply
#83
the54thvoid
Super Intoxicated Moderator
@xkm1948 - You're like the guy from that ancient Remington advert where the guy says, "I liked it so much, I bought the company!". Except, what a lot of people don't know is that you bought the Fury X, and were sorely let down by not just it, but the promised software support.

I mean, I wanted to give AMD a chance and I bought a 1700X. It's alright. It's a good direction for AMD. So much so, I'll buy a Ryzen 2, guaranteed when it releases. But the GFX dept at AMD... Not quite there yet. No matter how much some people suggest it is. I do think, that if Intel throws enough money at it, they might even overtake AMD, and that will be their death knell.
Posted on Reply
#84
xkm1948
the54thvoid@xkm1948 - You're like the guy from that ancient Remington advert where the guy says, "I liked it so much, I bought the company!". Except, what a lot of people don't know is that you bought the Fury X, and were sorely let down by not just it, but the promised software support.
Yep. I was and still am constantly attacked by angry red mobs for pointing out problems of the FuryX. They do not hesitant one second when it comes to defending their beloved brand.

Well i was young and stupid and let my emotion clouded my judgement. Not anymore tho...
Posted on Reply
#85
IceShroom
xkm1948For the shroom guy:

Buy a Radeon 7 dude. Then test performance along the way for new driver releases. We will all shut up if you give us hard numbers.

Plus you get to suport AMD GPu division. Two birds one stone right? I assuming you are not those hyppcrites who are all talks but no action?
I am not a customer for $500+ gpu, i am more interested in $120-140 GPU.
Posted on Reply
#86
Super XP
JB_GamerIsn't it the case - the problem for Amd - that ALL games are tested and optimized for nVidia GPU'S?
Nvidia seems to be a little more aggressive to get games optimized for its GPUs, probably because AMD owns the Console Graphics, and seeing as of late, most games are Console Ports onto PC. Well, Every Single Console Game is Optimized for AMD Radeon GPUs.
IceShroomI am not a customer for $500+ gpu, i am more interested in $120-140 GPU.
Well if you think about it, the maximum price tag for the highest of the high performance GPUs should be MAX $500. This is speaking about the Radeon VII, RTX 2080. Enthusiast price tag should be no more than $600. That would be a RTX 2080Ti and a Radeon VII+ with custom cooling :D
GPUs, especially the high end versions are all overpriced period.
Posted on Reply
#87
notb
Super XPWell if you think about it, the maximum price tag for the highest of the high performance GPUs should be MAX $500. This is speaking about the Radeon VII, RTX 2080. Enthusiast price tag should be no more than $600. That would be a RTX 2080Ti and a Radeon VII+ with custom cooling :D
OK. I've thought about it. It turns out the proper maximum price for high-end GPU should be $2008.55. There's a big difference between our results. :/ How did you get to those $500?
Posted on Reply
#88
turbogear
@W1zzard
Thanks a lot for the effort to retest. :toast:
Me as Radeon VII owner was not expecting any large performance increase between 19.2.1 and 19.2.2. The time frame I believe is too short between the two.
It did though manage to fix few issues with Wattman and overclocking.

What I personally noticed is that Black Ops 4 is running better. With 19.2.1 on my PC the frame was changing very fast between 70 FPS and 120FPS by every movement in game. The GPU clock was having lot of dips. With new drive it is mostly varying above 100FPS staying close to 120FPS and GPU clock is relatively stable. I have BenQ Zowie XL2730 FreeSync 144Hz monitor.
Posted on Reply
#89
HD64G
xkm1948Dude I am ALL for scientific methods. I am a molecular genetist. You make a hypothesis and you back it up with experimental setup and results. Anyone can go around all day and fart claims out lf thin air. But to make solid claims one has to get their hands dirty and get to do the experiments: FineWine as one of many examples.
Since you are fond of numbers that prove things scientifically, here is an example that proves that AMD GPUs aren't shown in their best form at their launch and this helps us customers to get an equal or better product in better price than will deserve in a few months only. And for a customer that keeps his hardware at least for 3 years, this is an opportunity.

Posted on Reply
#90
Nkd
lasI'm simply stating facts. I would wish that AMD GPU's were able to compete.. All their "high-end" solutions have been terrible recently. Leaving this segment for Nvidia.
Not sure what you mean. Does this card not compete at all? AMD needed CPUs to survive so they focused on that. Or you would have no AMD and Radeon graphics. Now CPU side is going good they will eventually get to the GPU side. Given their need to survive as company I am not sure how this is so bad. They somehow managed to stay relevant which in its own is good enough while starting to give intel some serious competition on CPU side. AMD did what they had to and thought long term. If they invested more in GPU side they might not be in business right now because CPU side is where they are going to survive as a company. Its not a matter of if, its just when. I think in next few years will have their Zen of GPUs too.
Posted on Reply
#91
trog100
well i am glad i forked out for a 2080ti.. at least i dont have to worry about the rest..

i follow all this stuff because it interests me.. but its all being spoiled by to much negativity.. too much entitlement infringement.. too much silly red green bickering..
huge lengthy threads about f-ck all...

trog
Posted on Reply
#92
xkm1948
HD64GSince you are fond of numbers that prove things scientifically, here is an example that proves that AMD GPUs aren't shown in their best form at their launch and this helps us customers to get an equal or better product in better price than will deserve in a few months only. And for a customer that keeps his hardware at least for 3 years, this is an opportunity.

This is exactly my point. For people who don’t believe in W1zzard’s review, go get a Radeon 7. Test it along the life span of the GPU and report back how it ages.
Posted on Reply
#93
Countryside
xkm1948This is exactly my point. For people who don’t believe in W1zzard’s review, go get a Radeon 7. Test it along the life span of the GPU and report back how it ages.
W1zzard’s reviews are of top quality and i dont doubt his ability but he is one, most people dont have the expertice nor the equipment to test hardware thus they realy on reviews and you dont make an educated decision based on one review.
Posted on Reply
#94
notb
Vayra86The point is, those examples exist and we already know what's up. Stop beating the dead horse...
While I'm not a huge fan of the argument quality we've seen here, I do share @xkm1948 's enthusiasm for a more scientific approach (as shown earlier).
I mean: we've been hearing about FineWine for years and it doesn't look like this could ever stop. There are, as you said "examples" or "traces" that something could be going on, but that's a bit like a proof of existence of ether or gods - we've seen some weird stuff going on, so let's assume there's a reason and let's give it a catchy name.

We had a nice discussion about quality of journalism in another topic. Seriously, wouldn't you like to see a proper experiment for this theory?
It's not that hard either. You take a few cards from different generations, 20-30 driver editions released over 5 years, a few AAA games and you measure all data points.
Instead, we have hundreds of thousands of people on thousands of forums talking about a phenomenon that no one has ever confirmed in a controlled environment. And it's been going for years. And it will keep going for as long as AMD will be making GPUs.

Why hasn't any of the large tech sites done this? Are they afraid of the result and angry mob from either side?
Why hasn't AMD done this?
Posted on Reply
#95
xkm1948
notbWhile I'm not a huge fan of the argument quality we've seen here, I do share @xkm1948 's enthusiasm for a more scientific approach (as shown earlier).
I mean: we've been hearing about FineWine for years and it doesn't look like this could ever stop. There are, as you said "examples" or "traces" that something could be going on, but that's a bit like a proof of existence of ether or gods - we've seen some weird stuff going on, so let's assume there's a reason and let's give it a catchy name.

We had a nice discussion about quality of journalism in another topic. Seriously, wouldn't you like to see a proper experiment for this theory?
It's not that hard either. You take a few cards from different generations, 20-30 driver editions released over 5 years, a few AAA games and you measure all data points.
Instead, we have hundreds of thousands of people on thousands of forums talking about a phenomenon that no one has ever confirmed in a controlled environment. And it's been going for years. And it will keep going for as long as AMD will be making GPUs.

Why hasn't any of the large tech sites done this? Are they afraid of the result and angry mob from either side?
Why hasn't AMD done this?
HardOCP did. They found Nvidia does not gimp their cards and AMD does not FineWine either
Posted on Reply
#96
HammerON
The Watchful Moderator
Warning and reply bans issued. No more personal attacks. Play nice or move along.
Posted on Reply
#97
notb
xkm1948HardOCP did. They found Nvidia does not gimp their cards and AMD does not FineWine either
OK, I've just checked it. Not enough data points, wrong presentation. But otherwise the right direction. Thanks.
Posted on Reply
#98
bug
notbOK, I've just checked it. Not enough data points, wrong presentation. But otherwise the right direction. Thanks.
Wrong attitude I'd say. You're looking for someone to disprove by the book something that has only anecdotally been proven.
Posted on Reply
#99
Mistral
If I may, a suggestion for benchmarks going forward: have a small icon next to the game titles, indicating if they are nVidia or AMD sponsored.
Posted on Reply
#100
xkm1948
Long term performance, people dont usually keep their cards long enough to see the impact. With the progression rate most modern GPUs have usful gaming performance life span for 3~4 years. By the time FineWine is ready the GPU is already obsolete.
Posted on Reply
Add your own comment
Nov 21st, 2024 13:23 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts