# AMD Radeon VII Retested With Latest Drivers



## W1zzard (Feb 18, 2019)

Just two weeks ago, AMD released their Radeon VII flagship graphics card. It is based on the new Vega 20 GPU, which is the world's first graphics processor built using a 7 nanometer production process. Priced at $699, the new card offers performance levels 20% higher than Radeon RX Vega 64, which should bring it much closer to NVIDIA's GeForce RTX 2080. In our testing we still saw a 14% performance deficit compared to RTX 2080. For the launch-day reviews AMD provided media outlets with a press driver dated January 22, 2019, which we used for our review.

Since the first reviews went up, people in online communities have been speculating that these were early drivers and that new drivers will significantly boost the performance of Radeon VII, to make up lost ground over RTX 2080. There's also the mythical "fine wine" phenomenon where performance of Radeon GPUs significantly improve over time, incrementally. We've put these theories to the test by retesting Radeon VII using AMD's latest Adrenalin 2019 19.2.2 drivers, using our full suite of graphics card benchmarks. 



 


In the chart below, we show the performance deltas compared to our original review, for each title three resolutions are tested: 1920x1080, 2560x1440, 3840x2160 (in that order).







Please do note that these results include performance gained by the washer mod and thermal paste change that we had to do when reassembling of the card. These changes reduced hotspot temperatures by around 10°C, allowing the card to boost a little bit higher. To verify what performance improvements were due to the new driver, and what was due to the thermal changes, we first retested the card using the original press driver (with washer mod and TIM). The result was +0.2% improved performance.

Using the latest 19.2.2 drivers added +0.45% on top of that, for a total improvement of +0.653%. Taking a closer look at the results we can see that two specific titles have seen significant gains due to the new driver version. Assassin's Creed Odyssey, and Battlefield V both achieve several-percent improvements, looks like AMD has worked some magic in those games, to unlock extra performance. The remaining titles see small, but statistically significant gains, suggesting that there are some "global" tweaks that AMD can implement to improve performance across the board, but unsurprisingly, these gains are smaller than title-specific optimizations.

Looking further ahead, it seems plausible that AMD can increase performance of Radeon VII down the road, even though we have doubts that enough optimizations can be discovered to match RTX 2080, maybe if suddenly a lot of developers jump on the DirectX 12 bandwagon (which seems unlikely). It's also a question of resources, AMD can't waste time and money to micro-optimize every single title out there. Rather the company seems to be doing the right thing: invest into optimizations for big, popular titles, like Battlefield V and Assassin's Creed. Given how many new titles are coming out using Unreal Engine 4, and how much AMD is lagging behind in those titles, I'd focus on optimizations for UE4 next.

*View at TechPowerUp Main Site*


----------



## ShurikN (Feb 18, 2019)

So basically, aside from 2 titles, everything is within margin of error.


----------



## Flyordie (Feb 18, 2019)

Really, all they should do is focus on optimizing for the major game engines. 

I mean over time, there will be more and more refinements as their driver teams discover more and more ways to optimize.  Right now, this seems more like a bug fix and stability check update to the drivers more than anything.


----------



## phill (Feb 18, 2019)

I still like the card and performance it gives, it's a shame that it's not all it can be tho.  Here's hoping some time it will be just where it needs to be


----------



## bug (Feb 18, 2019)

> Since the first reviews went up, people in online communities have been speculating that these were early drivers and that new drivers will significantly boost the performance of Radeon VII, to make up lost ground over RTX 2080.


Can anyone remember the last AMD launch where disappointing performance wasn't expected to be mitigated in upcoming drivers?


> There's also the mythical "fine wine" phenomenon where performance of Radeon GPUs significantly improve over time, incrementally.


Which won't go away, despite HardOCP looking at AMD's drivers over time and finding very modest increases, with a few (one) exceptions: https://www.hardocp.com/article/2017/01/30/amd_video_card_driver_performance_review_fine_wine/


----------



## londiste (Feb 18, 2019)

> Taking a closer look at the results we can see that two specific titles have seen significant gains due to the new driver version. Assassin's Creed Odyssey, and Battlefield V both achieve multi-digit improvements, looks like AMD has worked some magic in those games, to unlock extra performance.


Is the graph wrong? 5-6% is not multi-digit.


----------



## robert3892 (Feb 18, 2019)

I'd like to test this myself but sadly my Radeon 7 arrived as a Dead on Arrival unit and I had to send it back for a replacement. I suppose I'll get the replacement next week.


----------



## medi01 (Feb 18, 2019)

W1zzard said:


> the mythical "fine wine"



Remind me how 780Ti fared against 290x.
"Mythical" eh?



bug said:


> AMD launch where disappointing performance


Yay, that feeling aspect of AMD products is really something special now, isn't it?

I mean, there should be reasons $139 1050Ti outsells 1.5-2 times faster 570 priced at $99, shouldn't there?


----------



## W1zzard (Feb 18, 2019)

londiste said:


> Is the graph wrong? 5-6% is not multi-digit.


Whoops, I meant "multiple-percent", fixed now


----------



## londiste (Feb 18, 2019)

medi01 said:


> I mean, there should be reasons $139 1050Ti outsells 1.5-2 times faster 570 priced at $99, shouldn't there?


Stop harping on that. GTX1050Ti has MSRP of $139, RX570 has MSRP of $169. For most part of the lifetime relative prices have reflected that difference only lately moving to where the prices are now.


----------



## Zubasa (Feb 18, 2019)

medi01 said:


> I mean, there should be reasons $139 1050Ti outsells 1.5-2 times faster 570 priced at $99, shouldn't there?


Yes, because you see nVidia logo pop up more often in Games, on ads etc.
The average user might not even have an FPS counter on, they just look at the "safe" option.
They see nVidia / Intel on system requirements they buy that.


----------



## medi01 (Feb 18, 2019)

londiste said:


> relative prices have reflected that differenc


BS. 570 wiped the floor with 1050Ti upfront.

There is the same story of 280/280x vs 960, which was a terrible thing to buy, if one had to buy from green, 970 was there.
But it seems to be hard  to accept that hordes of consumers make ill informed GPU purchases, for some reason.


----------



## londiste (Feb 18, 2019)

medi01 said:


> BS. 570 wiped the floor with 1050Ti upfront.
> 
> There is the same story of 280/280x vs 960, which was a terrible thing to buy, if one had to buy from green, 970 was there.
> But it seems to be hard  to accept that hordes of consumers make ill informed GPU purchases, for some reason.


Yes, RX570 wipes the floor with GTX1050Ti. They are cards from different segments.
RX570 is the intended competitor of GTX1060 3GB.
GTX1050Ti is the intended competitor of RX560.

Pricing in the lowend and midrage is FUBAR.


----------



## JB_Gamer (Feb 18, 2019)

Isn't it the case - the problem for Amd - that ALL games are tested and optimized for nVidia GPU'S?


----------



## xkm1948 (Feb 18, 2019)

FineWine hahahahahaha.

Nope. If I pay for full price on day 1 I expect full performance on day 1.


----------



## bug (Feb 18, 2019)

JB_Gamer said:


> Isn't it the case - the problem for Amd - that ALL games are tested and optimized for nVidia GPU'S?


Not really. Games are calling DX API, they don't know what's going on below that level. Optimizations are more on the level of "don't try to use that feature extensively, because it will bring the hardware to its knees"  or "while you do X, you can also do Y, because the hardware can process that in parallel".
It's on;y when you take those optimizations and try to apply them to _all_ hardware that you can get in trouble. But besides a couple of titles that went overboard with hairworks, I don't recall developers being that dumb.


----------



## INSTG8R (Feb 18, 2019)

JB_Gamer said:


> Isn't it the case - the problem for Amd - that ALL games are tested and optimized for nVidia GPU'S?


Many on that list are AMD games. FC5,AC, and a couple more I can’t be arsed to install and confirm.


----------



## las (Feb 18, 2019)

medi01 said:


> Remind me how 780Ti fared against 290x.
> "Mythical" eh?



Remind he how 980 Ti fared against Fury X.
7970 was the last good AMD GPU. It's been 7 years.
AMDs GPU business is in a worse state than ever now. I miss ATi.
Fury X, Vega 64 and now Radeon VII - All cards have been a complete joke.


----------



## xkm1948 (Feb 18, 2019)

las said:


> Remind he how 980 Ti fared against Fury X.



Yeah man dont wanna even think about that overclocker’s dream...


----------



## Countryside (Feb 18, 2019)

Let try to keep this thread clean of red is bad and green is good or vice versa


----------



## las (Feb 18, 2019)

Countryside said:


> Let try to keep this thread clean of red is bad and green is good or vice versa


I'm simply stating facts. I would wish that AMD GPU's were able to compete.. All their "high-end" solutions have been terrible recently. Leaving this segment for Nvidia.


----------



## Imsochobo (Feb 18, 2019)

bug said:


> Can anyone remember the last AMD launch where disappointing performance wasn't expected to be mitigated in upcoming drivers?
> 
> Which won't go away, despite HardOCP looking at AMD's drivers over time and finding very modest increases, with a few (one) exceptions: https://www.hardocp.com/article/2017/01/30/amd_video_card_driver_performance_review_fine_wine/



290X
7970
6970
5870
4870
x1900
x1950xtx
x850
9800 pro


----------



## medi01 (Feb 18, 2019)

xkm1948 said:


> Nope. If I pay for full price on day 1 I expect full performance on day 1.


You mean, when you buy 780Ti, you expect it to be beaten by 960 later on, right?



las said:


> Remind he how 980 Ti fared against Fury X.


*Why the hell not!*
You might discover interesting things, if you check how results were changing over time in TP charts.
Fury used to beat 980Ti only at 4k at launch (and even then, barely) at 1440p it was about 10% behind:
https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html





And were are we now:
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/31.html





From 10% behind to several % ahead. Not bad, is it?

And one more point: FineWine in general refers to graceful aging of AMD cards, especially in contrast with that nVidia does with its customers.
Not about some magic dust coming into play later on or AMD purposefully crippling it at launch (the way certain other company does)


----------



## Countryside (Feb 18, 2019)

las said:


> I'm simply stating facts. I would wish that AMD GPU's were able to compete.. All their "high-end" solutions have been terrible recently. Leaving this segment for Nvidia.



Everybody has a right to their own opinion but with hardware opinion matters not and facts can be bent to serve once narrative.


----------



## bug (Feb 18, 2019)

Imsochobo said:


> 290X
> 7970
> 6970
> 5870
> ...


So, over 5 years ago. Or 5 generations ago. Gotcha.


----------



## notb (Feb 18, 2019)

medi01 said:


> And one more point: FineWine in general refers to graceful aging of AMD cards, especially in contrast with that nVidia does with its customers.
> Not about some magic dust coming into play later on or AMD purposefully crippling it at launch (the way certain other company does)


Man, it's you again on your vendetta against Nvidia. I hoped you would give up after R VII launch.

FineWine doesn't refer to "graceful aging of AMD cards". It refers to AMD not being able to provide proper drivers at the time of launch. So with Nvidia you get that extra 10% the first day, and with AMD you have to wait.

Basically, you just said that instead of getting $1000 today you'd rather get by monthly installments over a year, because then you'd have the sense of earning money.

Also, I would love to learn a way to revoke your rating rights, because you're just running around giving a -1 to anyone who doesn't share your love for Radeon chips. It undermines the already little sense that ranking system has.


----------



## medi01 (Feb 18, 2019)

bug said:


> 5 generations ago


Remind us about 5 generations we had between 290x release and today.


----------



## EarthDog (Feb 18, 2019)

Ive heard that the crashing was fixed!

Seems like those who were begging for a retest for performamce issues didnt leave the wine in the casket long enough. 

Edit: Hilarious this gets downvoted. Another polarizing toxic fanboy goes on  ignore at tpu. I wonder if there is a max amount of users one can ignore here...lol


----------



## medi01 (Feb 18, 2019)

notb said:


> vendetta


You just need to stop projecting. It isn't hard.



notb said:


> It refers to AMD not being able to provide proper drivers at the time of launch. So with Nvidia you get that extra 10% the first day, and with AMD you have to wait.



I mean, just how *shameless *can it become, seriously?
*You got what performance upfront, when 960 beat 780Ti ($699), come again?*

AMD perf improves over time, nVidia falls behind not only behind AMD, but behind own newer cards.
As card you bought gets older, NV doesn't give a flying sex act.
It needs quite a twisting to turn this into something positive.

What's rather unusual this time, is AMD being notably worse at perf/$ edge, at least with game list picked up at TP.

290x was slower than 780Ti at launch, but it cost $549 vs $699, so there goes "I get 10% at launch" again.


----------



## cucker tarlson (Feb 18, 2019)

medi01 said:


> AMD perf improves over time, nVidia falls behind not only behind AMD, but behind own newer cards.


factually wrong.



EarthDog said:


> Ive heard that the crashing was fixed!
> 
> Seems like those who were begging for a retest for performamce issues didnt leave the wine in the casket long enough.
> 
> Edit: Hilarious this gets downvoted. Another polarizing toxic fanboy goes on  ignore at tpu. I wonder if there is a max amount of users one can ignore here...lol


as one guy in the adoredtv thread said, you can "speak only positively of amd or piss off".I guess that involves disproved theories too as long as they're flattering to amd or harmng to nvidia.who cares if it's a debunked theory from 5 years ago.if it's an option to leave venomous comments and binge-downvote people they're taking that opportunity.


----------



## bug (Feb 18, 2019)

medi01 said:


> Remind us about 5 generations we had between 290x release and today.


200 series -> 300 series -> 400 series -> 500 series -> Vega


----------



## cucker tarlson (Feb 18, 2019)

bug said:


> 200 series -> 300 series -> 400 series -> 500 series -> Vega


Fury and another rx480 refresh (rx590) in between them.


----------



## las (Feb 18, 2019)

medi01 said:


> You mean, when you buy 780Ti, you expect it to be beaten by 960 later on, right?
> 
> 
> *Why the hell not!*
> ...



Haha, you seem to forget that AMD called Fury X an overclockers dream, while 980 Ti performs ~20% faster out of the box for custom cards and easily gains 10-20% more performance on top of that. A fully overclocked 980 Ti custom card completely wrecks a Fury X today. 2GB more VRAM also helps with this.

Performance per watt on recent AMD top-end cards have been a disaster too. Vega 64 does not beat 1080 while pulling 100% more power.

Radeon VII is just as bad.

The only reason why 290 series aged well, was because 390 series were a rebrand and AMD kept optimizing them to try and keep up with Nvidia. Notoriously bad performance per watt on both series.

Once again, 7970 was the last good AMD top-end GPU.


----------



## SIGSEGV (Feb 18, 2019)

> Given how many new titles are coming out using Unreal Engine 4, and how much AMD is lagging behind in those titles, I'd focus on optimizations for UE4 next.



hahaha....


cucker tarlson said:


> factually wrong.



but I hate their (NVIDIA) approach to reduce performance through a driver update to older GPU. 
That's beyond my comprehension.


----------



## medi01 (Feb 18, 2019)

cucker tarlson said:


> factually wrong.


*In your green face*, from page 1:











las said:


> Haha, but excuses...


It isn't needed to lose your face over it, Huang is not worth it.



bug said:


> 200 series -> 300 series -> 400 series -> 500 series -> Vega





cucker tarlson said:


> Fury and another rx480 refresh (rx590) in between them.


200 series cannot count as "between", could they?
Should I count 500 series twice? I don't know why, but there might be green reasons.
I would count rebrands as, what they are, zero.

Now, if you'd stop twisting reality, we'd have to mention:

=> Polaris => Vega => Vega 7nm

That's all the released we had between  290x and now. On the green side of things:

=> Pascal => [.... Volta never materialized in consumer market....] => Tesla


----------



## las (Feb 18, 2019)

medi01 said:


> In your green face, from page 1:
> 
> 
> 
> ...



Excuses? It's a fact that 980 Ti wrecks Fury X today.

Every single person with knowledge of GPU's knows that 980 Ti reference performs MUCH worse than custom cards out of the box (yet even reference can make Fury X look weak post OC).

https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_G1_Gaming/33.html
https://www.techpowerup.com/reviews/MSI/GTX_980_Ti_Lightning/26.html

35-40% OC gain in the end, over 980 Ti reference. Completely wrecks Fury X at the end of the day.

Meanwhile Fury X can gain 3% from OC, bumps up the powerusage like crazy tho. OVERCLOCKERS DREAM!!!


----------



## notb (Feb 18, 2019)

medi01 said:


> AMD perf improves over time, nVidia falls behind not only behind AMD, but behind own newer cards.


But is this important? With Nvidia you get a lot more performance up front.
Imagine the amount of "finewine" that has to happen until Radeon VII matches Turing on efficiency. They would have to make it run on happy thoughts (which, I imagine, you have in spades).

And sure, since Nvidia optimizes for their newest architecture, an older product may become a little slower.
It would look exactly the same with AMD, if they actually changed the architecture from time to time.
But since they're still developing GCN, it's inevitable that older cards will get better over time.
We may see this change when they switch to a new architecture (if we live long enough, that is...).


> As card you bought gets older, NV doesn't give a flying sex act.


They made it very good at launch. I'm fine with that. I buy every product as is. I have no guarantee of future improvements and I don't assume they will happen.

What other durable goods you buy become better during their lifetime? Why do you think this is what should be happening?
Does your shoes become better? Furniture? Cars?

When my car is recalled for repair, I'm quite mad that they haven't found it earlier. These are usually issues that shouldn't go through QC.
I imagine this is what you think: "what a lovely company it is: I already paid them for the car, but they keep fixing all these fatal flaws!".


> It needs quite a twisting to turn this into something positive.


Seriously, you'll lecture me on twisting?


----------



## EarthDog (Feb 18, 2019)

SIGSEGV said:


> I hate their (NVIDIA) approach to reduce performance through a driver update to older GPU.
> That's beyond my comprehension.


I thought this was that part that was bunked... I dont recall where but I do recall seeing this wasn't true after it was brought up years ago.

AMD has done a solid job with their drivers (though this release was quite awkward at best for some) and results do improve with time in some titles (as it does with nvidia)...not all. This is also in part due to the stagnant GCN arc that prevailed for several generations.

Anyway, I'm happy the issues from day1 are mitigated....time will tell about performance improvements..


----------



## medi01 (Feb 18, 2019)

notb said:


> But is this important?


*In the goddamn context of "is FineWine real", IT'S THE FREAKING POINT, not simply "important".*




notb said:


> Imagine the amount of "finewine" that has to happen until Radeon VII matches Turing on efficiency.


That doesn't make FineWine "a myth".
Last time I've checked on a rather AMD unfriendly TP, power gap was 25%, so uh, well, there is that.

But you are missing the point, for FineWine(tm) to work, it* just needs to get better over time*, there are no particular milestones it needs to beat.

There is no price parity between 2080 and VII, the latter comes with 3 games and twice the RAM.



notb said:


> And sure, since Nvidia optimizes for their newest architecture, an older product may become a little slower.


I'm glad 960 beating $699 780Ti is justifiable.




notb said:


> They made it very good at launch.


That's a dead horse, why not just leave it there?
290x was a bit behind, but was $549 vs $699, there was nothing "but better" about it,  you paid 25%+ for about 10% more perf at launch, which gradually disappeared.


----------



## notb (Feb 18, 2019)

SIGSEGV said:


> but I hate their (NVIDIA) approach to reduce performance through a driver update to older GPU.
> That's beyond my comprehension.


This is an accusation you could take to court and become a millionaire. Can you prove it?


----------



## rvalencia (Feb 18, 2019)

INSTG8R said:


> Many on that list are AMD games. FC5,AC, and a couple more I can’t be arsed to install and confirm.


From https://www.techpowerup.com/reviews/AMD/Radeon_VII/

Assassin's Creed Origins (NVIDIA Gameworks, 2017)

Battlefield V RTX (NVIDIA Gameworks, 2018)

Civilization VI (2016)

Darksiders 3 (NVIDIA Gameworks, 2018), old game remaster, where's Titan Fall 2.

Deus Ex: Mankind Divided (AMD, 2016)

Divinity Original Sin II (NVIDIA Gameworks, 2017)

Dragon Quest XI (Unreal 4 DX11, large NVIDIA bias, 2018)

F1 2018 (2018), Why? Microsoft's Forza franchise is larger than this Codemaster game.

Far Cry 5 (AMD, 2018)

Ghost Recon Wildlands (NVIDIA Gameworks, 2017), missing Tom Clancy's The Division

Grand Theft Auto V (2013)

Hellblade: Senuas Sacrif (Unreal 4 DX11, NVIDIA Gameworks)

Hitman 2

Monster Hunter World (NVIDIA Gameworks, 2018)

Middle-earth: Shadow of War (NVIDIA Gameworks, 2017)

Prey (DX11, NVIDIA Bias, 2017 )

Rainbow Six: Siege (NVIDIA Gameworks, 2015)

Shadows of Tomb Raider (NVIDIA Gameworks, 2018)

SpellForce 3 (NVIDIA Gameworks, 2017)

Strange Brigade (AMD, 2018),

The Witcher 3 (NVIDIA Gameworks, 2015)

Wolfenstein II (2017, NVIDIA Gameworks),  Results different from https://www.hardwarecanucks.com/for...a-geforce-rtx-2080-ti-rtx-2080-review-17.html when certain Wolfenstein II map exceeded RTX 2080'


----------



## medi01 (Feb 18, 2019)

notb said:


> This is an accusation you could take to court and become a millionaire.


Nope:

Apple: Yes, we're slowing down older iPhones 



rvalencia said:


> From https://www.techpowerup.com/reviews/AMD/Radeon_VII/
> 
> Assassin's Creed Origins (NVIDIA Gameworks, 2017)
> 
> ...



I colored things.
and wow



rvalencia said:


> From https://www.techpowerup.com/reviews/AMD/Radeon_VII/
> 
> Assassin's Creed Origins (NVIDIA Gameworks, 2017)
> 
> ...



I colored things.
And wow.

Isn't Hitman an AMD title though?


----------



## cucker tarlson (Feb 18, 2019)

medi01 said:


> *In your green face*, from page 1:


great.you've successfully proved that changing the testing suite changes the result.great sleuthing.

and 980ti is still a faster card

by 6% in pcgh's review (and this is stock)

http://www.pcgameshardware.de/Radeon-VII-Grafikkarte-268194/Tests/Benchmark-Review-1274185/2/

by even more in computerbase ranglist-no 980ti here but Fury X does very poorly

https://www.computerbase.de/thema/grafikkarte/rangliste/

also,calm down,you're having a fanboy tantrum all over this thread.


----------



## medi01 (Feb 18, 2019)

cucker tarlson said:


> I'm too small a person to admit I was wrong....


No problem.



cucker tarlson said:


> by 6% in pcgh's review


In FineWine discussion that doesn't matter, contrast to previous pcgh results does.


----------



## notb (Feb 18, 2019)

medi01 said:


> I'm glad 960 beating $699 780Ti is justifiable.


Well, after a product is launched, one company starts to develop new, much faster successor and the other company starts to think how the already launched product works.
It's good we have a choice, right?

But I agree with you in this case: Maxwell was awesome.


> 290x was a bit behind, but was $549 vs $699, there was nothing "but better" about it,  you paid 25%+ for about 10% more perf at launch, which gradually disappeared.


So what? I will replace the card at some point. I'll get a new one with a new "10% more performance at launch". I will keep having faster cards.

And you generally pay more (in %) than the performance difference is. It has to be that way, because the market just wouldn't work (convergence).


----------



## cucker tarlson (Feb 18, 2019)

medi01 said:


> No problem.


what are you even doing ? and why ?


----------



## londiste (Feb 18, 2019)

notb said:


> Imagine the amount of "finewine" that has to happen until Radeon VII matches Turing on efficiency. They would have to make it run on happy thoughts


It kind of does, when undervolted. The problem is that Radeon VII is a full process node ahead. Also, there is probably a reason AMD overvolts things out-of-box.
I don't think I have ever seen an attempt to seriously undervolt a Turing GPU for comparison though.


medi01 said:


> Isn't Hitman an AMD title though?


Glacier engine used to favor AMD a lot back in Absolution and initially in Hitman, it was one of AMD's DX12 showcases. Eventually, the results pretty much evened out in DX12. Not sure about DX11. When developing Hitman 2 they dropped DX12 renderer and kept going with DX11.


----------



## jabbadap (Feb 18, 2019)

Heh that escalated quickly and I don't have any popcorn ...

So drivers are stable now, good. How is the noise @W1zzard any improvements on that department? I.E. changes in fan profile etc.


----------



## las (Feb 18, 2019)

Nothing escalated, everyone can see that 980 Ti beats Fury X with ease, hence no Fine Wine here.

Trying to prove that Fury X can perform on par with a 980 Ti reference was fun tho. Remember the 30-40% OC headroom next time.

I'm out xD


----------



## trog100 (Feb 18, 2019)

it seems to me that nvidea make better graphics card than amd which leaves those in camp having a difficult time justifying exactly why they are in red camp..

ether way all this toxicity is getting rather boring.. 

trog


----------



## 95Viper (Feb 18, 2019)

No more changing what someone has posted to troll, insult, shame, create hate, or cause a toxic thread.
Also, non of the above, period.
No retaliatory comments, either.

Stay on topic.

Thank You


----------



## notb (Feb 18, 2019)

rvalencia said:


> Prey (DX11, NVIDIA Bias, 2017 )


Oh come on. This game was launched with Vega as a confirmation of AMD-Bethesda cooperation.
Counting Prey as favouring AMD (i.e. it could be even worse) it's:
AMD: 4
Nvidia: 14
undecided: 4

AMD: 4/18 ~= 22%
Nvidia: 14/18 ~= 78%
which is basically the market share these companies have in discrete GPU.

Do you think it should be 50:50? Or what? And why?


----------



## efikkan (Feb 18, 2019)

Flyordie said:


> Really, all they should do is focus on optimizing for the major game engines.


No, not at all. They should start focusing on optimizing the driver in general, not do workarounds to "cheat" benchmarks.

Many have misconceptions about what optimizations really are. Games are rarely specifically optimized for targeted hardware, and likewise drivers are rarely optimized for specific games in their core. The few exceptions to this are cases to deal with major bugs or bottlenecks.

Games should not be written for specific hardware, they are written using vendor-neutral APIs. Game developers should focus on optimizing their engine for the API, and driver developers should focus on optimizing their driver for the API, because when they try to cross over, that's when things starts to get messy. When driver developers "optimize" for games, they usually manipulate general driver parameters and replace some of the game's shader code, and in most cases it's not so much _optimization_ as them trying to remove stuff without you seeing the degradation in quality. Games have long development cycles and are developed and tested against API specs, so it's obvious problematic when suddenly a driver diverges from spec and manipulate the game, and generally this causes more problems than it solves. If you have ever experienced a new bug or glitch in a game after a driver update, then you now know why…

This game "optimization" stuff is really just about manipulating benchmarks, and have been going on since the early 2000s. If only the vendors spend this effort on actually improving their drivers instead, then we'll be far better off!



JB_Gamer said:


> Isn't it the case - the problem for Amd - that ALL games are tested and optimized for nVidia GPU'S?


Not at all. Many AAA titles are developed exclusively for consoles and then ported to PC, if anything there are many more games with a bias favoring AMD than Nvidia.

Most people don't understand what causes games to be biased. First of all, a game is not biased just because it scales better on vendor A than vendor B. Bias is when a game has severe bottlenecks or special design considerations, either intentional or "unintentional", that gives one vendor a disadvantage it shouldn't have. When some games scale better on one vendor and some other games scale better on another vendor isn't a problem by itself, games are not identical, and different GPUs have various strengths and weaknesses, so we should use a wide selection to determine real world performance. Significant bias happens when a game is designed around one specific feature, and the game scales badly on different hardware configurations. A good example of this is games which are built for consoles but doesn't really scale well with much more powerful hardware. But in general, games are much less biased than most people think, and just because the benchmark doesn't confirm your presumptions doesn't mean the benchmark is biased.



medi01 said:


> AMD perf improves over time, nVidia falls behind not only behind AMD, but behind own newer cards.


Over the past 10+ years, every generation have improved ~5-10% within their generation's lifecycle.
AMD is no better at driver improvements than Nvidia, this myth needs to die.



SIGSEGV said:


> hahaha....
> but I hate their (NVIDIA) approach to reduce performance through a driver update to older GPU.


FUD which has been disproven several times. I don't belive Nvidia have ever intentionally sabotaged older GPUs.


----------



## londiste (Feb 18, 2019)

The game list is wrong. The actual games tested in Radeon VII review along with year, API and engine name are:

2017 *- DX11 - *Assassin's Creed Odyssey (AnvilNext 2.0)
2018 *- DX11 - *Battlefield V (Frostbite 3)
2016 *- DX11 - *Civilization VI (Firaxis)
2018 *- DX11 - *Darksiders 3 (Unreal Engine 4)
2016 *- DX12 - *Deus Ex: Mankind Divided (Dawn)
2017 *- DX11 - *Divinity Original Sin II (Divinity Engine)
2018 *- DX11 - *Dragon Quest XI (Unreal Engine 4)
2018 *- DX11 - *F1 2018 (EGO Engine 4.0)
2018 *- DX11 - *Far Cry 5 (Dunia)
2017 *- DX11 - *Ghost Recon Wildlands (AnvilNext)
2015 *- DX11 - *Grand Theft Auto V (RAGE - Rockstar Advanced Game Engine)
2017 *- DX11 - *Hellblade: Senua's Sacrifice (Unreal Engine 4)
2018 *- DX11 - *Hitman 2 (Glacier 2.0)
2018 *- DX11 - *Just Cause 4 (Apex)
2018 *- DX11 - *Monster Hunter World (MT Framework)
2017 *- DX11 - *Middle-earth: Shadow of War (LithTech)
2015 *- DX11 - *Rainbow Six: Siege (AnvilNext)
2018 *- DX12 - *Shadow of the Tomb Raider (Foundation)
2018 *- DX12 - *Strange Brigade (Asura Engine)
2015 *- DX11 - *The Witcher 3 (REDengine 3)
2017 *- Vulkan - *Wolfenstein II (idTech6)


----------



## cucker tarlson (Feb 18, 2019)

londiste said:


> The game list is wrong. The actual games tested in Radeon VII review along with year, API and engine name are:
> 
> 2017 *- DX11 - *Assassin's Creed Odyssey (AnvilNext 2.0)
> 2018 *- DX11 - *Battlefield V (Frostbite 3)
> ...


Isn't Wolfenstein id7 or 6+ ? I remember doom was id6 and then devs said that wolfenstein was a big technical advancement over that.Supports half precision and variable rate shading.


----------



## SIGSEGV (Feb 18, 2019)

notb said:


> This is an accusation you could take to court and become a millionaire. Can you prove it?



accusation? really?  
I tested various driver version to my gpu and get it benched.  pfftt..


----------



## londiste (Feb 18, 2019)

Looking at the list of games:
- *Assassin's Creed*, *Battlefield*, *Civilization*, *Far Cry*, *GTA*, *Just Cause*, *Tomb Raider*, *Witcher* and *Wolfenstein* need to be in the list as the latest iteration of a long-running game series along with its engine. Same applies to *Hitman* and possibly *F1*.
- *Strange Brigade* as a game is one-off but its engine in a newer iteration fo the one behind Sniper Elite 4 which is one of the best DX12 implementations to date.
- *Divinity: Original Sin 2*, *Monster Hunter* and *Shadow of War* are bit of one-offs as relevant and popular games running unique engines.
- *Hellblade* is a game that is artistically important and actually has a good implementation of Unreal Engine 4.
- *R6: Siege* is unique case as despite its release year it is current and competitive game that is fairly heavy on GPUs.
- *Deus Ex: Mankind Divided* is a bit of a concession to AMD and DX12. This is a modified version of same Glacier 2 engine that is behind Hitman games.
- I am not too sure about the choice or relevance of *Darksiders 3*, *Dragon Quest XI* and *Wildlands*. Latest big UE4 releases and one of UbiSoft's non Assassin's Creed openworld games?

It is not productive to color games based on whether they use Nvidia GameWorks. The main problem with GameWorks as far as NVidia vs AMD is concerned was that it is closed source, making it impossible for AMD to optimize for it if needed. GameWorks has been open source since 2016 or so. AMD does not have a branded program in the same way, GPUOpen and tools-effects provided in it are non-branded but are present in a lot of games.


cucker tarlson said:


> Isn't Wolfenstein id7 or 6+ ? I remember doom was id6 and then devs said that wolfenstein was a big technical advancement over that.Supports half precision and variable rate shading.


Wolfenstein II is idTech6. Fixed. Thanks.


----------



## moproblems99 (Feb 18, 2019)

Good to see they sorted things out.  Would have been nice to have this upfront considering the architecture isn't exactly new or anything.



trog100 said:


> it seems to me that nvidea make better graphics card than amd which leaves those in camp having a difficult time justifying exactly why they are in red camp..



The real problem is that anyone cares.  Grow up and move in. (Not directed at you.)



SIGSEGV said:


> I tested various driver version to my gpu and get it benched.



The also could have introduced an issue accidentally and didn't circle back because it is not current gen.


----------



## cucker tarlson (Feb 18, 2019)

I don't think there's much in it







*there's no performance degradation for 780ti,there's very slight improvement.
*780ti vs 290x on 2016 drivers - 780ti wins in 18 runs out of 28,290x in 10 out of 28.


----------



## londiste (Feb 18, 2019)

Both AMD and Nvidia have a list of featured games:
https://www.amd.com/en/gaming/featured-games
https://www.nvidia.com/en-us/geforce/games/latest-pc-games/#gameslist

I doubt W1zzard had that in mind or checked it when choosing games but there are 6 games benchmarked from both vendors' featured list and the games were not what I really expected:
AMD: Assassin's Creed Odyssey, Civilization VI, Deus Ex: Mankind Divided, Far Cry 5, Grand Theft Auto V, Strange Brigade.
Nvidia: Battlefield V, Darksiders 3, Divinity Original Sin II, Dragon Quest XI, Monster Hunter World, Shadow of the Tomb Raider.


----------



## INSTG8R (Feb 18, 2019)

rvalencia said:


> From https://www.techpowerup.com/reviews/AMD/Radeon_VII/
> 
> Assassin's Creed Origins (NVIDIA Gameworks, 2017)
> 
> ...


Except it’s AC odyssey and it’s Amd but good effort regardless...


----------



## M2B (Feb 18, 2019)

medi01 said:


> You just need to stop projecting. It isn't hard.
> 
> 
> 
> ...



Kepler architecture aged bad for some reason but maxwell has aged as it should.
1060 and 980 were in the same level of performance back in 2016 and they are in 2019.
Nothing has changed (except for games that need more than 4GB of VRAM).

I don't think someone who spends 700$ on a card even cares about its performance after 3~ years or so, high-end owner needs high-end performance and upgrades sooner than mid-range user in general.


----------



## Xex360 (Feb 18, 2019)

I've heard of some sort of undervolting which improves the card's thermals greatly, the whole process seems very easy, why no one is bothering to use it?


----------



## moproblems99 (Feb 18, 2019)

While all of this is very interesting, I fail to see how it impacts this gpu.  New thread maybe?  Perhaps some conspiracy can be brought to light from it.


----------



## Vayra86 (Feb 18, 2019)

M2B said:


> Kepler architecture aged bad for some reason but maxwell has aged as it should.
> 1060 and 980 were in the same level of performance back in 2016 and they are in 2019.
> Nothing has changed (except for games that need more than 4GB of VRAM).



Kepler aged badly because of VRAM, and that is exactly where AMD had more to offer in the high-end at the time.

They had a 7970 with 3GB VRAM to compete with GTX 670/680 2GB.
And later they had a 290X with 4GB VRAM to compete with GTX 780/780ti 3GB.

At the same time, the mainstream res started slowly moving to 1440p as Korean IPS panels were cheap overseas and many enthusiasts imported one. This heavily increased VRAM demands, alongside the consoles being released with 6GB to address, which meant mainstream would quickly move to higher VRAM demands, and it happened across just 1,5 generation of GPUs, even in the Nvidia camp the VRAM almost doubled across the whole stack, and then doubled _again_ with Pascal. That is why people are liable to think AMD cards 'aged well' and Nvidia cards lost performance over time. This culminated in the release of the 3.5GB 'fast' VRAM GTX 970. That little bit of history ALSO underlines why AMD now releases a 16GB HBM card. They are banking on the idea that people THINK it might double again over time, that is why you can find some people advocating the 16GB as a good thing for gaming. And of course the expenses of having to alter the card.

If any supporter of Red needed confirmation bias, there it is . But it doesn't make it any less of an illusion that Nvidia drivers handicap performance over time.


----------



## EarthDog (Feb 18, 2019)

Xex360 said:


> I've heard of some sort of undervolting which improves the card's thermals greatly, the whole process seems very easy, why no one is bothering to use it?


They are... but not in reviews... the majority of people don't bother. There is also the point of, why should anyone have to do this in the first place?


----------



## IceShroom (Feb 18, 2019)

medi01 said:


> Nope:
> 
> Apple: Yes, we're slowing down older iPhones
> 
> ...


Older ones were AMD, new Hitman 2 is Nvidia Title.


----------



## THANATOS (Feb 18, 2019)

medi01 said:


> I'm glad 960 beating $699 780Ti is justifiable.


GTX 960 is not beating GTX 780Ti, that's a total BS. You even posted a graph from TP(2k resolution) where GTX 780Ti is way faster than GTX960.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/26.html


----------



## M2B (Feb 18, 2019)

Vayra86 said:


> Kepler aged badly because of VRAM, and that is exactly where AMD had more to offer in the high-end at the time.
> 
> They had a 7970 with 3GB VRAM to compete with GTX 670/680 2GB.
> And later they had a 290X with 4GB VRAM to compete with GTX 780/780ti 3GB.
> ...



True and untrue.
It's not just VRAM, yeah VRAM requirements have risen significantly since then but don't forget you can easily remove the VRAM bottleneck by lowering the Texture quality.
970 was slower than the 780Ti at launch but it's not the case today, it's ahead actually, even in cases where the VRAM isn't a limiting factor.


----------



## IceShroom (Feb 18, 2019)

M2B said:


> True and untrue.
> It's not just VRAM, yeah VRAM requirements have rised significantly since then but don't forget you can easily remove the VRAM bottleneck by lowering the Texture quality.
> 970 was slower than the 780Ti at launch but it's not the case today, it's ahead actually, without VRAM being a limiting factor.


Pay $350+ just to play with lower Texture quality!!!!
I have a better advice, Buy a(or two) console.


----------



## M2B (Feb 18, 2019)

IceShroom said:


> Pay $350+ just to play with lower Texture quality!!!!
> I have a better advice, Buy a(or two) console.



What the hell are you smoking?


----------



## Vayra86 (Feb 18, 2019)

But yeah, its not just VRAM, but if you speak of aging wrt the 3GB cards, we've also seen the introduction of delta compression and much improved GPU Boost for Nvidia starting with Maxwell, while AMD was busy rebranding everything. Maxwell was that crucial moment where they actually gave up and created the gaping hole in performance and perf/watt we're still looking at today.


----------



## jabbadap (Feb 18, 2019)

Vayra86 said:


> Kepler aged badly because of VRAM, and that is exactly where AMD had more to offer in the high-end at the time.
> 
> They had a 7970 with 3GB VRAM to compete with GTX 670/680 2GB.
> And later they had a 290X with 4GB VRAM to compete with GTX 780/780ti 3GB.
> ...



Not only that, it does not have tiled raster either so memory bandwidth became problem to it too. I would be interested to see original Kepler Titan added to tpu benchmark. Performance should be close to rx 570/gtx1060 3GB model  at 1080p. @W1zzard still has one? One interesting bit though there were 6GB gtx 780s too, gtx780ti was always 3GB as 6GB was the Titan black... But yeah we are going way off topic now.

Radeon VII has a lot vram only weakness in memory subsystem is it has quite low ROPs count. In normal tasks that is more than enough, but using MSAA can really tank the performance. Luckily for AMD MSAA is dying breed, fewer and fewer games are supporting that AA method anymore.


----------



## Xaled (Feb 18, 2019)

W1zzard says:
 I'd focus on optimizations for UE4 next.

Nvidia is working closely with ue4 right now. I don't think AMD would risk working on optimizations on ue4 then have all the labor gone by a patch that would let AMD cards fall behind again. Just like Nvidia did with the dx11 HAWX game.


----------



## cucker tarlson (Feb 18, 2019)

god damn this thread is just too fun of a read for those who enjoy conspiracy threories.


----------



## SIGSEGV (Feb 18, 2019)

cucker tarlson said:


> I don't think there's much in it
> 
> *there's no performance degradation for 780ti,there's very slight improvement.
> *780ti vs 290x on 2016 drivers - 780ti wins in 18 runs out of 28,290x in 10 out of 28.



That's the main reason I stay with so-called "ANCIENT" driver. Thanks.


----------



## cucker tarlson (Feb 18, 2019)

SIGSEGV said:


> That's the main reason I stay with so-called "ANCIENT" driver. Thanks.


what is the reason?


----------



## SIGSEGV (Feb 18, 2019)

cucker tarlson said:


> what is the reason?





> **there's no performance degradation for 780ti,there's very slight improvement*.



you gave me the old graph that supposed released somewhere in 2016


----------



## IceShroom (Feb 18, 2019)

Vayra86 said:


> But yeah, its not just VRAM, but if you speak of aging wrt the 3GB cards, we've also seen the introduction of delta compression and much improved GPU Boost for Nvidia starting with Maxwell, while AMD was busy rebranding everything. Maxwell was that crucial moment where they actually gave up and created the gaping hole in performance and perf/watt we're still looking at today.


Colour compression may be useful for SDR but not in HDR. Enable HDR on GTX 1080 makes it slower than Vega 64.
https://www.computerbase.de/2018-07...vidia-geforce/2/#diagramm-destiny-2-3840-2160


----------



## Vayra86 (Feb 18, 2019)

IceShroom said:


> Colour compression may be useful for SDR but not in HDR. Enable HDR on GTX 1080 makes it slower than Vega 64.
> https://www.computerbase.de/2018-07...vidia-geforce/2/#diagramm-destiny-2-3840-2160



That is great, what else you got? Maybe you ought to reread what you quoted and replied to, because your earlier comment had zero relation to what was being discussed and neither does this one.


----------



## cucker tarlson (Feb 18, 2019)

SIGSEGV said:


> you gave me the old graph that supposed released somewhere in 2016


so the gimping ocurred after 2016 then ?
when exactly ?


----------



## Xaled (Feb 18, 2019)

cucker tarlson said:


> god damn this thread is just too fun of a read for those who enjoy conspiracy threories.


A Theory like GPP for example?


----------



## bug (Feb 18, 2019)

cucker tarlson said:


> god damn this thread is just too fun of a read for those who enjoy conspiracy threories.


Yes, but it is of paramount importance to establish how well have aged cards that can't push modern AAA titles anyway


----------



## Vayra86 (Feb 18, 2019)

Xaled said:


> A Theory like GPP for example?



Unproven gimped Nvidia drivers > GPP. Yep. Seems legit...

Why is this turning into an Nvidia driver performance slowchat? That horse is dead, buried and probably cast into the ocean by now. Let it go unless you have actual data and if you do, open a nice little topic for it. Man...


----------



## M2B (Feb 18, 2019)

jabbadap said:


> memory subsystem is it has quite low ROPs count.



The ROP is not a part of memory-subsystem in AMD's case as far as I know.


----------



## turbogear (Feb 18, 2019)

Xex360 said:


> I've heard of some sort of undervolting which improves the card's thermals greatly, the whole process seems very easy, why no one is bothering to use it?



Undervolting works well on all Vega generations and is very easy to do through Wattman.
I undervolted my custom watercooled Vega 64 before with nice performance  boosts.
At 1085mV undervolt and 2.5% frequency overclock Vega 64 ran for past 1.5 years at frequencies in range of 1670MHz with max temperature at 42°C while consuming around 30W less than refernce settings.

I have undervolted Radeon VII 1030mV @1082MHz. 
In the two main games that I play at the moment BF V and Black Ops 4 the average power consumption on my Radeon VII is around 250W with this setting.
Waiting for waterblocks to be available to start fine tuning Radeon VII like Vega 64.
The refernce cooler is limiting factor at the moment. 
I have the watercooling setup as was used for cooling Vega 64 and Ryzen 2700x.
Only thing missing is waterblock for Radeon VII.


----------



## IceShroom (Feb 18, 2019)

Vayra86 said:


> That is great, what else you got? Maybe you ought to reread what you quoted and replied to, because your earlier comment had zero relation to what was being discussed and neither does this one.


The comment I replyed said Nvidia with colour compression requiers less bandwith and VRAM. Those useful with 6 bit SDR colour with not full(25-255) colour range. But with HDR with 8+ bit colour depth 0-255 colour rannge need Raw  bandwidth, the article in link shows that.


----------



## Vayra86 (Feb 18, 2019)

IceShroom said:


> The comment I replyed said Nvidia with colour compression requiers less bandwith and VRAM. Those useful with 6 bit SDR colour with not full(25-255) colour range. But with HDR with 8+ bit colour depth 0-255 colour rannge need Raw  bandwidth, the article in link shows that.



Great, but the topic was aging of cards _without delta compression_ and whether drivers had any influence on it.


----------



## xkm1948 (Feb 18, 2019)

For the shroom guy:

Buy a Radeon 7 dude. Then test performance along the way for new driver releases. We will all shut up if you give us hard numbers.

Plus you get to suport AMD GPu division. Two birds one stone right? I assuming you are not those hyppcrites who are all talks but no action?


----------



## the54thvoid (Feb 18, 2019)

@xkm1948 - You're like the guy from that ancient Remington advert where the guy says, "I liked it so much, I bought the company!". Except, what a lot of people don't know is that you bought the Fury X, and were sorely let down by not just it, but the promised software support. 

I mean, I wanted to give AMD a chance and I bought a 1700X. It's alright. It's a good direction for AMD. So much so, I'll buy a Ryzen 2, guaranteed when it releases. But the GFX dept at AMD... Not quite there yet. No matter how much some people suggest it is. I do think, that if Intel throws enough money at it, they might even overtake AMD, and that will be their death knell.


----------



## xkm1948 (Feb 18, 2019)

the54thvoid said:


> @xkm1948 - You're like the guy from that ancient Remington advert where the guy says, "I liked it so much, I bought the company!". Except, what a lot of people don't know is that you bought the Fury X, and were sorely let down by not just it, but the promised software support.



Yep. I was and still am constantly attacked by angry red mobs for pointing out problems of the FuryX. They do not hesitant one second when it comes to defending their beloved brand.

Well i was young and stupid and let my emotion clouded my judgement. Not anymore tho...


----------



## siluro818 (Feb 18, 2019)

xkm1948 said:


> Well i was young and stupid and let my emotion clouded my judgement. Not anymore tho...


*looks at the 2080Ti in system specs*

If you say so bro xD


----------



## xkm1948 (Feb 18, 2019)

siluro818 said:


> *looks at the 2080Ti in system specs*
> 
> If you say so bro xD



I huv the money for a top of the line peformer. U mad cause i can afford something good? lol

Also i am not your bro, kiddo


----------



## IceShroom (Feb 18, 2019)

xkm1948 said:


> For the shroom guy:
> 
> Buy a Radeon 7 dude. Then test performance along the way for new driver releases. We will all shut up if you give us hard numbers.
> 
> Plus you get to suport AMD GPu division. Two birds one stone right? I assuming you are not those hyppcrites who are all talks but no action?


I am not a customer for $500+ gpu, i am more interested in $120-140 GPU.


----------



## xkm1948 (Feb 18, 2019)

IceShroom said:


> I am not a customer for $500+ gpu, i am more interested in $120-140 GPU.



That's a pity. I was expecting you to buy a Radeon 7, for science and justice and all that.

Well anyone else wanna go?*siluro818*? I see you sporting a Vega64, time to go big for a Radeon 7 and show us that FineWine


----------



## Super XP (Feb 18, 2019)

JB_Gamer said:


> Isn't it the case - the problem for Amd - that ALL games are tested and optimized for nVidia GPU'S?


Nvidia seems to be a little more aggressive to get games optimized for its GPUs, probably because AMD owns the Console Graphics, and seeing as of late, most games are Console Ports onto PC. Well, Every Single Console Game is Optimized for AMD Radeon GPUs.



IceShroom said:


> I am not a customer for $500+ gpu, i am more interested in $120-140 GPU.


Well if you think about it, the maximum price tag for the highest of the high performance GPUs should be MAX $500. This is speaking about the Radeon VII, RTX 2080. Enthusiast price tag should be no more than $600. That would be a RTX 2080Ti and a Radeon VII+ with custom cooling 
GPUs, especially the high end versions are all overpriced period.


----------



## HD64G (Feb 18, 2019)

xkm1948 said:


> That's a pity. I was expecting you to buy a Radeon 7, for science and justice and all that.
> 
> Well anyone else wanna go?*siluro818*? I see you sporting a Vega64, time to go big for a Radeon 7 and show us that FineWine



I suggest you relax a bit *xkm1948 *as you have been writing constant provocative posts for hours in this thread. And I think that might end bad for you. A simple advice. I understand your pain or anger for your FuryX not being what you wanted to be but this attitute in an AMD thread towards anyone liking their products is doing harm for sure. And you use to do that a lot for months now in any AMD GPU related thread. Don't take it as a personal attack, just pointing out that it is a dangerous path for a TPU forum member the one you follow lately.


----------



## Assimilator (Feb 18, 2019)

AMD GPU fanboys to AMD drivers are like Trump supporters to Hillary's emails... you never hear the end of the BS, no matter how many times it's debunked.


----------



## notb (Feb 18, 2019)

Super XP said:


> Well if you think about it, the maximum price tag for the highest of the high performance GPUs should be MAX $500. This is speaking about the Radeon VII, RTX 2080. Enthusiast price tag should be no more than $600. That would be a RTX 2080Ti and a Radeon VII+ with custom cooling


OK. I've thought about it. It turns out the proper maximum price for high-end GPU should be $2008.55. There's a big difference between our results. :/ How did you get to those $500?


----------



## Vayra86 (Feb 18, 2019)

notb said:


> OK. I've thought about it. It turns out the proper maximum price for high-end GPU should be $2008.55. There's a big difference between our results. :/ How did you get to those $500?



Please explain the 55 cents, I'm intrigued


----------



## xkm1948 (Feb 18, 2019)

HD64G said:


> I suggest you relax a bit *xkm1948 *as you have been writing constant provocative posts for hours in this thread. And I think that might end bad for you. A simple advice. I understand your pain or anger for your FuryX not being what you wanted to be but this attitute in an AMD thread towards anyone liking their products is doing harm for sure. And you use to do that a lot for months now in any AMD GPU related thread. Don't take it as a personal attack, just pointing out that it is a dangerous path for a TPU forum member the one you follow lately.



Suggesting people to buy and test themselves is provactive? Since when I can no longer participae in discussions? They say what they wanted to say, i suggest they use their own purchase to backup their claim and that is wrong? Are you trying to censor postings now?


----------



## EarthDog (Feb 18, 2019)

HD64G said:


> I suggest you relax a bit *xkm1948 *as you have been writing constant provocative posts for hours in this thread. And I think that might end bad for you. A simple advice. I understand your pain or anger for your FuryX not being what you wanted to be but this attitute in an AMD thread towards anyone liking their products is doing harm for sure. And you use to do that a lot for months now in any AMD GPU related thread. Don't take it as a personal attack, just pointing out that it is a dangerous path for a TPU forum member the one you follow lately.


The ignore list is brand loyalty agnostic... 

Really, there are some pretty scary opinions on both sides of the fence here that need reigned in or just straight kicked out. If my ignore list his 15 (and lord knows we are close!), I'm just walking from this fookn forum.


----------



## Vayra86 (Feb 18, 2019)

xkm1948 said:


> Suggesting people to buy and test themselves is provactive? Since when I can no longer participae in discussions? They say what they wanted to say, i suggest they use their own purchase to backup their claim and that is wrong? Are you trying to censor postings now?



Well I have to agree it does all sound a bit baity and it doesn't help the overall quality of topics. That is at least what I associate this with. Its not like you are genuinely interested in the answers. Consider it feedback...


----------



## notb (Feb 18, 2019)

Vayra86 said:


> Please explain the 55 cents, I'm intrigued


Well, the calculation was very long - I don't want to bore anyone with the details...
But a few variables cancelled and I got this in the end:
`exp(2*ln("current year" - 2009) + "which planet from Sun you're on") = exp(2*ln(10)+3)`
that's the way you calculate how much a GPU should cost, right?

And it may actually explain @Super XP 's results, because, judging by his comments, he's clearly from a different time or planet...


----------



## xkm1948 (Feb 18, 2019)

Vayra86 said:


> Well I have to agree it does all sound a bit baity and it doesn't help the overall quality of topics. That is at least what I associate this with. Its not like you are genuinely interested in the answers. Consider it feedback...



Dude I am ALL for scientific methods. I am a molecular genetist. You make a hypothesis and you back it up with experimental setup and results. Anyone can go around all day and fart claims out lf thin air. But to make solid claims one has to get their hands dirty and get to do the experiments: FineWine as one of many examples.


----------



## Vayra86 (Feb 18, 2019)

xkm1948 said:


> Dude I am ALL for scientific methods. I am a molecular genetist. You make a hypothesis and you back it up with experimental setup and results. Anyone can go around all day and fart claims out lf thin air. But to make solid claims one has to get their hands dirty and get to do the experiments: FineWine as one of many examples.



The point is, those examples exist and we already know what's up. Stop beating the dead horse...


----------



## EarthDog (Feb 18, 2019)

Countryside said:


> Well why not try a new occupation at AMD and contribute in making their gpus better instead talking how shait their hardware is.


Sorry, you will need to get in line behind the dude from Adored who nearly lost his gag reflex trying to get a job at AMD.


----------



## notb (Feb 18, 2019)

Countryside said:


> Well why not try a new occupation at AMD and contribute in making their gpus better instead talking how shait their hardware is.


You do understand that molecular biology and semiconductor engineering are 2 different things, right?
But man... wouldn't it be nice if AMD finally employed (or at least cooperated with) people that represent the "pro" client segment.
Nvidia is *so active* in promoting and supporting GPGPU and it clearly didn't hurt their products...

When looking at recent AMD products and how they try to be attractive to content creators, it does remind me of how open-source software for photographers looks.
Software people are like: "we gave you all these fantastic features! Why are you still using that dreaded, expensive Photoshop?!"
And AMD is like: "we gave you all this fantastic cores and we crush Intel in benchmarks! Why are you still using that awful, expensive i7?!"



EarthDog said:


> Sorry, you will need to get in line behind the dude from Adored who nearly lost his gag reflex trying to get a job at AMD.


Man... I'm actually thinking about applying for a job at Nvidia. I wonder if I'd still be able to take part in all these lovely flames if they choose me... :/


----------



## turbogear (Feb 18, 2019)

@W1zzard
Thanks a lot for the effort to retest. 
Me as Radeon VII owner was not expecting any large performance increase between 19.2.1 and 19.2.2. The time frame I believe is too short between the two.
It did though manage to fix few issues with Wattman and overclocking.

What I personally noticed is that Black Ops 4 is running better. With 19.2.1 on my PC the frame was changing very fast between 70 FPS and 120FPS by every movement in game. The GPU clock was having lot of dips. With new drive it is mostly varying above 100FPS staying close to 120FPS and GPU clock is relatively stable. I have BenQ Zowie XL2730 FreeSync 144Hz monitor.


----------



## HD64G (Feb 18, 2019)

xkm1948 said:


> Dude I am ALL for scientific methods. I am a molecular genetist. You make a hypothesis and you back it up with experimental setup and results. Anyone can go around all day and fart claims out lf thin air. But to make solid claims one has to get their hands dirty and get to do the experiments: FineWine as one of many examples.


Since you are fond of numbers that prove things scientifically, here is an example that proves that AMD GPUs aren't shown in their best form at their launch and this helps us customers to get an equal or better product in better price than will deserve in a few months only. And for a customer that keeps his hardware at least for 3 years, this is an opportunity.


----------



## Nkd (Feb 18, 2019)

las said:


> I'm simply stating facts. I would wish that AMD GPU's were able to compete.. All their "high-end" solutions have been terrible recently. Leaving this segment for Nvidia.



Not sure what you mean. Does this card not compete at all? AMD needed CPUs to survive so they focused on that. Or you would have no AMD and Radeon graphics. Now CPU side is going good they will eventually get to the GPU side. Given their need to survive as company I am not sure how this is so bad. They somehow managed to stay relevant which in its own is good enough while starting to give intel some serious competition on CPU side. AMD did what they had to and thought long term. If they invested more in GPU side they might not be in business right now because CPU side is where they are going to survive as a company. Its not a matter of if, its just when. I think in next few years will have their Zen of GPUs too.


----------



## trog100 (Feb 18, 2019)

well i am glad i forked out for a 2080ti.. at least i dont have to worry about the rest.. 

i follow all this stuff because it interests me.. but its all being spoiled by to much negativity.. too much entitlement infringement.. too much silly red green bickering..
huge lengthy threads about f-ck all... 

trog


----------



## xkm1948 (Feb 18, 2019)

HD64G said:


> Since you are fond of numbers that prove things scientifically, here is an example that proves that AMD GPUs aren't shown in their best form at their launch and this helps us customers to get an equal or better product in better price than will deserve in a few months only. And for a customer that keeps his hardware at least for 3 years, this is an opportunity.




This is exactly my point. For people who don’t believe in W1zzard’s review, go get a Radeon 7. Test it along the life span of the GPU and report back how it ages.


----------



## Countryside (Feb 18, 2019)

xkm1948 said:


> This is exactly my point. For people who don’t believe in W1zzard’s review, go get a Radeon 7. Test it along the life span of the GPU and report back how it ages.



W1zzard’s reviews are of top quality and i dont doubt his ability but he is one, most people dont have the expertice nor the equipment to test hardware thus they realy on reviews and you dont make an educated decision based on one review.


----------



## siluro818 (Feb 18, 2019)

xkm1948 said:


> I huv the money for a top of the line peformer. U mad cause i can afford something good? lol
> 
> Also i am not your bro, kiddo


Oh nobody's arguing that you have the money, clearly lacking in every other area though


----------



## notb (Feb 18, 2019)

Vayra86 said:


> The point is, those examples exist and we already know what's up. Stop beating the dead horse...


While I'm not a huge fan of the argument quality we've seen here, I do share @xkm1948 's enthusiasm for a more scientific approach (as shown earlier).
I mean: we've been hearing about FineWine for years and it doesn't look like this could ever stop. There are, as you said "examples" or "traces" that something could be going on, but that's a bit like a proof of existence of ether or gods - we've seen some weird stuff going on, so let's assume there's a reason and let's give it a catchy name.

We had a nice discussion about quality of journalism in another topic. Seriously, wouldn't you like to see a proper experiment for this theory?
It's not that hard either. You take a few cards from different generations, 20-30 driver editions released over 5 years, a few AAA games and you measure all data points.
Instead, we have *hundreds of thousands of people* on *thousands of forums* talking about a phenomenon that no one has ever confirmed in a controlled environment. And it's been going *for years*. And it will keep going for as long as AMD will be making GPUs.

Why hasn't any of the large tech sites done this? Are they afraid of the result and angry mob from either side?
Why hasn't AMD done this?


----------



## siluro818 (Feb 18, 2019)

Wait, wait, wait: "I am a molecular genetist."

*nondescript snorting sounds*

Oh god that was just... Try to put together some molecules for that blood pressure, mate.


----------



## xkm1948 (Feb 18, 2019)

notb said:


> While I'm not a huge fan of the argument quality we've seen here, I do share @xkm1948 's enthusiasm for a more scientific approach (as shown earlier).
> I mean: we've been hearing about FineWine for years and it doesn't look like this could ever stop. There are, as you said "examples" or "traces" that something could be going on, but that's a bit like a proof of existence of ether or gods - we've seen some weird stuff going on, so let's assume there's a reason and let's give it a catchy name.
> 
> We had a nice discussion about quality of journalism in another topic. Seriously, wouldn't you like to see a proper experiment for this theory?
> ...



HardOCP did. They found Nvidia does not gimp their cards and AMD does not FineWine either


----------



## HammerON (Feb 18, 2019)

Warning and reply bans issued. No more personal attacks. Play nice or move along.


----------



## notb (Feb 18, 2019)

xkm1948 said:


> HardOCP did. They found Nvidia does not gimp their cards and AMD does not FineWine either


OK, I've just checked it. Not enough data points, wrong presentation. But otherwise the right direction. Thanks.


----------



## bug (Feb 18, 2019)

notb said:


> OK, I've just checked it. Not enough data points, wrong presentation. But otherwise the right direction. Thanks.


Wrong attitude I'd say. You're looking for someone to disprove by the book something that has only anecdotally been proven.


----------



## Mistral (Feb 18, 2019)

If I may, a suggestion for benchmarks going forward: have a small icon next to the game titles, indicating if they are nVidia or AMD sponsored.


----------



## xkm1948 (Feb 18, 2019)

Long term performance, people dont usually keep their cards long enough to see the impact. With the progression rate most modern GPUs have usful gaming performance life span for 3~4 years. By the time FineWine is ready the GPU is already obsolete.


----------



## Countryside (Feb 18, 2019)

Mistral said:


> If I may, a suggestion for benchmarks going forward: have a small icon next to the game titles, indicating if they are nVidia or AMD sponsored.



That's a decent idea, or instead of the icon you can just add text next to the game title nVidia or AMD sponsored


----------



## lexluthermiester (Feb 18, 2019)

The improvements offered by these updated drivers are on par with NVidia's same efforts for RTX and seem to show that with proper optimizations Radeon 7 is going to be a great card for a ton of usage scenario's.


----------



## Fouquin (Feb 18, 2019)

It doesn't look like a lot, but that's a pretty decent chunk of performance left on the table for the review driver. Seems they could have delayed the launch by a week or two and shown better results.



las said:


> Meanwhile Fury X can gain 3% from OC, bumps up the powerusage like crazy tho. OVERCLOCKERS DREAM!!!



Average core overclock for the Fury X is 1140MHz. That's 14%, not 3%. It still was not a great overclocker, but there's no need to fudge the numbers to push your narrative. Somewhat related, I tested an overclocked Fury X up against a Galax GTX 1070 EXOC and found the performance difference to be in the 5.5 - 11% range. Surprising to see the Fury X that close, to say the least.


----------



## John Naylor (Feb 18, 2019)

londiste said:


> Is the graph wrong? 5-6% is not multi-digit.



Whaddya talking about ? ... there's a 5 and a 6 .    Two digits is more than one !   In the post 2016 era, nothing means what the dictionary says any more.   But speaking of multi digit , have to wonder ... are the days of double digit performance increases after manual OC a memory from a bygone era ?   Havn't seem much of it from AMD since the 2xx series but w/ nVidia we saw mid teens normally and even more than 31% on some cards.  Now both camps seem to be aggressively overclocking the cards in the box, leaving is under 10% to grab on our own.




londiste said:


> Stop harping on that. GTX1050Ti has MSRP of $139, RX570 has MSRP of $169. For most part of the lifetime relative prices have reflected that difference only lately moving to where the prices are now.



I still don't see as it matters ... the 1060 remains the better buy over both in "performance per dollar".




Super XP said:


> Well if you think about it, the maximum price tag for the highest of the high performance GPUs should be MAX $500. This is speaking about the Radeon VII, RTX 2080. Enthusiast price tag should be no more than $600. That would be a RTX 2080Ti and a Radeon VII+ with custom cooling
> GPUs, especially the high end versions are all overpriced period.



If you do it accurately and account for inflation, since the year 2000, and excepting the current weirdness resulting from lack of competition at upper end, short supply and even tariffs, the average price of the top dog from nVidia  hasn't strayed that much from $700 in 17 years







[/QUOTE]




HD64G said:


> Since you are fond of numbers that prove things scientifically, here is an example that proves that AMD GPUs aren't shown in their best form at their launch and this helps us customers to get an equal or better product in better price than will deserve in a few months only. And for a customer that keeps his hardware at least for 3 years, this is an opportunity..



There's two sides of that coin and I recently mentioned that in another thread.  The 480 in particular improved significantly with the next driver release ... but you're forgetting just one thing.  Like the oft heard comment "Well card B might be slower than Card A but when overclocked, card B is almost as fast as card A".  The fact remains, card A can be overclocked too making the comment meaningless.  Yes, the 480 git a nice bump afterwards and TOU dedicated an entire article to that subject.   However, nVidias drivers provided improevements too.  And when you look at the 1060 versus the 480, 580 and even the 590, when all 4 cards are manully overclocked, (based upon the data in TPUs test results on this site), the 1060 has still maintained the edge.  The 480 OC'd about 6% .... 580 about 4.4% .... 590 did 3.9% .... the 1060 OCs over 18%.  So any advantage we saw from those RX cards from aggressive clocking before putting them "in the box"  was erased when users took them out of the box and did the manual OCs.  Since 2xx series, AMD cards have been more aggressively clocked when taken "outta the box" typically will manually OC only in single digits, while nVidias cards have ranged from the mid teens most of the time (with rare single digit exceptions) to over 31% manual OC over reference.   Even if we ignored driver improvements over time, that's a big hurdle to overcome.

Today,  with 2xxx / Radeon VII we are seeing a watershed moment in that all of current cards are seeing manual OCs about 8% over reference which makes side by side comparisons easier when using those charts.

Getting back to the "new drivers" issue have to wonder how much of these improvements are real.  Is the game actually performing better across the board ?  Or has it just been tweaked a bit more to "look good" in that benchmark.   That's what makes sites like TPU a "go to" source for me as Wiz doesn't use the IG demo.   As was presented in the chart in Post #59, I have not seen any evidence that either side is doing better than the other in this respect .... that chart says otherwise.

https://www.techpowerup.com/forums/...ested-with-latest-drivers.252691/post-3997015


----------



## Fluffmeister (Feb 18, 2019)

Ouch!!! Fact is, this turd of a card is getting too much attention, it was a token effort at best from AMD whilst they work with their Sony and Microsoft overlords to get Navi into check.... poor show all round AMD.


----------



## efikkan (Feb 18, 2019)

The only purpose of this card is to have something until the delayed Navi is ready.
But the attention is not the bad part; the bad part is those who claim it's a decent buy, both in forums and among Youtube opinionators.


----------



## John Naylor (Feb 18, 2019)

Fouquin said:


> Average core overclock for the Fury X is 1140MHz. That's 14%, not 3%. It still was not a great overclocker, but there's no need to fudge the numbers to push your narrative. Somewhat related, I tested an overclocked Fury X up against a Galax GTX 1070 EXOC and found the performance difference to be in the 5.5 - 11% range. Surprising to see the Fury X that close, to say the least.



Core / memory Ocs don't scale to increases in FPS ... in most instances the best OCs on core and memory do NOT correspond at all to highest fps






As you can see above, the highest cores aren't delivering the most fps.






You'll note that "outta the box' the Fury X beat the "slower" 980 Ti (102.6) by 0.3 fps before overclocking .

However, when TPU OC'd the Fury, it brought just 5.1% to the table .... Now let's look at what happened when TPU OCd the 980 Tis

On the MSI Gaming X 980 Ti, they OC's it 27.2% to hit 130.5 fps, 20.7% faster than the Fury X
On the Giga G1 980 Ti, they OC's it 31.4% to hit 130.5 fps, 24.7% faster than the Fury X
On the Zotac AMP X 980 Ti, they OC's it 27.1% to hit 130.4 fps, 20.6% faster than the Fury X
On the Asus Strix X 980 Ti, they OC's it 28,4% to hit 131.7 fps, 21.8% faster than the Fury X

I'm not going to speak to the "narrative", but the math here is clear.   I don't care about who wins, but I do care about facts.  The overclocking room for the 980 Ti is 5 - 6 times that of the Fury X.  If this is "competiing",  it's the 1973 Belmont Stakes


----------



## Prima.Vera (Feb 18, 2019)

This card is unfortunatelly 200$ more than it should have been. Nobody is going to pick this over an 1080Ti for example...


----------



## moproblems99 (Feb 18, 2019)

Prima.Vera said:


> This card is unfortunatelly 200$ more than it should have been. Nobody is going to pick this over an 1080Ti for example...



and few years too late.


----------



## sergionography (Feb 19, 2019)

Prima.Vera said:


> This card is unfortunately 200$ more than it should have been. Nobody is going to pick this over an 1080Ti for example...


Unless they wanted 16GB of fast memory and weren't exclusively gamers. This card has a niche. I hear alot of complains about this card but its honestly not as bad as everyone makes it. Simply because it doesn't blow nvidia out of the water doesn't make it a bad card. rtx2070-2080 performance is not exactly "bad". Also this card is newer than the 1080ti and is likely to be supported for longer(for those who don't buy graphics cards every other year)


----------



## EarthDog (Feb 19, 2019)

If you compute, it's a great card for the money. If you dont, then it performs worse, uses a lot more power, and is louder than its competition...at the same price point. Compute saves it if those other factors matter.


----------



## B-Real (Feb 19, 2019)

Prima.Vera said:


> This card is unfortunatelly 200$ more than it should have been. Nobody is going to pick this over an 1080Ti for example...


Radeon VII buyers get RE2, Division 2 and DMC5. 1080Ti buyers get ZERO games. Thats a $150-180 packet.



Fluffmeister said:


> Ouch!!! Fact is, this turd of a card is getting too much attention, it was a token effort at best from AMD whilst they work with their Sony and Microsoft overlords to get Navi into check.... poor show all round AMD.



As poor as the whole RTX roundup is. 



londiste said:


> Yes, RX570 wipes the floor with GTX1050Ti. They are cards from different segments.
> RX570 is the intended competitor of GTX1060 3GB.
> GTX1050Ti is the intended competitor of RX560.
> 
> Pricing in the lowend and midrage is FUBAR.


False, 1050Ti doesn't have an AMD counterpart. GTX 1050 goes against (and trades blows) with the RX560, RX 470/570 against the 1060 3GB and the RX 570/580 against the 1060 6GB. Anyway, it's true that if there is someone who buys a 1050Ti for the same price as the RX 570 4GB, he is just literally stupis af.


----------



## Xaled (Feb 19, 2019)

John Naylor said:


> If you do it accurately and account for inflation, since the year 2000, and excepting the current weirdness resulting from lack of competition at upper end, short supply and even tariffs, the average price of the top dog from nVidia  hasn't strayed that much from $700 in 17 years



Where is Titan in this table?


----------



## Jism (Feb 19, 2019)

And in the meantime, the Vega VII scoring merely 300 points behind the 2080TI:

https://www.3dmark.com/hall-of-fame-2/timespy+3dmark+score+extreme+preset/version+1.0/1+gpu

It likes water > and many people archieved a clockspeed of over 2250Mhz. From a maximum boost of 1800Mhz that is not bad at all! It's a great OC'er actually.


----------



## turbogear (Feb 19, 2019)

Radeon VII is not a bad card especially for 2K gaming with Freesync.
It may not match 2080 in every title but for people like me who come from Vega 64 and enjoy Freesync gaming it is a viable upgrade.
I know that Nvidia supports Freesync now but until now only officially on a handful of monitors.
I had a long thought if I should go for 2080 or Radeon VII, but finally decided to go with VII because of concerns with Freesync.
Nvidia does not support as far as  I checked my Benq XL2730 freesync monitor and paying between 500€ to 700€ more to get a new good Gsync monitor would have been too much. 
Only cooling performance of the refernce cooler on Radeon VII is not good but a good waterblock will fix that.


----------



## bajs11 (Feb 19, 2019)

M2B said:


> What the hell are you smoking?



that guy is funny
he tried to turn amd vs nvidia to pc vs console
i wonder what a console peasant is doing here though
this forum is about pc hardware...
they should sit in their moms basement play Mario Kart


----------



## metalfiber (Feb 19, 2019)

Imsochobo said:


> 290X
> 7970
> 6970
> 5870
> ...



Ah yes, I had a ATI 9800 Pro All In Wonder card...loved that card. Watched Tv, recorded tv and then edit that video. ATI was truly the undisputed king back then. I would not mind it too see that again one day.


----------



## Rebe1 (Feb 19, 2019)

But does the new drivers fix the Wattman? Can you do the OC / UV without any problems?


----------



## cucker tarlson (Feb 19, 2019)

Jism said:


> And in the meantime, the Vega VII scoring merely 300 points behind the 2080TI:
> 
> https://www.3dmark.com/hall-of-fame-2/timespy+3dmark+score+extreme+preset/version+1.0/1+gpu
> 
> It likes water > and many people archieved a clockspeed of over 2250Mhz. From a maximum boost of 1800Mhz that is not bad at all! It's a great OC'er actually.


it is,if you've cooling capacity to deal with a 500w gpu.



Fouquin said:


> Average core overclock for the Fury X is 1140MHz. That's 14%, not 3%.





no,that's 8%

1140 is 1.08x of 1050




Fouquin said:


> Somewhat related, I tested an overclocked Fury X up against a Galax GTX 1070 EXOC and found the performance difference to be in the 5.5 - 11% range. Surprising to see the Fury X that close, to say the least.


that's synthetics,amd always did great in 3dmarks

look at actual games


----------



## cyneater (Feb 19, 2019)

londiste said:


> Yes, RX570 wipes the floor with GTX1050Ti. They are cards from different segments.
> RX570 is the intended competitor of GTX1060 3GB.
> GTX1050Ti is the intended competitor of RX560.
> 
> Pricing in the lowend and midrage is FUBAR.



Pricing is fubur all over the board


----------



## londiste (Feb 19, 2019)

Jism said:


> And in the meantime, the Vega VII scoring merely 300 points behind the 2080TI:
> https://www.3dmark.com/hall-of-fame-2/timespy+3dmark+score+extreme+preset/version+1.0/1+gpu


Not really. 





> Description: 2Way CF 2000MHz/1200MHz


----------



## xkm1948 (Feb 19, 2019)

londiste said:


> Not really.



Crossfire and still can’t match a single 2080Ti.

Welp


----------



## INSTG8R (Feb 19, 2019)

londiste said:


> Not really.


Looks like one card to me no?


----------



## londiste (Feb 19, 2019)

INSTG8R said:


> Looks like one card to me no?


Click Detailed Result and read the description. Radeon VII is new and 3DMark's detection isn't the best even when tools are OK.


----------



## trog100 (Feb 19, 2019)

one of the first things i look for with any card are 3Dmark scores.. the radeon vii struck me as very low.. some reviewers include timespy in the testing results.. i think TPU should..

https://hexus.net/tech/reviews/graphics/126752-amd-radeon-vii/?page=4

trog


----------



## EarthDog (Feb 19, 2019)

INSTG8R said:


> Looks like one card to me no?View attachment 116885


https://www.techpowerup.com/forums/threads/radeon-vii-gpu-z-support.252445/page-2#post-3997726


----------



## Frutika007 (Feb 19, 2019)

medi01 said:


> Remind me how 780Ti fared against 290x.
> "Mythical" eh?
> 
> 
> ...



Kepler architecture's failure has nothing do to with AMD's 'finewine' myth. AMD didn't get better,Kepler got worse. Learn the difference. And 1050Ti 139$ and RX 570 99$?? Are you on drugs or something?? You wanna know the reason?? Here's your reason, GTX 1050Ti was launched in 2016 for 130$. RX 570 was launched in 2017 for 170$,also that's precisely when Mining scandal started and AMD gpu just dissapeared from the stock. 1 year gap with 40$ more money is not an appealing deal,not to mention the lack of availability due to mining.



JB_Gamer said:


> Isn't it the case - the problem for Amd - that ALL games are tested and optimized for nVidia GPU'S?



No,there are many games that are tested and optimized for AMD.



SIGSEGV said:


> hahaha....
> 
> 
> but I hate their (NVIDIA) approach to reduce performance through a driver update to older GPU.
> That's beyond my comprehension.



You are totally wrong. Nvidia doesn't reduce performance of older gpu through driver update. That's just a lie and misconception spread by AMD cultists.

Wow, already got -1 rating from that ignorant delusional braindead AMD cultist named medi01



rvalencia said:


> From https://www.techpowerup.com/reviews/AMD/Radeon_VII/
> 
> Assassin's Creed Origins (NVIDIA Gameworks, 2017)
> 
> ...



From top to the bottom,WRONG WRONG WRONG.

AC Origins gameworks?? It wasn't even nvidia sponsored,nobody sponsored that.
Battlefield 5 gameworks?? Are you having a giggle mate?? Do you even know what gameworks is?? Oh my god the ignorance and delusion in this comment is blowing my mind. Bf5 only has RTX and DLSS. It still uses Frostbite engine which vastly favours AMD.
Let me correct you.

"Assassin's Creed Origins (NVIDIA Gameworks, 2017)" - WRONG

"Battlefield V RTX (NVIDIA Gameworks, 2018)" - WORNG

"Darksiders 3 (NVIDIA Gameworks, 2018), old game remaster, where's Titan Fall 2." - WRONG

"Dragon Quest XI (Unreal 4 DX11, large NVIDIA bias, 2018)" - WRONG

"Ghost Recon Wildlands (NVIDIA Gameworks, 2017)" - WRONG

"Hellblade: Senuas Sacrif (Unreal 4 DX11, NVIDIA Gameworks)" -WRONG

Hitman 2 - it's an AMD title i think,not sure

"Monster Hunter World (NVIDIA Gameworks, 2018)" - not sure,but probably WRONG

"Middle-earth: Shadow of War (NVIDIA Gameworks, 2017)" - WRONG

"Prey (DX11, NVIDIA Bias, 2017 )" - WRONG

"Rainbow Six: Siege (NVIDIA Gameworks, 2015)" - WRONG

"Shadows of Tomb Raider (NVIDIA Gameworks, 2018)" - WRONG,it's only RTX but that hasn't even patched yet

"Wolfenstein II (2017, NVIDIA Gameworks)" - WRONG

Most of the games here aren't even sponsored by nvidia and doesn't have any gameworks in it. The only gameworks game in this list is Witcher 3. Most of the games in your nvidia list are even AMD sposnored title,like Wolfenstein 2 and Prey.



B-Real said:


> Radeon VII buyers get RE2, Division 2 and DMC5. 1080Ti buyers get ZERO games. Thats a $150-180 packet.
> 
> 
> As poor as the whole RTX roundup is.
> ...



Who cares? RE2 already cracked,DMC5 will get cracked, Division 2 will suck as division 1 did. Not a single appealing game. I would rather take price reduction than taking those games. So ultimately 1080Ti is a much better choice. Not to mention 1080Ti came out 2 year ago for the same 699$ price tag. LOL. After 2 years with 7nm process and same 699$ price tag,Radeon 7 still can't beat 1080Ti. How pathetic! Here's the generational improvement i wonder? And RTX 2060 is a great gpu for the price. So the whole RTX roundup isn't poor.


----------



## eidairaman1 (Feb 19, 2019)

siluro818 said:


> *looks at the 2080Ti in system specs*
> 
> If you say so bro xD



He still does by trolling AMD topics



HD64G said:


> Since you are fond of numbers that prove things scientifically, here is an example that proves that AMD GPUs aren't shown in their best form at their launch and this helps us customers to get an equal or better product in better price than will deserve in a few months only. And for a customer that keeps his hardware at least for 3 years, this is an opportunity.


----------



## Frutika007 (Feb 19, 2019)

eidairaman1 said:


> He still does by trolling AMD topics



*looks at the AMD FX cpu and AMD 290 gpu in system specs*

If you say so bro



notb said:


> Oh come on. This game was launched with Vega as a confirmation of AMD-Bethesda cooperation.
> Counting Prey as favouring AMD (i.e. it could be even worse) it's:
> AMD: 4
> Nvidia: 14
> ...



Actually his list is completely wrong.


----------



## EarthDog (Feb 19, 2019)

notb said:


> Oh come on. This game was launched with Vega as a confirmation of AMD-Bethesda cooperation.
> Counting Prey as favouring AMD (i.e. it could be even worse) it's:
> AMD: 4
> Nvidia: 14
> ...


I don't think games for review need to be chosen by who sponsors it for 'balance', but by relative popularity as well as genre encompassing. Meaning, the most popular games of genre (FPS, 3PS, RTS, MMO, Racing, etc). When people go to buy games, if they have NV or AMD card, do they only play those titles? No... it isn't even a concern for the sane.

There is simply no way everyone will be happy. But this method ensures game types are covered and the more popular ones and f-all to the AMD/NVIDIA sponsor pissing match.


----------



## danbert2000 (Feb 19, 2019)

I think we can all agree that out of the box, the Radeon VII is somewhere between a 2070 and a 2080, and that the improved drivers didn't change that. And it has no room for overclocking unless you get a golden sample and put it under water. AMD fans need to sit this one out, come back when Navi shows its cards. There's really not much of an argument. Nvidia is better at the high end and better at power efficiency out of the box, and that's how 95% of people will use these cards. Radeon VII is a stopgap card and it's not going to magically become better than the 2080 through drivers. It is a fine card but not the best.


----------



## Vayra86 (Feb 19, 2019)

Mistral said:


> If I may, a suggestion for benchmarks going forward: have a small icon next to the game titles, indicating if they are nVidia or AMD sponsored.



Got a better idea, put a black sticker over everything and only reveal the card brands after people formed an opinion. So all you get is Card ABCD with price ABCD and performance ABCD.

Then, when everything settles, you reveal brands. It'd be very interesting in terms of perceived brand loyalty. Honestly this whole brand loyalty is totally strange to me. For either camp. Neither AMD or Nvidia are in it to make you happy, they're in the game to make money and keep the gears turning for the company. The consumer, and mostly 'gamer' is just a target, nothing else. Ironically BOTH camps are now releasing cards that almost explicitly tell us 'Go f*k yourself, this is what you get, like it or not' and the only reason is because the general performance level is what it is. AMD's midrange is 'fine' for most gaming, and Nvidia's stack last gen was also 'fine' for most gaming. Radeon VII is a leftover from the pro-segment, and Turing in a very similar way is a derivative of Volta, a GPU only released for pro markets. On top of that even the halo card isn't a full die.

We're getting scraps and leftovers and bicker about who got the least shitty ones. How about taking a stance of being the critical consumer instead towards BOTH companies. If you want to win, thát's how it works. And the most powerful message any consumer could ever send, is simply not buying it.


----------



## cucker tarlson (Feb 19, 2019)

having owned nvidia for 4 years,I don't think I have ever complained how my card performs in amd-friendly games once.doesn't matter when perfomance is consistent.when it stops being consistent,then people start making ridiculous game-bias lists.


----------



## moproblems99 (Feb 19, 2019)

Vayra86 said:


> Got a better idea, put a black sticker over everything and only reveal the card brands after people formed an opinion. So all you get is Card ABCD with price ABCD and performance ABCD.
> 
> Then, when everything settles, you reveal brands. It'd be very interesting.



That would be a really cool review.  However, I think power consumption numbers would give it away.


----------



## jmcosta (Feb 20, 2019)

medi01 said:


> You mean, when you buy 780Ti, you expect it to be beaten by 960 later on, right?
> 
> 
> *Why the hell not!*
> ...



unfortunately that card has an insufficient memory buffer for a lot of games nowadays which results in occasional stutter.
That minuscule performance gain isn't gonna give you better experience overall.
I used to own one and played many titles at 1440p and it was the worst mistake i ever made
There is a "finewine" but that comes mostly from having a bad driver in the first months of that card release and yeah then slowly there is an improvement over time to a level that should have had in the first place.


----------



## rtwjunkie (Feb 20, 2019)

rvalencia said:


> Wolfenstein II (2017, NVIDIA Gameworks),


I’m pretty sure my copy ran on Vulkan.


----------



## king of swag187 (Feb 20, 2019)

robert3892 said:


> I'd like to test this myself but sadly my Radeon 7 arrived as a Dead on Arrival unit and I had to send it back for a replacement. I suppose I'll get the replacement next week.


Get a 2080 



jmcosta said:


> unfortunately that card has an insufficient memory buffer for a lot of games nowadays which results in occasional stutter.
> That minuscule performance gain isn't gonna give you better experience overall.
> I used to own one and played many titles at 1440p and it was the worst mistake i ever made
> There is a "finewine" but that comes mostly from having a bad driver in the first months of that card release and yeah then slowly there is an improvement over time to a level that should have had in the first place.


The VRAM isn't even an issue to me tbh, its the lack of drivers and how poorly kepler aged



danbert2000 said:


> I think we can all agree that out of the box, the Radeon VII is somewhere between a 2070 and a 2080, and that the improved drivers didn't change that. And it has no room for overclocking unless you get a golden sample and put it under water. AMD fans need to sit this one out, come back when Navi shows its cards. There's really not much of an argument. Nvidia is better at the high end and better at power efficiency out of the box, and that's how 95% of people will use these cards. Radeon VII is a stopgap card and it's not going to magically become better than the 2080 through drivers. It is a fine card but not the best.


14% slower than a 2080 and 5% than a 1080 ti (on average, according to TPU)


----------



## moproblems99 (Feb 20, 2019)

king of swag187 said:


> 14% slower than a 2080 and 5% than a 1080 ti (on average, according to TPU)



Didn't the new drivers bring it under 10% to 2080?  Almost in the middle.


----------



## Frutika007 (Feb 20, 2019)

moproblems99 said:


> Didn't the new drivers bring it under 10% to 2080?  Almost in the middle.



In two games,yes. But on average,it just brought it up to less than 1% compared to previous driver. So now it's 13% behind rtx 2080 and 4% behind 1080Ti.


----------



## Nkd (Feb 20, 2019)

Prima.Vera said:


> This card is unfortunatelly 200$ more than it should have been. Nobody is going to pick this over an 1080Ti for example...



The card will sell because it is really for people with dual purpose who need all that ram for content creation. I got a 2080ti but to say no one is going to buy it is overblown. That is your opinion not a fact, there are plenty of people picking up this card who see value in it within their workload.


----------



## _larry (Feb 20, 2019)

notb said:


> Man, it's you again on your vendetta against Nvidia. I hoped you would give up after R VII launch.
> 
> FineWine doesn't refer to "graceful aging of AMD cards". It refers to AMD not being able to provide proper drivers at the time of launch. So with Nvidia you get that extra 10% the first day, and with AMD you have to wait.
> 
> ...



My R9 290 cost me $250~ in 2014...
The performance has only gone up since I got it because of drivers and optimisations. I went from 1080p to 1440p without a second thought.
My friend has a 780Ti that constantly had stuttering issues in games... It cost him $400~ in 2014...
All in all, the AMD card is the better investment, because they are cheaper and just get better with updates...


----------



## efikkan (Feb 20, 2019)

_larry said:


> My R9 290 cost me $250~ in 2014...
> The performance has only gone up since I got it because of drivers and optimisations. I went from 1080p to 1440p without a second thought.
> My friend has a 780Ti that constantly had stuttering issues in games... It cost him $400~ in 2014...
> All in all, the AMD card is the better investment, because they are cheaper and just get better with updates...


It's not like Nvidia are not improving their drivers either. In fact, over the past decade there have been no driver improvement large enough to shift the relative positioning between competitors, well except perhaps for Nvidia 337.50, the update where Nvidia applied the driver side optimizations of DirectX 12 to DirectX 11 and OpenGL, resulting in one of the greatest driver improvements ever. AMD decided not to do the same kind of optimizations in their driver, to retain a larger performance delta with DirectX 12, and to keep alive the myth that AMD is somehow better in DirectX 12…


----------



## Vayra86 (Feb 20, 2019)

_larry said:


> My R9 290 cost me $250~ in 2014...
> The performance has only gone up since I got it because of drivers and optimisations. I went from 1080p to 1440p without a second thought.
> My friend has a 780Ti that constantly had stuttering issues in games... It cost him $400~ in 2014...
> All in all, the AMD card is the better investment, because they are cheaper and just get better with updates...



Now ask a Fury X owner how he feels versus a 980ti owner...

Both camps have their great and no-so-great cards and also offer fantastic value propositions. Its hit or miss, all the time, for both green and red.In the end, yes, I think we can agree that the 290 versus the 780 (comparing the non-X to the TI is not fair, neither on price or performance, or market share - 290s and 780s sold like hotcakes and the higher models a whole lot less!) is a win for the 290 when it comes to both VRAM and overall performance, but not in terms of power/noise/heat. Also, you needed a pretty good AIB 290 or you'd have a vacuum cleaner - and those surely weren't 250 bucks.

Regardless. Consistency is an issue and the driver approach of both camps is different. I think its a personal view on what is preferable; full performance at launch, or increasing performance over time - with a minor chance of gaining a few % over the competition near the end of a lifecycle. It depends a lot as well on how long you tend to keep a GPU.

And yes, absolutely, a 780ti in 2017 and onwards ran into lots of problems, I've experienced those first hand. At the same time, I managed to sell that card for a nice sum of 280 EUR, which was exactly the amount I bought it for a year earlier, and meant a serious discount on my new GTX 1080. Longevity isn't _just an advantage_. Keeping a card for a long time also means that by the time you want to upgrade, the value is almost gone, while you're also slowly bleeding money over time because perf/watt is likely lower. Running ancient hardware is not by definition cheaper - that only is true if that hardware is obsolete and will never be replaced anyway. If you are continuously upgrading, keeping a GPU for a longer time than 1 ~1,5 years (make it 3 with the current gen junk being released) is almost NEVER cheaper than replacing it for something of a similar price point and reselling your old stuff at a good price. And you get a bonus: you stay current.


----------



## turbogear (Feb 20, 2019)

Lot of talks about Fury X wakes old memories. 
I used to own one until 2017 before I replaced it with Vega 64.
At that time I had 1080p monitor @120Hz without Freesync.
For that resolution the Fury X was not bad. The large number of games I played then worked all at Ultra setting with good frame rates.
I sold it at eBay at that time for over 300€. 
Somebody may still be using it.


----------



## king of swag187 (Feb 20, 2019)

My Fury X is still going well at 1080p 144hz, but its quite under utilized by my brother (720p 60hz lol)


----------



## OneMoar (Feb 21, 2019)

drivers are not going to fix this card being a rebranded instinct card

AMD no longer makes gaming cards they make compute accelerators

this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia


----------



## notb (Feb 21, 2019)

OneMoar said:


> drivers are not going to fix this card being a rebranded instinct card
> 
> AMD no longer makes gaming cards they make compute accelerators
> 
> this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia


Which is really ironic since their compute accelerators aren't really a hit (that's why they're rebranding them for gaming).
They wanted to unify workstation and high-end gaming. This whole business strategy turned out to be a failure.

Everything could change if AMD focused on making purpose-built gaming chips and unify consoles and PC gaming, which would make sense for a change...

We'll see what happens with their datacenter products. IMO they'll give up and Intel takes over.


----------



## bug (Feb 21, 2019)

OneMoar said:


> drivers are not going to fix this card being a rebranded instinct card
> 
> AMD no longer makes gaming cards they make compute accelerators
> 
> this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia


Tbh, Nvidia also put a lot of computing resources in their cards. But when everybody was stuck on 28nm, Nvidia looked to get the most out of the die space and starting with Maxwell, have cut back on compute resources in their gaming cards.
Doesn't make AMD card lesser gaming cards. But, in comparison, it does make GCN resemble Netburst from an efficiency point of view.


----------



## Frutika007 (Feb 21, 2019)

RIP TechPowerUp. AMD cultists have been spoken.





Oh look another AMD cultist giving AdoredTV's review link as an irrefutable proof of Radeon 7's performance increase via new driver.
And looks like INSTG8R is also an AMD cultist,giving -1 rating as i revealed the dark side of his cult.


----------



## OneMoar (Feb 21, 2019)

enough will the sub 50 post new members kindly go pound sand
your youtube drama is not welcome here


----------



## Vayra86 (Feb 21, 2019)

R4WN4K said:


> RIP TechPowerUp. AMD cultists have been spoken.
> View attachment 117076
> 
> Oh look another AMD cultist giving AdoredTV's review link as an irrefutable proof of Radeon 7's performance increase via new driver.
> And looks like INSTG8R is also an AMD cultist,giving -1 rating as i revealed the dark side of his cult.View attachment 117077



You are getting a -1 from me because I don't need to see random idiot comments. You can find anything on the internet, just google it and boom, done.

Relevance = -1000

If you want to troll, go to that place you found those screens at. Don't do it here.


----------



## Redwoodz (Feb 22, 2019)

OneMoar said:


> drivers are not going to fix this card being a rebranded instinct card
> 
> AMD no longer makes gaming cards they make compute accelerators
> 
> this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia




 AMD has been doing the same thing since 7970. It was Nvidia who removed compute performance from their GTX cards.Now they have added it back, charging you double calling it RTX.


----------



## arbiter (Feb 27, 2019)

Redwoodz said:


> AMD has been doing the same thing since 7970. It was Nvidia who removed compute performance from their GTX cards.Now they have added it back, charging you double calling it RTX.


So then explain as to why AMD card is same price then? How would nvidia adding compute performance back make it so AMD cards are same price?


----------



## londiste (Feb 27, 2019)

Redwoodz said:


> AMD has been doing the same thing since 7970. It was Nvidia who removed compute performance from their GTX cards.Now they have added it back, charging you double calling it RTX.


What exactly do you mean by compute? The discussion here has been about FP64 performance. RTX is an entirely different type of computations and a very specialized one at that - BVH traversal.

FP64 is almost completely useless when it comes to gaming. It is useful in certain types of compute scenarios. Both manufacturers have struggled to find a balance between workstation/server/GPGPU cards and consumer cards in terms of compute features. If you look at the history, both have also settled to the balance points they decided upon - AMD at 1:16 and Nvidia at 1:32, with both trying to have a compute GPU at the top of their lineups that can do 1:2 or thereabouts.

When it comes to FP64, AMD history looks like this (a little messy due to reuse of GPUs over generations):
- HD4000/5000 high and midrange cards have FP64 at 1:5 FP32 (4870/4850/4770/4750, 5870/5850/5830). Lowend does not do FP64.
- Some of (higher end) HD6000/7000 have 1:4 (Tahiti, 7950/7970). HD7000 midrange has 1:16 (Pitcairn, 7870/7850), lowend has 1:16. Some really lowend things do not do FP64.
- High end R* 200 series (Hawaii, R9 290/290X) has 1:8, midrange has 1:4 (Tahiti, R9 280/280X) or 1:16 (Tonga, R9 285/285X) and lowend has 1:16. Some really lowend things do not do FP64.
- Fiji (Fury/FuryX) has 1:16
- RX400/500 has 1:16
- Vega10 (Vega56/Vega64) has 1:16
- Vega20 (Radeon VII) has 1:4

FP64 situation on the NVidia side looks like this:
- GTX200 series high end (GTX280/260) has 1:8.
- GTX400 series (Fermi) high end (GTX480/470) has 1:8, midrange and lowend (GTX460/450/440/430) has 1:12 and lowest end does not do FP64.
- GTX600 series (Kepler) has 1:24, except Titans at 1:3 and some lowend cards that are Fermi and have 1:12.
- GTX900 series (Maxwell) has 1:32.
- GTX1000 series (Pascal) has 1:32.
- Volta (Titan V) has 1:2.
- RTX2000/GTX1600 series (Turing) has 1:32.


----------



## bug (Feb 27, 2019)

londiste said:


> What exactly do you mean by compute?



I believe he was referring to this: https://en.wikipedia.org/wiki/Maxwell_(microarchitecture)#Performance
However, that's Nvidia cutting back on double precision computing power, which is really different from what RTX and tensor cores do.


----------



## Redwoodz (Mar 2, 2019)

londiste said:


> What exactly do you mean by compute? The discussion here has been about FP64 performance. RTX is an entirely different type of computations and a very specialized one at that - BVH traversal.
> 
> FP64 is almost completely useless when it comes to gaming. It is useful in certain types of compute scenarios. Both manufacturers have struggled to find a balance between workstation/server/GPGPU cards and consumer cards in terms of compute features. If you look at the history, both have also settled to the balance points they decided upon - AMD at 1:16 and Nvidia at 1:32, with both trying to have a compute GPU at the top of their lineups that can do 1:2 or thereabouts.
> 
> ...



Well you stated the basis there, Nvidia using 1:32 vs AMD using 1:16 for the past couple generations.


----------



## notb (Mar 2, 2019)

Redwoodz said:


> Well you stated the basis there, Nvidia using 1:32 vs AMD using 1:16 for the past couple generations.


But you said that Nvidia "removed compute", so it's doesn't really matter how it relates to AMD. It's important to check what happened over the years, because that's what you're referring to.
And @londiste collected the data.
Over the last 10 years Nvidia moved from 1:8 / 1:12 to 1:32.
In the same period AMD moved from 1:4 / 1:5 to 1:16.

So it does seem that they both "removed compute". Do you agree?


----------



## hat (Mar 2, 2019)

notb said:


> Which is really ironic since their compute accelerators aren't really a hit (that's why they're rebranding them for gaming).
> They wanted to unify workstation and high-end gaming. This whole business strategy turned out to be a failure.
> 
> Everything could change if AMD focused on making purpose-built gaming chips and unify consoles and PC gaming, which would make sense for a change...
> ...


Eh, maybe. While it's true a console chip's primary purpose would be, well, gaming, console chips are also custom designed (to some extent) and last years. This latest generation (PS4/Xbone) being the exception, with bigger and badder variants of the same console (a la PS4 Pro), previous consoles use more or less the same hardware for a long, long time compared to desktop parts. Every year or so a new generation of graphics cards come out... meanwhile, the same hardware lasted 7 years with the PS3. 5 years and counting with the PS4, if you toss out the PS4 pro...

So, it's not quite the same animal. Making desktop chips requires constant improvements, where as you make a decent chip for a console and it's expected to last 5 years or more. AMD already has Navi in the bag, primarily for the PS5, and we'll see that too on desktops, but it remains to be seen how well it will do, or what comes after that.


----------



## efikkan (Mar 3, 2019)

I think some need to look up what "compute" actually means. fp16, fp32, fp64, int32, FMA and tensor operations are all referring to compute, compute operations should not be confused with what people call "compute" workloads, which is typical CUDA or OpenCL simulations etc.

The thing about fp64 is that it's not normally used during rendering, in fact fp32 is overkill for parts of the rendering in games. E.g. after rasterization, when processing fragments("pixels"), only a tiny fraction of the precision of fp32 is actually used.


----------



## Raven Rampkin (Mar 5, 2019)

londiste said:


> Yes, RX570 wipes the floor with GTX1050Ti. They are cards from different segments.
> RX570 is the intended competitor of GTX1060 3GB.
> GTX1050Ti is the intended competitor of RX560.
> 
> Pricing in the lowend and midrage is FUBAR.



Looks like it's only FUBAR when the AMD option is cheaper


----------

