# AMD Radeon VII 16 GB



## W1zzard (Feb 7, 2019)

The time has come. We're finally allowed to talk about Radeon VII performance numbers. The company's new flagship graphics card is the world's first to be made using a 7 nanometer production process. Also, it has the largest VRAM size of any card below $1000: 16 GB.

*Show full review*


----------



## laszlo (Feb 7, 2019)

as amd user.... i wouldn't buy it ... too much hype from red team...


----------



## GeorgeMan (Feb 7, 2019)

Another dissapointment. 1TB/s bandwidth and it's not even better than my 1080Ti (stock vs stock) at my resolution (3440x1440)....
Excellent review as always!


----------



## EarthDog (Feb 7, 2019)

Well... first, ty W1z... 

So it seems that its:

* 14% slower (average) than a 2080 at 1440p (its bread and butter res - its more lower for those High Hz people). 10% slower at 4K UHD.
* 90W _average_ power difference between it and a 2080 (peak is a whopping 125W difference in the power virus app - W1z, stop using it... LOL).
* 7db louder on load than 2080 and 4db louder than a 2080 Ti.


Well, again, its another choice in the market which is good for everyone. A compelling choice... not so much at all. A used 1080Ti is as fast(4K UHD)/faster (FHD/QHD) uses less power, and cheaper. The sound issue can be mitigated by AICs, but everything else tough to swallow for most. I am surprised and quite disappointed that it is so much slower than a 2080.


----------



## Wavetrex (Feb 7, 2019)

Oh boy.
They really need to change that GCN architecture.

DOA card unfortunately.


----------



## oxidized (Feb 7, 2019)

What a huge failure, but i don't understand that inconsistency in games, could it be drivers are still a bit fckd up?


----------



## birdie (Feb 7, 2019)

A decent card for hardcore AMD fans who want to get top performance. TDP for such cards doesn't really matter since you're better served with water cooling anyways.

The only issue is that NVIDIA has leeway in regard to lowering prices (NVIDIA doesn't use super expensive HBM2) and releasing Turing on 7nm both of which could spell a disaster for AMD but given the rumors about this card availability I'm not really concerned.



oxidized said:


> What a huge failure, but i don't understand that inconsistency in games, could it be drivers are still a bit fckd up?



I'm not sure about that. This is basically Vega 64 on steroids - it's the same architecture with better specs.


----------



## W1zzard (Feb 7, 2019)

EarthDog said:


> peak is a whopping 125W difference in the power virus app - W1z, stop using it... LOL


Furmark is just a free data point for curious readers, it has no influence on the outcome of the review

"Peak (Gaming)" is the highest single value measured during gaming (Metro Last Light), "Average (Gaming)" is the average of these values


----------



## Deleted member 158293 (Feb 7, 2019)

Really is a compute card rather than a game card.  Swing & a miss in the consumer space, that build quality is really too high for consumer use and driving up price.


----------



## EarthDog (Feb 7, 2019)

W1zzard said:


> Furmark is just a free data point for curious readers, it has no influence on the outcome of the review
> 
> "Peak (Gaming)" is the highest single value measured during gaming (Metro Last Light), "Average (Gaming)" is the average of these values


Yep, I get it, not worried about the outcome, but the perception that its values mean much of anything is being validated by its use. Both NV and AMD agree with it being a power virus and should not be used for testing.. curiosity or not.


----------



## cdawall (Feb 7, 2019)

This card would have been a good purchase if they released it exactly one year ago. 

Day late and a dollar short for performance and pricing on this one.


----------



## jihadjoe (Feb 7, 2019)

I found the performance per dollar charts really amusing lol.

AMD has just managed to build the second worst value consumer card ever, beaten only by the 2080Ti!


----------



## Space Lynx (Feb 7, 2019)

What a joke, Lisa Su lost all respect from me for allowing this product to be released. I was considering a 3700x, but screw it. 9700k and rtx 2080 is probably my next build. 

really disappointing, 30 fps slower across the board at 1080p almost... and using 100 watts more... wow so pathetic...


----------



## XXL_AI (Feb 7, 2019)

as Jensen said, it sucks. try again amd.  the results are mediocre at best. lol. they can't even compete with a 3yr old GPU.. roflmao.


----------



## rrrrex (Feb 7, 2019)

Looks like that AMD have to rewrite drivers, at least for the coming cards. Current drivers just can't utilize CPU properly for the highest perfomance.


----------



## the54thvoid (Feb 7, 2019)

Oh hey, look, another disappointing release. For me, not so much the performance, although as mentioned GCN is just plain awful these days. If you NEED Vulkan or DX12 for your card to shine, that's not a great option. But that noise level..... On a 3 fan design. I think AMD should carve off Radeon to Intel entirely. I'm still looking forward to Ryzen 2 though.... C'mon AMD, rooting for success there!


----------



## Deleted member 158293 (Feb 7, 2019)

...and...  OOS in most places with many countries not even able to list it...  that was fast


----------



## Nima (Feb 7, 2019)

So it trade blows with RTX 2070 and much slower than 2080 even without factoring in ray tracing or DLSS . AMD made RTX cards look so cheap!


----------



## the54thvoid (Feb 7, 2019)

And at OcUK - it's £650 - £800.


----------



## Super XP (Feb 7, 2019)

lynx29 said:


> What a joke, Lisa Su lost all respect from me for allowing this product to be released. I was considering a 3700x, but screw it. 9700k and rtx 2080 is probably my next build.
> 
> really disappointing, 30 fps slower across the board at 1080p almost... and using 100 watts more... wow so pathetic...





GeorgeMan said:


> Another dissapointment. 1TB/s bandwidth and it's not even better than my 1080Ti (stock vs stock) at my resolution (3440x1440)....
> Excellent review as always!


*We all know AMD needs a complete GPU Re-Design.* Until this happens, this is what they can offer. Though I do believe the price is at least $150 too high, despite the 3 x AAA Games included.


----------



## M2B (Feb 7, 2019)

I said it's not going to beat the 1080Ti, and well... there you go.


----------



## Steevo (Feb 7, 2019)

Great for compute workloads, I bet if the crypto market hadn't crashed this card would have ruled them all. 

But crypto is not gamers, and I'm not buying crypto hardware when all I want is cheaper faster performance.


----------



## EarthDog (Feb 7, 2019)

Its saving grace is compute.... so for those who need its compute abilities, its going to be king of the hill in many (not all) circumstances. 

https://www.anandtech.com/show/13923/the-amd-radeon-vii-review/15


----------



## Space Lynx (Feb 7, 2019)

yakk said:


> ...and...  OOS in most places with many countries not even able to list it...  that was fast



its not. they only had a few handful to sell per store anyway. this is what AMD will tell shareholders, we sold out within minutes! but France only had 12 units to sell!  LOL  what a joke.


----------



## Deleted member 158293 (Feb 7, 2019)

lynx29 said:


> its not. they only had a few handful to sell per store anyway. this is what AMD will tell shareholders, we sold out within minutes! but France only had 12 units to sell!  LOL  what a joke.



All good, just saw amd.com website has stock.


----------



## XXL_AI (Feb 7, 2019)

EarthDog said:


> Its saving grace is compute.... so for those who need its compute abilities, its going to be king of the hill in many (not all) circumstances.
> 
> https://www.anandtech.com/show/13923/the-amd-radeon-vii-review/15



I don't see Titan RTX, RTX 2080 Ti on that list, do you?
I wonder how much amd paid to make those graphs look better over Nvidia.


----------



## Space Lynx (Feb 7, 2019)

yakk said:


> All good, just saw amd.com website has stock.



anyone who pays $699 for 30 fps slower actually lets make that 40 fps slower at 1080p across the board almost (cause radeon VII doesn't OC for crap compared to 2080 oc'd) while costing you more in an energy bill each month...

well I will withhold my comments. LOL


----------



## phill (Feb 7, 2019)

As disappointing as it might be for some I feel like this...
With driver updates and such I believe they might be able to close the gaps a bit.  I do feel if the price was around the £600 mark it would have been better (maybe that might happen at some point) but even with the slower performance than I was hoping for as everyone was hoping for, in the games that drivers and such are working for it, this AMD card is a step in the right direction 

I would like to say hats off to them but instead I'll tip my hat and just wait till the aftermarket cooled cards are around and go from there.  Still no regrets for buying my two 1080 Ti's but I'd definitely be looking at this card for the games it hits pass the 2080 and near to the 2080 Ti


----------



## EarthDog (Feb 7, 2019)

XXL_AI said:


> I don't see Titan RTX, RTX 2080 Ti on that list, do you?
> I wonder how much amd paid to make those graphs look better over Nvidia.


Not a dime... don't be a fool.

Correct... the $3000 Titan RTX isn't in there, nor the 2080 Ti for $1100. But will it be 2x faster to justify their pricing???? Look bigger picture. 



phill said:


> As disappointing as it might be for some I feel like this...
> With driver updates and such I believe they might be able to close the gaps a bit.  I do feel if the price was around the £600 mark it would have been better (maybe that might happen at some point) but even with the slower performance than I was hoping for as everyone was hoping for, in the games that drivers and such are working for it, this AMD card is a step in the right direction
> 
> I would like to say hats off to them but instead I'll tip my hat and just wait till the aftermarket cooled cards are around and go from there.  Still no regrets for buying my two 1080 Ti's but I'd definitely be looking at this card for the games it hits pass the 2080 and near to the 2080 Ti


Drivers will vary from no gains to several percent. It will vary by title. But as someone said, there isn't much difference under the hood here compared to Vega and that is a mature process. On this front, I won't hold my breath.

THe aftermarket cards will be more quiet, but will also fetch an additional premium making price to performance (where AMD typically shines) even worse...


----------



## R0H1T (Feb 7, 2019)

M2B said:


> I said it's not going to beat the 1080Ti, and well... there you go.


Say that again


----------



## phanbuey (Feb 7, 2019)

phill said:


> As disappointing as it might be for some I feel like this...
> With driver updates and such I believe they might be able to close the gaps a bit.  I do feel if the price was around the £600 mark it would have been better (maybe that might happen at some point) but even with the slower performance than I was hoping for as everyone was hoping for, in the games that drivers and such are working for it, this AMD card is a step in the right direction
> 
> I would like to say hats off to them but instead I'll tip my hat and just wait till the aftermarket cooled cards are around and go from there.  Still no regrets for buying my two 1080 Ti's but I'd definitely be looking at this card for the games it hits pass the 2080 and near to the 2080 Ti



That's pretty optimistic especially given this isn't a new design.  It's doubtful the drivers are very different from Vega, if at all.  I would not sell your 1080ti's for this anymore than the 2080 - they are all virtually the same card.

Definitely wait for the 7nm redesign.


----------



## Space Lynx (Feb 7, 2019)

phill said:


> As disappointing as it might be for some I feel like this...
> With driver updates and such I believe they might be able to close the gaps a bit.  I do feel if the price was around the £600 mark it would have been better (maybe that might happen at some point) but even with the slower performance than I was hoping for as everyone was hoping for, in the games that drivers and such are working for it, this AMD card is a step in the right direction
> 
> I would like to say hats off to them but instead I'll tip my hat and just wait till the aftermarket cooled cards are around and go from there.  Still no regrets for buying my two 1080 Ti's but I'd definitely be looking at this card for the games it hits pass the 2080 and near to the 2080 Ti




driver update will close a 40 fps gap over a oc'd 2080 at 1080p while using less wattage? lol mmk.  i still play a lot of DX11 games so yeah... no thanks.


----------



## ArbitraryAffection (Feb 7, 2019)

R0H1T said:


> Say that again


Sorry but these are all best-case for AMD and thus you cherrypicked.

Overall this card is a major let down, 7nm, massive power use, worse perf/watt than NVIDIA 16nm and can't universally beat 1080 ti after 2 years.


----------



## cucker tarlson (Feb 7, 2019)

holy crap,slower than 1080ti ? Am I seeing this correctly ?


----------



## Space Lynx (Feb 7, 2019)

ArbitraryAffection said:


> Sorry but these are all best-case for AMD and thus you cherrypicked.
> 
> Overall this card is a major let down, 7nm, massive power use, worse perf/watt than NVIDIA 16nm and can't universally beat 1080 ti after 2 years.




yep 95% of games its terrible.  i love how people this cherry pick the only 5% of games it does well in, LOL


----------



## ArbitraryAffection (Feb 7, 2019)

cucker tarlson said:


> holy crap,slower than 1080ti ? Am I seeing this correctly ?


Yes.

I cried a bit inside.


----------



## kings (Feb 7, 2019)

How disappointing, performance between a RTX 2070 and RTX 2080, consuming on the verge of 300W.

This card is terrible performance/watt wise, and on price/performance too.

Like I said before, its a rushed PR stunt card, just so people don't forget AMD in this segment.

The only good thing I see is the 16GB of VRAM, which can be interesting for a consumer niche!


----------



## R0H1T (Feb 7, 2019)

ArbitraryAffection said:


> Sorry but these are all best-case for AMD and thus you cherrypicked.
> 
> Overall this card is a major let down, 7nm, massive power use, worse perf/watt than NVIDIA 16nm and can't universally beat 1080 ti after 2 years.


Best case, so what about the best case scenario for 1080Ti do they not count? It's a virtual tie depending on resolution & quality.


----------



## Space Lynx (Feb 7, 2019)

ArbitraryAffection said:


> Yes.
> 
> I cried a bit inside.



ya it is depressing. I had high hopes if nothing else it would OC well and then compete... but nope it apparently doesn't OC at all (according to guru3d)


----------



## cucker tarlson (Feb 7, 2019)

lynx29 said:


> What a joke, Lisa Su lost all respect from me for allowing this product to be released. I was considering a 3700x, but screw it. 9700k and rtx 2080 is probably my next build.
> 
> really disappointing, 30 fps slower across the board at 1080p almost... and using 100 watts more... wow so pathetic...


come on,what does your cpu choice have to do with that ?


----------



## KarymidoN (Feb 7, 2019)

Everybody saying the card is trash, but it beat the 2080 by a good margin in 4 titles, maybe with some optimization and driver maturity it will be a good card, as it is RN its just not there.
This card is just a Radeon instinct rebreanded.


----------



## M2B (Feb 7, 2019)

R0H1T said:


> Say that again






TPU's performance summary is based on significantly more games and it's definitely more accurate because of this.


----------



## Deleted member 158293 (Feb 7, 2019)

lynx29 said:


> anyone who pays $699 for 30 fps slower actually lets make that 40 fps slower at 1080p across the board almost (cause radeon VII doesn't OC for crap compared to 2080 oc'd) while costing you more in an energy bill each month...
> 
> well I will withhold my comments. LOL



As I posted earlier, this shouldn't be a consumer card, I'm needing compute and the awesome build quality + price target is perfect for 24/7 use on the cheap.  

I wouldn't spend more than $250 just to play games...


----------



## Space Lynx (Feb 7, 2019)

cucker tarlson said:


> come on,what does your cpu choice have to do with that ?



because Lisa Su was the last person in the industry I trusted. and she lost that trust today. now i trust none of them, intel, amd, or nvidia. so i might as well just go with classic intel and nvidia for my next build.  9700k is the best gaming chip especially since i see it for $369 free ship no tax right now.


----------



## R0H1T (Feb 7, 2019)

M2B said:


> View attachment 115950
> 
> TPU's performance summary is based on significantly more games and it's definitely more accurate because of this.


It's not more accurate just because it has more games, it does cover more titles for sure. What if you take an avg from 100 games, will that be more/less accurate?


----------



## ArbitraryAffection (Feb 7, 2019)

cucker tarlson said:


> come on,what does your cpu choice have to do with that ?


Lol, they said that? :O Ryzen is fantastic and taking the fight directly to Intel. Radeon's just... a bit behind xD (still can have great value, take my 570 for example). 

3700X will be in another league to 9700K. I'm not abandoning AMD because of the VII being a bit.. mediocre. xD


----------



## Space Lynx (Feb 7, 2019)

yakk said:


> As I posted earlier, this shouldn't be a consumer card, I'm needing compute and the awesome build quality + price target is perfect for 24/7 use on the cheap.
> 
> I wouldn't spend more than $250 just to play games...



but it is and lisa su advertised it as such.


----------



## ArbitraryAffection (Feb 7, 2019)

lynx29 said:


> because Lisa Su was the last person in the industry I trusted. and she lost that trust today. now i trust none of them, intel, amd, or nvidia. so i might as well just go with classic intel and nvidia for my next build.  9700k is the best gaming chip especially since i see it for $369 free ship no tax right now.


Feel free to waste your money. Ryzen 3000 is going to make lot of 9th gen parts obsolete at their current prices.


----------



## phanbuey (Feb 7, 2019)

It's very fast in Dx12 but needs a triple slot cooler and $100-$150 price drop.


----------



## cucker tarlson (Feb 7, 2019)

now to include a lit up gadget in the box and rephrase low availability as "limited edition", fanboys will line up for a 300W 1080Ti at $700 like crazy.



lynx29 said:


> because Lisa Su was the last person in the industry I trusted. and she lost that trust today. now i trust none of them, intel, amd, or nvidia. so i might as well just go with classic intel and nvidia for my next build.  9700k is the best gaming chip especially since i see it for $369 free ship no tax right now.


then you're the sucker for trusting a business.


----------



## M2B (Feb 7, 2019)

R0H1T said:


> It's not more accurate just because it has more games, it does cover more titles for sure. What if you take an avg from 100 games, will that be more/less accurate?



Yeah, It's more accurate just because it has more games. Simple logic.


----------



## Space Lynx (Feb 7, 2019)

ArbitraryAffection said:


> Lol, they said that? :O Ryzen is fantastic and taking the fight directly to Intel. Radeon's just... a bit behind xD (still can have great value, take my 570 for example).
> 
> 3700X will be in another league to 9700K. I'm not abandoning AMD because of the VII being a bit.. mediocre. xD



we will see. 9700k still beats AMD 2700x in min fps and 0.1% mins across the board at 1080p by 10 fps sometimes as high as 30 fps. but @W1zzard no longer benches min fps testing. and overall fps avg and highs is always won by 9700k for about 95?% of games, i mean 2700k is very close, but yeah i dunno

if 3700x can do 4.8ghz, and 9700k at 5ghz - i will be very interested to see those benches, rumors are IPC is not improving with 3700x, only clock speeds.  so 9700k at 5ghz prob will still be the winner.



ArbitraryAffection said:


> Feel free to waste your money. Ryzen 3000 is going to make lot of 9th gen parts obsolete at their current prices.



I'm waiting for benches before I buy. so no worries. but we will see.


----------



## moproblems99 (Feb 7, 2019)

I was going to buy one of these if it was within 5% of the 2080. Or there abouts.  14% is a tough sell.



lynx29 said:


> What a joke, Lisa Su lost all respect from me for allowing this product to be released. I was considering a 3700x, but screw it. 9700k and rtx 2080 is probably my next build.
> 
> really disappointing, 30 fps slower across the board at 1080p almost... and using 100 watts more... wow so pathetic...



What? You can't be serious?  Why would you not get a likely perfectly good cpu that is going to be likely better than the 9700K because of this GPU? What did you really expect from this GPU?  Simple math told you this is where it was going to be.  Besides, the i9 9900K is basically the equivalent of this GPU in the CPU space.


----------



## R0H1T (Feb 7, 2019)

M2B said:


> Yeah, It's more *accurate just because it has more games*. Simple logic.


No it's not, certainly not logical the way you make it sound!


----------



## ArbitraryAffection (Feb 7, 2019)

lynx29 said:


> we will see. 9700k still beats AMD 2700x in min fps and 0.1% mins across the board at 1080p by 10 fps sometimes as high as 30 fps. but @W1zzard no longer benches min fps testing. and overall fps avg and highs is always won by 9700k for about 95?% of games, i mean 2700k is very close, but yeah i dunno
> 
> if 3700x can do 4.8ghz, and 9700k at 5ghz - i will be very interested to see those benches, rumors are IPC is not improving with 3700x, only clock speeds.  so 9700k at 5ghz prob will still be the winner.
> 
> ...


Rumours of IPC not improving? Where did you hear that. The Zen2 arch has an entirely re-designed front-end, including a better branch-predictor. That alone will result in some bottlenecks in the execution engine being removed (Z1 is front-end limited apparently) and higher per clock performance. Honestly i'm expecting around 5-10% higher IPC in most things. We'll see how gaming IPC improves, especially with regards to DRAM latency, which i fear may not actually improve that much with the seperate I/O die.


----------



## XXL_AI (Feb 7, 2019)

EarthDog said:


> Not a dime... don't be a fool.
> 
> Correct... the $3000 Titan RTX isn't in there, nor the 2080 Ti for $1100. But will it be 2x faster to justify their pricing???? Look bigger picture.
> 
> ...



My $1200 Titan X Pascal never stopped for a second in 2 years on Deep Learning training, I've never seen an amd gpu can last that long under heavy compute work. When I pay for it, I expect workstation class performance. Radeon VII marketed as a gpu that can do all the jobs, these reviews are proof of it'll contribute global warming, nothing more. Also, whole AMD is able to make 1000 cards at best, I wonder how many GPU units they have left for other companies  lol. amd lost the gpu war, this gpu and its hype is the proof of it.


----------



## Space Lynx (Feb 7, 2019)

moproblems99 said:


> I was going to buy one of these if it was within 5% of the 2080. Or there abouts.  14% is a tough sell.
> 
> 
> 
> What? You can't be serious?  Why would you not get a likely perfectly good cpu that is going to be likely better than the 9700K because of this GPU? What did you really expect from this GPU?  Simple math told you this is where it was going to be.  Besides, the i9 9900K is basically the equivalent of this GPU in the CPU space.




yeah im serious.  look at the ryzen benches below. ryzen is a joke compared to intel, for some reason GND is the only website that tests Ryzen properly. 



http://imgur.com/a/oTNrz


----------



## dirtyferret (Feb 7, 2019)

Over priced and by the time you need 16GB the GPU will be far slower then what will be on the market for a fraction of the cost.  I'm hoping AMD comes out with some solid $200-250 price range cards soon as I think Nvidia is dropping the ball in that price range.


----------



## moproblems99 (Feb 7, 2019)

lynx29 said:


> yeah im serious.  look at the ryzen benches below. ryzen is a joke compared to intel, for some reason GND is the only website that tests Ryzen properly.
> 
> 
> 
> http://imgur.com/a/oTNrz



What does a 1700X vs 8700k have to do with a 3XXX and 9700k?  Everyone knows there was a decent jump from 1XXX to 2XXX.  There is also likely going to be a similar jump from 2XXX to 3XXX.  All they need is that last jump to have parity.


----------



## W1zzard (Feb 7, 2019)

lynx29 said:


> yeah im serious.  look at the ryzen benches below. ryzen is a joke compared to intel, for some reason GND is the only website that tests Ryzen properly.
> 
> 
> 
> http://imgur.com/a/oTNrz


those graphs aren't starting at 0 !


----------



## ArbitraryAffection (Feb 7, 2019)

lynx29 said:


> yeah im serious.  look at the ryzen benches below. ryzen is a joke compared to intel, for some reason GND is the only website that tests Ryzen properly.
> 
> 
> 
> http://imgur.com/a/oTNrz


1700X is 6% slower in that bench the bars are misleading lmao. Also the 1700X is cheaper and there it is vs a decent max OC 8700K where the 1700X is only at 3.7. Try again, this time with 2700X which is the 8700K competitor (and even that is cheaper).


----------



## cucker tarlson (Feb 7, 2019)

ArbitraryAffection said:


> 3700X will be in another league to 9700K. I'm not abandoning AMD because of the VII being a bit.. mediocre. xD


it's still gonna get beat in games.
it's gonna beat it in cinnabon and video encoding tho.

look at cpu performance numbers at 720p,doesn't get much more cpu intensive than that


Spoiler












9600k stock is 15% faster than 2700x,2700x is only 12% faster than 1600.
2700x has 20% higher clock than 1600,plus more cores too.I'll add to that the fact that 6-8 cores is a much more critical core increase than going from 8 to higher. there's 7% ipc improvement too

https://www.computerbase.de/2018-04/amd-ryzen-2000-test/5/

now what are you getting with zen 2 ? probably less than 20% core clock boost,core count increase but it's from 8 to 10/12 now and probably more than 7% ipc improvement though not anything massive. now let's take a look at that 15% number I mentioned earlier again.


----------



## Basard (Feb 7, 2019)

Well..... this is embarrassing.


----------



## moproblems99 (Feb 7, 2019)

W1zzard said:


> those graphs aren't starting at 0 !



Didn't even catch that!  Also, how much cheaper was the 1700X vs the 8700k to have a video encode take 767 seconds instead of 792 seconds.


----------



## ArbitraryAffection (Feb 7, 2019)

cucker tarlson said:


> it's still gonna get beat in games.
> 
> look at cpu performance numbers at 720p,doesn't get much more cpu intensive than that
> 
> ...


In games, it will mostly be a bit behind Intel, you're right. But it's going to be so close that it doesn't matter, unless you're pushing 240Hz and playing highly competitive games (2700X can do 100fps gameplay). So Intel will likely have that gaming niche. But Zen2 is going to offer more cores, better value and performance in multi-threaded workloads including those with AVX2. Similar to how the 2000 series competes now, but with more of the good stuff


----------



## moproblems99 (Feb 7, 2019)

cucker tarlson said:


> now what are you getting with zen 2 ? probably less than 20% core clock boost,core count increase but it's from 8 to 10/12 now and probably more than 7% ipc improvement though not anything massive. now let's take a lokk at that 15% number I mentioned earlier again.



IPC is not the end-all.  With that IPC increase and a clock speed, there will essentially be parity.


----------



## delshay (Feb 7, 2019)

Can you do a Superposition 1080p extreme test & upload screenshot please.


----------



## iO (Feb 7, 2019)

If only AMD wouldn't go the cheap route and apply such unnecessary high voltages everytime to each card as some reviewers were able to push ots efficincy to Turing levels with some undervolting.


----------



## Dante Uchiha (Feb 7, 2019)

Stall or win in some AAA games, loses on some less-known or Nvidia biased games. The average consumption in games is similar to 2080.
 It looks decent, it was a much better launch than the Vega 64. 




M2B said:


> View attachment 115950
> 
> TPU's performance summary is based on significantly more games and it's definitely more accurate because of this.



IMHO, TPU has a lot of games to dilute the advantage that the Radeon VII has in some AAA games that are really relevant.



lynx29 said:


> driver update will close a 40 fps gap over a oc'd 2080 at 1080p while using less wattage? lol mmk.  i still play a lot of DX11 games so yeah... no thanks.



1080p on a U$ 700 GPU? The only resolution that matters in this thread is 4k. "Ah but I want to play 1080p @ 250fps", whatever, this public does not represent 0.0001% of the market. Nobody cares.


----------



## OneMoar (Feb 7, 2019)

well can't say I didn't see this coming
about the only time it comes close to a 2080 are in Battlefield V & farcry 5 which are probably the worst examples of a optimised properly sorted game engine on the market as per-usual all those CU's and no way to utilize them

#gcnneedstodie


----------



## unikin (Feb 7, 2019)

What was AMD thinking? Why release GPU that is slower than 2080 and costs €100 more than 2080 (at least here in Germany €749 Vs €649)? What a joke... Coming from AMD fan


----------



## phill (Feb 7, 2019)

EarthDog said:


> Drivers will vary from no gains to several percent. It will vary by title. But as someone said, there isn't much difference under the hood here compared to Vega and that is a mature process. On this front, I won't hold my breath.
> THe aftermarket cards will be more quiet, but will also fetch an additional premium making price to performance (where AMD typically shines) even worse...



I believe it's another card that people have a choice to buy or not, I can understand the reasons why people might not but I could also understand why people could   I like to think positively with AMD, they have come a decent way.  Nvidia does have a lot more chance at getting it right as it does like to release 20 of the same cards compared to AMD's one or two..  But again, everything is down by choice and I wouldnt blame anyone for buying it or not  
I hope that like Nvidia, they don't add a few hundred quid to the price for a slightly tweaked version..  But.....  Who knows 



phanbuey said:


> That's pretty optimistic especially given this isn't a new design.  It's doubtful the drivers are very different from Vega, if at all.  I would not sell your 1080ti's for this anymore than the 2080 - they are all virtually the same card.
> Definitely wait for the 7nm redesign.



If it was a complete build up for the card I would have hoped for something even more, but if this is just taking an older Vega and building on it, I can see why it's not quite as good as we were hoping for..   I don't think there's anything out there that I'd change my 1080 Ti's for at all...



lynx29 said:


> driver update will close a 40 fps gap over a oc'd 2080 at 1080p while using less wattage? lol mmk.  i still play a lot of DX11 games so yeah... no thanks.



I never mentioned power issues, I'm guessing that's AMD giving the GPU core a little more juice than it needed, just like Vega..  Hopefully it might change in a new card release, whenever that might be but I don't really tend to worry about power usage...  Efficiency is great but still, we've all had worse efficient GPUs in our time...  I can remember a few


----------



## cucker tarlson (Feb 7, 2019)

ArbitraryAffection said:


> In games, it will mostly be a bit behind Intel, you're right. But it's going to be so close that it doesn't matter, unless you're pushing 240Hz and playing highly competitive games (2700X can do 100fps gameplay). So Intel will likely have that gaming niche. But Zen2 is going to offer more cores, better value and performance in multi-threaded workloads including those with AVX2. Similar to how the 2000 series competes now, but with more of the good stuff


there's people,and I think it's the vast majority,that would take 9600k's gaming performance and 9900k's utility performance at the price of 9700k (or slightly lower) in a heartbeat.
not me tho,I want purely gaming chips only


----------



## Xuper (Feb 7, 2019)

Computerbase tested with UV.*Did you try it* ?

https://www.computerbase.de/2019-02...istungsaufnahme-der-grafikkarte-wolfenstein-2

From 275w to 207w in Wolf2


----------



## XXL_AI (Feb 7, 2019)

lynx29 said:


> What a joke, Lisa Su lost all respect from me for allowing this product to be released. I was considering a 3700x, but screw it. 9700k and rtx 2080 is probably my next build.
> 
> really disappointing, 30 fps slower across the board at 1080p almost... and using 100 watts more... wow so pathetic...



in the end of the day, lisa su is the niece of jensen huang, all gpu is a family business


----------



## Space Lynx (Feb 7, 2019)

W1zzard said:


> those graphs aren't starting at 0 !



i know the person who made them, I will ask him. I don't think he is trying to be misleading though, GND is a small site.


----------



## sixor (Feb 7, 2019)

lol so amd7 = 2 x 580s for more money than 2x580


----------



## Space Lynx (Feb 7, 2019)

XXL_AI said:


> in the end of the day, lisa su is the niece of jensen huang, all gpu is a family business



all the more reason all 3 companies are dumb to me. so might as well go with the best performers and retire from this hobby, silicon is about dead anyway. I don't have much faith in this new chiplet designs incoming

the fact AMD allowed such a conflict of interest with their main competitor, a relative CEO... says all I need to know about if I trust companies or not.

i trust none of them. therefore I will buy the best performer, and retire.


----------



## R0H1T (Feb 7, 2019)

Xuper said:


> Computerbase tested with UV.*Did you try it* ?
> 
> https://www.computerbase.de/2019-02...istungsaufnahme-der-grafikkarte-wolfenstein-2
> 
> From 275w to 207w in Wolf2


In kingdom come deliverance it's even worse (better?) almost 90W saved 
https://www.computerbase.de/2019-02...ahme-der-grafikkarte-kingdom-come-deliverance


----------



## cucker tarlson (Feb 7, 2019)

Xuper said:


> Computerbase tested with UV.*Did you try it* ?
> 
> https://www.computerbase.de/2019-02...istungsaufnahme-der-grafikkarte-wolfenstein-2
> 
> From 275w to 207w in Wolf2


Jesus Christ who was responsible for this launch ? monkeys ? granted not every card will UV the same,but god damn this is more than one lower down in power draw. Close to the difference between 2080ti and the friggin 2070.


----------



## Divide Overflow (Feb 7, 2019)

This is what happens when you bring a compute card to a gaming card fight.


----------



## XXL_AI (Feb 7, 2019)

lynx29 said:


> all the more reason all 3 companies are dumb to me. so might as well go with the best performers and retire from this hobby, silicon is about dead anyway. I don't have much faith in this new chiplet designs incoming
> 
> the fact AMD allowed such a conflict of interest with their main competitor, a relative CEO... says all I need to know about if I trust companies or not.
> 
> i trust none of them. therefore I will buy the best performer, and retire.


its the only logical move. being fanboy is a waste of time, so as following hypes.


----------



## efikkan (Feb 7, 2019)

There are really no surprises here.
AMD should have branded this their new "Frontier Edition", since this card has no chance as a pure gaming card.


----------



## oxidized (Feb 7, 2019)

birdie said:


> I'm not sure about that. This is basically Vega 64 on steroids - it's the same architecture with better specs.



Implying what? That Vega 64 wasn't a huge failure?


----------



## 15th Warlock (Feb 7, 2019)

Another overhyped product, reminds me of the hype preceding the launch of bulldozer, when will AMD learn?

In the meantime, we all lose, I said it in the announcement thread, AMD needs to get their shit together, competition is necessary in this market, as Nvidias inflated prices show.

Well, if AMD won’t compete, we can hope the falling stock value of Nvidia will for it to lower the prices for their cards, but I don’t have much hope in that.


----------



## Brusfantomet (Feb 7, 2019)

Was 4 available in here in Norway at launch (checked 15:02) somebody has ordered one as there now is 3 available (not me btw).

Would guess that the Radeon 7 would boost nicely under water, but I don’t think there is a block for it yet?


----------



## willace (Feb 7, 2019)

Just watched a lot of YouTube review before this.  It got a very broken driver where black screen and bad OC is very wide spread.  Wondering if you guys encountered this???


----------



## Tsukiyomi91 (Feb 7, 2019)

guess my prediction that this is a minor flop from AMD is hitting reality too close to home. But, this isn't a gaming card anyways... for those who are doing rendering & simulation workflow, at least there's more option than picking the FirePro or Quadro, which are super expensive at the higher end spectrum.


----------



## iO (Feb 7, 2019)

efikkan said:


> There are really no surprises here.
> AMD should have branded this their new "Frontier Edition", since this card has no chance as a pure gaming card.


That was probably their original plan but  the RTX prices got them the chance to sell it as a gaming card.


----------



## Xuper (Feb 7, 2019)

R0H1T said:


> In kingdom come deliverance it's even worse (better?) almost 90W saved
> https://www.computerbase.de/2019-02...ahme-der-grafikkarte-kingdom-come-deliverance



 
Is anyone there responsible for overvolting card ? wish AMD releases new driver to do auto UV, damn 90w......


----------



## Space Lynx (Feb 7, 2019)

Tsukiyomi91 said:


> guess my prediction that this is a minor flop from AMD is hitting reality too close to home. But, this isn't a gaming card anyways... for those who are doing rendering & simulation workflow, at least there's more option than picking the FirePro or Quadro, which are super expensive at the higher end spectrum.



we agree on this, but Lisa Su bragged about it being a gaming card... misleading. :/  I wonder if Lisa Su has any stocks in nvidia... cause Nvidia stocks have gone up a lot in last 3 days. i think 5-10% gain. lol

shady stuff imo when your Uncle is head of Nvidia and you just launched the only competing card.


----------



## R0H1T (Feb 7, 2019)

Xuper said:


> Is anyone there responsible for overvolting card ? wish AMD releases new driver to do auto UV, damn 90w......


It's a common problem with AMD across the board, even their CPU's are unnecessarily overvolted.


----------



## ppn (Feb 7, 2019)

Funny how almost half the chip is empty, the real size is 232 sq.mm if they drop the empty spaces and 2048 bit, since 4096 bit clearly doesn't benefit the card in any way. This is RX 590 sized chip, ~~299$.


----------



## Super XP (Feb 7, 2019)

M2B said:


> I said it's not going to beat the 1080Ti, and well... there you go.


On Ultra High Quality, it actually does beat it.


----------



## efikkan (Feb 7, 2019)

iO said:


> That was probably their original plan but  the RTX prices got them the chance to sell it as a gaming card.


Highly unlikely. It's not like this comes close to being competitive even with "RTX prices".
This card is the result of AMD needing to launch _something_, as Navi is facing delays.


----------



## Super XP (Feb 7, 2019)

R0H1T said:


> It's a common problem with AMD across the board, even their CPU's are unnecessarily overvolted.


You don't think it's because of the Boost Mode?


----------



## W1zzard (Feb 7, 2019)

willace said:


> Just watched a lot of YouTube review before this.  It got a very broken driver where black screen and bad OC is very wide spread.  Wondering if you guys encountered this???


Nope, ran 100% stable. OC worked better than on 1st gen Vega, too.


----------



## EarthDog (Feb 7, 2019)

phill said:


> why people could


Do tell. Compute is the only reason I could come up with after seeing these results. Otherwise, its slower, uses a lot more power, and is quite noisey in this form. Remind me why I should pay the same money for less? 


phill said:


> Nvidia does have a lot more chance at getting it right as it does like to release 20 of the same cards compared to AMD's one or two..


Sorry.. what?


----------



## trog100 (Feb 7, 2019)

i do wish wiz would include a timespy run along with all his game runs.. the first thing i am gonna do now is search for a review that does..

i know its synthetic but it is a kind of universal synthetic.. he he

trog


----------



## Tsukiyomi91 (Feb 7, 2019)

@Super XP by a very small margin & at 4K with DX12 enabled. That is still not enough when you factor in how badly HBM v2's OCing ability.
@lynx29 that's an even more larger revelation for some, including me when i heard that Lisa's uncle IS THE CEO of Green Team.


----------



## unikin (Feb 7, 2019)

ComputerBase.de review title says it all: "Zu laut, zu langsam und zu teuer, aber mit 16 GB HBM2  (Too loud, too slow and too expensive, but with 16 GB HBM2 "

Vega 64 was a very bad lunch, this is a disaster. AMD go back to drawing boards and bring us something that can compete, even if it takes you 2 or 3 years. Why shame yourself with this joke?

Btw: you should under Volt it. AMD didn't even bother to do it by default:

 "Die Folgen des Undervolting sind sofort spürbar: Eine deutlich geringere Leistungsaufnahme, ein hörbar leiserer Betrieb und die Performance bleibt gleich oder wird in wenigen Fällen sogar minimal besser – falls vorher das Power Limit eingeschritten ist. (The consequences of undervolting are immediately noticeable: A significantly lower power consumption, an audibly quieter operation and the performance remains the same or in a few cases even minimally better - if the power limit has previously intervened.)

https://www.computerbase.de/2019-02..._ist_der_eigentliche_koenig_der_radeon_vii_uv


----------



## Space Lynx (Feb 7, 2019)

Tsukiyomi91 said:


> @Super XP by a very small margin & at 4K with DX12 enabled. That is still not enough when you factor in how badly HBM v2's OCing ability.
> @lynx29 that's an even more larger revelation for some, including me when i heard that Lisa's uncle IS THE CEO of Green Team.



I honestly can't believe the AMD board room allowed her as CEO being the niece of Nvidia CEO... conflict of interest big time... especially if she is like my niece (who loves me and sees her almost as father like cause her father left her at a young age). She sure did have a big smile knowing her Radeon VII in her hand was 30 fps slower at 1080p almost across the board while running 100 watts more... LOL maybe she does have nvidia stock, who knows.


----------



## Super XP (Feb 7, 2019)

Tsukiyomi91 said:


> @Super XP by a very small margin & at 4K with DX12 enabled. That is still not enough when you factor in how badly HBM v2's OCing ability.
> @lynx29 that's an even more larger revelation for some, including me when i heard that Lisa's uncle IS THE CEO of Green Team.


Agreed. 
This is what AMD has to play with at the moment. But they are in need of a completely new next generation GPU Architecture. 
Hopefully Navi gives us the best possible Price / Performance, as this is what AMD can compete on if they play the cards right.


----------



## Deleted member 158293 (Feb 7, 2019)

lynx29 said:


> but it is and lisa su advertised it as such.



It could be advertised as a rocket ship,  don't care for branding or marketing.   I'm planning on using it as an inexpensive opencl workhorse.

Aiming for gaming market is a mistake.  If it was just for gaming a much cheaper PCB & component selection would easily make the card at least $100 cheaper to produce.


----------



## Super XP (Feb 7, 2019)

lynx29 said:


> I honestly can't believe the AMD board room allowed her as CEO being the niece of Nvidia CEO... conflict of interest big time... especially if she is like my niece (who loves me and sees her almost as father like cause her father left her at a young age). She sure did have a big smile knowing her Radeon VII in her hand was 30 fps slower at 1080p almost across the board while running 100 watts more... LOL maybe she does have nvidia stock, who knows.


*"""NVIDIA and AMD are fierce competitors, interestingly NVIDIA’s CEO Jen-Hsun Huang is the uncle of AMD’s CEO Lisa Su.  """*
WHAT? I never knew this? LOL,



yakk said:


> It could be advertised as a rocket ship,  don't care for branding or marketing.   I'm planning on using it as an inexpensive opencl workhorse.
> 
> Aiming for gaming market is a mistake.  If it was just for gaming a much cheaper PCB & component selection would easily make the card at least $100 cheaper to produce.


How about aim it for Crypto Mining lol


----------



## Space Lynx (Feb 7, 2019)

Super XP said:


> Agreed.
> This is what AMD has to play with at the moment. But they are in need of a completely new next generation GPU Architecture.
> Hopefully Navi gives us the best possible Price / Performance, as this is what AMD can compete on if they play the cards right.



Navi is still GCN based I think. Can anyone more technical confirm this?  I believe 2020/2021 is non-GCN cards. and quite frankly I am tired of waiting.



yakk said:


> It could be advertised as a rocket ship,  don't care for branding or marketing.   I'm planning on using it as an inexpensive opencl workhorse.
> 
> Aiming for gaming market is a mistake.  If it was just for gaming a much cheaper PCB & component selection would easily make the card at least $100 cheaper to produce.



good for you Nvidia has 75% of the gaming market share. and drivers will always be better optimized for nvidia for the majority of games because if something goes wrong, Nvidia will have the loudest voice on forums, etc


----------



## londiste (Feb 7, 2019)

Anandtech has clock-for-clock with Vega64. FPS is the same except for Shadow or War and GTA5 that appear to be rather heavily memory-dependent.
https://www.anandtech.com/show/13923/the-amd-radeon-vii-review/18


----------



## ShurikN (Feb 7, 2019)

So the card performs more or less as expected. They needed something as a stopgap to Navi and to be somewhat competitive in the high end. Unfortunately for them, all the chips they have are computing ones. It truly is a Vega successor in name, origin and disappointing price/performance. Although the performance is more or less depending on the game, and state of drivers or the lack there of at the moment. I'll have to keep an eye at YT reviews once drivers stop crashing.
But what really baffles me is how they managed to make a triple fan cooling that bad. Or better yet, how did Sapphire manage it, since they made it apparently. 
After looking at GN's video just now, it only consumes 20W more on average than a stock 2080FE, yet it's a lot louder to maintain similar temperatures... So how in gods name did this cooler see the light of day is beyond me.
And so with all the negatives piling up, the best thing about the card are the free games you are getting.
They need to ditch GCN as soon as possible. This is going nowhere...



lynx29 said:


> What a joke, Lisa Su lost all respect from me for allowing this product to be released. I was considering a 3700x, but screw it. 9700k and rtx 2080 is probably my next build.


If this card is the sole reason why you wouldn't buy Ryzen 3000, then you wouldn't have bought it in the first place. You just needed an excuse not to go with an AMD cpu (gpu choice aside)...


----------



## Vya Domus (Feb 7, 2019)

lynx29 said:


> *because Lisa Su was the last person in the industry I trusted. and she lost that trust today. now i trust none of them, intel, amd, or nvidia. so i might as well just go with classic intel and nvidia* for my next build.  9700k is the best gaming chip especially since i see it for $369 free ship no tax right now.





lynx29 said:


> yeah im serious.  look at the ryzen benches below. ryzen is a joke compared to intel, for some reason GND is the only website that tests Ryzen properly.
> 
> 
> 
> http://imgur.com/a/oTNrz






lynx29 said:


> What a joke, *Lisa Su lost all respect from me for allowing this product to be released. I was considering a 3700x, but screw it. 9700k and rtx 2080 is probably my next build.*
> 
> really disappointing, 30 fps slower across the board at 1080p almost... and using 100 watts more... wow so pathetic...





lynx29 said:


> we agree on this, but *Lisa Su bragged about it being a gaming card... misleading. :/  I wonder if Lisa Su has any stocks in nvidia... cause Nvidia stocks have **gone up a lot in last 3 days. i think 5-10% gain. lol*
> 
> *shady stuff imo when your Uncle is head of Nvidia and you just launched the only competing card.*




You are quite literally the definition of a fanboy,  you're constantly oscillating between contradictory ideas and outright conspiracy theories/lies/FUD. You also don't know how to read a chart.

At this point you are littering this thread with nonsense, stop it.


----------



## R0H1T (Feb 7, 2019)

lynx29 said:


> I honestly can't believe the AMD board room allowed her as CEO being the niece of Nvidia CEO... conflict of interest big time... especially if she is like my niece (who loves me and sees her almost as father like cause her father left her at a young age). She sure did have a big smile knowing her Radeon VII in her hand was 30 fps slower at 1080p almost across the board while running 100 watts more... LOL maybe she does have nvidia stock, who knows.


She isn't related to JHH, that's 1000% FUD 


> The Taiwanese media reported that the two were a distant relative, but she dismissed it as "not true."


via GT from this page - http://weeklybiz.chosun.com/site/data/html_dir/2018/08/31/2018083101687.html


----------



## Space Lynx (Feb 7, 2019)

Vya Domus said:


> You are quite literally the definition of a fanboy,  you're constantly oscillating between contradictory ideas and outright conspiracy theories/lies. You also don't know how to read a chart.
> 
> At this point you are littering this thread with nonsense, stop it.



you got it captain, have fun with AMD. take care folks, its been fun



R0H1T said:


> She isn't related to JHH, that's 1000% FUD
> via GT from this page - http://weeklybiz.chosun.com/site/data/html_dir/2018/08/31/2018083101687.html



thank you for clarifying this, I did not know, lol. oh boy. media, terrible.


----------



## Super XP (Feb 7, 2019)

lynx29 said:


> Navi is still GCN based I think. Can anyone more technical confirm this?  I believe 2020/2021 is non-GCN cards. and quite frankly I am tired of waiting.



OK fair enough, but what are you waiting for anyways? I can play any game on 2K with Ultra High Settings with my Sapphire RX 580 8GB Radeon card and my Ryzen 1700X, yet I didn't pay an arm and a leg on the more higher end GPUs. Anyhow, for me anyways, I will not buy an Nvidia Card all because of what they stand for and how many times they push Anti-Competition practices on AMD etc.,


----------



## jesdals (Feb 7, 2019)

I wanted a RX Vega 64, but wasnt sure about my 3 monitor setup and the 8GB of ram, so have waited for the 16GB version for more than a year. The reference design have but one problem (besides of all the bugs the reviews talk of), the lack of DVI/HDMI support for 3 monitors.

What would be best for 3 times 1920x1200 setup in Eyefinity, a displayport converter to HDMI or DVI-D solution?

BTW danish price around 5.749 dkr including tax and delivery 22. feb 2019, actually some in stock but on sale for as high ad 6.500 dkr.


----------



## londiste (Feb 7, 2019)

jesdals said:


> What would be best for 3 times 1920x1200 setup in Eyefinity, a displayport HDMI or DVI-D solution?


Displayport is the current and up to date video signal standard. With possibilities for improvement for the future.


----------



## jesdals (Feb 7, 2019)

londiste said:


> Displayport is the current and up to date video signal standard. With possibilities for improvement for the future.


forgot the "have to convert from" part


----------



## EarthDog (Feb 7, 2019)

jesdals said:


> 3 monitor setup and the 8GB of ram, so have waited for the 16GB version for more than a year.


I don't imagine many titles to go over 8GB in the first place, waiting was... well, I wouldn't have. Your res has almost 1/3 less pixels than 4K UHD and 8GB is enough for MOST titles at 4K UHD.


----------



## R0H1T (Feb 7, 2019)

The release is underwhelming, given some of the hype. But really anyone calling this a "failed product" must've missed the other dozen or so dumpster fires from either vendor in this decade!


----------



## ShurikN (Feb 7, 2019)

unikin said:


> ComputerBase.de review title says it all: "Zu laut, zu langsam und zu teuer, aber mit 16 GB HBM2  (Too loud, too slow and too expensive, but with 16 GB HBM2 "
> 
> Vega 64 was a very bad lunch, this is a disaster. AMD go back to drawing boards and bring us something that can compete, even if it takes you 2 or 3 years. Why shame yourself with this joke?
> 
> ...


That's what I've been thinking as well. For example Vega 56 is a monster of a card ONCE you undervolt it...  Why AMD doesn't do this by themselves is beyond me. Of course not all chips are equal, and some will undervolt more, some less, but here has to be a sweet spot that could be hit on all cards. This is definitely not it, as clearly nothing was done from AMDs side at all. Again.


----------



## Tsukiyomi91 (Feb 7, 2019)

ok... thank god that cleared the air. owo


----------



## Robcostyle (Feb 7, 2019)

How about that - ur 1500$ 2080 Ti is not top-of-the-top anymore 
https://www.guru3d.com/index.php?ct...dmin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1

P.S. Actually, strange, the results cannot be that broad. The question is - who is a liar?


----------



## PerfectWave (Feb 7, 2019)

At GURU3D.com it is better then VEGA64 and in other games beat 2080


----------



## unikin (Feb 7, 2019)

jesdals said:


> I wanted a RX Vega 64, but wasnt sure about my 3 monitor setup and the 8GB of ram, so have waited for the 16GB version for more than a year. The reference design have but one problem (besides of all the bugs the reviews talk of), the lack of DVI/HDMI support for 3 monitors.
> 
> What would be best for 3 times 1920x1200 setup in Eyefinity, a displayport converter to HDMI or DVI-D solution?
> 
> BTW danish price around 5.749 dkr including tax and delivery 22. feb 2019, actually some in stock but on sale for as high ad 6.500 dkr.



I don't know what is the problem here? You can easily connect 4x1920x1200px monitors to any GPU. I have 1 connected to HDMI, 2x to DP, and 1 to DVI-D. HDMI is the worst connection as it takes longest time to get recognized by display (I even have to do manual monitor rescan from time to time if it doesn't connect). That is on GTX 1050TI. I even had 5 displays connected to AMD HD 6870 with no problem (NVidia only supports up to 4).


----------



## M2B (Feb 7, 2019)

Robcostyle said:


> How about that - ur 1500$ 2080 Ti is not top-of-the-top anymore
> https://www.guru3d.com/index.php?ct...dmin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1
> 
> P.S. Actually, strange, the results cannot be that broad. The question is - who is a liar?







How about this?


----------



## willace (Feb 7, 2019)

W1zzard said:


> Nope, ran 100% stable. OC worked better than on 1st gen Vega, too.



Thanks.  Interesting.....


----------



## Tsukiyomi91 (Feb 7, 2019)

only at maxed out settings on 4K where the HBM v2 shines. Still, not many are gaming on 4K when 1440p, 144Hz or faster are the more preferred option... with a mere 4fps difference on stock clocks, the 2080 is a more feasible pick coz of it's better OCing potential (despite power limit issues).


----------



## jabbadap (Feb 7, 2019)

Robcostyle said:


> How about that - ur 1500$ 2080 Ti is not top-of-the-top anymore
> https://www.guru3d.com/index.php?ct...dmin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1
> 
> P.S. Actually, strange, the results cannot be that broad. The question is - who is a liar?



DX11 vs DX12.


----------



## JRMBelgium (Feb 7, 2019)

For me, a battlefield exclusive gamer, it's a great buy. But honestly, I won't recommend it and I'm still hoping for driver improvements. Let's hope BF V is a showcase of what's possible with proper optimalizations.


----------



## PerfectWave (Feb 7, 2019)

wondering why in this suite of game there is the presence of Dragon Quest XI. Seems that these suite of games at TPU are more NVIDIA advantage...


----------



## the54thvoid (Feb 7, 2019)

Robcostyle said:


> How about that - ur 1500$ 2080 Ti is not top-of-the-top anymore
> https://www.guru3d.com/index.php?ct...dmin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1
> 
> P.S. Actually, strange, the results cannot be that broad. The question is - who is a liar?





PerfectWave said:


> At GURU3D.com it is better then VEGA64 and in other games beat 2080



I'd suggest Guru3D has some issues with their results. For example, in Deus Ex MD, at 1080p ,1080ti beats a 2080ti. Highly dubious result.


----------



## DeOdView (Feb 7, 2019)

@Wizzard:  No 3D FutureMarK?   I noticed from other websites, too.

almost none had!  

???


----------



## M2B (Feb 7, 2019)

the54thvoid said:


> I'd suggest Guru3D has some issues with their results. For example, in Deus Ex MD, at 1080p ,1080ti beats a 2080ti. Highly dubious result.



CPU BOUND. their CPU is not fast enough to handle high-end GPUs properly at lower resolutions.


----------



## PerfectWave (Feb 7, 2019)

the54thvoid said:


> I'd suggest Guru3D has some issues with their results. For example, in Deus Ex MD, at 1080p ,1080ti beats a 2080ti. Highly dubious result.



DEUX EX MD is an AMD title and use DX12. All AMD cards are strong in this game and beat their counterpart.


----------



## qubit (Feb 7, 2019)

So, another expensive, hot, slow and noisy flagship graphics card from AMD that doesn't really compete with NVIDIA. I'll pass, again. No wonder AMD get so much flak and NVIDIA charges what it likes for their cards and has a dominant marketshare.

I'll probably get accused of being an NVIDIA fanboy from AMD apologists, but I'm actually telling the truth. Just read the conclusion in this review for the evidence.

Can't wait for AMD to do a Zen with their graphics cards and I have some optimism that they will. Eventually.


----------



## the54thvoid (Feb 7, 2019)

PerfectWave said:


> DEUX EX MD is an AMD title and use DX12. All AMD cards are strong in this game and beat their counterpart.



Read my post please. I said, a 1080ti beats a 2080ti. That's very unlikely to happen.

EDIT: just checked OcUK - MSI and ASUS available (9 and 10+) both at £799. So, we're looking at 2080ti levels of price/perf increase.  R7 is 25% better than Vega 64 but at £799, is 100% more expensive. Ouch.


----------



## ppn (Feb 7, 2019)

It is competeing with the 2070Ti. When nivida shrinks 2070 to 7nm, it is game over.


----------



## Metroid (Feb 7, 2019)

Radeon vii is only good on dx12 games and on dx12 renderer is only behind rtx 2080 ti and by very few fps, so in turn if you will use only dx12 with this gpu then this is the best gpu for it to date x price. I'm impressed with dx12 performance, I'm depressed with dx11 performance.


----------



## unikin (Feb 7, 2019)

It's mind boggling why AMD didn't bother to undervolt R7 by default as one of bios options. Translated from computerbase.de: The Radeon VII is more energy efficient with undervolting than the GeForce RTX 2080 FE  - "Die Radeon VII ist mit Undervolting energieeffizienter als die GeForce RTX 2080 FE"

It looks like AMD doesn't really give a flying f... about R7 lunch. I've lost all hope in AMD gaming GPUs. They don't even bother to do things that could be done effortlessly to make GPU look better. Why AMD?


----------



## HD64G (Feb 7, 2019)

Spoiler








I can see big variations in results between reviewers and almost all of them faced big driver issues. AMD messed another launch because of their drivers not being ready.

But point is that in most reviews, R7 is very close to 2080 @1440P and even closer @4K than what @W1zzard 's review shows. Maybe the drivers are responsible for this. Who knows?


----------



## the54thvoid (Feb 7, 2019)

What all reviews agree on though is the terrible acoustics. Doesn't bother some folk but to many, quiet PC's are important. This card is frankly bettered by the 2080 in almost all gaming regards.


----------



## repman244 (Feb 7, 2019)

Looks like AMD made their GTX 480.


----------



## W1zzard (Feb 7, 2019)

PerfectWave said:


> wondering why in this suite of game there is the presence of Dragon Quest XI. Seems that these suite of games at TPU are more NVIDIA advantage...


It was one of the first Unreal Engine 4 titles. It'll probably get replaced by a new game release next time I retest all cards


----------



## GoldenX (Feb 7, 2019)

So, a die shrink and some optimizations. Boring.
They didn't even enable the promised features of the first Vega (like Rapid Packed Math).


----------



## W1zzard (Feb 7, 2019)

repman244 said:


> Looks like AMD made their GTX 480.


If it makes the company rethink and reinvent itself, like NVIDIA did, then this will be a good thing


----------



## Fluffmeister (Feb 7, 2019)

Not great for TSMC 7nm part frankly, but then it was always gonna to fall short of even Pascal, just shows what a gem the 1080 Ti was two years ago.


----------



## moproblems99 (Feb 7, 2019)

EarthDog said:


> uses a lot more power



I would like to see Wizzard undervolt the card and rebench the power draw.

And before anyone says: We shouldn't have to undervolt it.  We also shouldn't have to overclock because the cards should come out at their max.  Undervolting is just as relevant as overlocking.



lynx29 said:


> LOL maybe she does have nvidia stock, who knows.



She would be sad if she got it at the peak.


----------



## londiste (Feb 7, 2019)

GoldenX said:


> So, a die shrink and some optimizations. Boring.
> They didn't even enable the promised features of the first Vega (like Rapid Packed Math).


RPM is enabled on Vegas, including Vega56/64 and has brought them some wins in game benchmarks. Unfortunately for AMD, Turings also have RPM.


----------



## EarthDog (Feb 7, 2019)

moproblems99 said:


> And before anyone says: We shouldn't have to undervolt it. We also shouldn't have to overclock because the cards should come out at their max. Undervolting is just as relevant as overlocking.


I think that data set is interesting to see the results, but, that's about it. 

I don't buy a card to turn it down to save power. I disagree wholeheartedly that UV is as relevant as overclocking. People don't typically turn their GPUs down (unless they have to), they turn it up (because they want to, and can). 

Also, other party cards will undervolt as well. If this is able to UV/UC better, who knows (that is sample related too).


----------



## Vya Domus (Feb 7, 2019)

GoldenX said:


> They didn't even enable the promised features of the first Vega (like Rapid Packed Math).



Rapid Packed Math meant double rate FP16, pretty sure it is enabled on both.


----------



## Metroid (Feb 7, 2019)

I knew this gpu would be like vega 14nm + more transistors and few more things, I hate how hot it is but it cant be helped, I'm interested more on navi which if navi 7nm is what rx 480 was for amd on 14nm then we will have a winner if price is right.


----------



## unikin (Feb 7, 2019)

moproblems99 said:


> I would like to see Wizzard undervolt the card and rebench the power draw.



Here you go:




70W less power draw for the stock performance when undervolted. 23 W lower PD than RTX 2080 for 9 % worse performance or 70 W more (OC) than stock 2080 FE for the same performance.


----------



## moproblems99 (Feb 7, 2019)

EarthDog said:


> I think that data set is interesting to see the results, but, that's about it.
> 
> I don't buy a card to turn it down to save power. I disagree wholeheartedly that UV is as relevant as overclocking. People don't typically turn their GPUs down (unless they have to), they turn it up (because they want to, and can).
> 
> Also, other party cards will undervolt as well. If this is able to UV/UC better, who knows (that is sample related too).



I don't understand.  I don't lose performance when I undervolt.  I undervolt and overclock and gain performance.  How is this not relevant?  Where is the turning down?


----------



## EarthDog (Feb 7, 2019)

The turning down is a power savings and heat savings thing... which mostly isn't done unless there is a need. Again, I think the dataset would be good to see just how far they pushed this thing out of its sweetspot, but that is all I can glean from it. 

It's a shame to want/need to do it is the biggest issue to me, regardless of outcome.


----------



## DeOdView (Feb 7, 2019)

What's happened to 3DMark???  Did they just disappeared? 

None tested except for 3DGuru ! seriously???


----------



## moproblems99 (Feb 7, 2019)

EarthDog said:


> It's a shame to want/need to do it is the biggest issue to me, regardless of outcome.



I don't disagree with you there are at all.  It is disheartening that this is still required this late in the game.  However, when I build my engines and cars, I don't just build for max power.  I also look at how to squeeze a little extra mpg out of it.  Undervolting is squeezing out that little mpg and should not be considered 'turning down'.  Using extra volts when you don't need to is a waste, unless you are losing performance.  Which you are not in this case.

Edit: I almost just bought the Sapphire, it was in stock at the cracked Egg.  My better sense (not my wife) won out and will wait for a price drop.


----------



## EarthDog (Feb 7, 2019)

It depends on what the car is... When I'm building motors, its for max power. IDGAF about gas mileage. (bad analogy... but I get ya)

Again, its neat that it can, but so can other cards. I would look at that dataset, golf clap, and move on (all the while wondering why the hell they can't get power use down and performance up in the same ballpark as the competition). Here is to hoping this testing is able to bring power in line and performance... but you wont catch me holding my breath (nor the overwhelming majority who wouldn't know/care enough to do this).


----------



## Imsochobo (Feb 7, 2019)

Meh, but sometimes you need an amdgpu. 
But really meh!


----------



## kastriot (Feb 7, 2019)

So it's  7nm version of vega 56/64 with higher clocks and  16GB HBM2 but i guess they will ditch vega  56/64 and use 8GB DDR5X version of this card with much  lower prices for gamers.

Btw Nice review!


----------



## GoldenX (Feb 7, 2019)

Vya Domus said:


> Rapid Packed Math meant double rate FP16, pretty sure it is enabled on both.


Still off in OpenGL.


----------



## B-Real (Feb 7, 2019)

lynx29 said:


> What a joke, Lisa Su lost all respect from me for allowing this product to be released. I was considering a 3700x, but screw it. 9700k and rtx 2080 is probably my next build.
> 
> really disappointing, 30 fps slower across the board at 1080p almost... and using 100 watts more... wow so pathetic...


So you don't buy a CPU because of this GPU. ROFLMAO. I wouldn't use pathetic on the GPU, but your attitude. And I want to see the millions of gamers using a 700 or 800$ card on FHD. Haha.



the54thvoid said:


> And at OcUK - it's £650 - £800.


And the 2080 costs £700+. You get 10% less performance for 10% price difference, and don't forget it has twice the amount of more expensive VRAM. I just don't understand why they didn't release 8GB versions which could minimize the price.



XXL_AI said:


> as Jensen said, it sucks. try again amd.  the results are mediocre at best. lol. they can't even compete with a 3yr old GPU.. roflmao.


What do you mean it can't compete with a 3 year old GPU? Explain to me. Thank you. This doesn't change anything as Jensen is a tard.


----------



## moproblems99 (Feb 7, 2019)

EarthDog said:


> It depends on what the car is...



Yeah, cars are like religion and politics...best to just not discuss some times.  That said, if it is fwd or has four doors, leave it alone.

Edit:

Agree with you on the performance and power.  I am hoping the influx of money from Zen finds its way to the GPU department.  They can't live like this forever.


----------



## Xuper (Feb 7, 2019)

Holy COW ! 90Mhs 

https://cryptomining-blog.com/10628-amd-radeon-vii-gfx906-delivers-90-mhs-ethash-out-of-the-box/

That's a lot.guess it's Frontier edition for gaming but good for Mining.ofc Mining is crashing.


----------



## Vya Domus (Feb 7, 2019)

GoldenX said:


> Still off in OpenGL.



I don't understand this, RPM is not something that can be turned off or on.

You can use half float in OpenGL just fine :

https://www.khronos.org/registry/OpenGL/extensions/AMD/AMD_gpu_shader_half_float.txt


----------



## Basard (Feb 7, 2019)

Holy crap, as if the card isn't embarrassing enough.... the forum response to it is even worse!


----------



## laszlo (Feb 7, 2019)

ArbitraryAffection said:


> Yes.
> 
> I cried a bit inside.



I don't as i have in mind previous year gpu's launches which i may say was also (over-) hyped by red team in comparison with green counterpart and seems to me as they try to squeeze out the max from the current architecture without adding extra value/price/consumption.. etc..

in a way is understandable as they really managed to kick intel ..ss  in the cpu segment and seems that cpu division is more important than gpu one...for now hopefully

this won't affect them at all as next gen consoles hw will use theirs but we all expect the day when they'll kick green's  ..ss with something new and comparable if not better with current nvidia architecture(with or without rtx and other current features which seems to be irrelevant in current gaming market )

in the end is all about pricing as there aren't bad products only bad priced ones and competition is good for all of us as our wallet shall dictate the market and not viceversa


----------



## The Riddler (Feb 7, 2019)

Is 331 mm² only GPU's die size or the sum of the areas of GPU and HBM2 stacks? If it's only GPU's that's a big failure. 14 nm RX 580 is 232 mm²  and Radeon 64 is 495 mm².


----------



## Anymal (Feb 7, 2019)

W1zzard said:


> Furmark is just a free data point for curious readers, it has no influence on the outcome of the review
> 
> "Peak (Gaming)" is the highest single value measured during gaming (Metro Last Light), "Average (Gaming)" is the average of these values


Since its a comparison...


----------



## Totally (Feb 7, 2019)

Damn shame, this would have been a very good product, a few months to  a year ago.


----------



## xkm1948 (Feb 7, 2019)

It is GCN, what do you expect? 7nm isn’t magic unicorn sauce that fix everything. As for compute that is also irrelevent. CUDA and Tensor Flow dominates scientific computing and this GPU has no dedicated support for either of those. 

Also paging @Vya Domus your dream card is here. Time to show your support for the underdog!


----------



## GhostRyder (Feb 7, 2019)

Well, its a decent card but it still not a game changer that AMD needed.  Seems there are some games really holding it back keeping that summary to be lower than it should be but that's the name of the game.  Not a bad card but nothing I have an interest in.


----------



## dicktracy (Feb 7, 2019)

Meh...


----------



## Xuper (Feb 7, 2019)

what I say that Price is not good, given RTX 2070 , then AMD needs to cut that price to 599 or even lower to make it attractive.


----------



## B-Real (Feb 7, 2019)

xkm1948 said:


> It is GCN, what do you expect? 7nm isn’t magic unicorn sauce that fix everything. As for compute that is also irrelevent. CUDA and Tensor Flow dominates scientific computing and this GPU has no dedicated support for either of those.
> 
> Also paging @Vya Domus your dream card is here. Time to show your support for the underdog!


As a 2080Ti owner who was milked, I understand you are crying. 



Xuper said:


> what I say that Price is not good, given RTX 2070 , then AMD needs to cut that price to 599 or even lower to make it attractive.



Hope you say the same for the RTX 2080 too.


----------



## Deleted member 172152 (Feb 7, 2019)

Game choice seems to make a big difference. Some other reviews have much less distance to 2080 with radeon vii between 1080 ti and 2080.

Objectively good? No. Subjectively? Gimme gimme gimme!


----------



## Xuper (Feb 7, 2019)

B-Real said:


> As a 2080Ti owner who was milked, I understand you are crying.
> 
> Hope you say the same for the RTX 2080 too.



given the size of Chip , Nvidia can't reduce it. how much it cost?


----------



## Xaled (Feb 7, 2019)

M2B said:


> View attachment 115950
> 
> TPU's performance summary is based on significantly more games and it's definitely more accurate because of this.


Not that i am saying that VII is a good card. but TPUs methodology always favors nvidia's card. especially at lower resolutions. For me when tpu says that 2080 is 14% faster than VII then it means approx. 7% for me. 
I wonder why 1060 is no more faster than Vega 56 though


----------



## unikin (Feb 7, 2019)

AMDs GPU department is in very bad situation right now. I was hoping we would be reading reviews of Navi by now. It looks like 7nm Navi isn't coming till 2H or even 3/4Q of 2019. They have missed opportunity to punish NVidia for overpriced RTX series. NVidia will be close to migrating Turing to 7 nm by the time high end Navi comes out. AMD will stay one gen behind no matter what they do now.


----------



## B-Real (Feb 7, 2019)

And with the $700 price, don't forget you are getting $180 worth of games, while you get $60 with RTX 2070 and above. It also changes the relative price of the card.


----------



## Vya Domus (Feb 7, 2019)

xkm1948 said:


> Also paging @Vya Domus your dream card is here. Time to show your support for the underdog!



What's with you on every RTG related discussion.

Does that Fury X still torment you ?


----------



## londiste (Feb 7, 2019)

B-Real said:


> And with the $700 price, don't forget you are getting $180 worth of games, while you get $60 with RTX 2070 and above. It also changes the relative price of the card.


$120 worth of games with RTX 2080, by the same measure.
And no, included games do not change the relative value of the card.


----------



## moproblems99 (Feb 7, 2019)

Vya Domus said:


> What's with you on every RTG related discussion.



Isn't it obvious?  You clearly can't bioreplicate red colored transistors.  Anyone worth their proteins knows this.


----------



## erocker (Feb 7, 2019)

How disappointing.


----------



## xkm1948 (Feb 7, 2019)

Vya Domus said:


> What's with you on every RTG related discussion.
> 
> Does that Fury X still torment you ?



Who decided I cannot comment on AMD GPU thread? Is TPU your site? Just time to support your champion that’s all. I hate seeing people who are just all talks but no action. You are certainly a man of action! So show the support! Snag a Radeon 7


----------



## Imsochobo (Feb 7, 2019)

B-Real said:


> And with the $700 price, don't forget you are getting $180 worth of games, while you get $60 with RTX 2070 and above. It also changes the relative price of the card.



It's no better than the argument "v64 vs 1080" back when they came out.
but... I ended up with a V64 at way way lower price than 1080, if not there is no way in hell I'd buy such bad hardware!

Edit:
On another note, it tells us Vega isn't memory starved, no fanboy can complain about anything now cause it couldn't possibly have better a chance!
16gb.
1TB/sec bandwidth.
Great VRM
Cooler that's capable.

Still, cannot compete, I do wonder if they used 100W just in overvolting from factory again.. my vega performs better at 230W total than 300W and I just increased memory clock, undervolted and tuned states....
No need to go up to 1600 mhz when it's there only for microseconds!


----------



## EarthDog (Feb 7, 2019)

xkm1948 said:


> Who decided I cannot comment on AMD GPU thread? Is TPU your site? Just time to support your champion that’s all. I hate seeing people who are just all talks but no action. You are certainly a man of action! So show the support! Snag a Radeon 7


LOL, Quoting > You...

...who are you responding to? LOL


----------



## xkm1948 (Feb 7, 2019)

EarthDog said:


> LOL, Quoting > You...
> 
> ...who are you responding to? LOL




Vya damous


----------



## EarthDog (Feb 7, 2019)

Oh so the reality is Ignore list > me. 

Most sites with ignore will either still quote the ignored user or show that there is ignored content within a post. hahaha!


----------



## Scougar (Feb 7, 2019)

This was a stand in halo product whilst waiting for Navi.  They needed something to show, and it is very clear they didn't want to do this given they got rid of a guy for trying to release it to start with... but then did so anyway.   It is the pinnacle of what can be achieved on Vega right now, and no doubt further drivers will make it better.  However... I don't expect the product life to be very long, or them to produce very many.  This is AMD's P4, and they are trying to make it work until they can get another architecture out the door.

For the gentleman/lady who won't buy an upcoming Ryzen chip because of this graphics card release, and it being hot and power hungry.... you are about to do the same thing with your intel purchase.  Hot and power hungry and performs worse than 'likely' competition.  Think about it.  Buy what makes sense IMO, not what you feel good.  Why get less performance for more money, when the clearly better 3xxx series will be the smarter choice (unless AMD manage to completely hose things).


----------



## Metroid (Feb 7, 2019)

Xuper said:


> Holy COW ! 90Mhs
> 
> https://cryptomining-blog.com/10628-amd-radeon-vii-gfx906-delivers-90-mhs-ethash-out-of-the-box/
> 
> That's a lot.guess it's Frontier edition for gaming but good for Mining.ofc Mining is crashing.



Mining has been dead since last year and if it comes back then best bet is 2020/2021.


----------



## 95Viper (Feb 7, 2019)

Stay on topic.
Quit the side discussions.
Stop trying to instigate drama with each other.

Thank You.


----------



## Metroid (Feb 7, 2019)

londiste said:


> $120 worth of games with RTX 2080, by the same measure.
> And no, included games do not change the relative value of the card.



It only does if you are about to buy the games, I mean, if there was resident evil 2 included i would save $60. I'm not a fan of any of these games that are included but for those who like those games and are about to buy then nonetheless is a save.

For me those people should had an option to sell those included games they dont want, then the value would be there regardless. Steam gifts should be a choice, not sure if you can still sell steam gift anyway, valve changed it a lot.


----------



## ppn (Feb 7, 2019)

The Riddler said:


> Is 331 mm² only GPU's die size or the sum of the areas of GPU and HBM2 stacks? If it's only GPU's that's a big failure. 14 nm RX 580 is 232 mm²  and Radeon 64 is 495 mm².









CUs occupy less than 200mm². memory and PCIe controllers had to be moved further away from the heat. The thing is that 2080 can and will be shrinked hopefully to 300 and clock 30% higher at under 200 watts. what else is left to do. So VEGA20 will have to match or beat 2080Ti to compete per die area, and it is closely beating 2070OC now. Per watt will be lost no matter what.


----------



## eidairaman1 (Feb 7, 2019)

ShurikN said:


> That's what I've been thinking as well. For example Vega 56 is a monster of a card ONCE you undervolt it...  Why AMD doesn't do this by themselves is beyond me. Of course not all chips are equal, and some will undervolt more, some less, but here has to be a sweet spot that could be hit on all cards. This is definitely not it, as clearly nothing was done from AMDs side at all. Again.



It was rushed like vega56/64, no finetuning.

As you said earlier, it is a stop gap card.

Cooler is nicer than the past blowers that both camps use.


----------



## looks (Feb 7, 2019)

lets not forget that the people critical of this card in the comment section haven't even brought up the horrible noise this card produces yet. red team fans have a lot of fronts to defend.


----------



## Scougar (Feb 7, 2019)

looks said:


> lets not forget that the people critical of this card in the comment section haven't even brought up the horrible noise this card produces yet. red team fans have a lot of fronts to defend.



The weird thing is, there are some totally different reports on this.  Some say it's loud, others say it is very quiet.  I think this will come out in the wash...


----------



## GoldenX (Feb 7, 2019)

Vya Domus said:


> I don't understand this, RPM is not something that can be turned off or on.
> 
> You can use half float in OpenGL just fine :
> 
> https://www.khronos.org/registry/OpenGL/extensions/AMD/AMD_gpu_shader_half_float.txt


Rapid Packed Math is a new stage on the pipeline, with the idea to improve FP16 performance over just half float. My GCN1.0 can do half float just fine.


----------



## EarthDog (Feb 7, 2019)

Scougar said:


> The weird thing is, there are some totally different reports on this.  Some say it's loud, others say it is very quiet.  I think this will come out in the wash...


Loud/Quiet is subjective. What is seemingly consistent across reviews is that it is 'louder' (by a few/several db) than reference 2080 and 2080Ti. AIC cards (at an additional cost) will likely mitigate that issue.


----------



## Captain_Tom (Feb 7, 2019)

"At a better price, such as $599, the Radeon VII, despite its shortcomings, could have forced NVIDIA to trim pricing of the RTX 2080 and RTX 2070 "

The era of AMD bending over backwards to offer a wholly better product than Nvidia... only to have its fanboys buy the Nvidia cards anyways, is over.  Period.

Enjoy the expensive future you paid for people.  This card will simply be produced in lower numbers, and point to its other usecases for sales (Professional Work/Mining/High Memory dependent gaming).  AMD no longer cares about winning at _everything _lol.


----------



## W1zzard (Feb 7, 2019)

EarthDog said:


> a few/several db


Part of the problem seems to be the non-linearity of the dBA scale. it's perceived at least 2x as loud


----------



## JRMBelgium (Feb 7, 2019)

Captain_Tom said:


> "At a better price, such as $599, the Radeon VII, despite its shortcomings, could have forced NVIDIA to trim pricing of the RTX 2080 and RTX 2070 "
> 
> The era of AMD bending over backwards to offer a wholly better product than Nvidia... only to have its fanboys buy the Nvidia cards anyways, is over.  Period.
> 
> Enjoy the expensive future you paid for people.  This card will simply be produced in lower numbers, and point to its other usecases for sales (Professional Work/Mining/High Memory dependent gaming).  AMD no longer cares about winning at _everything _lol.



This is true. Many times AMD was faster, but gamers waited for Nvidia to lower prices and bought Nvidia anyways. Personally, I don't see why they would ever start a price/performance war like that again. Not unless they can start a war like that and still make lots of money, wich is clearly not the case with Radeon 7. Who knows, if Navi can replace all RX 5xx and Vega cards with a power consumption and production cost below the Nvidia counterparts, they could start a price war, if not, they should just match Nvidia's prices and fight Intel exclusively.


----------



## FYFI13 (Feb 7, 2019)

Oh, so they made a super power hungry GTX 1070. Where's that guy who says "wow"?


----------



## EarthDog (Feb 7, 2019)

W1zzard said:


> Part of the problem seems to be the non-linearity of the dBA scale. it's at least 2x as loud perceived


Absolutely.

Subjectivity comes into play.. one man's buzzing is another man's flight of the bumblebee! Or one man's loud is another man's not loud to others. I imagine this card to be better than the blower solution, so it is an improvement there... but almost seems to be driven out of necessity (300W) than a bid to really quiet things down (though does that really matter why?). It's hard to say HOW much louder it is exactly, but it is fair to say most reviews are showing it a few db louder. I also found the FE cards, subjectively, to be not so bad on load... there was enough 'headroom' in my observation of this to get louder and not be annoying. It won't require ear plugs, but we do know it will be louder (however each and every one of us hears it).


----------



## Deleted member 158293 (Feb 7, 2019)

Xuper said:


> Holy COW ! 90Mhs
> 
> https://cryptomining-blog.com/10628-amd-radeon-vii-gfx906-delivers-90-mhs-ethash-out-of-the-box/
> 
> That's a lot.guess it's Frontier edition for gaming but good for Mining.ofc Mining is crashing.



It is looking very good at OpenCL, I'd expect at least another 10-20% improvement with updates and tweaking in the coming weeks.  Mining runs on a hi-low market cycle, and it'll take a while yet before it gets going on the up cycle again to pay for itself.  Unless GPU mining finally burns out with all the garbage coins leaving Asics to mine bitcoin proper and then everyone will be happy.


----------



## Captain_Tom (Feb 7, 2019)

JRMBelgium said:


> This is true. Many times AMD was faster, but gamers waited for Nvidia to lower prices and bought Nvidia anyways. Personally, I don't see why they would ever start a price/performance war like that again. Not unless they can start a war like that and still make lots of money, wich is clearly not the case with Radeon 7. Who knows, if Navi can replace all RX 5xx and Vega cards with a power consumption and production cost below the Nvidia counterparts, they could start a price war, if not, they should just match Nvidia's prices and fight Intel exclusively.



Exactly, even Ryzen 1000 refused to start a total price war with Intel - and it was a vastly superior product stack.  However Ryzen 3000 will probably go for a kill shot, and only because they will be able to make CPU's twice as good as Intel's for less money.  Until that happens for Radeon, they will not even try - and that won't happen till 2021 at the earliest unfortunately.

However Navi may actually lower prices quite a bit, but only to where they were in 2012 at best - although they were pretty great then, huh?


----------



## Scougar (Feb 7, 2019)

FYFI13 said:


> Oh, so they made a super power hungry GTX 1070. Where's that guy who says "wow"?



GTX 1070 ?  How did you get that out of this review?



Captain_Tom said:


> Exactly, even Ryzen 1000 refused to start a total price war with Intel - and it was a vastly superior product stack.  However Ryzen 3000 will probably go for a kill shot, and only because they will be able to make CPU's twice as good as Intel's for less money.  Until that happens for Radeon, they will not even try - and that won't happen till 2021 at the earliest unfortunately.
> 
> However Navi may actually lower prices quite a bit, but only to where they were in 2012 at best - although they were pretty great then, huh?



Because the vast majority of people buying cards don't research, but know the nvidia name/advertisting or are directed to it by store employees.  Most purchasers are not enthusiasts.  They are basically BOSE buyers, who just buy without looking, and thinking it's the best, regardless if it is or not.  That's a reality.


I am pretty amazed putting a few washers to increase pressure made such a difference (10C) in temp's on the junction.  Not had much time to look at reviews in-depth as at work... but looks like it has potential.  Wonder what an AIO will do if it is mostly thermal throttling.


----------



## cucker tarlson (Feb 7, 2019)

in virtually every review I've seen rtx 2070 oc is trading blows with RVII oc, at least at 2560x1440 or 3440x1440,and actually winning in many cases.
pcgh has it at 6% faster than 2070 in 21 games across 3 resolutions
http://www.pcgameshardware.de/Radeon-VII-Grafikkarte-268194/Tests/Benchmark-Review-1274185/2/


----------



## Xuper (Feb 7, 2019)

http://www.pcgameshardware.de/Radeon-VII-Grafikkarte-268194/Tests/Benchmark-Review-1274185/3/

from Desktop to Crysis 3 , it consumes less than FE 2080.while Anno 2070 , 303w vs 224w
when you do UV , GPU boost becomes stable.


----------



## mandelore (Feb 7, 2019)

That is really dissapointing. I was *sooo* hoping this would be the one to replace my ageing 295x2 setup. Its not. When CF works properly many years later AMD's finest still doesnt match up to a 295x2. May now go for a 2080 for a similar price.  Radeon VII is £800 on overclockers.co.uk! for an Extra £180 you can get a 2080 ti. Great performance bump from Vega, but sad to see a flagship card end up a lagship card..


----------



## Robcostyle (Feb 7, 2019)

M2B said:


> How about this?


Ah, cmon, don't ruin this - I was just about composing a letter to mr.leather jacket, with screenshot attached, that his gpu pricing is not viable anymore, condisdering one existing RT title


----------



## danbert2000 (Feb 7, 2019)

For all you guys talking about undervolting the card to make it compete, AMD likely tested the GPUs that were rolling out of the fab, and had a choice of a high voltage to make a large amount of them work, or a lower voltage and throwing away some of the marginal chips. It looks like they chose the former. Sure, the review samples are probably okay to undervolt, but just like overclocking, some of the chips won't be able to magically shed 80 W and keep the same performance. They don't even have a full product line to reuse marginal GPUs, so this is what we get. 5000 hot volted cards instead of maybe 3000 tuned cards.

I'm really astounded at the thermal pad solution, and that the card is as cool as it is with that on. What was AMD thinking? And what's with all of these temperature sensors jacking the fans up? I'm guessing that during testing, they had some issues with components frying when they went with just GPU temps.


----------



## purecain (Feb 7, 2019)

the thing I look for is frame times... this card has some potential. hopefully they have something much better down the line. I havnt looked for a while to see what they have planned. there's definately a few mistakes in here for lisa to learn from.  I really want this card to be better... come on AMD!!!!!!!


----------



## kings (Feb 7, 2019)

I dont know in the US, but in Europe this card is way overpriced.

The cheapest one are around 749€ and we can find many RTX 2080 models for 650€~680€.

This Radeon VII has to lower the price substantially to be an option, at least in Europe!


----------



## cucker tarlson (Feb 7, 2019)

kings said:


> I dont know in the US, but in Europe this card is way overpriced.
> 
> The cheapest one are around 749€ and we can find many RTX 2080 models for 650€~680€.
> 
> This Radeon VII has to lower the price substantially to be an option, at least in Europe!


in PL there's a grand total of 0 of them


----------



## Metroid (Feb 7, 2019)

Like vega 14nm this card was never aimed for gaming, compute workload in disguise is what has always been.


----------



## Deleted member 172152 (Feb 7, 2019)

I bought an Asrock one, regret the downsides, glad I could NOT get a 2080!  close enough


----------



## looks (Feb 7, 2019)

Scougar said:


> The weird thing is, there are some totally different reports on this.  Some say it's loud, others say it is very quiet.  I think this will come out in the wash...



hmm, the TPU review here, plus gamers nexus, pauls hardware, hardware unboxed all say it's pretty loud, the others i've watched just didn't even mention the acoustics, i haven't seen a review that said it was very quiet, maybe you could point me towards them?


----------



## cucker tarlson (Feb 7, 2019)

danbert2000 said:


> For all you guys talking about undervolting the card to make it compete, AMD likely tested the GPUs that were rolling out of the fab, and had a choice of a high voltage to make a large amount of them work, or a lower voltage and throwing away some of the marginal chips. It looks like they chose the former. Sure, the review samples are probably okay to undervolt, but just like overclocking, some of the chips won't be able to magically shed 80 W and keep the same performance. They don't even have a full product line to reuse marginal GPUs, so this is what we get. 5000 hot volted cards instead of maybe 3000 tuned cards.
> 
> I'm really astounded at the thermal pad solution, and that the card is as cool as it is with that on. What was AMD thinking? And what's with all of these temperature sensors jacking the fans up? I'm guessing that during testing, they had some issues with components frying when they went with just GPU temps.


fact is they were chasing the scenario where they could show off RVII match 2080 in a few cherry picked games,and that cost them that extra power draw.


----------



## Vya Domus (Feb 7, 2019)

Scougar said:


> Wonder what an AIO will do if it is mostly thermal throttling.



Probably not much. A lot of people don't understand what is going on here, 300W over 330 mm^2 is very difficult to cool down, to the point where you simply can't do anything past a certain threshold. The surface area of the chip ends up being the bottleneck. It's also the reason why they made it so that the fans spin up faster, there is no mystery here.

Bizarrely, the only thing AMD has to do now is to simply make a bigger chip running at lower clocks. That is if they'll ever make another high end consumer GPU.


----------



## Mistral (Feb 7, 2019)

An interesting card. And if you discount outliers lie Civ5, Darksiders 3 and Dragon Quest XI, the performance is quite solid too. Still, those prices from both camps...


----------



## cucker tarlson (Feb 7, 2019)

Vya Domus said:


> Probably not much. A lot of people don't understand what is going on here, 300W over 330 mm^2 is very difficult to cool down, to the point where you simply can't do anything, the surface area of the chip ends up being the bottleneck. It's also the reason why they made it so that the fans spin up faster, there is no mystery here.
> 
> Bizarrely, the only thing AMD has to do now is to simply make a bigger chip running at lower clocks. That is if they'll ever make another high end consumer GPU.


I bet if they refreshed V64 on 12nm with 4 stacks of hbm2 that'd come out faster.


----------



## M2B (Feb 7, 2019)

Vya Domus said:


> Bizarrely, the only thing AMD has to do now is to simply make a bigger chip running at lower clocks. That is if they'll ever make another high end consumer GPU.



It doesn't work on GCN, GCN benefits way more from higher clocks than higher shader count.


----------



## cucker tarlson (Feb 7, 2019)

M2B said:


> It doesn't work on GCN, GCN benfits way more from higher clocks than higher shader count.


but has tremendously hard time clockng high too.


----------



## Vya Domus (Feb 7, 2019)

M2B said:


> GCN benfits way more from higher clocks than higher shader count.



Just as is the case with pretty much every GPU architecture ever, performance scales almost linearly with shader count and so does other things such as power consumption. With clock speed that is never the case.


----------



## M2B (Feb 7, 2019)

Vya Domus said:


> Just as is the case with pretty much every GPU architecture ever, performance scales almost linearly with shader count and so does other things such as power consumption. With clocks that is never the case.



Vega 64 with 15% more shaders than vega 56 performs only 3% better than the 56 at the same clocks, it's not linear at all.
That's why AMD just increases clocks instead of shader count on their high-end GPUs.


----------



## Athlonite (Feb 7, 2019)

$1300NZD for this mess I think not it's loud it uses a fuckton of power the HSF is strangled by the stupid shroud design that blocks the bit the requires the most airflow 

perf/Watt = looser
perf/$ = looser 
noise = looser 

it ends up being a cut down compute card that's trying to be a gamer GPU and just fails miserably they would have been better off just die shrinking the Vega64 and 56 instead of this POS


----------



## Vya Domus (Feb 7, 2019)

M2B said:


> Vega 64 with 15% more shaders than vega 56 performs only 3% better than the 56 at the same clocks, it's not linear at all.



It's more like 10% not 3%. And of course nothing scales perfectly linearly as I said but shader count does more so than anything else.



M2B said:


> That's why AMD just increases clocks instead of shader count on their high-end GPUs.



They did that simply because they chose not to bother with any major redesign which was required for a wider GPU and they were also starting to become limited by the node. Hence they kept increasing clocks, but that means power will tend to increase exponentially hence also why this still uses 300W. It's all really simple.


----------



## M2B (Feb 7, 2019)

Vya Domus said:


> It's more like 10% not 3%. And of course nothing scales perfectly linearly as I sad but shader count does more so than anything else.



That 10% advantage over vega 56 is due to higher memory bandwidth and higher core clock on the 64.

https://hothardware.com/news/amd-radeon-rx-vega-56-unlocked-vega-64-bios-flash

But...whatever you say.


----------



## Ravenas (Feb 7, 2019)

Seems like driver problems are causing inconsistencies. IE battlefield and farcry. Look at the R7 trading blows with the 2080, but every other game it's all over the place.


----------



## XXL_AI (Feb 7, 2019)

hardware sucks so fanboys trying to blame software..


----------



## TheGuruStud (Feb 7, 2019)

Avoiding SE4 this time, eh? Lol.  39 fps vs 50s on V64 didn't look good, did it? Lows have been mysteriously missing, too (hint: Vega excels at this and nvidia is poo).


----------



## Countryside (Feb 7, 2019)

Ravenas said:


> Seems like driver problems are causing inconsistencies. IE battlefield and farcry. Look at the R7 trading blows with the 2080, but every other game it's all over the place.



Buggy and unstable driver has been reported by many reviewers


----------



## Imsochobo (Feb 7, 2019)

Captain_Tom said:


> Exactly, even Ryzen 1000 refused to start a total price war with Intel - and it was a vastly superior product stack.  However Ryzen 3000 will probably go for a kill shot, and only because they will be able to make CPU's twice as good as Intel's for less money.  Until that happens for Radeon, they will not even try - and that won't happen till 2021 at the earliest unfortunately.
> 
> However Navi may actually lower prices quite a bit, but only to where they were in 2012 at best - although they were pretty great then, huh?



AMD selling same die size and performance chip for up to half ?
Ryzen and Amd gpu's are two vastly different worlds.

Ryzen can compete with anything and be priced nothing with great profit margins even undercutting intel's entire lineup!
it's because of their Infinity fabric and scalability, Intel throws away chips amd can easily sell
Vega can only be sold at non profit, but I'd rather see them selling them at non profit than not at all!


----------



## notb (Feb 7, 2019)

I won't comment on the performance very much. RTX cards are very fast, so it was expected that Radeon VII will look relatively worse than Vega 64 did (Vega 64 matched 1080).

There are two more important things to take away:

1) *The noise level is unacceptable*
Is it the loudest card available today? Certainly near the "top". And it's an expensive, high-end model with 3 fans.
I wonder how AMD supporters feel about it? Do they think AMD lost contact with reality? Or maybe AMD simply doesn't respect their customers?

2) Many people kept saying that 7nm is the cure for all AMD problems. We get the first product and it clearly hasn't helped a lot.
Of course this node will get better over time, but AMD doesn't have time - Zen 2 is around the corner.


----------



## r9 (Feb 7, 2019)

Those 100 cards for sale in UK look like 100 too many now.
NVIDIA kicking AMD's butt and at the same time their stock is plummeting.
I feel bad for both AMD and NVIDIA, I would hate for any of the CPU/GPU manufactures to go out of business and drop out.


----------



## Countryside (Feb 7, 2019)

notb said:


> I won't comment on the performance very much. RTX cards are very fast, so it was expected that Radeon VII will look relatively worse than Vega 64 did (Vega 64 matched 1080).
> 
> There are two more important things to take away:
> 
> ...



Makes me wonder why they only allowed the REF model and did not allow partners to change the cooler.


----------



## cucker tarlson (Feb 7, 2019)

TheGuruStud said:


> Avoiding SE4 this time, eh? Lol.  39 fps vs 50s on V64 didn't look good, did it? Lows have been mysteriously missing, too (hint: Vega excels at this and nvidia is poo).




lol,those lows are truly excellent 
https://www.techspot.com/review/1789-amd-radeon-vii/
min fps lower than 2080 in 9 games out of 11 and lower than 1080ti in 6.

see that too
https://www.purepc.pl/karty_graficz...force_rtx_2080_test_kart_graficznych?page=0,4

dunno where'd you get that min fps theory,nvidia has always been more consistent with them.


----------



## JRMBelgium (Feb 7, 2019)

Xuper said:


> http://www.pcgameshardware.de/Radeon-VII-Grafikkarte-268194/Tests/Benchmark-Review-1274185/3/
> 
> from Desktop to Crysis 3 , it consumes less than FE 2080.while Anno 2070 , 303w vs 224w
> when you do UV , GPU boost becomes stable.



First thing I'll do is undervolt that baby


----------



## r.h.p (Feb 7, 2019)

ASUS GeForce RTX2080 STRIX OC 8GB GDDR6 


Bentley : 
	

	
	
		
		

		
		
	


	



Cockburn : 
	

	
	
		
		

		
		
	


	



Wangara : 
	

	
	
		
		

		
		
	


	



Osborne Park : 
	

	
	
		
		

		
		
	


	







       SKU: ROG-STRIX-RTX2080-O8G-GAMING    

       MFG: ASUS    



       $1,529



 



MSI Radeon VII 7nm Triple Fan 16GB HBM2 


Bentley : 
	

	
	
		
		

		
		
	


	



Cockburn : 
	

	
	
		
		

		
		
	


	



Wangara : 
	

	
	
		
		

		
		
	


	



Osborne Park : 
	

	
	
		
		

		
		
	


	





       SKU: Radeon VII 16G    

       MFG: MSI    

       $1,169

AUS Dollars


----------



## Xuper (Feb 7, 2019)

so out of stock ? who are those buyers? perhaps 1/4 FP64 at those price !


----------



## r.h.p (Feb 7, 2019)

Xuper said:


> so out of stock ? who are those buyers? perhaps 1/4 FP64 at those price !



maybe lol


----------



## Mescalamba (Feb 7, 2019)

So, as expected. A lot of hype and not much improvement.

No support for DirectX Raytracing
I wouldnt hold this against AMD, not now at least. It will be worthy negative when we will have a lot of RTX games.

Also price here is.. equiv of 939 USD. Good luck selling one.. (not that there actually are ANY ).


----------



## r.h.p (Feb 7, 2019)

Mescalamba said:


> So, as expected. A lot of hype and not much improvement.
> 
> No support for DirectX Raytracing
> I wouldnt hold this against AMD, not now at least. It will be worthy negative when we will have a lot of RTX games.
> ...



man I cant even get my shit together too start playing BF5 even though im die hard for the series , so ray trading
isn't even a issue . If the 300 bucks diff is true then that will go to a Free sync 144hz 32 inch monitor


----------



## efikkan (Feb 7, 2019)

notb said:


> I won't comment on the performance very much. RTX cards are very fast, so it was expected that Radeon VII will look relatively worse than Vega 64 did (Vega 64 matched 1080).


It is worth mentioning that this is the first "top" model from AMD in recent years that don't come close to their Nvidia counterpart in performance, so in essence we can call this their largest fail yet.



notb said:


> Many people kept saying that 7nm is the cure for all AMD problems. We get the first product and it clearly hasn't helped a lot.
> 
> Of course this node will get better over time, but AMD doesn't have time - Zen 2 is around the corner.


Yes, with AMD it's always the next big thing that will come and save them 

TSMC "7nm" is a very large die shrink, and should allow for some substantial gains. The fact that AMD seem to only get ~25% more performance is quite remarkable, and far away from their >40% goal. As you pointed out, many think the node shrinks will equalize the differences between AMD and Nvidia, but in fact, it will have the opposite effect, as more efficient architectures will scale even better. We also have to remember that this might be the last "good" node shrink in a while, and Nvidia haven't used it yet…


----------



## Fluffmeister (Feb 7, 2019)

Yeah it's kinda sad people are already excited an undervolted 7nm part can get close to Pascal/Turing levels of efficiency. The hype was the original Vega 64 was going to give the 1080 Ti a run for it's money, yet here we are with Vega built on cutting edge 7nm tech with a whopping 1TB/s of bandwidth, and ultimately it stills falls short.

Ouch.


----------



## Jism (Feb 7, 2019)

I'd still buy it. Reports seems that the power could be shaved off with a 50 to 90w on simply undervolting. For my purpose (1440p gaming) it should be perfect.


----------



## r.h.p (Feb 7, 2019)

Fluffmeister said:


> Yeah it's kinda sad people are already excited an undervolted 7nm part can get close to Pascal/Turing levels of efficiency. The hype was the original Vega 64 was going to give the 1080 Ti a run for it's money, yet here we are with Vega built on cutting edge 7nm tech with a whopping 1TB/s of bandwidth, and ultimately it stills falls short.
> 
> Ouch.




its true , I agree but in my opinion AMD are Producing CPU / GPU / Mobo Chips , geezuz they have probably bit a bit too much of the cake maybe ….??


----------



## efikkan (Feb 7, 2019)

Jism said:


> I'd still buy it. Reports seems that the power could be shaved off with a 50 to 90w on simply undervolting. For my purpose (1440p gaming) it should be perfect.


Why? Undervolting or not, it's still not good value.
And undervolting will hurt your stability.


----------



## r.h.p (Feb 7, 2019)

Also Great Review as always W1zzard


----------



## Xuper (Feb 7, 2019)

efikkan said:


> Why? Undervolting or not, it's still not good value.
> And undervolting will hurt your stability.



a proper UV won't hurt it.AMD cards do have proper voltage control.


----------



## efikkan (Feb 7, 2019)

Xuper said:


> a proper UV won't hurt it.AMD cards do have proper voltage control.


It wouldn't hurt the hardware, but make it unstable over time, as long as the undervoltage is applied. All CPUs and GPUs have a marginal overvoltage to ensure stability as the chip degrades. You can of course choose to remove this margin, risking that the machine will become gradually more unstable until a point where you have to re-apply the voltage again.


----------



## Vayra86 (Feb 7, 2019)

Shame to see it land equal to a 1080ti in only the best case scenario. If it had matched it across the board, we would've had a 2080 contender. As it is now, meh. Its just not consistent and that is a big shame. Losing at 1080p isn't a big issue, but to have only Strange Brigade as an impressive result, is very weak and not enough to make it a compelling choice. DX11 is still dominant and barely even losing terrain, and that won't be changing soon.

I had expected it to land a bit higher.

Still, great review and very informative. The most interesting bit to me is the merit of 7nm. Pushing perf/watt up by over 20% is quite impressive, especially on this ancient architecture, and that is even considering this card has been clocked quite high out of the box. Will be cool to see what Nvidia can do with this node - and hopefully what Navi will do with it.



Ravenas said:


> Seems like driver problems are causing inconsistencies. IE battlefield and farcry. Look at the R7 trading blows with the 2080, but every other game it's all over the place.



Nah, I think this is AMD pushing its resources where they think it counts. Far Cry was, for a long time, a hard nut to crack for them, so they've probably invested in it more than in the rest. Battlefield is similarly a title where AMD has been on the ball. Strange Brigade as well. There is no reason to believe the performance on every other title will radically change for the better - and even if it does, at some point in the future, its too late as usual. 



TheGuruStud said:


> Avoiding SE4 this time, eh? Lol.  39 fps vs 50s on V64 didn't look good, did it? Lows have been mysteriously missing, too (hint: Vega excels at this and nvidia is poo).



Come on, don't be a sore loser, don't spread BS to somehow create some sort of saving grace here - it doesn't exist and it doesn't help anyone, least of all yourself... I remember this one from Ryzen too... 'but the minimums'  Please.



cucker tarlson said:


> in virtually every review I've seen rtx 2070 oc is trading blows with RVII oc, at least at 2560x1440 or 3440x1440,and actually winning in many cases.
> pcgh has it at 6% faster than 2070 in 21 games across 3 resolutions
> http://www.pcgameshardware.de/Radeon-VII-Grafikkarte-268194/Tests/Benchmark-Review-1274185/2/



OC vs OC is dangerous territory with the RTX line, because we're looking at FE's with a higher TDP out of the box here... Not all cards have that, and its nothing to really rely on given the extent of GPU Boost these days on Nvidia. Its still a good observation, but I wouldn't think too much of it. Vega is _much _more tweakable than Turing - look at undervolt for example.


----------



## Xzibit (Feb 7, 2019)

Xuper said:


> so out of stock ? who are those buyers?* perhaps 1/4 FP64 at those price !*



You compare it to Nvidia current RTX line up for FP64 its a steal for those workloads.

RTX 2080 Ti = 420.2 GFLOPS
RTX Titan = 509.8 GFLOPS
Radeon VII = 3,360 GFLOPS

If your looking to save money.


----------



## Super XP (Feb 7, 2019)

notb said:


> I won't comment on the performance very much. RTX cards are very fast, so it was expected that Radeon VII will look relatively worse than Vega 64 did (Vega 64 matched 1080).
> 
> There are two more important things to take away:
> 
> ...


Radeon VII does good enough for PC Gaming, but its not necessarily a gaming card. And so far 7nm is looking great for the upcoming Zen 2. Let's see how Navi turns out, hopefully this GPU is enough to replace AMD's entire RX 500 lineup.



Metroid said:


> Like vega 14nm this card was never aimed for gaming, compute workload in disguise is what has always been.


Correct, but they are marketing the cards as gaming cards. 
Perhaps they shouldn't market them this way.


----------



## Fluffmeister (Feb 7, 2019)

Countryside said:


> Buggy and unstable driver has been reported by many reviewers



It's AMD, of course the day-0 drivers are a mess, there would be no finewine otherwise!


----------



## Metroid (Feb 7, 2019)

Super XP said:


> Correct, but they are marketing the cards as gaming cards.
> Perhaps they shouldn't market them this way.



On dx12 radeon vii is not as bad as it looks on dx11, regarding them marketing as a gaming card, AMD would never say is for compute workloads as investors would not like it, smart gamers are settling down for rtx 2060 price performance. I'm betting on Navi to deliver.


----------



## Jism (Feb 7, 2019)

efikkan said:


> Why? Undervolting or not, it's still not good value.
> And undervolting will hurt your stability.





efikkan said:


> voltage



Actually, more voltage will cause degradation over time. Less voltage will do less degradation. However these cards are build for a lasting 3 to 5 to 8 years or so before slowly giving the crap. You can safely undervolt any CPU or GPU within margins. It's correct that a UV on a game might be fully stable but once using other resources on the GPU like for example video encode the thing could crash. That's why undervolting should be done with alot of precaution at least. At 1090mv my RX580 is perfectly stable in games for HOURS from the original 1150mv. But going to Radeon LIVE and on screen recording crashes it. I need to dial back into 1110mv to get her fully stable.

With just lowering to 0.40mv i saved a 30 watts on total consumption (Furmark), which is of course, free performance.


----------



## Super XP (Feb 7, 2019)

Metroid said:


> On dx12 radeon vii is not as bad as it looks on dx11, regarding them marketing as a gaming card, AMD would never say is for compute workloads as investors would not like it, smart gamers are settling down for rtx 2060 price performance. I'm betting on Navi to deliver.


Hopefully the 7nm Navi will be enough to further accelerate Radeon GPU adoption. I heard that Navi may be a new design? Or highly enhanced/modified and be offered in both GDDR6 and HBM2.


----------



## Vayra86 (Feb 7, 2019)

Super XP said:


> Hopefully the 7nm Navi will be enough to further accelerate Radeon GPU adoption. I heard that Navi may be a new design? Or highly enhanced/modified and be offered in both GDDR6 and HBM2.



Navi so far is smoke and mirrors. All I'm hearing is problems and that we will not even see MCM designs anytime soon. If that is the case, Navi is nothing other than Vega in disguise.


----------



## efikkan (Feb 7, 2019)

Jism said:


> Actually, more voltage will cause degradation over time. Less voltage will do less degradation.


Try reading my second post again.
I know the difference between degraded hardware and stability issues. All hardware degrades over time, and requires more voltage remain stable, which is why hardware is running with marginally higher voltage than they need initially. Higher voltage beyond a certain limit does of course increase the degradation of hardware, but undervolting the hardware doesn't prevent this.


----------



## Fluffmeister (Feb 7, 2019)

Xzibit said:


> You compare it to Nvidia current RTX line up for FP64 its a steal for those workloads.
> 
> RTX 2080 Ti = 420.2 GFLOPS
> RTX Titan = 509.8 GFLOPS
> ...



Good point, it's kinda late to the game as a gaming card, so I guess this is a positive spin.

Compute card is not great for gamers so those that use FP64 should buy it instead. Wow.


----------



## Super XP (Feb 7, 2019)

Vayra86 said:


> Navi so far is smoke and mirrors. All I'm hearing is problems and that we will not even see MCM designs anytime soon. If that is the case, Navi is nothing other than Vega in disguise.



At this point, who knows. But one thing I know for a fact, AMD is hard at work because they are in need of a RX 580 successor fast. 
This is what I know about Navi. 
* Multi-Chip Design
* Infinity Fabric
* 7nm
* GDDR6 & HBM2 variants. 
* AMD has been Lab Testing 7nm Navi, and its supposedly performing better than expected. This based on a source by Fudzilla, which has been right a lot more than not.
* Supposedly going to be called RX 680, RX 690 etc., 
* Launch Date - May/June 2019 & Release Date - July 2019.


----------



## notb (Feb 7, 2019)

Countryside said:


> Makes me wonder why they only allowed the REF model and did not allow partners to change the cooler.


This card may be important as a statement for AMD. AIBs don't need it. You think it would be profitable for them to design a 300W cooling solution for few thousand cards (spread among 5+ big AIBs)?

AMD basically ordered a container of these, called the AIBs and asked if they want to put a sticker on it.
I doubt they make any money out of it (maybe if AMD is paying them for marketing).


efikkan said:


> It is worth mentioning that this is the first "top" model from AMD in recent years that don't come close to their Nvidia counterpart in performance, so in essence we can call this their largest fail yet.


True. We got used to the fact that the top "Ti" is out of range, but 390X matched 980 and Vega 64 matched 1080 (at least on performance).
I do believe it has more to do with Turing being awesome than Radeon VII being bad, but at the end of the day it doesn't matter. The gap grows.

At this rate next AMD gen will be competing with Nvidia's 3070. That would basically mean Nvidia being able to make notebook GPUs that outperform AMD's 3-fan ovens. Sad.


> TSMC "7nm" is a very large die shrink, and should allow for some substantial gains. The fact that AMD seem to only get ~25% more performance is quite remarkable, and far away from their >40% goal. As you pointed out, many think the node shrinks will equalize the differences between AMD and Nvidia, but in fact, it will have the opposite effect, as more efficient architectures will scale even better. We also have to remember that this might be the last "good" node shrink in a while, and Nvidia haven't used it yet…


Exactly. 7nm should be a big improvement and turned out to be expensive sh*t.
This is very weird indeed. Totally not in line with what AMD and TSMC have been telling us.

TSMC 7nm is either very immature at this point and it needs a lot of time to be polished or it only works for small, low-voltage chips.
Which would basically mean TSMC isn't as far ahead of Intel as many thought.

Maybe if AMD did a 7nm RX570 successor, it would shine in performance and efficiency. But they pushed this architecture as far as they could and ended up with this junk.
Mind you, if 7nm is very expensive and makes these GPUs unprofitable, a "maxed-out" limited edition was the better choice than a sensible mid-range one...

But let's mention CPUs once again. This card is such bad news for Zen 2.
There was already a leak that 8-core 7nm Ryzen will retain the ~100W TDP level. Everyone was like: "Naaah, AMD has already done this before. 8-cores on 7nm won't go past 60W and the future 12- and 16-core will also have 100W TDP". But looking at Radeon VII, I do believe in the 100W power draw. Which means 16-core Ryzen will pull 160W just like Threadripper. Not a big deal, but also not the miracle we've been promised.


Xuper said:


> a proper UV won't hurt it.AMD cards do have proper voltage control.


If they had such great voltage control and big margins, AMD would tune them properly in the factory.
What you actually wanted to say is: AMD has high quality variance.


----------



## Vya Domus (Feb 7, 2019)

Vayra86 said:


> All I'm hearing is problems and that we will not even see MCM designs anytime soon.



MCM starts making sense when you are close to the end of the life cycle of a node and you can no longer pack any more transistors into a single die while maintaining a certain cost/power envelope. In other words MCM will be reserved for the moment when you need to build the absolute fastest GPU you can make and nothing else can get you there.

That being said there is no reason AMD has to push for this right now, we are at the beginning of 7nm and there is still a lot of headroom left for monolithic GPUs.


----------



## Super XP (Feb 7, 2019)

Well this is interesting?
https://www.tomshardware.com/news/amd-navi-everything-we-know,38242.html


> The Radeon RX 3080 will supposedly employ AMD's Navi 10 silicon produced with TSMC's 7nm FinFET manufacturing process. It also comes equipped with 8GB of GDDR6 memory and a modest 150W TDP (thermal design power) rating. Performance-wise, the Radeon RX 3080 is believed to be 15 percent faster than AMD's current Radeon RX Vega 64 graphics card. The Radeon RX 3080 is expected to carry a $249.99 (~£197.55) price tag.



If this is true, WOW.


----------



## moproblems99 (Feb 7, 2019)

Vayra86 said:


> Navi is nothing other than Vega in disguise



If this ends up being true...it will be a sad day.


----------



## M2B (Feb 7, 2019)

Super XP said:


> If this is true



It's not.


----------



## Fluffmeister (Feb 7, 2019)

Yeah AdoredTV dreams, if true it will instantly render this Radeon VII turkey dead:







As always for AMD fans, I suggest you keep waiting.


----------



## Super XP (Feb 7, 2019)

M2B said:


> It's not.


You seen that is what we don't know. But multiple sources state Navi seems to be performing above expectations. This includes Fudzilla sources and wccftech sources.



Fluffmeister said:


> Yeah AdoredTV dreams, if true it will instantly render this Radeon VII turkey dead:
> 
> 
> 
> ...


It seems to me this Radeon VII turkey (LOL) is a stop gap till the Navi is released. OR its simply a disguise to help catch Nvidia off guard haha


----------



## M2B (Feb 7, 2019)

Super XP said:


> This includes Fudzilla sources and wccftech sources.



Don't trust them.


----------



## Vya Domus (Feb 7, 2019)

M2B said:


> It's not.



It is. 

Which one of us is right ?


----------



## Super XP (Feb 7, 2019)

M2B said:


> Don't trust them.


I do, because most of what they've reported on were correct or very close to the facts. The same goes for wccftech.



Vya Domus said:


> It is.
> 
> Which one of us is right ?


----------



## notb (Feb 7, 2019)

Super XP said:


> Radeon VII does good enough for PC Gaming, but its not necessarily a gaming card.


It's not a workstation card either.


> And so far 7nm is looking great for the upcoming Zen 2. Let's see how Navi turns out, hopefully this GPU is enough to replace AMD's entire RX 500 lineup.


How do you know it is looking great? You've seen it? Or are you judging by the marketing materials we've been fed with? Because what they said about 7nm GPUs was also rather _optimistic_.


Super XP said:


> Well this is interesting?
> https://www.tomshardware.com/news/amd-navi-everything-we-know,38242.html
> If this is true, WOW.


No offense man... I understand you like the brand and so on.
But you can't be so naive to think that in few months AMD will release another 7nm GPU with half of Radeon VII power consumption and for 1/5th of the price.


----------



## qubit (Feb 7, 2019)

@W1zzard wrote: "Gaming noise levels are just way too high; the fans ramp up very quickly and become a nuisance mere seconds after putting some gaming load on the card. If you compare this to NVIDIA's offerings, it'll be a day-and-night difference! If you value low noise, the Radeon VII is definitely not for you."

I can't believe this. Just how does AMD manage to screw this up on a $700 flagship card when it's a solved problem?!  When AMD have released so many generations of graphics cards over the last 15 years or so? When NVIDIA gets it right, or at least reasonably right, on all their reference cards?

This right here is a dealbreaker, even forgetting about all the other limitations I pointed out in my earlier post. This card is yet another underwhelming lemon.

If it was sold at around two thirds to half the price then it might start to become viable. Maybe. Personally, if someone gave the thing to me for free, I'd eBay it straight away and put that money towards a GTX 2080.

Someone at AMD needs to be fired, they really do.

I wait with baited breath for when AMD do a Zen on their graphics cards and gives us genuine choice and competition to NVIDIA.


----------



## M2B (Feb 7, 2019)

Vya Domus said:


> It is.
> 
> Which one of us is right ?



The one who uses his brain.


----------



## FYFI13 (Feb 7, 2019)

Scougar said:


> GTX 1070 ?  How did you get that out of this review?


*GTX 2070. GTX2000 series cards are so overpriced, that sometimes i forget they exist.


----------



## Super XP (Feb 7, 2019)

notb said:


> It's not a workstation card either.
> 
> How do you know it is looking great? You've seen it? Or are you judging by the marketing materials we've been fed with? Because what they said about 7nm GPUs was also rather _optimistic_.
> 
> ...


I was under the impression that Navi is a GPU Re-Design. All based on what I've read from TechPowerUp, Toms Hardware, Fudzilla, wccftech, Anandtech, HotHardware, PC Gamer, PCGames, TechRadar, HEXUS, Guru3D, Digital Trends, KitGURU, etc.,

Of course, we will simply have to wait and see...

This 7 days ago...
https://www.pcgamesn.com/amd/amd-7nm-competitive-advantage-2019
*AMD claims 7nm is a “big competitive advantage” over its rivals in 2019*


----------



## Mescalamba (Feb 7, 2019)

Xzibit said:


> You compare it to Nvidia current RTX line up for FP64 its a steal for those workloads.
> 
> RTX 2080 Ti = 420.2 GFLOPS
> RTX Titan = 509.8 GFLOPS
> ...



Just checked mining, one RVII does 90 Mh/s .. probably possible to get 100. Thats a LOT of computing power. Also bit strange, since it completely doesnt correspond with gaming performance (1080 Ti at best does 50, Vega 64 about same so RVII is actually huge computing jump, so where the heck is that performance..).


----------



## Vya Domus (Feb 7, 2019)

M2B said:


> The one who uses his brain.



I am sorry to hear that then. My condolences for your cognitive impairment.


----------



## moproblems99 (Feb 7, 2019)

notb said:


> But you can't be so naive to think that in few months AMD will release another 7nm GPU with half of Radeon VII power consumption and for 1/5th of the price.



Why not?  It isn't like VII was a new architecture they spent any time creating.  VII was merely a test run on 7nm to see we can and we did it first.  Navi has been in development for a long time.


----------



## efikkan (Feb 7, 2019)

notb said:


> Maybe if AMD did a 7nm RX570 successor, it would shine in performance and efficiency. But they pushed this architecture as far as they could and ended up with this junk.
> 
> Mind you, if 7nm is very expensive and makes these GPUs unprofitable, a "maxed-out" limited edition was the better choice than a sensible mid-range one...


I doubt that the node shrink will benefit the smaller chips more than larger ones. Usually upper mid-range to high-end chips achieve the highest performance per watt.

But still, the success of chips on the 7nm node depend on two primary factors; the node and the architecture. I would hope Navi have some architectural improvements/tweaks, but I'm not expecting anything radical.

This Radeon VII card is simply a stop-gap and a "backup plan" due to delays of Navi. Back in October we heard that AMD got engineering samples in their labs and they were looking "good", and then three months later at CES they didn't have a single fully working sample, and whipped up Radeon VII instead. That is a good indicator that something is seriously wrong with the "bigger" Navi chip. That might not even be the node at fault.



notb said:


> But let's mention CPUs once again. This card is such bad news for Zen 2.
> 
> There was already a leak that 8-core 7nm Ryzen will retain the ~100W TDP level. Everyone was like: "Naaah, AMD has already done this before. 8-cores on 7nm won't go past 60W and the future 12- and 16-core will also have 100W TDP". But looking at Radeon VII, I do believe in the 100W power draw. Which means 16-core Ryzen will pull 160W just like Threadripper. Not a big deal, but also not the miracle we've been promised.


I wouldn't read too much into that, since engineering samples can bee all over the place in terms of energy consumption. It can actually go both ways; do you remember the unrealistic Polaris engineering sample?

7nm will probably be good eventually, but until EUV arrives, it will continue to cause limitations. We can hope that Zen 2 is carefully crafted to deal with these production problems, but there will have to be some kind of trade-off between density, performance and yields until "7nm+" arrives.


----------



## M2B (Feb 7, 2019)

Vya Domus said:


> I am sorry to hear that then. My condolences for your cognitive impairment.



I love you so much.


----------



## notb (Feb 7, 2019)

qubit said:


> This right here is a dealbreaker, even forgetting about all the other limitations I pointed out in my earlier post. This card is yet another underwhelming lemon.


First thing I noticed.
The card is just way too loud for a consumer part that's supposed to be used at home. It's really getting close to vacuum cleaners.

I can only imagine gaming with this in very well isolated headphones. Preferably: if you live alone or keep the PC in a basement.
I can't believe people here are suggesting it for work. Imagine spending 8 hours a day next to this thing. And your colleagues in the office could share the pleasure!


----------



## xkm1948 (Feb 7, 2019)

Just read a lot of reviews from multiple sites. The consensus is fairly clear: this is a statement card. A statement to the market that says AMD is not done making consumer GPU yet. And that is it. Value is not good for the current price ($599 would be the sweet spot). 

My interest is in primitive shading or tile-based rasterization which was ambiguous regarding their actual implementation
https://www.techpowerup.com/240879/amd-cancels-implicit-primitive-shader-driver-support

Are there ways to test those features mighty @W1zzard ? I see you mentioned that some improvements may have been done under the silicon.

Also would it be possible to do a clock to clock comparion versus Vega56/64? Down clock HBM on Radeon 7 to achieve same VRAM bandwidth and compare the performance there.





Fluffmeister said:


> As always for AMD fans, I suggest you keep waiting.




Naw man. Waiting is for losers. I suggest ANY members here who consider themselves truly AMD fans go out there and buy those Radeon 7 home. Actions are louder than 1000 words.


----------



## Super XP (Feb 7, 2019)

Back in June 2018, TSMC was ahead of schedule for 7nm production. I can only imagine this to get better throughout the Q1 & Q2 of 2019. Unless I am missing something.


----------



## zenlaserman (Feb 7, 2019)

This thing looks _great_, IMO.  The Radeon VII is the first single GPU consumer card by AMD to have over 100 Gpixels/sec out of the box.  nV has been over that mark for a while ofc.  If one can cool this card enough and get it to a steady 2GHz, it'll be truly awesome for all work.  In typical FineWine fashion, this card is gonna get better and better for games, because that's what happens with something overly complicated as GCN on that front.  For a long time, they have simply lacked the Gpixels/sec for high average fps across all games.

People gonna complain tho.  Kids gonna look at superficial benchmarks and cry deeply.


----------



## xkm1948 (Feb 7, 2019)

Super XP said:


> Back in June 2018, TSMC was ahead of schedule for 7nm production. I can only imagine this to get better throughout the Q1 & Q2 of 2019. Unless I am missing something.



Thing will be interesting once Nvidia also switches to 7nm.

Without introducing a more effiecent uarc. AMD cannot compete with Nvidia in terms of power consumption and absolute performance.

I agree with @OneMoar 

#GCNMUSTDIE


----------



## Vayra86 (Feb 7, 2019)

Vya Domus said:


> MCM starts making sense when you are close to the end of the life cycle of a node and you can no longer pack any more transistors into a single die while maintaining a certain cost/power envelope. In other words MCM will be reserved for the moment when you need to build the absolute fastest GPU you can make and nothing else can get you there.
> 
> That being said there is no reason AMD has to push for this right now, we are at the beginning of 7nm and there is still a lot of headroom left for monolithic GPUs.



Well, given the state of Vega right now, don't you think they actually do need it? They can keep screwing around in the lower-midrange with Navi otherwise, and we all know what that means.



moproblems99 said:


> If this ends up being true...it will be a sad day.



Precisely, but if we are to believe Vya, that day is soon upon us...


----------



## Super XP (Feb 7, 2019)

xkm1948 said:


> Thing will be interesting once Nvidia also switches to 7nm.
> 
> Without introducing a more effiecent uarc. AMD cannot compete with Nvidia in terms of power consumption and absolute performance.
> 
> ...


They can compete but they probably can't take Nvidia in the high end. With the Navigation GPUs.


----------



## Vayra86 (Feb 7, 2019)

Super XP said:


> With the Navigation GPUs.



LOL autocarrot for the when!


----------



## M2B (Feb 8, 2019)

xkm1948 said:


> interesting



Not for AMD lovers though.


----------



## Vayra86 (Feb 8, 2019)

notb said:


> First thing I noticed.
> The card is just way too loud for a consumer part that's supposed to be used at home. It's really getting close to vacuum cleaners.
> 
> I can only imagine gaming with this in very well isolated headphones. Preferably: if you live alone or keep the PC in a basement.
> I can't believe people here are suggesting it for work. Imagine spending 8 hours a day next to this thing. And your colleagues in the office could share the pleasure!



Older GCN cards were louder than 43 dB. We've seen 45-46 more regularly than not on stock AMD blowers. Its odd but VII is actually an improvement...



zenlaserman said:


> This thing looks _great_, IMO.  The Radeon VII is the first single GPU consumer card by AMD to have over 100 Gpixels/sec out of the box.  nV has been over that mark for a while ofc.  If one can cool this card enough and get it to a steady 2GHz, it'll be truly awesome for all work.  In typical FineWine fashion, this card is gonna get better and better for games, because that's what happens with something overly complicated as GCN on that front.  For a long time, they have simply lacked the Gpixels/sec for high average fps across all games.
> 
> People gonna complain tho.  Kids gonna look at superficial benchmarks and cry deeply.



Ehm. I guess everyone's entitled to their opinion, but... I think you need a reality check.


----------



## M2B (Feb 8, 2019)

Vayra86 said:


> Ehm. I guess everyone's entitled to their opinion, but... I think you need a reality check.



What happened to the surprise element? 
Do you still see it?


----------



## Maye88 (Feb 8, 2019)

Do you have a part number or a link or any info about the washers you used? Gonna try that trick myself.


----------



## Tolga (Feb 8, 2019)

"16GB HBM Ram"  Good choice of rams for price competition 
8gb ram will be enough for everyone...

they are doing something by producing only 7nm of the same architecture , 7nm 4000 shader gpu *disappointment*


----------



## notb (Feb 8, 2019)

Vayra86 said:


> Older GCN cards were louder than 43 dB. We've seen 45-46 more regularly than not on stock AMD blowers. Its odd but VII is actually an improvement...


Just for the record: dB noise measurements are relative. You know that the card is not emitting 43 dB, right? 

There's really no point in discussing whether 43 dB is a lot or not. What is important, is that this card is more or less as loud as the reference (blower) Vega 64. And that should be enough for people who sat next to the latter.
Another thing - subjective, but pointed out by many reviewers (including TPU) - is that the noise is very unpleasant (as in: it's hard to concentrate next to this thing @100%).


----------



## Vayra86 (Feb 8, 2019)

notb said:


> Just for the record: dB noise measurements are relative. You know that the card is not emitting 43 dB, right?
> 
> There's really no point in discussing whether 43 dB is a lot or not. What is important, is that this card is more or less as loud as the reference (blower) Vega 64. And that should be enough for people who sat next to the latter.
> Another thing - subjective, but pointed out by many reviewers (including TPU) - is that the noise is very unpleasant (as in: it's hard to concentrate next to this thing @100%).



Here you go. 51 dB still relative to you?  Let's just say, vacuum cleaner statements belong under the review below, not this VII. This is probably more like hair dryer levels  43 dB and 'not fit for a consumer part' I think is lacking perspective.

https://www.techpowerup.com/reviews/AMD/HD_7970_GHz_Edition/27.html


----------



## Aquinus (Feb 8, 2019)

These are definitely interesting results. I read this review immediately after reading the one on Phoronix (which is obviously the Linux side of things,) and it seems a lot closer in performance than the results here do. I'm even more amused by the difference for Deus Ex, where it's faster in Windows, but slower in Linux, but in Unigine Heaven and Superposition, it's well past the 2080. It doesn't seem like a bad card, at least on the Linux side of the house.


----------



## Super XP (Feb 8, 2019)

Vayra86 said:


> LOL autocarrot for the when!


Auto Correct lol 
Meant to say Navi


----------



## Xuper (Feb 8, 2019)

One thing , this card is good for those who have 4K Freesync


----------



## Fluffmeister (Feb 8, 2019)

Xuper said:


> One thing , this card is good for those who have 4K Freesync



Not really.


----------



## Xuper (Feb 8, 2019)

Fluffmeister said:


> Not really.


Care to elaborate?


----------



## xkm1948 (Feb 8, 2019)

Xuper said:


> Care to elaborate?



1440p 144hz yeah. 4k60hz hell nah.


----------



## SystemMechanic (Feb 8, 2019)

once games start using dlss, this card will be trash.


----------



## zenlaserman (Feb 8, 2019)

Vayra86 said:


> Ehm. I guess everyone's entitled to their opinion, but... I think you need a reality check.



Funny you say I need a reality check, when it is you who is about to get one.

My opinion stems from experience - I've been using Radeons since the beginning of the name.  They've always been tops for all-around work in my rigs.  They work well clocked low, clocked high, with or without software accompanying the drivers, unlike nV.  Even AMD's HD-series onboard will accelerate CAD nicely.  I don't just game, which is what the loudest of whiners about this card seem to do.  AMD's GPUs have consistently proven to me to be superior for all-around work.  If I wanted a GPU just for gaming I'd buy a console, and nice, they have Radeons, too.

I've had more than enough bad experiences with nV as a company during my years in Silicon Valley, with their GPUs at times when I've tried using them for my diverse workload, and with their software when it attempts to dictate to me what hardware I should have in my PC via BSOD.

You've made clear your pro-nV leanings in many threads.  That's fine.  Maybe you'll be original one day.


----------



## EarthDog (Feb 8, 2019)

xkm1948 said:


> 1440p 144hz yeah. 4k60hz hell nah.


a 2080 is barely 4k 60 capable (and isnt all the time). if this slower by a several %... some IQ sacrifices will have to be made to reach that magic number. It can moonlight as one... but, isnt optimal.


zenlaserman said:


> because that's what happens with something overly complicated as GCN on that front.


Hasnt gcn been out for a few years now...gcn2 as well? Instinct cards have also been out for a few months, but those are pro cards with the same base driver (but plenty of other things to make them different). With that in mind, it's odd the drivers didnt come out more polished. While I do believe with driver maturation comes, in some titles, performance increases, it's not going to make up 14% across the board. Some titles may get that bump, others none at all.

Anyway, as I've mentioned earlier, if you can utilize the compute side of this card, it's really a no brainer to snag it as a prosumer. But, AMD placed this as gaming card with a side of compute (even though fp16 is cut down considerably...still faster than NV!)...and most users are not prosumers.

I have to admit, for all your diverse workloads in silicon valley, I'm surprised you or your employer didnt spring for the professional cards with full capabilities in the first place.


----------



## xkm1948 (Feb 8, 2019)

zenlaserman said:


> Funny you say I need a reality check, when it is you who is about to get one.
> 
> My opinion stems from experience - I've been using Radeons since the beginning of the name.  They've always been tops for all-around work in my rigs.  They work well clocked low, clocked high, with or without software accompanying the drivers, unlike nV.  Even AMD's HD-series onboard will accelerate CAD nicely.  I don't just game, which is what the loudest of whiners about this card seem to do.  AMD's GPUs have consistently proven to me to be superior for all-around work.  If I wanted a GPU just for gaming I'd buy a console, and nice, they have Radeons, too.
> 
> ...



Vayra86 is pro-nv???? Man am I living in some alternative universe? Vayra86 shits on both GPU makers for any BS they pulls as he is NOT A F*KING FANBOY.  He is one of the few level headed member here.

AMD fans truly will do anything to twist reality around themselves for their safe space.



EarthDog said:


> a 2080 is barely 4k 60 capable (and isnt all the time). if this slower by a several %... some IQ sacrifices will have to be made to reach that magic number. It can moonlight as one... but, isnt optimal.
> Hasnt gcn been out for a few years now...gcn2 as well? Instinct cards have also been out for a few months, but those are pro cards with the same base driver (but plenty of other things to make them different). With that in mind, it's odd the drivers didnt come out more polished. While I do believe with driver maturation comes, in some titles, performance increases, it's not going to make up 14% across the board. Some titles may get that bump, others none at all.
> 
> Anyway, as I've mentioned earlier, if you can utilize the compute side of this card, it's really a no brainer to snag it as a prosumer. But, AMD placed this as gaming card with a side of compute (even though fp16 is cut down considerably...still faster than NV!).
> ...




Once again, very few researchers in academia use AMD GPU accelerator. CUDA and Tensor Flow abosolute demolishes OpenCL in scientific computing.

And the only good use of AMD's compute I can think of is actually crypto-mining. Sad.


----------



## moproblems99 (Feb 8, 2019)

xkm1948 said:


> Just read a lot of reviews from multiple sites. The consensus is fairly clear: this is a statement card. A statement to the market that says AMD is not done making consumer GPU yet. And that is it. Value is not good for the current price ($599 would be the sweet spot).



What would have possibly made you think anything different?  I understand we cannot truly understand value until benches.  However, we pretty much knew the price.  We also knew this was Vega II with near zero architectural differences to the OG Vega.  The only actual things changing were clocks and VRAM.  We also knew it was going to 7nm and all those efficiency improvements were going to be used by clocks.

I honestly cannot figure out why anybody expected anything different than what we got besides a few % points on performance.

People continue to baffle me.


----------



## Dbiggs9 (Feb 8, 2019)

I Just ordered one.


----------



## xkm1948 (Feb 8, 2019)

Dbiggs9 said:


> I Just ordered one.



Do report back how it works. judging by W1zzard review with improved Wattman this can definitely overclock a bit.




Be like Dibgs9, supports the underdog with actual purchase; don't be like those who are all bark no bites.


----------



## Xzibit (Feb 8, 2019)

Frame times and an interesting look at use case content creators on the 16GB side


----------



## Zubasa (Feb 8, 2019)

SystemMechanic said:


> once games start using dlss, this card will be trash.


I am not sure people pays $700 to play games at reduced image quality 
DLSS or some kind of imagine up-scaling / dynamic resolution would be very useful for lower end cards.


----------



## Kissamies (Feb 8, 2019)

Much better than I thought it's going to be. Also one good thing is that it's not from Nvidia.


----------



## Prima.Vera (Feb 8, 2019)

This card is not a failure. Is just the price needs to be *at least *200$ lower. A 2nd hand GTX 1080 Ti is a natural choice. This card is that good, making the AMD one DOA on the 1st day.


----------



## W1zzard (Feb 8, 2019)

Ju


Maye88 said:


> Do you have a part number or a link or any info about the washers you used? Gonna try that trick myself.


anything that fits really, prefer plastic over metal if you have a choice, to reduce chance of shorts if you lose one in your case


----------



## sepheronx (Feb 8, 2019)

Guess I will keep my GTX 1070 for lot longer.

I think people are being way too....aggressive over this.

We knew it wont be that amazing.  It isn't a bad card either.  It is just loud, and over volted.  Clearly it shows that the voltage can be reduced considerably.  Noise is something I cant stand.  This card would not have fit in my case anyway but I wouldn't mind having it.  Its DX12 performance is good.

I think it is safe to say to wait on driver updates.  Some of the performance in some titles made little sense (Dragon Quest 11 for example).  And if others are experiencing driver related issues, maybe it is best to wait?  Recall RX570.  That card did not perform very well even compared to a GTX 1050 yet over time, it surpassed it in performance.  While it sucks and we shouldn't have to wait for proper drivers, I gather that may have to be the case.

If this card was $400 - $500, I would buy it.  Not at current price though.


----------



## Vya Domus (Feb 8, 2019)

Vayra86 said:


> They can keep screwing around in the lower-midrange with Navi otherwise, and we all know what that means.



Unfortunately, or fortunately for them, that's what will happen. They can mitigate their loses in the mid range a lot better.


----------



## cucker tarlson (Feb 8, 2019)

TheGuruStud said:


> Avoiding SE4 this time, eh? Lol.  39 fps vs 50s on V64 didn't look good, did it? Lows have been mysteriously missing, too (hint: Vega excels at this and nvidia is poo).





Vayra86 said:


> Come on, don't be a sore loser, don't spread BS to somehow create some sort of saving grace here - it doesn't exist and it doesn't help anyone, least of all yourself... I remember this one from Ryzen too... 'but the minimums'  Please.



Funny he can *precisely *recall the numbers of one test from Vega lanuch but can't find one review to support his min fps theory.

There you go:

http://www.pcgameshardware.de/Radeon-VII-Grafikkarte-268194/Tests/Benchmark-Review-1274185/2/

21 games,2080 has a higher min fps number in 16 of them.Meanwhile Radeon VII min fps numbers generally stay very close to 2070,actually losing in 6 out of 21.
SE4 has been running well on nvidia cards for a long time already,they followed with a driver update shortly after the release,but a biased and uninformed person would certainly prefer to use old and irrelevant data. See for yourself,se4 is included in the pcgh review I'm basing all this on,V64 is 1% faster in avg. fps and 8% faster in min. fps compared to gtx 1080 while 1080ti is 1.38x faster in avg. and 1.32x faster in min fps.

Sad to see red team fanbase come and defend the indefensible with made up theories,kinda tells you the whole story about this card cause it can't prove its value on its own mertis,it looks bleak compared to 2080 and a worse value buy than 2070 by a mile too.10% slower than 2080 while 6% faster than 2070,no RT features,no AIB versions,tells you exactly why people are complaining about the price.This should've been $500 and it would give potential 2070 buyers a better alternative.In reality it's been grossly overpriced and it probably will not come down in price too,can't see anyone but amd fans choosing this.


----------



## GoldenX (Feb 8, 2019)

What was the last "good" desktop GCN card? 390x?
This is getting repetitive, AMD releases a thermonuclear reactor slower than Nvidia, and Nvidia just adds another zero to their prices. Loop.


----------



## cucker tarlson (Feb 8, 2019)

GoldenX said:


> What was the last "good" desktop GCN card? 390x?


470,480/570,580


----------



## Blueberries (Feb 8, 2019)

Nice to see AMD finally release some competition for the 1080 ti.








...what year is it again?


----------



## Vayra86 (Feb 8, 2019)

zenlaserman said:


> Funny you say I need a reality check, when it is you who is about to get one.
> 
> My opinion stems from experience - I've been using Radeons since the beginning of the name.  They've always been tops for all-around work in my rigs.  They work well clocked low, clocked high, with or without software accompanying the drivers, unlike nV.  Even AMD's HD-series onboard will accelerate CAD nicely.  I don't just game, which is what the loudest of whiners about this card seem to do.  AMD's GPUs have consistently proven to me to be superior for all-around work.  If I wanted a GPU just for gaming I'd buy a console, and nice, they have Radeons, too.
> 
> ...



If you are reading my posts surely you will also have read that I also see that of course this card fills a niche. But we are looking at a product marketed for gaming. That is _my _context. You didnt talk about your other workloads earlier, you spoke of kids crying deeply over game benchmarks... And fine wine that we havent really seen much of the past few years.


----------



## Bjorn_Of_Iceland (Feb 8, 2019)

Man.. this sucks. Price it at $599, and it will be good.

Maybe wait for Fine Wine? Ferment it for what? 5 years? lol.


----------



## medi01 (Feb 8, 2019)

Brief undervolting effects (UV):









Blueberries said:


> for the 1080 ti.


Nice that 2080 is now called 1080Ti.
Progress, they said.



Tolga said:


> "16GB HBM Ram"  Good choice of rams for price competition


They had to gamble and it didn't quite work (bigger Polaris was also stopped in the process).
But thanks to nVidia bending customers over, price is not that big of an issue.


----------



## laszlo (Feb 8, 2019)

as a conclusion for me amd took a vega 20 (Instinct MI60), disabled a few CU , wrote a driver for gaming support and throw the final product as VII in the market.

basically they re-purposed a few thousand of stock chips which make me think that MI60 is not quite a good sell ; maybe this is the reason for the limited availability as they're testing the market and if selling goes well than stock can be used up this way...


----------



## dicktracy (Feb 8, 2019)

Same price as 2080 while being a little slower and having way less features is a hard sell. 2080 has RTX, DLSS, GSync + FreeSync, brand-name advantage, and EVGA backing. Yeah no... and this their full specced GPU. From a technical standpoint, Nvidia’s maxed out GPU is the RTX Titan and this thing is barely onpar with the gutted 2080!? Lol. RTG is beyond doom once Nvidia goes to 7nm


----------



## Xuper (Feb 8, 2019)

dicktracy said:


> Same price as 2080 while being a little slower and having way less features is a hard sell. 2080 has RTX, DLSS, GSync + *FreeSync*, brand-name advantage, and EVGA backing. Yeah no... and this their full specced GPU. From a technical standpoint, Nvidia’s maxed out GPU is the RTX Titan and this thing is barely onpar with the gutted 2080!? Lol. RTG is beyond doom once Nvidia goes to 7nm



If I want freesync works flawless I won't get NV card.Bad idea.Nvidia needs much more work to ensure that Freesync works without any issue , but I'm sure they never bother it due to their strategy


----------



## MonteCristo (Feb 8, 2019)

Meanwhile in Phoronix.


----------



## Xuper (Feb 8, 2019)

ofc , It's just Radeon Instinct MI50 in gaming card, AMD had worked on their linux driver to improve MI50's perf in Linux.
one owner mentioned he reduced volt from 1150 to 1050 and fan rpm dropped to 1900 from 2800.


----------



## cucker tarlson (Feb 8, 2019)

zenlaserman said:


> Funny you say I need a reality check, when it is you who is about to get one.
> 
> My opinion stems from experience - I've been using Radeons since the beginning of the name.  They've always been tops for all-around work in my rigs.  They work well clocked low, clocked high, with or without software accompanying the drivers, unlike nV.  Even AMD's HD-series onboard will accelerate CAD nicely.  I don't just game, which is what the loudest of whiners about this card seem to do.  AMD's GPUs have consistently proven to me to be superior for all-around work.  If I wanted a GPU just for gaming I'd buy a console, and nice, they have Radeons, too.
> 
> ...


no one is interested in your background story,stick to the point of the discussion.
though there ain't much to dicuss here,numbers say it all.that's why there's people stirring up drama.


----------



## deemon (Feb 8, 2019)

Super XP said:


> *We all know AMD needs a complete GPU Re-Design.* Until this happens, this is what they can offer. Though I do believe the price is at least $150 too high, despite the 3 x AAA Games included.



And the games aren't even added on most cases. Only a few shops here and there support this AMD bonus. None in my country


----------



## Metroid (Feb 8, 2019)

Super XP said:


> Hopefully the 7nm Navi will be enough to further accelerate Radeon GPU adoption. I heard that Navi may be a new design? Or highly enhanced/modified and be offered in both GDDR6 and HBM2.



You heard it right, navi will be a very efficient gpu like polaris is on 14nm. 2 to 4 x more performance than rx 480 and to be sold around $200 - $400 is a win to me.


----------



## Vya Domus (Feb 8, 2019)

Funny how all these people who are really adamant to point out how this card is useless/irrelevant/shit and call out the "red team" for defending it also keep coming back to argue about the most bizarre things. 

Oxymoron much ?


----------



## medi01 (Feb 8, 2019)

dicktracy said:


> brand-name advantage


When used in the context of which product is better, It is spelled "brain damage".


----------



## XXL_AI (Feb 8, 2019)

MonteCristo said:


> View attachment 116028
> 
> Meanwhile in Phoronix.



Because it literally is MI50 card rebranded as Radeon VII. AMD failed so hard on AI so they are trying to sell their stock as gaming gpu. I've applied for their AI platform 3 years ago and haven't heard from ever since. No one got accepted to their platform because there is no platform. amd is a troll.


----------



## notb (Feb 8, 2019)

MonteCristo said:


> Meanwhile in Phoronix.


Of course it is good in computation. It's a rebranded compute accelerator with video output added.
Do you really want to be that person who buys a card *for gaming* because it's good at running simulations? 

Also the other characteristics (power consumption, noise) are very datacenter-ish, not PC-friendly.


Xuper said:


> If I want freesync works flawless I won't get NV card.Bad idea.Nvidia needs much more work to ensure that Freesync works without any issue , but I'm sure they never bother it due to their strategy


On the other hand: you're fine with the fact that this Radeon needs tweaking, undervolting etc just to make it slightly competitive (or rather: less atrocious).
Nvidia cards are basically plug&play.


----------



## SIGSEGV (Feb 8, 2019)

XXL_AI said:


> Because* it literally is MI50 card rebranded as Radeon VII.* AMD failed so hard on AI so they are trying to sell their stock as gaming gpu. I've applied for their AI platform 3 years ago and haven't heard from ever since. No one got accepted to their platform because there is no platform. amd is a troll.



if your story is true then i would be delightful to get one or two of radeon vii. i need them badly and hey look the price is cheap. lol

thanks for your information.

...


----------



## xkm1948 (Feb 8, 2019)

Vya Domus said:


> Funny how all these people who are really adamant to point out how this card is useless/irrelevant/shit and call out the "red team" for defending it also keep coming back to argue about the most bizarre things.
> 
> Oxymoron much ?


Hey where is your Radeon 7 dude? Once again dictating who can or cannot comment on a thread?


----------



## [XC] Oj101 (Feb 8, 2019)

@W1zzard 

Thanks for the write-up 

Firstly, I am not insinuating anything here, but do you have any reason for being able to overclock the card when nobody else could? der8uaer couldn't get a performance increase, Gamers Nexus apparently couldn't get an increase (I'm going by Roman's video here, I don't watch GN), and the same goes for Toms Hardware and Guru3D. From the reviews I've seen, you're the only one to successfully overclock the card. Do you have any idea why?


----------



## xkm1948 (Feb 8, 2019)

[XC] Oj101 said:


> @W1zzard
> 
> Thanks for the write-up
> 
> Firstly, I am not insinuating anything here, but do you have any reason for being able to overclock the card when nobody else could? der8uaer couldn't get a performance increase, Gamers Nexus apparently couldn't get an increase (I'm going by Roman's video here, I don't watch GN), and the same goes for Toms Hardware and Guru3D. From the reviews I've seen, you're the only one to successfully overclock the card. Do you have any idea why?



Just in case you forgot it is the GPU-Z creator you are talking about here:W1zzard. His programming skills out match all those you mentioned above COMBINED.


In the age of “tech-tubers” people tend to forget the existence of real tech reviewers.

Nothing against tech-tubers, they are non-professional. They make videos for entertainment, review is just add-on value


----------



## W1zzard (Feb 8, 2019)

[XC] Oj101 said:


> Firstly, I am not insinuating anything here, but do you have any reason for being able to overclock the card when nobody else could?


I make GPU-Z


----------



## EarthDog (Feb 8, 2019)

@W1zzard W1z - New one out that supports it?

I remember a review said you were on vacation (how dare you!) and none of the apps worked.


----------



## W1zzard (Feb 8, 2019)

EarthDog said:


> @W1zzard W1z - New one out that supports it?
> 
> I remember a review said you were on vacation (how dare you!) and none of the apps worked.


Back from holiday, still working on it


----------



## XXL_AI (Feb 8, 2019)

SIGSEGV said:


> if your story is true then i would be delightful to get one or two of radeon vii. i need them badly and hey look the price is cheap. lol
> 
> thanks for your information.
> 
> ...



With NSA's RSA reverse engineering tool, I'm sure you can rewrite the firmware to use them as compute cards. There is no stock in my country, when they arrive, I'm going to get one just to verify my models on ROCm & OpenCL.


----------



## cucker tarlson (Feb 8, 2019)

[XC] Oj101 said:


> @W1zzard
> 
> Firstly, I am not insinuating anything here, but do you have any reason for being able to overclock the card when nobody else could?






W1zzard said:


> I make GPU-Z


sorry,but I just had to seize that moment


Spoiler



when you're the only one to oc RVII cause you wrote the software yourself you be like


----------



## repman244 (Feb 8, 2019)

zenlaserman said:


> Funny you say I need a reality check, when it is you who is about to get one.
> 
> My opinion stems from experience - I've been using Radeons since the beginning of the name.  They've always been tops for all-around work in my rigs.  They work well clocked low, clocked high, with or without software accompanying the drivers, unlike nV.  Even AMD's HD-series onboard will accelerate CAD nicely.  I don't just game, which is what the loudest of whiners about this card seem to do.  AMD's GPUs have consistently proven to me to be superior for all-around work.  If I wanted a GPU just for gaming I'd buy a console, and nice, they have Radeons, too.
> 
> ...



While I don't care what brand I buy (I buy whatever has the best price/performance/power consumption), if you buy a gaming card either from AMD or NVIDIA they are all crap for any serious CAD work and won't accelerate anything, for that you need FirePro or Quadro. With NVIDIA at least you can use CUDA even on gaming cards.


----------



## [XC] Oj101 (Feb 8, 2019)

W1zzard said:


> I make GPU-Z



Thanks for the reply 

Can you give us any insight as to how you got around it (genuine interest) or is that top secret for now?


----------



## ManofGod (Feb 8, 2019)

yakk said:


> Really is a compute card rather than a game card.  Swing & a miss in the consumer space, that build quality is really too high for consumer use and driving up price.



Yeah, much better to use the Nvidia Space Invaders build quality instead.


----------



## Vayra86 (Feb 8, 2019)

cucker tarlson said:


> sorry,but I just had to seize that moment
> 
> 
> Spoiler
> ...


----------



## moproblems99 (Feb 8, 2019)

xkm1948 said:


> His programming skills out match all those you mentioned above COMBINED.



Can you explain what programming skills have to do with overclocking and electronics?


----------



## Shambles1980 (Feb 8, 2019)

M2B said:


> View attachment 115950
> 
> TPU's performance summary is based on significantly more games and it's definitely more accurate because of this.



yeah its 5% better overall on average but a 1080ti is over £1k new in the uk, the r vii is around £680...
You should focus on the 2080 which is about the same price and 14% better currently. (under specified test workloads. its slower by the same amount in others)

i don't doubt driver support will improve the r vii performance numbers, but as it stands its under performing and you cant blame any one other than amd if it turns out to be 100% the drivers fault.


----------



## Vya Domus (Feb 8, 2019)

xkm1948 said:


> Hey where is your Radeon 7 dude? Once again dictating who can or cannot comment on a thread?




Recognized yourself in my description I see. You may continue posting tho.


----------



## evolucion8 (Feb 8, 2019)

Shambles1980 said:


> yeah its 5% better overall on average but a 1080ti is over £1k new in the uk, the r vii is around £680...
> You should focus on the 2080 which is about the same price and 14% better currently. (under specified test workloads. its slower by the same amount in others)
> 
> i don't doubt driver support will improve the r vii performance numbers, but as it stands its under performing and you cant blame any one other than amd if it turns out to be 100% the drivers fault.



Both are essentially tied, except on a few games that runs well over 150fps that triggers a CPU bottleneck on AMD's DX11 single threaded draw call thread, but AMD's GCN is very underutilized, with driver improvements along with more advanced games that are more demanding on resources, the story will repeat itself again. When Vega RX 64 got released, even the water cooling version had a hard time matching the GTX 1080, now the air cooling version consistently matches it and outperforms it on recent demanding games and the water cooling version is less than 18% behind the GTX 1080 Ti. That is why does not make a lot of sense jumping from RX 64 LC to Vega 7, and why AMD was able to match and outperform the GTX 1080 Ti with Vega 7 with a simple shrunk Vega with higher clocks. This generation of GPUs had been boring overall. At least even the mainstream GPUs are powerful enough to max games at mainstream resolutions.


----------



## [XC] Oj101 (Feb 8, 2019)

moproblems99 said:


> Can you explain what programming skills have to do with overclocking and electronics?



There's no need. The problem was that no current software could interface with the card other than AMD's own, which was a failure.


----------



## jabbadap (Feb 8, 2019)

moproblems99 said:


> Can you explain what programming skills have to do with overclocking and electronics?



Well, it's kind of hard to overclock if you don't have the software to do that  

To put some perspective to this, W1zzard was the author of now long deprecated AtiTool. I don't actually know current status of Radeon overclocking, do they have API for it(like nvapi from nvidia) or are they still managed from driver exploits on third party OC softwares. Either way you have to have some knowledge on hardware and programming skills to make use of them.


----------



## xkm1948 (Feb 8, 2019)

Vya Domus said:


> Recognized yourself in my description I see. You may continue posting tho.



Do an unboxing when you get your Radeon 7 man. I am eagerly waiting for your unbiased review of Radeon 7. And hey, you can finally be rid of that evil 1080 right. Which brand are you getting? Sapphire seems to have the best support and warranty

Keep us posted!


----------



## cucker tarlson (Feb 8, 2019)

xkm1948 said:


> Do an unboxing when you get your Radeon 7 man. I am eagerly waiting for your unbiased review of Radeon 7. And hey, you can finally be rid of that evil 1080 right. Which brand are you getting? *Sapphire seems to have the best support and warranty*
> 
> Keep us posted!


I can attest to that being true.Had two r9 290 trix die (bought new),both promptly replaced with new ones.It was my psu's fault it turned out.Sapphire were very kind not to ask questions though and just sent replacement.Swell company,I ever go back to Radeon then they're getting my money, no hestitation.


----------



## Shambles1980 (Feb 8, 2019)

i had a sapphire 7850 that messed up they wanted me to send the thing to Taiwan at my own expense insured and that would have costed me more than buying a second hand replacement one. 
genuinely don't like sapphire.


----------



## cucker tarlson (Feb 8, 2019)

Shambles1980 said:


> i had a sapphire 7850 that messed up they wanted me to send the thing to Taiwan at my own expense insured and that would have costed me more than buying a second hand replacement one.
> genuinely don't like sapphire.


my experience with sapphire couldn't be further from that what you're describing.I had the original recepits though,don't know what your situation was.


----------



## evolucion8 (Feb 8, 2019)

I had a Sapphire 290X reference that had issues with black screens with DisplayPorts, they exchanged it with no questions asked, even with no receipts as I bought it from Ebay right after the Mining Crashed on 2013 and lasted me two years before I swapped it to a Fury X, it was very powerful but a bit on the hot side.


----------



## xkm1948 (Feb 8, 2019)

Yeah the AIO pump on my old FuryX went busted just out of warranty. Sapphire still fixed it for me for just $50.


----------



## qubit (Feb 8, 2019)

xkm1948 said:


> Yeah the AIO pump on my old FuryX went busted just out of warranty. Sapphire still fixed it for me for just $50.


That's a seriously reasonable cost, good for you. It's not uncommon to see prices upwards of $150 for fixes like this.


----------



## GoldenX (Feb 8, 2019)

cucker tarlson said:


> 470,480/570,580


Mmm, 470/570 I believe, the 80s... not so much. But that could be the same argument with the 2080 and the 2070.


----------



## Super XP (Feb 8, 2019)

Shambles1980 said:


> i had a sapphire 7850 that messed up they wanted me to send the thing to Taiwan at my own expense insured and that would have costed me more than buying a second hand replacement one.
> genuinely don't like sapphire.


I've had nothing but great experiences with Sapphire. Had to return and get a card replaced a while back. There was no questions asked. It went as smooth as silk. RMA box came with the new Sapphire replacement card. Placed the defective product in the same box. Etc. No issues to report.


----------



## HD64G (Feb 8, 2019)

I don't know if anyone in this thread has mentioned that Radeon 7 has 1/4in double precision instead of 1/16 of the single precision the rumours went about before its launch. We are talking about serious compute power and for sure, the highest ever for a consumer gpu. Maybe AMD wanted to have another customer segment willing to buy it apart from gamers or miners.

And another case of use for this gpu is 4K video production in which is also incomparable because of its 1TB/s HBM2 16GB VRAM. Details in the video below.


----------



## Xuper (Feb 8, 2019)

ok , a guy told me he set Radeon VII Volt at -100mv +and 1700 Mhz , so Temp = 72 , junction temp = 90 , rpm = 1950 ( maybe He chose silent mode fan)
This is exactly what happened to my Ryzen 1600x , at 3950mhz , minimum stable voltage is 1.33v , but at 4ghz , I have to set it at least 1.405v


----------



## Shatun_Bear (Feb 8, 2019)

HD64G said:


> I don't know if anyone in this thread has mentioned that Radeon 7 has 1/4in double precision instead of 1/16 of the single precision the rumours went about before its launch. We are talking about serious compute power and for sure, the highest ever for a consumer gpu. Maybe AMD wanted to have another customer segment willing to buy it apart from gamers or miners.
> 
> And another case of use for this gpu is 4K video production in whichis also incomparable because of its 1TB/s HBM2 16GB VRAM. Details in the video below.



You're not wrong:


> As you'll see in the video embedded at the top of this page, I could put together a simple Adobe Premiere project using six clips and three transitions that would crash on the Titan, but export just fine on the Radeon 7. This isn't down to drivers or compute power, but 100 per cent down to GPU memory. For content creators pushing 4K video hard, Radeon 7 is still expensive - but it's offering a pro-level memory allocation that make working with challenging projects easier.


----------



## gamerman (Feb 8, 2019)

i cant belive that radeon VII is 7nm process gpu!!

its so slow,burn out heat and has terrible noisy.


if amd cant build better efficiency and fasster gpu with 7nm process,16gb hnm2 memory, i recomended that amd stop making thouse junk its offer ppl last 3.5 years

amd offer only alot heat,noise and little performnce and new tech and new ideas,example rtx rendering not supported.
and dare ask that gpualmost 800€!!
well, i dont teke it even free!

radeon vii have near rtx 2080ti power eat but less performance than 205W rtx 2080!!!!!!!
and all that high heat,alot noise and high price.

really,really junk gpu, what no1 normal brain ppl never even seek..

once  i msut say again, its 7nm gpu...i hope ppl understand how huge handicap is it and should be amd...BIG!

nvidia have 12nm , amd radeon vii 7nm...its almost half smaller, meaning its sgoudrunning much muchmuchcooler and have alot performance, specially thouse hmb2 mems.

STILL nvidia crush radeon easily.


its just tell all,amd cant haveskills to build gpu andstill they try do it. trusting red neck..omg what cheating, how amd dare!


well, REAL comapre we see when nvidia release they own 7nm gpu ,called rtx 3000 series soon,im sure we seen totally different gpus.

i ruarantie that rtx 3060 beat radeon vii easily and over 120W lower power eat.

also intel release they 1st gpu soon, and i say,then it will be..soon nvidia vs ntel.

amd gpus and also i say cpus are loosers choice. so terrible and lausy junk amd offer ppl.... shame amd... go away.



last.

AIB version not help radeon noisy or heat, bcoz radeon vii has already 3 fans installed, and they are good quality.so aib version are just same.
and all know, radeon vii cant oc'd ,its melted!!


----------



## XXL_AI (Feb 8, 2019)

gamerman said:


> i cant belive that radeon VII is 7nm process gpu!!
> 
> 
> also intel release they 1st gpu soon, and i say,then it will be..soon nvidia vs ntel.



it won't be nvidia vs. intel imo, because intel is still meltdown vs. spectre.


----------



## evolucion8 (Feb 8, 2019)

gamerman said:


> i cant belive that radeon VII is 7nm process gpu!!
> 
> its so slow,burn out heat and has terrible noisy.
> 
> ...



What? The RX Vega 7 uses 30W more than a RTX 2080 Ti, how come that 30W more is big of a deal? Of course not as good considering the RTX 2080 Ti is a bit faster but having a GPU that uses slightly more power than the RTX 2080 Ti and just being around 18% slower average, isn't big of a deal considering the RTX 2080 Ti is twice more expensive. I think PC gaming is not your suit.


----------



## EarthDog (Feb 8, 2019)

XXL_AI said:


> it won't be nvidia vs. intel imo, because intel is still meltdown vs. spectre.


What does that mean?



evolucion8 said:


> What? The RX Vega 7 uses 30W more than a RTX 2080 Ti, how come that 30W more is big of a deal? Of course not as good considering the RTX 2080 Ti is a bit faster but having a GPU that uses slightly more power than the RTX 2080 Ti and just being around 18% slower average, isn't big of a deal considering the RTX 2080 Ti is twice more expensive. I think PC gaming is not your suit.


Who are you talking to? Quote that person unless its directly above you...so confusing...

Anyway, its performance (according to most/all reviews) and market positioning (according to AMD) is that of the RTX 2080... of which its 90W (average gaming) higher than and performs, in general slower than. If you want to go off TDP, that is 225W vs 300W, or around 30% more power use for generally lesser performance. You are, for some odd reason, basing it off a card that is dozens of percent faster (39% @ QHD/UHD) and not intended to compete with. Compare like things. 

(data is from TPU, FYI). So I'm not sure where you are seeing the 2080Ti is a "bit" faster...unless a 'bit' is 39%... Hell a 'bit' isn't even 20%...


----------



## Super XP (Feb 8, 2019)

2080 Ti is A Lot and 2080 is a bit. Lol


----------



## Deleted member 158293 (Feb 8, 2019)

ManofGod said:


> Yeah, much better to use the Nvidia Space Invaders build quality instead.



That is the unfortunate reality of a lot of consumer grade products; cheap components like nvidia (reference 2060 being a real low point even for them, with the hand soldered ghetto power extension, but I guess at least it still works), home routers...etc...  AMD being exception, they usually provide high grade components on their PCBs.  They just really went over the top with the RVII, even for them.


----------



## evolucion8 (Feb 8, 2019)

EarthDog said:


> What does that mean?
> 
> Who are you talking to? Quote that person unless its directly above you...so confusing...
> 
> ...




I quoted Gamerman as you can see on my previous post lol, are you on a mobile phone? I noticed some scaling issues with Chrome and Android.


----------



## EarthDog (Feb 8, 2019)

OMG.. that's twice now. The ignore function is KILLING me!!!!

It doesn't show you quoted anyone on my screen!!! Sorry man!!!


----------



## Super XP (Feb 8, 2019)

Lets get some facts out of the way, with respect to 7nm Navi and this Radeon VII variant.
* Navi is suppose to be a *Brand New GPU* Micro-Architecture and *NONE gcn based*.



> QUOTE: I have also been told that Navi will be a *new microarchitecture* (in other words the *first AMD Radeon uArch to not be based on GCN*).


* AMD's CEO Confirms Navi for 2019, with a possible Q2 2019 release for low & mid range Navi's. 
* The higher end Navi is looking at a Q4 2019 release time schedule.
* The Radeon VII is or was meant for the professional(s). And tied against the 2080 Nvidia GPU. Based on its high memory, which no game will take advantage of. And the unnecessary high cost components which don't translate in better gaming performance per watt nor $$$. Again, meant for the professional(s) i.e. *Huge 16GB* . It could also be a 7nm test stop gap, somewhat to simply test the waters with 7nm. But it's not meant for everybody nor was it meant to sell 100's of 1,000's either.
* Can the Radeon VII play any game at 1080p? 1440p? 2160p? with high enough settings? Absolutely YES. It's just not really meant as a gaming card.
* The Radeon VII is AMD's fastest GPU out to date. In relation to its Nvidia competition, it's simply too costly.

Now having said that, Who's Excited about 7nm Navi?

I am. AMD has the upper hand in both CPU & GPU. The AMD ZEN project proved successful and forced Intel down on its knees. AMD is also 1st with 7nm. I can only imagine they will want to capitalize on this as soon as architecturally possible (7nm Navi / Zen 3). This will put them in a much needed lead, seeing how the Underdog AMD is battling two fierce competitors, that are very well known for devious anti-competitive practices (Hard Facts).

Are you Excited about 7nm Navi? And why?


----------



## cucker tarlson (Feb 8, 2019)

Super XP said:


> Lets get some facts out of the way, with respect to 7nm Navi and this Radeon VII variant.
> * Navi is suppose to be a *Brand New GPU* Micro-Architecture and *NONE gcn based*.
> 
> * AMD's CEO Confirms Navi for 2019, with a possible Q2 2019 release for low & mid range Navi's.
> ...


how sure are you that it's just not improved gcn with a new name.


----------



## Xex360 (Feb 8, 2019)

Very disappointed, AMD should've at least put the same number of cores as Vega 64, beating the 2080 should've been the priority nothing else is acceptable, everything about this card is bad relatively speaking, nVidia annoyingly (because as a customer we want more competition to drive the prices down) can use a 2 year old GPU to beat it. What a waste.


----------



## Super XP (Feb 8, 2019)

cucker tarlson said:


> how sure are you that it's just not improved gcn with a new name.


Very sure because of the source I linked. They've been right many times B4.


----------



## VSG (Feb 8, 2019)

EarthDog said:


> OMG.. that's twice now. The ignore function is KILLING me!!!!
> 
> It doesn't show you quoted anyone on my screen!!! Sorry man!!!



I spoke to W1zzard about it, and this is a Xenforo limitation. Nothing he can do about it.


----------



## EarthDog (Feb 8, 2019)

At the bottom of the thread is always the show ignored thing... I'll have to train myself to it that...or assume people arent talking to nobody moving forward. 

Ty.


----------



## vMax65 (Feb 8, 2019)

Super XP said:


> Lets get some facts out of the way, with respect to 7nm Navi and this Radeon VII variant.
> * Navi is suppose to be a *Brand New GPU* Micro-Architecture and *NONE gcn based*.
> 
> * AMD's CEO Confirms Navi for 2019, with a possible Q2 2019 release for low & mid range Navi's.
> ...



And AMD isn't devious??? You said it yourself and I quote '* The Radeon VII is or was meant for the professional(s).' and 'Again, meant for the professional(s)'.....AMD have launched this GPU as a Gaming GPU....no where does it say for 'professionals' on any of the boxes...They also launched a RX 560 with cut down cores and still called it the same name!!!! No I know both Nvidia and Intel have the same FUD marketing and do some fairly dodgy stuff to get people to buy there products but please do not say AMD is the white knight as it is not...It's a business with shareholders.....Intel have been at the top only becouse AMD on the CPU side made a huge mistake with Bulldozer/Piledriver etc, etc...Allowing Intel to speed away. With Zen they have gotten back to the top and long may this continue....

As to the Vega 7, It's not bad and hardcore AMD fans will be more than happy, it's just 2 years too late and can only just match or slightly beat the GTX 1080ti. With the RTX 2080 at least it does have additional new technology in the RT Cores and Tensor cores....okay we do not have much in terms of games that can actually use it but they can at least say we have new tech and if DXR/DLSS does take off then it will only get worse for AMD as Nvidia will have a generational lead in this new arena....and by the way, it is still at least $100 to expensive...If AMD had released the Vega 7 at say $500 or even $550 it would have been great as at least the value would have been there.

As to NAVI, I pray it's good as we so badly need proper compitition and we cannot blame Nvidia for AMD not being good enough in the GPU space...Fingers crossed Navi can compete but boy with the next gen RTX cards which will be die shrunk and even more effecient it is going to be very hard to beat...That is why we need NAVI to be stellar...

And finally I am not a fan of any company, just the company that can give me the best performance all around at the price I want to pay or have budgeted for....I don't care if it is AMD, Intel or Nvidia or any other company for that matter.


----------



## Super XP (Feb 8, 2019)

vMax65 said:


> And AMD isn't devious??? You said it yourself and I quote '* The Radeon VII is or was meant for the professional(s).' and 'Again, meant for the professional(s)'.....AMD have launched this GPU as a Gaming GPU....no where does it say for 'professionals' on any of the boxes...They also launched a RX 560 with cut down cores and still called it the same name!!!! No I know both Nvidia and Intel have the same FUD marketing and do some fairly dodgy stuff to get people to buy there products but please do not say AMD is the white knight as it is not...It's a business with shareholders.....Intel have been at the top only becouse AMD on the CPU side made a huge mistake with Bulldozer/Piledriver etc, etc...Allowing Intel to speed away. With Zen they have gotten back to the top and long may this continue....
> 
> As to the Vega 7, It's not bad and hardcore AMD fans will be more than happy, it's just 2 years too late and can only just match or slightly beat the GTX 1080ti. With the RTX 2080 at least it does have additional new technology in the RT Cores and Tensor cores....okay we do not have much in terms of games that can actually use it but they can at least say we have new tech and if DXR/DLSS does take off then it will only get worse for AMD as Nvidia will have a generational lead in this new arena....and by the way, it is still at least $100 to expensive...If AMD had released the Vega 7 at say $500 or even $550 it would have been great as at least the value would have been there.
> 
> ...


I'm talking about the Many Times both Intel and Nvidia done stuff to hurt the AMD Brand.

Sure AMD is devious to a certain extent, but usually causes harm to itself. Lol

With regards to Navi, I too hope AMD pulls a rabbit out of the hat this time. 

But I do blame Nvidia for its BS Anti Consumerism nonsense they tried to pull on AMD recently. 
Less we forget.


----------



## SIGSEGV (Feb 9, 2019)

I still don't get why AMD always keep raw gpu compute power in their gaming gpu.
just stripped off these junks ffs. lol


----------



## Dbiggs9 (Feb 9, 2019)

XXL_AI said:


> Because it literally is MI50 card rebranded as Radeon VII. AMD failed so hard on AI so they are trying to sell their stock as gaming gpu. I've applied for their AI platform 3 years ago and haven't heard from ever since. No one got accepted to their platform because there is no platform. amd is a troll.



If you listen to AMD Q4 earnings you would know they dodge the crypto drop due to higher Epyc and  instinct cards sales much much higher.


----------



## Kissamies (Feb 9, 2019)

evolucion8 said:


> What? The RX Vega 7 uses 30W more than a RTX 2080 Ti, how come that 30W more is big of a deal? Of course not as good considering the RTX 2080 Ti is a bit faster but having a GPU that uses slightly more power than the RTX 2080 Ti and just being around 18% slower average, isn't big of a deal considering the RTX 2080 Ti is twice more expensive. I think PC gaming is not your suit.


He is just an AMD hater, been for over a decade in Finnish HW sites Muropaketti and io-tech.

And his Finnish is as "good" as his Finnish. Just ignore that crap.


----------



## Shatun_Bear (Feb 9, 2019)

Again, Wizzard's performance summary paints Radeon 7 in the worst possible light out of all the big sites. It's like an anomaly. This is partly because of the choice of old titles or severely unoptimized games for AMD (this is no-one but AMD's fault, but still, it skews the performance figures) included in the suite.

In DX12 there is no question the Radeon 7 is on par with a GTX 2080. But when you throw in stuff like Dragon Quest XI, where performance is something like 30%+ worse (!!) or Divinity Original Sin, Hellblade etc, the average goes right down, leading him to conclude performance is comparable to a 2070, not a 2080. That is a joke. Sure, if you play old unoptimized DX11 games, that's the case.

In games where the drivers are there, like Battlefield 5, it is faster than a 2080 and will only continue to get faster in the coming months.


----------



## the54thvoid (Feb 9, 2019)

Shatun_Bear said:


> In games where the drivers are there, like Battlefield 5, it is faster than a 2080 and will only continue to get faster in the coming months.



Please stop to logically interpret your own statement. By implication, you're admitting AMD don't make good drivers for games. Ironically, DX12 removes much of that overhead. 
There is a flip side to this where Nvidia focus on a DX12 driver to get impressive performance. Basically what you're saying is, when the driver team tries hard, the hardware shines. This is equally true for both Nvidia and AMD.


----------



## HD64G (Feb 9, 2019)

Some dry ice oc of Radeon 7 from Debauer in a new video shows clocks reaching 2149Hz. He suggests that it is achievable and the only problem is the driver's immaturity. So, AMD made a bad launch again due to the drivers not being ready.


----------



## efikkan (Feb 9, 2019)

Super XP said:


> * Navi is suppose to be a *Brand New GPU* Micro-Architecture and *NONE gcn based*.


You know this is a piece of news from Wccftech itself, which is mostly one guy sitting in a basement. This is the same "source" who generated contradictory info on Nvidia Turing all the way up to a couple weeks ahead of release. I never trust the news Wccftech "source" themselves.
So I would appreciate a "real" source, or at least a more reputable one.



Super XP said:


> * AMD's CEO Confirms Navi for 2019, with a possible Q2 2019 release for low & mid range Navi's.


I don't doubt their internal target is/was end of Q2 2019 for the smaller Navi chip(s). But if this was still true, then normally they should have fully working engineering samples by now, and if they did, they certainly would have showed them off at CES. There is still a marginal chance their next stepping is solving all the problems and good enough for volume production, but don't be surprised if the launch slides into August or later.



Super XP said:


> * The Radeon VII is or was meant for the professional(s). And tied against the 2080 Nvidia GPU.


Not true in general gaming, and you know it.



Super XP said:


> The AMD ZEN project proved successful and forced Intel down on its knees. AMD is also 1st with 7nm. I can only imagine they will want to capitalize on this as soon as architecturally possible (7nm Navi / Zen 3).


We all appreciate AMD's return to relevance in the CPU market, and I belive Zen 2 will expand their foothold in both the desktop and server markets.
But we still have to remember the fact that AMD is still playing catch up with Intel in terms of core speed. And also the fact that Ice Lake(Sunny Cove) has been ready for two years, just waiting for a suitable node. So AMD's return to relevance is not just their own achievement, but at least as much Intel struggling with their new node. If Intel had foreseen these issues and made a backup plan of backporting Ice Lake to 14nm instead of Coffee Lake (their actual backup plan), then Zen and Zen2 would have had much tougher competition. So a lot of AMD's success is just because of Intel's misfortune, and that is not something you can assume will continue.


----------



## rtwjunkie (Feb 9, 2019)

Shatun_Bear said:


> Again, Wizzard's performance summary paints Radeon 7 in the worst possible light out of all the big sites. It's like an anomaly. This is partly because of the choice of old titles or severely unoptimized games for AMD (this is no-one but AMD's fault,


I’m curious why the mix of titles is a bad thing.  Will a Radeon 7 user only play games in which the game was optimized for AMD or made in DX12?  With my last AMD card I didn’t decide to not play certain games just because they work better with Nvidia cards. 

I really don’t understand why then we shouldn’t see a performance summary rating, since it includes a wide variety of games played.


----------



## HD64G (Feb 9, 2019)

rtwjunkie said:


> I’m curious why the mix of titles is a bad thing.  Will a Radeon 7 user only play games in which the game was optimized for AMD or made in DX12?  With my last AMD card I didn’t decide to not play certain games just because they work better with Nvidia cards.
> 
> I really don’t understand why then we shouldn’t see a performance summary rating, since it includes a wide variety of games played.


In fact, some games in @W1zzard 's review are heavily biased for nVidia GPUs (Darksiders 3 gives 30% diff, Civ6 gives 28% diff, Dragon Quest XI gives 43% diff and Hitman 2 gives 35% diff @1440P). On the other side, only Strange Brigade gives 10% diff for Radeon 7. Without those games, the difference on average would be 9% @1440P and 5% in 4K. The 2080 would win for sure but not by that far. Guru3D's review results on average are exaclty the ones without those biased games. And some DX12 games are absent also (Sniper Elite 4). I am not judging the professionalism of our @W1zzard but I am suggesting him some changes in his gamelist that would make the results better balanced.


----------



## XXL_AI (Feb 9, 2019)

EarthDog said:


> What does that mean?
> 
> Who are you talking to? Quote that person unless its directly above you...so confusing...
> 
> ...



I don't trust intel.


----------



## londiste (Feb 9, 2019)

HD64G said:


> In fact, some games in @W1zzard 's review are heavily biased for nVidia GPUs (Darksiders 3 gives 30% diff, Civ6 gives 28% diff, Dragon Quest XI gives 43% diff and Hitman 2 gives 35% diff @1440P). On the other side, only Strange Brigade gives 10% diff for Radeon 7. Without those games, the difference on average would be 9% @1440P and 5% in 4K. The 2080 would win for sure but not by that far. Guru3D's review results on average are exaclty the ones without those biased games.


What exactly makes Battlefield, DX:MD or Strange Brigade a less biased game? 
Hitman used to be a very AMD-favorable benchmark up until some point, Hitman 2 is a direct sequel.


----------



## Shatun_Bear (Feb 9, 2019)

the54thvoid said:


> Please stop to logically interpret your own statement. By implication, you're admitting AMD don't make good drivers for games. Ironically, DX12 removes much of that overhead.
> There is a flip side to this where Nvidia focus on a DX12 driver to get impressive performance. Basically what you're saying is, when the driver team tries hard, the hardware shines. This is equally true for both Nvidia and AMD.



Nope, it's well known that Nvidia optimised Turing heavily for Battlefield engine, specifically BF5 DX12. They've both done that - and the Radeon 7 is still faster.

And don't be silly - Nvidia have invested far more money and investment into their DX12 drivers than AMD and they're still behind in lots of titles.


----------



## Super XP (Feb 9, 2019)

efikkan said:


> You know this is a piece of news from Wccftech itself, which is mostly one guy sitting in a basement. This is the same "source" who generated contradictory info on Nvidia Turing all the way up to a couple weeks ahead of release. I never trust the news Wccftech "source" themselves.
> So I would appreciate a "real" source, or at least a more reputable one.
> 
> 
> ...



I think you are Way Off.
Wccftech is a reputable site. Claiming otherwise is simply your opinion.  They've been 1st to break news many times over again. I also hold Fudzilla with the same high regard.

In DX12 the Radeon VII is on par with the 2080 no matter what anybody says. Unless dozens of review sites are lying.

With regards to Navi, those words came out of AMDs CEO. And look at the time stamp, it wasn't that long ago.

Of course in the end, it's all speculation till the hard facts come out.


----------



## HD64G (Feb 9, 2019)

londiste said:


> What exactly makes Battlefield, DX:MD or Strange Brigade a less biased game?
> Hitman used to be a very AMD-favorable benchmark up until some point, Hitman 2 is a direct sequel.


Only the previous Hitman title in its DX12 setting was clearly better for AMD GPUs. The Hitman Absolution and the Hitman 2 ones are much better on nVidia GPUs. Check the reviews of the past in this site and you will find that out.


----------



## Super XP (Feb 9, 2019)

Shatun_Bear said:


> Nope, it's well known that Nvidia optimised Turing heavily for Battlefield engine, specifically BF5 DX12. They've both done that - and the Radeon 7 is still faster.
> 
> And don't be silly - Nvidia have invested far more money and investment into their DX12 drivers than AMD and they're still behind in lots of titles.


Don't confuse Nvidia's shady business practises with investments. And how terrible they treat so called partners.


----------



## efikkan (Feb 9, 2019)

Super XP said:


> I think you are Way Off.
> Wccftech is a reputable site. Claiming otherwise is simply your opinion.  They've been 1st to break news many times over again. I also hold Fudzilla with the same high regard.


Then I would kindly suggest to check your compass.
Wccftech, Videocardz and Fudzilla are known as low quality sources. That's based on their reporting, not my options. The only time they are on top of real news are the times they are referring to others. Wccftech has been serving so much false information just the past year that you shouldn't take any of their "own sources" seriously any more.



Super XP said:


> In DX12 the Radeon VII is on par with the 2080 no matter what anybody says. Unless dozens of review sites are lying.


AMD is glorious no matter what the evidence says…
Still spinning the DirectX 12 is better for AMD myth… Also ignoring the games where AMD is performing worse in DirectX 12 than 11. We are still waiting for games which use DirectX 12 without an abstraction layer to emulate DirectX 11, and what we see in games are more results of bias, many of which due to being ports from consoles.


----------



## EarthDog (Feb 9, 2019)

Super XP said:


> Wccftech is a reputable site.


Bwaaaaaaahahahahahahahahalololol. No. Just no....it isnt. Its a running joke with them along with Fraudzilla...oops, fudzilla.


----------



## Super XP (Feb 9, 2019)

efikkan said:


> Then I would kindly suggest to check your compass.
> Wccftech, Videocardz and Fudzilla are known as low quality sources. That's based on their reporting, not my options. The only time they are on top of real news are the times they are referring to others. Wccftech has been serving so much false information just the past year that you shouldn't take any of their "own sources" seriously any more.
> 
> 
> ...


I think you are confused. I never said AMD is better in DX12. I clearly said review sites show the VII does better in DX12 and stays on par with the 2080. You clearly must have missed a few of my previous posts. 

With regards to Wccftech and Fudzilla we will agree to disagree.


----------



## delshay (Feb 9, 2019)

This is a copy & paste of my posting in a Reddit Thread.

HOLD ON, THAT'S WRONG. You can not increase mounting pressure in that location where the washer is located in the photo. Those tiny springs are just there to hole the screws so they don't fall-out, but it can also be used as a stopper. The mounting pressure is from the way the X bracket is made near the centre. So to increase mounting pressure the washer needs to placed on the screw first, then go though the X bracket. The current placement of washer in the photo will reduce mounting pressure.

Adding a washer on the screw first will also make no difference whatsoever as there should be a stopper at the bottom of each screw hole on the X bracket itself or on the cooler, ie the screw hole, so that you can't over tighten the screws.

The only way to increase mounting pressure is to increase the height of the pads (normally black rubber) on the inner part of the X bracket. You can see this on the underside of the X bracket which looks like four rubber pads, normally glued to the X bracket. This what it looks like on my R9 Nano & I expect most cards that use the X bracket will be more or less the same.

EDIT: Another way is to file down the stopper at each screw hole on the X bracket or on the cooler & you may have to remove all four springs, as this can also be your limiter.

NOTE: The stopper itself could be the spring itself or it's built on the cooler, ie the screw hole, which means you have to cut this down. All modifications not recommended, but the safest one is changing the four most inner rubber pads.


----------



## EarthDog (Feb 9, 2019)

Super XP said:


> With regards to Wccftech and Fudzilla we will agree to disagree.


Flat earthers agree to disagree with those who know the earth is round as well... 

 Be woke.


----------



## Shambles1980 (Feb 9, 2019)

omg.. its bad enough the fan boy bickering, but do we really need to drag flat earthers in to this ??


----------



## 64K (Feb 9, 2019)

EarthDog said:


> OMG.. that's twice now. The ignore function is KILLING me!!!!
> 
> It doesn't show you quoted anyone on my screen!!! Sorry man!!!
> View attachment 116060



Well, stop putting people on ignore man. 

I bought me one of these new-fangled mouses with a scroll wheel. That works well for me.


----------



## Super XP (Feb 9, 2019)

EarthDog said:


> Flat earthers agree to disagree with those who know the earth is round as well...
> 
> Be woke.


But the earth is actually Flat lol


----------



## vMax65 (Feb 9, 2019)

Super XP said:


> But the earth is actually Flat lol


Oh come on, everyone knows it's square cube shape....duh


----------



## EarthDog (Feb 9, 2019)

The latest from that camp is donut shaped....


...no seriously.


(But we're getting OT here, ha)


----------



## efikkan (Feb 9, 2019)

The biggest contribution Radeon VII does to the gaming market is justifying the price of graphics cards. Many feel the RTX 2080 is a little too expensive, but it will only look better now compared to this card.


----------



## Shatun_Bear (Feb 9, 2019)

rtwjunkie said:


> I’m curious why the mix of titles is a bad thing.  Will a Radeon 7 user only play games in which the game was optimized for AMD or made in DX12?  With my last AMD card I didn’t decide to not play certain games just because they work better with Nvidia cards.
> 
> I really don’t understand why then we shouldn’t see a performance summary rating, since it includes a wide variety of games played.



Well I should be more clear. My problem is the numerous people here forming the conclusion the Radeon 7 is 'terrible', 'embarassing' or 'a disaster' because it gets 123fps in F1 2018 and the 2080 gets 135fps. Or similar examples. It's not their best card but it's not terrible. Like I said it goes toe to toe with the 2080 in the vast majority of DX12 titles.

People should cut AMD more slack, they have about 1/3rd of the budget of Nvidia yet their re-purposed M160 still offers 2080-like performance outside of older DX11 titles. Do you not want more competition in the graphics card market? i find it hard to understand this ultra critical attitude towards their products when they are doing us all a favour, literally, by being there and competing with the giant that has become Nvidia.


----------



## EarthDog (Feb 9, 2019)

I dont think anyone compared it like that mentioning fps in detail. But everything needs context. Surely it's a capable card at 1440p, but that doesnt change the fact for the same price you have a lot more power use, a noisey(er) cooling solution, and slower performance overall (varies from a few/several % to the teens depending on review site and titles used).

People are used to AMD offering the best performance for the buck and this offering couldnt be further from that. It's a solid improvement over v64 and, finally, a competitor in the high end space. If you need compute abilities, it's a winner hands down. If you dont, which the majority do not, then it has to compared by its gaming abilities (where amd has marketed the card). Drivers having issues out of the gate (its gcn...not new)...it wasn't a smooth launch.


----------



## moproblems99 (Feb 9, 2019)

EarthDog said:


> donut shaped


Donut hole right?



Shatun_Bear said:


> People should cut AMD more slack



Honestly, no they shouldn't.  It is no one's fault but AMD they bought Ati likely knowing they couldn't afford it.  Whether it was a good choice or not will be determined if they ever crawl out of this GPU hole.


----------



## efikkan (Feb 9, 2019)

We shouldn't cut AMD more slack, we should strive for being unbiased and judge them fairly compared to their competition.

When it comes to the buggy launch driver, I will reserve judgement for a few days to see if this was a last minute glitch in the driver, or if the driver is generally bad. I have a deep understanding for how nasty bugs can sneak into last minute "fixes" of software, but then such problems should be easily fixed once identified.


----------



## Shatun_Bear (Feb 9, 2019)

moproblems99 said:


> Donut hole right?
> 
> 
> 
> Honestly, no they shouldn't.  It is no one's fault but AMD they bought Ati likely knowing they couldn't afford it.  Whether it was a good choice or not will be determined if they ever crawl out of this GPU hole.





efikkan said:


> We shouldn't cut AMD more slack, we should strive for being unbiased and judge them fairly compared to their competition.
> 
> When it comes to the buggy launch driver, I will reserve judgement for a few days to see if this was a last minute glitch in the driver, or if the driver is generally bad. I have a deep understanding for how nasty bugs can sneak into last minute "fixes" of software, but then such problems should be easily fixed once identified.



Exactly, and judging them fairly doesn't mean you conclude that this card is terrible, or a disaster, which seems to be your thinking. Judging them fairly - your words - would be concluding this is a good card but is not as good value as a 2080.

You also cannot forego context in this market. The fact is, they're competing with a company with a vastly bigger budget who rake in billions of profit each quarter whereas AMD turn in  $250m for the entire year if they're lucky. I don't care what you say that needs to be considered* in a conclusion of a product.* This 'let's just deal with the card in front of us', ignore all context, be ultra critical of it is stupid. If a start-up company made a CPU almost as good as Intel's fastest, would you conclude 'Nah this is terrible, it's not as good as Intel's best, not worth it'? No, that would be moronic, wouldn't it? That's an extreme example of your stance.


----------



## franc bruschetta (Feb 9, 2019)

lynx29 said:


> What a joke, Lisa Su lost all respect from me for allowing this product to be released. I was considering a 3700x, but screw it. 9700k and rtx 2080 is probably my next build.
> 
> really disappointing, 30 fps slower across the board at 1080p almost... and using 100 watts more... wow so pathetic...



no...16fps slower, you need to use a calculator if you don t know how to do math


----------



## efikkan (Feb 9, 2019)

Shatun_Bear said:


> Exactly, and judging them fairly doesn't mean you conclude that this card is terrible, or a disaster, which seems to be your thinking. Judging them fairly - your words - would be concluding this is a good card but is not as good value as a 2080.


Judging it fairly is considering the facts without bias, and even calling something terrible if it is, but I didn't call this one terrible.



Shatun_Bear said:


> You also cannot forego context in this market. <snipped babble>. If a start-up company made a CPU almost as good as Intel's fastest, would you conclude 'Nah this is terrible, it's not as good as Intel's best, not worth it'? No, that would be moronic, wouldn't it? That's an extreme example of your stance.


Nice attempt on a _reductio ad absurdum_.

The problem with Radeon VII is what it offers, or lack thereof, for the gaming market. When a product has some disadvantages, it should compensate with other advantages, justifying its market position. When you have a product A and B, and A is equal or better in every practical way than B, why would you ever buy B? And coming up with the argument "but B is almost as good in these use-cases" is not good enough, when there is a clear winner, why would you choose something else?

Regardless of how you try to avoid the reality, Radeon VII performs somewhere in the middle between RTX 2070 and RTX 2080. That would be fair enough if it was priced accordingly or gave some other major benefits. But unfortunately it's priced like RTX 2080, and has significantly worse thermals and noise. And even when completely ignoring the raytracing feature, Radeon VII still remains an inferior choice compared to RTX 2080 and RTX 2070, and that's a fact, not my opinion.


----------



## franc bruschetta (Feb 9, 2019)

radeon vii clearly win the rtx 2070 in performance summary, so why at this link, https://www.techpowerup.com/gpu-specs/radeon-vii.c3358  both are equal? the correct is 100% for radeon vii and 94% rtx 2070.


----------



## vMax65 (Feb 9, 2019)

Shatun_Bear said:


> Exactly, and judging them fairly doesn't mean you conclude that this card is terrible, or a disaster, which seems to be your thinking. Judging them fairly - your words - would be concluding this is a good card but is not as good value as a 2080.
> 
> You also cannot forego context in this market. The fact is, they're competing with a company with a vastly bigger budget who rake in billions of profit each quarter whereas AMD turn in  $250m for the entire year if they're lucky. I don't care what you say that needs to be considered* in a conclusion of a product.* This 'let's just deal with the card in front of us', ignore all context, be ultra critical of it is stupid. If a start-up company made a CPU almost as good as Intel's fastest, would you conclude 'Nah this is terrible, it's not as good as Intel's best, not worth it'? No, that would be moronic, wouldn't it? That's an extreme example of your stance.



Look I get what you are saying especially about context but, this is not a start up or new to the GPU business but an established player that has been around since near the beginning in ATI. The fact is they bought out a GPU at $699 that runs hotter, drinks more power and only just matches or slighty exceeds a GPU from 2 years ago in the GTX 1080ti and remember the 1080ti was manufactured on a 14nm process as opposed to this 'first ever' 7nm Gaming GPU from AMD. As importantly it is coming up against a compititor in the Nvidia RTX 2080 which was slated across the board for it's price! (and rightly so) yet the RTX 2080 run faster, cooler and more effeciently on a 12nm process than the worlds first 7nm GPU and with the RTX, Nvidia still have an excuse for the price hike in that it has genuine new technology in the RT and Tensor cores no matter how relevant they are right now.

I think compition is so, so important to drive technology forward and to bring prices down so I do want AMD to succeed but lets just be honest for a moment, as we can only compare this product to it's compitition and in the context of gaming as it was released as a gaming GPU. Sadly it falls a bit short. I have to admit I am surprised at AMD for releasing this GPU especially with the launch having a less than perfect start...Bad drivers and whatever else you want to say does not help and again how is this possible as it is still based on GCN from years ago...First impressions do count...maybe to to the hardcore but to the general public who are not stupid.


----------



## EarthDog (Feb 9, 2019)

Shatun_Bear said:


> Judging them fairly - your words - would be concluding this is a good card but is not as good value as a 2080.


but, there is more to it for many than value. Noise, performance, power use/heat mitigation....these are not inconsequential things..

Few called it terrible... just about as many cant call a spade a spade either. It just depends on the user as to what they are looking for out of a GPU. Many find that trio a deal breaker, especially at the same price point. Add to that the driver issues out of the gate, it's pretty easy to see why many are very disappointed in the release.


----------



## Shatun_Bear (Feb 9, 2019)

efikkan said:


> Judging it fairly is considering the facts without bias, and even calling something terrible if it is, but I didn't call this one terrible.
> 
> 
> Nice attempt on a _reductio ad absurdum_.
> ...



Oh come on, you mention RTX features that no-one can use yet you choose to ignore the three free games that come with the Radeon 7. That's not fair, is it? At worst you'll get $50 for selling those so that takes the price down to $650.

But you're being silly, I said the 2080 is a better choice. The Radeon 7 is overpriced, yes, I would also agree. But the reaction of some of you is over the top to say the least.


----------



## EarthDog (Feb 9, 2019)

Shatun_Bear said:


> Oh come on, you mention RTX features that no-one can use yet you choose to ignore the three free games that come with the Radeon 7.


anyone with an rtx card and bf v can use them...more titles, AAA titles, come out this year. 

Some nvidia gpus offer a free game as well.


----------



## efikkan (Feb 9, 2019)

Shatun_Bear said:


> Oh come on, you mention RTX features that no-one can use…


I beg your pardon, did you even read my post?
I said I was ignoring the raytracing feature, not giving AMD any disadvantage here, which is fact is giving AMD the benefit of the doubt.



Shatun_Bear said:


> …yet you choose to ignore the three free games that come with the Radeon 7. That's not fair, is it? At worst you'll get $50 for selling those so that takes the price down to $650.


Nvidia and board partners have game bundles all the time.
Game bundles are only worth something if they offer a game you were going to buy anyway. And don't pretend you can just sell them and get the value, you'll be lucky to get 10 cents on the dollar.


----------



## Deleted member 158293 (Feb 9, 2019)

efikkan said:


> The biggest contribution Radeon VII does to the gaming market is justifying the price of graphics cards. Many feel the RTX 2080 is a little too expensive, but it will only look better now compared to this card.



Indeed!

Having a small 2 company video card oligopoly, if these companies decide to raise prices permanently, their word is rule.


----------



## Shatun_Bear (Feb 9, 2019)

efikkan said:


> I beg your pardon, did you even read my post?
> I said I was ignoring the raytracing feature, not giving AMD any disadvantage here, which is fact is giving AMD the benefit of the doubt.
> 
> 
> ...



It's not just one game it's three, and yes, I know they have no value if you don't want them (of course!) that's why I said you sell them? 

You reaction to this card earlier was over the top, that was my argument:



efikkan said:


> The biggest contribution Radeon VII does to the gaming market is justifying the price of graphics cards. Many feel the RTX 2080 is a little too expensive, but it will only look better now compared to this card.



I mean if you think that is the biggest contribution of this card.... The biggest contribution of this card is that it's the fastest AMD card on the market or a viable upgrade for Freesync owners.


----------



## Space Lynx (Feb 9, 2019)

franc bruschetta said:


> no...16fps slower, you need to use a calculator if you don t know how to do math



incorrect there is a couple that does close to 30 fps lower at 1080p


----------



## os2wiz (Feb 9, 2019)

I strongly criticize reviewer for his selection of games being heavily weighted against AMD. Almost every game is DX11, when everyone knows DX 12 games run better on AMD graphics. Other reviewers were more balanced with their game  selections. Hardware Canucks and Hardware Unboxed found the Radeon 7 within 4 to 5% of the RTX 2080 and definitely faster than the 1080 Ti in most games.  Poor review with skewed results. A hack  job.

Der Bauer has proven that  on overclocking on Radeon 7 is hampered by a faulty driver. Until the driver is corrected overclocking can not be properly evaluated..


----------



## M2B (Feb 9, 2019)

Nobody buys a GPU just to play latest AAA Games, games such as Dragon Quest Xi are a good indicator of the performance in lesser known titles:






Look how bad AMD sucks at Ace Combat 7, which is a UE4 title, nvidia is basically miles ahead of AMD when it comes to less known or older titles.
Go bash AMD instead of the reviewer.


----------



## Xzibit (Feb 9, 2019)

M2B said:


> \
> Look how bad AMD sucks in Ace Combat 7, *which is a UE4 title, nvidia is basically miles ahead* of AMD when it comes to less known or older titles.
> Go bash AMD instead of the reviewer.



I would expect as much given their partnership


			
				Tim Sweeney said:
			
		

> "*Epic developed Unreal Engine 4 on NVIDIA hardware*, and it looks and runs best on GeForce."


----------



## efikkan (Feb 9, 2019)

A wide selection of games should be a requirement for any review, as it evens out some of the outliers. It also tends to cancel out some of the shader-tweaks both vendors are doing ("optimizations" as they call it, but is really cheating). So a good selection of scalable games can only be positive to illustrate the _true_ and unbiased performance of a card, which is also the only metric that gives us an indication useful for predicting future games.

I wouldn't worry about game engines like Unreal being biased, if anything it's the opposite, it's one of the most bloated engines there is. Many of the games using it are not scalabe enough.


----------



## M2B (Feb 9, 2019)

efikkan said:


> Many of the games using it are not scalabe enough.



What do you exactly mean by "scalable"? Explain please.


----------



## efikkan (Feb 9, 2019)

M2B said:


> What do you exactly mean by "scalable"? Explain please.


As you know, some games simply don't benefit from higher performance. I would consider benchmarking such games irrelevant, unless the point is to check if the game runs or not.


----------



## purecain (Feb 9, 2019)

where is everyone thinking of putting the card straight under water?  I wonder how the card reacts?

theres a lot of fun to be had experimenting with this card. I hope crossfire is working... 

ps I also think the dodgyness of the cards drivers etc is all part of the amd experience. I used to get really excited over getting good frames and decent benchmarking results...


----------



## M2B (Feb 9, 2019)

efikkan said:


> As you know, some games simply don't benefit from higher performance. I would consider benchmarking such games irrelevant, unless the point is to check if the game runs or not.



As for the Unreal Engine 4, all I see here is good scalability across all different segments/generations of cards, turing is exceptionally good at UE4 titles and the RVII provides significant performance jump over the Vega 64, too.
Recent Assassin's creed games are not that good at scalability, but honesly, it doesn't matter to the generall reader, all they want to see is the performance on the latest AAA titles.


----------



## rtwjunkie (Feb 9, 2019)

os2wiz said:


> I strongly criticize reviewer for his selection of games being heavily weighted against AMD. Almost every game is DX11, when everyone knows DX 12 games run better on AMD graphics.


So AMD users only play DX12 games?  If they don’t, then sit down and be quiet.  It is a selection that covers many of the games played by AMD and Nvidia users, because gamers are gamers. 

Gamers play games based on fun factor, not whether it has been tuned to their species of GPU.  Your complaint against W1zzard is silly, frivolous, and even ludicrous.


----------



## notb (Feb 9, 2019)

Shatun_Bear said:


> Again, Wizzard's performance summary paints Radeon 7 in the worst possible light out of all the big sites. It's like an anomaly.


Mathematically speaking, because there is a finite set of review sites, one of them had to be the one that puts AMD in the worst light.
So actually it's not an anomaly, but a necessity.


> This is partly because of the choice of old titles or severely unoptimized games for AMD (this is no-one but AMD's fault, but still, it skews the performance figures) included in the suite.


No, it doesn't.
You know which games were used and which settings - you have the data you need to make a conscious decision as an *intelligent *human being.

An issue would arise if we would not know which games were used or how well these cards performed in each of them.


> That is a joke. Sure, if you play old unoptimized DX11 games, that's the case.


So you'll forbid us to play old games now? Seriously?


> In games where the drivers are there, like Battlefield 5, it is faster than a 2080 and will only continue to get faster in the coming months.


So there's a clear recommendation from user @Shatun_Bear : if you need a card today, buy something else. If you can wait until Navi arrives, buy Radeon VII. Oops...


rtwjunkie said:


> I’m curious why the mix of titles is a bad thing.  *Will a Radeon 7 user only play games in which the game was optimized for AMD* or made in DX12?  With my last AMD card I didn’t decide to not play certain games just because they work better with Nvidia cards.


*Actually, that quite possibly is correct.*
Do we still remember Ashes of the Singularity? As a game: not so good. As a benchmark convincing you that you made a good choice buying AMD: perfect.
And this game wouldn't be remotely as popular as it used to, if not for strong interest from AMD fans.
In fact in one of discussions on this forum a known AMD supporter said he would buy and play this game only to get the feeling he is using the whole potential of his hardware. 


> I really don’t understand why then we shouldn’t see a performance summary rating, since it includes a wide variety of games played.


Because some people don't understand statistics and because calculating any aggregate always makes some people confused and enraged.
It's like with these people who constantly moan that an average salary is stupid because more than half of population earns less.
@W1zzard We totally appreciate the work you're doing processing the data and preparing the final presentation. But have you ever considered just supplementing the raw measurements?
Wouldn't that be an awesome new phenomenon on PC review sites?


HD64G said:


> I am suggesting him some changes in his gamelist that would make the results better balanced.


If you're suggesting games based on whether they hurt AMD or not, you're actually adding some bias AMD supporters are so worried about all the time.
For an ideally unbiased set of games, we would have to take all the titles that exist and draw few (random sample). I doubt that would make the review more sensible.
So maybe a popularity approach? For example: get the 20 most popular demanding games from Steam?
It could be biased, but would you agree it's at least representative? Well.. Civ VI would certainly be on that list...


Shatun_Bear said:


> And don't be silly - Nvidia have invested far more money and investment into their DX12 drivers than AMD and they're still behind in lots of titles.


If Nvidia was ahead in everything, I guess we wouldn't be having this discussion, right?
But yeah, it's quite possible that if you spend more money on drivers, software may work better on your hardware. AMD should try that.


Shambles1980 said:


> omg.. its bad enough the fan boy bickering, but do we really need to drag flat earthers in to this ??


It's almost certain that flat-earthers are the dumbest representants of our species. So yeah, I think it's worth mentioning from time to time.

Imagine you don't know what a flat-earther is and one day your daughter tells you that she has a boyfriend who plays the piano and is a flat-earther. And you think: "hmm, a he plays the piano - he'll make a great son-in-law one day!"


----------



## Shambles1980 (Feb 10, 2019)

notb said:


> Imagine you don't know what a flat-earther is and one day your daughter tells you that she has a boyfriend who plays the piano and is a flat-earther. And you think: "hmm, a he plays the piano - he'll make a great son-in-law one day!"



you can change the word(s) "flat-earther" in to any religion you want and then its a harder stance to take. although probably equally as valid. Yet still has no place in a tech forum discussion


----------



## RichF (Feb 10, 2019)

sepheronx said:


> I think people are being way too....aggressive over this.
> 
> We knew it wont be that amazing.  It isn't a bad card either.  It is just loud, and over volted.  Clearly it shows that the voltage can be reduced considerably.  Noise is something I cant stand.


Which is it? Not bad or something you can't stand?

The undervolting solution is based on speculation. Just because some cards can be undervolted quite a bit doesn't mean they all can, unless you or anyone else has solid evidence to support that speculation.

And, if that speculation is actually correct then AMD should be pummelled for gross incompetence — by sending out review samples that were grossly overvolted and/or selling such cards to consumers.

Also, since there has been talk about Navi's release date, here is the latest from the rumour mill:

AMD Navi 7nm GPUs Based Radeon RX Graphics Cards Reportedly Delayed Till October 2019
https://wccftech.com/amd-navi-7nm-radeon-rx-gpus-delayed-q4-2019/



			
				WCC article said:
			
		

> Another reason could be that the Navi GPUs are also allegedly going to be featured in next-gen consoles and AMD would want to dedicate a good chunk of supply for those before they ship the chips out to AIBs for production of desktop-based Radeon RX  graphics cards.


Yeah, AMD is really working for PC gamers, if this is true. 

I've told people many times that the console scam* hurts everyone. If this rumour is accurate then it's even more sharp-and-stinking evidence to back that.

*Selling low-end PCs with artificially redundant software walled gardens is a scam. They're not consoles — they are cheap PCs that aren't compatible, intentionally and for nothing but an anti-consumer purpose, with the PC gaming software standard. Something like the Switch, which uses novel form factors and usage modes, is a different story. Sony and MS "consoles" could very easily be eliminated from the market and replaced by a Vulkan + OpenGL on Linux universal PC gaming platform. It's not necessary to pay the Windows tax.


----------



## sepheronx (Feb 10, 2019)

You should learn to differentiate between performance and noise.

I'm sensitive to noise. But I'm willing to accept variations in performance and power use.

Maybe for some I should have specified that?


----------



## RichF (Feb 10, 2019)

sepheronx said:


> You should learn to differentiate between performance and noise.
> 
> I'm sensitive to noise. But I'm willing to accept variations in performance and power use.
> 
> Maybe for some I should have specified that?


You can't have your cake and eat it, too.

Claim 1: "It's loud. Noise is something I can't stand."
Claim 2: "It's not a bad card."

This is obvious so I'm not going to spend more time on it.


----------



## GoldenX (Feb 10, 2019)

2019, the year of the shill. RTX Shills, Radeon 7 Shills, Intel i9 Shills, Zen Shills.


----------



## sepheronx (Feb 10, 2019)

RichF said:


> You can't have your cake and eat it, too.
> 
> Claim 1: "It's loud. Noise is something I can't stand."
> Claim 2: "It's not a bad card."
> ...



I wasn't aware you made the rules regarding how I may measure the card. So it is either good or bad based upon exactly every criteria? So noise being bad and mentioning it as such still means card is bad overall, even though that maybe one can work around the noise in turn to enjoy the performance of the card if it is good? It sounds more like you are being pretentious.

May be obvious to you.  Maybe not to others.  I will take your constructive criticism into consideration next time I decide to measure how good or bad a device is.


----------



## os2wiz (Feb 10, 2019)

M2B said:


> Nobody buys a GPU just to play latest AAA Games, games such as Dragon Quest Xi are a good indicator of the performance in lesser known titles:
> 
> View attachment 116129
> 
> ...


No the developers should be trashed they don't have the deep pockets to pay every developer to optimize their products for AMD hardware like Nvidia. THE manuals and software tools are available for free to code the optimizations. But the lazy f--ks will not  do it unless AMD slips them cash and sends a consultant to help them with the task.


----------



## c2DDragon (Feb 10, 2019)

This card is a joke.
I'm so conforted I didn't wait for Volta (the nVidia's unicorn) and did buy a 1080 Ti at the start of 2018 when prices were back to launch ones.
Right now if you want to upgrade, what do you have ? Overpriced failures VS Overpriced fake innovations.
AMD shouldn't have bought ATI. I can't remember a good power controlled performing card from the red team since it's called AMD...
Intel should launch in the dedicated GPU market it could be a not-so-bad thing from a consumer point of view I think.


----------



## HD64G (Feb 10, 2019)

notb said:


> If you're suggesting games based on whether they hurt AMD or not, you're actually adding some bias AMD supporters are so worried about all the time.
> For an ideally unbiased set of games, we would have to take all the titles that exist and draw few (random sample). I doubt that would make the review more sensible.
> So maybe a popularity approach? For example: get the 20 most popular demanding games from Steam?
> It could be biased, but would you agree it's at least representative? Well.. Civ VI would certainly be on that list...


I am simply suggesting not having 3 games using the same engine (Unreal engine 4) with 2 of them showing much worse performance for a specific brand (Only Senua's Sacrifice is optimised well for both AMD and nVidia out of those 3). Outliers aren't good for objectivity. And I don't think Civ6 is neutral in how it performs for AMD. But the solution to this could be for @W1zzard to show the performance of all games and he can count only the not biased ones (<25% diff for the same tier GPUs) in the summary. A win-win scenario for all imho.


----------



## R0H1T (Feb 10, 2019)

Could geometric mean be more useful, instead of arithmetic


----------



## efikkan (Feb 10, 2019)

os2wiz said:


> No the developers should be trashed they don't have the deep pockets to pay every developer to optimize their products for AMD hardware like Nvidia. THE manuals and software tools are available for free to code the optimizations. But the lazy f--ks will not  do it unless AMD slips them cash and sends a consultant to help them with the task.


Like many, you have misconceptions of how games/game engines work.
It is exceedingly rare that games have specific optimizations for hardware, and even making such optimizations would be nearly pointless, since they would be tied to specific iterations of GPU architectures, not vendors.

The kind of bias that exists in some games are usually not intentional, but simply a consequence of the game being developed for and tested on one set of hardware. Such bias is in most cases favoring AMD, as many AAA titles are developed exclusively for AMD based consoles, while "no" games are developed exclusively for Nvidia.

And I want to emphasize; just because card A scales better than card B in a game, doesn't mean there is bias in the game engine, it can simply be a consequence of better balance of resources on that card. So a card performing better in some games than other is not proof of bias, but often a mismatch between people's expectations and the reality.


----------



## HD64G (Feb 10, 2019)

Some new benchmarks using the latest public driver from AMD, show that R7 isn't as bad as it seems vs RTX2080 (-7% @1440P on average of 33 games)


Spoiler: 1440P 33 games graph @8m30s


----------



## notb (Feb 10, 2019)

HD64G said:


> I am simply suggesting not having 3 games using the same engine (Unreal engine 4) with 2 of them showing much worse performance for a specific brand (Only Senua's Sacrifice is optimised well for both AMD and nVidia out of those 3). Outliers aren't good for objectivity. And I don't think Civ6 is neutral in how it performs for AMD. But the solution to this could be for @W1zzard to show the performance of all games and he can count only the not biased ones (<25% diff for the same tier GPUs) in the summary. A win-win scenario for all imho.


The test includes 21 games out of dozens of popular titles. How do you know they're representative for all games? And how do you know they're representing for gamers' needs?
If you don't know that, how can you say what's an outlier?
Because with such a small sample, how do you know that few games you've mentioned are outliers in the population? 

Don't use the word "outlier". It contains a huge statistical burden.
If you don't like some results, don't look at them. TPU provides figures for each game.

The choice of games currently used by TPU is not representing my taste. I find only 6 out of 21 interesting, played 4 of them. I don't look at the average score when I choose a GPU. You shouldn't as well.


----------



## HD64G (Feb 10, 2019)

notb said:


> The test includes 21 games out of dozens of popular titles. How do you know they're representative for all games? And how do you know they're representing for gamers' needs?
> If you don't know that, how can you say what's an outlier?
> Because with such a small sample, how do you know that few games you've mentioned are outliers in the population?
> 
> ...


Outlier is meant as a statistical paradox greatly increacing the variance. And I know of those things from my time in university, so I think I put this word in good practice here.

I agree with what else you wrote (not looking at the average is a good practice indeed) but most people know that @W1zzard is a great professional and blindly look at the average results that, imho, at the moment have some big outliers that heavily skew the results.

So, my suggestions are towards the best objectivity possible. Statistics is a great tool if practised correctly in order to get conclusions out of test containing many results.


----------



## Kissamies (Feb 10, 2019)

purecain said:


> where is everyone thinking of putting the card straight under water?  I wonder how the card reacts?


That was the first thing which came to my mind when I knew that the cooler isn't good. Well, that comes always to my mind as I run a CPU/GPU custom loop.


----------



## Athlonite (Feb 10, 2019)

Chloe Price said:


> That was the first thing which came to my mind when I knew that the cooler isn't good. Well, that comes always to my mind as I run a CPU/GPU custom loop.



It'd probably be a great card in a custom WC loop mainly because the stock cooler on it is shite admittedly better than the single squirrel cage fan coolers but still shite at the end of the day none the less it will be interesting to see what the likes of EK do for full coverage blocks


----------



## Voluman (Feb 10, 2019)

Thanks for the review again, great as always 

Are there any chance to test with more modern apis too? In many title you use the dx 11 renderpath, which makes sense seeing the market (and your previous datapoints). Just makes me wonder how dx12 and vulkan are standing nowadays.
Can you check with Forza Horizon 4 at some point too? It has a reworked order processor/buffer which is supported by the VII, if i know correctly.


----------



## Blueberries (Feb 10, 2019)

AMD shit the bed with HBM2. Theoretically it was a smart investment (it is faster) but GDDR6 makes it irrelevant with current content for gamers and it's too expensive to sell their product at a price that would be competitive. 

I had high hopes for HBM2, it's just too little too late.


----------



## Super XP (Feb 11, 2019)

Blueberries said:


> AMD shit the bed with HBM2. Theoretically it was a smart investment (it is faster) but GDDR6 makes it irrelevant with current content for gamers and it's too expensive to sell their product at a price that would be competitive.
> 
> I had high hopes for HBM2, it's just too little too late.


HBM2 is not too little too late. But what it isn't for is the PC Gaming Market. Because there doesn't seem to be an advantage in gaming, just a high price tag. 
On the other hand, For a professional card, go all in with HBM2.


----------



## Blueberries (Feb 11, 2019)

Super XP said:


> HBM2 is not too little too late. But what it isn't for is the PC Gaming Market. Because there doesn't seem to be an advantage in gaming, just a high price tag.
> On the other hand, For a professional card, go all in with HBM2.



Is this not advertised as a gaming card?


----------



## Super XP (Feb 11, 2019)

Blueberries said:


> Is this not advertised as a gaming card?


Yes I believe so. Ask AMD why? I'm curious to how they answer that question lol


----------



## Xzibit (Feb 11, 2019)

Super XP said:


> HBM2 is not too little too late. But what it isn't for is the PC Gaming Market. Because there doesn't seem to be an advantage in gaming, just a high price tag.
> On the other hand, For a professional card, go all in with HBM2.



You can go back and watch the unveiling at CES. AMD from the start market it as a Gaming + Creation card.


----------



## londiste (Feb 11, 2019)

Voluman said:


> Can you check with Forza Horizon 4 at some point too? It has a reworked order processor/buffer which is supported by the VII, if i know correctly.


A couple of posts above there was talk about outliers. There is something in FH4 that brings down performance of Nvidia cards compared to AMD ones. at least in 1080p and 1440p. At 2160p the results are back to where you would roughly expect. On the other hand, TechReport in their Radeon VII review found (and AMD told them) that MSAA 8x is rough on AMD in that title.


----------



## Artas1984 (Feb 11, 2019)

I will call this card Vega VII, IDK.

People should not be criticizing Vega VII performance based on 1080P and 1440P. It is clearly a 4K card. No one makes a 16 GB VRAM and 1 TERABYTE of bandwidth card even for 1440P. That will be a thing for the future, but not for now. Having said that, it's equal to GTX 1080 Ti and is 5 - 10 % short of RTX 2080, so it's not a failure. Just because it has the most horrable cooler since R9 290 means shit.

Second party factory models with much better coolers and core over-clocks will come eventually, at least equaling the RTX 2080 FE.


----------



## arbiter (Feb 11, 2019)

Card isn't bad but part they failed on was going with 16gb vs 8gb. If they went with 8gb then price undercut would made the card better option to look at but at current price point its a kind a wash with check box for win to the rtx2080.


----------



## phill (Feb 11, 2019)

Wow that spirialed out of control quickly...  That and 3 days of being not on the forum lol  So apologises for the out of date reply..



EarthDog said:


> Do tell. Compute is the only reason I could come up with after seeing these results. Otherwise, its slower, uses a lot more power, and is quite noisey in this form. Remind me why I should pay the same money for less?
> 
> Sorry.. what?



As I said, you could, meaning people have a choice, right or wrong, it's a choice for people to decide what they want to have  

I was referring to the amount of cash, cards etc that Nvidia release, they either giving Intel ideas or just releasing a lot of cards to confuse people...  Either ways, this thread is the same as an AMD v Intel one..  Everyone has a perference and will voice it high and wide...  It's all a choice for whoever buys one    If it was closer to the £600 mark or even less than that, I'd definitely consider it..   Sadly not enough cash to do so right now..



phill said:


> I never mentioned power issues, I'm guessing that's AMD giving the GPU core a little more juice than it needed, just like Vega..  Hopefully it might change in a new card release, whenever that might be but I don't really tend to worry about power usage...  Efficiency is great but still, we've all had worse efficient GPUs in our time...  I can remember a few



Seems I did mention the power issue as well...  So I'm was kinda right   (Makes a change! )


----------



## Dante Uchiha (Feb 11, 2019)

lynx29 said:


> yeah im serious.  look at the ryzen benches below. ryzen is a joke compared to intel, for some reason GND is the only website that tests Ryzen properly.
> 
> 
> 
> http://imgur.com/a/oTNrz



Any processor is a joke on this pathetic software.


----------



## EarthDog (Feb 11, 2019)

Artas1984 said:


> I will call this card Vega VII, IDK.
> 
> People should not be criticizing Vega VII performance based on 1080P and 1440P. It is clearly a 4K card. No one makes a 16 GB VRAM and 1 TERABYTE of bandwidth card even for 1440P. That will be a thing for the future, but not for now. Having said that, it's equal to GTX 1080 Ti and is 5 - 10 % short of RTX 2080, so it's not a failure. Just because it has the most horrable cooler since R9 290 means shit.
> 
> Second party factory models with much better coolers and core over-clocks will come eventually, at least equaling the RTX 2080 FE.


It's performance numbers dictate it isn't. That type of bandwidth and that amount of VRAM isn't needed at 4K either. It is simply a cut down Instinct card marketed for gaming. Nobody said its a failure...(well ,maybe one or two people did) but people are disappointed that it is, in general, slower than a 2080, costs the same, uses a lot more power, noisey, drivers are borked at launch... etc. While I wouldn't call it a failure, there are a few shortcomings when compared to the RTX 2080 at the same price point.


----------



## evolucion8 (Feb 11, 2019)

HD64G said:


> In fact, some games in @W1zzard 's review are heavily biased for nVidia GPUs (Darksiders 3 gives 30% diff, Civ6 gives 28% diff, Dragon Quest XI gives 43% diff and Hitman 2 gives 35% diff @1440P). On the other side, only Strange Brigade gives 10% diff for Radeon 7. Without those games, the difference on average would be 9% @1440P and 5% in 4K. The 2080 would win for sure but not by that far. Guru3D's review results on average are exaclty the ones without those biased games. And some DX12 games are absent also (Sniper Elite 4). I am not judging the professionalism of our @W1zzard but I am suggesting him some changes in his gamelist that would make the results better balanced.



This is something that I always complained about those charts, cause those games that puts nVidia on a huge advantage, are very easy on resources, they run over 180-250fps, which triggers Radeons notorious CPU bottlenecks on DX11 draw calls when drawing such huge amount of fps which makes no difference on playability. The RTX 2080 will be faster overall, but the gap would be smaller and not the reason to put AMD on better light, more like using games that are very optimized regardless of the platform and those games like Dragon Quest XI, looks like hot gbage and even an GTX 1050 can max those.


----------



## GoldenX (Feb 11, 2019)

It's not Nvidia's fault that AMD never spent money on multi-thread optimizations on their drivers. AMD's performance in DX11 is nothing compared to OpenGL, their performance difference there is abysmal.


----------



## evolucion8 (Feb 11, 2019)

GoldenX said:


> It's not Nvidia's fault that AMD never spent money on multi-thread optimizations on their drivers. AMD's performance in DX11 is nothing compared to OpenGL, their performance difference there is abysmal.



I am not blaming nVidia, is just that the GCN architecture itself is forward looking and the DX11 API is obsolete, its 10 years old! But GCN already needs a revamp, those 4 shaders per CU and 64 ROPs are hindering the potential, and on lesser degree, the hardware scheduler and the Tessellator.


----------



## John Naylor (Feb 11, 2019)

> With 16 GB VRAM, Radeon VII has more memory than any other graphics card below $1000, twice that of RTX 2080, which really makes no measurable difference in any of our tests.



Thanks for that 


jihadjoe said:


> I found the performance per dollar charts really amusing lol.
> 
> AMD has just managed to build the second worst value consumer card ever, beaten only by the 2080Ti!



And people will still buy them.  When you sell them as fast as you can make them, value is not a concept sales managers worry about.  Like a shoe made by Jimmy Choo many folks pay for status, others will pay because having nothing but the best is "their thing".  With AMD unable to put anything oin this space, you don't have to care about value when you have no competition.




lynx29 said:


> because Lisa Su was the last person in the industry I trusted. and she lost that trust today. now i trust none of them, intel, amd, or nvidia. so i might as well just go with classic intel and nvidia for my next build.  9700k is the best gaming chip especially since i see it for $369 free ship no tax right now.



It's called capitalism .... corporations are bound to serve, better said legally required to maximize profits for their shareholders (withing the law) , no one else.   A situation where stores can't keep cards in stock and you lower proces ... that's a betrayal of your reseonsibilities to shareholders.




moproblems99 said:


> What? You can't be serious?  Why would you not get a likely perfectly good cpu that is going to be likely better than the 9700K because of this GPU? What did you really expect from this GPU?  Simple math told you this is where it was going to be.  Besides, the i9 9900K is basically the equivalent of this GPU in the CPU space.



Because everytime AMD says they will be ... they ain't.  And then the fans are clamoring are all exclaiming, "yeah but it's great at (insert one of the things that 98% of consumers enver do".




lynx29 said:


> i know the person who made them, I will ask him. I don't think he is trying to be misleading though, GND is a small site.



That's actually quite common... it's done to emphacize differences.  98 to 99 looks a lot bigger starting at 95 than 0  ...oft argued that using 0 makes graphic too big




R0H1T said:


> It's a common problem with AMD across the board, even their CPU's are unnecessarily overvolted.



It's how AMD has competed since 290x .... they basically overlock the begeeezes out of their cards beore boxing them... when 290x was released, they got loads of press saying how it was faster than the 780 ... but it was a faux victory .... once the web sites had a chance to test those cards which came clocked out of the box running 95C, the 780 was the faster card ... Since htem AMDs overclocking has been limited to single digits whereas nVidia is almost always double digits since then, sometimes breaking 30%.




jesdals said:


> I wanted a RX Vega 64, but wasnt sure about my 3 monitor setup and the 8GB of ram, so have waited for the 16GB version for more than a year. The reference design have but one problem (besides of all the bugs the reviews talk of), the lack of DVI/HDMI support for 3 monitors.
> 
> What would be best for 3 times 1920x1200 setup in Eyefinity, a displayport converter to HDMI or DVI-D solution?
> 
> BTW danish price around 5.749 dkr including tax and delivery 22. feb 2019, actually some in stock but on sale for as high ad 6.500 dkr.



2160 p is just 4 x 1080p, so if ya have the memory for run 4k, you have a third more memory than ya need for 3 x 1080p.  This card has 3 display ports as do most 2080s


----------



## GoldenX (Feb 11, 2019)

evolucion8 said:


> I am not blaming nVidia, is just that the GCN architecture itself is forward looking and the DX11 API is obsolete, its 10 years old! But GCN already needs a revamp, those 4 shaders per CU and 64 ROPs are hindering the potential, and on lesser degree, the hardware scheduler and the Tessellator.


I remember that during the first GCN releases, GCN was praised for giving priority to DX11 over 9.
DX11 and OpenGL are "obsolete" (for gaming), but not all developers want to "waste time" on the extra work that DX12 and Vulkan and Metal means (and it's A LOT of work), so, for some years, like it was with DX9, DX11 will still be a priority.
Also, there's even the issue that Vulkan is based off OpenGL ES, that means that it's not as feature rich (bindless textures for example), plus what's missing from dropping OpenGL compat.


----------



## evolucion8 (Feb 11, 2019)

GoldenX said:


> I remember that during the first GCN releases, GCN was praised for giving priority to DX11 over 9.
> DX11 and OpenGL are "obsolete" (for gaming), but not all developers want to "waste time" on the extra work that DX12 and Vulkan and Metal means (and it's A LOT of work), so, for some years, like it was with DX9, DX11 will still be a priority.
> Also, there's even the issue that Vulkan is based off OpenGL ES, that means that it's not as feature rich (bindless textures for example), plus what's missing from dropping OpenGL compat.



Vulkan is based on Mantle. Unless if I am wrong, Mantle is not based on OpenGL and even programs differently.


----------



## John Naylor (Feb 11, 2019)

notb said:


> I won't comment on the performance very much. RTX cards are very fast, so it was expected that Radeon VII will look relatively worse than Vega 64 did (Vega 64 matched 1080).
> 
> There are two more important things to take away:
> 
> ...



1.   The noise:

@ Idle AMD VII Noise = 27 dbA / MSI 2080 Gaming = 0 dbA.
@ Load AMD VII Noise = 43 dbA / MSI 2080 Gaming =36 dbA ... that's 1.625 times louder .... the reference nVidia card is quieter tha MSI's factory OCd version.

2.  It's been a long while since we have seen products from AMD competing at the top end in the gaming segment, that's why all the pre-launch talk is about 7nm and # of cores ... and no talk about results.  Mantle was gonna change everything, HBM was gonna change everything, 7 nm was gonna change everything.  The "pump and dump" crowd use this to rake in millions from uninformed investors and it keeps the name in the news.  In the end the new machine's value remains tied to the performance it delivers (increase in actual user productivity or fps) , versus the cost of putting it on the desk (everything inside the box or connected to it.)

We very well could see significant improvements from the AIB cards or even after what I call the 2nd beta period (1st 3 months since release)  when most problems are addressed but given the starting point we are looking at, I see this as a $550 card




efikkan said:


> It is worth mentioning that this is the first "top" model from AMD in recent years that don't come close to their Nvidia counterpart in performance, so in essence we can call this their largest fail yet.



AMD last took the title with the 290x .... it lasted about a week.  They lost it in the press when the 780 Ti dropped a week later .... but they actully lost it before that when the web sites compared the 290x OC'd with the 780 OCd.... because of the decision to aggressively OC the 2xx series cards before putting them in the box, they lost the title once those cards were tested,




xkm1948 said:


> Just read a lot of reviews from multiple sites. The consensus is fairly clear: this is a statement card. A statement to the market that says AMD is not done making consumer GPU yet. And that is it. Value is not good for the current price ($599 would be the sweet spot).



I won't contest the $599, tho $550 seems more appropriate to my eyes considering PSU should be 100 watts larger.... will need an extra fan to keep case interior the same temps.  But unless AIB folks get that noise down, that's a deal breaker ...  should include a set of 30 foot cables so PC coud be placed in another room.  As for the statement ... statements get made every day ... look at politics ... the question is, is anyone taking it seriously ?    If price stays at $700, I have to say no.




repman244 said:


> While I don't care what brand I buy (I buy whatever has the best price/performance/power consumption), if you buy a gaming card either from AMD or NVIDIA they are all crap for any serious CAD work and won't accelerate anything, for that you need FirePro or Quadro. With NVIDIA at least you can use CUDA even on gaming cards.



That's a common misconception and least in the construction fields.    Have been a practicing engineer for over 40 years and started my own consulting business in 1990.   Because we couldn't get what we wanted without a $2,000 markup,  we built our own boxes, for us and others.  The fact is AutoCAD is almost entrely single threaded, no need or benefit to 6, 8, 10, or more core CPUs because the program can't use them.   In addition, Firepro / Quadro are historically poor performers up against GTX cards when doing 2D and 3D CAD work.  Not saying that a $2000 / $4000 Firepro / Quadro won't beat a $600 GTX card, but they win by less than 1% in2D CAD and actually lose by a significant margin in 3D CAD.  Quadro and Fire pro, excell at modeling, rendering and animation.  Radeon had oft taken the top spot over all other comers in AutoDesk Inventor.  Maya most often favors Quadro followed by Firepro and same for Solidworks.  Most Architect / Engineer offices I visit will typically have a say 6 - 10 CAD stations will have just one on Quadro for the rendering stuff and the rest for straight 2D / 3D CAD.     The one reason for using Quadro in this environment ... no AutoDesk support.  When visiting another office some years ago, after a meeting and a lunch, went back to office and one of the folks was asking me about AutoCAD performance tweaks.  I sat down at his workstation (Quadro) and manuevered my way around the drawing ... opened it on my lappie (GTX) and it was faster.  Like anything inside a PC case, it's wise to match your hardware and budget needs to your specific software requirements.


----------



## GoldenX (Feb 11, 2019)

evolucion8 said:


> Vulkan is based on Mantle. Unless if I am wrong, Mantle is not based on OpenGL and even programs differently.


Vulkan took the "command list" procedure from Mantle, it still is a Khronos' work (like OpenGL). If Vulkan was only Mantle (a GCN specific API), it wouldn't work on Nvidia and Intel too.


----------



## notb (Feb 11, 2019)

John Naylor said:


> 1.   The noise:
> 
> @ Idle AMD VII Noise = 27 dbA / MSI 2080 Gaming = 0 dbA.
> @ Load AMD VII Noise = 43 dbA / MSI 2080 Gaming =36 dbA ... that's 1.625 times louder .... the reference nVidia card is quieter tha MSI's factory OCd version.


1.625 times what? Power of the source? Certainly, *not*. :-D


----------



## londiste (Feb 11, 2019)

evolucion8 said:


> This is something that I always complained about those charts, cause those games that puts nVidia on a huge advantage, are very easy on resources, they run over 180-250fps, which triggers Radeons notorious CPU bottlenecks on DX11 draw calls when drawing such huge amount of fps which makes no difference on playability.


So, Darksiders 3, Civ6, Dragon Quest XI, Hitman 2, right?
At 1440p, Radeon VII does 77.3, 88.9, 93.1 and 61.5 FPS average in these games.
At 2160p, Radeon VII does 39.3, 79.0, 44.4, 39.7.
What are you talking about, these are very far from 180-250 FPS.



HD64G said:


> I am simply suggesting not having 3 games using the same engine (Unreal engine 4) with 2 of them showing much worse performance for a specific brand (Only Senua's Sacrifice is optimised well for both AMD and nVidia out of those 3). Outliers aren't good for objectivity. And I don't think Civ6 is neutral in how it performs for AMD. But the solution to this could be for @W1zzard to show the performance of all games and he can count only the not biased ones (<25% diff for the same tier GPUs) in the summary. A win-win scenario for all imho.


Unreal Engine is by far the most popular engine around. One could argue that at 3 out of 21 it is underrepresented.
There are biased games and always will be. DX12 and Vulkan will make that a lot worse, with lower level APIs developers will write engines and games than end up being more focused to specific architectures.


----------



## notb (Feb 11, 2019)

Super XP said:


> Yes I believe so. Ask AMD why? I'm curious to how they answer that question lol


If the rumors about ~5000 pieces being made are true, than there's no point in "why". Why cares? Most of us won't even see this card live.

If they decide to make more of them, then the answer is pretty obvious. I've given it a few times already in other topics.
No one buys Radeon Instinct
AMD makes orders HBM possibly a year or so before order (maybe someone here knows?), but they end up with an awful GPU. What do you expect them to do? Throw it to the bin? They do whatever they can to sell the inventory.
In this case: they rebrand a GPGPU accelerator as a gaming card. Not a very efficient one, but with enough oomph to pull this off (performance-wise).
Mind you, it's still awful as consumer electronics.


----------



## jabbadap (Feb 11, 2019)

notb said:


> 1.625 times what? Power of the source? Certainly, *not*. :-D



In loudness, 7dB more is ~1.625 times more loud in psychoacoustics.


----------



## notb (Feb 11, 2019)

jabbadap said:


> In loudness, 7dB more is ~1.625 times more loud in psychoacoustics.


So first of all: I would love to see a text that describes how "n times more loud" is measured. Any source?
Of course I know we don't hear the power increase (which in this case is ~5), but where does the 1.625 come from?

Second: even assuming something called "psychoacoustics" works, what you said may only be correct for some reference levels.
Certainly, 30dB vs 23dB and 100dB vs 93dB have different values of perceived loudness increase.


----------



## RichF (Feb 11, 2019)

sepheronx said:


> I wasn't aware you made the rules regarding how I may measure the card. So it is either good or bad based upon exactly every criteria? So noise being bad and mentioning it as such still means card is bad overall, even though that maybe one can work around the noise in turn to enjoy the performance of the card if it is good? It sounds more like you are being pretentious.
> 
> May be obvious to you.  Maybe not to others.  I will take your constructive criticism into consideration next time I decide to measure how good or bad a device is.


The first sentence is a straw man. I didn't make the rules about logical expression with the English language, which is what my post was about. Logical expression dictates, regardless of your feelings (i.e. the pretentiousness complaint), that a person can only take one position at a time, not two contradictory positions simultaneously.

When a person says a card is loud and that they _can't stand _noise, that person is clearly saying noise is unacceptable to them and that the card is loud. This means that the card is included in the realm of things the person cannot tolerate. This means that, for that person, the card is not good (also known as unacceptable).

Performance encompasses all aspects of how a product performs, including the noise it produces. Furthermore, regardless of how many FPS it delivers, you already declared that the product belongs to the category of unacceptable products due to its noise output and your lack of tolerance for noise. Draw a Venn diagram.

Finally, the accusation about pretentiousness is an example of concern trolling (tone complaint). It's a distraction from the point and an example of the _ad hominem_ fallacy.



notb said:


> So first of all: I would love to see a text that describes how "n times more loud" is measured. Any source?
> Of course I know we don't hear the power increase (which in this case is ~5), but where does the 1.625 come from?
> 
> Second: even assuming something called "psychoacoustics" works, what you said may only be correct for some reference levels.
> Certainly, 30dB vs 23dB and 100dB vs 93dB have different values of perceived loudness increase.


It seems that this site has all the data anyone here could possibly want on this subject:

http://www.sengpielaudio.com/calculator-levelchange.htm


----------



## notb (Feb 11, 2019)

RichF said:


> It seems that this site has all the data anyone here could possibly want on this subject:
> 
> http://www.sengpielaudio.com/calculator-levelchange.htm


I think it doesn't (not in the actual physics part, anyway).
But it is bashing hasty usage of psychoacoustics (a lot), so it was a nice read after all. ;-)


----------



## sepheronx (Feb 12, 2019)

RichF said:


> The first sentence is a straw man. I didn't make the rules about logical expression with the English language, which is what my post was about. Logical expression dictates, regardless of your feelings (i.e. the pretentiousness complaint), that a person can only take one position at a time, not two contradictory positions simultaneously.
> 
> When a person says a card is loud and that they _can't stand _noise, that person is clearly saying noise is unacceptable to them and that the card is loud. This means that the card is included in the realm of things the person cannot tolerate. This means that, for that person, the card is not good (also known as unacceptable).
> 
> ...



So reviewers are wrong in saying that pro being performance and con being noise should instantly mean the card is bad, even if they say it isn't a bad card? I can't stand certain noise but can tolerate it regarding other considerations to the card.  Still seems rather odd to me that you can negate every other consideration.

If it may help: I can't stand noise. But depending on situation and other criteria, I can look past it.

Is that better?

You are nitpicking for pretty much no reason. You decided to single someone out for what? I would rather you spend your time worrying about other things than cherry picking a sentence cause you seemed to have misunderstood the point.  Other than derailung the thread, why not talk about the card? If you have nothing else to say other than being sentence police, might I suggest ignoring me from now on please?


----------



## Super XP (Feb 12, 2019)

Without factoring in the noise, power draw (Despite being 7nm) and high $$ tag, the Radeon VII isn't a bad card after closely looking at several reviews. W1zzards wonderful review already gave me this same conclusion. But had to venture out and check another 10-12 sites. The majority all have similar results as W1zzards review. All on par with what this card is capable of.


----------



## notb (Feb 12, 2019)

Super XP said:


> Without factoring in the noise, power draw (Despite being 7nm) and high $$ tag, the Radeon VII isn't a bad card after closely looking at several reviews. W1zzards wonderful review already gave me this same conclusion. But had to venture out and check another 10-12 sites. The majority all have similar results as W1zzards review. All on par with what this card is capable of.


Without factoring the noise, fuel consumption and high price, Focus RS is a perfect everyday car.

Honestly... with this kind of logic you can justify anything.


----------



## R0H1T (Feb 12, 2019)

The part about high power draw is slightly exaggerated, nearly all Vegas can be undervolted & in case of VII the power savings are substantially higher. Depending on workload & Si lottery, the max (system) power draw can be reduced anywhere between 50W to nearly 100W - granted this ought to have been done by AMD or board partners themselves, but it's not as big an issue as it was on the original Vega.


----------



## londiste (Feb 12, 2019)

R0H1T said:


> The part about high power draw is slightly exaggerated, nearly all Vegas can be undervolted & in case of VII the power savings are substantially higher. Depending on workload & Si lottery, the max (system) power draw can be reduced anywhere between 50W to nearly 100W - granted this ought to have been done by AMD or board partners themselves, but it's not as big an issue as it was on the original Vega.


Silicon lottery seems to be a thing, if I remember correctly it was Guru3D who wrote that undervolting gave them 10, maybe 15 watts.
My Vega64 undervolting resulted in a very modest boost. If my memory serves right, about +40MHz at the same power consumption. It still hit the power limit and did not reach boost clocks.

By the way, have you seen any reviews doing serious undervolting for RTX2080 for comparison?


----------



## R0H1T (Feb 12, 2019)

Any reviews that show 2080 (or 2080Ti) being undervolted & clocking higher/better like in case of VII - no I haven't. Also btw Nvidia's x80 have been the most efficient GPU in their lineups since Kepler.





londiste said:


> Silicon lottery seems to be a thing, if I remember correctly it was Guru3D who wrote that undervolting gave them 10, maybe 15 watts.


They didn't specify whether they undervolted whilst OCing, also you have a couple of German sites (earlier in the thread) testing this under more workloads/games & the results were impressive.

From legit reviews - https://www.legitreviews.com/amd-radeon-vii-16gb-video-card-review_210489/14


> *3DMark Firestrike*
> 
> *Stock 1069mv, GPU 1801MHz, HBM2 1000MHz, 0% Power – 342 Watts in GT1 (overall 21,724 / GPU 26,941)*
> Undervolted to 934mV, GPU 1801MHz, HBM2 1000MHz, +20% Power – 291 Watts in GT1 (overall 21,658 / GPU 27,240)
> ...


----------



## londiste (Feb 12, 2019)

934mV is a very good result. Computerbase was at ~950mV. Most sites tend to and up at around 980mV. Both voltage and clocks are test-specific though and each site seems to use a different test for OC and UV 
Undervolting to under 1V range does put Radeon VII efficiency to around RTX 2080 level.

The thing to keep in mind about undervolting that it is by no means a guaranteed thing. It has a lot in common with overclocking in that regard. From forum and reddit posts from Radeon VII owners there are games that will need more voltage to be stable, Battlefield V and Shadow of Tomb Raider are usually brought out as examples.


----------



## R0H1T (Feb 12, 2019)

I can bet my anonymous internet reputation that every VII can be undervolted, given the stock voltages we're seeing, then again the increased efficiency will depend on a lot of things including workload & the rest of the system.


----------



## EarthDog (Feb 12, 2019)

One has to wonder why, if these perform so much better and use so much less power undervolting (and many can/do) why did AMD put this version of the VII in the wild??? Feels like its shooting themselves in the foot, no?

Why wouldnt they want to be closer to the 2080 in performance and power? Theres a reason for it...


----------



## R0H1T (Feb 12, 2019)

I'm not saying it's as good as 2080 in terms of performance or efficiency, it is however better than Pascal in both areas. Also wrt why AMD did this - money? Apart from Ryzen & TR you'd be hard-pressed to remember any spectacular launches from the red camp in well over half decade, if not more. This doesn't justify the outcome but possibly could explain why AMD do this repeatedly.


----------



## londiste (Feb 12, 2019)

R0H1T said:


> Also wrt why AMD did this - money?


AMD needs to be in the picture. It is a forced move. Vega64 could no longer cut it as a viable product high enough in the ladder. Leaving pricing aside, RTX 2080Ti, RTX 2080, RTX 2070 and GTX 1080Ti from last generation were all offering higher performance. At the same time GTX 1080 and RTX 2060 are uncomfortably close. Plus all the Titans.

Radeon VII would be efficient as hell with voltage and frequency pulled just a little bit back. Unfortunately, that would put it in a really uncomfortable place with regards to performance and I doubt it could be priced much lower than the $699 MSRP.

Edit:
In fact, Computerbase tested it at 240W power limit (-20%, the lowest it would go). It apparently lost a mere 3% performance overall and 6% in the worst game:
https://www.computerbase.de/2019-02..._vs_rx_vega_64_bei_gleicher_leistungsaufnahme


----------



## EarthDog (Feb 12, 2019)

R0H1T said:


> I'm not saying it's as good as 2080 in terms of performance or efficiency, it is however better than Pascal in both areas. Also wrt why AMD did this - money? Apart from Ryzen & TR you'd be hard-pressed to remember any spectacular launches from the red camp in well over half decade, if not more. This doesn't justify the outcome but possibly could explain why AMD do this repeatedly.


I'm not saying you did, nor was I talking about Pascal. 

I'm not asking why they released the card in general... (its b/c of market presence, not money.. the rumor is they are losing their arse on these cards...remember these are $1500 cards with a different bios and drivers in the Instinct cards)...


I wondering why if "all" the cards undervolt and show performance improvements, why didnt they release it like that? It would be more attractive in the market..


----------



## TheGuruStud (Feb 13, 2019)

Looky what I found https://techreport.com/review/34461/revisiting-the-radeon-vii-and-rtx-2080-at-2560x1440/3

Manual testing reveals canned benchmarks can be a total joke. Surprise.
This has been going on for a while.


----------



## trog100 (Feb 13, 2019)

EarthDog said:


> I'm not saying you did, nor was I talking about Pascal.
> 
> I'm not asking why they released the card in general... (its b/c of market presence, not money.. the rumor is they are losing their arse on these cards...remember these are $1500 cards with a different bios and drivers in the Instinct cards)...
> 
> ...



i think you are right about high end market presence.. they have a desperate need for that.. worth losing a bit of money for a limited edition (gaming) run i doubt these things will ever be available in real numbers.. 

trog


----------



## os2wiz (Feb 13, 2019)

Super XP said:


> *We all know AMD needs a complete GPU Re-Design.* Until this happens, this is what they can offer. Though I do believe the price is at least $150 too high, despite the 3 x AAA Games included.


Hi Super XP. I ordered Radeon VII 2 days ago on AMD's website. it will be delivered Friday. I placed the order when I heard the new driver was released and that the FP64 compute function is better than on any other consumer graphics cards. Some strategy games definitely will take advantage of that capability plus Adobe Premier.  I expect more games will eventually be programmed to leverage FP64 compute. Strategy games are really giant data bases which obviously use a lot of compute function per turn.



efikkan said:


> Like many, you have misconceptions of how games/game engines work.
> It is exceedingly rare that games have specific optimizations for hardware, and even making such optimizations would be nearly pointless, since they would be tied to specific iterations of GPU architectures, not vendors.
> 
> The kind of bias that exists in some games are usually not intentional, but simply a consequence of the game being developed for and tested on one set of hardware. Such bias is in most cases favoring AMD, as many AAA titles are developed exclusively for AMD based consoles, while "no" games are developed exclusively for Nvidia.
> ...


W hat you are saying is only a half-truth at best. Optimizations have made immense performance differences when AMD has collaborated with software vendors. This goes on with Nvidia which pays development costs for far more games than AMD does. So the developers automatically use Nvidia hardware . Yes different hardware has different theoretical capabilities . Often times these capabilities are never reached.  Then sometimes the cost of AMD to fully enable a technology in Vega has proven too time consuming and costly to pursue.We know thatat least 2 capabilities in Vega were never exploited as each developer would have to invest too much time and money to employ it in their games so no API was ever written for it.


----------



## os2wiz (Feb 14, 2019)

Based  on the drasticallly improved Valentines Day driver from AMD another review is in order. Undervolting now works flawlessly. The easily over clock to 2000 mhz and above. The HBM2 can overclock to 1150 mhz. . The card has significant room for overclocking. I am positive if this review is repeated with the latest driver performance will be be at least 5 per cent better. Fanspeed and noise is much easier to regulate. Night and day difference.


----------



## londiste (Feb 15, 2019)

os2wiz said:


> I placed the order when I heard the new driver was released and that the FP64 compute function is better than on any other consumer graphics cards. Some strategy games definitely will take advantage of that capability plus Adobe Premier.  I expect more games will eventually be programmed to leverage FP64 compute. Strategy games are really giant data bases which obviously use a lot of compute function per turn.


No, games will not take advantage of FP64. On one hand, there are not enough GPUs around with sufficient FP64 performance on the other hand games do not need that level of precision. As the best case scenario for FP64 is 1/2 the performance of FP32, that hit is huge. Right now, the drive is exactly the opposite - as both Vega and Turing can do FP16 at 2 times FP32, games are trying to use less precision wherever they can to get more performance.


os2wiz said:


> Often times these capabilities are never reached.  Then sometimes the cost of AMD to fully enable a technology in Vega has proven too time consuming and costly to pursue.We know thatat least 2 capabilities in Vega were never exploited as each developer would have to invest too much time and money to employ it in their games so no API was ever written for it.


These were not pursued because AMD was not able to get them to work properly. The capabilities need to be implemented and exposed in APIs and/or drivers before developers can start using them.


----------



## os2wiz (Feb 15, 2019)

londiste said:


> No, games will not take advantage of FP64. On one hand, there are not enough GPUs around with sufficient FP64 performance on the other hand games do not need that level of precision. As the best case scenario for FP64 is 1/2 the performance of FP32, that hit is huge. Right now, the drive is exactly the opposite - as both Vega and Turing can do FP16 at 2 times FP32, games are trying to use less precision wherever they can to get more performance.
> These were not pursued because AMD was not able to get them to work properly. The capabilities need to be implemented and exposed in APIs and/or drivers before developers can start using them.



  That those features did not work was not a hardware failure. It was the amount of investment required to develop the API and train software developers to implement it. At that time AMD was just starting to emerge from the financial funk they had fallen into. The time required to fix the features was too long to make a difference with Vegas  success. AMD now should revisit those capabilities . They now have considerably more resources.
   But no one has responded to my post on the latest drivers from AMD that now unlock the full overclocking and undervolting capabilities of the gpu and memory. Coul amount to greater than 10^ performance improvement and eliminates the loud fan issue.


----------



## evolucion8 (Feb 15, 2019)

os2wiz said:


> That those features did not work was not a hardware failure. It was the amount of investment required to develop the API and train software developers to implement it. At that time AMD was just starting to emerge from the financial funk they had fallen into. The time required to fix the features was too long to make a difference with Vegas  success. AMD now should revisit those capabilities . They now have considerably more resources.
> But no one has responded to my post on the latest drivers from AMD that now unlock the full overclocking and undervolting capabilities of the gpu and memory. Coul amount to greater than 10^ performance improvement and eliminates the loud fan issue.



Yeah, it is not the first time AMDs old technology gets revisited and then implemented on a newer API. Tessellation is one of them, Geometry Instancing is another one, along with other small features like the ones on DX10.1 and DX11.2 that got implemented later on even on DX12 which even has other features grabbed from Mantle which evolved in Vulkan. So I think that 16-Bit precision will make a good comeback without resorting to proprietary stuff.


----------



## os2wiz (Feb 15, 2019)

GeorgeMan said:


> Another dissapointment. 1TB/s bandwidth and it's not even better than my 1080Ti (stock vs stock) at my resolution (3440x1440)....
> Excellent review as always!


    Poor selection of games . I believe only 3 were DX12. Why were there no Stardock games?? Star Control Origins is a great game that is DX 12 The skewed game selection made the results more lopsided than they should have been. Now that the February 14 drivers from AMD are available the overclocking works completely (manual and auto) both of memory and gpu. The undervolting works as well.  Fan noise now completely contollable without performance loss.  I ask Tech Powerup to rerun the review the differnce in results will be surprising.



sepheronx said:


> So reviewers are wrong in saying that pro being performance and con being noise should instantly mean the card is bad, even if they say it isn't a bad card? I can't stand certain noise but can tolerate it regarding other considerations to the card.  Still seems rather odd to me that you can negate every other consideration.
> 
> If it may help: I can't stand noise. But depending on situation and other criteria, I can look past it.
> 
> ...



  The noise issue is dead. Yesterdays driver release allows for undervolting which will reduce fan speed and noise with almost no impact on performance.


----------



## BaneSilvermoon (Feb 16, 2019)

When adding the washers, did you tighten down the screws or leave them a little loose? I tried some Nylon washers on mine yesterday and saw no change whatsoever. My over-clocked card still reaches 114c.

Would love to see those temps come down and be able to happily maintain the OC, or possibly push it higher. Have gotten +191mhz on the gpu and +200mhz on the vram. Significant performance boost.


----------



## arbiter (Feb 16, 2019)

os2wiz said:


> Poor selection of games . I believe only 3 were DX12. Why were there no Stardock games?? Star Control Origins is a great game that is DX 12 The skewed game seklection made the results more lopsided than they should have been. Now that the February 14 drivers from AMD are available the overclocking works completely (manual and auto) both of memory and gpu. The undervolting works as well.  Fan noise now completely contollable without performance loss.  I ask Tech Powerup to rerun the review the differnce in results will be surprising.


Saying its poor selection of games is just making excuses for why it has poor performance. The games used are high end games that a lot of people would play either now or when they were out. Star control origins only had 1 month of over 1000 peak players and as for their other game "Ashes of the Singularity" was for most part a glorified tech demo made in to a game. Using games that never really had much of a player base is a waste of time.


----------



## os2wiz (Feb 16, 2019)

Super XP said:


> Without factoring in the noise, power draw (Despite being 7nm) and high $$ tag, the Radeon VII isn't a bad card after closely looking at several reviews. W1zzards wonderful review already gave me this same conclusion. But had to venture out and check another 10-12 sites. The majority all have similar results as W1zzards review. All on par with what this card is capable of.


      I hope you have seen Not an Apple Fan 's Youtube review of the Radeon 7 that he reexamined after thelatest Adrenalin driver release on February 14. All issues that had previously been reported have been resolved : fan noise by applying auto undervolt, maual overclocking can now bring gpu frequency at least to 2000mhz with a reports of close to 2100mhz, Memory can be overclocked consistently to at least 1150mhz with reports of up to 1200mhz. He changed his conclusions which were negative in his initial review to neutral now. He says he no longer regrets purchasing the card, only that it costs $700 . But he also acknowledges that with 16GB of HBM2 it could not cost less than $700.



arbiter said:


> Saying its poor selection of games is just making excuses for why it has poor performance. The games used are high end games that a lot of people would play either now or when they were out. Star control origins only had 1 month of over 1000 peak players and as for their other game "Ashes of the Singularity" was for most part a glorified tech demo made in to a game. Using games that never really had much of a player base is a waste of time.


Star Control Origins by Stardock is a top 50 game. It should have been in the mix. Are you going to stretch credibility and say very few of the DX12 games are popular?  That would be a complete lie.


----------



## dalekdukesboy (Feb 17, 2019)

Hey Wizzard or anybody, you overclock the card and run Heaven benchmark...but you don't list the settings or resolution?


----------



## W1zzard (Feb 18, 2019)

BaneSilvermoon said:


> When adding the washers, did you tighten down the screws or leave them a little loose? I tried some Nylon washers on mine yesterday and saw no change whatsoever. My over-clocked card still reaches 114c.


I tightened them down very hard, don't overtighten though



dalekdukesboy said:


> Hey Wizzard or anybody, you overclock the card and run Heaven benchmark...but you don't list the settings or resolution?


I'm using only a subset of the whole Heaven benchmark, so nothing you can easily reproduce


----------



## Ronamo (Feb 18, 2019)

Hey, do you know if the rtx 2080 used for this test is the founder's edition ?
FE cost 800 dollars and is OC, they usually perform 4-5% better than the ones sold at 700 dollars so comparing cards with a 100 dollars difference seems unfair to me. The average difference between stock 2080 and RVII is around 4-5%.


----------



## EarthDog (Feb 18, 2019)

Ronamo said:


> Hey, do you know if the rtx 2080 used for test is the founder's edition ?
> FE cost 800 dollars and is OC, they usually perform 4-5% better than the ones sold at 700 dollars so comparing cards with a 100 dollars difference seems unfair to me. The difference between stock 2080 and RVII is around 4-5%.


Yes.

And that meager overclock from reference to FE isn't typically 4-5% better...1-2% at most.


----------



## arbiter (Feb 18, 2019)

os2wiz said:


> Star Control Origins by Stardock is a top 50 game. It should have been in the mix. Are you going to stretch credibility and say very few of the DX12 games are popular?  That would be a complete lie.


top 50? "2,607 all-time peak" yea ok if you say so unless you mean bottom 50.

https://steamcharts.com/app/271260


----------



## Ronamo (Feb 18, 2019)

EarthDog said:


> Yes.
> 
> And that meager overclock from reference to FE isn't typically 4-5% better...1-2% at most.


Check out this: https://www.anandtech.com/show/13346/the-nvidia-geforce-rtx-2080-ti-and-2080-founders-edition-review

The difference between FE and stock goes up to 6% depending on games. When comparing graphic cards that close, I don't think you can negligate it.


----------



## EarthDog (Feb 18, 2019)

Up to.. but it averaged a mere couple of percent. This is around the difference between stock and overclocked versions of cards (out of the box) anyway. 

There are bigger fish to fry is my point. But if you are in the market for a blower cooler reference Turing card, sure.


----------



## Ronamo (Feb 18, 2019)

EarthDog said:


> Up to.. but it averaged a mere couple of percent. This is around the difference between stock and overclocked versions of cards (out of the box) anyway.
> 
> There are bigger fish to fry is my point. But if you are in the market for a blower cooler reference Turing card, sure.



Check tests, there are 2 games with almost no difference, every others have a 3-4% difference and 2 have a 6,2 and 6,8% difference so it would have been interesting to see how RVII competes in a 20 games benchmark. I'm not saying that it would have changed much in the conclusion but a 10% average difference seems very high for 2 cards meant to perform the same.

Most reviewers write that rtx 2080 and RVII cost the same price, so yes it would have been fair to compare 700 dollars models (not only blowers, there are decent models like msi ventus for this price), OC versions usually cost around 750 dollars.


----------



## EarthDog (Feb 18, 2019)

It may be a bit closer, but the song remains the same as they say. 

Also, AMD may have overclocked versions as well so... there is that


----------



## Ronamo (Feb 18, 2019)

EarthDog said:


> It may be a bit closer, but the song remains the same as they say.
> 
> Also, AMD may have overclocked versions as well so... there is that



No, there are no current custom card announced, not even by Sapphire or PowerColor. It's pretty disappointing because the card works better undervolted and oc at 1850mhz, it reduces its noises and increases its performances.


----------



## EarthDog (Feb 18, 2019)

Its funny because AMD said that, no custom anything... then suddenly we saw powercolor with its red devil cooler on the card. That won't be free.........so who knows what is true or not at this point. But I don;t know why PC would advertise it if it wasn't true...

It is disappointing the card works better like that.. a head scratcher at minimum...one would think it should have come out of the box that way...............


----------



## dalekdukesboy (Feb 18, 2019)

W1zzard said:


> I tightened them down very hard, don't overtighten though
> 
> 
> I'm using only a subset of the whole Heaven benchmark, so nothing you can easily reproduce



Thanks Wizz! Sucks that it isn't something simple to reproduce but good to know for reference. Do you only use part of the benchmark for any technical reasons or is it just for brevity's sake when you are benchmarking so much stuff?



Super XP said:


> Without factoring in the noise, power draw (Despite being 7nm) and high $$ tag, the Radeon VII isn't a bad card after closely looking at several reviews. W1zzards wonderful review already gave me this same conclusion. But had to venture out and check another 10-12 sites. The majority all have similar results as W1zzards review. All on par with what this card is capable of.



I hope nobody rates me similarly to this in regards to my performance at anything, lol. Talk about trying to put a positive spin on polishing a turd. However, you are about right sadly. This card is #2....in performance to Nvidia and I think they should make it a poop brown color scheme rather than regular red AMD colors.


----------



## HwGeek (Feb 21, 2019)

*Just found out that the vBios update on Asus page contains all the AtiFlash files that can be extracted  from the .exe *
https://www.asus.com/Graphics-Cards/RADEONVII-16G/HelpDesk_Download/
So now we have new atiflash utility .


----------



## Artas1984 (Mar 10, 2019)

Gamer's Nexus had Vega VII tested under water cooling with no additional overclock. As a result it managed to slightly outperform RTX 2080 in half of the cases.

EAT THIS:










As i was speculating, this card has more potential, it sucked because it was simply throttling under bad cooler. With proper cooling it works how it should.


----------



## EarthDog (Mar 10, 2019)

It's a shame users need to spend hundreds to unlock that potential...or undervolt even.


----------



## moproblems99 (Mar 10, 2019)

EarthDog said:


> It's a shame users need to spend hundreds to unlock that potential...or undervolt even.



I really, really, really thought about picking one up especially with the water blocks releasing but I just can't do it.  By the time I pick up the block, additional radiator, and pieces I am looking at nearly $1k.  At that point, I would really just consider a 2080ti.  If I could get the block and card for about $750 I'd do it.  So I wait, for Navi.

And I grow more skeptical by the month.  Where's my sad face?  Oh there it is...


----------



## HD64G (Apr 26, 2019)

After the drivers being better for this new gpu (fan curve fix included) and availability getting much better, this video is relevant imho


----------



## notb (Apr 26, 2019)

HD64G said:


> After the drivers being better for this new gpu (fan curve fix included) and availability getting much better, this video is relevant imho


So maybe AMD should have released it now, when it's finished?

How big are the improvements? Is there a source for people who can read?


----------



## GoldenX (Apr 26, 2019)

notb said:


> So maybe AMD should have released it now, when it's finished?
> 
> How big are the improvements? Is there a source for people who can read?


Mostly a tie, maybe a slight lead on the R7, with Nvidia winning on sponsored titles, or it's better to say in non driver optimized titles from AMD.
I would still prefer a 2080 only for the better power consumption. Too bad the low end of Nvidia is non-existent.


----------



## Fluffmeister (Apr 26, 2019)

Fact is the 1080 Ti was a better option... years ago, I guess patience isn't always a virtue. And Nvidia still have those 7nm gains to come... this was what the Vega hype suggested, but failed to deliver when it really mattered.


----------



## Space Lynx (Apr 26, 2019)

GoldenX said:


> Mostly a tie, maybe a slight lead on the R7, with Nvidia winning on sponsored titles, or it's better to say in non driver optimized titles from AMD.
> I would still prefer a 2080 only for the better power consumption. Too bad the low end of Nvidia is non-existent.




low end is non-existent?  a 1660 for 1080p is amazing.  1660 ti, 2060 - all overclock well and do great for low end.


----------



## GoldenX (Apr 26, 2019)

lynx29 said:


> low end is non-existent?  a 1660 for 1080p is amazing.  1660 ti, 2060 - all overclock well and do great for low end.


That's not low end for a third world country.


----------



## EarthDog (Apr 26, 2019)

How about a 1650?

Otherwise, a potato will do.


----------



## eidairaman1 (Apr 26, 2019)

notb said:


> So maybe AMD should have released it now, when it's finished?
> 
> How big are the improvements? Is there a source for people who can read?



Take time and watch


----------



## Fluffmeister (Apr 26, 2019)

EarthDog said:


> How about a 1650?
> 
> Otherwise, a potato will do.



In a third world country, a potato will do.


----------



## notb (Apr 26, 2019)

eidairaman1 said:


> Take time and watch


I don't watch vlogs and I have no idea who "Timmy Joe PC Tech". That'll be just lost 20 minutes on something I have no reason to believe.
I'm hoping for an article from a renown website. This is how these things work - on respect and trust.
If this guy made a video about AMD selling Radeon to Intel, you would just mock his 100k youtube followers and an Instagram account full of family photos. ;-)


----------



## Space Lynx (Apr 26, 2019)

GoldenX said:


> That's not low end for a third world country.



Do you drink, smoke? Go out with friends to movies? Go out to eat?  I made $10 an hour most of life sometimes $8 an hour, but I never did any of those things, gaming is all I do, it wasn't hard at all to afford even high tier PC. 

Work and spend your money wisely for long term enjoyment not short term.


----------



## BaneSilvermoon (Apr 27, 2019)

After having the card for a few months, and constantly testing it. The heat/cooling performance is the downside. The card is always throttled. In months of testing, the highest avg GPU speed I've ever gotten was around 1,550mhz. If the card could maintain its stock speed, performance would be fantastic.


----------



## GoldenX (Apr 27, 2019)

lynx29 said:


> Do you drink, smoke? Go out with friends to movies? Go out to eat?  I made $10 an hour most of life sometimes $8 an hour, but I never did any of those things, gaming is all I do, it wasn't hard at all to afford even high tier PC.
> 
> Work and spend your money wisely for long term enjoyment not short term.


Man you really have no idea. Multiply the cost of hardware by... let me check the current rate... 46x, and add a 50% thanks to taxes and reasons. Also, earn less than that.
A 1650 would be "fine" if it wasn't so badly priced compared to the competition, and a 570 would be good if it didn't have that stupid power consumption that GCN carries with it since 2012.


----------



## Space Lynx (Apr 27, 2019)

GoldenX said:


> Man you really have no idea. Multiply the cost of hardware by... let me check the current rate... 46x, and add a 50% thanks to taxes and reasons. Also, earn less than that.
> A 1650 would be "fine" if it wasn't so badly priced compared to the competition, and a 570 would be good if it didn't have that stupid power consumption that GCN carries with it since 2012.



I guess I didn't realize Argentina was third world, I thought most of South America was first world, outside of Venezuela, with as much oil as they have they should be first world... heh


----------



## EarthDog (Apr 27, 2019)

GoldenX said:


> Man you really have no idea. Multiply the cost of hardware by... let me check the current rate... 46x, and add a 50% thanks to taxes and reasons. Also, earn less than that.
> A 1650 would be "fine" if it wasn't so badly priced compared to the competition, and a 570 would be good if it didn't have that stupid power consumption that GCN carries with it since 2012.


I mean... you're going to have to suck up the power and heat thing I guess.


----------



## GoldenX (Apr 27, 2019)

lynx29 said:


> I guess I didn't realize Argentina was third world, I thought most of South America was first world, outside of Venezuela, with as much oil as they have they should be first world... heh


Check the inflation charts. We suck.

First World means siding with America during the Cold War, Third means "meh, we'll do it our way", which we did, kinda, for a time.


----------



## HD64G (Apr 27, 2019)

BaneSilvermoon said:


> After having the card for a few months, and constantly testing it. The heat/cooling performance is the downside. The card is always throttled. In months of testing, the highest avg GPU speed I've ever gotten was around 1,550mhz. If the card could maintain its stock speed, performance would be fantastic.


Some pictures from your case with the side panel opened could help us identify your case's airflow condition which I think isn't optimal in order to cause you that strong throttling. Or maybe the fan curve you set is the cause of that.


----------



## Deleted member 172152 (Apr 27, 2019)

HD64G said:


> Some pictures from your case with the side panel opened could help us identify your case's airflow condition which I think isn't optimal in order to cause you that strong throttling. Or maybe the fan curve you set is the cause of that.


Indeed, that's utter BS. ANY Radeon VII can do 1750mhz easily and mine even does up to 2000 on auto oc, which, admittedly, did crash but for no obvious reason other than bad drivers, a broken os drive I had to return and windows generally being utter crap!

That person is trolling, needs a better case or even has a faulty VII. Honestly, no matter which it is, I'll make sure not to take "advice" from that person, just because every review CLEARLY showed much better clocks, so that person could have figured out something was not quite right THREE MONTHS ago! XD


----------



## JRMBelgium (Apr 27, 2019)

It would be nice if TPU did another review. Retest RTX 2060, 2070, 2080 and 2080TI and Vega 56/64/II on their latest drivers.


----------



## W1zzard (Apr 27, 2019)

JRMBelgium said:


> It would be nice if TPU did another review. Retest RTX 2060, 2070, 2080 and 2080TI and Vega 56/64/II on their latest drivers.


Will do that soon, with new games, too, 9900k


----------



## JRMBelgium (Apr 27, 2019)

W1zzard said:


> Will do that soon, with new games, too, 9900k



Good to know. Thx for the heads up


----------



## vega22 (Apr 27, 2019)

@W1zzard  I know it's time consuming, but any chance you could throw an amd test bed in to the mix too?

Maybe just for a few title for CPU comparisons?

If only as 26/2700x are more likely to be what end users are using vs the small % who will buy the i9.

Thanks for the update none the less. I look forward to seeing which titles get gains from the step up in CPU power.


----------



## W1zzard (Apr 27, 2019)

vega22 said:


> I know it's time consuming, but any chance you could throw an amd test bed in to the mix too?


way too time consuming to maintain two platforms, if zen 2 turns out to be a winner i might change to amd, but for now the cpu limit for 1080p with 2080 ti could be a problem


----------



## vega22 (Apr 27, 2019)

Yeah, I wasn't thinking about maintaining both. But more, just run a few of the outliners again on the amd system. See how much games which are single threaded lag behind and if games that have multi thread support claw anything back.

But as you say, with Zen 2 looming on the horizon it makes sense to wait for that before investing too much time into it.


----------



## Brusfantomet (Apr 27, 2019)

Anyone got a Radeon VII here? Was wondering what the spacing for the retention screws are, have some old universal water cooling blocks lying around, if they fit then it would be a interesting upgrade over my two 290X in CF.



GoldenX said:


> Check the inflation charts. We suck.
> 
> First World means siding with America during the Cold War, Third means "meh, we'll do it our way", which we did, kinda, for a time.



Funny thing is, Sweden was firmly in the «non aligned» camp, making it a 3rd world country. (insert Scandinavian piss talking and old jokes here)


----------



## Space Lynx (Apr 27, 2019)

W1zzard said:


> Will do that soon, with new games, too, 9900k



Only a month 1-2 month wait for 3700x though I thought? Might as well just wait a bit longer and do a 9900k/3700x comparison as well on latest drivers. 

I am not sure the workload required for this kind of testing, just would hate to see you get burned out now, and not want to do 3700x and 9900k comparison on latest drivers with various gpu's, etc.  May 2019 Windows Creators Update is almost here too, so best to just wait another couple months imo.


----------



## JRMBelgium (Apr 27, 2019)

lynx29 said:


> Only a month 1-2 month wait for 3700x though I thought? Might as well just wait a bit longer and do a 9900k/3700x comparison as well on latest drivers.
> 
> I am not sure the workload required for this kind of testing, just would hate to see you get burned out now, and not want to do 3700x and 9900k comparison on latest drivers with various gpu's, etc.  May 2019 Windows Creators Update is almost here too, so best to just wait another couple months imo.



It's still 3 months before Ryzen 3700x release. That's a long time to wait for something to test while it's possible to do it right now.

Honestly, I get why people want to see benchmarks on the fastest CPU possible, but if you look at the market share, it's so freakin low and doesn't make sence. The best selling CPU's are currently the 2600, 2600x and 2700x. Just look at Amazon, Mindfacory and even when I look in my country, the R5 2600 is the best rated and best selling CPU at the moment. But what do reviewers use currently, the 9900k. Only to limit the bottleneck that 9/10 gamers DO have to show results 9/10 games will never have...

If you search for car reviews, do you search for a realistic review, or do you want to see a review on the smoothest road possible with no turns, no wind and no ups and downs in the road? So why do PC reviewers try to hard to show the least representative figures?


----------



## GoldenX (Apr 27, 2019)

JRMBelgium said:


> It's still 3 months before Ryzen 3700x release. That's a long time to wait for something to test while it's possible to do it right now.
> 
> Honestly, I get why people want to see benchmarks on the fastest CPU possible, but if you look at the market share, it's so freakin low and doesn't make sence. The best selling CPU's are currently the 2600, 2600x and 2700x. Just look at Amazon, Mindfacory and even when I look in my country, the R5 2600 is the best rated and best selling CPU at the moment. But what do reviewers use currently, the 9900k. Only to limit the bottleneck that 9/10 gamers DO have to show results 9/10 games will never have...
> 
> If you search for car reviews, do you search for a realistic review, or do you want to see a review on the smoothest road possible with no turns, no wind and no ups and downs in the road? So why do PC reviewers try to hard to show the least representative figures?


Because those are for GPU reviews, you want to see the best performance it has. I would like to have dual tests, one wit a 9900k, one with a 2700x.


----------



## Space Lynx (Apr 27, 2019)

JRMBelgium said:


> It's still 3 months before Ryzen 3700x release. That's a long time to wait for something to test while it's possible to do it right now.
> 
> Honestly, I get why people want to see benchmarks on the fastest CPU possible, but if you look at the market share, it's so freakin low and doesn't make sence. The best selling CPU's are currently the 2600, 2600x and 2700x. Just look at Amazon, Mindfacory and even when I look in my country, the R5 2600 is the best rated and best selling CPU at the moment. But what do reviewers use currently, the 9900k. Only to limit the bottleneck that 9/10 gamers DO have to show results 9/10 games will never have...
> 
> If you search for car reviews, do you search for a realistic review, or do you want to see a review on the smoothest road possible with no turns, no wind and no ups and downs in the road? So why do PC reviewers try to hard to show the least representative figures?



I agree with you mostly, honestly I don't want to own a CPU that hits 95 celsius while playing games at stock settings. That is main reason I am waiting for the 3700x. so it's not even just about money.  it's about well room temperature, fan noise, etc.

I might go for 3800x or 3700x, but I def know my next CPU is going to be one of those.  I just hope they don't hit 90+ celsius at stock like 9900k does. (my cooler is the giant heatsink Noctua NH-D14)


----------



## Deleted member 172152 (Apr 27, 2019)

lynx29 said:


> I agree with you mostly, honestly I don't want to own a CPU that hits 95 celsius while playing games at stock settings. That is main reason I am waiting for the 3700x. so it's not even just about money.  it's about well room temperature, fan noise, etc.
> 
> I might go for 3800x or 3700x, but I def know my next CPU is going to be one of those.  I just hope they don't hit 90+ celsius at stock like 9900k does. (my cooler is the giant heatsink Noctua NH-D14)


Amd is fairly easy to cool nowadays, but what do I know woth a h150i pro! XD  Still, everything's supercool at quiet everything, doubt that would be the case with 9900k and I could even overclock no problem at all!


----------



## JRMBelgium (Apr 27, 2019)

lynx29 said:


> I agree with you mostly, honestly I don't want to own a CPU that hits 95 celsius while playing games at stock settings. That is main reason I am waiting for the 3700x. so it's not even just about money.  it's about well room temperature, fan noise, etc.
> 
> I might go for 3800x or 3700x, but I def know my next CPU is going to be one of those.  I just hope they don't hit 90+ celsius at stock like 9900k does. (my cooler is the giant heatsink Noctua NH-D14)



Same here buddy. My next upgrade is going to be the Ryzen 3700x and if AMD blesses x470 owners with Ryzen 4xxx support ( and that is to be expected, I'll upgrade next year aswell  
Fun thing about AMD upgrades, you don't have to get a second job for them


----------



## RichF (May 2, 2019)

W1zzard said:


> Will do that soon, with new games, too, 9900k


Fury X, too, please.


----------



## Athlonite (May 2, 2019)

JRMBelgium said:


> Fun thing about AMD upgrades, you don't have to get a second job for them



That entirely depends on where you live where I live unfortunately at the bottom of the world in a small nation it costs alot to upgrade $1200+ just for new mobo cpu and ram


----------



## W1zzard (May 3, 2019)

RichF said:


> Fury X, too, please.


No plans for fury x, I dropped it for last rebench and nobody noticed


----------



## Ja.KooLit (May 3, 2019)

man I thought this would bring the green cards forced to bring down price...

meh, hoping to get that 2080 but even 1080ti is still too pricy for me to get this time due to higher pricing here in korea.. 

oh well, I will still hold on to my 1060 for quite sometime


----------



## Aquinus (May 3, 2019)

W1zzard said:


> but for now the cpu limit for 1080p with 2080 ti could be a problem


Are 2080 Ti owners really playing games at 1080p? Is that really a realistic use case? The person I know personally has one just so he can drive 4k smoothly. The other co-worker who plays at 1080p with a 144hz panel still uses a 1080 Ti.


----------



## Space Lynx (May 3, 2019)

Aquinus said:


> Are 2080 Ti owners really playing games at 1080p? Is that really a realistic use case? The person I know personally has one just so he can drive 4k smoothly. The other co-worker who plays at 1080p with a 144hz panel still uses a 1080 Ti.



I have a 240hz monitor at 1080p, technically a 2080 ti isn't powerful enough.  I can tell the difference between 144hz and 220 hertz or so. very fluid like. some people might prefer that like myself.  ULMB is nice, but it hitches too much for my taste


----------



## londiste (May 3, 2019)

Aquinus said:


> Are 2080 Ti owners really playing games at 1080p? Is that really a realistic use case? The person I know personally has one just so he can drive 4k smoothly. The other co-worker who plays at 1080p with a 144hz panel still uses a 1080 Ti.


As far as testing is concerned being least CPU-limited is the goal. The CPU for that today is 9900K. RTX 2080Ti - and to lesser degree next tier of cards like GTX 1080Ti, RTX 2080 and Radeon 7 - are quite CPU-limited at 1080p and that shows in the results.


----------



## RichF (May 3, 2019)

W1zzard said:


> No plans for fury x, I dropped it for last rebench and nobody noticed


Thanks for letting me know I'm nobody.

It makes good sense to test it so I would suggest a change of plans.

It would also be nice to see DOOM in the testing since, as far as I know, it is the best proof-of-concept for AMD's hardware.


----------

