Tuesday, July 25th 2017

AMD Radeon RX Vega Put Through 3DMark

Ahead of its July 27 unveiling at AMD's grand media event on the sidelines of SIGGRAPH, performance benchmarks of the elusive Radeon RX Vega consumer graphics card surfaced once again. Someone with access to an RX Vega sample, with its GPU clocked at 1630 MHz and memory at 945 MHz, put it through 3DMark. One can tell that it's RX Vega and not Pro Vega Frontier Edition, looking at its 8 GB video memory amount.

In three test runs, the RX Vega powered machine yielded a graphics score of 22,330 points, 22,291 points, and 20.949 points. This puts its performance either on-par or below that of the GeForce GTX 1080, but comfortably above the GTX 1070. The test-bench consisted of a Core i7-5960X processor, and graphics driver version 22.19.640.2.
Source: VideoCardz
Add your own comment

175 Comments on AMD Radeon RX Vega Put Through 3DMark

#101
cdawall
where the hell are my stars
EarthDogYes and No... clearly a 90W difference isn't in a single/dual pumps on the card. So either that watercooler isn't doing as good a job as other water cooled implementations, or the water cooled version is using an arseload more power.
It is a single 120mm aluminum radiator, traditional thought puts maximum thermal load through those at 150w assuming a good radiator and brass/copper construction using a pump with actual flow. We already saw this same nonsense garbage cooling with the 295X2. Radiators were well past their efficiency curve and temps/noise suffered substantially for it. 375w board TDP through a 120mm radiator is a bloody joke.
Posted on Reply
#103
OneMoar
There is Always Moar
the fact that it needs water cooling to beat a 1070 is a joke -.- and a unfunny one
Posted on Reply
#104
cadaveca
My name is Dave
Mark LittleClocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet.
Want it to be more confusing? 3DMark does not list actual clocks the card ran at in the benchmark; it merely lists the maximum reported. My 1080, which is a stock-clocked card (but with better cooler) regularly reports 1924 MHz as it's clock in 3DMark, but it usually runs @ 1872.

I know that's not that big of a difference, but it's still nearly 50 MHz.

Here's my 1080 with a 7900X (both at stock, but memory bus @ 3600 MHz, because 100gb/s memory bandwidth :p) I ran this just now, just for you (note the 1911 MHz reported clock)

www.3dmark.com/fs/13200948
cdawallIt is a single 120mm aluminum radiator, traditional thought puts maximum thermal load through those at 150w assuming a good radiator and brass/copper construction using a pump with actual flow. We already saw this same nonsense garbage cooling with the 295X2. Radiators were well past their efficiency curve and temps/noise suffered substantially for it. 375w board TDP through a 120mm radiator is a bloody joke.
Most companies rate 120 mm as good for 250W+. You can thank Koolance for that. ;)


koolance.com/radiator-1-fan-120mm-30-fpi-copper

That one is rated for 400w. ROFL.
Posted on Reply
#105
efikkan
Captain_TomRX Vega is a 13.1+ TF card with HBM2, RPM, HBC, and finally Tiled Rasterization. It is a monster card, period.
A hot monster, sure.
Captain_TomWill AMD ever get their drivers to work? Nobody knows. But at the very least the potential is there, and Frontier clearly wasn't using all of its features.
So, AMD can't live up to the hype and you conclude it's due to their drivers? Perhaps your expectations are unrealistic?
What evidence is there for "potential"?

Do you remember Fiji? ~53% more GFlop/s and still it was beaten by GTX 980 Ti. So much for "potential".
Captain_TomIf Vega really turns out this bad, it will be because Vega was never built for gaming. GCN=gaming, Vega=professional work.
So, in full damage control mode already?
I knew we were going to see people in denial about Vega…
Posted on Reply
#106
Captain_Tom
Eric3988Agreed. Pricing can make or break the product. I think AMD can appreciate gaining market share on the competitors (look at Ryzen and Threadripper). $400-500 would make the card tempting in that performance bracket, even if it heats up your room and adds to the electric bill.
There is no such thing as a bad product, just bad pricing ;).


Let's assume that indeed Vega is a worst-case-scenario, and it's performance will fall short of what it's spec suggested. If so, I would say this should be the line-up:
  • $600 = Fully unlocked and water-cooled Vega. This would be the 1080 Ti competitor (still a little weaker) that trades efficiency for more features and future proofing.
    • In general this is just for the AMD fanboys.
    • Even if it uses 400w, I think it is imperative AMD has something comparable to the 1080 Ti so that CPU benchmarks sometimes use an AMD card.
    • Over time this will likely catch up to the 1080 Ti or higher, but it will take many driver revisions.
  • $450 = Fully unlocked (or not?), lower clocked air-cooled Vega. This would narrowly beat the 1080 (Or trade blows) while using 250 - 300w.
    • This is the card AMD needs to nail perfectly. Most people don't buy uber graphics cards, and something at this caliber is sorely needed at a lower price point.
    • Paired with a Freesync monitor this would be a very attractive alternative to Nvidia's high-end cards.
    • The fact that you could crossfire 3 of these for around the same price as a Titan Xp would be a big selling point.
  • $350= Cut down, lower clocked, 4GB Vega. This is the card that beats the 1070 for a tad less money.
    • A very important gap to fill between the RX 580 and RX Vega cards.
    • I would expect 8GB and Nano variants.
    • It would be sweet if they could make a GDDR5 version to further reduce price, but I know that won't happen.
^^^I know this is a long-shot, but I feel this is what's required to even have a chance of competing. Nvidia can surely drop prices by $50 across their entire line-up, and a minor Pascal refresh could be out by December (Or a 12nm refresh by Spring 2018). Plus even at these aggressive prices AMD would still be profiting some on each card.
Posted on Reply
#107
OneMoar
There is Always Moar
Captain_TomThere is no such thing as a bad product, just bad pricing ;).


Let's assume that indeed Vega is a worst-case-scenario, and it's performance will fall short of what it's spec suggested. If so, I would say this should be the line-up:
  • $600 = Fully unlocked and water-cooled Vega. This would be the 1080 Ti competitor (still a little weaker) that trades efficiency for more features and future proofing.
    • In general this is just for the AMD fanboys.
    • Even if it uses 400w, I think it is imperative AMD has something comparable to the 1080 Ti so that CPU benchmarks sometimes use an AMD card.
    • Over time this will likely catch up to the 1080 Ti or higher, but it will take many driver revisions.
  • $450 = Fully unlocked (or not?), lower clocked air-cooled Vega. This would narrowly beat the 1080 (Or trade blows) while using 250 - 300w.
    • This is the card AMD needs to nail perfectly. Most people don't buy uber graphics cards, and something at this caliber is sorely needed at a lower price point.
    • Paired with a Freesync monitor this would be a very attractive alternative to Nvidia's high-end cards.
    • The fact that you could crossfire 3 of these for around the same price as a Titan Xp would be a big selling point.
  • $350= Cut down, lower clocked, 4GB Vega. This is the card that beats the 1070 for a tad less money.
    • A very important gap to fill between the RX 580 and RX Vega cards.
    • I would expect 8GB and Nano variants.
    • It would be sweet if they could make a GDDR5 version to further reduce price, but I know that won't happen.
^^^I know this is a long-shot, but I feel this is what's required to even have a chance of competing. Nvidia can surely drop prices by $50 across their entire line-up, and a minor Pascal refresh could be out by December (Or a 12nm refresh by Spring 2018). Plus even at these aggressive prices AMD would still be profiting some on each card.
NONE of that is going to happen
no way in hell underwater with a 400WTDP its barely matching a vanilla 1080
Posted on Reply
#108
Captain_Tom
efikkanA hot monster, sure.


So, AMD can't live up to the hype and you conclude it's due to their drivers? Perhaps your expectations are unrealistic?
What evidence is there for "potential"?

Do you remember Fiji? ~53% more GFlop/s and still it was beaten by GTX 980 Ti. So much for "potential".


So, in full damage control mode already?
I knew we were going to see people in denial about Vega
What denial? LMAO!

Can you read? Most of my posts here center around Vega being a big disappointment so far (And likely overall). But everything I said still stands:
  • Vega Frontier's drivers ARE terrible. Learn how to read some reviews, there are bugs everywhere. This is a fact that the drivers are bad, what is opinion is if better drivers will improve performance.
  • Although none of us are fortune tellers, it would be idiotic to think performance won't increase by a decent margin considering how bad they are now. In fact GCN 1.0 had gained so much performance from it's 12.11 drivers that TechPowerUp said "The 7870 felt like an entirely different card", and GCN had nowhere near the issues Vega clearly has buddy:
www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/
  • If GCN 1.0 could gain 10-20% in the first year, it is not insane to think Vega could gain the same or even more performance considering how big a departure this architecture is. Again, I am not saying Vega will become substantially stronger, but it is not at all crazy to think it could.
  • The 980 Ti vs Fury debate is a dead horse. Stop beating it. The only thing I will say is that the Fury X is currently trading blows with the 1070 while the 980 Ti is treading water above he 390X. If you call that a victory, congratulations.
Posted on Reply
#109
Captain_Tom
OneMoarNONE of that is going to happen
no way in hell underwater with a 400WTDP its barely matching a vanilla 1080
Call me crazy, but I don't think that's how it will shake out in the reviews. We will see though, and if it is that bad...It's bad lol
Posted on Reply
#110
Basard
A 120mm rad will have a 150w cooling capacity on one chip and 250w on another.... depends how much heat each chip can tolerate.
Posted on Reply
#111
EarthDog
BasardA 120mm rad will have a 150w cooling capacity on one chip and 250w on another.... depends how much heat each chip can tolerate.
Que?
Posted on Reply
#112
KainXS
Just bringing my popcorn to see how the local AMD Defense Force will defend this.

This entire Vega launch is like a train wreck in slow motion at this point.
Posted on Reply
#113
Kronauer
Isn't FireStirke 1.1 is a fairly old DirectX 11 benchmark btw?
I mean even the 2nd year old Polaris is designed for DirectX 12/Vullkan gaming, yet everybody loses their mind over a directx 11 benchmark score.
Im confident that Vega will show very good fps numbers, and prove that these older benchmarks does not represent the real-world performance of a newer graphics card.
Posted on Reply
#114
efikkan
Captain_TomVega Frontier's drivers ARE terrible. Learn how to read some reviews, there are bugs everywhere. This is a fact that the drivers are bad, what is opinion is if better drivers will improve performance.
And what is the evidence that Vega's drivers are any worse than anything else?
This is just the same old excuse; "the drivers are immature, buy it and the performance will come".
Captain_TomIf GCN 1.0 could gain 10-20% in the first year, it is not insane to think Vega could gain the same or even more performance considering how big a departure this architecture is.
GCN 1.0 was a brand new architecture, while Vega is a slight refinement in comparison.

Don't forget that the competition also improves their drivers, so even if Vega improves 10% over the next year, Nvidia will also improve.
Captain_TomThe 980 Ti vs Fury debate is a dead horse. Stop beating it.
You don't like the tough facts, do you?
Fury X (Fiji), offered significantly more computational performance, more memory bandwidth, etc., and yet it was beaten by a much "weaker" on paper GTX 980 Ti. As I always say; theoretical figures are irrelevant, only real performance matters. That's why I bring it up, because the same history keep repeating itself; AMD fall short, but fans claim they have potential to be unleashed (which of course never happen).
Posted on Reply
#115
evernessince
gdallskYet with far greater power consumption, same performance and slightly higher price tag.
If you ask me people who bought the R9 390 won, simply because they could have cashed out for $500 or more during the height of the ether craze. The GTX 970 only goes for around $200.
Posted on Reply
#116
Slizzo
Captain_TomOver time this will likely catch up to the 1080 Ti or higher, but it will take many driver revisions.
If you think that driver updates will allow this card to make up a 30% or greater performance deficit you have your blinders on.

Of course my statement assumes that RX Vega will perform similarly to Frontier Edition; which I also assume is a safe bet.
Posted on Reply
#117
the54thvoid
Intoxicated Moderator
KronauerIsn't FireStirke 1.1 is a fairly old DirectX 11 benchmark btw?
I mean even the 2nd year old Polaris is designed for DirectX 12/Vullkan gaming, yet everybody loses their mind over a directx 11 benchmark score.
Im confident that Vega will show very good fps numbers, and prove that these older benchmarks does not represent the real-world performance of a newer graphics card.
DX11 AAA games are still being made. DX12 has not been the Holy Grail some hoped for. Besides, Vega cannot improve much over Fiji on DX12, it's already well enough designed for it.
Posted on Reply
#118
cdawall
where the hell are my stars
cadavecaMost companies rate 120 mm as good for 250W+. You can thank Koolance for that. ;)


koolance.com/radiator-1-fan-120mm-30-fpi-copper

That one is rated for 400w. ROFL.
At what noise level? If this releases at a $500 price point to compete with the 1080 I expect noise level to match. Not the 50dB that the 290X released with. That gives them numbers of 36dB and 37dB on the reference 1070/1080 to at least equal. The 295X2 being reference with a single 120mm radiator and similar power profile (430w average consumption) it was able to maintain 40-41dB (I do not know the length of time w1zzard runs this test for reference...)

I stand by 150w being a nice safe, silent number. There is a reason I can passively cool my 5960x if notched down to stock clocks.
Captain_TomWhat denial? LMAO!

Can you read? Most of my posts here center around Vega being a big disappointment so far (And likely overall). But everything I said still stands:
  • Vega Frontier's drivers ARE terrible. Learn how to read some reviews, there are bugs everywhere. This is a fact that the drivers are bad, what is opinion is if better drivers will improve performance.
  • Although none of us are fortune tellers, it would be idiotic to think performance won't increase by a decent margin considering how bad they are now. In fact GCN 1.0 had gained so much performance from it's 12.11 drivers that TechPowerUp said "The 7870 felt like an entirely different card", and GCN had nowhere near the issues Vega clearly has buddy:
www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/
  • If GCN 1.0 could gain 10-20% in the first year, it is not insane to think Vega could gain the same or even more performance considering how big a departure this architecture is. Again, I am not saying Vega will become substantially stronger, but it is not at all crazy to think it could.
  • The 980 Ti vs Fury debate is a dead horse. Stop beating it. The only thing I will say is that the Fury X is currently trading blows with the 1070 while the 980 Ti is treading water above he 390X. If you call that a victory, congratulations.
I am going to take this point by point.

How do you rate drivers as bad? Are they crashing, do you have proof that we will see improvement or are you just assuming this?

Vega is just GCN with an elongated pipeline to allow high clockspeed, as well as the addition of an HBC. Neither of those should require a ground up driver rewrite. This is further proven from the usage of a Fury driver from the get go. Consider the GCN 2.1

10-20% which is wishful thinking puts this even with an AIB 1080, which still consumes less than half the power.

980Ti vs Fury debate is awful, more so when you try and make the fury sound better than it is. The reference 980Ti is barely edged out by the fury, which sees no improvement in performance with AIB cards in normal situations.



The AIB model 980Ti's still consistently compete and often best 1070's
Captain_TomThere is no such thing as a bad product, just bad pricing ;).


Let's assume that indeed Vega is a worst-case-scenario, and it's performance will fall short of what it's spec suggested. If so, I would say this should be the line-up:
  • $600 = Fully unlocked and water-cooled Vega. This would be the 1080 Ti competitor (still a little weaker) that trades efficiency for more features and future proofing.
    • In general this is just for the AMD fanboys.
    • Even if it uses 400w, I think it is imperative AMD has something comparable to the 1080 Ti so that CPU benchmarks sometimes use an AMD card.
    • Over time this will likely catch up to the 1080 Ti or higher, but it will take many driver revisions.
  • $450 = Fully unlocked (or not?), lower clocked air-cooled Vega. This would narrowly beat the 1080 (Or trade blows) while using 250 - 300w.
    • This is the card AMD needs to nail perfectly. Most people don't buy uber graphics cards, and something at this caliber is sorely needed at a lower price point.
    • Paired with a Freesync monitor this would be a very attractive alternative to Nvidia's high-end cards.
    • The fact that you could crossfire 3 of these for around the same price as a Titan Xp would be a big selling point.
  • $350= Cut down, lower clocked, 4GB Vega. This is the card that beats the 1070 for a tad less money.
    • A very important gap to fill between the RX 580 and RX Vega cards.
    • I would expect 8GB and Nano variants.
    • It would be sweet if they could make a GDDR5 version to further reduce price, but I know that won't happen.
^^^I know this is a long-shot, but I feel this is what's required to even have a chance of competing. Nvidia can surely drop prices by $50 across their entire line-up, and a minor Pascal refresh could be out by December (Or a 12nm refresh by Spring 2018). Plus even at these aggressive prices AMD would still be profiting some on each card.
Long shot is a nice way to put "not going to bloody happen"
Posted on Reply
#119
EarthDog
Maybe im myopic but, im not hedging my bets on drivers improving things to reach a 1080ti, no way. 5-10% sure. I'm also not hedging my bets on vulkan being the savior either.

That is simply too much risk for a consumer to wait for market saturation of dx12/vulkan games and improved drivers.
Posted on Reply
#120
cdawall
where the hell are my stars
EarthDogMaybe im myopic but, im not hedging my bets on drivers improving things to reach a 1080ti, no way. 5-10% sure. I'm also not hedging my bets on vulkan being the savior either.

That is simply too much risk for a consumer to wait for market saturation of dx12/vulkan games and improved drivers.
Considering the Vega FE card is kicking out worse numbers in DX12 than a 11 that is a good bet
Posted on Reply
#121
efikkan
And the myth about AMD's superiority in Direct3D 12 and Vulkan lives on…
There is nothing inherent in these APIs giving GCN an edge, but rather many of the initial games being developed as AMD exclusives. Nvidia also chose to bring the driver side improvements of Direct3D 12 to all APIs, giving them a lower "relative gain". Over time Nvidia has prioritized Direct3D 12 more, reducing the initial advantage of AMD. Still, considering that nearly all games so far are using an abstraction layer instead of the new APIs directly, so we have no evidence to claim AMD have an advantage. The claims that AMD is superior in these APIs are approaching superstition at this point.
Posted on Reply
#122
cadaveca
My name is Dave
cdawallI stand by 150w being a nice safe, silent number. There is a reason I can passively cool my 5960x if notched down to stock clocks.
I wasn't disagreeing with your sentiment; the fact remains that a 120mm rad can remove a lot of heat, given the right... uh... conditions. Yeah. ;)
EarthDogThat is simply too much risk for a consumer to wait for market saturation of dx12/vulkan games and improved drivers.
These GPUs should have two focuses I think; HPC and 4K. What you mention isn't even in the picture. The question is, how many GPUs does AMD expect you to buy to get there? If you look at the entire PC ecosystem right now, it's obvious that AMD wants to sell high-end users more than a single GPU. The average consumer gets one GPU, maybe capable of 2560x1080 at more than reasonable framerates. To aim for any other performance target is unrealistic. You have to position yourself properly in the market, both in performance/price, but also matching the consumer base you intend to sell to. Understanding who that audience is really explains everything, but I see very few people that truly understand where AMD is headed here. I'd like to blame AMD for this, but unfortunately, I know better.


I see a lot of reverse hype going on lately, and it annoys the crap out of me, but only because people don't see things for what they obviously are.
Posted on Reply
#123
cdawall
where the hell are my stars
cadavecaI wasn't disagreeing with your sentiment; the fact remains that a 120mm rad can remove a lot of heat, given the right... uh... conditions. Yeah. ;)
Hehe you mean like the 120x38mm delta that is spec'd for the rad you posted lol
Posted on Reply
#124
xkm1948
Well during heavy load the FuryX raditor can become too hot to touch as well. And that is with just stock speed.

Anyway the Vega train is DOA. Move on. People should be gearing up for the Volta/Navi train now.
Posted on Reply
#125
Fluffmeister
cdawall980Ti vs Fury debate is awful, more so when you try and make the fury sound better than it is. The reference 980Ti is barely edged out by the fury, which sees no improvement in performance with AIB cards in normal situations.



The AIB model 980Ti's still consistently compete and often best 1070's
It's funny isn't it, I enjoy revisiting the comments section of the GTX 1080 review (among others), their main argument then was that was barely faster than AIB 980 Ti's.

Short memories I guess.
Posted on Reply
Add your own comment
Aug 21st, 2024 08:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts