Tuesday, July 25th 2017

AMD Radeon RX Vega Put Through 3DMark

Ahead of its July 27 unveiling at AMD's grand media event on the sidelines of SIGGRAPH, performance benchmarks of the elusive Radeon RX Vega consumer graphics card surfaced once again. Someone with access to an RX Vega sample, with its GPU clocked at 1630 MHz and memory at 945 MHz, put it through 3DMark. One can tell that it's RX Vega and not Pro Vega Frontier Edition, looking at its 8 GB video memory amount.

In three test runs, the RX Vega powered machine yielded a graphics score of 22,330 points, 22,291 points, and 20.949 points. This puts its performance either on-par or below that of the GeForce GTX 1080, but comfortably above the GTX 1070. The test-bench consisted of a Core i7-5960X processor, and graphics driver version 22.19.640.2.
Source: VideoCardz
Add your own comment

175 Comments on AMD Radeon RX Vega Put Through 3DMark

#76
cdawall
where the hell are my stars
Mark LittleClocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:

First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.

Second, from the Gigabyte article I linked, all the game benchmark results are performed with the default settings of the Gigabyte card not the overclocked results (we don't know anything about overclocking a Vega).

As shown on the clock profiles page, the clocks range from 1696 to 2038 MHz. I assume they change based on the load and card temperature. The average is around 1982 MHz.
www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/34.html

It might seem that 1924 MHz is very different from the max 2038 MHz because there is a 2 instead of a 1 but the difference is actually quite small (~5.5%). Plus, it is hard to compare a moving clock speed to a fixed clock speed benchmark result. But if we take the average of 1982 MHz on the Gigabyte card (3% above 1924 MHz) and adjust the 3dmark score, you get ~23260. The 1630 MHz Vega received a score of 22330 which is 4% lower. Yes you can probably overclock a GTX 1080 an additional 4% to get above 2 GHz. You also might be able to overclock a Vega 4%. Time will tell.

So again, all I'm saying is that Vega is equivalent to an overclocked (factory or manual) GTX 1080 according to these leaked 3dmark scores.

If we look at power consumption, TP measured 243 Watts peak on the factory overclocked Gigabtye card. Rumors peg the liquid cooled Vega (I'm assuming that is highest score) at 375 Watts. So AMD is getting about the same performance as Nvidia but at 35% higher power. That suuuuuuuuuuuucccccccccccccckkkkkkkkkkkkksss in my book. Others may not care.
And if you look at the post I made earlier as far as more efficient nvidia cards go the 1080ti massively underclocked still beats it.
Posted on Reply
#77
Captain_Tom
cdawallI don't think even a combo of Intel's and nvidias driver team could fix a card that only competes with a 1080ti downclocked on a 55% tdp...that is a lot of ground to make up for.
It's not a matter of having a better driver team, it's a matter of the card just being too complicated period.

But don't underestimate that this is the first arch to use Rapid Packed Math, High Bandwidth Cache, and (supposedly) tiled rasterization all at once. Then throw in the fact that AMD has been on GCN for 6 years straight.
Posted on Reply
#78
Daven
cdawallAnd if you look at the post I made earlier as far as more efficient nvidia cards go the 1080ti massively underclocked still beats it.
Oh yes. I quite agree. The TI version puts AMD even more to shame. This is why I hope they price Vega around $500 like the GTX 1080.
Posted on Reply
#79
cdawall
where the hell are my stars
Captain_TomIt's not a matter of having a better driver team, it's a matter of the card just being too complicated period.

But don't underestimate that this is the first arch to use Rapid Packed Math, High Bandwidth Cache, and (supposedly) tiled rasterization all at once. Then throw in the fact that AMD has been on GCN for 6 years straight.
2 years since the first benchmarks started their rounds. In that time AMD has done what? I strongly doubt their card id anymore advanced than the HPC products on the market.
Posted on Reply
#80
Vayra86
Captain_TomThen throw in the fact that AMD has been on GCN for 6 years straight.
That's six years to optimize it incrementally, but AMD was happier adding shaders and going to 512 bit. They started considering incremental refinements with Tonga which was a dud, and we all know the rebrand hell that followed. AMD could have seen this coming for at least 3-4 years, you can't defend that TBR shouldn't have been here long ago already.

Sorry but 'card too difficult' I just don't buy. GCN is a stubborn fucker that wanted to be jack of all trades and AMD has been paying the price for that ever since DX11. They didn't choose HBM because they figured it'd be a nice, cost effective GPU they did it because desperate for board TDP budget and out of good ideas for GCN itself. Hawaii was a writing on the wall, really. The only thing that saved Polaris from being horrible was delta compression > smaller bus and a smaller node.
Posted on Reply
#81
Eric3988
The timing of this card really couldn't be much worse but the fact is that if AMD can provide nearly 1080 performance for less than the MSRP, it can do well. Doubtless Nvidia won't sit still, so keep on waiting if that's your thing. I for one need a new GPU to replace my super long in the tooth hd6970 in my wife's PC. She'll get my 970 and I'll take Vega with the Freesync monitor I purchased recently.
Posted on Reply
#82
Captain_Tom
Vayra86That's six years to optimize it incrementally, but AMD was happier adding shaders and going to 512 bit. They started considering incremental refinements with Tonga which was a dud, and we all know the rebrand hell that followed. AMD could have seen this coming for at least 3-4 years, you can't defend that TBR shouldn't have been here long ago already.

Sorry but 'card too difficult' I just don't buy. GCN is a stubborn fucker that wanted to be jack of all trades and AMD has been paying the price for that ever since DX11. They didn't choose HBM because they figured it'd be a nice, cost effective GPU they did it because desperate for board TDP budget and out of good ideas for GCN itself. Hawaii was a writing on the wall, really. The only thing that saved Polaris from being horrible was delta compression > smaller bus and a smaller node.
What about Hawaii?

Hawaii wiped the floor with Kepler and Grenada even managed to stay competitive with Maxwell.
Eric3988The timing of this card really couldn't be much worse but the fact is that if AMD can provide nearly 1080 performance for less than the MSRP, it can do well. Doubtless Nvidia won't sit still, so keep on waiting if that's your thing. I for one need a new GPU to replace my super long in the tooth hd6970 in my wife's PC. She'll get my 970 and I'll take Vega with the Freesync monitor I purchased recently.
In my opinion AMD needs to just give up on making a profit on gaming Vega. At this point they need to keep marketshare, and Zen should make them enough money to be fine until Navi.

If Vega is a 300w 1080, it should cost $400 at most.
Posted on Reply
#84
Eric3988
Captain_TomIn my opinion AMD needs to just give up on making a profit on gaming Vega. At this point they need to keep marketshare, and Zen should make them enough money to be fine until Navi.

If Vega is a 300w 1080, it should cost $400 at most.
Agreed. Pricing can make or break the product. I think AMD can appreciate gaining market share on the competitors (look at Ryzen and Threadripper). $400-500 would make the card tempting in that performance bracket, even if it heats up your room and adds to the electric bill.
Posted on Reply
#85
Vayra86
Eric3988Agreed. Pricing can make or break the product. I think AMD can appreciate gaining market share on the competitors (look at Ryzen and Threadripper). $400-500 would make the card tempting in that performance bracket, even if it heats up your room and adds to the electric bill.
The bulk of market share is within the RX480 segment, and Polaris was targeting that. Not Vega. The only way you gain market share with enthusiast/high end is by having the halo card actually be the fastest and allow for trickle down, which is everything Vega's not. Trickle down HBM won't ever happen in the foreseeable future.

@Captain_Tom about Hawaii, read my post. It was GCN with 512 bit and more shaders to counter a major architecture inefficiency issue, one that still exists today and is now hitting its absolute ceiling.
Posted on Reply
#86
Tomgang
I came, I saw and I do not want it.

If this is how rx vega performe and if the rated tdp of 300 or 375 watt i true, i do not want.

Then i am just even more happy i dit not wait for vega and got a 1080 TI.

Besides a hig power use that will effect your electrical bill, there are also other downsides of a high tdp card:

It heats up the room it is in up faster.
Potentially more noise from fans.
Needs to be bigger to have space for a bigger cooler that can handle 300 watt TDP+ if this TDP turn out true off cause.
Can you really live with a card that uses close to up to dobbelt so much power for the same performance another card can deliver at the half power use. I can not.

Nope RX vega dosent impress me much so far.
Posted on Reply
#87
refillable
I know it's disappointing but how did you deduce that it slower than GTX 1080? Most reference 1080 got around 20000 and since this one is reference it should be the other way around.
Posted on Reply
#88
Eric3988
Vayra86The bulk of market share is within the RX480 segment, and Polaris was targeting that. Not Vega. The only way you gain market share with enthusiast/high end is by having the halo card actually be the fastest and allow for trickle down, which is everything Vega's not. Trickle down HBM won't ever happen in the foreseeable future.

@Captain_Tom about Hawaii, read my post. It was GCN with 512 bit and more shaders to counter a major architecture inefficiency issue, one that still exists today and is now hitting its absolute ceiling.
There's more than one way to skin a cat, as they say. Anything at or above 1070 is enthusiast/high end and with what we know of Vega it will be at that level at least. At a competitive price point it can gain some share. I'm not saying it's going to flip the market share around in a cycle, but at this point AMD is looking for any gains it can muster right? What's important is that they don't screw this up. I know we have our team red and green, but we should all be rooting for both to do well to keep competition healthy and prices low for all.
Posted on Reply
#89
Nkd
EarthDogWell, if this is the XTX air/water... that isn't good. If it is something lower in the product stack...

300/375W vs 180 (gtx 1080) doesn't look good in performance /W. Looks like they will slide in on price: performance ratio and undercut.
gaming vega have a TDP of 275w for the entire board before overclocking. Videocardz had the numbers earlier this month I think.
Posted on Reply
#90
Basard
Captain_TomI am sure it will (And it does) dominate at certain professional workloads just like Bulldozer did. However gamers will say "Why didn't they just die shrink Phenom II/Polaris?!"
Bingo! We have a winner! I would have jizzed my pants over an 8 core Phenom @ 4ghz....
Posted on Reply
#91
EarthDog
Nkdgaming vega have a TDP of 275w for the entire board before overclocking. Videocardz had the numbers earlier this month I think.
There were 2 values. One for the air cooled and one for water. The water cooled shows 375W there, while air is 285W. Most other sites report 300/375. A cheesy pump or two sure as hell isn't 90W. ;)

videocardz.com/amd/radeon-500
Posted on Reply
#92
Slizzo
Mark LittleClocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:

First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.

Second, from the Gigabyte article I linked, all the game benchmark results are performed with the default settings of the Gigabyte card not the overclocked results (we don't know anything about overclocking a Vega).

As shown on the clock profiles page, the clocks range from 1696 to 2038 MHz. I assume they change based on the load and card temperature. The average is around 1982 MHz.
www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/34.html

It might seem that 1924 MHz is very different from the max 2038 MHz because there is a 2 instead of a 1 but the difference is actually quite small (~5.5%). Plus, it is hard to compare a moving clock speed to a fixed clock speed benchmark result. But if we take the average of 1982 MHz on the Gigabyte card (3% above 1924 MHz) and adjust the 3dmark score, you get ~23260. The 1630 MHz Vega received a score of 22330 which is 4% lower. Yes you can probably overclock a GTX 1080 an additional 4% to get above 2 GHz. You also might be able to overclock a Vega 4%. Time will tell.

So again, all I'm saying is that Vega is equivalent to an overclocked (factory or manual) GTX 1080 according to these leaked 3dmark scores.

If we look at power consumption, TP measured 243 Watts peak on the factory overclocked Gigabtye card. Rumors peg the liquid cooled Vega (I'm assuming that is highest score) at 375 Watts. So AMD is getting about the same performance as Nvidia but at 35% higher power. That suuuuuuuuuuuucccccccccccccckkkkkkkkkkkkksss in my book. Others may not care.
The definition of an overclocked card is somewhat now a misnomer as Pascal has launched. Our understanding of overclocking has been changed drastically with the advent of GPU Boost 3.0 (which is on all Pascal based GPUs). ALL 1080s, whether they're Founders' Edition or not, will generally boost up to at least ~1900MHz, that is really what you should look for as the normalized frequency of the cards. It's rare you'll see one that won't boost to that level or higher unless it's starved for air.

What you should be looking for is STOCK for STOCK performance, no matter if the card is "overclocking" itself or not. Compare a Founders' Edition card to a stock RX Vega, which is I'm sure what you will see all the news outlets test once they are able to publish their reviews on RX Vega.
Posted on Reply
#93
OneMoar
There is Always Moar
wow slower then a 1080 and nearly double the power consumption
sign me up

a overclocked 1080 draws about 200W

vega draws >350 at thje clocks needed to match a 1080

if you run it at ~1300 to 1400Mhz it _only_ draws 290W and gets its ass beat by the 1070
pretty simple math
Posted on Reply
#94
EarthDog
Mark LittleClocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:

First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.

Second, from the Gigabyte article I linked, all the game benchmark results are performed with the default settings of the Gigabyte card not the overclocked results (we don't know anything about overclocking a Vega).

As shown on the clock profiles page, the clocks range from 1696 to 2038 MHz. I assume they change based on the load and card temperature. The average is around 1982 MHz.
www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/34.html

It might seem that 1924 MHz is very different from the max 2038 MHz because there is a 2 instead of a 1 but the difference is actually quite small (~5.5%). Plus, it is hard to compare a moving clock speed to a fixed clock speed benchmark result. But if we take the average of 1982 MHz on the Gigabyte card (3% above 1924 MHz) and adjust the 3dmark score, you get ~23260. The 1630 MHz Vega received a score of 22330 which is 4% lower. Yes you can probably overclock a GTX 1080 an additional 4% to get above 2 GHz. You also might be able to overclock a Vega 4%. Time will tell.

So again, all I'm saying is that Vega is equivalent to an overclocked (factory or manual) GTX 1080 according to these leaked 3dmark scores.

If we look at power consumption, TP measured 243 Watts peak on the factory overclocked Gigabtye card. Rumors peg the liquid cooled Vega (I'm assuming that is highest score) at 375 Watts. So AMD is getting about the same performance as Nvidia but at 35% higher power. That suuuuuuuuuuuucccccccccccccckkkkkkkkkkkkksss in my book. Others may not care.
They aren't if you know... as I explained in post #44 to ya. Its a clear picture. :)

1. Cant compare clockspeeds between the two..
2. Correct. It was stock for that specific card.
3. The BOOST clocks vary by temperature, correct. Not so much load unless its a light load and it drops to a different set of clocks lower than the base clock.
4. We get it... just sharing with you the way to properly read GPUz and how Boost works with NVIDIA cards.
5. Again, we get what you are saying, but how you got there wasn't correct. Looks settled now. :)
6. Yes, and that is a factory overclocked 1080. When AIBs get their hands on Vega, its power will use will go up making that 35% value larger. :(
Posted on Reply
#95
OneMoar
There is Always Moar
either way you are looking at a 400W card loosing to a 200 watt card
that costs less btw

I don't understand AMD you would think they would give up on GCN its obviously shit at scaling more compute units

time and time again they throw more CU's at the problem and all we get is more power consumption and MEH performance
Posted on Reply
#96
cdawall
where the hell are my stars
EarthDogThey aren't if you know... as I explained in post #44 to ya. Its a clear picture. :)

1. Cant compare clockspeeds between the two..
2. Correct. It was stock for that specific card.
3. The BOOST clocks vary by temperature, correct. Not so much load unless its a light load and it drops to a different set of clocks lower than the base clock.
4. We get it... just sharing with you the way to properly read GPUz and how Boost works with NVIDIA cards.
5. Again, we get what you are saying, but how you got there wasn't correct. Looks settled now. :)
6. Yes, and that is a factory overclocked 1080. When AIBs get their hands on Vega, its power will use will go up making that 35% value larger. :(
We actually don't know for sure number 6. Remember how the polaris and Fermi cards pull equal or even sometimes less power than their OEM equal due to temperature drop. Fermi was notorious for pulling less than stock power when heavily overclocked on water. Now I am not saying it will be a miracle 50% decrease, but I could see a drop.
Posted on Reply
#97
Slizzo
OneMoareither way you are looking at a 400W card loosing to a 200 watt card
that costs less btw

I don't understand AMD you would think they would give up on GCN its obviously shit at scaling more compute units

time and time again they throw more CU's at the problem and all we get is more power consumption and MEH performance
Vega and previous Fury have the same amount of shaders (4096). From what we've seen of Frontier Edition, the resulting increase in performance is directly relatable to the increase in clock speeds. (Tests have been done with a Frontier Edition at 1050MHz and it's performance matched that of R9 Fury X).
Posted on Reply
#98
EarthDog
cdawallWe actually don't know for sure number 6. Remember how the polaris and Fermi cards pull equal or even sometimes less power than their OEM equal due to temperature drop. Fermi was notorious for pulling less than stock power when heavily overclocked on water. Now I am not saying it will be a miracle 50% decrease, but I could see a drop.
Yes and No... clearly a 90W difference isn't in a single/dual pumps on the card. So either that watercooler isn't doing as good a job as other water cooled implementations, or the water cooled version is using an arseload more power.
Posted on Reply
#99
OneMoar
There is Always Moar
EarthDogYes and No... clearly a 90W difference isn't in a single/dual pumps on the card. So either that watercooler isn't doing as good a job as other water cooled implementations, or the water cooled version is using an arseload more power.
we know from various benches of the FE what happens when the card goes beyond 1500Mhz the power consumption skyrockets
we also know that AMD is using some ultra aggressive clock-gating to keep the card from drawing 400 fking watts
I leave the rest to your imagination
Posted on Reply
#100
EarthDog
I'm just trying to figure out how a 120mm AIO does the job well enough... I mean they strapped one on the 295x2 (500W), and it 'worked'... but sweet l0rd baby jebus, that radiator was HOT to the touch!!
Posted on Reply
Add your own comment
Nov 22nd, 2024 07:01 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts