Tuesday, July 25th 2017
AMD Radeon RX Vega Put Through 3DMark
Ahead of its July 27 unveiling at AMD's grand media event on the sidelines of SIGGRAPH, performance benchmarks of the elusive Radeon RX Vega consumer graphics card surfaced once again. Someone with access to an RX Vega sample, with its GPU clocked at 1630 MHz and memory at 945 MHz, put it through 3DMark. One can tell that it's RX Vega and not Pro Vega Frontier Edition, looking at its 8 GB video memory amount.
In three test runs, the RX Vega powered machine yielded a graphics score of 22,330 points, 22,291 points, and 20.949 points. This puts its performance either on-par or below that of the GeForce GTX 1080, but comfortably above the GTX 1070. The test-bench consisted of a Core i7-5960X processor, and graphics driver version 22.19.640.2.
Source:
VideoCardz
In three test runs, the RX Vega powered machine yielded a graphics score of 22,330 points, 22,291 points, and 20.949 points. This puts its performance either on-par or below that of the GeForce GTX 1080, but comfortably above the GTX 1070. The test-bench consisted of a Core i7-5960X processor, and graphics driver version 22.19.640.2.
175 Comments on AMD Radeon RX Vega Put Through 3DMark
But don't underestimate that this is the first arch to use Rapid Packed Math, High Bandwidth Cache, and (supposedly) tiled rasterization all at once. Then throw in the fact that AMD has been on GCN for 6 years straight.
Sorry but 'card too difficult' I just don't buy. GCN is a stubborn fucker that wanted to be jack of all trades and AMD has been paying the price for that ever since DX11. They didn't choose HBM because they figured it'd be a nice, cost effective GPU they did it because desperate for board TDP budget and out of good ideas for GCN itself. Hawaii was a writing on the wall, really. The only thing that saved Polaris from being horrible was delta compression > smaller bus and a smaller node.
Hawaii wiped the floor with Kepler and Grenada even managed to stay competitive with Maxwell. In my opinion AMD needs to just give up on making a profit on gaming Vega. At this point they need to keep marketshare, and Zen should make them enough money to be fine until Navi.
If Vega is a 300w 1080, it should cost $400 at most.
@Captain_Tom about Hawaii, read my post. It was GCN with 512 bit and more shaders to counter a major architecture inefficiency issue, one that still exists today and is now hitting its absolute ceiling.
If this is how rx vega performe and if the rated tdp of 300 or 375 watt i true, i do not want.
Then i am just even more happy i dit not wait for vega and got a 1080 TI.
Besides a hig power use that will effect your electrical bill, there are also other downsides of a high tdp card:
It heats up the room it is in up faster.
Potentially more noise from fans.
Needs to be bigger to have space for a bigger cooler that can handle 300 watt TDP+ if this TDP turn out true off cause.
Can you really live with a card that uses close to up to dobbelt so much power for the same performance another card can deliver at the half power use. I can not.
Nope RX vega dosent impress me much so far.
videocardz.com/amd/radeon-500
What you should be looking for is STOCK for STOCK performance, no matter if the card is "overclocking" itself or not. Compare a Founders' Edition card to a stock RX Vega, which is I'm sure what you will see all the news outlets test once they are able to publish their reviews on RX Vega.
sign me up
a overclocked 1080 draws about 200W
vega draws >350 at thje clocks needed to match a 1080
if you run it at ~1300 to 1400Mhz it _only_ draws 290W and gets its ass beat by the 1070
pretty simple math
1. Cant compare clockspeeds between the two..
2. Correct. It was stock for that specific card.
3. The BOOST clocks vary by temperature, correct. Not so much load unless its a light load and it drops to a different set of clocks lower than the base clock.
4. We get it... just sharing with you the way to properly read GPUz and how Boost works with NVIDIA cards.
5. Again, we get what you are saying, but how you got there wasn't correct. Looks settled now. :)
6. Yes, and that is a factory overclocked 1080. When AIBs get their hands on Vega, its power will use will go up making that 35% value larger. :(
that costs less btw
I don't understand AMD you would think they would give up on GCN its obviously shit at scaling more compute units
time and time again they throw more CU's at the problem and all we get is more power consumption and MEH performance
we also know that AMD is using some ultra aggressive clock-gating to keep the card from drawing 400 fking watts
I leave the rest to your imagination