• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX Vega 64 8 GB

On the Amazon best-selling CPU list, the Ryzen 1700x jumped to the 2nd position (from maybe a top20 or a bit above). Guess why.

RX Vega bundles?
 
You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.
 
You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.
For me GSync is a very big plus
 
Just checked availability and pricing in UK (Scan), the cheapest Vega 64 is £110 more expensive than the cheapest 1080 and those have aftermarket cooling, I am seriously hoping these prices drop once things have settled.
 
You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.
Without G-Sync on my 3440x1440 monitor anything below 60fps would become a stutter party. With G-Sync enable I can play even with 30fps with maximum fluidity.
 
Indeed, it is smooth and stutter free. However, imo, people can live without it.

Personally, I have gsync turned on all the time even with competitive games like Overwatch and PUBG.
capture.png


But to opt for a power inneficient and hot card just for the sake of gaming on a freesync (or Gsync if that is the case) is not worth it imo.
 
Last edited:
Indeed, it is smooth and stutter free. However, imo, people can live without it.

Personally, I have gsync turned on all the time even with competitive games like Overwatch and PUBG.

But to opt for a power inneficient and hot card just for the sake of gaming on a freesync is not worth it imo.

you should try enhanced sync with freesync and or just enhanced sync.
I tested "chill" with it, I am very impressed with the trio.

We also tested some die hard gsync fans with enhanced sync, and that was a no sweat from those "lag" complainers.
Specifically with Enhanced sync and Freesync, it seems AMD have it sorted, their hardware needs drivers and optimizations from game devs to tap into the new features and methods one can use for performance or maybe it never happens.
or better; make hardware that fits gaming market better.

I am pro AMD's supporting features and not their hardware, the latter I'd rather not tell my honest opinion on :)

Also wattman seems to be a nightmare.
 
As much as people like to think this is a miners delight card, kinda wonder if that is the case give the MH per watt. For example a vega 64 can draw up to 300watts which probably will to get that 33MH, yet for half that a 1070 is listed as 26.5. You could get 2 1070's yes costs more but get 52MH for same power envelope. So stock least for 64 might be a problem but 56 might be one they go for instead if they do.
 
When is it not an excuse and instead a clear reflection of reality?
You were the one claiming "Vega is a compute chip that can also play some games", while Vega10 is their gaming chip.

You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.
Then you're doing it wrong.
G-Sync is the greatest improvement in gaming for years. Smooth gaming is more important than framerate or resolution.
 
AMD what is the point? You can't even buy one because they are all sold out. The stock prices are $100 more than announce. So Vega 64 Liquid is $699. I'm not sure why you would spend 700 bucks on Vega 64 Liquid when you can get a 1080Ti for 750 bucks. When will it be avail to actually purchase and to play games with?
 
@W1zzard
Can you please explain your scale?
How can a terrible product like this get a 8.6 out of 10? And how bad does a product have to be to get something like a 5? What is the point of a scale 1-10 when there is barely any difference from a bad card like this vs. a good card like GTX 1080 (which got 9.1).

In my book, a 5 should mean a mediocre score, a completely OK decent product. Vega 64 fails to deliver in terms of performance, efficiency, noise, etc. It deserves a score of 3, 4 at most.

-----

Who in their right mind would buy Vega 56/64 for gaming? GTX 1070/1080 are clearly better options.

Well little late respond, but just went through some old reviews and saw your quite valid question. Old mighty fermistor foureighty got 8.2, which was not really that bad compared to competition(which got 9.5). gtx590 blowed up and got 7.0. And the fixed fermistor fiveeighty got 9.0, while the direct competitor reached low point of 8.0...

All in all w1zzard is usually quite consistent with points. I.E. he gave gtx1080fe 9.1, which was higher than the points this get(8.6). And then there's is the points difference with reference to custom cards. Highest points for custom gtx1080 is for Zotac gtx1080 amp! extreme with points of 9.9(which I don't really get with that grappy vrm cooling it have). I'm quite confident that there will be custom RX Vega⁶⁴ card, which will get higher points than gtx1080FE and for a good reason too(better cooling, no price premium, better performing so better card all together - power and heat but that can be managed).
 
This card is of no use to gamers. Just matches the competition's card which has been out for a year and does so only after consuming 125W+ more power. Still, miners will snatch up every one they can make so I guess its still a win for AMD.
 
I had a chance to purchase a RX Vega 64 at Microcenter but I declined it. I told them to sell to someone else. I know heat is always an issue with Ref cards but me selling a RX Vega 64 for $200-$300 more than MSRP, $200-$300 isn't going to make me get ahead in life. I quite frankly don't need the money. I will wait for AIB like I mentioned in other posts.
 
1. Most of the extra transistors were used on improving frequency
2. RTG sacrificed efficiency (IPC) from Fiji to improve frequency
<cut>
4. GloFo manufacturing simply cannot tame the power consumption at such high frequency.
1 and 2 makes no sense. Making the cores more complex will require higher voltages which in the end will limit the frequency. Regarding 4, the problem is not the node. AMD had the same problems when they were using the same node as Nvidia for 28nm. Vega is power limited, since its circuit design is less efficient, it needs more power to make the transistors respond quickly enough to maintain stability, which in the end results in high energy consumption and throttling.
 
Why would I buy this Vega card when I can get a used GTX 1080 on eBay which equals it, and saves me about $100 a year in electricity (if not more) for ~$499 (actually saw a diff Gigabyte 1080 for $475).

I'm all for an AMD comeback riding on the coat tails of Ryzen's success, but this is like arriving too late to a party. With warm beer.

As for the Miners. Go for it. Not interested.

http://www.ebay.com/itm/GigaByte-Wi...357924&hash=item440974e951:g:aXUAAOSwgxxZk2zR
 
1 and 2 makes no sense. Making the cores more complex will require higher voltages which in the end will limit the frequency. Regarding 4, the problem is not the node. AMD had the same problems when they were using the same node as Nvidia for 28nm. Vega is power limited, since its circuit design is less efficient, it needs more power to make the transistors respond quickly enough to maintain stability, which in the end results in high energy consumption and throttling.

http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/2
My source:

Regarding not expanding the actual important part, the CU arrays

Anandtech said:
Talking to AMD’s engineers about the matter, they haven’t taken any steps with Vega to change this. They have made it clear that 4 compute engines is not a fundamental limitation – they know how to build a design with more engines – however to do so would require additional work. In other words, the usual engineering trade-offs apply, with AMD’s engineers focusing on addressing things like HBCC and rasterization as opposed to doing the replumbing necessary for additional compute engines in Vega 10

Not shown on AMD’s diagram, but confirmed in the specifications, is how the CUs are clustered together within a compute engine. On all iterations of GCN, AMD has bundled CUs together in a shader array, with up to 4 CUs sharing a single L1 instruction cache and a constant cache. For Vega 10, that granularity has gone up a bit, and now only 3 CUs share any one of these cache sets. As a result there are now 6 CU arrays per compute engine, up from 4 on Fiji.

Regarding the extra transistors
Anandtech said:
That space is put to good use however, as it contains a staggering 12.5 billion transistors. This is 3.9B more than Fiji, and still 500M more than NVIDIA’s GP102 GPU. So outside of NVIDIA’s dedicated compute GPUs, the GP100 and GV100, Vega 10 is now the largest consumer & professional GPU on the market.

Given the overall design similarities between Vega 10 and Fiji, this gives us a very rare opportunity to look at the cost of Vega’s architectural features in terms of transistors. Without additional functional units, the vast majority of the difference in transistor counts comes down to enabling new features.

Talking to AMD’s engineers, what especially surprised me is where the bulk of those transistors went; the single largest consumer of the additional 3.9B transistors was spent on designing the chip to clock much higher than Fiji. Vega 10 can reach 1.7GHz, whereas Fiji couldn’t do much more than 1.05GHz. Additional transistors are needed to add pipeline stages at various points or build in latency hiding mechanisms, as electrons can only move so far on a single (ever shortening) clock cycle; this is something we’ve seen in NVIDIA’s Pascal, not to mention countless CPU designs. Still, what it means is that those 3.9B transistors are serving a very important performance purpose: allowing AMD to clock the card high enough to see significant performance gains over Fiji.
 
Why would I buy this Vega card when I can get a used GTX 1080 on eBay which equals it, and saves me about $100 a year in electricity (if not more) for ~$499 (actually saw a diff Gigabyte 1080 for $475).

I'm all for an AMD comeback riding on the coat tails of Ryzen's success, but this is like arriving too late to a party. With warm beer.

As for the Miners. Go for it. Not interested.

http://www.ebay.com/itm/GigaByte-Wi...357924&hash=item440974e951:g:aXUAAOSwgxxZk2zR

At least you might get it that cheap. Here in Europe it costs 650€ for vanilla blower one, while you can get custom gtx1080ti for 699€ and custom gtx1080 for under 550€...
 
At least you might get it that cheap. Here in Europe it costs 650€ for vanilla blower one, while you can get custom gtx1080ti for 699€ and custom gtx1080 for under 550€...
Vega would probably be in the same overpriced scenario.
 
Vega would probably be in the same overpriced scenario.

I bought a RX64 for 1070 price here in Norway.
next day, almost 1080TI price for RX64 (I have no idea why really)...
 
Enhanced Sync:
This contains NOTHING that NVidia does not have, though I think the article implies it does at the end. At best it works the same. There are however Freesync monitors with no LFC whereas GSYNC always supports this.

Enhanced Sync is equivalent to NVidia's:
FAST SYNC + Adaptive VSync + GSync (again with the LFC caveat)

(Adaptive VSync is VSYNC ON and OFF automatically. It is not used if GSYNC is working. Same on Freesync)

AMD did a video where they made it sound simple. For example, they talked about the "last frame created" once you go over the top of Freesync range (so FAST SYNC) but what they failed to mention is that it is pointless to do that unless you can generate over 2x the FPS otherwise you never get a 2nd frame that allows you to drop the first frame. For example on a 144Hz monitor with a worst-case 48Hz to 90Hz Freesync range it is:

0 to 48FPS (VSYNC OFF or ON; thus screen tear or stuttering)
48FPS to 90FPS (FASTSYNC; ideal tear-free, minimal lag zone)
90FPS to 144FPS (VSYNC OFF or ON; if VSYNC ON then stutter as you are not synching to 144Hz)
144FPS (VSYNC ON; if chosen by default no screen tear, not much latency)
144FPS to approx 300FPS (VSYNC OFF needed; screen-tear may or may not be obvious but "twitch" shooters may prefer to 144FPS VSYNC ON)
300FPS+ (if you choose "Enhanced" I guess it doesn't cap but works like FAST SYNC so no screen tear as it draws 144FPS but draws only the latest full frame. So similar to 144FPS VSYNC ON but slightly less lag. Very, very minor so only the very BEST twitch shooters could tell)

*See how CONFUSING that setup is (again a worst-case but Freesync range can be hard to find). On a 144Hz GSYNC monitor it is always THIS or close:

0 to 30FPS (each frame doubled to stay in GSYNC mode. So 29FPS becomes "58FPS")
30FPS to 144FPS (GSYNC)

.. above 144FPS options same as Freesync

**Again though, some of the good Freesync monitors are close. Some are 40Hz to 144Hz so you have a nice range and the driver supports LFC so drops below 40FPS are not a big deal.

GSync:
May cost more but it is SUPERIOR overall. There are some Freesync monitors with LFC as discussed that are very good bit it is hit and miss. Even the better ones may not be able to maintain color/blur as well since they use OVERDRIVE which is problematic when changing frame times (unless you for example, make a hardware module to help with that).

FREESYNC 2/HDR:
This makes it closer to GSYNC by requiring LFC support, but with variable frame times and wider range of colors/brightness due to HDR it is much harder to make this work. Price may be a big jump up from normal Freesync, whereas on GSync 2 the addon module should reduce the monitor manufacturers R&D considerably so if they can get the price down on the modules GSYNC and FREESYNC 2 should get closer in price with GSYNC 2 likely to remain superior.

OTHER:
My main issue with the REVIEW which was mostly excellent was I saw no reference to what a top-end GTX1080 could do or even what card was used. Later we need to compare two Asus Strix models (3-fan) for GTX1080 vs VEGA64 then see how they do in terms of performance and noise. Liquid cooling seems mostly pointless if it costs close to a GTX1080Ti that beats it in every way.

GAMERS NEXUS noted that the VEGA56 has a voltage/power cap which is currently impossible to overcome but there does appear enough headroom left to nearly MATCH VEGA64 (or at least the air-cooled VEGA64 that has temperature throttling).
 
RX-VEGA my two cents:
After looking at other reviewers with different results, it appears it is best to conclude that VEGA56 is nearly identical to the GTX1070 and VEGA64 close to the GTX1080 on AVERAGE.

I expect VEGA to age better due to FP16, ACE, and due to the fact that the basic GCN architecture is in the consoles.

Now many people said "so what, that's in the FUTURE and by then... blah blah" well a lot of people buy a graphics card and keep it for 3+ years so guessing how the card should age is very important. I have seen the "FINE WINE" info before for AMD vs NVidia and was not impressed really, but I do think it is completely DIFFERENT now because the software never really had a chance to optimize for GCN before since DX12/Vulkan was required to implement the best features.

But... on the other hand NVidia tends to do a better job at more timely drivers.

Power (HEAT) is another issue. In particular for the VEGA64 since I can NOT use that card in my small room as the room temperature would be far too hot. An extra 100Watts or so makes a HUGE difference. VEGA56 is more reasonable though I'd still get something like an Asus Strix.

VEGA64 solves the heat issue (update: I mean temperature issue not heat) with liquid cooling but then charges so much that you should just get a GTX1080Ti instead.

None of this matters for cheaper VEGA64 and VEGA56 unless the PRICE is right and that may be a big issue until mining is no longer an issue AND stock is sufficient that resellers don't overcharge.

*So in general there are pros and cons, but I think VEGA56 in particular will be the best value mostly due to its FUTURE improvements relative to the GTX1070, and assuming the price is nearly IDENTICAL to a GTX1070 with the same cooler.

FEATURES: most people don't use extra features but it should be looked at if interested. How well does RECORDING compare, or features like ANSEL for 2D and 3D screenshots (in only a few titles so far). There is also VR SUPPORT and frankly I don't know how they compare. AMD's asynchronous architecture in theory should be better but NVidia tends to do better with their software support.

AMD has been improving in software quite a bit to the point they MATCH NVidia most of the time but i wouldn't say they are quite as good yet.

If an Asus Strix VEGA56 card was priced at roughly $450USD today that would be an excellent buy IMO.

(I do not see any advantage to having the HBCC for gaming unless the game needs more than 8GB, and also can't normally swap the data around in a timely fashion. HBM2 though does appear to help at higher resolutions, though possibly not enough to justify the cost since AMD could probably have dropped prices more so the VALUE proposition might have been better with say GDDR5x instead of HBM2 and maybe a $349 RX-VEGA56 MSRP)
 
Last edited:
RX-VEGA my two cents:
After looking at other reviewers with different results, it appears it is best to conclude that VEGA56 is nearly identical to the GTX1070 and VEGA64 close to the GTX1080 on AVERAGE.

I expect VEGA to age better due to FP16, ACE, and due to the fact that the basic GCN architecture is in the consoles.

Now many people said "so what, that's in the FUTURE and by then... blah blah" well a lot of people buy a graphics card and keep it for 3+ years so guessing how the card should age is very important. I have seen the "FINE WINE" info before for AMD vs NVidia and was not impressed really, but I do think it is completely DIFFERENT now because the software never really had a chance to optimize for GCN before since DX12/Vulkan was required to implement the best features.

But... on the other hand NVidia tends to do a better job at more timely drivers.

Power (HEAT) is another issue. In particular for the VEGA64 since I can NOT use that card in my small room as the room temperature would be far too hot. An extra 100Watts or so makes a HUGE difference. VEGA56 is more reasonable though I'd still get something like an Asus Strix.

VEGA64 solves the heat issue (update: I mean temperature issue not heat) with liquid cooling but then charges so much that you should just get a GTX1080Ti instead.

None of this matters for cheaper VEGA64 and VEGA56 unless the PRICE is right and that may be a big issue until mining is no longer an issue AND stock is sufficient that resellers don't overcharge.

*So in general there are pros and cons, but I think VEGA56 in particular will be the best value mostly due to its FUTURE improvements relative to the GTX1070, and assuming the price is nearly IDENTICAL to a GTX1070 with the same cooler.

FEATURES: most people don't use extra features but it should be looked at if interested. How well does RECORDING compare, or features like ANSEL for 2D and 3D screenshots (in only a few titles so far). There is also VR SUPPORT and frankly I don't know how they compare. AMD's asynchronous architecture in theory should be better but NVidia tends to do better with their software support.

AMD has been improving in software quite a bit to the point they MATCH NVidia most of the time but i wouldn't say they are quite as good yet.

If an Asus Strix VEGA56 card was priced at roughly $450USD today that would be an excellent buy IMO.

(I do not see any advantage to having the HBCC for gaming unless the game needs more than 8GB, and also can't normally swap the data around in a timely fashion. HBM2 though does appear to help at higher resolutions, though possibly not enough to justify the cost since AMD could probably have dropped prices more so the VALUE proposition might have been better with say GDDR5x instead of HBM2 and maybe a $349 RX-VEGA56 MSRP)
Enhanced Sync:
This contains NOTHING that NVidia does not have, though I think the article implies it does at the end. At best it works the same. There are however Freesync monitors with no LFC whereas GSYNC always supports this.

Enhanced Sync is equivalent to NVidia's:
FAST SYNC + Adaptive VSync + GSync (again with the LFC caveat)

(Adaptive VSync is VSYNC ON and OFF automatically. It is not used if GSYNC is working. Same on Freesync)

AMD did a video where they made it sound simple. For example, they talked about the "last frame created" once you go over the top of Freesync range (so FAST SYNC) but what they failed to mention is that it is pointless to do that unless you can generate over 2x the FPS otherwise you never get a 2nd frame that allows you to drop the first frame. For example on a 144Hz monitor with a worst-case 48Hz to 90Hz Freesync range it is:

0 to 48FPS (VSYNC OFF or ON; thus screen tear or stuttering)
48FPS to 90FPS (FASTSYNC; ideal tear-free, minimal lag zone)
90FPS to 144FPS (VSYNC OFF or ON; if VSYNC ON then stutter as you are not synching to 144Hz)
144FPS (VSYNC ON; if chosen by default no screen tear, not much latency)
144FPS to approx 300FPS (VSYNC OFF needed; screen-tear may or may not be obvious but "twitch" shooters may prefer to 144FPS VSYNC ON)
300FPS+ (if you choose "Enhanced" I guess it doesn't cap but works like FAST SYNC so no screen tear as it draws 144FPS but draws only the latest full frame. So similar to 144FPS VSYNC ON but slightly less lag. Very, very minor so only the very BEST twitch shooters could tell)

*See how CONFUSING that setup is (again a worst-case but Freesync range can be hard to find). On a 144Hz GSYNC monitor it is always THIS or close:

0 to 30FPS (each frame doubled to stay in GSYNC mode. So 29FPS becomes "58FPS")
30FPS to 144FPS (GSYNC)

.. above 144FPS options same as Freesync

**Again though, some of the good Freesync monitors are close. Some are 40Hz to 144Hz so you have a nice range and the driver supports LFC so drops below 40FPS are not a big deal.

GSync:
May cost more but it is SUPERIOR overall. There are some Freesync monitors with LFC as discussed that are very good bit it is hit and miss. Even the better ones may not be able to maintain color/blur as well since they use OVERDRIVE which is problematic when changing frame times (unless you for example, make a hardware module to help with that).

FREESYNC 2/HDR:
This makes it closer to GSYNC by requiring LFC support, but with variable frame times and wider range of colors/brightness due to HDR it is much harder to make this work. Price may be a big jump up from normal Freesync, whereas on GSync 2 the addon module should reduce the monitor manufacturers R&D considerably so if they can get the price down on the modules GSYNC and FREESYNC 2 should get closer in price with GSYNC 2 likely to remain superior.

OTHER:
My main issue with the REVIEW which was mostly excellent was I saw no reference to what a top-end GTX1080 could do or even what card was used. Later we need to compare two Asus Strix models (3-fan) for GTX1080 vs VEGA64 then see how they do in terms of performance and noise. Liquid cooling seems mostly pointless if it costs close to a GTX1080Ti that beats it in every way.

GAMERS NEXUS noted that the VEGA56 has a voltage/power cap which is currently impossible to overcome but there does appear enough headroom left to nearly MATCH VEGA64 (or at least the air-cooled VEGA64 that has temperature throttling).

Why should I take any advice from someone

1) That can't figure out how to NOT double post?
2) That doesn't have the card to see how it works compared to the competition
3) That only has theory and a wall of text explaining their uninformed ideas.

W1zz has the card, tried the features, and found them to be better than the Nvidia implementation.

As to your "overdrive" idea, overdrive is OVERCLOCKING the pixel clock and causing the refresh rate to go above specified, which has nothing to do with lower than refresh rate Freesync. My TV has 6Gb of memory and an AMD chip in it and I only experience tearing if I don't turn on max frame rates, as it already interleaves frames adaptively. I don't think you understand how "overdrive" works VS just syncing frames to the front or back porch signal http://lmgtfy.com/?q=HDMI+front+porch and the fact that many TV's already perform interleaving or frames (2/3 pull down) on 24Hz sources to prevent backlight and frame flickering issues and all you have to do is turn the feature on and even low frame rates don't cause stuttering or tearing. Freesync was just an extension of that and used technology already in use.
 
Vega irony is that if it was good for gamers it would be perfect for miners. Either way no gamers would use it.
 
Back
Top