• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX Vega 64 8 GB

Joined
Sep 2, 2011
Messages
1,019 (0.21/day)
Location
Porto
System Name No name / Purple Haze
Processor Phenom II 1100T @ 3.8Ghz / Pentium 4 3.4 EE Gallatin @ 3.825Ghz
Motherboard MSI 970 Gaming/ Abit IC7-MAX3
Cooling CM Hyper 212X / Scythe Andy Samurai Master (CPU) - Modded Ati Silencer 5 rev. 2 (GPU)
Memory 8GB GEIL GB38GB2133C10ADC + 8GB G.Skill F3-14900CL9-4GBXL / 2x1GB Crucial Ballistix Tracer PC4000
Video Card(s) Asus R9 Fury X Strix (4096 SP's/1050 Mhz)/ PowerColor X850XT PE @ (600/1230) AGP + (HD3850 AGP)
Storage Samsung 250 GB / WD Caviar 160GB
Display(s) Benq XL2411T
Audio Device(s) motherboard / Creative Sound Blaster X-Fi XtremeGamer Fatal1ty Pro + Front panel
Power Supply Tagan BZ 900W / Corsair HX620w
Mouse Zowie AM
Keyboard Qpad MK-50
Software Windows 7 Pro 64Bit / Windows XP
Benchmark Scores 64CU Fury: http://www.3dmark.com/fs/11269229 / X850XT PE http://www.3dmark.com/3dm05/5532432
On the Amazon best-selling CPU list, the Ryzen 1700x jumped to the 2nd position (from maybe a top20 or a bit above). Guess why.

RX Vega bundles?
 
Joined
Jan 2, 2008
Messages
3,296 (0.53/day)
System Name Thakk
Processor i7 6700k @ 4.5Ghz
Motherboard Gigabyte G1 Z170N ITX
Cooling H55 AIO
Memory 32GB DDR4 3100 c16
Video Card(s) Zotac RTX3080 Trinity
Storage Corsair Force GT 120GB SSD / Intel 250GB SSD / Samsung Pro 512 SSD / 3TB Seagate SV32
Display(s) Acer Predator X34 100hz IPS Gsync / HTC Vive
Case QBX
Audio Device(s) Realtek ALC1150 > Creative Gigaworks T40 > AKG Q701
Power Supply Corsair SF600
Mouse Logitech G900
Keyboard Ducky Shine TKL MX Blue + Vortex PBT Doubleshots
Software Windows 10 64bit
Benchmark Scores http://www.3dmark.com/fs/12108888
You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.
 
Joined
Sep 24, 2014
Messages
1,269 (0.34/day)
Location
Birmingham UK
System Name El Calpulator
Processor AMD Ryzen R7 7800X3D
Motherboard ASRock X670E Pro RS
Cooling ArcticCooling Freezer 3 360ARGB AIO
Memory 32GB Corsair Vengance 6000Mhz C30
Video Card(s) MSI RTX 4080 Gaming Trio X @ 2925 / 23500 mhz
Storage 5TB nvme SSD + Synology DS115j NAS with 4TB HDD
Display(s) Samsung G8 34" QD-OLED + Samsung 28" 4K 60hz UR550
Case Montech King 95 PRO Blue
Audio Device(s) SB X4+Logitech Z623 2.1+Astro A50 Wireless
Power Supply be quiet! Pure Power 12 M 1000W ATX 3.0 80+ Gold
Mouse Logitech G502X Plus LightSpeed Hero Wireless plus Logitech G POWERPLAY Wireless Charging Mouse Pad
Keyboard Logitech G915 LightSpeed Wireless
Software Win 11 Pro
Benchmark Scores Just enough
You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.
For me GSync is a very big plus
 
Joined
Feb 8, 2012
Messages
3,014 (0.64/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,944 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
Just checked availability and pricing in UK (Scan), the cheapest Vega 64 is £110 more expensive than the cheapest 1080 and those have aftermarket cooling, I am seriously hoping these prices drop once things have settled.
 
Joined
Sep 15, 2011
Messages
6,762 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.
Without G-Sync on my 3440x1440 monitor anything below 60fps would become a stutter party. With G-Sync enable I can play even with 30fps with maximum fluidity.
 
Joined
Jan 2, 2008
Messages
3,296 (0.53/day)
System Name Thakk
Processor i7 6700k @ 4.5Ghz
Motherboard Gigabyte G1 Z170N ITX
Cooling H55 AIO
Memory 32GB DDR4 3100 c16
Video Card(s) Zotac RTX3080 Trinity
Storage Corsair Force GT 120GB SSD / Intel 250GB SSD / Samsung Pro 512 SSD / 3TB Seagate SV32
Display(s) Acer Predator X34 100hz IPS Gsync / HTC Vive
Case QBX
Audio Device(s) Realtek ALC1150 > Creative Gigaworks T40 > AKG Q701
Power Supply Corsair SF600
Mouse Logitech G900
Keyboard Ducky Shine TKL MX Blue + Vortex PBT Doubleshots
Software Windows 10 64bit
Benchmark Scores http://www.3dmark.com/fs/12108888
Indeed, it is smooth and stutter free. However, imo, people can live without it.

Personally, I have gsync turned on all the time even with competitive games like Overwatch and PUBG.


But to opt for a power inneficient and hot card just for the sake of gaming on a freesync (or Gsync if that is the case) is not worth it imo.
 
Last edited:
Joined
Feb 19, 2009
Messages
1,162 (0.20/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) AW3423dwf.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Indeed, it is smooth and stutter free. However, imo, people can live without it.

Personally, I have gsync turned on all the time even with competitive games like Overwatch and PUBG.

But to opt for a power inneficient and hot card just for the sake of gaming on a freesync is not worth it imo.

you should try enhanced sync with freesync and or just enhanced sync.
I tested "chill" with it, I am very impressed with the trio.

We also tested some die hard gsync fans with enhanced sync, and that was a no sweat from those "lag" complainers.
Specifically with Enhanced sync and Freesync, it seems AMD have it sorted, their hardware needs drivers and optimizations from game devs to tap into the new features and methods one can use for performance or maybe it never happens.
or better; make hardware that fits gaming market better.

I am pro AMD's supporting features and not their hardware, the latter I'd rather not tell my honest opinion on :)

Also wattman seems to be a nightmare.
 
Joined
Jun 13, 2012
Messages
1,409 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
As much as people like to think this is a miners delight card, kinda wonder if that is the case give the MH per watt. For example a vega 64 can draw up to 300watts which probably will to get that 33MH, yet for half that a 1070 is listed as 26.5. You could get 2 1070's yes costs more but get 52MH for same power envelope. So stock least for 64 might be a problem but 56 might be one they go for instead if they do.
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
When is it not an excuse and instead a clear reflection of reality?
You were the one claiming "Vega is a compute chip that can also play some games", while Vega10 is their gaming chip.

You people talking about Freesync / Gsync like it is a huge leap in your gaming experience, where in fact it is not. It pretty much feels just like regular sync really. Of course, I am talking about if you are consistently getting that high fps on your screen (which pretty much is the case if you have that shiny high end GPU inside). Gsync / Freesync is overrated.
Then you're doing it wrong.
G-Sync is the greatest improvement in gaming for years. Smooth gaming is more important than framerate or resolution.
 
Joined
Oct 24, 2013
Messages
10 (0.00/day)
AMD what is the point? You can't even buy one because they are all sold out. The stock prices are $100 more than announce. So Vega 64 Liquid is $699. I'm not sure why you would spend 700 bucks on Vega 64 Liquid when you can get a 1080Ti for 750 bucks. When will it be avail to actually purchase and to play games with?
 
Joined
Mar 10, 2014
Messages
1,793 (0.45/day)
@W1zzard
Can you please explain your scale?
How can a terrible product like this get a 8.6 out of 10? And how bad does a product have to be to get something like a 5? What is the point of a scale 1-10 when there is barely any difference from a bad card like this vs. a good card like GTX 1080 (which got 9.1).

In my book, a 5 should mean a mediocre score, a completely OK decent product. Vega 64 fails to deliver in terms of performance, efficiency, noise, etc. It deserves a score of 3, 4 at most.

-----

Who in their right mind would buy Vega 56/64 for gaming? GTX 1070/1080 are clearly better options.

Well little late respond, but just went through some old reviews and saw your quite valid question. Old mighty fermistor foureighty got 8.2, which was not really that bad compared to competition(which got 9.5). gtx590 blowed up and got 7.0. And the fixed fermistor fiveeighty got 9.0, while the direct competitor reached low point of 8.0...

All in all w1zzard is usually quite consistent with points. I.E. he gave gtx1080fe 9.1, which was higher than the points this get(8.6). And then there's is the points difference with reference to custom cards. Highest points for custom gtx1080 is for Zotac gtx1080 amp! extreme with points of 9.9(which I don't really get with that grappy vrm cooling it have). I'm quite confident that there will be custom RX Vega⁶⁴ card, which will get higher points than gtx1080FE and for a good reason too(better cooling, no price premium, better performing so better card all together - power and heat but that can be managed).
 
Joined
Nov 30, 2006
Messages
1,002 (0.15/day)
Location
NorCal
System Name Modest Box
Processor i5-4690K @ 4.7 Ghz
Motherboard ASUS Z97-C
Cooling Noctua NH-D15
Memory G.Skill Ares DDR3-2400 16GB
Video Card(s) Colorful GTX 950
Storage OCZ Vertex 460A 480GB
Display(s) HP w2558hc
Case Cooler Master Stacker 830
Audio Device(s) Onboard Realtek
Power Supply Gigabyte 750W Gold
Mouse Microsoft Intellimouse Explorer
Software Windows 10 64 Bit
This card is of no use to gamers. Just matches the competition's card which has been out for a year and does so only after consuming 125W+ more power. Still, miners will snatch up every one they can make so I guess its still a win for AMD.
 
Joined
Mar 3, 2017
Messages
38 (0.01/day)
I had a chance to purchase a RX Vega 64 at Microcenter but I declined it. I told them to sell to someone else. I know heat is always an issue with Ref cards but me selling a RX Vega 64 for $200-$300 more than MSRP, $200-$300 isn't going to make me get ahead in life. I quite frankly don't need the money. I will wait for AIB like I mentioned in other posts.
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
1. Most of the extra transistors were used on improving frequency
2. RTG sacrificed efficiency (IPC) from Fiji to improve frequency
<cut>
4. GloFo manufacturing simply cannot tame the power consumption at such high frequency.
1 and 2 makes no sense. Making the cores more complex will require higher voltages which in the end will limit the frequency. Regarding 4, the problem is not the node. AMD had the same problems when they were using the same node as Nvidia for 28nm. Vega is power limited, since its circuit design is less efficient, it needs more power to make the transistors respond quickly enough to maintain stability, which in the end results in high energy consumption and throttling.
 
Joined
Jul 20, 2013
Messages
236 (0.06/day)
System Name Coffee Lake S
Processor i9-9900K
Motherboard MSI MEG Z390 ACE
Cooling Corsair H115i Platinum RGB
Memory Corsair Dominator Platinum RGB 32GB (2x16GB) DDR4 3466 C16
Video Card(s) EVGA RTX 2080 Ti XC2 Ultra
Storage Samsung 970 Pro M.2 512GB - Samsung 860 EVO 1TB SSD - WD Black 2TB HDD
Display(s) Dell P2715Q 27" 3840x2160 IPS @ 60Hz
Case Fractal Design Define R6
Power Supply Seasonic 860 watt Platinum
Mouse SteelSeries Rival 600
Keyboard Corsair K70 RGB MK.2
Software Windows 10 Pro 64 bit
Why would I buy this Vega card when I can get a used GTX 1080 on eBay which equals it, and saves me about $100 a year in electricity (if not more) for ~$499 (actually saw a diff Gigabyte 1080 for $475).

I'm all for an AMD comeback riding on the coat tails of Ryzen's success, but this is like arriving too late to a party. With warm beer.

As for the Miners. Go for it. Not interested.

http://www.ebay.com/itm/GigaByte-Wi...357924&hash=item440974e951:g:aXUAAOSwgxxZk2zR
 
Joined
Mar 18, 2008
Messages
5,717 (0.93/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
1 and 2 makes no sense. Making the cores more complex will require higher voltages which in the end will limit the frequency. Regarding 4, the problem is not the node. AMD had the same problems when they were using the same node as Nvidia for 28nm. Vega is power limited, since its circuit design is less efficient, it needs more power to make the transistors respond quickly enough to maintain stability, which in the end results in high energy consumption and throttling.

http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/2
My source:

Regarding not expanding the actual important part, the CU arrays

Anandtech said:
Talking to AMD’s engineers about the matter, they haven’t taken any steps with Vega to change this. They have made it clear that 4 compute engines is not a fundamental limitation – they know how to build a design with more engines – however to do so would require additional work. In other words, the usual engineering trade-offs apply, with AMD’s engineers focusing on addressing things like HBCC and rasterization as opposed to doing the replumbing necessary for additional compute engines in Vega 10

Not shown on AMD’s diagram, but confirmed in the specifications, is how the CUs are clustered together within a compute engine. On all iterations of GCN, AMD has bundled CUs together in a shader array, with up to 4 CUs sharing a single L1 instruction cache and a constant cache. For Vega 10, that granularity has gone up a bit, and now only 3 CUs share any one of these cache sets. As a result there are now 6 CU arrays per compute engine, up from 4 on Fiji.

Regarding the extra transistors
Anandtech said:
That space is put to good use however, as it contains a staggering 12.5 billion transistors. This is 3.9B more than Fiji, and still 500M more than NVIDIA’s GP102 GPU. So outside of NVIDIA’s dedicated compute GPUs, the GP100 and GV100, Vega 10 is now the largest consumer & professional GPU on the market.

Given the overall design similarities between Vega 10 and Fiji, this gives us a very rare opportunity to look at the cost of Vega’s architectural features in terms of transistors. Without additional functional units, the vast majority of the difference in transistor counts comes down to enabling new features.

Talking to AMD’s engineers, what especially surprised me is where the bulk of those transistors went; the single largest consumer of the additional 3.9B transistors was spent on designing the chip to clock much higher than Fiji. Vega 10 can reach 1.7GHz, whereas Fiji couldn’t do much more than 1.05GHz. Additional transistors are needed to add pipeline stages at various points or build in latency hiding mechanisms, as electrons can only move so far on a single (ever shortening) clock cycle; this is something we’ve seen in NVIDIA’s Pascal, not to mention countless CPU designs. Still, what it means is that those 3.9B transistors are serving a very important performance purpose: allowing AMD to clock the card high enough to see significant performance gains over Fiji.
 
Joined
Mar 10, 2014
Messages
1,793 (0.45/day)
Why would I buy this Vega card when I can get a used GTX 1080 on eBay which equals it, and saves me about $100 a year in electricity (if not more) for ~$499 (actually saw a diff Gigabyte 1080 for $475).

I'm all for an AMD comeback riding on the coat tails of Ryzen's success, but this is like arriving too late to a party. With warm beer.

As for the Miners. Go for it. Not interested.

http://www.ebay.com/itm/GigaByte-Wi...357924&hash=item440974e951:g:aXUAAOSwgxxZk2zR

At least you might get it that cheap. Here in Europe it costs 650€ for vanilla blower one, while you can get custom gtx1080ti for 699€ and custom gtx1080 for under 550€...
 
Joined
Jan 2, 2008
Messages
3,296 (0.53/day)
System Name Thakk
Processor i7 6700k @ 4.5Ghz
Motherboard Gigabyte G1 Z170N ITX
Cooling H55 AIO
Memory 32GB DDR4 3100 c16
Video Card(s) Zotac RTX3080 Trinity
Storage Corsair Force GT 120GB SSD / Intel 250GB SSD / Samsung Pro 512 SSD / 3TB Seagate SV32
Display(s) Acer Predator X34 100hz IPS Gsync / HTC Vive
Case QBX
Audio Device(s) Realtek ALC1150 > Creative Gigaworks T40 > AKG Q701
Power Supply Corsair SF600
Mouse Logitech G900
Keyboard Ducky Shine TKL MX Blue + Vortex PBT Doubleshots
Software Windows 10 64bit
Benchmark Scores http://www.3dmark.com/fs/12108888
At least you might get it that cheap. Here in Europe it costs 650€ for vanilla blower one, while you can get custom gtx1080ti for 699€ and custom gtx1080 for under 550€...
Vega would probably be in the same overpriced scenario.
 
Joined
Feb 19, 2009
Messages
1,162 (0.20/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) AW3423dwf.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Vega would probably be in the same overpriced scenario.

I bought a RX64 for 1070 price here in Norway.
next day, almost 1080TI price for RX64 (I have no idea why really)...
 
Joined
Apr 11, 2006
Messages
52 (0.01/day)
Enhanced Sync:
This contains NOTHING that NVidia does not have, though I think the article implies it does at the end. At best it works the same. There are however Freesync monitors with no LFC whereas GSYNC always supports this.

Enhanced Sync is equivalent to NVidia's:
FAST SYNC + Adaptive VSync + GSync (again with the LFC caveat)

(Adaptive VSync is VSYNC ON and OFF automatically. It is not used if GSYNC is working. Same on Freesync)

AMD did a video where they made it sound simple. For example, they talked about the "last frame created" once you go over the top of Freesync range (so FAST SYNC) but what they failed to mention is that it is pointless to do that unless you can generate over 2x the FPS otherwise you never get a 2nd frame that allows you to drop the first frame. For example on a 144Hz monitor with a worst-case 48Hz to 90Hz Freesync range it is:

0 to 48FPS (VSYNC OFF or ON; thus screen tear or stuttering)
48FPS to 90FPS (FASTSYNC; ideal tear-free, minimal lag zone)
90FPS to 144FPS (VSYNC OFF or ON; if VSYNC ON then stutter as you are not synching to 144Hz)
144FPS (VSYNC ON; if chosen by default no screen tear, not much latency)
144FPS to approx 300FPS (VSYNC OFF needed; screen-tear may or may not be obvious but "twitch" shooters may prefer to 144FPS VSYNC ON)
300FPS+ (if you choose "Enhanced" I guess it doesn't cap but works like FAST SYNC so no screen tear as it draws 144FPS but draws only the latest full frame. So similar to 144FPS VSYNC ON but slightly less lag. Very, very minor so only the very BEST twitch shooters could tell)

*See how CONFUSING that setup is (again a worst-case but Freesync range can be hard to find). On a 144Hz GSYNC monitor it is always THIS or close:

0 to 30FPS (each frame doubled to stay in GSYNC mode. So 29FPS becomes "58FPS")
30FPS to 144FPS (GSYNC)

.. above 144FPS options same as Freesync

**Again though, some of the good Freesync monitors are close. Some are 40Hz to 144Hz so you have a nice range and the driver supports LFC so drops below 40FPS are not a big deal.

GSync:
May cost more but it is SUPERIOR overall. There are some Freesync monitors with LFC as discussed that are very good bit it is hit and miss. Even the better ones may not be able to maintain color/blur as well since they use OVERDRIVE which is problematic when changing frame times (unless you for example, make a hardware module to help with that).

FREESYNC 2/HDR:
This makes it closer to GSYNC by requiring LFC support, but with variable frame times and wider range of colors/brightness due to HDR it is much harder to make this work. Price may be a big jump up from normal Freesync, whereas on GSync 2 the addon module should reduce the monitor manufacturers R&D considerably so if they can get the price down on the modules GSYNC and FREESYNC 2 should get closer in price with GSYNC 2 likely to remain superior.

OTHER:
My main issue with the REVIEW which was mostly excellent was I saw no reference to what a top-end GTX1080 could do or even what card was used. Later we need to compare two Asus Strix models (3-fan) for GTX1080 vs VEGA64 then see how they do in terms of performance and noise. Liquid cooling seems mostly pointless if it costs close to a GTX1080Ti that beats it in every way.

GAMERS NEXUS noted that the VEGA56 has a voltage/power cap which is currently impossible to overcome but there does appear enough headroom left to nearly MATCH VEGA64 (or at least the air-cooled VEGA64 that has temperature throttling).
 
Joined
Apr 11, 2006
Messages
52 (0.01/day)
RX-VEGA my two cents:
After looking at other reviewers with different results, it appears it is best to conclude that VEGA56 is nearly identical to the GTX1070 and VEGA64 close to the GTX1080 on AVERAGE.

I expect VEGA to age better due to FP16, ACE, and due to the fact that the basic GCN architecture is in the consoles.

Now many people said "so what, that's in the FUTURE and by then... blah blah" well a lot of people buy a graphics card and keep it for 3+ years so guessing how the card should age is very important. I have seen the "FINE WINE" info before for AMD vs NVidia and was not impressed really, but I do think it is completely DIFFERENT now because the software never really had a chance to optimize for GCN before since DX12/Vulkan was required to implement the best features.

But... on the other hand NVidia tends to do a better job at more timely drivers.

Power (HEAT) is another issue. In particular for the VEGA64 since I can NOT use that card in my small room as the room temperature would be far too hot. An extra 100Watts or so makes a HUGE difference. VEGA56 is more reasonable though I'd still get something like an Asus Strix.

VEGA64 solves the heat issue (update: I mean temperature issue not heat) with liquid cooling but then charges so much that you should just get a GTX1080Ti instead.

None of this matters for cheaper VEGA64 and VEGA56 unless the PRICE is right and that may be a big issue until mining is no longer an issue AND stock is sufficient that resellers don't overcharge.

*So in general there are pros and cons, but I think VEGA56 in particular will be the best value mostly due to its FUTURE improvements relative to the GTX1070, and assuming the price is nearly IDENTICAL to a GTX1070 with the same cooler.

FEATURES: most people don't use extra features but it should be looked at if interested. How well does RECORDING compare, or features like ANSEL for 2D and 3D screenshots (in only a few titles so far). There is also VR SUPPORT and frankly I don't know how they compare. AMD's asynchronous architecture in theory should be better but NVidia tends to do better with their software support.

AMD has been improving in software quite a bit to the point they MATCH NVidia most of the time but i wouldn't say they are quite as good yet.

If an Asus Strix VEGA56 card was priced at roughly $450USD today that would be an excellent buy IMO.

(I do not see any advantage to having the HBCC for gaming unless the game needs more than 8GB, and also can't normally swap the data around in a timely fashion. HBM2 though does appear to help at higher resolutions, though possibly not enough to justify the cost since AMD could probably have dropped prices more so the VALUE proposition might have been better with say GDDR5x instead of HBM2 and maybe a $349 RX-VEGA56 MSRP)
 
Last edited:
Joined
Nov 4, 2005
Messages
12,013 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
RX-VEGA my two cents:
After looking at other reviewers with different results, it appears it is best to conclude that VEGA56 is nearly identical to the GTX1070 and VEGA64 close to the GTX1080 on AVERAGE.

I expect VEGA to age better due to FP16, ACE, and due to the fact that the basic GCN architecture is in the consoles.

Now many people said "so what, that's in the FUTURE and by then... blah blah" well a lot of people buy a graphics card and keep it for 3+ years so guessing how the card should age is very important. I have seen the "FINE WINE" info before for AMD vs NVidia and was not impressed really, but I do think it is completely DIFFERENT now because the software never really had a chance to optimize for GCN before since DX12/Vulkan was required to implement the best features.

But... on the other hand NVidia tends to do a better job at more timely drivers.

Power (HEAT) is another issue. In particular for the VEGA64 since I can NOT use that card in my small room as the room temperature would be far too hot. An extra 100Watts or so makes a HUGE difference. VEGA56 is more reasonable though I'd still get something like an Asus Strix.

VEGA64 solves the heat issue (update: I mean temperature issue not heat) with liquid cooling but then charges so much that you should just get a GTX1080Ti instead.

None of this matters for cheaper VEGA64 and VEGA56 unless the PRICE is right and that may be a big issue until mining is no longer an issue AND stock is sufficient that resellers don't overcharge.

*So in general there are pros and cons, but I think VEGA56 in particular will be the best value mostly due to its FUTURE improvements relative to the GTX1070, and assuming the price is nearly IDENTICAL to a GTX1070 with the same cooler.

FEATURES: most people don't use extra features but it should be looked at if interested. How well does RECORDING compare, or features like ANSEL for 2D and 3D screenshots (in only a few titles so far). There is also VR SUPPORT and frankly I don't know how they compare. AMD's asynchronous architecture in theory should be better but NVidia tends to do better with their software support.

AMD has been improving in software quite a bit to the point they MATCH NVidia most of the time but i wouldn't say they are quite as good yet.

If an Asus Strix VEGA56 card was priced at roughly $450USD today that would be an excellent buy IMO.

(I do not see any advantage to having the HBCC for gaming unless the game needs more than 8GB, and also can't normally swap the data around in a timely fashion. HBM2 though does appear to help at higher resolutions, though possibly not enough to justify the cost since AMD could probably have dropped prices more so the VALUE proposition might have been better with say GDDR5x instead of HBM2 and maybe a $349 RX-VEGA56 MSRP)
Enhanced Sync:
This contains NOTHING that NVidia does not have, though I think the article implies it does at the end. At best it works the same. There are however Freesync monitors with no LFC whereas GSYNC always supports this.

Enhanced Sync is equivalent to NVidia's:
FAST SYNC + Adaptive VSync + GSync (again with the LFC caveat)

(Adaptive VSync is VSYNC ON and OFF automatically. It is not used if GSYNC is working. Same on Freesync)

AMD did a video where they made it sound simple. For example, they talked about the "last frame created" once you go over the top of Freesync range (so FAST SYNC) but what they failed to mention is that it is pointless to do that unless you can generate over 2x the FPS otherwise you never get a 2nd frame that allows you to drop the first frame. For example on a 144Hz monitor with a worst-case 48Hz to 90Hz Freesync range it is:

0 to 48FPS (VSYNC OFF or ON; thus screen tear or stuttering)
48FPS to 90FPS (FASTSYNC; ideal tear-free, minimal lag zone)
90FPS to 144FPS (VSYNC OFF or ON; if VSYNC ON then stutter as you are not synching to 144Hz)
144FPS (VSYNC ON; if chosen by default no screen tear, not much latency)
144FPS to approx 300FPS (VSYNC OFF needed; screen-tear may or may not be obvious but "twitch" shooters may prefer to 144FPS VSYNC ON)
300FPS+ (if you choose "Enhanced" I guess it doesn't cap but works like FAST SYNC so no screen tear as it draws 144FPS but draws only the latest full frame. So similar to 144FPS VSYNC ON but slightly less lag. Very, very minor so only the very BEST twitch shooters could tell)

*See how CONFUSING that setup is (again a worst-case but Freesync range can be hard to find). On a 144Hz GSYNC monitor it is always THIS or close:

0 to 30FPS (each frame doubled to stay in GSYNC mode. So 29FPS becomes "58FPS")
30FPS to 144FPS (GSYNC)

.. above 144FPS options same as Freesync

**Again though, some of the good Freesync monitors are close. Some are 40Hz to 144Hz so you have a nice range and the driver supports LFC so drops below 40FPS are not a big deal.

GSync:
May cost more but it is SUPERIOR overall. There are some Freesync monitors with LFC as discussed that are very good bit it is hit and miss. Even the better ones may not be able to maintain color/blur as well since they use OVERDRIVE which is problematic when changing frame times (unless you for example, make a hardware module to help with that).

FREESYNC 2/HDR:
This makes it closer to GSYNC by requiring LFC support, but with variable frame times and wider range of colors/brightness due to HDR it is much harder to make this work. Price may be a big jump up from normal Freesync, whereas on GSync 2 the addon module should reduce the monitor manufacturers R&D considerably so if they can get the price down on the modules GSYNC and FREESYNC 2 should get closer in price with GSYNC 2 likely to remain superior.

OTHER:
My main issue with the REVIEW which was mostly excellent was I saw no reference to what a top-end GTX1080 could do or even what card was used. Later we need to compare two Asus Strix models (3-fan) for GTX1080 vs VEGA64 then see how they do in terms of performance and noise. Liquid cooling seems mostly pointless if it costs close to a GTX1080Ti that beats it in every way.

GAMERS NEXUS noted that the VEGA56 has a voltage/power cap which is currently impossible to overcome but there does appear enough headroom left to nearly MATCH VEGA64 (or at least the air-cooled VEGA64 that has temperature throttling).

Why should I take any advice from someone

1) That can't figure out how to NOT double post?
2) That doesn't have the card to see how it works compared to the competition
3) That only has theory and a wall of text explaining their uninformed ideas.

W1zz has the card, tried the features, and found them to be better than the Nvidia implementation.

As to your "overdrive" idea, overdrive is OVERCLOCKING the pixel clock and causing the refresh rate to go above specified, which has nothing to do with lower than refresh rate Freesync. My TV has 6Gb of memory and an AMD chip in it and I only experience tearing if I don't turn on max frame rates, as it already interleaves frames adaptively. I don't think you understand how "overdrive" works VS just syncing frames to the front or back porch signal http://lmgtfy.com/?q=HDMI+front+porch and the fact that many TV's already perform interleaving or frames (2/3 pull down) on 24Hz sources to prevent backlight and frame flickering issues and all you have to do is turn the feature on and even low frame rates don't cause stuttering or tearing. Freesync was just an extension of that and used technology already in use.
 
Joined
Feb 8, 2012
Messages
3,014 (0.64/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Vega irony is that if it was good for gamers it would be perfect for miners. Either way no gamers would use it.
 
Top