• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Vega Discussion Thread

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,683 (4.08/day)
Location
Houston
System Name Moving into the mobile space
Processor 7940HS
Motherboard HP trash
Cooling HP trash
Memory 2x8GB
Video Card(s) 4070 mobile
Storage 512GB+2TB NVME
Display(s) some 165hz thing that isn't as nice as it sounded
The 5930k was released in the 29th of august 2014 while the FX8350 was released in the 23rd of october 2012. :p (it doesn't take from the fact that in the TPU review it was not compated to HEDT though)

Fine fx8370 whatever, point is they aren't the same class. I am sure if and when Naples is released for consumers we will see a comparison by tpu between it and Intels HEDT. Maybe if amd wanted the world's perfect comparison they would have released a heavily binned 4.2-4.3ghz ryzen quad core on day zero instead of a stack of low clocked 8 core into a market that wasn't ready for it.
 
Joined
Nov 29, 2016
Messages
699 (0.23/day)
System Name Unimatrix
Processor Intel i9-9900K @ 5.0GHz
Motherboard ASRock x390 Taichi Ultimate
Cooling Custom Loop
Memory 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32
Video Card(s) EVGA 2080 with Heatkiller Water Block
Storage 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD
Display(s) Alienware 34" Ultrawide 3440x1440
Case CoolerMaster P500M Mesh
Power Supply Seasonic Prime Titanium 850W
Keyboard Corsair K75
Benchmark Scores Really Really High
That logic always baffled me. If NVIDIA sells something for 700€ and AMD releases product that matches that, in what parallel universe AMD isn't allowed to charge same amount for their product? People always have this weird illusion that just because it's AMD, they have to give stuff away basically for free. But NVIDIA, charge 1200€ away. And they'll ask for more. ¯\_(ツ)_/¯

If it'll match GTX 1080Ti, then I'm willing to pay the same price. Or, if they feel comfortable setting the bar a bit lower to attract more customers, then so be it. That's competition and free market. The more there is, the better for us, customers.

If it costed the same why would I buy the AMD product? Just like if Intel had the same performing chip at the same price as AMD, people will buy the familiar one.
 
Joined
Apr 15, 2009
Messages
1,051 (0.18/day)
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 Aorus Master
Cooling ARCTIC Liquid Freezer III 360 A-RGB
Memory 32 GB Ballistix Elite DDR4-3600 CL16
Video Card(s) XFX 6800 XT Speedster Merc 319 Black
Storage Sabrent Rocket NVMe 4.0 1TB
Display(s) LG 27GL850B x 2 / ASUS MG278Q
Case be quiet! Silent Base 802
Audio Device(s) Sound Blaster AE-7 / Sennheiser HD 660S
Power Supply Seasonic Vertex PX-1200
Software Windows 11 Pro 64
From AMD's financial review meeting the other day:
"AMD's "Vega" GPU architecture is on track to launch in Q2"
 
Joined
Apr 30, 2011
Messages
2,745 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
If it costed the same why would I buy the AMD product? Just like if Intel had the same performing chip at the same price as AMD, people will buy the familiar one.
GPUs have APIs that work very differently for each arch. For example, anyone who will play modern games having Vulcan or DX12 might prefer to game on AMD and whoever needs to play older or modern but not so many future games could coose nVidia. So, even if Vega and 1080Ti could match each other on average having the same price, someone could choose based on the games he 'd prefer to play more in the next few years.
 

bug

Joined
May 22, 2015
Messages
14,118 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
They even compared 1800X to 6900k in the Ryzen launch event. And then TPU didn't even include a direct competitor chip that costs twice as much, but performs around the same. As much as I love TPU in general and their reviews, they often have weird ways of doing things.

And I personally think all the reviewers are doing AMD injustice with all the dramatization of "uh oh games run worse on Ryzen" thing. Do they really? Apart from a hand full (of a really clumsy carpenter) that actually have a considerable difference, most are insignificant and they run with framerate high enough to be irrelevant. And for the future, core count IS the future. IPC will never increase enough to offset the fact that you can't clock cores to infinity either. Having them at 5GHz is really pushing them hard and 7700K is very close to that. Game studios will simply have to start making games more multicore aware and less dependent on just few threads and those clocked ridiculously high. Unless Intel pulls a magic out of nowhere and gives CPU's a 2x IPC boost. But I find that highly unlikely.
It's not dramatization, it's a matter of covering all bases.

Because, conversely, who really benefits from 16 threads at home? Nobody does video transcoding, 3D rendering or serves millions of HTTP requests around the clock.
When the performance is(or isn't) there, it's a review's job to highlight it.
 
Joined
Oct 2, 2004
Messages
13,791 (1.85/day)
It matters. That would be like comparing RX Vega to GTX 1070 only, not even bothering to compare it to anything else. Because someone decided so for no real reason. or excluding Titan cards from "gaming" reviews because they are meant for developers and not gamers. And yet people still buy them for gaming. How is Ryzen 1800X any different? Some people just buy them because they want to. Do I need 12 threads? Probably not. But I wanted them. Do you need a BMW M4 with 400 horse power? Probably not. But you wanted it, so you got it. That's why.
 
Joined
Nov 3, 2013
Messages
2,141 (0.52/day)
Location
Serbia
Processor Ryzen 5600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 6700 10GB SWFT 309
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703 (TTC Gold 60M)
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (Gateron milky yellow)
Software W10
If it costed the same why would I buy the AMD product? Just like if Intel had the same performing chip at the same price as AMD, people will buy the familiar one.
If they cost the same what is pushing you to buy nV and/or Intel
 
Joined
Oct 2, 2004
Messages
13,791 (1.85/day)
Fanboyism usually. That's my experience. What pushes me usually are the extra features one offers. AMD has a huge edge with extra goodies like superior looking Crimson control panel, Wattman, Chill feature, Frame limiter and few other goodies like FreeSync. Only thing NVIDIA really has a serious edge is Adaptive V-Sync and Fast V-Sync modes that are super useful. It can also be something as silly as the X factor. GDDR5X on top tier Pascals or HBM2 on Vega...
 

OneMoar

There is Always Moar
Joined
Apr 9, 2010
Messages
8,814 (1.62/day)
Location
Rochester area
System Name RPC MK2.5
Processor Ryzen 5800x
Motherboard Gigabyte Aorus Pro V2
Cooling Thermalright Phantom Spirit SE
Memory CL16 BL2K16G36C16U4RL 3600 1:1 micron e-die
Video Card(s) GIGABYTE RTX 3070 Ti GAMING OC
Storage Nextorage NE1N 2TB ADATA SX8200PRO NVME 512GB, Intel 545s 500GBSSD, ADATA SU800 SSD, 3TB Spinner
Display(s) LG Ultra Gear 32 1440p 165hz Dell 1440p 75hz
Case Phanteks P300 /w 300A front panel conversion
Audio Device(s) onboard
Power Supply SeaSonic Focus+ Platinum 750W
Mouse Kone burst Pro
Keyboard SteelSeries Apex 7
Software Windows 11 +startisallback
more memory bandwidth doesn't make up for a crappy core gcn is crap at pretty much everything in the 'traditional' dx11 style pipeline
and thats how people are still coding, write the game engine let the gpu/driver figure out how to make it go fast
 
Joined
Mar 18, 2008
Messages
5,717 (0.92/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Fanboyism usually. That's my experience. What pushes me usually are the extra features one offers. AMD has a huge edge with extra goodies like superior looking Crimson control panel, Wattman, Chill feature, Frame limiter and few other goodies like FreeSync. Only thing NVIDIA really has a serious edge is Adaptive V-Sync and Fast V-Sync modes that are super useful. It can also be something as silly as the X factor. GDDR5X on top tier Pascals or HBM2 on Vega...

superior looking Crimson control panel,Wattman
Well, sort of. However recent Crimson drivers has caused some problems among 390 owners. Specifically broken Wattman for 390 VRAM overclocking.

Chill feature
Tried it in both TF2 and Fallout4. Hate it. The drop in frame rate to ~40 is quite easily to pick up by human eyes. And the lag resulted from bouncing between low FPS and high FPS makes me want to vomit. So in short, cool feature on paper, bad experience in real life. I find the Target Frame Control a way better tool than this Chill.

Honestly man, as a long time ATi GPU owner and user, I must say they are great at "promising" features in their powerpoint slides. Implementation though is usually seriously lacked.

Take the ZeroCore feature for example. It was promised to be an energy saving feature on AMD's FuryX GPU. As shown over here:


Green LED indicates that the card will in ultra energy saving mode, like browsing websites or word processing. Guess what? It never worked! Apparently AMD only tried to enable this on Windows 7 for the old Catalyst Control Center. After moving on to Windows 10 they just conveniently ignored this completely. I shoot them ~20 emails as well as posting on their forum for this feature to be added back in Crimson drivers and they never responded.

This is just one small example of their lack of software support. I don't mind their FineWine, but I do want to have the features they promised to work.
 
Joined
Apr 30, 2012
Messages
3,881 (0.83/day)
The GPU Tach was Tachy. That was before the GPU RGB craze too.

They still have it, on the ES VEGA when it was demoing Doom at 60-70 FPS @ 4K it only had 3 lights on for load


 
Last edited:
Joined
Mar 18, 2008
Messages
5,717 (0.92/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
The GPU Tach was Tachy. That was before the GPU RGB craze too.

They still have it, on the ES VEGA when it was demoing Doom at 60-70 FPS @ 4K it only had 3 lights on for load




In case you didn't pay attention. I was talking about the green LED ZeroCore feature instead of the load based tachmeter.
 
Joined
Mar 10, 2010
Messages
11,880 (2.17/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB
Video Card(s) Asus tuf RX7900XT /Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores laptop Timespy 6506
superior looking Crimson control panel,Wattman
Well, sort of. However recent Crimson drivers has caused some problems among 390 owners. Specifically broken Wattman for 390 VRAM overclocking.

Chill feature
Tried it in both TF2 and Fallout4. Hate it. The drop in frame rate to ~40 is quite easily to pick up by human eyes. And the lag resulted from bouncing between low FPS and high FPS makes me want to vomit. So in short, cool feature on paper, bad experience in real life. I find the Target Frame Control a way better tool than this Chill.

Honestly man, as a long time ATi GPU owner and user, I must say they are great at "promising" features in their powerpoint slides. Implementation though is usually seriously lacked.

Take the ZeroCore feature for example. It was promised to be an energy saving feature on AMD's FuryX GPU. As shown over here:


Green LED indicates that the card will in ultra energy saving mode, like browsing websites or word processing. Guess what? It never worked! Apparently AMD only tried to enable this on Windows 7 for the old Catalyst Control Center. After moving on to Windows 10 they just conveniently ignored this completely. I shoot them ~20 emails as well as posting on their forum for this feature to be added back in Crimson drivers and they never responded.

This is just one small example of their lack of software support. I don't mind their FineWine, but I do want to have the features they promised to work.
My my you blather on ,you own a fury yet now you care about r9 390 compatibility, shit man every tech company has issues with every product , see the columns in this fair land for many an intel nvidia and amd feck up there's lists and lists for all your kit , and mine yet it works still ,stop crying about bollocks ,feckin power save mode moaning on tpu , what have we come to mine are never in rest.
 
Joined
Apr 30, 2012
Messages
3,881 (0.83/day)
In case you didn't pay attention. I was talking about the green LED ZeroCore feature instead of the load based tachmeter.

I was. You said ZeroCore was to be a Fury feature so I stopped there. ZeroCore has been around longer than Fury. I took your gripe to be with the LED light which maybe you have bad luck and its not turning on for you
 
Joined
Mar 18, 2008
Messages
5,717 (0.92/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
I was. You said ZeroCore was to be a Fury feature so I stopped there. ZeroCore has been around longer than Fury. I took your gripe to be with the LED light which maybe you have bad luck and its not turning on for you

Nope. The green led indicator was for all fury owners.

It seems really hard for some people to resist the urge to curse and attack. Not you Xzibit, the one above you. Being civil and keep discussion on matter is hard i guess.
 
Joined
Jul 24, 2011
Messages
91 (0.02/day)
Location
phliadelphia
Processor i5 6500 @ 4.5
Motherboard Asus z170
Memory 16gb ddr4 3000
Video Card(s) gtx 1070
Storage 1tb Seagate 7200 rpm
Case Antec 1200
Power Supply Corsair 750
Software windows 10 pro
Joined
Mar 18, 2008
Messages
5,717 (0.92/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Joined
Apr 15, 2009
Messages
1,051 (0.18/day)
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 Aorus Master
Cooling ARCTIC Liquid Freezer III 360 A-RGB
Memory 32 GB Ballistix Elite DDR4-3600 CL16
Video Card(s) XFX 6800 XT Speedster Merc 319 Black
Storage Sabrent Rocket NVMe 4.0 1TB
Display(s) LG 27GL850B x 2 / ASUS MG278Q
Case be quiet! Silent Base 802
Audio Device(s) Sound Blaster AE-7 / Sennheiser HD 660S
Power Supply Seasonic Vertex PX-1200
Software Windows 11 Pro 64
Something tells me that Guru 3d article was correct about Vega.
So you're saying that hundreds of thousands of investors have insider performance information on an unreleased product and since their stock price has had a correction Vega must be a dud? It's not that the market could be reacting to the recently announced 2017 fiscal year Q1 results? :banghead:
 
Joined
Oct 2, 2004
Messages
13,791 (1.85/day)
@xkm1948
Chill feature allows you to set lower and higher limit. You can limit the bottom FPS to 60fps and top to what you r monitor refresh is if you have 144Hz (so, 144 FPS).
 
Joined
Mar 18, 2008
Messages
5,717 (0.92/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
@xkm1948
Chill feature allows you to set lower and higher limit. You can limit the bottom FPS to 60fps and top to what you r monitor refresh is if you have 144Hz (so, 144 FPS).



For people with FreeSync monitor it may be a good feature, like you said, let it float from 60~100+

For people without FreeSync monitor it really doesn't do much. I would rather just cap FPS at 58~59 to avoid the VSync lag as well as tearing when FPS is over 60.
 
Joined
Mar 18, 2008
Messages
5,717 (0.92/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Well, I'll be able to test it if I buy RX Vega :p

People want what they dont have. You bought 980Ti and I bought FuryX. Somehow we are both yawning for GPU from the opposite camp.

I know Nvidia treats its old gen GPU not so kindly, but grass isnt that greener over AMD's side.
 
Joined
Feb 18, 2005
Messages
6,145 (0.84/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G604
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
People want what they dont have. You bought 980Ti and I bought FuryX. Somehow we are both yawning for GPU from the opposite camp.

I know Nvidia treats its old gen GPU not so kindly, but grass isnt that greener over AMD's side.

Why don't you two just swap GPUs? :p
 
Top