• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Vega-based Cards to Reportedly Launch in May 2017 - Leak

Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
Yeah, well, they apparently found a way. Because the traditional "prefetching" just wouldn't make any kind of sense, we already do that in existing games. All Unreal 3.x and 4.x games use texture streaming which is location based around the player, so you don't store textures in memory for entire level, but just for the segment player is in and has a view of. The rest is streamed into VRAM per need basis as you move around and done by the engine itself.
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
Oh, that's quite far away. :(

One year after Pascal. FWIW, a Titan X (not full GP102 core) is 37% faster than a 1080 at 4k Vulkan Doom. It's just when people call Vega's performance stupendous and then repeat such things, it's a bit baity. Once Vega is out it's on a smaller node, has more cores and more compute with more 'superness' than Pascal. So if it doesn't beat Titan X, it's not superb enough frankly.



The architecture is a year old.


I fully expect Vega 10 to beat the Titan XP. In fact I would say the real question is if it beats the Titan XP Black.
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Yeah, well, they apparently found a way. Because the traditional "prefetching" just wouldn't make any kind of sense, we already do that in existing games. All Unreal 3.x and 4.x games use texture streaming which is location based around the player, so you don't store textures in memory for entire level, but just for the segment player is in and has a view of. The rest is streamed into VRAM per need basis as you move around and done by the engine itself.
Manual prefetching in a game is possible, because the developer might be able to reason about possible movement several frames ahead, I've done this myself in simulators.
But it's impossible for the GPU to do this itself, since it only sees memory accesses and instructions and clear patterns in these.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
Manual prefetching in a game is possible, because the developer might be able to reason about possible movement several frames ahead, I've done this myself in simulators.
But it's impossible for the GPU to do this itself, since it only sees memory accesses and instructions and clear patterns in these.

You do know GPU isn't just hardware these days, it's also software part and drivers sure can know a whole lot of things about what and how game is shuffling stuff through VRAM.
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
You do know GPU isn't just hardware these days, it's also software part and drivers sure can know a whole lot of things about what and how game is shuffling stuff through VRAM.
Even the driver is not able to reason about how the camera might move and which resource which might be needed in the future. The only thing the driver and the GPU sees is simple buffers, textures, meshes, etc. The GPU have no information about the internal logic of the game, so it has no ability to reason about what might happen. The only way to know this is if the programmer intentionally designs the game engine and tells the GPU this somehow.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
It does. It sees what resources are being used on what basis and arrange the usage accordingly on a broader scale and not down to individual texture level.
 
Joined
Nov 4, 2005
Messages
12,015 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Even the driver is not able to reason about how the camera might move and which resource which might be needed in the future. The only thing the driver and the GPU sees is simple buffers, textures, meshes, etc. The GPU have no information about the internal logic of the game, so it has no ability to reason about what might happen. The only way to know this is if the programmer intentionally designs the game engine and tells the GPU this somehow.
If you played through a game and identified all the data points of FPS less than 60 you could go find the reasons, and preemptively correct the issue by either prefetching data, precooking some physics (just like Nvidia still does with PhysX on their GPU's) or whatever else caused the slowdown and implement that in drivers to fetch said data or start processing said data.
 
Joined
Feb 7, 2006
Messages
739 (0.11/day)
Location
Austin, TX
System Name WAZAAM!
Processor AMD Ryzen 3900x
Motherboard ASRock Fatal1ty X370 Pro Gaming
Cooling Kraken x62
Memory G.Skill 16GB 3200 MHz
Video Card(s) EVGA GeForce GTX 1070 8GB SC
Storage Micron 9200 Max
Display(s) Samsung 49" 5120x1440 120hz
Case Corsair 600D
Audio Device(s) Onboard - Bose Companion 2 Speakers
Power Supply CORSAIR Professional Series HX850
Keyboard Corsair K95 RGB
Software Windows 10 Pro
Good catch I didn't see till you pointed that out I had assumed that it was relative performance(%) between the best performing card and all the others but seeing the top performing card at 98.4 indicates that is not the case.


I mentioned it in my first post with charts/tables.

Performance is from the "Performance Summary" table in the Titan X Pascal review for 1440p. And I made the disclaimer that this methodology is super "hand-wavey".

Yeah, I get it. It's imperfect for many reasons and wrong for others. But it at least provides some sort of methodology for trying to make predictions.
 
Joined
Feb 7, 2006
Messages
739 (0.11/day)
Location
Austin, TX
System Name WAZAAM!
Processor AMD Ryzen 3900x
Motherboard ASRock Fatal1ty X370 Pro Gaming
Cooling Kraken x62
Memory G.Skill 16GB 3200 MHz
Video Card(s) EVGA GeForce GTX 1070 8GB SC
Storage Micron 9200 Max
Display(s) Samsung 49" 5120x1440 120hz
Case Corsair 600D
Audio Device(s) Onboard - Bose Companion 2 Speakers
Power Supply CORSAIR Professional Series HX850
Keyboard Corsair K95 RGB
Software Windows 10 Pro
Your conclusion based on the data is wrong. You need to break the data into their proper components. you need to look at the 9XX, 10XX, 3XX, and 4XX separately since they are all different arc, when lump them together like that you are hiding the fact that the scaling of the 10XX(1060->1080) is pretty bad and being propped up by the 9XX when lumped together as AMD vs. Nvidia.

It's not 'wrong'. Those data points hold meaningful information.

If you think that a different methodology would be better, then put the info into a spreadsheet and show us what you come up with.

EDIT:
Once you plot everything out, it's pretty easy to see that those changes don't actually mean much for either company. The linear model actually fits pretty well over the past couple generations.

upload_2017-1-13_16-35-24.png


Personally, I think the biggest issue with this model is that the FuryX drastically skews the projection down for high FLOPs cards. If we set the intercepts to 0 (as makes logical sense) the picture changes a bit:

upload_2017-1-13_16-37-1.png


If you want to point out how this methodology is flawed/wrong/terrible, it would help to show us what you think is better. With pictures and stuff.
 
Last edited:
Joined
Mar 10, 2014
Messages
1,793 (0.45/day)
The problem on your Gflops chart is that Gflops depends on gpu clock and you have taken them on ihvs given numbers. Thumb of rule were that on amd side given gflops are probably little too big, card is throttling clocks back before given up-to clock GFlops(furyX being exception because of water cooling, oh and GFlops for RX480 are not from up-to clock's which should be 5,834GFlops). And for nvidia given GFlops figure is too small, because of real gpu clock is higher than given boost clock GFlops.

I.E. Titan X Gflops are given as 2*1.417GHz*3584cc=10,157GFlops while in gaming gpu clock is higher than that, in tpus review boost clock for titan xp is ~1.62GHz which means 2*1.62GHz*3584cc= 11,612GFlops. Why this matters is that real boost clock for nvidia's cards varies more per card and thus the real GFlops differs more from the given value.
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
It does. It sees what resources are being used on what basis and arrange the usage accordingly on a broader scale and not down to individual texture level.
Neither the driver nor the GPU knows anything about the internal state of the game engine.

If you played through a game and identified all the data points of FPS less than 60 you could go find the reasons, and preemptively correct the issue by either prefetching data
The GPU have no ability to reason about why data is missing, it's simply a device processing the data it's ordered to.
 
Joined
Dec 29, 2014
Messages
861 (0.24/day)
AMD just didn't play in the high end this generation for various reasons.

Reasons being they are a very small company building both CPUs and GPUs. They have to pick their battles, and the bleeding edge highest performing product isn't it.

I'm more interested in Vega 11. It a more mainstream product, and if it really does deliver good performance/watt (better than Pascal), that would be something. It would also give AMD a true laptop card.
 
Joined
Apr 15, 2009
Messages
1,035 (0.18/day)
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 Aorus Master
Cooling ARCTIC Liquid Freezer III 360 A-RGB
Memory 32 GB Ballistix Elite DDR4-3600 CL16
Video Card(s) XFX 6800 XT Speedster Merc 319 Black
Storage Sabrent Rocket NVMe 4.0 1TB
Display(s) LG 27GL850B x 2 / ASUS MG278Q
Case be quiet! Silent Base 802
Audio Device(s) Sound Blaster AE-7 / Sennheiser HD 660S
Power Supply Seasonic Vertex PX-1200
Software Windows 11 Pro 64
I'm amused how some people are highly skeptical of some info from WCCFTech but other info is considered gospel. :rolleyes:
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
The reason they crippled it was that was Titan cutting into their own workstation graphics business. People aren't going to give up their hard earned when they don't have to and the original Titan presented a very viable alternative to their Quadro line, so Titan in that form had to go. I say it was more a self-preservation move than cost-cutting.
Nah, nothing got crippled after the original Titan. It's just so that Titan X (Maxwell and Pascal) are gaming architectures, they never had DP in the first place.
 
Joined
Feb 7, 2006
Messages
739 (0.11/day)
Location
Austin, TX
System Name WAZAAM!
Processor AMD Ryzen 3900x
Motherboard ASRock Fatal1ty X370 Pro Gaming
Cooling Kraken x62
Memory G.Skill 16GB 3200 MHz
Video Card(s) EVGA GeForce GTX 1070 8GB SC
Storage Micron 9200 Max
Display(s) Samsung 49" 5120x1440 120hz
Case Corsair 600D
Audio Device(s) Onboard - Bose Companion 2 Speakers
Power Supply CORSAIR Professional Series HX850
Keyboard Corsair K95 RGB
Software Windows 10 Pro
The problem on your Gflops chart is that Gflops depends on gpu clock and you have taken them on ihvs given numbers. Thumb of rule were that on amd side given gflops are probably little too big, card is throttling clocks back before given up-to clock GFlops(furyX being exception because of water cooling, oh and GFlops for RX480 are not from up-to clock's which should be 5,834GFlops). And for nvidia given GFlops figure is too small, because of real gpu clock is higher than given boost clock GFlops.

I.E. Titan X Gflops are given as 2*1.417GHz*3584cc=10,157GFlops while in gaming gpu clock is higher than that, in tpus review boost clock for titan xp is ~1.62GHz which means 2*1.62GHz*3584cc= 11,612GFlops. Why this matters is that real boost clock for nvidia's cards varies more per card and thus the real GFlops differs more from the given value.

That doesn't actually change any of the prediction because nothing in the model is dependent on nVidia's "performance/flop" when 'predicting' Vega's performance. All it does is drop the dot for Titan X a little more below the linear fit line.

The graphs & charts I made aren't 'wrong' or 'flawed'. They follow standard practices.

What you're doing--making specific modifications due to information you have--is more in line with Bayesian statistics.

It's not better, it's not worse. It's different.
 
Joined
Apr 16, 2010
Messages
2,070 (0.39/day)
System Name iJayo
Processor i7 14700k
Motherboard Asus ROG STRIX z790-E wifi
Cooling Pearless Assasi
Memory 32 gigs Corsair Vengence
Video Card(s) Nvidia RTX 2070 Super
Storage 1tb 840 evo, Itb samsung M.2 ssd 1 & 3 tb seagate hdd, 120 gig Hyper X ssd
Display(s) 42" Nec retail display monitor/ 34" Dell curved 165hz monitor
Case O11 mini
Audio Device(s) M-Audio monitors
Power Supply LIan li 750 mini
Mouse corsair Dark Saber
Keyboard Roccat Vulcan 121
Software Window 11 pro
Benchmark Scores meh... feel me on the battle field!
......dosent matter when they release it cause nvdia has a counter punch ready. Honestly in spite of how good the "10" series cards are, I feel as though they were just a fund raiser for whats coming next
......but in the mean time even after Amd's new power up Nvdia's like

 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
......dosent matter when they release it cause nvdia has a counter punch ready.
I'm just curious, what is the counterpart from Nvidia? (Except for 1080 Ti which will arrive before Vega)
 
Joined
Apr 16, 2010
Messages
2,070 (0.39/day)
System Name iJayo
Processor i7 14700k
Motherboard Asus ROG STRIX z790-E wifi
Cooling Pearless Assasi
Memory 32 gigs Corsair Vengence
Video Card(s) Nvidia RTX 2070 Super
Storage 1tb 840 evo, Itb samsung M.2 ssd 1 & 3 tb seagate hdd, 120 gig Hyper X ssd
Display(s) 42" Nec retail display monitor/ 34" Dell curved 165hz monitor
Case O11 mini
Audio Device(s) M-Audio monitors
Power Supply LIan li 750 mini
Mouse corsair Dark Saber
Keyboard Roccat Vulcan 121
Software Window 11 pro
Benchmark Scores meh... feel me on the battle field!
I'm just curious, what is the counterpart from Nvidia? (Except for 1080 Ti which will arrive before Vega)

....counter "punch" .......Nvdia could just drop the price of the 1080s, or start talking about the 20 series or do nothing and let the 1080ti ride. They seem confident that it'll retain the crown even after vega......
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,120 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
I fully expect Vega 10 to beat the Titan XP. In fact I would say the real question is if it beats the Titan XP Black.

Your enthusiasm is... Optimistic. Would be lovely if true.
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
....counter "punch" .......Nvdia could just drop the price of the 1080s, or start talking about the 20 series or do nothing and let the 1080ti ride. They seem confident that it'll retain the crown even after vega......
They may lower the price, but they have no new chips until Volta next year.
 
Joined
Oct 2, 2015
Messages
27 (0.01/day)
Not really these are targeted at the 1080Ti not 1080. This would be about average for AMD vs Nvidia release schedule. I do however find it to be a bit annoying on the hype train as per usual. I was expecting a Jan-Feb release.
 
Joined
Oct 2, 2015
Messages
27 (0.01/day)
Hype = Pascal DX12 Hardware support, HBM2. When will Pascal justify the 1080/Titan pricing.
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
Vega 11 replacing Polaris 10? So is this going to be a complete next gen product stack aka RX 5xx? I thought its supposed to add to the current product stack at the high end. The information seems off.

No the information is dead on with literally everything we have been told from AMD.


Look at their product roadmap:

2015 = Fury HBM, R9 300 28nm

2016 = Polaris 10, 11

2017 = Vega 10, 11


Just read their bloody map!!! 2016 had the 480 as a stopgap while they got their nearly all HBM2 line-up ready.
 
Joined
Mar 7, 2007
Messages
1,426 (0.22/day)
Processor E5-1680 V2
Motherboard Rampage IV black
Video Card(s) Asrock 7900 xtx
Storage 500 gb sd
Software windows 10 64 bit
Benchmark Scores 29,433 3dmark06 score
Filling the SPs with the much enhanced scheduler should do it easily.

I'm no expert here but just from looking at specs and being on 7nm tech especially if it clocks like hell (plus whatever they set standard clocks at) this card I would honestly suspect is going to be in 1080ti territory and depending on application/game and drivers etc it may beat it.
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Just read their bloody map!!! 2016 had the 480 as a stopgap while they got their nearly all HBM2 line-up ready.
I've not seen the details of Vega 11 yet, but I assume it will be GDDR 5(X).

HBM has so far been a bad move for AMD, and it's not going to help Vega 10 for gaming either. GP102 doesn't need it, and it's still going to beat Vega 10. HBM or better will be needed eventually, but let's see if even Volta needs it for gaming. AMD should have spent their resources on the GPU rather than memory bandwidth they don't need.
 
Top