• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon VII Detailed Some More: Die-size, Secret-sauce, Ray-tracing, and More

Joined
Jun 10, 2014
Messages
2,978 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
What are the drawbacks? What advantages does the 2080 have? You can't be talking about RTX and DLSS, can you?
Primarily a major difference in TDP: 215W vs. ~300W.

When you have competing products A and B, which performs and costs the same, but one of them have a major disadvantage, why would anyone ever buy it?
 
Joined
Mar 10, 2015
Messages
3,984 (1.13/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
I don't consider that a major disadvantage. It's probably less than $20 a year. If that is the only disadvantage then I don't see a problem. Also, throw that 215W out after you start overclocking and lower that 300W when you undervolt.
 
Joined
Nov 21, 2010
Messages
2,350 (0.46/day)
Location
Right where I want to be
System Name Miami
Processor Ryzen 3800X
Motherboard Asus Crosshair VII Formula
Cooling Ek Velocity/ 2x 280mm Radiators/ Alphacool fullcover
Memory F4-3600C16Q-32GTZNC
Video Card(s) XFX 6900 XT Speedster 0
Storage 1TB WD M.2 SSD/ 2TB WD SN750/ 4TB WD Black HDD
Display(s) DELL AW3420DW / HP ZR24w
Case Lian Li O11 Dynamic XL
Audio Device(s) EVGA Nu Audio
Power Supply Seasonic Prime Gold 1000W+750W
Mouse Corsair Scimitar/Glorious Model O-
Keyboard Corsair K95 Platinum
Software Windows 10 Pro
According to AMD the cost of 7nm is significant, with 16 hbm2 I can't imagine it's cheap for them, but i assume they are least making some money.

All that slide does is say is the die says remian the same costs go up, so when does a die not get smaller when going from a larger process to a smaller one? With this in mind it shows that they they were are profitting with decreasing margins until jump to 7nm.
 
Joined
Jul 5, 2013
Messages
27,380 (6.61/day)
Late to the party again, but I'd say this is a decent answer to RTX. Maybe not the show stopper that Ryzen was but damn decent none-the-less. It seems AMD has kicked it up.

Good luck with RayTracing in software, if that was viable we would have had that already. If they do it it is just a desperate move not to look obsolete.
Raytracing has been done in software for decades, just not real-time.
Do not expect RayTracing in hardware until end of 2020 and even then they will be years behind nVidia who will, by that time, be in the process of readying their third gen RTX cards for release.
You don't and can't know any of that.
 
Last edited:
Joined
May 15, 2014
Messages
235 (0.06/day)
[QUOTE="Kaotik, post: 3974472, member: 101367"Unless proven otherwise it should be 64, as the Vega 20 diagrams from Instinct release clearly show 4 Pixel Engines per Shader Engine.[/QUOTE]

I must admit I took the 128 ROPs report as given. If the Instinct diags aren't just high level basic copies of Vega10 slides, then definately 64 ROPS for Vega20.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,166 (7.56/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD has confirmed that the card's ROP count is 64.
 

Nkd

Joined
Sep 15, 2007
Messages
364 (0.06/day)
Good luck with RayTracing in software, if that was viable we would have had that already. If they do it it is just a desperate move not to look obsolete.

Do not expect RayTracing in hardware until end of 2020 and even then they will be years behind nVidia who will, by that time, be in the process of readying their third gen RTX cards for release.

We need Intel to enter the market with RayTracing from the get go in 2020.

I also have a feeling that AMD may be working secretly with Intel on RayTracing tech to sett up a unified standard against nvidias RTX.

3rd gen rtx card? Not happening lol. NVidia is not going to refresh until 2020. Thats when they will have 7nm. You really think Nvidia is goint to replace rtx 20 series after less then 12 months? They don't have a process to shrink to and they are not in a hurry to do it. Heck they stretched pascal for 2 years. So Nvidia is going to have 3 rtx generations in 3 years lol. Do you realize what you are saying?

Late to the party again, but I'd say this is a decent answer to RTX. Maybe not the show stopper that Ryzen was but damn decent none-the-less. It seems AMD has kicked it up.


Raytracing has been done in software for decades, just not real-time.

You don't and can't know any of that.


yea he thinks nvidia is going to release 3 rtx generations in 3 years 2018, 2019 and then 2020. When pascal went for 2 years alone. Not sure about that rofl.
 
Joined
Sep 2, 2014
Messages
660 (0.18/day)
Location
Scotland
Processor 5800x
Motherboard b550-e
Cooling full - custom liquid loop
Memory cl16 - 32gb
Video Card(s) 6800xt
Storage nvme 1TB + ssd 750gb
Display(s) xg32vc
Case hyte y60
Power Supply 1000W - gold
Software 10
The only way they could have done this is if they priced the Radeon 7 at $649 or $599, not $699. $699 is the same price as the RTX2080 but the 2080 doesn't have the heat, power use, has RT cores, has Tensor cores, etc. Overall the RTX2080 is expensive because it has new tech in it. If I have to pay the same price, I will buy the one with the lower power draw, the lower heat, the advance tech in it.



699 its a good price for that performance, plus dont forget how looks stock cooler. A not shit blowers style.
The rumor is that it costs close to $750 to make the Radeon 7 cards. So no, they are not making money. This is just to stop the bleeding.
 
Joined
Apr 10, 2013
Messages
302 (0.07/day)
Location
Michigan, USA
Processor AMD 1700X
Motherboard Crosshair VI Hero
Memory F4-3200C14D-16GFX
Video Card(s) GTX 1070
Storage 960 Pro
Display(s) PG279Q
Case HAF X
Power Supply Silencer MK III 850
Mouse Logitech G700s
Keyboard Logitech G105
Software Windows 10
What are the drawbacks? What advantages does the 2080 have? You can't be talking about RTX and DLSS, can you?
The drawback is the rumored price of $699 and missing technology. If you can get the technology with the other product at the same price why settle? It is like choosing between two identical cars - one has headlights and one doesn't. The salesman can say "hey it is light out right now maybe you won't need those headlights". AMD's engineering has always been adequate but it sold by undercutting competition pricing. If AMD GPU prices intend to match the competition I can't see how they continue to improve their already dismal market shares. NVIDIA's release and pricing led to a major crash in their sales and stock value - I am not sure why a strengthening AMD would want to embrace that model. AMD has a long way to go before they can price with the big boys.
 
Joined
Apr 26, 2008
Messages
1,136 (0.19/day)
Location
london
System Name Staggered
Processor Intel i5 6600k (XSPC Rasa)
Motherboard Gigabyte Z170 Gaming K3
Cooling RX360 (3*Scythe GT1850) + RX240 (2*Scythe GT1850) + Laing D5 Vario (with EK X-Top V2)
Memory 2*8gb Team Group Dark @3000Mhz 16-16-16-36 1.25v
Video Card(s) Inno3D GTX 1070 HerculeZ
Storage 256gb Samsung 830 + 2*1tB Samsung F3 + 2*2tB Samsung F4EG
Display(s) Flatron W3000H 2560*1600
Case Cooler Master ATCS 840 + 1*120 GT1850 (exhaust) + 1*230 Spectre Pro + Lamptron FC2 (fan controller)
Power Supply Enermax Revolution 85+ 1250W
Software Windows 10 Pro 64bit
It should be noted that Nvidia has a huge ass achilles heel with the RTX series - that RT operations are INT based, and that the card needs to flush to switch between FP and INT operations.

Dedicated Hardware acceleration for RT is a smokescreen IMO, the key is if you can cut down your FP or INT instructions as small as possible and run as many as parallel as possible. AMD does have some FP division capability so its possible that some cards can be retrofitted for RT.
Source? I tried googling it and couldn't find anything.
 
Joined
Feb 18, 2005
Messages
5,755 (0.80/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
I just remembered how the AMD fanboys were pissing over RTX 2080 's $699 pricetag at launch, but Radeon VII comes along with the same price and suddenly people are claiming it's great value.

No, great value would be if it wasn't just a Vega respin with double the memory bandwidth, double the VRAM, 250MHz higher clocks, and an extra $200 tacked on to the price. The die-shrink to 7nm is going to help with power and heat, but this is still Vega/GCN 5 with all its limitations, and I honestly don't expect this card to outperform GTX 2080 in the way AMD is claiming.
 
Joined
Jun 10, 2014
Messages
2,978 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
For reference, Vega 20 would need about ~40% more performance over Vega 10 to be on par with RTX 2080. I do wonder which changes are going to make that possible.
 
Joined
Oct 1, 2006
Messages
4,930 (0.75/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
I just remembered how the AMD fanboys were pissing over RTX 2080 's $699 pricetag at launch, but Radeon VII comes along with the same price and suddenly people are claiming it's great value.

No, great value would be if it wasn't just a Vega respin with double the memory bandwidth, double the VRAM, 250MHz higher clocks, and an extra $200 tacked on to the price. The die-shrink to 7nm is going to help with power and heat, but this is still Vega/GCN 5 with all its limitations, and I honestly don't expect this card to outperform GTX 2080 in the way AMD is claiming.
Both are bad value, one being worse than the other doesn't mean either card are good value.
 

Nkd

Joined
Sep 15, 2007
Messages
364 (0.06/day)
Primarily a major difference in TDP: 215W vs. ~300W.

When you have competing products A and B, which performs and costs the same, but one of them have a major disadvantage, why would anyone ever buy it?

Gtx 2080 is around 225w. It remains to be seen what the actual usage is on Radeon 7 during gaming. For that we wait for reviews.

The only way they could have done this is if they priced the Radeon 7 at $649 or $599, not $699. $699 is the same price as the RTX2080 but the 2080 doesn't have the heat, power use, has RT cores, has Tensor cores, etc. Overall the RTX2080 is expensive because it has new tech in it. If I have to pay the same price, I will buy the one with the lower power draw, the lower heat, the advance tech in it.

The rumor is that it costs close to $750 to make the Radeon 7 cards. So no, they are not making money. This is just to stop the bleeding.

I don't think that was how much it costs them to make, it was what they originally wanted to sell it at. Yea I have no doubt they are not making much on it.

Plus lets hold off on that heat portion. Wait for the reviews, you can't complain about heat when you haven't seen the temps yet. Will it use more power? Yea sure doesn't mean its going to run hot.
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
For reference, Vega 20 would need about ~40% more performance over Vega 10 to be on par with RTX 2080. I do wonder which changes are going to make that possible.


This video shows the performance of a Vega 64 clocked at 1,750MHz against an RTX 2080 running at stock clocks. (Also don't forget Vega 64 has 4 more CUs than Radeon VII which makes up for that 50MHz core clock deficit)
Even the memory on the AMD side is overclocked and at those clocks the vega has 580GB of memory bandwidth which is quite a lot.
This is pretty much what you would expect from a Radeon VII to do, maybe a little bit better.
 
Last edited:
Joined
Jun 10, 2014
Messages
2,978 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Gtx 2080 is around 225w. It remains to be seen what the actual usage is on Radeon 7 during gaming. For that we wait for reviews.
AMD promises "25% more performance at the same power", whatever that means.
25% is not enough to be on par with RTX 2080.

But as you say, reviews will tell the truth.
 
Joined
Mar 10, 2015
Messages
3,984 (1.13/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
If you can get the technology with the other product

I fail to see the missing technology. RTX is usable in one game...and the series is trash. DLSS looks like shit compared to the other available methods. I fail to see what benefits the 2080 has.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.48/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
AMD promises "25% more performance at the same power", whatever that means.
25% is not enough to be on par with RTX 2080.

But as you say, reviews will tell the truth.
If you take it at face value, Radeon VII has 25% more performance for the same power consumption (295w).
13% of that performance comes from the higher boost clock of 1800 MHz (remember, 4 CU short).
12% likely comes from Radeon VII's ability to hold boost clock longer than Vega 64 does.

You know how it goes: they're likely talking about games where Vega 64 does really well against Turing. I highly doubt they're talking about an average.
 
Joined
Nov 1, 2018
Messages
584 (0.27/day)
I wonder why AMD is stuck with maximum 4096 SP's ?
I mean.... Fury, Vega (1), Vega II ... they are almost identical.

Considering that the new chip is rather small at 331 mm2, what stopped them from making a 450 mm2 chip for example and fitting 72 CU's in it, or 96 !!
It would wipe the floor with 2080 Ti with 6144 SP's (let's say cut a few for being defective, even with 5760 SP's it would still crush it with raw computer power and that massive 1TBps bandwidth, WHILE BEING A SMALLER CHIP due to 7nm)

Instead, they just shrunk Fury, then shrunk it again without adding anything :(
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.48/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Because bigger = lower yields. AMD is all about mass production these days.
 
Joined
Nov 21, 2010
Messages
2,350 (0.46/day)
Location
Right where I want to be
System Name Miami
Processor Ryzen 3800X
Motherboard Asus Crosshair VII Formula
Cooling Ek Velocity/ 2x 280mm Radiators/ Alphacool fullcover
Memory F4-3600C16Q-32GTZNC
Video Card(s) XFX 6900 XT Speedster 0
Storage 1TB WD M.2 SSD/ 2TB WD SN750/ 4TB WD Black HDD
Display(s) DELL AW3420DW / HP ZR24w
Case Lian Li O11 Dynamic XL
Audio Device(s) EVGA Nu Audio
Power Supply Seasonic Prime Gold 1000W+750W
Mouse Corsair Scimitar/Glorious Model O-
Keyboard Corsair K95 Platinum
Software Windows 10 Pro
I just remembered how the AMD fanboys were pissing over RTX 2080 's $699 pricetag at launch, but Radeon VII comes along with the same price and suddenly people are claiming it's great value.

No, great value would be if it wasn't just a Vega respin with double the memory bandwidth, double the VRAM, 250MHz higher clocks, and an extra $200 tacked on to the price. The die-shrink to 7nm is going to help with power and heat, but this is still Vega/GCN 5 with all its limitations, and I honestliy don't expect this card to outperform GTX 2080 in the way AMD is claiming.

I see people justifying the power consumption but I don't see that, only ONE comment stating "...because the 2080 is $699." was it's 10-series counter also $699 at launch?

AMD promises "25% more performance at the same power", whatever that means.
25% is not enough to be on par with RTX 2080.

But as you say, reviews will tell the truth.

Good spot, but it probably means what it says to does, it's 25% more effiecient. Assuming it's being compared to the V64, when consuming the same amount of power it does 25% more work. We could probably figure out how much power this card really sucks down with that bit assuming power/perf scales linearly and a little guestimation(2080 power * [V64/2080] ratio * [v7/v64] ratio) puts the card around 400-450w.
 
Last edited:
Joined
Jul 15, 2006
Messages
1,256 (0.19/day)
Location
Noir York
Processor AMD Ryzen 7 5700G
Motherboard ASUS A520M-K
Cooling Scythe Kotetsu Mark II
Memory 2 x 16GB SK Hynix CJR OEM DDR4-3200 @ 4000 20-22-20-48
Video Card(s) Colorful RTX 2060 SUPER 8GB GDDR6
Storage 250GB WD BLACK SN750 M.2 + 4TB WD Red Plus + 4TB WD Purple
Display(s) AOpen 27HC5R 27" 1080p 165Hz curved VA
Case AIGO Darkflash C285
Audio Device(s) Creative SoundBlaster Z + Kurtzweil KS-40A bookshelf / Sennheiser HD555
Power Supply Great Wall GW-EPS1000DA 1kW
Mouse Razer Deathadder Essential
Keyboard Cougar Attack2 Cherry MX Black
Software Windows 10 Pro x64 22H2
I wonder why AMD is stuck with maximum 4096 SP's ?
I mean.... Fury, Vega (1), Vega II ... they are almost identical.

Considering that the new chip is rather small at 331 mm2, what stopped them from making a 450 mm2 chip for example and fitting 72 CU's in it, or 96 !!
It would wipe the floor with 2080 Ti with 6144 SP's (let's say cut a few for being defective, even with 5760 SP's it would still crush it with raw computer power and that massive 1TBps bandwidth, WHILE BEING A SMALLER CHIP due to 7nm)

Instead, they just shrunk Fury, then shrunk it again without adding anything :(
They say have removed the 4 Shader Engine limitation on GCN5 (Vega), but I dont believe that. They instead put something to mitigate the limitation like DSBR, NGG fastpath, HBCC, which some of them are broken. AMD should dump GCN for gaming card and start anew.

Even if I dont have my Vega56 I wont buy this card at all, for once it still uses the same limitation since Fiji. They only increase clockspeed and add tiny bit of improvement here and there. Only reason I bought my Vega56 is because it didnt have the dreaded 4GB limitation as Fury so new games wont choke, and I get it for cheap since mining crash.
 
Last edited:
Joined
Oct 14, 2017
Messages
210 (0.08/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a
I wonder why AMD is stuck with maximum 4096 SP's ?
I mean.... Fury, Vega (1), Vega II ... they are almost identical.

Considering that the new chip is rather small at 331 mm2, what stopped them from making a 450 mm2 chip for example and fitting 72 CU's in it, or 96 !!
It would wipe the floor with 2080 Ti with 6144 SP's (let's say cut a few for being defective, even with 5760 SP's it would still crush it with raw computer power and that massive 1TBps bandwidth, WHILE BEING A SMALLER CHIP due to 7nm)

Instead, they just shrunk Fury, then shrunk it again without adding anything :(

Because bigger = lower yields. AMD is all about mass production these days.

so...the rules of interactive entertainment is ?
think littel ? - stay littel
think big - get BIG

isn't that whay nvidea have money for drivers and AMD don't ? the money they get from thinking big give them drivers, when AMD fail again and again in drivers and still didn't learned that game developer relations important ? let me guess: the "solution" to developer relations is put more GB/s memory banwith and another 1000 mhz. they never learn. do wrong once, you stupid, do wrong twice you retard, do wrong 3: you insane.
 
Joined
Mar 10, 2015
Messages
3,984 (1.13/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
do wrong once, you stupid, do wrong twice you retard, do wrong 3: you insane.

What does complaining about driver issues that don't exist make you?
 
Joined
Oct 14, 2017
Messages
210 (0.08/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a
whare do you see complaining ? and whare do you see don't exist ?

AMD shills ? I can respect that, you look like a fighter too. "fight for your right for gaming on AMD, kill anyone that looks like against AMD" ?
but you could use a brain: a gaming developer relationship program will benefit AMD more than a few more mhz and a few more GB/s memory banwith
you don't see the advantage of that ? for your own good ? what does that make you ?
how about async compute enabled on all games sounds to you ? should I mention how much faster doom 4 was with async enabled ? and that was just one game where is was used.....WITHOUT AMD's help...and that's just the beginning, are you able to imagine what it could mean if AMD was involved ? in all games ?
oh wait, you are a radeon expert, im sorry you must know more than I do
 
Last edited:
Top