• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The future of RDNA on Desktop.

Joined
Sep 7, 2017
Messages
16 (0.01/day)
Now that AMD's RX9070 Series GPUs have been Announced and will be Released for Sales starting this Thursday I have been read rumors that this is it for RDNA on Desktop. The next Revision of RDNA, RDNA 5, will be for Mobile and Gaming Consoles. The next node for Desktop GPUs will be UDNA. If RDNA is successful beyond all expectations could AMD Leadership be pursuaded to do one more iteration of RDNA, call it RDNA4+ that is configured with 20GB of DDR7. The current Boards uses GDDR6. Call the new card the 9070 XTX and do a similar upgrade on the current 9070 vanilla with 20GB of GDDR7.

As you can tell UDNA1 doesn't give me the warm and fuzzys. And it looks to me AMD has a Winner in the 9070 Series.

Any thoughts ?
 
Joined
Jan 14, 2019
Messages
14,754 (6.58/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
RDNA 4 is the last RDNA architecture. The next mainstream desktop one is UDNA. They decided so because it costs AMD too much to develop two architectures simultaneously, not to mention the sporadic ROCm support on (cheaper) gaming cards drivers away sales from people who need compute, but don't have the need and/or cash for an enterprise card, and have to choose Nvidia for CUDA. As for whether it's good for gaming or not, we'll see. Personally, I'm due for an upgrade anyway, so I'll just get a 9070 XT and call it quits for a good 2-3 gens.
 
Joined
Jul 13, 2016
Messages
3,534 (1.12/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Now that AMD's RX9070 Series GPUs have been Announced and will be Released for Sales starting this Thursday I have been read rumors that this is it for RDNA on Desktop. The next Revision of RDNA, RDNA 5, will be for Mobile and Gaming Consoles. The next node for Desktop GPUs will be UDNA. If RDNA is successful beyond all expectations could AMD Leadership be pursuaded to do one more iteration of RDNA, call it RDNA4+ that is configured with 20GB of DDR7. The current Boards uses GDDR6. Call the new card the 9070 XTX and do a similar upgrade on the current 9070 vanilla with 20GB of GDDR7.

As you can tell UDNA1 doesn't give me the warm and fuzzys. And it looks to me AMD has a Winner in the 9070 Series.

Any thoughts ?

We might see higher VRAM variants but I don't see AMD spending any time on significant revisions for future RDNA cards past the shortly to be released series.

That would just detract resources from UDNA and that is AMD's number one priority right now. Keep in mind that UDNA will now be including all their gaming, compute, and AI improvements from now on. The name might not sound exciting to gamers but it should be given that some of the benefits to compute and AI have already been integrated into RDNA4.
 
Joined
Dec 28, 2012
Messages
4,255 (0.96/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
AMD could have easily given it 6144 units and GDDR7 with just a 16% larger die area of 420 mm2.
Nah we're not allowed to have high end cards anymore.
 
Joined
Jan 14, 2019
Messages
14,754 (6.58/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
AMD could have easily given it 6144 units and GDDR7 with just a 16% larger die area of 420 mm2.
They could have, but a larger chip comes with higher manufacturing costs, which translates into a higher price. People are looking for Nvidia at that price point, so it would have been a futile investment.
 
Joined
Jun 19, 2024
Messages
535 (2.07/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
43,896 (6.80/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
UDNA is the next arch after RDNA
 
Joined
Dec 25, 2020
Messages
7,760 (5.07/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Now that AMD's RX9070 Series GPUs have been Announced and will be Released for Sales starting this Thursday I have been read rumors that this is it for RDNA on Desktop. The next Revision of RDNA, RDNA 5, will be for Mobile and Gaming Consoles. The next node for Desktop GPUs will be UDNA. If RDNA is successful beyond all expectations could AMD Leadership be pursuaded to do one more iteration of RDNA, call it RDNA4+ that is configured with 20GB of DDR7. The current Boards uses GDDR6. Call the new card the 9070 XTX and do a similar upgrade on the current 9070 vanilla with 20GB of GDDR7.

As you can tell UDNA1 doesn't give me the warm and fuzzys. And it looks to me AMD has a Winner in the 9070 Series.

Any thoughts ?

No, it's not that simple. If the Navi 48's memory controller wasn't designed to handle G6X or G7 from the get go, upgrading the memory won't be a trivial matter. They could at most extend memory capacity by adopting 32 Gbit ICs. 16 GB at 256-bit likely means 8 16 Gbit ICs installed. But higher density memory chips or clamshell tend to have certain drawbacks like inferior timings or heat dissipation, power consumption in the case of a clamshell configuration.
 
Joined
Jan 2, 2024
Messages
791 (1.85/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
This gen RDNA doesn't need GDDR7 to compete. Straight up.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Now that AMD's RX9070 Series GPUs have been Announced and will be Released for Sales starting this Thursday I have been read rumors that this is it for RDNA on Desktop. The next Revision of RDNA, RDNA 5, will be for Mobile and Gaming Consoles. The next node for Desktop GPUs will be UDNA. If RDNA is successful beyond all expectations could AMD Leadership be pursuaded to do one more iteration of RDNA, call it RDNA4+ that is configured with 20GB of DDR7. The current Boards uses GDDR6. Call the new card the 9070 XTX and do a similar upgrade on the current 9070 vanilla with 20GB of GDDR7.

As you can tell UDNA1 doesn't give me the warm and fuzzys. And it looks to me AMD has a Winner in the 9070 Series.

Any thoughts ?

I wouldn't worry about the name. Just think of 9070xt as a low-end UDNA chip you're getting early.

As I've said before, it's likely RDNA4 is competing with the next-gen low-end Rubin (128-bit) and *maybe* (if they release a high-clocked 16/32GB model) a cut-down slightly higher-end chip (7680sp on 3nm).

The point is 1080pRT mins and 1440pRT (sometimes upscaled) averages, and is likely how the XT/XTX will be divided. Again, I reference Wukong at 1080p and 1440p. Or Spider-man 2 in 1080p or 1440p.

It's just bringing the new market segmentation to you a little early because right now some games will be okay-ish at 1440p, but long-term it'll be a 1080p card. 1440p will be the 18GB market on 3nm.

This might piss some people off long-term, but remember that nVIDIA still thinks that's worth $750-1000. Again, this is why this chip should be cheap and they shouldn't sell it as 1440p/4k *really*...but whatever.

We're in a transitional point between pure raster and standardization of RT so I guess they will.

Beyond that, I wouldn't worry.
AMD could have easily given it 6144 units and GDDR7 with just a 16% larger die area of 420 mm2.
You mean ~12288sp (essentially 7900xtx but with RT/FSR improvements and high clock)? That's next-gen...I think they purposely avoided it this gen given it they'd have to sell it for >$1000.
When nVIDIA cut the costs of 4080 (segment), I think they decided it wasn't worth pursuing on 4nm. Next-gen AMD/nVIDIA are going to fight that battle HARD (~4090+ performance), as lots of people want that.

If you mean 6144sp, like 5070, that too (approx configuration; 3x 1920sp or 2048sp chiplets) is probably next-gen. Same with nVIDIA. It will likely replace 9070 xt almost directly in performance on 128-bit/GDDR7.
 
Last edited:
Joined
Dec 31, 2020
Messages
1,212 (0.80/day)
I mean 9090 for €900 with +50% more units 96x64, 192 rops and 24 Gbps memory, doesn't have to be 384 bit, 30% faster at almost no cost and die increase to crush the 5080. But i get it. Now it's 2:1 sides ratios of the chip and very nice also.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I mean 9090 for €900 with +50% more units 96x64, 192 rops and 24 Gbps memory, doesn't have to be 384 bit, 30% faster at almost no cost and die increase to crush the 5080. But i get it. Now it's 2:1 sides ratios of the chip and very nice also.
With AMD it would be 6 clusters of 1024 (2048 by the way I judge things), yes 192R/384-bit w/ GDDR5, but will be 96 or 128R/256-bit with GDDR7.

I agree they should have made it, monolithic or otherwise, giving us a more reasonably priced 4090, but they didn't.

A thing people need to understand is that at it's max clock potential N48 could use 425W. With 32GB, ~475W.
Yes, a chip with 50% more units only 25% more power for the chip +25-75W for the ram (if 24/48GB), but that's still an expensive/niche product using ~600W.
 
Last edited:
Joined
Oct 19, 2022
Messages
368 (0.42/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6200MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
... I'll just get a 9070 XT and call it quits for a good 2-3 gens.

For 2-3 gens?!! Given how badly optimized most modern games are, I doubt it will run games at more than 1080p Medium in 4-5 years :twitch:
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
For 2-3 gens?!! Given how badly optimized most modern games are, I doubt it will run games at more than 1080p Medium in 4-5 years :twitch:

This guy gets it. :laugh:

I think it'll run 1080p fine for a while, until 18GB becomes a standard (just like 16GB is becoming over 12GB now), but I think 1440p is going to require some concessions (especially if you use features like FG, etc).
It's all relative. Some people (like me) will say 1080p, others will say they can make it work fine for 1440p (until more games are built towards 192-bit/18GB configs and/or built toward the PS6).
I'm not going to argue, because as I've said countless times...you can make ANYTHING work and different things are acceptable to different people.
I try to be cautious when explaining things (to the best of my ability/understanding at any given moment), which is why I trend towards worse-case and using minimum 60fps frame rates at high settings.
 
Joined
Mar 23, 2016
Messages
4,862 (1.49/day)
Processor Core i7-13700
Motherboard MSI Z790 Gaming Plus WiFi
Cooling Cooler Master RGB Tower cooler
Memory Crucial Pro 5600 32GB kit OCed to 6600
Video Card(s) XFX Speedster SWFT309 AMD Radeon RX 6700 XT CORE Gaming
Storage 970 EVO NVMe M.2 500GB,,WD850N 2TB
Display(s) Samsung 28” 4K monitor
Case Phantek Eclipse P400S
Audio Device(s) EVGA NU Audio, Edifier Bookshelf Speakers R1280
Power Supply EVGA 850 BQ
Mouse Logitech G502 Hero
Keyboard Logitech G G413 Silver
Software Windows 11 Professional v24H2
This gen RDNA doesn't need GDDR7 to compete. Straight up.
The large chunk of cache on-die mitigates the need for faster VRAM. My 6700XT can act like it has a 850 GB/s for memory bandwidth because of 96 MB of on-die cache. Actual memory bandwidth is 384.0 GB/s.

 
Last edited:
Joined
Jul 13, 2016
Messages
3,534 (1.12/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
The large chunk of cache on-die mitigates the need for faster VRAM. My 6700XT can act like it has a 850 GB/s for memory bandwidth because of 96 MB of on-die cache. Actual memory bandwidth is 384.0 GB/s.


The 9070 XT is getting a cache size downgrade to 64 MB but it's faster than last gen.

Very close to the cache size of the 5070 Ti only the 5070 Ti has faster memory. If the 9070 XT ends up being around 5070 Ti level performance, that's a pretty big win for AMD as it'd mean they are doing more with less. Using GDDR6 brings the BOM of the radeon cards down and ensures better availability (assuming GDDR7 could be a production bottleneck).

Hard to say whether AMD is truly happy with GDDR6 or just decided to focus all engineering resources on next gen.
 
Joined
Aug 12, 2019
Messages
2,320 (1.14/day)
Location
LV-426
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Galax RTX 3090 EX Gamer White OC
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
Looking forward to the next AMD gpu which will use UDNA, the current RDNA4 with the much improved ray tracing capability, things will get better
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
43,896 (6.80/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
The 9070 XT is getting a cache size downgrade to 64 MB but it's faster than last gen.

Very close to the cache size of the 5070 Ti only the 5070 Ti has faster memory. If the 9070 XT ends up being around 5070 Ti level performance, that's a pretty big win for AMD as it'd mean they are doing more with less. Using GDDR6 brings the BOM of the radeon cards down and ensures better availability (assuming GDDR7 could be a production bottleneck).

Hard to say whether AMD is truly happy with GDDR6 or just decided to focus all engineering resources on next gen.
Id say the latter and not the former
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
The large chunk of cache on-die mitigates the need for faster VRAM.
I think this will return. If you do the math, AMD really needs to match nVIDIA's cache structure next-gen. If this means they double L3 (again), transition to similar L2 like nVIDIA, or something else...I don't know.
As I've said, with RDNA3 this dropped to roughly half of nVIDIA. With RDNA 4, perhaps 2/3.

All that matters is that the compute is fed, and I think it will be. As I said, I think 20gbps will be good for ~3150mhz, a typical overclock of 2700mhz (21.6gbps), ~3400mhz. This is likely the limit of 375w.
So, it makes sense. I like it.
The 9070 XT is getting a cache size downgrade to 64 MB but it's faster than last gen.

Very close to the cache size of the 5070 Ti only the 5070 Ti has faster memory. If the 9070 XT ends up being around 5070 Ti level performance, that's a pretty big win for AMD as it'd mean they are doing more with less. Using GDDR6 brings the BOM of the radeon cards down and ensures better availability (assuming GDDR7 could be a production bottleneck).

Hard to say whether AMD is truly happy with GDDR6 or just decided to focus all engineering resources on next gen.

You know what I think is funny? Faster L3. You know why the L3 is faster? Because it runs at core speed. You know what's faster in N48? The core. LOL.
Also, you may need to understand the difference between L2/L3. L2 is literally twice as fast.

I do commend them for what they pulled off with GDDR6. I think they made a very economical and common-sense chip, and if it can keep 1080pRT 60fps mins, that's cool. 5070 surely can't.
I also hope whatever they can pull off at the high-end makes people at 1440p happy-enough for a while...although it might be a little bit of a strech IMHO for my tastes. It'll probably be 'ok', but a novelty.
At least they don't sell it the way nVIDIA does 5080 16GB, which is quickly going to become a very humorous joke when people realize how much 2GB can and does make to minimum FR at 1440p.
5080 is the most expensive 1080p card I have ever seen. Seriously. No joke. Now imagine if a card was 9216sp @ 3780mhz with 18GB of ram. Not 10752 @ 2640mhz and 16GB. Exactly. 60fps. Heh.
 
Joined
Dec 25, 2020
Messages
7,760 (5.07/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
At least they don't sell it the way nVIDIA does 5080 16GB, which is quickly going to become a very humorous joke when people realize how much 2GB can and does make to minimum FR at 1440p.
5080 is the most expensive 1080p card I have ever seen. Seriously. No joke. Now imagine if a card was 9216sp @ 3780mhz with 18GB of ram. Not 10752 @ 2640mhz and 16GB. Exactly. 60fps. Heh.

Regardless of vendor, 16 GB is a perfectly adequate amount of video memory for any GPU designed with 1440p and even 4K in mind, even when using demanding settings including ray tracing and often optional ultra high quality textures. Neither the 5080 nor the 9070 XT will suffer from frame buffer capacity-related issues before their computing power is exhausted on almost any game today and in for some time to come, a few niches notwithstanding. Those who really wanna go there have the 7900 XTX, 4090 and 5090 available for them.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Regardless of vendor, 16 GB is a perfectly adequate amount of video memory for any GPU designed with 1440p and even 4K in mind, even when using demanding settings including ray tracing and often optional ultra high quality textures. Neither the 5080 nor the 9070 XT will suffer from frame buffer capacity-related issues before their computing power is exhausted on almost any game today and in for some time to come, a few niches notwithstanding. Those who really wanna go there have the 7900 XTX, 4090 and 5090 available for them.
It really isn't. I know you think it is. It isn't. Look at this. Look at the scaling. I know you will say you don't play benchmarks. But, you need to understands games are built upon standards like benchmarks.

What is the aim for this segment. 120. This is shown in the GAMES I linked, by being roughly half. What can this not reach even at absurd clocks? 120. Why? ram. Add 2GB of ram and figure it out.

The founders edition runs at 2640mhz constant...so it'll help you figure it out.

Like I say, a 9216sp (@ ~3780) w/ 18GB would score 120 in that bench, and vicariously keep 60fps in games.

Want to make a bet on this? I will bet you one heart eyes emoji I am 100% correct.
 
Last edited:
Joined
Feb 24, 2023
Messages
3,637 (4.92/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
I think 20gbps will be good for ~3150mhz, a typical overclock of 2700mhz (21.6gbps), ~3400mhz.
You empuzzled me. How does it even work? Game engines differ, various resolutions need different and not linearly changing amounts of VRAM bandwidth and on-die calculating power, plus God knows what else; all this makes 20 Gbps a massive overkill in some scenarios and not even half enough in others. I do agree it's not completely tragic to only have G6@20 with N48 in mind but there definitely will be scenarios when it's anemic. 9070 non-XT feels a lot more balanced (yet a lot worse per $).

I assume using G7 is way too expensive even if we're talking chips that can't even reasonably achieve 28 Gbps and this is why we don't see 9070 XT with, say, G7@26 coming.

Unfortunately no 160+ CU beast to witness. I'm sick of top tier GPU only being green and of AMD never even trying.
 
Joined
Dec 25, 2020
Messages
7,760 (5.07/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
It really isn't. I know you think it is. It isn't. Look at this. Look at the scaling. I know you will say you don't play benchmarks. But, you need to understands games are built upon standards like benchmarks.

What is the aim for this segment. 120. This is shown in the GAMES I linked, by being roughly half. What can this not reach even at absurd clocks? 120. Why? ram. Add 2GB of ram and figure it out.

The reason the RTX 4090 outperforms the 5080 is because its core is (sometimes significantly) more powerful, not because the 5080 is memory capacity starved. To run into the limitations of 16 GB, you currently have to go all-out, with the most extreme scenarios (and it would still fit into memory by a hair) - not to mention W1zz tested this on the 5090, where this would be about 49.5% of its capacity. On a 16 GB card it would preallocate less, and use a bit less as a result.

 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
You empuzzled me. How does it even work? Game engines differ, various resolutions need different and not linearly changing amounts of VRAM bandwidth and on-die calculating power, plus God knows what else; all this makes 20 Gbps a massive overkill in some scenarios and not even half enough in others. I do agree it's not completely tragic to only have G6@20 with N48 in mind but there definitely will be scenarios when it's anemic. 9070 non-XT feels a lot more balanced (yet a lot worse per $).

I assume using G7 is way too expensive even if we're talking chips that can't even reasonably achieve 28 Gbps and this is why we don't see 9070 XT with, say, G7@26 coming.

Unfortunately no 160+ CU beast to witness. I'm sick of top tier GPU only being green and of AMD never even trying.

It's about saturation points. Limitations in what respect or another. I don't get what you don't understand?

GDDR7 truly isn't that expensive, but the reality is it's just not needed yet. It will be next gen. Right now it is largely used to sell as an advantage, more than actually being one.

As shown with N48, there are ways to optimize designs (towards current gaming trends) without it, as I explained. This comes with matching compute ability, cache, and memory bandwidth accordingly.

Buffer size is important to an extent because of current trends in relation to according compute capability and standardized resolution/settings (especially as new ones emerge, such as up-scaling/RT/FG).
It changes, and is fluid, but there are general guidelines that correspond to products. I have outlined this before. It is why a 12GB nVIDA card does not hit >45TF, and why 7800xt was limited this way, etc.
It is also how a product like 9070 xt, which is just over 45TF, and 9070 vanilla likely limited in this regard. Above allows a similar market until it generally limited by buffer, as in the case of 5080.

This is why you could literally see a 3500mhz N48 competing with 5080, even though a 5080 is capable of much higher compute throughput.
5080 can run at faster clock, but general playable settings (ie 60fps mins) are unattainable do to inadequate buffer. Why do you think nVIDIA clocked it at 2640mhz, and not closer to 3154 (like AMD)?

The amusing part is for 5080 to make sense it needs 24GB of memory (really 18-20, but 24GB is the only option) at 3.23ghz, which is the *exact* top of the 5nm dense voltage curve (seen on A15/M1).
If nVIDIA productizes this I will be amazed, because that is what they will sell next-gen and claim it is faster than a 5080...100% without doubt. They *could* do this, but likely won't unless they HAVE TO.
This is how you make safeguards within your product stack/chip designs, but don't push things forward too quickly if you don't have to (and hence save money on each inch of progress).

You will also notice 3 out of 10 5080 16GB skus reviewed on this website are capable of that for any prolonged period of time. This is by design.

The reason the RTX 4090 outperforms the 5080 is because its core is (sometimes significantly) more powerful, not because the 5080 is memory capacity starved. To run into the limitations of 16 GB, you currently have to go all-out, with the most extreme scenarios (and it would still fit into memory by a hair) - not to mention W1zz tested this on the 5090, where this would be about 49.5% of its capacity. On a 16 GB card it would preallocate less, and use a bit less as a result.

We'll see how it bears out. 4090 outperforms it for these reasons, yes, but minimums (and often benchmarks) are largely correlated to limitations.

I'm just sayin' a 9216sp part, with 3780mhz core and 36gbps ram is a much more rational pairing of resources. Again, we shall see!

As mentioned about a trillion times, I do not agree with how W1zard perceives memory usage, not how he tests it's limitations. There are many variables in this beyond purely benchmarks, such as swapping.

What happens when a game is targeting 16GB, as you say, and you turn on FG? nVIDIA doesn't want you to know and hence hides native FR when you do this. This what I'm talking about; many variables.
 
Last edited:
Top