• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The future of RDNA on Desktop.

Joined
Sep 7, 2017
Messages
16 (0.01/day)
RDNA 4 is the last RDNA architecture. The next mainstream desktop one is UDNA. They decided so because it costs AMD too much to develop two architectures simultaneously, not to mention the sporadic ROCm support on (cheaper) gaming cards drivers away sales from people who need compute, but don't have the need and/or cash for an enterprise card, and have to choose Nvidia for CUDA. As for whether it's good for gaming or not, we'll see. Personally, I'm due for an upgrade anyway, so I'll just get a 9070 XT and call it quits for a good 2-3 gens.
I know that
We might see higher VRAM variants but I don't see AMD spending any time on significant revisions for future RDNA cards past the shortly to be released series.

That would just detract resources from UDNA and that is AMD's number one priority right now. Keep in mind that UDNA will now be including all their gaming, compute, and AI improvements from now on. The name might not sound exciting to gamers but it should be given that some of the benefits to compute and AI have already been integrated into RDNA4.

It's kinda hard to give an accurate assessment until what we know is really going on with Nvidia. Specifically I'm thinking about the "Missing ROP" saga. We don't have a clue of exactly what happened but it seems apparent to me that Jensen's declaration about it affecting .5 Percent reeks so bad it makes my eyes water. The possibility the whole Nvidia Production Run was lost is not out of the realm of possibility. Who is responsible for the mess will play a key role in determining how its cleaned up. Both Nvidia and AMD use TSMC 4nm Nodes. There are 3 different TSMC 4nm lines; N4, N4P and N4X. Those lines are booked years in advance and squeezing extra capacity out of them will not be easy. How much extra stock AMD ordered may also play a big part. AMD may have ordered for a 2 year run of cards. If the entire Nvidia Production Run has to be replaced it could be a very long time before they receive replacements. Like 6 to 9 months and even at that new Dies would be flowing in at a trickle. AMD could keep the market satiated for a while. At the pace of fulfillment that would crush AMD's stock 24 month stock could only last for 18 months. What happens then ? Who gets priority ? They both use TSMC
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I know that


It's kinda hard to give an accurate assessment until what we know is really going on with Nvidia. Specifically I'm thinking about the "Missing ROP" saga. We don't have a clue of exactly what happened but it seems apparent to me that Jensen's declaration about it affecting .5 Percent reeks so bad it makes my eyes water. The possibility the whole Nvidia Production Run was lost is not out of the realm of possibility. Who is responsible for the mess will play a key role in determining how its cleaned up. Both Nvidia and AMD use TSMC 4nm Nodes. There are 3 different TSMC 4nm lines; N4, N4P and N4X. Those lines are booked years in advance and squeezing extra capacity out of them will not be easy. How much extra stock AMD ordered may also play a big part. AMD may have ordered for a 2 year run of cards. If the entire Nvidia Production Run has to be replaced it could be a very long time before they receive replacements. Like 6 to 9 months and even at that new Dies would be flowing in at a trickle. AMD could keep the market satiated for a while. At the pace of fulfillment that would crush AMD's stock 24 month stock could only last for 18 months. What happens then ? Who gets priority ? They both use TSMC

What I find amazing is how do you disable 8 ROPs in a 16 ROP cluster? Doesn't make any sense. More rational explanation? We're being lied to wrt how nVIDIA GPUs are laid out.

This has happened before, with entire certain clusters disabled effecting different units than described depending upon how the yield was made, rather than disabling units within seperate clusters (as marketed).
I'll have to go look it up, but may be hard to find. nVIDIA took a lot of shit for it, so it would not surprise me if they started lying to people about how chips are organized to hide these discrepancies.
 
Joined
Mar 23, 2016
Messages
4,862 (1.49/day)
Processor Core i7-13700
Motherboard MSI Z790 Gaming Plus WiFi
Cooling Cooler Master RGB Tower cooler
Memory Crucial Pro 5600 32GB kit OCed to 6600
Video Card(s) XFX Speedster SWFT309 AMD Radeon RX 6700 XT CORE Gaming
Storage 970 EVO NVMe M.2 500GB,,WD850N 2TB
Display(s) Samsung 28” 4K monitor
Case Phantek Eclipse P400S
Audio Device(s) EVGA NU Audio, Edifier Bookshelf Speakers R1280
Power Supply EVGA 850 BQ
Mouse Logitech G502 Hero
Keyboard Logitech G G413 Silver
Software Windows 11 Professional v24H2
What I find amazing is how do you disable 8 ROPs in a 16 ROP cluster? Doesn't make any sense. More rational explanation? We're being lied to wrt how nVIDIA GPUs are laid out.

This has happened before, with entire certain clusters disabled effecting different units than described depending upon how the yield was made, rather than units within seperate clusters but not whole clusters.
I'll have to go look it up, but may be hard to find. nVIDIA took a lot of shit for it, so it would not surprise me if they started lying to people about how chips are organized to hide these discrepancies.
Didn’t Gamers Nexus speculate it’s the fused off shaders in a GPC?
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Didn’t Gamers Nexus speculate it’s the fused off shaders in a GPC?
I didn't see that (haven't made the video rounds lately), but sounds plausible.

The point is, you shouldn't be able to seperate them that way according to how we're led to believe each GPU is laid out. There would have to be a different amount of clusters or each cluster halved.
 
Joined
Mar 23, 2016
Messages
4,862 (1.49/day)
Processor Core i7-13700
Motherboard MSI Z790 Gaming Plus WiFi
Cooling Cooler Master RGB Tower cooler
Memory Crucial Pro 5600 32GB kit OCed to 6600
Video Card(s) XFX Speedster SWFT309 AMD Radeon RX 6700 XT CORE Gaming
Storage 970 EVO NVMe M.2 500GB,,WD850N 2TB
Display(s) Samsung 28” 4K monitor
Case Phantek Eclipse P400S
Audio Device(s) EVGA NU Audio, Edifier Bookshelf Speakers R1280
Power Supply EVGA 850 BQ
Mouse Logitech G502 Hero
Keyboard Logitech G G413 Silver
Software Windows 11 Professional v24H2
I didn't see that (haven't made the video rounds lately), but sounds plausible.

The point is, you shouldn't be able to seperate them that way according to how we're led to believe each GPU is laid out. There would have to be a different amount of clusters or each cluster halved.
After the GTX 970 3.5GB of fast RAM issue they decoupled the ROPs, and L2 cache from the memory controller.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Thanks, I'll have to watch it when I have a moment to absorb it all.

People that haven't been around for as long as some of us don't remember some of the weird crap nVIDIA has been caught doing.

Randomly cut L2, some but not all ram doubled over a bus, the whole 970 situation, disabling of whole clusters instead of separate pieces of them (as sold) that *can* impact performance...etc etc etc etc.

I'm fairly certain the ways nVIDIA cuts costs hasn't changed...I'm fairly certain they just got better at hiding it. *This* is what I keep trying to explain. They obfuscate, and when they can't, they lie.

When they get caught in their lies they say "it isn't a big deal" (Huang is very good at this), when it IS a big deal! Their ability to gaslight is incredible. I'm sure Steve has touched upon that at some point.

I'm not on some tirade or trying to fanboy...It just boggles my mind how much they have gotten away with, and now they get away with even more (because many people don't understand how GPUs work).

I'm not perfect in explaining it all, and I don't get *everything* right all the time, but I certainly can see a lot of things they have done and are doing that most don't appear to notice/understand.

Thank goodness for GPU-Z and how it works, or this would've been able to slip by as well.
How many never use GPU-Z, though?
Just as how many don't understand how they segment/limit/obsolete products in a way ridiculously unfair to the consumers?
I try to explain them, but I don't know how to do it without coming across impartially and get people to understand. This is why I get frustrated. I'm not cheerleading, but what they do HAS to stop.
And it won't unless people understand all this stuff. How to get it across, I really honestly don't know!

edit: I was writing that as you were (some people don't remember that stuff, and that isn't even all of it). This is true, but it's still connected to the shader cluster AFAIK? Perhaps I am mistaken.
 
Last edited:
Joined
Mar 23, 2016
Messages
4,862 (1.49/day)
Processor Core i7-13700
Motherboard MSI Z790 Gaming Plus WiFi
Cooling Cooler Master RGB Tower cooler
Memory Crucial Pro 5600 32GB kit OCed to 6600
Video Card(s) XFX Speedster SWFT309 AMD Radeon RX 6700 XT CORE Gaming
Storage 970 EVO NVMe M.2 500GB,,WD850N 2TB
Display(s) Samsung 28” 4K monitor
Case Phantek Eclipse P400S
Audio Device(s) EVGA NU Audio, Edifier Bookshelf Speakers R1280
Power Supply EVGA 850 BQ
Mouse Logitech G502 Hero
Keyboard Logitech G G413 Silver
Software Windows 11 Professional v24H2
Thanks, I'll have to watch it when I have a moment to absorb it all.

People that haven't been around for as long as some of us don't remember some of the weird crap nVIDIA has been caught doing.

Randomly cut L2, some but not all ram doubled over a bus, the whole 970 situation, disabling of whole clusters instead of separate pieces of them (as sold) that *can* impact performance...etc etc etc etc.

I'm fairly certain the ways nVIDIA cuts costs hasn't changed...I'm fairly certain they just got better at hiding it. *This* is what I keep trying to explain. They obfuscate, and when they can't, they lie.

When they get caught in their lies they say "it isn't a big deal" (Huange is very good at this), when it IS a big deal! Their ability to gaslight is incredible. I'm sure Steve has touched upon that at some point.

edit: I was writing that as you were (some people don't remember that stuff, and that isn't even all of it). This is true, but it's still connected to the shader cluster AFAIK? Perhaps I am mistaken.
It goes farther back to GTX 660 with 2 GB of RAM but the last 512 MB piggybacked on two memory controllers that already had ICs occupying those controllers so that 512 MB was slow to access since it wasn’t spread out over the whole memory bus. I had a GTX 660 and it was fine until that last 512 MB was used then the frame rate would stutter.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
It goes farther back to GTX 660 with 2 GB of RAM but the last 512 MB piggybacked on two memory controllers that already had ICs occupying those controllers so that 512 MB was slow to access since it wasn’t spread out over the whole memory bus. I had a GTX 660 and it was fine until that last 512 MB was used then the frame rate would stutter.
YEEEEPPP. Nice pull!

I had a 970 (before it self-combusted) and it did the same thing (stutter when accessing last GB).

I think people like us understand RAM (buffer/bandwidth) limitations better than most, because we know what it looks like.
 
Last edited:
Joined
Dec 31, 2020
Messages
1,212 (0.80/day)
What I find amazing is how do you disable 8 ROPs in a 16 ROP cluster? Doesn't make any sense. More rational explanation?

The Rop is actually divided into 2x8.
1741070149638.png

Rdna4 4x8
1741070969662.png
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX

Right. But do you think of it that way when you look at it? I guess now people do.

Sorry, I think of RASTER ENGINE as the rops, although yes I understand the colors coordinate. I don't see AMD's RB's seperated into some weird sometimes-fractured sometimes-not diagram.

AMD is 4*32 btw. 128 ROPs on N48. (edit, so you're saying 4x8 per chunk? I didn't know that).

So, point being, yes...What Steve is saying and what I am saying are the same thing, and yes...nVIDIA has done this before (while selling it/shipping to reviewers without this being the case).

I would imagine one shader cluster needs to stay active per group of 8 ROPs. By disabling whole chunks, they lost the ROPs (probably).


Wouldn't you call that 24x1024 instead of 12x2048? I would. Still, so it would be 22 disabled...how do you get the equivalent of 24 stacks (3 chunks) ROPs disabled? It doesn't make sense.

Are they further divided? Is it really 512sp per 4 ROPs? I apologize, I knew it was *something* like that...I had not looked at the diagram in a bit; just remembered something didn't line up.

I guess the simplest answer is literally just severing something that shouldn't have been, but it's still weird. Kinda curious how it effects things beyond pixel throughput.

I know some have tested it. I haven't caught up on it (been a little out of the loop this past week). That would probably answer those questions rather than just speculating, I suppose.

1741073057303.jpeg
 
Last edited:
Joined
Jan 14, 2019
Messages
14,754 (6.58/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
For 2-3 gens?!! Given how badly optimized most modern games are, I doubt it will run games at more than 1080p Medium in 4-5 years :twitch:
Which modern games?

I'm currently playing Kingdom Come Deliverance 2 at 1440 UW high with no FSR on a 6750 XT. A 9070 XT will allow me to max it out.

Space Marine 2 is also fine with reduced settings (on my 6750 XT), only that it doesn't look as good.

The last big title I played before that was Alan Wake 2, which ran acceptably as long as I stayed away from RT.

Hellblade 2 is doable with FSR - only that I don't like upscaling.

Avatar FoP runs like shit, I give you that.

Anyway, my backlog is huge, I could complete half of them on my 6750 XT no problem. :)

Id say the latter and not the former
Also price. GDDR7 is expensive, the benefits wouldn't necessarily have made it worth it.
 
Joined
Dec 25, 2020
Messages
7,760 (5.07/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Which modern games?

I'm currently playing Kingdom Come Deliverance 2 at 1440 UW high with no FSR on a 6750 XT. A 9070 XT will allow me to max it out.

Space Marine 2 is also fine with reduced settings (on my 6750 XT), only that it doesn't look as good.

The last big title I played before that was Alan Wake 2, which ran acceptably as long as I stayed away from RT.

Hellblade 2 is doable with FSR - only that I don't like upscaling.

Avatar FoP runs like shit, I give you that.

Anyway, my backlog is huge, I could complete half of them on my 6750 XT no problem. :)


Also price. GDDR7 is expensive, the benefits wouldn't necessarily have made it worth it.

Again, it's all relative, and I truly do try to respect that. :)

Well, now we can add it to being a future without older PCs involved :)

Huh. I can't remember the last time I had a board without UEFI. That would been...hmm...before Sandy Bridge? I don't know the exact date. That's getting pretty old.

I was more pissed about certain machines not being able to run Windows 11. Until I ran Windows 11. Then I was like, "okay...that's fine." :p
 
Joined
Nov 27, 2023
Messages
2,968 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
@Dr. Dro
It’s not like running modern GPUs under CSM even makes much sense - no ReBAR and technically regular consumer Windows “requires” UEFI functions anyway. A non-issue. Anything pre-UEFI makes no real sense to be running with a 9070 in any case.
 
Joined
Jan 14, 2019
Messages
14,754 (6.58/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Again, it's all relative, and I truly do try to respect that. :)
Exactly my point. :)

I'm more into visuals and atmosphere than high FPS, that's why I'm gonna be fine with the 9070 XT. Someone else might not be.

Besides, generational upgrades aren't so great these days.
 
Joined
Dec 31, 2020
Messages
1,212 (0.80/day)
Right. But do you think of it that way when you look at it? I guess now people do.

Sorry, I think of RASTER ENGINE as the rops, although yes I understand the colors coordinate. I don't see AMD's RB's seperated into some weird sometimes-fractured sometimes-not diagram.

The layout of RB+ is very accurately shown in the diagram, Nvidia Blackwell probably not at all
Now imagine that instead of just 4 shader engines there are 2 more. Probably not suitable for 384 bits on the outskirts. 96 RDNA 4 units are possible.

1741076956802.png
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Exactly my point. :)

I'm more into visuals and atmosphere than high FPS, that's why I'm gonna be fine with the 9070 XT. Someone else might not be.

Besides, generational upgrades aren't so great these days.

Yeah, and I'm all about not getting drops to <60fps while keeping everything pretty. Everyone's different, and that's cool. People should buy what they can afford and be happy. That's the truth.

It's just one of those things where you don't want people to go into some of these purchases thinking they'll be able to run 1440pRT maxed out w/ decent frames forever, because they won't.
Especially when pretty clearly both 9070 xt and 5080 (with more ram) will each be replaced with cheaper parts next generation. That's all I'm trying to get across.

I think 9070 xt will end up being the 1080p market once RT settles, and this makes sense. When that will be, I don't exactly know. Probably before most upgrade again I would think. That's why I say that.
The 9216sp Geforce will probably replace 5080, be cheaper, faster, have more RAM, and be the go-to for 1440p (or PS6 parity), that's why I say that.
The '7900xtx' version of something like this, that doesn't exist, but will, or similar to a AD103/B203 but with an extra shader cluster will replace 4090 for 1440p->4k upscaling, that's why I say that.
Also 5090 is ridiculous because it can't keep 4kRT above 60fps, but the next-gen probably will, so that's why I say that.

It's not that these things aren't good (especially for their relative price) right now, or that you can't be happy with them, just put it in perspective because of the raster->RT (and console) transition.
The layout of RB+ is very accurately shown in the diagram, Nvidia Blackwell probably not at all
Now imagine that instead of just 4 shader engines there are 2 more. Probably not suitable for 384 bits on the outskirts. 96 RDNA 4 units are possible.

I agree. I don't understand why you don't think it's suitable for 384-bit? Too much heat density? That's probably true. I think that's why they were opting for chiplets.
 
Joined
Dec 25, 2020
Messages
7,760 (5.07/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
@Dr. Dro
It’s not like running modern GPUs under CSM even makes much sense - no ReBAR and technically regular consumer Windows “requires” UEFI functions anyway. A non-issue. Anything pre-UEFI makes no real sense to be running with a 9070 in any case.

Agreed, just sharing this bit of knowledge I came across :)
 
Joined
Feb 24, 2023
Messages
3,637 (4.92/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Agreed, just sharing this bit of knowledge I came across :)
Time to retire than old Lynnfield. Wait, I never had a Lynnfield. Time to retire that old Athlon/Opteron X2. Couple of those stashed away, one delidded and cracked all to hell by a younger-me wb mount. :p

In all honesty, I wonder why this is. There must be some reason that I'm not grasping at this moment. Interesting to see the huge rambling overclocking disclaimer.

Easily overcome by lowering the clocks, innit?
Isn't that the whole point of this architecture? Well, RT (TMUs)/8-bit (AI), I know...but in terms of upgrading the actual raster capability.

I doubt they would just want to compete with 5080 with something like that...in-fact it would appear they already are without it.

No...I think it was a 4090 competitor. I just think it would've been too expensive and used more power than 4090, hence been a harder sell.
I also think they may have feared a similar part coming from nVIDIA...which they could have made...and still probably will! This whole generation feels like a half-step to the 3nm parts people really want.
AMD didn't put out the higher-end chip and chose to wait, while nVIDIA held back (on the parts most people actually buy).
 
Last edited:
Joined
Dec 31, 2020
Messages
1,212 (0.80/day)
I agree. I don't understand why you don't think it's suitable for 384-bit? Too much heat density? That's probably true. I think that's why they were opting for chiplets.
There is not enough stretch for the memory channels of a bigger navi. Barely 32 bits more.

1741079157011.png
 
Joined
Feb 24, 2023
Messages
3,637 (4.92/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
Isn't that the whole point of this architecture?
I never heard of any point of this arch tbf... Even if the most optimistic leaks come true we still have a major mindshare problem that AMD failed to address in time. RDNA4 will be "Hm, at least it's not as bad as Blackwell" at the very most. The opportunity window (late '24) is now ancient history, now people are actively purchasing Ada GPUs because there's still NO RDNA4 GPU TO BUY WHATSOEVER and RDNA3 is straight up obsolete. 6 Franklins is not exactly the worst idea but it's almost gotten there. Cheap enough to sell, not enough to impact the market. At 650 USD, it would've been selling maybe 1 percent worse but margins would've been better. At 550 USD, it'd be agressive enough.

Anyway, in both RDNA2 and RDNA3, we had top dog SKUs clocking lower than middle ground ones (6900 XT at 2.35 GHz VS 6700 XT at 2.5 GHz; 7900 XTX at 2.65 GHz VS 7700 XT at 2.8 GHz). Why not make a 96 CU RDNA4 GPU running at, say, 2.75 GHz instead of 2.97? This will still be leagues faster than 9070 XT and will be reasonably hot.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
There is not enough stretch for the memory channels of a bigger navi. Barely 32 bits more.
Right, but chiplets. Probably connected to a host chip (that has a 128-bit controller) on an interposer. I don't think 384-bit monolithic was ever an option.

I never heard of any point of this arch tbf... Even if the most optimistic leaks come true we still have a major mindshare problem that AMD failed to address in time. RDNA4 will be "Hm, at least it's not as bad as Blackwell" at the very most. The opportunity window (late '24) is now ancient history, now people are actively purchasing Ada GPUs because there's still NO RDNA4 GPU TO BUY WHATSOEVER and RDNA3 is straight up obsolete. 6 Franklins is not exactly the worst idea but it's almost gotten there. Cheap enough to sell, not enough to impact the market. At 650 USD, it would've been selling maybe 1 percent worse but margins would've been better. At 550 USD, it'd be agressive enough.

Anyway, in both RDNA2 and RDNA3, we had top dog SKUs clocking lower than middle ground ones (6900 XT at 2.35 GHz VS 6700 XT at 2.5 GHz; 7900 XTX at 2.65 GHz VS 7700 XT at 2.8 GHz). Why not make a 96 CU RDNA4 GPU running at, say, 2.75 GHz instead of 2.97? This will still be leagues faster than 9070 XT and will be reasonably hot.
It's only a couple days. :p

I don't want to argue the price again. I agree it should be $550, but again I also think it will be proportionally faster than the 128-bit Rubin nVIDIA will eventually release at that price, so it makes sense to me.

You mean why didn't they make 7900xtx non-fubar edition? Again, I think it just didn't make sense because there was no market. Then it would be competing with a 5080 again, like this is, but more expensive.

I really honestly think they don't want to make the cards people want, because then nobody will upgrade.
9070 xt is one of those cards people want, but it's the low-end. 4090 is as well, but it's cost-prohibitive on purpose.
 
Last edited:
Top