• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The future of RDNA on Desktop.

Joined
Sep 17, 2014
Messages
23,435 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
For 2-3 gens?!! Given how badly optimized most modern games are, I doubt it will run games at more than 1080p Medium in 4-5 years :twitch:
'Given how badly optimized'

Guess what, you have the option to play one of the fourteen thousand other games that have been released and are getting released almost daily.
Part of this comes down to a crucial question: how much hardware are you willing to fork out (and money) to counteract the shitty work of developers and publishers? How far do you go in funding a sick market that relies on the customer to fix 'a game' with either upscaling or a shitload of extra shaders?

Food for thought. Its not like you can't play games if you skip past the bullshit. I mean seriously... paying north of 1k or even 1k to look at PT in Cyberpunk... what the fuck yo. Get your head examined. Its a complete nothingburger and the exact same game. 'But muh shadows respond naturally'... guess what, there's been hundreds if not thousands of rasterized games that have been given sufficient TLC to have perfectly accurate lighting and shadows. Its from a magical age where developers did that work for you, and you paid a game and were done with it, and capable of just enjoying it without stacking sixteen post effects over your image to get it playable. And it gets even more mystical... those games didn't cost 79,99, but 59,99.

All this new age of gaming gets you, is fast food gaming at a 3 star restaurant price. I guess a lot of people didn't get that memo yet, but its clear as day as that is literally what optimized workflows mean. Optimizing is in fact a cool word for 'not doing' things, while you are left with the impression its better. But you're just not doing things you used to do, to achieve a goal. The vast majority of poster boy games are barely even a proper, functional game these days. But oh oh those reflections. In the meantime you can't post five screenshots of Cyberpunk without some actor clipping into another one or failing to sit in a chair properly. Optimized.

And then you turn off RT and you notice that even without an upscale, even a 6800XT from two gens ago just murders anything else you can throw at it. Wake up. We're being taken for a ride.
 
Last edited:

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,901 (2.87/day)
Location
north
System Name Black MC in Tokyo
Processor Ryzen 5 7600
Motherboard MSI X670E Gaming Plus Wifi
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Corsair Vengeance @ 6000Mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston KC3000 1TB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Dell SK3205
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
For 2-3 gens?!! Given how badly optimized most modern games are, I doubt it will run games at more than 1080p Medium in 4-5 years :twitch:

That will absolutely kill PC gaming.
 
Joined
Feb 24, 2023
Messages
3,637 (4.92/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
It's only a couple days
It should have been October '24. Not March '25. Almost everyone who cares and has (had, rather) the quid has already bought something else.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
It should have been October '24. Not March '25. Almost everyone who cares and has (had, rather) the quid has already bought something else.

I don't agree people bought something else. I don't think there's really another option for something like this. Maybe a 7900xtx, but you're paying for the extra raster/ram there to get similar RT, and w/o FSR4.
I think there's a reason they don't want to port it, because upscaled 1080p on a 7900xtx might not perform and/or look half-bad with FSR4.

A 5070ti exists, if you could find one at a reasonable price, but that too is still more expensive. This is about bringing this level of RT (and better up-scaling) to a more-affordable range.

I also don't know a ton of people that bought overpriced and impossible to find Blackwells, but then again maybe I just know the wrong people.

I do think this should have launched sooner, I am not arguing, but hopefully the extra time shows in their drivers/features/performance.

'Given how badly optimized'

Guess what, you have the option to play one of the fourteen thousand other games that have been released and are getting released almost daily.
Part of this comes down to a crucial question: how much hardware are you willing to fork out (and money) to counteract the shitty work of developers and publishers? How far do you go in funding a sick market that relies on the customer to fix 'a game' with either upscaling or a shitload of extra shaders?

Food for thought. Its not like you can't play games if you skip past the bullshit. I mean seriously... paying north of 1k or even 1k to look at PT in Cyberpunk... what the fuck yo. Get your head examined. Its a complete nothingburger and the exact same game. 'But muh shadows respond naturally'... guess what, there's been hundreds if not thousands of rasterized games that have been given sufficient TLC to have perfectly accurate lighting and shadows. Its from a magical age where developers did that work for you, and you paid a game and were done with it, and capable of just enjoying it without stacking sixteen post effects over your image to get it playable. And it gets even more mystical... those games didn't cost 79,99, but 59,99.

All this new age of gaming gets you, is fast food gaming at a 3 star restaurant price. I guess a lot of people didn't get that memo yet, but its clear as day as that is literally what optimized workflows mean. The vast majority of poster boy games are barely even a proper, functional game these days. But oh oh those reflections. In the meantime you can't post five screenshots of Cyberpunk without some actor clipping into another one or failing to sit in a chair properly.

And then you turn off RT and you notice that even without an upscale, even a 6800XT from two gens ago just murders anything else you can throw at it. Wake up. We're being taken for a ride.
C'mon, man.

W1zard, your alt is showing, brah. :p

:)love:)

(I'm kidding, I get what you're saying and respect it. You're...not wrong that is sometimes true. That's why I recommend 6800xt's to a lot of thrifty people.)
 
Last edited:
Joined
Feb 24, 2023
Messages
3,637 (4.92/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
I also don't know a ton of people that bought overpriced and impossible to find Blackwells
No one knows. I specifically mentioned Ada GPUs which are still in abundance, especially on the 2nd hand market. Blackwell is a paper launch, not worth discussing.
hopefully the extra time shows in their drivers/features/performance.
I bought my 6700 XT two years ago. Driver quality never changed. ZERO new features I would recommend enabling. FSR3.x is unusable at 1440p and below. AFMF is a no go because frames are way too fake. HYPR-RX is just an "I don't know how to select lower settings" button. Performance in games never changed. Drivers from '22 show exact same FPS numbers with maybe three exceptions that don't work correctly on older drivers.

I have absolutely no idea why one would think AMD are cooking something tasty. They could do that but 1 promille is the most I can give them.

In the meantime you can't post five screenshots of Cyberpunk without some actor clipping into another one or failing to sit in a chair properly.
CP77 is truly one of the games of all times. I'm yet to tell what game had a worse technical state after being actively patched for more than four years. Even GTA IV wasn't that woeful.
 
Last edited:
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I bought my 6700 XT two years ago. Driver quality never changed. ZERO new features I would recommend enabling. FSR3.x is unusable at 1440p and below. AFMF is a no go because frames are way too fake. HYPR-RX is just an "I don't know how to select lower settings" button. Performance in games never changed. Drivers from '22 show exact same FPS numbers with maybe three exceptions that don't work correctly on older drivers.

I have absolutely no idea why one would think AMD are cooking something tasty. They could do that but 1 promille is the most I can give them.

I don't like FSR at 1440p (960p up-scaled) or below either, so I feel that. I was never a 6700xt guy, but I get where you're coming from. I really think 6800xt was the first worth-while upgrade since a long time.
Below that there a million different options, ofc I always recommended the 2080Ti. I know people still think that's weird. It's really not! They were cheap, and still decent! Also, DLSS is not FSR1/2/3.
 
Joined
Feb 24, 2023
Messages
3,637 (4.92/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
If properly overclocked, it's similar in perfromance to 3080 more than it is similar to 3070. So it's truly not a weird thing to buy.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
If properly overclocked, it's similar in perfromance to 3080 more than it is similar to 3070. So it's truly not a weird thing to buy.
Exactly. It's better (value) than anything up to that thing, and the cheaper version of that thing is called a 6800xt. That's what I'm saying. :)

See, I think some people think I'm biased. I'm really not. I think 2080 Ti was an excellent card ever since it's price tanked with the release of 3080 (and 6800xt).

6800xt has now (generally, not always) just gotten so cheap it's hard not to recommend that. If nVIDIA ever has a card like that again, or some weird price fluke happens, I will rec it.

I just don't see it happening. The next card above 6800xt (now) is this. If it was cheaper, that would be wonderful. At some point it will be because the next-gen will happen.

If 4090's suddenly becomes massive deals at that point, I will suggest them. They might, but they might not. I don't know how that will go.
 
Last edited:
Joined
Jan 14, 2019
Messages
14,754 (6.58/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Yeah, and I'm all about not getting drops to <60fps while keeping everything pretty. Everyone's different, and that's cool. People should buy what they can afford and be happy. That's the truth.

It's just one of those things where you don't want people to go into some of these purchases thinking they'll be able to run 1440pRT maxed out w/ decent frames forever, because they won't.
Especially when pretty clearly both 9070 xt and 5080 (with more ram) will each be replaced with cheaper parts next generation. That's all I'm trying to get across.
That's why I said that I'm gonna put a hold onto my upgrades because I'm gonna be fine with what the 9070 XT has. It was a statement, not a recommendation to others.

I could easily sit out RDNA 4 with my 6750 XT - I just don't want to.

I think 9070 xt will end up being the 1080p market once RT settles, and this makes sense. When that will be, I don't exactly know. Probably before most upgrade again I would think. That's why I say that.
The 9216sp Geforce will probably replace 5080, be cheaper, faster, have more RAM, and be the go-to for 1440p (or PS6 parity), that's why I say that.
The '7900xtx' version of something like this, that doesn't exist, but will, or similar to a AD103/B203 but with an extra shader cluster will replace 4090 for 1440p->4k upscaling, that's why I say that.
Also 5090 is ridiculous because it can't keep 4kRT above 60fps, but the next-gen probably will, so that's why I say that.

It's not that these things aren't good (especially for their relative price) right now, or that you can't be happy with them, just put it in perspective because of the raster->RT (and console) transition.
RT has been the future for a solid 7 years now, and only now we're starting to see games where it's mandatory. By the time it gets mainstream, the 9070 series will be long obsolete. Give it a good 5-7 more years.

Right, but chiplets. Probably connected to a host chip (that has a 128-bit controller) on an interposer. I don't think 384-bit monolithic was ever an option.
Chiplets are a cost-saving measure (and not a good one on GPUs), not something to improve memory connectivity with. On RDNA 3, you still had to connect the main die to the memory controller chiplets somehow. Those connections don't just magically disappear if you put the memory controller onto a separate die. It increases complexity while not offering such cost-saving that AMD had hoped.
 
Joined
Jan 2, 2024
Messages
791 (1.85/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
Hard to say whether AMD is truly happy with GDDR6 or just decided to focus all engineering resources on next gen.
Fffhahaha doubt it. Look at the current state of GDDR7 on 50 series. It's new and we're already looking at it like it's way off into the future before it evolves.
The potential is very much there for bigger sizes and speed but all these products can easily suffer a thing that is "too high" potential (to be viable in the now).
Let nvidia deal with the teething issues, early adopter tax of this new technology, figure out what's an issue and whatever. They have way more time and money.
AMD can adopt GDDR7 later as it evolves to better products and reap the benefits of creating super exciting 90 class products that actually sells by the boatload.
Huh. I can't remember the last time I had a board without UEFI.
I have a few pairings:
Pentium 4 and the ABit IS7 (bios)
Phenom II X4 and Asus M4A78T-E (bios/efi)
Athlon 2650e and eMachines boxer (bios)
FX-8370 and Gigabyte GA-970A-DS3P (efi)

First two are out of service and the rest are SFF/2U jobbers. If the 9060 ships in HHHL flavor, I may snag one for the rack. On the other hand if I ever have to main the FX again for any reason, there won't be a problem. Nice.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
RT has been the future for a solid 7 years now, and only now we're starting to see games where it's mandatory. By the time it gets mainstream, the 9070 series will be long obsolete. Give it a good 5-7 more years.


Chiplets are a cost-saving measure (and not a good one on GPUs), not something to improve memory connectivity with. On RDNA 3, you still had to connect the main die to the memory controller chiplets somehow. Those connections don't just magically disappear if you put the memory controller onto a separate die. It increases complexity while not offering such cost-saving that AMD had hoped.

First part is correct, and why I didn't recommend it. Second part is incorrect. This is the beginning of the start of it actually happening...and why AMD made this card (as it'll kinda-sorta do 1440p, will do 1080p).
It will complete happening (I mean more mainstream games people actually want to play) within the life of the next generation, as that is when the PS6 will launch (with more games using those things as mandatory).

Have you seen the mock-ups of the cancelled chip?
There is a version like how you speak (which is that^), but I know I've seen 3 chips each connected to a host chip in multiples. I just don't remember where I saw it.

I would look for it, but what's the point? I'm sure something will leak eventually about however they choose to do it on 3nm, be it that way, the way I remember seeing it, or something else.

They could wire the chips to the MC through the interposer, I would imagine, or bridge chips, not unlike what you see on current Zen cpus.

The word has always been a 9x 18432sp chip was happening, and is still happening, and the plan is for eventually 12. I don't know what it will look like, that's just what the leakers say. :p
 
Last edited:
Joined
Sep 17, 2014
Messages
23,435 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
First part is correct, and why I didn't recommend it. Second part is incorrect. This is the beginning of the start of it actually happening...and why AMD made this card (as it'll kinda-sorta do 1440p, will do 1080p).
It will complete happening (I mean more mainstream games people actually want to play) within the life of the next generation, as that is when the PS6 will launch (with more games using those things as mandatory).

Have you seen the mock-ups of the cancelled chip?
There is a version like how you speak (which is that^), but I know I've seen 3 chips each connected to a host chip in multiples. I just don't remember where I saw it.

I would look for it, but what's the point? I'm sure something will leak eventually about however they choose to do it on 3nm, be it that way, the way I remember seeing it, or something else.

They could wire the chips to the MC through the interposer, I would imagine, or bridge chips, not unlike what you see on current Zen cpus.

The word has always been a 9x 18432sp chip was happening, and is still happening, and the plan is for eventually 12. I don't know what it will look like, that's just what the leakers say. :p
It will complete happening on the current hardware? Right. (This includes PS6, which will be RDNA4)

AFAIK we're still juggling rasterized-optimized shaders / SMs against RT loads. Even Nvidia. From hardware POV it hasn't happened at all yet. The performance per sq/mm of die with RT on still nosedives by a good 40-50%, and that's not even doing full PT.

I think you need a massive reality check here. RT only goes through as 'playable' with the addition of upscalers that at least double your framerate OR render at a much lower resolution. And then you're not talking about your average 300-400 dollar GPU either (the equivalent of what's in a console)... you're talking about recent x70 territory or better. A 9070XT, for example, ergo last gen's high end.

We are still at least a decade away from a gaming paradigm that 'requires RT'. In fact, given the market penetration of affordable, RT capable GPUs I think we've been at a complete standstill since Turing - you can credit Nvidia for that. Back then the poster child was Battlefield and oh my those leaves are lit differently and the puddle still reflects things but now looks like a mirror. Today its Cyberpunk and Indiana Jones and oh my look how the sand is colored more yellow now.

Come on man. Its a nothingburger. People ain't gonna pay for this shit en masse. They aren't and the feature is still turned off in the vast majority of games. Look at any and all big game teasers and announcements. RT? Its whatever. Its still all about the games. People don't give a rats' ass about RT and its unobtanium if they even did.
 
Last edited:
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
It will complete happening on the current hardware? Right.

AFAIK we're still juggling rasterized-optimized shaders against RT loads. Even Nvidia. From hardware POV it hasn't happened at all yet. The performance per sq/mm of die with RT on still nosedives by a good 40-50%, and that's not even doing full PT.

I think you need a massive reality check here. RT only goes through as 'playable' with the addition of upscalers that at least double your framerate OR render at a much lower resolution. And then you're not talking about your average 300-400 dollar GPU either (the equivalent of what's in a console)... you're talking about recent x70 territory or better. A 9070XT, for example, ergo last gen's high end.

We are still at least a decade away from a gaming paradigm that 'requires RT'. In fact, given the market penetration of affordable, RT capable GPUs I think we've been at a complete standstill since Turing. Back then the poster child was Battlefield and oh my those leaves are lit differently and the puddle still reflects things but now looks like a mirror. Today its Cyberpunk and Indiana Jones and oh my look how the sand is colored more yellow now.

Come on man. Its a nothingburger. People ain't gonna pay for this shit en masse. They aren't and the feature is still turned off in the vast majority of games. Look at any and all big game teasers and announcements. RT? Its whatever. Its still all about the games. People don't give a rats' ass about RT.

We really aren't still balancing it, though. That's kinda my point. There are showpieces, sure, but they are the exception and not the rule.

You can start to see with stuff like Wukong and SM2 how there is truly a standard. One is up-scaled (Wukong) and the other native (SM2), but still very much congruent with each other.
I know people think it is very crazy that Sony would have their most-important IP from their one of their most tech-wise studios, ported by the most-prestigious port house in the world prepare for the PS6.
Why, I don't know. People are very weird. By weird, I mean very slow.

The en masse, you speak of, is called a Playstation. The PC crowd will follow, but some will already be somewhat prepared.

9070 xt will hit that level for 1080p mins, it appears pretty apparent as the goal. Overclocked maybe 1440 avgs, and that will likely be the goal for the XTX at stock (which lines up with 5080 stock).

And really, this makes sense. Because if it's competing with a low-end Rubin, and a cut-down slightly higher-end Rubin, that would be those markets.

1080p (9070 xt, 6144sp Rubin)
1440p averages (90x0, 7680sp cut-down Rubin)
1440p mins (9216sp Rubin/replacing 5080)
1440p->4k (12288sp from both/replacing 4090)
4k (18432/new tier)

I don't need a reality check, some people just need to pay attention to the what's currently happening before the exception becomes the standardized future and it steam rolls them.
 
Joined
Jul 5, 2013
Messages
29,451 (6.91/day)
The next Revision of RDNA, RDNA 5, will be for Mobile and Gaming Consoles.
This has yet to be definitively verified by AMD. If it has the advances and advantages over RDNA4 that are rumored, AMD would be foolish to keep it exclusive to mobile and gaming consoles.
The next node for Desktop GPUs will be UDNA.
That is also yet to be confirmed, though it is possible, perhaps even likely.
As you can tell UDNA1 doesn't give me the warm and fuzzys.
Why not?
 
Joined
Feb 18, 2005
Messages
6,181 (0.84/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G604
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
This is why you could literally see a 3500mhz N48 competing with 5080

The amusing part is for 5080 to make sense it needs 24GB of memory (really 18-20, but 24GB is the only option) at 3.23ghz
You making claims like these yet never provide any evidence for them besides waffling about why MOAR VRAM BETTAR thus AMD is superior, and your theorycrafted walls of text are getting quite tiresome.

In contrast, W1zzard's reviews consistently provide hard data that shows exactly the opposite that you claim. Your only response to this is "but his testing methodology is wrong" yet you also never explain why.

Basically you believe you're right, but you have no evidence to back it up, and much like a certain American president you obfuscate that fact that with many words. You know what they say about opinions and anuses, right?
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
You making claims like these yet never provide any evidence for them besides waffling about why MOAR VRAM BETTAR thus AMD is superior, and your theorycrafted walls of text are getting quite tiresome.

In contrast, W1zzard's reviews consistently provide hard data that shows exactly the opposite that you claim. Your only response to this is "but his testing methodology is wrong" yet you also never explain why.

Basically you believe you're right, but you have no evidence to back it up, and much like a certain American president you obfuscate that fact that with many words. You know what they say about opinions and anuses, right?

You can believe what you'd like. I've explained it several times in my walls of text. I've shown you ample examples of what I'm talking about. Which part have I not explained to your liking?

I have no problem with an adverse stance or opinion, but I can prove what I'm talking about; that's the point. Your straw-man misinterpretation of my words I find mighty tiring.

I have explained why Wizard's reviews are problematic: they do not show minimums in general testing, including RT, and exclude games that are not constantly swapping (which can cause stutter or low-rez assets to persist), which is not something captured by average framerates or sometimes ANY frame rates. Up-scaling, including minimum frameate, needs to be shown, as it is now crucial to how games are played.

Do you have anything of substance to contribute?
 
Last edited:
Joined
Jan 14, 2019
Messages
14,754 (6.58/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
First part is correct, and why I didn't recommend it. Second part is incorrect. This is the beginning of the start of it actually happening...and why AMD made this card (as it'll kinda-sorta do 1440p, will do 1080p).
It will complete happening (I mean more mainstream games people actually want to play) within the life of the next generation, as that is when the PS6 will launch (with more games using those things as mandatory).
I respect your opinion, but allow me to disagree. AMD is focusing on RT with RDNA 4 because that's where RDNA 2 and 3 were severely lacking, and not because the last 7 years of stagnation is now magically turning into a full RT revolution. That's right, I said stagnation. Turing saw an X % performance decrease with RT on vs off, Ampere the same, Ada the same, and... yep, Blackwell the same as well. So even on Nvidia, the overall performance improved, but RT hasn't. They keep yapping about being on the N-th gen RT core now, but what has really changed? Without measurable performance or block diagrams, no one knows.

Have you seen the mock-ups of the cancelled chip?
There is a version like how you speak (which is that^), but I know I've seen 3 chips each connected to a host chip in multiples. I just don't remember where I saw it.
That's an interesting one, thanks for the link. :)

I can only assume that AMD cancelled it because R&D costs of the interposer and packaging were / would have been through the roof. After recent losses, they needed a much more cost-efficient design. Something simple that just works - and I'm hoping that RDNA 4 is just that.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I respect your opinion, but allow me to disagree. AMD is focusing on RT with RDNA 4 because that's where RDNA 2 and 3 were severely lacking, and not because the last 7 years of stagnation is now magically turning into a full RT revolution. That's right, I said stagnation. Turing saw an X % performance decrease with RT on vs off, Ampere the same, Ada the same, and... yep, Blackwell the same as well. So even on Nvidia, the overall performance improved, but RT hasn't. They keep yapping about being on the N-th gen RT core now, but what has really changed? Without measurable performance or block diagrams, no one knows.


That's an interesting one, thanks for the link. :)

I can only assume that AMD cancelled it because R&D costs of the interposer and packaging were / would have been through the roof. After recent losses, they needed a much more cost-efficient design. Something simple that just works - and I'm hoping that RDNA 4 is just that.

We can agree to disagree on RT. It's one of those things, you know? When do you really say it has arrived? When there is a game you want to play? At what fidelity do you want to play it? It's all opinion.
I think AMD chose this moment because they know it's going to be much more prevalent and want something out there in whatever market is feasible (which is currently the mid-range, soon low-end).

As for the cancelled chip, I already said why I think it happened. AFAIK, it is still happening, just on 3nm. You're right they needed something simple/efficient, and depending on your definition, N48 is that.

Will some people call it's max clock potential efficient (versus using a design with more units/lower clock)? Probably not. But in terms of cost, it's the opposite. So, again, it's a matter on opinion.

None of this is new; it is what AMD does. They enter a market and/or focus on features when the time is right. How do they know the time is right? Because they make the console chips (and scale accordingly).
 
Joined
Jan 14, 2019
Messages
14,754 (6.58/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
We can agree to disagree on RT. It's one of those things, you know? When do you really say it has arrived? When there is a game you want to play? At what fidelity do you want to play it? It's all opinion.
I think AMD chose this moment because they know it's going to be much more prevalent and want something out there in whatever market is feasible (which is currently the mid-range, soon low-end).
I'll say it has arrived when it runs fluently at 1440p on an x60-class GPU.

I think AMD chose this moment because they knew very well that bad RT support was an excuse not to buy RDNA 2 or 3 for many.

Like you yourself said, it's all personal opinion. Mine is this. :)
 
Joined
Feb 24, 2023
Messages
3,637 (4.92/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
I'll say RT is here when I don't need to have a million doobies to be unable to distinguish RT effects and real life, and all that whilst at least 50% of what could be done with RT is done using RT. In every game I've seen, including but not limited to CP77, SH2, AW2, and Control, RT only looks okay in select sub-scenarios and generally only hampers both the visuals and the performance. We're horribly far away from this.

According to AMD, 9070 XT trails behind 5070 Ti in heavy RT by at least 19% (CP77 benchmark showed 81%) so it's not like we're getting necessary uplifts.

As per the monetary part of the question, having reasonable RT (=at least two times better than current Path Tracing) at 1080p at 60+ FPS should be under $400 for RT really to be a thing.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I'll say it has arrived when it runs fluently at 1440p on an x60-class GPU.

I think AMD chose this moment because they knew very well that bad RT support was an excuse not to buy RDNA 2 or 3 for many.

Like you yourself said, it's all personal opinion. Mine is this. :)

But if it has not arrived, why does it matter? Why did people choose it? Why are they capitulating if not something that is required?

I agree it largely didn't before, especially as quickly as it was deprecated each gen (which is why people shouldn't have bought into it imho). But now that it is settling they are jumping in.

Again, I think next-gen '60' is 1080p, similar to 9070 xt. Again, people will make 1440p work until there are greater requirements (probably dictated by PS games that may be built using more compte/VRAM at base).
So that would mean probably 2nm before it's 1440p....but it's difficult to know. 1440p might just stay '70/80' class card, and general requirements increase.
This is due to technological slowdown. In the past, what you want would happen. But now it's conceivable it may not, but rather slight adjustments made in each perf tier (and general gaming) to sustain it.

If you think about it though...It's possible. 2nm is a 1.28x increase to general transistor density, meaning if we get 6144sp to replace 8192sp on 3nm from 4, we could get 8192 on 2nm from 3nm, replacing 9216sp (at 14% higher clock). So, yeah...maybe with a 11060...haha. Maybe. If it has enough ram. Given how the low-end typically works, it is generally limited in some way from entering the 1440p market.
This is on purpose, otherwise they would have no reason to have a stack.

But...judging by current games...what you want will *probably* happen on 2nm. That's fair; I'll give you that.
I would argue once there is a full stack it will begin to occur, but it may indeed take longer for greater saturation; sure.

I'll say RT is here when I don't need to have a million doobies to be unable to distinguish RT effects and real life, and all that whilst at least 50% of what could be done with RT is done using RT. In every game I've seen, including but not limited to CP77, SH2, AW2, and Control, RT only looks okay in select sub-scenarios and generally only hampers both the visuals and the performance. We're horribly far away from this.

According to AMD, 9070 XT trails behind 5070 Ti in heavy RT by at least 19% (CP77 benchmark showed 81%) so it's not like we're getting necessary uplifts.

As per the monetary part of the question, having reasonable RT (=at least two times better than current Path Tracing) at 1080p at 60+ FPS should be under $400 for RT really to be a thing.
This is because it's an option. Things are much different when built into the inherent structure of a game. There are many examples of technologies this has been the case over the years.

What you are saying wrt price is fair for general 1080pRT (again, mins...please don't take that classification too rigidly) imo, and like I said; I think that's next gen and will replace this very thing.
There's a reason 9070 vanilla will probably not be (on purpose, as it's price may drop low). AMD is trying to make some extra money while they can (and before the market settles into it).
 
Last edited:
Joined
Sep 17, 2014
Messages
23,435 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
the exception becomes the standardized future and it steam rolls them.
It never does though that is what Im saying. The content will simply adapt or it wont sell. This aint a matter of oh you are missing 2GB to run it - ALL cards use crutches to get usable RT.
 
Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
It never does though that is what Im saying. The content will simply adapt or it wont sell. This aint a matter of oh you are missing 2GB to run it - ALL cards use crutches to get usable RT.
This just really isn't true. Look at a 4090. It will typically upscale most games from 1440p to 4k, which is a very reasonable thing. Games like Wukong it will run at 1440p (yes, upscaled).
Yes, other cards are *purposely* stilted so it doesn't line up well. Like how a 5080 isn't suitable for 1440p (60fps mins). 9070 xt likely will be for 1080p. Then they may try to say 'xtx' good-enough for 1440p averages.
Which is exactly what 5080 does, for a crap-load more money.
Especially when you add features like FG and/or add DLSS which requires ~5-7% more performance (look at W1zards DLSS3 vs 4 benches if you don't believe me on that one btw), and again, this is by design.

That's kinda of what I've been saying! :p

9070 xt is likely about this with a cheaper card than nVIDIA has (Which again, is the point of what nVIDIA does). That you can build one towards a spec; you don't have to go high/low, and can stay consistant.
Where 5070 is a POS (that needs to upscale regardless) or instead you need at least a $750 card to get 1080p 60fps mins.

Again, I know there are different situations; When I say 1080p60 mins, in Wukong that means upscaling. In Spiderman it doesn't. I think Spiderman is the more realistic notion for many reasons.
Including the fact they make the Playstation and that game will almost-certainly be ported with settings from the PC at some ratio that is similar to a then-available 3nm card on the market with 60fps mins.

In Spider-man 3, who's to say that won't be the default? It probably WILL BE. That's all I'm trying to say. All of that makes sense, doesn't it?

I think nVIDIA (and AMD) will hit the nail on the head next-gen, with (hopefully) better pricing. I don't think they'll be able to dance around it any further. If you take the clocks/specs I've suggested, this bears out.
 
Last edited:
Joined
Sep 17, 2014
Messages
23,435 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
This just really isn't true. Look at a 4090. It will typically upscale most games from 1440p to 4k, which is a very reasonable thing. Games like Wukong it will run at 1440p (yes, upscaled).
Yes, other cards are *purposely* stilted so it doesn't line up well. Like how a 5080 isn't suitable for 1440p (60fps mins). 9070 xt likely will be for 1080p. Then they may try to say it's good-enough for 1440p averages.
Which is exactly what 5080 does, for a crap-load more money.
Especially when you add features like FG and/or add DLSS which requires ~5-7% more performance (look at W1zards DLSS3 vs 4 benches if you don't believe me on that one btw), and again, this is by design.

That's kinda of what I've been saying! :p

9070 xt is likely about this with a cheaper card than nVIDIA has (Which again, is the point of what nVIDIA does). That you can build one towards a spec; you don't have to go high/low, and can stay consistant.
Where 5070 is a POS (that needs to upscale regardless) or instead you need at least a $750 card to get 1080p 60fps mins.

Again, I know there are different situations; When I say 1080p60 mins, in Wukong that means upscaling. In Spiderman it doesn't. I think Spiderman is the more realistic notion for many reasons.
Including the fact they make the Playstation and that game will almost-certainly be ported with settings from the PC at some ratio that is similar to a then-available 3nm card on the market with 60fps mins.

In Spider-man 3, who's to say that won't be the default? It probably WILL BE. That's all I'm trying to say. All of that makes sense, doesn't it?

I think nVIDIA (and AMD) will hit the nail on the head next-gen, with (hopefully) better pricing. I don't think they'll be able to dance around it any further. If you take the clocks/specs I've suggested, this bears out.
Lol you have yet to show an example of non upscaled RT running OK on even a 4090. Mate. These cards are solid in unobtanium land. Upscaling is still a crutch; vendor/version/game specific support required.
 
Joined
Jan 14, 2019
Messages
14,754 (6.58/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
But if it has not arrived, why does it matter? Why did people choose it? Why are they capitulating if not something that is required?
Because marketing. People see some red and green bars in a review and go "I ain't gonna buy that crap" without thinking. They see something that Nvidia does better than AMD, and not something that both manufacturers suck at more or less.

This just really isn't true. Look at a 4090. It will typically upscale most games from 1440p to 4k, which is a very reasonable thing. Games like Wukong it will run at 1440p (yes, upscaled).
Yes, other cards are *purposely* stilted so it doesn't line up well. Like how a 5080 isn't suitable for 1440p (60fps mins). 9070 xt likely will be for 1080p. Then they may try to say it's good-enough for 1440p averages.
Which is exactly what 5080 does, for a crap-load more money.
Especially when you add features like FG and/or add DLSS which requires ~5-7% more performance (look at W1zards DLSS3 vs 4 benches if you don't believe me on that one btw), and again, this is by design.
Upscaling has never been a reasonable thing on a top of the line GPU. If a 5080 isn't suitable for RT at 1440p, like you said, then you just proved my point: we're far from RT being usable in games.
 
Top