• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The future of RDNA on Desktop.

Joined
Jul 15, 2022
Messages
995 (1.00/day)
The price of the RX 9070 is probably going to be competitive with its Nvidia counterparts in a few months' time.
New AMD Radeon cards often have the problem that their prices are less competitive at release than a few months after release.

And it looks to me AMD has a Winner in the 9070 Series.

AMD has always had great products.
RX 580 8GB was usually cheaper than the Nvidia GTX 1060 6GB.
RX 7600 and RTX 3060 8GB have been the same price for months but the RX 7600 is faster on average.

In terms of FPS/dollar, AMD was often narrowly better than Nvidia in the segment where most GPUs are sold.
 
Joined
Sep 7, 2017
Messages
37 (0.01/day)
So AMD will get rid of the Radeon branding? It makes no sense to ditch the brand now that they're getting a massive improvement with RDNA4, bringing some needed recognition to the Radeon brand.
AMD needs to stick with a naming scheme and get people aware of the brand, unless AMD is deciding to do a Ryzen with their graphics division and wants to start fresh.

No mid range or $500-700 tier card would completely kill off AMD cards for most gamers, leaving Nvidia with over 90% market share.
If AMD is going to do a 2 chip GPGPU+Compute die, then one or both of the chips should be scalable?
To address the second part of your response. AMD along with Nvidia and Intel is planning to migrate close to all of the Low-End Gaming Card Market along with a sizable chunk of the Mid Range Gaming Card Market over to Gaming Laptops. AMD has a high degree of confidence the upcoming APUs will do the trick. Intel has always had an iGPU strategy. It remains to be seen what Intel can stitch together with its EMIB Technology. The 64,000 dollar question is what Nvidiia can do get in this game. There problem comes the CPU side of the equation which they are working feverishly to get into. All of this will erode away the tradition dGPU Market. By the end of the decade all that could be left is the High-End and a small chunk of the dGPU Market.

A Dual Chiplet / Dual Mode Application design will not come cheap. A $1000 the Card would be a loss leader. Even at $1500 it would still be losing money. The only way out of the conundrum is to market it to Design and Engineering Houses for prices of 10 to 20 grand a pop. That would make the proposition pallettable.

In the long run what I expect for the Mid-Range Users is much slower pace of developments. Instead of a new generation cards every 2 years it will be a new generation of cards every 6 years. Maybe stagger it with a Software Updates every 3 years. The Major Software Updates come on the Odd Years with Minor Updates timed to coincide with new Hardware Releases.

The price of the RX 9070 is probably going to be competitive with its Nvidia counterparts in a few months' time.
New AMD Radeon cards often have the problem that their prices are less competitive at release than a few months after release.



AMD has always had great products.
RX 580 8GB was usually cheaper than the Nvidia GTX 1060 6GB.
RX 7600 and RTX 3060 8GB have been the same price for months but the RX 7600 is faster on average.

In terms of FPS/dollar, AMD was often narrowly better than Nvidia in the segment where most GPUs are sold.
The RX580's GCN was strictly second best to Nvidia's PASCAL offerings when it came to Gaming. GCN was about number crunching. Those cards generated a Mean Mining Hash Rate. I wouldn't be surprised if we see a revival of a developed form of the GCN Architecture resurface on the compute chip side of UDNA
 
Last edited:
Joined
Oct 5, 2024
Messages
255 (1.41/day)
Location
United States of America
To address the second part of your response. AMD along with Nvidia and Intel is planning to migrate close to all of the Low-End Gaming Card Market along with a sizable chunk of the Mid Range Gaming Card Market over to Gaming Laptops. AMD has a high degree of confidence the upcoming APUs will do the trick. Intel has always had an iGPU strategy. It remains to be seen what Intel can stitch together with its EMIB Technology. The 64,000 dollar question is what Nvidiia can do get in this game. There problem comes the CPU side of the equation which they are working feverishly to get into. All of this will erode away the tradition dGPU Market. By the end of the decade all that could be left is the High-End and a small chunk of the dGPU Market.

A Dual Chiplet / Dual Mode Application design will not come cheap. A $1000 the Card would be a loss leader. Even at $1500 it would still be losing money. The only way out of the conundrum is to market it to Design and Engineering Houses for prices of 10 to 20 grand a pop. That would make the proposition pallettable.

In the long run what I expect for the Mid-Range Users is much slower pace of developments. Instead of a new generation cards every 2 years it will be a new generation of cards every 6 years. Maybe stagger it with a Software Updates every 3 years. The Major Software Updates come on the Odd Years with Minor Updates timed to coincide with new Hardware Releases.
You have a source on any of this?
 
Joined
Mar 13, 2021
Messages
512 (0.35/day)
Processor AMD 7600x
Motherboard Asrock x670e Steel Legend
Cooling Silver Arrow Extreme IBe Rev B with 2x 120 Gentle Typhoons
Memory 4x16Gb Patriot Viper Non RGB @ 6000 30-36-36-36-40
Video Card(s) XFX 6950XT MERC 319
Storage 2x Crucial P5 Plus 1Tb NVME
Display(s) 3x Dell Ultrasharp U2414h
Case Coolermaster Stacker 832
Power Supply Thermaltake Toughpower PF3 850 watt
Mouse Logitech G502 (OG)
Keyboard Logitech G512
Samsung is still the bleeding edge according to those that are interested in their enthusiast products but you'll have to ask them, I stick to Micron.

To address the second part of your response. AMD along with Nvidia and Intel is planning to migrate close to all of the Low-End Gaming Card Market along with a sizable chunk of the Mid Range Gaming Card Market over to Gaming Laptops. AMD has a high degree of confidence the upcoming APUs will do the trick. Intel has always had an iGPU strategy. It remains to be seen what Intel can stitch together with its EMIB Technology. The 64,000 dollar question is what Nvidiia can do get in this game.
APUs is what they are looking for in this space. What will be interesting to see is if we see Next Gen Xbox/PS will be based on effectively an off the shelf design vs a custom SoC it was previously. I suspect that was mostly because the offerings at their respective times the APU offerings were VERY anemic

PS4 (8 core APU with 1152 Shader engines) APU at similar design/release Athlon 5370 (4 core APU with 128 Shader engines)
PS5 (8 Core APU with at least 36CUs) APU at similar design/Release Ryzen 7 4700G (8 Core with 8CUs)

With the rise of Handheld gaming and how AMD has pushed the CU count of things like their AI Max+ Pro395 which is a 16 core 40 CU part already it is getting close IMO to that point. I mean this APU has more CUs that an equivalent 7600XT and an equivalent boost clock as long as power/cooling is available.

The 64,000 dollar question is what Nvidiia can do get in this game. There problem comes the CPU side of the equation which they are working feverishly to get into. All of this will erode away the tradition dGPU Market. By the end of the decade all that could be left is the High-End and a small chunk of the dGPU Market.
They will be very happy to hear about Epic Stores support of Snapdragon as this could mean something like their Grace core becomes more attractive to them to offer into the consumer market btu there is a lot left there before it becomes a reality

A Dual Chiplet / Dual Mode Application design will not come cheap. A $1000 the Card would be a loss leader. Even at $1500 it would still be losing money. The only way out of the conundrum is to market it to Design and Engineering Houses for prices of 10 to 20 grand a pop. That would make the proposition pallettable.
I actually disagree massively in this. The reason why Ryzen has been so successful in both consumer and server space is the scalability offered by the chiplet design benefits. AMD has been trying to replicate success this with GPUs but I suspect it was a case of a little bit too much too soon for this generation as looking at some of the specs they have taken what they learnt with Ryzen and applied it to their GPUs but on a lot larger scale and actually shows what could be soon happening to their CPUs as well!!!

 
Joined
Jun 26, 2023
Messages
108 (0.17/day)
Processor 7800X3D @ Curve Optimizer: All Core: -25
Motherboard TUF Gaming B650-Plus
Memory 2xKSM48E40BD8KM-32HM ECC RAM (ECC enabled in BIOS)
Video Card(s) 4070 @ 110W
Display(s) SAMSUNG S95B 55" QD-OLED TV
Power Supply RM850x
Joined
Sep 7, 2017
Messages
37 (0.01/day)


APUs is what they are looking for in this space. What will be interesting to see is if we see Next Gen Xbox/PS will be based on effectively an off the shelf design vs a custom SoC it was previously. I suspect that was mostly because the offerings at their respective times the APU offerings were VERY anemic

PS4 (8 core APU with 1152 Shader engines) APU at similar design/release Athlon 5370 (4 core APU with 128 Shader engines)
PS5 (8 Core APU with at least 36CUs) APU at similar design/Release Ryzen 7 4700G (8 Core with 8CUs)

With the rise of Handheld gaming and how AMD has pushed the CU count of things like their AI Max+ Pro395 which is a 16 core 40 CU part already it is getting close IMO to that point. I mean this APU has more CUs that an equivalent 7600XT and an equivalent boost clock as long as power/cooling is available.


They will be very happy to hear about Epic Stores support of Snapdragon as this could mean something like their Grace core becomes more attractive to them to offer into the consumer market btu there is a lot left there before it becomes a reality


I actually disagree massively in this. The reason why Ryzen has been so successful in both consumer and server space is the scalability offered by the chiplet design benefits. AMD has been trying to replicate success this with GPUs but I suspect it was a case of a little bit too much too soon for this generation as looking at some of the specs they have taken what they learnt with Ryzen and applied it to their GPUs but on a lot larger scale and actually shows what could be soon happening to their CPUs as well!!!

To address the final point

1) Ryzen was never designed for server applications. It was strictly Desktop. If you recall at introduction back in 2017 when Ryzen was introduced as the Desktop CPU for personal use, Threadripper for HEDT (High End Desk Top) and Epyc was exclusively for Server Applications. Although within the last 6 months reports have surfaced about Epyc being used in Desktop Applications. Designed Desktop CPU don't have the capacity to address ram in the size and at the speed required for Server Applications.

2) The reason for using chiplets in CPUs (read Ryzen) was to split the I/O function off from the Main Compute Die thus creating two smaller dies. This creates higher yield in the Wafer Fabrication Process thus lowering the cost to manufacture the CPU. This was done in such a manner to impose no Performance Penalty on the CPU.

3) Unfortunately there is not a straight forward application of Chiplets with GPUs. AMD's Infinity Fabric while fast still introduces latency into Time Sensitive Operations. Pulling apart a GPU is not as easy as segmenting a CPU. If you will remember when RDNA was first introduced there was speculation about a chiplet version in the future. AMD was quick to shoot that down. The speculation continued at the time of the release of RDNA 2. AMD responded modifying its position to "never say never". Two years ago AMD leaked out that it had something cooking in the way of a GPU that would involve chiplets. The fundamental problems regarding the applications of chiplets to GPUs has not gone away. But it appears to me the new UDNA Card will be a Dual Function Card with some unorthodox application of Chiplets. One Die performing Gaming Tasks Functions while the second Die handles Compute Programs like CAD (Computer Aided Design). It would seem to me the most logical application of chiplets is to share resources between chiplets. Neither of these two Dies will be small thus it follows the UDNA Card will not be cheap. From a business stand point it makes a great deal of sense. As stated earlier the Compute Side of the chip will be handling Applications that are used in business so AMD could charge business appropriately for the card like on the order of 10 to 20 thousand dollars for it. While a much lower, loss leading, price of 1500 to 2000 dollars would be charged for the Gaming Version. To eliminate the shanniggans of businesses buying the Graphics Version of the Card and then repurposing it to compute application the Compute Function of the card could be nerfed and then locked. The user would then have to call on AMD to supply a License Key to unlock the card, after they receive their 18,000 bucks. The revenue from the Compute Cards would provide the revenue to AMD to make it highly profitable. The Graphics Card price would be set to restrict consumption on the Consumer Side so as to provide a constant flow of cards to the Enterprise Sector.

We are still about 2 to 4 years away from that.
 
Last edited:
Joined
Dec 17, 2024
Messages
131 (1.21/day)
Location
CO
System Name Zen 3 Daily Rig
Processor AMD Ryzen 9 5900X with Optimus Foundation block
Motherboard ASUS Crosshair VIII Dark Hero
Cooling Hardware Labs 360GTX and 360GTS custom loop, Aquacomputer HighFlow NEXT, Aquacomputer Octo
Memory G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
Video Card(s) Nvidia RTX 3080 Ti Founders Edition with Alphacool Eisblock
Storage x2 Samsung 970 Evo Plus 2TB, Crucial MX500 1TB
Display(s) LG 42" C4 OLED
Case Lian Li O11 Dynamic
Power Supply be Quiet! Straight Power 12 1500W
Mouse Corsair Scimitar RGB Elite Wireless
Keyboard Keychron Q1 Pro
Software Windows 11 Pro
The future of RDNA is UDNA.

Are they still doing an RDNA5 before UDNA or are we going to UDNA from RDNA4? The rebrand to 9000-series kinda makes sense for next gen starting a new naming scheme (maybe UDNA?).
 
Joined
Sep 7, 2017
Messages
37 (0.01/day)
The future of RDNA is UDNA.

Are they still doing an RDNA5 before UDNA or are we going to UDNA from RDNA4? The rebrand to 9000-series kinda makes sense for next gen starting a new naming scheme (maybe UDNA?).
By the fact that the new naming scheme is associated with RDNA 4 puts it in the RDNA Family.

It would seem apparent that once UDNA is successfully launch even the name UDNA Series would totally fall away.
 
Joined
Sep 7, 2017
Messages
37 (0.01/day)
The latest data from AMD out of AI PC Summit in Being stayed that 200,000 9070/9070XT Cards were sold worldwide at Launch March 8. The US Inventory was snapped up in a matter of hours. If all of the cards were of the MSRP Cards the total retail value was $120 Million Dollars. If 25 percent of the cards were of the premium over clocking variety the Total Cash Value of the lot was in excess of 130 Million Dollars.

Even with a huge stock of cards on hand at launch barely a dent was made to satisfy demand. Both AMD and it's AIB Partners have sent a notice of encouragement to prospective customers to "sit tight, more cards are on the way" adding "new cards are coming in every week, expect the situation to stabilize in April".
 
Last edited:
Joined
Jun 19, 2024
Messages
649 (2.25/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
Intel's TMG is one sorry collection of Stumble Bums. Intel now has to purchase Wafers from TSMC for most of their own chips because their collection of nincompoops in Oregon and Arizona can't do it. They say that are outputting on 4nm in Arizona .... For what ? Yet they have to buy wafers in from TSMC

Intel has bought chips from TSMC for over three decades. It’s not a bad thing.

Your history lesson for today.

latest data from AMD out of AI PC Summit in Being stayed that 200,000 9070/9070XT Cards were sold worldwide at Launch March 8
AMD has already stated that report is false.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
44,157 (6.81/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
The future of RDNA is UDNA.

Are they still doing an RDNA5 before UDNA or are we going to UDNA from RDNA4? The rebrand to 9000-series kinda makes sense for next gen starting a new naming scheme (maybe UDNA?).
No it's UDNA, RDNA4 is the last iteration

RDNA has no future though, this is it's last gen :p
Nbc
 
Joined
Sep 7, 2017
Messages
37 (0.01/day)
Intel has bought chips from TSMC for over three decades. It’s not a bad thing.

Your history lesson for today.


AMD has already stated that report is false.
But what AMD has said is the sales of the RX-9070/9070XT in the first week since its release exceeds the launch of ALL AMD GPUs in a similar period in their histories by a factor of 10.

Regarding the former if Intel can not produce a quality product to satisfy all their needs after 55 years in business how can that be not "a bad thing". It demonstrates unqualified incompetence.

The future of RDNA is UDNA.

Are they still doing an RDNA5 before UDNA or are we going to UDNA from RDNA4? The rebrand to 9000-series kinda makes sense for next gen starting a new naming scheme (maybe UDNA?).
The demand for the 9070/9070XT is unprecedented. It will be interesting to watch what becomes it of it in the coming months. If demand continues strong AMD would be foolish not to address it. Will AMD run out of wafers before demand is satisfied ? If so would they order another run ? If the answer to that question is yes does AMD entertain the idea of implementing Engineering change Orders ?

Its still way to early too early tell, but we should have some clues by late this summer. Stay tuned.
 
Last edited:
Joined
Jun 19, 2024
Messages
649 (2.25/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
Regarding the former if Intel can not produce a quality product to satisfy all their needs after 55 years in business how can that be not "a bad thing". It demonstrates unqualified incompetence.

You’re going to be really blown away when you learn that automobile manufacturers regularly make products for each other.

I guess they are all incompetent too.
 
Joined
Jun 19, 2024
Messages
649 (2.25/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
Joined
Sep 7, 2017
Messages
37 (0.01/day)
No it's UDNA, RDNA4 is the last iteration


Nbc
I wouldn't be so sure of that. RDNA 4 has injected a new dynamic into the equation. If you find yourself riding a Hot Horse you ride that horse until you ride it into the ground. It seems apparent that AMD is going to need additional runs of wafers to satiate demand. Would those new RDNA 4+ Chips have Engineering changes to accommodate a future version of Upscaling Software ? (Read FSR 5) Does AMD say no to that ?
 
Last edited:
Joined
Oct 5, 2024
Messages
255 (1.41/day)
Location
United States of America
I wouldn't be so sure of that. RDNA 4 has injected a new dynamic into the equation. If you find yourself riding a Hot Horse you ride that horse until you ride it into the ground. It seems apparent that AMD is going to need additional runs of wafers to satiate demand. Would those new RDNA 4+ Chips have Engineering changes to accommodate a future version of Upscaling Software ? (Read FSR 5) Does AMD say no to that ?
Yes, AMD will always say no to that because the UDNA chips are already well into their design cycle (it takes years to go from a blank sheet to the final product). Making a RDNA 5 from scratch when you didn't have that plan in mind years ago means a massive delay to both RDNA 5 and UDNA.

FSR 5 or 6 or whatever it is called will be on UDNA platforms if it requires new hardware and could be backported to RDNA 4 if no new hardware is required.
 
Joined
Sep 7, 2017
Messages
37 (0.01/day)
Yes, AMD will always say no to that because the UDNA chips are already well into their design cycle (it takes years to go from a blank sheet to the final product). Making a RDNA 5 from scratch when you didn't have that plan in mind years ago means a massive delay to both RDNA 5 and UDNA.

FSR 5 or 6 or whatever it is called will be on UDNA platforms if it requires new hardware and could be backported to RDNA 4 if no new hardware is required.
RDNA 5 is already in the works as a successor to replace RDNA 3.5 in Laptop APUs and in the next generation Gaming Consoles.

We don't no where UDNA will fit in the dGPU Lineup. It could only be a Halo Product featuring both an RDNA die and a CDNA die on one interposer. That will not be cheap. If RDNA 4 is a winner why would AMD pull the plug on it if it is selling strong ? With some minor tweaks it could keep selling for well over the two year cycle normal allotted to most GPUs. That would leave AMD with a large cushion to release UDNA.
 
Joined
Jun 19, 2024
Messages
649 (2.25/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
RDNA 5 is already in the works as a successor to replace RDNA 3.5 in Laptop APUs and in the next generation Gaming Consoles.
Are you being willfully ignorant?

 
Joined
Sep 7, 2017
Messages
37 (0.01/day)
Are you being willfully ignorant?

No I'm just ignoring you.

This thread is about RDNA 4 and the 9070XT and you came in here looking to steer the discussion away from that point. Well their are plenty of other places for Nvidia Fan Bois to salivate about their fantasies like the clown who went out and spent $8500 to purchase an Nvidia Pro 6000 and then tested it against an Nvidia RTX5090 only to find it had an increase of 5% in FPS in FortNight. Talk about a dope ! A Fool and his Money ...

Oh did you read the news about Intel canceling the announced high end Battlemage Card this morning Bmg-g31. It was published on Wccftech this morning.

 
Low quality post by Visible Noise
Joined
Jun 19, 2024
Messages
649 (2.25/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
No I'm just ignoring you.

This thread is about RDNA 4 and the 9070XT and you came in here looking to steer the discussion away from that point. Well their are plenty of other places for Nvidia Fan Bois to salivate about their fantasies like the clown who went out and spent $8500 to purchase an Nvidia Pro 6000 and then tested it against an Nvidia RTX5090 only to find it had an increase of 5% in FPS in FortNight. Talk about a dope ! A Fool and his Money ...

Oh did you read the news about Intel canceling the announced high end Battlemage Card this morning Bmg-g31. It was published on Wccftech this morning.

With fans like you, it explains why RDNA is dead on the desktop. There’s only so many ignorant suckers in the world.

*plonk*
 
Joined
May 7, 2023
Messages
805 (1.15/day)
Processor Ryzen 5700x
Motherboard Gigabyte Auros Elite AX V2
Cooling Thermalright Peerless Assassin SE White
Memory TeamGroup T-Force Delta RGB 32GB 3600Mhz
Video Card(s) PowerColor Red Dragon Rx 6800
Storage Fanxiang S660 1TB, Fanxiang S500 Pro 1TB, BraveEagle 240GB SSD, 2TB Seagate HDD
Case Corsair 4000D White
Power Supply Corsair RM750x SHIFT
The price of the RX 9070 is probably going to be competitive with its Nvidia counterparts in a few months' time.
New AMD Radeon cards often have the problem that their prices are less competitive at release than a few months after release.
They are much better value than their Nvidia counterparts already, even at scalper prices
 
Joined
Sep 7, 2017
Messages
37 (0.01/day)
Are you being willfully ignorant?

Here is another one for you. "Moore's Law is Dead" has released another video of leaks within the last 24 hours. The video focuses on Medusa Point APUs. At the 19 minute mark he states the chip will have an RDNA module not UDNA and presumably an RDNA 5 chip given the APU's release date (mid 2027)
 
Joined
May 13, 2008
Messages
1,106 (0.18/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
he states the chip will have an RDNA module not UDNA and presumably an RDNA 5 chip given the APU's release date (mid 2027)
One must consider APU architecture lags behind actual discrete gpu architecture, sometimes significantly, perhaps dependent upon if the new architecture makes sense wrt power/die space/perhaps other reasons.
It would appear they are not worked on in tandem (GPU arch + APU using it); rather a module of a completed GPU architecture bolted on to a newer APU when designed. It's long been this way.
You'll find APUs constructed with Vega when we had RDNA2 on desktop; as that is just how it appears to work at AMD. If they could cut that lag it would be great, but as I said, there are probably reasons.
Things likely won't truly shift until GPU is a MCM chiplet (at some point), and to me that's what I call 'UDNA', even if it's not the actual definition. It would appear that won't happen until post-2027.

At least in APUs. We could see it in other markets earlier, perhaps even as APUs continue to use a more centralized design (which could be continued to be called RDNAx).

Consider UDNA may be mostly be defined by that movement to chiplets in general (used in both MI-series and discrete graphics; probably also eventually MCM APUs); either in conjunction with CPU cores or not.
I have absolutely no idea when this will be ready; it may be with the next GPU series, it may not. If they choose to call the next thing RDNA5 or UDNA (perhaps in conjunction with a programming language model) IDK. The fact his contact referred to RDNA5 does make me wonder if instead of chiplets we instead get a couple monolithic dies (at least at first) on 3nm; perhaps the change to chiplets later.

At the end of the day, I don't think things will change that much from RDNA4 regardless. Cache ratio might change a little; it might be (mostly) external w/ UDNA; but the general architecture still similar I think.

I've been thinking about next-gen a lot, given while it *does* make sense to make an efficient 384-bit chip on 3nm (say something like 18432sp @ 3780/36000), and chiplets yield better, other options also exist.
And monolithic could still make sense for a company like AMD. They don't *have* to attack that highest-end market (and likely won't on monolithic given cost/risk) head-on with a high-unit setup.

For instance, there are absolutely perf targets you can hit with a high-clocked 12288sp part (over 4ghz/40gbps instead of 3.7/36gbps) versus using a more-efficient lower-clocked design with similar units.
nVIDIA may use the latter to sell a higher-end part versus the lower-end one (that otherwise may satiate many 4k/1440RT [even upscaled to 4k] scenarios), especially if DLSS becomes more costly.
Theoretically the less-unit chip (even w/ high clocks) would still be able to get yield and not be priced absurdly, which I do believe is AMD's main goal (chiplets or not).

Part of me wonders if this indeed will happen; if AMD will let nVIDIA make their 18432sp part to replace 5090, and they'll happily just make a better 4090 w/ more perf than a '6080'.
That way they would be staying in their known markets and not venturing into the very small market of >$1000 GPUs, while for all intents and purposes covering the same realistic goals (and majority of folks).
I guess it would depend how well a super high-end part would accomplish native 4k RT or maybe 1440p (and up-scaled to 4k) PT. I don't think AMD will target more than 1080p (upscaled to 4k) PT.

If you look at the performance of 4090 (where it is now ~55fps), you could imagine (increasing) scenarios where a (efficiently-clocked) 12288sp part would not necessarily hold that setting, even w/ an OC.
But perhaps if AMD shoot for higher clocks it may in-fact hold those scenarios. It may use more power, but perhaps serve the overwhelming majority of the market better. IDK if that will happen, but it's possible.

I've got to believe information on whatever they are launching in 2026 will leak soon (and how that coincides with what the next-gen consoles will use), given tape-out expected by EOY and MP early next.
And hopefully we get some insight into the long-term of what's coming after that as well (as-in if there is indeed monolithic chips coming and then an eventual chiplet design in 2027 or later).

TBH, the most exciting part of that video is the idea of AMD using N2X for CPU chiplets; I've wondered if they'd do something like that for a long time. Glad to hear it would appear they are. Yay.

Now imagine them doing something like that for UDNA; where I/O chip and cache are perhaps 3/4/5nm, but GPU chiplets a heavily-advanced process; even if it could only yield small chips. Exciting!
That's exactly what they have to do to compete with nVIDIA imho: they need to find a way to deliver highest-end performance with a lower cost/risk; as they currently find themselves a tough sell for those users.
First they're going for the kill wrt Intel, apparently...but I could absolutely see them doing the same in GPUs. And TBH, they need to do that imho.
 
Last edited:
Top