• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA 5 a "Clean Sheet" Graphics Architecture, RDNA 4 Merely Corrects a Bug Over RDNA 3

Joined
Nov 11, 2016
Messages
3,411 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Yeah, I would really like to see a BOM cost if it was high it would make me feel better lol.

Not sure if we can take MLiD word for it but he said the BOM cost on 4090 is like 1100usd, Nvidia is upholding to their 60% margins LOL.
 
Joined
Sep 10, 2018
Messages
6,912 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Not sure if we can take MLiD word for it but he said the BOM cost on 4090 is like 1100usd, Nvidia is upholding to their 60% margins LOL.


1000-1100 given that the cooler probably cost way too much 200 usd ish sounds about right honestly.
 
Joined
Nov 11, 2016
Messages
3,411 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
1000-1100 given that the cooler probably cost way too much 200 usd ish sounds about right honestly.

Next up Nvidia charge 70% margins while Radeon charge 30%, who need gamers anyways right :D
 
Joined
Sep 10, 2018
Messages
6,912 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Next up Nvidia charge 70% margins while Radeon charge 30%, who need gamers anyways right :D

I think the 5080 will be overpriced again in the 1200 range with the 5090 looking great at 1600-1800 but 50-60% faster....
 
Joined
Dec 25, 2020
Messages
6,722 (4.70/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Not sure if we can take MLiD word for it but he said the BOM cost on 4090 is like 1100usd, Nvidia is upholding to their 60% margins LOL.

I think MLID is wrong, that 1100 USD estimation is is way, way too high. They should be much cheaper to manufacture than that. What you're paying for is not only a high margin, it's also the cost of software development over time (driver engineers cost money) and their hardware R&D costs.

If I had to shoot blindly, i'd shoot at around ~400 USD for a finished, packaged 4090 FE.

I think the 5080 will be overpriced again in the 1200 range with the 5090 looking great at 1600-1800 but 50-60% faster....

Well, if it provides a sufficient lead, efficiency or feature set, it definitely will.
 
Joined
Jul 13, 2016
Messages
3,279 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
The duopoly must continue, Nvidia is pricing their gaming GPU just high enough to make sure of that.

It's so easy for AMD and Nvidia to figure out the minimum prices of their competitor, given that they share the same chip manufacturer (TSMC), same GDDR manufacturer (Samsung), same PCB manufacturer (AIBs).

Who know perhaps Nvidia will charge higher margins next-gen, just so Radeon can improve their terrible margins.

Aside from the 5090 I don't think there's much more Nvidia can charge. They already priced their products at what the market will bear. There's only so much money regular consumers have to spend on a graphics card. It's more likely that Nvidia will give customers less than charge more. It's shrinkflation basically. Of course it is possible that Nvidia increases prices anyways because frankly they'd be just fine selling more dies to the AI and enterprise markets.

Hard to tell what's going to happen this gen although I do agree it's likely AMD and Nvidia price around each other again instead of competing. Intel is another wildcard as well, might have some presence in the midrange if they get a decent uArch out the door.
 
Joined
Nov 11, 2016
Messages
3,411 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I think MLID is wrong, that 1100 USD estimation is is way, way too high. They should be much cheaper to manufacture than that. What you're paying for is not only a high margin, it's also the cost of software development over time (driver engineers cost money) and their hardware R&D costs.

If I had to shoot blindly, i'd shoot at around ~400 USD for a finished, packaged 4090 FE.

Well the AD102 chip already cost 300-400usd per chip (using silicon cost calculator), not to mention GDDR6 cost 13usd per GB back in 2022 (GDDR6 prices has fallen since then). 1000-1100usd BOM cost on 4090 back in 2022 is quite realistic figure, BOM cost is probably less in 2024 but not in the 400-500usd range. Selling AD102 on workstation (like the 7000usd RTX6000 ADA) increase the profit margins massively.

A quick google search show Nvidia has 29,600 employees, meanwhile AMD has 26k employees, Intel has 120k employee (that's some impressive revenue generated per employee on Nvidia). This means software development cost of Nvidia should not be that much higher than AMD.

TL;DR: Nvidia can maintain super high margins because they use everything effectively.
 
Joined
Dec 25, 2020
Messages
6,722 (4.70/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Well the AD102 chip already cost 300-400usd per chip (using silicon cost calculator), not to mention GDDR6 cost 13usd per GB back in 2022 (GDDR6 prices has fallen since then). 1000-1100usd BOM cost on 4090 back in 2022 is quite realistic figure, BOM cost is probably less in 2024 but not in the 400-500usd range. Selling AD102 on workstation (like the 7000usd RTX6000 ADA) increase the profit margins massively.

A quick google search show Nvidia has 29,600 employees, meanwhile AMD has 26k employees, Intel has 120k employee (that's some impressive revenue generated per employee on Nvidia). This means software development cost of Nvidia should not be that much higher than AMD.

TL;DR: Nvidia can maintain super high margins because they use everything effectively.

Mm, its that we don't know how much's the wafers, the cost of packaging and assembly and all. Most of the other components you can extrapolate from bulk pricing (it's cheaper than what you'd pay for through Mouser or Digikey, even in large quantities, but how much exactly?), PCB costs are also decreased with bulk purchases... it's hard without having insider info, but I have a relatively hard time trying to picture how each AD102's going for $300+ at this point in time. It's 2 year old silicon, after all. Besides, 4090's have a megaton of disabled cache and cores, I'd wager most of the bad AD102s are going into 4090's as is.

It's really complicated.
 
Joined
Sep 10, 2018
Messages
6,912 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Mm, its that we don't know how much's the wafers, the cost of packaging and assembly and all. Most of the other components you can extrapolate from bulk pricing (it's cheaper than what you'd pay for through Mouser or Digikey, even in large quantities, but how much exactly?), PCB costs are also decreased with bulk purchases... it's hard without having insider info, but I have a relatively hard time trying to picture how each AD102's going for $300+ at this point in time. It's 2 year old silicon, after all. Besides, 4090's have a megaton of disabled cache and cores, I'd wager most of the bad AD102s are going into 4090's as is.

It's really complicated.

They pay the same amount per die regardless of what they have to disable... It's a per wafer cost not per die.
 
Last edited:
Joined
Nov 11, 2016
Messages
3,411 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Mm, its that we don't know how much's the wafers, the cost of packaging and assembly and all. Most of the other components you can extrapolate from bulk pricing (it's cheaper than what you'd pay for through Mouser or Digikey, even in large quantities, but how much exactly?), PCB costs are also decreased with bulk purchases... it's hard without having insider info, but I have a relatively hard time trying to picture how each AD102's going for $300+ at this point in time. It's 2 year old silicon, after all. Besides, 4090's have a megaton of disabled cache and cores, I'd wager most of the bad AD102s are going into 4090's as is.

It's really complicated.

Well from 2022 data, TSMC charge per wafer
10k usd for 7nm
16k usd for 5nm
At 70% yield each AD102 should cost ~260usd
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Do you think chiplets are about gamers? Far from it. The post you replied to demonstrates that it's a cost saving technique, nothing else. Better yields on smaller chips, the ability to link chips made on different nodes, etc.

1. Err, I am against the chiplets if I have to sacrifice some significant amounts of performance.
2. Like I said previously, they can use older processes with higher IPC architectures in order to offset the transistor count deficit linked to using an older process.
And still, no one will stop them to make a second revision of Navi 31 with larger die size ~700 mm^2 monolithic and transferring the cost as far as it's possible onto the gamers, and then putting the profit margin at negative values or around zero, like they already have done with the consoles.
 
Joined
Apr 19, 2018
Messages
1,227 (0.51/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
Not sure if we can take MLiD word for it but he said the BOM cost on 4090 is like 1100usd, Nvidia is upholding to their 60% margins LOL.
I wouldn't mind betting a year's salary on it being much closer to $600-$700 to manufacture and ship it if you assume it's a founders card, direct from nVidia. Not talking about profits, R&D, driver dev etc.

$120 for the PCB and all components
$300 for the chip
$150 for the memory
$60 for the cooler
$20 for the packaging
$5 accessories
$15 shipping
=$670

Obviously, this is manufacturer specific, as the larger ones will get these items cheaper, and we don't know what the relationship is between nGreedia and the OEMs, and how much they charge to supply a GPU chip. But we can easily assume nGreedia charge at least double to the OEMs. So $1100 could well be it for the OEMs, as they probably get the PCB, components and memory far cheaper than I stated.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Not sure if we can take MLiD word for it but he said the BOM cost on 4090 is like 1100usd, Nvidia is upholding to their 60% margins LOL.

I wouldn't mind betting a year's salary on it being much closer to $600-$700 to manufacture and ship it if you assume it's a founders card, direct from nVidia. Not talking about profits, R&D, driver dev etc.

$120 for the PCB and all components
$300 for the chip
$150 for the memory
$60 for the cooler
$20 for the packaging
$5 accessories
$15 shipping
=$670

Obviously, this is manufacturer specific, as the larger ones will get these items cheaper, and we don't know what the relationship is between nGreedia and the OEMs, and how much they charge to supply a GPU chip. But we can easily assume nGreedia charge at least double to the OEMs. So $1100 could well be it for the OEMs.

I would correct it even further:

90 PCB and all components
200 the chip
100 the memory
60 the cooler
10 the packaging
10 accessories
15 shipping
=485$
 
Joined
Apr 19, 2018
Messages
1,227 (0.51/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
I would correct it even further:

90 PCB and all components
200 the chip
100 the memory
60 the cooler
10 the packaging
10 accessories
15 shipping
=485$
Yeah, I guess if we assume an 80% chip yield would be around the $200-ish point, then you are probably closer. I'm sure the big OEMs have huge discounts on everything but the GPU itself, compared to the 1,000 unit prices we get to see that are designed to make us feel false value in our purchases.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Yeah, I guess if we assume an 80% chip yield would be around the $200-ish point, then you are probably closer. I'm sure the big OEMs have huge discounts on everything but the GPU itself, compared to the 1,000 unit prices we get to see that are designed to make us feel false value in our purchases.

Also, the used chip is a salvaged chip with disabled shaders, and probably nvidia pays for the working dies, not for the wasted wafers.
 
Joined
Feb 20, 2019
Messages
8,277 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I think the 5080 will be overpriced again in the 1200 range with the 5090 looking great at 1600-1800 but 50-60% faster....
Based on past trend of 3090 and historic 4090 pricing over the product cycle so far, the 4090 is going to stay at $2000 when the 5090 launches at a new even higher price.

I'd love to be wrong but Nvidia are charging what people will pay and people will continue to pay $2000 for a 4090 regardless of what else is out there. Nvidia are not going to say undermine their own profits on their current highest-margin part just because they've developed an even more profitable one! That's basic free-market capitalism, which is Nvidia's bible.
 
Joined
Jun 1, 2010
Messages
380 (0.07/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
Don't beat me hard please, this is just my personal take.
GPU compute for the datacenter and AI isn't particularly latency sensitive, so the latency penalty of a chiplet MCM approach is almost irrelevant and the workloads benefit hugely from the raw compute bandwidth.

GPU for high-fps gaming is extremely latency-sensitive, so the latency penalty of chiplet MCM is 100% a total dealbreaker.

AMD hasn't solved/evolved the inter-chiplet latency well enough for them to be suitable for a real-time graphics pipeline yet, but that doesn't mean they won't.
Yes. This is why they put the bandaids on RDNA3 a la RDNA4, while doing RDNA5 from the scratch.
Also I guess AI isn't that sensitive, because their products have tons of HBM memory, which mitigates the portion of the issue.
You may be disappointed to hear that by the time any such bubble pops they will remain a multi trillion corporation.

$NVDA is priced as is because they provide both the hardware and software tools for AI companies to develop their products. OpenAI for example is a private corporation (similar to Valve), and AI is widely considered to be in its infancy. It's the one lesson not to mock a solid ecosystem.
Indeed. They already got so much wealth, that even if the buble burst today, they will be able to calmly sip the drinks while having a warm bath. They just want to increase the margins even more, while they can.
Also, OpenAI recently got some HW from JHH, so I doub't they are that "Open" after all. Not to mention data sellout to MS, etc. If AI guys want any progress, they should do something really independent, as the cartel lobby has been already established.
Never, for sure.
It's simply a question of cost because low end parts need to be cheap, which means using expensive nodes for them makes absolutely zero sense.

I can confidently say that it's not happened in the entire history of AMD graphics cards, going back to the early ATi Mach cards, 35 years ago!
Look at the column for manufacturing node; The low end of each generation is always last years product rebranded, or - if it's actually a new product rather than a rebrand - it's always an older process node to save money.

So yes, please drop it. I don't know how I can explain it any more clearly to you. Low end parts don't get made on top-tier, expensive, flagship manufacturing nodes, because it's simply not economically viable. Companies aiming to make a profit will not waste their limited quantity of flagship node wafer allocations on low-end shit - that would be corporate suicide!

If Pirelli came accross a super-rare, super-expensive, extra-sticky rubber but there was a limited quantity of the stuff - they could use it to make 1000 of the best Formula 1 racing tyres ever seen and give their brand a huge marketing boost and recognition, OR they could waste it making 5000 more boring, cheap, everyday tyres for commuter workhorse cars like your grandma's Honda Civic.
True. But if you recall the events that old, you can also see, that these lower nodes were always the bread and butter, at least for AMD, and for nVidia until ADA generation. There's nothing wrong in having simplier SKUs, made from lower end chips on cheaper stable nodes. Heck even nVidia managed to produce and sell dozens of millions of hot garbage chips on Samsung's dog-shit 8(10nm) node.
What is expensive today, will not necessarily be expensive tomorrow. Wafer prices fall, N4 will be an ancient technology in 5 or 10 years.
Saying never, means that you must have an alternative in mind? What's it? Making RX 7600 on 6nm for 20 years more?

Not for 20 years, but if the older less refined node doesn't hinder the performance and power efficiency, then IMHO, it's quite viable solution. It's better to sell more akin 7600 on n6, then make few expensive broken top-end chips on finest node, that nobody would like to buy.
Ohhh, you mean on N4 once N4 is old and cheap?
Sure, that'll eventually happen. That's where N6 is right now - but it's not relevant to this discussion, is it?
Why not? At least for AMD, it's still relevant, since they've hit the invisible wall/theshold in their GPU architecture, where node doesn't bring an advantage anymore. At least for current products. I even would dare to say, that if AMD would have made Radeon RX7000 entire series monolithic and on 6nm, it wold have been more viable, than broken 5nm MCM. An it would have made them time to fix, and refine their MCM approach, so the RDNA4 would have been bug-free.
This is especially esencial, in a view of current horrible situation with TSMC allocations, where all top nodes, were completely consumed by Apple and nVidia with it's "AI" oriented chips. So eg making something decent, that is still not sensitive to the older nodes.

Don't get me wrong. I'm all for stopping the manufacturers to fart the broken inferior chips and products, for the sake of profits. Especially, since it requires a lot of materials, resouces, which otherwise could be put in more advanced, more stable and more powerful products. But there should be some middle ground.
At least some portion of "inferior" older n6 etc products could be made, for reasonable prices, just to meed the demand for a temporary solution. Since so many people sitting on ancient HW, that needs to be changed, but withhold the purchase, as only overpriced and pointless products fill the market.
Yeah, I would really like to see a BOM cost if it was high it would make me feel better lol.
Everyone would. But that won't happen enywhere soon. There's reason why margins are about 60% for nVidia, and for AMD until recently.
They won't disclose it as it will shatter their "premium" brand image, that they both managed to maintain, despite being called out for their shenanigans. It happened many times, when it ended up for nVidia to having cheaping out on the design, while still asking a huge premium. Until both nVidia and AMD's reputation and public image and blind followership won't shatter, nothing will change.
I think the 5080 will be overpriced again in the 1200 range with the 5090 looking great at 1600-1800 but 50-60% faster....
I guess nVidia won't make it "cheaper"/sell for the same price. As they made it perfectly clear about five years ago, that they would stack their newer and more powerful solutions above previous gen stuff, while keeping the price of previous. Since newer are greater, thus more expensive. I can't find the reference, but AFAIR it was during RTX inception.
Aside from the 5090 I don't think there's much more Nvidia can charge. They already priced their products at what the market will bear. There's only so much money regular consumers have to spend on a graphics card. It's more likely that Nvidia will give customers less than charge more. It's shrinkflation basically. Of course it is possible that Nvidia increases prices anyways because frankly they'd be just fine selling more dies to the AI and enterprise markets.

Hard to tell what's going to happen this gen although I do agree it's likely AMD and Nvidia price around each other again instead of competing. Intel is another wildcard as well, might have some presence in the midrange if they get a decent uArch out the door.
Regular consumers- no. But there are a lot of crypto-substitutes AKA "AI", that would be gladly buy any compute power for any money. As much as the dumb rich folks and YT influencers, who would create a public image of "acceptable".
1. Err, I am against the chiplets if I have to sacrifice some significant amounts of performance.
2. Like I said previously, they can use older processes with higher IPC architectures in order to offset the transistor count deficit linked to using an older process.
And still, no one will stop them to make a second revision of Navi 31 with larger die size ~700 mm^2 monolithic and transferring the cost as far as it's possible onto the gamers, and then putting the profit margin at negative values or around zero, like they already have done with the consoles.
Sadly, the MCM won't go anywhere, since this means higher profit margins for AMD. They would do anything to keep it this way. Although it still is cheaper to produce, it doesn't manifest itself in the final price formation.
 
Last edited:
Joined
Jul 13, 2016
Messages
3,279 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Err, I am against the chiplets if I have to sacrifice some significant amounts of performance.

The University of Toronto's paper demonstrates that the benefits of chiplets scales as you increase number of CPU cores including having lower latency than a monolithic chip if an active interposer is used. I'd imagine that also applies to GPUs as well, of which is a market that would be more tolerable to the higher price tag of an active interposer.

For large silicon products chiplets are a must have as the cost savings, increased yields, ability to modularize your product (the benefits of which could be it's own subject alone), and better binning (you bin out of every chiplet you make and not just a specific SKU) among other things provide a revolution to chip design.

Interposers will also prove a very fruitfull field for performance gains in the near future as latency, bandwidth, and energy usage among other factors are improved over time, enabling more chiplets at lower latencies using less energy to transmit data between said chiplets.
 
Joined
Dec 25, 2020
Messages
6,722 (4.70/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Indeed. They already got so much wealth, that even if the buble burst today, they will be able to calmly sip the drinks while having a warm bath. They just want to increase the margins even more, while they can.
Also, OpenAI recently got some HW from JHH, so I doub't they are that "Open" after all. Not to mention data sellout to MS, etc. If AI guys want any progress, they should do something really independent, as the cartel lobby has been already established.

Oh they're far from "Open" nowadays. Elon Musk is even suing the company for breach of founding contract, since they were intended to be a non-profit organization and nowadays they're very much a for-profit.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
1. Err, I am against the chiplets if I have to sacrifice some significant amounts of performance.
2. Like I said previously, they can use older processes with higher IPC architectures in order to offset the transistor count deficit linked to using an older process.
And still, no one will stop them to make a second revision of Navi 31 with larger die size ~700 mm^2 monolithic and transferring the cost as far as it's possible onto the gamers, and then putting the profit margin at negative values or around zero, like they already have done with the consoles.
1. Chiplet or no chiplet, I don't care as long as it works and doesn't cost an arm and leg.
2. Older processes would throw efficiency out of the window. Do you want a 400-450+ Watt 7900 XT? I don't.
 
Joined
Sep 10, 2018
Messages
6,912 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
1. Chiplet or no chiplet, I don't care as long as it works and doesn't cost an arm and leg.
2. Older processes would throw efficiency out of the window. Do you want a 400-450+ Watt 7900 XT? I don't.

The one thing that confuses me if RDNA 4 is just fixed RDNA3 with better RT why not do a full lineup. Maybe chiplets were so bad for them and they don't want to make a large die is all I can think of. Going back to the RDNA 1 gameplan just seems defeatist to me.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
The one thing that confuses me if RDNA 4 is just fixed RDNA3 with better RT why not do a full lineup. Maybe chiplets were so bad for them and they don't want to make a large die is all I can think of. Going back to the RDNA 1 gameplan just seems defeatist to me.
I read somewhere that there's two teams working on GPUs at AMD. The one that worked on RDNA 2 is working on RDNA 4, and the one that worked on RDNA 3 is working on RDNA 5. I don't know how true this is, though, and I can't remember the source.

Edit: Maybe AMD saw that chiplets aren't good for a GPU, yet, so they put the project on hold until they figure things out (speculation).
 
Joined
Sep 10, 2018
Messages
6,912 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I read somewhere that there's two teams working on GPUs at AMD. The one that worked on RDNA 2 is working on RDNA 4, and the one that worked on RDNA 3 is working on RDNA 5. I don't know how true this is, though, and I can't remember the source.

Edit: Maybe AMD saw that chiplets aren't good for a GPU, yet, so they put the project on hold until they figure things out (speculation).

Part of me wonders if RDNA4 is only a thing because Sony wanted better RT and some sort of bespoke AI silicon and amd figures they might as well use a similar but larger version on desktop.
 
Joined
Jul 13, 2016
Messages
3,279 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
The one thing that confuses me if RDNA 4 is just fixed RDNA3 with better RT why not do a full lineup. Maybe chiplets were so bad for them and they don't want to make a large die is all I can think of. Going back to the RDNA 1 gameplan just seems defeatist to me.

The only way it makes sense to me is if AMD is going to have multiple GCDs on a single GPU / product (possible AMD has other enterprise products with a combination of CPU / XCU / GCD). Otherwise as you say there's no reason AMD couldn't have just refined over the 7000 series. Of course there's no guarantee that multiple GCDs are coming to the consumer market if that is the case either, could be AMD is able to do it with CoWoS packaging but not with an organic substrate which would restrict it to enterprise only. That would make sense if AMD's strategy is to rapidly push for gains in the AI / enterprise markets. Of course the downside is that it'll undoubtly take a hit in the consumer market unless it prices aggressively.
 
Joined
Sep 10, 2018
Messages
6,912 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
The only way it makes sense to me is if AMD is going to have multiple GCDs on a single GPU / product (possible AMD has other enterprise products with a combination of CPU / XCU / GCD). Otherwise as you say there's no reason AMD couldn't have just refined over the 7000 series. Of course there's no guarantee that multiple GCDs are coming to the consumer market if that is the case either, could be AMD is able to do it with CoWoS packaging but not with an organic substrate which would restrict it to enterprise only. That would make sense if AMD's strategy is to rapidly push for gains in the AI / enterprise markets. Of course the downside is that it'll undoubtly take a hit in the consumer market unless it prices aggressively.

It would be interesting how B100 gaming varient would work with basically 2 gpu dies... Guessing for gaming it's not practical from a cost perspective but they could bring back Titan and charge 3-4k for it assuming it works lol... Not holding my breath.
 
Top