• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 Series Technical Deep Dive

Joined
Mar 23, 2012
Messages
605 (0.13/day)
Processor Intel i9-9900KS @ 5.2 GHz
Motherboard Gigabyte Z390 Aorus Master
Cooling Corsair H150i Elite
Memory 32GB Viper Steel Series DDR4-4000
Video Card(s) RTX 3090 Founders Edition
Storage 2TB Sabrent Rocket NVMe + 2TB Intel 960p NVMe + 512GB Samsung 970 Evo NVMe + 4TB WD Black HDD
Display(s) 65" LG C9 OLED
Case Lian Li O11D-XL
Audio Device(s) Audeze Mobius headset, Logitech Z906 speakers
Power Supply Corsair AX1000 Titanium
simple gamers buy what is best when it comes to their purchasing power
It's a small sample size, but all the people I know IRL who play games on PC are "simple gamers" who don't get involved in tech forums / etc. And all of them - for years - buy whatever Nvidia GPU is within their budget. None of them have ever purchased a GPU more expensive than around $400. They don't even think about AMD. I'm only beginning to hear them even think about AMD CPUs and AMD has been better than Intel for enthusiasts for years.

I don't think people on these forums recognize what a niche bubble the online tech space is. The 9070 XT is between fine and great for this bubble, depending on your expectations and what measuring stick you use.

For the "simple gamer"? This isn't going to do anything.
 
Joined
Dec 26, 2006
Messages
3,995 (0.60/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
Hopefully circumstances will never arise when a GPU like the 6400/6500 series ever have to exist as a discrete GPU ever again.

IMO they're fantastic GPUs for their intended purpose (providing extra shader power to weak integrated Radeon laptop graphics that already have a full media engine and all the expected display outputs).

They were never designed to be discrete GPUs, because they're not a complete GPU by modern expectations. I bet the engineers who worked on them cry themselves to sleep at night knowing how their great idea was misused in the most inappropriate and disappointing way imagineable :\
ya, but with todays market, a 4GB card with a chip like that for $100 would be a nice option......for htpc or a no igpu cpu replacement if needed...........especially if your card dies and you don't have $600 to get a new gpu.............

I remember my moms old pc gpu dying during covid, I had to get GT1030 2GB for $130 cause the next card up was $900 :|
 
Joined
May 11, 2022
Messages
55 (0.05/day)
System Name LAMP 2017
Processor Intel Core i7-7700K
Motherboard MSI Z270I GAMING PRO CARBON AC
Cooling Noctua NH-U9S
Memory Crucial 2x8 GB DDR4-2400 CL17
Video Card(s) EVGA GTX 1080 Ti SC2 HYBRID
Storage WD Black SN850X 4TB, Samsung 870 QVO 8TB
Display(s) Alienware AW3821DW
Case NCASE M1 V2
Audio Device(s) RME ADI-2 Pro, Neumann KH120A, Sennheiser HD 660S2
Power Supply Corsair SF750 Platinum
Mouse Logitech G303 Shroud Edition
Keyboard (varies)
This was a very informative article summarizing everything we know about the 9070 cards for now. Eagerly awaiting the reviews next week!
 
Joined
Jan 27, 2015
Messages
1,779 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
We're comparing price/performance using die sizes for dies that were stable and performant, right? Just figured we'd take it to it's logical conclusion. It's 875 adjusted for inflation, so a 45% increase in price for a 68% increase in die size means it's a better value. N4C is also mature, it's the third revision of TSMC's 5nm process and is cheaper to produce than their high-end nodes.

While we're comparing die sizes, a more relevant look would be at last gen's 7800 XT - the 9070s are a value regression compared to that card.

Die size is a frankly stupid way to look at it.

What does just the wafer at an N4 cost vs N5 vs N7? What was the initial capital investment for the Fab?

Cost per transistor is a better way to measure it, and that hasn't changed much in a decade. In fact, it has been going up.

And these new GPUs have a *lot* more transistors in them than GPUs from a decade ago, or even just a few years ago.

For example, the 7700XT has ~28B transistors @ N5 node, while the 6700XT had ~18B @ N7 node. And the cost per transistor went up, so.. you get the picture.

1740851100938.png
 
Joined
Oct 6, 2021
Messages
1,764 (1.42/day)
System Name Raspberry Pi 7 Quantum @ Overclocked.
Die size is a frankly stupid way to look at it.

What does just the wafer at an N4 cost vs N5 vs N7? What was the initial capital investment for the Fab?

Cost per transistor is a better way to measure it, and that hasn't changed much in a decade. In fact, it has been going up.

And these new GPUs have a *lot* more transistors in them than GPUs from a decade ago, or even just a few years ago.

For example, the 7700XT has ~28B transistors @ N5 node, while the 6700XT had ~18B @ N7 node. And the cost per transistor went up, so.. you get the picture.

View attachment 387350
SRAM can no longer keep pace with the growing density of logic and has become the most challenging component to scale in modern designs demanding ever-larger caches. While, at times, achieving a 50% performance gain requires nearly doubling the transistor count, this only further distorts your perspective.

Cumulative Scaling (28nm to 3nm):
  • Logic: ~9.8x density increase (0.10x area).
  • SRAM (N3B): ~3.42x density increase (0.29x area).
  • SRAM (N3E): ~3.26x density increase (0.31x area).
xx152549.png


Next, we must consider the development cost per chip. These costs are far from negligible. How many units must be sold to break even?

- Estimated Design Costs
Industry reports (IBS, Gartner) provide ballpark figures for total chip design costs (including EDA tools, IP licensing, engineering, and tape-out) at advanced nodes. These are rough averages and vary by chip complexity (like CPU vs. simple ASIC):

xx153120.png
 
Joined
Dec 31, 2020
Messages
1,203 (0.79/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case MATREXX 50
Power Supply SF850L
How do you explain that GCD absorbs x4 MCD 2.050 million with 16MB infinite cache 50MTr / mm² and still remains 150MTr / mm². Oh it's N7.
 
Joined
Jan 27, 2015
Messages
1,779 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
SRAM can no longer keep pace with the growing density of logic and has become the most challenging component to scale in modern designs demanding ever-larger caches. While, at times, achieving a 50% performance gain requires nearly doubling the transistor count, this only further distorts your perspective.

Cumulative Scaling (28nm to 3nm):
  • Logic: ~9.8x density increase (0.10x area).
  • SRAM (N3B): ~3.42x density increase (0.29x area).
  • SRAM (N3E): ~3.26x density increase (0.31x area).
View attachment 387355

Next, we must consider the development cost per chip. These costs are far from negligible. How many units must be sold to break even?

- Estimated Design Costs
Industry reports (IBS, Gartner) provide ballpark figures for total chip design costs (including EDA tools, IP licensing, engineering, and tape-out) at advanced nodes. These are rough averages and vary by chip complexity (like CPU vs. simple ASIC):

View attachment 387356

Right, but in simpler terms there is no relationship between die size and cost when comparing different nodes.

Another way of looking at this is, if Nvidia were to make a 1080 Ti GPU today on N5 it would probably cost exactly the same to make as it did originally. However, the die would likely be ~1/7th the original size.

Originally it was 470mm^2 @ 16nm so it would probably be ~70-100mm^2 (67mm^2 literally, but as you say some parts haven't shrunk as much as the ideal transistor) on the N5 node, or roughly the size of a modern A18 Apple SoC that is on TSMC N3E.
 
Joined
Nov 23, 2023
Messages
124 (0.27/day)
Die size is a frankly stupid way to look at it.

What does just the wafer at an N4 cost vs N5 vs N7? What was the initial capital investment for the Fab?

Cost per transistor is a better way to measure it, and that hasn't changed much in a decade. In fact, it has been going up.

And these new GPUs have a *lot* more transistors in them than GPUs from a decade ago, or even just a few years ago.

For example, the 7700XT has ~28B transistors @ N5 node, while the 6700XT had ~18B @ N7 node. And the cost per transistor went up, so.. you get the picture.

View attachment 387350
If that's the case and the picture is true, I don't know what's going on with AMD. Production cost would be wildly over MSRP at 53,900 million transistors.
 
Joined
Jan 27, 2015
Messages
1,779 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
If that's the case and the picture is true, I don't know what's going on with AMD. Production cost would be wildly over MSRP at 53,900 million transistors.

No. Both Nvidia and AMD have huge margins on these chips, always have.
 
Joined
Oct 6, 2021
Messages
1,764 (1.42/day)
System Name Raspberry Pi 7 Quantum @ Overclocked.
Right, but in simpler terms there is no relationship between die size and cost when comparing different nodes.

Another way of looking at this is, if Nvidia were to make a 1080 Ti GPU today on N5 it would probably cost exactly the same to make as it did originally. However, the die would likely be ~1/7th the original size.

Originally it was 470mm^2 @ 16nm so it would probably be ~70-100mm^2 (67mm^2 literally, but as you say some parts haven't shrunk as much as the ideal transistor) on the N5 node, or roughly the size of a modern A18 Apple SoC that is on TSMC N3E.

It would indeed be more expensive in 5nm, as you're essentially overlooking the development costs behind the chip and only factoring in the cost per isolated wafer, which doesn't align with reality
Companies use the density budget offered at each node to increase performance in the form of products that hook the customer, having RT and AI in the equation has made this a little worse, but at the end of the day that's the logic: basically more things to divide your area budget into.


"In the first half of the year the industry shipped 18.2 million graphics cards for desktop systems, up 46% from the same period in 2023, according to JPR. However, shipments of graphics boards in Q3 2024 totaled 8.1 million units — down from 8.9 million units in Q3 2023. This is believed to be a result of inventory correction at AMD as well as the end of both Ada Lovelace and RDNA 3 lifecycles."

Nvidia’s dominant hold on 80-90% of the GPU market grants them the unique advantage of both healthy margins and competitive pricing. However, instead of fair prices, they have consistently opted to prioritize higher profit margins at any cost. In contrast, AMD, which holds only 10-20% of the market share, struggles to cover its development expenses with this limited revenue.

If AMD didn't have the consoles and other parallel businesses using Radeon IP they'd be out of business. Their margin in the gaming sector is too thin to pay for the R&D needed, I suppose that's also the reason for merging CDNA and RDNA: to simplify hardware and software development.
 
Joined
Jan 27, 2015
Messages
1,779 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
It would indeed be more expensive in 5nm, as you're essentially overlooking the development costs behind the chip and only factoring in the cost per isolated wafer, which doesn't align with reality
Companies use the density budget offered at each node to increase performance in the form of products that hook the customer, having RT and AI in the equation has made this a little worse, but at the end of the day that's the logic: basically more things to divide your area budget into.


"In the first half of the year the industry shipped 18.2 million graphics cards for desktop systems, up 46% from the same period in 2023, according to JPR. However, shipments of graphics boards in Q3 2024 totaled 8.1 million units — down from 8.9 million units in Q3 2023. This is believed to be a result of inventory correction at AMD as well as the end of both Ada Lovelace and RDNA 3 lifecycles."

Nvidia’s dominant hold on 80-90% of the GPU market grants them the unique advantage of both healthy margins and competitive pricing. However, instead of fair prices, they have consistently opted to prioritize higher profit margins at any cost. In contrast, AMD, which holds only 10-20% of the market share, struggles to cover its development expenses with this limited revenue.

If AMD didn't have the consoles and other parallel businesses using Radeon IP they'd be out of business. Their margin in the gaming sector is too thin to pay for the R&D needed, I suppose that's also the reason for merging CDNA and RDNA: to simplify hardware and software development.

That chart attempts to aggregate all of that into one display. We're not arguing here, I'm just saying that you can present the information in a simple easy way without digging into the morass of a million details.

Edit: What's going on is no different than what I see in other places, like cars. Cars are extremely expensive today compared to 10 years ago. But 10 years ago, I didn't have Apple CarPlay, backup sensors, lane keeping assist cameras, blind spot monitors, auto stop-start, hotspot WiFi with cellular WAN, satellite tracking my car, an app to see what my tire pressure is and decode any CEL codes, an LCD infotainment system, and so on. Some of that existed sure, but in higher trims and mostly higher end manufacturers.

So today everybody wants the features that used to only exist in high end cars, and guess what? They get to pay high end car prices.

Same in the GPU space. GPUs are not getting much more advanced, they're just getting more transistors in a smaller area hence higher performance. And the cost per transistor has not gone down, so they simply cost more.


1740860495992.png




 
Last edited:
Joined
Jan 27, 2015
Messages
1,779 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Oh. In that case I'm just gonna ignore that, then... AMD still should have priced their cards at $450/$550.

Neither AMD nor Nvidia set the prices for their cards, the market does that. Just like you can't get a 5070 Ti for $750, you won't be able to get one of these cards for $600.

I do think they both stumbled into a new strategy during COVID though.

They post a complete BS MSRP, knowing they won't supply enough cards to market. Then they use all their capacity to make server CPUs (AMD) and AI accelerators (AMD) that keep corporate margins and profits high, and only produce enough GPUs to show that they do exist and maintain presence in those markets.

Then the market has its way with prices.

The traditional way to do this is to do market research and set the price based on a supply / demand curve, but I don't think either company cares enough about the GPU market to try to actually find the sweet spot in the curve. They clearly are not doing that now though.

What they are doing now is padding the pockets of retailers, to stay in good graces in case they need retail cooperation later on (if say, crypto and AI both collapse). MSRPs are just a trick right now. If they could get away with it, I bet neither AMD nor Nvidia would make a single GPU.

1740862654812.png
 
Joined
Dec 1, 2022
Messages
539 (0.65/day)
Neither AMD nor Nvidia set the prices for their cards, the market does that. Just like you can't get a 5070 Ti for $750, you won't be able to get one of these cards for $600.

I do think they both stumbled into a new strategy during COVID though.

They post a complete BS MSRP, knowing they won't supply enough cards to market. Then they use all their capacity to make server CPUs (AMD) and AI accelerators (AMD) that keep corporate margins and profits high, and only produce enough GPUs to show that they do exist and maintain presence in those markets.

Then the market has its way with prices.

The traditional way to do this is to do market research and set the price based on a supply / demand curve, but I don't think either company cares enough about the GPU market to try to actually find the sweet spot in the curve. They clearly are not doing that now though.

What they are doing now is padding the pockets of retailers, to stay in good graces in case they need retail cooperation later on (if say, crypto and AI both collapse). MSRPs are just a trick right now. If they could get away with it, I bet neither AMD nor Nvidia would make a single GPU.

View attachment 387400
The market responded to the 5070Ti being $1000 by buying it anyway, the same happened during the crypto hype, people complained yet used bots or paid scalpers to buy a card at 2-3X over MSRP.
I think Nvidia could get away with it, especially since they're making loads of money with AI, gaming is just pocket change to them. But the same doesn't apply to AMD since they've only lost market share and don't have the mindshare buying a new GPU no matter how high the prices are. AMD claimed good availability with the 9070 and the rumor is cards have been in the hands of retailers since January, hopefully there is enough stock to meet demand.
 
Joined
Jan 27, 2015
Messages
1,779 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
The market responded to the 5070Ti being $1000 by buying it anyway, the same happened during the crypto hype, people complained yet used bots or paid scalpers to buy a card at 2-3X over MSRP.
I think Nvidia could get away with it, especially since they're making loads of money with AI, gaming is just pocket change to them. But the same doesn't apply to AMD since they've only lost market share and don't have the mindshare buying a new GPU no matter how high the prices are. AMD claimed good availability with the 9070 and the rumor is cards have been in the hands of retailers since January, hopefully there is enough stock to meet demand.

When MSRP is set at a level that is not correctly gauging demand, the market starts to correct for that - it's actually called 'price discovery'.

The market is looking for the correct price, and as you say it elevated it from $750 to what I see as around $1200 for the 5070 Ti. It may settle a little lower, it's still in price discovery. But I can guarantee you that the correct market price is not $750.

The reason AMDs market share shrank is that they did not make enough cards, plain and simple. If they made more, prices (of both AMD and Nvidia cards) would drop and move down on the curve.

This curve is a real thing. And I guarantee you that both companies know what the curve looks like for their products, and know they did not set their MSRP correctly for the supply they were going to generate. They just aren't interested in maximizing profits here, it's more of a hedge against disruptions in AI and Data Center spaces. Their profits come from there, not here.

 

Attachments

  • 1740864576270.png
    1740864576270.png
    497.1 KB · Views: 7
Joined
Nov 23, 2023
Messages
124 (0.27/day)
The reason AMDs market share shrank is that they did not make enough cards, plain and simple. If they made more, prices (of both AMD and Nvidia cards) would drop and move down on the curve.
I don't think that's true. The 7000 series was well stocked throughout it's lifespan and many of their cards were priced poorly upon release, so they didn't sell enough volume and lost market share.
Neither AMD nor Nvidia set the prices for their cards, the market does that. Just like you can't get a 5070 Ti for $750, you won't be able to get one of these cards for $600.

I do think they both stumbled into a new strategy during COVID though.

They post a complete BS MSRP, knowing they won't supply enough cards to market. Then they use all their capacity to make server CPUs (AMD) and AI accelerators (AMD) that keep corporate margins and profits high, and only produce enough GPUs to show that they do exist and maintain presence in those markets.

Then the market has its way with prices.

The traditional way to do this is to do market research and set the price based on a supply / demand curve, but I don't think either company cares enough about the GPU market to try to actually find the sweet spot in the curve. They clearly are not doing that now though.

What they are doing now is padding the pockets of retailers, to stay in good graces in case they need retail cooperation later on (if say, crypto and AI both collapse). MSRPs are just a trick right now. If they could get away with it, I bet neither AMD nor Nvidia would make a single GPU.

View attachment 387400
If they want to increase market share this generation and push volume, which is their stated goal this generation, they must lower their prices. This pans out on the supply/demand graph by moving the supply line to the right.
 
Joined
May 8, 2024
Messages
28 (0.09/day)
Looks interesting. Even though AMD isn't targeting the high end they may have a real winner on their hands with RNDA4 if the performance is really there at a price that undercuts similar offerings from Nvidia. Even so I'll wait to see independently verified benchmarks and if they're even available before getting excited.

Personally, I'm not at all interested in video cards that cost anywhere close to $1000 or more. In my 33 years of computing the single most expensive GPU I've ever purchased in actual dollars is my current RX 6600 XT which was $320 in Aug 2022.

Even adjusting for inflation the single most expensive GPU I ever purchased was a Radeon DDR for $255 in Dec 2000 or about $465 in early 2025 dollars.
 
Joined
Sep 17, 2014
Messages
23,422 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Right, but in simpler terms there is no relationship between die size and cost when comparing different nodes.

Another way of looking at this is, if Nvidia were to make a 1080 Ti GPU today on N5 it would probably cost exactly the same to make as it did originally. However, the die would likely be ~1/7th the original size.

Originally it was 470mm^2 @ 16nm so it would probably be ~70-100mm^2 (67mm^2 literally, but as you say some parts haven't shrunk as much as the ideal transistor) on the N5 node, or roughly the size of a modern A18 Apple SoC that is on TSMC N3E.
Die size + node = # transistors. Its the same thing, ballpark, at least, as die size and counting for inflation. After all, shrinks, like inflation, happens over time.
 
Joined
Oct 30, 2020
Messages
437 (0.28/day)
Location
Toronto
System Name GraniteXT
Processor Ryzen 9950X
Motherboard ASRock B650M-HDV
Cooling 2x360mm custom loop
Memory 2x24GB Team Xtreem DDR5-8000 [M die]
Video Card(s) RTX 3090 FE underwater
Storage Intel P5800X 800GB + Samsung 980 Pro 2TB
Display(s) MSI 342C 34" OLED
Case O11D Evo RGB
Audio Device(s) DCA Aeon 2 w/ SMSL M200/SP200
Power Supply Superflower Leadex VII XG 1300W
Mouse Razer Basilisk V3
Keyboard Steelseries Apex Pro V2 TKL
Impressive they managed to cram that many transistors into 356mm2. Doesn't that surpass TSMC's quoted N4C peak density?

Also seems like it scales beyond 304w TDP. If power limits are fully uncapped, I think it's time to get one of the 3x8 pin variants and crank it all the way up.

9070 vanilla is priced like a turd
 
Joined
Jan 27, 2015
Messages
1,779 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Die size + node = # transistors. Its the same thing, ballpark, at least, as die size and counting for inflation. After all, shrinks, like inflation, happens over time.

Nope, doesn't work like that anymore. It costs more $$ to make 100M transistors at 3nm than it did at 28nm.

TAnd we are putting a lot more transistors in a new GPU.

So as example, a 980 was on 28nm and had 5.2B transistors with a die of 398mm2. The 4080 was on N5 with 45.9M transistors and 378mm2. I would expect based on the chart, that the 4080 die cost about 8.8X more to make than the 980 die. There's no inflation adjustment there, and this is just for the GPU itself - not the other components board heat sink etc.

So it starts to make sense why the 4080 would have an MSRP of $1200, vs the 980's $550 in 2014.

Now for inflation, $550 in 2014 is $738 today. The rest is mostly going to be that GPU die.


1740877954235.png


Ref:
 
Joined
Dec 31, 2020
Messages
1,203 (0.79/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case MATREXX 50
Power Supply SF850L
Yet a 9070 with 54B Xtors costs 599 in 2025 lol. Nvidia has some explaining to do.

Impressive they managed to cram that many transistors into 356mm2. Doesn't that surpass TSMC's quoted N4C peak density?

For 50% logic. Rest SRAM and phy, when it has more logic that number shoots up.

But 7900XtX had the same 150 density with external cache something doesn't add up.
 
Last edited:
Joined
Sep 17, 2014
Messages
23,422 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Nope, doesn't work like that anymore. It costs more $$ to make 100M transistors at 3nm than it did at 28nm.

TAnd we are putting a lot more transistors in a new GPU.

So as example, a 980 was on 28nm and had 5.2B transistors with a die of 398mm2. The 4080 was on N5 with 45.9M transistors and 378mm2. I would expect based on the chart, that the 4080 die cost about 8.8X more to make than the 980 die. There's no inflation adjustment there, and this is just for the GPU itself - not the other components board heat sink etc.

So it starts to make sense why the 4080 would have an MSRP of $1200, vs the 980's $550 in 2014.

Now for inflation, $550 in 2014 is $738 today. The rest is mostly going to be that GPU die.


View attachment 387416

Ref:
I dont disagree on that either. But when looking at the subject and statement I was discussing, about 9070XT value compared to a 1070ti at a similar MSRP corrected for inflation and similar die size; this does mean the 9070XT has a reasonable price point given its die size. More than reasonable even.
 
Joined
Jan 27, 2015
Messages
1,779 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
I dont disagree on that either. But when looking at the subject and statement I was discussing, about 9070XT value compared to a 1070ti at a similar MSRP corrected for inflation and similar die size; this does mean the 9070XT has a reasonable price point given its die size. More than reasonable even.

I would agree the MSRP is reasonable. Keep in mind AMD and Nvidia have huge margins. AMDs latest non-GAAP gross margin is 53%, meaning that they make something for $0.47 and sell it for $1.00 This is all inclusive of all operating, R&D, marketing, transport, etc.

But that margin is almost certainly not on GPUs.

Their margins are extremely high on server CPUs. The Epyc 9755 for example, has 16 chiplets for a total combined die size of 1130mm^2. This is about 3X the size of a single 9070 die. It costs almost $13,000 though. An entire card 2/9070 XT MSRPs at $600, so 3 of them (same combined die size) are $1800. vs $13,000 for just a single 9755 Epyc server chip. There are some other costs with the server chip since it has chiplets, but no way is it that much.

So I don't think the market price for their GPUs will be reasonable, I don't think they'll make enough supply to keep the prices at MSRP. Not when they can make truckloads more money making server CPUs.

I have no reason to hope for that, just my cold pragmatic view. I'd like to get a 5070 or 9070 / 9070 XT class card, so I hope I'm wrong on the market pricing.
 
Joined
Jun 6, 2021
Messages
765 (0.56/day)
System Name Red Devil
Processor AMD 5950x - Vermeer - B0
Motherboard Gigabyte X570 AORUS MASTER
Cooling NZXT Kraken Z73 360mm; 14 x Corsair QL 120mm RGB Case Fans
Memory G.SKill Trident Z Neo 32GB Kit DDR4-3600 CL14 (F4-3600C14Q-32GTZNB)
Video Card(s) PowerColor's Red Devil Radeon RX 6900 XT (Navi 21 XTX)
Storage 1 x Western Digital SN850 1GB; 1 x WD Black SN850X 4TB; 1 x Samsung SSD 870EVO 2TB
Display(s) 1 x MSI MPG 321URX QD-OLED 4K; 2 x Asus VG27AQL1A
Case Corsair Obsidian 1000D
Audio Device(s) Raz3r Nommo V2 Pro ; Steel Series Arctis Nova Pro X Wireless (XBox Version)
Power Supply AX1500i Digital ATX - 1500w - 80 Plus Titanium
Mouse Razer Basilisk V3
Keyboard Razer Huntsman V2 - Optical Gaming Keyboard
Software Windows 11
Will buy to support. Keen to support Team Red
I'm doing the same thing. I was dead set on buying a 5090, but at this level of greed, I simply can not support. I know I won't get anything remotely close to the performance I want, but to hell with it team red for another 2 years or so.
 
Top