• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 Series Technical Deep Dive

Joined
Nov 15, 2020
Messages
978 (0.62/day)
System Name 1. Glasshouse 2. Odin OneEye
Processor 1. Ryzen 9 5900X (manual PBO) 2. Ryzen 9 7900X
Motherboard 1. MSI x570 Tomahawk wifi 2. Gigabyte Aorus Extreme 670E
Cooling 1. Noctua NH D15 Chromax Black 2. Custom Loop 3x360mm (60mm) rads & T30 fans/Aquacomputer NEXT w/b
Memory 1. G Skill Neo 16GBx4 (3600MHz 16/16/16/36) 2. Kingston Fury 16GBx2 DDR5 CL36
Video Card(s) 1. Asus Strix Vega 64 2. Powercolor Liquid Devil 7900XTX
Storage 1. Corsair Force MP600 (1TB) & Sabrent Rocket 4 (2TB) 2. Kingston 3000 (1TB) and Hynix p41 (2TB)
Display(s) 1. Samsung U28E590 10bit 4K@60Hz 2. LG C2 42 inch 10bit 4K@120Hz
Case 1. Corsair Crystal 570X White 2. Cooler Master HAF 700 EVO
Audio Device(s) 1. Creative Speakers 2. Built in LG monitor speakers
Power Supply 1. Corsair RM850x 2. Superflower Titanium 1600W
Mouse 1. Microsoft IntelliMouse Pro (grey) 2. Microsoft IntelliMouse Pro (black)
Keyboard Leopold High End Mechanical
Software Windows 11
AMD snatches defeat from the jaws of victory again... $600/$550 are too expensive for these cards to be compelling.

Single digit market share incoming within two years. I hate this garbage GPU timeline we're on.
I don't have an issue with price if the reports on performance are in the ballpark. I think the non XT should have been a bit cheaper, maybe $525.
 

Davinmk

New Member
Joined
Mar 2, 2025
Messages
2 (1.00/day)
I'm doing the same thing. I was dead set on buying a 5090, but at this level of greed, I simply can not support. I know I won't get anything remotely close to the performance I want, but to hell with it team red for another 2 years or so.
I mean yeah it maybe only 60% of the performance of the 5090 but it’s also 20-30% the price
 
Joined
Sep 10, 2007
Messages
245 (0.04/day)
Location
Zagreb, Croatia
System Name My main PC - C2D
Processor Intel Core 2 Duo E4400 @ 320x10 (3200MHz) w/ Scythe Ninja rev.B + 120mm fan
Motherboard Gigabyte GA-P35-DS3R (Intel P35 + ICH9R chipset, socket 775)
Cooling Scythe Ninja rev.B + 120mm fan | 250mm case fan on side | 120mm PSU fan
Memory 4x 1GB Kingmax MARS DDR2 800 CL5
Video Card(s) Sapphire ATi Radeon HD4890
Storage Seagate Barracuda 7200.11 250GB SATAII, 16MB cache, 7200 rpm
Display(s) Samsung SyncMaster 757DFX, 17“ CRT, max: 1920x1440 @64Hz
Case Aplus CS-188AF case with 250mm side fan
Audio Device(s) Realtek ALC889A onboard 7.1, with Logitech X-540 5.1 speakers
Power Supply Chieftec 450W (GPS450AA-101A) /w 120mm fan
Software Windows XP Professional SP3 32bit / Windows 7 Beta1 64bit (dual boot)
Benchmark Scores none
Got to say, yesterday I was cautiously optimistic. After reading about 70-80% of this thread, I am realizing the train has left the station FOREVER and I'm just living in past.

We're here talking about 700-800$ GPUs being fine, and many selling out at 1000+$ - FOR WHAT?!? Playing rehashed games, just this time remastered with "RTX"?

As someone said, I AM getting old. But is 40+ really that old? I do remember GPUs always being most expensive part of gaming, but I also remember drooling from getting a 300€ (tax included!) card that could run ANYTHING I threw at it, and it payed for itself in 2 months as I worked as game reviewer at a time. I no longer in review business, that was "just" a job that got me through college without parents/banks/debts... I make roughly 8x more easily these days.

Then I look back, and see my last card being GTX 1070Ti which I got free from business partner when it was already old and leftover in inventory. That PC is also now rotting in a closet, because I switched to just using my company laptop (with built in mobile RTX3060). Realistically last GPUs I actually bought with own money (that I can remember) were the likes of Radeon 9700, then some Nvidia I think GeForce 6800 LE with unlocked units throwing it one tier up, then Radeon X1950Pro, then HD4870/4890, then long nothing until I found I think second hand R9 2x0 or something, and later swapped that for R9 3xx something, barely remember those two cards anymore. Most of that was 2002-2009 or there about, and cards from that period had high OCs, custom (air) cooling, tweaked BIOS, unlocked features and the likes. It was hobby and fun, and relatively affordable even for a student and first low payed jobs. But that means I haven't bought a real GPU since 2014/2015?! That GTX1070Ti was a gift around 2019/2020, and my current laptop was bought around Sept/Oct 2022 as EoL model released about a year before that.

The way things are going I'll become a console gamer, maybe switch to handheld like Steam Deck and just forget about all this bullsh** forever.

Tell me ppl, is it logical that laptop with RTX3060 was 1000€ (with 25% sales tax included!!), including screen, battery, and all the PC parts, while at the same time roughly same specs in a desktop WITHOUT screen and battery cost more? It's same with this topic at hand. We're talking about companies having 50-60% margins, but let's not forget, that's 50% ASML margins selling to TSMC, then 50% TSMC margins selling to AMD, then 50% margins AMD selling to OEM, then more sane 15% margins for OEM, and then God knows how much margins when retailers and scalpers take over, and another 20-25% sales tax depending on country. If we start at 100$ that ends up at 700-800$ by the time it reaches our consumer homes. So unless you're in the stock market and one profiting from all that, everyone else is getting milked to the limit... Actually beyond limits. Ask your boss for 50% raise and see what they think about it when you try to milk them. You'll get slapped to reality. Then why on earth are people still buying god damn 1k+ GPUs... Slap yourselves out of it - please!

Sorry for rant but it was heavily needed...

Edit: Oh, and someone said we'll end up 10y from now with people unable to afford it all... It's already happened, like 3 years ago. They're just running on inertion until it hits everyone. With 8.5 billion people, in 2025 selling under 75 million COMBINED iGPU + dGPU is telling that already but note only 10% of that is dGPUs... If I haven't forgot my math thats about 0.08% of people that can (and is willing) to buy dGPU. And it's already declining.

Source: https://www.tomshardware.com/pc-com...te-igpus-increase-while-discrete-gpus-decline
 
Last edited:
Joined
Aug 9, 2019
Messages
1,783 (0.88/day)
Processor 7800X3D 2x16GB CO
Motherboard Asrock B650m HDV
Cooling Peerless Assassin SE
Memory 2x16GB DR A-die@6000c30 tuned
Video Card(s) Asus 4070 dual OC 2610@915mv
Storage WD blue 1TB nvme
Display(s) Lenovo G24-10 144Hz
Case Corsair D4000 Airflow
Power Supply EVGA GQ 650W
Software Windows 10 home 64
Benchmark Scores Superposition 8k 5267 Aida64 58.5ns
If fsr 4 ends up being close to dlss 4 I'm inclined to go 9070/xt instead of 5070ti. Msrp on 5070ti is a joke, actual price where I live is 1300usd for cheaprst model in stock. If I can get an 9070 xt for about 700usd which it will retail for in Norway msrp and get same performance, powerconsumption and fsr4 being close to dlss 4 I'm going team red again. This reminds me of 5700/5700XT where they offered similar performance to 2060s/2070s at a much lower price.
 
Joined
Nov 23, 2023
Messages
131 (0.28/day)
Got to say, yesterday I was cautiously optimistic. After reading about 70-80% of this thread, I am realizing the train has left the station FOREVER and I'm just living in past.

We're here talking about 700-800$ GPUs being fine, and many selling out at 1000+$ - FOR WHAT?!? Playing rehashed games, just this time remastered with "RTX"?

As someone said, I AM getting old. But is 40+ really that old? I do remember GPUs always being most expensive part of gaming, but I also remember drooling from getting a 300€ (tax included!) card that could run ANYTHING I threw at it, and it payed for itself in 2 months as I worked as game reviewer at a time. I no longer in review business, that was "just" a job that got me through college without parents/banks/debts... I make roughly 8x more easily these days.

Then I look back, and see my last card being GTX 1070Ti which I got free from business partner when it was already old and leftover in inventory. That PC is also now rotting in a closet, because I switched to just using my company laptop (with built in mobile RTX3060). Realistically last GPUs I actually bought with own money (that I can remember) were the likes of Radeon 9700, then some Nvidia I think GeForce 6800 LE with unlocked units throwing it one tier up, then Radeon X1950Pro, then HD4870/4890, then long nothing until I found I think second hand R9 2x0 or something, and later swapped that for R9 3xx something, barely remember those two cards anymore. Most of that was 2002-2009 or there about, and cards from that period had high OCs, custom (air) cooling, tweaked BIOS, unlocked features and the likes. It was hobby and fun, and relatively affordable even for a student and first low payed jobs. But that means I haven't bought a real GPU since 2014/2015?! That GTX1070Ti was a gift around 2019/2020, and my current laptop was bought around Sept/Oct 2022 as EoL model released about a year before that.

The way things are going I'll become a console gamer, maybe switch to handheld like Steam Deck and just forget about all this bullsh** forever.

Tell me ppl, is it logical that laptop with RTX3060 was 1000€ (with 25% sales tax included!!), including screen, battery, and all the PC parts, while at the same time roughly same specs in a desktop WITHOUT screen and battery cost more? It's same with this topic at hand. We're talking about companies having 50-60% margins, but let's not forget, that's 50% ASML margins selling to TSMC, then 50% TSMC margins selling to AMD, then 50% margins AMD selling to OEM, then more sane 15% margins for OEM, and then God knows how much margins when retailers and scalpers take over, and another 20-25% sales tax depending on country. If we start at 100$ that ends up at 700-800$ by the time it reaches our consumer homes. So unless you're in the stock market and one profiting from all that, everyone else is getting milked to the limit... Actually beyond limits. Ask your boss for 50% raise and see what they think about it when you try to milk them. You'll get slapped to reality. Then why on earth are people still buying god damn 1k+ GPUs... Slap yourselves out of it - please!

Sorry for rant but it was heavily needed...

Edit: Oh, and someone said we'll end up 10y from now with people unable to afford it all... It's already happened, like 3 years ago. They're just running on inertion until it hits everyone. With 8.5 billion people, in 2025 selling under 75 million COMBINED iGPU + dGPU is telling that already but note only 10% of that is dGPUs... If I haven't forgot my math thats about 0.08% of people that can (and is willing) to buy dGPU. And it's already declining.

Source: https://www.tomshardware.com/pc-com...te-igpus-increase-while-discrete-gpus-decline
Yep, I know the feeling. 2nm is the last time SRAM gets more dense, then we're hardstuck forever. Crazy people shilling "$699 actually good because REAL msrp for NVIDIA $999", it's just embarassing how much people will refuse to take their own side.

Too bad none of these specialized parts will be worth making at some point in the future, iGPUs deprecating the low end and creeping upward and all. Least they could do is pack them with VRAM to keep things relevant for a while but nope. All just gonna be sand in the wind at the end of the day.

Maybe node shrinks dying will finally force devs, engines, and publishers to push optimization to the fore, but I feel like it's more likely the whole industry will crash before it gets to that point.
 
Joined
Jul 21, 2016
Messages
116 (0.04/day)
AMD should have priced the 9070 XT less than $600 because the 5080 which is $1000 has a similar die size? Are you serious? :kookoo:


Let me try to justify why I mentioned this earlier.
It was not about Nvidia but AMD catching up in performance per area

1. Foundries have 300mm wafers, they are cut into parts which we call dies, the dies have a physical limit
(YES I KNOW there is a new method to use the full wafer, but no consumer chip is using that)
3. The cost of a single wafer has gone up
4. Transistors/area density has hit a few walls, no more >2x density for a single 50% shrink, so it's more difficult to cool down tiny dies when they dissipate 300W...so peak frequencies have hit a wall too
5. TDP has hit the 300-400W wall due to material properties limitations and the average consumer doesn't want a 1kW heater on their desk

So what is left, if you want to make a guess for the best possible high-end GPU on a specific architecture, for example "what would be the peak theoretical performance of a Navi 4x at 750mm2 die size" the only route of improvement left for monolithic dies, is the size.
Since TSMC releases their "average density" improvements for each node, we can use that.

If you take Navi 3x and AD10x and calculate what a shrink to the new process node would be, then compare to what was actually released, you end up with a performance number which is approximately what the architectural improvements are between previous/next-gen
It's not a perfect 1:1 scaling, usually it's around 70-90% for the largest dies depending on power constraints, however you can get a very good estimate.


While it's nice to compare perf/watt, absolute performance and all the other metrics, there is one metric that matters most for the companies and that's the cost per GPU in order to market these.
AMD has been behind in performance per die size for a long time, i think since Pascal, which is one of the reasons they were not releasing "high-end" models every now and then
It makes no sense for AMD to compete in performance when nvidia can achieve the same performance at a much lower cost.

Before that, it was the opposite. At some point, they got so far ahead that nvidia had their own GPU oven memes, but then Bulldozer happened and AMD has been playing catch-up.
Unlike Intel, nvidia did not sit back and watch but actually innovated, so there wasn't a "Zen moment" on the GPU side.

While Navi 4x architecture seems to be closest they have been, they are still not there, but maybe they're close-enough

I don't think we'll ever see 350mm2 dies sold for 400$ again, since it's not only the cost of the die that increased but everything around it like VRAM chips, increased cooler and MOSFET cost due to higher power, increased PCB quality due to higher signal requirements...etc etc

HOWEVER IF AMD reaches parity on performance per die size, the cost of the GPU board is pretty much the same, so while NV might be pricing their top-end higher and higher...AMD desperately wants that market and would definitely release be able to release cheaper cards, without losing money and I WISH Intel stays in the game, then we might see interesting innovations.





Edit:
All that inflation and VRAM requirement talk is bullshit, the 980Ti was a 600mm2 die with 6GB VRAM released for 650$ and the 1080Ti was 470mm2 die with 11GB VRAM released for 700$ after 10months because there was no competition from AMD anymore, this was ~9 years ago
Funny - all those cherry-picked charts for GPU die size and prices begin at 2014 when AMD stopped competing
 
Last edited:

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
13,471 (3.02/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / console
Processor 5800X @ PBO +200 / i5-8600K @ 4.6GHz
Motherboard ROG Crosshair VII Hero / ROG Strix Z370-F
Cooling Alphacool Eisbaer 360 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3600
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.5TB of SSDs / several small SSDs
Display(s) 4K120 IPS + 4K60 IPS / 1080p60 HDTV
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CH720N / TV speakers
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Razer Basilisk / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
AMD opting to go for performance comparison against 4 year old GPUs. Great! And to top it off majority of slides feature MBA designs that we were told were "only artist concept" which I strongly doubt.
I don't see an issue here since there's a lot of us Ampere users like me and you.
 
Joined
Jan 29, 2021
Messages
1,965 (1.31/day)
Location
Alaska USA
Power Color recommends a 900W psu for their 9700 XT and Asus recommends a 750W psu for theirs so I'm going to guess a 850W is the sweet spot?
 
Joined
May 29, 2017
Messages
668 (0.24/day)
Location
Latvia
Processor AMD Ryzen™ 7 7700
Motherboard ASRock B650 PRO RS
Cooling Thermalright Peerless Assassin 120 SE + Arctic P12 MAX
Memory XPG Lancer Blade 6000Mhz CL30 2x16GB
Video Card(s) SAPPHIRE PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NM790 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Lian Li Lancool 207 + Arctic P12/14 MAX
Audio Device(s) Airpulse A100 + Ruark RS1
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
Power Color recommends a 900W psu for their 9700 XT and Asus recommends a 750W psu for theirs so I'm going to guess a 850W is the sweet spot?
On average RX 9070 XT will not be as power hungry as RX 7900 XTX so good quality 750w psu should be fine.
 
Joined
Nov 23, 2023
Messages
131 (0.28/day)
What a "shot in the foot". This way, AMD encourages people to buy mid and low-end cards from Nvidia.

What "great" administrators AMD has...
Or any and every single Intel dGPU on the planet. They must think it's okay because they can just have their CPUs do it.
 
Joined
Apr 13, 2017
Messages
194 (0.07/day)
System Name AMD System
Processor Ryzen 7900 at 180Watts 5650 MHz, vdroop from 1.37V to 1.24V
Motherboard MSI MAG x670 Tomahawk Wifi
Cooling AIO240 for CPU, Wraith Prism's Fan for RAM but suspended above it without touching anything in case.
Memory 32GB dual channel Gskill DDR6000CL30 tuned for CL28, at 1.42Volts
Video Card(s) Msi Ventus 2x Rtx 4070 and Gigabyte Gaming Oc Rtx 4060 ti
Storage Samsung Evo 970
Display(s) Old 1080p 60FPS Samsung
Case Normal atx
Audio Device(s) Dunno
Power Supply 1200Watts
Mouse wireless & quiet
Keyboard wireless & quiet
VR HMD No
Software Windows 11
Benchmark Scores 1750 points in cinebench 2024 42k 43k gpu cpu points in timespy 50+ teraflops total compute power.
Lets see if the lack of integer cores vs nvidia 5000 series makes it boost higher like 3.5GHz - 3.7GHz for a good oc. Almost becoming cpu-like frequency. Also the scalar unit works like cpu i guess. But is it for scheduling?
 
Top