• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
2,056 (0.35/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case Cooler Master QUBE 500 Flatpack Macaron
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Keychron K2 HE Wireless / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Meta Quest 3 512GB
Software Windows 11 Pro 64-bit 24H2 Build 26100.2605
The 330W is just the TDP it will target. So spikes will be similar to 7900XT with a +15% pt most likely. Not exactly problematic. There is no way they'll hit 7900XTX perf, it has 96 compute units vs this one's 64, if we believe the rumors.
Ah, right I forgot it was rumored to have only 64 CUs. But 330W TGP for that many CUs though? Unless they're pulling an RTX 3060 to 4060 (where the 4060 is still ~10% better for much less power due to newer arch), that much power draw doesn't seem to correlate with the number of CUs it may have. The 7900GRE has 80 CUs at 260W and the 7900XT has 84 at 300W. I don't think AMD is just going to throw efficiency out of the window just because of higher game/boost clocks.

Perhaps RDNA4 is a big architectural improvement where 64 CUs can now do just as much work as 80 or 72 CUs at slightly higher clocks? Hopefully this is the case.
 
Joined
Jan 8, 2017
Messages
9,520 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Latest leak says 9070xt within 5% of 4080.
Not possible according to the shader count, one or the other are wrong.

Navi 31 was clearly designed to go after the RTX 4090
It clearly wasn't, different die sizes and process nodes. Not to mention more space is wasted when using chiplets, they simply aren't comparable.
 
Last edited:
Joined
Mar 16, 2017
Messages
2,170 (0.76/day)
Location
Tanagra
System Name Budget Box
Processor Xeon E5-2667v2
Motherboard ASUS P9X79 Pro
Cooling Some cheap tower cooler, I dunno
Memory 32GB 1866-DDR3 ECC
Video Card(s) XFX RX 5600XT
Storage WD NVME 1GB
Display(s) ASUS Pro Art 27"
Case Antec P7 Neo
Some examples:

Intel releases broken cpus, people keep buying them because its intel.

Yet Zen 5 was nailed to the cross with rusty nails even though the issue was due to Winblows.

Intel gpus have horrible drivers, people dont even mention that, since only AMD has horrible drivers.

3 or so articles about leaks and rumors about the upcoming rdn4 gpus, 99% of the comments are negative and hostile towards AMD.

But I guess that those things are figments of our imagination.

I no longer believe in most of today’s reviewers.

To me, they are bribed influencers.

Granted, places like LTT might not have a choice but to take those bribes just because of how many employees are there and how much their salaries are.

Same for Tom’s and many others.
I have been around long enough to remember the Tom’s article that reviewed how CPUs would perform when you intentionally removed the HSF while it’s running. The Intel CPU would throttle, the Athlon CPU would burn up. A seemingly pointless article, but one that painted Intel in a better light at a time when AMD was actually competitive for the first time. Also we can’t forget the CTS labs “Zen flaws” release. When intel was taking hits for all its flaws, suddenly a “research company” spills the beans on AMD vulnerabilities, too. Legit issues or not, it had all the looks of a shell company established to produce a hit piece, and they have published nothing else.
 
Joined
Jul 21, 2016
Messages
107 (0.03/day)
Some examples:

Intel releases broken cpus, people keep buying them because its intel.

Yet Zen 5 was nailed to the cross with rusty nails even though the issue was due to Winblows.

Intel gpus have horrible drivers, people dont even mention that, since only AMD has horrible drivers.

3 or so articles about leaks and rumors about the upcoming rdn4 gpus, 99% of the comments are negative and hostile towards AMD.

But I guess that those things are figments of our imagination.

I no longer believe in most of today’s reviewers.

To me, they are bribed influencers.

Granted, places like LTT might not have a choice but to take those bribes just because of how many employees are there and how much their salaries are.

Same for Tom’s and many others.
I have been around long enough to remember the Tom’s article that reviewed how CPUs would perform when you intentionally removed the HSF while it’s running. The Intel CPU would throttle, the Athlon CPU would burn up. A seemingly pointless article, but one that painted Intel in a better light at a time when AMD was actually competitive for the first time. Also we can’t forget the CTS labs “Zen flaws” release. When intel was taking hits for all its flaws, suddenly a “research company” spills the beans on AMD vulnerabilities, too. Legit issues or not, it had all the looks of a shell company established to produce a hit piece, and they have published nothing else.

Nothing generates clicks like a thumbnail and a title in the form of "[insert great product] sucks - and here's why" or "THE BIG FLAW WITH [insert popular product]"

or "AMD Radeon RX 9070 XT Boosts up to 3.10 GHz"(but that's barely +5% VS 6000 series)

Even though AMD announced that they pulled out from the high-end, even if their marketing department won't advertise this as a mid-range, the narrative is "9070 XT to compete with 5090" - then you get 40% slower performance and it's a disappointment, even if it's priced accordingly

Or it's another "poor volta" moment
 
Last edited:

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,252 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Edit: just did a quick check on 2008 vs 2016 vs 2024
9800 GTX -> 1080Ti =11x
1080Ti -> 4090 = 3.3x
Why did you choose to use the 9800GTX that was a die shrunk and tweaked 8800GTX and not the GTX280 which also released in 2008? it would have still served your point with a 7.87x uplift to a 1080Ti.
Navi 31 was clearly designed to go after the RTX 4090 and it utterly failed to do this
Yeah it absolutely was, the total die area, the BOM as you say, at best I'd say they wanted to at least split the difference between a 4080 and 4090 and then still fell a good 15% short.
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
8,941 (3.90/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
Oh yeah.. they are totally making a 4070Ti..

Screenshot 2024-12-26 213726.jpg
 
Joined
Dec 28, 2012
Messages
3,969 (0.91/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Meh, doubt it, AMD could have clawed away so much marketshare from nGreedia if they prices their previous generation GPUs well, however, they too decided to price gouge their customers. The only light at the end of the tunnel I see is Intel, strange as that may sound.

Anyhoo, 3Ghz Clocks are awesome, hope we see those on the new nGreedia GPUs too. :)
I know math is hard, and throwing out silly names like "ngreedia" must be the peak of comedy, but you do know when your margins are 2-3%, you cant just cut prices willy nilly and stay in business for long, right?

When it comes to GPUs, people just outright refuse to believe that inflation is real.

Why did you choose to use the 9800GTX that was a die shrunk and tweaked 8800GTX and not the GTX280 which also released in 2008? it would have still served your point with a 7.87x uplift to a 1080Ti.

Yeah it absolutely was, the total die area, the BOM as you say, at best I'd say they wanted to at least split the difference between a 4080 and 4090 and then still fell a good 15% short.
This is why I wish Anandtech hadnt gone to crap, a circuit analysis of a MCD would be fascinating. I'd bet good money that if the chips were on the same 5nm node as the GPU, and you didnt need those MCM interconnects, that the memory controllers would be a lto smaller and the total die size would be a lot closer to the 4080.
 
Joined
Dec 25, 2020
Messages
7,076 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
This is why I wish Anandtech hadnt gone to crap, a circuit analysis of a MCD would be fascinating. I'd bet good money that if the chips were on the same 5nm node as the GPU, and you didnt need those MCM interconnects, that the memory controllers would be a lto smaller and the total die size would be a lot closer to the 4080.

Thankfully the smartest folks that used to be part of Anandtech's editorial staff now publish at Chips and Cheese, which is a much higher level technical publication instead.

It clearly wasn't, different die sizes and process nodes. Not to mention more space is wasted when using chiplets, they simply aren't comparable.

It is painfully obvious, there is no technicality that will work around the fact that this graphics card was designed to achieve much greater heights than it actually did. It is a very complex design and the overall die area is exceptionally large, let's not pretend that AMD is several nodes behind or that the chiplets are at fault or anything.

The decisions they made to implement things like a 384 bit interface with 12 memory chips, the power target (because as we have come to know, it scales miserably past its default 330 to 350 W range, even if you pump it 475 W+ into it), none of these are taken lightly when designing a graphics card, they are very much aware of their product's strengths and weaknesses, yet the end result is that both it and the 7900 XT visibly had to be subjected to pre-launch price cuts once they saw that they stood no chance in hell against the RTX 4090 despite being late, especially with the horribly broken launch drivers (which they knew was a problem at the time).

I guarantee you if the RTX 4080 had come first by itself and the 4090 had delayed for no more than a month, the 7900 XTX would have an $1499 MSRP (justifying itself against the 4080 for having 24 over 16 GB) and they would place the 7900 XT at 1199 while telling people they could still get 20 GB and almost as much performance, still sounding like it was a good deal.
 
Joined
Jan 14, 2019
Messages
12,648 (5.82/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Joined
Jan 8, 2017
Messages
9,520 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
It is painfully obvious, there is no technicality that will work around the fact that this graphics card was designed to achieve much greater heights than it actually did.
There actually isn't a single technical detail that proves it did, lower number of transistors, smaller die size, lower TDP and shader count, etc. AMD never expected it to compete with whatever Nvidia was going to sell as it's flagship, I'll remind you both the 4090 and 4080 were more expensive than the 7900XTX.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,735 (6.69/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
330 W? Yikes!:fear:
A 3080 ti has that tdp in the bios code, i just gave someone gainward and palit bios to use on a pny card since all 3 use the same pcb and gainward has the same exact cooler as the pny.
 
Joined
Jan 14, 2019
Messages
12,648 (5.82/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
A 3080 ti has that tdp in the bios code, i just gave someone gainward and palit bios to use on a pny card.
I'd still like to see lower values by default without any tinkering. I guess I'll have a reference model, then (fingers crossed that it'll be available in the UK).
 
Joined
Aug 8, 2024
Messages
47 (0.33/day)
System Name New AMD Build
Processor Ryzen 7 9700X
Motherboard ASUS ROG Strix X870-F Gaming WIFI
Cooling Cooler Master Liquid 360 Atmos
Memory G.Skill Trident Z5 Neo RGB DDR5-6000 CL30-40-40-96 1.40V 64GB (2x32GB) - EXPO
Video Card(s) ASUS TUF Gaming GeForce RTX 4080 SUPER
Storage 2x Samsung 2TB 990 Pro NVMe M.2 SSD
Display(s) Samsung Odyssey OLED G8/G80SD 32" 4K 240Hz
Case Corsair 5000D Airflow Tempered Glass Mid-Tower ATX
Power Supply Corsair HX1000i Fully Modular Ultra-Low Noise Platinum ATX 1000 Watt
Software Windows 11 Professional
Perhaps RDNA4 is a big architectural improvement where 64 CUs can now do just as much work as 80 or 72 CUs at slightly higher clocks? Hopefully this is the case.
Yes probably exactly that. 64 CUs, new arch, faster clocks, more cache, and maybe more shading units/cores per CU.
 
Joined
Sep 17, 2014
Messages
22,729 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Yes probably exactly that. 64 CUs, new arch, faster clocks, more cache, and maybe more shading units/cores per CU.
Sounds like a fantasy though, that.
 
Joined
Apr 30, 2020
Messages
1,007 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
The GPU market is really sad since the pandemic+AI boom.
We went from new GPU architecture with lower power or higher performance and similar prices every 12-18 months to new GPUs(not necessarily arch) every 18-36 months with same or lower performance but the new GPUs can render 720p much faster, so let's increase the prices

There's no more new GPUs released for 100-250$ from amd or nvidia...you still get the same 480/1060 performance from 8 years ago though (or is it 290X performance from 11 years ago?)\


Edit: just did a quick check on 2008 vs 2016 vs 2024
9800 GTX -> 1080Ti =11x
1080Ti -> 4090 = 3.3x


Everything else is just brainwashed kids and marketing bots
That's a much larger time frame from 9800 GTX to the 1080 Ti.
you would have to go form the GTX 680 to 1080 Ti ti match the same time frame as 1080 ti to RTX 4090. It's basically about the 3.3x increase.

The main difference is the price the GTXX 680 had a price of $499 & the 1080 ti had a price of $699 That's only 40% more
then you go the 1080 TI & at $699 going the RTX 4090 is 2.2 times more expensive. Even with inflation the Nvidia RTX cards are all horribly priced & always have been since their introduction.

The RX 9070 is only showing to even come close the RX 7900 GRE , but a 20% increase in RT fps. while having the same total CU count. AMD did the same thing with RDNA3 They gave 20% more RT cores & barely made it faster than their RNDA2 by cheeping out shaders & RT cores. Meanwhile ever single one of NVidia's cards have always increase both shaders & RT cores, by far mor than 20%.
The card is horribly over priced a $649 Or even $449 That card is so slow compare the 5000 series is should be a short release of 6 to 8 month there is almost no reason to release it all.
The price should be $350 it's a last generation tier performance & it will still be behind.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,684 (2.86/day)
Location
w
System Name Black MC in Tokyo
Processor Ryzen 5 7600
Motherboard MSI X670E Gaming Plus Wifi
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Corsair Vengeance @ 6000Mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston KC3000 1TB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Dell SK3205
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
please go ahead and read what they wrote about the price, what was the situation when they wrote that. You want so much to drive your point you didn't even read it.

"AMD hasn't officially dropped the price of the RX 6950 XT to $599, at least as far as we're aware — the AMD store lists the reference card at $699 (and it's out of stock). But Newegg, Amazon, and others have regularly had RX 6950 XT cards priced in the $599–$699 range for several months, basically since the RX 7900 XT launched. Will such cards continue to exist, now that the RTX 4070 launch is over? Perhaps until they're all sold, or until a lower tier RX 7800- or 7700-series card comes along with similar performance (and hopefully at a better price).

And that's the problem. One card has an official $599 MSRP and should remain at that price point going forward, the other got a price promotion to hit $599 and those cards seem to have disappeared now. The best we can find is $619 at present. Sure, it's "only" $20 extra, but it's also for roughly equivalent performance across our gaming suite, and again we favor Nvidia's GPU across a larger suite of testing that factors in DLSS and AI.

If you can find an RX 6950 XT for the same price as an RTX 4070, it's worth considering, but that's not a decisive win. And in fact, since you'll also use about 100–150 watts more power while gaming with the RX 6950 XT, depending on how much you play games, the ongoing cost of owning the RX 6950 XT could also be higher."

I genuninely don't know what we're arguing about anymore.
 
Joined
Dec 25, 2020
Messages
7,076 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I genuninely don't know what we're arguing about anymore.

I stopped following some time ago tbh. These always devolve into some victim complex where a pitiful one is facing a great evil, followed by a stream of self reassuring posts and brand loyalty remarks, its amusing at first but it gets old fast
 
Joined
Feb 24, 2023
Messages
3,151 (4.68/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Why is AMD making a 0-class GPU?
Because they started realistically anticipating the sales.

No one but about a couple dozen ludomaniacs wants a 330 W midrange krankenachtung that's pretending to be a GPU whilst being open about being an AMD product. Catching up with upper midrange Ada in terms of RT performance could have been an achievement three years ago when Ada GPUs didn't exist. In this day and age, this is like a postman finally delivering you a fax machine. Cool but you ordered that a lifetime ago.

This is a loop of underdelivery. AMD promise 7th heaven but what you see on shelves is shitposting. Then they don't get good revenue and once again promise sky high quality but then the product/feature comes out a couple years too late and it's still worse than what competition had readily available when AMD only were running an ad campaign. Disregard facts, pretend it's a part of a genius plan, get set, repeat.

What AMD need is to become pounders, not smacktalkers. Or to sell their graphics division to someone who ACTUALLY cares.
 
Joined
Jan 14, 2019
Messages
12,648 (5.82/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Because they started realistically anticipating the sales.

No one but about a couple dozen ludomaniacs wants a 330 W midrange krankenachtung that's pretending to be a GPU whilst being open about being an AMD product. Catching up with upper midrange Ada in terms of RT performance could have been an achievement three years ago when Ada GPUs didn't exist. In this day and age, this is like a postman finally delivering you a fax machine. Cool but you ordered that a lifetime ago.

This is a loop of underdelivery. AMD promise 7th heaven but what you see on shelves is shitposting. Then they don't get good revenue and once again promise sky high quality but then the product/feature comes out a couple years too late and it's still worse than what competition had readily available when AMD only were running an ad campaign. Disregard facts, pretend it's a part of a genius plan, get set, repeat.

What AMD need is to become pounders, not smacktalkers. Or to sell their graphics division to someone who ACTUALLY cares.
I don't think RT is that important as long as games largely build around consoles, and consoles build largely around AMD. You can see that by most games being very conservative with forced RT options, and rather treating it as an optional feature.
 
Joined
Feb 24, 2023
Messages
3,151 (4.68/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
as long as games largely build around consoles, and consoles build largely around AMD.
Sony explicitly threatened to ditch AMD if the latters won't improve RT. Console gaming also demands RT but AMD don't deliver on this so this is the reason why console games ain't exactly the pinnacle of rays and bounces. RT is to become more and more popular in game development regardless of what you or I think about it.

Not my point anyway. My point is whatever AMD made last decade feature wise was at least a year later than the green equivalent and never catched up in quality. FSR from almost 2025 is still worse than DLSS from 2020. "But it's brand agnostic and open source and shiet" why should I care if it... doesn't improve my experience, or does but similar perf at similar quality can be achieved on a cheaper green GPU because DLSS P beats FSR Q not only in speed but also in quality in almost all games? RT from almost 2025 is still worse than in RTX 3090, a 2020 GPU. Professional workload performance is still worse, too. AMD frame generation is still worse than what NVIDIA had on the day 0. This comes down to anything.
 
Joined
Jan 14, 2019
Messages
12,648 (5.82/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Sony explicitly threatened to ditch AMD if the latters won't improve RT. Console gaming also demands RT but AMD don't deliver on this so this is the reason why console games ain't exactly the pinnacle of rays and bounces. RT is to become more and more popular in game development regardless of what you or I think about it.

Not my point anyway. My point is whatever AMD made last decade feature wise was at least a year later than the green equivalent and never catched up in quality. FSR from almost 2025 is still worse than DLSS from 2020. "But it's brand agnostic and open source and shiet" why should I care if it... doesn't improve my experience, or does but similar perf at similar quality can be achieved on a cheaper green GPU because DLSS P beats FSR Q not only in speed but also in quality in almost all games? RT from almost 2025 is still worse than in RTX 3090, a 2020 GPU. Professional workload performance is still worse, too. AMD frame generation is still worse than what NVIDIA had on the day 0. This comes down to anything.
I can't really add much to the topic other than the fact that I don't give a rat's arse about DLSS or FSR, let alone frame generation which is buggy as hell most of the time, and shoots your input lag into the sky every time, except when you're working with a high enough base frame rate, in which case you don't need FG in the first place.

The fact that AMD fails to deliver on bullshit gimmicks invented by Nvidia to artificially divide the market, bears no significance in my eyes.
 
Joined
Jan 2, 2024
Messages
636 (1.76/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
That's just an excuse for not making enthusiast cards. Enthusiast cards sell also midrange and low end cards.
Hello? NOT an appropriate position. AMD is making a massive pivot with RDNA+CDNA unification, returning to the kind of product stack that was the best standard. They have let us know this is the last stop before the big change. So who is expected to buy the new range of cards? We don't even have a handle on that. Could be guys like me on RX480/580 that buy once every 5-10 years, try to escape as it gets long in the tooth then run into problems on any new shift. Maybe it's for people from the nVidia camp that have been in it too long and want to test the AMD waters. We all have the idea the next flagship is going to be a really small die and then the next round of enthusiast cards are set to ship much later under UDNA, which will most likely have MUCH larger dies that are likely comparable to the 6950XT the way AMD has been doing packaging the last few generations. Supposedly the kind of cut we see between Pascal and Volta (Tesla).

Imagine calling for enthusiast cards on a new process with last gen's memory and speeds while some very serious changes are in the works for hardware and software on a distant scheduled release that is likely going to be double or triple the die size (mm²) with the promise of significantly less headache. What I'm saying is the 8000/9000 rollout is likely going to be the redheaded stepchild of the bunch and then BAM we get something completely insane like Tesla V100 die sizes. The small chip will be really cool but if it's the designated capstone of this era, that party is going to be really short lived.
RDNA 4 is probably a fixed RDNA 3 plus the RT improvements SONY demanded from them to remain a customer. AMD's only new problem is Intel, but I think they will try to ignore Intel for now.
This is not even a guess. RDNA3 is a complete product stack and RDNA4 is some supposed "bugfix" even though there's no new information on features. Everybody that has bought 7000 series has already settled in with whatever features/issues and they're good for the next couple of years if not the rest of the decade. Those guys are all set. Weird miner products like the BC-250 have already exposed the reality of PS5 hardware and we're still in this really stupid MX standoff with GPUs as none of the three companies are producing anything that competes with one another. We don't have enough information on how this goes and it gets shakier with each dumb leak thread. Guess we'll have to wait until next week to be sure.
"AMD hasn't officially dropped the price of the RX 6950 XT to $599, at least as far as we're aware — the AMD store lists the reference card at $699 (and it's out of stock). But Newegg, Amazon, and others have regularly had RX 6950 XT cards priced in the $599–$699 range for several months
$500-550ish *with water block
The people that waited on 6800XT-6950XT price drops have been snackin GOOD these past several months. The guys picking up 6000 series mid-range are all happy.
I would consider it too but there's this nagging voice telling me it's cheaper to go with newer product and for creator features I don't need, like AV1 encode and a buffer over 16GB.
I'll never need these on desktop since 1080p144 is my max but the moment I hop into VR or some 3D kit it's an immediate jump like dual 4K. There's no ignoring it anymore.
But...A lot can happen in a year. We could see weird price hikes and drops here and there, maybe a whole new semiconductor or new manufacturing technology. We gonn' find out.
 
Joined
Mar 23, 2005
Messages
4,094 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 144Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
These upcoming GPUs should be priced at min $499usd. Anything higher and AMD will have issues with trying to capture new market share IMO. Might end up losing market share. RDNA3 was already overpriced, hence the current market share situation.

If AMD wants to gain, cater to the mainstream market for this GPU launch.

price will be the most important factor here, and we all know AMD will decide it after Nvidia releases theirs, and that's were AMD's problem is.
I agree, as they priced themselves out of the market with RDNA3. And as we see today, they continue to lose market share against Ngreedia. As I mentioned before, the top of the line mainstream RDNA4 GPU > rx 9070 xt must not exceed $499usd. If it does, AMD is not interested in gaining market share.
BTW, I hate the naming scheme 9070. They should have stuck with 8700xt or 9700xt.
 
Top