• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Thing is they attacked nVidia but when it came to make a real show of how good the 7900XTX was a fail performance wise, on top of that they went chip-let to make it cheaper but the price is still hiked up.

They should be ashamed on the detail release on the card(s), they acted like kids.
It wasn't the best performance preview, but it was an Architectural pr release, I would also say that both companies tend to make quick and big stride's in driver development over the first few months of a new architecture release.
It might make sense to let later reviews on newer driver's speak directly of its performance close to release on better driver's from AMD pov.

I believe they went chiplets to make they're cards viable while also advancing they're knowledge on 2.5D and 3D complexes and clearly were not ready for side by side GPU tile's, so a baby step with MCD and GCD was produced, imagine the same chip monolithic, it would have been big, expensive and in the same performance band anyway but, it would also likely have been 1600£, a harder sell.
 
Joined
Jan 14, 2019
Messages
12,627 (5.81/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
-Throwing shades at Nvidia every chance they got and making their fanboys think the 7900xtx was actually super fast to then not even compare it with nvidia's gpu.
Because there's no direct competition to Nvidia this time around. I thought we've already established that.

- Ridiciliously stupid presentations - 8k gaming, the display port 800hz, throwing 7900xtx fps out and making viewers guess what they were meant to be(max,avg?),
It's a technological presentation aimed at DP 2.0. If you've seen a product launch video from any company that specifies whether demonstrated FPS numbers are avg or max, please show me. I haven't.

- That monitor advertisement :roll:
They needed that to justify DP 2.0. You don't care about it, I don't care about it, we're free to move on and look at other things.

-Not bothering with showing their cherrypicked benchmarks vs any of their old flagships or Nvidia's.
Because they're not aiming against old flagships. They're releasing a product on its own merits. Besides, if they compared against old Nvidia products, that would only signal that they're behind which is not necessarily the case (at least technologically), regardless of raw performance data.

Edit: But they actually did show comparisons against the 6950 XT.

Thing is they attacked nVidia but when it came to make a real show of how good the 7900XTX was a fail performance wise, on top of that they went chip-let to make it cheaper but the price is still hiked up.

They should be ashamed on the detail release on the card(s), they acted like kids.
How is it a fail? Do you know something that the rest of us don't?

This time Nvidia will not know the performance of these cards until 2 weeks before Xmas.
This is probably why they didn't give more detailed performance numbers. They didn't want Nvidia to gain the upper hand by knowing what they're up to. Nvidia still hasn't released the 4080 after all. Imagine if the 7900 XTX ends up being faster than the $1200 4080. I'm not saying that it will be, but it might be.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,724 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
- HDMI 2.1 is still there. No DP 2.1 monitors on the market just yet. Will be released in 2023 at the earliest. DSC is still there with zero visible visual difference.
- Such heavy GPUs have existed before. I've not heard of any mass-reports about any issues related to its weight.
- It's not. The FE edition runs around 67C.
- Capped to ~315W (70% of TDP) it loses less than 5% of performance.
- The 90 series cards have always been super expensive. They are not for the average gamer.
- I don't give a .... about its UI. It works. At the same time I get lost in AMD's UI. Which tab do I need? Which place the option I'm looking for? Where's digital vibrance? People have been asking AMD for years for this option. Competive CSGO players do not touch AMD cards because of that.
- EVGA, what? Who the .... cares? 95% of the world have never seen EVGA cards.
- Out of literally tens of thousands of sold cards, fewer than a few dozen people have reported issues. And it looks likely all of them have done something wrong, including bending the cable too much or not properly inserting the adapter. Again, a card for the rich or professionals.

Literally not a single argument.

It's a free market. AMD is all yours. Remember Radeon 9800. Should I remind you about its release price? It was $400. Corporations are not your friend even though you want to love AMD so much.
I'm inclined to agree here.

The 4090 isn't a bad product. Its just pushed too far, in every metric: size/dimensions, wattage, price, and marketing. Its just another gen with just another Nv card on top of the stack, meanwhile, the price moves further away from reality every generation post-Pascal. Its literally overinflated, and it is also literally a better product in every way once you dial it back a little. Remarkably similar to Intel offerings - so here we have two companies with similar grasp on the markets over their direct competition, that use the same approach to keep generational leaps 'intact'.

Meanwhile, we've seen new Ryzen chips with similar, but yet still more conservative ideas and more sensible limitations - even if 95C is similarly uncanny, its still less, as is peak TDP, as is tuning out of the box. And GPU now confirms a similar thing for AMD.

The trend is there... but the approaches still differ as do the results. Time will tell what was the best way forward...
 
Joined
Sep 17, 2014
Messages
22,724 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
You can keep your wccftech where the sun doesn't shine, don't even remotely try to make an argument based on that source, nor YT, with me. All I will point out is your own sheep mentality, scroll a few pages back for proof. Or zoom in on my avatar and try to consider what it tries to convey.

Your laugh smilies also don't suit you nor your responses. No need to make a fool of yourself.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.60/day)
Location
Ex-usa | slava the trolls
You can keep your wccftech where the sun doesn't shine, don't even remotely try to make an argument based on that source, nor YT, with me. All I will point out is your own sheep mentality, scroll a few pages back for proof.

RTX 4090 is not a bad product, it's an awful product which should not exist at all. :D

The NVIDIA GeForce RTX 4090 is the newest graphics card from the company. Its physical size and power were said to be unmatched. However, since its release, the graphics card has been reported to overheat the connection, melting the connection port and the cable. A recent post on Reddit now shows that the native ATX 3.0 power supply using the 12VHPWR power connector is now having the same melting issues.

:D
 
Joined
Jun 2, 2017
Messages
9,405 (3.40/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I'm inclined to agree here.

The 4090 isn't a bad product. Its just pushed too far, in every metric: size/dimensions, wattage, price, and marketing. Its just another gen with just another Nv card on top of the stack, meanwhile, the price moves further away from reality every generation post-Pascal. Its literally overinflated, and it is also literally a better product in every way once you dial it back a little. Remarkably similar to Intel offerings - so here we have two companies with similar grasp on the markets over their direct competition, that use the same approach to keep generational leaps 'intact'.

Meanwhile, we've seen new Ryzen chips with similar, but yet still more conservative ideas and more sensible limitations - even if 95C is similarly uncanny, its still less, as is peak TDP, as is tuning out of the box. And GPU now confirms a similar thing for AMD.
I watched Debauer load a Vbios to allow the 4090 to draw 1000 Watts that was at 1.35 volts. That is insane for a GPU to be able to pull that amount of power but speaking of performance increases it was underwhelming. outside of benchmark scores.
 
Joined
Sep 17, 2014
Messages
22,724 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
RTX 4090 is not a bad product, it's an awful product which should not exist at all. :D



:D
Right, let's go back to where we were; you overinflated the negatives, I'm bringing some nuance to that comparison by saying it was pushed too far. I'm fully aware of the 12VHPWR issue; but its like Intel CPUs; a power limited chip makes for a highly performant piece of silicon with great efficiency.

What are you looking for exactly, I don't get it.

I watched Debauer load a Vbios to allow the 4090 to draw 1000 Watts that was at 1.35 volts. That is insane for a GPU to be able to pull that amount of power but speaking of performance increases it was underwhelming. outside of benchmark scores.
Yeah, its the same shit we've been seeing since GPU Boost 3.0; the overclock is done out of the box. Any extra investment is futile.

But now we've crossed the barrier where not only is it OC'd out of the box, effectively (or just out of its efficiency curve), you also get to invest in specific cooling to keep the whole thing from running way below advertised speeds. These top end products aren't the price they specify. They're way, way more expensive, to get that last 5% of perf that gets eclipsed just a year later :p

Like I said elsewhere... we're in silly land with the top end of every chip stack on consumer right now. And it will continue until that penny drops, collectively, and we stop boasting about burning north of 1KW to play a game. Especially in this age.
 
Last edited:
Joined
Sep 3, 2019
Messages
3,600 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
I didn't even think about that. Didn;t they increase the Infinty cache size? That alone should have benefits,
They actually reduce infinity cache amount over the RX6000 series (128MB >> 96MB) for the top GPUs.
Yet it is way faster and increases performance because of the new architecture interconnection between dies.

1667674379020.png

Basically we are talking about up to a few TB/s on actual effective bandwidth. That's why bit bus doesn't mean anything on AMD since infinity cache introduction.

So they can always increase it further beyond 96MB, maybe up to double it...
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.60/day)
Location
Ex-usa | slava the trolls
Let's say that the 4090 are made for the "hall of fame" benchmarking and record scores for users called K|NGP|N and similar.
But for the regular market, the risk of running the card into a real fire hazard is something like 50-50, or it's close to impossible for the average users to keep the card alive.

I am not "overinflating" the negatives - the negatives do exist, and this time they are extremely serious - maybe the most serious since the original Fermi GTX 480 launch 13 years ago.
I will not support yours or anyone's "political correctness" and underestimation of the serious risks.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Can someone confirm/deny that "4090 only 33% faster in Lumen than 3090Ti" is true?

If you wonder what the heck it is, here is it (Unreal 5 demo using Lumen):


Cool eh. That's using "SOFTWARE ray tracing". Now, "hardware RT" in Lumen should be faster, shouldn't it?
Let this sink in:


Lumen also comes with hardware ray-tracing, but most developers will be sticking to the former, as it’s 50% slower than the SW implementation even with dedicated hardware such as RT cores. Furthermore, you can’t have overlapping meshes with hardware ray-tracing or masked meshes either, as they greatly slow down the ray traversal process. Software Ray tracing basically merges all the interlapping meshes into a single distance field as explained above.


a better product in every way once you dial it back a little
Indeed.
Such as 4080 12 GB after seeing what AMD is up to.

You dial it back a little:

nVidia "unlaunches" 4080 12GB

and suddenly it's a better product than before. :D
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
19,113 (2.99/day)
Location
UK\USA
It wasn't the best performance preview, but it was an Architectural pr release, I would also say that both companies tend to make quick and big stride's in driver development over the first few months of a new architecture release.
It might make sense to let later reviews on newer driver's speak directly of its performance close to release on better driver's from AMD pov.

I believe they went chiplets to make they're cards viable while also advancing they're knowledge on 2.5D and 3D complexes and clearly were not ready for side by side GPU tile's, so a baby step with MCD and GCD was produced, imagine the same chip monolithic, it would have been big, expensive and in the same performance band anyway but, it would also likely have been 1600£, a harder sell.

Not the best?, it was terrible, i was not expecting it to be better or as good and just wanted some thing from them that did not get wrote up by what seemed like kids.

As for the GPU side by side i never expected it to be as they only just started chiplets, so step by step make extra money at the very least.

I'll buy one in Dec if i get the chance as i would of been happy with the 6900XT but AMD seem to cut support after 6-8 years or so and was thinking it be possibly cut sooner.
 
Joined
Jan 27, 2015
Messages
1,747 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Because there's no direct competition to Nvidia this time around. I thought we've already established that.

This is actually the thing people are arguing about. AMD pretty clearly has ceded the high end and will have nothing to compete beyond a 4080 16GB - and maybe not even that.

For this reason the price comparisons vs the 4090 are also fallacies. A $999 AMD flagship card that probably lands below the $1199 4080 16GB in performance. They are in the same ballpark in price, given that Nvidia has always commanded a 10-20% premium.

Not that it matters to 99% of folks, who are not getting a 4090 anyway. Most of the people arguing here have older mid or upper midrange GPUs (now low end) and probably aren't in the market for a new one anyway, so just arguing about something they aren't buying from either corporation.
 
Joined
Sep 3, 2019
Messages
3,600 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
This is actually the thing people are arguing about. AMD pretty clearly has ceded the high end and will have nothing to compete beyond a 4080 16GB - and maybe not even that.

For this reason the price comparisons vs the 4090 are also fallacies. A $999 AMD flagship card that probably lands below the $1199 4080 16GB in performance. They are in the same ballpark in price, given that Nvidia has always commanded a 10-20% premium.

Not that it matters to 99% of folks, who are not getting a 4090 anyway. Most of the people arguing here have older mid or upper midrange GPUs (now low end) and probably aren't in the market for a new one anyway, so just arguing about something they aren't buying from either corporation.
How do you come up with that conclusion because almost nothing of what you wrote here adds up...
Everything so far suggests that the 900$ 7900XT will be the rival of the 1200$ 4080 16GB and the 1000$ 7900XTX will be just short of the 1600$ 4090, things that can also change with AIB OC variants.
All this at rasterization, just to be clear. RT on the new AMD GPUs is a full step behind RTX40 (3090/Ti region).
 
Joined
Jan 27, 2015
Messages
1,747 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
How do you come up with that conclusion because almost nothing of what you wrote here adds up...
Everything so far suggests that the 900$ 7900XT will be the rival of the 1200$ 4080 16GB and the 1000$ 7900XTX will be just short of the 1600$ 4090, things that can also change with AIB OC variants.
All this at rasterization, just to be clear. RT on the new AMD GPUs is a full step behind RTX40 (3090/Ti region).

Did you not read my post?

I said "A $999 AMD flagship card that probably lands below the $1199 4080 16GB in performance. They are in the same ballpark in price, given that Nvidia has always commanded a 10-20% premium."

The rest of your post is gibberish, as if Nvidia doesn't have AIB OC variants as well.
 
Joined
Sep 3, 2019
Messages
3,600 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
I said "A $999 AMD flagship card that probably lands below the $1199 4080 16GB in performance. They are in the same ballpark in price, given that Nvidia has always commanded a 10-20% premium."
Yes I did read it and nothing makes sense. Just like your mentioning of the low bit rate of the new(or future) GPUs on another thread(?)

The gap between 4090 and 4080 16 is big and at least 7900XTX will be between them if not the 7900XT also if the 6950XT vs 7900XTX (1.5x) holds up.

1667696454446.png


1667697184371.png
 
Joined
Jan 27, 2015
Messages
1,747 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Yes I did read it and nothing makes sense. Just like your mentioning of the low bit rate of the new(or future) GPUs on another thread(?)

The gap between 4090 and 4080 16 is big and at least 7900XTX will be between them if not the 7900XT also if the 6950XT vs 7900XTX (1.5x) holds up.

View attachment 268706

View attachment 268708


It's pretty well known at this point, at least by those who have not deeply imbibed the AMD kool-aid and are blindly making some kind of excuse (albeit for what, I know not), that the 7900XT and possibly the 7900XTX are competitors to the 4080 16GB.

But you keep thinking what you want to think. I'm sure you'll have some reason or the other later on for being blind to the obvious...

[Radeon RX 7900 XTX] is designed to go against 4080 and we don’t have benchmarks numbers on 4080. That’s the primary reason why you didnt see any NVIDIA compares. […] $999 card is not a 4090 competitor, which costs 60% more, this is a 4080 competitor.

— Frank Azor to PCWorld


1667699065635.png
 
Joined
Sep 3, 2019
Messages
3,600 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
It's pretty well known at this point, at least by those who have not deeply imbibed the AMD kool-aid and are blindly making some kind of excuse (albeit for what, I know not), that the 7900XT and possibly the 7900XTX are competitors to the 4080 16GB.

But you keep thinking what you want to think. I'm sure you'll have some reason or the other later on for being blind to the obvious...



View attachment 268709
I'm well aware of that statement. AMD can be very conservative on their statements at this point, for their reasons, and people can think of what they want and ignore the numbers.
 
Joined
Jan 27, 2015
Messages
1,747 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
I'm well aware of that statement. AMD can be very conservative on their statements at this point, for their reasons, and people can think of what they want and ignore the numbers.

Uh-huh.. the chief of marketing guy at AMD is being conservative about performance...............................................
 
Joined
Sep 3, 2019
Messages
3,600 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
Uh-huh.. the chief of marketing guy at AMD is being conservative about performance...............................................
Yes I can understand your frustration...
 
Joined
Oct 26, 2022
Messages
57 (0.07/day)
@W1zzard What's really the HDMI version of the 7900 XT & 7900 XTX?
The latest 2022 2.1a 48Gbps or 2020 2.1 40Gbps like those of 6900 & 6950 XT's?
 
Joined
May 19, 2009
Messages
1,868 (0.33/day)
Location
Latvia
System Name Personal \\ Work - HP EliteBook 840 G6
Processor 7700X \\ i7-8565U
Motherboard Asrock X670E PG Lightning
Cooling Noctua DH-15
Memory G.SKILL Trident Z5 RGB Black 32GB 6000MHz CL36 \\ 16GB DDR4-2400
Video Card(s) ASUS RoG Strix 1070 Ti \\ Intel UHD Graphics 620
Storage 2x KC3000 2TB, Samsung 970 EVO 512GB \\ OEM 256GB NVMe SSD
Display(s) BenQ XL2411Z \\ FullHD + 2x HP Z24i external screens via docking station
Case Fractal Design Define Arc Midi R2 with window
Audio Device(s) Realtek ALC1150 with Logitech Z533
Power Supply Corsair AX860i
Mouse Logitech G502
Keyboard Corsair K55 RGB PRO
Software Windows 11 \\ Windows 10
The problem is that the budget cards will cost $5-600 too.

Yep, exactly this - and they are what majority of people usually buy.
 
  • Haha
Reactions: ARF

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.60/day)
Location
Ex-usa | slava the trolls
Yep, exactly this - and they are what majority of people usually buy.

No.

Current pricing in Germany:

Radeon RX 6400 - 169.00
Radeon RX 6500 XT - 199.00
Radeon RX 6600 - 261.99

Radeon RX 6600 XT - 339.00
Radeon RX 6650 XT - 339.00
Radeon RX 6700 XT - 439.00
Radeon RX 6750 XT - 499.90
Radeon RX 6800 - 559.00
Radeon RX 6800 XT - 635.90
Radeon RX 6900 XT - 748.00

Radeon RX 6950 XT - 899.00

The majority of people will buy up to the RX 6650 XT that goes for 339 as of now but its price should go downward spiralling because it's crappy for 1080p only.
 
Joined
Sep 26, 2022
Messages
236 (0.29/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
No.

Current pricing in Germany:

Radeon RX 6400 - 169.00
Radeon RX 6500 XT - 199.00
Radeon RX 6600 - 261.99

Radeon RX 6600 XT - 339.00
Radeon RX 6650 XT - 339.00
Radeon RX 6700 XT - 439.00
Radeon RX 6750 XT - 499.90
Radeon RX 6800 - 559.00
Radeon RX 6800 XT - 635.90
Radeon RX 6900 XT - 748.00

Radeon RX 6950 XT - 899.00

The majority of people will buy up to the RX 6650 XT that goes for 339 as of now but its price should go downward spiralling because it's crappy for 1080p only.
What about the 6700 non-XT?
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
nothing to compete beyond a 4080 16GB - and maybe not even that.
In which lala-land will AMD have "nothing to compete" with a 40% cutdown of 4090?
Tell us more about "unlaunching" of the other 4080....

:D
 
Top