• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XT Now $100 Cheaper Than GeForce RTX 4070 Ti SUPER

Joined
Sep 17, 2014
Messages
22,929 (6.07/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I just realize it's a futile argument. Neither side will ever concede on this. It's an endless quarrel. Right now, the house of cards is stacked against AMD. They are on the uphill battle, they are the ones trailing in market performance, features, etc. - and that means that they are the ones who need to overcome, adapt and innovate in order to lead.

You guys are happy? Brilliant! That's a great sign. I hope they continue their work so that I can be happy with them again. There's nothing wrong with that. I just want the make believe to end, pretending that issues aren't real, that arguments are overblown, and that the small indie is always being ruthlessly bullied by the greedy meanie... it all got so old by now. Every time I read the word "nGreedia" I just feel bitter disappointment. I fully realize that I often get on AMD fans' case. It's generally vested in good faith.



ok fine :)

Really, it's a lot less personal than it sounds. I've no real problem with you. It'd be good to see you broaden your horizons a little, though. AMD doesn't have your best interests at heart. :toast:
It is futile that is exactly what Ive concluded :) Glad we agree. Its not meant personally, indeed. Opinions differ - and they also change over time. For over a decade I would not touch AMD with a ten foot pole because of similar arguments we see today.

The differences have been there for a long time. Today, Im of the opinion AMD has positioned itself much better, as in, the chiplet is likely a much more crucial piece of tech to advance GPUs than more refinement on monolithical dies where it is already evident we barely gain more FPS per dollar on the hardware alone. Thats worrying and requires a solution - one even Nvidia is gonna have to adopt.
 
Last edited:
Joined
Apr 29, 2014
Messages
4,323 (1.10/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Nah, I can assure assure you it's not the brand, people will eventually switch to praising whatever they think it's more premium (not even necessary buying it). Look at how Intel has slowly lost it's mindshare, as soon as the 5800X3D made it's appearance a lot of Intel loyalists vanished.

If the next X3D CPU is like 1000$, you'd absolutely have tons of people arguing it's actually OK because it is the fastest gaming CPU after all and if you think it's horrid value and a stupid choice you're just a poor pleb or something.
I do agree, they have overcome Intels strangle hold on the CPU market and have become a true competitor. But I think that the CPU battle was a little easier compared to the GPU battle and is going to take significantly more to overcome.
Yea if they did price the CPU like that I am sure we would hear that argument. Totally agree.

I spend a premium on premium features and stable ecosystem. I have no loyalty to a brand, nor do I fall victim to marketing tricks. In fact, I often comment on Nvidia's marketing posts on social media in a negative light, for example, their recent trend of comparing Turing and Ampere rendering a native image to Ada with DLSS-G and DLSS-RR enabled is simply dishonest. AMD has not been able to provide any of that, if you purchase a 7900 XTX today, you're settling. You're invariably getting the second class experience, and frankly, i didn't build my computer for that. Good for anyone who did, though. But the ravings and ramblings of AMD fans constantly dumping on Nvidia nGreedia and aggressively defending each and every move of their beloved FRIEND helps absolutely nobody either. It only feeds their delusion.

Nvidia's RT is also based on DXR. There's no difference, except they've got a generation's worth of a head start, as Turing was DXR capable and RDNA was not. Not only it wasn't, AMD also chose to take the lazy route and not implement the low-performance, but at the time, important software driver for it either, something that Nvidia went out of their way to add to Pascal. It may not have been fast enough for gamers, but it sure could help solidify GeForce as the premier RT vendor simply because you could develop DXR software on Pascal but not on AMD's products. The result is that both Nvidia's driver is far more mature, their software is more robust, but graphics programmers are actually more familiar with how the Nvidia hardware works because they've had such a massive head start.

Since I play on Windows, I'm not exactly drawn by open source, in fact, I couldn't care less as long as I get the best experience. The vast majority of people share this thought beyond a knee-jerk reaction of "oh yeah FOSS is great i love me some FOSS", and seconds later, pull out their iPhone from their pocket. They like the free part, the open source part... only devs care.
Settling for more performance where it matters? I mean unless you buy the 4090 which is hands down the most powerful card your 'settling' technically. I love my Titan X, I love my laptop with the 1660ti (Though I don't heavily game on it), and there are plenty of features that are good on nVidia that are appealing. However, we keep playing this features game when talking about Ray Tracing as though the clouds parted and heaven shined down on us this new feature and we can never look back is ridiculous. Same with DLSS which I would argue is a much better feature overall than Ray Tracing since it improves performance instead of killing it for some slightly better lighting effects (Though neither DLSS or FSR is perfect and can cause weird visual problems). Every feature has their upsides and downsides, but at the end of the day raw performance is the most important thing on a gaming GPU when comparing similarly priced components as that is what most people are going to be getting when using said card.

I mean if we are going to talk about feature sets, why does nVidia Control Panel look straight out of Windows XP and AMD's driver look like a modern normal experience? We could even discuss alot of the options in the AMD driver versus nVidia.

My point is not to tell you you made a horrible mistake buying the RTX 4080 over the RX 7900 XTX, or to tell anyone they made a bad decision on the card they bought. That is not for me to tell someone and even if I did, that would be irrelevant and pointless because they are free to purchase what they like. But the constant threads telling people to kneel at the altar of Ray Tracing is not helping anyone (I am more aiming that at everyone talking about it that way not saying your running around to every thread doing that) and it makes it seem significantly bigger than it is. The 4090 is the only card that can somewhat run Ray tracing in an ok way to where the games that actually show a noticeable difference can be run at playable FPS (Yes I am aware mixing DLSS with it helps alot, however that situation is not perfect and can cause visual anomalies and if we are talking about trying to make something look better, then why use something that can potentially hurt visual clarity while trying to improve it).
 
Joined
Dec 25, 2020
Messages
7,310 (4.92/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Settling for more performance where it matters? I mean unless you buy the 4090 which is hands down the most powerful card your 'settling' technically. I love my Titan X, I love my laptop with the 1660ti (Though I don't heavily game on it), and there are plenty of features that are good on nVidia that are appealing. However, we keep playing this features game when talking about Ray Tracing as though the clouds parted and heaven shined down on us this new feature and we can never look back is ridiculous. Same with DLSS which I would argue is a much better feature overall than Ray Tracing since it improves performance instead of killing it for some slightly better lighting effects (Though neither DLSS or FSR is perfect and can cause weird visual problems). Every feature has their upsides and downsides, but at the end of the day raw performance is the most important thing on a gaming GPU when comparing similarly priced components as that is what most people are going to be getting when using said card.

I mean if we are going to talk about feature sets, why does nVidia Control Panel look straight out of Windows XP and AMD's driver look like a modern normal experience? We could even discuss alot of the options in the AMD driver versus nVidia.

My point is not to tell you you made a horrible mistake buying the RTX 4080 over the RX 7900 XTX, or to tell anyone they made a bad decision on the card they bought. That is not for me to tell someone and even if I did, that would be irrelevant and pointless because they are free to purchase what they like. But the constant threads telling people to kneel at the altar of Ray Tracing is not helping anyone (I am more aiming that at everyone talking about it that way not saying your running around to every thread doing that) and it makes it seem significantly bigger than it is. The 4090 is the only card that can somewhat run Ray tracing in an ok way to where the games that actually show a noticeable difference can be run at playable FPS (Yes I am aware mixing DLSS with it helps alot, however that situation is not perfect and can cause visual anomalies and if we are talking about trying to make something look better, then why use something that can potentially hurt visual clarity while trying to improve it).

Quantifying "more performance where it matters" is crucial to make an informed decision, though. It's a 4% weighted average including outliers in raster, something that the SUPER reduced to 1%. Factor in the value additions, the power consumption and efficiency, support schedule, etc. and you start to get a pretty different picture painted. I would argue that it's far more likely i'll need the extra leg-up when tackling a RT game than otherwise, and the weighted gap is 16 (original) and 20% (super) in Nvidia's favor here. The point of buying an expensive (near-)flagship is to experience latest-generation games, after all.

We know AMD routinely refreshes and reskins their control panel every year. What matters is what's under the hood, though. Nvidia's just been cramming their feature controls under the "Manage 3D Settings" page on the age old NVCP. It's less than optimal but it's not really a deal breaker. Nvidia's drivers are time tested at this point.
 
Joined
Jan 14, 2019
Messages
13,565 (6.17/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Quantifying "more performance where it matters" is crucial to make an informed decision, though. It's a 4% weighted average including outliers in raster, something that the SUPER reduced to 1%. Factor in the value additions, the power consumption and efficiency, support schedule, etc. and you start to get a pretty different picture painted. I would argue that it's far more likely i'll need the extra leg-up when tackling a RT game than otherwise, and the weighted gap is 16 (original) and 20% (super) in Nvidia's favor here. The point of buying an expensive (near-)flagship is to experience latest-generation games, after all.

We know AMD routinely refreshes and reskins their control panel every year. What matters is what's under the hood, though. Nvidia's just been cramming their feature controls under the "Manage 3D Settings" page on the age old NVCP. It's less than optimal but it's not really a deal breaker. Nvidia's drivers are time tested at this point.
Why are you still going on about the extra 20% in RT, when we've already established that it's still unplayable unless you lay down some cash for a 4090?

Before anybody says "market share" again:
Screenshot_20240208_182821_YouTube.jpg
 
Joined
Dec 25, 2020
Messages
7,310 (4.92/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Why are you still going on about the extra 20% in RT, when we've already established that it's still unplayable unless you lay down some cash for a 4090?

Before anybody says "market share" again:
View attachment 333761

We're talking about high-end GPUs aren't we? 4080+.
 
Joined
Jan 14, 2019
Messages
13,565 (6.17/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
We're talking about high-end GPUs aren't we? 4080+.
Do you mean 4090? AMD has nothing there, so I don't know where the 20% extra comes from. Or if you mean 4080 Super vs 7900 XTX, then I admit, it's a mighty impressive battle, considering that the 4080 Super somehow managed to be cheaper than the vanilla 4080, almost matching the 7900 XTX in price. At that level, I really wouldn't think twice about getting Nvidia. But only at that level, and only as long as AMD doesn't lower the prices on the 7900 series (again), which they will have to.
 
Joined
Apr 29, 2014
Messages
4,323 (1.10/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Quantifying "more performance where it matters" is crucial to make an informed decision, though. It's a 4% weighted average including outliers in raster, something that the SUPER reduced to 1%. Factor in the value additions, the power consumption and efficiency, support schedule, etc. and you start to get a pretty different picture painted. I would argue that it's far more likely i'll need the extra leg-up when tackling a RT game than otherwise, and the weighted gap is 16 (original) and 20% (super) in Nvidia's favor here. The point of buying an expensive (near-)flagship is to experience latest-generation games, after all.

We know AMD routinely refreshes and reskins their control panel every year. What matters is what's under the hood, though. Nvidia's just been cramming their feature controls under the "Manage 3D Settings" page on the age old NVCP. It's less than optimal but it's not really a deal breaker. Nvidia's drivers are time tested at this point.
I am not following your logic on this. The super is barely 1% better overall than its non-super counterpart in all the relative performance slides. When compared to the 7900 XTX I mean at 1080p sure its a 1% difference, but we are clearly seeing alot of games more CPU bound, which is why at 1440p and 2160p the 4090's gap moves up by a significant margin across the board (And why the gap widens significantly more as we move up). Plus both these cards are crazy for 1080p, your better off saving money and getting the 4070 ti super, 7900 XT, or even maybe the 7800 XT or 4060 ti if that is where you want to game. Plus when talking outliers, if you reference the charts in the review of the 4080 super vs RX 7900 XTX I don't see any major outliers that are just ridiculous differences (Revenant 2 would be one that is odd since the 7900 XTX beats the 4090), but some if that can be explained with the engines being used on those games. I generally see the older engines preferring nVidia. But I generally only like results being thrown out if there is a significant ridiculous result in a game that cant be explained (Like for instance if the RTX 4090 was matching an RX 7800 in a specific game, that would be a moment where you know something is wrong). Which also goes to the argument of experiencing the latest games, both sides have a decent list of recent games that favor them however AMD has more at least on this list I would argue just by looking at the comparison chart on here between the two.

Like you said though, if all you care about is Ray Tracing which you are alluding to as your reason to buy it then yes that is a clear victory. But seeing what FPS people get in games like Cyberpunk on a 4080 with Ray Tracing enabled at 2160p, while it does make it look nice it was abysmal performance. My friend with a 4090 has done them and it can make a noticeable difference and at least be playable in my book, but even he turns it off most of the time to keep his higher refresh rate.

As for the skin refresh, I mean that is literally my point is that they at least refresh the panel and keep it modern. Also under the hood is fine, the whole "AMD Drivers crash and suck" argument is from 10 years ago and is irrelevant today. AMD drivers have been fine for a long time and to me seem significantly more premium than nVidia Control Panel. But I would not claim in this day and age one is abstractly better than another in terms of driver performance.

Do you mean 4090? AMD has nothing there, so I don't know where the 20% extra comes from. Or if you mean 4080 Super vs 7900 XTX, then I admit, it's a mighty impressive battle, considering that the 4080 Super somehow managed to be cheaper than the vanilla 4080, almost matching the 7900 XTX in price. At that level, I really wouldn't think twice about getting Nvidia. But only at that level, and only as long as AMD doesn't lower the prices on the 7900 series (again), which they will have to.
To be fair this thread is talking about $100 dollars cheaper than the 4080 super. So it still can be had a decent bit cheaper.
 
Last edited:
Joined
Jan 6, 2020
Messages
15 (0.01/day)
Just a month or two maybe... Other than that, 4090 stays very close to its MSRP. We're talking almost 1.5 years lifespan at this point. You cherry picked 10% to "prove" what's either way false. It's been less than a week when 4090 was doubling the 4080's price. Today, it's $1200 VS $1800.
Why do you keep using the old 4080 price.

4080 Super was out by the time you made this comment, and in stock for $1000, and even prior to that, it was $1100 usually for a 4080 prior to that.

Additionally, it was December and January that has had the 4090 mostly for over $2000. It only dropped to sub-$2000 within the last 3 weeks. And its still hard to find it at that price. Only cards that are consistently below $2000 are Gigabyte and MSI.

So $1100 vs $2050 (Average selling price for 4080/4080S vs Avg Selling Price for 4090). Yeah, that justifies being called double, as it is an over 85% increase in price.

Like, yeah, gaming is much more accessible than whatever ago, especially if we speak 2021 and 2022 but these extra tiny PCBs with so much simplification don't deserve being sold for that money. 4060 series is a joke. 4070 series is a rip off. 4080 is just WHY. And I can't say I'm happy with what the competition is doing because I know from my own experience how it's like to do a whole lot of nothing.

4060 series is a joke, I do agree. 4070 series isn't too much of a rip off though.

3080 vs 4070 Super, is a decent uplift, and a comparable increase to the 970 to 1060 in performance and bang for buck increase.

(((£750 - £540)/£750)) + 1) x 1.1 = 45% Better Bang for Buck.

My only complaint is the 4070 Super should've been the 4070 from the start.

4070 Ti should've been between 4070 Ti Super and 4080. Basically the current 4080 at 250W (so 6-9% less perf than the 4080.), and at £729/$749.

4080, keep it the same as 4080 Super. Just needs an extra $50 off; $949.
 
Top