• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RX 7900 XTX vs. RTX 4080

Status
Not open for further replies.
and take home a product that has no buts or ifs

A product that's advertised as needing upscaling and interpolated frames because the performance is so bad games would be unplayable without still sounds like a lot of if and buts, let's stop kidding ourselves.
 
I wouldn't call a 20% price difference insignificant, but you're right about the lack of RT performance compared to Ada. I'm not sure if that'll change even with RDNA 4 as AMD seems to be sticking to the trend of minimizing the area dedicated to RT.

There's enough information out there on both cards for someone to decide if the 20% premium is worth it to them. For me it would be for others it wouldn't lets not overcomplicated it.


Actually if it was me I would just get the 4090 (I actually looked at them and was like nah after reviews) instead I don't feel like either of these cards is actually worth buying but everyone has to make that decision for themselves.
 
Last edited:
Actually if it was me I would just get the 4090 (I actually looked at them and was like nah after reviews) instead I don't feel like either of these cards is actually worth buying but everyone has to make that decision for themselves.
I agree that the 4090 is actually the better priced product of the 4080 and the 4090. As far as your point about the 20% price difference between the 4080 and the 7900 XTX is concerned, it makes sense to buy the 4080 if ray tracing performance is important to the buyer. Again, even in that case, the 4090 would be even better from a performance per $ perspective.
 
I agree that the 4090 is actually the better priced product of the 4080 and the 4090. As far as your point about the 20% price difference between the 4080 and the 7900 XTX is concerned, it makes sense to buy the 4080 if ray tracing performance is important to the buyer. Again, even in that case, the 4090 would be even better from a performance per $ perspective.

If AMD can get to within 10% of the Nvidia alternatives in RT in games like Witcher next gen and CP 2077 I would definitely consider them. I do hope that they improve FSR even if that means locking it to their gpu's or at least have an improved version that takes advantage of their hardware for example. I really liked the 7970 and 290X back in the day even though I also owned the competing Nvidia cards as well. I always have two main systems and would love to get back to owning both vendor gpu's if AMD makes something compelling to me again.

I'm really hoping by next gen you can just blindly buy whatever is cheaper in a given performance segment but I'm not holding my breath.
 
@oxrufiioxo
It does make sense that AMD should lock down FSR, but by allowing Nvidia or Intel(?) users to utilize FSR maybe AMD will get more customers down the line when they see what it can do. Does FSR work better on AMD GPU's? I sure hope so.
 
@oxrufiioxo
It does make sense that AMD should lock down FSR, but by allowing Nvidia or Intel(?) users to utilize FSR maybe AMD will get more customers down the line when they see what it can do. Does FSR work better on AMD GPU's? I sure hope so.

Currently it works similarly regardless of gpu which is good. I'm just saying they should do like with Xess where intel hardware does it better but it runs on all hardware.
 
vram usage on 2160p = 15-16gb, so ...... vote to radeon rx 7900 xtx 24gb.....

Hey Zeke Anderson from Tour of Duty, you here too?... :D
 
A product that's advertised as needing upscaling and interpolated frames because the performance is so bad games would be unplayable without still sounds like a lot of if and buts, let's stop kidding ourselves.

Except that isn't how they advertise this segment, they just boast that their frame interpolation tech boosts frame rates far beyond usual generational leaps... All the while they withhold the goods. It's brilliant, if not entirely anti-consumer.

Still to say a 4080 needs DLSS 3 FG to pull frame rates while it will handily outperform the 6950 XT, 3090 Ti and in most cases even the 7900 XT without it is... eh?
 
Actually if it was me I would just get the 4090 (I actually looked at them and was like nah after reviews) instead I don't feel like either of these cards is actually worth buying but everyone has to make that decision for themselves.
Kind of taking the words out of my mouth here.
I need a card that'll satisfy my 4K good/high framerate needs. And literally none of them make me want to click Buy. I'll eventually cave for one, most likely the XTX, but none of them make me WANT to buy, and the price is the reason. If the 4080 wasn't such a scam with its pricing I'd have taken it already, it's pretty much the perfect card for what I want.
 
Kind of taking the words out of my mouth here.
I need a card that'll satisfy my 4K good/high framerate needs. And literally none of them make me want to click Buy. I'll eventually cave for one, most likely the XTX, but none of them make me WANT to buy, and the price is the reason. If the 4080 wasn't such a scam with its pricing I'd have taken it already, it's pretty much the perfect card for what I want.
sddefault.jpg
 
Except that isn't how they advertise this segment, they just boast that their frame interpolation tech boosts frame rates far beyond usual generational leaps... All the while they withhold the goods. It's brilliant, if not entirely anti-consumer.

Still to say a 4080 needs DLSS 3 FG to pull frame rates while it will handily outperform the 6950 XT, 3090 Ti and in most cases even the 7900 XT without it is... eh?

The point is that even if the RT performance is better than AMD's it's still overall horrible and it's nothing either of these companies should boast about, like ever. This is what you see on their product page :

1680678227581.png


Unless you're gonna tell me that 22fps is perfectly playable then no, this is advertised as being necessary and not just optional.
 
The point is that even if the RT performance is better than AMD's it's still bad and it's nothing either of these companies should boast about, like ever. This is what you see on their product page :

View attachment 290378

Unless you're gonna tell me that 22fps is perfectly playable then no, this is advertised as being necessary and not just optional.
I think this is to illustrate how good DLSS3 is in getting more FPS not how great RT performance is. Cause we all know the performance when you switch RT on is crap and that is on a graphics card that has the best RT performance you can get for a huge price mark.
 
I think this is to illustrate how good DLSS3 is in getting more FPS not how great RT performance is. Cause we all know the performance when you switch RT on is crap and that is on a graphics card that has the best RT performance you can get for a huge price mark.

Again, is anyone playing these games without DLSS ? Obviously not, because they'd suck otherwise, the main purpose of these things is to make the abhorrent RT performance more palatable. This has always been about RT performance, that's why they came up with it in the first place.
 
Again, is anyone playing these games without DLSS ? Obviously not, because they'd suck otherwise, the main purpose of these things is to make the abhorrent RT performance more palatable. This has always been about RT performance, that's why they came up with it in the first place.
Im playing with with DLSS Quality, no FG. Framerate is perfect

Overdrive mode also works great on amd, you just have to drop settings a bit
Cyberpunk-2077-RTX-Path-Tracing-Overdrive-screenshots-3.jpg.68126fa7e1e9ebf47a41b016adfcdb4d.jpg
 
Last edited:
Again, is anyone playing these games without DLSS ? Obviously not, because they'd suck otherwise, the main purpose of these things is to make the abhorrent RT performance more palatable. This has always been about RT performance, that's why they came up with it in the first place.
Not saying RT performance is great. Actually it sucks with or without DLSS. If you have only one graphics card that can kinda play it nicely according to current FPS standards in a game that we have and the notion that 60 FPS is not enough, what FPS we have with RT nowadays is laughable. People can play without DLSS what is funny, is the ignorance and arrogance around the FPS you get. 30 fps is perfect for RT then again, 144FPS is minimum for raster. It's just crazy what people say sometimes.
Im playing with with DLSS Quality, no FG. Framerate is perfect
Sure with one card that can somewhat pull it off. A card that is the best there is to date.
Is it perfect? the next day you will say you need to play at least 144FPS because you see difference between 100 and 144. I've seen those "perfect framerates I have" opinions.
 
Sure with one card that can somewhat pull it off. A card that is the best there is to date.
Is it perfect? the next day you will say you need to play at least 144FPS because you see difference between 100 and 144. I've seen those "perfect framerates I have" opinions.
70-80 fps is fine for me for single player games, id rather have visuals pushed than 500 fps. If the card was able to get 144 - that would mean the visuals aren't as pushed as they could be.
 
The only unreasonable people are the ones
You really can't with Vya especially but a few others, there's no amount of reason that can get through twisted logic and mental gymnastics. At least I'm lucky, pretty sure they've got me on ignore.
 
You really can't with Vya especially but a few others, there's no amount of reason that can get through twisted logic and mental gymnastics. At least I'm lucky, pretty sure they've got me on ignore.
Nah, I don't think anyone is that far gone, I think they already know the difference is 45+% - they are just acting dumb so they don't have to admit it. I refuse to accept they haven't realized it yet.
 
The point is that even if the RT performance is better than AMD's it's still overall horrible and it's nothing either of these companies should boast about, like ever. This is what you see on their product page :

View attachment 290378

Unless you're gonna tell me that 22fps is perfectly playable then no, this is advertised as being necessary and not just optional.

NVIDIA's marketing can be downright deceptive, I'll agree with you on this and I'll be the first to put my pitchforks up over it, but at the same time there's a few things to consider:

1. That's Cyberpunk, isn't it? Which will run like dog regardless of hardware
2. Let's suppose a hypothetical exists where the 4080 truly gets 22 fps in a RT workload. The 4090 would have like 27, an eventual full Ada would do around 33. That'd mean a 7900 XTX or 3090/Ti would do 15-16, a 7900 XT would do 13-14. That's pretty much my guesstimate based on their performance difference by usual review gaps. It's obviously a workload that's not sustainable on current generation hardware and I don't think will be realistic any time soon
3. A lot of the argument over frame generation revolves around the fact that it supposedly shouldn't be used in low frame rate situations...which is intriguing, really, just more FUD spread by NVIDIA's fanboys trying to justify a 40 series card.
 
NVIDIA's marketing can be downright deceptive, I'll agree with you on this and I'll be the first to put my pitchforks up over it, but at the same time there's a few things to consider:

1. That's Cyberpunk, isn't it? Which will run like dog regardless of hardware
2. Let's suppose a hypothetical exists where the 4080 truly gets 22 fps in a RT workload. The 4090 would have like 27, an eventual full Ada would do around 33. That'd mean a 7900 XTX or 3090/Ti would do 15-16, a 7900 XT would do 13-14. That's pretty much my guesstimate based on their performance difference by usual review gaps. It's obviously a workload that's not sustainable on current generation hardware and I don't think will be realistic any time soon
3. A lot of the argument over frame generation revolves around the fact that it supposedly shouldn't be used in low frame rate situations...which is intriguing, really, just more FUD spread by NVIDIA's fanboys trying to justify a 40 series card.

Aw, 4090 being 75% faster than 3090 in rasterization is already making with worth the 1600usd price tag, aren't you being a little bit too salty :roll:
 
NVIDIA's marketing can be downright deceptive, I'll agree with you on this and I'll be the first to put my pitchforks up over it, but at the same time there's a few things to consider:

1. That's Cyberpunk, isn't it? Which will run like dog regardless of hardware
2. Let's suppose a hypothetical exists where the 4080 truly gets 22 fps in a RT workload. The 4090 would have like 27, an eventual full Ada would do around 33. That'd mean a 7900 XTX or 3090/Ti would do 15-16, a 7900 XT would do 13-14. That's pretty much my guesstimate based on their performance difference by usual review gaps. It's obviously a workload that's not sustainable on current generation hardware and I don't think will be realistic any time soon
3. A lot of the argument over frame generation revolves around the fact that it supposedly shouldn't be used in low frame rate situations...which is intriguing, really, just more FUD spread by NVIDIA's fanboys trying to justify a 40 series card.
Problem with DLSS3 is the same as other proprietary features. It works today and tomorrow is forgotten and something else replaces it which obviously is tied to hardware. So new hardware purchases are in order. Kinda Gsync scenario. It also costs a lot and even if you have it, (low frame rate situation) it may not work as intended with all products.
 
NVIDIA's marketing can be downright deceptive, I'll agree with you on this and I'll be the first to put my pitchforks up over it, but at the same time there's a few things to consider:

1. That's Cyberpunk, isn't it? Which will run like dog regardless of hardware
2. Let's suppose a hypothetical exists where the 4080 truly gets 22 fps in a RT workload. The 4090 would have like 27, an eventual full Ada would do around 33. That'd mean a 7900 XTX or 3090/Ti would do 15-16, a 7900 XT would do 13-14. That's pretty much my guesstimate based on their performance difference by usual review gaps. It's obviously a workload that's not sustainable on current generation hardware and I don't think will be realistic any time soon
3. A lot of the argument over frame generation revolves around the fact that it supposedly shouldn't be used in low frame rate situations...which is intriguing, really, just more FUD spread by NVIDIA's fanboys trying to justify a 40 series card.
2) Numbers are mostly correct

3) It is true that FG isn't great on low fps. You need to be hitting 45-50 to make it decent,, 60+ to make it great.
 
Aw, 4090 being 75% faster than 3090 in rasterization is already making with worth the 1600usd price tag, aren't you being a little bit too salty :roll:
so if you have a 5090 card which will be 50% faster than 4090 in both RT and raster will it be worth a price tag of 2600USD? just a simple question to check your opinion on that.

2) Numbers are mostly correct

3) It is true that FG isn't great on low fps. You need to be hitting 45-50 to make it decent,, 60+ to make it great.
Yes and the problem is in CB2077 only one card can do that.
 
so if you have a 5090 card which will be 50% faster than 4090 in both RT and raster will it be worth a price tag of 2600USD? just a simple question to check your opinion on that.
But the 4090 wasn't 2600 USD. Actually I paid less for my 4090 than my 3090 :roll:

Yes and the problem is in CB2077 only one card can do that.
The 4080 can dot it as well - probably using DLSS balanced or 1440p with DLSS Q.
 
But the 4090 wasn't 2600 USD. Actually I paid less for my 4090 than my 3090 :roll:
read again. Yeah you did sure.
But the 4090 wasn't 2600 USD. Actually I paid less for my 4090 than my 3090 :roll:


The 4080 can dot it as well - probably using DLSS balanced or 1440p with DLSS Q.
yes balance and you lose visuals for with a card that costs 1200USD.

weird stuff with MSRP pricing for me.
1080 ti = 699usd (full die)
2080 ti = 999usd (full die)
3080 ti = 1199usd (not full die)
3090 = 1499usd (not full die)
3090 ti = 1999usd (full die)
4090 = 1599usd (not full die)
Why I have a problem with this? Well, in a 5 year mark we will have low end cards with a price tag of $1k if all goes according to NV's plan.
 
Last edited:
Status
Not open for further replies.
Back
Top