• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RX 7900 XTX vs. RTX 4080

Status
Not open for further replies.
Finally the 7900XTX performed really well with Fortnight hardware accellerated RT. I think with both consoles rocking AMD and Unreal Engine 5.2 being a big deal, AMD may go from abysmall to nearly as performant as NVidia in the near future (without a change to the silicon). Saying that I think we'll need Nvidia and AMDs next generation to fully extract consistent performance from Lumen in the UE5 engine.

Both cards are awesome! You can't really lose out either way.

Although if you own a PSVR2 and want to use it with SteamVR don't make the mistake I did and forget you need a USB-C video out!

Abysmal? The 7900 XTX isn't that far behind the 4080 in RT:

1680063963007.png


Surely you were using hyperbole, 16% is an edge, not abysmal.

I've noticed that the cheapest RTX 4080 is ~$1200 while the cheapest RX 7900 XTX is ~$1000, but the RX 7900 XTX has more performance on most every game than the 4080 in both min. FPS and avg. FPS according to W1zzard's review:
https://www.techpowerup.com/review/asrock-radeon-rx-7900-xtx-taichi/34.html

Does the RX 7900 XTX lack the hardware ray tracing capabilities of Team Green? Would the hardware ray tracing effects of say Control or Cyberpunk 2077 or Metro: Exodus Enhanced Edition look any worse on the RX 7900 XTX than they would on an RTX 4080?

Both vendors support ray tracing and it'd look identical regardless of brand. As illustrated in the graph above, the 4080 is faster in RT but the 7900 XTX is still very capable.

It's too bad Team Red doesn't support Gsync otherwise I'd be already buying an RX 7900 XTX...

This is poorly worded, AMD can't support G-Sync because it's a proprietary Nvidia technology. It's not AMD's choice to avoid implementing G-Sync support.

That said G-Sync is essentially irrelevant today, the vast majority of monitors coming out are FreeSync monitors that work on Intel, AMD,and Nvidia graphics cards. Most vendors have gotten their FreeSync scalers good enough to the point where they are as good as the G-Sync ones if not better. A good example of this is the new Dell Alienware 34" OLED, where the G-Sync variant has an annoying load fan as the newer G-Sync module comes with that from Nvidia. In addition, the HDR tone-mapping on FreeSync Premium Pro monitors is superior.

It makes sense to get the 4080 if you want the lower power consumption or you plan on utilizing CUDA or other professional features (NVENC is still better). Contrary to what's been said thus far, Nvidia is not more stable. HWUB has a section on this in their recent AMA:

I do think the 4080 is the superior card but I do not think it's better value than the 7900 XTX. I'd say it's slightly worse value if you aren't using all of the features the card has to offer.

I'm surprised no one has asked yet but what exactly is your use case for this product? What games do you intend to play, at what resolution, and for how many years do you plan on keeping the card? You may very well be able to get away with a last gen card and save the money towards your next upgrade or real life things. $1,200+ for a video card is a lot of money for a product that's gong to generate marginally prettier graphics than a $500 graphics card could. I feel like a lot of tech enthusiasts tend to case the higher number on the bar charts without stopping and thinking of the practical benefits.
 
vram usage on 2160p = 15-16gb, so ...... vote to radeon rx 7900 xtx 24gb.....
Are you sure it's usage and not just allocation - this is fairly typical. Having said that I'm seeing this game get absolutely shredded for the horrible performance issues at launch, so I won't hang my hat on this game as an example, much like Hogwarts which got promptly fixed.
 
Are you sure it's usage and not just allocation - this is fairly typical. Having said that I'm seeing this game get absolutely shredded for the horrible performance issues at launch, so I won't hang my hat on this game as an example, much like Hogwarts which got promptly fixed.

It could be either. Resident Evil 4 Remake allocates 20GB on a 7900 XTX and uses 16GB with RT enabled max graphics. Might be why the 3070 and 3070 Ti crash the game when you turn on RT with the memory buffer above 0.5GB (which is low FYI).

Everyone knew the 3070, 3070 Ti, and 3080 were kneecapped by their VRAM. They are on paced to be the quickest outmoded cards due to a lack of VRAM. Precisely why I didn't downgrade to a 3070 / 3080 form my 1080 Ti, those cards are not designed to last. Neither is the 4070 Ti.
 
It could be either. Resident Evil 4 Remake allocates 20GB on a 7900 XTX and uses 16GB with RT enabled max graphics. Might be why the 3070 and 3070 Ti crash the game when you turn on RT with the memory buffer above 0.5GB (which is low FYI).
Everyone knew the 3070, 3070 Ti, and 3080 were kneecapped by their VRAM. They are on paced to be the quickest outmoded cards due to a lack of VRAM.
Interesting, I'm playing it maxed out, at 2160p with RT on, but I've dropped in the DLSS mod and am using the 2.5.1 DLL. The warning is red, but the game runs buttery smooth on my 3080.

I certainly understand the sentiment, but I believe the term kneecapped or obsolete take that notion too far when settings can just be turned down if they're an issue and the game will still run fine. It's not like the vast majority of every gamers video cards are currently kneecapped/obsolete because they can't run max everything, the vast vaaast majority of gamers are already compromising on settings somewhere, like high instead of max or optimised settings, lowering resolution, RT off, FSR on and so on.

Personally, I wanted a video card that would give me a good RT experience and be a big upgrade from a 1080, and the 3080 was that card, I made the choice to, over the life of the card, possibly need to drop texture settings due to the 10GB, but retain RT, where on a 6800XT I'd have had to almost entirely forgo RT from day 1. Naturally each to their own, other people make other choice for their own reasons and I'd never deny them that.
 
vram usage on 2160p = 15-16gb, so ...... vote to radeon rx 7900 xtx 24gb.....
Vram "usage" or rather allocation is dynamic. Games will allocate (not necessarily need) more vram according to the capacity of GPU. No current games (esp console ports) has ever had its performance compromised due to 16gb vram being insufficient.

Cards that have 24 or 20gb vram go by the bit-bus spec of the GPU (384-bit or 320-bit), not because games will ever need that any time soon.
 
Last edited:
Vram "usage" or rather allocation is dynamic. Games will allocate (not necessarily need) more vram according to the capacity of GPU. No current games (esp console ports) has ever had its performance compromised due to 16gb vram being insufficient.

Cards that have 24 or 20gb vram go by the bit-bus spec of the GPU (384-bit or 320-bit), not because games will ever need that any time soon.

rtx 4070 ti 12gb, 2160p, vram usage = 12gb....

rtx 4090 24gb, 3840 x 1600, vram usage = 15-16gb...
 
I'd go with the Radeon option if you don't care about ray tracing a lot. It seems like a better option around that price and you guys having nice prices over there. If i had to make that choice i'd have to pay 2000 USD for 4080 and 1675 USD for 7900 XTX...
 
rtx 4070 ti 12gb, 2160p, vram usage = 12gb....

rtx 4090 24gb, 3840 x 1600, vram usage = 15-16gb...
Not sure why quote me and post these links. Its as if you didnt get the point.

I'd go with the Radeon option if you don't care about ray tracing a lot. It seems like a better option around that price and you guys having nice prices over there. If i had to make that choice i'd have to pay 2000 USD for 4080 and 1675 USD for 7900 XTX...
And many other places the diff is a lot less. Like $1000 vs 1200. And the added power draw of the XTX may require some ppl to upgrade their PSU.
 
Not sure why quote me and post these links. Its as if you didnt get the point.


And many other places the diff is a lot less. Like $1000 vs 1200. And the added power draw of the XTX may require some ppl to upgrade their PSU.
Yeah, but also more VRAM can pull some people to the red side too.
 
Everyone picking 4080 for RT specifically needs to realize that you're paying 20% more for a card that is 20% faster in RT, BUT the XTX has 50% more VRAM.
And if you think the latter doesn't matter much just take a look what's happening with the 3080 in Hogwarts or RE4 to get an idea how things are gonna go down a couple of years from now...
 
I'd go with the Radeon option if you don't care about ray tracing a lot. It seems like a better option around that price and you guys having nice prices over there. If i had to make that choice i'd have to pay 2000 USD for 4080 and 1675 USD for 7900 XTX...
What do you mean with "don't care about raytracing a lot"? The XTX is on par with the 3090/Ti when it comes to raytracing. So the gamers running their expensive Geforces should not care about raytracing either....
 
Everyone picking 4080 for RT specifically needs to realize that you're paying 20% more for a card that is 20% faster in RT, BUT the XTX has 50% more VRAM.
And if you think the latter doesn't matter much just take a look what's happening with the 3080 in Hogwarts or RE4 to get an idea how things are gonna go down a couple of years from now...
Yes, vram usage has gone up in most recent games. The 3080 is a 10gb card and can no longer do 4k maxed out in very demanding games (which were the conditions I believe HUB demonstrated).

The Last of Us p1 appears to be the latest unoptomized mess of a game:

The Last of Us Part I - PC1.png

 
Last edited:
What do you mean with "don't care about raytracing a lot"? The XTX is on par with the 3090/Ti when it comes to raytracing. So the gamers running their expensive Geforces should not care about raytracing either....
What i meant is, i'd not spesifically choose a GPU for ray tracing. This eliminates Nvidia cards. Ray tracing is a huge part of Nvidia. Also i wrote "if" so it is a personal selection process. I'd not get the 4080 for it's higher ray tracing performance. I'd like more vram for me but everyone is searching for something else, aren't we? It is all about personal criterias.
 
What i meant is, i'd not spesifically choose a GPU for ray tracing. This eliminates Nvidia cards. Ray tracing is a huge part of Nvidia. Also i wrote "if" so it is a personal selection process. I'd not get the 4080 for it's higher ray tracing performance. I'd like more vram for me but everyone is searching for something else, aren't we? It is all about personal criterias.

I agree with you.
 
24GB of Vram, wider VRAM bus, DP2.1 and USB-C ports (for my PSVR2 hacks) would have all been really useful to me.

Although if you own a PSVR2 and want to use it with SteamVR don't make the mistake I did and forget you need a USB-C video out!
wait, is there a way to get psvr2 to work on pc ?
 
I'm surprised no one has asked yet but what exactly is your use case for this product? What games do you intend to play, at what resolution, and for how many years do you plan on keeping the card? You may very well be able to get away with a last gen card and save the money towards your next upgrade or real life things. $1,200+ for a video card is a lot of money for a product that's gong to generate marginally prettier graphics than a $500 graphics card could. I feel like a lot of tech enthusiasts tend to case the higher number on the bar charts without stopping and thinking of the practical benefits.

The majority of us who already weighed in on this topic leaned 4080 the OP is just hoping AMD will do better so that he can get an Nvidia card at a better price if you look at his multiple threads about gpu's that will become obvious.

He is upgrading from an aging 1080ti and has a Gsync only monitor so the idea of switching to team red is mostly unappealing due to that.

Abysmal? The 7900 XTX isn't that far behind the 4080 in RT:

View attachment 289650

Surely you were using hyperbole, 16% is an edge, not abysmal.


While overall RT performance isn't too bad any game that heavily uses it like CP2077 and Witcher 3 the 7900XTX takes a massive hit and is something like 100% slower in RT vs a 4090 I'd probably call that abysmal as well. Making it probably 60-70% slower than the 4080 in those scenarios. So it really comes down to the games you want to use RT in some games the 7900XTX is ok others not so much.

UE 5.2 does give me some hope that the AMD cards won't be too far off at least in that engine though but other than Fortnite we really don't have any examples.
 
Last edited:
Yes, vram usage has gone up in most recent games. The 3080 is a 10gb card and can no longer do 4k maxed out in very demanding games (which were the conditions I believe HUB demonstrated).

The Last of Us p1 appears to be the latest unoptomized mess of a game
Actually, as with the Uncharted 4 port before it, it seems that the issues are mostly in the Nvidia camp - I have once again no issues with stuttering, crashing, black screens on startup, etc. on AMD.
That is, if people aren't trying to play during the shader pre-compilation, which sees your CPU regularly utilized at 99%.
Add to it that the RAM usage during the process on my 32GB machine peaked at 27GB and it's pretty clear what's causing the crashes.
They should have moved this outside of the game menu and into the setup instead of allowing people to attempt to play it before it is done...
 
Interesting, I'm playing it maxed out, at 2160p with RT on, but I've dropped in the DLSS mod and am using the 2.5.1 DLL. The warning is red, but the game runs buttery smooth on my 3080.

I certainly understand the sentiment, but I believe the term kneecapped or obsolete take that notion too far when settings can just be turned down if they're an issue and the game will still run fine. It's not like the vast majority of every gamers video cards are currently kneecapped/obsolete because they can't run max everything, the vast vaaast majority of gamers are already compromising on settings somewhere, like high instead of max or optimised settings, lowering resolution, RT off, FSR on and so on.

Personally, I wanted a video card that would give me a good RT experience and be a big upgrade from a 1080, and the 3080 was that card, I made the choice to, over the life of the card, possibly need to drop texture settings due to the 10GB, but retain RT, where on a 6800XT I'd have had to almost entirely forgo RT from day 1. Naturally each to their own, other people make other choice for their own reasons and I'd never deny them that.

Yep, that's part of the oddity as well. There a 2 part video on the topic here:

There are some texture loading issues noted in the 2nd part.

I think going from 1080 to 3080 is a good upgrade. I just wish these cards included a bit more VRAM.
 
Actually, as with the Uncharted 4 port before it, it seems that the issues are mostly in the Nvidia camp - I have once again no issues with stuttering, crashing, black screens on startup, etc. on AMD.
Havent bought the game and wont until its in a better state. But have been following the games threads on other forums (Guru3d) and there are quite a few Nvidia owners who have no issues with the game. Of course any game that would have issues Nvidia would probably figure prominently. Because they have have a far larger GPU market share (82%) with the rest split between AMD and Intel.

 
Everyone picking 4080 for RT specifically needs to realize that you're paying 20% more for a card that is 20% faster in RT, BUT the XTX has 50% more VRAM.
And if you think the latter doesn't matter much just take a look what's happening with the 3080 in Hogwarts or RE4 to get an idea how things are gonna go down a couple of years from now...
It's not just 20% faster in rt, lol
 
4090 or Amd would be my reply. Nvidia's 4000 series lineup is increadibly bad value, especially the 4080.
 
4090, or Amd would be my reply. Nvidia's 4000 series lineup is increadibly bad value, especially the 4080.

The 4070ti is definitely a nope but I'd still prefer a 4080FE over any 7900XTX although I wouldn't feel overly great about spending 1000-1200 usd on either I'd only go that route if I couldn't afford the 4090.
 
The 4070ti is definitely a nope but I'd still prefer a 4080FE over any 7900XTX although I wouldn't feel overly great about spending 1000-1200 usd on either I'd only go that route if I couldn't afford the 4090.

Yeah, it obviously depends on pricing where you live. Im in denmark where the cheapest 7900xtx is 8690 dkk (1262 usd) and the cheapest 4080 is 10990 dkk (1600 usd). In that case the 4080 is just so stupid priced that there aint any argument for getting it.

A 4090 can be had for 14013 dkk (2040 usd), so if you absolutely must have nvidia, and you are already spending that much money, you might aswell spend those 400 usd more and get a much much better product.
 
Yeah, it obviously depends on pricing where you live. Im in denmark where the cheapest 7900xtx is 8690 dkk (1262 usd) and the cheapest 4080 is 10990 dkk (1600 usd). In that case the 4080 is just so stupid priced that there aint any argument for getting it.

A 4090 can be had for 14013 dkk (2040 usd), so if you absolutely must have nvidia, and you are already spending that much money, you might aswell spend those 400 usd more and get a much much better product.

If either of those cards was 12-1600 usd they would both automatically be a nope.

It's really crappy how bad these cards are priced in other countries even Canada get's shafted.
 
Status
Not open for further replies.
Back
Top