• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RX 7900 XTX vs. RTX 4080

Status
Not open for further replies.
If either of those cards was 12-1600 usd they would both automatically be a nope.

It's really crappy how bad these cards are priced in other countries even Canada get's shafted.

Europe gets shafted hardcore with prices, yeah. Always been the case, but with this gen nvidia increased prices even more for europe specifically. But it is what it is, and people still want their hardware.

My "4090 or amd" comment was ofc based on those prices. But if 7900xtx and 4080 is the same 1000 usd price... tough call. I reckon the 7900xtx will age MUCH better. Even in a year or two the extra banwidth and vram is going to prove advantageous at "normal" resolutions. And dlss's advantage over fsr diminishes for every day, so... yeah, i don't know. If he has a g-sync module monitor already it will obviously make the most sense to get the 4080, but otherwise even at the same 1000 usd price i'd lean towards the 7900xtx.
 
If either of those cards was 12-1600 usd they would both automatically be a nope.

It's really crappy how bad these cards are priced in other countries even Canada get's shafted.
Denmark has a 25% sales tax (VAT) hence the reason PC components such as those gpu's are so expensive there. Canada also has an outrageously high sales tax.
 
Depends on the game sample, TPU's RT game selection overall skews slightly to the Radeon's favour with a few in there that have purposely light/AMD sponsored implementations, and of course some heavy/Nvidia sponsored ones too. There can obviously be games that show a much larger difference (45.5%), and ones that show virtually no difference (3.1%) - just depends which titles you play and if RT is desirable to you, averages don't tell the whole story.
 
That's like testing a 7950x 3d against an r5 1600 in 8k resolution and claiming they are similar.

Most games in that list don't make much use of rt, therefore those are mostly raster results. Games that actually use multiple rt effects show a gap as big as the Atlantic. I get amd fans don't really care about the facts but I'm still hoping....
 
Depends on the game sample, TPU's RT game selection overall skews slightly to the Radeon's favour with a few in there that have purposely light/AMD sponsored implementations, and of course some heavy/Nvidia sponsored ones too. There can obviously be games that show a much larger difference (45.5%), and ones that show virtually no difference (3.1%) - just depends which titles you play and if RT is desirable to you, averages don't tell the whole story.
Did you just discover the hot water? Isn't it the exact same story with raster too? Do we not rely on averages PRECISELY because there are outliers in either direction? Wth man xD

That's like testing a 7950x 3d against an r5 1600 in 8k resolution and claiming they are similar.

Most games in that list don't make much use of rt, therefore those are mostly raster results. Games that actually use multiple rt effects show a gap as big as the Atlantic. I get amd fans don't really care about the facts but I'm still hoping....
My man, once UE5 games start flooding the market - you know the ones like UE5 Fortnite where there's hardly any difference in RT performance where are you gonna hide exactly?
 
Did you just discover the hot water?
What does this even mean?
Isn't it the exact same story with raster too? Do we not rely on averages PRECISELY because there are outliers in either direction?
Yeah and TPU's averages are just with that particular sample of games, another sites average will be different, so 16% isn't a be all end all that I'd hang my hat on. Anyone buying a 7900XTX interested in RT, that's expecting it to only perform 16% worse than a 4080 in RT is setting themselves up for disappointment over the coming years.

Yeah it's the same with raster averages, but the game sample size is WAAAAY bigger too, and even then I look at titles that I'll be playing specifically to see which product is better for me. Perhaps I'm just thorough.
 
My man, once UE5 games start flooding the market - you know the ones like UE5 Fortnite where there's hardly any difference in RT performance where are you gonna hide exactly?
Why would I need to hide anywhere. When that happens I'll admit the difference in rt will be whatever it is. Right now the gap is huge in any game that actually uses a lot of rt effects. I don't even understand why you are contesting the point.

Hogwarts demonstrates the point quite easily. Computer base test the game between a 4080 and a 7900xtx, and they ended up realizing that the difference between them is 6% with 1 rt effect on, 22% with 2 rt effects on and 50% with all rt effects. Doesnt that tell you anything?

Also the whole point about outliers is wrong. There are outliers in raster cause some games perform much better on amd and some on nvidia. That is not the case with rt. Rt games either perform much better on nvidia or don't have many rt effects so they perform similarly. Is there a game with heavy rt that amd gets a big win?
 
Did you just discover the hot water? Isn't it the exact same story with raster too? Do we not rely on averages PRECISELY because there are outliers in either direction? Wth man xD


My man, once UE5 games start flooding the market - you know the ones like UE5 Fortnite where there's hardly any difference in RT performance where are you gonna hide exactly?
The problem here is that the actual FPS you get in RT isn't all that high, so all performance is welcome, whereas on raster, you'll not be wanting more performance on either card.

So that is why people say if you want RT, better off with green. Don't live in denial, rather try to understand why people say what they say ;)
 
The problem here is that the actual FPS you get in RT isn't all that high, so all performance is welcome, whereas on raster, you'll not be wanting more performance on either card.

So that is why people say if you want RT, better off with green. Don't live in denial, rather try to understand why people say what they say ;)
Also the average in rt is completely unreliable. If you want to play an rt game, either the difference between green and red will be minimal cause the game doesn't use much rt, or it will be kinda huge cause it does.
 
Performance is most important, RT still from 4080 still bad.

Exactly, who cares if it's 20% faster if it's still bad.

No one plays anything with RT on without using some form of upscaling which makes these performance differentials meaningless.
 
Exactly, who cares if it's 20% faster if it's still bad.

No one plays anything with RT on without using some form of upscaling which makes these performance differentials meaningless.
I'm using RT in all titles that have it and love it. Some use the term "upscaling" as if its an equal feature between AMD and Nvidia. Its not. Even pro-AMD HUB affirms DLSS is better than FSR and at 4k its almost indistinguishable from native.
 
Some use the term "upscaling" as if its an equal feature between AMD and Nvidia.
Upscaling is upscaling, regardless of which is better, I don't get your point.
 
Denmark has a 25% sales tax (VAT) hence the reason PC components such as those gpu's are so expensive there. Canada also has an outrageously high sales tax.

That's part of the explanation, but nvidia specifically increased the msrp in euro vs usd this time around.
 
I would not chose any of these to be fair and wait for a better deal. Despite the RT or no-RT person argument is meaningless here for me. There are games that 4090 can barely make it playable with RT so what about those lower tier cards? I can bet that the tendency for getting current cards run faster on future titles, while they can't sometimes run current games, is dissipating faster than water in a desert. I'd wait for the next gen cards to see what the RT performance for those is and then decide. For now, I'd stick to raster and make sure that works well. RT is an addition or a showcase if you are willing to check it out in games and what devs can do with RT but relying solely on RT with current cards? Fool's errand in my opinion.
 
Last edited:
I would not chose any of these to be fair and wait for a better deal. Despite the RT or no-RT person argument is meaningless here for me. There are games that 4090 can barely make it playable with RT so what about those lower tier cards? I can bet that the tendency for getting current cards run faster on future titles, while they can't sometimes run current games, is dissipating faster than water in a desert. I'd wait for the next gen cards to see what the RT performance for those is and then decide. For now, I'd stick to raster and make sure that works well. RT is an addition or a showcase if you are willing to check it out in games and what devs can do with RT but relying solely on RT with current cards? Fool's errand in my opinion.
I don't think Nvidia will produce and improve their GPUs fast enough. Announced 4060-4070 is barely feels like an upgrade.
 
I find RT more than playable at 4k on a 3080 and at 1080p on an A2000 (3050 level) fwiw, and I have high standards for fidelity and fps, but I can appreciate that everyone's bar for success is different. Some peoples bar for success is daft and confusing mind you, but hey, each to their own.
 
I find RT more than playable at 4k on a 3080 and at 1080p on an A2000 (3050 level) fwiw, and I have high standards for fidelity and fps, but I can appreciate that everyone's bar for success is different. Some peoples bar for success is daft and confusing mind you, but hey, each to their own.
It is relative what you consider playable. I'd rather stay on a 4k res NO-RT than 1080p with RT. That also depends on a game you are playing. Within time, current GPUs wont be able to run 1080p with RT either by VRAM or simply too slow. This degrade in RT performance for GPUs dips way faster than raster performance. What you take of it is as you said to each to their own.
 
Hogwarts demonstrates the point quite easily. Computer base test the game between a 4080 and a 7900xtx, and they ended up realizing that the difference between them is 6% with 1 rt effect on, 22% with 2 rt effects on and 50% with all rt effects. Doesnt that tell you anything?

Also the whole point about outliers is wrong. There are outliers in raster cause some games perform much better on amd and some on nvidia. That is not the case with rt. Rt games either perform much better on nvidia or don't have many rt effects so they perform similarly. Is there a game with heavy rt that amd gets a big win?
Except that Hogwarts Legacy is an outlier in itself, because no other game behaves like that, and that result may very well be a bug - so it says in the CB's analysis btw - yet you are happy to jump on that particular bit of info with open arms xD
Look, I get that you have accepted the idea that RTX equals ray tracing as ground truth, but that just couldn't be further from it.
This is the only modern RT result that matters - Epic's own implementation of their own UE RT tech, which is not based on RTX:
https://www.techspot.com/review/2599-radeon-7900-xtx-vs-geforce-rtx-4080/#Fortnite_DX12_RT
And you need to understand - I don't give two rusty $hits about AMD. I just can't stand anyone and anything that seeks to manipulate information for their own profit and boy Nvidia has been all about that in the last decade. Thankfully that RTX circus at least is coming to an end...
 
Except that Hogwarts Legacy is an outlier in itself, because no other game behaves like that, and that result may very well be a bug - so it says in the CB's analysis btw - yet you are happy to jump on that particular bit of info with open arms xD
Look, I get that you have accepted the idea that RTX equals ray tracing as ground truth, but that just couldn't be further from it.
This is the only modern RT result that matters - Epic's own implementation of their own UE RT tech, which is not based on RTX:
https://www.techspot.com/review/2599-radeon-7900-xtx-vs-geforce-rtx-4080/#Fortnite_DX12_RT
And you need to understand - I don't give two rusty $hits about AMD. I just can't stand anyone and anything that seeks to manipulate information for their own profit and boy Nvidia has been all about that in the last decade. Thankfully that RTX circus at least is coming to an end...
It's not hogwarts, every game behaves the same way. Take cyberpunk for example, the less effects you turn on the smaller the gap is between amd and nvidia. That's like.. just common sense. When a game uses just a tiny amount of rt, the difference is smaller, and vice versa

The only manipulation is by people adding games with minimal to no rt in the rt averages and then claim the difference is 16%. That's just a joke, and it's similar to testing cpus at 4k and claim there ain't much difference between an i3 and 7950X3D.
 
Except that Hogwarts Legacy is an outlier in itself, because no other game behaves like that, and that result may very well be a bug - so it says in the CB's analysis btw - yet you are happy to jump on that particular bit of info with open arms xD
Look, I get that you have accepted the idea that RTX equals ray tracing as ground truth, but that just couldn't be further from it.
This is the only modern RT result that matters - Epic's own implementation of their own UE RT tech, which is not based on RTX:
https://www.techspot.com/review/2599-radeon-7900-xtx-vs-geforce-rtx-4080/#Fortnite_DX12_RT
And you need to understand - I don't give two rusty $hits about AMD. I just can't stand anyone and anything that seeks to manipulate information for their own profit and boy Nvidia has been all about that in the last decade. Thankfully that RTX circus at least is coming to an end...

While UE5 does give me hope the two games I like RT best in Witcher 3/CP 2077 radeon still performs like $#!+... Hopefully once actual games using UE5 release we get get a better picture of how the 7900XT/XTX perform. Although next generation cards will likely be out first I am looking forward to The Witcher 4, Hellblade 2, and Gears 6 which will all be using UE5.
 
While UE5 does give me hope the two games I like RT best in Witcher 3/CP 2077 radeon still performs like $#!+... Hopefully once actual games using UE5 release we get get a better picture of how the 7900XT/XTX perform. Although next generation cards will likely be out first I am looking forward to The Witcher 4, Hellblade 2, and Gears 6 which will all be using UE5.
Well, naturally - everything in these CDPR games is designed for RTX specifically, the same way it is with Metro Exodus, or Portal RTX, etc.
It's not even about these titles not being optimized for AMD hardware - they are effectively running in emulation mode on Radeon.
 
Well, naturally - everything in these CDPR games is designed for RTX specifically, the same way it is with Metro Exodus, or Portal RTX, etc.
It's not even about these titles not being optimized for AMD hardware - they are effectively running in emulation mode on Radeon.

At least for me the choice was 4090 or 7900XTX and even at a 60% discount the radeon card wasn't very appealing I'd probably still go with the 4080 over it but there is enough information/benchmarks for anyone to make their own informed decision on what's best for them. If all I played was COD I would have grabbed a 7900XTX for example.
 
Thankfully that RTX circus at least is coming to an end...

Not so sure about that, looks like they're now pushing for the full path tracing stuff which is even heavier than regular ray tracing.

It looks like their end game is to make everything literally unplayable without upscaling and frame interpolation.
 
Status
Not open for further replies.
Back
Top