• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX

"Stockholm syndrome is a coping mechanism to a captive or abusive situation. People develop positive feelings toward their captors or abusers over time. This condition applies to situations including child abuse, coach-athlete abuse, relationship abuse and sex trafficking."
 
The target is the x80/x800 cards to be at 999$/€.
Mobile phone companies have been selling the same thing for ages, every year at that price and above.

Anyone would do the same if they were in Lisa/Jensen shoes, watching crap technology to be sold at ridiculous prices.
And no one diagrees that gpu developing is way harder than anything else (in the consumer level).

I think the problem will be solved in the next two three gens when both companies have chiplets and can produce gpus for every pocket.
The monolithic design does not allow you to sell other than high end products with good profit.

Anyway, although I want to upgrade my gpu and am able to cover a 7900XTX/4080 cost, it's matter of principle.
Some friends were lucky to buy a 649 pounds 3080. Taking into account all the factors, my limit for the next x80 is 799.
 
The target is the x80/x800 cards to be at 999$/€.
Mobile phone companies have been selling the same thing for ages, every year at that price and above.

Anyone would do the same if they were in Lisa/Jensen shoes, watching crap technology to be sold at ridiculous prices.
And no one diagrees that gpu developing is way harder than anything else (in the consumer level).

I think the problem will be solved in the next two three gens when both companies have chiplets and can produce gpus for every pocket.
The monolithic design does not allow you to sell other than high end products with good profit.

Anyway, although I want to upgrade my gpu and am able to cover a 7900XTX/4080 cost, it's matter of principle.
Some friends were lucky to buy a 649 pounds 3080. Taking into account all the factors, my limit for the next x80 is 799.
Fair point in the wrong thread there's a 4080 thread for Nvidia only buyers to complain about price.

Or are you one of those AMD should be competitive or I can't get my cheap Nvidia types.
 
Fair point in the wrong thread there's a 4080 thread for Nvidia only buyers to complain about price.

Or are you one of those AMD should be competitive or I can't get my cheap Nvidia types.

Looool. I think my post fits in both topics.

I had a 5700XT (and RX470, 7970, 6950, 4850, X1950Pro etc....) before buying the 2080Ti, so no.
I would buy a 7900XTX if it was at the 6800XT msrp.

1671799520590.png
 
True, i saw on youtube someone comment on the 4080, "for $1000 i would buy it with no remorse", LOL. It is so incredibly overpriced that even at minus $200 it's still up there. Well, i don't see any incentive to replace my 6800 xt, unless the 4800/7900xtx goes down to $750 at most.
I also have an RX 6800 XT and I plan to keep using it for many years to come. It's a great performer and won't have any VRAM limitations like my R9 Fury did. I think that it'll be at least a decade before 16GB isn't enough and that's assuming that there will ever be a time when 16GB isn't enough. I say that because it's already more than enough for Unreal Engine 5 which is essentially photo-realistic. You can't get much better than that.
And even then i don't know if my 5900X would bottleneck them at 1440p/144Hz.
Actually, the RX 6800 XT would be the bottleneck there, especially at 1440p, not the R9-5900X. See, the RX 6800 XT is the performance equal of the RTX 3080 and TechPowerUp has these results for the R9-5900X with an RTX 3080. I'll use 1080p because the game will be completely GPU bottlenecked at 1440p:
cyberpunk-2077-1920-1080.png

So you see here that your R9-5900X is tied with my R7-5800X3D with an RTX 4080, so our two rigs would be equal in performance because we both have RX 6800 XTs. However, if we had a more powerful card, like an RTX 3090 Ti for example:
CP2077.png

All other things being equal, your CPU's output increased by 7.5% which means that the RTX 3080 was the bottleneck here. While it's true that Cyberpunk is pretty hard on GPUs, it's only 1080p here and if the RTX 3080 is the bottleneck at 1080p in Cyberpunk, then it would definitely be the bottleneck in almost all modern titles at 1440p. Exceptions to this would be real-time strategy games like Civilization because it relies completely on the CPU for game speed. I often joke that the ideal setup for Civilization 6 would be a Threadripper coupled with an RX 580.

If the performance increases at the same resolution with a faster GPU, it means that the CPU isn't the bottleneck so your R9-5900X is just fine. Even with the bottleneck, you're still looking at ~60FPS at 1440p with the RX 6800 XT in Cyberpunk 2077, certainly nothing worth upgrading any time soon. That game is often considered to be a "worst-case-scenario" for Radeon cards so other games would be even better. Here's a Cyberpunk demo at 1440p using an R9-5950X (which is pretty much the same as the R9-5900X in gaming) and an RX 6800 XT:
 
I was able to get one of these directly from AMD. Great pricing and shipping for sure. Just now installing VGA drivers on Win11. Not much luck on Linux side outside of Ubuntu / ubuntu flavors.

Ubuntu and Red Hat always get there hands on these first.
 
Last edited:
Seems like a good offering from AMD, the same RT performance as a 3090ti and as fast as a 4080 in everything else, all for less money while using less power! what's not to like .
 
The price :D
The 7900 XTX is fine. It's 47% faster than the 6900 XT, while having the same launch MSRP even after two bad years of inflation. If people want to fight the good fight and try to lower the prices, sure, have at it, but some people seem to think 7900 XTX should really be a 7800 XT, which is delusional at best. The "halo" cards for both AMD and nvidia have never been the best bang for your buck, except the 1080 ti, which was clearly a mistake on nvidia's part.

The real problem is the 7900 XT. Right now the 7900 XT is only a 34% improvement over the 6800 XT, while costing $900. AMD won't release a 7800 XT that performs better than a 7900 XT, so you are probably looking at best around a 30% improvement going from a 6800 XT to a 7800 XT, at who knows what cost. For comparison, I think the 5700 XT to 6700 XT was a 35% improvement, at a 20% price hike, which if they followed that trend it would put the 7800 XT at slightly higher performance than the 7900 XT, while costing $780 MSRP.
 
seem to think 7900 XTX should really be a 7800 XT, which is delusional at best. The "halo" cards for both AMD and nvidia have never been the best bang for your buck
People are not "thinking" that, and amd does not have a "halo" product because they decided to name it like one. A "halo product" has the highest performance at the highest price, like the 4090 this gen, and last gen 3090 and 6900 xt. The 6900xt traded blows with the 3090. Forward two years and now the best from amd is tied with the x80 class gpu from nvidia, that's not a halo product, that's a x80 class gpu like the 6800XT. AMD is just taking advantage of the ultra high price increace of the x80 class gpu from nvidia and pricing to match.
The 7900 XTX is fine. It's 47% faster than the 6900 XT
39% according to this site. And 31% vs previous top of the line 6950 XT. 4090 is 45% faster than the 3090Ti, that is a true generational leap(and not the highest).
the 7900 XT is only a 34% improvement over the 6800 XT, while costing $900
27% improvement (again, according to tpu) for 38% increase in price. It would need to be at the very least $750 for the increase in performance to be higher than the increase in price, in that case it would be 27% faster for 15% more expensive. That would also be a shitty deal, but at least not a ripoff. The 7900XT and 7900XTX are clearly not for 6800XT owners, or people that care about price/performance.
you are probably looking at best around a 30% improvement going from a 6800 XT to a 7800 XT
Actually a 17 - 20% would be more realistic and for no less than $700.
 
39% according to this site. And 31% vs previous top of the line 6950 XT. 4090 is 45% faster than the 3090Ti, that is a true generational leap(and not the highest).
Seriously? Can we come back to this in a year?
 
In all your agitation, you remained frozen in 1990. Gentlemen, today, the video card is not only used for fps in gaming. Does it support AMD OptiX or CUDA? Does the enc/dec module match nVidia's value? For the price of video cards, taking into account that no respected creator is looking at any Radeon, the demand is huge in this segment. Oh, and let's not forget DLSS, which is completely missing from AMD.
You gave the example of Doom Eternal. What performance does an AMD video card get with DLSS ON? Ah, can't stand technology! So how does it perform with FSR? Ah, the game doesn't have this implemented.
In addition to this, gentlemen, we are at the end of 2022. If we are still investing a lot of money, we are investing it intelligently, and intelligently does not match the purchase of video cards released two years ago. I'm just saying.
Do you work for nVidia? You're spouting their BS extremely well for someone who doesn't.

CUDA - Are you a student or professional in the CAD market? No? So who cares?
DLSS - Is no better than FSR because you wouldn't be able to tell which is which if they weren't side by side with you looking at the screen through a magnifying glass at a game moving at 25% speed. What an absolute joke, a joke that you have either fallen for or have a vested interest in.

It didn't take long for me to no longer be able to take you seriously between your nonsensical way of stringing words together coupled with your clear lack of gaming experience. On to the list you go. I prefer to forget that people like you exist.
 
CUDA is everywhere, my friend.
For example, for you, in renderings: OptiX (only RTX) -> CUDA (only nVidia cards)-> OpenCL (AMD). If the program does not support OptiX, you can confidently use CUDA, which is much more efficient and better implemented than OpenCL.
I don't know what reviews you've read, but I'm not from Earth if you have FSR=DLSS. Even if it were, nVidia cards support both.

For example, I have the privilege of choosing what I want with an nVidia card, in Cyberpunk, Shadow of the Tomb Raider, RDR 2 and many, many others.
In Doom Eternal, and not only, you don't have the FSR option.
"The gentle lamb sucks two sheep", a Romanian proverb that determined the winner of the great battle: 5700X versus 2070 Super.
 

Attachments

  • cyber DLSS.jpg
    cyber DLSS.jpg
    68.5 KB · Views: 94
  • cyber FSR.jpg
    cyber FSR.jpg
    68.6 KB · Views: 98
  • doom dlss only.jpg
    doom dlss only.jpg
    556.7 KB · Views: 85
Last edited:
For example, I have the privilege of choosing what I want with an nVidia card
And that's the exact reason your comments have zero relevance on price, you're like the top 1% of the income bracket. Most people that are complaining are of this 2 points of view:

1. Have the money, hard earned by working (money does not grow on trees) and do not agree with a $350 - $500 increase just because amd or nvidia wants to earn more money.
2. The people that had their budget capped at $800 that always granted them a x80 gpu, and now have to settle for x70 performance. Because their income did not increase by +70%.

This is not about amd vs nvidia anymore.

Hope you uderstand something very basic(common sense).
 
Or
Proverb: I'm too poor to buy cheap if it means compromise.
You keep saying that I don't understand, but your horse glasses don't let you understand that the market regulates everything. You complain too much that your old CRT 256 lines costs less than the new 4K OLEDs.
 
Or
Proverb: I'm too poor to buy cheap if it means compromise.
You keep saying that I don't understand, but your horse glasses don't let you understand that the market regulates everything. You complain too much that your old CRT 256 lines costs less than the new 4K OLEDs.
The Market my ass. We are quick to forget that if Mining was still viable (for now) you would not be able to buy a 4080 and the 7900XT is (was) not selling because of the same thing. It was too expensive for the market. The hardest 7900XTX to get will be the 3X8 pin variants and I expect a $100-200 difference in price for those units. They must be selling them too because the price has already risen by $300 on Newegg for of all things the Sapphire Reference Edition. You can expect that there will be a bump in the rest of the AIB at first too as the Pulse is currently $70 more than the Nitro and that makes no sense but those prices are listed for when the cards launched.
 
As the demand for RX is high here, the difference between 4080 and 7900XTX (the only available model) is only ~$60, not $200. I wouldn't look at $60 when the 4080 is clearly above the 7900XTX at ... I won't repeat them, but they are features.
The market regulates everything, gentlemen. No one is forcing you to buy, and you won't have a heart attack if you don't. What the hell, are you North Koreans undercover?
 
As the demand for RX is high here, the difference between 4080 and 7900XTX (the only available model) is only ~$60, not $200. I wouldn't look at $60 when the 4080 is clearly above the 7900XTX at ... I won't repeat them, but they are features.
The market regulates everything, gentlemen. No one is forcing you to buy, and you won't have a heart attack if you don't. What the hell, are you North Koreans undercover?
Yes, Yes we know about those features.You seem to forget that not everyone has your specific use case in mind. This is a thread for the 7900XTX. Don't you no Specs have a 4080?

I don't know what reviews you've read, but I'm not from Earth if you have FSR=DLSS. Even if it were, nVidia cards support both.
Yep Thanks to Nvidia that Company that cares about Gamers by how they show it.
 
As the magic of cheap fuel ended in 1970 (as if), maybe you should also inspect the other end of the glass: video cards are not expensive now, they were very cheap until now. The Cinderella story for only $600 is over. It remains to be seen how many will adapt to reality.
 
I also have an RX 6800 XT and I plan to keep using it for many years to come. It's a great performer and won't have any VRAM limitations like my R9 Fury did. I think that it'll be at least a decade before 16GB isn't enough and that's assuming that there will ever be a time when 16GB isn't enough. I say that because it's already more than enough for Unreal Engine 5 which is essentially photo-realistic. You can't get much better than that.

Actually, the RX 6800 XT would be the bottleneck there, especially at 1440p, not the R9-5900X. See, the RX 6800 XT is the performance equal of the RTX 3080 and TechPowerUp has these results for the R9-5900X with an RTX 3080. I'll use 1080p because the game will be completely GPU bottlenecked at 1440p:
cyberpunk-2077-1920-1080.png

So you see here that your R9-5900X is tied with my R7-5800X3D with an RTX 4080, so our two rigs would be equal in performance because we both have RX 6800 XTs. However, if we had a more powerful card, like an RTX 3090 Ti for example:
CP2077.png

All other things being equal, your CPU's output increased by 7.5% which means that the RTX 3080 was the bottleneck here. While it's true that Cyberpunk is pretty hard on GPUs, it's only 1080p here and if the RTX 3080 is the bottleneck at 1080p in Cyberpunk, then it would definitely be the bottleneck in almost all modern titles at 1440p. Exceptions to this would be real-time strategy games like Civilization because it relies completely on the CPU for game speed. I often joke that the ideal setup for Civilization 6 would be a Threadripper coupled with an RX 580.

If the performance increases at the same resolution with a faster GPU, it means that the CPU isn't the bottleneck so your R9-5900X is just fine. Even with the bottleneck, you're still looking at ~60FPS at 1440p with the RX 6800 XT in Cyberpunk 2077, certainly nothing worth upgrading any time soon. That game is often considered to be a "worst-case-scenario" for Radeon cards so other games would be even better. Here's a Cyberpunk demo at 1440p using an R9-5950X (which is pretty much the same as the R9-5900X in gaming) and an RX 6800 XT:

There's a flaw in your comparison: @W1zzard tests are CUSTOM and i don't know how exactly HU tests their games.

You can compare them directly ONLY WHEN all things are equal (except the processor ofc, in this specific case), but since they aren't, you can't

Either you compare both tests from TPU or both tests from HU, and EVEN THEN, it assumes ALL ELSE the same (from drivers, to Windows version, etc).
 
There's a flaw in your comparison: @W1zzard tests are CUSTOM and i don't know how exactly HU tests their games.

You can compare them directly ONLY WHEN all things are equal (except the processor ofc, in this specific case), but since they aren't, you can't

Either you compare both tests from TPU or both tests from HU, and EVEN THEN, it assumes ALL ELSE the same (from drivers, to Windows version, etc).
The differences in the test benches isn't enough to be significant so I don't worry about it.
 
View attachment 274144
16% RT gap, on par in raster, with more games pushing past the 4080 than there are games ending up worse.

999 is priced right in relation to the 4080. But not priced right. The 7900XT OTOH is half a tier below the 4080 but has virtually the same perf/$.
Overall this makes AMD's offering on the (too-) pricy side IMHO, much like Nvidia's.

Guess I'm saving that 13th month for now :)
How come your chart is different from the ones in TechPowerUp review?? I double check its different.

Are you hoping for a miracle from the drivers?
How old are you? Why did you come to 7900XTX forum to spam?

Be happy with your RTX4080 Ti purchase, its the best thing since sliced bread.

For me, I only bought the 7900 XTX because its the most powerful GPU that still uses normal 8pins power socket. The Nvidia 16pins adaptor is fxxking ugly and its too much effort to redo a custom one.
 
Last edited:
CUDA is everywhere, my friend.
For example, for you, in renderings: OptiX (only RTX) -> CUDA (only nVidia cards)-> OpenCL (AMD). If the program does not support OptiX, you can confidently use CUDA, which is much more efficient and better implemented than OpenCL.
I don't know what reviews you've read, but I'm not from Earth if you have FSR=DLSS. Even if it were, nVidia cards support both.

For example, I have the privilege of choosing what I want with an nVidia card, in Cyberpunk, Shadow of the Tomb Raider, RDR 2 and many, many others.
In Doom Eternal, and not only, you don't have the FSR option.
"The gentle lamb sucks two sheep", a Romanian proverb that determined the winner of the great battle: 5700X versus 2070 Super.
Put the green coolade down step back and chill.

I have both brands, Cuda means f all to gaming anyway.

Cuda has not stopped my Vega64 or this 7900XT playing any of these games.

Nothing Nvidia sells is a MUST have, though admittedly the same goes for AMD and Intel.

Your trolling is getting ridiculous this isn't a Nvidia thread you are looking schill like at the moment.
 
Last edited:
Back
Top