• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX

???
1.7x performance of 6950 XT at 4K in cyberpunk would at least put the 7900 XTX on par with the 4090, yet it's only 1.42X faster - same story in most other games.

zIHDw7G.png

yukk6KI.jpg


113 / 84 = 1.34X more fps
I knew something wrong with their Marketing , So I didn't believe it. anyone who think it's ( either AMD or Nvidia) True , deserve being unhappy.
 
Wow, performance is honestly what i was hoping for.

Price is still too high, though, but at least there is competition for the 4080.
 
Good raster value compared to the RTX 40 series, but that would be it.

Performance per dollar is at the same level as the previous generation. And even if this is the future and we have to get used to it, at this moment in time it is unacceptable.

I just do not see how they are envisioning this future. Who will pay $1000 for mid-range cards in a few years?
 
I want a bench of this thing under water. Does it scale well while being cool.
 
People have short memories
1670857884949.png

1670857975634.png

Portal "3x" performance
1670858228768.png
 
Color me Surprised, W1zzard Blatant Biased yet again. Pairing a AMD GPU with an Intel CPU. He think he is slick. He knows AMD's GPU perform the best with AMD CPUs... I am done with W1zzard's antics.
do we also need to get AMD SSD's, power supplies, keyboards and energy drinks for it to run better?
 
This is a really impressive showing by AMD if you compare it to the 4080. For $200 less, you get better performance out of the box in most of the tested games. RT performance isn't on the same level, but it seems like a massive jump versus RDNA2. It looks like AMD has a crap load of driver work to do still. A few games are really not scaling as expected. The multi-display idle bug is back, and what is up with the video decoding power consumption. I'm willing to bet AMD works this stuff out with drivers.
 
'Unacceptable RT performance' that's quite something.

First of all, I guess you consider all 3000-series cards 'unacceptable' too in RT since they offer similar performance there. Secondly, RT is still a gimmick, Nvidia has made you believe it is worth paying a premium for 16% more of it.
AMD is believing that as well, if they're making you pay a premium for performance comparable to the previous Nvidia generation... If Nvidia didn't offer any generational upgrade in performance that would have been unacceptable as well (after all the time and money that they've spent in researching and promoting it).
 
Another disappointing product due to price, this card should have obliterated the 4080 in raster (RT is still stupid even the 4090 and its RT cores gets slaughtered) to justify its price.
They are just milking us.
 
I guess those rumors about AMD Hardware bugs were right, in the AMD slides, and AMD themselves i swear it said this card would push 3GHZ. If it actually could have made it that high on the frequency i think it could have been a competitor to the 4090. Right now the card is kinda all over the place in FPS, frame times and power consumption. The card was clearly not ready but they pushed it out anyways. Let's see what they can do to salvage this in the next couple months with drivers.
 
If anything this launch has made me very happy that I was able to snag a 6950XT OCF on the cheap for black friday.
 
3000 series cards being released in late 2022 with a $1000 price tag would be unacceptable.

Maybe people are getting tired of AMD settling to be as fast as NVIDIA's previous gen every release.

So you've just taken what I said to the next level. Instead of just considering RT performance a major factor in a cards value, you are wholly defining a card based on its RT performance by saying these are 'as fast as Nvidia's previous gen', ignoring it's actual performance (raster). That's crazy framing.

The XTX is faster than Nvidia's previous gen, it's only in RT there is a 16% gap with the current gen.
 
Look at the (% changes) for the 6950 XT verses the 7900XTX. The 7900 XTX's efficiency in raytracing is basically on par with the 6000 series. This is the disappointing, especially as they specifically said they worked on raytracing performance for this architecture. It looks like the architecture changes to increase raytracing performance didn't translate well into real-world game raytracing performance.

Remember that even Nvidia didn't improve on RT performance. Ampere had around a 40-45% drop in performance when you turned on RT and so does Ada.

Neither company improved upon RT performance when compared to last gen. The only reason things look better is simply because of the rasterization performance gains. I think it's sad that neither company made any improvements with RT.....although, it doesn't really matter to me because I don't care about RT. I never use it on any games that support it because for me it really doesn't make any difference. You're moving too fast through the games to really take in small visual changes that RT has.
 
'Unacceptable RT performance' that's quite something.

First of all, I guess you consider all 3000-series cards 'unacceptable' too in RT since they offer similar performance there. Secondly, RT is still a gimmick, Nvidia has made you believe it is worth paying a premium for 16% more of it.
Unacceptable for a top tier product that cost 1000$ and will be with us for the next two+ years.
It’s a gimmick to you. I value it a lot more than the raster performance.
 
Looks like they didn’t repeat the 6800/6900 sweetness.

maybe they should have just added additional shaders to rdna2 and made it bigger?

for the price I guess it’s ok.

@W1zzard 2% difference at 4k between and AMD and Intel cpu. Who was whining and complaining about that? Not even worth changing the test bed for. 1080p ok but running these cards on that would be ridiculous.
I guess there was now way to do confirmation testing on DP and HDMI versions? I may have missed it but for media testing which codec(s) did you use? Do these cards offer full 10-bit colour output?
 
Last edited:
This is more like zen 1 IMO - they're sacrificing efficiency for the 1st gen modular design, but it has the potential to help them scale (or realize massive efficiencies) down the line. The first zen had the same issues.

If they just die shrunk and optimized RDNA 2 they would have gotten better results, for sure, but they're opting for the Zen strategy.

I definitely think it's priced ok according to the 4080 -- i don't agree with the reviews that "nvidia has to lowe price" since they do have more features and better RT performance, to me the 4080 and 7900xtx are at parity.
No, it's Bulldozer. AMD was promising higher performance than what the reviews show. I think they where hoping to improve their drivers considerably in this last month. But they didn't. Or we will have to accept that their marketing numbers where best case scenarios close to the border of being lies. Zen 1 was a new architecture that was offering tremendous IPC gains over Bulldozer(which probably was the easy part), while also incorporating a first step into chiplet design. Zen 1 was a clear jump into the future. RDNA3 is a big "What The Hell?".

Here we have an architecture that probably fails to be utilized at it's full potential. Like what we where seeing with Bulldozer. As you say, if they have shrunk RDNA2 they might where getting better results. The same was true back when Bulldozer came out. Me and others where thinking that a 32nm Phenom III with 8 cores could have been a better product than an "eight core" Bulldozer.

It would have been priced OK if it was clearly beating RTX 4080. Nvidia only has to drop RTX 4080's price by $200 and RX 7900XTX is DOA at $1000. Or release RTX 4070 Ti at $800 and see that card beating RX models in RT performance. People will go that direction.

Damn, 2 years of Nvidia monopoly and Intel coming to bite AMD from behind. At least they will be selling plenty of APUs.
 
So you've just taken what I said to the next level. Instead of just considering RT performance a major factor in a cards value, you are wholly defining a card based on its RT performance by saying these are 'as fast as Nvidia's previous gen', ignoring it's actual performance (raster). That's crazy framing.

The XTX is faster than Nvidia's previous gen, it's only in RT there is a 16% gap with the current gen.
Engines are updated or are being updated to have RT as a default replacement for baked lighting. This process is moving in one direction.

I meant what I said.
 
What about memory temperature benchmark? 7900xtx has gddr6 384bit while 4080 has gddrx6 256bit. Does gddr6 384bit run cooler than 256bit gddrx6?

The greatest thing about 7900 gpus are the included displayport 2.1, finally. Other than that, 7900xtx price a little bit lower but power consumption a little bit higher 5nm x 4nm. Performance x price wise pretty much the same. I would still prefer amd for the displayport 2.1, however dp 2.1 is only 54gb maximum, not the full 80gb and that is bad, hdmi 2.1 full is 48gb and both amd and nvidia have full 48gb on hdmi 2.1, amd is counting on dsc here, dsc is only for display port 1.4a and up, hdmi 2.1 dont have dsc.

arch14_small.jpg
 
Last edited:
No, it's Bulldozer. AMD was promising higher performance than what the reviews show. I think they where hoping to improve their drivers considerably in this last month. But they didn't. Or we will have to accept that their marketing numbers where best case scenarios close to the border of being lies. Zen 1 was a new architecture that was offering tremendous IPC gains over Bulldozer(which probably was the easy part), while also incorporating a first step into chiplet design. Zen 1 was a clear jump into the future.

Here we have an architecture that probably fails to be utilized at it's full potential. Like what we where seeing with Bulldozer. As you say, if they have shrunk RDNA2 they might where getting better results. The same was true back when Bulldozer came out. Me and others where thinking that a 32nm Phenom III with 8 cores could have been a better product than Bulldozer.

It would have been priced OK if it was clearly beating RTX 4080. Nvidia only has to drop RTX 4080's price by $200 and RX 7900XTX is DOA at $1000. Or release RTX 4070 Ti at $800 and see that card beating RX models in RT performance. People will go that direction.

Damn, 2 years of Nvidia monopoly and Intel coming to bite AMD from behind. At least they will be selling plenty of APUs.

I'm a huge fan of new intel, but they're not even close to 'biting' AMD from behind. If intel can match the 6800xt with battlemage next year I will be impressed, but I doubt it.

This architecture does have similarities to bulldozer, but at the end of the day it pushes the FPS needed to compete with the 80 class.

They will need to drop price, for sure -- but hopefully the silicon savings allow for that.
 
I knew something wrong with their Marketing , So I didn't believe it. anyone who think it's ( either AMD or Nvidia) True , deserve being unhappy.

There is wrong and there is that.

Small samples always have a certain level of inaccuracy but claiming a 70% bump in CP2077 and then getting just 43% at HUB vs a 6950XT and 63% here vs a 6900XT is utterly misleading vs what AMD claimed. It is such a shame as well because AMD were building a track record of being trustworthy in their marketing performance claims, they were with Zen 2, Zen 3, Zen 4, RDNA, RDNA2 so for them to do this totally destroys years and years of hard work.
 
Remember that even Nvidia didn't improve on RT performance. Ampere had around a 40-45% drop in performance when you turned on RT and so does Ada.

Neither company improved upon RT performance when compared to last gen. The only reason things look better is simply because of the rasterization performance gains. I think it's sad that neither company made any improvements with RT.....although, it doesn't really matter to me because I don't care about RT. I never use it on any games that support it because for me it really doesn't make any difference. You're moving too fast through the games to really take in small visual changes that RT has.

NVIDIA did improve RT performance by a lot for Ada Lovelace relative to Ampere, it's just the base frame rate that increased as well. If RT performance was the same, generational improvements for RT titles would be a lot smaller than for other purely rasterized games. Math requires respect.

Some examples from a Russian website which runs some purely artificial benchmarks:
 

Attachments

  • 3dmdxr.png
    3dmdxr.png
    153.2 KB · Views: 212
  • 3dmportroyal.png
    3dmportroyal.png
    146.6 KB · Views: 228
  • 3dmspeedway.png
    3dmspeedway.png
    136 KB · Views: 218
  • rtboundary.png
    rtboundary.png
    142.5 KB · Views: 228
Last edited:
Way overhyped... Now I know why the 4080 is so terribly priced and now we will likely get an 800-900 usd 4070 way to go AMD.


The only hope is that aib cards perform way better.
 
Not sure what people were expecting that so many comments are talking about disappointment :S

Its slightly faster than a 4080, while cheaper. Did you think AMD were retaking the performance crown while charging half as much? lol

Nice card IMO.
 
Back
Top