• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

Low quality post by mechtech
Nice looking card :)

1667515845435.png
 
Decoupled shader clocks.. Thats a throwback reminding me of back in the day when you could overclock the shader and core clocks separately on geforce.
 
I'll be waiting for the rest of the lineup, and some benchmarks, before I can judge this. It looks good on paper, I love the idea of using chiplets, especially for GPUs which are traditionally extremely huge. Could be a winner, but it's too early to tell.

Love it. No bells and whisles, just raw performance, without compromises with sane price.
These prices are not sane. I remember when I could buy a card of this caliber for less than a third of the price.
edit: didn't factor in the euro tax. That makes it less than a third, not just less than half.
 
They are so far ahead in mindshare they can afford to price much higher than AMD.

If amd could price their cards the same as Nvidia they would. They are not giving us a 7900XTX at 1000 usd out of the kindness of their hearts.

Look at their cpu division the minute they caught up to intel prices went to $h1+.

The same will happen if they ever have overall parity with nvidia on the gpu side of things.

AMD isn't competing in the "ultra high end".
 
Napkin theory math says this 999$ gpu kills the 4080 at its current price
 
What's the reason for them not showing benchmarks vs the 4090 or at least the 3090ti? They really enjoyed showcasing the 6800 vs nvidia gpu's.

IF they use the 3090Ti people will ask why not use the 4090. Given the 4090 has a 60% price premium they are not really targeting the same market and there is no 4080 16GB to compare against at a closer price point so they just compared to their old flagship.
 
Ahah, they already announced that a couple weeks ago, right after the first problems with the 12pin connector appeared
The actual cards should be made way ahead of the 12vhpwr problem surfaces..
I mean the AMD PR team and the 3D animators caught up with the news rather quickly and decided to re-render the 3D cuts and powerpoint slides to make this a selling point :)
I am impressed with their efficiency catching up with news.
 
i think he is asking can you find those settings you just mentioned in the radeon driver, so where do you find "adaptiveAA w/ supersampling"? id like to know as well for older games.
1667517160565.png
 
What's the reason for them not showing benchmarks vs the 4090 or at least the 3090ti? They really enjoyed showcasing the 6800 vs nvidia gpu's.
It's pretty obvious. The numbers they provided were 50-70% faster than their current gen. That makes then 7900 XTX around 0-20% slower (depending on the game) in pure rasterization in games than the 4090. I expect they would be comparing it to the 4080, if it was out.

This was first and foremost a marketing presentation and at this time there's just nothing from Nvidia at $900-1000 price range that makes for a good showcase, in terms of performance, of their GPU's.
 
Well... I might get one. I will just get in trouble... but I can always say I saved a thousand bucks by not going with Nvidia lol...
 
Nice toy(s) to have, pity there aren't any AAA games atm that motivate me enough to upgrade.
Perhaps with Starfield next year.... but still too early to predict performance demands in that title.
 
Becvause a graphics bell and whistle that tanks your framerate while producing effects so minute that even with still frame people still cant see the difference is totally worthless to everyone but specwhores. To those of us that PLAY games, RT is functionally useless except for Ambient Occlusion, which can be done far easier with shader tricks without requiring the power of a nuclear sub to operate. The numebr of newer games coming out with impressive lighting effects that dont need any form of RT should indicate that RT, at least in its current form, will go down the same road as Hairworks and Physx.
I mean there's plenty of arguments in there. SSR is far more jarring than RT reflections. I don't really see RT going anywhere. Both major players are heavily invested into RT now.
 
Need @W1zzard's review already, I think we'll see a lot of variance vs the 4090 depending on the game tested.

Also keen to hear more about this FSR 3.0 FluidMotion with frame generation, what will it work on, how good is it, can we piggyback games that have DLSS3? so many questions.
 
So, basically, AMD just did a switch-a-roo naming-scheme here, in my opinion, huh?

Actually 7900xt = now a 7900xtx

Actually 7800xt = now a 7900xt

"It would look weird and it will expose us for charging $899 just for our 7800xt but, hey, let's do an NVIDIA but... a tad less... and our fans will defend is... Yeah!... the Board!" :laugh:
 
Nvidia price cuts... will they do it?
 
I just realized this. It has AV1 which you could use for OBS streaming. NVENC was the only other thing that Nvidia had an upper hand on.
 

This average includes games where the 4090 is CPU-bound even in 4K, so the XTX will be affected by that as well.

But it could be very close in AMD-optimized titles. It should definitely win performance per dollar (in rasterization). Performance per watt might be similar, as the 4090 is surprisingly efficient.
 
Well played Nvidia, people now see $899 to $999 as great value.
 
Well, that 999 dollar card will be about 1537 here.

Thats a hard pass. Gonna have to go back to my other hobbies since I am being priced out of this one.
Don't say that. Hopefully AMD will be good to us Canadians and allow us to get it from their website for MSRP plus conversion. I don't want to see Mining premiums applied to this card. I am going to Memory Express tomorrow to talk to my friend.
 
Monitors will follow. Imagine buying RTX 4090 and get stuck with 4k144 forever.
My Overwatch 2 is running 600 fps at 4k, and sorry bruh you have to play 144 hz on your horsegarbo nvidia. What is this.
Not sure if you're trolling, or you really are serious with this total nonsense
 
Well played Nvidia, people now see $899 to $999 as great value.
The GTX 690 was $1000 in 2012

The GTX 590 was $1000 in 2010

The GTX 8800 ultra was $830 in 2007, adjusted for inflation over $1100.

Halo cards hitting 4 figures is not a new thing.

In the context of halo cards, the 7900 xtx is a "great value", in that it is substantially cheaper then the 4090 while likely not being that much slower, if not occasionally faster.
 
The GTX 690 was $1000 in 2012

The GTX 590 was $1000 in 2010

The GTX 8800 ultra was $830 in 2007, adjusted for inflation over $1100.

Halo cards hitting 4 figures is not a new thing.

In the context of halo cards, the 7900 xtx is a "great value", in that it is substantially cheaper then the 4090 while likely not being that much slower, if not occasionally faster.

Well indeed, yet something like $699 GTX 1080 TI was milking the market, and $1000 Titan was an abomination confirming their greed.

How times have changed.
 
Back
Top