• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RDNA4 (RX 9070XT / 9070) launch announced for (delayed to) March 2025

Status
Not open for further replies.
Do we have to trot out the "80% of RTX GPU owners (20/30/40-series) turn on DLSS in PC games." stat again? It's whether those people deem the potential extra price worth it for the potential extra frames (and improved quality). Doesn't AMD also have a fancy new all in one app like Nvidia for ease of access?

i doubt that number, a lot of people use the custom presets and they turn dlss on by default, i doubt 80% is tinkering with dlss. I'm honest if the game runs fine i don't turn dlss on in most games i play.
There is probably a 80% of gamers that use motion blur because it cames as default.
 
I'm wondering if ease of use / familiarity would give pause to "switching sides"? Why I mentioned the AMD all in one app.
 
That is the reason. RT, DLSS, Frame Gen, and that's just the beginning. The general consensus is that those things are essential and as such Nivida will be a better buy. If the 9070XT is as fast as the 5070ti in raster but cost the same as a 5070 people would still recommend people getting the 5070. Add to this more and more people (and gamers specifically) are into making videos and 3D modeling and whatever so the line between professional and consumer has indeed blurred quite a bit, and in that space AMD has no chance of catching up.
DLSS is essential, rt is debatable, depends on what tier are you buying, framegen is just nice to have in a few selection of titles but amd has its own solution there so who cares.

In order for me to trade dlss I'd like a card that's 20% faster in both raster and rt.
 
DLSS is essential, rt is debatable, depends on what tier are you buying, framegen is just nice to have in a few selection of titles but amd has its own solution there so who cares.

In order for me to trade dlss I'd like a card that's 20% faster in both raster and rt.
My first experience with DLSS (back when I had a 3070) was so off-putting that I still do not touch any upscaler with a 10ft pole. Sure, the tech evolved and game devs basically depend on them today, but I'd still rather have higher firepower than better upscaling anyday.
 
Apologies, I phrased myself excessively poorly. What I mean is, that Twitter profile can't be proven to be the real David McAfee. McAfee at AMD is part of Ryzen product management (which, given his other responsibilities, likely involves his need to be informed about projects in the works) and the 'client channel business' department, or more accurately client communications and marketing.

The 'David McAfee' on Twitter is likely not real. It doesn't help that the little blue badge was only acquired 'sometime in January 2025' rather than around the time of the account's creation, when he would have already been working for the company in his current very public position, nevermind that he for some reason hadn't made made a Twitter account until '22 for his professional appearance during either his near 18-year history with AMD or his 6-year history on LinkedIn. AFAIK, only his LinkedIn is verifiably him. Anything else is easier falsifiable than veritable. The only tie to AMD is that he's followed by an account presenting itself as Anush Elangovan, who is part of a different department in a different location and whose account is similarly suspect.

TL;DR too much of the account's details are suspiciously off and it's setting off my 'fake shit' alarms.

Not everyone had the need or could purchase the "blue badge" on Twitter earlier and I was one of them that's why my accounts on Twitter been private and I already lost one account because of Twitters way after Musk purchased Twitter...

I do not want to support Musk that's why I do not have the blue badge myself.

Again David McAfee shouldn't state this because it's a death note for the RX 9070 and RX 9070 XT because people have been waiting and if March 2025 is true the majority of people will have brought Nvidia even most of them that are waiting on the new Radeon cards such as myself.

I might just go RX 7800 XT or RX 7900 GRE if these rumours are true.
 
i don't think that's the consensus at all, just look at the frake fames discussion everywhere, there's also the competitive gaming crowd that is enormous and don't care at all about that, and the general backlash on the 5070 = 4090ti.

Pricing is the issue here. If they release a card with good raster performance at a very good price compared to whatever nvidia has in the same raster category people will buy AMD, influencers will praise AMD and shit on Nvidia's AI and 5070 = 4090ti claims, fake frames, ghosting and lag, etc...

What people want is a decently priced mid range gpu with good raster, the rest is BS.
I also feel like their marketing emphasis on FG seems to tell a story that most of the gains going forward will be from FG, but everybody knows that FG perform the best when the GPU got decent raw power to begin with...That's one aspect where I think that they are a bit tone deaf. There was a ton of people who assumed that the DLSS 3 exemple in cyberpunk was generating frames from 27 FPS (hence a terrible experience), when the base FPS was actually boosted by upscaling before.

It's factually still in their interest to boost raster performance, but their marketing is indirectly saying that they might stop improving on that aspect
 
My first experience with DLSS (back when I had a 3070) was so off-putting that I still do not touch any upscaler with a 10ft pole. Sure, the tech evolved and game devs basically depend on them today, but I'd still rather have higher firepower than better upscaling anyday.
That's exactly how I feel about native :D
 
Yea, I had a quick glance around and it seems like a lot of people were saying that they rated them higher than the Sapphire cards last gen. Interesting to see what has changed, if anything.
I didn't really look into Powercolor's last gen, but it didn't seem to have the pass-through cooling, and was a bit thicker than dual-slot.

That is the reason. RT, DLSS, Frame Gen, and that's just the beginning. The general consensus is that those things are essential and as such Nivida will be a better buy.
God save us from a world where Nvidia's features are "essential". :fear:

We already have $2k graphics cards, how much more of a monopoly do you want?

If the 9070XT is as fast as the 5070ti in raster but cost the same as a 5070 people would still recommend people getting the 5070. Add to this more and more people (and gamers specifically) are into making videos and 3D modeling and whatever so the line between professional and consumer has indeed blurred quite a bit, and in that space AMD has no chance of catching up.
Why do some people think that everybody has become a video maker (I'm not gonna say content creator, because that needs actual content, not just a random dude rambling while playing a game) all of a sudden?

DLSS is essential,
No it's not. It's a helping hand. FSR exists, too.

framegen is just nice to have in a few selection of titles but amd has its own solution there so who cares.
FG works at high FPS where you don't need it, but it's completely useless at low FPS where you'd actually need it. It's a gimmick just to score Nvidia a nice CES presentation with pretty green bars.
 
with the recent release of indiana jones, and i think ff will be the same, Nvidia has already won and the AI crap will be absolutely essential, i blame AMD for that, if they didn't played the "-50usd game" they could have a higher market share and things would not be dictated like this by Nvidia
 
No it's not. It's a helping hand. FSR exists, too.
Is is essential for me. Fsr isn't at the same level sadly.


FGworks at high FPS where you don't need it, but it's completely useless at low FPS where you'd actually need it. It's a gimmick just to score Nvidia a nice CES presentation with pretty green bars.

You cant tell someone what works and what he needs. I've been using it on a bunch of games that my cpu can't push higher framerate. Even at 50-70 fps it works fine. I didn't say it's mandatory, and MFG is even less so, but it's still nice to have for games like hogwarts.
 
These are insane differences if you don't know what you are looking at. PCGH - the site that conducted the test, was using ultra settings to get to those differences (cause if you lowered the textures one click, you wouldn't see those differences). The problem is, a big majority of these games are already running BELOW 30 fps at 1080p on the 16gb card. So what's the freaking point?
The point is, why one can't play with ultra textures even on lower tier cards? I'd much rather lower any other setting than textures quality.
Also, as long as game developers remain constrained by low amount of VRAM, we won't be seeing any major update in textures quality any time soon.

Better question is why Nvidia can't really do more than 8GB of VRAM below 700€ mark, except for RTX 3060 (12GB) and RTX 4060 Ti (16GB).
2nd generation in a row Nvidia is gonna serve 8 GB along with 128-bit bus width in lower mainstream class (xx60) and 12GB (192-bit) in the mid-mainstream.

The 7800xt has the same performance as the 6800xt, a 649$ 2020 gpu. It even had the same vram. If the 9070xt isn't twice as fast as that at the same price, something is very rotten here.
7800XT was not a real successor to 6800XT, hardware-wise. RX 6800 XT was a $649 2020 GPU, while RTX 3090 was a $1499 2020 GPU which ended up being faster by only 14% in 4K, 10% in 1440p and 8% in 1080p than RX 6800 XT, while achieving +14.6% better p/w ratio on average. 6800XT was a very good card back then.

Source: AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble

AMD drivers allocate system RAM too. Where else would you overflow with an 8-12 GB card?
Sure, my point was about occupying less system RAM when there is enough VRAM.

In a strict sense, even AMD said nothing about the launch date. Not once.
That's true, there's only one confirmed information regarding RX 9070 (XT) launch and that's Q1'25 (official presentation).
Several dates have been published as pleasant launch date: CES, 15.01.2025, 23.01.2025 but nothing was confirmed.

The cards have been sent to retailers and reviewers.
This does not mean that the cards will launch in next 2 weeks after delivering first pieces to retailers. Unfortunately for us. It's chaos.

That is the reason. RT, DLSS, Frame Gen, and that's just the beginning. The general consensus is that those things are essential and as such Nivida will be a better buy.
I would not call it general consensus.

I'd like that to be true, but when even the reviews here on TPU on every Radeon and Arc card comes with a "No DLSS" con...
Imagine in reviews for cars made by Ford, Volkswagen, KIA, Hyundai, Jeep, etc. in review conclusion there would be stated a con saying "The car is not equipped with BMW engine."
 
Is is essential for me. Fsr isn't at the same level sadly.
If it is for you, that's fine. :)

Personally, I don't think FSR 3 is that bad, but I'd still rather avoid all upscaling if performance allows.

You cant tell someone what works and what he needs. I've been using it on a bunch of games that my cpu can't push higher framerate. Even at 50-70 fps it works fine. I didn't say it's mandatory, and MFG is even less so, but it's still nice to have for games like hogwarts.
Well, if you want to turn 100 FPS into 200, fair enough. I just don't see the point.
 
it helps when needed, by why essential? i still find a lot of issues with it
Because it offers the best image quality at a given performance level.
Well, if you want to turn 100 FPS into 200, fair enough. I just don't see the point.
More like 50 into 90.

The point is, why one can't play with ultra textures even on lower tier cards? I'd much rather lower any other setting than textures quality.
Also, as long as game developers remain constrained by low amount of VRAM, we won't be seeing any major update in textures quality any time soon.
It's the 7600 8gb that has issues though, not the 4060. The 4060 8gb was faster than the 16gb 7600. I think nvidia has better compression algorithms
 
7800XT was not a real successor to 6800XT, hardware-wise. RX 6800 XT was a $649 2020 GPU, while RTX 3090 was a $1499 2020 GPU which ended up being faster by only 14% in 4K, 10% in 1440p and 8% in 1080p than RX 6800 XT, while achieving +14.6% better p/w ratio on average. 6800XT was a very good card back then.

Source: AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble
The $499 7800 XT is the successor of the $479 6700 XT. People get hung up on model names way too much. Had it been called the 7700 XT, and the real 7700 XT just a non-XT, both would have been a lot more successful.

Imagine in reviews for cars made by Ford, Volkswagen, KIA, Hyundai, Jeep, etc. in review conclusion there would be stated a con saying "The car is not equipped with BMW engine."
Exactly! If having no DLSS makes a graphics card a loser, then you're doomed to buy only Nvidia for the rest of your life. Welcome to the monopoly of $2k graphics cards, lads! ;)
 
surely i'm missing something, but if you need 100+fps you are a competitive gamer and those hate dlss, fsr, etc... if you aren't i prefer 60fps to 100fps+ with ghosting, and all that comes with dlss.
Sure if things seem ok, and fps could be better i can turn dlss on a bit, but i would much prefer not and i'm willing to sacrifice fps for not having image issues.

CP77 is the dlss flaghsip, has been trained in the supercomputers for years now and i can still find flickering, ghosting with dlss today
 
surely i'm missing something, but if you need 100+fps you are a competitive gamer and those hate dlss, fsr, etc... if you aren't i prefer 60fps to 100fps+ with ghosting, and all that comes with dlss.
Sure if things seem ok, and fps could be better i can turn dlss on a bit, but i would much prefer not and i'm willing to sacrifice fps for not having image issues.

CP77 is the dlss flaghsip, has been trained in the supercomputers for years now and i can still find flickering, ghosting with dlss today
Exactly my thoughts. I'm an atmospheric gamer, so I'm fine with 50 FPS. FG doesn't work below that, so it's useless to me. If you're a competitive gamer, however, the extra latency and ghosting of FG kills your reaction times, I imagine.
 
I've been thinking about buying back the 7800 XT that I once had, so I'm good with whatever that is faster and comes at a good enough price. I have expectations, but they're not through the roof.
Yeah I get it. For me that would not do. my 6900xt is a bit faster than a 7800xt. I wish to get the 7900xtx performance for some decent money and less power draw.
 
Yeah I get it. For me that would not do. my 6900xt is a bit faster than a 7800xt. I wish to get the 7900xtx performance for some decent money and less power draw.
That would be nice, though I'm expecting the 9070 XT to fall a bit short - as long as it's a small bit, like 7900 XT level, I don't mind at all.

It's the 7600 8gb that has issues though, not the 4060. The 4060 8gb was faster than the 16gb 7600. I think nvidia has better compression algorithms
As far as I know, the 4060 Ti is affected as well. Games run fine until the VRAM overflows and starts to copy over from the system RAM. Sometimes it happens straight away. Sometimes it takes a bit of game time. Sometimes the game loads low quality models and textures to save performance (Halo Infinite).
 
As far as I know, the 4060 Ti is affected as well. Games run fine until the VRAM overflows and starts to copy over from the system RAM. Sometimes it happens straight away. Sometimes it takes a bit of game time. Sometimes the game loads low quality models and textures to save performance (Halo Infinite).
I was specifically referring about the pcgh review that was posted in a previous page where they show big differences between the 8 and 16 gb 7600 gpus. Problem is the 8gb 4060 was faster than both so it's not the vram per we that's the issue there.

they go out of to avoid latency, no one turns dlss on for competitive gaming
DLSS / fsr reduces latency. You are probably referring to FG.
 
I was specifically referring about the pcgh review that was posted in a previous page where they show big differences between the 8 and 16 gb 7600 gpus. Problem is the 8gb 4060 was faster than both so it's not the vram per we that's the issue there.


DLSS reduces latency. You are probably referring to FG.
Sure, I just mean, performance issues aren't the only symptoms of running out of VRAM.
 
Do we have to trot out the "80% of RTX GPU owners (20/30/40-series) turn on DLSS in PC games." stat again? It's whether those people deem the potential extra price worth it for the potential extra frames (and improved quality). Doesn't AMD also have a fancy new all in one app like Nvidia for ease of access?
It sounds more like a marketing stat more than something users might even not even be aware of, I wonder how many people are just leaving it on if the game recommends DLSS?
If someone prefers the software features, I think that's fine, but I don't see how paying more for software is good thing instead of having better performance without the need for faked frames or lowered resolution.
Because it offers the best image quality at a given performance level.
If thats what you prefer, but I don't see why it should be a necessity. I'd rather turn down a few things than have to use upscaling.
More like 50 into 90.
But that 90fps will still feel like 50fps in terms of latency.
 
If thats what you prefer, but I don't see why it should be a necessity. I'd rather turn down a few things than have to use upscaling.
It's a necessity for him, not for every one who ever bought an GPU since inception of Dlss or FSR, I am not sure what we are arguing about here.
But that 90fps will still feel like 50fps in terms of latency.
Personally thought that camera smoothness is the most important. Especially in a 3rd Person Game, Elden Ring or Genshin for an example. Getting a smoother Picture overall helps as well. Also could not tell the difference between 50 and 40 ms.
 
Status
Not open for further replies.
Back
Top