• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RDNA4 (RX 9070XT / 9070) launch announced for (delayed to) March 2025

Status
Not open for further replies.
You know what I think. If indeed these cards can perform like a 7900 card the retailers are in a real pickle. The alleged price leaked is that of the 7700XT. Just imagine what would happen to the value of every GPU from the 7700XT and up if they are the same price or more expensive. Hence the delay to try to move those old cards. I expect another Game(s) offer for incentive. Look for 7000 cards to get some deep discounts in the coming weeks.
Sure, that sucks, but isn't that the case every gen? It's AMD's fault for manufacturing too many GPUs that people don't want to buy, at current prices at least.
 
It's a necessity for him, not for every one who ever bought an GPU since inception of Dlss or FSR, I am not sure what we are arguing about here.
The average consumer looking to buy a new GPU will think having DLSS is important when reviewers list not having it as a con.
Personally thought that camera smoothness is the most important. Especially in a 3rd Person Game, Elden Ring or Genshin for an example. Getting a smoother Picture overall helps as well. Also could not tell the difference between 50 and 40 ms.
IMO camera smoothness and lower input lag are both important, if a game looks like 90fps but still feels like a laggy 40-50fps mess then its not worth using upscaling over just turning a few settings down.
 
The average consumer looking to buy a new GPU will think having DLSS is important when reviewers list not having it as a con.

IMO camera smoothness and lower input lag are both important, if a game looks like 90fps but still feels like a laggy 40-50fps mess then its not worth using upscaling over just turning a few settings down.
Turning a few settings down won't make your cpu pump any more frames.
 
i don't think that's the consensus at all, just look at the frake fames discussion everywhere, there's also the competitive gaming crowd that is enormous and don't care at all about that, and the general backlash on the 5070 = 4090ti.

Pricing is the issue here. If they release a card with good raster performance at a very good price compared to whatever nvidia has in the same raster category people will buy AMD, influencers will praise AMD and shit on Nvidia's AI and 5070 = 4090ti claims, fake frames, ghosting and lag, etc...

What people want is a decently priced mid range gpu with good raster, the rest is BS.

If that was the case AMD would sell more cards.
God save us from a world where Nvidia's features are "essential". :fear:

We already have $2k graphics cards, how much more of a monopoly do you want?

I don't want it, but those things is what people talk about. "But for RT and hence DLSS Nvidia is better" is just a stated fact.

Why do some people think that everybody has become a video maker (I'm not gonna say content creator, because that needs actual content, not just a random dude rambling while playing a game) all of a sudden?

Aspiring video creator mind you. "I'm not doing it right now but I would like to do X and for that Nvidia is better so..."

No it's not. It's a helping hand. FSR exists, too.

Go read any review w1z has ever made. DLSS being the superior tech is also a stated fact.

I would not call it general consensus.

Hyperbolic maybe but for AAA gaming all of those things will just get more important. Look at Black Myth Wukong. And again, DLSS being better than FSR is just a stated fact.
 
The average consumer looking to buy a new GPU will think having DLSS is important when reviewers list not having it as a con.
First of all that was not his point, he likes FSR as well, the picture quality is just not that good (currently)
IMO camera smoothness and lower input lag are both important, if a game looks like 90fps but still feels like a laggy 40-50fps mess then its not worth using upscaling over just turning a few settings down.
yesn't I also think you should at least get to 45 stable frames (lower your settings and so on, you know the drill)

But why argue against Dlss if you hate 40 - 50 fps in games, shouldn't you like it then? ("free" performance for small difference in picture quality)
 
DLSS / fsr reduces latency. You are probably referring to FG.

it's not black or white like that:


and most competitive users just don't use it because there is no upside to it in most cases and you get cons
 
Go read any review w1z has ever made. DLSS being the superior tech is also a stated fact.
DLSS4 looks noticeably better than FSR3.1 but at what performance cost.... Theres a chance that new FSR finally will close the gap at least visually.
 
If that was the case AMD would sell more cards.

people get all upset when i talk about AMD cards, but i will stand by this: AMD's problem is the drivers, the software, they are absolute dog shit, and you can't change my mind. They burnt way to many people with their cards. But i'm also not willing to say what i said here, so i'm not a hater, i'm a realist.

All of this to say, their problem is drivers, and they are unwilling to drop the pricing to compensate for that. I will easily pay 100 more for a nvidia card just to not have to troubleshoot shit all the time and like me there are countless others.
 
AMD's problem is the drivers
Please stop straight up lying! I agree about FSR and RT performance disadvantage but drivers ? Cmon.....
 
people get all upset when i talk about AMD cards, but i will stand by this: AMD's problem is the drivers, the software, they are absolute dog shit, and you can't change my mind. They burnt way to many people with their cards. But i'm also not willing to say what i said here, so i'm not a hater, i'm a realist.

All of this to say, their problem is drivers, and they are unwilling to drop the pricing to compensate for that. I will easily pay 100 more for a nvidia card just to not have to troubleshoot shit all the time and like me there are countless others.
My experience as well. Ive had a lot of issues with nvidia but fixes were usually a week or a month away. I had issues with amd that I was waiting for years...
 
it's not black or white like that:

and most competitive users just don't use it because there is no upside to it in most cases and you get cons
wow not even 1 ms difference between dlss on & off (in the worst case scenario)

ok nvidia (and amd) pack it up, the tech is garbage!

(also 3 year old video and probably outdated...)

My experience as well. Ive had a lot of issues with nvidia but fixes were usually a week or a month away. I had issues with amd that I was waiting for years...
funnily enough, I noticed a bug in FF7 remake on steam where the whole screen becomes white, you only see the UI elements, i've only seen other people with rtx30 series cards having the problem.

(it gets stuck there until you start photo mode and exit it or save & reload your current safe state)

the game is also 2,5 Years old. Maybe an issue on square enix side but who knows.
 
Please stop straight up lying! I agree about FSR and RT performance disadvantage but drivers ? Cmon.....

you don't seem to know what an opinion is, i can't be lying when I'm giving my opinion, it's my opinion, it's not a binary true or false statement. That's just moronic man

If i said Lisa Su is not AMD's CEO or Ryzen is a gpu brand those are statements that can be true or false
If i say AMD's software is shit or milk tastes bad, those are opinions not binary true or false statements
see the difference
 
Imagine in reviews for cars made by Ford, Volkswagen, KIA, Hyundai, Jeep, etc. in review conclusion there would be stated a con saying "The car is not equipped with BMW engine."
Not a very good comparison; a "BMW engine" is not a killer feature, but DLSS very much is. And before you say "DLSS is just upscaling", it's not - it started out as such but it's become much more, including frame generation, and that's what makes it killer. The consumer market agrees, and if you as an individual do not - you're welcome to that opinion, but please remember it is just that, unsupported by the available facts. As such it makes perfect sense for W1zz to note it in his reviews.
 
it's not black or white like that:


and most competitive users just don't use it because there is no upside to it in most cases and you get cons
Am I blind? DLSS ON improved latency (dramatically) on every single scenario.

funnily enough, I noticed a bug in FF7 remake on steam where the whole screen becomes white, you only see the UI elements, i've only seen other people with rtx30 series cards having the problem.

(it gets stuck there until you start photo mode and exit it or save & reload your current safe state)

the game is also 2,5 Years old. Maybe an issue on square enix side but who knows.
There are lots of bugs I had over the years with nvidia, im not claiming otherwise. From blackscreens to driver crashers artifacts, monitors failing to recover from sleep and all sorts of things. Thing is there was an immediate fix on the next driver, or the problem solved itsolved if I manually downgraded my driver to the previous one. That hasn't been my experience with amd were such issues exist for years (yes, plural).
 
Having dlss noted as a feature is fine, but IMO listing it as a con isn't and only leads to the readers questioning the bias of the reviewer. Frame Gen and upscaling are just nice to have for those who want it, it shouldn't be pushed as something needed, but unfortunately the AAA games industry have been using dlss as an easy way to avoid game optimization.
 
Last edited:
i'm not a doctor. But you should listen to his conclusion, no eyes needed you can just listen
Which is a no. If you use DLSS to increase FPS (so you are gpu bound) it decreases latency, since it increases framerate.
 
unfortunately the AAA games industry have been using dlss as a easy way to avoid game optimization.
Unfortunately you've failed to educate yourself on how the Moore's Law wall is precluding the generational advancements in graphics horsepower that we've become accustomed to. Upscaling and frame generation are required technologies if we want to see graphics fidelity continue to improve over and above what GPUs can offer. They are not hacks, they are not lazy, they are not fake frames, they are a solution to a fundamental physical constraint. Denying the facts doesn't change them.
 
Am I blind? DLSS ON improved latency (dramatically) on every single scenario.
In Fortnite it's a teeny tiny bit worse...
1737558910394.png

But even a cat would have problems noticing the difference in delay here... (Also 3 year old video, as I already mentioned)
 
...

Go read any review w1z has ever made. DLSS being the superior tech is also a stated fact.

....

So...point of discussion. W1z states that there is no DLSS, then immediately launches into the "I know it's an Nvidia exclusive feature." It's almost like the defensiveness is based upon knowing that both DLSS and FSR are competing ways to do the same thing...which is inherently to guess at what frames 2-6 are when you've got 1 and 7. They are great if you're running a build that is nominally capable of the raw output you need to generate the actual frames...but it's kind of like our step-up in performance is being derived exclusively from whichever guessing algorithm you want to pretend is best. W1z inherently believes that DLSS is better...while I believe that nearly two decades on from its introduction as the mainstream anything should be able to do 1080p that is billed as a dedicated GPU...and 4k has been theoretically possible since the 30x0 series from Nvidia at playable framerates.

I personally don't want FSR or DLSS. It's a way to hoodwink the stupid, like those older shooters that "fixed" performance issues by using a narrowed FOV. I don't know about you, but those shooters wound up giving me a splitting headache over time. Likewise, the vaseline and prayers of frame interpolation to generate "smoother" motion to me is silly. This is especially true when hardware from the 30x0 generation is supposed to be way slower than the new 50x0 generation, but can still put up 144+ FPS at 2560x1440.


This is why I'm looking forward to the Nvidia vs AMD fight at the middle-high end cards....that used to just be the middle end. I don't care about the halo products because they can't justify their price for gaming, and they inevitably require so much power that the might as well be a space heater. If Nvidia and AMD can put out cards that actually perform at about the same level, for the $550 price, then maybe we've got some hope. My fear with this level of delay is that cards already in retailer hands will not remain boxed. My fear is that Nvidia setting the new pricing for these cards will mean that AMD is only confident in their product if it undercuts the Nvidia price, which means they think they aren't competing as a good product. This is already a generation where they said they want to focus on the future...but in the here and now we should be getting 3080 performance on 5060 costing. What I see is two companies playing chicken for who releases the only competing high end cards of this generation...and neither believes their card has enough inherent value to stand on its own.


Side note though, the team read and team green nonsense in this thread has been amusing to no end. I drink value brand kool-aid, so I've got another 3-6 months before these start being discounted in any significant way, unless the pricing is so wonky that either brand cannot move off store shelves. That...should not happen so soon after the last time.
 
Yep, nVidia -10%. Selling point - level 9000. WP if true :D
they can't keep getting away with it!!

Well... they don't, but they seem to try it anyway every time (hopefully a wrong rumour)
 
Unfortunately you've failed to educate yourself on how the Moore's Law wall is precluding the generational advancements in graphics horsepower that we've become accustomed to. Upscaling and frame generation are required technologies if we want to see graphics fidelity continue to improve over and above what GPUs can offer. They are not hacks, they are not lazy, they are not fake frames, they are a solution to a fundamental physical constraint. Denying the facts doesn't change them.
Moore's law or Jensen's law? The claim of cards have to get more expensive while hardware can no longer improve is a bunch of bs. There is a way around the limit of monolithic die improvements, if AMD can make a chiplet GPU then I'm sure Nvidia can figure it out.
It sounds like you're already buying into the marketing of upscaling and fake frames is a performance improvement, not just a clever trick to convince gamers to keep buying the next gen which will be required to run dlss4 with even more fake frames.
And I'll stop calling it fake frames if Nvidia stops marketing fake frames as a performance uplift over the previous gen, but I expect reviewers will hype it up and say it's a con on cards that don't have it.
 
they can't keep getting away with it!!

Well... they don't, but they seem to try it anyway every time (hopefully a wrong rumour)

it isn't a bug, it's a feature
 
In Fortnite it's a teeny tiny bit worse...
View attachment 381014
But even a cat would have problems noticing the difference in delay here... (Also 3 year old video, as I already mentioned)
I assume that's CPU bound, right? What does his fps look like? In that case it makes sense.
 
Status
Not open for further replies.
Back
Top