Thursday, February 6th 2025

AMD's Frank Azor Expects Upcoming Presentation to Fully Detail RDNA 4 GPUs

AMD debuted its first wave of RDNA 4 graphics cards—consisting of Radeon RX 9070 XT and RX 9070 (non-XT) models—at the beginning of January. At the time, press outlets and PC gaming hardware enthusiasts were equally flummoxed by Team Red's confusing presentation strategy. Invited attendees of CES 2025 were allowed to handle demonstration samples, but board partners appeared to be sworn to secrecy regarding technical specifications or performance figures. Miscellaneous leaks and rumors have seeped out since then—according to insiders, AMD was prepping its new Radeon product line for launch late last month. A re-scheduled rollout is seemingly in the works, possibly on next month's calendar entry. Benchlife (via VideoCardz) believes that a pre-launch showcase event is lined up for late February.

Following publication of the latest RDNA 4-related leaks, a brave soul engaged with AMD's Frank Azor on social media. Dee Batch, a loyal and long-term supporter of Radeon gaming hardware, sent a query to Team Red's chief architect of gaming solutions: "can we see the RDNA 4 full presentation? I honestly feel you can prevent many gamers from getting a GeForce RTX 5070 or RTX 5070 Ti GPU...Please, do not miss this opportunity to gain gamer mind share." Azor replied with a short sentence: "yes, full details are coming soon." This brief interaction attracted additional participants—VideoCardz noted that the Team Red executive was taking on board feedback about expectations surrounding RDNA 4 MSRPs. Late last month, Azor refuted rumors of the Radeon RX 9070 XT pricing starting at a baseline of $899. NVIDIA has officially disclosed price points of $549 (RTX 5070) and $749 (RTX 5070 Ti)—AMD enthusiasts have their fingers crossed in hope of TBA competitive numbers.
Sources: Dee_Batch Tweet, VideoCardz
Add your own comment

81 Comments on AMD's Frank Azor Expects Upcoming Presentation to Fully Detail RDNA 4 GPUs

#76
alwayssts
AusWolfThat's a load of ***. No offense.


No, I am not the type. The 4090 is way out of my comfortable zone of spending for a toy. Even the 9070 XT is, but since I'm not planning to upgrade for the next 2-3 generations, I'll swallow it just for this once.

Edit: I'm also not the type to use the word "invest" to describe buying something that depreciates faster in value than a used electric car, purely for personal entertainment.


Speculation.
Didn't see this post earlier. I really don't want to fight because when I tell people stuff like this they almost *never* remember. I look like an ass now for going out on a limb and they don't care when I'm right pretty soon afterwards. It's the same way 3080 hasn't held up well for 1440p raster or 1080pRT, 4080 isn't holding up well (for native1440pRT, barely 1080p native [60fps in that scenario is a good example of n48]) and suffers in 1080p->4k upscaling bc DLSS4 perf hit (<60fps), and 4070ti for native/upscaling 1080pRT...all true and all I explained to people before they happened (much to them saying the same things you are; again, I was right). Just remember, it's much easier to make current observations than to extrapolate (sometimes even the near) future. Even those current observations are difficult for some to understand.

Point is, why are people buying cards for those features if they can't hold up for even one generation? IMO it's trickery from nVIDIA to spur constant upgrades, and it pisses me off because people JUST DON'T GET IT.

It's really not a load of shit. Look at 4070ti right now; slipping from 1440p to 1080p for raster/RT, if that.
Do people upscale ~1080p to 1440p (and sometimes 4k)? Yes they do. Does 'quality' 1440p upscaling take less raster than native 1080p? Yes it does. Is this what nVIDIA is banking on? YES IT IS. Not even 1080p.
Look at 7900xtx, which is a 4k raster card. Do some people upscale 1080pRT to 1440p/4k? Yes they do and yes they will. And 1080p up-scaling IQ sucks with FSR3 imo.
N48 is clearly aiming for 1440p raster and 1080p RT (FSR4 to 1440p/4k). This might work now, but many options we've seen (GI/FG/etc) require slightly more compute/ram. This is why 5080 falters.
Not all games are all these things.

The point I've been trying to get across about 4090 is that on 3nm that will be a '80'-level card, like-wise AMD's 7900xtx upgrade, and it has held up well for 1440pRT/4k raster. nVIDIA has prolonged this from happening on a lower tier by weird segmentation, hence 5080 is still not a 1440pRT/4k raster card, and 7900xtx didn't have the needed arch improvements (high clock/bw/RT/ML up-scaling) for 1440pRT (or good 1080p up-scaling), but I think it will all mesh next-gen with both having decent options. It's absolutely fine if you're a mid-range guy; I also think that's a smart play. I also think those cards don't really exist right now; a better option is 18GB on 3nm (with more compute than N48/GB203). If a 24GB 5080 existed, I would say it is that...but it very purposely doesn't; nVIDIA does not want people to have a non-90 card that will last. nVIDIA 12GB cards need more everything for 1080p. Most 16GB cards need more compute/ram for 1440p/1080pRT (especially up-scaled higher), 5080 needs more RAM for higher-end settings to make sense long-term for 1440p. This is why this generation kind of sucks. While AMD is striking a good balance (for 1080pRT/1440R->4k up-scaling rn), I still don't think it will be enough to make people for that 'mid-range' segment happy long-term. This is partially because I don't think it's compute/ram is enough for 1440p, and partially because up-scaling IQ from 1080p->4k quality is (with FSR3) and could be (with FSR4) less than ideal.

But how do you explain that to people? Are reviews going to show a ton of 1080pRT->4k upscaling (which almost nobody currently does) and a ton of pictures comparing FSR/DLSS4 1080p->4k IQ?
Will people understand one may look better and the other may keep 60fps mins, but never the two shall meet? Which is better? I think neither are what you want.
Wait until you can have both (at similar/cheaper price), is what I would suggest.

So much of this depends of what is acceptable to an individual and games they play. That's why it's not an argument; it's a conversation. I can prove my point and show it, though. I'm not saying it bc any affiliation.
Settings, upscale quality, running RT or not, etc; how people judge performance...it all varies.
So while people can make examples contrary to what I'm saying, I prefer explaining worse-case and best experience (60fps mins and one-step up-scaling).
This way people don't get fucked over by planned obsolescence by nVIDIA and/or even well-matched configurations, such as N48 may be from AMD, quickly being outdated for what ($) they're being sold to do.
This is why N48 should be cheaper; but you can also extrapolate they're conceivably setting this price as to not (comparatively) devalue the next-generation from also having high prices (~$1000 for a ~4090).
If N48 were priced lower (as it should be due to these deficiencies and/or inability to future-proof due to required optimal configurations rn), those cards would also have to be priced lower (~$750-800).
Likewise, those 192-bit cards that replace these cards would have to be in the range we've come to expect from AMD (~400-600) if these were. This way they can be also be ~650-750, like these cards.

I'm a futurist: I look at things and I plan, based on provable trends. You can call it speculation if you'd like. To me it's called being prepared so people don't get the rug pulled out from under them.
I also call bullshit when I (and perhaps others, although I don't really see many people talking about it) see companies making plays to skew results and/or pad margins for each tier. People deserve to know.
Even if many don't completely understand right now because a lot of these technologies and ways to judge performance (like up-scaling/RT [perf/iq]) are so new to a lot of peoples' experiences.
Most will eventually understand, I hope, even if it takes DF et al vids, but won't ever learn until someone starts to explain it all to them so they can begin to look at things this way.

With respect, you are a perfect example. IDC if you listen to me, listen to DF and everyone else as they explain this going forward in the future. I'm used to it; just remember this conversation happened earlier.
Posted on Reply
#77
Jtuck9
JustBenchingIf FSR 4 provides better than native image quality at the same framerate it will be a no brainer to use just like DLSS.
I think for a lot of people it's still a no brainer if you want "better" performance, and are perhaps a tad more forgiving when it comes to ghosting etc. Especially if you've payed a fair whack for your computer.

I have fond memories of trying to play Oblivion at like 12FPS, Half life not running because we didn't have enough RAM, etc...
Posted on Reply
#78
Bronan
alwaysstsI think FSR4 will end up being very similar to DLSS3. That's just a guess (by kinda figuring out what AMD is doing with their architecture), but I get the feeling it will probably find support.
I could understand instances of developers not wanting to incorporate the shader model scaling from FSR3, but given both FSR4/DLSS will likely be very similar, I think they'll both find similar support.

At least in new titles.

...I hope. :p


You really should, though. Because it truly is the future. You don't have to like it, but at some point you're probably going to need to learn to accept it. It is important that games support it.
The fact many more games currently support nVIDIA's implementation is important to a degree, but what matters is that as we get into more instances of people upscaling from 1080p, which I suspect will become very common relatively shortly given most cards will not really support 1440p+ RT very well, that all the best implementations (in both performance and image quality) from all companies are supported.

This is why nVIDIA buffed up DLSS IQ with DLSS4. They know it just doesn't make sense for nearly anyone to natively do 1440p+ RT, especially rn. Hence, they improved the quality of 1080p upscaling.
A small relative price to pay to increase the market for the features beyond those willing to shell out a ridiculous amount of money for a GPU to render that stuff natively.

Likewise, AMD is likely shooting for a similar thing, but probably just targeting up-scaling to 1440p, most-likely (in terms of getting IQ ok), rather than 4k. This is because N48 is a 1440p raster targeted card.

I would argue there are people out there with 7900xt/x that would like to see better ~1080p->4k up-scaling quality, or even ~1080p->1440p, but I wonder if AMD will kick the can until UDNA. That would suck.
LoL i do not need that silly upscaling nor will use it for along time maybe even till i die.

I admit i will look once every time, but i really do not see the value in it to use it for real gaming

I run my games now at 4K so time for a new AMD gpu either the 9070 XT or a 7900 XTX which is slowly dropping in price already will see when other bought the card and i see with my own eyes.
Which one it will become has to wait till i see the 9070 XT run with on a few friends their machines.
And might upgrade the cpu also if the upcoming is really giving a nice boost also.
I kinda know for sure already it is gonna be another AMD only system.
Posted on Reply
#79
alwayssts
BronanLoL i do not need that silly upscaling nor will use it for along time maybe even till i die.

I admit i will look once every time, but i really do not see the value in it to use it for real gaming

I run my games now at 4K so time for a new AMD gpu either the 9070 XT or a 7900 XTX which is slowly dropping in price already will see when other bought the card and i see with my own eyes.
Which one it will become has to wait till i see the 9070 XT run with on a few friends their machines.
And might upgrade the cpu also if the upcoming is really giving a nice boost also.
I kinda know for sure already it is gonna be another AMD only system.
I still don't think you understand. If you're going to be using RT, pretty soon it won't be an option unless you pay big bucks or generally want to run native 1080p.
As I've clearly shown, a 4080 already has trouble upscaling 1080pRT->4k; 5080 can't do 1440pRT well-enough to say it will last any length of time (or 4k raster).
N48 surely won't be much different than 4080, and 7900xtx RT (relatively) sucks vs it's raster so it is also ~1080pRT, if that. 7900xtx has FSR3, which up-scaling from 1080p->4k kinda sucks.
Like I said; maybe they can make it work (with N48/FSR4) but it could be at the cost of IQ. Again, 16GB of ram is also going to be a limitation. If you're just running raster, I understand the 7900xtx.

Like I said, some of you just don't get. I know some do, and that's why it's like :banghead:, because I care and want people to understand, but truly so many just do not...which apparently is a TON of people.

That said, it's not a war. Be happy with what you buy and I hope you enjoy it. I just don't want anyone to be blind-sided over the next couple years, or even sooner. That's my only goal.
Posted on Reply
#80
Jtuck9
alwaysstsI still don't think you understand. If you're going to be using RT, pretty soon it won't be an option unless you pay big bucks or generally want to run native 1080p.
As I've clearly shown, a 4080 already has trouble upscaling 1080pRT->4k; 5090 can't do 1440pRT well-enough to say it will last any length of time (or 4k raster).
N48 surely won't be much different than 4080, and 7900xtx RT (relatively) sucks vs it's raster so it is also ~1080pRT, if that. 7900xtx has FSR3, which up-scaling from 1080p->4k kinda sucks.
Like I said; maybe they can make it work (with N48/FSR4) but it could be at the cost of IQ. Again, 16GB of ram is also going to be a limitation. If you're just running raster, I understand the 7900xtx.

Like I said, some of you just don't get. I know some do, and that's why it's like :banghead:, because I care and want people to understand, but truly so many just do not...which apparently is a TON of people.

That said, it's not a war. Be happy with what you buy and I hope you enjoy it. I just don't want anyone to be blind-sided over the next couple years, or even sooner. That's my only goal.
If you end up picking up one of these cards I'd be interested in seeing you do some comparisons and pointing out the limitations regarding the hardware.
Posted on Reply
#81
alwayssts
I really should. I just enjoy crunching the data/learning other perspectives more than actually testing...I think it's more interesting and helps me from getting lost in the sauce.

I'm hoping some of the people with a platform (that actually need to fill air) actually do it, is the point. Some are certainly more eloquent/entertaining conveyors of information than I am.

The problem is, many of them have an agenda and/or have to conform to certain 'rules' provided by nVIDIA. Because of this, it truly does make it difficult w/o doing it independently...in that, you are correct.
Posted on Reply
Add your own comment
Mar 10th, 2025 03:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts