Thursday, February 6th 2025

AMD's Frank Azor Expects Upcoming Presentation to Fully Detail RDNA 4 GPUs
AMD debuted its first wave of RDNA 4 graphics cards—consisting of Radeon RX 9070 XT and RX 9070 (non-XT) models—at the beginning of January. At the time, press outlets and PC gaming hardware enthusiasts were equally flummoxed by Team Red's confusing presentation strategy. Invited attendees of CES 2025 were allowed to handle demonstration samples, but board partners appeared to be sworn to secrecy regarding technical specifications or performance figures. Miscellaneous leaks and rumors have seeped out since then—according to insiders, AMD was prepping its new Radeon product line for launch late last month. A re-scheduled rollout is seemingly in the works, possibly on next month's calendar entry. Benchlife (via VideoCardz) believes that a pre-launch showcase event is lined up for late February.
Following publication of the latest RDNA 4-related leaks, a brave soul engaged with AMD's Frank Azor on social media. Dee Batch, a loyal and long-term supporter of Radeon gaming hardware, sent a query to Team Red's chief architect of gaming solutions: "can we see the RDNA 4 full presentation? I honestly feel you can prevent many gamers from getting a GeForce RTX 5070 or RTX 5070 Ti GPU...Please, do not miss this opportunity to gain gamer mind share." Azor replied with a short sentence: "yes, full details are coming soon." This brief interaction attracted additional participants—VideoCardz noted that the Team Red executive was taking on board feedback about expectations surrounding RDNA 4 MSRPs. Late last month, Azor refuted rumors of the Radeon RX 9070 XT pricing starting at a baseline of $899. NVIDIA has officially disclosed price points of $549 (RTX 5070) and $749 (RTX 5070 Ti)—AMD enthusiasts have their fingers crossed in hope of TBA competitive numbers.
Sources:
Dee_Batch Tweet, VideoCardz
Following publication of the latest RDNA 4-related leaks, a brave soul engaged with AMD's Frank Azor on social media. Dee Batch, a loyal and long-term supporter of Radeon gaming hardware, sent a query to Team Red's chief architect of gaming solutions: "can we see the RDNA 4 full presentation? I honestly feel you can prevent many gamers from getting a GeForce RTX 5070 or RTX 5070 Ti GPU...Please, do not miss this opportunity to gain gamer mind share." Azor replied with a short sentence: "yes, full details are coming soon." This brief interaction attracted additional participants—VideoCardz noted that the Team Red executive was taking on board feedback about expectations surrounding RDNA 4 MSRPs. Late last month, Azor refuted rumors of the Radeon RX 9070 XT pricing starting at a baseline of $899. NVIDIA has officially disclosed price points of $549 (RTX 5070) and $749 (RTX 5070 Ti)—AMD enthusiasts have their fingers crossed in hope of TBA competitive numbers.
81 Comments on AMD's Frank Azor Expects Upcoming Presentation to Fully Detail RDNA 4 GPUs
Point is, why are people buying cards for those features if they can't hold up for even one generation? IMO it's trickery from nVIDIA to spur constant upgrades, and it pisses me off because people JUST DON'T GET IT.
It's really not a load of shit. Look at 4070ti right now; slipping from 1440p to 1080p for raster/RT, if that.
Do people upscale ~1080p to 1440p (and sometimes 4k)? Yes they do. Does 'quality' 1440p upscaling take less raster than native 1080p? Yes it does. Is this what nVIDIA is banking on? YES IT IS. Not even 1080p.
Look at 7900xtx, which is a 4k raster card. Do some people upscale 1080pRT to 1440p/4k? Yes they do and yes they will. And 1080p up-scaling IQ sucks with FSR3 imo.
N48 is clearly aiming for 1440p raster and 1080p RT (FSR4 to 1440p/4k). This might work now, but many options we've seen (GI/FG/etc) require slightly more compute/ram. This is why 5080 falters.
Not all games are all these things.
The point I've been trying to get across about 4090 is that on 3nm that will be a '80'-level card, like-wise AMD's 7900xtx upgrade, and it has held up well for 1440pRT/4k raster. nVIDIA has prolonged this from happening on a lower tier by weird segmentation, hence 5080 is still not a 1440pRT/4k raster card, and 7900xtx didn't have the needed arch improvements (high clock/bw/RT/ML up-scaling) for 1440pRT (or good 1080p up-scaling), but I think it will all mesh next-gen with both having decent options. It's absolutely fine if you're a mid-range guy; I also think that's a smart play. I also think those cards don't really exist right now; a better option is 18GB on 3nm (with more compute than N48/GB203). If a 24GB 5080 existed, I would say it is that...but it very purposely doesn't; nVIDIA does not want people to have a non-90 card that will last. nVIDIA 12GB cards need more everything for 1080p. Most 16GB cards need more compute/ram for 1440p/1080pRT (especially up-scaled higher), 5080 needs more RAM for higher-end settings to make sense long-term for 1440p. This is why this generation kind of sucks. While AMD is striking a good balance (for 1080pRT/1440R->4k up-scaling rn), I still don't think it will be enough to make people for that 'mid-range' segment happy long-term. This is partially because I don't think it's compute/ram is enough for 1440p, and partially because up-scaling IQ from 1080p->4k quality is (with FSR3) and could be (with FSR4) less than ideal.
But how do you explain that to people? Are reviews going to show a ton of 1080pRT->4k upscaling (which almost nobody currently does) and a ton of pictures comparing FSR/DLSS4 1080p->4k IQ?
Will people understand one may look better and the other may keep 60fps mins, but never the two shall meet? Which is better? I think neither are what you want.
Wait until you can have both (at similar/cheaper price), is what I would suggest.
So much of this depends of what is acceptable to an individual and games they play. That's why it's not an argument; it's a conversation. I can prove my point and show it, though. I'm not saying it bc any affiliation.
Settings, upscale quality, running RT or not, etc; how people judge performance...it all varies.
So while people can make examples contrary to what I'm saying, I prefer explaining worse-case and best experience (60fps mins and one-step up-scaling).
This way people don't get fucked over by planned obsolescence by nVIDIA and/or even well-matched configurations, such as N48 may be from AMD, quickly being outdated for what ($) they're being sold to do.
This is why N48 should be cheaper; but you can also extrapolate they're conceivably setting this price as to not (comparatively) devalue the next-generation from also having high prices (~$1000 for a ~4090).
If N48 were priced lower (as it should be due to these deficiencies and/or inability to future-proof due to required optimal configurations rn), those cards would also have to be priced lower (~$750-800).
Likewise, those 192-bit cards that replace these cards would have to be in the range we've come to expect from AMD (~400-600) if these were. This way they can be also be ~650-750, like these cards.
I'm a futurist: I look at things and I plan, based on provable trends. You can call it speculation if you'd like. To me it's called being prepared so people don't get the rug pulled out from under them.
I also call bullshit when I (and perhaps others, although I don't really see many people talking about it) see companies making plays to skew results and/or pad margins for each tier. People deserve to know.
Even if many don't completely understand right now because a lot of these technologies and ways to judge performance (like up-scaling/RT [perf/iq]) are so new to a lot of peoples' experiences.
Most will eventually understand, I hope, even if it takes DF et al vids, but won't ever learn until someone starts to explain it all to them so they can begin to look at things this way.
With respect, you are a perfect example. IDC if you listen to me, listen to DF and everyone else as they explain this going forward in the future. I'm used to it; just remember this conversation happened earlier.
I have fond memories of trying to play Oblivion at like 12FPS, Half life not running because we didn't have enough RAM, etc...
I admit i will look once every time, but i really do not see the value in it to use it for real gaming
I run my games now at 4K so time for a new AMD gpu either the 9070 XT or a 7900 XTX which is slowly dropping in price already will see when other bought the card and i see with my own eyes.
Which one it will become has to wait till i see the 9070 XT run with on a few friends their machines.
And might upgrade the cpu also if the upcoming is really giving a nice boost also.
I kinda know for sure already it is gonna be another AMD only system.
As I've clearly shown, a 4080 already has trouble upscaling 1080pRT->4k; 5080 can't do 1440pRT well-enough to say it will last any length of time (or 4k raster).
N48 surely won't be much different than 4080, and 7900xtx RT (relatively) sucks vs it's raster so it is also ~1080pRT, if that. 7900xtx has FSR3, which up-scaling from 1080p->4k kinda sucks.
Like I said; maybe they can make it work (with N48/FSR4) but it could be at the cost of IQ. Again, 16GB of ram is also going to be a limitation. If you're just running raster, I understand the 7900xtx.
Like I said, some of you just don't get. I know some do, and that's why it's like :banghead:, because I care and want people to understand, but truly so many just do not...which apparently is a TON of people.
That said, it's not a war. Be happy with what you buy and I hope you enjoy it. I just don't want anyone to be blind-sided over the next couple years, or even sooner. That's my only goal.
I'm hoping some of the people with a platform (that actually need to fill air) actually do it, is the point. Some are certainly more eloquent/entertaining conveyors of information than I am.
The problem is, many of them have an agenda and/or have to conform to certain 'rules' provided by nVIDIA. Because of this, it truly does make it difficult w/o doing it independently...in that, you are correct.