Thursday, February 6th 2025

AMD's Frank Azor Expects Upcoming Presentation to Fully Detail RDNA 4 GPUs
AMD debuted its first wave of RDNA 4 graphics cards—consisting of Radeon RX 9070 XT and RX 9070 (non-XT) models—at the beginning of January. At the time, press outlets and PC gaming hardware enthusiasts were equally flummoxed by Team Red's confusing presentation strategy. Invited attendees of CES 2025 were allowed to handle demonstration samples, but board partners appeared to be sworn to secrecy regarding technical specifications or performance figures. Miscellaneous leaks and rumors have seeped out since then—according to insiders, AMD was prepping its new Radeon product line for launch late last month. A re-scheduled rollout is seemingly in the works, possibly on next month's calendar entry. Benchlife (via VideoCardz) believes that a pre-launch showcase event is lined up for late February.
Following publication of the latest RDNA 4-related leaks, a brave soul engaged with AMD's Frank Azor on social media. Dee Batch, a loyal and long-term supporter of Radeon gaming hardware, sent a query to Team Red's chief architect of gaming solutions: "can we see the RDNA 4 full presentation? I honestly feel you can prevent many gamers from getting a GeForce RTX 5070 or RTX 5070 Ti GPU...Please, do not miss this opportunity to gain gamer mind share." Azor replied with a short sentence: "yes, full details are coming soon." This brief interaction attracted additional participants—VideoCardz noted that the Team Red executive was taking on board feedback about expectations surrounding RDNA 4 MSRPs. Late last month, Azor refuted rumors of the Radeon RX 9070 XT pricing starting at a baseline of $899. NVIDIA has officially disclosed price points of $549 (RTX 5070) and $749 (RTX 5070 Ti)—AMD enthusiasts have their fingers crossed in hope of TBA competitive numbers.
Sources:
Dee_Batch Tweet, VideoCardz
Following publication of the latest RDNA 4-related leaks, a brave soul engaged with AMD's Frank Azor on social media. Dee Batch, a loyal and long-term supporter of Radeon gaming hardware, sent a query to Team Red's chief architect of gaming solutions: "can we see the RDNA 4 full presentation? I honestly feel you can prevent many gamers from getting a GeForce RTX 5070 or RTX 5070 Ti GPU...Please, do not miss this opportunity to gain gamer mind share." Azor replied with a short sentence: "yes, full details are coming soon." This brief interaction attracted additional participants—VideoCardz noted that the Team Red executive was taking on board feedback about expectations surrounding RDNA 4 MSRPs. Late last month, Azor refuted rumors of the Radeon RX 9070 XT pricing starting at a baseline of $899. NVIDIA has officially disclosed price points of $549 (RTX 5070) and $749 (RTX 5070 Ti)—AMD enthusiasts have their fingers crossed in hope of TBA competitive numbers.
81 Comments on AMD's Frank Azor Expects Upcoming Presentation to Fully Detail RDNA 4 GPUs
Topic for another thread, I suppose.
Yeah, you and I think the same in this regard, I think.
I'm just saying the raster of this card is aimed toward 1440p, and that's where most people will probably use it.
Nothing really exists for good native 4k outside a 4090/5090 (1440pRT) or 7900xtx (raster, 1080pRT). People can use it ofc, just saying it could be a limitation (ram/raster) and it's clearly not the aim.
Because of this (and the inferred up-scaling capability [difference between total compute and ROP limitation] which would be similar to DLSS3) I would assume the general good quality of scaling could be limited to something similar to DLSS3. Obviously you could use that for 1080p->4k, but it's real strength wasn't really beyond one step (1080p->1440p or 1440p->4k) imho.
DLSS4 1080p->4k is much better than DLSS3. That is literally what Huang has sold this generation on because none of the non-90 cards are true 4k or 1440pRT cards.
That means upscaling from at least 1440p, if not 1080p. This was clearly pushed in both the CES presentation and the '5070 is a 4090' (by up-scaling 1080p->4k plus 4x frame gen).
Now, I would argue <45TF/12GB means 5070 is going to suck at 1080pRT (especially 1440p) and maybe even raster soon regardless, which is the whole point of N48 (to succeed where it falters), but I digress.
All I was trying to say is I think nVIDIA is pushing for higher-quality scaling now (because it doesn't make sense for most markets to render native right now). AMD targeted the market it does, more-or-less.
I guess we'll have to see how 1080p->1440p/4k performance is on FSR4/DLSS4 comparatively. I think FSR will end up similar to DLSS3, in which case I wouldn't personally love using it for 4k. Others might.
Again, I'm sure there are instances 1440pRT might be playable and scale to 4k will look just fine (given it's performance and 1440p->4k has always looked fine imho).
I'm just trying to say but I wouldn't be surprised if fairly quickly it's relegated to 1080pRT, is my point.
If 1080p->4k upscaling is less than stellar (IQ<DLSS4), one might consider other options and/or stick to 1080->1440p....which is probably what they expect given how the rest of the card should typically perform.
This is why this whole situation sucks for people like me (whom write, not do videos*). There are SOOO many variables and things to consider now, and trying to have it all gel for people is difficult w/o lots of words.
Especially for people that haven't used these things to any great extent and/or don't understand the aims of each product. I'm sure it's super confusing to a TON of people (and I bet that makes nVIDIA happy).
I get that people don't want to read all the words; I get some people probably don't even want to watch long videos to understand it all. That is why this whole situation is a nightmare when trying to help.
*Thought of a video one of the Steves did (the framegen one) the other day. It's still a lot of words and difficult to get across, even if you do videos. Writing it just extra sucks, is my point.
www.tomshardware.com/pc-components/gpus/amd-research-suggests-plans-to-catch-up-to-nvidia-using-neural-supersampling-and-denoising-for-real-time-path-tracing
We'll see how the math holds up.
Also, my bad for assuming people would instantly be aware of what Lisa Su mentioned recently (chronically online).
It will make sense as more cards launch and time moves forward a bit. You may very-well be the type that invests in a 4090-level card next time they upgrade. So will I! Others slightly better than N48, imho.
Because the later will likely be similar in performance to the next-gen consoles. Because of that, everything not up to that tier will be relegated to 1080pRT (or less) and/or 1440p raster (or less) at high settings.
4090 prolly still 1440pRT ~60fps mins. That's what I'm to get across. They're trying to be the 4090 of the mid-range (1080p/1440 where 4090 1440p/4k). This requires good upscaling IQ, esp 1080p->4k.
IMHO that will require >60TF and 18GB of ram (long-term), but they might be able to sell it on that scenario for what's currently available. Does that make sense? I honestly understand it's confusing. Marketing is marketing, but like I say...I absolutely expect there to be *some* 1440pRT (where 4070ti/5070 may not quite make the cut it does), but soon relegated to 1080pRT bc it might be asking too much.
This is absolutely where the quality of FSR will come into play. The same is obviously true for 5070 (which soon will mostly be 1080p+/-=RT, if that imo). DLSS4 *does* make 1080p->1440p/4k look okay imho.
Still doesn't mean 5070 won't run out of ram/raster, especially at native 1440p. I think 16GB will become 1080p standard pretty soon. It kinda already is (in some titles, like MH with high-rez textures loaded).
N48 has more of both which could squeak them by (for now), but the question then becomes upscale quality.
1440p->4k will probably be okay; 1080p->1440p too. This was generally true for DLSS3 as well (imho; all of this is subjective), so I don't know why it wouldn't.
If it can't meet DLSS4, which I doubt it can (right now), but rather DLSS3, it will work/run okay, but w/ IQ compromises that we can then nitpick to death at 1080p->4k. That's all I'm trying to get across. :p
The point to all of this is there is SOOO much more than just looking at a FPS chart these days. If you do, in that instance AMD might look like they struck the right balance, but it's much more complicated.
Just like how when I segment these tiers, some people might say 'x card can run higher resolution sometimes'. That's true, I'm just saying this is where I think things will shake-out long-term for high-end games.
What's your definition of 4k?™ (nVIDIA, you can have that one for free). I didn't really trademark it...yet.
To me 4k requires a 1440p upscale; I can get behind some that argue ~1270p depending on viewing distance ('balanced' upscaling to 4k). I would not call 1080p upscaling 4k, but nVIDIA is trying to sell it.
Likely, soon, so will AMD, perhaps to less success. We shall see. I would love to be surprised by AMD's IQ and hopefully minimal performance hit when using it. I'm not holding my breath, though. Not yet.
Maybe UDNA, especially when there is more spare horsepower on tap.
As far as RT goes, I think AMD's implementation will be excellent; perhaps better than nVIDIA's.
The question again becomes how much that performance can effect playable resolutions/scales/framerates (for these products in particular), especially long-term.
Again, I think we'll need slightly more pure grunt/ram to do what these products are aiming to do long-term. They may succeed short-term. This is also true for every nVIDIA part outside 4090/5090.
Edit: I'm also not the type to use the word "invest" to describe buying something that depreciates faster in value than a used electric car, purely for personal entertainment. Speculation.
(I went with that instead of City of Angels/Wings of Desire. Still not sure I made the right call. Sometimes my references are Far Away, So Close!)
I absolutely love your comparison to streaming quality. 100% nailed it. Some are higher bit-rate than others, different compression algos, codecs etc. Some are good, some are bad. Again, what's 'good' 4k? IDFK.
You get what you get and I guess you deal with it. Same thing here; but at least we have multiple companies to compare the options of how it is presented (rather than forced into one by a video host).
I think there will be a big back and forth over IQ vs perf. For instance, I truly think DLSS4 was created to increase ~1080p upscale IQ, but it also relegated some cards below old performance (60mins) thresholds.
There is no standard, and in that respect we are at the mercy of whatever AMD/nVIDIA decide to do to strike that balance. Like I've said, how long before some are switching .dlls for better perf rathan than iq?
That's mostly an exaggeration, but hopefully you get my point. They can move the goalposts however they please; it's another metric/variable they can screw with to require and/or outdate products.
Your question about input latency is fair, but I don't think any of them are/will be *that* bad (JMO). At least not purely up-scaling/rt. Maybe FG depending on how severe these companies lean into it.
I don't like how nVIDIA's (new) antilag thing appears to work wrt IQ, but that's a matter of preference. I think the main thing to get right is frame-metering with all this stuff going on; so it's at least consistent.
It's kind of why I think (at least currently) >3x framegen doesn't make sense (rn). They can make >3 work, but not 4, and it kinda makes it look bad (from what I've seen). Lossless scaling has this problem too.
This is why I think my aim will be 240hz with 3x framegen (1440p @ 80hz upscaled to 4k/240). I think a lot of people will end up going and/or wanting to go this route (for many games).
Might require some beastly hardware for games with heavy RT (certainly PT), so perhaps less some/most of the time.
Hopefully the latency doesn't suck too much for all this stuff (wrt to how they are supposed to be played baseline; not just as an option). We shall see. I hope developers don't use it as a crutch too often.
"In the service of"...
How many beautiful things much be sacrificed at the alter of convenience?
How many people will never see LoA in a theater and truly recognize and appreciate it's splendor? Some would argue too many. What effect diminished that may have brought some to strive to accomplish similar?
In a way, lesser-quality/upscaled 4k could be like that. Because the digital medium; the ability to control what people see given an outside choice of what quality is 'acceptable', is bullshit.
How many native 1440p/4k assets will be sacrificed to the up-scaling Gods? Probably a ton. You can argue that may allow them to make more and/or use more systems, but that's not the point.
The craftsmanship and constraints of the former reality were something to appreciate. There may be less of that (at least that most people ever see). I think that's a shame, especially if the upscale quality isn't great.
I apologize for waxing poetic, but this greater theme is something that matters to myself and many others quite a lot.
I imagine many people told that guy to get with the times and adapt; that it brought accessibility through convenience. They weren't wrong, and in similar fashion that can be applied in many ways today.
In a way though, the type that actually do all the things the way they were meant to be experienced rather than a way more convenient and/or accessible; conceivably less than?
Those people probably live the best lives, imo. They probably get the most from those things, and it may often impact them more; perhaps teach them something and/or gave them something to aspire more often.
The type perhaps less-likely to take art for granted. They don't settle for "we have ___ at home", which literally encapsulates my argument. They don't believe a home theater is 'good-enough', certainly not a phone.
Because it's still not a theater.
No, they go to the theater. If they all haven't closed because instead of buying a ticket many watched the film on their phone. And when they do close, people like myself (and others like me) get very upset.
People should experience things in the best possible way they can. That's the point. And now sometimes people don't even have the option due to some people choosing convenience. For 'settling'.
People should not settle for a lesser experience if a better one is possible and they are able, and shouldn't encourage an atmosphere/ecosystem that promotes the "less than" or 'good-enough' experience.
I want that for everyone; on any scale and regarding any type of thing. That's all. Whenever I see the LoA thing I get really pissed off. So do a lot of people that mourn the death of theaters and/or even physical media.
This is why streaming (let-alone quality) and things like crappy upscaling (think AI remasters) might truly spur the second death of greater thought, let-alone art and it's greater appreciation. I'm not even joking.
www.nytimes.com/2015/12/20/arts/television/streaming-tv-isnt-just-a-new-way-to-watch-its-a-new-genre.html
www.denofgeek.com/tv/how-online-streaming-is-changing-tv-storytelling/
Why the prospect of playing certain games on something like the Steam Deck doesn't really interest me beyond the technical accomplishment. I also wonder if the resurgence of handhelds is less about providing a unique mobile experience (a la Nintendo) and more about having to do doing something with your less powerful chips. Like that grey zone the Strix Halo seemingly finds itself in.
This is the one of the many things I don't get about modern society. Why do people demand constant entertainment, even if it diminishes the value of it?