Saturday, March 1st 2025

Early Leak Claims AMD Radeon RX 9060 XT Might Reach NVIDIA GeForce RTX 4070 Territory
Judging by the current state of gaming GPUs, it might appear to some that true budget-class cards are a thing of the past. That said, it is almost certain that both NVIDIA and AMD are cooking entry-level GPUs to cater to folks who can't shell out the astoundingly high prices that modern mid-range and high-end GPUs command, with AMD having already confirmed the launch for RX 9060 class cards sometime in Q2 of this year. Previous leaks have indicated that the RX 9060 will likely hit the scene with 12 GB of GDDR6 VRAM, whereas its XT sibling will boast an additional 4 GB. NVIDIA is also expected to drop the RTX 5060 and 5060 Ti cards sometime towards the end of this month, likely in 8 GB and 16 GB flavors of the shinier GDDR7 spec.
Now, a fresh leak by Moore's Law Is Dead (MLID) has claimed that the Radeon RX 9060 XT will outperform the RTX 4060 Ti in performance, slotting in between the RTX 4060 Ti and the Radeon RX 7700 XT. Moreover, he added that AMD may even push clocks to bring the card closer to the RTX 4070 territory - a sweet position to hold indeed. Regarding launch date, MLID expects the card to hit the arena sometime in April. Of course, as with all leaks and rumors, accept this information with a grain of salt, especially considering that MLID's assertions are sourced from a single party. The RTX 5060/Ti is expected to be priced in the $400-$500 range, which means the RX 9060 XT will likely have to be priced in the lower-end of that in order to make for a compelling value proposition.
Sources:
Moore's Law is Dead, Spotted by Notebookcheck
Now, a fresh leak by Moore's Law Is Dead (MLID) has claimed that the Radeon RX 9060 XT will outperform the RTX 4060 Ti in performance, slotting in between the RTX 4060 Ti and the Radeon RX 7700 XT. Moreover, he added that AMD may even push clocks to bring the card closer to the RTX 4070 territory - a sweet position to hold indeed. Regarding launch date, MLID expects the card to hit the arena sometime in April. Of course, as with all leaks and rumors, accept this information with a grain of salt, especially considering that MLID's assertions are sourced from a single party. The RTX 5060/Ti is expected to be priced in the $400-$500 range, which means the RX 9060 XT will likely have to be priced in the lower-end of that in order to make for a compelling value proposition.
54 Comments on Early Leak Claims AMD Radeon RX 9060 XT Might Reach NVIDIA GeForce RTX 4070 Territory
I remain skeptical.
Are you the guy buying 27" 4K monitors? Wasted PPI in my opinion. Who sits THAT close to their screen.
Cards like the 5090 and 4090 should NOT exist, they are very customer and market unfriendly as they take away a ton of wafer production volume for a small number of chips and give developers excuses to make their games much more poorly. Chips above 400-450mm^2 should be left for the "professional" GPUs.
Perhaps they should go back to naming it "Titan", but I don't think the market want these products gone. Considering the margins on these cards, they do largely fund the development of the whole lineup.
However, I use a 27" 4K monitor at work and it's way better than using 2 1080p monitors which I used to consider essential. Text is super sharp even with a lot of it on the screen. But a potato iGPU can put text on a 4K monitor.
There is a reason the 27'' 1440p display rules the roost...It's kinda perfect. Yeah, you can start arguing for 42'' 4k monitors/tvs on your desk, but to me that's a lil' extreme bc it's too big to see everything. JMO.
This is why I'll never even understand running native 4k for gaming, given the extra power/perf needed. I run 1440p and upscale it to 4k, and I honestly don't think *most* people need more.
I sit 5-6' from a 65 4k OLED, depending on if doing the gamer meme. I'd reckon you'd have to sit closer than 5' to notice a difference from 1440p native...and most don't do that bc then you can't see whole screen.
As an obscene perfectionist, I think my setup is perfect. Obviously no two people are the same, but I feel solid recommending something like that as it truly is the most economical for a great experience.
Now, getting into the era of 1440p->4k up-scaling with RT...which essentially requires something like a 4090...That part sucks. I'll eat it next-gen though, bc that's whatcha' do if it's yer' thing.
8k though? Good luck with that. Next-gen's absurd high-end will be about running 4k native RT. Again, I don't think most people need it, and certainly most can't afford it, and I think that's okay.
While next-gen certainly is about features (again, I believe 9070xt is the bottom of the new paradigm; 1440p raster, 1080pRT, or 960p->1440p upscaled RT), I think most overestimate the PS6.
Most we can hope is 1080p->4k up-scaling wrt demanding games, and doing something like that would require something with the grunt of a 5080, or a next-gen (9216sp?) 192-bit/18GB chip.
My hope is the PS6 essentially uses the 256-bit setup from AMD (similar to a 7900xtx but with RT/FSR improvements) but packed dense and clocked super low, making it similar to the desktop 192-bit parts (5080).
The thing people truly do not understand, and they will very soon, is that RT will become STANDARDIZED. You will NEED a card capable of this stuff if you want to run those games in any decent way.
Right now 9070xt will be the cheapest for a 60fps at any kind of common resolution (listed above). Yes, you can upscale from a lower-rez and/or use a lesser card and lower settings, but that's not the point.
People can argue what's acceptable to a point, but too many just do not understand the shift that is happening. There is a reason next-gen LOW-END (like the market of 9060) will be like a 9070 xt.
I'm honestly not trying to fight with people that don't get it, only prepare them. There is a reason why 9070 xt does what it does, there is a reason the 3nm stack will be what it is, and also the PS6 do what it does.
It may honestly catch some people off-guard, but that's why I caveat literally everything with *think about what you buy right now*, because for many people it just ain't gonna do what you want pretty soon.
Again, IN THOSE GAMES. Not all games are those games, but increasingly more will be (especially once PS6), and I'm also trying to be considerate that people don't want to be limited on the games they play. As I have said many times, 4090 exists because it is literally the foundation of 1440p->4k up-scaled RT, which to many people is the (reasonable) grail.
This will trickle down next-gen to $1000 cards. And the gen after that. And the gen after that.
The cards above that next-gen (36GB?) will be about native 4kRT. 5090 is a weird freakin' thing that I agree is mostly a novelty of what's currently possible, but not a tier.
The next generational leap after this is we'll probably all be gaming in the cloud. :p
It astounds me why it's taken so long to come to market, and when you realize the PS5 Pro exists, there's no reason for a xx50 class APU not to exist, sure you'll need around a higher-end 200-250w cooler for it, but those already exist and many people would flock to such a solution, I think the memory bandwidth is probably the biggest barrier here as Strix Halo needs soldered RAM and quad-channel bandwidth. I hope there's a way to get past this issue or the soldered RAM with CAMM2 or something of the sort in the future to make this a real option. Basically this would leave the dGPU market to 70 class and higher GPUs on both sides. I believe many people would happily pay 600-700 USD for such an APU for 8-12 cores Zen 5 and a 32CU RDNA 4 iGPU with a 250w power limit, with 6700XT/4060Ti-like perfomance, and a large VRAM (96GB!?) cap as is possible with the laptop platform.
Here's to hoping on all fronts. Cheers.
20/20 is the average, with everyone, good AND bad eyesight people included.
And before correcting the bad eyesight ones with spectacles or lenses.
The average, with the proper prescription glasses for those who needs it, for people under 40, is about 20/12.
At less than 2' that's a lot more than 100ppi.
5070ti beat by a 9070XT
9060 beating Nvidia's x60ti? That's going to get people thinking.
Yes, names do matter... if all you do is skim reviews and collect sound bites. Which the overwhelming majority of consumers do.
And the funniest thing is, Nvidia can't do another 'woops we'll pull this 4080 '12gb' and call it something else nao' either.
As usual, I will err on the cautious side and set myself up to be pleasantly surprised, rather than the other way around.