Saturday, March 1st 2025

Early Leak Claims AMD Radeon RX 9060 XT Might Reach NVIDIA GeForce RTX 4070 Territory

Judging by the current state of gaming GPUs, it might appear to some that true budget-class cards are a thing of the past. That said, it is almost certain that both NVIDIA and AMD are cooking entry-level GPUs to cater to folks who can't shell out the astoundingly high prices that modern mid-range and high-end GPUs command, with AMD having already confirmed the launch for RX 9060 class cards sometime in Q2 of this year. Previous leaks have indicated that the RX 9060 will likely hit the scene with 12 GB of GDDR6 VRAM, whereas its XT sibling will boast an additional 4 GB. NVIDIA is also expected to drop the RTX 5060 and 5060 Ti cards sometime towards the end of this month, likely in 8 GB and 16 GB flavors of the shinier GDDR7 spec.

Now, a fresh leak by Moore's Law Is Dead (MLID) has claimed that the Radeon RX 9060 XT will outperform the RTX 4060 Ti in performance, slotting in between the RTX 4060 Ti and the Radeon RX 7700 XT. Moreover, he added that AMD may even push clocks to bring the card closer to the RTX 4070 territory - a sweet position to hold indeed. Regarding launch date, MLID expects the card to hit the arena sometime in April. Of course, as with all leaks and rumors, accept this information with a grain of salt, especially considering that MLID's assertions are sourced from a single party. The RTX 5060/Ti is expected to be priced in the $400-$500 range, which means the RX 9060 XT will likely have to be priced in the lower-end of that in order to make for a compelling value proposition.
Sources: Moore's Law is Dead, Spotted by Notebookcheck
Add your own comment

54 Comments on Early Leak Claims AMD Radeon RX 9060 XT Might Reach NVIDIA GeForce RTX 4070 Territory

#26
efikkan
N/ASuch a loss, what genius disabled half the chip, unless it is defective, does not go well with the claim of good yields. Final specs should be more generous.
If this were accurate, wouldn't that put RX 9060 XT close to half the performance of RX 9070 XT? Even with slightly higher clocks that seems like a very big jump within the mid-range GPUs, and that's not even considering the RX 9060.

I remain skeptical.
Posted on Reply
#27
Darmok N Jalad
This is exactly what they did with the original Navi 10 launch. We had the 5700XT, the 5700, and the 5600XT all based off the same Navi 10. It allowed the 5600XT to have a 192bit bus, which would be a key component again to make the 9060 a 12GB card. Really, this can make a lot of sense if the yields work out that way, as the AIBs can reuse the same PCB across all designs, which helps hit lower MSRPs. The second gen one Navi was 14, which was a much smaller 128bit chip that topped out at the 5500XT. Makes me wonder in we'll have something like that again, but maybe a 9050 could potentially match the 7600 at a lower price point. The 7600 is not bad at all, even at 1440P with a mix of high-to-medium settings.
Posted on Reply
#28
AusWolf
hsewGiven RTX50’s current track record, I can’t even see the 5060Ti matching the 4070. nVidia better hope these rumors don’t come to fruition.
Exactly my thoughts.
Posted on Reply
#29
Nater
DristunAnother internet forum special. Unless you want HRR, an IPS 4K monitor costs $300. Same with 4K TVs, and there for $300 you're getting a much better screen too. If that's stupid money for you, I have no idea how you could afford the rig from your specs.
I have stupid money. Like I said, I have to feed 4 more gamers though. I see no value in a $300 4K monitor. I'd rather have a higher refresh, lower res, more FPS, higher quality settings in-game.

Are you the guy buying 27" 4K monitors? Wasted PPI in my opinion. Who sits THAT close to their screen.
Posted on Reply
#30
Caring1
NaterI have stupid money. Like I said, I have to feed 4 more gamers though. I see no value in a $300 4K monitor. I'd rather have a higher refresh, lower res, more FPS, higher quality settings in-game.

Are you the guy buying 27" 4K monitors? Wasted PPI in my opinion. Who sits THAT close to their screen.
I've got a 32" 4K Monitor on my desk beside my 24" 1080P Monitor and both are no more than 2' away from my face.
Posted on Reply
#31
bug
Right. So 9070XT ~4080 perf level, 9060XT ~4070. With the 9070 slotted in between. Anyone up for a bridge in Brooklyn?
Posted on Reply
#32
rattlehead99
TumbleGeorgeHmm, I hope that next generation with uDNA and GDDR7 will compete much better in 2026-2027...and finally I hope that next next AMD generation which will be released after PS6 and before 2030 will make all models in 8K gaming territory including start level models. 2028-2029 must be massive 8K access for all gamers with (new)dedicated graphics cards
From all the leaks shown the rtx 9070XT is as good as the rtx 5070Ti or with an overclock at 3.3Ghz it will match the 5080.

Cards like the 5090 and 4090 should NOT exist, they are very customer and market unfriendly as they take away a ton of wafer production volume for a small number of chips and give developers excuses to make their games much more poorly. Chips above 400-450mm^2 should be left for the "professional" GPUs.
Posted on Reply
#33
efikkan
rattlehead99Cards like the 5090 and 4090 should NOT exist, they are very customer and market unfriendly as they take away a ton of wafer production volume for a small number of chips and give developers excuses to make their games much more poorly. Chips above 400-450mm^2 should be left for the "professional" GPUs.
A very large portion of 80-class and 90-class graphics cards are bought by "professional" users, that is the primary reason why they are priced like that.
Perhaps they should go back to naming it "Titan", but I don't think the market want these products gone. Considering the margins on these cards, they do largely fund the development of the whole lineup.
Posted on Reply
#34
N/A
Overclocked the 9060XT with no juice left to squeeze. Why not just give it 3072 units and 2.0GHz and let the user decide.
bugRight. So 9070XT ~4080 perf level, 9060XT ~4070. With the 9070 slotted in between. Anyone up for a bridge in Brooklyn?
I'm not taking this. We shouldn't trust AMD's slides. But it's a technological marvel if it beats the 5080.
Posted on Reply
#35
Denver
DavenIt seems the following will be pretty close:

9070XT = 7900XTX = 5070Ti = 4080S
9070 = 7900XT = 5070 = 4070TiS
9060XT = 7800XT = 4070S
9060 = 7700XT = 4060Ti/4070

Just waiting on the 5060/9060 series performance and pricing to get the complete picture. But as long as Blackwell is out of stock and overpriced, RDNA4 is the only option.

Oh and 5080 is still DOA and completely irrelevant. Nvidia screwed up that one by wanting the 5090 to be so much faster, the 5080 ended up barely faster than much lower priced cards. Buyer beware on the 5080.
Really? It's obvious to me that the 5070 barely competes with the 7900GRE. 9070 non-xt will be at least 20% faster, the two will tie in RT-intensive games, but AMD will be slightly superior in titles with subtle RT.
Posted on Reply
#36
Squared
NaterMy brother doesn't have a 4K TV. My parents don't. My nephew doesn't. None of my kids run 4K monitors. I don't even run one.

Over 80% of people out there are 1440 and UNDER according to the Steam survey I just looked up. 4K is 3% of Steam gamers. (maybe i'm reading it wrong)

4K is NICHE for the people with stupid money or stupid with their money. If I didn't have 3 kids with gaming rigs + wifey with a laptop I'd have the stupid money, and I'd be 5K actually.
I'm still using a 2560×1080 monitor I bought 9 years ago and have no plans to upgrade. What would I even upgrade to? I'd have to triple my GPU spending to run a better monitor. (I could lower game resolution but it's super annoying compared to just using native resolution.)
NaterI have stupid money. Like I said, I have to feed 4 more gamers though. I see no value in a $300 4K monitor. I'd rather have a higher refresh, lower res, more FPS, higher quality settings in-game.

Are you the guy buying 27" 4K monitors? Wasted PPI in my opinion. Who sits THAT close to their screen.
A friend of mine bought a 4K monitor 10 years ago. Later he switched to a 1440p monitor becuase he uses it to play games. Today he's shopping for $1k graphics cards that stores aren't even in stock because he wants to go back to 4K. I don't get it.

However, I use a 27" 4K monitor at work and it's way better than using 2 1080p monitors which I used to consider essential. Text is super sharp even with a lot of it on the screen. But a potato iGPU can put text on a 4K monitor.
Posted on Reply
#37
N/A
4K240 27" OLED for me. 8K60 only exists in one DELL 31.5 going for $3K. But this is exactly how 4K started.
Posted on Reply
#38
mechtech
NaterMaybe WE are. As in people in this forum.

My brother doesn't have a 4K TV. My parents don't. My nephew doesn't. None of my kids run 4K monitors. I don't even run one.

Over 80% of people out there are 1440 and UNDER according to the Steam survey I just looked up. 4K is 3% of Steam gamers. (maybe i'm reading it wrong)

4K is NICHE for the people with stupid money or stupid with their money. If I didn't have 3 kids with gaming rigs + wifey with a laptop I'd have the stupid money, and I'd be 5K actually.
I got my 4k monitor for $300 cad and the 4k tv was also $300cad. If you are getting plain 60hz, there is not much difference in $$ between 1080p and 4k at the low end.
Posted on Reply
#39
kanecvr
DristunAnother internet forum special. Unless you want HRR, an IPS 4K monitor costs $300. Same with 4K TVs, and there for $300 you're getting a much better screen too. If that's stupid money for you, I have no idea how you could afford the rig from your specs.
4k displays are cheap, but graphics cards able to push consistently over 60fps at 4k are not, which is why lots of us are running 2k setups.
Posted on Reply
#40
Why_Me
Grebber0112gb lol :( I hope no
12GB for a card that's meant for gaming at 1080P is just fine.
Posted on Reply
#41
medi01
DenverAMD will use Navi 44(200mm²?) against 5060/5060ti(188mm²), not such a big and expensive chip lol

For example, 7600XT +30% is already at 7700XT level, just a little bit more and that ties with the 4070.
Perf doesn't scale linearly with CUs.
Posted on Reply
#42
Nater
Caring1I've got a 32" 4K Monitor on my desk beside my 24" 1080P Monitor and both are no more than 2' away from my face.
30-32" is the sweet spot for 4K desktop monitor IMO.
mechtechI got my 4k monitor for $300 cad and the 4k tv was also $300cad. If you are getting plain 60hz, there is not much difference in $$ between 1080p and 4k at the low end.
The value is there, it's double the cost though (for 4X the resolution).
Posted on Reply
#43
alwayssts
NaterDude. We're not even at 4K mainstream yet. Hell, even 1440 isn't. PS6 will be 4K@120 in the best case.

Features and quality will drive GPU's, not resolution. If resolution was a driver we'd be there by now. Once you get up over 120ppi, you can't tell the difference sitting at a desk. Only reason to push for 8K would be for movie theater sized screens to game on.
I'd argue even ~100PPI is fine for most people. I know *some* people have 20/20 vision, but in reality I think most people don't. I certainly don't.
There is a reason the 27'' 1440p display rules the roost...It's kinda perfect. Yeah, you can start arguing for 42'' 4k monitors/tvs on your desk, but to me that's a lil' extreme bc it's too big to see everything. JMO.
This is why I'll never even understand running native 4k for gaming, given the extra power/perf needed. I run 1440p and upscale it to 4k, and I honestly don't think *most* people need more.
I sit 5-6' from a 65 4k OLED, depending on if doing the gamer meme. I'd reckon you'd have to sit closer than 5' to notice a difference from 1440p native...and most don't do that bc then you can't see whole screen.
As an obscene perfectionist, I think my setup is perfect. Obviously no two people are the same, but I feel solid recommending something like that as it truly is the most economical for a great experience.

Now, getting into the era of 1440p->4k up-scaling with RT...which essentially requires something like a 4090...That part sucks. I'll eat it next-gen though, bc that's whatcha' do if it's yer' thing.
8k though? Good luck with that. Next-gen's absurd high-end will be about running 4k native RT. Again, I don't think most people need it, and certainly most can't afford it, and I think that's okay.
While next-gen certainly is about features (again, I believe 9070xt is the bottom of the new paradigm; 1440p raster, 1080pRT, or 960p->1440p upscaled RT), I think most overestimate the PS6.
Most we can hope is 1080p->4k up-scaling wrt demanding games, and doing something like that would require something with the grunt of a 5080, or a next-gen (9216sp?) 192-bit/18GB chip.
My hope is the PS6 essentially uses the 256-bit setup from AMD (similar to a 7900xtx but with RT/FSR improvements) but packed dense and clocked super low, making it similar to the desktop 192-bit parts (5080).

The thing people truly do not understand, and they will very soon, is that RT will become STANDARDIZED. You will NEED a card capable of this stuff if you want to run those games in any decent way.
Right now 9070xt will be the cheapest for a 60fps at any kind of common resolution (listed above). Yes, you can upscale from a lower-rez and/or use a lesser card and lower settings, but that's not the point.
People can argue what's acceptable to a point, but too many just do not understand the shift that is happening. There is a reason next-gen LOW-END (like the market of 9060) will be like a 9070 xt.

I'm honestly not trying to fight with people that don't get it, only prepare them. There is a reason why 9070 xt does what it does, there is a reason the 3nm stack will be what it is, and also the PS6 do what it does.

It may honestly catch some people off-guard, but that's why I caveat literally everything with *think about what you buy right now*, because for many people it just ain't gonna do what you want pretty soon.
Again, IN THOSE GAMES. Not all games are those games, but increasingly more will be (especially once PS6), and I'm also trying to be considerate that people don't want to be limited on the games they play.
rattlehead99Cards like the 5090 and 4090 should NOT exist, they are very customer and market unfriendly as they take away a ton of wafer production volume for a small number of chips and give developers excuses to make their games much more poorly. Chips above 400-450mm^2 should be left for the "professional" GPUs.
As I have said many times, 4090 exists because it is literally the foundation of 1440p->4k up-scaled RT, which to many people is the (reasonable) grail.
This will trickle down next-gen to $1000 cards. And the gen after that. And the gen after that.
The cards above that next-gen (36GB?) will be about native 4kRT. 5090 is a weird freakin' thing that I agree is mostly a novelty of what's currently possible, but not a tier.

The next generational leap after this is we'll probably all be gaming in the cloud. :p
Posted on Reply
#44
defaultname91
N/ASuch a loss, what genius disabled half the chip, unless it is defective, does not go well with the claim of good yields. Final specs should be more generous.
I have to assume it's actually a 48 CU unit and that the eventual 9060 will be a 40 CU version. Otherwise I don't really see the point seeing as we now have Strix Halo with 40 CUs on-chip (yes I know it's RDNA 3.5, but I'm eager to see how much more performance per CU RDNA 4 can bring at iso-frequency/power draw). The Strix Halo concept, regardless of necessitating cutting down in certain situations, hopefully gets brought over to desktop APUs and handhelds and other form-factors. It should basically kill the market for low-end GPUs meaning that we'll never see another RX 6400/5500 XT type card, as well as killing the nVIDIA xx50 line.

It astounds me why it's taken so long to come to market, and when you realize the PS5 Pro exists, there's no reason for a xx50 class APU not to exist, sure you'll need around a higher-end 200-250w cooler for it, but those already exist and many people would flock to such a solution, I think the memory bandwidth is probably the biggest barrier here as Strix Halo needs soldered RAM and quad-channel bandwidth. I hope there's a way to get past this issue or the soldered RAM with CAMM2 or something of the sort in the future to make this a real option. Basically this would leave the dGPU market to 70 class and higher GPUs on both sides. I believe many people would happily pay 600-700 USD for such an APU for 8-12 cores Zen 5 and a 32CU RDNA 4 iGPU with a 250w power limit, with 6700XT/4060Ti-like perfomance, and a large VRAM (96GB!?) cap as is possible with the laptop platform.

Here's to hoping on all fronts. Cheers.
Posted on Reply
#45
Enterprise24
I won't trust anything from mlid. Too many bs.
Posted on Reply
#46
lexluthermiester
While these are just rumors, I hope they pan out and even do better. NVidia needs a swift kick in the goolies for the absolute BS of the 5000 series launch..
Posted on Reply
#47
CyberPomPom
alwaysstsI'd argue even ~100PPI is fine for most people. I know *some* people have 20/20 vision, but in reality I think most people don't. I certainly don't.
Most people definitely have better than 20/20 vision. By definition.

20/20 is the average, with everyone, good AND bad eyesight people included.
And before correcting the bad eyesight ones with spectacles or lenses.

The average, with the proper prescription glasses for those who needs it, for people under 40, is about 20/12.
At less than 2' that's a lot more than 100ppi.
Posted on Reply
#48
Vayra86
Nvidia is about to get 'exposed' it seems for their continuous shifting of tier-performance to a higher price point at this rate.

5070ti beat by a 9070XT
9060 beating Nvidia's x60ti? That's going to get people thinking.

Yes, names do matter... if all you do is skim reviews and collect sound bites. Which the overwhelming majority of consumers do.
And the funniest thing is, Nvidia can't do another 'woops we'll pull this 4080 '12gb' and call it something else nao' either.
Posted on Reply
#49
Bwaze
lexluthermiesterWhile these are just rumors, I hope they pan out and even do better. NVidia needs a swift kick in the goolies for the absolute BS of the 5000 series launch..
I don't think there will be any "kicking". If AMD cards by any chance present a good price/performance, then demand will correct the price very soon, I doubt AMD will want to commit making large volume of cheap GPUs, while they are constrained by TSMC so they can't make server CPUs and Ryzen 9800X3D fast enough.
Posted on Reply
#50
bug
BwazeI don't think there will be any "kicking". If AMD cards by any chance present a good price/performance, then demand will correct the price very soon, I doubt AMD will want to commit making large volume of cheap GPUs, while they are constrained by TSMC so they can't make server CPUs and Ryzen 9800X3D fast enough.
This is exactly how you gain market share. Sell at discount, even at a loss if you can afford it. It's not a short-term plan.
As usual, I will err on the cautious side and set myself up to be pleasantly surprised, rather than the other way around.
Posted on Reply
Add your own comment
Mar 3rd, 2025 19:48 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts