• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Early Leak Claims AMD Radeon RX 9060 XT Might Reach NVIDIA GeForce RTX 4070 Territory

I don't think there will be any "kicking". If AMD cards by any chance present a good price/performance, then demand will correct the price very soon, I doubt AMD will want to commit making large volume of cheap GPUs, while they are constrained by TSMC so they can't make server CPUs and Ryzen 9800X3D fast enough.
This is exactly how you gain market share. Sell at discount, even at a loss if you can afford it. It's not a short-term plan.
As usual, I will err on the cautious side and set myself up to be pleasantly surprised, rather than the other way around.
 
This is exactly how you gain market share. Sell at discount, even at a loss if you can afford it. It's not a short-term plan.
As usual, I will err on the cautious side and set myself up to be pleasantly surprised, rather than the other way around.
Yes, I understand that, but I don't think normal market conditions exist right now.

All the tech companies right now are constrained by the fact that most of the products, even the ones that are direct competitors, are being made by a single maker - TSMC, who can't just simply expand his production volumes. And common consumer products from those tech companies are also not their biggest revenue bringers - Nvidia makes most of their money by selling server AI equipment, AMD is selling server CPUs.

So I don't see any possibility of AMD suddenly focusing on consumer GPUs, regaining market share - but actually lowering their income by not making as many server CPUs. I think not having presentation of new GPU lineup on CES was telling enough on that they aren't a priority, even more so than not aiming at high end GPUs at all.
 
It seems the following will be pretty close:

9070XT = 7900XTX = 5070Ti = 4080S
9070 = 7900XT = 5070 = 4070TiS
9060XT = 7800XT = 4070S
9060 = 7700XT = 4060Ti/4070

Just waiting on the 5060/9060 series performance and pricing to get the complete picture. But as long as Blackwell is out of stock and overpriced, RDNA4 is the only option.

Oh and 5080 is still DOA and completely irrelevant. Nvidia screwed up that one by wanting the 5090 to be so much faster, the 5080 ended up barely faster than much lower priced cards. Buyer beware on the 5080.

I don't think so.

Top line seems about right.

9070 may equal the 7900XT give or take but I suspect the 5070 does not even match the 4070S let alone the 4070Ti.

The 9060XT has too little bandwidth to match the 7800XT so I see it closer to the 7700XT / 4070

With how small N44 probably is I don't see the point in a cut part at launch. Just release the XT with 16GB of VRAM for $330 and call it done.

If they do get some faulty dies save them and then release a 9050XT with a 96bit bus and 12GB of VRAM for ~$250 when there is enough of a stockpile for it to last. I really think 96bit with 12GB will be a far better 1080p part than an 8GB 128bit part would be.

That is the stack I would have if I were AMD.
 
I hope there's a power-efficient offering from AMD this generation. The answer isn't always to chase peak performance with total disregard to performance/Watt.

I've picked an Nvidia card countless times for builds that are thermally-constrained because the Nvidia equivalent GPU is pulling 30-40% less power for the same end result. It's why I begrudgingly put up with the 4060Ti in one of my machines, and I can't even remember the last time I put a Radeon in a mini ITX build...
 
I hope there's a power-efficient offering from AMD this generation. The answer isn't always to chase peak performance with total disregard to performance/Watt.

I've picked an Nvidia card countless times for builds that are thermally-constrained because the Nvidia equivalent GPU is pulling 30-40% less power for the same end result. It's why I begrudgingly put up with the 4060Ti in one of my machines, and I can't even remember the last time I put a Radeon in a mini ITX build...
Same here. Plus, the lower the TDP, the less you have to worry about the quality of the cooling solution.
 
Same here. Plus, the lower the TDP, the less you have to worry about the quality of the cooling solution.
Did you see the leaked slide from HUB a minute ago?


It's pretty terrifying if those numbers are even vaguely accurate for the "330W" 9070XT
 
Did you see the leaked slide from HUB a minute ago?


It's pretty terrifying if those numbers are even vaguely accurate for the "330W" 9070XT
Haven't seen them. Too high for my taste, but I guess what mileage those cards offer in return matters, too.
 
Haven't seen them. Too high for my taste, but I guess what mileage those cards offer in return matters, too.
I'm guessing that's total system power as the 263W 7800XT is showing as 344W
 
  • Like
Reactions: bug
I'd argue even ~100PPI is fine for most people. I know *some* people have 20/20 vision, but in reality I think most people don't. I certainly don't.
There is a reason the 27'' 1440p display rules the roost...It's kinda perfect. Yeah, you can start arguing for 42'' 4k monitors/tvs on your desk, but to me that's a lil' extreme bc it's too big to see everything. JMO.
This is why I'll never even understand running native 4k for gaming, given the extra power/perf needed. I run 1440p and upscale it to 4k, and I honestly don't think *most* people need more.
I sit 5-6' from a 65 4k OLED, depending on if doing the gamer meme. I'd reckon you'd have to sit closer than 5' to notice a difference from 1440p native...and most don't do that bc then you can't see whole screen.
As an obscene perfectionist, I think my setup is perfect. Obviously no two people are the same, but I feel solid recommending something like that as it truly is the most economical for a great experience.

Now, getting into the era of 1440p->4k up-scaling with RT...which essentially requires something like a 4090...That part sucks. I'll eat it next-gen though, bc that's whatcha' do if it's yer' thing.
8k though? Good luck with that. Next-gen's absurd high-end will be about running 4k native RT. Again, I don't think most people need it, and certainly most can't afford it, and I think that's okay.
While next-gen certainly is about features (again, I believe 9070xt is the bottom of the new paradigm; 1440p raster, 1080pRT, or 960p->1440p upscaled RT), I think most overestimate the PS6.
Most we can hope is 1080p->4k up-scaling wrt demanding games, and doing something like that would require something with the grunt of a 5080, or a next-gen (9216sp?) 192-bit/18GB chip.
My hope is the PS6 essentially uses the 256-bit setup from AMD (similar to a 7900xtx but with RT/FSR improvements) but packed dense and clocked super low, making it similar to the desktop 192-bit parts (5080).

The thing people truly do not understand, and they will very soon, is that RT will become STANDARDIZED. You will NEED a card capable of this stuff if you want to run those games in any decent way.
Right now 9070xt will be the cheapest for a 60fps at any kind of common resolution (listed above). Yes, you can upscale from a lower-rez and/or use a lesser card and lower settings, but that's not the point.
People can argue what's acceptable to a point, but too many just do not understand the shift that is happening. There is a reason next-gen LOW-END (like the market of 9060) will be like a 9070 xt.

I'm honestly not trying to fight with people that don't get it, only prepare them. There is a reason why 9070 xt does what it does, there is a reason the 3nm stack will be what it is, and also the PS6 do what it does.

It may honestly catch some people off-guard, but that's why I caveat literally everything with *think about what you buy right now*, because for many people it just ain't gonna do what you want pretty soon.
Again, IN THOSE GAMES. Not all games are those games, but increasingly more will be (especially once PS6), and I'm also trying to be considerate that people don't want to be limited on the games they play.


As I have said many times, 4090 exists because it is literally the foundation of 1440p->4k up-scaled RT, which to many people is the (reasonable) grail.
This will trickle down next-gen to $1000 cards. And the gen after that. And the gen after that.
The cards above that next-gen (36GB?) will be about native 4kRT. 5090 is a weird freakin' thing that I agree is mostly a novelty of what's currently possible, but not a tier.

The next generational leap after this is we'll probably all be gaming in the cloud. :p
If we get to gaming in the cloud and on a subscription, mark my words, gaming will becomes to bad, bland and/or boring that you wouldn't even want to play games. Every entertainment industry that moves from providing products to being a service turns to crap.
 
Hmm, I hope that next generation with uDNA and GDDR7 will compete much better in 2026-2027...and finally I hope that next next AMD generation which will be released after PS6 and before 2030 will make all models in 8K gaming territory including start level models. 2028-2029 must be massive 8K access for all gamers with (new)dedicated graphics cards

8K will never be a thing, because it's the biggest waste of computational power you can imagine. You need a 50" desk monitor or a 100" TV to notice the detail, and you need to have the detail in the first place. Nobody will master anything in 8K because it's a ridiculous waste of resources. Nobody will stream anything in 8K because it's a waste of bandwidth. Nobody will render anything in 8K because you'd have to do X360/PS3-level graphics. Most console games render below 1080p, and you want 8K in the next generation? I think you missed the memo on how small process node improvements are becoming, while getting much more expensive.
 
8K will never be a thing, because it's the biggest waste of computational power you can imagine. You need a 50" desk monitor or a 100" TV to notice the detail, and you need to have the detail in the first place. Nobody will master anything in 8K because it's a ridiculous waste of resources. Nobody will stream anything in 8K because it's a waste of bandwidth. Nobody will render anything in 8K because you'd have to do X360/PS3-level graphics. Most console games render below 1080p, and you want 8K in the next generation? I think you missed the memo on how small process node improvements are becoming, while getting much more expensive.
I wouldn't be too sure about that. Not being able to make the best of it, never stopped resolutions from going up. To this day "4k" signal is so compressed it barely equals transmission of a proper FHD material. Can't see the details? You already can't see it on 4k screens at typical sitting distance. True story, my 55" 4k TV sits less than 4m/13ft away, has some dead pixels, I only noticed up close. Back on the couch, I know where they are, still can't spot them.
 
8K will never be a thing, because it's the biggest waste of computational power you can imagine.
That is a matter of opinion. Not everyone agrees.
You need a 50" desk monitor or a 100" TV to notice the detail, and you need to have the detail in the first place.
Again more opinion, not supported by history or fact.
Nobody will master anything in 8K because it's a ridiculous waste of resources.
What?!? There have been 8k+ camera's for nearly 10 years. Either you've hiding under a rock or you have to stop with the drugs.
Nobody will stream anything in 8K because it's a waste of bandwidth.
Again, that is your opinion. There are those who have mutli-gigabit per second internet connections. A good quality 4k stream requires 30Mbps. 8k = 4k x 4. So naturally 8k = 120Mbps. I can do that a few times over with my fiber connection and still have bandwidth to spare. And where people CAN do something they frequently WANT to do that something.
Nobody will render anything in 8K because you'd have to do X360/PS3-level graphics. Most console games render below 1080p, and you want 8K in the next generation? I think you missed the memo on how small process node improvements are becoming, while getting much more expensive.
Games are a completely different beast from video streaming. While you're right in the fact we're not there yet computationally, 4k and 8k screens would afford a flexibility which can be utilized for certain specialFX that can take advantage of the extra resolution.

With the exception of that last bit, your statement edges on the blatantly ignorant and ridiculous. Every time a new advancement comes along that pushes the boundary's, someone chimes in and said it'll never happen and then it invariably does. They said that with 1440p, they said it with 4k and now here we are with someone saying it about 8k.

Are we sensing a pattern yet?

Now before you respond with some smarmy egocentric diatribe, sit back, think back to the past and just accept that your current thought on this matter does not meet with historical trends or merit. Everyone who said that 1080p, 1440p and 2160p would never be a thing were wrong, because here we are.

The future is coming, it needs only time and the work of technological engineers.
 
Everyone relax the 3090 and 7900XTX were already "8k" gaming cards :peace:
I wouldn't say that. The 7900XTX, that's plausible. The 3090? Not so much, but it's close. The 4080/4080ti/4090 that's in the ballpark. We're not far away from 8k, it's gonna happen, just a matter of when.
 
1) What?!? There have been 8k+ camera's for nearly 10 years. Either you've hiding under a rock or you have to stop with the drugs.

2) Again, that is your opinion. There are those who have mutli-gigabit per second internet connections. A good quality 4k stream requires 30Mbps. 8k = 4k x 4. So naturally 8k = 120Mbps. I can do that a few times over with my fiber connection and still have bandwidth to spare. And where people CAN do something they frequently WANT to do that something.

3) Now before you respond with some smarmy egocentric diatribe, sit back, think back to the past and just accept that your current thought on this matter does not meet with historical trends or merit. Everyone who said that 1080p, 1440p and 2160p would never be a thing were wrong, because here we are.

1) What does that have to do with mastering? 4K Blu-ray has been around since 2016, yet the vast majority of new movies was being mastered at 2K and upscaled to 4K. It was only like 2 years ago that we finally started seeing native 4K in new movies, which is irrelevant most of the time anyway. Movies are inherently soft, they use filters, tons of post-processing and CGI, that is all done at low resolutions. You can take screenshots from 4K movies, resize them to 1080p and compare. Most of the time they look identical. Even 720p often looks identical (House of the Dragon is an extremely soft TV show, where 4K is completely useless).
It's all about budget constraints, about resources, about time and manpower. Nobody wants to waste any of that on something that offers no value.

2) You're talking about the consumer point of view again. YouTube wanted to lock 4K behind Premium, you think they'll want to pay for 8K in every video? People are watching podcasts in 4K, which is a complete waste of bandwidth. They're watching videos with graphs, where 4K offers no benefit whatsoever. It costs them crazy amounts of money, and allowing 8K would raise those costs even more, with no financial benefit to them.
And what about streaming services like Netflix? They all keep raising prices, introducing ads, and 4K quality is already constrained. How much would they have to charge for 8K, where you wouldn't even see a difference because of the reasons I stated in point 1.
People are getting sick of streaming services. It might all just completely crash if they keep doing what they're doing.

3) With a bump in resolution has to come a bump in screen size. That's actually the definition of resolution, it's a combination of pixel density, screen size and viewing distance. Calling 4K a "resolution" is just a marketing term that's been normalized in everyday use. Your eyes need to actually be able to resolve the detail. You say it's an opinion, but there are plenty of studies that show optimal viewing distance for a given screen size and "resolution".
And there's a limit to what screen size people want. Most people don't want a screen bigger than 32" on their desk. Many say that even 32" is too big for a monitor.


I really wish the obsession with resolution stopped. On one hand you get those crying about upscaling and saying "native or bust", even when they're presented with direct comparisons showing little to no difference. On the other hand you have you have console gamers playing titles that upscale from 720p and they think they're getting 4K because that's the signal being output from the console.
I guess you're right, though. People will just gobble up whatever is thrown at them. They'll buy shiny new 8K monitors that offer no benefit, but they'll be happy because bigger number better.

The only thing is that for that to happen, there needs to be a semiconductor revolution. The current advancement of silicon manufacturing will not be enough to make 8K viable. Not from the consumer side, but from the content provider side.
 
The current advancement of silicon manufacturing will not be enough to make 8K viable.
Currently, no. But that was true for 4k and for 1080p. Advances happen and time marches on.. If you can see the forest for the trees, that's on you..
 
I wouldn't say that. The 7900XTX, that's plausible. The 3090? Not so much, but it's close. The 4080/4080ti/4090 that's in the ballpark. We're not far away from 8k, it's gonna happen, just a matter of when.

-Just saying both AMD and Nvidia marketed those cards as "8k" cards*

*With FSR/DLSS ultra performance + FG
 
This is exactly how you gain market share. Sell at discount, even at a loss if you can afford it. It's not a short-term plan.
As usual, I will err on the cautious side and set myself up to be pleasantly surprised, rather than the other way around.
While having regular discounts certainly can help boost sales, what has been a persistent problem with AMD over the last decade is whenever they have a portion of the market where they have an edge over Nvidia, Nvidia still outsell them by 10x in this segment, not because Nvidia have more fansboys (AMD have far more), but because the selection of models in stock at or around MSRP has been terrible. Far too often I'll see base models from AMD sold out for months, and only the pricey models in stock. Paying ~$20-30 extra for Nvidia's counterpart that's either in stock or have confirmed delivery dates is then very appealing.

So now with AMD having a slight edge with RX 9070, and a good chance of well positioned RX 9060 / 9060 XT, they should above all flood the market with as many as possible, which will also prevent stores from overcharging.

If they do get some faulty dies save them and then release a 9050XT with a 96bit bus and 12GB of VRAM for ~$250 when there is enough of a stockpile for it to last. I really think 96bit with 12GB will be a far better 1080p part than an 8GB 128bit part would be.
Having more VRAM doesn't give you more performance, it only makes you capable of displaying higher details provided you have the bandwidth and computational performance to go along with it. A theoretical "9050 XT" would be way too slow to make any real use for 12 GB during gaming, and would probably approach ~10 FPS in loads that actually requires it. Meanwhile, having faster and slightly more expensive VRAM would probably yield an extra ~3-5% performance across the board, which is way more valuable to the customer than extra VRAM which would only come into play when running unrealistic uses cases no buyer would ever do.
 
Back
Top