Thursday, January 30th 2025

AMD Radeon 9070 XT Rumored to Outpace RTX 5070 Ti by Almost 15%

It would be fair to say that the GeForce RTX 5080 has been quite disappointing, being roughly 16% faster in gaming than the RTX 4080 Super. Unsurprisingly, this gives AMD a lot of opportunity to offer excellent price-to-performance with its upcoming RDNA 4 GPUs, considering that the RTX 5070 and RTX 5070 Ti aren't really expected to pull off any miracles. According to a recent tidbit shared by the renowned leaker Moore's Law is Dead, the Radeon RX 9070 XT is expected to be around 3% faster than the RTX 4080, if AMD's internal performance goals are anything to go by. MLID also notes that RDNA 4's performance is improving by roughly around 1% each month, which makes it quite likely that the RDNA 4 cards will exceed the targets.

If it does turn out that way, the Radeon RX 9070 XT, according to MLID, should be roughly around 15% faster than its competitor from the Green Camp, the RTX 5070 Ti, and roughly match the RTX 4080 Super in gaming performance. The Radeon RX 9070, on the other hand, is expected to be around 12% faster than the RTX 5070. Of course, these performance improvements are limited to rasterization performance, and when ray tracing is brought to the scene, the performance improvements are expected to be substantially more modest, as per tradition. Citing our data for Cyberpunk 4K with RT, MLID stated that his sources indicate that the RX 9070 XT falls somewhere between the RTX 4070 Ti Super and RTX 3090 Ti, whereas the RX 9070 should likely trade blows with the RTX 4070 Super. Considering AMD's track record with ray tracing, this sure does sound quite enticing.

Of course, it will all boil down to pricing once the RDNA 4 cards hit the scene. If AMD does manage to undercut its competitors from NVIDIA by a reasonable margin, there is no doubt that RDNA 4 will be the better choice for most people. However, with NVIDIA's undeniable lead in ray tracing, paired with DLSS 4, will presumably make things more complicated than ever before. It is unclear what AMD has up its sleeve with FSR 4. Recent rumors do point at pretty good compatibility, but as with all rumors, be sure to accept any pre-release whispers with a grain of salt.
Source: MLID via YouTube
Add your own comment

74 Comments on AMD Radeon 9070 XT Rumored to Outpace RTX 5070 Ti by Almost 15%

#51
Lost_Wanderer
Well if Blackwell is this generation’s G92, I’d like Navi 48 to be this generation’s RV770. I’d still wait for reviews.
Posted on Reply
#52
Aylimta
Sooo, incoming 700 USD price and yet another dud launch?
Posted on Reply
#53
Hiner101
I understand that after that mess of an uplift from Blackwell, it’s hard to believe that a product can improve from one generation to the next. But it’s true, it’s always been this way. Companies enhance their products. It’s not strange at all. NVIDIA has done it in the past, but this time is AMD that has done it. I don’t get why that’s so hard to accept...

We always need to wait for the reviews to get the final numbers, and that’s clear no one wants to sell nonsense like NVIDIA did with the 5070=4090 ;)

By the way, have you seen the benchmarks of the XTX with DeepSeek R1 model compared to the 4090? As soon as you step out of the NVIDIA "ecosystem", which has become as closed and dangerous for evolution as Apple, and move to a more open system, the numbers start to seem different. I’d love to see what would happen if game developers began to consider them a bit less. It’s logical, NVIDIA has a much larger market share and has been leveraging it for years but there's life beyond as well.

9070xt will definitely be very competitive against the 5070Ti. By the way, it’s not new to see the 9070xt performing close to the XTX in raster. There’s always been talk of achieving performance near the 4080 or XTX in raster. And we’ve known for months that ray tracing has improved significantly, especially since the PS5 Pro with RDNA3.5 (at least +50%) was introduced, which has been somewhat of a development platform for RDNA4.

The 5080 was never its target, the focus remains on the 5070Ti. But considering how little improvement NVIDIA has brought, it will still be exciting to see the 9070xt in the high-end range of the charts!
Posted on Reply
#54
Vayra86
I'm calling BS on this story.
Posted on Reply
#55
_roman_
Another day - another AMD graphic card does this and that story without facts
Posted on Reply
#56
mkppo
Lost_WandererWell if Blackwell is this generation’s G92, I’d like Navi 48 to be this generation’s RV770. I’d still wait for reviews.
This exact thought came to my mind lol. Where's that 4870 team that David wang and co led, they were great. There was no shitty marketing, great prices, great cards, no nonsense. They made nvidia look like absolute fools in that generation. IIRC nvidia had like a massive 25% or so price cut a couple of months after launch because of RV770's arrival. Not happening now, but for a card that's supposed to be a stop gap till UDNA, if it can somehow be close to 4080S that's somehow not too far off a 5080 which I would've never expected but that 5080 turned out to be a POS on PCI-E.

The other thing I noticed is in the nvidia 5xxx reviews TPU had that strange, unexpected and frankly unnecessary line about not being sure if AMD will be in the GPU space in a couple of years. Didn't really expect it from them as it was...idk something the trashy rumor sites would post and i'll stop at that. Anyway, what people fail to realize is development cycles and the company's position at the time.

1) GCN was being developed around 2008-2009 when AMD inherited the arch during it's infancy from ATI who were doing quite well. It was a banger, and even though they ran out of money right after launch it served them well for a decade

2) RDNA was developed around 2016-2017 when AMD were deep in debt and putting all their money, hopes and dreams on Ryzen. It turned out okay, but nothing close to what GCN achieved.

3) UDNA is being developed now, when AMD have money, resources, time and a bunch of clowns in their marketing department. Speaking to people at AMD, they're putting a lot of resources into that thing and rightly so - their whole AI money pit depends on it. There's every possibility it's going to be another banger, but let's wait and see. I just can't see it being worse than RDNA on the 'relative to competition' basis.

It's supposed to launch around the same time TPU claims AMD discrete GPU division might not be around so erm..let's wait and see I suppose.
Posted on Reply
#57
AusWolf
mkppoThis exact thought came to my mind lol. Where's that 4870 team that David wang and co led, they were great. There was no shitty marketing, great prices, great cards, no nonsense. They made nvidia look like absolute fools in that generation. IIRC nvidia had like a massive 25% or so price cut a couple of months after launch because of RV770's arrival. Not happening now, but for a card that's supposed to be a stop gap till UDNA, if it can somehow be close to 4080S that's somehow not too far off a 5080 which I would've never expected but that 5080 turned out to be a POS on PCI-E.

The other thing I noticed is in the nvidia 5xxx reviews TPU had that strange, unexpected and frankly unnecessary line about not being sure if AMD will be in the GPU space in a couple of years. Didn't really expect it from them as it was...idk something the trashy rumor sites would post and i'll stop at that. Anyway, what people fail to realize is development cycles and the company's position at the time.

1) GCN was being developed around 2008-2009 when AMD inherited the arch during it's infancy from ATI who were doing quite well. It was a banger, and even though they ran out of money right after launch it served them well for a decade

2) RDNA was developed around 2016-2017 when AMD were deep in debt and putting all their money, hopes and dreams on Ryzen. It turned out okay, but nothing close to what GCN achieved.

3) UDNA is being developed now, when AMD have money, resources, time and a bunch of clowns in their marketing department. Speaking to people at AMD, they're putting a lot of resources into that thing and rightly so - their whole AI money pit depends on it. There's every possibility it's going to be another banger, but let's wait and see. I just can't see it being worse than RDNA on the 'relative to competition' basis.

It's supposed to launch around the same time TPU claims AMD discrete GPU division might not be around so erm..let's wait and see I suppose.
Someone correct me if I'm wrong, but TPU reviews are somewhat Nvidia-flavoured in general. I mean, the 5080 gets a "highly recommended" badge despite being nothing more than a 4080 Super rinse-and-repeat for the exact same price? C'mon...

I didn't even know about the janky comment on "AMD not being in the business later". I stopped reading/watching conclusions ages ago because they are way too swayed by the reviewer's personal taste, and that's not just TPU, it's everywhere. Anyway, why would AMD quit the business when they own basically the entire console APU market? That comment is just stupid.
Posted on Reply
#58
Denver
AusWolfSomeone correct me if I'm wrong, but TPU reviews are somewhat Nvidia-flavoured in general. I mean, the 5080 gets a "highly recommended" badge despite being nothing more than a 4080 Super rinse-and-repeat for the exact same price? C'mon...

I didn't even know about the janky comment on "AMD not being in the business later". I stopped reading/watching conclusions ages ago because they are way too swayed by the reviewer's personal taste, and that's not just TPU, it's everywhere. Anyway, why would AMD quit the business when they own basically the entire console APU market? That comment is just stupid.
Maybe using Cyberpunk to measure efficiency—one of the games with the biggest gains in Blackwell over the previous generation—or picking unpopular titles that favor Nvidia, or even making a big ad out of the 5xxx series launch, creates that impression. But TPU is neutral, just like Digital Foundry, and everything is just another conspiracy theory, right? RIGHT?

Caution: The above post contains twisted humor, sarcasm and cannot be used as grounds for legal action.
Posted on Reply
#59
alwayssts
3valatzyQuestion 1: How? It will have extremely low memory bandwidth - leaks of GPU-Z show mediocre 644 GB/s. For a comparison, the RTX 5090 is able to reach 2176 GB/s !

Question 2: How with only 16 GB ? If the VRAM doesn't matter, then why do they put so much, and instead don't limit the cards to 10 GB (as of RTX 3080) ?

16GB is the right amount of ram, just like 12GB is amount right for the 5070 and 18-20GB is enough for 5080. Whoops one of those didn't happen.
Gotta sell it later with 24GB of ram as 5080 Super when 3GB prices go down. (Probably Micron but ignore that part!)
Whoops, doesn't have 8 clusters so it doesn't need 24GB of ram, better sell it again as 6080 with a slightly higher clockspeed when we can make it cheaper on 3nm.
I ain't even lying. 5080 16GB is ridiculous; always was. That's why these cards will get so close bc they're well-matched in terms of compute/vram.

As I've said before N31 was equalized to 2720mhz with 20gbps ram. If you figure the same cache and it's essentially 2/3 N31, it would be the same.

The difference is it's most-likely used in a 7511.111/64 ROP use-case, hence you get the clockspeed of 2.97ghz. Strangely, just above where a 7800xt can overclock...WEIRD!

To me, these rumors look correct in terms of absolute performance. Tom is a reliable guy that shares credible information (at the time he receives it) and they fit with my thesis of their likely capability.
That said, again, to me, this looks like OC/absolute performance...unless AMD is adjusting clockspeeds to where I think they should have always been all-along and there's some cache/faster ram shenanigans we aren't privy.
Posted on Reply
#60
AusWolf
DenverMaybe using Cyberpunk to measure efficiency—one of the games with the biggest gains in Blackwell over the previous generation—or picking unpopular titles that favor Nvidia, or even making a big ad out of the 5xxx series launch, creates that impression. But TPU is neutral, just like Digital Foundry, and everything is just another conspiracy theory, right? RIGHT?

Caution: The above post contains twisted humor, sarcasm and cannot be used as grounds for legal action.
Of course. The 5080 is a massive improvement, it's very cheap, it washes your car and makes coffee, too. We all know that, right? ;)
Posted on Reply
#61
alwayssts
DenverBut TPU is neutral, just like Digital Foundry,
I literally just spit out my tea.

I come here for the charts; voltage/clock ranges etc. To deduce bandwidth/arch limitations through them, etc.
There is great info here, but it ain't in the conclusion. Also, I have only once accused W1zard of payola from PNY/nVIDIA. Only once. Because I don't want to get banned.
That said, he created some amazing tools/charts that these companies use to create their product stacks because many consumers see them as the gold standard, so he deserves some kickbacks respect for that wonderful continued work over many years.
Remember: 8GB is enough.

I listen to DF to scream out loud about their bias comma mostly (which I think hurts/confuses many consumers). They certainly have ins at nVIDIA for info, which if you parse through the regurgitated marketing (they perhaps have to say to keep them) is actually interesting. It's a shame many in the general populace get brainwashed by it though.
Remember that time guy revealed frame-gen was done on tensor to them? That was pretty hilarious (whoops!). Guess we can't sell the Optical Flow snake oil anymore, put it in the back with the G-sync module. I'll be looking forward to DF never mentioning it again so FG doesn't have to be back-ported.
Again, I go to them for their frame-time/rate vids, image comparisons, mentioning the resolution scale, analyzing clocks etc...and they're very good at that. They deserve massive respect for pioneering/updating those tests et al for more consumers to see. Also, sometimes they zoom-crop FFVIIR bikinis, sometimes they don't. I feel that. I do.
But I'll never forget the time they had to be over-nighted a card to test an AMD feature, because they don't even keep them around. That was pretty telling. Speak for the average/balanced/long-term consumer wrt products they do not. Proudly, I guess. If they could do it with a little bit less propaganda though, that would be cool.
Posted on Reply
#62
EsliteMoby
I doubt 64 CU will do much other than being another refresh of 7800XT. I guess they were using the best-case scenario with 1080p
Posted on Reply
#63
AusWolf
EsliteMobyI doubt 64 CU will do much other than being another refresh of 7800XT. I guess they were using the best-case scenario with 1080p
Don't forget about clock speeds. The 7800 XT runs at 2.4 GHz, the 9070 XT is at 2.9-3 GHz.
Posted on Reply
#64
mkppo
DenverMaybe using Cyberpunk to measure efficiency—one of the games with the biggest gains in Blackwell over the previous generation—or picking unpopular titles that favor Nvidia, or even making a big ad out of the 5xxx series launch, creates that impression. But TPU is neutral, just like Digital Foundry, and everything is just another conspiracy theory, right? RIGHT?

Caution: The above post contains twisted humor, sarcasm and cannot be used as grounds for legal action.
The efficiency thing needs to be revised. I just had a look at other reviews and 5080 efficiency isn't better than 4080S and depends on the game. CP shows 5xxx in a much better light than it should be in this regard.

With efficiency swaying so much from game to game there needs to be an average of sorts I suppose. CP also comparatively shows AMD worse off as well.

There's a power draw average for games, and performance summary as well in the review itself. Pretty sure an equation can be put in there because the data for averages already exist. But I haven't put much thought into it only something that crossed my mine rn. Because when I glanced over that data, it seems 5080 consumes around 10% more power for 12% more performance. Certainly not 11% more efficient overall, not even close.
Posted on Reply
#65
sLowEnd
AusWolfSomeone correct me if I'm wrong, but TPU reviews are somewhat Nvidia-flavoured in general. I mean, the 5080 gets a "highly recommended" badge despite being nothing more than a 4080 Super rinse-and-repeat for the exact same price? C'mon...
It's "highly recommended" because of the sad state of the market. The 5080 is still technically the best card you can get new at its price point.
Posted on Reply
#66
Sound_Card
ZoneDymoand that is suppose to be exciting ?



This comment is so over the top, it reads like sarcasm
Nope, it is the mass psychosis I have talked about here for a bit.



The poetry writes itself, it's like lady luck loves AMD. lol.

Watch AMD release a 32gb 9070xt just to mind f'k the market with AI.
Posted on Reply
#67
AusWolf
sLowEndIt's "highly recommended" because of the sad state of the market. The 5080 is still technically the best card you can get new at its price point.
It's the only card you can get at this price point, but whether that alone makes it good enough for a recommendation is questionable at best, imo.
Sound_CardNope, it is the mass psychosis I have talked about here for a bit.



The poetry writes itself, it's like lady luck loves AMD. lol.

Watch AMD release a 32gb 9070xt just to mind f'k the market with AI.
I'm just afraid that even if Nvidia throws such a low ball, AMD still won't know how to respond and will overprice the 9070 XT. Let history prove me wrong.
Posted on Reply
#68
Sound_Card
It's starting to make sense why AMD has three 8 pin power connectors and giant coolers on some skus now. They plan to clock their midrange to 5080 territory by the looks of what is happening with Nvidia's clown show.
Posted on Reply
#69
alwayssts
mkppoThis exact thought came to my mind lol. Where's that 4870 team that David wang and co led, they were great. There was no shitty marketing, great prices, great cards, no nonsense. They made nvidia look like absolute fools in that generation. IIRC nvidia had like a massive 25% or so price cut a couple of months after launch because of RV770's arrival. Not happening now, but for a card that's supposed to be a stop gap till UDNA, if it can somehow be close to 4080S that's somehow not too far off a 5080 which I would've never expected but that 5080 turned out to be a POS on PCI-E.

The other thing I noticed is in the nvidia 5xxx reviews TPU had that strange, unexpected and frankly unnecessary line about not being sure if AMD will be in the GPU space in a couple of years. Didn't really expect it from them as it was...idk something the trashy rumor sites would post and i'll stop at that. Anyway, what people fail to realize is development cycles and the company's position at the time.

1) GCN was being developed around 2008-2009 when AMD inherited the arch during it's infancy from ATI who were doing quite well. It was a banger, and even though they ran out of money right after launch it served them well for a decade

2) RDNA was developed around 2016-2017 when AMD were deep in debt and putting all their money, hopes and dreams on Ryzen. It turned out okay, but nothing close to what GCN achieved.

3) UDNA is being developed now, when AMD have money, resources, time and a bunch of clowns in their marketing department. Speaking to people at AMD, they're putting a lot of resources into that thing and rightly so - their whole AI money pit depends on it. There's every possibility it's going to be another banger, but let's wait and see. I just can't see it being worse than RDNA on the 'relative to competition' basis.

It's supposed to launch around the same time TPU claims AMD discrete GPU division might not be around so erm..let's wait and see I suppose.
Bonus points for mentioning Baumann's baby. I also mentioned that in a post not too long ago. I think I posted it, maybe I deleted it...I forget. I do that sometimes.
I get very saddened by people buying into nVIDIA's savvy crap that does not benefit them long-term but they think does (until they complain about it later), and reviewers aren't helping. I won't get into it.
Some people kinda/sorta already did, but it goes beyond that in ways I don't want to get into. I don't want to start a fight with any Youtube math teachers that want allocation.
I look to AMD for solace, it isn't there, and I get mad. It's doesn't mean they don't and/or can't make good-value products, but they used to LEAD in very important aspects; also make their strengths known.
I know it comes out in my posts, and I apologize for that. There's just something about the culture changing from nerds to normies whom think they understand, but don't; really mostly nVIDIA marketing.

...and AMD's marketing is awful to boot, which doesn't help. The whole 9070 series thing is a gigantic clusterfuck the likes I have never seen before, and they should be ashamed.
I get that they want time to catch up on features, but how they went about exposing this series and then trying to shove it back in the closet is beyond ridiculous. The price/placement uncertainty...it's bad form.

Whoever they have now in marketing is no Dave Baumann. Hell, whoever they have now is no Scott Herkelman. Very obvious things are in disarray now, perpaps because of layoffs.
Sound_CardIt's starting to make sense why AMD has three 8 pin power connectors and giant coolers on some skus now. They plan to clock their midrange to 5080 territory by the looks of what is happening with Nvidia's clown show.
Higher than that, my friend.
AusWolfDon't forget about clock speeds. The 7800 XT runs at 2.4 GHz, the 9070 XT is at 2.9-3 GHz.
Overclock a 7800xt. It's clocked in the toilet at stock for marketing purposes of this very card.

It gains around 19% performance in many cases. Look at W1zard's reviews.

The best a 7800 can clock is 2936mhz, but a 7700 (oddly similar across multiple cards) 3133mhz. That doesn't make any sense other than obv 7800xt was going to be clock-limited but instead got PL-limited.
Oddly, 7900xtx can also hit around 3165-3200mhz on the same arch before going power bananas?

It's stinky. Reeks of artificial product segmentation (granted RDNA4 has 8-bit ops; tensor cores). He's not wrong (for the most part).

The question truly is how high it will clock. If it's only ~3.3-3.4ghz max, that's bad considering die size is ~15-20% larger than it should be.
Not necessarily bad for those products, but for the chip overall if it can't be binned higher (at >375w).

If it's 3.5-3.7ghz on 3x8-pin (and release a >20gbps ram card), then we're talking an actual improvement wrt chip design and not just marketing tactics.
Posted on Reply
#70
mkppo
alwaysstsBonus points for mentioning Baumann's baby. I also mentioned that in a post not too long ago. I think I posted it, maybe I deleted it...I forget. I do that sometimes.
I get very saddened by people buying into nVIDIA's savvy crap that does not benefit them long-term but they think does (until they complain about it later), and reviewers aren't helping. I won't get into it.
Some people kinda/sorta already did, but it goes beyond that in ways I don't want to get into. I don't want to start a fight with any Youtube math teachers that want allocation.
I look to AMD for solace, it isn't there, and I get mad. It's doesn't mean they don't and/or can't make good-value products, but they used to LEAD in very important aspects; also make their strengths known.
I know it comes out in my posts, and I apologize for that. There's just something about the culture changing from nerds to normies whom think they understand, but don't; really mostly nVIDIA marketing.

...and AMD's marketing is awful to boot, which doesn't help. The whole 9070 series thing is a gigantic clusterfuck the likes I have never seen before, and they should be ashamed.
I get that they want time to catch up on features, but how they went about exposing this series and then trying to shove it back in the closet is beyond ridiculous. The price/placement uncertainty...it's bad form.

Whoever they have now in marketing is no Dave Baumann. Hell, whoever they have now is no Scott Herkelman. Very obvious things are in disarray now, perpaps because of layoffs.
Your thoughts echo mine, I couldn't have said it better and holy shit it's all coming back - it was Dave Baumann who AMD hired from ATI and he wasn't only in charge of marketing but also pushed the engineers to amp up the RV770 to what it became. Here's the article if you're interested. Also, what a bloody awesome 'review', i'd recommend it to anyone who has time to kill and I can guarantee you that you'll learn a thing or two. I remember learning something new everytime a new review came up on AT, or just browsing through their and B3D forums back then. It's been almost 20 years now but it makes me sad that those reviews are long gone, people got into gimmicks more, YT became the primary source of reviews for people and 98% of those reviewers don't know a thing about what they're reviewing, drink the kool-aid, lick the manufacturers and give a glowing review so they get another sample. You can guess which company is licked more, you know the one that's slightly more threatening.

There have been times in the past where I couldn't really understand why reviewers were taking up the angle they were. AMD had better architectures a few times going up against nVidia, but it wasn't really reflected in reviews and (partly) consequently sales. Take the 290X - it was architecturally much superior to the GTX780 and that was reflected in it's longevity. Guess what the reviewers and consequently general people thought of it? It's hot, loud and sucks power. Sales? pfft. To those who weren't there, no there were no proprietary features that were worth the salt, nothing. Sure nvidia kept trying one thing after the other to lock consumers in their ecosystem but they all failed till then (they certainly learned from their failures though - see today). AMD's brand perception of being the 'cheaper intel' didn't help, nor did their decade of pulling GCN when they were broke. I don't want to really get into it either, but I do hope things change (as a whole) in the next decade.

Then there's Mantle. AMD literally fixed a whole clusterfuck of issues under the hood and it paved the way for DX12/Vulkan which we all enjoy now. Most people are under the impression it's only Nvidia who released all the great new features throughout the past couple of decades. Sure they did, they released a whole heap of features and a few of them turned out to be great successes today but I won't get into detail the ones they bought and killed off or the pissfight they had with tesselation. There were times it felt like (and it turned out to be true) they were only trying to increase their performance margin at the expense of consumers. I won't get into the fact that it's still happening. Not a word from peeps though, it's okay. But let's not forget that the other camp did a lot for your GPU's as well.

Lisa needs to talk to a few ex ATI people in marketing. And fire Azor and a couple of the other clowns today. This stupidity really needs to stop. I thought they got their marketing shit together with 6xxx launch but then they decided to go ahead and fuck up the other two, somehow one before they even launched it. That's a new low tbh.

AMD/ATI has come back from way back before a few times. 9700 pro, 4870 and to a lesser extent 3870 and 7970 were ones that come to mind. Hell, 6900XT was the same and it wasn't that long ago. No one's lead is insurmountable, but proprietary features are hard to crack and I doubt ill see much change in the competitive landscape anytime soon. One can hope.
Posted on Reply
#71
AusWolf
I just had a thought...

Everyone thought AMD delayed the 9070 XT because they found out that Nvidia's cards are too good so the price had to be adjusted down.

What if they actually found out that Nvidia doesn't offer anything on top of last gen in the midrange, so the price on the 9070 XT actually has to be adjusted up?

So it's not like "hey look, the 5070 Ti is only $750, so we can't sell the 9070 XT for $900", but instead "look at these pieces of crap, we really shouldn't be selling the 9070 XT for $500, how about $700 instead".
Posted on Reply
#72
JustBenching
How can MLID know where it scores on Cyberpunk 2077 when TPU runs a custom scene? He is making stuff up as per the usual
Posted on Reply
#73
JustBenching
Hecate91Sure, AMD was the greedy one, while Nvidia tried to sell the 4080 12GB for $900, then re-labeled it as a 4070Ti for $800, and then re-released it as a Super version, thats the real greed.
Just so we are on the same page, that 4080 12gb / 4070ti was still better in both raster / $ and RT / $ than it's competitor, the 7900xt. So yeah, AMD is super greedy when they managed to outgreed the greediest nvidia card.
AusWolfSomeone correct me if I'm wrong, but TPU reviews are somewhat Nvidia-flavoured in general. I mean, the 5080 gets a "highly recommended" badge despite being nothing more than a 4080 Super rinse-and-repeat for the exact same price? C'mon...

I didn't even know about the janky comment on "AMD not being in the business later". I stopped reading/watching conclusions ages ago because they are way too swayed by the reviewer's personal taste, and that's not just TPU, it's everywhere. Anyway, why would AMD quit the business when they own basically the entire console APU market? That comment is just stupid.
Last gen both the xtx and the 4080 got editor's choice. Should that make me feel like TPU is leaning AMD? You all need to stop with those conspiracies.
Posted on Reply
#74
kondamin
Hmm bit late publishing this, it would have helped more if he were to have taken his leak a day earlier so more people would have stayed home and slightly lowered the demand for the 5080
Posted on Reply
Add your own comment
Jan 31st, 2025 01:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts