• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon 9070 XT Rumored to Outpace RTX 5070 Ti by Almost 15%

Joined
Nov 20, 2021
Messages
146 (0.13/day)
…I’m gonna wait until reviews come out to confirm because that way too good to be true.

For once, I actually think it's better for AMD to keep shut. Remember the numbers they were advertising for the 7900 XT and XTX and got nowhere close?
Yep, that did no good at all.

Just let the cards do the talking themselves when they release.
EXACTLY
 
Joined
May 15, 2020
Messages
49 (0.03/day)
System Name Gaming
Processor AMD Ryzen 7 9800X3D
Motherboard Asus ROG Strix B850-E
Cooling TR Phantom Spirit 120 EVO w/ two NF-A12x25 chromax.black, and TG Kryonaut Extreme
Memory TEAMGROUP T-Create Expert 2x32GB DDR5-6400 (CL 34, CTCWD564G6400HC34BDC01)
Video Card(s) PowerColor Red Devil RX 6950 XT
Storage 1x Samsung 990 Pro 4TB, 1x Lexar NM790 8TB
Display(s) AOC Agon PRO AG274QZM 27” (240Hz, 1440p Mini-LED), Razer Raptor 27" (165Hz, 1440p)
Case GAMDIAS NESO P1 B w/ 4x Silent Wings Pro 140mm, and 6x Grand Tornado 120mm
Audio Device(s) Klipsch ProMedia THX 2.1, Razer BlackShark V2
Power Supply FSP Hydro Ti PRO 1000W (HTI-1000M-GEN5)
Mouse ENDGAME GEAR OP1 8k
Keyboard Ducky One 3 Pro Full Size w/ MX2A Blues, and PBT Pudding Keycaps
Software Windows 11 Pro
Well if Blackwell is this generation’s G92, I’d like Navi 48 to be this generation’s RV770. I’d still wait for reviews.
 
Joined
Jun 26, 2021
Messages
7 (0.01/day)
I understand that after that mess of an uplift from Blackwell, it’s hard to believe that a product can improve from one generation to the next. But it’s true, it’s always been this way. Companies enhance their products. It’s not strange at all. NVIDIA has done it in the past, but this time is AMD that has done it. I don’t get why that’s so hard to accept...

We always need to wait for the reviews to get the final numbers, and that’s clear no one wants to sell nonsense like NVIDIA did with the 5070=4090 ;)

By the way, have you seen the benchmarks of the XTX with DeepSeek R1 model compared to the 4090? As soon as you step out of the NVIDIA "ecosystem", which has become as closed and dangerous for evolution as Apple, and move to a more open system, the numbers start to seem different. I’d love to see what would happen if game developers began to consider them a bit less. It’s logical, NVIDIA has a much larger market share and has been leveraging it for years but there's life beyond as well.

9070xt will definitely be very competitive against the 5070Ti. By the way, it’s not new to see the 9070xt performing close to the XTX in raster. There’s always been talk of achieving performance near the 4080 or XTX in raster. And we’ve known for months that ray tracing has improved significantly, especially since the PS5 Pro with RDNA3.5 (at least +50%) was introduced, which has been somewhat of a development platform for RDNA4.

The 5080 was never its target, the focus remains on the 5070Ti. But considering how little improvement NVIDIA has brought, it will still be exciting to see the 9070xt in the high-end range of the charts!
 
Joined
Sep 17, 2014
Messages
23,106 (6.10/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I'm calling BS on this story.
 
Joined
Oct 30, 2020
Messages
377 (0.24/day)
Location
Toronto
System Name GraniteXT
Processor Ryzen 9950X
Motherboard ASRock B650M-HDV
Cooling 2x360mm custom loop
Memory 2x24GB Team Xtreem DDR5-8000 [M die]
Video Card(s) RTX 3090 FE underwater
Storage Intel P5800X 800GB + Samsung 980 Pro 2TB
Display(s) MSI 342C 34" OLED
Case O11D Evo RGB
Audio Device(s) DCA Aeon 2 w/ SMSL M200/SP200
Power Supply Superflower Leadex VII XG 1300W
Mouse Razer Basilisk V3
Keyboard Steelseries Apex Pro V2 TKL
Well if Blackwell is this generation’s G92, I’d like Navi 48 to be this generation’s RV770. I’d still wait for reviews.
This exact thought came to my mind lol. Where's that 4870 team that David wang and co led, they were great. There was no shitty marketing, great prices, great cards, no nonsense. They made nvidia look like absolute fools in that generation. IIRC nvidia had like a massive 25% or so price cut a couple of months after launch because of RV770's arrival. Not happening now, but for a card that's supposed to be a stop gap till UDNA, if it can somehow be close to 4080S that's somehow not too far off a 5080 which I would've never expected but that 5080 turned out to be a POS on PCI-E.

The other thing I noticed is in the nvidia 5xxx reviews TPU had that strange, unexpected and frankly unnecessary line about not being sure if AMD will be in the GPU space in a couple of years. Didn't really expect it from them as it was...idk something the trashy rumor sites would post and i'll stop at that. Anyway, what people fail to realize is development cycles and the company's position at the time.

1) GCN was being developed around 2008-2009 when AMD inherited the arch during it's infancy from ATI who were doing quite well. It was a banger, and even though they ran out of money right after launch it served them well for a decade

2) RDNA was developed around 2016-2017 when AMD were deep in debt and putting all their money, hopes and dreams on Ryzen. It turned out okay, but nothing close to what GCN achieved.

3) UDNA is being developed now, when AMD have money, resources, time and a bunch of clowns in their marketing department. Speaking to people at AMD, they're putting a lot of resources into that thing and rightly so - their whole AI money pit depends on it. There's every possibility it's going to be another banger, but let's wait and see. I just can't see it being worse than RDNA on the 'relative to competition' basis.

It's supposed to launch around the same time TPU claims AMD discrete GPU division might not be around so erm..let's wait and see I suppose.
 
Joined
Jan 14, 2019
Messages
13,951 (6.32/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
This exact thought came to my mind lol. Where's that 4870 team that David wang and co led, they were great. There was no shitty marketing, great prices, great cards, no nonsense. They made nvidia look like absolute fools in that generation. IIRC nvidia had like a massive 25% or so price cut a couple of months after launch because of RV770's arrival. Not happening now, but for a card that's supposed to be a stop gap till UDNA, if it can somehow be close to 4080S that's somehow not too far off a 5080 which I would've never expected but that 5080 turned out to be a POS on PCI-E.

The other thing I noticed is in the nvidia 5xxx reviews TPU had that strange, unexpected and frankly unnecessary line about not being sure if AMD will be in the GPU space in a couple of years. Didn't really expect it from them as it was...idk something the trashy rumor sites would post and i'll stop at that. Anyway, what people fail to realize is development cycles and the company's position at the time.

1) GCN was being developed around 2008-2009 when AMD inherited the arch during it's infancy from ATI who were doing quite well. It was a banger, and even though they ran out of money right after launch it served them well for a decade

2) RDNA was developed around 2016-2017 when AMD were deep in debt and putting all their money, hopes and dreams on Ryzen. It turned out okay, but nothing close to what GCN achieved.

3) UDNA is being developed now, when AMD have money, resources, time and a bunch of clowns in their marketing department. Speaking to people at AMD, they're putting a lot of resources into that thing and rightly so - their whole AI money pit depends on it. There's every possibility it's going to be another banger, but let's wait and see. I just can't see it being worse than RDNA on the 'relative to competition' basis.

It's supposed to launch around the same time TPU claims AMD discrete GPU division might not be around so erm..let's wait and see I suppose.
Someone correct me if I'm wrong, but TPU reviews are somewhat Nvidia-flavoured in general. I mean, the 5080 gets a "highly recommended" badge despite being nothing more than a 4080 Super rinse-and-repeat for the exact same price? C'mon...

I didn't even know about the janky comment on "AMD not being in the business later". I stopped reading/watching conclusions ages ago because they are way too swayed by the reviewer's personal taste, and that's not just TPU, it's everywhere. Anyway, why would AMD quit the business when they own basically the entire console APU market? That comment is just stupid.
 
Joined
Oct 6, 2021
Messages
1,648 (1.36/day)
Someone correct me if I'm wrong, but TPU reviews are somewhat Nvidia-flavoured in general. I mean, the 5080 gets a "highly recommended" badge despite being nothing more than a 4080 Super rinse-and-repeat for the exact same price? C'mon...

I didn't even know about the janky comment on "AMD not being in the business later". I stopped reading/watching conclusions ages ago because they are way too swayed by the reviewer's personal taste, and that's not just TPU, it's everywhere. Anyway, why would AMD quit the business when they own basically the entire console APU market? That comment is just stupid.
Maybe using Cyberpunk to measure efficiency—one of the games with the biggest gains in Blackwell over the previous generation—or picking unpopular titles that favor Nvidia, or even making a big ad out of the 5xxx series launch, creates that impression. But TPU is neutral, just like Digital Foundry, and everything is just another conspiracy theory, right? RIGHT?

Caution: The above post contains twisted humor, sarcasm and cannot be used as grounds for legal action.
 
Joined
May 13, 2008
Messages
782 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Question 1: How? It will have extremely low memory bandwidth - leaks of GPU-Z show mediocre 644 GB/s. For a comparison, the RTX 5090 is able to reach 2176 GB/s !

Question 2: How with only 16 GB ? If the VRAM doesn't matter, then why do they put so much, and instead don't limit the cards to 10 GB (as of RTX 3080) ?

View attachment 382548

16GB is the right amount of ram, just like 12GB is amount right for the 5070 and 18-20GB is enough for 5080. Whoops one of those didn't happen.
Gotta sell it later with 24GB of ram as 5080 Super when 3GB prices go down. (Probably Micron but ignore that part!)
Whoops, doesn't have 8 clusters so it doesn't need 24GB of ram, better sell it again as 6080 with a slightly higher clockspeed when we can make it cheaper on 3nm.
I ain't even lying. 5080 16GB is ridiculous; always was. That's why these cards will get so close bc they're well-matched in terms of compute/vram.

As I've said before N31 was equalized to 2720mhz with 20gbps ram. If you figure the same cache and it's essentially 2/3 N31, it would be the same.

The difference is it's most-likely used in a 7511.111/64 ROP use-case, hence you get the clockspeed of 2.97ghz. Strangely, just above where a 7800xt can overclock...WEIRD!

To me, these rumors look correct in terms of absolute performance. Tom is a reliable guy that shares credible information (at the time he receives it) and they fit with my thesis of their likely capability.
That said, again, to me, this looks like OC/absolute performance...unless AMD is adjusting clockspeeds to where I think they should have always been all-along and there's some cache/faster ram shenanigans we aren't privy.
 
Joined
Jan 14, 2019
Messages
13,951 (6.32/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Maybe using Cyberpunk to measure efficiency—one of the games with the biggest gains in Blackwell over the previous generation—or picking unpopular titles that favor Nvidia, or even making a big ad out of the 5xxx series launch, creates that impression. But TPU is neutral, just like Digital Foundry, and everything is just another conspiracy theory, right? RIGHT?

Caution: The above post contains twisted humor, sarcasm and cannot be used as grounds for legal action.
Of course. The 5080 is a massive improvement, it's very cheap, it washes your car and makes coffee, too. We all know that, right? ;)
 
Joined
May 13, 2008
Messages
782 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
But TPU is neutral, just like Digital Foundry,
I literally just spit out my tea.

I come here for the charts; voltage/clock ranges etc. To deduce bandwidth/arch limitations through them, etc.
There is great info here, but it ain't in the conclusion. Also, I have only once accused W1zard of payola from PNY/nVIDIA. Only once. Because I don't want to get banned.
That said, he created some amazing tools/charts that these companies use to create their product stacks because many consumers see them as the gold standard, so he deserves some kickbacks respect for that wonderful continued work over many years.
Remember: 8GB is enough.

I listen to DF to scream out loud about their bias comma mostly (which I think hurts/confuses many consumers). They certainly have ins at nVIDIA for info, which if you parse through the regurgitated marketing (they perhaps have to say to keep them) is actually interesting. It's a shame many in the general populace get brainwashed by it though.
Remember that time guy revealed frame-gen was done on tensor to them? That was pretty hilarious (whoops!). Guess we can't sell the Optical Flow snake oil anymore, put it in the back with the G-sync module. I'll be looking forward to DF never mentioning it again so FG doesn't have to be back-ported.
Again, I go to them for their frame-time/rate vids, image comparisons, mentioning the resolution scale, analyzing clocks etc...and they're very good at that. They deserve massive respect for pioneering/updating those tests et al for more consumers to see. Also, sometimes they zoom-crop FFVIIR bikinis, sometimes they don't. I feel that. I do.
But I'll never forget the time they had to be over-nighted a card to test an AMD feature, because they don't even keep them around. That was pretty telling. Speak for the average/balanced/long-term consumer wrt products they do not. Proudly, I guess. If they could do it with a little bit less propaganda though, that would be cool.
 
Last edited:
Joined
Jun 5, 2023
Messages
65 (0.11/day)
I doubt 64 CU will do much other than being another refresh of 7800XT. I guess they were using the best-case scenario with 1080p
 
Joined
Jan 14, 2019
Messages
13,951 (6.32/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
I doubt 64 CU will do much other than being another refresh of 7800XT. I guess they were using the best-case scenario with 1080p
Don't forget about clock speeds. The 7800 XT runs at 2.4 GHz, the 9070 XT is at 2.9-3 GHz.
 
Joined
Oct 30, 2020
Messages
377 (0.24/day)
Location
Toronto
System Name GraniteXT
Processor Ryzen 9950X
Motherboard ASRock B650M-HDV
Cooling 2x360mm custom loop
Memory 2x24GB Team Xtreem DDR5-8000 [M die]
Video Card(s) RTX 3090 FE underwater
Storage Intel P5800X 800GB + Samsung 980 Pro 2TB
Display(s) MSI 342C 34" OLED
Case O11D Evo RGB
Audio Device(s) DCA Aeon 2 w/ SMSL M200/SP200
Power Supply Superflower Leadex VII XG 1300W
Mouse Razer Basilisk V3
Keyboard Steelseries Apex Pro V2 TKL
Maybe using Cyberpunk to measure efficiency—one of the games with the biggest gains in Blackwell over the previous generation—or picking unpopular titles that favor Nvidia, or even making a big ad out of the 5xxx series launch, creates that impression. But TPU is neutral, just like Digital Foundry, and everything is just another conspiracy theory, right? RIGHT?

Caution: The above post contains twisted humor, sarcasm and cannot be used as grounds for legal action.

The efficiency thing needs to be revised. I just had a look at other reviews and 5080 efficiency isn't better than 4080S and depends on the game. CP shows 5xxx in a much better light than it should be in this regard.

With efficiency swaying so much from game to game there needs to be an average of sorts I suppose. CP also comparatively shows AMD worse off as well.

There's a power draw average for games, and performance summary as well in the review itself. Pretty sure an equation can be put in there because the data for averages already exist. But I haven't put much thought into it only something that crossed my mine rn. Because when I glanced over that data, it seems 5080 consumes around 10% more power for 12% more performance. Certainly not 11% more efficient overall, not even close.
 
Joined
Sep 27, 2008
Messages
1,223 (0.20/day)
Someone correct me if I'm wrong, but TPU reviews are somewhat Nvidia-flavoured in general. I mean, the 5080 gets a "highly recommended" badge despite being nothing more than a 4080 Super rinse-and-repeat for the exact same price? C'mon...
It's "highly recommended" because of the sad state of the market. The 5080 is still technically the best card you can get new at its price point.
 
Joined
Aug 3, 2006
Messages
264 (0.04/day)
Location
Austin, TX
Processor Ryzen 6900HX
Memory 32 GB DDR4LP
Video Card(s) Radeon 6800m
Display(s) LG C3 42''
Software Windows 11 home premium
and that is suppose to be exciting ?



This comment is so over the top, it reads like sarcasm

Nope, it is the mass psychosis I have talked about here for a bit.

1738287768855.png


The poetry writes itself, it's like lady luck loves AMD. lol.

Watch AMD release a 32gb 9070xt just to mind f'k the market with AI.
 
Joined
Jan 14, 2019
Messages
13,951 (6.32/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
It's "highly recommended" because of the sad state of the market. The 5080 is still technically the best card you can get new at its price point.
It's the only card you can get at this price point, but whether that alone makes it good enough for a recommendation is questionable at best, imo.

Nope, it is the mass psychosis I have talked about here for a bit.

View attachment 382615

The poetry writes itself, it's like lady luck loves AMD. lol.

Watch AMD release a 32gb 9070xt just to mind f'k the market with AI.
I'm just afraid that even if Nvidia throws such a low ball, AMD still won't know how to respond and will overprice the 9070 XT. Let history prove me wrong.
 
Joined
Aug 3, 2006
Messages
264 (0.04/day)
Location
Austin, TX
Processor Ryzen 6900HX
Memory 32 GB DDR4LP
Video Card(s) Radeon 6800m
Display(s) LG C3 42''
Software Windows 11 home premium
It's starting to make sense why AMD has three 8 pin power connectors and giant coolers on some skus now. They plan to clock their midrange to 5080 territory by the looks of what is happening with Nvidia's clown show.
 
Joined
May 13, 2008
Messages
782 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
This exact thought came to my mind lol. Where's that 4870 team that David wang and co led, they were great. There was no shitty marketing, great prices, great cards, no nonsense. They made nvidia look like absolute fools in that generation. IIRC nvidia had like a massive 25% or so price cut a couple of months after launch because of RV770's arrival. Not happening now, but for a card that's supposed to be a stop gap till UDNA, if it can somehow be close to 4080S that's somehow not too far off a 5080 which I would've never expected but that 5080 turned out to be a POS on PCI-E.

The other thing I noticed is in the nvidia 5xxx reviews TPU had that strange, unexpected and frankly unnecessary line about not being sure if AMD will be in the GPU space in a couple of years. Didn't really expect it from them as it was...idk something the trashy rumor sites would post and i'll stop at that. Anyway, what people fail to realize is development cycles and the company's position at the time.

1) GCN was being developed around 2008-2009 when AMD inherited the arch during it's infancy from ATI who were doing quite well. It was a banger, and even though they ran out of money right after launch it served them well for a decade

2) RDNA was developed around 2016-2017 when AMD were deep in debt and putting all their money, hopes and dreams on Ryzen. It turned out okay, but nothing close to what GCN achieved.

3) UDNA is being developed now, when AMD have money, resources, time and a bunch of clowns in their marketing department. Speaking to people at AMD, they're putting a lot of resources into that thing and rightly so - their whole AI money pit depends on it. There's every possibility it's going to be another banger, but let's wait and see. I just can't see it being worse than RDNA on the 'relative to competition' basis.

It's supposed to launch around the same time TPU claims AMD discrete GPU division might not be around so erm..let's wait and see I suppose.

Bonus points for mentioning Baumann's baby. I also mentioned that in a post not too long ago. I think I posted it, maybe I deleted it...I forget. I do that sometimes.
I get very saddened by people buying into nVIDIA's savvy crap that does not benefit them long-term but they think does (until they complain about it later), and reviewers aren't helping. I won't get into it.
Some people kinda/sorta already did, but it goes beyond that in ways I don't want to get into. I don't want to start a fight with any Youtube math teachers that want allocation.
I look to AMD for solace, it isn't there, and I get mad. It's doesn't mean they don't and/or can't make good-value products, but they used to LEAD in very important aspects; also make their strengths known.
I know it comes out in my posts, and I apologize for that. There's just something about the culture changing from nerds to normies whom think they understand, but don't; really mostly nVIDIA marketing.

...and AMD's marketing is awful to boot, which doesn't help. The whole 9070 series thing is a gigantic clusterfuck the likes I have never seen before, and they should be ashamed.
I get that they want time to catch up on features, but how they went about exposing this series and then trying to shove it back in the closet is beyond ridiculous. The price/placement uncertainty...it's bad form.

Whoever they have now in marketing is no Dave Baumann. Hell, whoever they have now is no Scott Herkelman. Very obvious things are in disarray now, perpaps because of layoffs.

It's starting to make sense why AMD has three 8 pin power connectors and giant coolers on some skus now. They plan to clock their midrange to 5080 territory by the looks of what is happening with Nvidia's clown show.
Higher than that, my friend.

Don't forget about clock speeds. The 7800 XT runs at 2.4 GHz, the 9070 XT is at 2.9-3 GHz.
Overclock a 7800xt. It's clocked in the toilet at stock for marketing purposes of this very card.

It gains around 19% performance in many cases. Look at W1zard's reviews.

The best a 7800 can clock is 2936mhz, but a 7700 (oddly similar across multiple cards) 3133mhz. That doesn't make any sense other than obv 7800xt was going to be clock-limited but instead got PL-limited.
Oddly, 7900xtx can also hit around 3165-3200mhz on the same arch before going power bananas?

It's stinky. Reeks of artificial product segmentation (granted RDNA4 has 8-bit ops; tensor cores). He's not wrong (for the most part).

The question truly is how high it will clock. If it's only ~3.3-3.4ghz max, that's bad considering die size is ~15-20% larger than it should be.
Not necessarily bad for those products, but for the chip overall if it can't be binned higher (at >375w).

If it's 3.5-3.7ghz on 3x8-pin (and release a >20gbps ram card), then we're talking an actual improvement wrt chip design and not just marketing tactics.
 
Last edited:
Joined
Oct 30, 2020
Messages
377 (0.24/day)
Location
Toronto
System Name GraniteXT
Processor Ryzen 9950X
Motherboard ASRock B650M-HDV
Cooling 2x360mm custom loop
Memory 2x24GB Team Xtreem DDR5-8000 [M die]
Video Card(s) RTX 3090 FE underwater
Storage Intel P5800X 800GB + Samsung 980 Pro 2TB
Display(s) MSI 342C 34" OLED
Case O11D Evo RGB
Audio Device(s) DCA Aeon 2 w/ SMSL M200/SP200
Power Supply Superflower Leadex VII XG 1300W
Mouse Razer Basilisk V3
Keyboard Steelseries Apex Pro V2 TKL
Bonus points for mentioning Baumann's baby. I also mentioned that in a post not too long ago. I think I posted it, maybe I deleted it...I forget. I do that sometimes.
I get very saddened by people buying into nVIDIA's savvy crap that does not benefit them long-term but they think does (until they complain about it later), and reviewers aren't helping. I won't get into it.
Some people kinda/sorta already did, but it goes beyond that in ways I don't want to get into. I don't want to start a fight with any Youtube math teachers that want allocation.
I look to AMD for solace, it isn't there, and I get mad. It's doesn't mean they don't and/or can't make good-value products, but they used to LEAD in very important aspects; also make their strengths known.
I know it comes out in my posts, and I apologize for that. There's just something about the culture changing from nerds to normies whom think they understand, but don't; really mostly nVIDIA marketing.

...and AMD's marketing is awful to boot, which doesn't help. The whole 9070 series thing is a gigantic clusterfuck the likes I have never seen before, and they should be ashamed.
I get that they want time to catch up on features, but how they went about exposing this series and then trying to shove it back in the closet is beyond ridiculous. The price/placement uncertainty...it's bad form.

Whoever they have now in marketing is no Dave Baumann. Hell, whoever they have now is no Scott Herkelman. Very obvious things are in disarray now, perpaps because of layoffs.

Your thoughts echo mine, I couldn't have said it better and holy shit it's all coming back - it was Dave Baumann who AMD hired from ATI and he wasn't only in charge of marketing but also pushed the engineers to amp up the RV770 to what it became. Here's the article if you're interested. Also, what a bloody awesome 'review', i'd recommend it to anyone who has time to kill and I can guarantee you that you'll learn a thing or two. I remember learning something new everytime a new review came up on AT, or just browsing through their and B3D forums back then. It's been almost 20 years now but it makes me sad that those reviews are long gone, people got into gimmicks more, YT became the primary source of reviews for people and 98% of those reviewers don't know a thing about what they're reviewing, drink the kool-aid, lick the manufacturers and give a glowing review so they get another sample. You can guess which company is licked more, you know the one that's slightly more threatening.

There have been times in the past where I couldn't really understand why reviewers were taking up the angle they were. AMD had better architectures a few times going up against nVidia, but it wasn't really reflected in reviews and (partly) consequently sales. Take the 290X - it was architecturally much superior to the GTX780 and that was reflected in it's longevity. Guess what the reviewers and consequently general people thought of it? It's hot, loud and sucks power. Sales? pfft. To those who weren't there, no there were no proprietary features that were worth the salt, nothing. Sure nvidia kept trying one thing after the other to lock consumers in their ecosystem but they all failed till then (they certainly learned from their failures though - see today). AMD's brand perception of being the 'cheaper intel' didn't help, nor did their decade of pulling GCN when they were broke. I don't want to really get into it either, but I do hope things change (as a whole) in the next decade.

Then there's Mantle. AMD literally fixed a whole clusterfuck of issues under the hood and it paved the way for DX12/Vulkan which we all enjoy now. Most people are under the impression it's only Nvidia who released all the great new features throughout the past couple of decades. Sure they did, they released a whole heap of features and a few of them turned out to be great successes today but I won't get into detail the ones they bought and killed off or the pissfight they had with tesselation. There were times it felt like (and it turned out to be true) they were only trying to increase their performance margin at the expense of consumers. I won't get into the fact that it's still happening. Not a word from peeps though, it's okay. But let's not forget that the other camp did a lot for your GPU's as well.

Lisa needs to talk to a few ex ATI people in marketing. And fire Azor and a couple of the other clowns today. This stupidity really needs to stop. I thought they got their marketing shit together with 6xxx launch but then they decided to go ahead and fuck up the other two, somehow one before they even launched it. That's a new low tbh.

AMD/ATI has come back from way back before a few times. 9700 pro, 4870 and to a lesser extent 3870 and 7970 were ones that come to mind. Hell, 6900XT was the same and it wasn't that long ago. No one's lead is insurmountable, but proprietary features are hard to crack and I doubt ill see much change in the competitive landscape anytime soon. One can hope.
 
Last edited:
Joined
Jan 14, 2019
Messages
13,951 (6.32/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
I just had a thought...

Everyone thought AMD delayed the 9070 XT because they found out that Nvidia's cards are too good so the price had to be adjusted down.

What if they actually found out that Nvidia doesn't offer anything on top of last gen in the midrange, so the price on the 9070 XT actually has to be adjusted up?

So it's not like "hey look, the 5070 Ti is only $750, so we can't sell the 9070 XT for $900", but instead "look at these pieces of crap, we really shouldn't be selling the 9070 XT for $500, how about $700 instead".
 
Joined
Jun 14, 2020
Messages
4,224 (2.50/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
How can MLID know where it scores on Cyberpunk 2077 when TPU runs a custom scene? He is making stuff up as per the usual
 
Joined
Jun 14, 2020
Messages
4,224 (2.50/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
Sure, AMD was the greedy one, while Nvidia tried to sell the 4080 12GB for $900, then re-labeled it as a 4070Ti for $800, and then re-released it as a Super version, thats the real greed.
Just so we are on the same page, that 4080 12gb / 4070ti was still better in both raster / $ and RT / $ than it's competitor, the 7900xt. So yeah, AMD is super greedy when they managed to outgreed the greediest nvidia card.

Someone correct me if I'm wrong, but TPU reviews are somewhat Nvidia-flavoured in general. I mean, the 5080 gets a "highly recommended" badge despite being nothing more than a 4080 Super rinse-and-repeat for the exact same price? C'mon...

I didn't even know about the janky comment on "AMD not being in the business later". I stopped reading/watching conclusions ages ago because they are way too swayed by the reviewer's personal taste, and that's not just TPU, it's everywhere. Anyway, why would AMD quit the business when they own basically the entire console APU market? That comment is just stupid.
Last gen both the xtx and the 4080 got editor's choice. Should that make me feel like TPU is leaning AMD? You all need to stop with those conspiracies.
 
Joined
Jan 11, 2022
Messages
1,060 (0.95/day)
Hmm bit late publishing this, it would have helped more if he were to have taken his leak a day earlier so more people would have stayed home and slightly lowered the demand for the 5080
 
Top