• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7600 Early Sample Offers RX 6750 XT Performance at 175W: Rumor

Joined
Jan 9, 2023
Messages
297 (0.43/day)
You dream too far..
In the real world, 10-15% performance above 6650x at a cost of $250 will sell very well.
I'd bet it's going to have an MSRP of 299$ and perform similarly to the 6700XT. About 20% increase in value.

2 slot and dual fan, not triple.
I don't see why not. There's the 6650XT Fighter. Similar consumption and 2 slot dual fan.
Obviously don't look at the premium models as that's how they're selling the markup.
 
Joined
Mar 28, 2020
Messages
1,753 (1.03/day)
I feel the performance is within expectations. Navi33 as I recall is almost like a RDNA 2+, since there is little in common between NAVI 31 and 33 since its built on 6nm and a monolithic design. I do wonder if its got dedicated RT cores like the actual RDNA3 chips.
 
Joined
Dec 25, 2020
Messages
6,734 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Precisely why I brought up that I suspected that they couldn't make a product competitive with the RTX 4070 with the same wattage figures.

That massive inertia regarding their most popular segment isn't something that a company that believes in its product will do, especially if they can retake badly needed market share by doing so.

This is the proof. It's a full tier of performance below its competition in the same weight class, the RTX 4070 will leave this thing in the dust in every regard. Shame that Nvidia charges for that privilege.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,147 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Rocking an rx 7900 xt right now, I can say that the biggest sign if immaturity of the card is its power draw. It constantly stays above what's necessary, or jitters in power draw and has a very hard time properly downpowering.

For example I can run Overwatch 2 on Ultra (also Epic, but frankly I saw no visual difference so why bother) with Radeon Chill on, and comfortably get 140fps: card eats 150W.
I then start a basic Youtube video/Twitch and just idle away: card eats 70W.
Downpowering back to the best possible idle power draw (29W) is a test of patience because the card will literally dance around 40-55W for several minutes before entirely downpowering to its real minimum. Even when all you do is idle away or just type on these forums:

All of that to say that the card clearly has the chops for a good efficiency, but is woefully immature in its power usage.
I'm eagerly waiting for the next driver update to see what actually gets done, and I'm expecting to gain a few watts on each new driver or so in the next ~1 year.

To be fair, between the huge upcoming APUs and the value of cards that have less than 12Go of VRAM, I'm really expecting low end enthusiast to be squeezed away in the next years. No need to put big efforts when you can just shove 10-40 RDNA3 CUs on your CPU...

The regular fluctuations in idle and power are normal, it's just the result of more precisely and dynamically controlled core and mem clocks. Same way Ryzen idle is touchy. Ever experienced the 90W multi monitor idle on Navi31? I guess not......

I would've been pretty happy with 47W idle on two screens, since that's basically normal behaviour using RDNA2 as a reference, and proof that VRAM isn't forced 100% all the time. It's not 10W idle Ada, it's good enough considering that by design Navi31 dumps a whole bunch of wasted power on mem-related rails. Next time you get the chance, spend some time watching HWInfo. Core and (to an extent) VRAM itself idles effectively, it's everything else associated with the new interconnect that doesnt.

If you've ever played a lighter game, you'd know not to wish for *more* downclocking in-game. The lack of a stable VRAM clock under lighter loads is exactly what torpedoes the Radeon experience on an otherwise strong card, and why no Geforce GPU suffers the same fate - pick a memory Pstate and stick to it, even if it's not max VRAM clock, and even if core clocks stay low.

There might be some future optimizations for excessive draw on the memory rails, but none of the other problems are unique to RDNA3. They've been around for years.

Regardless, if Navi33 is monolithic as it looks to be, it won't suffer any of the same problems as Navi31 (except Radeon's immortal hatred for multi-monitor power draw, of course).
 
Last edited:
Joined
Sep 17, 2014
Messages
22,439 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Effiency is a complicated topic and can't be reduced to a single game. If you use Doom Eternal for example then it's suddenly a 20-30% difference. 4080 only uses 280-290W in it while being faster than a 7900XTX IIRC so it's very much an example in favor of ADA, but it's a good example to show how much a game can swing the results in favor of either Nvidia or AMD.

Although you're that guy who mistakenly believes GPUs always hit max power consumption, right? That reminds me that I should deliver you some screenshots of synthetic GPU workloads, like TSE or Port Royal, where my 4090 runs below 400W.
View attachment 293191

Also once you cap games with a framelimit RDNA3 loses a ton of efficiency. That might not matter to you if you never run a cap but a lot of people run games with frame caps.
View attachment 293190
Euh... what? Can you please kick in some more open doors while you're at it?

Of course a GPU that has peak performance on a higher level is going to be able to clock lower and run more efficiently when capped to a certain framerate.
This is why I use the numbers from a large bench suite like TPU has it instead of cherry picking. The cherry pick you made and especially the conclusions you draw couldn't be further from the truth. It might be the number today. Consider also that the 4090 and 4080 are CPU limited more often than not, so they'll run more efficiently than a fully loaded 7900 card to begin with. Part of the reason for that is higher CPU overhead on Nvidia drivers in newer APIs. I think right now the 4070~4070ti is the best indicator of the real efficiency Ada has. Those GPUs aren't limited in the test aggregate I linked.

So no I don't believe GPUs run at max utilization all the time, but that IS the real way to measure differences in real power efficiency - you need to level the playing field which means all GPUs should run as close to 100% util as possible. That is the peak stock perf you get and you will land there sooner or later as games get heavier and the GPU relatively will have to work harder.

I mean what point are you trying to make here? That Ada is more efficient? Its as efficient as the TPU test shows, no matter what you link there, it doesn't show us anything different, you just failed to interpret numbers here. I hope this shines some light.
 
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Performance looks good. VRAM capacity, nope.... nope.... nope.... nope. It should have been at least 12GB and just keep the RX 6600 in the market as a cheaper, 8GB VRAM option.
That also means 7700 will come with 12GB, where we should be getting 16GB.
All this marketing about RX 6800 and 16GB and then ... 8GB. Meh...
 
Joined
Feb 3, 2017
Messages
3,753 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
However I don't quite get anyone who's pissing over general RDNA3 efficiency, its damn close to Ada really. And it does that while having much more VRAM, which does cost power.
RDNA3 cards so far are using GDDR6 vs GDDR6X on Ada. At least in RTX3000 series GDDR6X proved to be quite a power hog.
IIRC, 2GB GDDR6 chips consumed ~40% more power vs 1GB GDDR6 chips.
 
Joined
Dec 25, 2020
Messages
6,734 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
RDNA3 cards so far are using GDDR6 vs GDDR6X on Ada. At least in RTX3000 series GDDR6X proved to be quite a power hog.
IIRC, 2GB GDDR6 chips consumed ~40% more power vs 1GB GDDR6 chips.

2nd generation G6X (introduced with 3090 Ti) greatly alleviated power consumption, and the 4070 which is the GPU in the same power consumption tier uses standard G6, doesn't it? Because if not... The performance per watt disparity is even more extreme.

Price aside, the 4070 should murder this product in practically every regard. The performance per watt and feature set gap... Ignore that, it's not a gap, it's an abyss between them. For this GPU to be reasonable it can't cost more than $250.
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Performance looks good. VRAM capacity, nope.... nope.... nope.... nope. It should have been at least 12GB and just keep the RX 6600 in the market as a cheaper, 8GB VRAM option.
That also means 7700 will come with 12GB, where we should be getting 16GB.
All this marketing about RX 6800 and 16GB and then ... 8GB. Meh...
It's not out yet, but if you hear the whole video, it sounds like AMD really aren't sure what to launch in this market, which is kinda funny.

And 8GB of RAM is indeed low, but it all depends on the price. I bought a 3050 laptop a few months ago, it still made lots of sense for the money, although obviously I will never be able to use high res textures with it.

If the price is right and higher VRAM alternatives are not priced obscenely this could still make sense.
 
Joined
Dec 28, 2012
Messages
3,877 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
It's not out yet, but if you hear the whole video, it sounds like AMD really aren't sure what to launch in this market, which is kinda funny.
I said something similar a few days ago and was mocked. I dont understand how AMD doesnt know what to do, they've been doing this for 20 years now. Either that or they have a LOT of unused rDNA2 laying around, which is even funnier.
And 8GB of RAM is indeed low, but it all depends on the price. I bought a 3050 laptop a few months ago, it still made lots of sense for the money, although obviously I will never be able to use high res textures with it.

If the price is right and higher VRAM alternatives are not priced obscenely this could still make sense.
Laptops are a different beast, on a small screen you're probably not going to see the difference between medium and ultra on some games, or need fancy AA options. My venerable alienware 15r2 finally ran into issues this year with its 3GB framebuffer on the 970m. I'd say I got my use out of it.
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
I said something similar a few days ago and was mocked. I dont understand how AMD doesnt know what to do, they've been doing this for 20 years now. Either that or they have a LOT of unused rDNA2 laying around, which is even funnier.
Well, it's unclear if they are able to achieve an aggressive price performance ratio this generation. It seems that they thought the initial pricing on the 7900s was aggressive. That's really funny actually.


Laptops are a different beast, on a small screen you're probably not going to see the difference between medium and ultra on some games, or need fancy AA options. My venerable alienware 15r2 finally ran into issues this year with its 3GB framebuffer on the 970m. I'd say I got my use out of it.
Well, I still play with it more often than not docked on a 1440p ultrawide. But that's irrelevant, what I mean is that it's not like 6GB, 8GB or 10GB GPUs should cease to exist right this very moment. It's just that they shouldn't be marketed as no-compromise options. If the price is right the compromise can be perfectly acceptable.
 
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
It's not out yet, but if you hear the whole video, it sounds like AMD really aren't sure what to launch in this market, which is kinda funny.
It's not funny. It's a problem. When the competition has both the performance and the support of consumers, it's a problem. Even if you come out with a product at a much better price point, a simple price drop on the competitor's equivalent product and you are finished. Considering Nvidia no matter what, will place the VRAM obsolesce switch in it's products, AMD can only play the "more VRAM" at ANY price point marketing, to have some kind of advantage. Offering cards with the same VRAM at the same price points only helps Nvidia to sell more cards. We see it with RTX 3050 and RX 6600. The RX 6600 faster and cheaper, yet people buy the RTX 3050. But "more VRAM", that could sell cards on it's own.

And 8GB of RAM is indeed low, but it all depends on the price. I bought a 3050 laptop a few months ago, it still made lots of sense for the money, although obviously I will never be able to use high res textures with it.

If the price is right and higher VRAM alternatives are not priced obscenely this could still make sense.
8GB VRAM makes the cards cheaper to produce and market, but we seen that lower price is not enough to persuade people to prefer AMD cards over Nvidia when VRAM is the same. Considering the prices of modern cards and the fear that future models will be even more expensive, a GPU with more VRAM could be seen as a more future proof product and a way to avoid having to pay again in 1-2 years for a new GPU. Unfortunately AMD decided to cheap out here, while they done the complete opposite in CPUs, where they build a platform equivalent to a tank to make it future proof.
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
8GB VRAM makes the cards cheaper to produce and market, but we seen that lower price is not enough to persuade people to prefer AMD cards over Nvidia when VRAM is the same. Considering the prices of modern cards and the fear that future models will be even more expensive, a GPU with more VRAM could be seen as a more future proof product and a way to avoid having to pay again in 1-2 years for a new GPU.
Well maybe they will achieve just that with their 7700xt? Let's not forget that due to their stupid naming scheme, the 7600 might actually be a 6500 equivalent this generation. It's unclear from the rumor.
Or if unable to reach an aggressive price/performance, just discount the 6xxx series, keep selling the top RDNA3 and go as fast as possible to RDNA4 for a full lineup, instead of wasting money launching GPUs with poor price/performance.
 
Joined
Dec 25, 2020
Messages
6,734 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
8GB VRAM makes the cards cheaper to produce and market, but we seen that lower price is not enough to persuade people to prefer AMD cards over Nvidia when VRAM is the same. Considering the prices of modern cards and the fear that future models will be even more expensive, a GPU with more VRAM could be seen as a more future proof product and a way to avoid having to pay again in 1-2 years for a new GPU. Unfortunately AMD decided to cheap out here, while they done the complete opposite in CPUs, where they build a platform equivalent to a tank to make it future proof.

(rant warning)

I can't help but point out the sheer karmic rebalance at play here. AMD marketing hit on NVIDIA being stingy with VRAM? Proceeds to release a 8 GB GPU anyway. It's not even the first time, they've done the same with the 4 GB 6500 XT, and even temporarily removed a hit piece on their website that claimed 4 GB GPUs were dead.


AMD marketing hit on NVIDIA for 12VHPWR overheating failures? Their own MBA 7900 XTXs were cooking due to manufacturing defects in the vapor chamber.


Like that wasn't enough, all the memes about RTX 4090 power connectors catching fire and the i9-13900KS's high power consumption became reason for grief once the socket AM5 Ryzens began catching fire inside the socket and went up in smoke taking the motherboard alongside it, all over a failure/error in the AGESA/SMU power management code. For the substrate to bubble up and explode like these are doing? You need an insane amount of current to damage a processor that way.

Arguing this on Discord I've been called multiple names pointing that out, until... buildzoid came out with his video and brought up that one of the root causes of the problem could be an AGESA bug, and then no one talked me down on it again.



Guess what AMD and motherboard vendors have done? Withdrew all current versions of BIOS and began issuing high-priority AGESA updates to everyone. I'm gonna lay down the cards here, AMD fans really need to stop being so proud and rabid in defense of this company. They're not your friends, they're not your old pals, they're not an indie, they're not an underdog. Nothing they do is out of kindness of their hearts. Begin demanding better products of them. They can do better.

If this series of blunders continue to occur routinely and every product they put out is massively overhyped, underdelivered and completely underwhelming, NVIDIA will charge you a thousand bucks for the RTX 5070 instead of just 600 for the 4070 as they do now, and Intel is gonna bring back $1800 Extreme Edition CPUs the second they gain an absolute performance lead back.
 
Last edited:
Joined
Apr 10, 2020
Messages
504 (0.30/day)
It's a total shitshow coming from both teams. xx60 class used to get us xx80 previous gen performance for the same price (or a bit higher adjusted for inflation). What we're getting now is xx60 class being a previous xx70 class for 20% more bucks from both teams. 4070TI should have been 4070 and costed 549 bucks, the 4080 should have been 4070TI for $749, 800 max and released 4070 4060TI for 400 to 440 bucks. Same goes for AMD 7900XTX = "7800XT" for 800 bucks and 7900XT should have been "7800" for 600 to 700 bucks.

And 'RX 7600 Early Sample Offers RX 6750 XT Performance' is a joke beyond belief as it wouldn't even compete with ngreedia's previous gen 3070 class GPU. Well, maybe if it's priced at 250 bucks, it would be acceptable buy, but we all know that's not gonna happen. It's likely gonna get 349 MSRP and rot no shelves like most new Radeon products lately :banghead:
 
Joined
Mar 19, 2023
Messages
153 (0.25/day)
Location
Hyrule Castle, France
Processor Ryzen 5600x
Memory Crucial Ballistix
Video Card(s) RX 7900 XT
Storage SN850x
Display(s) Gigabyte M32U - LG UltraGear+ 4K 28"
Case Fractal Design Meshify C Mini
Power Supply Corsair RM650x (2021)
You can't afford to be wrong when you consider yourself newsworthy. Anything coming from him/them is hot air, imo.
Absolutely ridiculous.
Anyone making this kind of projection work about yet unreleased tech products is sure to be at least as wrong as the people projecting to make the products. The semiconductor industry keeps trumping up new ideas and throws them away. On every generation of GPUs, the cuts to full dies are basically random and are just made on whatever the engineers found to work ok enough. On every new technology, the possibility of it failing or being abandoned is present.

Demanding MLID or any leaker to be 100% accurate is absolutely idiotic. People getting on their high horse and demanding 100% accuracy or bust have not understood the first thing about innovation or high tech. And if you're gonna say that the only worthwhile news is the one that's been thoroughly verified:
  1. Zen 4 x3D is a lie since we just got voltage issue on some
  2. 4070s are both a lie since their VRAM has yet to be proven insufficient broadly
  3. Intel was unworthy of being newsworthy for over 10 years because 10 years after, we found Heartbleed in it
  4. TPU is not newsworthy because it's analysing cards that'll get driver updates and will be proven completely wrong (see RDNA3 power draw day 1 and 4 months later)
And so on. The entire point of the bleeding edge is that we're always living in novelty and uncertainty. If you don't want to listen to leakers that's your right, but getting all uppity and demanding 100% accuracy is basically denying the very concept of high tech. You're asking a field that's all about innovation to be always sure of what's going to happen, as if the leakers were wizards with magic balls that saw the future or were pants on fire liars.

I swear that half the stupidity of this demand is from the leakers themselves never admitting getting anything wrong. There's a freakish aura about MLID and the like, where if TPU reviews a product and it gets proven innacurate later down, it's ok, but if it's MLID, it should always foresee everything perfectly. It's so dumb.
I listen to MLID, the guy is an AMD shill with a severe tendency of tooting his own horn and listen to himself talk, but the actual quality of conversation (his podcasts, except the ones with his bro where it's MLID agrees with MLID), general industry analysis and overall intelligence, are all high. It's one of the most interesting people to listen to if you can overcome the shilling and self-aggrandizing. And nobody should ask of him, or any news website, to get everything right. It's all about evaluating how right and how wrong they can be and deciding the validity of their statements, not demanding 100% truth on educated guesses and climbing a 40m ladder up to your saddle when they get a guess wrong.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Demanding MLID or any leaker to be 100% accurate is absolutely idiotic. People getting on their high horse and demanding 100% accuracy or bust have not understood the first thing about innovation or high tech. And if you're gonna say that the only worthwhile news is the one that's been thoroughly verified:
  1. Zen 4 x3D is a lie since we just got voltage issue on some
  2. 4070s are both a lie since their VRAM has yet to be proven insufficient broadly
  3. Intel was unworthy of being newsworthy for over 10 years because 10 years after, we found Heartbleed in it
  4. TPU is not newsworthy because it's analysing cards that'll get driver updates and will be proven completely wrong (see RDNA3 power draw day 1 and 4 months later)
1. We've got voltage issues on one. There is literally one confirmed case of CPU damage and we don't know how or why it happened. That doesn't make the whole series a lie.
2. The 4070 isn't a lie. It's got 12 GB VRAM. It's a fact. Whether you like it or not is up to you to decide.
3. We found Heartbleed years after release of the first affected CPU generation. It's a backdoor that somebody found, not a design feature of a product.
4. TPU analyses cards with release drivers. Release drivers aren't a lie. They exist. No one can test the future, obviously.

I am not demanding 100% accuracy from MLID. What I want is TPU not to use Youtube channels that are famous for making random shit up as source for newsworthy articles. Or if it's fine, then I think (based on nothing) that the 7700 XT will have 3840 shader cores and 16 GB VRAM over a 256-bit bus and will fall between the 6800 XT and 6900 XT. That's it, I've said it. Can somebody write an article now? :rolleyes:

MLID is not a leak channel. It's a fake channel.
 
Joined
Mar 19, 2023
Messages
153 (0.25/day)
Location
Hyrule Castle, France
Processor Ryzen 5600x
Memory Crucial Ballistix
Video Card(s) RX 7900 XT
Storage SN850x
Display(s) Gigabyte M32U - LG UltraGear+ 4K 28"
Case Fractal Design Meshify C Mini
Power Supply Corsair RM650x (2021)
I'm gonna lay down the cards here, AMD fans really need to stop being so proud and rabid in defense of this company.
You're totally missing the point.

AMD is at this stage our only real chance of getting any kind of competition. Intel was the fattest monopoly in tech for far too long. Their prices and laziness were the stuff of legends.
Nvidia is currently an untouchable king in parallel processing (aka GPUs). There is nothing getting close. AMD is barely capable of offering somewhat capable products for gaming.

We don't like AMD because we like red or for Lisa Su's pretty face. We like them because AMD managed to go from being near bankruptcy and rising to actually take Intel down from their status as Lazy Gods.
Nvidia is being extremely greedy right now, and unsurprisingly so. We support AMD because we think that they are our only shot at actually opening up this terrible market into something more palatable.
They're not your friends, they're not your old pals, they're not an indie, they're not an underdog. Nothing they do is out of kindness of their hearts. Begin demanding better products of them. They can do better.
And as for this stupid line:
  1. AMD IS an underdog by any definition of the term. They were, and still are, woefully understaffed and underpowered vs Nvidia and Intel. The sales status on CPUs and server chips is nothing short of extraordinary when you consider that AMD, all sectors combined (CPU and GPU) was 1/4th of Intel's personnel not 2 years ago. I've heard more than once that RTG's entire staff is about the size of Nvidia's driver team alone.
  2. AMD is not "indie" is a meaningless phrase, no industrial-scale manufacturer or designer can be indie. Nobody's expecting Raspberry Pi power or prices, we just expect competition and that can only come from an industrial provider. You don't need to be a mom and pop corner shop to get my sympathy, you need to make me see that you're bringing something good for people. AMD is bringing something good across the board in CPUs, and I want to see that in GPUs too.
  3. "Not your friends/pals" yes I'm aware of how capitalism works, thanks. You will see me supporting Intel and Nvidia if AMD ever succeeds at bringing them down and starts becoming the monopoly. It's all about playing giants one against the other so that we humble buyers can reap the best situation.
  4. "Begin demanding better products" is another entirely silly idea, since the only ways to "demand" that is to either wait for them to produce them, buy the competition, or do not buy at all. High grade consumer chips are a duopoly in both GPUs and CPUs. That translates to "buy Nvidia, buy Intel, or buy nothing and pout". NONE of these options help the consumers at all. What you're asking for is basically the attitude of a teenager that wants his mum to buy him a new videogame. If you want to actually help "better products", you have two choices: go work for AMD and help them actually make these better products, or build your own company that'll compete better than AMD or Nvidia. Or find a way to force Jensen to lower prices and his margins.
"They can do better" is the most void and pointless phrase from a consumer that is encouraging no competition, encouraging no growth, and does nothing at all but sit at home and wait for the big players to either make him cough up the money, or not cough it up. Nothing you're suggesting is helping at all.

I am not demanding 100% accuracy from MLID. What I want is TPU not to use Youtube channels that are famous for making random shit up as source for newsworthy articles. Or if it's fine, then I think (based on nothing) that the 7700 XT will have 3840 shader cores and 16 GB VRAM over a 256-bit bus and will fall between the 6800 XT and 6900 XT. That's it, I've said it. Can somebody write an article now? :rolleyes:
"Random shit up" isn't a thing MLID does.
And yes, it's fine for you to invent that and for it to be a news source on TPU. As long as you get it right. If you get it wrong, then its not fine.

And as I've explained at length, the one and only metric that matters is how often you get it right vs how often you don't.
MLID is solid enough that he can claim to mostly get things right. That's why he's in articles.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
"Random shit up" isn't a thing MLID does.
And yes, it's fine for you to invent that and for it to be a news source on TPU. As long as you get it right. If you get it wrong, then its not fine.

And as I've explained at length, the one and only metric that matters is how often you get it right vs how often you don't.
MLID is solid enough that he can claim to mostly get things right. That's why he's in articles.
I disagree on that, but have it your way. I consider anything coming from MLID as non-news, as they've been proven completely wrong on multiple occasions. Anyone can make a Youtube channel to spread bollocks.

I agree with the above (not quoted) part of your post. AMD will never become better than Nvidia if everyone takes the "wait and see" approach. Of course, we're not charity organisations to buy bad products just to support on underdog company, but AMD's products at this point are not so much worse that one can't have a decent gaming experience on an AMD system. Not to mention the price disparity. I'd much rather have an "inferior" AMD card for 50-100 bucks less than supply the green greed and make sure the next generation of Nvidia cards ends up being even more expensive than this one.
 
Joined
Mar 19, 2023
Messages
153 (0.25/day)
Location
Hyrule Castle, France
Processor Ryzen 5600x
Memory Crucial Ballistix
Video Card(s) RX 7900 XT
Storage SN850x
Display(s) Gigabyte M32U - LG UltraGear+ 4K 28"
Case Fractal Design Meshify C Mini
Power Supply Corsair RM650x (2021)
The regular fluctuations in idle and power are normal, it's just the result of more precisely and dynamically controlled core and mem clocks. Same way Ryzen idle is touchy. Ever experienced the 90W multi monitor idle on Navi31? I guess not......
Yes, since I had it and fixed it in about 2 minutes.
I would've been pretty happy with 47W idle on two screens, since that's basically normal behaviour using RDNA2 as a reference, and proof that VRAM isn't forced 100% all the time. It's not 10W idle Ada, it's good enough considering that by design Navi31 dumps a whole bunch of wasted power on mem-related rails. Next time you get the chance, spend some time watching HWInfo. Core and (to an extent) VRAM itself idles effectively, it's everything else associated with the new interconnect that doesnt.
Yes I read up about that. I wonder how much they'll be able to calm it in the next 18 months until RDNA4 rolls around.
If you've ever played a lighter game, you'd know not to wish for *more* downclocking in-game. The lack of a stable VRAM clock under lighter loads is exactly what torpedoes the Radeon experience on an otherwise strong card, and why no Geforce GPU suffers the same fate - pick a memory Pstate and stick to it, even if it's not max VRAM clock, and even if core clocks stay low.
No no, I don't care for more downclocking in game. 150W on 4K Ultra OW2 is way better than I expected. Deep Rock Galactic pushed it somehow to the max 315W (before undervolting), and Radeon Chill brought it back down to ~220W.

I'm just aiming for 2 things:
  • proper idling at 30W without taking the piss (not 5 entire minutes from watching a video to properly downpower)
  • video playback not being at a ridiculous 70W+ but rather around 40 to 50W
In games, I have no complaints. I'm quite happy.
There might be some future optimizations for excessive draw on the memory rails, but none of the other problems are unique to RDNA3. They've been around for years.
Aye, that's basically the gist of my demand. I want it optimised.
Regardless, if Navi33 is monolithic as it looks to be, it won't suffer any of the same problems as Navi31 (except Radeon's immortal hatred for multi-monitor power draw, of course).
Agree, we shall see soon.
 
Joined
Dec 25, 2020
Messages
6,734 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Nothing you're suggesting is helping at all.

Know what helps even less? Excusing them at every turn. But suit yourself, I'm not looking to start a flame war here. My objective of pointing out that they do the same things that their supposedly impure competition does has been achieved.
 
Joined
Mar 19, 2023
Messages
153 (0.25/day)
Location
Hyrule Castle, France
Processor Ryzen 5600x
Memory Crucial Ballistix
Video Card(s) RX 7900 XT
Storage SN850x
Display(s) Gigabyte M32U - LG UltraGear+ 4K 28"
Case Fractal Design Meshify C Mini
Power Supply Corsair RM650x (2021)
I disagree on that, but have it your way. I consider anything coming from MLID as non-news, as they've been proven completely wrong on multiple occasions. Anyone can make a Youtube channel to spread bollocks.
Let's agree to disagree.
Perhaps I'll make a BollocksTech channel then.
If I imitate a convincing British accent I'm sure to get all the americans to subscribe to my every word.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Let's agree to disagree.
Perhaps I'll make a BollocksTech channel then.
If I imitate a convincing British accent I'm sure to get all the americans to subscribe to my every word.
Exactly! :laugh:
 
Joined
Mar 19, 2023
Messages
153 (0.25/day)
Location
Hyrule Castle, France
Processor Ryzen 5600x
Memory Crucial Ballistix
Video Card(s) RX 7900 XT
Storage SN850x
Display(s) Gigabyte M32U - LG UltraGear+ 4K 28"
Case Fractal Design Meshify C Mini
Power Supply Corsair RM650x (2021)
Know what helps even less? Excusing them at every turn.
I don't recall excusing RDNA 3. Perhaps at first only when it seemed like the insufficient performance would be quickly fixed. If anything, I've come to the same conclusion as you, that currently AMD deserves no more love than Nvidia. I hesitated over a 4080 for weeks. Ended up going with the XT because I literally couldn't bear the Nvidia cult mentality and the price was extremely unacceptable for the hardware.

If anything, I bought RDNA3 because my choices were between extremely overpriced and flawed. I bought it thinking that I'm giving AMD a good chance here: they can fix/optimise the card until it is worth its price, or not. If not, I will make sure to go pay the Jensen Tax next time.

Also, the entire open source aspect did tip my decision AMD's way. There is a strong streak of open sourcing valuable technologies that Nvidia certainly can't claim to have on their side. Inferior or not, AMD tech does feel like it's trying to be passed along to the public and like it's generally trying to help people. DLSS may be amazeballs, if it's both vendor locked in and generation locked in, it can go in the gutter. FSR actually broadly helps, if in an insufficient way for now.
(and tbh, Adrenalin is the kind of software quality that makes me want to never go back to Nvidia's GeCrap Experience)
But suit yourself, I'm not looking to start a flame war here. My objective of pointing out that they do the same things that their supposedly impure competition does has been achieved.
It's never about purity, it's about results.
AMD's results have been to take down Intel back to Earth. AMD's results may yet be to take down Nvidia to Earth too.
I'd rather support that than anything else.

Although if Intel actually bothered to go on with their ARC lineup, I might be interested as well. The hardware wasn't there on gen 1 (large chips for poor performance), but the software had some great highlights (XeSS and the video encoders).

I've reconsidered.
I'll make a "HatedTV" channel.
Instead of making wild claims in a thick Scottish accent about how amazing AMD's next thing will be, I'll make wild claims in a thick French accent about how Intel's stuff is going to ruin us all with how good it is.
And I'll shill some shitcoins for extra Hatred.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I've reconsidered.
I'll make a "HatedTV" channel.
Instead of making wild claims in a thick Scottish accent about how amazing AMD's next thing will be, I'll make wild claims in a thick French accent about how Intel's stuff is going to ruin us all with how good it is.
And I'll shill some shitcoins for extra Hatred.
Some food for thought: :toast:

The guy tested the 4090 before it existed: :laugh:
 
Top