• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD CEO Dr Lisa Su Confirms Mainstream RDNA3 GPUs in Q2-2023

bug

Joined
May 22, 2015
Messages
13,836 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
@Avro Arrow If anything, more complex parts are all but expected to fail sooner. They're pricier so one would assume they undergo more thorough testing. I somehow doubt that, since added functionality/parts increase test scenarios exponentially.

Also, TigerDirect... the latest to bite the dust :cry:
 
Joined
Jan 14, 2019
Messages
12,548 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I don't know what you're talking about because that has never been true. The Phenom II X4 940 drew almost 220W at max load while the FX-8350 drew over 250W at max load.

Check this out:

AM2+ Era (Techspot):

AM3+ Era (Techspot):


And check out the FX-9590's numbers from AnandTech!

AM4 Era (Techspot):

(Note that the Ryzen 7 5700X consumes 32 fewer watts than the Ryzen 7 5800X so it would be at 174W total system draw.)

AM5 Era:

In the AM5 era, Intel's CPUs just look like hyper-OC versions of their previous gens while AMD has that stupid "race to 95°C" thing to max their power use because they just both want their performance numbers to be maxxed-out for review benchmark charts like these. IIRC, the R7-5800X3D uses a bit more power than the R7-7700X in Eco Mode. I get the feeling that Eco Mode is the same as AMD Cool'n'Quiet, a setting that is turned on by default in all AMD CPUs and APUs before Zen4.

Other than that, it doesn't appear that CPU power usage has appreciably gone up over the years. They're (almost) all in the 150-275W total system power between the AM2+ and AM4 eras with the power consumption in the AM5 era being artificially inflated to produce greater performance numbers. So, no, 125W did not mean 125W any more than it does today (unless you're Intel and say that the i9-13900K has a TDP of 125W). Tech advancement not only increases performance, it also increases effciency.

The most power-hungry consumer-grade CPU before the i9-13900K was the FX-9590 from the AM3+ era. It didn't perform even close to the R7-5950X but it used a crap-tonne more power.

Hell, even with the insanely-powerful video cards of today, the most power-hungry video card ever made was made nine years ago in 2014 with a TDP of 580W. The suggested PSU for this card was 950W.
Powercolor Radeon R9 290x2 Devil 13 4GB

Things aren't nearly as bad today with regard to power use as it appears. It's just that, with the war in Ukraine and the resultant spike in energy costs across the EU (caused by terrible energy decisions made by clueless politicians), power usage has come under more of a microscope than it ever had before. Couple that with the artificially-inflated power consumption numbers caused by AMD and Intel wanting to occupy the "top spot" on benchmark charts. Let's face it, people are just plain stupid sometimes. They behave like the top-spot CPU or GPU is somehow relevant to them even if they're not buying that specific product. Like, sure, the RTX 4090 is the fastest card in the world but what does that have to do with the noob who bought an RTX 4070 because he assumed that it must be faster than an RX 7900 XT because "It's nVidia, just like the RTX 4090!".

This is the kind of guano-insane mindset that has brought us to where we are now.
But the diagrams you linked show total system power. I was talking about CPU only power consumption. If you compare numbers on your linked diagrams, you see that the FX-8150 is around 7700X level, which was absolutely insane back then, but it doesn't even come close to the 7950X, which sits a good 100 W higher. That's what motherboards have to deal with today, and that's (partly) why there's a bigger difference in the low and high end.
^^^ From the post that you were responding to. Please note the bold/italic text. ^^^
Since you "like" it when I repeat myself... :roll:

Power sipped through 4 slots is not the same as power consumed through one single socket. In the first case, you just build a normal PCI-e circuitry 4 times. In the second case, you have to design an entirely new power delivery to suit the higher load.
 
Joined
Dec 10, 2022
Messages
486 (0.66/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
@Avro Arrow If anything, more complex parts are all but expected to fail sooner. They're pricier so one would assume they undergo more thorough testing. I somehow doubt that, since added functionality/parts increase test scenarios exponentially.
Sure, that's also a possibility, but that also didn't usually happen. Companies like MSi know what they're doing when it comes to making motherboards, it's old hat to them and would've been old hat even back then. There was probably just a tiny flaw somewhere on the board that got missed and I was the unlucky recipient of the flaw. The reason that I'll never buy MSi again is that they were a$$holes about it. See, if I was running customer service for a company like that, sure, the warranty period is the warranty period, but, if a customer had purchased one of my expensive flagship products and it failed only three months after the warranty period was over, I would totally allow the customer to send the item in for examination. If it was clear that they'd done nothing to cause the problem, I would definitely take care of them. A customer who buys a flagship product is valuable and the choice between eating $100 to gain a loyal-as-hell customer (remember, the warranty had techincally expired) who buys flagship boards or saving $100 and possibly losing that customer (because their perception of my company would be terrible at that point), I'd choose the former seven days a week and twice on Sundays. If a flagship product fails only three months after the warranty expires, a company should be embarrassed by that. Instead, MSi was completely nonchalant.

At the end of the call, I informed them that I worked for Tiger Direct and that I would not sell another MSi-branded item for as long as I worked there. I estimate that they lost about $20,000 in sales over the next year while ASRock, ECS and Gigabyte probably gained the most benefit as a result.
Also, TigerDirect... the latest to bite the dust :cry:
Yeah, but don't feel bad. It bit the dust because it was a terrible company with terrible management. The upper-level management was a bunch of crooks and cronies. Tiger Direct deserved to die.

But the diagrams you linked show total system power.
Yeah, they ALL do and since pretty much everything else in the system hasn't really changed with power use, they're all relevant. What, do you think that a hard drive or some RAM uses an extra 50W?

I was talking about CPU only power consumption. If you compare numbers on your linked diagrams, you see that the FX-8150 is around 7700X level, which was absolutely insane back then, but it doesn't even come close to the 7950X, which sits a good 100 W higher. That's what motherboards have to deal with today, and that's (partly) why there's a bigger difference in the low and high end.
My point was that there have been high-watt CPUs in every era and so motherboards had to be made to deal with them. Hell, in the AM2 era, Phenom I CPUs sometimes melted their motherboards. Do you think that happened because they didn't use much juice? Oh hell no! :laugh:
Since you "like" it when I repeat myself... :roll:
It sure beats repeating me! :D
Power sipped through 4 slots is not the same as power consumed through one single socket.
Not at the socket site itself but it all comes from a single source that must be made more robust to handle that.
In the first case, you just build a normal PCI-e circuitry 4 times. In the second case, you have to design an entirely new power delivery to suit the higher load.
And what is that normal PCI-e circuitry all attached to? The power distribution circuits of the motherboard itself where 300W flows as 300W before it's divided up into 4 circuits of 75W. That's how circuits work. No matter what the wattage is at each end of it, it's all of them together at the source.
 

bug

Joined
May 22, 2015
Messages
13,836 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
@Avro Arrow Yeah, I avoid MSI because of some subpar interaction with their customer support as well.

As for TigerDirect, I can't say I gave them a lot of business (I am US based). It's still sad to see brick and mortar going through these hard times. I know online is all the rage, but if you want to try a mouse or a keyboard before buying or just looking at a monitor to gauge if it's all the reviews make it out to be... well, good luck with that. Sure, you can return your online purchase, but that's just wasteful. And you get to pay for it.
 
Joined
Jan 14, 2019
Messages
12,548 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Yeah, they ALL do and since pretty much everything else in the system hasn't really changed with power use, they're all relevant. What, do you think that a hard drive or some RAM uses an extra 50W?
No, but since I was talking about CPU only power consumption, those diagrams aren't really an answer to what I said.

My point was that there have been high-watt CPUs in every era and so motherboards had to be made to deal with them. Hell, in the AM2 era, Phenom I CPUs sometimes melted their motherboards. Do you think that happened because they didn't use much juice? Oh hell no! :laugh:
Of course because OCP, OVP and such wasn't as robust as these days. VRMs also melted some motherboards because they were poorly built. My point stands: the difference between entry-level and high-end is much greater now than it used to be.

Not at the socket site itself but it all comes from a single source that must be made more robust to handle that.
Oh, the 24-pin cable/connector can handle that. ;)
From Wikipedia:
The 20–24-pin Molex Mini-Fit Jr. has a power rating of 600 volts, 8 amperes maximum per pin (while using 18 AWG wire).[16] As large server motherboards and 3D graphics cards have required progressively more and more power to operate, it has been necessary to revise and extend the standard beyond the original 20-pin connector, to allow more current using multiple additional pins in parallel. The low circuit voltage is the restriction on power flow through each connector pin; at the maximum rated voltage, a single Mini-Fit Jr pin would be capable of 4800 watts.

And what is that normal PCI-e circuitry all attached to? The power distribution circuits of the motherboard itself where 300W flows as 300W before it's divided up into 4 circuits of 75W. That's how circuits work. No matter what the wattage is at each end of it, it's all of them together at the source.
Okay, show me how complicated the 75 W PCI-Express power delivery circuit is compared to a CPU VRM.
 
Top