• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Official Statement from AMD on the PCI-Express Overcurrent Issue

Joined
Apr 29, 2014
Messages
4,290 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Ok, so this thread seems to be spiraling out into a war. Should we lock and load?

In all seriousness, here is what it boils down to:

1: AMD decided to put a 6 pin instead of an 8 pin reference to look lower power instead of being smart and letting us have the clocking and less problems.
2: AMD needs to release a driver fix to stop the card from overdrawing from the PCIE and either change it to the 6 pin or limit it.
3: Even if you buy this card, your not going to kill your motherboard with it unless you have the most basic/cheap motherboard possible and even then I would be skeptical.

Fact is this should not be a problem but it is. Is it a big problem that is going to result in dead motherboards? No because motherboards especially in this day and age are pretty tough even on the cheap side. I have overloaded a motherboard's PCIE's before, it takes alot to actually do some damage to it. But the fact is AMD was beyond foolish to not only not put an 8 pin, but to let this pass through like this instead of allowing the 6 pin to take the brunt. PSU's in this day and age have an 8 pin minimum even on the most cheap entry level one you would want to buy to support your gaming rig (Speaking ~500watt). Either way though, this does not ruin the card or the value of what your getting, but it definitely makes after market variants look alot more appealing.
 
Joined
Jun 21, 2016
Messages
1,125 (0.37/day)
System Name Team Crimson
Processor AMD FX 8320
Motherboard Gigabyte GA-78LMT-USB3
Cooling Corsair H80i
Memory DDR3 16GB Crucial Ballistix Sport
Video Card(s) Sapphire Nitro+ RX 480 8GB RAM
Storage Samsung 850 EVO 250GB / Crucial MX300 275GB
Display(s) AOC 2752H 27" 1080p
Case NZXT Source 220 Windowed
Power Supply Antec Earthworks 650W
Mouse Logitech M510/ AGPTEK T-90 Zelotes
Keyboard Logitech K360/ iBUYPOWER TTC RED Switch Mechanical
Software Windows 8.1 64 Bit
1: AMD decided to put a 6 pin instead of an 8 pin reference to look lower power instead of being smart and letting us have the clocking and less problems.

Ed from Sapphire had a cryptic answer while under NDA when what connector the RX 480 NITRO would have; he said it has an 8-pin but that you really don't need it. He seemed to suggest that plugging in the additional 2-pins was optional.
 
Joined
Jul 1, 2005
Messages
5,197 (0.73/day)
Location
Kansas City, KS
System Name Dell XPS 15 9560
Processor I7-7700HQ
Memory 32GB DDR4
Video Card(s) GTX 1050/1080 Ti
Storage 1TB SSD
Display(s) 2x Dell P2715Q/4k Internal
Case Razer Core
Audio Device(s) Creative E5/Objective 2 Amp/Senn HD650
Mouse Logitech Proteus Core
Keyboard Logitech G910
That's why AIB's have different hardware ID's to identify specific hardware, so that driver doesn't "assume" things like this, but it "knows" things like this.

Oh man you're fucking delusional. Source please.
 
Joined
Mar 10, 2014
Messages
1,793 (0.46/day)
Ed from Sapphire had a cryptic answer while under NDA when what connector the RX 480 NITRO would have; he said it has an 8-pin but that you really don't need it. He seemed to suggest that plugging in the additional 2-pins was optional.

Nothing cryptic on that, those 2 extra pins are just ground. Again it's quite safe drew more than 75W from 6-pin connector, if you have high end psu.

I thought AIBs were adding 8-pins for higher total power limits for overclocking?
I have not seen anything concrete that shows that an 8-pin would decrease the draw on the PCIE slot, as my understanding is that is regulated by the GPU itself.

That is correct, reference RX-480 has a solid vrm. It's just routed for 50-50 power distribution between pcie slot and pcie connector.
 
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Son, you just went full retard. I think we're about done here.
Nope. I will have to try really really hard to reach drop to your level.

You still are really not getting how nvidia boost works...
Thanks to GPU boost, basically 0w.

Maybe I am missing something, but I don't see anyone explaining to me how you can get 20% extra performance and don't consume any more power. Can someone explain me that magic? The card consumes 74W at average and default clocks, it gets overclocked, it scores 20% higher in performance and I have to assume that power consumption on average remained at under 75W because of Nvidia boost? Oh, please explain.

I learned about PCI-E bus because of this:
View attachment 76280
...and this:
View attachment 76281

...and about 28 more reasons in my office - I fix this stuff occasionally, if you know what I mean.

Now, if it came to defensive insults, what makes you a specialist in this area?

P.S. Boards are not for sale! Can trade a Z77 for cheap air conditioning :toast:
When you don't have any arguments just throw degrees and hardware in the face of the other. That will make you look more credible I guess. You think you are the first person on the internet that starts a post with "You should hear me, I am an engineer" and then you can't believe the BS he writes. I am not saying that you are talking BS. I just say that taking pictures of your hardware doesn't makes you an expert. You think I bought my PC yesterday? And no I haven't thought about PCIe power draw and I bet 99% of those posting in here haven't either. The last time I was worrying about a graphics card and a bus, was when running a GeForce 2 MX64 with the AGP at 83MHz.

Reinforcements... :ohwell:
 
Joined
Mar 10, 2014
Messages
1,793 (0.46/day)
Nope. I will have to try really really hard to reach drop to your level.




Maybe I am missing something, but I don't see anyone explaining to me how you can get 20% extra performance and don't consume any more power. Can someone explain me that magic? The card consumes 74W at average and default clocks, it gets overclocked, it scores 20% higher in performance and I have to assume that power consumption on average remained at under 75W because of Nvidia boost? Oh, please explain.
:ohwell:When you don't have any arguments just throw degrees and hardware in the face of the other. That will make you look more credible I guess. You think you are the first person on the internet that starts a post with "You should hear me, I am an engineer" and then you can't believe the BS he writes. I am not saying that you are talking BS. I just say that taking pictures of your hardware doesn't makes you an expert. You think I bought my PC yesterday? And no I haven't thought about PCIe power draw and I bet 99% of those posting in here haven't either. The last time I was worrying about a graphics card and a bus, was when running a GeForce 2 MX64 with the AGP at 83MHz.

Reinforcements...

I don't think you would get that 20% performance out of it, unless you have truly amazing chip. Nvidia has power restrictions set in bios, if you don't ask more power to take while overclocking(=don't touch tdp percentages), it will throttle clocks to keep power in which is restricted by bios.
 

silentbogo

Moderator
Staff member
Joined
Nov 20, 2013
Messages
5,540 (1.38/day)
Location
Kyiv, Ukraine
System Name WS#1337
Processor Ryzen 7 5700X3D
Motherboard ASUS X570-PLUS TUF Gaming
Cooling Xigmatek Scylla 240mm AIO
Memory 4x8GB Samsung DDR4 ECC UDIMM
Video Card(s) MSI RTX 3070 Gaming X Trio
Storage ADATA Legend 2TB + ADATA SX8200 Pro 1TB
Display(s) Samsung U24E590D (4K/UHD)
Case ghetto CM Cosmos RC-1000
Audio Device(s) ALC1220
Power Supply SeaSonic SSR-550FX (80+ GOLD)
Mouse Logitech G603
Keyboard Modecom Volcano Blade (Kailh choc LP)
VR HMD Google dreamview headset(aka fancy cardboard)
Software Windows 11, Ubuntu 24.04 LTS
 
Joined
Sep 2, 2014
Messages
259 (0.07/day)
Location
Emperor's retreat/Naboo Moenia
System Name Order66
Processor Ryzen 7 3700X
Motherboard Asus TUF GAMING B550-PLUS
Cooling AMD Wraith Prism (BOX-cooler)
Memory 16GB DDR4 Corsair Desktop RAM Vengeance LPX 3200MHz Red
Video Card(s) GeForce RTX 3060Ti
Storage Seagate FireCuda 510 1TB SSD
Display(s) Asus VE228HR
Case Thermaltake Versa C21 RGB
Audio Device(s) onboard Realtek
Power Supply Corsair RM850x
Software Windows10 64bit
Nope. I will have to try really really hard to reach drop to your level.
.............................................................

Sorry mate, but your following comment that @Assimilator has quoted, wasn't among your best ones:
john_ said:
You see, there are many things that the press will not tell you. You just learned about the PCIe bus power draw because the RX480 is an AMD card. If it was an Nvidia card, you wouldn't have known about it.

Seriously?!!
If it was an Nvidia card we would have never hear about it?:eek:
AMD managed to confuse the entire gaming community
with their propaganda Vs the GTX 970 memory size, & made the people believe that the card had less than advertised memory, and now you expect me to believe that if NV's cards had similar power issues (*which is something far greater than the memory size, since its a safety matter ), no one would know? !!o_O
 
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Oh, please explain.
Ok, I'll be the one explaining this time.
The low power 950 is at all times limited to 75W power target ... at all times. Sample that @W1zzard reviewed probably had considerably better ASIC quality than average, meaning it was able to reach higher clocks on lower voltages than average sample. The rest is boost 2.0, power target is same 75W, clocks are offset by 200 Hz and the boost tightens the voltages to stay inside 75W and voila stable overclock. Every review has dynamic OC: clock vs. voltage table ... as you can see there are multiple clock samples for each voltage state.
 
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Ok, I'll be the one explaining this time.
The low power 950 is at all times limited to 75W power target ... at all times. Sample that @W1zzard reviewed probably had considerably better ASIC quality than average, meaning it was able to reach higher clocks on lower voltages than average sample. The rest is boost 2.0, power target is same 75W, clocks are offset by 200 Hz and the boost tightens the voltages to stay inside 75W and voila stable overclock. Every review has dynamic OC: clock vs. voltage table ... as you can see there are multiple clock samples for each voltage state.

The think BiggieShady is that I am not talking about frequencies here. The card could boost to 2GHz and stay under 75W and under certain conditions. But I am not talking about frequencies, do I? I am talking about performance. If the card wasn't gaining 20% performance but 1-3%, I wouldn't be making any fuss about it.
 
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
But I am not talking about frequencies, do I? I am talking about performance.
If you are talking about performance then you are talking about frequency ... you are not gaining performance by dynamically adding compute units :rolleyes:
 
Joined
Nov 5, 2015
Messages
501 (0.15/day)
Location
Skopje, Macedonia
System Name The Tesseract Cube
Processor AMD Ryzen 5 3600
Motherboard MSI X570A-PRO
Cooling DeepCool Maelstrom 240mm, 2 X DeepCool TF120S (radiator fans), 4 X DeepCool RF120 (case fans)
Memory 2 x 16gb Kingston HyperX 3200mhz
Video Card(s) Sapphire Radeon RX 6800 Nitro + 16GB
Storage Corsair MP400 G3 1TB, Western Digital Caviar Blue 1TB
Display(s) MSI MAG241C Full HD, 144hz FreeSync
Case DeepCool Matrexx 55
Audio Device(s) MB Integrated, Sound Blaster Play 3 (Headset)
Power Supply Corsair CX650M Modular 80+ Bronze
Mouse Corsair Dark Core Pro Wirless RGB
Keyboard MSI GK30 Mecha-Membrane
Software Windows 10 Pro
Benchmark Scores CPUZ: Single Thread - 510 Multi Thread - 4.050 Cinebench R20: CPU - 3 500 score
All this talk for no reason.
Everytning is back to square one.
Nothing solved, nothing learned.
Just fanboys fighting all over TPU.

Truth be told, nobody should compare AMD to Nvidia, and it is becase they have different aproach on the GPU Market.

I honestly owned 4 AMD Cards, none of which blew up my system (on a side note my old psu almost did, people know what i mean).

I guess this one wont either.
Jays2Cents said clearly in the review of this card: mIt makes systems unstable if they have low or mid class mobos"

It does not blow up hardware. And it never will. It is just excuse to make AMD look bad, just because of a small problem their product has.

So what? No company for any tipe of product has perfection, and nobody bitches over most of those brands and names.
 
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
If you are talking about performance then you are talking about frequency ... you are not gaining performance by dynamically adding compute units :rolleyes:
Does anyone understand basic things here?

By increasing the frequency you don't necessarily gain performance. If the card is limited in how much power it will take from the pcie bus, remaining under or at 75W, in will throttle. But if the results of overclocking the card are 20% extra performance, then the card doesn't stop at 75W, it asks and it gets more power from the pcie bus. Remember. At standard speeds, based on the review, the card is already at 74W average. Not talking about the peak at 79W. Let's ignore that. If at 100% performance you have an average power consumption of 74W, even if you keep the voltages stable, by increasing the frequency of the GPU AND the frequency of the GDDR5, you are going higher in power consumption. And probably power consumption increases more than 20% that is the performance gain. For Nvidia's Boost to do some magic to keep the card at 75W, it will have to drop voltages automatically at higher frequencies and the card to remain stable.
 
Joined
Nov 19, 2012
Messages
753 (0.17/day)
System Name Chaos
Processor Intel Core i5 4590K @ 4.0 GHz
Motherboard MSI Z97 MPower MAX AC
Cooling Arctic Cooling Freezer i30 + MX4
Memory 4x4 GB Kingston HyperX Beast 2400 GT/s CL11
Video Card(s) Palit GTX 1070 Dual @ stock
Storage 256GB Samsung 840 Pro SSD + 1 TB WD Green (Idle timer off) + 320 GB WD Blue
Display(s) Dell U2515H
Case Fractal Design Define R3
Audio Device(s) Onboard
Power Supply Corsair HX750 Platinum
Mouse CM Storm Recon
Keyboard CM Storm Quickfire Pro (MX Red)
Ok, so this thread seems to be spiraling out into a war. Should we lock and load?

In all seriousness, here is what it boils down to:

1: AMD decided to put a 6 pin instead of an 8 pin reference to look lower power instead of being smart and letting us have the clocking and less problems.
2: AMD needs to release a driver fix to stop the card from overdrawing from the PCIE and either change it to the 6 pin or limit it.
3: Even if you buy this card, your not going to kill your motherboard with it unless you have the most basic/cheap motherboard possible and even then I would be skeptical.

Fact is this should not be a problem but it is. Is it a big problem that is going to result in dead motherboards? No because motherboards especially in this day and age are pretty tough even on the cheap side. I have overloaded a motherboard's PCIE's before, it takes alot to actually do some damage to it. But the fact is AMD was beyond foolish to not only not put an 8 pin, but to let this pass through like this instead of allowing the 6 pin to take the brunt. PSU's in this day and age have an 8 pin minimum even on the most cheap entry level one you would want to buy to support your gaming rig (Speaking ~500watt). Either way though, this does not ruin the card or the value of what your getting, but it definitely makes after market variants look alot more appealing.

Good intentions, not quite the most accurate info, though...

1: AMD decided to split the power supply 50/50 between the external power connector (happens to be 6-pin in this case) and the PCI-E slot. To illustrate:

front_full.jpg


This is a problem because while the official spec for the 6-pin connector is 75W it can realistically provide upwards of 200W continuously without any ill effects.
The PCI-E slot and the card's x16 connector have 5 (five) flimsy pins at their disposal for power transfer. Those cannot physically supply more than a bit above 1A each. The better ones can sometimes handle 1.2A before significantly accelerating oxidation (both due to heating and passing current) and thus increasing resistance, necessitating more amps to pass to supply enough power further increasing oxidation rate... It's a feedback loop eventually leading to failure.

2: AMD cannot fix this via drivers, as there are trace breaks with missing resistors and wires that would bridge the PCI-E slot supply to the 6-pin power connector. This would make the connector naturally preferable to the current flow as its path has a lower resistance and that's the path current prefers to take. It can only be permanently fixed by physical modification. No other methods. AMD can lower the total power draw and thus by extension relieve the stress on the PCI-E slot, but it will probably cost some of the GPU's performance. We'll see.

3: Buying and using this card won't kill your motherboard... straight away. Long-term consequences are unpredictable but cannot be positive. Would driving your car in first gear only, bumping into RPM limiter all the time kill your car? Well, not right away, but... Yeah. It's the same here, you're constantly at the realistic limit of an electromechanical system, constant stress is not going to make it work longer nor better, that's for sure.

The AIB partners would do well to design their PCBs such that the PCI-E slot only supplies power past 150W being drawn from the auxiliary power connector or something like that. Perhaps give one of the six phases to the slot, and the remaining five to the connector... Or better yet, power memory from the slot and GPU from the power connector exclusively. Breaking PCI-E spec that way is much less damaging due to the actual cpaabilities of the Molex Mini-Fit Jr. 2x3-pin connector that we like to call the 6-pin PCI-E power.
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.10/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Maybe I am missing something, but I don't see anyone explaining to me how you can get 20% extra performance and don't consume any more power. Can someone explain me that magic? The card consumes 74W at average and default clocks, it gets overclocked, it scores 20% higher in performance and I have to assume that power consumption on average remained at under 75W because of Nvidia boost? Oh, please explain.

Because performance is not directly related to power draw. Raising clock speeds does very little to power draw, it is raising the voltage that increases power draw. On the GTX950 with the 6-pin, the GPU runs at 1.3v. On the GTX950 without the 6-pin the GPU runs at 1.0v. That is a massive difference, and the reason the card stays at 75w. It is also the reason that the 6-pinless GTX950 barely overclocks to match the stock speeds the 6-pin GTX950 runs. The GTX950 Strix with no overclock boosts to 1408MHz(@1.3v), the 6-pinless GTX950 with overclock only boosts to 1440Mhz(@1.0v). That 1.0v is why it stays under 75w, and GPU Boost will lower that voltage and the clock speeds if it needs to to stay under 75w.
 
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Because performance is not directly related to power draw. Raising clock speeds does very little to power draw, it is raising the voltage that increases power draw. On the GTX950 with the 6-pin, the GPU runs at 1.3v. On the GTX950 without the 6-pin the GPU runs at 1.0v. That is a massive difference, and the reason the card stays at 75w. It is also the reason that the 6-pinless GTX950 barely overclocks to match the stock speeds the 6-pin GTX950 runs. The GTX950 Strix with no overclock boosts to 1408MHz(@1.3v), the 6-pinless GTX950 with overclock only boosts to 1440Mhz(@1.0v). That 1.0v is why it stays under 75w, and GPU Boost will lower that voltage and the clock speeds if it needs to to stay under 75w.

Power draw goes up with frequency, not as much as by increasing the voltage, but it does go up. And not by very little, you are wrong here, especially when you overclock both memory and GPU.

Please try to NOT ignore the fact that the average power draw in the review at defaults is at 74W. Even if the GTX 950 runs at 1.0V instead of 1.3V, in the end it consumes 74W on average. So even if the difference in voltage is massive, as you say, the card still uses 74W on average. So the starting line is there at 74W. The card overclocks really well in W1zzard's review and it gets 20% extra performance(I am keep writing this, everyone conveniently ignores it). You don;t get 20% extra performance with lower clocks and voltage. So, if the card is at 74W at defaults, for that 20% extra performance it probably jumps at 90W through the pcie bus. If it was staying at 75W, then there wouldn't have been any serious performance gains and W1zzard's conclusion would have been that the card is power limited.

Am I right @W1zzard ?
 
Joined
Nov 19, 2012
Messages
753 (0.17/day)
System Name Chaos
Processor Intel Core i5 4590K @ 4.0 GHz
Motherboard MSI Z97 MPower MAX AC
Cooling Arctic Cooling Freezer i30 + MX4
Memory 4x4 GB Kingston HyperX Beast 2400 GT/s CL11
Video Card(s) Palit GTX 1070 Dual @ stock
Storage 256GB Samsung 840 Pro SSD + 1 TB WD Green (Idle timer off) + 320 GB WD Blue
Display(s) Dell U2515H
Case Fractal Design Define R3
Audio Device(s) Onboard
Power Supply Corsair HX750 Platinum
Mouse CM Storm Recon
Keyboard CM Storm Quickfire Pro (MX Red)
For CPUs and GPUs, the power dissipation increases linearly with frequency, and proportional to the square of the voltage. In simple terms, P = C*V²*F, where C = internal capacitance (specific to the individual specimen), V = voltage and F = frequency. This is an oversimplification but provides a nice model that's fairly accurate until you get to LN2 stuff...
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
@McSteel
Are you sure phases are physically tied to one and another power input? I wanted to ask just that if anyone can trace the wiring on the PCB...

Either way, if AMD limits power draw to actual 150W, that technically wouldn't really be cheating, they'd just be bringing it to what they've been advertising the whole time. Assuming they did it on purpose to boost framerate in reviews, hoping no one would notice it is just foolish seeing what kind of shitstorm everyone made out of this. And especially since all reviewers also tackle power consumption and that would also be a straight giveaway, like it was now.

So, calling it intentional, I'm not buying it. No one is this stupid.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.10/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Power draw goes up with frequency, not as much as by increasing the voltage, but it does go up. And not by very little, you are wrong here, especially when you overclock both memory and GPU.

Please try to NOT ignore the fact that the average power draw in the review at defaults is at 74W. Even if the GTX 950 runs at 1.0V instead of 1.3V, in the end it consumes 74W on average. So even if the difference in voltage is massive, as you say, the card still uses 74W on average. So the starting line is there at 74W. The card overclocks really well in W1zzard's review and it gets 20% extra performance(I am keep writing this, everyone conveniently ignores it). You don;t get 20% extra performance with lower clocks and voltage. So, if the card is at 74W at defaults, for that 20% extra performance it probably jumps at 90W through the pcie bus. If it was staying at 75W, then there wouldn't have been any serious performance gains and W1zzard's conclusion would have been that the card is power limited.

Am I right @W1zzard ?

No one is ignoring it. We just keep telling you it is happening with no extra power draw. You are ignoring what we keep telling you. Clock speeds do not affect power draw a noticeable amount, maybe 1w. Voltage affects power draw. GPU Boost guarantees the card stays within its power limit. NVidia learned from their mistakes already, they went through this growing phase with Fermi, and have developed a very good tech to guarantee cards don't go over their power limit.
 
Joined
Nov 19, 2012
Messages
753 (0.17/day)
System Name Chaos
Processor Intel Core i5 4590K @ 4.0 GHz
Motherboard MSI Z97 MPower MAX AC
Cooling Arctic Cooling Freezer i30 + MX4
Memory 4x4 GB Kingston HyperX Beast 2400 GT/s CL11
Video Card(s) Palit GTX 1070 Dual @ stock
Storage 256GB Samsung 840 Pro SSD + 1 TB WD Green (Idle timer off) + 320 GB WD Blue
Display(s) Dell U2515H
Case Fractal Design Define R3
Audio Device(s) Onboard
Power Supply Corsair HX750 Platinum
Mouse CM Storm Recon
Keyboard CM Storm Quickfire Pro (MX Red)
@McSteel
Are you sure phases are physically tied to one and another power input? I wanted to ask just that if anyone can trace the wiring on the PCB...

Either way, if AMD limits power draw to actual 150W, that technically wouldn't really be cheating, they'd just be bringing it to what they've been advertising the whole time. Assuming they did it on purpose to boost framerate in reviews, hoping no one would notice it is just foolish seeing what kind of shitstorm everyone made out of this. And especially since all reviewers also tackle power consumption and that would also be a straight giveaway, like it was now.

So, calling it intentional, I'm not buying it. No one is this stupid.

Yeah, you can see that in this video. Ok, the guy in it may not hold a masters in electronics, but it's clear the power phases are completely separated, and the GPU simply draws in power 50/50 from them.
A bit more current is drawn from the slot than from the aux connector simply due to higher resistance of the slot power pins...

I'm sure @W1zzard could confirm if he could find a bit of free time to do it :)
 
Joined
Sep 2, 2014
Messages
259 (0.07/day)
Location
Emperor's retreat/Naboo Moenia
System Name Order66
Processor Ryzen 7 3700X
Motherboard Asus TUF GAMING B550-PLUS
Cooling AMD Wraith Prism (BOX-cooler)
Memory 16GB DDR4 Corsair Desktop RAM Vengeance LPX 3200MHz Red
Video Card(s) GeForce RTX 3060Ti
Storage Seagate FireCuda 510 1TB SSD
Display(s) Asus VE228HR
Case Thermaltake Versa C21 RGB
Audio Device(s) onboard Realtek
Power Supply Corsair RM850x
Software Windows10 64bit
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
No one is ignoring it. We just keep telling you it is happening with no extra power draw. You are ignoring what we keep telling you. Clock speeds do not affect power draw a noticeable amount, maybe 1w. Voltage affects power draw. GPU Boost guarantees the card stays within its power limit. NVidia learned from their mistakes already, they went through this growing phase with Fermi, and have developed a very good tech to guarantee cards don't go over their power limit.
In your dreams that thing you wrote and I putted in bold. In fact it would have been a dream of mine also to just increase frequencies in my hardware and expect only 1W more power consumption after getting 20% extra performance. Not to mention that in that case RX480 would have been close to 166W at any frequency, still it goes at 187W if I remember correctly. Doesn't it? Yes, yes I know. GPU Boost is a magical feature offering free performance with 1 extra watt.

No need to quote me again. Just see McSteel's post and stop there. Save both ourselves some time.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.10/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
In your dreams that thing you wrote and I putted in bold. In fact it would have been a dream of mine also to just increase frequencies in my hardware and expect only 1W more power consumption after getting 20% extra performance. Not to mention that in that case RX480 would have been close to 166W at any frequency, still it goes at 187W if I remember correctly. Doesn't it? Yes, yes I know. GPU Boost is a magical feature offering free performance with 1 extra watt.

No need to quote me again. Just see McSteel's post and stop there. Save both ourselves some time.

With normal operation, when the clocks go up the voltage goes up with it. That is why W1z includes voltage/clock tables in his reviews. AMD had to increase the voltage on the RX 480 to keep it stable at the clock speeds they wanted(this is also probably why it overclocks so poorly at stock voltage). However, when W1z does his overclocking he does not increase voltage, he leaves it at the stock. So while he increases the clock speeds, the voltage stays the same, so the current going through the GPU stays the same. So you get no real power consumption increase.

In fact, one of the trick of overclocking nVidia cards is to actually lower the voltage to get higher clock speeds. If your card is stable, but hitting the power limit, you can lower the voltage and raise the clocks to get better performance. It is a commonly used trick, and one I had to use on my GTX970s.
 
Top