• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

is my 3080 TUF OC a Golden Sample? (2160 Mhz while gaming)

  • Thread starter Deleted member 193596
  • Start date
D

Deleted member 193596

Guest
so i have a 3080 TUF OC since around a week now and i am overclocking it every day (one 15 Mhz Bump every day after several hours of 1440p gaming)


i currently sit at +180 Mhz on the core which results in Cold boosts up to 2200 MHz with the stock fan curve.


my average Gaming Clock speed is 2145 MHz and it bumps up to 2160 Mhz.

In very demanding titles that are going to the max 352W non stop the card runs at around 2145 to 2160 Mhz.

3080 overclock.png


btw the boost clock is 165 Mhz above the EVGA FTW3 Ultra
GPU Z 3080.png


here is the clock speed at full power consumption and utilization 2145 to 2160 Mhz average.
3080 overclock 100% load.png
 
Last edited by a moderator:
Joined
Sep 3, 2019
Messages
3,050 (1.71/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 160W PPT limit, 75C temp limit, CO -9~14
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F37h, AGESA V2 1.2.0.B
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MHz 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~465W (378W current) PowerLimit, 1060mV, Adrenalin v24.7.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v23H2, OSB 22631.3737)
You “measure” power draw by readings on HWiNFO?
Just curious... Does this sensor report the whole card (GPU+VRAM) or just the GPU?
I think that nVidia is not providing any info about VRAM other than clock. You may have to add another 30-40W for VRAM power draw on top of that 350W for total card power draw, given the fact that GDDR6X is more power “hungry” than GDDR6 which is around 25W max for a 8x1GB (14Gb, 1800MHz) VRAM configuration.

Very impressive clocks though... I was under the impression that 30series AIB cards have limited power draw, worst that the FE ones.
 
D

Deleted member 193596

Guest
You “measure” power draw by readings on HWiNFO?
Just curious... Does this sensor report the whole card (GPU+VRAM) or just the GPU?
I think that nVidia is not providing any info about VRAM other than clock. You may have to add another 30-40W for VRAM power draw on top of that 350W for total card power draw, given the fact that GDDR6X is more power “hungry” than GDDR6 which is around 25W max for a 8x1GB (14Gb, 1800MHz) VRAM configuration.

Very impressive clocks though... I was under the impression that 30series AIB cards have limited power draw, worst that the FE ones.


the 350W are board power of course.

in GPU Z the GPU pulls around 200W and the board power is 352W.

the 110% Powerlimit are the 352W.
 
Joined
Sep 3, 2019
Messages
3,050 (1.71/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 160W PPT limit, 75C temp limit, CO -9~14
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F37h, AGESA V2 1.2.0.B
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MHz 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~465W (378W current) PowerLimit, 1060mV, Adrenalin v24.7.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v23H2, OSB 22631.3737)
Only 200W for GPU and the rest is 150W... it’s hard to believe.
May I see what short of sensors GPU-Z is reporting for the card?
I’m interested in such info.
 
D

Deleted member 193596

Guest
Only 200W for GPU and the rest is 150W... it’s hard to believe.
May I see what short of sensors GPU-Z is reporting for the card?
I’m interested in such info.
that's completely normal for a 3080.

the memory alone pulls 70W

3080 powerconsumption.png
 
Joined
Sep 3, 2019
Messages
3,050 (1.71/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 160W PPT limit, 75C temp limit, CO -9~14
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F37h, AGESA V2 1.2.0.B
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MHz 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~465W (378W current) PowerLimit, 1060mV, Adrenalin v24.7.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v23H2, OSB 22631.3737)
Screenshot of GPU-Z sensors?
 
Last edited:
Joined
Sep 3, 2019
Messages
3,050 (1.71/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 160W PPT limit, 75C temp limit, CO -9~14
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F37h, AGESA V2 1.2.0.B
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MHz 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~465W (378W current) PowerLimit, 1060mV, Adrenalin v24.7.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v23H2, OSB 22631.3737)
Ok, I saw it now
 
Joined
May 12, 2017
Messages
2,207 (0.84/day)
so i have a 3080 TUF OC since around a week now and i am overclocking it every day (one 15 Mhz Bump every day after several hours of 1440p gaming)


i currently sit at +180 Mhz on the core which results in Cold boosts up to 2200 MHz with the stock fan curve.


my average Gaming Clock speed is 2145 MHz and it bumps up to 2160 Mhz.

In very demanding titles that are going to the max 352W non stop the card runs at around 2145 to 2160 Mhz.

View attachment 171612

btw the boost clock is 165 Mhz above the EVGA FTW3 Ultra View attachment 171613

here is the clock speed at full power consumption and utilization 2145 to 2160 Mhz average.View attachment 171615

Let's see your score with Superposition 1080P Extreme. ..Upload screenshot

Download Here https://benchmark.unigine.com/superposition
 
Joined
Aug 27, 2011
Messages
983 (0.21/day)
Processor Intel core i9 13900ks sp117 direct die
Motherboard Asus Maximus Apex Z790
Cooling Custom loop 3*360 45mm thick+ 3 x mo-ra3 420 +Dual D5 pump and dual ddc pump
Memory 2x24gb Gskill 8800c38
Video Card(s) Asus RTX 4090 Strix
Storage 1TB Samsung 860Evo,2*2tb Samsung 970Evo Plus, 1tb Intel 660p nvme
Display(s) Sammsung G7 32”
Case Dynamic XL
Audio Device(s) Creative Omni 5.1 usb sound card
Power Supply Corsair AX1600i
Mouse Model O-
Keyboard Hyper X Alloy Origin Core
Run time spy or port royal and see if it can hold the clock. If it holds, its very good.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,020 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Could be a golden sample, but I find my with 3080 TUF it really depends on what you have loaded the card with.

Could you spin up Quake RTX ? (first few levels are free), and a few heat soaked runs of timespy extreme? Those are the brutaliser tests. I have a feeling that after 10+ minutes of Quake RTX the clock will be a fair bit lower, but I could be wrong, would definitely show how much of a golden sample it is.
 
Joined
Mar 18, 2008
Messages
5,717 (0.96/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
RTX games is the real test. Fire up Metro exodus and play for a solid hour and report your clock
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,020 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
RTX games is the real test. Fire up Metro exodus and play for a solid hour and report your clock
Absolutely and more so even with non-RTX games, the older and 'lighter' they are, the higher the card can and will clock. I don't play either of the games he's shown screenshots for, but to my eyes it's wow and modern warfare?
 
D

Deleted member 193596

Guest
Absolutely and more so even with non-RTX games, the older and 'lighter' they are, the higher the card can and will clock. I don't play either of the games he's shown screenshots for, but to my eyes it's wow and modern warfare?
i played battlefield V, Hitman 2, R6 Siege, NFS Heat and crysis remastered now.

in all games the clock speed hovers between 2125 and 2160 Mhz. except in Siege with everything maxed out..it runs at over 300FPS and the card drops down to 2085-2100 Mhz
 
Joined
Aug 27, 2011
Messages
983 (0.21/day)
Processor Intel core i9 13900ks sp117 direct die
Motherboard Asus Maximus Apex Z790
Cooling Custom loop 3*360 45mm thick+ 3 x mo-ra3 420 +Dual D5 pump and dual ddc pump
Memory 2x24gb Gskill 8800c38
Video Card(s) Asus RTX 4090 Strix
Storage 1TB Samsung 860Evo,2*2tb Samsung 970Evo Plus, 1tb Intel 660p nvme
Display(s) Sammsung G7 32”
Case Dynamic XL
Audio Device(s) Creative Omni 5.1 usb sound card
Power Supply Corsair AX1600i
Mouse Model O-
Keyboard Hyper X Alloy Origin Core
My 3090 hit 2160mhz in bf5 but in time spy and port royal, it drops to stock clock. Cant hold it.
 
Joined
May 12, 2017
Messages
2,207 (0.84/day)
My 3090 hit 2160mhz in bf5 but in time spy and port royal, it drops to stock clock. Cant hold it.

You & I ask the OP to do an benchmark test & has done neither test sofar. OP needs to show results, then we can judge if it's a golden chip.
 
D

Deleted member 193596

Guest
You & I ask the OP to do an benchmark test & has done neither test sofar. OP needs to show results, then we can judge if it's a golden chip.

ever heard of time zones?
i posted this at midnight, went to bed and i have a job.

i am home since a few hours and only replied in my lunch break.

here is a random time spy run
 
Joined
Jan 5, 2006
Messages
18,428 (2.72/day)
System Name AlderLake / Laptop
Processor Intel i7 12700K P-Cores @ 5Ghz / Intel i3 7100U
Motherboard Gigabyte Z690 Aorus Master / HP 83A3 (U3E1)
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans / Fan
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36 / 8GB DDR4 HyperX CL13
Video Card(s) MSI RTX 2070 Super Gaming X Trio / Intel HD620
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2 / Samsung 256GB M.2 SSD
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p / 14" 1080p IPS Glossy
Case Be quiet! Silent Base 600 - Window / HP Pavilion
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W / Powerbrick
Mouse Logitech MX Anywhere 2 Laser wireless / Logitech M330 wireless
Keyboard RAPOO E9270P Black 5GHz wireless / HP backlit
Software Windows 11 / Windows 10
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
ever heard of time zones?
i posted this at midnight, went to bed and i have a job.

i am home since a few hours and only replied in my lunch break.

here is a random time spy run

When running this benchmark, run GPU-Z sensors in the background and post a screenshot of it afterwards a benchmark.
The GPU won't stay at maximum boost, you can try other benchmarks as well like superposition, heaven, valley or other.
 
Joined
Aug 27, 2011
Messages
983 (0.21/day)
Processor Intel core i9 13900ks sp117 direct die
Motherboard Asus Maximus Apex Z790
Cooling Custom loop 3*360 45mm thick+ 3 x mo-ra3 420 +Dual D5 pump and dual ddc pump
Memory 2x24gb Gskill 8800c38
Video Card(s) Asus RTX 4090 Strix
Storage 1TB Samsung 860Evo,2*2tb Samsung 970Evo Plus, 1tb Intel 660p nvme
Display(s) Sammsung G7 32”
Case Dynamic XL
Audio Device(s) Creative Omni 5.1 usb sound card
Power Supply Corsair AX1600i
Mouse Model O-
Keyboard Hyper X Alloy Origin Core
Only 1982mhz in time spy. Not that impressive.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,020 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
here is a random time spy run
As said above - Average clock frequency 1,982 MHz - which aint bad but I don't think it's golden sample territory. Can you heat soak it and run Timespy Extreme specifically with your best clocks?
 
Joined
Nov 11, 2016
Messages
3,266 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Ah why so harsh against OP, the fact that his 3080 can reach 2160Mhz means that the chip reach very high clocks.
Golden sample or good ASIC score means that the chip can reach high clocks, these chip tend to be leaky, meaning high power consumption for a given freq/voltage. These chip will do wonders when the PL is removed.
Vice versa a chip with low ASIC score tend to be less leaky for a given freq/voltage, so it might get higher Timespy score when locked to the same PL as high ASIC chip, but can't reach clocks above 2050mhz under no circumstance.
TL;DR OP has to shunt mod his card in order to get the most out of his 3080 :D
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,020 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Ah why so harsh against OP, the fact that his 3080 can reach 2160Mhz means that the chip reach very high clocks.
Definitely not trying to be harsh, just trying to get a couple of things out of it, for one an apples to applies comparison against my own RTX3080 TUF.

The chip definitely can touch and potentially hold some very high clocks, but the 3080 is an interesting beast in the sense that, well, it is SUCH a beast and it takes a lot to load it to the maximum extent possible. Power draw / power limiting / holding boost clocks is wildly variable depending on, among other things, the workload. To really push into seeing what the GPU is capable of in terms of sustained boost clocks, you need to saturate it with the heaviest workloads imaginable. 4K with all the trimmings is one such avenue, and why I also recommend Quake RTX as it's fully path traced and will bring any RTX capable card to it's knees sustained clocks/power draw wise. It's more nuanced than any GPU I've ever owned before in this way and that for example two different games could show 99% GPU utilisation, but one pull 270w and another pull 350w of TDP when settings are a locked max clock/voltage, which is a bit different to OP using the core clock offset slider.

But at the end of the day I can appreciate that if these are the games OP plays, and he is getting 2100+ in those games, I mean that ain't bad by any means. My methodology is more to test and set profiles around my worst case scenario, rather than best case scenario. It does make me wonder however about my example of 270w vs 350w, if I take the game that only wants 270w from the card, I can probably tweak a profile with much higher clocks just for that game.
 
Last edited:
Joined
Nov 11, 2016
Messages
3,266 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Definitely not trying to be harsh, just trying to get a couple of things out of it, for one an apples to applies comparison against my own RTX3080 TUF.

The chip definitely can touch and potentially hold some very high clocks, but the 3080 is an interesting beast in the sense that, well, it is SUCH a beast and it takes a lot to load it to the maximum extent possible. Power draw / power limiting / holding boost clocks is wildly variable depending on, among other things, the workload. To really push into seeing that the GPU is capable of in terms of sustained boost clocks, you need to saturate it with the heaviest workloads imaginable. 4K with all the trimmings is one such avenue, and why I recommend Quake RTX as it's fully path traced and will bring any RTX capable card to it's knees sustained clocks/power draw wise. It's more nuanced than any GPU I've ever owned before in this way and that for example two different games could show 99% GPU utilisation, but one pull 270w and another pull 350w of TDP when settings a locked max clock/voltage, which is a bit different to OP using the core clock offset slider.

But at the end of the day I can appreciate that if these are the games OP plays, and he is getting 2100+ in those games, I mean that ain't bad by any means. My methodology is more to test and set profiles around my worst case scenario, rather than best case scenario. It does make me wonder however about my example of 270w vs 350w, if I take the game that only wants 270w from the card, I can probably tweak a profile with much higher clocks just for that game.

Actually the heaviest workload is not meant to test the ASIC of the card, it only test how far PL allows you to go :D. Meaning as you increase the PL, your scores improve accordingly.

OP's 3080 chip is definitely meant to go into the Strix OC version, which much higher PL so that it can stretch its legs better.

Think of it like this, for an average 3080, a 400W PL is enough for its maximum clock of 2050-2070mhz, however OP's can reach 2160mhz which requires 500W PL to sustain that clocks in all type of workload.

Nvidia was really clever to use Power Limit as the way to standardize performance, with Maxwell era and before, you could have chips that clock 100-200mhz apart, making performance between models wildly different. Vendors (EVGA specifically) was selling higher ASIC chip for way more money.
Now with a tight PL in place, higher or lower ASIC make little difference, they would generally end up around the same clocks under the same PL.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,020 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Actually the heaviest workload is not meant to test the ASIC of the card, it only test how far PL allows you to go.
And this is more the angle/point I'm making anyway, given RTX3080's are held back by far the most by the PL, we try and maximise what we can get our of our silicone lottery 'win' within that ~375w PL, the lighter the workload the higher his clocks will be which can potentially give a false impression of an amazing ASIC quality.

The ASIC quality might be amazing, I'm not the judge of that, but in a normalised workload between our individual cards, restricted by an identical PL, It doesn't seem like it will hold clocks much higher. Golden sample? maybe, I suppose that's not my call to make, you seem pretty convinced. Unless he's willing to do hard mods like a shunt mod, we might not ever truly find out because the card can't draw an unlimited amount of power to see.
 
Top