• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4080 Could Get 23 Gbps GDDR6X Memory with 340 Watt Total Board Power

Joined
Apr 12, 2013
Messages
7,003 (1.70/day)
the planet won’t be saved by limiting the gpu consumption. In Europe, we actually punish ourselves to do even more for the planet while elsewhere they don’t give a fxxx about energy consumption, fuel consumption, waste management/recycling, renewable energy etc

If you don’t limit the heavy industry globally , not just in Europe or America or Asia, that’s the main cause of the climate change, it’s worthless doing anything else.
It’s a matter of scale.
You are doing nothing by using 1000 electric vehicles while just one old truck in Europe or china or Africa or America or wherever consumes and pollutes 100 times more.
That's BS & you know it!

Again BS, pretty sure your online shopping is an ecological disaster ~

Like I was harping in the other thread you've got to stop the rot starting from oneself, no point blaming a$$hole corporations or other individuals if you aren't doing more than the bare minimum at your end. Glad one of the more sane voices out there picked up on this!

What's your per capita energy/resource consumption in the West? Wanna try that again :rolleyes:

Another similar take on this topic ~
 
Joined
Jun 18, 2015
Messages
575 (0.17/day)
700->600->450->340

I think we could see a '150W TDP' rumor in the middle of September.

The only reason that made them pull it from 700w to 340w is the decrease of crypto-currency value, after they figured that under current circumstances, even a 700w card wont make a good mining card, therefore they pulled values to more "humane" ones. All rumors about optimizations and etc is just BS.
 
Joined
Jan 21, 2021
Messages
67 (0.05/day)
People need to stop rationalizing this senseless increases in power draw, it's absurd pure and simple.

3080 is about 40% faster than the 2080Ti.
3080 has about 30% higher TDP than 2080Ti.

3080 was built on 8nm compared to 12nm of 2080Ti...

That's not progress!

What are these CLOWNS up to?
Seriously
WHAT THE F**

I had a write up about CPUs done up, essentially reaming out Intel. But as I wrote what's above,...

WHAT THE F*


Anyone?
 
Joined
Nov 11, 2016
Messages
3,259 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
3080 is about 40% faster than the 2080Ti.
3080 has about 30% higher TDP than 2080Ti.

3080 was built on 8nm compared to 12nm of 2080Ti...

That's not progress!

What are these CLOWNS up to?
Seriously
WHAT THE F**

I had a write up about CPUs done up, essentially reaming out Intel. But as I wrote what's above,...

WHAT THE F*


Anyone?

Just drag the power slider on the 3080 down to 250W and it would still outperform 2080Ti by 30%.
If people don't know how to do that, perhaps just buy an rx6400 and call it a day
 
Joined
Feb 22, 2022
Messages
101 (0.11/day)
System Name Lexx
Processor Threadripper 2950X
Motherboard Asus ROG Zenith Extreme
Cooling Custom Water
Memory 32/64GB Corsair 3200MHz
Video Card(s) Liquid Devil 6900XT
Storage 4TB Solid State PCI/NVME/M.2
Display(s) LG 34" Curved Ultrawide 160Hz
Case Thermaltake View T71
Audio Device(s) Onboard
Power Supply Corsair 1000W
Mouse Logitech G502
Keyboard Asus
VR HMD NA
Software Windows 10 Pro
Isnt the reality without cognitive dissonance, that we should ALL show some restraint to provide future generations a habitable planet?
I hope, while you're fretting about the fate of the planet, you stop farting, don't breed and don't have pets.

I also hope that if one day, it's discovered that this is one of the greatest con's of our time, that those driving the climate change narrative, get jail time.

Also, I hear the earth is actually flat.
 
Joined
Jun 18, 2021
Messages
2,422 (2.14/day)
They're not the only ones ;)

No 'founders/reference model' did, but AIB's certainly did and the high end of the stack still had 250w+ parts, like Dual GPU cards, same for AMD.

Also AMD single GPU between Fermi and current generation:
  • R9 290/390 - 275w
  • R9 290X/390X - 290w/275W
  • R9 Fury X - 275w
  • Vega 64 - 295w
  • Radeon VII - 295w

I'm not accounting for factory OC of course and the higher power from the amd side (which still couldn't compete with nvidia at the time) goes to show the point I made in a different comment, this power level increases with Ampere happened because they needed them to retain the gaming/benchmark crown.

If you don’t like it….. don’t buy it. Nobody is forcing you to buy a 4080. You can buy a 4070 or 4060 with the power draw you desire, and better performance, and save money! Win win!

It's not as simple or clear cut, this is an overarching trend that extends to the entire lineup and it will only aggravate if not scrutinized. My original response that many interpreted as just climate conscious or whatever (not false but not really the point I was going for) was specifically aimed at the "normal in this day and age" comment. What in this day and age makes it ok to up their own self imposed for more than a decade power target? Day and age with all current problems no less and day and age with more software solutions to bridge the gap like dlss (that nvidia even pioneered!)

I'm not telling anyone what to buy or not, enjoy whatever you enjoy, but not that long ago there was pretty much only one player on the GPU space (amd wasn't really competitive), right now there's still only 2 (intel has yet to deliver anything other than teasers), and they both just using cheap tricks like moving the power needle to beat the other instead of actually innovating (i'd say nvidia more so since amd is closer to their usual higher target - that should be viewed like a disadvantage as it was for years)
 
Joined
Jun 16, 2022
Messages
74 (0.10/day)
Location
Hungary
System Name UNDERVOLTED (UV) silenced, 135W limited, high energy efficient PC
Processor Intel i5-10400F undervolt
Motherboard MSI B460 Tomahawk
Cooling Scythe Ninja 3 rev.B @ 620RPM
Memory Kingston HyperX Predator 3000 4x4GB UV
Video Card(s) Gainward GTX 1060 6GB Phoenix UV 775mV@1695MHz, 54% TDP (65 Watt) LIMIT
Storage Crucial MX300 275GB, 2x500GB 2.5" SSHD Raid, 1TB 2.5" SSHD, BluRay writer
Display(s) Acer XV252QZ @ 240Hz
Case Logic Concept K3 (Smallest Full ATX/ATX PSU/ODD case ever..)
Audio Device(s) Panasonic Clip-On, Philips SHP6000, HP Pavilion headset 400, Genius 1250X, Sandstrøm Hercules
Power Supply Be Quiet! Straight Power 10 500W CM
Mouse Logitech G102 & SteelSeries Rival 300 & senior Microsoft IMO 1.1A
Keyboard RAPOO VPRO 500, Microsoft All-In-One Keyboard, Cougar 300K
Software Windows 10 Home x64 Retail
Benchmark Scores More than enough Fire Strike: 3dmark.com/fs/28356598 Time Spy: 3dmark.com/spy/30561283
Seems Green comapany has fckN DARK future plan with high power consuming products. :slap: I can not see this trend a good way.
1. At least do not raise the TDP, but should better to lowering it bit by bit with newer architechures and
2. soft devs should work hardver with better optimmization.
This 2 thing should be a trend of this time period. Not just beacuse of energy price, but much more due to humanity energy consuming.

USA administration should regulate these companies by this 2 directive.
 
Joined
Sep 17, 2014
Messages
21,572 (6.00/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I hope, while you're fretting about the fate of the planet, you stop farting, don't breed and don't have pets.

I also hope that if one day, it's discovered that this is one of the greatest con's of our time, that those driving the climate change narrative, get jail time.

Also, I hear the earth is actually flat.
Everybody farts! Im on a real farting spree the last couple days tbh. I kid you not

3080 is about 40% faster than the 2080Ti.
3080 has about 30% higher TDP than 2080Ti.

3080 was built on 8nm compared to 12nm of 2080Ti...

That's not progress!

What are these CLOWNS up to?
Seriously
WHAT THE F**

I had a write up about CPUs done up, essentially reaming out Intel. But as I wrote what's above,...

WHAT THE F*


Anyone?
The price of RT.
 
Joined
Sep 13, 2021
Messages
86 (0.08/day)
Every new GPU generation has a higher computing power per Watt. Rational discussion end.
What a person believes to need for gaming (high end GPUs . . . ), is irrational.
Suggested solution:
Only the non-overclockable or underclocked GPUs and CPUs are sold in the USA.

The free versions in the rest of the world. :clap:
 
Last edited:
Joined
Apr 15, 2021
Messages
868 (0.73/day)
As far as I'm concerned, unless you're gaming on a steam deck, switch or equivalent, you're bloody irresponsible with your outrageous and wasteful power usage in the name of a better gaming experience.

Your 125w GFX card is a disgrace to future generations, when power is only going to cost more and the planet is heating up, we're in a recession and next summer is set to be the hottest on record!

Responsible gamers should all limit themselves to sub 30w draw for the entire device, or you'll be answering to my sweaty, poor grandchildren dammit.
I think the most I draw is around 340 watts when gaming, but it usually stays between 240-285. Even running Frontline 30 vs. 30 in World of Tanks with max settings doesn't seem to go above 300 watts much. Rendering draws a lot more since iray just goes full blast on whatever devices you have set up to render, which could be CPU, 1 or more graphics cards, or both CPU & graphics card(s). The one card gets up to 85C when rendering since its a workstation card that uses a blower.
 
Joined
Dec 28, 2012
Messages
3,630 (0.86/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
I mean, I outlined the many whays in which I do

That's pretty personal dude, I wouldn't call running an undervolted 3080 or say 4080 an utter waste, but hey, you do you. I told you all I can really do is rationalise it myself, the whataboutism is a fun little extra.

Again man, you do you. I don't think my 3080 is bullshit, and yeah, RT so far has been a very fun path to walk with this class card. Metro EE, DOOM Eternal, Control, Dying Light 2, Spiderman... The experience with all the dials up is right up my alley, because it's not up yours doesn't automatically make it bullshit to everyone else.

Well, just keep buying into that segment then, easy. Both RDNA3 and Ada will have products that want for 1x8pin / <225w, and they'll be faster than Ampere or RDNA2 (at equal power draw), vote with that wallet.

They're not the only ones ;)

No 'founders/reference model' did, but AIB's certainly did and the high end of the stack still had 250w+ parts, like Dual GPU cards, same for AMD.

Also AMD single GPU between Fermi and current generation:
  • R9 290/390 - 275w
  • R9 290X/390X - 290w/275W
  • R9 Fury X - 275w
  • Vega 64 - 295w
  • Radeon VII - 295w


Winning Logic. I also consistently see a lot of complaints from what amounts to people who would've never bought the card anyway, even if it fit their power budget, because of things like brand and price.
What always gets me about the complaints is the new xx8x cards drawing so much more power then the old xx8x cards, while ignoring the sheer performance difference.
Like, if you want a 225watt card, not only can you still buy them, but now they are the cheaper xx7x or xx6x tier cards. So you don’t need the flagship anymore!

do people just want the top tier cards to be limited in power so they feel justified in spending $700 on a RTX 4060? Doesnt make any sense to me.
 
Joined
Jun 16, 2022
Messages
74 (0.10/day)
Location
Hungary
System Name UNDERVOLTED (UV) silenced, 135W limited, high energy efficient PC
Processor Intel i5-10400F undervolt
Motherboard MSI B460 Tomahawk
Cooling Scythe Ninja 3 rev.B @ 620RPM
Memory Kingston HyperX Predator 3000 4x4GB UV
Video Card(s) Gainward GTX 1060 6GB Phoenix UV 775mV@1695MHz, 54% TDP (65 Watt) LIMIT
Storage Crucial MX300 275GB, 2x500GB 2.5" SSHD Raid, 1TB 2.5" SSHD, BluRay writer
Display(s) Acer XV252QZ @ 240Hz
Case Logic Concept K3 (Smallest Full ATX/ATX PSU/ODD case ever..)
Audio Device(s) Panasonic Clip-On, Philips SHP6000, HP Pavilion headset 400, Genius 1250X, Sandstrøm Hercules
Power Supply Be Quiet! Straight Power 10 500W CM
Mouse Logitech G102 & SteelSeries Rival 300 & senior Microsoft IMO 1.1A
Keyboard RAPOO VPRO 500, Microsoft All-In-One Keyboard, Cougar 300K
Software Windows 10 Home x64 Retail
Benchmark Scores More than enough Fire Strike: 3dmark.com/fs/28356598 Time Spy: 3dmark.com/spy/30561283
Every new GPU generation has a higher computing power per Watt. Rational discussion end.
Efficiency and overall power consumptions is 2 different thing. Especially that paradox when efficiency grows that leads toward more power consuming by the gowing level of lifestyle:
Jevons Paradox. Read about it.
That is one of the reason we see the overall global energy consumption is growing without stop.

Seems a bit off under an nVidia topic. BUT actually it is not, beacuse of I play games and I do and even more ppl do care about enviroment and consume less energy.

And I am not an idiot fun of Greta. I also have allergic reactions if I just see her. We just simply have to do things a little better and live a little bit better nature conscious life. That's it. That is why a wrote at least do not raise the TDP. But they even push forward by the time. Not so far in the past GTX 1080 was only 180W TDP.
(source: TPU GTX 1080) If the technology is developing and efficiency is growing, enough to limit the new card TDP because the overall performance will increase. But by the history datas Jevons paradox works...
 
Joined
Oct 6, 2009
Messages
2,824 (0.52/day)
Location
Midwest USA
System Name My Gaming System
Processor Intel i7 4770k @ 4.4 Ghz
Motherboard Asus Maximus VI Impact (ITX)
Cooling Custom Full System Water cooling Loop
Memory G.Skill 1866 Mhz Sniper 8 Gb
Video Card(s) EVGA GTX 780 ti SC
Storage Samsung SSD EVO 120GB - Samsung SSD EVO 500GB
Display(s) ASUS W246H - 24" - widescreen TFT active matrix LCD
Case Bitfenix Prodigy
Power Supply Corsair AX760 Modular PSU
Software Windows 8.1 Home Primeum
Another 'leak', another tweet, another thread, another 'inside info'.
No matter the content, if you engaged in any sort then it worked.
The PR is very effective now days, see how much talk about W is being generated.
So easy to make people move to participate in the PR loop.

I'm doing my part also of course, free entertainment that keep my busy for a while, sort of geek HW escapism, and the PR machine get it's profit.
Win-Win situations :)
This is the trend in ALL Journalism. Report the first thing you hear, damn the facts, don't check references, and never retract anything.

All journalism has turned into the Jerry Springer rumor mill. (Politics, Tech, Economics, Crime, all of it!)

TPU at least used to say things like "Take this with a grain of salt" but that is becoming more rare here as well.
That said, when it comes to tech I do like to read some rumors :)~
 
Joined
Jan 25, 2020
Messages
2,100 (1.28/day)
System Name DadsBadAss
Processor I7 13700k w/ HEATKILLER IV PRO Copper Nickel
Motherboard MSI Z790 Tomahawk Wifi DDR4
Cooling BarrowCH Boxfish 200mm-HWLabs SR2 420/GTX&GTS 360-BP Dual D5 MOD TOP- 2x Koolance PMP 450S
Memory 4x8gb HyperX Predator RGB DDR4 4000
Video Card(s) Asrock 6800xt PG D w/ Byski A-AR6900XT-X
Storage WD SN850x 1TB NVME M.2/Adata XPG SX8200 PRO 1TB NVMe M.2
Display(s) Acer XG270HU
Case ThermalTake X71 w/5 Noctua NF-A14 2000 IP67 PWM/3 Noctua NF-F12 2000 IP67 PWM/3 CorsairML120 Pro RGB
Audio Device(s) Klipsch Promedia 2.1
Power Supply Seasonic Focus PX-850 w/CableMod PRO ModMesh RT-Series Black/Blue
Mouse Logitech G502
Keyboard Black Aluminun Mechanical Clicky Thing With Blue LEDs, hows that for a name?!
Software Win11pro
My grandchildren game on phones or pathetic little tablets (friggin blasphemy). They hate using the computers I built them, for anything (equally blasphemous). Touchscreen or death for them.

So don't worry, GPUs aren't going to kill the world.


This is clearly stated to be a prediction. TPU apparently needs to stop assuming its forum members can read AND comprehend now?
 
Joined
Mar 9, 2018
Messages
188 (0.08/day)
You can do both with good design, but you can't get those chart-topping scores that everyone wants to brag about without over the top power consumption.

Edit: Added Horizon: Zero Dawn

I did some tests recently with my 6600XT, an efficient card to begin with but which is still overclocked out of it's efficency range out of the box.

Default core clock: 2668 MHz, ~2600 MHz in-game, 16Gbps Mem, 132-150W (Power +20%), 1.15v
Overclocked: 2750 MHz, ~2690 MHz in-game, 17.6Gbps Mem, 145-163W (Power +20%, hits power limits), 1.15v
Underclocked: 2050 MHz, ~2000 MHz in-game, 16Gbps Mem, 70-75W, 0.862v

But what of performance? Canned game benchmark runs as I'm no in-game consistent benchmarker:

R5 5600, 3200 MHz CL16, 1440p, PCIe 3.0 (B450 board), no Motion Blur

CP2077 - 18% lower fps at High settings
SotTR - 12% lower fps at HUB rec settings (~High but better visuals)
Forza Horizon 4 - 17% lower fps at Ultra settings
Horizon: Zero Dawn - 16% lower fps at Favor Quality settings (High)

1% lows were about the same in CP2077 (need to do more runs at 2690 to confirm), -13% in SotRT and -14% in FH4, -11% in H:ZD.

So about a -15% FPS tradeoff for half the power usage. Comparing runs directly to each other, the 2000MHz test used 51% of the power of the 2690MHz tests in the same games (148W in CP2077 and SotTR vs 75W, 118W in FH4 vs 60W, 139W in H:ZD vs 71W).

15% fewer frames is a lot and it's not a lot, depending on what you're looking for.
Thank you, this is the type of discussion we need right now. Anyone who's worried about power consumption can do their own tweaking. The tools are there, have been for years now, and are really easy to use with no risk whatsoever, and without breaking the warranty even.
 
Joined
Sep 13, 2021
Messages
86 (0.08/day)
Efficiency and overall power consumptions is 2 different thing. Especially that paradox when efficiency grows that leads toward more power consuming by the gowing level of lifestyle:
Jevons Paradox. Read about it.
That is one of the reason we see the overall global energy consumption is growing without stop.
Overall global energy consumption is growing, because large parts of Asia were underdeveloped. About 2 billion people left poverty during industrialization. And world population is rising. In the western world, energy consumption is going down for about the last 10-15 years. This trend will hold on.

Seems a bit off under an nVidia topic. BUT actually it is not, beacuse of I play games and I do and even more ppl do care about enviroment and consume less energy.

And I am not an idiot fun of Greta. I also have allergic reactions if I just see her. We just simply have to do things a little better and live a little bit better nature conscious life. That's it. That is why a wrote at least do not raise the TDP. But they even push forward by the time. Not so far in the past GTX 1080 was only 180W TDP.
(source: TPU GTX 1080) If the technology is developing and efficiency is growing, enough to limit the new card TDP because the overall performance will increase. But by the history datas Jevons paradox works...
They raise the TDP, because there is competence. There are consumers, who wants the fastest and consumer, who wants the most economic solutions. I prefer the economic solution, use undervolting anyway, have my 2070 around 100-150W and will buy a 4070, which I will bring down to about 150W either, more than doubling the computing power. It depends on the consumer, what he wants.

From my life experience, I would say, you can't stop climate change, there is no global dictatorship who is able, to bring all people's consume down to a scientifically calculated, sustainable level and stop population growth, bring all people to the same level. So what will happen: Production cost in the west rise, production went to countries like China, who are much worse in environmental protection and ideology. Keep technological leadership and big economic resources, free markets. Then you will be able, to react adequately to climate change and protect your population. You can see this everywhere. Poor countries aren't able to protect their people and are destroying their nature more rapidly. If you interfere, you get a war. nVidia or AMD will change nothing here, these people have other problems to discuss as power consumption of high end GPUs.
 
Joined
Jul 20, 2020
Messages
933 (0.64/day)
System Name Gamey #1 / #2
Processor Ryzen 7 5800X3D / Core i7-9700F
Motherboard Asrock B450M P4 / Asrock B360M P4
Cooling IDCool SE-226-XT / CM Hyper 212
Memory 32GB 3200 CL16 / 32GB 2666 CL14
Video Card(s) PC 6800 XT / Soyo RTX 2060 Super
Storage 4TB Team MP34 / 512G Tosh RD400+2TB WD3Dblu
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / CM N200
Audio Device(s) Dragonfly Black
Power Supply EVGA 650 G3 / Corsair CX550M
Mouse JSCO JNL-101k Noiseless
Keyboard Steelseries Apex 3 TKL
Software Win 10, Throttlestop
Thank you, this is the type of discussion we need right now. Anyone who's worried about power consumption can do their own tweaking. The tools are there, have been for years now, and are really easy to use with no risk whatsoever, and without breaking the warranty even.

Most people don't care and I do only because I came into PC gaming using NUCs and old laptops where squeezing every last bit of efficiency with undervolting and controlling clocks is needed to eke out the optimal gaming and cooling experience from power and thermally-limited hardware.

Getting a Gaming X 1060 PC was cool-running luxury but soon I was back to the efficiency game with a cheap PNY 1080 with a crap cooler. Running that at its efficiency peak (1911-1923 MHz @0.9v, 135W max) taught me where current GPUs do their best and allowed it to run at 74°C in warm summer rooms instead of thermal throttling at 83°C. Plus, electronics are likely to last longer if not run at their very edge of capability. IMO 5 or even 10% down is an OK tradeoff for longer equipment life.

It does seem to me based on this 6600XT that using an Infinity Cache with a narrower memory bus allows for greater reduction in minimum power usage, or at least a closer to linear reduction in power usage when underclocking into the efficiency zone. In other words, I expect the RTX 3000 series not to be able to lower their power as much as the RX 6000 series because of their wider busses. Also using GDDR6X memory will kill these types of efficiency improvements. I'd love to see how a 6700XT or 6800/XT does at these lower clocks to see if they also benefit as much from running in their efficiency zone.

It would be interesting to see if there's a sweet spot of perhaps even larger cache and relatively narrow but fastest GDDR6 non-X memory bus which allows for greater FPS per watt when run in the efficiency zone. Like a 6800XT core count with a 192-bit bus but 256MB IC that performs like a 6800 but with 6700XT or even lower power requirements when run around 2000MHz cores? It'll never happen but I wonder if that would even be a viable performer or if instead it runs up against another performance brick wall that I'm not considering.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,017 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I think the most I draw is around 340 watts when gaming
For 99.9% of my games I run my 3080 at 1725mhz @ 775mv, most games are 200-230w, absolute ball-tearers that use every last ounce can have it hit 255w. Stock is about 5% faster than the UV and uses the full 320w, unlimited power and a overclock is 7-8% faster than the UV and uses 370w. Bearing in mind it's at 4k targeting 120fps too, so it's really pushing the Ampere arch.

No matter the TDP of the next product I buy, and it could be from either brand, I'd certainly undervolt it to increase efficiency.
 

Lei

Joined
Jul 3, 2021
Messages
1,143 (1.02/day)
Location
usually in my shirt
Processor 3900x - Bykski waterblock
Motherboard MSI b450m mortar max BIOS Date 27 Apr 2023
Cooling αcool 560 rad - 2xPhanteks F140XP
Memory Micron 32gb 3200mhz ddr4
Video Card(s) Colorful 3090 ADOC active backplate cooling
Storage WD SN850 2tb ,HP EX950 1tb, WD UltraStar Helioseal 18tb+18tb
Display(s) 24“ HUION pro 4k 10bit
Case aluminium extrusions copper panels, 60 deliveries for every piece down to screws
Audio Device(s) sony stereo mic, logitech c930, Gulikit pro 2 + xbox Series S controller, moded bt headphone 1200mAh
Power Supply Corsair RM1000x
Mouse pen display, no mouse no click
Keyboard Microsoft aio media embedded touchpad (moded lithium battery 1000mAh)
Software Win 11 23h2 build 22631
Benchmark Scores cine23 20000
The only reason that made them pull it from 700w to 340w is the decrease of crypto-currency value, after they figured that under current circumstances, even a 700w card wont make a good mining card, therefore they pulled values to more "humane" ones. All rumors about optimizations and etc is just BS.
Hello, Ethereum's going proof of stake. Not Bitcoin.
Satoshi Nakamoto is dead you know, no on can make Bitcoin PoS. It will be mined for another 140 years

1661387641417.png
 
Joined
Jan 18, 2020
Messages
724 (0.44/day)
Nvidia earnings a bloodbath on the consumer side. This before any real recession has got going. This will be why the 4080 specs are changing. It'll need to be cheap and reasonable power consumption.

Nvidia's gaming department revenue was down 33% year-over-year to $2.04 billion, which was a sharper decline than the company anticipated. Nvidia said that the miss was because of lower sales of its gaming products, which are primarily graphics cards for PCs.9 hours ago
 
Joined
Jul 15, 2020
Messages
991 (0.67/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.025mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F26 with "Instant 6 GHz" on
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 85%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
Nvidia earnings a bloodbath on the consumer side. This before any real recession has got going. This will be why the 4080 specs are changing. It'll need to be cheap and reasonable power consumption.
Very naive, to think this kind of GPU spec changes according to post response..

Much more resenable is that many people have profit in the 'leak' prior to lunch to generate trafic and to get your 'unbelievable and amazed' reaction. All PR stunts.

You may see many power consumption numbers and all are true, given a spacific test. Each test look into spacific parameter and some of the tests need to loos all TDP constraints hance 900w+ for 4090 and so on.
So the company get endless, free, exposure to it's upcoming product by leat you see and talk about each test power consumption and the 'fluctuation' between each test. Power consumption, TDP and global warming buzz words together all in the service of a very intensional marketing campaign. Nothing more.

And if they, as a bonus, made you think you have any degree of influence on the process before lunch, well then, it's a pure win on tham side and probably you will earn them even more free PR in the future because you posted and, for sure, made a change.
;)
 
Last edited:
Joined
Sep 13, 2021
Messages
86 (0.08/day)
Nvidia earnings a bloodbath on the consumer side. This before any real recession has got going. This will be why the 4080 specs are changing. It'll need to be cheap and reasonable power consumption.
There will be no cheap and reasonable power consumption for high end GPUs. Wrong peer group.
 
Joined
Apr 15, 2021
Messages
868 (0.73/day)
There will be no cheap and reasonable power consumption for high end GPUs. Wrong peer group.
Exactly. Unless there's some new earth-shattering technological breakthrough in regards to energy, high performance + cheap will never be possible. Otherwise, we would all be driving sports cars & using high end computers that can get 120+FPS on a 16k display at the cost of pennies per day for the energy consumed. This equates to something like doing WORK requiring the energy of nuclear fusion/fission on a small scale in your own house or car without the need for putting the energy into it. Solar is about as good as it gets, but the costs become astronomical when you consider the hardware & recurrent costs. The laws of physics has us by the balls. :(
 
Joined
Jul 5, 2019
Messages
307 (0.17/day)
Location
Berlin, Germany
System Name Workhorse
Processor 13900K 5.9 Ghz single core (2x) 5.6 Ghz Allcore @ -0.15v offset / 4.5 Ghz e-core -0.15v offset
Motherboard MSI Z690A-Pro DDR4
Cooling Arctic Liquid Cooler 360 3x Arctic 120 PWM Push + 3x Arctic 140 PWM Pull
Memory 2 x 32GB DDR4-3200-CL16 G.Skill RipJaws V @ 4133 Mhz CL 18-22-42-42-84 2T 1.45v
Video Card(s) RX 6600XT 8GB
Storage PNY CS3030 1TB nvme SSD, 2 x 3TB HDD, 1x 4TB HDD, 1 x 6TB HDD
Display(s) Samsung 34" 3440x1400 60 Hz
Case Coolermaster 690
Audio Device(s) Topping Dx3 Pro / Denon D2000 soon to mod it/Fostex T50RP MK3 custom cable and headband / Bose NC700
Power Supply Enermax Revolution D.F. 850W ATX 2.4
Mouse Logitech G5 / Speedlink Kudos gaming mouse (12 years old)
Keyboard A4Tech G800 (old) / Apple Magic keyboard
Exactly. Unless there's some new earth-shattering technological breakthrough in regards to energy, high performance + cheap will never be possible. Otherwise, we would all be driving sports cars & using high end computers that can get 120+FPS on a 16k display at the cost of pennies per day for the energy consumed. This equates to something like doing WORK requiring the energy of nuclear fusion/fission on a small scale in your own house or car without the need for putting the energy into it. Solar is about as good as it gets, but the costs become astronomical when you consider the hardware & recurrent costs. The laws of physics has us by the balls. :(
It's not necessarily that the laws of physics has us by the balls, but rather that we have insatiable wants. We always want more and can never be fully content.
It's a never ending battle of the ever moving goal posts.
 
Joined
Jan 21, 2021
Messages
67 (0.05/day)
Just drag the power slider on the 3080 down to 250W and it would still outperform 2080Ti by 30%.
If people don't know how to do that, perhaps just buy an rx6400 and call it a day

The 1030 in the HTPC I made for my parents runs at 1733MHz with 0.900V (instead of ~1.05V)
I don't have a 2080Ti, but I bet... if you dragged its slider down too...

Edit: every device should have a button that switches clocks and voltages to "most for least".

One thing needs to be taken into account to make this button work its best though: so that things don't become unstable during their warranty periods (become "broken" to most people), as time passes, extra voltage over the minimum required at manufacture is needed

Up to this point what has been done by AMD/Intel/nVidia/all semiconductor companies, is the use of a voltage offset. They ask: What's required for stability of the chip while running at its peak temperature before throttling (usually 100deg C)? 1.000V. So what's its voltage set to? 1.100V. This ensures that at month 6, when 1.007V is required, an RMA isn't happening.

Instead of doing this, there is no reason why voltage can't be optimized to increase over time depending on hours powered on, temperature, and utilization. To keep things REALLY simple they could even just go by time since manufacture and be really conservative with their ramp up - There would still be a span of like two years where they could ramp up from 1.000V to 1.100V - TONNES of power would be saved worldwide, even from that.

My 2500K, I ran it at 4200MHz undervolted by more than 0.25V less than its VID for 3700MHz (the stock max speed of turbo boost), so this isn't a new problem.
Today, because it's old and still in use (parents HTPC), it's getting 0.02 less volts than specified by its VID now and running (Prime stable), at 4800MHz with DDR 2133 C10 at 1.675V. VCCIO was increased to 1.200V to keep the minimum 0.5V voltage differential that prevents damage to the IMC (people with Nehalem/Sandy/Ivy/Haswell/Broadwell with DDR3 at 1.65V should have done this, but I don't think the majority of people did. I guess it didn't end up being too important lol) but back to my point - so much extra voltage is used than is needed, and so much power is wasted because of it
 
Last edited:
Top