• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 4080 Rumored To Feature 420 W TDP

Joined
Mar 31, 2020
Messages
1,519 (0.88/day)
The upcoming generation of graphics cards from NVIDIA look set to feature significantly higher power budgets than their predecessors according to a recent claim from leaker Kopite. The RTX 4090 has been rumored to feature a TDP above 400 W for some time and this latest leak indicates that the RTX 4080 may also ship with an increased power requirement of 420 W. This RTX 4080 (PG139-SKU360) would represent an increase of 100 W compared to the RTX 3080 with power rises also expected with the RTX 4070 and RTX 4060. The RTX 4070 could see a power budget as high as 400 W if NVIDIA chooses to use GDDR6X memory for the card while the RTX 4060 is rumored to see a 50 W increase to 220 W at a minimum. The preliminary rumors indicate a launch date for these cards in late 2022.



View at TechPowerUp Main Site | Source
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,609 (6.67/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Probably need this just to stay cool.

 
Joined
Sep 15, 2011
Messages
6,761 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Somebody from gov should regulate this nonsense. This is getting ridiculous. A GPU that consumes more then a standard PSU from a couple of years ago..
 
Joined
Jun 16, 2013
Messages
1,457 (0.35/day)
Location
Australia
The relevance is of course, the cost of the electricity, not how much the device actually consumes.
 
Joined
Dec 29, 2010
Messages
3,809 (0.75/day)
Processor AMD 5900x
Motherboard Asus x570 Strix-E
Cooling Hardware Labs
Memory G.Skill 4000c17 2x16gb
Video Card(s) RTX 3090
Storage Sabrent
Display(s) Samsung G9
Case Phanteks 719
Audio Device(s) Fiio K5 Pro
Power Supply EVGA 1000 P2
Mouse Logitech G600
Keyboard Corsair K95
It's kind of ironic considering the world is coming to grips with global warming and they keep turning the wick up on these parts. Hello, we want more performance at higher efficiency not less.
 
Joined
Jan 25, 2011
Messages
531 (0.10/day)
Location
Inside a mini ITX
System Name ITX Desktop
Processor Core i7 9700K
Motherboard Gigabyte Aorus Pro WiFi Z390
Cooling Arctic esports 34 duo.
Memory Corsair Vengeance LPX 16GB 3000MHz
Video Card(s) Gigabyte GeForce RTX 2070 Gaming OC White PRO
Storage Samsung 970 EVO Plus | Intel SSD 660p
Case NZXT H200
Power Supply Corsair CX Series 750 Watt
Somebody from gov should regulate this nonsense. This is getting ridiculous. A GPU that consumes more then a standard PSU from a couple of years ago..

Unfortunately, there is no way around it. Dennard scaling hit a wall while Moore is still going. Transistors/die can still increase significantly but power/transistor isn't going down at the same rate. If you want more performance, you have to give it more power. The last thing we need is governments regulating high-performance computing.
 
Joined
Mar 7, 2011
Messages
4,625 (0.92/day)
Somebody from gov should regulate this nonsense. This is getting ridiculous. A GPU that consumes more then a standard PSU from a couple of years ago..
Why just GPUs even Intel CPUs are getting out of hand and with rumoured return of HEDT, workstations might need a pair of kW class PSUs atleast to powerup.

On othernote, that leaker seems to Kimi Raikkanon fan.
 
Joined
Oct 17, 2011
Messages
63 (0.01/day)
System Name HydroAMD
Processor AMD Ryzen 5600X
Motherboard X570 Aorus Elite
Cooling Watercooling: 360mm Fans: 6x Corsair RGB 120mm
Memory 2x16GB G.Skill TridentZ Neo @ 3600Mhz
Video Card(s) XFX Speedster MERC319 AMD Radeon RX 6900 XT Black
Storage Samsung Evo 960 500GB - 2TB HP EX950 m.2 - 1TB 850 Pro
Display(s) Samsung 32" Curved QHD C32HG70 2560×1440
Case Corsair Crystal 570X
Audio Device(s) Sound Blaster Z
Power Supply Corsair AX860
Mouse Logitech G900 Chaos Spectrum
Keyboard Corsair K95 Platinum XT
Software Windows 11 Pro
Benchmark Scores 3DMark Time Spy - 18284
Joined
Dec 29, 2010
Messages
3,809 (0.75/day)
Processor AMD 5900x
Motherboard Asus x570 Strix-E
Cooling Hardware Labs
Memory G.Skill 4000c17 2x16gb
Video Card(s) RTX 3090
Storage Sabrent
Display(s) Samsung G9
Case Phanteks 719
Audio Device(s) Fiio K5 Pro
Power Supply EVGA 1000 P2
Mouse Logitech G600
Keyboard Corsair K95
Why just GPUs even Intel CPUs are getting out of hand and with rumoured return of HEDT, workstations might need a pair of kW class PSUs atleast to powerup.

On othernote, that leaker seems to Kimi Raikkanon fan.
Don't forget the optional hidden under table chiller. Which would go great with Nvidia's 900w gpu. You'll need to order two chillers lmao.
 
Joined
May 11, 2018
Messages
1,292 (0.53/day)
It looks more and more I'll skip this generation. Going from GTX 1080 Ti to RTX 3080 was quite a leap in consumption and generated heat, and custom water cooling just means the heat is transfered to my room more quietly. I don't really want to add another 100 W of heat...
 

Sipu

New Member
Joined
Apr 1, 2022
Messages
4 (0.00/day)
Amd be like "we can do that now?" Funny how high power consumprion was always an issue (ok performance was shit) in the 2000's, but now both intel and nvidia are going crazy. Consodering how efficient zen and rdna is, it will be interesting to see if they can scale that up to ridiculous power with new gens
 
Joined
Jun 21, 2013
Messages
606 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
That's an assumption if ever there was one...
Where I live, all domestic electricity is hydro generated. :D
Energy = heat. So even if you are using 100% green energy, you are still contributing to global warming if you are using more of it than necessary.
 
Joined
Apr 2, 2011
Messages
2,849 (0.57/day)
Energy = heat. So even if you are using 100% green energy, you are still contributing to global warming if you are using more of it than necessary.

? I can't tell whether this is a troll answer. By definition global climate change (because warming is inaccurate) is not influenced by the consumption of energy...and that's trivial to demonstrate. Nuclear power is taking unstable atoms, concentrating them until they create a controlled chain reaction, and then using their heat to driven a turbine. This process has literally been going on since the earth formed...and contributes no additional greenhouse gasses. If you'd like to argue, there are millennia old caves in Mexico that demonstrate this process is literally older than man. (Naica Crystal Caves)

By this logic, the only option would be to define that what you want is necessary, then do a calculation on total power draw rather than peak draw. Theoretically then, a 4080 shouldn't exist because the 3060 exists...and it can do the came calculations much slower, but overall more efficiently. Likewise, a 3060 should not exist because there are lower spec and more efficient GPUs...let alone the system you've got being less efficient using CISC processors rather than RISC. If you don't get it yet, there's a point where this argument of efficiency is a joke race to the bottom...because technically humans have pencils. Our biological processes are more efficient than any computer...so it'd be more efficient to physically color and illustrate a game than to use a computer of any kind...which hopefully highlights how silly this point is by proxy.


With regards to the thread topic, a 4080 using 420 watts of power is...whatever it is. I'm honestly looking forward to the market cards that target 1080p resolution gaming...which in this generation might be the 4060 or 4070. If they release as absolutely energy hungry messes, then it's going to be a strong case to break out the 30x0 cards as a much better value for the money. It'll also hopefully be an opportunity for AMD to get its crap together and release cards right. That is to say they release a competitive product that forces Nvidia to be more price competitive (with drivers and BIOSes that aren't an utter joke).

Personally, I've dealt with the AMD 5700 xt lineup...and having months of wait to get the properly tuned bios for the cards that didn't make them stutter more than was anywhere near acceptable was...frustrating. Nvidia pushing an idiotic level of power consumption would be that opportunity in a nutshell. I may be a bit more optimistic than rational though.
 
Joined
May 11, 2018
Messages
1,292 (0.53/day)
.


If they release as absolutely energy hungry messes, then it's going to be a strong case to break out the 30x0 cards as a much better value for the money.

When we're talking about resolutions... I know that VR isn't even remotely relevant in the market, but it could really do with doubling, tripling of the graphics cards performance.

And about going with 30x0 cards if the 40x0 proves inefficient - remember the 20x0 release. In spring of 2018 we had a cryptomarket collapse, and used GTX 1080 Ti cards were plentiful in the summer (but prices in stores remained high, market was still drunk on success of cryptoboom). When the RTX 2080 was released in September 2018, the prices of used 1080 Ti actually shot up - because the new generation didn't bring any price / performance increase - you paid to be guinnea pig for new technologies like RT and DLSS which were only very slowly getting released.
 
Joined
Sep 8, 2020
Messages
219 (0.14/day)
System Name Home
Processor 5950x
Motherboard Asrock Taichi x370
Cooling Thermalright True Spirit 140
Memory Patriot 32gb DDR4 3200mhz
Video Card(s) Sapphire Radeon RX 6700 10gb
Storage Too many to count
Display(s) U2518D+u2417h
Case Chieftec
Audio Device(s) onboard
Power Supply seasonic prime 1000W
Mouse Razer Viper
Keyboard Logitech
Software Windows 10
Regardless of power consumption, GPU prices are still very high in europe, we have to wait a couple of years for prices to settle down, if ever.
 
Joined
Mar 28, 2020
Messages
1,761 (1.02/day)
The question now is, how much more performance are we getting from the increase in power requirements? 420W represents a 100W jump from a reference RTX 3080 which slightly over a 30% increase, and I would expect in return that the card performs 50% faster at a minimal. I guess we will know towards the end of the year.
 

bug

Joined
May 22, 2015
Messages
13,843 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Ffs, I have been buying midrange because the power draw was moderate and easy to manage...
 
Joined
Aug 24, 2004
Messages
217 (0.03/day)
Looking forward to miners dumping their 3080's on ebay for $300 after crypto circles the drain some more.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Don't forget the optional hidden under table chiller. Which would go great with Nvidia's 900w gpu. You'll need to order two chillers lmao.

I think you figured it all out now. That's where SLI went! Double cooling for a single card!

Energy = heat. So even if you are using 100% green energy, you are still contributing to global warming if you are using more of it than necessary.
That's not how it works. Excess energy is converted to heat, but heat on its own does not contribute to global warming ;) Read a bit on the greenhouse effect.

Ffs, I have been buying midrange because the power draw was moderate and easy to manage...
Nvidia is just helping you and me to buy even lower in the stack. Which is a futile exercise, because are we really gaining performance of any measure gen-to-gen then?

I think GPU is slowly reaching a dead end. Turing was the writing on the wall. Its either going to have to go MCM, or it will stall. There is no other low hanging fruit either, much the same as it is on CPU.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
? I can't tell whether this is a troll answer. By definition global climate change (because warming is inaccurate) is not influenced by the consumption of energy...and that's trivial to demonstrate. Nuclear power is taking unstable atoms, concentrating them until they create a controlled chain reaction, and then using their heat to driven a turbine. This process has literally been going on since the earth formed...and contributes no additional greenhouse gasses. If you'd like to argue, there are millennia old caves in Mexico that demonstrate this process is literally older than man. (Naica Crystal Caves)

By this logic, the only option would be to define that what you want is necessary, then do a calculation on total power draw rather than peak draw. Theoretically then, a 4080 shouldn't exist because the 3060 exists...and it can do the came calculations much slower, but overall more efficiently. Likewise, a 3060 should not exist because there are lower spec and more efficient GPUs...let alone the system you've got being less efficient using CISC processors rather than RISC. If you don't get it yet, there's a point where this argument of efficiency is a joke race to the bottom...because technically humans have pencils. Our biological processes are more efficient than any computer...so it'd be more efficient to physically color and illustrate a game than to use a computer of any kind...which hopefully highlights how silly this point is by proxy.


With regards to the thread topic, a 4080 using 420 watts of power is...whatever it is. I'm honestly looking forward to the market cards that target 1080p resolution gaming...which in this generation might be the 4060 or 4070. If they release as absolutely energy hungry messes, then it's going to be a strong case to break out the 30x0 cards as a much better value for the money. It'll also hopefully be an opportunity for AMD to get its crap together and release cards right. That is to say they release a competitive product that forces Nvidia to be more price competitive (with drivers and BIOSes that aren't an utter joke).

Personally, I've dealt with the AMD 5700 xt lineup...and having months of wait to get the properly tuned bios for the cards that didn't make them stutter more than was anywhere near acceptable was...frustrating. Nvidia pushing an idiotic level of power consumption would be that opportunity in a nutshell. I may be a bit more optimistic than rational though.

Don't agree with anything in this so long post. Everything that you just posted is wrong.

420 watts for a mid-range card is unacceptable and ugly by nvidia.
4080 won't be high-end. There will be 4090, 4090 Ti and allegedly a Titan-class flagship.

Yes, climate change is indeed influenced by the energy use - transport, industry, etc. which burn polluting coal, oil and fossil gas.


Let's hope AMD's new 7000 cards are ok with power consumption (look at the thread above which states 750-watt and 650-watt PSU recommended), and with this generation nvidia will lose market share.

And no, 1080p gaming is crap, move on to 2160p - better and nicer.
 
Joined
Dec 5, 2020
Messages
203 (0.14/day)
It's kind of ironic considering the world is coming to grips with global warming and they keep turning the wick up on these parts. Hello, we want more performance at higher efficiency not less.
Whatever power they'll draw they'll be much more efficient. Efficiency and higher absolute power draw are not mutually exclusive.
 
Top