Monday, June 6th 2022
NVIDIA RTX 4080 Rumored To Feature 420 W TDP
The upcoming generation of graphics cards from NVIDIA look set to feature significantly higher power budgets than their predecessors according to a recent claim from leaker Kopite. The RTX 4090 has been rumored to feature a TDP above 400 W for some time and this latest leak indicates that the RTX 4080 may also ship with an increased power requirement of 420 W. This RTX 4080 (PG139-SKU360) would represent an increase of 100 W compared to the RTX 3080 with power rises also expected with the RTX 4070 and RTX 4060. The RTX 4070 could see a power budget as high as 400 W if NVIDIA chooses to use GDDR6X memory for the card while the RTX 4060 is rumored to see a 50 W increase to 220 W at a minimum. The preliminary rumors indicate a launch date for these cards in late 2022.
Sources:
@kopite7kimi (via VideoCardz), @kopite7kimi
80 Comments on NVIDIA RTX 4080 Rumored To Feature 420 W TDP
Edit: Looking back on the past 35 years of PC gaming, I remember being blown away in the late 90's to mid 2000's when games like Thief, Half Life and FEAR came out. Not just the graphics, but the whole package, eg, The Dark Engine's audio propagation physics on an Aureal 3D sound card, enemies communicating with each other in FEAR in an actually believable way, etc, and thinking damn what are games going to be like in 25 years time. 25 years on, and advances in these areas are flat as hell. Everything is solely about graphics and even that's starting to experience the same depreciating excitement issue as Hollywood CGI, ie, once you've seen the 300th similar looking photo-realistic clone, it's no longer that 'interesting' anymore and you start craving stuff that actually looks and feels different, eg, Dishonored's water-colour or Bioshock's underwater art-deco styles. I'd take more of these type of games that run exceptionally well on an average GPU anyday than the 'need' for a 1500w RTX 6090Ti Super Titan for Crysis 6's 7hr long 'content'...
High power CPU and GPU should be licensed and used by those who really need it, not by simple end-users who want to play games.
[SIZE=4][ISPOILER][URL unfurl="true"]https://wccftech.com/nvidia-ada-lovelace-gpus-4n-process-node-advantage-over-5nm-amd-rdna-3/[/URL][/ISPOILER][/SIZE]
"The reasons why NVIDIA may have selected TSMC's 4N as the candidate for its next-gen gaming GPU lineup are kind of obvious. The upcoming cards will be real power-hungry and NVIDIA & the company is going to optimize them as much as they can by utilizing the 4N process node. AMD on the other hand will be utilizing a mix of TSMC 5nm and 6nm process nodes for its upcoming MCM and monolithic GPUs based on the RDNA 3 graphics architecture and while they don't bring the optimizations that 4N does, they will feature an MCM approach that is expected to be highly efficient. So at the end of the day, NVIDIA gets the better node while AMD delivers a better design approach."Well, let's wait & see who has the more efficient package this round. :cool: Power efficency will be a big selling point for many. Can you even see your monitor with the bush fires raging down there? :wtf:
Here's a healthy reality check of what most normal people are actually playing, ie, ray-traced Cyberpunk 2077 is barely more popular than "Russian Fishing 4" and on the brink of falling out of the chart. In the middle of the chart, Skyrim, Age of Empires 2, Don't Starve, Stardew Valley, etc have been there for years. No one cares about ray-tracing for those. Now look at the top of the chart. All the competitive CS:GO, Fortnite, etc, players are going to do is turn RT off for better FPS. It really is a literal truth to say less than 5% of the gaming market is interested in RT (and not all of those are interested in it "at any cost"). Perceptions of "everyone will want this" future popularity of the newest gimmick has often been heavily skewed on hardware enthusiast tech forums. Ray-tracing is like VR, 3D glasses, etc, some will want it and it'll be a premium feature for those who do, but many won't and just like VR it certainly isn't going to become a new game development "minimum requirement baseline" though for the bulk of the market and not just due to GPU cost.
If the nvidia micro-architecture is much worse, then those 5% won't matter at all.
idiots after their 15 minutes of fame (and don't came after me, because at least one of them is, and maybe both)
And it's also for the times that people play the SP games they want the gorgeous graphics, and I'd argue that most people don't want to turn down too much of details (unless they are the top percenter)
and a couple of those still requires a bit of power to output the high FPS
"It is too early to speculate on RTX 40 series power draw, but it is quite clear we will be seeing a 600W card in this generation"
it's to early but let me speculate anyway :D:D:D