• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4080 Could Get 23 Gbps GDDR6X Memory with 340 Watt Total Board Power

I suppose that all I can rationalize is my own purchase decision, based on what products end up existing, given this is now an industry trend. I too think the trend isn't good, far from it.

I have made and continue to make large efforts to be as responsible as I can. I have solar power (and game my consumption to use it as much as feasible), I recycle (and separate multiple different kinds of recyclables, I compost, I have a water-wise garden, I've set up my house as best I can to be passively insulated instead of throwing power at the problem of comfort, I sold my motorcycle and I ride a pushbike or electric scooter to work instead, I eat leftovers, I walk when it's close, the car we have was bought with an eye to economy and emissions, the list goes on, including undervolting my system and making it run in the efficiency sweetspot. But I'm at the point in my life where I have little time for hobbies, and one that keeps me at home, safe, out of trouble etc is a winner, but I'm also at the point where I have reasonable money to spend on this hobby.

So yeah, I do have a 320w GPU now, and may have an even more power hungry one in a few months (you can bet I'll undervolt it tho), and one day I might need to pay again for that, socially, physically or otherwise.

Like always, make up your own mind what's acceptable to buy, to support too, voting with your wallet might even get these companies to change the trend, I'd like to believe that's possible.

In the scheme of the world where people waste needlessly, buy gas guzzling v8 trucks, litter never mind being responsible with real waste, they buy massive air conditioners and heaters and live in comfort etc, and this is regular middle class people, never mind the 1% or 0.1% of planet rapers...

I have a high end PC with a power hungry GPU, and I'll do it again. Sorry.
Isnt the reality without cognitive dissonance, that we should ALL show some restraint to provide future generations a habitable planet?

Whataboutisms wont get you, me or any of our kids anywhere. At its core all that is, is egocentric behaviour. Hypocrisy. Its up to each of us to set boundaries. If this post represents yours, fine, but dont even begin telling anyone you're somehow doing something right.... all you do is justify utter waste for yourself.

If you can own that, enjoy that 400W bullshit GPU ;) If your gut tells you it doesnt feel right, the solution is very simple; dont buy into it. Thatis consumer power and that is how 'we' enforce change. You literally even said so yourself. If it doesnt feel right, its not right. Screw what 'the industry' pushes for. Thats the very 1% YOU are catering for. The 1% that releases a product you yourself consider 'a bit too much' even!

Signed: an EV driving, solar powered home owner. The point isnt saving a few more kwh than the next guy, the point is what we as consumers keep feeding. My point is: dont feed ever more power hungry GPUs because node advancements are stalling!

And... again.. 'the price of RT...' :) Worth?!
 
Last edited:
the planet won’t be saved by limiting the gpu consumption. In Europe, we actually punish ourselves to do even more for the planet while elsewhere they don’t give a fxxx about energy consumption, fuel consumption, waste management/recycling, renewable energy etc…

If you don’t limit the heavy industry globally , not just in Europe or America or Asia, that’s the main cause of the climate change, it’s worthless doing anything else.
It’s a matter of scale.
You are doing nothing by using 1000 electric vehicles while just one old truck in Europe or china or Africa or America or wherever consumes and pollutes 100 times more.
 
All these rumors. So existing and yet so useless. I mean they changed by the month or week. Release these card all ready and come out with the true final spec.

Well here is the latest 4000 series rumors from me. Rtx 4000 is just a revamped rtx 3000 series cards. I mean nvidia has done it before... Just saying.
 
  • Like
Reactions: Lei
the planet won’t be saved by limiting the gpu consumption. In Europe, we actually punish ourselves to do even more for the planet while elsewhere they don’t give a fxxx about energy consumption, fuel consumption, waste management/recycling, renewable energy etc…

If you don’t limit the heavy industry globally , not just in Europe or America or Asia, that’s the main cause of the climate change, it’s worthless doing anything else.
It’s a matter of scale.
You are doing nothing by using 1000 electric vehicles while just one old truck in Europe or china or Africa or America or wherever consumes and pollutes 100 times more.
Energy usage is about a near infinite stacking of all the little things that matter. It echoes now very well in your own energy bill. One minute less shower per day for a year will make a noticeable dent in it. And that, as behaviour, does indeed scale.

Its 100W/hr on a GPU. Its a minute or two less under warm water. Its a few hundred kwh yearly because of solar. Etc etc etc

It really does matter, because our footprints are far too high overall. None of us with the wealth to buy these products have any right to point fingers...

That heavy industry and its processes exist because we buy into it. Every time. We have the power to kill a portion of it.
 
700->600->450->340

I think we could see a '150W TDP' rumor in the middle of September.
All your mentioned tdp are different gpu's
Rtx 4090ti 700tdp on peak time, sure don't forget models like kinping or half of fame.
 
Possibly they're trying to adapt the design to market conditions.... There's no market for a 600w $1000+ monster GPU now, the GPU market has totally gone. All changed in the last few months.

If they actually want to shift any cards, 4080 should be $500 or less and 300w or so.
 
  • Like
Reactions: Lei
All these people complaining the cards use too much energy, but I promise you everyone would be complaining to the moon if GPU progression was only 5% improvement per generation.

We want faster cards that are more efficient. Those 2 things are not interchangeable. Typically, you sacrifice one to make the other.

The good thing is it is a lot easier to grow plants and live in warm environments than cold. For example, Antarctica has a population of less than 1000, while millions of people live in the Sahara desert.
 
All these people complaining the cards use too much energy, but I promise you everyone would be complaining to the moon if GPU progression was only 5% improvement per generation.

We want faster cards that are more efficient. Those 2 things are not interchangeable. Typically, you sacrifice one to make the other.

Before the current generation, you can go all the way back to fermi and no 80 or 80ti tier card went above 250W (250w being mostly for the TI cards with the "regular" 80 having a much lower TDP). There were a couple odd balls above that but they were either workstation titan cards or dual gpu cards.

We're also are in no way talking about 5% performance improvement and older cards could also push more power when modded to do so, there's a lot of diminishing returns with power increase, it just so happens that nvidia now feels the need to go to those lenghts.

The good thing is it is a lot easier to grow plants and live in warm environments than cold. For example, Antarctica has a population of less than 1000, while millions of people live in the Sahara desert.

That's just... dumb.
 
All these people complaining the cards use too much energy, but I promise you everyone would be complaining to the moon if GPU progression was only 5% improvement per generation.

We want faster cards that are more efficient. Those 2 things are not interchangeable. Typically, you sacrifice one to make the other.

You can do both with good design, but you can't get those chart-topping scores that everyone wants to brag about without over the top power consumption.

Edit: Added Horizon: Zero Dawn

I did some tests recently with my 6600XT, an efficient card to begin with but which is still overclocked out of it's efficency range out of the box.

Default core clock: 2668 MHz, ~2600 MHz in-game, 16Gbps Mem, 132-150W (Power +20%), 1.15v
Overclocked: 2750 MHz, ~2690 MHz in-game, 17.6Gbps Mem, 145-163W (Power +20%, hits power limits), 1.15v
Underclocked: 2050 MHz, ~2000 MHz in-game, 16Gbps Mem, 70-75W, 0.862v

But what of performance? Canned game benchmark runs as I'm no in-game consistent benchmarker:

R5 5600, 3200 MHz CL16, 1440p, PCIe 3.0 (B450 board), no Motion Blur

CP2077 - 18% lower fps at High settings
SotTR - 12% lower fps at HUB rec settings (~High but better visuals)
Forza Horizon 4 - 17% lower fps at Ultra settings
Horizon: Zero Dawn - 16% lower fps at Favor Quality settings (High)

1% lows were about the same in CP2077 (need to do more runs at 2690 to confirm), -13% in SotRT and -14% in FH4, -11% in H:ZD.

So about a -15% FPS tradeoff for half the power usage. Comparing runs directly to each other, the 2000MHz test used 51% of the power of the 2690MHz tests in the same games (148W in CP2077 and SotTR vs 75W, 118W in FH4 vs 60W, 139W in H:ZD vs 71W).

15% fewer frames is a lot and it's not a lot, depending on what you're looking for.
 
Last edited:
All these people complaining the cards use too much energy, but I promise you everyone would be complaining to the moon if GPU progression was only 5% improvement per generation.

We want faster cards that are more efficient. Those 2 things are not interchangeable. Typically, you sacrifice one to make the other.

The good thing is it is a lot easier to grow plants and live in warm environments than cold. For example, Antarctica has a population of less than 1000, while millions of people live in the Sahara desert.
Some life experience seems to be absent here lol
 
Hook line and sinker. Gobble gobble gobble
 
  • Like
Reactions: Lei
During a global energy crunch and what will again be the hottest and driest summer on record!? The 3080 had a TDP of 320w (not that far but still an increase - the 3080ti was 350w), but looking just one generation past the 2080ti used "just" 250w, as did the 1080ti - the top cards on the line up mind you, now there's an expectation for an even more absurd 90/90ti tier with even higher power draws. Die shrinks and new generations should bring effiency improvements, not just moar power draw.

People need to stop rationalizing this senseless increases in power draw, it's absurd pure and simple.
If you game so much that an extra 100-150w from the gpu (high end ones have never been below 200w or so for at least 15 years) will affect you either financially or you believe it will affect the environment, then you've got a problem somewhere else...
 
700->600->450->340

I think we could see a '150W TDP' rumor in the middle of September.
We're getting closer to 4090 being 30% faster than 3090 :D
 
The lower TDP the better, hitting limits all around with such big chips. We'll see how Adios my dineros (TM) does with their recent advances of corporate responsibility accros their value chain. :)
 
During a global energy crunch and what will again be the hottest and driest summer on record!? The 3080 had a TDP of 320w (not that far but still an increase - the 3080ti was 350w), but looking just one generation past the 2080ti used "just" 250w, as did the 1080ti - the top cards on the line up mind you, now there's an expectation for an even more absurd 90/90ti tier with even higher power draws. Die shrinks and new generations should bring effiency improvements, not just moar power draw.

People need to stop rationalizing this senseless increases in power draw, it's absurd pure and simple.
Hey, bud, if electric prices and an extra 100w of heat are that big of a deal to you, you shouldn’t be looking at these halo products in the first place. And as a reminder, During this “power crunch” there is a global push for electric cars, which consume more power in a single highway trip then this card will in a year.
If you don’t like it….. don’t buy it. Nobody is forcing you to buy a 4080. You can buy a 4070 or 4060 with the power draw you desire, and better performance, and save money! Win win!

I'm sure they'd be a lot happier with another insane increase in board power
20 w is insane now. Rediculous.
 
And as a reminder, During this “power crunch” there is a global push for electric cars, which consume more power in a single highway trip then this card will in a year.
And the more you game, the less you go out and drive cars
1661286652437.png
 
If you don’t like it….. don’t buy it. Nobody is forcing you to buy a 4080.
Yep. But the topic is all about the x80 ;) We dont even know where that will end up either but we know even less about x70 or 60. We did however experience the absolute most messy stack of Nvidia GPUs in decades with Ampere. ADA is fast looking to go down the same path.

What i do know is that a 1080 uses 50% less power. Thats significant. It was the segment I used to buy into. Thats also significant; shit escalated damn quickly. And for what?! A few realtime light rays?
 
Last edited:
  • Like
Reactions: Lei
Isnt the reality without cognitive dissonance, that we should ALL show some restraint to provide future generations a habitable planet?
I mean, I outlined the many whays in which I do
Whataboutisms wont get you, me or any of our kids anywhere. At its core all that is, is egocentric behaviour. Hypocrisy. Its up to each of us to set boundaries. If this post represents yours, fine, but dont even begin telling anyone you're somehow doing something right.... all you do is justify utter waste for yourself.
That's pretty personal dude, I wouldn't call running an undervolted 3080 or say 4080 an utter waste, but hey, you do you. I told you all I can really do is rationalise it myself, the whataboutism is a fun little extra.
If you can own that, enjoy that 400W bullshit GPU ;)
And... again.. 'the price of RT...' :) Worth?!
Again man, you do you. I don't think my 3080 is bullshit, and yeah, RT so far has been a very fun path to walk with this class card. Metro EE, DOOM Eternal, Control, Dying Light 2, Spiderman... The experience with all the dials up is right up my alley, because it's not up yours doesn't automatically make it bullshit to everyone else.
What i do know is that a 1080 uses 50% less power. Thats significant. It was the segment I used to buy into. Thats also significant; shit escalated damn quickly. And for what?! A few realtime light rays?
Well, just keep buying into that segment then, easy. Both RDNA3 and Ada will have products that want for 1x8pin / <225w, and they'll be faster than Ampere or RDNA2 (at equal power draw), vote with that wallet.
Before the current generation, you can go all the way back to fermi and no 80 or 80ti tier card went above 250W
it just so happens that nvidia now feels the need to go to those lenghts.
They're not the only ones ;)

No 'founders/reference model' did, but AIB's certainly did and the high end of the stack still had 250w+ parts, like Dual GPU cards, same for AMD.

Also AMD single GPU between Fermi and current generation:
  • R9 290/390 - 275w
  • R9 290X/390X - 290w/275W
  • R9 Fury X - 275w
  • Vega 64 - 295w
  • Radeon VII - 295w
ey, bud, if electric prices and an extra 100w of heat are that big of a deal to you, you shouldn’t be looking at these halo products in the first place.
If you don’t like it….. don’t buy it.
Winning Logic. I also consistently see a lot of complaints from what amounts to people who would've never bought the card anyway, even if it fit their power budget, because of things like brand and price.
 
Last edited:
Hi,
Well everyone needs something to buzz about :cool:

bitcoin cryptocurrency GIF
 
Rtx 4080, about 350W is just fine. In the next recession energy prices and earnings will go down. Like china or third world, low budget gpus or handy will be used for games. Problem solved.
 
225w is about what I'm looking for in a GPU either in a single card or glued together via multiple PCIE slots in the form of SLI/CF or single PC with individual GPU card usage.
 
Sheesh, you guys would spit your pacifier if I told you what was in my rig, and sometimes use both cards for rendering. I don't like the idea of higher wattages, but there's really no alternative for 3D rendering stuff in iray at a reasonable speed.
 
Sheesh, you guys would spit your pacifier if I told you what was in my rig
As far as I'm concerned, unless you're gaming on a steam deck, switch or equivalent, you're bloody irresponsible with your outrageous and wasteful power usage in the name of a better gaming experience.

Your 125w GFX card is a disgrace to future generations, when power is only going to cost more and the planet is heating up, we're in a recession and next summer is set to be the hottest on record!

Responsible gamers should all limit themselves to sub 30w draw for the entire device, or you'll be answering to my sweaty, poor grandchildren dammit.
 
Back
Top