Tuesday, August 23rd 2022

NVIDIA GeForce RTX 4080 Could Get 23 Gbps GDDR6X Memory with 340 Watt Total Board Power

NVIDIA's upcoming GeForce RTX 40 series graphics cards are less than two months from the official launch. As we near the final specification draft, we are constantly getting updates from hardware leakers claiming that the specification is ever-changing. Today, @kopite7kimi has updated his GeForce RTX 4080 GPU predictions with some exciting changes. First off, the GPU memory will get an upgrade over the previously believed specification. Before, we thought that the SKU used GDDR6X running at 21 Gbps; however, now, it is assumed that it uses a 23 Gbps variant. Faster memory will definitely result in better overall performance, and we are yet to see what it can achieve with overclocking.

Next, another update for NVIDIA GeForce RTX 4080 comes with the SKU's total board power (TBP). Previously we believed it came with a 420 Watt TBP; however, the sources of kopite7kimi claim that it has a 340 Watt TBP. This 60 Watt reduction is rather significant and could be attributed to NVIDIA's optimization to have the most efficient design possible.
Sources: kopite7kimi (Twitter), via VideoCardz
Add your own comment

82 Comments on NVIDIA GeForce RTX 4080 Could Get 23 Gbps GDDR6X Memory with 340 Watt Total Board Power

#26
gffermari
the planet won’t be saved by limiting the gpu consumption. In Europe, we actually punish ourselves to do even more for the planet while elsewhere they don’t give a fxxx about energy consumption, fuel consumption, waste management/recycling, renewable energy etc…

If you don’t limit the heavy industry globally , not just in Europe or America or Asia, that’s the main cause of the climate change, it’s worthless doing anything else.
It’s a matter of scale.
You are doing nothing by using 1000 electric vehicles while just one old truck in Europe or china or Africa or America or wherever consumes and pollutes 100 times more.
Posted on Reply
#27
Tomgang
All these rumors. So existing and yet so useless. I mean they changed by the month or week. Release these card all ready and come out with the true final spec.

Well here is the latest 4000 series rumors from me. Rtx 4000 is just a revamped rtx 3000 series cards. I mean nvidia has done it before... Just saying.
Posted on Reply
#28
Vayra86
gffermarithe planet won’t be saved by limiting the gpu consumption. In Europe, we actually punish ourselves to do even more for the planet while elsewhere they don’t give a fxxx about energy consumption, fuel consumption, waste management/recycling, renewable energy etc…

If you don’t limit the heavy industry globally , not just in Europe or America or Asia, that’s the main cause of the climate change, it’s worthless doing anything else.
It’s a matter of scale.
You are doing nothing by using 1000 electric vehicles while just one old truck in Europe or china or Africa or America or wherever consumes and pollutes 100 times more.
Energy usage is about a near infinite stacking of all the little things that matter. It echoes now very well in your own energy bill. One minute less shower per day for a year will make a noticeable dent in it. And that, as behaviour, does indeed scale.

Its 100W/hr on a GPU. Its a minute or two less under warm water. Its a few hundred kwh yearly because of solar. Etc etc etc

It really does matter, because our footprints are far too high overall. None of us with the wealth to buy these products have any right to point fingers...

That heavy industry and its processes exist because we buy into it. Every time. We have the power to kill a portion of it.
Posted on Reply
#29
zo0lykas
Crackong700->600->450->340

I think we could see a '150W TDP' rumor in the middle of September.
All your mentioned tdp are different gpu's
Rtx 4090ti 700tdp on peak time, sure don't forget models like kinping or half of fame.
Posted on Reply
#30
PapaTaipei
zo0lykasAll your mentioned tdp are different gpu's
Rtx 4090ti 700tdp on peak time, sure don't forget models like kinping or half of fame.
"Half of fame" xD
Posted on Reply
#31
mb194dc
Possibly they're trying to adapt the design to market conditions.... There's no market for a 600w $1000+ monster GPU now, the GPU market has totally gone. All changed in the last few months.

If they actually want to shift any cards, 4080 should be $500 or less and 300w or so.
Posted on Reply
#32
Guwapo77
ZoneDymoUsed to be a website for stuff like this, think it was called Fudzilla.
Remember The Inquirer? Oh, the good ol'days.
Posted on Reply
#33
FeelinFroggy
All these people complaining the cards use too much energy, but I promise you everyone would be complaining to the moon if GPU progression was only 5% improvement per generation.

We want faster cards that are more efficient. Those 2 things are not interchangeable. Typically, you sacrifice one to make the other.

The good thing is it is a lot easier to grow plants and live in warm environments than cold. For example, Antarctica has a population of less than 1000, while millions of people live in the Sahara desert.
Posted on Reply
#34
trsttte
FeelinFroggyAll these people complaining the cards use too much energy, but I promise you everyone would be complaining to the moon if GPU progression was only 5% improvement per generation.

We want faster cards that are more efficient. Those 2 things are not interchangeable. Typically, you sacrifice one to make the other.
Before the current generation, you can go all the way back to fermi and no 80 or 80ti tier card went above 250W (250w being mostly for the TI cards with the "regular" 80 having a much lower TDP). There were a couple odd balls above that but they were either workstation titan cards or dual gpu cards.

We're also are in no way talking about 5% performance improvement and older cards could also push more power when modded to do so, there's a lot of diminishing returns with power increase, it just so happens that nvidia now feels the need to go to those lenghts.
FeelinFroggyThe good thing is it is a lot easier to grow plants and live in warm environments than cold. For example, Antarctica has a population of less than 1000, while millions of people live in the Sahara desert.
That's just... dumb.
Posted on Reply
#35
Lew Zealand
FeelinFroggyAll these people complaining the cards use too much energy, but I promise you everyone would be complaining to the moon if GPU progression was only 5% improvement per generation.

We want faster cards that are more efficient. Those 2 things are not interchangeable. Typically, you sacrifice one to make the other.
You can do both with good design, but you can't get those chart-topping scores that everyone wants to brag about without over the top power consumption.

Edit: Added Horizon: Zero Dawn

I did some tests recently with my 6600XT, an efficient card to begin with but which is still overclocked out of it's efficency range out of the box.

Default core clock: 2668 MHz, ~2600 MHz in-game, 16Gbps Mem, 132-150W (Power +20%), 1.15v
Overclocked: 2750 MHz, ~2690 MHz in-game, 17.6Gbps Mem, 145-163W (Power +20%, hits power limits), 1.15v
Underclocked: 2050 MHz, ~2000 MHz in-game, 16Gbps Mem, 70-75W, 0.862v

But what of performance? Canned game benchmark runs as I'm no in-game consistent benchmarker:

R5 5600, 3200 MHz CL16, 1440p, PCIe 3.0 (B450 board), no Motion Blur

CP2077 - 18% lower fps at High settings
SotTR - 12% lower fps at HUB rec settings (~High but better visuals)
Forza Horizon 4 - 17% lower fps at Ultra settings
Horizon: Zero Dawn - 16% lower fps at Favor Quality settings (High)

1% lows were about the same in CP2077 (need to do more runs at 2690 to confirm), -13% in SotRT and -14% in FH4, -11% in H:ZD.

So about a -15% FPS tradeoff for half the power usage. Comparing runs directly to each other, the 2000MHz test used 51% of the power of the 2690MHz tests in the same games (148W in CP2077 and SotTR vs 75W, 118W in FH4 vs 60W, 139W in H:ZD vs 71W).

15% fewer frames is a lot and it's not a lot, depending on what you're looking for.
Posted on Reply
#36
Vayra86
FeelinFroggyAll these people complaining the cards use too much energy, but I promise you everyone would be complaining to the moon if GPU progression was only 5% improvement per generation.

We want faster cards that are more efficient. Those 2 things are not interchangeable. Typically, you sacrifice one to make the other.

The good thing is it is a lot easier to grow plants and live in warm environments than cold. For example, Antarctica has a population of less than 1000, while millions of people live in the Sahara desert.
Some life experience seems to be absent here lol
Posted on Reply
#37
maxfly
Hook line and sinker. Gobble gobble gobble
Posted on Reply
#38
HenrySomeone
trsttteDuring a global energy crunch and what will again be the hottest and driest summer on record!? The 3080 had a TDP of 320w (not that far but still an increase - the 3080ti was 350w), but looking just one generation past the 2080ti used "just" 250w, as did the 1080ti - the top cards on the line up mind you, now there's an expectation for an even more absurd 90/90ti tier with even higher power draws. Die shrinks and new generations should bring effiency improvements, not just moar power draw.

People need to stop rationalizing this senseless increases in power draw, it's absurd pure and simple.
If you game so much that an extra 100-150w from the gpu (high end ones have never been below 200w or so for at least 15 years) will affect you either financially or you believe it will affect the environment, then you've got a problem somewhere else...
Posted on Reply
#39
Lei
Crackong700->600->450->340

I think we could see a '150W TDP' rumor in the middle of September.
We're getting closer to 4090 being 30% faster than 3090 :D
Posted on Reply
#40
pavle
The lower TDP the better, hitting limits all around with such big chips. We'll see how Adios my dineros (TM) does with their recent advances of corporate responsibility accros their value chain. :)
Posted on Reply
#41
TheinsanegamerN
trsttteDuring a global energy crunch and what will again be the hottest and driest summer on record!? The 3080 had a TDP of 320w (not that far but still an increase - the 3080ti was 350w), but looking just one generation past the 2080ti used "just" 250w, as did the 1080ti - the top cards on the line up mind you, now there's an expectation for an even more absurd 90/90ti tier with even higher power draws. Die shrinks and new generations should bring effiency improvements, not just moar power draw.

People need to stop rationalizing this senseless increases in power draw, it's absurd pure and simple.
Hey, bud, if electric prices and an extra 100w of heat are that big of a deal to you, you shouldn’t be looking at these halo products in the first place. And as a reminder, During this “power crunch” there is a global push for electric cars, which consume more power in a single highway trip then this card will in a year.
If you don’t like it….. don’t buy it. Nobody is forcing you to buy a 4080. You can buy a 4070 or 4060 with the power draw you desire, and better performance, and save money! Win win!
trsttteI'm sure they'd be a lot happier with another insane increase in board power
20 w is insane now. Rediculous.
Posted on Reply
#42
Lei
TheinsanegamerNAnd as a reminder, During this “power crunch” there is a global push for electric cars, which consume more power in a single highway trip then this card will in a year.
And the more you game, the less you go out and drive cars
Posted on Reply
#43
Vayra86
TheinsanegamerNIf you don’t like it….. don’t buy it. Nobody is forcing you to buy a 4080.
Yep. But the topic is all about the x80 ;) We dont even know where that will end up either but we know even less about x70 or 60. We did however experience the absolute most messy stack of Nvidia GPUs in decades with Ampere. ADA is fast looking to go down the same path.

What i do know is that a 1080 uses 50% less power. Thats significant. It was the segment I used to buy into. Thats also significant; shit escalated damn quickly. And for what?! A few realtime light rays?
Posted on Reply
#44
wolf
Better Than Native
Vayra86Isnt the reality without cognitive dissonance, that we should ALL show some restraint to provide future generations a habitable planet?
I mean, I outlined the many whays in which I do
Vayra86Whataboutisms wont get you, me or any of our kids anywhere. At its core all that is, is egocentric behaviour. Hypocrisy. Its up to each of us to set boundaries. If this post represents yours, fine, but dont even begin telling anyone you're somehow doing something right.... all you do is justify utter waste for yourself.
That's pretty personal dude, I wouldn't call running an undervolted 3080 or say 4080 an utter waste, but hey, you do you. I told you all I can really do is rationalise it myself, the whataboutism is a fun little extra.
Vayra86If you can own that, enjoy that 400W bullshit GPU ;)
And... again.. 'the price of RT...' :) Worth?!
Again man, you do you. I don't think my 3080 is bullshit, and yeah, RT so far has been a very fun path to walk with this class card. Metro EE, DOOM Eternal, Control, Dying Light 2, Spiderman... The experience with all the dials up is right up my alley, because it's not up yours doesn't automatically make it bullshit to everyone else.
Vayra86What i do know is that a 1080 uses 50% less power. Thats significant. It was the segment I used to buy into. Thats also significant; shit escalated damn quickly. And for what?! A few realtime light rays?
Well, just keep buying into that segment then, easy. Both RDNA3 and Ada will have products that want for 1x8pin / <225w, and they'll be faster than Ampere or RDNA2 (at equal power draw), vote with that wallet.
trsttteBefore the current generation, you can go all the way back to fermi and no 80 or 80ti tier card went above 250W
it just so happens that nvidia now feels the need to go to those lenghts.
They're not the only ones ;)

No 'founders/reference model' did, but AIB's certainly did and the high end of the stack still had 250w+ parts, like Dual GPU cards, same for AMD.

Also AMD single GPU between Fermi and current generation:
  • R9 290/390 - 275w
  • R9 290X/390X - 290w/275W
  • R9 Fury X - 275w
  • Vega 64 - 295w
  • Radeon VII - 295w
TheinsanegamerNey, bud, if electric prices and an extra 100w of heat are that big of a deal to you, you shouldn’t be looking at these halo products in the first place.
TheinsanegamerNIf you don’t like it….. don’t buy it.
Winning Logic. I also consistently see a lot of complaints from what amounts to people who would've never bought the card anyway, even if it fit their power budget, because of things like brand and price.
Posted on Reply
#45
ThrashZone
Hi,
Well everyone needs something to buzz about :cool:

Posted on Reply
#46
Sisyphus
Rtx 4080, about 350W is just fine. In the next recession energy prices and earnings will go down. Like china or third world, low budget gpus or handy will be used for games. Problem solved.
Posted on Reply
#47
InVasMani
225w is about what I'm looking for in a GPU either in a single card or glued together via multiple PCIE slots in the form of SLI/CF or single PC with individual GPU card usage.
Posted on Reply
#48
MentalAcetylide
Sheesh, you guys would spit your pacifier if I told you what was in my rig, and sometimes use both cards for rendering. I don't like the idea of higher wattages, but there's really no alternative for 3D rendering stuff in iray at a reasonable speed.
Posted on Reply
#49
wolf
Better Than Native
MentalAcetylideSheesh, you guys would spit your pacifier if I told you what was in my rig
As far as I'm concerned, unless you're gaming on a steam deck, switch or equivalent, you're bloody irresponsible with your outrageous and wasteful power usage in the name of a better gaming experience.

Your 125w GFX card is a disgrace to future generations, when power is only going to cost more and the planet is heating up, we're in a recession and next summer is set to be the hottest on record!

Responsible gamers should all limit themselves to sub 30w draw for the entire device, or you'll be answering to my sweaty, poor grandchildren dammit.
Posted on Reply
#50
R0H1T
gffermarithe planet won’t be saved by limiting the gpu consumption. In Europe, we actually punish ourselves to do even more for the planet while elsewhere they don’t give a fxxx about energy consumption, fuel consumption, waste management/recycling, renewable energy etc

If you don’t limit the heavy industry globally , not just in Europe or America or Asia, that’s the main cause of the climate change, it’s worthless doing anything else.
It’s a matter of scale.
You are doing nothing by using 1000 electric vehicles while just one old truck in Europe or china or Africa or America or wherever consumes and pollutes 100 times more.
That's BS & you know it!

Again BS, pretty sure your online shopping is an ecological disaster ~

Like I was harping in the other thread you've got to stop the rot starting from oneself, no point blaming a$$hole corporations or other individuals if you aren't doing more than the bare minimum at your end. Glad one of the more sane voices out there picked up on this!

What's your per capita energy/resource consumption in the West? Wanna try that again :rolleyes:

Another similar take on this topic ~
Posted on Reply
Add your own comment
Jul 24th, 2024 01:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts