Tuesday, August 23rd 2022

NVIDIA GeForce RTX 4080 Could Get 23 Gbps GDDR6X Memory with 340 Watt Total Board Power

NVIDIA's upcoming GeForce RTX 40 series graphics cards are less than two months from the official launch. As we near the final specification draft, we are constantly getting updates from hardware leakers claiming that the specification is ever-changing. Today, @kopite7kimi has updated his GeForce RTX 4080 GPU predictions with some exciting changes. First off, the GPU memory will get an upgrade over the previously believed specification. Before, we thought that the SKU used GDDR6X running at 21 Gbps; however, now, it is assumed that it uses a 23 Gbps variant. Faster memory will definitely result in better overall performance, and we are yet to see what it can achieve with overclocking.

Next, another update for NVIDIA GeForce RTX 4080 comes with the SKU's total board power (TBP). Previously we believed it came with a 420 Watt TBP; however, the sources of kopite7kimi claim that it has a 340 Watt TBP. This 60 Watt reduction is rather significant and could be attributed to NVIDIA's optimization to have the most efficient design possible.
Sources: kopite7kimi (Twitter), via VideoCardz
Add your own comment

82 Comments on NVIDIA GeForce RTX 4080 Could Get 23 Gbps GDDR6X Memory with 340 Watt Total Board Power

#1
wolf
Better Than Native
Another day another rumor, what will it be tomorrow?

Part of me seems convinced they do this on purpose to test the market, so we had heaps of backlash at the higher power draws and now the rumors are a lot kore sane. I mean circa 340w for some people still isn't sane but it's acceptable for such a tier product, in this day and age.
Posted on Reply
#2
ZoneDymo
wolfAnother day another rumor, what will it be tomorrow?

Part of me seems convinced they do this on purpose to test the market, so we had heaps of backlash at the higher power draws and now the rumors are a lot kore sane. I mean circa 340w for some people still isn't sane but it's acceptable for such a tier product, in this day and age.
Used to be a website for stuff like this, think it was called Fudzilla.
Posted on Reply
#3
Bomby569
so actually no increase in TDP considering a lot more memory using power, it would probably be a decrease
Posted on Reply
#4
mahirzukic2
ZoneDymoUsed to be a website for stuff like this, think it was called Fudzilla.
I used to go there almost every day. It has fallen though, don't know why I went there in the first place.
Posted on Reply
#5
Shou Miko
I believe that Kopite7kimi could be correct about the "total card power 340W" he have a good record and is a good source to follow on Twitter together with OneRaichu last one twits a lot of different things too.
Posted on Reply
#6
agent_x007
If they run GPU at over 1V (Vgpu), it will throttle regardless (with lower TDP, it will simply throttle harder/faster).
Until they decrease max. voltage (below 1V) along side it, TDP "tweaking" is meaningless.

EDIT : Oh, it should make PSU manufacturers feel better - so, there's that.
Posted on Reply
#7
trsttte
wolfit's acceptable for such a tier product, in this day and age.
During a global energy crunch and what will again be the hottest and driest summer on record!? The 3080 had a TDP of 320w (not that far but still an increase - the 3080ti was 350w), but looking just one generation past the 2080ti used "just" 250w, as did the 1080ti - the top cards on the line up mind you, now there's an expectation for an even more absurd 90/90ti tier with even higher power draws. Die shrinks and new generations should bring effiency improvements, not just moar power draw.

People need to stop rationalizing this senseless increases in power draw, it's absurd pure and simple.
agent_x007Oh, it should make PSU manufacturers feel good - so there's that.
I'm sure they'd be a lot happier with another insane increase in board power that would allow them to sell another psu upgrade and continue to ship the very high power psu's that now will sit on shelves after crypto went bust for a bit
Posted on Reply
#8
Minus Infinity
trsttteDuring a global energy crunch and what will again be the hottest and driest summer on record!? The 3080 had a TDP of 320w (not that far but still an increase - the 3080ti was 350w), but looking just one generation past the 2080ti used "just" 250w, as did the 1080ti - the top cards on the line up mind you, now there's an expectation for an even more absurd 90/90ti tier with even higher power draws. Die shrinks and new generations should bring effiency improvements, not just moar power draw.

People need to stop rationalizing this senseless increases in power draw, it's absurd pure and simple.



I'm sure they'd be a lot happier with another insane increase in board power that would allow them to sell another psu upgrade and continue to ship the very high power psu's that now will sit on shelves after crypto went bust for a bit
Someone gets it.
Posted on Reply
#9
Vayra86
Nice, I can save money here. Anything over 250W is a no go

And double that in Euro is another limit here. Going over? No buy...
Posted on Reply
#10
Crackong
700->600->450->340

I think we could see a '150W TDP' rumor in the middle of September.
Posted on Reply
#11
Flanker
Vayra86Nice, I can save money here. Anything over 250W is a no go

And double that in Euro is another limit here. Going over? No buy...
We can remain pascal buddies :toast:
Posted on Reply
#12
Dirt Chip
Another 'leak', another tweet, another thread, another 'inside info'.
No matter the content, if you engaged in any sort then it worked.
The PR is very effective now days, see how much talk about W is being generated.
So easy to make people move to participate in the PR loop.

I'm doing my part also of course, free entertainment that keep my busy for a while, sort of geek HW escapism, and the PR machine get it's profit.
Win-Win situations :)
Posted on Reply
#13
DeathtoGnomes
Before, we thought that the SKU used GDDR6X running at 21 Gbps; however, now, it is assumed that it uses a 23 Gbps variant.
What about the new 24 Gbps memory or was that left for overclockers to reach to feel like they achieved something?
Posted on Reply
#14
BluesFanUK
I remember going from AMD to Nvidia because of how incredibly efficient Maxwell was. Now they're just taking liberties, both in cost and efficiency.
Posted on Reply
#15
Chaitanya
AssimilatorBasically he is just making shit up as he goes along and morons like you believe him.
shooting in dark enough times and hoping something will stick eventually.
Posted on Reply
#16
Assimilator
Chaitanyashooting in dark enough times and hoping something will stick eventually.
A person manages to get 1 thing right once and idiots think that person is some kind of oracle forever. It boggles my mind how thinking is apparently something so few do anymore.
Posted on Reply
#17
Dristun
trstttePeople need to stop rationalizing this senseless increases in power draw, it's absurd pure and simple.
If you don't need the performance that its higher power draw allows for, you can always go down a tier or buy a competitor that will consume 40W less, what else is there to rationalize? People want 4K60 with RT, me included, NVIDIA and AMD are happy to oblige, that's it, really.
Posted on Reply
#18
AnarchoPrimitiv
trsttteDuring a global energy crunch and what will again be the hottest and driest summer on record!? The 3080 had a TDP of 320w (not that far but still an increase - the 3080ti was 350w), but looking just one generation past the 2080ti used "just" 250w, as did the 1080ti - the top cards on the line up mind you, now there's an expectation for an even more absurd 90/90ti tier with even higher power draws. Die shrinks and new generations should bring effiency improvements, not just moar power draw.

People need to stop rationalizing this senseless increases in power draw, it's absurd pure and simple.



I'm sure they'd be a lot happier with another insane increase in board power that would allow them to sell another psu upgrade and continue to ship the very high power psu's that now will sit on shelves after crypto went bust for a bit
I couldn't agree more, and while our culture at large ceaselessly applauds everything technology and rarely, if ever, criticizes it and we're asked to believe that technological progress is entirely a linear ascent, the truth is far from that. What you're observing, the fact that despite node shrinks and gains in technological efficiency, we are increasingly offered hardware that has higher and higher net energy consumption, is called the "rebound effect" and it has been observed for the entirety of the techno-industrial order since it began (circa 1712 when steam power was first used to pump water from an English coal mine), in every technological segment/market, and at the macro level as a whole. Despite having the capacity for greater efficiency than ever befire, we consume more energy per capita than any time previous in human history.

While there are several contributors to this phenomenon, the paramount impetus is capitalism, specifically the paradigm of infinite expansion, growth and production. For example, while workers are literally more productive than ever, we are on average working more than we were in 1970 and that's because any gains in productivity and production efficiency are NEVER utilized to lower net consumption or to maintain current levels of production (i.e. workers decrease from 40 to 32 hours per week, but production levels stay the same), but to increase net production and therefore increase energy/resourse consumption as well (i.e. at best workers continue working thr same amount, but more often either work more hours or are expected to produce more units in the same amount of time).

When it comes to GPUs, the tradition of perpetually claiming higher performance every new generation coupled with marketing that specific aspect and placing its primacy above above all others (like efficiency) in the marketing itself, has transformed it into a convention from which the companies, their marketing departments, and the overwhelming majority of consumers anticipate and from which they refuse to deviate. I think we can all agree that if either AMD or Nvidia applied efficiency gains harvested from node shrinks, architectural improvements, etc into a new generation that provided just a 10% gain in performance, but a 100% increase in efficiency, not only would the marketing departments not know what to do with such a product, but consumers would largely criticize and reject it. Would an extremely energy efficient video card capable of being cooled with a now "old fashioned" one or two slot cooler experience wide spread market adoption? I think not. But far from blaming consumers, this current reality is also the result of years of constant, incremental conditioning by companies to tolerate and accept ever higher energy consumption and ever larger video cards to manage that consumption....if we were to show a 2017 audience the four slot 3090 ti cooler that is basically the default across the GPU line and completely transcends brand variance, I think they'd probably laugh from the ridiculousness of it and be mystified at how it arrived to such an endpoint (this reaction would be even more poignant if this hypothetical audience came from the end of 2014 when Nvidia's Maxwell architecture was released and every Nvidia fanboy couldn't bring up "efficiency" enough as a cited advantage over AMD's competing GCN architectural iteration).

This has been the issue with CPUs as well. As a whole AMD has previously decreased power draw on their CPUs, while Intel has continuously increased it to the point that a leak from just the other day showed an engineering sample of raptor lake consuming nearly 400 watts when a water chiller was deployed and the frequency approached 6ghz...that's what 64 core Epyc Milans consume! And unfortunately, while there has been sporadic criticism of Intel's high power draw from the enthusiast community, on a whole, consumers have largely accepted this trend without opposition. Even more unfortunate is that AMD has undoubtedly taken notice of the market's seemingly infinite patience with perpetually increasing power draw and increased the tdp (or whatever AMD calls it) of their soon to be released 16 core zen4 cpu from 105 watts to 170 watts. Intel has been doing this for a while and has experienced no market backlash, therefore AMD has been forced into a situation by both Intel and Nvidia to increase the power draw in both the CPU and dGPU market to stay competitive. It's a great example of how when a company would, or at least could, prioritize efficiency, the behavior of their competition which maintains a substantially larger market share and thereby dictates the trends and direction of those markets creates pressures for which a company like AMD, who not only has to match, but greatly exceed their competition's performance to be considered by fickle consumers, is forced to abandon such priorities to gain market share and even just to maintain their current position. This ultimately results in the evaporation of any alternate choice for consumers who wish to prioritize efficiency or decrease their overall energy consumption and thereby basically forcing ALL consumers to increase their net energy consumption whether they will it or not.
Posted on Reply
#19
wolf
Better Than Native
trsttteDuring a global energy crunch and what will again be the hottest and driest summer on record!?

People need to stop rationalizing this senseless increases in power draw, it's absurd pure and simple.
I suppose that all I can rationalize is my own purchase decision, based on what products end up existing, given this is now an industry trend. I too think the trend isn't good, far from it.

I have made and continue to make large efforts to be as responsible as I can. I have solar power (and game my consumption to use it as much as feasible), I recycle (and separate multiple different kinds of recyclables, I compost, I have a water-wise garden, I've set up my house as best I can to be passively insulated instead of throwing power at the problem of comfort, I sold my motorcycle and I ride a pushbike or electric scooter to work instead, I eat leftovers, I walk when it's close, the car we have was bought with an eye to economy and emissions, the list goes on, including undervolting my system and making it run in the efficiency sweetspot. But I'm at the point in my life where I have little time for hobbies, and one that keeps me at home, safe, out of trouble etc is a winner, but I'm also at the point where I have reasonable money to spend on this hobby.

So yeah, I do have a 320w GPU now, and may have an even more power hungry one in a few months (you can bet I'll undervolt it tho), and one day I might need to pay again for that, socially, physically or otherwise.

Like always, make up your own mind what's acceptable to buy, to support too, voting with your wallet might even get these companies to change the trend, I'd like to believe that's possible.

In the scheme of the world where people waste needlessly, buy gas guzzling v8 trucks, litter never mind being responsible with real waste, they buy massive air conditioners and heaters and live in comfort etc, and this is regular middle class people, never mind the 1% or 0.1% of planet rapers...

I have a high end PC with a power hungry GPU, and I'll do it again. Sorry.
Posted on Reply
#20
sam_86314
Vayra86Nice, I can save money here. Anything over 250W is a no go

And double that in Euro is another limit here. Going over? No buy...
Yep, I think anything above 250W is too much.

I was worried about that when I got my 6800 XT, but I've been able to tune it so that it rarely goes above 200W while gaming (limited to 240W). It performs nearly twice as well as my old 5700 XT, and yet it only uses about 20-40W more to do so.
Posted on Reply
#22
defaultluser
wolfAnother day another rumor, what will it be tomorrow?

Part of me seems convinced they do this on purpose to test the market, so we had heaps of backlash at the higher power draws and now the rumors are a lot kore sane. I mean circa 340w for some people still isn't sane but it's acceptable for such a tier product, in this day and age.
its the same board power as the last gen 3080

The higher psu spec for 4090 leaves us room for future ti revisions ( and also a jump in 4080 ti power consumption, if rdna 3 is as competitive as it was last gen.)

We will have to see what the rest of the board powers are, but I highly doubt they'll be much different!
Posted on Reply
#23
mechtech
New rumour. For ever watt of power reduction Nvidia will raise price $5. ;)
Posted on Reply
#24
ppn
I'm sure there was a 23 Gbps rumour before, or am I having a deja vu. At this point I don't even care if it is 200 or 400 watts. since it only costs me 20 euro montly, My system is already sucking 350 watts at the wall, So 100 watts more or what not is irrelevant. Considering I would waste 12 hours of money earning time during my most productive years, the focus of the real cost of you playing games should be placed elswhere. And I doubt the relaxation that PC games provide would lead to better productivity at work or would randomise you life in such a way that would be beneficial in any way. Maybe it trains reactions, but kills neurons at a higher rate than is training them. Most likely would be very detremental and would lead to horrible consequences.
Posted on Reply
#25
Vayra86
wolfI suppose that all I can rationalize is my own purchase decision, based on what products end up existing, given this is now an industry trend. I too think the trend isn't good, far from it.

I have made and continue to make large efforts to be as responsible as I can. I have solar power (and game my consumption to use it as much as feasible), I recycle (and separate multiple different kinds of recyclables, I compost, I have a water-wise garden, I've set up my house as best I can to be passively insulated instead of throwing power at the problem of comfort, I sold my motorcycle and I ride a pushbike or electric scooter to work instead, I eat leftovers, I walk when it's close, the car we have was bought with an eye to economy and emissions, the list goes on, including undervolting my system and making it run in the efficiency sweetspot. But I'm at the point in my life where I have little time for hobbies, and one that keeps me at home, safe, out of trouble etc is a winner, but I'm also at the point where I have reasonable money to spend on this hobby.

So yeah, I do have a 320w GPU now, and may have an even more power hungry one in a few months (you can bet I'll undervolt it tho), and one day I might need to pay again for that, socially, physically or otherwise.

Like always, make up your own mind what's acceptable to buy, to support too, voting with your wallet might even get these companies to change the trend, I'd like to believe that's possible.

In the scheme of the world where people waste needlessly, buy gas guzzling v8 trucks, litter never mind being responsible with real waste, they buy massive air conditioners and heaters and live in comfort etc, and this is regular middle class people, never mind the 1% or 0.1% of planet rapers...

I have a high end PC with a power hungry GPU, and I'll do it again. Sorry.
Isnt the reality without cognitive dissonance, that we should ALL show some restraint to provide future generations a habitable planet?

Whataboutisms wont get you, me or any of our kids anywhere. At its core all that is, is egocentric behaviour. Hypocrisy. Its up to each of us to set boundaries. If this post represents yours, fine, but dont even begin telling anyone you're somehow doing something right.... all you do is justify utter waste for yourself.

If you can own that, enjoy that 400W bullshit GPU ;) If your gut tells you it doesnt feel right, the solution is very simple; dont buy into it. Thatis consumer power and that is how 'we' enforce change. You literally even said so yourself. If it doesnt feel right, its not right. Screw what 'the industry' pushes for. Thats the very 1% YOU are catering for. The 1% that releases a product you yourself consider 'a bit too much' even!

Signed: an EV driving, solar powered home owner. The point isnt saving a few more kwh than the next guy, the point is what we as consumers keep feeding. My point is: dont feed ever more power hungry GPUs because node advancements are stalling!

And... again.. 'the price of RT...' :) Worth?!
Posted on Reply
Add your own comment
Nov 23rd, 2024 22:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts