Tuesday, August 23rd 2022

NVIDIA GeForce RTX 4080 Could Get 23 Gbps GDDR6X Memory with 340 Watt Total Board Power

NVIDIA's upcoming GeForce RTX 40 series graphics cards are less than two months from the official launch. As we near the final specification draft, we are constantly getting updates from hardware leakers claiming that the specification is ever-changing. Today, @kopite7kimi has updated his GeForce RTX 4080 GPU predictions with some exciting changes. First off, the GPU memory will get an upgrade over the previously believed specification. Before, we thought that the SKU used GDDR6X running at 21 Gbps; however, now, it is assumed that it uses a 23 Gbps variant. Faster memory will definitely result in better overall performance, and we are yet to see what it can achieve with overclocking.

Next, another update for NVIDIA GeForce RTX 4080 comes with the SKU's total board power (TBP). Previously we believed it came with a 420 Watt TBP; however, the sources of kopite7kimi claim that it has a 340 Watt TBP. This 60 Watt reduction is rather significant and could be attributed to NVIDIA's optimization to have the most efficient design possible.
Sources: kopite7kimi (Twitter), via VideoCardz
Add your own comment

82 Comments on NVIDIA GeForce RTX 4080 Could Get 23 Gbps GDDR6X Memory with 340 Watt Total Board Power

#76
Sisyphus
MentalAcetylideExactly. Unless there's some new earth-shattering technological breakthrough in regards to energy, high performance + cheap will never be possible. Otherwise, we would all be driving sports cars & using high end computers that can get 120+FPS on a 16k display at the cost of pennies per day for the energy consumed. This equates to something like doing WORK requiring the energy of nuclear fusion/fission on a small scale in your own house or car without the need for putting the energy into it. Solar is about as good as it gets, but the costs become astronomical when you consider the hardware & recurrent costs. The laws of physics has us by the balls. :(
Psychology. People always want more. If energy were free, clean, and renewable, the spared resources would flow into more complex hardware, which depletes other resources and will have the same ecological impact. Diogenes is credited with saying: Happy is he who has no needs. He probably lived around 400 BC. in a barrel in Athens, without clothes, living on the leftovers of others. He ridiculed the life of the Athenians, who wasted all their time satisfying needs, most of which seemed useless to him. The word cynic goes back to him.
Back to the topic:
As I mentioned, computing power per Watt is better from generation to generation. About 50-100%, real performance 20-50%. Who wants to save money and energy, buys a cheap entry CPU, limits burst, buys an entry GPU, limits burst/undervolting, and plays games who are older 4-5 years. Or buy a notebook with rtx m3060 or m3070, plug a HD or 4k monitor in, done. Or a console or smartphone game. Its up to everyone, that's freedom. Taxing the energy costs is easier and fairer. No need for more bans and more regulation.
Posted on Reply
#77
tpu7887
agent_x007There is no "work best"...


My solution :
Make cards with max. boost limited to lower frequency (1.5 - 2GHz) and voltage of 0.8-0.85V, with additional V/F options being available through "OC" mode that can be enabled via driver_option/AIB_program/3rd-party_OCtool. Maybe display appropriate warning about massive implications (for longevity/thermal/noise/long-term-performance/etc.), BEFORE using those higher rated modes (or just do a "Uber" vBIOS switch on PCB ?).

BTW : ALL NV Boost 3.0 cards are capable of this (ie. since Pascal), but NV simply likes to "push things to 11", because... reasons.
Proof?
ALL GPUs mentioned contain stable, and manufacturer tested lower frequencies/voltage combinations a particular card can use. It's in the table, and you can "lock" GPU to any of them via Afterburner (for example).
edit/aside: After a bit, it came to my mind that you might have thought I was suggesting more power. I don't, you'll understand after you read my post below. Personally, a lot of the time I run my 3080 at 0.900V and 1800MHz with no boost. During the most demanding scenes I get crashes if the clock is set much past 1850MHz at this power saving but probably not completely optimal 0.900V (this state made possible with Asus's GPU Tweak II and its "Always 3D clocks" button). If Gigabyte shipped my card as I run it currently, it might last a year or a bit more of heavy use, but then you'd

Since Intel's introduction of turbo boost, even chips of the same types have had different VIDs. I believe GPUs are the same. If not, they need to be (and so they will) catch up ---they take so much more power than CPUs it's a must. First a bit of background - for those who don't already know, you may need to look up "CPU VID" for more in depth knowledge than what I describe next (which is just the minimum required knowledge to understand what I'm describing in the last part of this post). VIDs are voltages in a table that a CPU gives to motherboards so they send the the right voltages at all the operating frequencies it's capable of running (frequencies it's designed to run, so all base clock frequencies and turbo frequencies get the right voltages. Capable of reaching with overclocking is not covered).

I'm talking about a small modification to this standard, nothing else, which is the already present mode of operation. All that needs to be done is to, starting at the beginning of operation (day 0) with a voltage set just above the absolutely lowest required voltage for stability stable voltage (which is already determined by manufacturers so they can set the VIDs the way they do currently (too high). From that day forward, on a monthly basis using a formula with well known and defined variables, very small voltage increases would be applied to all the voltages in the table to ensure reliable operation is maintained over the product's service life. How much lower could voltages be set at the beginning? Well, like you said, all chips are different. An extreme example for you: my old 2500K had a VID of 1.416V turbo boost of 3700MHz. With that voltage it's Prime95 small FFT stable indefinitely at 4800MHz! With just 1.25V it's prime stable at 4300MHz! I think it needs 1.10V for 3700MHz. That's 0.3V! I don't think most 2nd gen Intel chips have a 0.3V discrepancy, but I bet new it's 0.2V, and after a decade or continual heavy use at high temps, definitely less than 0.1V. Conversely, my 9600K's a VID of 1.16V is only 0.07V above what's needed. Maybe degradation is slower with the lower voltages with new manufacturing process, I don't know (nobody but Intel does because proprietary (and I hate that). But all is not lost! We know it's lower now and chips are lasting just as long, so it has to be something.

We don't know all of the variables or the exact values of variables we've inferred the existence of through observations, but we don't need to! All that's needed for success is for the method I described getting back to the people who do know them! It'll help them with temperatures, power density, a bunch of stuff I don't have to get into. So they should do it.

Environmentally, worldwide, a LOT. And I mean A LOT LOT, of power would be saved.

How much is a lot?

Well, as I've posited, a lot of information isn't easy to come by, but even in the near term, it's a ridiculously high number: tens of millions of devices, all laptops/desktops/servers could be improved. Even if average power consumption was reduced by just 10% during the first 3 years of service? That's a LOT. Someone who came up with similarly awesome strategies to help the environment might have received things like prizes throwing peace signs. Someone (I'm looking at you random nvidia/intel/amd mid level forum spies) should pass this up the chain. I'm not a climate activist or anything, but I don't like waste - especially waste that's completely unnecessary.
Posted on Reply
#78
InVasMani
Just a low level and/or mid level power savings over reference design clocks bios switch on all cards would do wonders to save a ton of additional power. It wouldn't really cost much different if they just scrap one of the numerous display outputs in favor of the better environmental option. Most cards have like 3-4 display outputs as it is and with today's panels not many are even doing triple display setups these days due to 4K's and UW's being more appealing to most because bezels blow for immersion and are distracting relative to a seamless image.

Something like 25% to 50% power savings options with better efficiency tuning of voltages since you can tighten voltage a bit when you drop frequency if you aren't boosting as high. In fact you can drop boost speeds by like 10% and bump up base clock speeds 10% and better off for efficiency because you can also reduce voltage further if you reduce boost speeds. That actually minimizes the performance deficit while still leveraging higher efficiency to a good extent. You're basically shifting the two voltage goal posts in exchange for performance and efficiency trade-offs.
Posted on Reply
#79
Parn
A lot less power than what it was rumored a couple of months ago, but still it's a no go for me. Anything more than 250W is simply absurd especially with the current electricity price in the UK.
Posted on Reply
#80
InVasMani
About 225w/250w is my cutoff point with PSU.
Posted on Reply
#81
tpu7887
Hopefully it's true, a lot of pointless power consumption will be curtailed. 20% less power for 3% less frames - sounds good to me! There still need to be options for people who want to overclock though. It's annoying that my 3080, no matter what I do, won't draw more than ~370W - its average with FurMark is just over 360.
It's not really a problem right now because I run 1850MHz with 0.900V, but in the future when it's not a top end card? I'm sure I'll want to overclock a bit more

edit: I wasn't even thinking of the high electricity prices! That must be why they're doing it... Maybe they'll have a button to clock it like a mobile GPU too!
Posted on Reply
#82
Ultra Taco
Is there any chance some models of the 4080 will have atx 2.0 power interface? I don't want to buy a new PSU
Posted on Reply
Add your own comment
Jul 24th, 2024 01:32 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts