Friday, March 25th 2022
NVIDIA GeForce RTX 4090/4080 to Feature up to 24 GB of GDDR6X Memory and 600 Watt Board Power
After the data center-oriented Hopper architecture launch, NVIDIA is slowly preparing to transition the consumer section to new, gaming-focused designs codenamed Ada Lovelace. For starters, the source claims that NVIDIA is using the upcoming GeForce RTX 3090 Ti GPU as a test run for the next-generation Ada Lovelace AD102 GPU. Thanks to the authorities over at Igor's Lab, we have some additional information about the upcoming lineup. We have a sneak peek of a few features regarding the top-end GeForce RTX 4080 and RTX 4090 GPU SKUs. According to Igor's claims, NVIDIA is testing the PCIe Gen5 power connector and wants to see how it fares with the biggest GA102 SKU - GeForce RTX 3090 Ti.
Additionally, we find that the AD102 GPU is supposed to be pin-compatible with GA102. This means that the number of pins located on GA102 is the same as what we are going to see on AD102. There are 12 places for memory modules on the AD102 reference design board, resulting in up to 24 GB of GDDR6X memory. As much as 24 voltage converters surround the GPU, NVIDIA will likely implement uP9512 SKU. It can drive eight phases, resulting in three voltage converters per phase, ensuring proper power delivery. The total board power (TBP) is likely rated at up to 600 Watts, meaning that the GPU, memory, and power delivery combined output 600 Watts of heat. Igor notes that board partners will bundle 12+4 (12VHPWR) to four 8-pin (PCIe old) converters to enable PSU compatibility.
Source:
Igor's Lab
Additionally, we find that the AD102 GPU is supposed to be pin-compatible with GA102. This means that the number of pins located on GA102 is the same as what we are going to see on AD102. There are 12 places for memory modules on the AD102 reference design board, resulting in up to 24 GB of GDDR6X memory. As much as 24 voltage converters surround the GPU, NVIDIA will likely implement uP9512 SKU. It can drive eight phases, resulting in three voltage converters per phase, ensuring proper power delivery. The total board power (TBP) is likely rated at up to 600 Watts, meaning that the GPU, memory, and power delivery combined output 600 Watts of heat. Igor notes that board partners will bundle 12+4 (12VHPWR) to four 8-pin (PCIe old) converters to enable PSU compatibility.
107 Comments on NVIDIA GeForce RTX 4090/4080 to Feature up to 24 GB of GDDR6X Memory and 600 Watt Board Power
GA106 is a 170w SKU. Not 75 ;)
So really we moved the TDP of 104 to a lower point in the stack. Again you need to twist reality to get to favorable outcomes ;) you really just provided your own counter argument.
Last i checked we are talking about the gaming stack...
I didn't have to twist anything but provide information that you clearly lack.
Do none of you understand how any of this works?
The reason that CPUs and GPUs are drawing more and more energy for apparently less performance is a simple function of physics.
Even until a few years ago, a node shrink generally meant that transistor density would double (180nm to 90nm to 45nm / 28nm to 14nm to 7nm) which implied that the same chip would occupy only a quarter of the physical size after a shrink. That gave you a fuckton of room to add more transistors for more functionality and performance.
But that's no longer possible because node sizes aren't halving anymore, they're dropping by a nanometer or two at a time because we're reaching the literal physical limits of what silicon can do (end of Moore's law). So die area isn't as easily available, hence die sizes have to increase.
Then there's the unavoidable consequence of smaller and smaller nodes - less physical space between the individual transistors. Which leads to electromigration and crucially, current leakage. Which means that not only do you need more energy input to overcome that leakage, you need to dedicate more of your die to current-monitoring and -controlling transistors. Again, that means you have less die area for transistors to increase functionality and performance.
And because consumers don't care about any of the above and demand more functionality and performance generation-on-generation, the problem is going to continue to get worse until the semiconductor industry transitions to a successor to silicon. No it's not.
Plus anyone with half a brain can put historical specs side by side and take a wild guess at future core counts. Its not interesting and we shouldnt quite care so much about it. 'First!' Well yay, want a cookie? None of the info really matters until stuff is on shelves and seriously reviewed anyway. There are many ways to seek efficiency, and the chips we are getting now are a mix of bad things. GPUs are brute forcing expensive ray tracing because they eat anything without RT alive... so what we have there is just a saturated market searching for new money makers, in a day and age of diminishing returns and major climate problem... plus a market where chips are scarce.
Fundamentally, that is a display of human greed at the expense of a lot of sense that unfortunately is not common.
Adults look at the horizon and see a problem, Id say it are in fact the children still yearning for good old progress like we always had it. 'Must have new toys' for... what exactly? Progress in the eternal shape of more more more simply cannot keep lasting. You must be joking. Power to them, I guess? The fact remains its a workstation GPU, its not even a full 106, and its part of a stack that isnt subject of this topic. Im not even considering price lol...
Twisting reality to make your point. 106 for gaming is a 170W GPU that you falsely compare as a 70W one, to say it beats a 1080 GP104 at 180W. Glad we have that sorted out. The reality is that 106 now uses the same power as a 2016 104. And the current 104 is in the olde 102 TDP range.
Base electricity it tiered in time
www.hydroone.com/rates-and-billing/rates-and-charges/electricity-pricing-and-costs#TOU
Most people heat with natural gas since electricity would be 6x-10x the cost and Manitoba typically heats with electricity cause it's cheaper than natural gas.
Twenty-four
What we need is more cooking videos. I'm all-in on this one.
Seriously though, when are they going to stop that nonsense? I surmise a lot of down-the-road card failures will be mem related.
seeing 25¢ for a kilowatt up north . . . . wow. its under 8¢ on this side of the lake.
granted still pay toldeo edison for distribution/equipment frees ~$60 a month.
Maybe we shouldn't be buying GPUs every year or two to begin with though. From a consumer standpoint, I personally don't mind if there isn't something 'lastest and greatest' constantly. Can't buy it all anyway. But I do wonder what happens to all of these industries dependent on excruciatingly fast product cycles with just slightly better stuff in them each time. It's always been kind of nuts to me how many microchips we churn out just to toss in a few years. But I never saw how releasing products this way could be sustainable. It's a lot of waste, and people spend a lot of their finite financial resources on it. It doesn't seem necessary to reap the benefits of technology. To me, that's more marketing that makes people feel like they always need more, and on the other end are games promising more and more, utilizing the extra headroom. Pretty much every year, there will be some new upgrade to tempt your wallet.
That's the thing.... the pace of actual performance advancements slowed, right? But the products didn't. They came out at the same rate as always, with smaller gains, and prices that really only got higher year after year, like most other things. The more logical and (I think) fairer way is to hold a higher standard for what's considered a real gain, develop products for as long as that takes. Less waste, better products, less oversaturation in the markets and product stacks... possibly better delineation between them too. The thing is, it's impossible under the current economic model. No tech business could survive as it does today, if it was waiting until it had serious gains for each new product. Nobody is footing the bill on added operational costs. We are talking about entire industries that are almost predicated breakneck advancement... that can plan for a decade's worth of new shit, a whole roadmap of relatively small incremental improvements. What happens when we can no longer do that with the knowledge and tools that we have? What does stagnating development in silicon transistor tech look like when it reaches its final stage? People have their pet materials, but those are more challenging to work with than silicon. Those challenges could reap huge dividends in the end, but they could also just ensure that it's all ridiculously expensive for a long time. Just getting one of them to replace silicon will be expensive for everyone.
Something kind of has to give there though. Somewhere, somehow.
I kind of wonder how many things about how we operate will change in my life. I don't really see technology, or especially stuff in the realm of consumer technology ever being like it was before.
I certainly understand the desire to have a decently powerful GPU that doesn't require an external power connector, and I also understand why so many (including myself) are fascinated with the A2000 - it demonstrates that the efficiency that Maxwell brought hasn't gone anywhere, it's just that because everything has become about performance and beating the competition (and I blame consumers for this as much as anyone), the manufacturers are clocking their silicon to the limits to try to get any win. And of course that's a zero-sum game because as soon as one manufacturer does it, all of them will do it.
People will need to drop down a tier this gen if they don't want sticker shock, but you'll still get a lot stronger performance.
In summary more expensive to buy if you aren't using it for mining like Nvidia intends it to be. It's basically within the error of margin of a 6GB version at or below 1440p within +/- 5% that's pathetic really in context. Also no SLI so there is no salvaging how bad it is for gamer's plus they tamped down on bios modding as well like it's not at all aimed at gaming except uninformed ones. The GTX 960 4GB had more upside you could SLI it and could bios mod the VRAM to run faster this card is just a joke outright.
Meanwhile I'm just chilling with my 3090 using < 300W @ 4K RT DLSS Balanced in new AAA games, but hey DLSS is bad according to some too.