Thursday, May 9th 2024
NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W
According to Benchlife.info insiders, NVIDIA is supposedly in the phase of testing designs with various Total Graphics Power (TGP), running from 250 Watts to 600 Watts, for its upcoming GeForce RTX 50 series Blackwell graphics cards. The company is testing designs ranging from 250 W aimed at mainstream users and a more powerful 600 W configuration tailored for enthusiast-level performance. The 250 W cooling system is expected to prioritize compactness and power efficiency, making it an appealing choice for gamers seeking a balance between capability and energy conservation. This design could prove particularly attractive for those building small form-factor rigs or AIBs looking to offer smaller cooler sizes. On the other end of the spectrum, the 600 W cooling solution is the highest TGP of the stack, which is possibly only made for testing purposes. Other SKUs with different power configurations come in between.
We witnessed NVIDIA testing a 900-watt version of the Ada Lovelace AD102 GPU SKU, which never saw the light of day, so we should take this testing phase with a grain of salt. Often, the engineering silicon is the first batch made for the enablement of software and firmware, while the final silicon is much more efficient and more optimized to use less power and align with regular TGP structures. The current highest-end SKU, the GeForce RTX 4090, uses 450-watt TGP. So, take this phase with some reservations as we wait for more information to come out.
Source:
Bechlife.info
We witnessed NVIDIA testing a 900-watt version of the Ada Lovelace AD102 GPU SKU, which never saw the light of day, so we should take this testing phase with a grain of salt. Often, the engineering silicon is the first batch made for the enablement of software and firmware, while the final silicon is much more efficient and more optimized to use less power and align with regular TGP structures. The current highest-end SKU, the GeForce RTX 4090, uses 450-watt TGP. So, take this phase with some reservations as we wait for more information to come out.
84 Comments on NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W
Just....buy a smaller GPU and stop crying? Just go get a 6750xt or 4060ti and revel in the power efficiency! So the high end should be artificially gimped so you can have a dual slot cooler? Because the only way they are doing this is to dramatically under-clock the thing out of the box or cut down the GPU die size, at which point you have a xx7x card with a xx8x name and price, which I seem to remember people didnt like with the 4000 series. The 4000s series is LIGHTYEARS ahead of the 1000 series in perf/W. Just because they didnt artificially limit their GPUs to 1080ti performance doesnt mean theres not been major efficiency improvements. So if you dont want the heat, buy a lower TDP card then? The point is, its not a risk, nor is it going to electrocute your cat or burn down your house.
At least, they still have a quite significant "FPS" lead over the old Radeon RX 7900 XTX, so why even bother to release useless products? I see pictures with heavy cards bent, sagging is quite an issue.
Broken, cracked PCBs with molten connectors will become a norm.
I choose to avoid, though :D
Solution - water cooling.
Also, something like that for rendering would suck ass because you would only be able to feasibly have one of those things in the same case without going all-out full custom system, cooling, wall outlets, etc., and probably a beefy 2000-watt PSU & a decent UPS w/ surge protection. That's easily looking at around $10k+ right there if you're a DIY.
Like I said, they're going to keep the wattage down, and I'm sure if they wanted to, they could make them more efficient, but its cheaper for them to stick with their current manufacturing processes.
Otherwise, there will be little to no initiative for anyone to throw another large sum of money. For 10-15-20% more FPS it won't be worth it.
but you know not many people buy this type of card. Few bought the 8800 GTX ultra or the GTX 590 or the 1080ti, those are just impossible to find today because nobody bought them right? Ahem....the 3090ti exists, and peaks at over 600 watts. It's sustained draw at stock os over 500 watts.
ATI Radeon HD 2900 XT Review - Value & Conclusion | TechPowerUp
uses less power, 110W vs. 120W
delivers more 1080p fps at time of review (96 vs 83) in W1zz's reviews
costs less after inflation, $300 vs $323
delivers 2.4x the fps at 1080p in today's games (a reasonable 3-gen uplift, ~34%/gen)
I bought the 1060 6G as my first PC gaming GPU and the 4060 is a well-matched successor. I'll bet the 5060 will be a little higher power but still delivers more fps/W.
I think what we are seeing is that the range of performance between the x60 and x80ti/90 has increased quite a bit. The TDP is stretched up mostly on the higher end.
So; you really do get a tinier slice of silicon. The x60 is less card today than it was relatively 3-6 generations back Power grids have bigger issues, like EVs and solar. Its precisely NOT usage but power input that destroys network stability.
If we ever reach a point where the newest GPUs threaten to max out the power of the average domestic circuit, it won't be especially difficult to opt out.
The trend now is actually the opposite. You buy lower emission TVs, refrigerators, washing machines... your average light bulbs went from 100-watts down to 9-watts or so. The last good game was Crysis 3 back in 2013. After that and the original Far Cry, there is nothing even remotely appealing in graphics realism and breakthrough revolutionary innovative graphics.
Ray-tracing alone is still an embarrassing hoax.