• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

First NVIDIA GeForce RTX 5090 GPU with 32 GB GDDR7 Memory Leaks Ahead of CES Keynote

Joined
Jun 20, 2024
Messages
447 (2.20/day)
It actually has better green credentials. Solar parks reflect more light than they absorb those also warming up the planet nor is production of the solar cells green for that regard. The same goes for the amount of birds that die by windmill farms. Nuclear is in fact greener than solar. Hydro is the cleanest of them all, but pretty limited in usage and has it's own environmental impacts.
Then you still have the problem of scaling your energy up or down. which is far more difficult with both options than with a nuclear plant. If you want the world to really go green then nuclear is an unavoidable power source at this time unless fusion power makes a breakthrough.

The problem with nuclear plants is that all people see is Chernobyl. A badly designed nuclear plant from the start and then they forget it happened because of bad management/operational error. At the same time it was somewhat a blessing in disguise because the world learned so enormously much from what not to do and it's implications of when it goes horribly wrong.
Hey, I'm not arguing against it - I'm not debating what is / isn't better - everything has pro's and con's and there's no such thing as real clean energy yet when you include full lifetime environmental impact.
What this originally stems from was the whole nuclear being lumped into the same bracket as renewables, primarily because it's 'carbon neutral' - that's really a mistake and I'm unsure if it's deliberate by the energy industry to try and lift it's PR rep with the public.

Chernobyl is an outlier - the Fukushima problems are actually more worrying; a supposedly well designed facility in a well off first world country having multiple containment failures. Yeah for sure, a natural disaster started the sequence of events, but it's a reminder that even after a shutdown / scram there is still quite an immediate problem of maintaining control for cooling everything down.
 
Joined
Jun 2, 2017
Messages
9,519 (3.43/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
It actually has better green credentials. Solar parks reflect more light than they absorb those also warming up the planet nor is production of the solar cells green for that regard. The same goes for the amount of birds that die by windmill farms. Nuclear is in fact greener than solar. Hydro is the cleanest of them all, but pretty limited in usage and has it's own environmental impacts.
Then you still have the problem of scaling your energy up or down. which is far more difficult with both options than with a nuclear plant. If you want the world to really go green then nuclear is an unavoidable power source at this time unless fusion power makes a breakthrough.

The problem with nuclear plants is that all people see is Chernobyl. A badly designed nuclear plant from the start and then they forget it happened because of bad management/operational error. At the same time it was somewhat a blessing in disguise because the world learned so enormously much from what not to do and it's implications of when it goes horribly wrong.
Three Mile Island, Acid Rain. Do you have any idea how many Nuclear Power Plants ring the Great Lakes? The biggest problem with Nuclear is the waste. Depleted Uranium covered bullets and tanks are not the solution. For those that don't know the rods are made of Uranium. Chernobyl introduced new Cancers after it happened. Fukushima happened in the Ocean and the Ocean is pure Chemistry in my eyes. When that Oil Platform the Americans built in the Gulf Of Mexico blew up one of the biggest surprises was that the Ocean has an algae that eats oil.
 
Joined
Sep 15, 2011
Messages
6,815 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Why is that nonsense? Gaming is a wasteful hobby. Why do you need 150 watt to play bing bing wahoo? Hell why do you need 75 watt? The switch only pulls like 10 at load, that's all you need for video games.

You say it's nonsense because YOU want a GPU with more than 75w of capability. Funny how that works. Once you start down the road of "you dont need this", it WILL be taken to its logical extreme. Remember the UK's potato peeler license debacle?

I'd love to hear what this sensible argument for a NEED of GPUs more than 75 watt to play games with is.

No video card on the consumer side draws more than 1kW of power. Not even close. The highest was the 600w the 3090ti could tickle. If you're talking whole systems, buddy, SLI gaming PCS were pushing 2KW back in 2008. Somehow you all survived.

a 1kW card poses no more of a risk then a 500w card does, or a 300w card. If you have such an old building, you should be upgrading the wiring instead of buying GPUs to waste time gaming on.

Here in "rednek ville" we have modern 25 amp wiring with 20 amp breakers that can run a 5090 without burning the house down. Interesting that this is such a concern in the EU, with all those "very smart people" banning everything. Why are you even allowed to have a building with wiring that cant handle a 1000w load? That's literal knob and tube style wiring.

If you have such widespread problems with wiring, the solution is NOT to ban GPUs, because you're gonna have to ban fridges, microwaves, and vacuum cleaners too. What you SHOULD be doing is banning that old wiring. This is why the EU gets referred to as a "nanny state".
Sorry, but amazingly, you misunderstood EVERYTHING I've just wrote. You're acting like a typical American spoiled brat, thinking only he is right, so I'm not going to further entertain your agenda.
For the rest of the readers, I'm just saying that anything with such high power draw, there should be some minimum regulations in place, otherwise those burning connectors won't be the last thing that might start to burn, if you have those kind of microwave oven like monsters, that are gradually going to 1KW power draw.
Just imagine in not even 10 years, having a 1KW GPU, a 500W CPU, 100W for 2xSSDs, etc :)))
 
Joined
Dec 26, 2006
Messages
3,887 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
"The leaked materials show that Inno3D's variant will use a 3.5-slot cooling system"

not 4?!?!
not sure I believe it ;)
 
Joined
Jun 5, 2018
Messages
243 (0.10/day)
How do you get a 50% increase in performance between 5090 and 4090 with only a 33% increase in CUDA cores when the 4090 only beat the 4080 by 25% with a 67% increase in cores?
You are right, which is why he increased the price by 33% exactly.
 
Joined
Dec 12, 2016
Messages
2,015 (0.68/day)
Are we basing this bet on Nvidia's numbers, or on TPUs? If TPUs when does the card actually release?
Given the Nvidia keynote with only RT/AI/DLSS scores, we will have to wait until the TPU review unless you really believe what Nvidia says and the 5090 is double the performance of the 4090 in which case you win and everyone else losses.
 
Joined
Sep 26, 2022
Messages
232 (0.28/day)
No, the 5090 is not a "professional" card. It doenst have fully enabled FP output nor does it have ECC memory or pro driver support. It DOES have 32 GB of memory, because its got a 512 bit bus and the only other option would be 16GB, which would have sent the internet into a hysterical frenzy. As a side benefit, you can tinker with LLM and AI image generation on this. Yay!

If you cant afford or dont want a 5090 that's fine, go buy a 5070 and enjoy your game library. Or a RX 9070. Or a intel B580. I swear, if people whined like this a decade ago we never would have gotten anything larger than a GTX 550ti.

  • No, the 4090 already DOES have ECC, just not enabled in the control panel by default...no doubt the 5090 will too :). It most definitely can be a professional card, what you can do, professionally speaking, with a 5090 will be pretty advanced. Just like it is today for the 4090...you will tell me that the ECC on the 4090 was for gaming too ?.
  • The GDDR7 speed is very relevant and important here but for me you got it back-ways: it's 512-bits because it has 32GB of memory, just like the 24GB (12 x 2GB/32bits) of the 4090 makes it 384bits. We are really not bandwidth saturated on a 4090 at 4K. Unless MFR is ultra intensive on the bandwidth, I fail to see why...But my main point you kinda missed is that we either have 16GB...And 256bit on the 5080 (but slightly faster memclock to compensate apparently) or 32GB/512 on the 5090, it's really a big shame that there's no in-between here like a 24GB/384bit 5080ti or something. It's the lack of choice, 16GB is not enough for me, it's not future proof for 4K/High, yet, 32GB is too much for gaming and I suspect it's for AI and the stocks will be depleted because of it.

I don't purchase AMD, only for CPUs. I plan on getting a 5090 pending independent reviews,..because I do not have a choice vRAM wise, exactly my point.
However "a decade ago" we were not dealing with tons of scalpers just to get a damn nvidia high-end card also, people whined at least as much in the past, I was quite young but the Ati vs Nvidia, the GTX 970 and its 3.5GB of performance vRAM etc. yet we ended up with something slightly larger than a GTX 550ti :)...
 
Top