• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

I really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.

99% of that 88% NVIDIA owners don't have X080 or X090 cards. Most people have lower tier cards, 50, 60, maybe a few 70s, where AMD is actually competent and better value/price. Most people just blindly buy into NVIDIA without any research, because that's what they hear everywhere. Like people still believing Intel CPUs are the only thing to buy even today, because they live in a cave or something..

If you check modern games, they almost all use over 12GB ram in 4K. There is nothing to argue over here, Nvidia is limiting the VRAM to up-sale people to a higher tier and more expensive card and limit how future-proof those cards are, just like when they release a new technology and limit it to only their newest series.
 
99% of that 88% NVIDIA owners don't have X080 or X090 cards. Most people have lower tier cards, 50, 60, maybe a few 70s, where AMD is actually competent and better value/price. Most people just blindly buy into NVIDIA without any research, because that's what they hear everywhere. Like people still believing Intel CPUs are the only thing to buy even today, because they live in a cave or something..

If you check modern games, they almost all use over 12GB ram in 4K. There is nothing to argue over here, Nvidia is limiting the VRAM to up-sale people to a higher tier and more expensive card and limit how future-proof those cards are, just like when they release a new technology and limit it to only their newest series.
So these lower tier cards most people have, would you also say they tend to have 4K monitors?

Seems most people have 1080p or 1440p...

1718120778465.png


From TPU's latest 4070S review.
Alan Wake 2 RT @ 4K is what I would call an edge case.

1718120849869.png


If you want to play native 4K RT, you'd best be buying a 4070 Ti S or better lol.
 
99% of that 88% NVIDIA owners don't have X080 or X090 cards. Most people have lower tier cards, 50, 60, maybe a few 70s
Yup.
Steam hardware survey had the GTX 1650 as the most popular GPU of 2023, and it was finally overtaken this year by the 3060, a 3-year-old GPU you can buy for $260, which is cheaper than even the cheapest 40-series offering.

AMD APUs and Intel IGPs have a far higher marketshare than the 4070Ti, 4080, 4090, and all three of those high-end 40-series GPUs combined are still outnumbered by 2016's GTX 1060.

So these lower tier cards most people have, would you also say they tend to have 4K monitors?
One thing to remember is that 4K monitors start at under $185 with good quality, reviewer-recommended models starting from about $250, and a monitor purchase can easily last a decade.

Someone can get themselves a decent 4K monitor and the cost increase over a half decent 1080p model may be under $100, which is only an extra $10 a year.

Only pure gamers who need to spend every single penny of their budget on GPUs will compromise their monitor purchase to try and buy a higher-tier GPU. For anyone who also works, enjoys movies, TV shows, or just likes lots of desktop real-estate, 4K monitors are both useful and extremely affordable. Nobody is forcing you to game at native resolution and now with DLSS/FSR/XeSS you can enjoy the benefits of a crisp, native resolution UI whilst the game itself renders at a resolution your GPU can handle, whatever that may be....
 
Last edited:
Is a kilowatt PSU already needed for a single card?
 
32GB is really the minimum amount of memory the 5090 needs TBH. The 24GB on my 4090 is barely enough to train LoRAs for SDXL and running SDXL inference with IP adapters / LoRA pushes it as well. 32GB isn't enough to fine-tune the main model but at least it'll let the card keep up in other regards.

The lack of VRAM on cards lower on the stack would be expected but unfortunate. If I had to give people a tip, it would be that graphics have never been a deciding factor in whether a game is good or not and frankly returns are ever diminishing. It'd be one thing if we got actual AI in games (which would take up significant memory) that wasn't dumb as a bricks but slightly better looking graphics isn't worth taking the poison pills Nvidia wants you to swallow with it's VRAM on cards below the xx90 and their pricing.
 
The GB203 only 256-bit & 10,752 CUDA cores looks barely better than 4080specs and we all know more cache is not a silver bullet. It could br that 5080 will be no faster or even a bit slower than 4090 in rasterization at 4K and above. No cempetition = no progress :banghead:
 
If the 5080 is what it takes to match a 4090 then I give up.
I'm not dropping over a grand on that crap and still buying a 4K monitor...
 
32GB is really the minimum amount of memory the 5090 needs TBH. The 24GB on my 4090 is barely enough to train LoRAs for SDXL and running SDXL inference with IP adapters / LoRA pushes it as well. 32GB isn't enough to fine-tune the main model but at least it'll let the card keep up in other regards.

The lack of VRAM on cards lower on the stack would be expected but unfortunate. If I had to give people a tip, it would be that graphics have never been a deciding factor in whether a game is good or not and frankly returns are ever diminishing. It'd be one thing if we got actual AI in games (which would take up significant memory) that wasn't dumb as a bricks but slightly better looking graphics isn't worth taking the poison pills Nvidia wants you to swallow with it's VRAM on cards below the xx90 and their pricing.

Yes, but those are not typical gaming loads, this is something that you would want from a workstation card, not a GeForce.

I agree that the VRAM on the lower models is not enough, and the GB203 specs are too low.
 
You'd be surprised how often have I heard "Aaaaand AMD display driver just crashed" from my buddy rocking a 6600 XT on a new AM5 system while playing the same game online.
And you'd be surprised when my brother was running his 5700XT and I was on my 3080 when we'd be playing coop and my game would crash to desktop for me while his wouldn't....

What's your point?
 
Only the "5090" may be worth buying.
But I'm planning to skip it.
 
And you'd be surprised when my brother was running his 5700XT and I was on my 3080 when we'd be playing coop and my game would crash to desktop for me while his wouldn't....

What's your point?

Maybe that the older RDNA 2 cards have some type of issues running on the new AM5 platform... Who knows.

Jesus, GB206 (presumably 5060 and 5060Ti) are still 128-bit and still potentially 8GB. Nvidia are expecting GDDR7 to do some heavy lifting there and they've already complained about Micron GDDR7 not living up to claims, whilst Micron themselves aren't shouting their success from the rooftops.

IMO, there's no reason for these three variants to exist at all. I am 100% certain Nvidia will screw everyone on VRAM anyway, as part of the upsell:
View attachment 350819

The 5090 is 50% more compute than a 4090, so it'll cost $3000. Nvidia will probably say it's 60% faster which is why they can justify the asking price, which is irrelevant because China will buy them all for AI anyway and demand will outstrip demand until something better than 5090s comes along for AI datacenter racks.

Err, nope.
GB202 = 28GB
GB203 = 24GB
GB205 = 20GB
GB206 = 16GB
GB207 = 12GB

This is ok now.
 
  • The internet myth that AMD has fundamental driver problems and Nvidia doesn't
Man doesn't it just suck when an Internet myth breaks my display after updating my RX580's drivers a couple of years ago. I sure hope this myth gets relegated to a wives' tale sometime soon, that might mean this non-existent problem affects even less fictional users.
 
The GB203 only 256-bit & 10,752 CUDA cores looks barely better than 4080specs and we all know more cache is not a silver bullet. It could br that 5080 will be no faster or even a bit slower than 4090 in rasterization at 4K and above. No cempetition = no progress :banghead:
GDDR7. So these cards are def moving up a little bit. Emphasis on little, I guess

Man doesn't it just suck when an Internet myth breaks my display after updating my RX580's drivers a couple of years ago. I sure hope this myth gets relegated to a wives' tale sometime soon, that might mean this non-existent problem affects even less fictional users.
Pics or it didn't happen ;) A driver... for a GPU, breaking a display?
 
Man doesn't it just suck when an Internet myth breaks my display after updating my RX580's drivers a couple of years ago. I sure hope this myth gets relegated to a wives' tale sometime soon, that might mean this non-existent problem affects even less fictional users.
Why does everyone think that if their computer crashes, every computer in the world is also crashing? I'm sorry to hear about your problems. I hope you find a solution on the web or through some third party. Please do not forget that AMD has sold millions and millions of GPUs through discrete cards, iGPUs, consoles, smartphones, gambling machines, etc. There are no reported widespread outages because if there were, it would be breaking news on major media platforms. Your issues are yours. Not mine and not others.
 
Err, nope.
GB202 = 28GB
GB203 = 24GB
GB205 = 20GB
GB206 = 16GB
GB207 = 12GB
Ehm, nope? First generation of GDDR7 will only have the same 16Gb capacity as GDDR6(X) has now, 24Gb modules will only come later. We might se GPU with 1,5x VRAM later in the life cycle of Blackwell or with a refresh, but certainly not from the start. So GB203=16GB max or 32GB via clamshell.
 
Last edited:
I can't see the full GB203 being more than 15% faster than a 4080 Super in raster, and that's barely worthy of a 5070 Ti moniker
 
RTX 5090 can run Hellblade 2 Senua's Saga at 8K max settings with no DLSS, to achieve avr. 30?
 
Why does everyone think that if their computer crashes, every computer in the world is also crashing? I'm sorry to hear about your problems. I hope you find a solution on the web or through some third party. Please do not forget that AMD has sold millions and millions of GPUs through discrete cards, iGPUs, consoles, smartphones, gambling machines, etc. There are no reported widespread outages because if there were, it would be breaking news on major media platforms. Your issues are yours. Not mine and not others.
Never did I claim that all AMD products immediately catch on fire and kick the nearest puppy as you imply. You can read, you know this, there is no reason to exaggerate.
It is, although, curious that some people consider AMD drivers' bad reputation to be due to a "myth" when the amount of complaints is visibly superior to NVIDIA users. I've switched to NVIDIA shortly after and had no such episode since. Such a stark difference might sound ridiculous, but yeah, not once did I have a significant issue after updating NVIDIA drivers.

This is my personal experience and I notice more people complain about AMD drivers than NVIDIA's. Do with that information as you will.

Pics or it didn't happen ;) A driver... for a GPU, breaking a display?
Would that I could, As said, this was years ago back in the RTX 20 era, no pics now. The monitor didn't explode, but there was no image displayed anymore. It was a frustrating experience figuring out what was going on, but I eventually switched to iGPU and was able to rollback or uninstall the latest Radeon drivers, don't remember which one was the solution.
 
The 5080 is going to be about 10-20% short of a 4090 game dependent. This is on late beta hardware. 5080's will be starting at roughly £1250 and 5090's about £1900. There is also a new DLSS version coming out but it may not be ready for launch. Keep it quiet though.
 
Would that I could, As said, this was years ago back in the RTX 20 era, no pics now. The monitor didn't explode, but there was no image displayed anymore. It was a frustrating experience figuring out what was going on, but I eventually switched to iGPU and was able to rollback or uninstall the latest Radeon drivers, don't remember which one was the solution.

That doesn't sound like the drivers broke your display. If you moved to the iGPU and the display worked, then your display wasn't broken. Sounds like a driver issue for you.

Don't feel bad, everyone has had some kind of annoyance from minor to major issues with a driver on Nivida or AMD's side over the their lifetime of using them. Anyone that say otherwise is lying because as much as some people would believe, not every driver will work with every hardware/software configuration flawlessly.

I've had issues with Nvidia drivers that:
  • broke video playback (nothing would play back in videos, just a black screen)
  • caused stuttering
  • caused flickering of images
  • failed to allow secondary and tertiary monitors from waking up after going to sleep (the image wouldn't come back on them, I had to power cycle the computer to get the image to come back)
  • Caused all blacks (shadows) in games to have green flickering
  • Caused instability with SLI configurations
  • gave me random driver failures with crashes to desktop
I learned some time ago that if a driver works and I don't get issues playing games, I have zero reasons to update the driver. There are times I run a driver for a year or more before I have to update it because I'm doing a re-install of Windows or swapping GPUs for whatever reason or maybe a new game I'm playing just won't run well without updated drivers.
 
Ehm, nope? First generation of GDDR7 will only have the same 16Gb capacity as GDDR6(X) has now, 24Gb modules will only come later. We might se GPU with 1,5x VRAM later in the life cycle of Blackwell or with a refresh, but certainly not from the start. So GB203=16GB max or 32GB via clamshell.

12 GB VRAM = 6 x 2GB chips.
16 GB VRAM = 8 x 2GB chips.
20 GB VRAM = 10 x 2GB chips.
24 GB VRAM = 12 x 2GB chips.
28 GB VRAM = 7 x 2 x 2GB via clamshell.
 
@ARF : Yeah, I know. GB203 will be 8x32Bit, so 16GB. 20/24/28GB without clamshell will requiere GB202.

Edit: not 6x32Bit but 8x32Bit. I hope...
 
Last edited:
  • Haha
Reactions: ARF
That doesn't sound like the drivers broke your display. If you moved to the iGPU and the display worked, then your display wasn't broken. Sounds like a driver issue for you.

I learned some time ago that if a driver works and I don't get issues playing games, I have zero reasons to update the driver. There are times I run a driver for a year or more before I have to update it because I'm doing a re-install of Windows or swapping GPUs for whatever reason or maybe a new game I'm playing just won't run well without updated drivers.
I used to do that too but so often do games get a "day 1 driver" release that I'm just always updated now. Might go on another abstinence streak now that you mention it.

Dunno what term is correct, but the screen was turned on with a black screen. No image. Maybe "the display broke" is not accurate, but that's what I meant.
 
Back
Top