• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Blackwell GeForce RTX 50 Series Opens New World of AI Computer Graphics

I still don't care about the $1k+ segment, but 5070 for $549 sounds pretty good. Of course, I need the benchmarks to go with that.
 
If all the server AI cards are really sold out for 2025, then these "Gaming" cards will also be grabbed by the AI / machine learning / neural network companies, leaving gamers to dream about $2000 RTX 5090 and $1000 RTX 5080...
Most of the X090 cards were grabbed up for AI, ML, DL, NN work as that's what they are designed for. I ditched my Titans and gave them to someone else when the 3090 hit, same when 4090, will do the same with 5090. The person I gave them to didn't game on them either. When he gave them to his kid brother who's in school they aren't used for gaming.

It's been a given for all of these that the majority of them even the Gigabytes and ASUS are not going to gaming. They are going to CUDA as they were designed for. And for those of us doing it we don't have an issue even paying inflated prices.

Maybe stick to AMD and don't buy CUDA products?
 
Most of the X090 cards were grabbed up for AI, ML, DL, NN work as that's what they are designed for. I ditched my Titans and gave them to someone else when the 3090 hit, same when 4090, will do the same with 5090. The person I gave them to didn't game on them either. When he gave them to his kid brother who's in school they aren't used for gaming.

It's been a given for all of these that the majority of them even the Gigabytes and ASUS are not going to gaming. They are going to CUDA as they were designed for. And for those of us doing it we don't have an issue even paying inflated prices.

Maybe stick to AMD and don't buy CUDA products?
Yeah, the only person I know that immediately wants the xx90 cards when they come out for new builds or upgrading current ones is my friend. He runs an multi million dollar amphitheater design company and does huge CAD projects for initial design concepts/simulations and hours' worth of rendering. Quadro cards are actually slower for him and software company that makes the CAD suite he uses doesnt recommend Quadro anyways, so we always go for the highest Geforce card with the largest frame buffer.

Everyone else I do rigs for that's gaming gets xx80 or below. More often than not its usually mid range cards like xx60/70 and AMD counter parts.

I stick by the idea that the xx90 really aren't here for gamers.
 
Last edited:
I stick by the idea that the xx90 really aren't here for gamers.
NVIDIA would make it easier on anyone if they brought back the Titan naming for the current 90-tier cards, and leave the 90 for a filler between the 80 and then Titan lineup.
 
NVIDIA would make it easier on anyone if they brought back the Titan naming for the current 90-tier cards, and leave the 90 for a filler between the 80 and then Titan lineup.

I agree. I liked the Titan naming for that reason
 
NVIDIA would make it easier on anyone if they brought back the Titan naming for the current 90-tier cards, and leave the 90 for a filler between the 80 and then Titan lineup.
Neah. people forget, but Titan were neither here nor there. They were kind of marketed as professional cards, but they weren't Quadros. And they ran on the regular GeForce drivers. Nvidia kinda painted themselves in a corner. They had to make the "Titan" go away, so they introduce the "x90".
 
They were kind of marketed as professional cards, but they weren't Quadros.
I mean, that's precisely why I dislike the 90's not being called Titans.
 
I mean, that's precisely why I dislike the 90's not being called Titans.
Yeah, it's still a bastard of a card. But "Titan" was probably costing them money in terms of marketing or product support.
 
Not impressed at all. Looks like after all the smoke and mirrors of the new AI gimmicks clear, the 5090 is barely 20-30% faster in raster than the 2 year old 4090. Considering the price increase, I'll be honest -- I'd rather they priced it at $2500 and for the 5090 to be twice as fast in raster over the 4090 compared to what we got -- that would at least make the upfront cost worth it to be on the cutting edge for a few years. A measly 30% more doesn't even make it anywhere near good enough to run UE5 games like Silent Hill 2 Remake at native 4K 60FPS, let alone 120FPS.

And with the 5080 separated by a much higher margin of $1000 from the 5090 and still handicapped by a pathetic 16GB (a mere 4GB more than my ancient 10 year old Titan X Maxwell that didn't have even a quarter of the VRAM-hogging AI gimmicks running and which cost considerably less), this entire generation seems like a huge "meh" for anything besides running local LLMs. I'll reserve my judgement until the reviews are out -- but from what I've seen so far, the only redeeming feature is their return to dual slot flagship cards (from the 3-4 slot behemoths we've had for the past several generations).
 
Last edited:
I wonder if Blackwell can do lower voltage than Ada (~0.9 V). Considering the increased TDP, more aggressive undervolting would be welcome. The 5070 Ti looks like the most attractive card in this line-up, but 300 W is a lot of heat.
 
NVIDIA would make it easier on anyone if they brought back the Titan naming for the current 90-tier cards, and leave the 90 for a filler between the 80 and then Titan lineup.
I'd agree but that didn't stop the issue.

Nvidia flat out fucking said what the Titans were for. It did not stop Linus and other youtube influcencers from gaming on them and convicing idiot gamers to buy them. It did not stop systems integrators from putting them in gaming machines and selling them as "SLI on a single GPU!!! in ITX form factor!!!!!!!!!!". It did not stop gamers from buying Titan Volta a series that never released as a normal GTX from buying it and posting e-peen benchmarks on forums.

Given all this the issue isn't the naming. The issue isn't the marketing. The issue isn't nvidia. The issue is PC gamers themselves. But that's the one group of people, the gamers, that will never ever assume even remote responsibility for the situation that was created. It's the most spoiled fucking brats of the group. It's even worse than Leather Jacket himself! And until they admit that. Until they change their actions. The situation will remain what it is.

Money talks and bullshit walks and I'm old enough to have seen it all. But to the young spoiled morons crowd PC gamers have killed off pro standards before... and then never gotten them again. The best example is NVME drives. u.2 and AICs slaughtered, and still do, m.2 drives. They also cost real money. There's even m.2 specs that aren't out for gamer boards. Gamers didn't want them and they went the fuck away on gaming boards. Now of course that means gamers don't get those products, but the result is the result. The way to stop this is don't buy the damn 5090s and let them go away completely and be relegated to higher prices and the workstation markets and they will cease to exist for gaming and you will never to worry about them again!

When Titan hit the selling point was NOT gaming. It was a bunch of HPC shit it enabled on the cheap. It was Youtube that sold it to gamers and nvidia responded when they realized idiots and their money... and here we are. The Titan naming did squat.
 
@SOAREVERSOR See my post above: while Nvidia said Titan was a compute/pro card, it still ran on gaming drivers. It was a little confusing having Titan in the lineup, it is only slightly less confusing having that overpriced x90 instead.
To me it was always simple: the high-end stopped at x80. Anything above was and still is a "let's see how much money you will throw at a GPU" thing.
 
To me it was always simple: the high-end stopped at x80. Anything above was and still is a "let's see how much money you will throw at a GPU" thing.

The problem is Nvidia completely changed the ranks of what is low, mid, high, and ridiculous range.

Right now we only have an announcement of graphics cards for $550, $750, $1000, $2000, with no clear date of the rest of the range. And based on previous generation, not really much focus there. So PC Glorious Master Race starts above the price of gaming console, PlayStation 5, and that’s just for the GPU, far from the whole system?

And sure, Nvidia focuses a lot of their marketing on how x90 cards are made for actual work. But at the same time they waste no breath to clearly show it as the top of the line of Gaming range, call it “a beast”, exaggerate on how much faster it is from the rest of the range…

And by cutting the x80 range more and more, we’re now at roughly 50% down from RTX 5090, we even get into ridiculous situations like with RTX 4080 and 4090, where reviewers concluded that RTX 4080 was a bad value compared to previous generation, for 50% more cash you got less than 50% improvement, but the value got better if you stretched your spending from $1200 to $1600?

What will be the conclusion this time, if RTX 5080 really barely outpaces RTX 4080 in raster, but RTX 5090 offers a clear upgrade, being the only card in lineup with more than 50% performance increase compared to previous generation? “Don’t bother with the lower end?”

Nvidia knows how to play their game.
 
*sniff* *sniff* is it just me or can anybody else smell a NZXT Flex no-strings (aka low-strings) attached offering here?

The problem is Nvidia completely changed the ranks of what is low, mid, high, and ridiculous range.

Right now we only have an announcement of graphics cards for $550, $750, $1000, $2000, with no clear date of the rest of the range. And based on previous generation, not really much focus there. So PC Glorious Master Race starts above the price of gaming console, PlayStation 5, and that’s just for the GPU, far from the whole system?

And sure, Nvidia focuses a lot of their marketing on how x90 cards are made for actual work. But at the same time they waste no breath to clearly show it as the top of the line of Gaming range, call it “a beast”, exaggerate on how much faster it is from the rest of the range…

And by cutting the x80 range more and more, we’re now at roughly 50% down from RTX 5090, we even get into ridiculous situations like with RTX 4080 and 4090, where reviewers concluded that RTX 4080 was a bad value compared to previous generation, for 50% more cash you got less than 50% improvement, but the value got better if you stretched your spending from $1200 to $1600?

What will be the conclusion this time, if RTX 5080 really barely outpaces RTX 4080 in raster, but RTX 5090 offers a clear upgrade, being the only card in lineup with more than 50% performance increase compared to previous generation? “Don’t bother with the lower end?”

Nvidia knows how to play their game.
Give it 8 weeks, and i bet my both left hands that NZXT is gonna come up with some very flexible plan with nvidia - for everyone (NO STRNGS ATTACHED!)

The problem is Nvidia completely changed the ranks of what is low, mid, high, and ridiculous range.

Right now we only have an announcement of graphics cards for $550, $750, $1000, $2000, with no clear date of the rest of the range. And based on previous generation, not really much focus there. So PC Glorious Master Race starts above the price of gaming console, PlayStation 5, and that’s just for the GPU, far from the whole system?

And sure, Nvidia focuses a lot of their marketing on how x90 cards are made for actual work. But at the same time they waste no breath to clearly show it as the top of the line of Gaming range, call it “a beast”, exaggerate on how much faster it is from the rest of the range…

And by cutting the x80 range more and more, we’re now at roughly 50% down from RTX 5090, we even get into ridiculous situations like with RTX 4080 and 4090, where reviewers concluded that RTX 4080 was a bad value compared to previous generation, for 50% more cash you got less than 50% improvement, but the value got better if you stretched your spending from $1200 to $1600?

What will be the conclusion this time, if RTX 5080 really barely outpaces RTX 4080 in raster, but RTX 5090 offers a clear upgrade, being the only card in lineup with more than 50% performance increase compared to previous generation? “Don’t bother with the lower end?”

Nvidia knows how to play their game.
Thing is, at the end of day, what you gonna do? Complain to Donald? Complain to Jsus? Nope, you gonna chew on it and then, you gonna pay for it or pretend AMD is the better thing
 
Back
Top