• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA: RTX 3090 Performance 10-15% Higher Than RTX 3080 in 4K

3090 is clearly meant for gaming first and everything else after. If it truly was a Titan replacement they would talk about AI, rendering, ML and so on, not DLSS...

No, the 3090 is clearly being marketed for gaming. Marketing and reality are two very different things. Why are you taking Nvidia's marketing as gospel? It's absolutely meaningless what they tell you - of course they want to get more money from you, of course they want to make you believe you need the 3090 for gaming. That's how marketing, sales and profits work. Just because they are saying it's a gaming card, doesn't mean that it is. Just like the Titan was never the 'professional' product, the real professional SKU has always been the Quadro line. And the argument 'but if it was a Titan replacement, it would be called a Titan' doesn't really work here, because Nvidia is clearly doing changes to their product stack.

First, there was no xx90 SKU for the 1000 and 2000 series card. Second, the Titan branding failed. I have contacts in the PC hardware market, and trust me, they were almost impossible to move. It was a low volume, low sales product, even for a halo one. But you know how they can move more pieces of their halo product? By shifting it to be the top-end 'gaming' SKU, so they can sell it to gullible 1337 g4m3rs, who eat up the PR department's every word, because Nvidia told them this was the 1337-est card, and they must have t3h b3st!!!!111

Or you can just... like... not believe Nvidia's marketing and make your own purchasing decisions and conclusions? At the end of the day, Nvidia is a business and they DON'T have your financial interest in mind, only their own, so don't fall for their bulls**t. It's as simple as that. This is why independent reviewers, like this exact website, exist - so they can tell you what you're actually getting for your money.
 
If the 3090's performance only 10-15% higher than the 3080, than we can very much expect the 3080 20G to outperform the 3080 10G by less than 8%.

I am hearing this a lot. Since when does more vram gives more performance?!? do you get more CPU performance when you use 64G instead of 32 or 16???
 
I am hearing this a lot. Since when does more vram gives more performance?!? do you get more CPU performance when you use 64G instead of 32 or 16???
That's why I said it'll be less than 8%...

I am hearing this a lot. Since when does more vram gives more performance?!? do you get more CPU performance when you use 64G instead of 32 or 16???
Also, by saying 'less than 8%' I meant to say: I am not impressed with the 20G variant.
 
If the 3090's performance only 10-15% higher than the 3080, than we can very much expect the 3080 20G to outperform the 3080 10G by less than 8%.
Why do you think that bumping the ram capacity (only ram capacity) will outperform the same exact chip where there isn't a game that will require more.
It's like saying that PC with 16GB of ram will run Battlefield V slower than the exact same PC with 32GB of ram.
 
Why do you think that bumping the ram capacity (only ram capacity) will outperform the same exact chip where there isn't a game that will require more.
It's like saying that PC with 16GB of ram will run Battlefield V slower than the exact same PC with 32GB of ram.
Future proof? :)
 
I'm just wondering, is it truly getting "Pro" drivers? Is it REALLY this gens Titan? Where's it going to land on a chart like this?

Notice how the Titan RTX ($2500) is right up there with the RTX 4000 ($800) and a good margin ahead of the 2080 Ti ($1000).

If this card slots in like that in the current lineup of Ampere GTX's & Quadro's, and is at the top of the charts in 4K gaming, $1500 scales VERY well vs me having to buy a GTX AND a Quadro.

It's just, does it get certified driver support? Which I doubt it will for CAD apps like the Quadro's do.

Dassault-Systemes-SolidWorks-4K-Viewport-Performance-NVIDIA-TITAN-RTX.png
 
Future proof? :)
Could be but in my opinion when that "future" is here, the 3080 would be enough to play 1080p or 1440p due to its insufficient performance and you wont need 20GB of Vram for that resolution.
Honestly, never seen a game that would needed more than 8GB card to run any game.
 
The 3080, 3090 & 3070 are not impressive at all. Not to mention they are well overpriced and have a high black screen failure rate.
If you can find one to buy in its limited supply.

 
Could be but in my opinion when that "future" is here, the 3080 would be enough to play 1080p or 1440p due to its insufficient performance and you wont need 20GB of Vram for that resolution.
Honestly, never seen a game that would needed more than 8GB card to run any game.
Well, I've already ordered the Gigabyte Vision. Personally, I couldn't care less about the 20G variant.
 
Guys, stop arguing about the 3090 being a Titan or not. It's neither a Titan, nor a gaming card, or it's both a Titan and a gaming card, depending on how you want to look at it.

It's a Titan through flagship performance ("Titan class"), but it's not a Titan due to very high TDP (Titans are usually 250W).

And Nvidia markets it to whoever wants to pay for it, gamers or pro users, miners, scalpers, whatever.
 
Could be but in my opinion when that "future" is here, the 3080 would be enough to play 1080p or 1440p due to its insufficient performance and you wont need 20GB of Vram for that resolution.
Honestly, never seen a game that would needed more than 8GB card to run any game.
Per your reasoning, I'm now at peace with my purchase:)
 
The 3080, 3090 & 3070 are not impressive at all. Not to mention they are well overpriced and have a high black screen failure rate.
If you can find one to buy in its limited supply.

I can bet that this failure is due to power delivery and insufficiency caused by the PSU itself. At least mos of them

Per your reasoning, I'm now at peace with my purchase:)
I'm not saying it will not change sooner or later because it will but I'm leaning more to "later" than what people say.
Besides, there is a lot of so-called proof people here are giving. 1080p uses 9GB vram for a card that has 11GB available but then you move to 1440p with the same game and still it uses around 9GB even though the resolution has changed.
For instance. Battlefield V
Here you have 2080 Super 4k Ultra
Here you have 2070 Super 4k Ultra with DXR enabled.
Both use around 7GB. Some games just load as many textures as much memory you have available. It doesn't mean they need so much vram.
Here you have 2080 TI running Battlefield V with DXR enabled. Usage of Vram around 9GB (like 8.5 GB mostly) but the 2080 Ti has 11GB available. Does this mean it must have at least 9GB or the game is using that much because it has more available? You know the answer since 2080 and 2070 run the game at 4k Ultra no problem no to mention the 2070 Super runs it at 4k with DXR on. If you reduce the settings it's only because of the frame rate.

Enjoy your card dude. :) If the 20GB 3080 come out and that would mean the 3080 10GB drops in price, I might buy it since it's been a while since I had a NV card :)
 
Last edited:
Thanks bots. You saved NVIDIA. It makes RTX 3090 irrelevant and I guess these cards will have very limited supplies/stocks. :laugh:

ijw.JPG
 
Last edited:
When you go the the official product page for 3090 you are greeted by the pic I just posted.
And then when you go down, the feature list is:
DLSS - gaming
Raytracing - gaming
8K HDR gaming - well that one is clear
Victory measured in milliseconds - gaming
And after all of those comes "creativity" part: "Take your creative projects to a new level with GeForce RTX 30 Series GPUs. Delivering AI-acceleration in top creative apps. Backed by the NVIDIA Studio platform of dedicated drivers and exclusive tools. And built to perform in record time. Whether you’re rendering complex 3D scenes, editing 8K video, or livestreaming with the best encoding and image quality, GeForce RTX GPUs give you the performance to create your best."

If you go to the TITAN RTX product page the feature list is:
TITAN RTX powers AI, machine learning, and creative workflows.
Start AI Development Now
Accelerate Data Science
Power Your Creativity
And the final one is NVLink, and it says:
"The result is an effective doubling of memory capacity to 48 GB, so that you can train neural networks faster, process even larger datasets, and work with some of the biggest rendering models."
Not once are words game or gaming mentioned.

3090 is clearly meant for gaming first and everything else after. If it truly was a Titan replacement they would talk about AI, rendering, ML and so on, not DLSS...
Zzzzzzzzzzzzzzzzzzzzz.... for 8K HDR gaming.

After/before... who cares....did you see what I linked?

It's a confusing message, for some, if the word Titan isn't in the name. People don't like to be outside of their box. I've linked a page where it says otherwise.... take that for what it is worth...marketed towards both, perhaps?


Guys, stop arguing about the 3090 being a Titan or not. It's neither a Titan, nor a gaming card, or it's both a Titan and a gaming card, depending on how you want to look at it.

It's a Titan through flagship performance ("Titan class"), but it's not a Titan due to very high TDP (Titans are usually 250W).

And Nvidia markets it to whoever wants to pay for it, gamers or pro users, miners, scalpers, whatever.
NO SHIT?????!!!!

Except that the Titan classification ahs NOTHING to do with TDP.
 
Last edited:
I wish everyone finding the 3090 too expensive, good luck (expecting a 3080 20GB to be 800 bucks) :laugh:
 
Just to update on the point I was making earlier about the Gaming + CAD benefits of the 3090, and why $1500 is a steal. It's beating an RTX 6000 in production applications, and those ring up at $4000, while also stomping all over the 2080 Ti in gaming. I've yet to find an answer on certified driver support. Oh well.

Benchies over at IgorsLab
 
Just to update on the point I was making earlier about the Gaming + CAD benefits of the 3090, and why $1500 is a steal. It's beating an RTX 6000 in production applications, and those ring up at $4000, while also stomping all over the 2080 Ti in gaming. I've yet to find an answer on certified driver support. Oh well.

Benchies over at IgorsLab
See Linus' review: no CAD support, only for content creators:

 
I don't understand the point of this card. It's marginally faster than the 3080, yet costs more than twice as much.

If anyone has $1500 they're going to dump on this card, you can just give me about half and spend the remainder on a 3080. You'll feel better about how you've spent the money and I'll feel better about keeping you from wasting $1500 on it. In the end, we both win!
 
I don't understand the point of this card. It's marginally faster than the 3080, yet costs more than twice as much.

If anyone has $1500 they're going to dump on this card, you can just give me about half and spend the remainder on a 3080. You'll feel better about how you've spent the money and I'll feel better about keeping you from wasting $1500 on it. In the end, we both win!
Because it isn't just for gaming....

God DAMN did Nvidia bork the naming by not calling this a Titan... sheesh.
 
I think Jay has all the right things to say about the 3090;

If you can afford one and want a workstation card that doesn't cost an arm & a leg and/or a gaming card with the ultimate in performance, get one. Otherwise the pointless whining is pointless.
 
Meh, I mean the performance increase is par for the course. I mean there has always been diminishing returns the higher you go and while I am not going to fully defend the professional aspects of this card, it probably also has to do with the bin rate for these cards. They know if they had more they would sell so the fact that there were a small portion available probably means the chip at this level is probably harder to get. Now does that justify the price, I don't know, but I can at least say the others in the lineups are not nearly as bad as they have been last generation so its a bit of a trade off.
 
You're ignoring the fact it comes with 24 GB VRAM and SLI support. It's not really designed for gamers.
Yet it lacks Quadro features so it does not accelerate pro software... So should we also have to ignore the upcoming 3080ti, which is essentially this card, for $500 less?

The 3090 is a waste of effort.
 
See Linus' review: no CAD support, only for content creators:


Shitty. I was hopeful. Guess I'll wait for this gens Quadro RTX 4000 and call it a day.
 
Yet it lacks Quadro features so it does not accelerate pro software...

The 3090 is a waste of effort.
That does not mean it's useless. As the benchmarks Linus, Jay and a number of other review sites have shown, the 3090 is a godsend for most productivity/rendering software. Yes, there are a few that don't work well, but those are the exceptions rather than the rule. There will be a Quadro version of Ampere and it will be fully featured, but it will also have a price-tag that only the professionals that need that level of compute can afford and will pay.

The 3090 is the ultimate in gaming performance for this gen of GPU's. If you want the ultimate, the best of the best, the 3090 is it. It's only a waste for those who can't afford it.

So should we also have to ignore the upcoming 3080ti, which is essentially this card, for $500 less?
It's not going to be $500 less and it likely will not be coming until next year if at all.
 
Back
Top