• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS TUF Gaming GeForce RTX 3090 Ti Box Pictured

The miserable 10 GB on the RTX 3080 already decrease the performance, forcing Nvidia to do shenanigans in the driver to force lower resolution textures in the games..

RTX 3080 VRAM usage warnings and the issue with VRAM pool sizes: the compromise of 4K gaming | ResetEra

Far Cry 6 needs more VRAM than the Nvidia RTX 3080 has to load HD textures | PCGamesN

Exactly and those shenanigans don't show up in reviews at launch. Win win, right? 10GB was enough and people can happily live in ignorance that it is. Unless they frequent TPU, where I'll keep massaging that one in since lots of buyers were convinced all was fine and 'oh, otherwise the card's perf isn't going to push those details anyway' or 'I'll just lower my textures'. On GPUs that are hitting the 1K MSRP mark. Less than a year post-launch. Its utterly retarded.

To each their own, it was easy to predict this.

So no, Nvidia won't do the hardware workaround for 'sufficient' VRAM anymore, after they got burned on Maxwell they're designing around it and they sell you the argument that all is well with 50% reduced VRAM compared to Pascal in relative core power. Meanwhile, they still do release double VRAM for the entire stack. Early adopter heaven, cash twice on fools with money. Its much better than getting forced to settle with customers for your 3.5GB.

The newest argument to avoid buyers' remorse now is 'muh muh but we have DLSS, so it looks great anyway'. :roll::roll::roll: Okay, enjoy being on Nvidia's DLSS leash for your gaming, what used to be waiting for your SLI profiles now is waiting for your DLSS profiles. But! This time, it will all be different, right? :D

Its a piss poor gen, this one. Even the 450W on this 3090ti underlines it - they can't make a full die to save their life without pulling all the stops on voltage. Samsung 8nm. Fan-tas-tic node.
 
Last edited:
"Expect next-generation graphics cards to consume more power"

"In a recent Tweet, the leaker @kopite7kimi stated that "400 is not enough" when referring to Nvidia's next-generation RTX 40 series."
I always found undervolted x70 cards a good compromise between performance & consumption (and price, of course). 4070 might be the one for me.
 
yay more stuff for miners to gobble up with bots, while ironically the common man who doesn't know how to make bots to buy items with will continue to not get the item.

ironic you say, why do you say that lynx? cause that is the argument that lex friedman and others make for the power and importance of crypto success, is that it will free the common man from the corrupt.

hehe such a funny irony.
 
I'm sure there will be at least 2 distraught gamers crying as they can't get one of these
 
So, 5% performance gain (max) for twice the price of an already massively overpriced card. Gotcha.
 
What planet do these lunatics live on? If you're releasing a card that's exceeding 400w then something internally has gone seriously wrong.

Another generation to miss for me, i'll wait till the "mid" range cards have 3070 performance at a fraction of the power usage.
 
good, you solved the "not enough heat" ... now solve the "min[e]or" problem ... ( [joke]aka: add a R18 tag on them or make them burn when under load in a system with more than 3 GPU[/joke] the "R18 tag" was a joke ... the second part ... mmhh well ok that one is a joke too ... :oops: )

i guess my 1070 still has some time ahead of her ... well, i have yet to find a game i play on it, that go under 1620p30 and be unplayable, aside Quake II RTX ofc... (not that RTX is something "worth it" for me :laugh: ) most games including recent ones are in the 1620p50+ (as long as it's not lower than 40 i am fine, and no AA everything else max/near max does the trick on most )
 
450W?Are you kidding me?It will be another nuclear GPU like GTX590 and GTX690.Power Supply R.I.P
Those were dual-GPU cards. This has only one GPU and way higher TDP (or TBP, whatever).
What planet do these lunatics live on? If you're releasing a card that's exceeding 400w then something internally has gone seriously wrong.

Another generation to miss for me, i'll wait till the "mid" range cards have 3070 performance at a fraction of the power usage.
R9 295X2 was kinda understandable as it has two already power-hungry GPUs, has liquid cooling in its reference design and had a 500W TDP/TBP. A very high consumption of course, but as I said, it was also a dual-GPU card.
 
Those were dual-GPU cards. This has only one GPU and way higher TDP (or TBP, whatever).

R9 295X2 was kinda understandable as it has two already power-hungry GPUs, has liquid cooling in its reference design and had a 500W TDP/TBP. A very high consumption of course, but as I said, it was also a dual-GPU card.
Yep that one I understand, different times too. Nvidia cards kept lowering the TDP whilst remaining powerful cards, Maxwell was amazing. Lately they've gone backwards and done an Intel. You simply cannot brag about performance if you're having to raise the TDP so much, and that'll ultimately cost the consumer too. PSU's of 1000w+ will soon be the norm for them, and that's not ok.
 
Yep that one I understand, different times too. Nvidia cards kept lowering the TDP whilst remaining powerful cards, Maxwell was amazing. Lately they've gone backwards and done an Intel. You simply cannot brag about performance if you're having to raise the TDP so much, and that'll ultimately cost the consumer too. PSU's of 1000w+ will soon be the norm for them, and that's not ok.
Yeah. They're doing exactly that, pushing out raw performance no matter the power consumption. Like you said, Maxwell (and Pascal) was awesome what it came to efficiency.
 
Maxwell was good on efficiency and Pascal followed up and took it further. After that point RTRT got pushed and efficiency went out the window.
 
Back
Top