• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Acer Leaks GeForce RTX 5090 and RTX 5080 GPU, Memory Sizes Confirmed

16 gig on a 80 series card in 2025 will be a joke. Especially when it's going to cost anything over $700.
 
I'm starting to resent this company. I haven't upgraded my graphic card in 5 years since I feel the current gen 40XX is a rip-off. Looking forward to a 5080 with 24Gb of RAM to be slightly future proof, but nooooooooooo, it was too simple. Let's keep milking everyone by selling $1400 cards that will be limited in 2 years for a small amount of RAM missing. If you want more, it's going to cost you $3000+ thanks to scalpers.

I can't take this mentality anymore. I'm waiting for what AMD has to offer.
 
I'm starting to resent this company. I haven't upgraded my graphic card in 5 years since I feel the current gen 40XX is a rip-off. Looking forward to a 5080 with 24Gb of RAM to be slightly future proof, but nooooooooooo, it was too simple. Let's keep milking everyone by selling $1400 cards that will be limited in 2 years for a small amount of RAM missing. If you want more, it's going to cost you $3000+ thanks to scalpers.

I can't take this mentality anymore. I'm waiting for what AMD has to offer.
I think the 5080 will be the worst investment ever. I bet by the end of 2025, there will be a game which stutters because of low VRAM, and it will require a lower setting to get round it. You don't lower settings on a brand-new, state of the art $1200 graphics card that's not even a year old!

But you know what, I bet nGreedia launches a refresh (Super/Ti BS) at the end of 2025 and it will have 24GB of memory, what it should have had in the first place, and everyone who fell for the lie spread by the ignorant that 16GB is fine for at least 6 years because consoles... will suddenly realise they have a $1200 paperweight that loses its value because nobody wants to pay top dollar for a second hand 16GB card that can't play the latest games at high settings.

And to the stupid people getting butthurt by this post, the bandwidth/speed of GDDR7 VRAM means NOTHING if the card is fetching textures from your crap slow-ass system RAM at 60GB/s. It will still stutter like it's 2016 all over again.

At this point I really can't see the point of the 5080 over the 5070Ti. nGreedia has just cut the 5080 to the bone, it's not even half the performance of the 5090, which makes me think that it's actually the 5070Ti rebadged as a 5080, and the real 5080 will be what ends up as the Super/Ti version that comes later.
 
Last edited:
I tried looking for it but TPU didn't check it as you said.
That said, TPU doesn't have a particular good track record for this. Some games start to lower texture quality or just not load them at all and it's not always acknowledged.
But TPU does "show" VRAM usage it in a nice and concise way.
If I want to be asinine I'd love if TPU checks AMD and Intel GPU usage as well. Not as in-depth, but a couple data-points to check if it matches.
Exactly. All you need to do here is spot trends. Games readily saturate 8GB. They occasionally saturate 12GB. And every gen of cards handles vram differently. No need to get into the last small detail, if you zoom out its easy to see what you might need.

And you go by allocated memory because that is the only guaranteed number you can count on that will ensure you dont encounter stutter, or quality reduction, dynamic scaling. Etc. Im not going to compare every tiny detail that they trick us with : I just buy cards that do not lack sufficient VRAM.
 
Back
Top