Wednesday, December 18th 2024

Acer Leaks GeForce RTX 5090 and RTX 5080 GPU, Memory Sizes Confirmed

Acer has jumped the gun and listed its ACER Predator Orion 7000 systems with the upcoming NVIDIA RTX 50 series graphics cards, namely the GeForce RTX 5080 and the GeForce RTX 5090. In addition, the listing confirms that the GeForce RTX 5080 will come with 16 GB of GDDR7 memory, while the GeForce RTX 5090 will get 32 GB of GDDR7 memory.

The ACER Predator Orion 7000 gaming PC was announced back in September, together with Intel's Core Ultra 200 series, and it does not come as a surprise that this high-end pre-built system will now be getting NVIDIA's new GeForce RTX 50 series graphics cards. In case you missed previous rumors, the GeForce RTX 5080 is expected to use the GB203-400 GPU with 10,752 CUDA cores, and come with 16 GB of GDDR7 memory on a 256-bit memory interface. The GeForce RTX 5090, on the other hand, gets the GB202-300 GPU with 21,760 CUDA cores and packs 32 GB of GDDR7 memory.
The NVIDIA GeForce RTX 50 series is expected to be unveiled at the CES 2025 keynote, led by NVIDIA CEO Jensen Huang, on January 6 at 6:30 PM, and judging from earlier leaks by Inno3D, we might see a surprise or two with "Advanced DLSS Technology", "Neural Rendering", and other AI-oriented features.
Source: Videocardz.com
Add your own comment

16 Comments on Acer Leaks GeForce RTX 5090 and RTX 5080 GPU, Memory Sizes Confirmed

#1
Daven
From reading the comments of other articles, it seems that just a handful of 4090 owners are looking forward to the 5090 as they have the money for the generation to generation purchases of the highest SKU. I'm not seeing too many caring about the 5080 SKU and lower.
Posted on Reply
#2
mb194dc
Pretty sceptical to be honest, be interesting to see what the generational gains are it was only 6% more shaders on the 5070ti? Spend a couple of grand on a 5090, then use Advanced DLSS technology, Ultra Performance to upscale from 720p so the image quality is garbage, full of shimmering etc and worse than games 20 years ago. The wonders of "AI".
Posted on Reply
#3
StimpsonJCat
16GB on a brand new, $1200+ top of the "consumer" range 2025 graphics card is a total joke.

There will be absolutely no longevity in buying a top range card with such a limited amount on VRAM. 12GB is the minimum to play modern games at 1440p, as confirmed by many games already in 2024. Are we seriously being told to think that no games coming out in 2025 will not start stuttering on a 16GB card? I don't trust nGreedia enough to put my money on that!
Posted on Reply
#4
Xaled
Increasing the memory from 24 gb to 32 gb is a probably just a move to justify a price increase and/or a small generational performance increase.

I hope it is not though. 32gb is more than welcome as in my content creation pipeline 24gb is not enough for creating 4K images/image sequenes and 48gb quadros are way above my budget.

edit: Keeping 16 gb on xx80s could just be temporary move to see how the land lies, and then make a "we responded to the community and upgraded it to 20 or 24" move.
Posted on Reply
#5
wNotyarD
Xalededit: Keeping 16 gb on xx80s could just be temporary move to see how the land lies, and then make a "we responded to the community and upgraded it to 20 or 24" move.
Keeps the gap alive for the release of either (or both) 5080 Super and Ti.
Posted on Reply
#6
Hecate91
StimpsonJCat16GB on a brand new, $1200+ top of the "consumer" range 2025 graphics card is a total joke.

There will be absolutely no longevity in buying a top range card with such a limited amount on VRAM. 12GB is the minimum to play modern games at 1440p, as confirmed by many games already in 2024. Are we seriously being told to think that no games coming out in 2025 will not start stuttering on a 16GB card? I don't trust nGreedia enough to put my money on that!
Agreed, a $1200+ card should have 20 or 24GB of VRAM, although Nvidia has to keep the performance gap to get people to buy the flagship, and it seems the gap will get even wider with a 32gb 5090.
I really dislike the product segmentation Nvidia has with high end cards, these cards with just enough vram forces users to turn on DLSS and upscaling.
Posted on Reply
#7
Nostras
Nvidia leaving a huge gap between 16GB and 32GB is a bit ridiculous. It's probably got to do with AI but having to cough up over 1k for a 16GB card sounds awful.
Posted on Reply
#8
ErikG
DavenFrom reading the comments of other articles, it seems that just a handful of 4090 owners are looking forward to the 5090 as they have the money for the generation to generation purchases of the highest SKU. I'm not seeing too many caring about the 5080 SKU and lower.
No, anyone with a 4090 can skip the next generation. I have 4090 and next 2-3 years will be ok for 4k.
Posted on Reply
#9
Metroid
Is 512bit gddr7 on 5090 confirmed?
Posted on Reply
#10
TSiAhmat
ErikGNo, anyone with a 4090 can skip the next generation. I have 4090 and next 2-3 years will be ok for 4k.
I think the question here is:

why did they buy an 4090?

-> They needed the card because of performance (4K 120Hz for an example or Research, Video editing, CAD and so on...)
-> They wanted the best card on the market
(I am sure there are more reasons... this is to boil down the point I am trying to make)

If it was for the latter, it's not the best and shiniest thing on the market anymore, therefor a reason to get it
Posted on Reply
#11
igormp
XaledIncreasing the memory from 24 gb to 32 gb is a probably just a move to justify a price increase and/or a small generational performance increase.

I hope it is not though. 32gb is more than welcome as in my content creation pipeline 24gb is not enough for creating 4K images/image sequenes and 48gb quadros are way above my budget.

edit: Keeping 16 gb on xx80s could just be temporary move to see how the land lies, and then make a "we responded to the community and upgraded it to 20 or 24" move.
Reminder that, differently from the 3090->4090 move, those 32GB will provide a substantial memory bandwidth uplift, going from the ~1TB/s of the 3090/4090 (384-bit at 21Gbps) to almost 1.8TB/s (512-bit at 28Gbps), an 80% improvement that should translate to also 80% uplift for stuff like LLMs.
MetroidIs 512bit gddr7 on 5090 confirmed?
So far everything points it to be the case. To manage 32GB with 2GB DRAM modules you'd need 16 of those, which means either 256-bit in clamshell (which would be a downgrade from previous gens), or go with 512-bit for those 16 channels.

Another possibility would be to use 24Gb (3GB) modules with a 384-bit bus, but that would lead to 36GB.
Posted on Reply
#12
stahlhart
DavenFrom reading the comments of other articles, it seems that just a handful of 4090 owners are looking forward to the 5090 as they have the money for the generation to generation purchases of the highest SKU. I'm not seeing too many caring about the 5080 SKU and lower.
The guys who can't live with themselves unless they are constantly in possession of the biggest benchmark weenie at any given moment.

The 4090 wasn't my first choice -- I had been holding out for a 4080Ti that never materialized, and I just couldn't pull the trigger on an overpriced -- for what you were getting in comparison, at the time -- 4080, but at the same time the 4090 felt like (and still feels like) overkill. And games of late for me aren't justifying an upgrade beyond that, so I'll probably sit this gen out.
Posted on Reply
#13
AnarchoPrimitiv
StimpsonJCat16GB on a brand new, $1200+ top of the "consumer" range 2025 graphics card is a total joke.

There will be absolutely no longevity in buying a top range card with such a limited amount on VRAM. 12GB is the minimum to play modern games at 1440p, as confirmed by many games already in 2024. Are we seriously being told to think that no games coming out in 2025 will not start stuttering on a 16GB card? I don't trust nGreedia enough to put my money on that!
Agreed...xx80 class cards will definitely be hitting well above $1000 this time around, so it would have been nice to see at least 24GB....what's the deal with Nvidia making the gulf between the xx80 and xx90 class cards even larger? IDK, seems weird to have an almost $1000 gap between the top card and the runner up....anyone know the strategy behind that one?
Posted on Reply
#14
wNotyarD
AnarchoPrimitivAgreed...xx80 class cards will definitely be hitting well above $1000 this time around, so it would have been nice to see at least 24GB....what's the deal with Nvidia making the gulf between the xx80 and xx90 class cards even larger? IDK, seems weird to have an almost $1000 gap between the top card and the runner up....anyone know the strategy behind that one?
Make the xx80 seem so inferior that you'll splash the cash for the halo xx90.
Posted on Reply
#15
igormp
AnarchoPrimitivAgreed...xx80 class cards will definitely be hitting well above $1000 this time around, so it would have been nice to see at least 24GB....what's the deal with Nvidia making the gulf between the xx80 and xx90 class cards even larger? IDK, seems weird to have an almost $1000 gap between the top card and the runner up....anyone know the strategy behind that one?
Remember that the top "consumer" chip (AD102 for 4090, GB102 for 5090) is also used for the workstation/datacenter market.
The 4090 didn't even have the full die enabled, the RTX 6000 Ada actually had more cores than it. I guess it's just a matter of having bigger chips, and yields being good enough that you have no reason to create a smaller products out of it.
Posted on Reply
#16
Vayra86
StimpsonJCat16GB on a brand new, $1200+ top of the "consumer" range 2025 graphics card is a total joke.

There will be absolutely no longevity in buying a top range card with such a limited amount on VRAM. 12GB is the minimum to play modern games at 1440p, as confirmed by many games already in 2024. Are we seriously being told to think that no games coming out in 2025 will not start stuttering on a 16GB card? I don't trust nGreedia enough to put my money on that!
I frankly don't think 16GB is going to be a problem anywhere in 2025. And that's coming from a VRAM herald.

Sure, if you want to play the selection of 3 odd games that really go there, then yes, but they're simulators that also like to have their entire own setup to begin with.

But yes, if you jumped on 4K, you might nip on the heels of 16GB sooner rather than later. Good reason not to. I'm a big advocate of 3440x1440 as max res. Or just 1440p. Its comfortable, you're not chasing new standards all the time, high refresh is easy to attain, you're not forced to upscale, native scaling works fine, yadayadayada.

The real VRAM issue is happening on the x60/x70 territory with 8/12GB cards. But Nvidia is moving to 16GB there too, if rumors are true. x60 is just entirely avoid territory at this point.
wNotyarDMake the xx80 seem so inferior that you'll splash the cash for the halo xx90.
Nah, last gen perhaps but now? That gap is too large. It all depends on how they price x90. If they give it away at spitting distance from a 5080, its a no brainer, but given the fact its over twice the GPU, that's never happening. I rather think the 5080 might surprise us with a price below 1k. I think Nvidia got the memo that 4080 was overpriced. It was a very problematic segment for them in Ada - x70ti-x80 was a mess even pre-release. We might even pray for that philosophy to trickle down the stack a bit. x70 was heavily underspecced too in Ada.
Posted on Reply
Add your own comment
Dec 18th, 2024 11:19 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts