• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Acer Leaks GeForce RTX 5090 and RTX 5080 GPU, Memory Sizes Confirmed

GFreeman

News Editor
Staff member
Joined
Mar 6, 2023
Messages
1,582 (2.42/day)
Acer has jumped the gun and listed its ACER Predator Orion 7000 systems with the upcoming NVIDIA RTX 50 series graphics cards, namely the GeForce RTX 5080 and the GeForce RTX 5090. In addition, the listing confirms that the GeForce RTX 5080 will come with 16 GB of GDDR7 memory, while the GeForce RTX 5090 will get 32 GB of GDDR7 memory.

The ACER Predator Orion 7000 gaming PC was announced back in September, together with Intel's Core Ultra 200 series, and it does not come as a surprise that this high-end pre-built system will now be getting NVIDIA's new GeForce RTX 50 series graphics cards. In case you missed previous rumors, the GeForce RTX 5080 is expected to use the GB203-400 GPU with 10,752 CUDA cores, and come with 16 GB of GDDR7 memory on a 256-bit memory interface. The GeForce RTX 5090, on the other hand, gets the GB202-300 GPU with 21,760 CUDA cores and packs 32 GB of GDDR7 memory.



The NVIDIA GeForce RTX 50 series is expected to be unveiled at the CES 2025 keynote, led by NVIDIA CEO Jensen Huang, on January 6 at 6:30 PM, and judging from earlier leaks by Inno3D, we might see a surprise or two with "Advanced DLSS Technology", "Neural Rendering", and other AI-oriented features.

View at TechPowerUp Main Site | Source
 
Joined
Dec 12, 2016
Messages
1,944 (0.66/day)
From reading the comments of other articles, it seems that just a handful of 4090 owners are looking forward to the 5090 as they have the money for the generation to generation purchases of the highest SKU. I'm not seeing too many caring about the 5080 SKU and lower.
 
Joined
Jan 18, 2020
Messages
832 (0.46/day)
Pretty sceptical to be honest, be interesting to see what the generational gains are it was only 6% more shaders on the 5070ti? Spend a couple of grand on a 5090, then use Advanced DLSS technology, Ultra Performance to upscale from 720p so the image quality is garbage, full of shimmering etc and worse than games 20 years ago. The wonders of "AI".
 
Joined
Sep 30, 2024
Messages
103 (1.29/day)
16GB on a brand new, $1200+ top of the "consumer" range 2025 graphics card is a total joke.

There will be absolutely no longevity in buying a top range card with such a limited amount on VRAM. 12GB is the minimum to play modern games at 1440p, as confirmed by many games already in 2024. Are we seriously being told to think that no games coming out in 2025 will not start stuttering on a 16GB card? I don't trust nGreedia enough to put my money on that!
 
Last edited:
Joined
Jun 18, 2015
Messages
578 (0.17/day)
Increasing the memory from 24 gb to 32 gb is a probably just a move to justify a price increase and/or a small generational performance increase.

I hope it is not though. 32gb is more than welcome as in my content creation pipeline 24gb is not enough for creating 4K images/image sequenes and 48gb quadros are way above my budget.

edit: Keeping 16 gb on xx80s could just be temporary move to see how the land lies, and then make a "we responded to the community and upgraded it to 20 or 24" move.
 
Joined
Sep 26, 2022
Messages
2,142 (2.63/day)
Location
Brazil
System Name G-Station 2.0 "YGUAZU"
Processor AMD Ryzen 7 5700X3D
Motherboard Gigabyte X470 Aorus Gaming 7 WiFi
Cooling Freezemod: Pump, Reservoir, 360mm Radiator, Fittings / Bykski: Blocks / Barrow: Meters
Memory Asgard Bragi DDR4-3600CL14 2x16GB
Video Card(s) Sapphire PULSE RX 7900 XTX
Storage 240GB Samsung 840 Evo, 1TB Asgard AN2, 2TB Hiksemi FUTURE-LITE, 320GB+1TB 7200RPM HDD
Display(s) Samsung 34" Odyssey OLED G8
Case Lian Li Lancool 216
Audio Device(s) Astro A40 TR + MixAmp
Power Supply Cougar GEX X2 1000W
Mouse Razer Viper Ultimate
Keyboard Razer Huntsman Elite (Red)
Software Windows 11 Pro
edit: Keeping 16 gb on xx80s could just be temporary move to see how the land lies, and then make a "we responded to the community and upgraded it to 20 or 24" move.
Keeps the gap alive for the release of either (or both) 5080 Super and Ti.
 
Joined
Dec 1, 2022
Messages
245 (0.33/day)
16GB on a brand new, $1200+ top of the "consumer" range 2025 graphics card is a total joke.

There will be absolutely no longevity in buying a top range card with such a limited amount on VRAM. 12GB is the minimum to play modern games at 1440p, as confirmed by many games already in 2024. Are we seriously being told to think that no games coming out in 2025 will not start stuttering on a 16GB card? I don't trust nGreedia enough to put my money on that!
Agreed, a $1200+ card should have 20 or 24GB of VRAM, although Nvidia has to keep the performance gap to get people to buy the flagship, and it seems the gap will get even wider with a 32gb 5090.
I really dislike the product segmentation Nvidia has with high end cards, these cards with just enough vram forces users to turn on DLSS and upscaling.
 
Last edited:
Joined
Jan 9, 2023
Messages
320 (0.45/day)
Nvidia leaving a huge gap between 16GB and 32GB is a bit ridiculous. It's probably got to do with AI but having to cough up over 1k for a 16GB card sounds awful.
 
Joined
Apr 24, 2022
Messages
72 (0.07/day)
Location
UK
System Name JustGaming
Processor Ryzen 9 7950x3D
Motherboard Asus ProArt X670E-CREATOR WIFI
Cooling Thermalright Peerless Assassin 140 Black
Memory G.Skill Trident Z5 DDR5-6000 CL32
Video Card(s) Inno3D RTX 4090 24Gb
Storage Crucial T500 2TB x 3
Display(s) Asus ROG Swift OLED PG32UCDM
Case Asus ProArt PA602
Audio Device(s) SSL 2+
Power Supply SuperFlower 1200 Platinum
Mouse Razer Deathadder v2
Keyboard Montech Mkey TKL
From reading the comments of other articles, it seems that just a handful of 4090 owners are looking forward to the 5090 as they have the money for the generation to generation purchases of the highest SKU. I'm not seeing too many caring about the 5080 SKU and lower.
No, anyone with a 4090 can skip the next generation. I have 4090 and next 2-3 years will be ok for 4k.
 
Joined
Nov 13, 2024
Messages
86 (2.39/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
No, anyone with a 4090 can skip the next generation. I have 4090 and next 2-3 years will be ok for 4k.
I think the question here is:

why did they buy an 4090?

-> They needed the card because of performance (4K 120Hz for an example or Research, Video editing, CAD and so on...)
-> They wanted the best card on the market
(I am sure there are more reasons... this is to boil down the point I am trying to make)

If it was for the latter, it's not the best and shiniest thing on the market anymore, therefor a reason to get it
 
Joined
May 10, 2023
Messages
341 (0.58/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Increasing the memory from 24 gb to 32 gb is a probably just a move to justify a price increase and/or a small generational performance increase.

I hope it is not though. 32gb is more than welcome as in my content creation pipeline 24gb is not enough for creating 4K images/image sequenes and 48gb quadros are way above my budget.

edit: Keeping 16 gb on xx80s could just be temporary move to see how the land lies, and then make a "we responded to the community and upgraded it to 20 or 24" move.
Reminder that, differently from the 3090->4090 move, those 32GB will provide a substantial memory bandwidth uplift, going from the ~1TB/s of the 3090/4090 (384-bit at 21Gbps) to almost 1.8TB/s (512-bit at 28Gbps), an 80% improvement that should translate to also 80% uplift for stuff like LLMs.

Is 512bit gddr7 on 5090 confirmed?
So far everything points it to be the case. To manage 32GB with 2GB DRAM modules you'd need 16 of those, which means either 256-bit in clamshell (which would be a downgrade from previous gens), or go with 512-bit for those 16 channels.

Another possibility would be to use 24Gb (3GB) modules with a 384-bit bus, but that would lead to 36GB.
 
Joined
Aug 9, 2024
Messages
134 (1.02/day)
Location
Michigan, United States
Processor Intel i7-13700K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling NZXT Kraken Elite 360
Memory G.Skill Trident Z DDR5-6400
Video Card(s) MSI RTX 4090 Suprim X Liquid
Storage Western Digital SN850X 4Tb x 4
Case Corsair 5000D Airflow
Audio Device(s) Creative AE-5 Plus
Power Supply Corsair HX-1200
Software Windows 11 Pro 23H2
From reading the comments of other articles, it seems that just a handful of 4090 owners are looking forward to the 5090 as they have the money for the generation to generation purchases of the highest SKU. I'm not seeing too many caring about the 5080 SKU and lower.

The guys who can't live with themselves unless they are constantly in possession of the biggest benchmark weenie at any given moment.

The 4090 wasn't my first choice -- I had been holding out for a 4080Ti that never materialized, and I just couldn't pull the trigger on an overpriced -- for what you were getting in comparison, at the time -- 4080, but at the same time the 4090 felt like (and still feels like) overkill. And games of late for me aren't justifying an upgrade beyond that, so I'll probably sit this gen out.
 
Joined
Nov 6, 2016
Messages
1,771 (0.60/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
16GB on a brand new, $1200+ top of the "consumer" range 2025 graphics card is a total joke.

There will be absolutely no longevity in buying a top range card with such a limited amount on VRAM. 12GB is the minimum to play modern games at 1440p, as confirmed by many games already in 2024. Are we seriously being told to think that no games coming out in 2025 will not start stuttering on a 16GB card? I don't trust nGreedia enough to put my money on that!
Agreed...xx80 class cards will definitely be hitting well above $1000 this time around, so it would have been nice to see at least 24GB....what's the deal with Nvidia making the gulf between the xx80 and xx90 class cards even larger? IDK, seems weird to have an almost $1000 gap between the top card and the runner up....anyone know the strategy behind that one?
 
Joined
Sep 26, 2022
Messages
2,142 (2.63/day)
Location
Brazil
System Name G-Station 2.0 "YGUAZU"
Processor AMD Ryzen 7 5700X3D
Motherboard Gigabyte X470 Aorus Gaming 7 WiFi
Cooling Freezemod: Pump, Reservoir, 360mm Radiator, Fittings / Bykski: Blocks / Barrow: Meters
Memory Asgard Bragi DDR4-3600CL14 2x16GB
Video Card(s) Sapphire PULSE RX 7900 XTX
Storage 240GB Samsung 840 Evo, 1TB Asgard AN2, 2TB Hiksemi FUTURE-LITE, 320GB+1TB 7200RPM HDD
Display(s) Samsung 34" Odyssey OLED G8
Case Lian Li Lancool 216
Audio Device(s) Astro A40 TR + MixAmp
Power Supply Cougar GEX X2 1000W
Mouse Razer Viper Ultimate
Keyboard Razer Huntsman Elite (Red)
Software Windows 11 Pro
Agreed...xx80 class cards will definitely be hitting well above $1000 this time around, so it would have been nice to see at least 24GB....what's the deal with Nvidia making the gulf between the xx80 and xx90 class cards even larger? IDK, seems weird to have an almost $1000 gap between the top card and the runner up....anyone know the strategy behind that one?
Make the xx80 seem so inferior that you'll splash the cash for the halo xx90.
 
Joined
May 10, 2023
Messages
341 (0.58/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Agreed...xx80 class cards will definitely be hitting well above $1000 this time around, so it would have been nice to see at least 24GB....what's the deal with Nvidia making the gulf between the xx80 and xx90 class cards even larger? IDK, seems weird to have an almost $1000 gap between the top card and the runner up....anyone know the strategy behind that one?
Remember that the top "consumer" chip (AD102 for 4090, GB102 for 5090) is also used for the workstation/datacenter market.
The 4090 didn't even have the full die enabled, the RTX 6000 Ada actually had more cores than it. I guess it's just a matter of having bigger chips, and yields being good enough that you have no reason to create a smaller products out of it.
 
Joined
Sep 17, 2014
Messages
22,649 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
16GB on a brand new, $1200+ top of the "consumer" range 2025 graphics card is a total joke.

There will be absolutely no longevity in buying a top range card with such a limited amount on VRAM. 12GB is the minimum to play modern games at 1440p, as confirmed by many games already in 2024. Are we seriously being told to think that no games coming out in 2025 will not start stuttering on a 16GB card? I don't trust nGreedia enough to put my money on that!
I frankly don't think 16GB is going to be a problem anywhere in 2025. And that's coming from a VRAM herald.

Sure, if you want to play the selection of 3 odd games that really go there, then yes, but they're simulators that also like to have their entire own setup to begin with.

But yes, if you jumped on 4K, you might nip on the heels of 16GB sooner rather than later. Good reason not to. I'm a big advocate of 3440x1440 as max res. Or just 1440p. Its comfortable, you're not chasing new standards all the time, high refresh is easy to attain, you're not forced to upscale, native scaling works fine, yadayadayada.

The real VRAM issue is happening on the x60/x70 territory with 8/12GB cards. But Nvidia is moving to 16GB there too, if rumors are true. x60 is just entirely avoid territory at this point.

Make the xx80 seem so inferior that you'll splash the cash for the halo xx90.
Nah, last gen perhaps but now? That gap is too large. It all depends on how they price x90. If they give it away at spitting distance from a 5080, its a no brainer, but given the fact its over twice the GPU, that's never happening. I rather think the 5080 might surprise us with a price below 1k. I think Nvidia got the memo that 4080 was overpriced. It was a very problematic segment for them in Ada - x70ti-x80 was a mess even pre-release. We might even pray for that philosophy to trickle down the stack a bit. x70 was heavily underspecced too in Ada.
 
Last edited:
Joined
Jan 9, 2023
Messages
320 (0.45/day)
I frankly don't think 16GB is going to be a problem anywhere in 2025. And that's coming from a VRAM herald.

Sure, if you want to play the selection of 3 odd games that really go there, then yes, but they're simulators that also like to have their entire own setup to begin with.

But yes, if you jumped on 4K, you might nip on the heels of 16GB sooner rather than later. Good reason not to. I'm a big advocate of 3440x1440 as max res. Or just 1440p. Its comfortable, you're not chasing new standards all the time, high refresh is easy to attain, you're not forced to upscale, native scaling works fine, yadayadayada.

The real VRAM issue is happening on the x60/x70 territory with 8/12GB cards. But Nvidia is moving to 16GB there too, if rumors are true. x60 is just entirely avoid territory at this point.


Nah, last gen perhaps but now? That gap is too large. It all depends on how they price x90. If they give it away at spitting distance from a 5080, its a no brainer, but given the fact its over twice the GPU, that's never happening. I rather think the 5080 might surprise us with a price below 1k. I think Nvidia got the memo that 4080 was overpriced. It was a very problematic segment for them in Ada - x70ti-x80 was a mess even pre-release. We might even pray for that philosophy to trickle down the stack a bit. x70 was heavily underspecced too in Ada.
I'm not so sure about that, Black Myth Wukong is already going over 13GB of VRAM.
We can argue about cherry picking and how unreasonable it is, but I see no reason why this trend won't continue.
I'd expect a game by 2026 to be quite troublesome to run on the 5080 assuming you want maximum eyecandy (which does not sound unreasonable for a card well over 1k lol).
Funnily enough Nvidia cards are at much higher risk because Nvidia users are much more likely to enable RT which consumes significantly more VRAM.
 
Joined
Sep 9, 2022
Messages
106 (0.13/day)
16GB on a brand new, $1200+ top of the "consumer" range 2025 graphics card is a total joke.

There will be absolutely no longevity in buying a top range card with such a limited amount on VRAM. 12GB is the minimum to play modern games at 1440p, as confirmed by many games already in 2024. Are we seriously being told to think that no games coming out in 2025 will not start stuttering on a 16GB card? I don't trust nGreedia enough to put my money on that!

‘Then don’t buy it? Or is anyone holding a gun to your head and forcing you to buy a “total joke”? :kookoo:
 
Joined
Sep 17, 2014
Messages
22,649 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I'm not so sure about that, Black Myth Wukong is already going over 13GB of VRAM.
We can argue about cherry picking and how unreasonable it is, but I see no reason why this trend won't continue.
I'd expect a game by 2026 to be quite troublesome to run on the 5080 assuming you want maximum eyecandy (which does not sound unreasonable for a card well over 1k lol).
Funnily enough Nvidia cards are at much higher risk because Nvidia users are much more likely to enable RT which consumes significantly more VRAM.
Over 13GB is not 16GB and it certainly isn't 16GB saturated stutter territory. Also, that is with FG. Without it, 10GB would suffice for maxed out 4K.

But 2026, sure, I can get into that. Its why I got mself a 20GB card :p No, not really. 16GB would have been fine, too. As others have stated correctly, you can indeed just dial down from ultra and not feel all too much of it. Where I draw the line is medium and then still not getting the performance I want to target. And that's where 8GB is at right now, and 12GB is soonish. 16GB? Not by a long shot.
 
Joined
Feb 8, 2017
Messages
232 (0.08/day)
As I've posted in my thread on these forums the new 5000 generation of Nvidia cards are all pretty much over expensive turds. The 5090 will cost $2500 at least and you can take that to the bank, with intentional scarcity pushing the cards even up to $3000.

All of the rest are just bare bones turds with absurd prices.
 
Joined
May 10, 2023
Messages
341 (0.58/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
I'm not so sure about that, Black Myth Wukong is already going over 13GB of VRAM.
We can argue about cherry picking and how unreasonable it is, but I see no reason why this trend won't continue.
I'd expect a game by 2026 to be quite troublesome to run on the 5080 assuming you want maximum eyecandy (which does not sound unreasonable for a card well over 1k lol).
Funnily enough Nvidia cards are at much higher risk because Nvidia users are much more likely to enable RT which consumes significantly more VRAM.
Tbh this test is kinda misleading since it doesn't represent the actual used VRAM, but rather the allocated amount. Too bad there were no tests run with something like a 3090 and a 4070ti (which share similar performance) with that exact 13GB config.
For the tests in 4k for both GPUs we can see that they have similar performance (with the 4070ti often being faster), meaning that the 12 vs 24gb is not really an issue in this specific game.
You'd also be having a hard time managing 60fps at this config anyway.
 
Joined
Jan 9, 2023
Messages
320 (0.45/day)
Tbh this test is kinda misleading since it doesn't represent the actual used VRAM, but rather the allocated amount. Too bad there were no tests run with something like a 3090 and a 4070ti (which share similar performance) with that exact 13GB config.
For the tests in 4k for both GPUs we can see that they have similar performance (with the 4070ti often being faster), meaning that the 12 vs 24gb is not really an issue in this specific game.
You'd also be having a hard time managing 60fps at this config anyway.
I tried looking for it but TPU didn't check it as you said.
That said, TPU doesn't have a particular good track record for this. Some games start to lower texture quality or just not load them at all and it's not always acknowledged.
But TPU does "show" VRAM usage it in a nice and concise way.
If I want to be asinine I'd love if TPU checks AMD and Intel GPU usage as well. Not as in-depth, but a couple data-points to check if it matches.
 
Joined
May 10, 2023
Messages
341 (0.58/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
But TPU does "show" VRAM usage it in a nice and concise way.
To be fair is not easy to measure this at all without a debugger, which would also cause impact in the actual game performance.
Perfect point of comparison is for models that have different variants when it comes to the vram, like the 4060ti. But at this point there's plenty of known games that extrapolate the 8gb framebuffer.

So what's left is comparing a 3080 10gb vs a 12gb model, or even a 3080ti vs a 3090, and also vs a 4070ti and seeing at which point their performance really diverge (which would still not account texture issues that you mentioned).
 
Joined
Aug 2, 2012
Messages
2,011 (0.44/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15S
Memory Crucial Ballistix 3200/C16 32GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Lian-Li O11 Dynamic Evo XL / Noctua NF-A12x25 fans
Audio Device(s) Creative Sound Blaster ZXR / AKG K601 Headphones
Power Supply Seasonic PRIME Fanless TX-700
Mouse Logitech G500S
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
Hmm...

I expected the 5080 to have a bit more VRam, seeing it's a GPU aimed at UltraWide.
 
Top