• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
its overkill for 4k and 8k at this point in time.
I would have happily got less, but i had no real choice at the time of purchase (GPU shortage, warranty return offer)


I really hope things like directstorage change this, because it'll make budget GPU's with more VRAM more useful, preloading/streaming in 16GB of data to a 3060 would certainly improve the experience vs using 4GB of it and hoping for the best
Isn't DirectStorage all about negating the need for more VRAM by swapping out data as and when it's needed instead of storing all of it in the VRAM?
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Is there something new about? Version 1.11 or 1.2?
Not yet, we've got the ability to transfer game content and hardware decompress them - but games arent using it fully yet

Isn't DirectStorage all about negating the need for more VRAM by swapping out data as and when it's needed instead of storing all of it in the VRAM?
It wont need to swap, if you have spare VRAM. It'll be like system RAM, keeping it loaded and swapping out the oldest/least used content at the time it's needed to.

I've got 20GB of cached content in windows, used to speed thngs up - but if the RAM is needed, that's ditched instantly to free space.
It should behave the same, the game loads whats needed without disposing of anything if VRAM is free - but current methods are a bit clunky. DX9 for example had to duplicate everything in system RAM and VRAM, DX10/11 ditched that - now we're removing the CPU processing the transfer, so in theory the GPU drivers can control when to purge instead of the game engine

The simplest form would be to "tell me what we need for right now" and just keep streaming it in, only purging the content used least recently when full.
Seeing 4GB used on a 24GB card - and that 4GB changing content regularly - is an ineffcient method used at present to cater to the weakest GPU's out there, because if the GPU driver isn't in control the game engine simply cant tell how much VRAM it has

DXDIAG is a great example of this, showing that i have either 24GB or ~56GB depending how they count it
1679368215206.png
 
Last edited:
Joined
Aug 20, 2007
Messages
21,541 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Joined
Apr 30, 2020
Messages
999 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
Not yet, we've got the ability to transfer game content and hardware decompress them - but games arent using it fully yet


It wont need to swap, if you have spare VRAM. It'll be like system RAM, keeping it loaded and swapping out the oldest/least used content at the time it's needed to.

I've got 20GB of cached content in windows, used to speed thngs up - but if the RAM is needed, that's ditched instantly to free space.
It should behave the same, the game loads whats needed without disposing of anything if VRAM is free - but current methods are a bit clunky. DX9 for example had to duplicate everything in system RAM and VRAM, DX10/11 ditched that - now we're removing the CPU processing the transfer, so in theory the GPU drivers can control when to purge instead of the game engine

The simplest form would be to "tell me what we need for right now" and just keep streaming it in, only purging the content used least recently when full.
Seeing 4GB used on a 24GB card - and that 4GB changing content regularly - is an ineffcient method used at present to cater to the weakest GPU's out there, because if the GPU driver isn't in control the game engine simply cant tell how much VRAM it has

DXDIAG is a great example of this, showing that i have either 24GB or ~56GB depending how they count it
View attachment 288616
The Driver don't do very much in DX12, as DX12 is a lower level API, compared to the older ones.

If it had nearly as much control as you posted in this example it would lead to stackovers & overflows with massive latency pentiles while read & writing data constantly. Read & writing data take time. it's not for free. ( Stack overflows can be any from anything, latency overflows, to execution, or ram floods.)

The problem would be like this
(GPU) where is x-y-b
when (GPU) needs to finish x-y-d after x-y-b.
While x-y-b needs to have b written again.
before it's read for y-x-b. ( latency hit for read & write happens)

Feeding it & out all the time induces latency. While just having to read only from Vram can be faster as it's one action & not two. That also needs to be come through the same bus that cpu comuicates on.
The reason Resizable bar exisist is to elimit that need for constant writes, because then we could still get away with cards that have 256mbs cards & the increase to 4GB address space would've never been in needed in the first place.

The majority of problems come from older code from older DX standards that are still in DX12 that always include copying into ram while copying in to Vram. This happens just so if any thing doesn't fit in the Vram With reziseable bar. It's should be able to be called upon from the GPU by calling to cpu to the memory. That's where another latency hit happens because it's more than one stop to get the information. needed.
 
Joined
Jul 14, 2018
Messages
473 (0.20/day)
Location
Jakarta, Indonesia
System Name PC-GX1
Processor i9 10900 non K (stock) TDP 65w
Motherboard asrock b560 steel legend | Realtek ALC897
Cooling cooler master hyper 2x12 LED turbo argb | 5x12cm fan rgb intake | 3x12cm fan rgb exhaust
Memory corsair vengeance LPX 2x32gb ddr4 3600mhz
Video Card(s) MSI RTX 3080 10GB Gaming Z Trio LHR TDP 370w| 566.36 WHQL | MSI AB v4.65 | RTSS v7.36
Storage NVME 2+2TB gen3| SSD 4TB sata3 | 1+2TB 7200rpm sata3| 4+4+5TB USB3 (optional)
Display(s) AOC U34P2C (IPS panel, 3440x1440 75hz) + speaker 5W*2 | APC BX1100CI MS (660w)
Case lianli lancool 2 mesh RGB windows - white edition | 1x dvd-RW usb 3.0 (optional)
Audio Device(s) Nakamichi soundstation8w 2.1 100W RMS | Simbadda CST 9000N+ 2.1 352W RMS
Power Supply seasonic focus gx-850w 80+ gold - white edition 2021 | APC BX2200MI MS (1200w)
Mouse steelseries sensei ten | logitech g440
Keyboard steelseries apex 5 | steelseries QCK prism cloth XL | steelseries arctis 5
VR HMD -
Software dvd win 10 home 64bit oem + full update 22H2
Benchmark Scores -
Vram usage on 2160p = 14gb......

so, is that rtx 4070 ti 12gb = obselete in 2160p ??
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
The Driver don't do very much in DX12, as DX12 is a lower level API, compared to the older ones.
directstorage, i specifically mentioned that as how i want things to change

The OS controls that, and is aware of RAM, VRAM, page file sizes and usage and already does caching - it should be well able to load game data in advance, with the game containing information on what to load and keep hold of
 
Joined
Jul 14, 2018
Messages
473 (0.20/day)
Location
Jakarta, Indonesia
System Name PC-GX1
Processor i9 10900 non K (stock) TDP 65w
Motherboard asrock b560 steel legend | Realtek ALC897
Cooling cooler master hyper 2x12 LED turbo argb | 5x12cm fan rgb intake | 3x12cm fan rgb exhaust
Memory corsair vengeance LPX 2x32gb ddr4 3600mhz
Video Card(s) MSI RTX 3080 10GB Gaming Z Trio LHR TDP 370w| 566.36 WHQL | MSI AB v4.65 | RTSS v7.36
Storage NVME 2+2TB gen3| SSD 4TB sata3 | 1+2TB 7200rpm sata3| 4+4+5TB USB3 (optional)
Display(s) AOC U34P2C (IPS panel, 3440x1440 75hz) + speaker 5W*2 | APC BX1100CI MS (660w)
Case lianli lancool 2 mesh RGB windows - white edition | 1x dvd-RW usb 3.0 (optional)
Audio Device(s) Nakamichi soundstation8w 2.1 100W RMS | Simbadda CST 9000N+ 2.1 352W RMS
Power Supply seasonic focus gx-850w 80+ gold - white edition 2021 | APC BX2200MI MS (1200w)
Mouse steelseries sensei ten | logitech g440
Keyboard steelseries apex 5 | steelseries QCK prism cloth XL | steelseries arctis 5
VR HMD -
Software dvd win 10 home 64bit oem + full update 22H2
Benchmark Scores -
1.

Ultra

With permission 1920x1080 video memory consumption for video cards with 8 gigabytes 8 gigabytes, with 12 gigabytes 10 gigabyte, with 16 gigabytes 11 gigabyte with 24 gigabytes 11 gigabyte.

With permission 2560x1440 video memory consumption for video cards from 8 gigabytes to 8 gigabytes, with 12 gigabytes 11 gigabyte, with 16 gigabytes 12 gigabyte with 24 gigabytes 12 gigabyte.

With permission 3840x2160 video memory consumption for video cards from 8 gigabytes to 8 gigabytes, with 12 gigabytes 12 gigabyte, with 16 gigabytes 14 gigabyte with 24 gigabytes 14 gigabyte.

2.

RT

With permission 1920x1080 video memory consumption for video cards with 8 gigabytes 7 gigabytes, with 12 gigabytes 9 gigabyte, with 16 gigabytes 8 gigabyte with 24 gigabytes 10 gigabyte.

With permission 2560x1440 video memory consumption for video cards from 8 gigabytes to 8 gigabytes, with 12 gigabytes 10 gigabyte, with 16 gigabytes 9 gigabyte with 24 gigabytes 10 gigabyte.

With permission 3840x2160 video memory consumption for video cards with 12 gigabytes 10 gigabyte, with 16 gigabytes 10 gigabyte with 24 gigabytes 11 gigabyte .

3.

Ultra

With permission 1920x1080 video memory consumption for video cards with 8 gigabytes 7 gigabytes, with 12 gigabytes 10 gigabyte, with 16 gigabytes 11 gigabyte with 24 gigabytes 13 gigabyte .

With permission 2560x1440 video memory consumption for video cards from 8 gigabytes to 7 gigabytes, with 12 gigabytes 10 gigabyte, with 16 gigabytes 12 gigabyte with 24 gigabytes 14 gigabyte .

With permission 3840x2160 video memory consumption for video cards from 8 gigabytes to 8 gigabytes, with 12 gigabytes 10 gigabyte, with 16 gigabytes 13 gigabyte with 24 gigabytes 15 gigabyte .

4.

Ultra +RT

With permission 1920x1080 video memory consumption for video cards with 8 gigabytes 7 gigabytes, with 12 gigabytes 10 gigabyte, with 16 gigabytes 9 gigabyte with 24 gigabytes 11 gigabyte .

With permission 2560x1440 video memory consumption for video cards from 8 gigabytes to 7 gigabytes, with 12 gigabytes 10 gigabyte, with 16 gigabytes 10 gigabyte with 24 gigabytes 11 gigabyte .

With permission 3840x2160 video memory consumption for video cards from 8 gigabytes to 8 gigabytes, with 12 gigabytes 11 gigabyte, with 16 gigabytes 12gigabyte with 24 gigabytes 13 gigabyte .

for safe playing on 4 pc games like that, is disable RT & must using DLSS.... also, playing with under 1440p (16:9), just for safe & smooth.....
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Vram usage on 2160p = 14gb......

so, is that rtx 4070 ti 12gb = obselete in 2160p ??
Just because a game occupies X amount of VRAM when it can doesn't mean it won't run smoothly with less available. What you see in monitoring tools is VRAM allocation. Actual usage is a different matter. When you go down in VRAM size and start to see stutters is when you're actually running out of VRAM. I wouldn't be surprised to see the majority of high-performance 8-12 GB cards (3070, 3080, 6700 XT) do well in games where you see more VRAM used on a 4090.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
That's the big problem at the moment is seeing allocation vs usage

Saw some news today that MS just pushed out a DX12 update to improve upon what DX10 started so long ago (and DIrecstorage is another step towards) - not running content through system ram, before it reaches VRAM.


I'll spoiler my massive infodump because it's a history of how this has happened in the past with DX9 through 12 ultimate, and the important links at the bottom for you to just read the news

My view is: This will work and work wonderfully, and then game devs will fuck it up by using a 5% performance gain to not bother optimising anything else at all.
*DX9 had to duplicate things, so system RAM performance mattered a heap. This was because most of the alterations were done by the CPU from decompression to texture swapping.
*DX10 began a de-sync of this, using the CPU less and letting the GPU do more. so it was decompressed into system RAM, then moved to VRAM.
Texture arrays enable swapping of textures in GPU without CPU intervention.
Direct3D 9 had a limitation on the number of draw calls because of the CPU cost per draw call. On Direct3D 10, the cost of each draw call has been reduced
Then gaming websites put out articles like this saying DX10 didn't look any better
"DX10 doesn't look any better, it's not worth it" - and that hurt GPU sales

Suddenly, game devs had to implement features that made DX10 LOOK better, when it's goal was to PERFORM better.

This was rarely ever discussed, except it was a big feature of Aero in windows 7 at the time It ran Aero in DX10.1 hardware accelerated on the GPU *or* a software accelerated mode since Intel and Nvidia hadn't quite caught up at the time of release and just had 'compatible' hardwaer.
If you had true DX10.1 hardawre toto move the rendering of the OS to the GPU it made it feel more responsive and used less resources - it made those crappy intel atom netbooks feel a whole lot smoother.

DX10 and DX11 titles suffered the Crysis problem here:
Instead of using these features to improve performance, Game devs instead crammed every new lighting effect into a game -RTX style- and tanked performance instead.
Company of heroes was the first DX10 game (Added via patch and it had *major* problems vs it's DX9 rendering method, but you could run a single command line shortcut to fix it all - because they added physics enabled flowers and rocks that ran off the CPU, negating all their performance gains because

DirectX 9 vs. DirectX 10 - Real World DirectX 10 Performance: It Ain't Pretty (anandtech.com)
CoH ran at about 50% of DX9 performance - but using a single command "-nolitter" would fix it because DX10 added in LITERAL TRASH to the game world that had software based physics interactions that undid any and all performance gains. Weirdly almost all references to this command seemed to have vanished online, despite it being a huge deal when it was introduced.

Some followup titles (the anandtech link above) showed performane increases, but they were sadly the minority as we reached the crysis point because that horrible nasty performance impact of making it look a tiny bit better did one thing and one thing only:
It made the game controversial. It made the game newsworthy. It became profitable to have ultra game settings that ran like shit, because it made people who COULD run it, brag about it and started upgrade envy, and became marketing material for new GPUs as a requirement to play these games "properly"

*Now DX12 is allowing the CPU and GPU access at the same time (to different parts) thanks to ReBAR (see link below)

Up next is DirectStorage allowing the GPU direct access to data from NVME bypassing all CPU and RAM, which the new DX12 feature would still allow the CPU to access and alter the content if needed despite being removed from the process as it has been until now - that access would be critical for anything currently being done 'in-transit' so this feature allows them to 'bugfix' easier if there are problems with this new method

This seems to be all about lowering CPU and RAM requirements without upping GPU requirements, while direcstorage can help prevent issues with low VRAM GPU's (by swapping new data in at high speeds, with zero CPU and RAM overhead slowdown)

All they need next is for GPU's to be able to cache more data in advance to use spare VRAM - but that runs the risk of increased power consumption, temperatures and even lower performance so it's not something they'll leap at. My 3090 would use another 50W to run all the VRAM (like it did with mining) - and that 50W comes from the GPU, harming performance.



Agility SDK 1.710.0-preview: GPU Upload Heaps and Non-normalized Sampling - DirectX Developer Blog (microsoft.com)

DirectX 12 Update Allows CPU and GPU to Access VRAM Simultaneously | Tom's Hardware (tomshardware.com)

To be clear this works *now* without DirectStorage, it's just a patch away from games - Nvidia and Intel already support it, as long as you have ReBar.
With this feature, a game's RAM and CPU utilization could decrease noticeably due to a reduction in data transfers alone. This is because the CPU no longer needs to keep copies of data on both system RAM and GPU VRAM to interact with it. Another bonus is that GPU video memory is very fast these days, so there should be no latency penalties for leaving data on the GPU alone. In fact, there will probably be a latency improvement with CPU access times on high-end GPUs with high-speed video memory.

For gamers, the only requirement you'll need is Resizable-Bar or Smart Access Memory support on both your CPU and GPU. Resizable-bar is the foundation for GPU Upload Heaps since the feature enables Windows to manage GPU VRAM directly.

For developers, the feature is already supported by Nvidia, Intel, and AMD drivers. For example, it's already included in Nvidia's latest Game Ready and Studio Drivers (version 531.41 or newer) and Intel A-series/Xe GPUs (with driver 31.0.101.4255 or newer). For AMD GPUs, developers must consult their AMD alliance manager to get a supported driver.
 
Last edited:

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
I just did a massive spoiler rambling edited into that post, 30 minutes sidetracked hunting down a lost piece of gaming history with a startup command from company of heroes that was SUPER IMPORTANT AND I NEEDED TO REMEMBER IT DAMNIT

Summary: MS releases thing to boost performance, it works
Game devs: We gotta make money back from putting that in, make the game prettier in screenshots. NO. MATTER. THE. COST.
Media: All news is good news, can we spend 15 years talking about your game runs like ass?
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
It also 800%( or 8 times more) more L2 cache than the RTX 3090 has. This depends on the acrhitechure, or how it utilizes it's own Vram.
And generates stutter in multiple games now even at res/settings from 2015.

You are right though in general. But the outliers are what make or break GPUs.

Gpu future proofing is not a good idea. PSU may be.
But it's better to upgrade more frequently than to buy this much extra vram. Will cost less imo
Until your additional vram becomes useful, a much better gpu with lower price/watt will hit the market.
My 8GB 1080 would like a word, alongside the rest of the Pascal stack... it still runs anything, 7 years post release. Even if the FPS goes down, the VRAM usage creeps closer to 8GB over the years, but the games are stutter free.
 
Last edited:

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
My 8GB 1080 would like a word
Can confirm, my GTX1080 is quite happy too. I can even do 4K gaming in a few titles, although not at ultra settings if they're current-gen. FSR is obviously a thing, and a few games can be hacked to have DLSS work on it too
 
Joined
Mar 19, 2023
Messages
153 (0.24/day)
Location
Hyrule Castle, France
Processor Ryzen 5600x
Memory Crucial Ballistix
Video Card(s) RX 7900 XT
Storage SN850x
Display(s) Gigabyte M32U - LG UltraGear+ 4K 28"
Case Fractal Design Meshify C Mini
Power Supply Corsair RM650x (2021)
24 Gb video RAM is worth for gaming nowadays?
How is that even a question? :D
The biggest games you'll find will barely scratch 16Gb.
By the time 24 will be used, the chips accompanying the VRAM will be very outdated.

IMO if Nvidia is barely meeting the RAM requirements, AMD is overly cautious with it. Sometimes it's making great packages like the 6700xt which I still think is some sort of golden ticket into cheap but acceptable 1440p, sometimes I think it's silly, like with the 7900 xt where you have 20Gb in a package that currently basically equals the 4070 Ti. Even if you believe in FineWine, the 7900xt will probably not beat a 4080 or even equal it, and that card is sitting solid with 16Gb for a few years yet (3-4 tops, but that's enough). In 4 years the 7900xt's chip will have been beaten by basically any RDNA4/Blackwell and beyond gens' medium chips.

To me the right balance is something like 16Gb on any midrange and top card, with the extra RAM being used for productivity workloads. Which AMD generally doesn't support nearly as well as Nvidia...
 

Lei

Joined
Jul 3, 2021
Messages
1,143 (0.90/day)
Location
usually in my shirt
Processor 3900x - Bykski waterblock
Motherboard MSI b450m mortar max BIOS Date 27 Apr 2023
Cooling αcool 560 rad - 2xPhanteks F140XP
Memory Micron 32gb 3200mhz ddr4
Video Card(s) Colorful 3090 ADOC active backplate cooling
Storage WD SN850 2tb ,HP EX950 1tb, WD UltraStar Helioseal 18tb+18tb
Display(s) 24“ HUION pro 4k 10bit
Case aluminium extrusions copper panels, 60 deliveries for every piece down to screws
Audio Device(s) sony stereo mic, logitech c930, Gulikit pro 2 + xbox Series S controller, moded bt headphone 1200mAh
Power Supply Corsair RM1000x
Mouse pen display, no mouse no click
Keyboard Microsoft aio media embedded touchpad (moded lithium battery 1000mAh)
Software Win 11 23h2 build 22631
Benchmark Scores cine23 20000
@Vayra86 @Mussels
Yeah, but there wasn't a gtx1090 with 24gb vram during Pascal times.
You think 7 years after 4090, you'd appreciate its extra vram?
 
Joined
Jun 1, 2011
Messages
4,677 (0.94/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
How is that even a question? :D
The biggest games you'll find will barely scratch 16Gb.
By the time 24 will be used, the chips accompanying the VRAM will be very outdated.
this, you are stripping away one part of a video card as if its the whole thing. It's like saying any CPU without 8MB of cache is not "FUTURE PROOF"
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
@Vayra86 @Mussels
Yeah, but there wasn't a gtx1090 with 24gb vram during Pascal times.
You think 7 years after 4090, you'd appreciate its extra vram?
The GTX1080Ti with its 11GB was far, FAR ahead of its time and people who still own one, are still thoroughly enjoying them.

The RX480/580 8GB was exactly the same until the mining boom sucked them all up, they aged better than the lower VRAM variants, and the performance difference (especially that minimum FPS) was very clear as the years went on

(Taken from a random YT video)
61 to 67FPS? Meh!
29 to 46 on the 0.1%? That's the difference between stuttering mess and playable.
1680438522183.png
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
@Vayra86 @Mussels
Yeah, but there wasn't a gtx1090 with 24gb vram during Pascal times.
You think 7 years after 4090, you'd appreciate its extra vram?
Well the take away here is that the Pascal stack, was properly balanced between VRAM and core.

The fastest card (pseudo x90?) in the stack was the 1080ti which carried about 30% more perf, for about 30% more VRAM, having 11GB on board. It runs like a charm just the same. This trend trickles down all the way to the 1060 6GB, and only gets 'neutered' downwards from there. You could say the 1070 and 1060 6GB had too much in fact. Another thing that is properly balanced is the bus width and resulting bandwidth. Consider this set of specs to compare:

1080:
1680458279772.png

4070ti (the supposed 4080 'LE')
1680458330964.png

Here's the kicker:

1680458357788.png


I suppose in comparison, in part due to the use of cache, the 4080 and 4090 at 16 and 24GB could be in a good place, the gap in perf between them is similar to the gap in VRAM. But when you then notice that titles can already saturate 12GB on much less powerful cards, you could also defend the idea that a few years from now, at least the 4080 will be starved or closing in on it, and you could definitely push it into situations where core perf is sufficient, but VRAM is not. Often? Probably not. But occasionally for me is enough to say planned obsolescence and stay away.

And then where is the 4090? Well, we know that the top end of required VRAM isn't quite so punishing to performance as dropping below the requirement for any game, and we also know that resolution increases don't really tax VRAM that much more. The 4090 may and likely will last that much longer for it. Another fact in that sense is that 24GB is 33% more VRAM than the high end 'norm' of 16GB which I think we can easily concede on today. The norm only moves up slowly and all games will be tailored to meet it as nobody wants to follow Crytek down the drain.

Don't get me wrong I don't see the 4080 and definitely not the 4090 as cards that will turn obsolete fast. But the 4080 is on that scale, somewhere, and below that in the stack, its going to be a bloodbath akin to 1060 3GB. I consider it likely the 4070ti will turn out as a much, much worse 3080 10GB, because for its shitty VRAM cap, that one does have 760GB/s. That order on 4070ti's cache is going to be very tall.

The Driver don't do very much in DX12, as DX12 is a lower level API, compared to the older ones.

If it had nearly as much control as you posted in this example it would lead to stackovers & overflows with massive latency pentiles while read & writing data constantly. Read & writing data take time. it's not for free. ( Stack overflows can be any from anything, latency overflows, to execution, or ram floods.)

The problem would be like this
(GPU) where is x-y-b
when (GPU) needs to finish x-y-d after x-y-b.
While x-y-b needs to have b written again.
before it's read for y-x-b. ( latency hit for read & write happens)

Feeding it & out all the time induces latency. While just having to read only from Vram can be faster as it's one action & not two. That also needs to be come through the same bus that cpu comuicates on.
The reason Resizable bar exisist is to elimit that need for constant writes, because then we could still get away with cards that have 256mbs cards & the increase to 4GB address space would've never been in needed in the first place.

The majority of problems come from older code from older DX standards that are still in DX12 that always include copying into ram while copying in to Vram. This happens just so if any thing doesn't fit in the Vram With reziseable bar. It's should be able to be called upon from the GPU by calling to cpu to the memory. That's where another latency hit happens because it's more than one stop to get the information. needed.
Which kinda says to us, you're left at the mercy of developer TLC to cater to Nvidia's low VRAM.
 
Last edited:
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
DXDIAG is a great example of this, showing that i have either 24GB or ~56GB depending how they count it
View attachment 288616

Both are correct. WDDM will allocate up to 50% of physical system RAM as VRAM. You get around 57000 MB because you have 64 GB of RAM installed. That is, the ~24000 MB physically present in the 3090 + 32700 MB or so of dynamically allocatable VRAM. If you lower to 32 GB, then that becomes a 16 GB allowance and your total memory will go down to around 40000 MB maximum. Here is my laptop:

1680459587702.png


4 GB physical + 50% of 16 GB RAM = around 12 GB addressable.
 
Joined
Apr 30, 2020
Messages
999 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
The GTX1080Ti with its 11GB was far, FAR ahead of its time and people who still own one, are still thoroughly enjoying them.

The RX480/580 8GB was exactly the same until the mining boom sucked them all up, they aged better than the lower VRAM variants, and the performance difference (especially that minimum FPS) was very clear as the years went on

(Taken from a random YT video)
61 to 67FPS? Meh!
29 to 46 on the 0.1%? That's the difference between stuttering mess and playable.
View attachment 290119
is that picture of while they benchmark it ?


The easist way to think about usage vs accoulation is
accolation is the amount of avaliable/useable space the GPu is requesting to have open while the program is running. It may or not use all of it maybe up to as twice as much as it actually uses this is to try & prevent over flows & not ovr flpow back in the cpu's ram.
Usage is how much of the avaliable space is being used by the program while the program is self is runing.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
is that picture of while they benchmark it ?


The easist way to think about usage vs accoulation is
accolation is the amount of avaliable/useable space the GPu is requesting to have open while the program is running. It may or not use all of it maybe up to as twice as much as it actually uses this is to try & prevent over flows & not ovr flpow back in the cpu's ram.
Usage is how much of the avaliable space is being used by the program while the program is self is runing.
This, not to mention some game assets may only be stored in the VRAM for further use, not affecting immediate game performance.
 
Joined
Dec 12, 2020
Messages
1,755 (1.19/day)
My 1080ti, even overclocked, doesn't work well in Control at 1440p -- I had to reduce the render resolution to 1080p from 1440p -- otherwise when dust and debris go flying the FPS tanks badly. This is the only game that had problems like that at 1440p, but there are a few games where a steady 60 FPS doesn't happen at 1440p.

The only game that allocates all my VRAM is RE7 and I can see why, the textures are disgustingly realistic.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
I was gunna make a post on how nvidias product stacks work, but i'm changing my mind and summarising it as "it doesnt"


They start with a big GPU Die and sell it as smaller variants when various parts are defective, like GP104 supporting 8GB of ram over a 384 bit bus

Using TPU's listing for laziness
Note how the main change between the first models was simple, les shaders, lower cutting parts down, but over time things didnt work out cost-wise - or maybe they already installed GDDR5x onto the boards before a failure was detected, and the stacks get WEIRD, sometimes with regional exclusives
1680499728344.png





Then they make smaller cheaper cut down designs later and do it all over again, like how the current 40 series uses higher density RAM to use less modules overall - resulting in "why is my bus width so smol?!?" questions and all sorts of uneven comparisons between generations


Edit: real life got in the way, rushed post missed the initial point:

Each generation is *weird* with VRAM values, with the initial stack being logical-ish and then higher density ram modules coming out (or higher speeds not being available in that density) meaning you get to choose between more VRAM vs faster VRAM, and theres no real consensus on that. If the GPU performance is a lot lower to get that VRAM, it's likely not worth it unless direcstorage changes things.
 
Last edited:
Joined
Sep 10, 2018
Messages
6,965 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I think people are over complicating this if a gpu cost $400-600 it should have at least 12GB of Vram if the gpu cost $7-1200 it should come with at least 16GB and anything more expensive than that should come with 24GB in 2023.

Us as gamers should be voting with our wallets and expect this as a min.

I'm not going to sit here and say what people should have purchased but anyone who decided to grab a 6700XT/6800 over a 3070/3070ti will likely have a much better gaming experience in general while spending around the same sure the nvidia options have superior RT and I much prefer DLSS but does that matter if they don't have enough vram to actually run the latest games with ultra textures etc. I know what the nvidia fanboys will say just lower settings but to me that ridiculous considering both gpus cost well over 500 usd just one generation ago just feels bad.

Although I guess you can just say nvidia loyalist me included deserve this they've been gimping vram since at least the 600 series probably longer.

Considering how much nvidia seems to want to push the market forward with new/improved technologies it confuses me why they are so stingy with vram other than making people upgrade sooner than they'd likely have to otherwise if the gpus had adequate vram.

If the 3070ti came with 16GB and the 3080 had 20GB both those gpus would have been viable much longer and even in the edge cases where the game might not be super optimized it would just brute force and even if nvidia wanted more money for said models they could have at least gave aib a choice to sell slightly more expensive models.
 
Joined
Jul 14, 2018
Messages
473 (0.20/day)
Location
Jakarta, Indonesia
System Name PC-GX1
Processor i9 10900 non K (stock) TDP 65w
Motherboard asrock b560 steel legend | Realtek ALC897
Cooling cooler master hyper 2x12 LED turbo argb | 5x12cm fan rgb intake | 3x12cm fan rgb exhaust
Memory corsair vengeance LPX 2x32gb ddr4 3600mhz
Video Card(s) MSI RTX 3080 10GB Gaming Z Trio LHR TDP 370w| 566.36 WHQL | MSI AB v4.65 | RTSS v7.36
Storage NVME 2+2TB gen3| SSD 4TB sata3 | 1+2TB 7200rpm sata3| 4+4+5TB USB3 (optional)
Display(s) AOC U34P2C (IPS panel, 3440x1440 75hz) + speaker 5W*2 | APC BX1100CI MS (660w)
Case lianli lancool 2 mesh RGB windows - white edition | 1x dvd-RW usb 3.0 (optional)
Audio Device(s) Nakamichi soundstation8w 2.1 100W RMS | Simbadda CST 9000N+ 2.1 352W RMS
Power Supply seasonic focus gx-850w 80+ gold - white edition 2021 | APC BX2200MI MS (1200w)
Mouse steelseries sensei ten | logitech g440
Keyboard steelseries apex 5 | steelseries QCK prism cloth XL | steelseries arctis 5
VR HMD -
Software dvd win 10 home 64bit oem + full update 22H2
Benchmark Scores -
this is only war thunder.....

using 3 DLC HD textures, movie graphic setting, 2160p......

VRAM usage...

warplanes battles = 13gb....
ground forces = 16gb....
warships battles = 18gb...

plz, any one, can play those.... the old MMORPG F2P.....with rtx 4070 ti 12gb or rtx 4080 16gb for playing 2-3 hours long time, with the same setting of graphic & resolution..... ??
 
Top