• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Modded NVIDIA GeForce RTX 3070 With 16 GB of VRAM Shows Impressive Performance Uplift

Joined
May 17, 2021
Messages
3,005 (2.34/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Not true. Check this out

That was another video that made no sense at all, you shouldn't test that on cards with 24GB, it's like upgrading from 16GB of RAM to 32GB, all your applications suddenly report they are using more. And i'm not sure afterburner or any other software similar ever reported real usage for vram.
I don't get what is so difficult to test thing in realistic scenarios, not ultra settings, not 4090's at 1080p. It's infuriating.

I'm sorry but they are sellers, all of them, they want to sell you their videos, they make a living from our clicks, testing realistic scenarios is not click material.

Next we'll be testing Cyberpunk in 4k using a 3dfx, using no keyboard and one hand tied behind my back while drinking cola and eating pizza driving down the highway, because lols
 

Outback Bronze

Super Moderator
Staff member
Joined
Aug 3, 2011
Messages
2,028 (0.42/day)
Location
Walkabout Creek
System Name Raptor Baked
Processor 14900k w.c.
Motherboard Z790 Hero
Cooling w.c.
Memory 48GB G.Skill 7200
Video Card(s) Zotac 4080 w.c.
Storage 2TB Kingston kc3k
Display(s) Samsung 34" G8
Case Corsair 460X
Audio Device(s) Onboard
Power Supply PCIe5 850w
Mouse Asus
Keyboard Corsair
Software Win 11
Benchmark Scores Cool n Quiet.
Wouldn't it be nice to upgrade the ram on your GPU like we do on our mobo's : )
 
Joined
Sep 1, 2020
Messages
2,340 (1.52/day)
Location
Bulgaria
Wouldn't it be nice to upgrade the ram on your GPU like we do on our mobo's : )
No, extreme temperatures and especially frequent changes in a large range, around the GPU, and from the operation of the memories themselves, it is better to have the chips soldered.
 
Joined
Aug 12, 2019
Messages
2,176 (1.13/day)
Location
LV-426
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Galax RTX 3090 EX Gamer White OC
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
That was another video that made no sense at all, you shouldn't test that on cards with 24GB, it's like upgrading from 16GB of RAM to 32GB, all your applications suddenly report they are using more. And i'm not sure afterburner or any other software similar ever reported real usage for vram.
I don't get what is so difficult to test thing in realistic scenarios, not ultra settings, not 4090's at 1080p. It's infuriating.

I'm sorry but they are sellers, all of them, they want to sell you their videos, they make a living from our clicks, testing realistic scenarios is not click material.

Next we'll be testing Cyberpunk in 4k using a 3dfx, using no keyboard and one hand tied behind my back while drinking cola and eating pizza driving down the highway, because lols

i think the whole point of the video is to see how much vram it uses and 4090 and 7900xtx are the current generation cards with most vram... from low settings 1080p and 4k to max settings 1080p and 4k
it might not be accurate but it gives an indication of how vram is used and how much new games require them for low and ultra settings
 

Juancito

New Member
Joined
Dec 19, 2022
Messages
3 (0.00/day)
This just shows the bad value proposition of the RTX4070. True, the power comsumption has gone down significantly comparing RTX3070 to RTX4070. However, if the RTX3070 shows such performance with extra RAM, this confirms that what reviewers have been saying: the RTX4070 is actually a 4050 or 4060 at a higher price than the 3070 launched. I just sold my 3070 thinking of the 4070 but will actually wait for AMD's next launch.
 
Joined
Sep 17, 2014
Messages
22,422 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Surely totally depends on res and settings. You don't need more than 8GB if you're using 1080p.

Can probably just tweak settings at higher resolutions to stay within it. Easier than modding hardware.
Sure, you can run on low as well, why even bother buying a new GPU at all right!

I don't get this sentiment at all. You pay through the nose for even a midrange GPU and then people are content with all sorts of quality reductions to keep it afloat. While there are ALSO similarly priced midrangers that don't force you into that, today or in the next three to five years.
Like... why even bother to begin with, just use your IGP and save money, 720p is after all just a setting eh.
 
Joined
Sep 15, 2011
Messages
6,715 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Looks like 10GB is more than enough, since those games were pulling maximum ~9600MB usage for VRAM...
 
Joined
Sep 17, 2014
Messages
22,422 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Looks like 10GB is more than enough, since those games were pulling maximum ~9600MB usage for VRAM...
I'm already seeing north of 12GB in TW WH3 at sub 4K (3440x1440).
The engine isn't even using any new tech, its just a big game.
 
Joined
May 17, 2021
Messages
3,005 (2.34/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
i think the whole point of the video is to see how much vram it uses and 4090 and 7900xtx are the current generation cards with most vram... from low settings 1080p and 4k to max settings 1080p and 4k
it might not be accurate but it gives an indication of how vram is used and how much new games require them for low and ultra settings

A card with more vram will allocate more vram, and the usage numbers are not reliable, and is probably doing the same because they scale in the same way. You don't get any useful information this way.
 
Joined
Sep 1, 2020
Messages
2,340 (1.52/day)
Location
Bulgaria
A card with more vram will allocate more vram, and the usage numbers are not reliable, and is probably doing the same because they scale in the same way. You don't get any useful information this way.
Improving minimum frames and delta between minimum and maximum is useful information. Must be reason for this... Only one difference is VRAM size. Think! :)
 
Joined
May 17, 2021
Messages
3,005 (2.34/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Improving minimum frames and delta between minimum and maximum is useful information. Must be reason for this... Only one difference is VRAM size. Think! :)

what are we talking about now? Brian's video? i see different minimum and maximum, 0.1% lows using the same vram (24GB) and just using a different card, amd or nvidia.

the cards even allocate and use vram differently just changing brands and having the same vram size

How are you concluding anything about maximums and minimums based on vram size? talking about the 8GB comparisons, the 4k numbers are a bit irrelevant for this discussion.
 
Joined
Dec 5, 2013
Messages
637 (0.16/day)
Location
UK
I don't get this sentiment at all. You pay through the nose for even a midrange GPU and then people are content with all sorts of quality reductions to keep it afloat.
Whilst I agree it doesn't make sense to skimp on VRAM if you're paying +£600 for a new GPU intended for 2023-2025 AAA Ultra gaming, in reality many of us tweak settings anyway out of personal preference (I cannot stand Depth of Myopia, Chromatic Abhorration, "Supernova Bloom", stupefied head-bob, etc, effects) and think they look ridiculous far more than they have ever 'added realism'. And I'd still turn them off even if I owned a 65,536 Yottabyte VRAM GPU...
 
Joined
Dec 25, 2020
Messages
6,679 (4.68/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Whilst I agree it doesn't make sense to skimp on VRAM if you're paying +£600 for a new GPU intended for 2023-2025 AAA Ultra gaming, in reality many of us tweak settings anyway out of personal preference (I cannot stand Depth of Myopia, Chromatic Abhorration, "Supernova Bloom", stupefied head-bob, etc, effects) and think they look ridiculous far more than they have ever 'added realism'. And I'd still turn them off even if I owned a 65,536 Yottabyte VRAM GPU...

Agreed, but head bob is great though, realism!!! o_O:laugh:
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10

Fresh and still hot

No AMD vs Nvidia in this video. Only 8GB vs 16GB, pure Nvidia.
 
Joined
Sep 15, 2011
Messages
6,715 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
I'm already seeing north of 12GB in TW WH3 at sub 4K (3440x1440).
The engine isn't even using any new tech, its just a big game.
It just caches the VRAM. Nothing new. Games have done that for more than 10 years now...
 
Joined
Dec 28, 2012
Messages
3,869 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
You know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.
I'm already seeing north of 12GB in TW WH3 at sub 4K (3440x1440).
The engine isn't even using any new tech, its just a big game.
Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
 
Joined
Dec 25, 2020
Messages
6,679 (4.68/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
You know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.

Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.

Allocation may not be utilization, but we've reached a point where games in general are beginning to no longer run adequately with 8 GB GPUs or 16 GB RAM PCs. People who make that argument often forgo or reject the concept of memory pressure. As physical memory nears exhaustion, data will be first compressed (which costs CPU cycles, but should still be manageable in most cases), and in order of priority, swapped onto slower memory tiers (whenever available) until it reaches storage, which is comparably glacial even on high-speed NVMe SSDs.

Apple offers an easy way to read memory pressure in the Mac OS's activity monitor, but Microsoft has yet to do something like this on Windows. A dead giveaway that you are short on RAM is when you begin to see the Compressed Memory figure rise (ideally you want this at 0 MB), and you'll be practically out of RAM when your commit charge exceeds your physical RAM capacity. This is also the reason why you will never see RAM usage maxed out in the Windows task manager, it will attempt to conserve about 1 GB of RAM for emergency use at all times, this leads many people with the mindset of "I paid for 16 GB of RAM and use 16 GB of RAM I shall" to think that their computers are doing OK and that they aren't short at all.

A similar concept applies to GPU memory allocation on Windows. As you may know by now, Windows doesn't treat memory as absolute values, but rather as an abstract concept of addressable pages instead, with the MB values reported by the OS being more estimates than a precise, accurate metric. Normally, the WDDM graphics driver will allocate physical memory present in the graphics adapter plus up to 50% of system RAM, so for example, if you have a 24 GB GPU, plus 32 GB of RAM, you will have a maximum of 24 + 16 = around 40 GB of addressable GPU memory:

1682424555464.png


This means that a 8 GB GPU such as the RTX 3070 on a PC with 32 GB of RAM actually has around 20 GB of addressable memory. However, at that point, the graphics subsystem is no longer interested in performance but rather, preventing crashes, as it's fighting for resources demanded by programs in main memory. By running games that reasonably demand a GPU that has that much dedicated memory to begin with, you can see where this is going fast.

I believe we may be seeing this symptom in Hardware Unboxed's video, in The Last of Us, where the computer is attempting to conserve memory at all costs:

1682424945638.png



This is, of course, my personal understanding of things. Don't take it as gospel, I might be talking rubbish - but one thing's for sure, by ensuring that I always have more RAM than an application demands, I have dodged that performance problem for many years now.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,422 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
You know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.

Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
Certainly but it is an indicator for sure.
 
Joined
May 17, 2021
Messages
3,005 (2.34/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Me watching these videos:

Sure 8GB was stupid, my RX480 come with 8GB back in 2016
using a 3070, especially considering the cost, to game at 1080p is absurd
using a 3070 to run a game at 1440p at ultra is also absurd, it brings nothing to the experience, HU thinks the same as me, they even made a video about it
don't buy one, not because of the vram, but because of the price even if you play CSGO competitively
if you already have one, let HU make another video just so you fell even more buyers remorse, where were they when the 3070's released?!

and we keep beating these points around.
 
Joined
Sep 17, 2014
Messages
22,422 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Whilst I agree it doesn't make sense to skimp on VRAM if you're paying +£600 for a new GPU intended for 2023-2025 AAA Ultra gaming, in reality many of us tweak settings anyway out of personal preference (I cannot stand Depth of Myopia, Chromatic Abhorration, "Supernova Bloom", stupefied head-bob, etc, effects) and think they look ridiculous far more than they have ever 'added realism'. And I'd still turn them off even if I owned a 65,536 Yottabyte VRAM GPU...
Absolutely I disable the same crap every single time :D

But textures must be maxed, same as LOD and all other things that affect real assets. The GPU hit is low but the IQ win is high.
 
Joined
Sep 28, 2005
Messages
3,318 (0.47/day)
Location
Canada
System Name PCGR
Processor 12400f
Motherboard Asus ROG STRIX B660-I
Cooling Stock Intel Cooler
Memory 2x16GB DDR5 5600 Corsair
Video Card(s) Dell RTX 3080
Storage 1x 512GB Mmoment PCIe 3 NVME 1x 2TB Corsair S70
Display(s) LG 32" 1440p
Case Phanteks Evolve itx
Audio Device(s) Onboard
Power Supply 750W Cooler Master sfx
Software Windows 11
Absolutely I disable the same crap every single time :D

But textures must be maxed, same as LOD and all other things that affect real assets. The GPU hit is low but the IQ win is high.
High textures in RE4R is demanding on vram though. My 3080 has crashed a few times till I reduced some settings at 1440p
 
Last edited:
Joined
Nov 15, 2020
Messages
913 (0.62/day)
System Name 1. Glasshouse 2. Odin OneEye
Processor 1. Ryzen 9 5900X (manual PBO) 2. Ryzen 9 7900X
Motherboard 1. MSI x570 Tomahawk wifi 2. Gigabyte Aorus Extreme 670E
Cooling 1. Noctua NH D15 Chromax Black 2. Custom Loop 3x360mm (60mm) rads & T30 fans/Aquacomputer NEXT w/b
Memory 1. G Skill Neo 16GBx4 (3600MHz 16/16/16/36) 2. Kingston Fury 16GBx2 DDR5 CL36
Video Card(s) 1. Asus Strix Vega 64 2. Powercolor Liquid Devil 7900XTX
Storage 1. Corsair Force MP600 (1TB) & Sabrent Rocket 4 (2TB) 2. Kingston 3000 (1TB) and Hynix p41 (2TB)
Display(s) 1. Samsung U28E590 10bit 4K@60Hz 2. LG C2 42 inch 10bit 4K@120Hz
Case 1. Corsair Crystal 570X White 2. Cooler Master HAF 700 EVO
Audio Device(s) 1. Creative Speakers 2. Built in LG monitor speakers
Power Supply 1. Corsair RM850x 2. Superflower Titanium 1600W
Mouse 1. Microsoft IntelliMouse Pro (grey) 2. Microsoft IntelliMouse Pro (black)
Keyboard Leopold High End Mechanical
Software Windows 11
A card with more vram will allocate more vram, and the usage numbers are not reliable, and is probably doing the same because they scale in the same way. You don't get any useful information this way.
Yes but if the game doesn't crash (which happens) then the quality of the game diminishes, sometimes very noticeably because the card is scrambling to allocate resources and has to compromise. If that's what you want then the 4070 is for you because that's the future - like it or not.

You know, it's amazing. People in this forum are jumping over themselves saying how the 3070 should have had 16GB at launch. Now, if nvidia had launched a 16gb 3070, with the associated $80+ price increase to cover the additional memory cost, the same people would REEEE about the price and how its too expensive and how nvidia is da big evil guy.

Allocation. :)Is. :)Not. :)Utilization.:)

We need that clapping hand emoji.

Unless you are seeing consistent microstuttering at higher framerates, you are not running out of VRAM. So far nobody has displayed that with TW3.
Let's wait for the stuttering then :D
 
Joined
May 17, 2021
Messages
3,005 (2.34/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Yes but if the game doesn't crash (which happens) then the quality of the game diminishes, sometimes very noticeably because the card is scrambling to allocate resources and has to compromise. If that's what you want then the 4070 is for you because that's the future - like it or not.

Not crash! but don't start or stutter, it's true, i'm not contesting that.
 
Top