• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are game requirements and VRAM usage a joke today?

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
9,168 (3.98/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
I bought one of, if not the most hated Nvidia GPU's..

And I think its really sweet lol..

Yeah sure I can hit the 12GB limit in some games, but this isn't supposed to be a 4K card :)

A lot of the time I find myself running out of horsepower before I run out of VRAM..
 
Joined
Jan 10, 2011
Messages
1,460 (0.29/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) ViewSonic VA2406-MH 75Hz
Case Bitfenix Nova Midi
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Logitech G300s
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 24.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
unified memory code ported over lazily causing PS3 quality textures unless on a 16+ gig card.
I stopped following this thread for a while, but didn't we established that consoles' UMA has little/nothing to do with the alleged poor optimisation?

The reason consoles do better on UE4/5 as far as shader compilation and traversal stutter go is they have dedicated silicon for decompression unlike pc where we have to brute force it a bit at least until direct storage or somthing similar becomes more prevalent.
Texture decompression on PC (as far as standard BC/DXT compression goes, at least) is not done in software, rather in hardware.
I doubt stutter has much to do with texture decoding. And textures have definitely nothing to do with shader compilation.
 
Joined
Sep 10, 2018
Messages
7,200 (3.11/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I bought one of, if not the most hated Nvidia GPU's..

And I think its really sweet lol..

Yeah sure I can hit the 12GB limit in some games, but this isn't supposed to be a 4K card :)

A lot of the time I find myself running out of horsepower before I run out of VRAM..

Even Nvidia didn't like it they fixed it with 16GB of vram while keeping the same price even though word on the street is Vram prices have increased since ada launched.

It's a great gpu that should have never come with 12GB of vram not because it's generally not enough but because it cost over 800 usd typically. I agree though the 4070ti OG does lose a lot more steam at 4k than it should and it almost never has to do with vram more likely it's how much they crippled the bus width.

Honestly if these super cards are the ones that launched from the start I'd probably own a 4080 instead of a 4090 even though it has 33% less vram and the 4070ti super would be my defacto recommendation for anyone who couldn't afford 1k+
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
9,168 (3.98/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
I am just going to wait. I was going to get a 4080S but I will just wait for 50 series..

Maybe..

That 3070Ti in my sons system is getting long in the tooth, he might get bumped to Ada :D

:laugh:
 
Joined
Sep 10, 2018
Messages
7,200 (3.11/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I am just going to wait. I was going to get a 4080S but I will just wait for 50 series..

Maybe..

That 3070Ti in my sons system is getting long in the tooth, he might get bumped to Ada :D

:laugh:

My 3080ti with the dreaded 12GB still kicks ass in my secondary system no need to upgrade anyways the 5070 will at least watch the 4080s and will hopefully be much cheaper anyways... who knows maybe the 5080 will beat the 4090 for 999 usd you'd think but my crystal ball is pretty cloudy so you never know :laugh:
 
Last edited:
Joined
Apr 30, 2020
Messages
1,013 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
The reason consoles do better on UE4/5 as far as shader compilation and traversal stutter go is they have dedicated silicon for decompression unlike pc where we have to brute force it a bit at least until direct storage or somthing similar becomes more prevalent.

EU4/5 suck because they use a main single thread for two things & sync it all at once which causes call stalls. The 1st is physics the other the 2nd is the main calls for rendering. The dedicated hardware doesn't help that much really since Rebar should easily be able to overcome their small memory buses, The other reason consoles end up being faster for textures & loads is because they are not having to deal with a transient layer of an OS that's constantly the in-between.
 
Joined
Dec 26, 2012
Messages
1,137 (0.26/day)
Location
Babylon 5
System Name DaBeast!!! DaBeast2!!!
Processor AMD AM4 Ryzen 7 5700X3D 8C/16T/AMD AM4 RYZEN 9 5900X 12C/24T
Motherboard Gigabyte X570 Aorus Xtreme/Gigabyte X570S Aorus Elite AX
Cooling Thermaltake Water 3.0 360 AIO/Thermalright PA 120 SE
Memory 2x 16GB Corsair Vengeance RGB RT DDR4 3600C16/2x 16GB Patriot Elite II DDR4 4000MHz
Video Card(s) XFX MERC 310 RX 7900 XTX 24GB/Sapphire Nitro+ RX 6900 XT 16GB
Storage 500GB Crucial P3 Plus NVMe PCIe 4x4 + 4TB Lexar NM790 NVMe PCIe 4x4 + TG Cardea Zero Z NVMe PCIe 4x4
Display(s) Samsung LC49HG90DMEX 32:9 144Hz Freesync 2/Acer XR341CK 75Hz 21:9 Freesync
Case CoolerMaster H500M/SOLDAM XR-1
Audio Device(s) iFi Micro iDSD BL + Philips Fidelio B97/FostexHP-A4 + LG SP8YA
Power Supply Corsair HX1000 Platinum/Enermax MAXREVO 1500
Mouse Logitech G303 Shroud Ed/Logitech G603 WL
Keyboard Logitech G915/Keychron K2
Software Win11 Pro/Win11 Pro
I'd decided to upgrade both my main rig, and my yet to be built 2nd rig (at the time), so I went with an RX 7900 XTX 24GB + RAM upgrade to 2x 16GB DDR4 3600C16 RAM. For my 2nd rig, I decided on an X570S mobo (to go with my R9 3900X from my main rig which was upgraded to a 5900X) + RX 6900XT 16GB (which was from my main rig, it should do fine at 3440x1440 75Hz gaming) + 2x 16GB DDR4 4000 RAM. Recently tried DSR (totally maxed out ingame setting) with AMD FMF + FSR2 and framerate stayed a mostly constant 75fps with mild dips here and there, so it was mostly smooth (let's not forget the 'wonderful' traversal hitching).

I'm looking forward to the upcoming Adrenalin driver with official AFMF support on the 24th, so beta AFMF drivers would be out of the picture from then on. With these RAM and min 16GB VRAM cards, I figure I should be fine for the next few years, my enthusiasm for gaming has been waning as of late (hey, I'd be hitting 60 next month!), so I'd prolly be using my rig for simple stuff like viewing YT, with vids and listening to music.
 
Last edited:
Joined
Sep 10, 2018
Messages
7,200 (3.11/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I'd decided to upgrade both my main rig, and my yet to be built 2nd rig (at the time), so I went with an RX 7900 XTX 24GB + RAM upgrade to 2x 16GB DDR4 3600C16 RAM. For my 2nd rig, I decided on an X570S mobo (to go with my R9 3900X from my main rig which was upgraded to a 5900X) + RX 6900XT 16GB (which was from my main rig, it should do fine at 3440x1440 75Hz gaming) + 2x 16GB DDR4 4000 RAM. Recently tried DSR (totally maxed out ingame setting) with AMD FMF + FSR2 and framerate stayed a mostly constant 75fps with mild dips here and there, so it was mostly smooth (let's not forget the 'wonderful' traversal hitching).

I'm looking forward to the upcoming Adrenalin driver with official AFMF support on the 24th, so beta AFMF drivers would be out of the picture from then on. With these RAM and min 16GB VRAM cards, I figure I should be fine for the next few years, my enthusiasm for gaming has been waning as of late (hey, I'd be hitting 60 next month!), so I'd prolly be using my rig for simple stuff like viewing YT, with vids and listening to music.

As I like to say only the real ones have 24GB of Vram PCMR 4eva!!!! :toast: :laugh:


I hear you though my enthusiasm for pc gaming has also been waning maybe I'm just old and jaded or what's popular enough to develop these days just isn't for me.

It is nice not to have to worry about vram though at least for a couple years.
 
Joined
Jan 10, 2011
Messages
1,460 (0.29/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) ViewSonic VA2406-MH 75Hz
Case Bitfenix Nova Midi
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Logitech G300s
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 24.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
EU4/5 suck because they use a main single thread for two things & sync it all at once which causes call stalls.
Any citation/elaboration for this?

UE uses Nvidia's Physx, which does internally run its physics calculations on different threads, even if the API call itself was from the same thread as the renderer's.

Not saying that this won't cause stalls. The renderer still has to wait for physics calc to finish. But that should be true for all engines and platforms.
 
Joined
Jan 14, 2019
Messages
13,146 (6.01/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Mainly Linux, little bit of Windows 10
Both consoles are 2070 super/2080 level or 6700 10G on the Radeon side just with more vram.... Oddly even with the extra vram texture quality is still trash in a lot of console games compared to the highest settings on PC.

My PS5 is almost always worse than my 6700XT/7600 in all aspects in games on both for example.
I guess that's because consoles don't have VRAM. They only have system RAM which is shared with the GPU as well.
 
Joined
Sep 10, 2018
Messages
7,200 (3.11/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I guess that's because consoles don't have VRAM. They only have system RAM which is shared with the GPU as well.

Still 12-14GB reserved for games I believe... 8.5-9 with series S

Part it might be more recent console games adopting FSR and usually very poor implementations of it you'd think they'd be able to scale texture resolution separately but even on PC some games do a poor job of that.
 
Joined
Jan 14, 2019
Messages
13,146 (6.01/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Mainly Linux, little bit of Windows 10
Still 12-14GB reserved for games I believe... 8.5-9 with series S

Part it might be more recent console games adopting FSR and usually very poor implementations of it you'd think they'd be able to scale texture resolution separately but even on PC some games do a poor job of that.
I wonder if it's not the poor implementation, but the game using too high settings/textures and running out of RAM even with FSR on.
 
Joined
Feb 1, 2019
Messages
3,684 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
I stopped following this thread for a while, but didn't we established that consoles' UMA has little/nothing to do with the alleged poor optimisation?

.
I also stopped for a while, I remember been told in this thread that its apparently harder to use unified than to not use it as its not the default behaviour in engines like UE. However is games that have patched their behaviour to move stuff from VRAM to RAM to fix their issues which goes against what that post said.
 
Joined
Jun 6, 2022
Messages
622 (0.66/day)
The vRAM topic was started right after the disaster of the other topic: Is 16GB of RAM enough for gaming?

We have a saying here and I don't think it's "local production": it hits you in the face. If the visual impact is so strong that you say "WTF is with this garbage", we have a topic for discussion. If you look with a magnifying glass for the difference of a pixel, it's just ... chatter is it good?

The reason: developers are interested in maximum profit and the target cannot be reached if you exclude a large part of potential buyers from the start. How many potential buyers have 16+GB vRAM video cards?
 

SL2

Joined
Jan 27, 2006
Messages
2,463 (0.36/day)
The vRAM topic was started right after the disaster of the other topic: Is 16GB of RAM enough for gaming?
To me it's about 8 GB not being enough anymore (FOR SOME), and how the 3070 aged badly (FOR SOME) just because of not having enough VRAM.

Yes, this is just one game, but imagine if it had a bit more VRAM..
Skärmbild (74).png
 
Last edited:
Joined
Jun 6, 2022
Messages
622 (0.66/day)
From my garden (1080p, Ultra, FSR OFF)


Leave me with HU. To force a 6800XT for 30-50 fps just to please your sponsor seems ridiculous to me at least. Too bad you can't read the small letters.

TPU Review
Clipboard01.jpg


-----------
Let's stay here, on this site, and analyze the reality as those from HU cannot reproduce it. They are not allowed.
According to the TPU reviews from June 9, 2021 and January 2024 (~20 games tested), in 2.5 years the difference between the 6700XT 12GB and the 3070 8GB has increased. In favor of 3070, of course. New games added in the last review, but the extra memory didn't help the 6700XT at all.
Comparing the data of the two reviews, both video cards lost almost 40% because new games demand more from the GPU. On top of that, the 6700XT also lost more.
I think there are 20 games tested and if they don't tell you anything either, take the exceptions and demonstrate with them. The reality is different.


comp.jpg
 
Last edited:
Joined
Jan 10, 2011
Messages
1,460 (0.29/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) ViewSonic VA2406-MH 75Hz
Case Bitfenix Nova Midi
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Logitech G300s
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 24.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
I also stopped for a while, I remember been told in this thread that its apparently harder to use unified than to not use it as its not the default behaviour in engines like UE. However is games that have patched their behaviour to move stuff from VRAM to RAM to fix their issues which goes against what that post said.
Memory managers are very elementary components of an engine. Supposing that someone made the mistake of designing their engine around dumping everything on graphics dedicated memory, "fixing" this would probably require an overhaul of the entire renderer (given that it's an architectural issue), not a simple patch.

From what little I understand of this topic, I think the UMA/NUMA differences would cause performance issues more likely than they would memory allocation issues.
Memory allocation issues are more likely caused by lousy garbage collection and inefficient assets/rendering. If someone made the mistake of allocating tons of high res textures without mips/streaming and didn't promptly clear them after use, they'd fill up the memory regardless of whether they're on an RTX4090 or a Zen3 APU.
 
Joined
Feb 1, 2019
Messages
3,684 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Memory managers are very elementary components of an engine. Supposing that someone made the mistake of designing their engine around dumping everything on graphics dedicated memory, "fixing" this would probably require an overhaul of the entire renderer (given that it's an architectural issue), not a simple patch.

From what little I understand of this topic, I think the UMA/NUMA differences would cause performance issues more likely than they would memory allocation issues.
Memory allocation issues are more likely caused by lousy garbage collection and inefficient assets/rendering. If someone made the mistake of allocating tons of high res textures without mips/streaming and didn't promptly clear them after use, they'd fill up the memory regardless of whether they're on an RTX4090 or a Zen3 APU.
What you just described isnt whats happening though.

The latest far cry game it was confirmed, they specifically fixed it by moving certain things that were in an older version in VRAM to system RAM in the newer version.

In games where is VRAM saturation, and then someone loads that game up using a 24 gig GPU, it doesnt fill up the VRAM on the 24 gig GPU or come close to it.

There is games today with really low res textures consuming more VRAM than games using very good textures years prior. If you are correct that its not simple to adjust for a platform like the PC, then that will possibly explain why some ports arent optimised for the PC platform and are dumping too much stuff in VRAM.
 

SL2

Joined
Jan 27, 2006
Messages
2,463 (0.36/day)

When your 3070 TI running in RT matches a 4090 in non-RT.. yeah, that is yet another example of why we shouldn't compare benchmarks like you do. :roll:
1705395335120.png

1705395382995.png



Hint: You're not benching actual gameplay...

They didn't use a 6800 XT, or a 6700 XT, and you clearly didn't watch the video. Pushing the settings is a way to find weaknesses, just like some want to see low res game settings as a way to push the CPU harder in reviews, even though it doesn't represent real world gameplay settings.

HUB is just pointing out that 8 GB was to low for that price point, and this is old news. It's simply embarrassing to claim that they have to be sponsored as it says more about you than HUB. I bet you haven't heard how much crap they give 6500 XT, over and over again. :D

1705392349463.png


Your VRAM graph is only 300 MB off the HUB's 1440p results, and surprise, we're not talking about 8192 MB here.

If you actually watched the video you would know that comparing average numbers won't show anything meaningful. The newest TPU review you linked to does ONLY have average numbers for 3070 so that doesn't help either. HUB never said it was slow in every game.

You have all the right to go after a reviewer, and who knows, maybe they're not always honest of perfect, but when you do it with such incoherency there's no point.
 
Last edited:

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
To me it's about 8 GB not being enough anymore (FOR SOME), and how the 3070 aged badly (FOR SOME) just because of not having enough VRAM.

Yes, this is just one game, but imagine if it had a bit more VRAM..
View attachment 329734

Cherrypicking is not hard. However you needed to find early benchmarks from a disaster launch title to see this, even with maximum ray tracing enabled. Game was fixed long ago, just like AMD sponsored RE4 and TLOU titles which ran like crap on all cards including AMDs own also maxed out 8GB cards. Today they run fine and look the same if not better. 8GB VRAM users could just use High instead of Ultra which fixed the problem too, and image quality was pretty much identical.

Avatar, one of the best looking games right now and 3070 beats Radeon 6800 on ultra preset, even in 4K/UHD, in both average and minimum fps.

Also, 6800 was never a 3070 competitor. 6800 was 579 MSRP, 3070 was 499 MSRP.

6700XT was AMDs 3070 competitor and 3070 beat it on release and still beats it. They were priced at 479 and 499. Yet 6700XT performed closer to 3060 Ti which was 399.

Yeah VRAM is always nice to have, if GPU power is up to the task. Typically it's not and you will be forced to lower settings anyway, hence lowering VRAM requirement as well. It's generally a non-issue.
Lots of VRAM don't futureproof a GPU. The GPU itself is going to be the limiting factor every single time.

I'd rather have a fast GPU with less VRAM than a weaker GPU with alot of VRAM. Tweaking and lowering VRAM usage is easy and the faster GPU will still blast out tons of frames by running high instead of ultra and the difference in actual gameplay will be none.

Also, DLSS is better than FSR and will save alot of RTX users without seeing shimmering and artifacts all over, like FSR has. I tested DLSS vs FSR in tons of games by now and the image quality in general is much better with DLSS. I hope AMD can improve FSR further.
 
Last edited:
Joined
Jan 10, 2011
Messages
1,460 (0.29/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) ViewSonic VA2406-MH 75Hz
Case Bitfenix Nova Midi
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Logitech G300s
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 24.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
The latest far cry game it was confirmed, they specifically fixed it by moving certain things that were in an older version in VRAM to system RAM in the newer version.
Anywhere to read more about this? Can't find anything on Google nor the patch notes.

In games where is VRAM saturation, and then someone loads that game up using a 24 gig GPU, it doesnt fill up the VRAM on the 24 gig GPU or come close to it.

There is games today with really low res textures consuming more VRAM than games using very good textures years prior. If you are correct that its not simple to adjust for a platform like the PC, then that will possibly explain why some ports arent optimised for the PC platform and are dumping too much stuff in VRAM.

Perhaps because no hardware can render scenes with 24GB of assets in real time?

Poor implementations exist (looking at you, Cities Skyline 2 -_- ) and will do so. But I'd be cautious comparing two different games if the used metric is (subjective) aesthetic quality. Same for concluding that memory utilisation is a bad thing. Optimisation prioritizes performance, not memory consumption. Uselessness of unused memory 'n all...
 

SL2

Joined
Jan 27, 2006
Messages
2,463 (0.36/day)
Cherrypicking is not hard. However you needed to find early benchmarks from a disaster launch title to see this, even with maximum ray tracing enabled. Game was fixed long ago, just like AMD sponsored RE4 and TLOU titles which ran like crap on all cards including AMDs own also maxed out 8GB cards. Today they run fine and look the same if not better. 8GB VRAM users could just use High instead of Ultra which fixed the problem too, and image quality was pretty much identical.
Four months in isn't early. Telling people to lower the settings isn't the right solution, even if it is a good afterthought.

Avatar, one of the best looking games right now and 3070 beats Radeon 6800 on ultra preset, even in 4K/UHD, in both average and minimum fps.
No one said this happens in every game.
Also, 6800 was never a 3070 competitor. 6800 was 579 MSRP, 3070 was 499 MSRP.
You don't know that. Do you know how I know that? Because it all happened the during the worst global price hike for graphics cards in history. MSRP meant shit. During other releases, prices have been adjusted according to lots of factors, but that wasn't feasible at the time. Or, deduct $40 for the extra 8 GB and you're close enough. We can of course look at the price after that period, but no one knows how much they were affected by previos events.

Yeah VRAM is always nice to have, if GPU power is up to the task.
In my example it was, for a few seconds, and that's the most frustrating thing to see. If your GPU is slow, fine, you have to upgrade. But when you go from this
1705399301116.png

to this within seconds, you're gonna get pissed.The GPU is up to the task, obviously. And by that time it doesn't matter if we're comparing to a 6800 or a 6700 XT, as both will do more than 25.
1705399376177.png


And of course there might be better examples out there than just this game, but you know it's going to happen again, because it's happened before. So many 3 - 4 GB cards in the past that could have worked for a longer time.

I'd rather have a fast GPU with less VRAM than a weaker GPU with alot of VRAM.
It's just a shame that the 3070 couldn't have both. You shouldn't have to choose.

Also, DLSS is better than FSR and will save alot of RTX users without seeing shimmering and artifacts all over, like FSR has. I tested DLSS vs FSR in tons of games by now and the image quality in general is much better with DLSS. I hope AMD can improve FSR further.
That's a different matter, but since DLSS is overall better it might compensate lack of VRAM in the short run when it comes to popularity among buyers, but perhaps not in the long run.

I highly doubt AFMF will deliver, or AsFMotherF like I call it, but I guess we'll soon find out.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Four months in isn't early. Telling people to lower the settings isn't the right solution, even if it is a good afterthought.


No one said this happens in every game.

You don't know that. Do you know how I know that? Because it all happened the during the worst global price hike for graphics cards in history. MSRP meant shit. During other releases, prices have been adjusted according to lots of factors, but that wasn't feasible at the time. Or, deduct $40 for the extra 8 GB and you're close enough. We can of course look at the price after that period, but no one knows how much they were affected by previos events.


In my example it was, for a few seconds, and that's the most frustrating thing to see. If your GPU is slow, fine, you have to upgrade. But when you go from this
View attachment 329912
to this within seconds, you're gonna get pissed.The GPU is up to the task, obviously. And by that time it doesn't matter if we're comparing to a 6800 or a 6700 XT, as both will do more than 25.
View attachment 329913

And of course there might be better examples out there than just this game, but you know it's going to happen again, because it's happened before. So many 3 - 4 GB cards in the past that could have worked for a longer time.


It's just a shame that the 3070 couldn't have both. You shouldn't have to choose.


That's a different matter, but since DLSS is overall better it might compensate lack of VRAM in the short run when it comes to popularity among buyers, but perhaps not in the long run.

I highly doubt AFMF will deliver, or AsFMotherF like I call it, but I guess we'll soon find out.
Well tons of other tests shows Callisto Protocol runs just fine on 3070/3070Ti


Show 4K/UHD only and you see that 3070 Ti beats 6800 16GB here, at MAX + RT

2nd Link, TPU Testing-> https://www.techpowerup.com/review/the-callisto-protocol-benchmark-test-performance-analysis/5.html

3070 beats 6800 while costing 80 dollars less even at 4K/UHD on Ultra as well, and yeah Radeon 6800 was more expensive than 3070 around release, no matter how the GPU market was. Higher MSRP = Higher price. 6700XT was the 3070 competitor. AMD don't price a 3070 competitor at 579 dollars obviously.

Like I said, cherrypicking is not that hard, all kinds of weird results to find if you want.


In reality, VRAM is not really an issue for many people. Especially not if you use 1440p or less like 95% of PC gamers. However no last gen mid-end offerings are going to max out the most demanding games at 1440p today, without using upscaling, which lowers VRAM requirement.

However its always funny when people find demanding games, running at 4K/UHD native / without upscaling while maxing out ray tracing too, to make a point about how much VRAM matters when the 16GB cards from the same generation is struggling hard as well.

GPU have always been the most important part and generally people don't need more than 8GB at 1440p or less. 12GB will last for many years here.

If you actually use a 4K/UHD+ monitor, refuse to use upscaling and wants to max out every single demanding game in the next 4-5 years, which GPU would you buy? Because not even my 4090 will do it.
I have the VRAM probably till 2028-2030 but the GPU will already seem dated when 5090/5080 hits "soon"

VRAM never futureproofed a GPU and never will.

7900XT would have been a better GPU with more cores/power and less VRAM. 16GB would have been more than fine. 7900XTX again, more cores/power and 16-20GB VRAM, would have been better. 24GB is not doing ANYTHING on 7900XTX, and never will. Looks good on paper but the GPU can't evne do Ray Tracing or Path Tracing well, which is stuff that eats VRAM.

Also AMD keeps using GDDR6 instead of GDDR6X because their chips use more power. GDDR6 is kinda cheap.

AMD started this VRAM talk because of their marketing on the subject. TLOU and RE4 was rushed console ports sponsored by AMD. Ultra was not possible on 8GB cards on release. This has been fixed. This "bug" was probably not an accident as it came RIGHT AFTER AMD speaking about how much VRAM matters.

AMD did this before. Back in the Shadow of Mordor days, when they released a 6GB Texture Pack -> https://www.pcgamer.com/spot-the-di...rdor-ultra-hd-textures-barely-change-a-thing/


The texture pack did NOTHING but max out VRAM on all cards with less than 6GB, meaning most Nvidia cards at the time. Graphics looked identical.

Also, people are generally too stupid to realize that VRAM usage on a 4090/7900XTX in game testing does not mean actual VRAM requirement. Allocation is a big part of the number you see in tests. VRAM usage does NOT mean required amount. This is nothing new.

And most games today also has pretty much identical looking textures at high and ultra presets.

Ultra often means uncompressed and high means slightly compressed. When playing you don't see the difference at all.

But sure, keep praising VRAM as the single thing that will make your GPU futureproof.

Nvidia already showcased that their Neural Texture Compression tech can make graphics look much better while using alot less VRAM long ago. The tech exist.
 
Last edited:
Joined
Jun 6, 2022
Messages
622 (0.66/day)
When your 3070 TI running in RT matches a 4090 in non-RT.. yeah, that is yet another example of why we shouldn't compare benchmarks like you do. :roll:
View attachment 329909
View attachment 329910
"TPU Custom Scene", not running the benchmark included in the game. And there is a difference between 1080p and 1440p.
Anyway, the game runs smoothly with 8GB RAM and you can worship HU. Maybe you donate some money to them because they make you happy, if you haven't noticed on the TPU how much vRAM memory this game requires.

I didn't need a psychologist, the game ran fluently (1080p, ULTRA + RT without FSR), but I think a psychiatrist is even needed for those who think it's ok to buy a video card for $500 and play with 50 FPS just to prove it.. . I do not know what.
I repeat: the TPU reviews from 2021 and 2024 for the 3070 and 6700XT are valid for the real world.
Shall we wait another 10 years? Maybe then the 6700XT@2 fps average beats the 3070@1.99 fps average and HU celebrates..

The Callisto Protocol Screenshot 2024.01.16 - 17.45.30.82.jpg
 
Last edited:
Top