• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are game requirements and VRAM usage a joke today?

Joined
Sep 10, 2018
Messages
7,288 (3.15/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
vram is pretty binary -- you either have enough or you don't, and when you don't (which is increasingly common at higher rez) the game just runs like trash or doesn't run at all.

That being said if you plan to flip the card for a newer model in the next 18 months then you're gonna be just fine.

If you're buying a 3070ti today for some 1440P gaming, I'm not sure you're going to be very happy with it.

My buddy who just ditched his 6700XT was going to pick up a 3070ti used for like 350 told him don't do it he ended up with a 4070. He games at ultrawide 1440p. Something about how AMD cards capture clips or some crap was why he wasn't happy with the AMD card and FSR.... Probably niche reasons.
 
Joined
Dec 25, 2020
Messages
7,230 (4.89/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
My buddy who just ditched his 6700XT was going to pick up a 3070ti used for like 350 told him don't do it he ended up with a 4070. He games at ultrawide 1440p. Something about how AMD cards capture clips or some crap was why he wasn't happy with the AMD card and FSR.... Probably niche reasons.

AMD's video recording (ReLive) really doesn't hold a candle, and FSR's markedly inferior to DLSS, so I understand that

I can believe it, I got 12 fps on my laptop which has a i9-13980HX and a 4090 Mobile. My GPU is usually a fair bit weaker than a 7900 XTX, but does outperform it in some titles that favor NVIDIA.

4090 mobile = 4080 desktop = roughly equal or faster than 7900 XTX though. You sure you aren't power or thermally limited?

There must be a typo there - 1080p medium requires a 3070 or 6700 XT, but for 1440p, a 3060 or 6600 XT is enough?

6600 XT is targeting 30 fps

1699553752913.png
 
Joined
Jun 6, 2022
Messages
622 (0.65/day)
My logic is as follows. If I'm going to pay more, I want to get more for my money. More memory, with faster modules and on a wider bus and a faster GPU. More than anything, actually physically recreated, what we call hardware, or intimately "iron". Just as I do not save or negotiate the price in the store, but accept it as it is on the sticker, so the corporation does not sell me empty talk and fake frames, but something I can touch. I don't want to reduce and reduce and reduce settings because something is missing.
Paradoxically, paying more for a more powerful video card is always a compromise.
 
Joined
Oct 14, 2007
Messages
663 (0.11/day)
Location
Auburn Hills, MI
Processor Intel i9-13900KF
Motherboard ASUS Z790M-Plus
Memory 64 GB DDR5 @ 6000 G. Skill Trident Z
Video Card(s) ASUS TUF Gaming RTX 4090
Storage 2 TB SN850X + 2x 4 TB Lexar NM790
Display(s) 32" 4K/240 Hz W-OLED w/ 1080P/480Hz Mode + 39" 3440x1440 240 Hz W-OLED
Case Lian Li O11 Mini
Audio Device(s) Kali LP-UNF + Audeze Maxwell
Power Supply Corsair RM1000x
Mouse Logitech G502 X Plus
Keyboard Keychron Q6 Pro
4090 mobile = 4080 desktop = roughly equal or faster than 7900 XTX though. You sure you aren't power or thermally limited?

The mobile variant has a much lower power limit. It's capped at 175W. Performance is closer to a 4070 Ti or 7900XT.
 
Joined
Dec 25, 2020
Messages
7,230 (4.89/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
The mobile variant has a much lower power limit. It's capped at 175W. Performance is closer to a 4070 Ti or 7900XT.

I see, although, 175 W should be plenty of fun with an AD103 chip still. I don't think I've ever seen my 4080 much above 200 W when playing games though :D
 
Joined
Jun 6, 2022
Messages
622 (0.65/day)
If you're buying a 3070ti today for some 1440P gaming, I'm not sure you're going to be very happy with it.
RTX 3070 Ti is sold from June 2021. We are in November 2023. I don't know why we switched to prices when discussing vRAM.

The last of us was so unstable on 8gb cards that nvidia released a hotfix for it. That fix eventually got worked into the mainline driver. Sounds like you may have played after that happened. I played before. It was not okay at 1440 high settings native. You could even say it was a 'problem' hence why there were fixes. Both from nvidia, and the game devs.


As for the rest of your arguments, I have already addressed/answered them in the past, just go back. And please don't insinuate things of me that you don't have evidence for.
It was as if you were saying that the vRAM limitation was the cause of your problems. I notice that now you have a revelation.
 
Joined
Nov 13, 2007
Messages
10,895 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
RTX 3070 Ti is sold from June 2021. We are in November 2023. I don't know why we switched to prices when discussing vRAM.

I didn't say anything about a price. I purely meant in terms of 8GB longevity now -- especially due to games coded for the new consoles defaulting to a 12-16GB framebuffer -- you're going to see more of the "game released and ran like a turd on 8gb cards until they patched" much more often.
 
Joined
Sep 10, 2018
Messages
7,288 (3.15/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I didn't say anything about a price. I purely meant in terms of 8GB longevity now -- especially due to games coded for the new consoles defaulting to a 12-16GB framebuffer -- you're going to see more of the "game released and ran like a turd on 8gb cards until they patched" much more often.

They are mostly fine with no RT and turning down settings to high/med though.... As long as that is the persons mindset at 1440p it's fine.... I think at 1080p if the 8GB card is cheap enough it's fine.

Not even sure why this is a debate are people defending the asinine 4060ti 8GB which is also just as crap in it's 16GB variant just slightly less so? People in 2023 should really be targeting 12GB for AAA gaming why would anyone want a card that will start losing steam in a year or two that cost 400+ makes no sense.
 
Joined
Jun 6, 2022
Messages
622 (0.65/day)
That's an interesting point. Do you have any sources on that?

I have a 2070 on the shelf to test, but no rival card with more VRAM. the closest I have is a 6750 XT that is also more powerful in the GPU.



I agree. DLSS/FSR is nothing more than a "but... but..." reaction from the seller upon not being able to answer why I'm not getting more for my money.
There is no rival for the 2070/2070 Super. 5700XT has 8GB.

I disagree. An RTX offers much more than an RX. In the "technologies" section, AMD is far behind. DLSS, Ray Tracing, Low Latency, NVENC and, last but not least, CUDA. I don't know if you found out that the old 3070Ti surpasses the 7900XTX in terms of rendering thanks to OptiX, another feature offered by nVidia. In the new Cinebench 2024, an RX 6900XT has a score of ~8500. RTX 3070 Ti reaches 12000 in stock form.

As an RTX owner, I'm excited about DLSS.
I will give an example of the very game incriminated in the last pages: The Last of US. In the first video (April 30) I used 1080p and activated DLSS at minute 1:30. In the second video I use exclusively DLSS for 1440p.
It should be noted (after the vRAM disaster at launch) that from April 30th to November, game optimization made it possible for 1440p to use the same amount of vRAM as 1080p and achieve the same framerate.
Some titles are sponsored by AMD, but the many negative reviews of the players bring them back to earth.

P.S. I used NVENC for recording. It reduces performance by ~5%, it's not phenomenal, but it's free and, as far as I know, AMD has nothing to compete with it. I also use it for encodings...skyrocket mode.

When comparing RTX to RX, only use rasterization. It is the only viable redoubt for AMD.

 
Joined
Mar 7, 2023
Messages
947 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
Keyboard msi gk30
RTX 3070 Ti is sold from June 2021. We are in November 2023. I don't know why we switched to prices when discussing vRAM.


It was as if you were saying that the vRAM limitation was the cause of your problems. I notice that now you have a revelation.

Yes with tlou vram limitation was the problem, which is why dlss made it playable for me, because it lowered render resolution, and thus freed up just enough to make it not crash at high 1440p. It was 60 fps with and 60 fps without ( because I had vsync on), the difference was the stability. But I feel like I've said this already. I'll remind you again your original argument was gpu limitation came before vram limitation. With tlou in the early days at least, this was not the case.
 
Joined
Jan 14, 2019
Messages
13,280 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
AMD's video recording (ReLive) really doesn't hold a candle, and FSR's markedly inferior to DLSS, so I understand that
That's what people say, although personally, I haven't seen a single pixel's worth of difference between ShadowPlay and ReLive, or between DLSS and FSR.

There is no rival for the 2070/2070 Super. 5700XT has 8GB.

I disagree. An RTX offers much more than an RX. In the "technologies" section, AMD is far behind. DLSS, Ray Tracing, Low Latency, NVENC and, last but not least, CUDA. I don't know if you found out that the old 3070Ti surpasses the 7900XTX in terms of rendering thanks to OptiX, another feature offered by nVidia. In the new Cinebench 2024, an RX 6900XT has a score of ~8500. RTX 3070 Ti reaches 12000 in stock form.

As an RTX owner, I'm excited about DLSS.
I will give an example of the very game incriminated in the last pages: The Last of US. In the first video (April 30) I used 1080p and activated DLSS at minute 1:30. In the second video I use exclusively DLSS for 1440p.
It should be noted (after the vRAM disaster at launch) that from April 30th to November, game optimization made it possible for 1440p to use the same amount of vRAM as 1080p and achieve the same framerate.
Some titles are sponsored by AMD, but the many negative reviews of the players bring them back to earth.

P.S. I used NVENC for recording. It reduces performance by ~5%, it's not phenomenal, but it's free and, as far as I know, AMD has nothing to compete with it. I also use it for encodings...skyrocket mode.

When comparing RTX to RX, only use rasterization. It is the only viable redoubt for AMD.

That is entirely your own opinion that I may or may not agree with (I don't want to start another DLSS or no DLSS battle). I'll watch the videos when I get home from work, but I doubt that they'll sway me either way.
 
Joined
Mar 7, 2023
Messages
947 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
Keyboard msi gk30
There is no rival for the 2070/2070 Super. 5700XT has 8GB.

I disagree. An RTX offers much more than an RX. In the "technologies" section, AMD is far behind. DLSS, Ray Tracing, Low Latency, NVENC and, last but not least, CUDA. I don't know if you found out that the old 3070Ti surpasses the 7900XTX in terms of rendering thanks to OptiX, another feature offered by nVidia. In the new Cinebench 2024, an RX 6900XT has a score of ~8500. RTX 3070 Ti reaches 12000 in stock form.

As an RTX owner, I'm excited about DLSS.
I will give an example of the very game incriminated in the last pages: The Last of US. In the first video (April 30) I used 1080p and activated DLSS at minute 1:30. In the second video I use exclusively DLSS for 1440p.
It should be noted (after the vRAM disaster at launch) that from April 30th to November, game optimization made it possible for 1440p to use the same amount of vRAM as 1080p and achieve the same framerate.
Some titles are sponsored by AMD, but the many negative reviews of the players bring them back to earth.

P.S. I used NVENC for recording. It reduces performance by ~5%, it's not phenomenal, but it's free and, as far as I know, AMD has nothing to compete with it. I also use it for encodings...skyrocket mode.

When comparing RTX to RX, only use rasterization. It is the only viable redoubt for AMD.


Any chance we could not make this another amd vs nvidia thread? It was about the increasing vram requirements of games.
 
Joined
Sep 25, 2023
Messages
159 (0.34/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
The funniest part is that some people forget that some users have higher vram requirements and they end up needing a 2 pc setup to be able to handle everything whereas a 4090 cant justify the price, be vtubing, encoding, maxing the gpu in several different ways so on and so on.
 
Joined
Jan 14, 2019
Messages
13,280 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
The funniest part is that some people forget that some users have higher vram requirements and they end up needing a 2 pc setup to be able to handle everything whereas a 4090 cant justify the price, be vtubing, encoding, maxing the gpu in several different ways so on and so on.
Individual needs are always a thing. That's why I don't agree with blanket statements like "nobody should buy a graphics card with 8 GB VRAM in 2023".
 
Joined
Mar 7, 2023
Messages
947 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
Keyboard msi gk30
Individual needs are always a thing. That's why I don't agree with blanket statements like "nobody should buy a graphics card with 8 GB VRAM in 2023".

Thats true, I usually assume what is meant is " if you want to play new AAA games going forward for a couple years" if all you play is league of legends or something... not an issue at all. Thats why pc gaming is so great, you always have a huge backlog of old games. I still game on an emac from 2001 that has 32mb of ddr1 vram. The crt is great for games that were meant to run at lower resolutions. Every resolution is native resolution with those, well more or less.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,328 (1.29/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
If you're buying a 3070ti today for some 1440P gaming, I'm not sure you're going to be very happy with it.
oh 100%, I wouldn't buy a 10GB card today targeting 3440x1440 or 4k, but 3 years ago is proving to be non-problematic so far. Relative to the features I like to enable in games, the GPU is lacking more in outright grunt than it is VRAM, and DLSS is proving to be the fine wine keeping it going strong. My next GPU which will need to see out 4k(up to)120 for another 2 generations, I will be looking at a bare minimum of 16GB, but likely 20+
 
Joined
Oct 6, 2021
Messages
1,605 (1.35/day)
There is no rival for the 2070/2070 Super. 5700XT has 8GB.

I disagree. An RTX offers much more than an RX. In the "technologies" section, AMD is far behind. DLSS, Ray Tracing, Low Latency, NVENC and, last but not least, CUDA. I don't know if you found out that the old 3070Ti surpasses the 7900XTX in terms of rendering thanks to OptiX, another feature offered by nVidia. In the new Cinebench 2024, an RX 6900XT has a score of ~8500. RTX 3070 Ti reaches 12000 in stock form.

As an RTX owner, I'm excited about DLSS.
I will give an example of the very game incriminated in the last pages: The Last of US. In the first video (April 30) I used 1080p and activated DLSS at minute 1:30. In the second video I use exclusively DLSS for 1440p.
It should be noted (after the vRAM disaster at launch) that from April 30th to November, game optimization made it possible for 1440p to use the same amount of vRAM as 1080p and achieve the same framerate.
Some titles are sponsored by AMD, but the many negative reviews of the players bring them back to earth.

P.S. I used NVENC for recording. It reduces performance by ~5%, it's not phenomenal, but it's free and, as far as I know, AMD has nothing to compete with it. I also use it for encodings...skyrocket mode.*

When comparing RTX to RX, only use rasterization. It is the only viable redoubt for AMD.

You call it technology, I call it cancer of the gaming industry. I would never ever choose a GPU based on this.

It is useful to call "almost playable" the terrible games that are being released broken (a side effect of the post-existence of Huang's Pandora's box), and secondly to justify the existence of products with similar price and almost zero generational advancement like 3060-4060.

*The few people who make money from this type of activity will buy dedicated recording cards. Bro, Not everyone purchases a GPU to spend time recording games or other activities instead of enjoying a few hours of gameplay, given the hectic lives that most people live.
 
Joined
Sep 25, 2023
Messages
159 (0.34/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
Individual needs are always a thing. That's why I don't agree with blanket statements like "nobody should buy a graphics card with 8 GB VRAM in 2023".
Oh yeah absolutely, for example, for those second pc setups 8gb of vram is absolutely fine whereas in the primary rig 10,12 and 16 is plenty. I'm running the old gtx 1080 of my wife and I barely hit 4-5gb of ram with all the stuff that I ran and I could still run more (which I'm getting parts to use my old 3700x as I'm a bit cpu limited in some scenarios).

I've been wondering if I should just buy a 4060 or a 3060 because of the RT cores (AI motion capture), the newer nvenc and some other stuff.
Those gpus can be put for other case scenarios that are plenty, altho the price at times it isn't that good, (I look at you european prices)

Graphics cards can be used for plenty scenarios, not just "gaming", as they are a general unit that can process lots of parallel processes really fast compared to a cpu, like, I use the secondary pc to render stuff fast for editing (proxyes) and in the time that that is done I have my rig free to be used for other stuff, low power gpus do rock in that sense tbh

oh 100%, I wouldn't buy a 10GB card today targeting 3440x1440 or 4k, but 3 years ago is proving to be non-problematic so far. Relative to the features I like to enable in games, the GPU is lacking more in outright grunt than it is VRAM, and DLSS is proving to be the fine wine keeping it going strong. My next GPU which will need to see out 4k(up to)120 for another 2 generations, I will be looking at a bare minimum of 16GB, but likely 20+
I totally agree, 10gb for ultrawide 1440p is heavy, really heavy, I was able to get around the vram management in my 3080 by using HW scheduling and not using more than high graphics for current titles as I normally average 9-9.7gb with Dlss and RT at 1440p, dlss in quality) plus OBS at the same time, (this in Cyberpunk 2077). Next gpu that I will pick will have to have 16gb vram minimum and I will just stick there. I would like to crank some graphics higher like the textures but sadly not today
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,328 (1.29/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Joined
Sep 25, 2023
Messages
159 (0.34/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
Can you elaborate on what this does to help/mitigate?
Normally you will run those programs that need gpu priority with admin, this will make them have gpu/vram priority access but due to the gpu HW scheduling it manages the memory/gpu in a more dynamic way.

For the longest of time the devs of OBS have recommended against using obs with it because when it was a hotshot feature it caused totally random issues in obs, scene rendering issues, nvenc bugging out etc..
Q
After some testing and the opinion of a obs pluging dev I decided to start testing really demanding games with obs admin + hw scheduling on and off.

One of the biggest differences was that when you use obs, thw best way to capture games is using a hotkey hooking, so opening obs first then the game would have some random rendering lag, which its due to gpu being maxed out, with hw scheduling off I was able to max out the gpu usage and the vram without running in to obs complaining that there is issues, I decided to test this with cyberpunk 2077 with dlss+rt and batflefield V at native res and fps at 144 without RT. I didn't have any issue thanks to running obs without admin with hw scheduling enabled, later on I decided to run some of my vtubing programs at the same time and the it went surprisingly well.

My setup relies on frames being stable and obs not complaining, when I stream/record I cap my game fps at 120 while obs runs at 60fps, this is crucial as I run the 2 pc setup snd any lag in obs will introduce lag in to the encoder pc. In the case of Cyberpunk 2077 I cap them at 60 to have smooth fps without heavy ups and downs.

Since then I havent run in to vram issues, I don't have more in depth knowledge about how hw scheduling works and I would love if GN/TPU would do some tests regarding this.

Because we have me and my wife the same setup I'm gonna run some more in depth tests soon and even force much more the vram and see how it will come as we like doing dual pov streaming, so our rigs are 98% the same.

If anyone would love doing tests with thus please ping me!
 
Joined
Jan 14, 2019
Messages
13,280 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Oh yeah absolutely, for example, for those second pc setups 8gb of vram is absolutely fine whereas in the primary rig 10,12 and 16 is plenty. I'm running the old gtx 1080 of my wife and I barely hit 4-5gb of ram with all the stuff that I ran and I could still run more (which I'm getting parts to use my old 3700x as I'm a bit cpu limited in some scenarios).

I've been wondering if I should just buy a 4060 or a 3060 because of the RT cores (AI motion capture), the newer nvenc and some other stuff.
Those gpus can be put for other case scenarios that are plenty, altho the price at times it isn't that good, (I look at you european prices)

Graphics cards can be used for plenty scenarios, not just "gaming", as they are a general unit that can process lots of parallel processes really fast compared to a cpu, like, I use the secondary pc to render stuff fast for editing (proxyes) and in the time that that is done I have my rig free to be used for other stuff, low power gpus do rock in that sense tbh


I totally agree, 10gb for ultrawide 1440p is heavy, really heavy, I was able to get around the vram management in my 3080 by using HW scheduling and not using more than high graphics for current titles as I normally average 9-9.7gb with Dlss and RT at 1440p, dlss in quality) plus OBS at the same time, (this in Cyberpunk 2077). Next gpu that I will pick will have to have 16gb vram minimum and I will just stick there. I would like to crank some graphics higher like the textures but sadly not today
I've got 4 GB on the 1050 Ti in one of my HTPCs, and I can still play Kingdom Come: Deliverance on it. It fills up the whole 4 GB, but as long as I get 30-ish FPS on average, who cares? :D

I've got 2 GB on the GT 1030 in my other HTPC, and I can play Nascar Heat 5 on it just fine.

You can also use your GPU to contribute to science project calculations using Folding@Home or BOINC.

There should be no blanket statements, especially not in the extremely diverse gaming community.
 
Joined
Sep 25, 2023
Messages
159 (0.34/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
I've got 4 GB on the 1050 Ti in one of my HTPCs, and I can still play Kingdom Come: Deliverance on it. It fills up the whole 4 GB, but as long as I get 30-ish FPS on average, who cares?
Oh yeah totally, that experience for httpcs is pretty insane!!


I've got 2 GB on the GT 1030 in my other HTPC, and I can play Nascar Heat 5 on it just fine.
Most people dont know this but the gt 1030 can decode 4k hdr just fine!, I have one and thats what I used as a "cuda" gpu in the first iteration of the streaming rig, quicksync was the one doing the encoding and it was a pretty good combo (i7 7700k), my wife did use that gt 1030 ad her gaming gpu until she got the gtx 1080 so its totally doable!


You can also use your GPU to contribute to science project calculations using Folding@Home or BOINC.
I did some with my vega 64 some point, I was thinking if I should put it in to my proxmox server and leave it there running it via passthru, I need to look in to it as I have there a gt 545 haha

There should be no blanket statements, especially not in the extremely diverse gaming community
Absolutely!
 
Joined
Jan 14, 2019
Messages
13,280 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
I did some with my vega 64 some point, I was thinking if I should put it in to my proxmox server and leave it there running it via passthru, I need to look in to it as I have there a gt 545 haha
If you're interested, pop to the BOINC thread here on TPU. There are some interesting projects with TPU teams that you can join. :)
 
Joined
Sep 25, 2023
Messages
159 (0.34/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
If you're interested, pop to the BOINC thread here on TPU. There are some interesting projects with TPU teams that you can join. :)
I didn't know that they had, Ill will write it down in case I forget!
 
Joined
Dec 25, 2020
Messages
7,230 (4.89/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
That's what people say, although personally, I haven't seen a single pixel's worth of difference between ShadowPlay and ReLive, or between DLSS and FSR.


That is entirely your own opinion that I may or may not agree with (I don't want to start another DLSS or no DLSS battle). I'll watch the videos when I get home from work, but I doubt that they'll sway me either way.

until the 7900 XTX came around the video encoder on Radeon was complete trash for h264, even comparing to Pascal's NVENC, so yeah there was a very large gulf between them in quality. This is no longer the case as long as you have RDNA 3 but... still holds as far as 6950XT and older


Just look at TPUs own reviews on the subject, you'll see DLSS always shine in the small details, it doesn't shimmer as much, etc. although you may not have the panel or eye for detail sometimes. FSRs only true strength is its hardware agnostic nature, you can get FSR to run on anything.
 
Top