• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are game requirements and VRAM usage a joke today?

Joined
Jan 14, 2019
Messages
13,274 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
There is no rival for the 2070/2070 Super. 5700XT has 8GB.

I disagree. An RTX offers much more than an RX. In the "technologies" section, AMD is far behind. DLSS, Ray Tracing, Low Latency, NVENC and, last but not least, CUDA. I don't know if you found out that the old 3070Ti surpasses the 7900XTX in terms of rendering thanks to OptiX, another feature offered by nVidia. In the new Cinebench 2024, an RX 6900XT has a score of ~8500. RTX 3070 Ti reaches 12000 in stock form.

As an RTX owner, I'm excited about DLSS.
I will give an example of the very game incriminated in the last pages: The Last of US. In the first video (April 30) I used 1080p and activated DLSS at minute 1:30. In the second video I use exclusively DLSS for 1440p.
It should be noted (after the vRAM disaster at launch) that from April 30th to November, game optimization made it possible for 1440p to use the same amount of vRAM as 1080p and achieve the same framerate.
Some titles are sponsored by AMD, but the many negative reviews of the players bring them back to earth.

P.S. I used NVENC for recording. It reduces performance by ~5%, it's not phenomenal, but it's free and, as far as I know, AMD has nothing to compete with it. I also use it for encodings...skyrocket mode.

When comparing RTX to RX, only use rasterization. It is the only viable redoubt for AMD.

I tried to see any difference in your first video, but all I see is encoding (bitrate-related) artefacts in both native and DLSS.

until the 7900 XTX came around the video encoder on Radeon was complete trash for h264, even comparing to Pascal's NVENC, so yeah there was a very large gulf between them in quality. This is no longer the case as long as you have RDNA 3 but... still holds as far as 6950XT and older


Just look at TPUs own reviews on the subject, you'll see DLSS always shine in the small details, it doesn't shimmer as much, etc. although you may not have the panel or eye for detail sometimes. FSRs only true strength is its hardware agnostic nature, you can get FSR to run on anything.
I've never seen any side-by-side comparison until now, but I agree, Nvidia's H.264 looks better at the same bitrate (as for the graphs and numbers, I don't care).
 
Joined
Sep 25, 2023
Messages
159 (0.34/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
I've never seen any side-by-side comparison until now, but I agree, Nvidia's H.264 looks better at the same bitrate
I think the only one that might have good comparissions about this is EposVox

Edit, I didn"t noticed the video referenced is by Epos lol
 
Last edited:
Joined
Sep 1, 2020
Messages
2,451 (1.54/day)
Location
Bulgaria
....


Just
Let's just see when clip is post...on 2020 8th April. I can't figure out the date it was taken, but it must have been before it was published. 7 months and 2 weeks before very first rDNA 2 cards release date. What in this video is approval for your words about quality with rDNA 2 models?
 
Joined
Dec 25, 2020
Messages
7,226 (4.89/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Let's just see when clip is post...on 2020 8th April. I can't figure out the date it was taken, but it must have been before it was published. 7 months and 2 weeks before very first rDNA 2 cards release date. What in this video is approval for your words about quality with rDNA 2 models?

Epos has done these videos yearly, that's just the one I had bookmarked from back when I had my Radeon VII. RDNA 2's encoder still suffered compared to Nvidia at lower bit rates.

you really want RDNA 3 to get equivalence with Nvidia. It's not a problem now but it was very much until these cards released, years late.
 
Joined
Sep 25, 2023
Messages
159 (0.34/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
Epos has done these videos yearly, that's just the one I had bookmarked from back when I had my Radeon VII. RDNA 2's encoder still suffered compared to Nvidia at lower bit rates.

you really want RDNA 3 to get equivalence with Nvidia. It's not a problem now but it was very much until these cards released, years late.
Yeap, a lot of people is also waiting for the amd encoder revamp in obs studio, until then a lot of peope is just going intel ot nvidia because of how big is the difference
 
Joined
Mar 7, 2023
Messages
947 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
Keyboard msi gk30
I've been wondering if I should just buy a 4060 or a 3060 because of the RT cores (AI motion capture), the newer nvenc and some other stuff.
Those gpus can be put for other case scenarios that are plenty, altho the price at times it isn't that good, (I look at you european prices)

I like the 3060 as pretty much the only cheap and somewhat modern nvidia card with a reasonable amount of vram. However it is getting pretty slow. I do believe that the Yuzu team came out and said do not buy the 4060/4060 ti because it performs worse than the 3060, due to the bandwidth. Which I guess is important and not so easily offset with cache in switch emulation. So it depends I guess. But I really do not like the 4060 series out of principle.

One positive I can think of is the low profile 4060. I think its the best low profile card you can get ( though don't quote me on that).
 
Joined
Sep 25, 2023
Messages
159 (0.34/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
I like the 3060 as pretty much the only cheap and somewhat modern nvidia card with a reasonable amount of vram. However it is getting pretty slow. I do believe that the Yuzu team came out and said do not buy the 4060/4060 ti because it performs worse than the 3060, due to the bandwidth. Which I guess is important and not so easily offset with cache in switch emulation. So it depends I guess. But I really do not like the 4060 series out of principle.

One positive I can think of is the low profile 4060. I think its the best low profile card you can get ( though don't quote me on that).
yeah it makes sense, 3060 is a pretty good bang for the buck compared to the "4060", well in this case the 4060 is a rebranded 4050 after all.. I'm not surprised they recommend against buying it
 
D

Deleted member 57642

Guest
Some points are valid, like:

- games being poorly optimized - so the GPU has the work harder while using more VRAM (for example: rendering far away textures at high quality - which is more like a waste of extra VRAM for details you barely notice or miss entirely)
- poorly optimized ports from consoles
- downgrading VRAM on latest GPUs (lowering both bus memory width and quantity) - so latter they can refresh them by using the norm VRAM from older generations - thus, giving some a reason to invest in upgrade - whithout actually bringing something new to the table
- ray tracing is still not worth it (offering very little in terms of visuals "for its requirements")

But then there's... "the elephant in the room" as in... the obsession with ULTRA HIGH SETTINGS at 2K or 4K resolutions - even while owning a mid range GPU. Not to mention, the game used as example (Guild Wars 2) - shined more artistically on a visual level - and not in terms of having the highest level of details for that time (which is not that surprising for a MMO of that scale). Even so - "ON ULTRA SETTINGS + 4K - GW2 can get close to 5GB VRAM which is close to 16x Higher than the OP (300 MB / GW2 also switched from DX9 to DX11 - which actually improved the FPS and stuttering, significantly - beyond the buggy beta testing phase), in some regions:


...while in other regions gets close to 3.6GB VRAM:


Yet, the majority of GW2 players - use a mid range system (at best) - and it runs just fine "AT LOWER SETTINGS!" Yes, shocking i know: PC Games have multiple graphical profiles (Very Low, Low, Medium, High, Ultra, Ultra+ and everything in between) - so that a larger audience can enjoy a well made game. I might be to old school - but there was a time when Game-play + Story-line were a main priority, as for visuals - even the artwork was deem more important (in terms of visuals) - than textures that can offer the highest level of detail for a given time period. It's quite an irony - that the game used as example - might need a 2000$ Video card (+ a CPU that can handle it without bottlenecking) - to run a game released more than a decade ago at 4K+ & highest settings (especially if we're also taking "Modern Refresh Rates" into account).

Last not but not least - this is far from being a new thing (some games having higher requirements - than the current tech can handle). Maybe some of you are to young (or at least - that was the case back in 2007) - but there's an old meme of such game catching people by surprise: Yes, but can it run Crysis?!:wtf: Another way to put it (if indeed you're to young to get this meme) - despite of being released in 2007, played in 8K at highest settings - can go over 6K VRAM:


That being said - it's only natural for VRAM Requirement to get higher and higher. On the bright side - most developers won't shoot themselves in the foot - by implementing a level of detail which only a very small minority of machines can handle (after all - profit - is the top priority).
 
Joined
Jul 15, 2022
Messages
963 (1.06/day)
You see the same thing with Photoshop. In 2005, the app opened in about 11 seconds.

Meanwhile, we have M.2 SSDs, and CPUs and RAM have gotten much faster.

But Photoshop still opens in about 11 seconds in 2023. Suppose everything were perfectly optimized then there would be no need for powerful hardware, and then Nvidia, Intel and AMD and many other big companies could not sell much anymore.

There is a big financial motive to make software run as slow as possible on current hardware, otherwise billions of people don't feel they need another upgrade for their PC.

I also believe that 4GB of VRAM could be more than sufficient for 1080p gaming. This has also been proven many times that games that use more than 4GB VRAM for 1080p gaming (max settings) often do not look sharper than games that use less than 4GB VRAM on max settings.
 
D

Deleted member 57642

Guest
You see the same thing with Photoshop. In 2005, the app opened in about 11 seconds.

Meanwhile, we have M.2 SSDs, and CPUs and RAM have gotten much faster.

But Photoshop still opens in about 11 seconds in 2023. Suppose everything were perfectly optimized then there would be no need for powerful hardware, and then Nvidia, Intel and AMD and many other big companies could not sell much anymore.

Download the same Photoshop version from 2005 and you'll see an improvement. Same way, run the latest version of PhotoShop on similar hardware like the one you used in 2005 (especially on a HDD) - and you'll notice the drawbacks of past tech (might take 4 minutes or 320 seconds to load and really stutter - while using the latest implemented features).

Software optimizations have their limits (can help - but can't compete with better hardware / there's no room for comparison). In terms of software development - the benefit of all this hardware advancements from past decade - tend to translate to more features or/and more advanced textures being used. PhotoShop for example - used to be highly skill dependent in 2005 and was used mostly as an offline tool - dependent mainly on local resources. Latest Photoshop - has A.I. on top of the old engine - which makes it quite dependent of Online/Cloud Services - and takes far less skill to be used compared to 2005 versions - while having so many automated features. That being said, this A.I. improvements - where an A.I. tool can cover hours of work in just couple of seconds - does save a lot of time (can cover 10x more work than in 2005 and that quite smoothly with the help of modern hardware).

As for gaming, as mentioned above - some people are obsessed with ULTRA SETTINGS at 2k/4k Resolutions. Which, if we're to be real - would be impossible to achieve with any software optimizations (modern games running at maximum settings - even on mid range hardware or lower). Even more than that - most modern games do come with software optimizations tweak-able from the option menu. What do you think all those - Graphic/Video Settings are?! There's Games where you can lower the VRAM usage by 70% (compared to ULTRA settings), also running at higher FPS - while still looking great. But no, all the sliders have to set on Maximum/Ultra+ and the game should still run great - or else it's deemed poorly optimized. That's being absurd/delusional.:kookoo:
 
Joined
Oct 18, 2013
Messages
6,309 (1.54/day)
Location
So close that even your shadow can't see me !
System Name The Little One
Processor i5-11320H @4.4GHZ
Motherboard AZW SEI
Cooling Fan w/heat pipes + side & rear vents
Memory 64GB Crucial DDR4-3200 (2x 32GB)
Video Card(s) Iris XE
Storage WD Black SN850X 8TB m.2, Seagate 2TB SSD + SN850 8TB x2 in an external enclosure
Display(s) 2x Samsung 43" & 2x 32"
Case Practically identical to a mac mini, just purrtier in slate blue, & with 3x usb ports on the front !
Audio Device(s) Yamaha ATS-1060 Bluetooth Soundbar & Subwoofer
Power Supply 65w brick
Mouse Logitech MX Master 2
Keyboard Logitech G613 mechanical wireless
VR HMD Whahdatiz ???
Software Windows 10 pro, with all the unnecessary background shitzu turned OFF !
Benchmark Scores PDQ
Sooooo, after 335 posts of everyone expounding on their own arguments, justifications, complaints and compliments on anything & everything related to this issue, can't we just summarize to say that the answer to the OP's original question is a big, fat, resounding

YES

Game requirements ARE out of control, however, the gamin folks have only themselves to blame, since they are constantly saying that they want everything more & better & faster etc, be it textures, reflections, FPS or whatever, and in most cases, are silly enough to pay the game developers & card mfgr's for those wants ......

As the old saying goes: If the shoe fits, wear it you must....:eek:..:shadedshu:..:laugh:
 
Joined
Jun 6, 2022
Messages
622 (0.66/day)
You call it technology, I call it cancer of the gaming industry. I would never ever choose a GPU based on this.

It is useful to call "almost playable" the terrible games that are being released broken (a side effect of the post-existence of Huang's Pandora's box), and secondly to justify the existence of products with similar price and almost zero generational advancement like 3060-4060.

*The few people who make money from this type of activity will buy dedicated recording cards. Bro, Not everyone purchases a GPU to spend time recording games or other activities instead of enjoying a few hours of gameplay, given the hectic lives that most people live.
This cancer accompanies you everywhere you use the system. Operating system, browser, antivirus, video playback, etc. It's called hardware acceleration.
See, out of curiosity, how much GPU Usage you have on the video card only when loading the operating system. This "cancer" gave satisfaction to many owners of GT 8xxx (8800 GTS, I think) who managed to install modded drivers to unlock Quadro features. Now, with two or three exceptions, Qaudro facilities are available on GeForce.
Understand that a video card is not only used for 3D rasterization.

NVENC (nVidia encoder) is not limited to recording. You can use it for encoding and streaming, for example. It is a separate module in the chip, completely independent of cores.
 
Joined
Sep 25, 2023
Messages
159 (0.34/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
Understand that a video card is not only used for 3D rasterization.
Yeah this is what I was referring in one of my previous comments, gpu's can do many things, hence the narrative that some people push that "nobody should buy a 8gb card in 2023" is stupid.

NVENC (nVidia encoder) is not limited to recording. You can use it for encoding and streaming, for example. It is a separate module in the chip, completely independent of cores.
Nvdec for decoding, Nvenc for recording. Modded drivers let you have as many nvenc encoding sessions as you want, like in QUADRO gpus :)
It's been nice that Nvidia upped the limit for Geforce from 2 to 3 then from 3 to 5. It has let people stick to their geforce cards without going for quadro, most of them just swap to the studio drivers
 
Joined
Jan 14, 2019
Messages
13,274 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Yeah this is what I was referring in one of my previous comments, gpu's can do many things, hence the narrative that some people push that "nobody should buy a 8gb card in 2023" is stupid.
Besides, there's many many great games that run well with 8 GB, or even less than that. The newest AAA games aren't the only entertainment in existence, and Ultra isn't the only graphics option.
 
Joined
Jun 6, 2022
Messages
622 (0.66/day)
I tried to see any difference in your first video, but all I see is encoding (bitrate-related) artefacts in both native and DLSS.
Try to find a single frame in which 90 fps is displayed until DLSS is activated. Even if I film with a dedicated camera, many details are lost on YouTube. Probably because it is a simple, free account.
So:
1. FPS difference between DLSS OFF/ON (video 1).
2. The degree of vRAM occupancy in video 1 (April, 1080p) and that of video 2 (November, 1440p).

DLSS does a good job (reviewers say so too) and a lot of work has been done to optimize the memory used in this game.

I remain with my fix that the GPU limitation intervenes long before the vRAM limitation. I'm looking at the 6700XT and I don't see how 12GB helps if in all new games (one or two exceptions) it can't render over 50 fps average at 1440p resolution? Even in 1080p it has big problems in many titles. 12GB is really not enough for 1080p????
 
Joined
Jan 14, 2019
Messages
13,274 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
1. FPS difference between DLSS OFF/ON (video 1).
2. The degree of vRAM occupancy in video 1 (April, 1080p) and that of video 2 (November, 1440p).
Both parts of the footage look smooth, so why would I care?
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
9,214 (3.99/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
Game requirements ARE out of control, however, the gamin folks have only themselves to blame, since they are constantly saying that they want everything more & better & faster etc, be it textures, reflections, FPS or whatever, and in most cases, are silly enough to pay the game developers & card mfgr's for those wants ......
This is a good point.

Gotta pay to play, that has never, ever changed. This is why in the very beginning SLI was good. But we as humans are greedy in general.
 
Joined
Sep 25, 2023
Messages
159 (0.34/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
Besides, there's many many great games that run well with 8 GB, or even less than that. The newest AAA games aren't the only entertainment in existence, and Ultra isn't the only graphics option.
Oh yeah absolutely, I've had quite a year playing and recording all my old favorite games and I still have a lot of them to go for haha, no need to play the shiniest all the time tbh,
Ultra graphics are kind of a waste, anything with ultra after 2016 in my opinion is a waste in performance/resources and I'm glad I'm not the only one that thinks the same.
 
Joined
Jan 14, 2019
Messages
13,274 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Oh yeah absolutely, I've had quite a year playing and recording all my old favorite games and I still have a lot of them to go for haha, no need to play the shiniest all the time tbh,
Ultra graphics are kind of a waste, anything with ultra after 2016 in my opinion is a waste in performance/resources and I'm glad I'm not the only one that thinks the same.
I'm not saying that Ultra is a waste. If your computer can run it, then do it, why the heck not. :)

All I'm sayin is, it's not needed to enjoy the game, so if your computer can't run it, dialling back a notch (or two) isn't the end of the world.

Historically though, Ultra has always been a setting for future graphics cards, so I really don't get why people got angry all of a sudden.
 
Joined
Jun 6, 2022
Messages
622 (0.66/day)
Nvdec for decoding, Nvenc for recording. Modded drivers let you have as many nvenc encoding sessions as you want, like in QUADRO gpus :)
It's been nice that Nvidia upped the limit for Geforce from 2 to 3 then from 3 to 5. It has let people stick to their geforce cards without going for quadro, most of them just swap to the studio drivers
The problem back then was that you couldn't use hardware acceleration on a GeForce. To give an example, the burst of performance available now when you use a video card for rendering (Blender, for example), was impossible in those days. In the GT 8xxx series, some smart guys found a niche and with modded drivers you could use GeForce in other applications, not just in games.
P.S. as a rule, my RTX even has 40% peak GPU Usage when loading the operating system. Right click, GPU Usage reacts. You open a folder, GPU Usage reacts. Nowadays it is totally wrong to separate the video card from the rest of the system and to attribute merits to it only in 3D rendering.

Both parts of the footage look smooth, so why would I care?
Recordings with Counter Strike 60 fps versus 120 fps can also be smooth. In the game, you really don't care?
Either way, you get a spectacular jump with DLSS. If you are not interested now, it is a reserve for the future.
 
Last edited:
Joined
Sep 25, 2023
Messages
159 (0.34/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
The problem back then was that you couldn't use hardware acceleration on a GeForce. To give an example, the burst of performance available now when you use a video card for rendering (Blender, for example), was impossible in those days. In the GT 8xxx series, some smart guys found a niche and with modded drivers you could use GeForce in other applications, not just in games.
Oh for real? Back in those days I was using Radeon mostly, even if I had a 9800 gt I used more Radeon so I honestly didn't really know it was a thing back then omg

P.S. as a rule, my RTX even has 40% peak GPU Usage when loading the operating system. Right click, GPU Usage reacts. You open a folder, GPU Usage reacts. Nowadays it is totally wrong to separate the video card from the rest of the system and to attribute merits to it only in 3D rendering.
Yeah I don't really get why it's all pilled up there, there must be a reason but I've never bother to look in to it, it must have something to do with the WDDM and how it handles the desktop rendering etc etc
 
Joined
Mar 7, 2023
Messages
947 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
Keyboard msi gk30
You see the same thing with Photoshop. In 2005, the app opened in about 11 seconds.

Meanwhile, we have M.2 SSDs, and CPUs and RAM have gotten much faster.

But Photoshop still opens in about 11 seconds in 2023. Suppose everything were perfectly optimized then there would be no need for powerful hardware, and then Nvidia, Intel and AMD and many other big companies could not sell much anymore.

There is a big financial motive to make software run as slow as possible on current hardware, otherwise billions of people don't feel they need another upgrade for their PC.

I also believe that 4GB of VRAM could be more than sufficient for 1080p gaming. This has also been proven many times that games that use more than 4GB VRAM for 1080p gaming (max settings) often do not look sharper than games that use less than 4GB VRAM on max settings.
My old mac se from 1987 with a 25mhz accelerator card booted up near instantaneously, so did photoshop 1.0.... the only drawback was it was monochrome (There wasn't even gray) and the screen was like 8". But the stock cpu I think was 8mhz, so 25mhz was a huge jump.

Tangent aside yeah the more resources developers have, the less clever they have to be with them. You can do a lot with very little, its been shown so many times. Buuuut, there's not much incentive for that unfortunately.

On the bright side, at least computers double as furnaces now.
 
Joined
Apr 30, 2020
Messages
1,018 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
This is a good point.

Gotta pay to play, that has never, ever changed. This is why in the very beginning SLI was good. But we as humans are greedy in general.

All of AMD's RDNA lineup support mGPU from RX 5,000 series up through RX 7,000 series.
I really feel like the next DirectX should have mGPU mandatory because there are times that a single card just isn't going to cut it. Plus, I would rather still have the option to compete with Frame generation.
I feel like the smaller buses are trying to midgait the problem of becoming cpu bottlenecked by the larger buses, but that doesn't seem to be a problem with the RTX 4090.
 
Joined
Jan 14, 2019
Messages
13,274 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Recordings with Counter Strike 60 fps versus 120 fps can also be smooth. In the game, you really don't care?
Nope, 60 FPS is smooth enough for me. :) With that said, I don't play CS, or any other competitive shooter.

Either way, you get a spectacular jump with DLSS. If you are not interested now, it is a reserve for the future.
Exactly! It's a reserve. That's why I don't get it when people treat it like a must-have.
 
Joined
Jun 6, 2022
Messages
622 (0.66/day)
Nope, 60 FPS is smooth enough for me. :) With that said, I don't play CS, or any other competitive shooter.


Exactly! It's a reserve. That's why I don't get it when people treat it like a must-have.
In some titles, DLSS has been a "must have" since its launch. Even in 1080p. As a reserve, it proves extremely useful for RTX 2000 owners or those using weaker video cards.
The Last is a shooter. A high framerate helps enormously.

In the two captures there are only two big differences:
DLSS on (included in the preset) and DLSS off
FPS: min, max, average

Cyberpunk 2077 Ray Tracing Ulta preset.png

Cyberpunk 2077 RT Ultra DLSS off.png
 
Top