• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is a game's graphical quality important to you?

Are graphics important?

  • Yes.

  • No.

  • I have a preference for presentation (e.g. cartoony, realistic, etc.).

  • Other (please specify).

  • Yes (up to a point).

  • No (up to a point).


Results are only viewable after voting.

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,993 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
I was agreeing up to this point, so hold up. AMD didn't copy it otherwise it would have been just as crappy and proprietary as GSync. AMD made VRR more standard by not requiring stupid special hardware for it and making the standard open. It forced nVidia to support it. I'd hardly call that copying. nVidia tried to corner the market and AMD flipped that on its face. RT on nVidia cards is no different, more proprietary hardware and APIs to get the best performance at the cost of vendor lock, which is a dangerous thing with a company like nVIdia given their history. If I bought a GPU today, RT performance is going to be at the bottom of my list of things I care about. I want the ecosystem to evolve just like VRR did because eventually, we won't need proprietary compute blocks or APIs to do this and it will all become standardized. Until then, I'll let everyone else play the role of guinea pig.
Proprietary hardware drives innovation because it results in profit - the fact that open source or non-licensed versions arrive a few years later is a benefit of this process, not a limitation.

You can see each company taking it's own approach, Intel with their "optimized kernel" and "compatibility kernel", Nvidia with the RTX stack (that is part of the industry standard set of tools NVIDIA offers, despite not being opensource). AMD is going the route AMD always takes, which seems to be seeing what Intel/NVIDIA do and releasing their own version a couple years later. See DLSS/FSR, DLSS3.0/FSR3, Ray tracing support (AMD competing with Ampere not Ada, and losing to Intel). Hopefully we see some kind of standardization which seems to already be the case. You can Run XeSS/FSR OR DLSS etc. in games if you have an NVIDIA GPU, despite each of these running in different ways and using different hardware to different extents. Unreal Engine 5.1 with Lumen - hardware ray tracing takes perfect advantage of NVIDIA RTX hardware, giving you better fidelity due to taking advantage of specific proprietary hardware with no loss of performance over the hybrid software approach, while AMD/Intel can still use the hybrid approach and have somewhat lower fidelity with a somewhat higher cost to FPS with hardware RT enabled, but still share in RT goodness.

I don't believe that there's much evidence for opensource taking over vs proprietary, last time I checked NVIDIA CUDA and Microsoft Office are still the standards, despite there being open source alternatives to each, I use the word alternatives instead of equivalents since I don't think there are equivalent stacks.

RT on nVidia cards is no different, more proprietary hardware and APIs to get the best performance at the cost of vendor lock, which is a dangerous thing with a company like nVIdia given their history. If I bought a GPU today, RT performance is going to be at the bottom of my list of things I care about. I want the ecosystem to evolve just like VRR did because eventually, we won't need proprietary compute blocks or APIs to do this and it will all become standardized. Until then, I'll let everyone else play the role of guinea pig.
So your approach is basically to take a moral stance against a technical question, which is somewhat admirable, but you pay the cost of that in the product you get. Sadly it seems most people prefer higher performance/fidelity rather than a more open standard at the cost of those factors.

1672751536021.png
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
41,886 (6.61/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Eye rolls, RT is just like physx, sli, hardly implemented.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,162 (2.82/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
So your approach is basically to take a moral stance against a technical question, which is somewhat admirable, but you pay the cost of that in the product you get. Sadly it seems most people prefer higher performance/fidelity rather than a more open standard at the cost of those factors.
You're right. I don't tend to buy nVidia on moral grounds. They could have the best GPU and performance, but I don't care. It's hard to excuse a lot of what nVidia does. As a company, they're pretty hostile to other businesses in the industry and I can't ignore that.
 
Joined
Jan 25, 2020
Messages
2,202 (1.26/day)
System Name DadsBadAss
Processor I7 13700k w/ HEATKILLER IV PRO Copper Nickel
Motherboard MSI Z790 Tomahawk Wifi DDR4
Cooling BarrowCH Boxfish 200mm-HWLabs SR2 420/GTX&GTS 360-BP Dual D5 MOD TOP- 2x Koolance PMP 450S
Memory 4x8gb HyperX Predator RGB DDR4 4000
Video Card(s) Asrock 6800xt PG D w/ Byski A-AR6900XT-X
Storage WD SN850x 1TB NVME M.2/Adata XPG SX8200 PRO 1TB NVMe M.2
Display(s) Acer XG270HU
Case ThermalTake X71 w/5 Noctua NF-A14 2000 IP67 PWM/3 Noctua NF-F12 2000 IP67 PWM/3 CorsairML120 Pro RGB
Audio Device(s) Klipsch Promedia 2.1
Power Supply Seasonic Focus PX-850 w/CableMod PRO ModMesh RT-Series Black/Blue
Mouse Logitech G502
Keyboard Black Aluminun Mechanical Clicky Thing With Blue LEDs, hows that for a name?!
Software Win11pro
Yeah, its time to start a new RT debate thread...

There hasn't been a response regarding the OP for so long I wasn't sure what thread I was reading.
 
Joined
Jan 10, 2011
Messages
1,436 (0.28/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) ViewSonic VA2406-MH 75Hz
Case Bitfenix Nova Midi
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Logitech G300s
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 24.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
"oh a top tier baked lighting solution is barely distinguishable from low end RT, so what's the point" the point is that the top tier baked lighting solution is only likely to come from a AAA game studio who spent 1000s of developer hours achieving that.
This!
Studios marketing RT as merely an upgrade for screen-space reflections is highly misleading, but I suppose it's unavoidable. You need to convince consumers to pay for required hardware R&D somehow...

You can't actually implement things like RT in a meaningful way. In fact, a forced perspective negates some 3D rules.
Camera perspective is irrelevant. All digital imagery project to a 2D plane. RT, like any other rendering method, address how to convert scene parameters into coloured pixels. It doesn't matter how the camera moves, or whether it moves at all. RT has been used for fixed images in archviz and product viz in print for ages now.

In Unravel's case, the scene is three dimensional, and from what I gathered/estimate off a couple of youtube vids, it uses the mainstream GI-estimation hacks you'd see in most contemporary games (light maps, AO, light probes, etc). So it is a candidate for an RT treatment, hypothetically speaking.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,996 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Camera perspective is irrelevant. All digital imagery project to a 2D plane. RT, like any other rendering method, address how to convert scene parameters into coloured pixels. It doesn't matter how the camera moves, or whether it moves at all. RT has been used for fixed images in archviz and product viz in print for ages now.

In Unravel's case, the scene is three dimensional, and from what I gathered/estimate off a couple of youtube vids, it uses the mainstream GI-estimation hacks you'd see in most contemporary games (light maps, AO, light probes, etc). So it is a candidate for an RT treatment, hypothetically speaking.

The issue at hand is the view within the scene cannot be altered. It is a static facsimile of a 3D scene, unlike a first-person POV, or 3rd person POV, where the player (or viewer) can navigate around the environment and in doing so, the play of the light will alter dynamically with each novel point of view. It is the movement and viewpoint from within a space which makes lighting important. You also totally missed the point when I said it can't be implemented in a meaningful way - the effect being to immerse you into the scene (which is not the focus of a side-scrolling adventure).
 
Joined
Jun 1, 2011
Messages
4,559 (0.93/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
currently playing WoW classic
last "great" game I played, Divinity: Original Sin 2
game I played the most in 2022, 2021, 2020, 2019, etc., Civ V
game I most looking forward to, Baldurs Gate 3

so clearly intensive and state of the art graphics are a must for me in any game I play!
 
Joined
Apr 30, 2020
Messages
977 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
Eye rolls, RT is just like physx, sli, hardly implemented.
That's a complete Myth.

Raytracing adaptation is far higher than all three of those combined, at least 55% of games on average in DX12 currently there are 272 games in DX12 & 151 of those game support some sort of raytracing feature.

While with DX11/D10/DX9 & about 5,989 games around 18 % which is 1064 games supported S.L.I
While Crossfire out that same 5,989 games in DX11/DX10/DX9 about 6% which is only 347 games supported Crossfire.
The total humber of games on DX11/DX10/DX9 to support crossfire & SLI together was 12% on average out of 5,989 games.
The total Number of games supported in mGPU is actaully better than both SLI & Crossfire on average. It's like 15% of games like 40 games out 272 Dx12 games support mGPU.

PhysX support which one??? GPU physX? cpu physX ? Apex? or, even Hair Works? ?

Damn near all, or any nvidia sponsored "RTX" games that uses DX12 will have cpu physX Jammed/baked into the code. DX12 has no GPU physX avaliable at all.
PhysX gpu support might have been the lowest % maybe beblow 1% on these games in DX11/DX10/DX9, But that's specific to GPu physX.
however I'm pretty sure the numbr cpu PhysX supported games is well above 10% as it's easily baked in code with many "RTX" games.

On the this topic

There are Games that fall into the category of It's got better graphics now than before, but it's worse game play wise than the original. Which includes games like Crysis Remastered, GTA Triology definite edition & some older games did this too with changes in gameplay, like Conker's Bad fur day Live & reloaded. Some times it makes the game worse sometimes, it makes it better. It can be done well, other times too much of the old is taken out for the new. We end up playing the old because we know how it works we're used to it. Sometimes the it's isn't the graphics that makes the game good it's just good consistant game play that you can replay over & over, do it different ways find things you missed, or find new tricks to do threw out the game. I think Crysis Remastered is the one where the engine porting was the bad idea for graphics, because it lack the use of more than two threads from a modren 16 thread cpu.
 
Joined
Jan 10, 2011
Messages
1,436 (0.28/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) ViewSonic VA2406-MH 75Hz
Case Bitfenix Nova Midi
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Logitech G300s
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 24.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
It is a static facsimile of a 3D scene, unlike a first-person POV, or 3rd person POV, where the player (or viewer) can navigate around the environment and in doing so, the play of the light will alter dynamically with each novel point of view. It is the movement and viewpoint from within a space which makes lighting important.
Not necessarily. Lighting is important whether the viewpoint changes or not. Has been long before we started animating things (i.e. in painting, photography). I mentioned archviz specifically because it's one instance where RT has long been used, and taken as the standard, to produce imagery from -inherently- fixed perspective.

That said, panning alters the view as much as dollying or orbiting do. And when using a perspective camera, panning has an effect on the third dimension (perspective distortion). And even if the camera was static, movement of the objects itself alters the scene lighting (which is the main reason why light probes were introduced). And that's without going into cases where the light sources itself move.

There are cases where one would be hard pressed to find an RT implementation for; games such as Super Meat Boy or Celeste. But that's not because they are side scrollers, it's because they are not designed to simulate illumination to begin with. Unravel, on the other hand, is.

You also totally missed the point when I said it can't be implemented in a meaningful way - the effect being to immerse you into the scene (which is not the focus of a side-scrolling adventure).
I admit that I was focusing on RT as an alternative to the status quo. That is, using it to produce the exact same result, but with less work. But that doesn't mean one can't improve on the existing product (even with mainstream methods).
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,996 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
I'll just say again - a meaningful difference. Consider though the topic is still about graphical quality, therefore the side-scroller family, whilst enjoying good graphics (amazing graphics in Unravel), does not need to be improved to enjoy the game. Hell, I'd happily go back and play Jet Set Willy for the pure nostalgia. Making everything shiny is not the be all and end all. Context is everything.
 
Joined
Sep 17, 2014
Messages
22,300 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I'll just say again - a meaningful difference. Consider though the topic is still about graphical quality, therefore the side-scroller family, whilst enjoying good graphics (amazing graphics in Unravel), does not need to be improved to enjoy the game. Hell, I'd happily go back and play Jet Set Willy for the pure nostalgia. Making everything shiny is not the be all and end all. Context is everything.
Yep, that is exactly my point.

I don't deny RT will take over at some point for a large number of game's/types/engines. I don't deny Nvidia has an advantage here right now. But will it be the feature that cannot be missed - I think history shows us that is untrue. Good points were raised also by @dgianstefani imho about what will drive the market forward and what customers are buying. Yes. Certainly, all true. And still, we see that market adoption is a fickle beast, and the actual practice in the actual content we see is not quite so rosy as the expectations are before the content gets released.

Lots of implementations of RT in games right now are lackluster, way too costly for what they achieve (a commonly heard complaint for any game on any engine that takes too many resources to run at half decent FPS, no matter what kind of graphical treat it is about), and as it is progressing further in time, we're also seeing a latency hit and the necessity to stack technologies at the cost of general graphical quality (DLSS3 is a balancing act like that). Meanwhile, even with all these technologies, FPS is still lackluster on the fastest cards. We're also seeing these cards priced out of the market for most gamers.

We see progress on phones as in: there are socs that can do RT. Good! Now let's see the content happen, and let's see the performance running that content on a small handful of Watts.

And then the context of this topic when we take all of these points into consideration. I think the general desire is not RT, clearly, but rather affordable gaming at decent quality. RT for all of its cost will be balancing on that scale, and its balancing act as it is right now, is uncertain. What is affordable is changing, and what is desirable in gaming has also changed, and not in favor of graphics over everything else. The overwhelming majority of gaming is all about game concepts, and not graphics.

What I am really waiting for wrt to RT, is the same thing I would have expected by now from physics engines. Implementation in a game where you really get to play around with these elements. Where they are essential to its gameplay. Thát is probably the moment we'll start seeing RT as impossible to miss out on. But as long as it is 'fighting with rasterized' over dev hours and market share, it will remain a 'nice to have' at best, and it'll always fight for attention against well running rasterized implementations of lighting. It'll be usable in specific games, much like it is today.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,993 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
What I am really waiting for wrt to RT, is the same thing I would have expected by now from physics engines. Implementation in a game where you really get to play around with these elements. Where they are essential to its gameplay. Thát is probably the moment we'll start seeing RT as impossible to miss out on. But as long as it is 'fighting with rasterized' over dev hours and market share, it will remain a 'nice to have' at best, and it'll always fight for attention against well running rasterized implementations of lighting. It'll be usable in specific games, much like it is today.
It's not fighting for anything.

Epic decided to make RT the standard for UE 5.1 as it's a major selling point for their engine - easier to implement and better looking realistic lighting by default.

It's pretty much open and closed at this point. RT is the standard moving forward, we're already at 55%+ adoption rates for new games, and that is only going to increase.

If you count "new games with actual new engines" and not "games released in 2022/23 using old engines" that 55% stat goes up quite a bit I'd wager.
 
Joined
Jul 5, 2013
Messages
27,404 (6.62/day)
I voted 'No, up to a point'.

Games are meant to be played for fun (that's why they are games). Immersion can be a part of that fun, but it is very dependant upon the game and the person. I loved playing the Steam game Super Hot, and the graphics are both simplistic and hyper-effective. That game, I think, would not benefit from shiny graphics - the entire point of the game is the gameplay.

When I want hyper-real graphics, I go for a walk outside.
I am of a similar opinion, voted "Yes, up to a point".

I enjoy the effort made to make a game visually appealing, but such should never come at the expense of the other parts of the game experience. The best GFX in the world would be utterly ruined by a lackluster OST or sloppy controls or a bad story plot.

Example: Chrono Trigger for the SNES.
That game had everything, excellent visuals(for the time and the hardware), amazing music, controls that helped immerse you in the world and a story arc that was truly exceptional! SquareSoft didn't just push the limits of the hardware graphically, they focused just as equally on every aspect of the game to grant the player the best experience possible.

Bad GFX can break a game, but amazing GFX can't make a game alone.
 
Joined
Jun 2, 2017
Messages
8,974 (3.31/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Yeah of course you're absolutely right, that's why every CPU and GPU designer including ones bringing their first products to market (Intel), consoles, and literal phone GPUs are inserting support for RT, using valuable and expensive die space. Since Raster will get us all the way when it comes to graphical fidelity. I guess all that Moores law and massive progression in processing power over the past several decades should culminate in zero new techniques and methods of rendering and processing, since we're clearly at the apex already. Maybe Cinema should never have invested in 3D technology or advanced CGI? Too expensive and low volume initially...

Thanks for sharing. Me pointing out that the lowest tier RTX SKU in it's most power and cooling limited laptop form from the literal first generation of RT GPUs, backed with our own TPU testing that shows the flagship 2080ti getting less than 20 FPS in a cutting edge RT demo is... surprising? Rude to point out? Needing a "reality check"? In case you missed the meaning, I mentioned the research paper date as an example of just how cutting edge that RT implementation is, since there's obviously huge variation in the complexity and integration level of RT across different games.

All new consoles have literal hardware for RT baked into the design, and new engines use it by default. We've had just about every AAA game be released with some form of RT, and consoles typically have two modes, high quality (RT on), and high refresh, using Spiderman as an example.

And you call me arrogant?:laugh:

So, your examples of GSync (innovation by Nvidia in 2013, copied by AMD with Freesync in 2015, now ubiquitious), HBAO+ etc, don't matter since at initial adoption they had low market penetration? Interesting take... Pushing RT seems to be something the entire industry is doing, with NVIDIA being a leader, and Intel laughably having better RT support than AMD out of the gate, shame about the driver issues. But of course, it's entirely possible that every market analyst, engineer, executive etc. at all of these companies concurrently have missed what you are saying, and placed all their R&D into a dead end... wait no, it's only a dead end until it's the majority of the market? Is that what you're saying? You mentioned 20 years a while back, I guess we'll see, doubt it though.


Sure bud. Whatever floats your boat. :toast:

I for one am all for innovations that push the envelope for detail and accuracy. Cheers to all the (much smarter than anyone here) engineers and researchers figuring out ways to make the virtuality even less distinguishable from reality.
So I guess Ray Tracing is Nvidia's idea? How old are you none of what Nvidia has been doing is new. When you speak of Intel being better than AMD at Ray Tracing I guess you are missing the point that they had to work with Intel to develop it in the first place as you may not know that all Nvidia did was implement certain features of DX11 and 12. Physx (Bought), SLI, Gsync and DLSS are all Nvidia innovations and all selfishly kept for them. AMD has adopted what made the PC space what it is, Open Source and I guess in your World it is not cool to buy $149 monitors that support Freesync or Big Screen 4K TVs that are great for Gaming, not because of Gsync but thanks to VRR or in another way Freesync. Unreal 5 could do to Nvidia Ray Tracing the same thing Freesync did to FSR. We can thank Nvidia though there selfishness is so complete that people love to treat them these Companies like they are entities and not a Group of People making a decision. Is EVGA the first vendor to cut ties with Nvidia? What has the outcome of the burning connectors been? All I have heard is leave your side panel off.
 
Joined
Jun 1, 2011
Messages
4,559 (0.93/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
same thing occurs in movies

I still prefer this



over this

 
Joined
Sep 17, 2014
Messages
22,300 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
same thing occurs in movies

I still prefer this



over this

Yup and average movie quality has similarly taken a nosedive. More often than not I struggle to understand the visual crap they plaster over some stuff.

Star Trek (the new series, first season ) has it too. They put lens flares all over the god damn screen everywhere, for no reason at all. There is no artistic vision here, just 'look at my flashy lights' and if often adds nothing but does detract from the overall thing. In gaming, a lot of the same happens. I mean how easy is it when you can automate your visual quality and then create a flashy trailer. No need for good game content. You can make do with a 5 hour campaign on some linear trail, draw quest line from A to B, include some fake idea of customization or choice and poof, next! We do in fact have a decent number of those games, they often coincide with the launch of a console or a new GPU line. This isn't new...

Generated entertainment = fast food entertainment.

No Man's Sky comes to mind. They're still working to complete their promises from the trailers. 'Just procedural' turned into a massive development roadmap post release, because otherwise the whole game is effing boring. And frankly, it really still is, you're just less liable to burn out from the same crap, you can do other pointless things in it. There's no game, and there never really will be, here. Its still a casual space simulator with wacky colors and totally broken mechanics.

That's the exact issue with this in a nutshell. No real talent = no real games, just fast food junk. Nothing changes here, we won't magically have more talented devs, but we will have a magical surge of con artists.

It's not fighting for anything.

Epic decided to make RT the standard for UE 5.1 as it's a major selling point for their engine - easier to implement and better looking realistic lighting by default.

It's pretty much open and closed at this point. RT is the standard moving forward, we're already at 55%+ adoption rates for new games, and that is only going to increase.

If you count "new games with actual new engines" and not "games released in 2022/23 using old engines" that 55% stat goes up quite a bit I'd wager.
Irrelevant if you can't run it OR if the visual improvement is barely there. This is not 'adoption rate'. Adoption rate is in the user base.

Of course commerce is pushing the next best thing, that's why its commerce. You're just kicking in open doors here. Lots of innovation also just dies after trying for a few years. Again: VR is your best example. Even 'killer app' Alyx didn't take it anywhere. Like, literally nowhere. Its ultra niche and not picking up momentum. RT is in the same place, regardless of game support, if the hardware isn't in player hands. And, like I've pointed out a half dozen times now, that momentum ain't exactly picking up either, even with Nvidia 'owning 88%' of the PC market. The vast majority of that share can't even run RT, and of the cards that cán run it, only a tiny top end margin is actually capable of enough to keep doing so and provide a solid experience - in a small handful of games.

But, its nice you're all optimistic ;) I think we've come full circle by now; adoption rates are going nowhere - people still just upgrade when its time and not before, the high price of entry making that a much harder sell, and content isn't making it a must have, that much is pretty clear going by the reactions in this topic. There are different demographics here than elsewhere, I'm sure you can find a popularity number elsewhere that's different. But the actual facts are: we just don't know where it's really going. And another fact: its not primary to gaming for most gamers.
 
Last edited:
Joined
Feb 24, 2009
Messages
2,928 (0.51/day)
Location
Riverside, California
Processor AMD Ryzen 7 7800X3D
Motherboard AsRock X670E Phantom Gaming Lightning
Cooling Be Quiet! Dark Rock 4
Memory G.SKILL Trident Z5 Neo DDR5-6000 32GB (2 x 16GB)
Video Card(s) Sapphire Radeon RX 7900 XTX
Storage Samsung 980 PRO Series 1TB, Samsung 980 PRO Series 1TB, Crucial P3 NVMe M.2 2TB
Display(s) LG OLED55G2PUA
Case Lian Li O11 Dynamic XL ROG Certified
Audio Device(s) Digital out to high end dac and amps.
Power Supply EVGA GQ 1000W
Mouse Logitech G600
Keyboard Logitech G413 Carbon
VR HMD Oculus Rift CV1, Oculus Rift S, Quest 2, Quest 3
Software Windows 10 Pro
A big yes from me. Games that have crap graphics get a hard pass. I expect the games I play today to be on par with todays expected generation of graphics.

Sure I can play older games and enjoy them for what they are but I'm strictly speaking about new games and with that in mind, the prettier the better, but if gameplay and such are crap then no amount of good graphics will save it.
 
Joined
Oct 21, 2005
Messages
7,039 (1.01/day)
Location
USA
System Name Computer of Theseus
Processor Intel i9-12900KS: 50x Pcore multi @ 1.18Vcore (target 1.275V -100mv offset)
Motherboard EVGA Z690 Classified
Cooling Noctua NH-D15S, 2xThermalRight TY-143, 4xNoctua NF-A12x25,3xNF-A12x15, 2xAquacomputer Splitty9Active
Memory G-Skill Trident Z5 (32GB) DDR5-6000 C36 F5-6000J3636F16GX2-TZ5RK
Video Card(s) ASUS PROART RTX 4070 Ti-Super OC 16GB, 2670MHz, 0.93V
Storage 1x Samsung 970 Pro 512GB NVMe (OS), 2x Samsung 970 Evo Plus 2TB (data), ASUS BW-16D1HT (BluRay)
Display(s) Dell S3220DGF 32" 2560x1440 165Hz Primary, Dell P2017H 19.5" 1600x900 Secondary, Ergotron LX arms.
Case Lian Li O11 Air Mini
Audio Device(s) Audiotechnica ATR2100X-USB, El Gato Wave XLR Mic Preamp, ATH M50X Headphones, Behringer 302USB Mixer
Power Supply Super Flower Leadex Platinum SE 1000W 80+ Platinum White, MODDIY 12VHPWR Cable
Mouse Zowie EC3-C
Keyboard Vortex Multix 87 Winter TKL (Gateron G Pro Yellow)
Software Win 10 LTSC 21H2
I think a lot of these graphics filters make things harder to look at and that detracts from game play. I have 20/15 vision but i find it hard to find elements on the screen from the amount of blur, filter, aberration, flare, etc. I have the same opinion on the ENBs that were so popular for games such as Skyrim. Sure it produces a decent screenshot, but the gameplay suffers from it.
 
Joined
Jun 16, 2021
Messages
53 (0.04/day)
System Name 2rd-hand Hand-me-down V2.0, Mk. 3
Processor Ryzen R5-5500
Motherboard ASRock X370
Cooling Wraith Spire
Memory 2 x 16Gb G.Skill @ 3200Mhz
Video Card(s) Power Color RX 5700 XT
Storage 500 Gb Crucial MX500, 2Tb WD SA510
Display(s) Acer 24.0" CB2 1080p
Case (early) DeepCool
Audio Device(s) Ubiquitous Realtek
Power Supply 650W FSP
Mouse Logitech
Keyboard Logitech
VR HMD What?
Software Yes
Benchmark Scores [REDACTED]
Back on topic...

Given my hardware situation at the moment, I'm largely revisiting the more senior titles in my game collection. I do have more graphically demanding games, as I previously owned two Ryzen+ platforms; one with a GTX 1660Ti, the other with a RX 5600XT. They were sold to keep a roof over my head. Presumably, someone else is enjoying their capabilities now. Or, at least I hope so. (Believe me, roofs are good. Roofs are quite underrated.) But, I do have some perspective on this topic by having played graphically upscale titles on reasonably decent gaming PCs in the not-to-distant past. Death Stranding, as an example.

So, having re-played things like Clive Barker's Undying, Deus Ex, Call of Pripyat, The Wheel Of Time, and several other games from the early 17th Century, and enjoying every one of them immensely while doing so, I feel I fall firmly in the category of one who feels that gameplay, story, mood and atmosphere are, far and away, more important than visuals. And, this isn't meant to imply that any of these older games are visually sub-standard.

I'm generalizing here, but the upswing in graphic showcasing, (not necessarily graphic realism), has far outpaced many developer's abilities to crank-up the story-telling and gameplay mechanics to an equivalent level. In fact, I don't see too much effort in that area at all. Just recycle the same generic scenario from the previous release, and add a higher number at the end of the title's name. Maybe in Roman numerals, so it seems a little bit innovative.

This is most definitely a "me" problem, but I'm stunned to see that people are willingly paying so much for hardware to simply play what seems like, aside from graphics, such derivative and lackluster games.
 
Joined
Jun 1, 2011
Messages
4,559 (0.93/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
I read this article in PCgamer the other day, I was going to create a new thread for it but it probably goes with this thread's theme


2022 was a warning shot for the big-budget games industry


Even if we reduce these games to the thing they're best known for—pushing the graphical envelope—the industry is increasingly experiencing diminishing returns. The visual difference between games made in 1992 and 2002 is vast. The difference between games made in 2012 and 2022? It's still visible, but nowhere near as dramatic. Making any further graphical gains requires a disproportionate amount of effort, as demonstrated by the hardware demand of ray tracing compared to the visual improvements the tech actually provides. That's a bit of a problem when you've spent the last 30 years luring in players with those exciting graphical leaps.
 
Joined
Jan 14, 2019
Messages
12,252 (5.77/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I read this article in PCgamer the other day, I was going to create a new thread for it but it probably goes with this thread's theme


2022 was a warning shot for the big-budget games industry


Even if we reduce these games to the thing they're best known for—pushing the graphical envelope—the industry is increasingly experiencing diminishing returns. The visual difference between games made in 1992 and 2002 is vast. The difference between games made in 2012 and 2022? It's still visible, but nowhere near as dramatic. Making any further graphical gains requires a disproportionate amount of effort, as demonstrated by the hardware demand of ray tracing compared to the visual improvements the tech actually provides. That's a bit of a problem when you've spent the last 30 years luring in players with those exciting graphical leaps.
Game graphics are good enough already. It's time for game developers to focus on other things, I guess.
 
Joined
Jul 30, 2019
Messages
3,238 (1.68/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
... What has the outcome of the burning connectors been? All I have heard is leave your side panel off.
If I recall correctly (from GN videos I watched and perhaps some other sources) the real problem with the new connectors (from testing) seems likely to have been from improper latching. An unlatched connection with the (reduced) safety margins from NVidia's cable design allows easier overheating when unlatched cable forms a bad connection and the current is no longer distributed evenly across the necessary pins. A correctly latched cable even with a good amount of bending shouldn't cause an issue.
 
Joined
Jul 5, 2013
Messages
27,404 (6.62/day)
Game graphics are good enough already. It's time for game developers to focus on other things, I guess.
Exactly. Hardware has gotten to the point where games can be have amazing GFX with only a passing effort. This leaves devs time to focus on gameplay concepts and handling, storyline, soundFX and music and refining all of same.
 
Last edited:
Joined
Jun 1, 2011
Messages
4,559 (0.93/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
gameplay concepts and handling, storyline, soundFX and music and refining all of same.
none of which you can put on the "back of a box" to sell games and actually needs "creative" talent. It's just easier to make the graphics more demanding and add one more number at the end of the IP's title
 
Joined
Jul 5, 2013
Messages
27,404 (6.62/day)
none of which you can put on the "back of a box" to sell games and actually needs "creative" talent. It's just easier to make the graphics more demanding and add one more number at the end of the IP's title
Unfortunately, the "back of the box" rarely exist anymore. Reviews are King these days.
 
Top