• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is a game's graphical quality important to you?

Are graphics important?

  • Yes.

  • No.

  • I have a preference for presentation (e.g. cartoony, realistic, etc.).

  • Other (please specify).

  • Yes (up to a point).

  • No (up to a point).


Results are only viewable after voting.
It's literally so easy I can take an ancient game like oblivion, put it through one pass of RTX Remix, and have a ray traced game at the other end, with a few hours of work.
Sounds like we never even needed that omniverse to begin with?
 
Sounds like we never even needed that omniverse to begin with?
You mean the omniverse of which RTX Remix is a part of?
 
You mean the omniverse of which RTX Remix is a part of?
Yes yes, I played portal RTX last night actually, just to see, and that is done now too, unimpressive, pointless and most importantly Tooo damn late that games done too old not what people want to play on mass so pointless IMHO.

At this rate they'll be doing Crysis RTX in 15/20 years, great.

Remix all the old shit games you want, sit in that NICHE roll around and enjoy it.

You might as well because people on mass are not now buying and playing portal , or quake 2 for that matter.

A niche doesn't often earn enough for big corp's.

Oh and I eagerly await the big Devs take on remix.

You can't remake your game for later generational rereleases if the gamer's can do it themselves.
 
Yes yes, I played portal RTX last night actually, just to see, and that is done now too, unimpressive, pointless and most importantly Tooo damn late that games done too old not what people want to play on mass so pointless IMHO.

At this rate they'll be doing Crysis RTX in 15/20 years, great.

Remix all the old shit games you want, sit in that NICHE roll around and enjoy it.

You might as well because people on mass are not now buying and playing portal , or quake 2 for that matter.

A niche doesn't often earn enough for big corp's.
RTX Remix is literally free software, it's not a niche, or a earner or anything of the sort. It's a piece of software that demonstrates one of the advantages of RT, which is easy lighting, and AI - in upscaling textures and converting surfaces to physically based so they interact with the world in an immersive fashion. NVIDIA offers it as a tool for modders to use as an example of the advantages of their software/hardware approach. You're barking up the wrong tree bud.

Also, frankly, I doubt that your 2060 would be able to render Portal RTX with any kind of quality level that would satisfy your evident demand for being impressed.

You can't remake your game for later generational rereleases if the gamer's can do it themselves.
Amazing how you turn a free set of innovative tools into a negative.
 
Also, frankly, I doubt that your 2060 would be able to render Portal RTX with any kind of quality level that would satisfy your evident demand for being impressed.


Amazing how you turn a free set of innovative tools into a negative.
You need to try and escape your bubble now and again.
Nvidia is living to you.
It doesn't take a 4090 to be able to run RTX effectively.

It does take compromise, yes, I use it on a 1080p 144hz laptop.

Your leets leaking out there, tuck it back in.

I was impressed with the new COD, until all the lights bled through walls, I retried that last night too, I'll clearly have to try it on the Vega, hopefully it will work better.


A reasonable look at the game release list of last year clearly shows a love of re releasing old games or remaking them, common sense says remix will stir that pot no?!.

Negative, realistic.
 
You need to try and escape your bubble now and again.
Nvidia is living to you.
It doesn't take a 4090 to be able to run RTX effectively.

It does take compromise, yes, I use it on a 1080p 144hz laptop.

Your leets leaking out there, tuck it back in.

I was impressed with the new COD, until all the lights bled through walls, I retried that last night too, I'll clearly have to try it on the Vega, hopefully it will work better.


A reasonable look at the game release list of last year clearly shows a love of re releasing old games or remaking them, common sense says remix will stir that pot no?!.

Negative, realistic.
Yes I'm sure it ran great on a laptop 2060.
Screenshot_20230102-204333.png
 
Portal RTX is a tech demo made by Nvidia for 40-series cards. It shouldn't form the basis of any argument, imo.
And the entire point of ReMix is to sell 40 series cards.

A 3090Ti can’t even hit 60fps at 1080p :shadedshu:
 
Portal RTX is a tech demo made by Nvidia for 40-series cards. It shouldn't form the basis of any argument, imo.
Then maybe people with hardware that can only run it in a slideshow could have a big think as to why they might be unimpressed.

It's intentionally cutting edge RT, and the fact that current gen GPUs can run it in native framerates approaching 100, and DLSS framerates almost double that certainly puts some perspective into the argument some here make that RT is 20 years away from adoption.

And the entire point of ReMix is to sell 40 series cards.

A 3090Ti can’t even hit 60fps at 1080p :shadedshu:
Would you prefer fidelity stagnated so owners of older GPUs could see a bigger number? The 3090ti is still a fast card, the release of the 4xxx series does nothing to change that, it's just the benchmark has shifted, as with all generations.
 
Portal RTX is a tech demo made by Nvidia for 40-series cards. It shouldn't form the basis of any argument, imo.
Aww don't ruin his looking down at the peasants moment, I'll recall it later non the less.

It ran fine, made for 40 series don't make me laugh.

It just didn't look much better in practice and didn't change the fact it's an old game looking old.

RTX looks yawn, simple as.
 
Then maybe people with hardware that can only run it in a slideshow could have a big think as to why they might be unimpressed.

It's intentionally cutting edge RT, and the fact that current gen GPUs can run it in native framerates approaching 100, and DLSS framerates almost double that certainly puts some perspective into the argument some here make that RT is 20 years away from adoption.
As long as those current gen GPUs cost over $1k, I think that argument shouldn't be discarded. When a $150 GPU runs it at above 60 FPS at 1080p, I'll accept RT as a common feature.
 
Some of the techniques used in the Portal RTX aren't even a year away from being research papers.
 
I'm one that says graphics matters up to a point.
The real thing is to have fun, no doubt but at the same time a game that turns into a frame-by-frame, jumbled slide-show isn't fun.
And although graphics don't have to be anything special it is nice if they are sharp/clear enough so it doesn't look like you're playing the latest version of the Blob game with an all-blob cast.

As long as the graphics don't start chugging, the details in the game don't start getting all blocky/blobby and it has decent FPS ( Consistent 45+ FPS minimum) it's all good.
Doesn't have to go screaming along at 200,000,000,000+++++ FPS to be fine as long as the experience is smooth while fulfilling the other requirements - An average of 60 FPS more or less is well doable by me.
 
After many years of Console only gaming since PS3 I'm fine with 60 fps PC gaming 1080/1440p and as long as it doesn't look like junk and doesn't stutter I'm good. My current gaming rig is rocking a RX 6600XT so not exactly a top tier gaming rig but it works well.
 
I'm one that says graphics matters up to a point.
The real thing is to have fun, no doubt but at the same time a game that turns into a frame-by-frame, jumbled slide-show isn't fun.
And although graphics don't have to be anything special it is nice if they are sharp/clear enough so it doesn't look like you're playing the latest version of the Blob game with an all-blob cast.

As long as the graphics don't start chugging, the details in the game don't start getting all blocky/blobby and it has decent FPS ( Consistent 45+ FPS minimum) it's all good.
Doesn't have to go screaming along at 200,000,000,000+++++ FPS to be fine as long as the experience is smooth while fulfilling the other requirements - An average of 60 FPS more or less is well doable by me.
I grew up with pong and Monaco GP as many others did.

Looking back via emulators has taught me one thing, the mind makes up for a lot of shit.

I don't really get how that was a F1 car now but I'm not in the mode either.

But playing on a steamdeck has also reminded me it's really mostly about the experience and how enjoyable it is, 30/40 FPS is fine with small screens or big, it depends only on the eyeballs watching if it's ok.
 
Portal RTX is a tech demo made by Nvidia for 40-series cards. It shouldn't form the basis of any argument, imo.

It would be good to see AMD team up with a game dev and pimp eh... rasterization and eh... more VRAM use or something.
 
It's intentionally cutting edge RT

More like intentionally made to run as atrociously as possible. There is nothing remarkable visually speaking in it compared to other more complex games that have RT and actually manage to run at a decent framerate, I don't know how anyone can make the case that the performance hit is in any way justifiable on a basic game like Portal. Not to mention that such conversions on old games are incredibly misleading because you can achieve 90% of the same look with modern non RT techniques and once you become aware of that you realize just how ridiculous this is.

If the cutting edge of RT is making a 15 year old game look modern while obliterating anything reassembling a playable framerate on 99% of the available hardware we're in for a rough ride.
 
Last edited:
I think the gameplay is always the priority, I like to feel the games I play and that feeling doesnt seem to be afected by a trash resolution, most of my all time favorites are PS2 or PSP games and those look like crap and I think the essence of those games is the gameplay

I think 8bit is better than blood Mario
Nightmare fuel
 
It would be good to see AMD team up with a game dev and pimp eh... rasterization and eh... more VRAM use or something.
They do team up. Does FSR ring a bell?
 
So, where are all those indie studios giving us beautiful GI in games? Minecraft isn't exactly a little guy :)

Convince me. Because Minecraft really just has shitty lighting in the base game, obviously an RT pass is going to have an impact there. They could have also just implemented a few dynamic light sources and shadowing.

And at the same time, a game like Valheim, managed to do that just fine without RT. It has atmosphere up to the moon. And if you ever designed a few levels in Unreal Tourney, you also knew how to work with lighting to get the mood going. This isn't rocket science just because you might not realize it. I remember quite a few in-game editors where all of this was under your fingertips. On a potato PC.


Oh is it? I find games with limitations a hundred times more memorable than a screen plastered full of post effects. Limitations breed creativity, and creativity breeds originality. Automated 'design' is a construct for fast-food products. Less is quite often more, and requiring talent to design a game, is a pro, not a con.

Of course, removing limitations in game design is a good thing in general, but its a mistake to think this can replace a talented dev and creative team.
Or that it somehow saves anyone money or effort.

Sure Rockstar is no indie but gosh look at the lighting effects in Red Dead, they added RT to that but sheesh i feel like it was a waste of resources tbh.


Unreal engine has RT as default now and big and small studios are switching to it from Unity/In house engines etc. UE4 is still more common than UE5, this doesn't mean UE5 isn't going to become the new standard.

Your argument about Minecraft doesn't make much sense, it's almost perfect for RT since it's entirely procedurally/player generated. Baked light sources don't work - hence the shitty lighting. Valheim would also be an excellent candidate for RT if they didn't use the unoptimized mess that is Unity, and you can see the result in the typically poor framerates once you get to building complex structures for a game of it's basic fidelity - atmosphere aside. You seem to be conflating an argument I'm not making "RT is necessary for an atmospheric or successful game" and using that as some kind of logic to determine that RT is bad? It's pretty easy to build an internal combustion car too and relies on time tested knowledge, but electric cars are still what the market is moving to.

Most developers target the lowest common denominator which even now is the PS4/XBO and not PS5/XSX which actually have (weak) RT hardware or PC.

I don't need to convince you - the technology is there and is superior, adoption rates are increasing and will continue to. Even AAA companies like Bethesda for instance make all sorts of crazy engine decisions that don't reflect an appropriate response to reality, looks like Starfield will still be based on an updated x64 version of their Creation engine that's over 10 years old at this point - all to avoid paying for a new engine whether in house or leased. The game looks like crap in the early trailers and I expect they'll pay the price for that shortsightedness in the inevitable jank and loss of immersion. Oh well, at least they can take the Skyrim route of releasing it 17 different times with minor updates.

Indie studios have potential - it's still up to them to use it - the tools exist so they will be used when devs learn how to. You'd be surprised at how many developers will choose to spend hundreds of hours doing things the old way rather than 10 hours learning something new, and 10 hours building things the new way.


Post effects are literally used to cheaply fake the real deal, e.g. Global illumination, physically accurate materials etc. They're imitations designed to cheaply shortcut to an end result. Again you're arguing against a point i'm not making. Limitations do not breed creativity, and concerning yourself with working with what are in essence fake or unnecessarily complicated tools to achieve the image in your minds eye is not a virtue.

Who suggested it can replace a talented team?

Do construction companies that exclusively use manual labour over things like cranes and power tools make better products as a result? No, they take longer and cost more.


This shit worries me, in the way the lack of other game engines.
 
More like intentionally made to run as atrociously as possible. There is nothing remarkable visually speaking in it compared to other more complex games that have RT and actually manage to run at a decent framerate, I don't know how anyone can make the case that the performance hit is in any way justifiable on a basic game like Portal. Not to mention that such conversions on old games are incredibly misleading because you can achieve 90% of the same look with modern non RT techniques and once you become aware of that you realize just how ridiculous this is.

If the cutting edge of RT is making a 15 year old game look modern while obliterating anything reassembling a playable framerate on 99% of the available hardware we're in for a rough ride.
This.

Raster is there for a reason: it was hyper efficient. Efficient enough to look pretty great, add lots of dynamic effects to the scene, and still run on a potato.

Nvidia however likes to sell a new potato every other year. And since RTX, at astronomical price tags. We've just seen a discussion happen where a 2060 (!) user couldn't properly run RT content, even if its only two generations away and among the scarce cards that can even run it to begin with. Then we get a misguided tour through Nvidia land where '40 series can run it fine' even though the stack contains only north of 1K$ GPUs that are the worst perf/$ since forever. Then, when its unavoidable to deny this won't quite help adoption, a 'research paper turned into a game' that is only one year in the past is suddenly grounds for helping that adoption rate. You couldn't make it up, but sound logic it is not.

Somebody really drank too much kool aid here. And is being pretty arrogant about it too when proven wrong by reality checks. As a staff member, I might add, and as another event in a long series of examples. Tone of voice: disgusting and oozing gullible fool caught by marketing. The guy should start a YT channel - TPU is not his place clearly. Any content he spews is clearly tainted by confirmation bias. I'd prefer to stay far away from reading it.

'I'm right, you're wrong' he says, when there is clear evidence in the market that RT adoption is not happening at any reasonable rate. The consoles aren't even pushing it and cards that run it have extremely low market penetration. And that was due to the days of free & crypto money, whereas now, people are looking at >10% of inflation YoY. Good luck pushing nonsensically heavy processing feature sets when people are happy they can make ends meet. Enjoy those research papers and PoCs, that's about as far as it'll go. Much like VR - it simply prices itself out of the market.

Now excuse me while I push 3440x1440 on a GTX 1080 at 60~100 fps in pretty much all I want to play. :toast:

Then maybe people with hardware that can only run it in a slideshow could have a big think as to why they might be unimpressed.
Its called common sense, try it someday. I see RT side by side with raster scenes and I just can't find any reason to justify the expense. I've been here before, pretty much every gen Nvidia pushes a new proprietary trick, and every time, it wasn't worth jumping into; Hairworks, PhysX, Turf Effects, HBAO+, MLAA, it was all completely not necessary and the market didn't carry it. They're gimmicks, and only when they become ubiquitous, is when they start to matter. And they get there, when the market is saturated with cards that can run it just fine. Thát is when devs are truly convinced. Right now its like a crypto ICO, no more, no less, and you're just doing missionary work for someone else. And, again, the writings are on the economic walls: a have/have not situation is not going to help RT adoption, but its exactly where we're at.

And you know what happens when the market does carry it? You get the Gsync situation: Nvidia is forced to retract their turd because their whole business approach fell flat just like that. 'Pushing RT' the way Nvidia does ever since Turing is really not going places unless the products are accessible. The fact Nvidia is increasing margins 'while pushing the industry' is a completely nonsensical shift as well, it directly counters their attempts to move RT forward. Gosh, I wonder why? Maybe you will figure it out someday, big thinker.

'Moore's Law is Dead, (because we created RTX)' - Huang, 2022.
 
Last edited:
Then maybe people with hardware that can only run it in a slideshow could have a big think as to why they might be unimpressed.

It's intentionally cutting edge RT,
Your leets dripping from your lips now.

Clearly your not aware there are scaler sliders that effect quality in games.

There's settings besides max ultra.
I can easily run it maxed on detail and did.

I saw the RTX added little.

I then noted that half the scalable detail in settings terms looked much the same yet ran smoothly.

I then noted that RTX added little and didn't look much better than pre baked lighting.


And finally noted that a old game still looked old and my two prior play throughs over ten years ago were and are, enough.

Intentionally cutting edge is still a bit shit apparently, pointless and not worth any more investment than 3DTv got.

And finally, this isn't even a thread about RTX, so your single minded Nvidia advertising just looks schill like, if you aren't a shareholder or payed employee by Nvidia you are a confused person and dubious choice for an impartial proof reader for this site.
 
This.

Raster is there for a reason: it was hyper efficient. Efficient enough to look pretty great, add lots of dynamic effects to the scene, and still run on a potato.
Yeah of course you're absolutely right, that's why every CPU and GPU designer including ones bringing their first products to market (Intel), consoles, and literal phone GPUs are inserting support for RT, using valuable and expensive die space. Since Raster will get us all the way when it comes to graphical fidelity. I guess all that Moores law and massive progression in processing power over the past several decades should culminate in zero new techniques and methods of rendering and processing, since we're clearly at the apex already. Maybe Cinema should never have invested in 3D technology or advanced CGI? Too expensive and low volume initially...
Nvidia however likes to sell a new potato every other year. And since RTX, at astronomical price tags. We've just seen a discussion happen where a 2060 (!) user couldn't properly run RT content, even if its only two generations away and among the scarce cards that can even run it to begin with. Then we get a misguided tour through Nvidia land where '40 series can run it fine' even though the stack contains only north of 1K$ GPUs that are the worst perf/$ since forever. Then, when its unavoidable to deny this won't quite help adoption, a 'research paper turned into a game' that is only one year in the past is suddenly grounds for helping that adoption rate. You couldn't make it up, but sound logic it is not.

Somebody really drank too much kool aid here. And is being pretty arrogant about it too when proven wrong by reality checks. As a staff member, I might add, and as another event in a long series of examples. Tone of voice: disgusting and oozing gullible fool caught by marketing. The guy should start a YT channel - TPU is not his place clearly. Any content he spews is clearly tainted by confirmation bias. I'd prefer to stay far away from reading it.
Thanks for sharing. Me pointing out that the lowest tier RTX SKU in it's most power and cooling limited laptop form from the literal first generation of RT GPUs, backed with our own TPU testing that shows the flagship 2080ti getting less than 20 FPS in a cutting edge RT demo is... surprising? Rude to point out? Needing a "reality check"? In case you missed the meaning, I mentioned the research paper date as an example of just how cutting edge that RT implementation is, since there's obviously huge variation in the complexity and integration level of RT across different games.
'I'm right, you're wrong' he says, when there is clear evidence in the market that RT adoption is not happening at any reasonable rate. The consoles aren't even pushing it and cards that run it have extremely low market penetration. And that was due to the days of free & crypto money, whereas now, people are looking at >10% of inflation YoY. Good luck pushing nonsensically heavy processing feature sets when people are happy they can make ends meet. Enjoy those research papers and PoCs, that's about as far as it'll go. Much like VR - it simply prices itself out of the market.
All new consoles have literal hardware for RT baked into the design, and new engines use it by default. We've had just about every AAA game be released with some form of RT, and consoles typically have two modes, high quality (RT on), and high refresh, using Spiderman as an example.
Now excuse me while I push 3440x1440 on a GTX 1080 at 60~100 fps in pretty much all I want to play. :toast:

Its called common sense, try it someday.
And you call me arrogant?:laugh:
I see RT side by side with raster scenes and I just can't find any reason to justify the expense. I've been here before, pretty much every gen Nvidia pushes a new proprietary trick, and every time, it wasn't worth jumping into; Hairworks, PhysX, Turf Effects, HBAO+, MLAA, it was all completely not necessary and the market didn't carry it. They're gimmicks, and only when they become ubiquitous, is when they start to matter.

And you know what happens when the market does carry it? You get the Gsync situation: Nvidia is forced to retract their turd because their whole business approach fell flat just like that. 'Pushing RT' the way Nvidia does ever since Turing is really not going places unless the products are accessible. The fact Nvidia is increasing margins 'while pushing the industry' is a completely nonsensical shift as well, it directly counters their attempts to move RT forward. Gosh, I wonder why? Maybe you will figure it out someday, big thinker.

'Moore's Law is Dead, (because we created RTX)' - Huang, 2022.
So, your examples of GSync (innovation by Nvidia in 2013, copied by AMD with Freesync in 2015, now ubiquitious), HBAO+ etc, don't matter since at initial adoption they had low market penetration? Interesting take... Pushing RT seems to be something the entire industry is doing, with NVIDIA being a leader, and Intel laughably having better RT support than AMD out of the gate, shame about the driver issues. But of course, it's entirely possible that every market analyst, engineer, executive etc. at all of these companies concurrently have missed what you are saying, and placed all their R&D into a dead end... wait no, it's only a dead end until it's the majority of the market? Is that what you're saying? You mentioned 20 years a while back, I guess we'll see, doubt it though.

Your leets dripping from your lips now.

Clearly your not aware there are scaler sliders that effect quality in games.

There's settings besides max ultra.
I can easily run it maxed on detail and did.

I saw the RTX added little.

I then noted that half the scalable detail in settings terms looked much the same yet ran smoothly.

I then noted that RTX added little and didn't look much better than pre baked lighting.


And finally noted that a old game still looked old and my two prior play throughs over ten years ago were and are, enough.

Intentionally cutting edge is still a bit shit apparently, pointless and not worth any more investment than 3DTv got.

And finally, this isn't even a thread about RTX, so your single minded Nvidia advertising just looks schill like, if you aren't a shareholder or payed employee by Nvidia you are a confused person and dubious choice for an impartial proof reader for this site.
Sure bud. Whatever floats your boat. :toast:

I for one am all for innovations that push the envelope for detail and accuracy. Cheers to all the (much smarter than anyone here) engineers and researchers figuring out ways to make the virtuality even less distinguishable from reality.
 
copied by AMD with Freesync in 2015
I was agreeing up to this point, so hold up. AMD didn't copy it otherwise it would have been just as crappy and proprietary as GSync. AMD made VRR more standard by not requiring stupid special hardware for it and making the standard open. It forced nVidia to support it. I'd hardly call that copying. nVidia tried to corner the market and AMD flipped that on its face. RT on nVidia cards is no different, more proprietary hardware and APIs to get the best performance at the cost of vendor lock, which is a dangerous thing with a company like nVIdia given their history. If I bought a GPU today, RT performance is going to be at the bottom of my list of things I care about. I want the ecosystem to evolve just like VRR did because eventually, we won't need proprietary compute blocks or APIs to do this and it will all become standardized. Until then, I'll let everyone else play the role of guinea pig.
 
Last edited:
Back
Top