• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia's future DlSS iterations might have ai textures / NPC add ons.

Joined
Sep 10, 2018
Messages
5,857 (2.76/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Oh you would definitely notice how 120FPS with DLSS.Balanced is much smoother than 80FPS with DLAA, before you notice anything else :laugh:.

But yeah FSR flaws cannot be missed, I have seen people with Radeon prefer XeSS 1.3 or TSR over FSR2

XESS 1.3 is definitely better than FSR in most games that support it I don't like TSR though but maybe it's better than FSR.

Yeah that's part of the reason I swapped to 1440p ultrawide couldn't stomach having yo drop to 4k DLSS balanced to get the performance I wanted in some games although I have all 3 options 4k120, 1440p uw 175/ 1440p360, and I can just picked what I like best game to game....

Thinking about getting a 32 inch 4k240hz oled next Maybe balanced DLSS will be ok on the smaller size...
 
Joined
Sep 17, 2014
Messages
21,457 (6.00/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Called it. DLSS 4 with new proprietary hardware requirements inc

Prepare to feel really shit about your Ada soon and then pride yourselves after purchase that you are once more chasing that 'cutting edge' in upscaling. It looks so much bettehhrrurur. Don't worry too much about necessity, Nvidia will find a way to make it so. And you will again believe it.

It would be hilarious if it wasn't so sad.

No, no, & hell no...

Cause AI IS dlss, textures, npc and everything else, everywhere, all at once, all the time.....

If the chair is against the wall, the wall is against the chair, and you can't sit on either one, even you wanted to...
What? Who said the wall has a roof and why isn't the chair turned 180 degrees then? Also, can I take some of those shrooms you've got there? :D
 
Joined
Nov 11, 2016
Messages
3,205 (1.15/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Anyways back to topic, Nvidia continue to innovate in the gaming space, why is it in any shape or way a bad thing?

DLSS 2: Super Resolution
DLSS 3: Frame generation
DLSS 3.5: Ray Reconstruction
DLSS 4: Texture Reconstruction?
DLSS 5: AI NPCs?

Weird to find tech enthusiasts so afraid of new techs LOL. Maybe create a new site call Techpowerdown and discuss how new tech scare you :roll:
 
Joined
Sep 17, 2014
Messages
21,457 (6.00/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Anyways back to topic, Nvidia continue to innovate in the gaming space, why is it in any shape or way a bad thing?

DLSS 2: Super Resolution
DLSS 3: Frame generation
DLSS 3.5: Ray Reconstruction
DLSS 4: Texture Reconstruction?
DLSS 5: AI NPCs?

Weird to find tech enthusiasts so afraid of new techs LOL
New tech is fine, its just a shame Nvidia actively uses it to corner the market and exclude the competition.

The end result of that is now that consoles don't really do anything with it, which comprises over 50% of the gaming market. Same games. Just a different platform. Same market, same target demographic. So now we have two worlds in gaming and the market has lost a lot of its dynamic. We don't benefit from that, as consumers/gamers.

It also doesn't help the adoption of RT; Nvidia is forced to throw money at devs to make them add RT as an afterthought. This is still happening; the implementations are often lackluster and the rare games where they do shine are fed heavily with Nvidia TLC. Because otherwise there really isn't a market. Gamers are already on Nvidia anyway, so why would devs make an effort? Gamers don't buy games simply 'because they have RT'. And they never will. And devs primarily develop for consoles first, and PC next/in tandem, this won't be changing anytime soon either.
 
Joined
Jan 14, 2019
Messages
10,496 (5.26/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Get ready for games written by AI, featuring generic, bland NPCs and a storyline without any meaning or significance with DLSS 6, exclusive to your $2,000 RTX 6090 Ti. :rockout:

Developers, enjoy your looong holidays! :pimp:
 
Joined
Sep 17, 2014
Messages
21,457 (6.00/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Get ready for games written by AI, featuring generic, bland NPCs and a storyline without any meaning or significance with DLSS 6, exclusive to your $2,000 RTX 6090 Ti. :rockout:

Developers, enjoy your looong holidays! :pimp:
Nah developers are still busy just like they always were, but they can now focus their attention on fighting AI hallucinations. You don't need to be talented for that, just recognize and delete. Similar things apply to code. You don't need to write great code anymore. Just let ChatGPT fix it for you. No need to be talented there either. Its a free for all for mediocre studios, isn't it lovely. And a few years later we have AI expert devs that know all lora's by heart. I can't wait.
 
Joined
Sep 4, 2022
Messages
220 (0.33/day)
Nah I would take 50% more FPS, 50W less and go with DLSS.Balanced with a touch of Sharpen+ instead of DLAA
DLAA vs DLSS.Balanced Sharpen+
It seems Dlaa is on the right where is is more details in the textures all around the image. When using Dlaa in Vermitide 2 I noticed that the texture resolution subjectively has more resolution than the Temporal anti aliasing but at the cost of 50 watts of additional power. I prefer DLAA if the hardware allows playable frame rate.
Anyways back to topic, Nvidia continue to innovate in the gaming space, why is it in any shape or way a bad thing?

DLSS 2: Super Resolution
DLSS 3: Frame generation
DLSS 3.5: Ray Reconstruction
DLSS 4: Texture Reconstruction?
DLSS 5: AI NPCs?

Weird to find tech enthusiasts so afraid of new techs LOL. Maybe create a new site call Techpowerdown and discuss how new tech scare you :roll:
Lol no one is scared on new technology, some are just trying to start a conversation on the inevitable direction of Nvidia's RTX lineup especially when they said DLSS will eventually get to version nomenclature 10.
What I am excited about is the potential of 3d textures with this technology. Huang mentioned textures and objects meaning the textures do not have to flat and can have added geometry.
Also who do you think will be splitting the bill initially? It's us the enthusiast beta testers that make way for tomorrow's standards. :cool::rockout:

Nah developers are still busy just like they always were, but they can now focus their attention on fighting AI hallucinations. You don't need to be talented for that, just recognize and delete. Similar things apply to code. You don't need to write great code anymore. Just let ChatGPT fix it for you. No need to be talented there either. Its a free for all for mediocre studios, isn't it lovely. And a few years later we have AI expert devs that know all lora's by heart. I can't wait.
Yeah with all the layoffs in the gaming industry, I forsee and boom in the independent developers work with the help of ai. I always seen program engineers that can correct faulty ai code where the ones that will be ahead though.
 
Joined
Jan 14, 2019
Messages
10,496 (5.26/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Nah developers are still busy just like they always were, but they can now focus their attention on fighting AI hallucinations. You don't need to be talented for that, just recognize and delete. Similar things apply to code. You don't need to write great code anymore. Just let ChatGPT fix it for you. No need to be talented there either. Its a free for all for mediocre studios, isn't it lovely. And a few years later we have AI expert devs that know all lora's by heart. I can't wait.
Well, that's both good and bad, I guess. It's bad because the days of talented developers are over, and good because other types of artists, who may not be so talented in code can have access to game development. About the end product, we'll see. This is all hypothetical, of course.
 
Joined
Sep 10, 2018
Messages
5,857 (2.76/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Well, that's both good and bad, I guess. It's bad because the days of talented developers are over, and good because other types of artists, who may not be so talented in code can have access to game development. About the end product, we'll see. This is all hypothetical, of course.

I think in the long term it will be a net benefit as it get's better and better the average AAA game takes 4-5 years to develop and that is just too long for most games to be actually profitable at current pricing without BS GAS tied in so my hope is that once the tools get significantly better that it can be cut down to 3-4 years making new concepts possibly less risky for publishers. That is probably a best case scenario and the reality probably won't be so great.

There will always be the indie stuff and smaller projects on PC I don't think that will be affected.

My biggest issue with this AI NPC thing and AI texture thing is it being locked into one ecosystem I am ok with upscalers and or frame generation being locked to a specific hardware assuming that makes it better but developers having to make basically two different versions of a game to accommodate one hardware set I will never be on board with regardless.... Now if both future consoles support something similar sure which sounds like it could be a thing with both sony and microsoft seemingly wanting AMD to make some sort of AI based solution for them.
 
Joined
Mar 21, 2016
Messages
2,324 (0.77/day)
Well, that's both good and bad, I guess. It's bad because the days of talented developers are over, and good because other types of artists, who may not be so talented in code can have access to game development. About the end product, we'll see. This is all hypothetical, of course.

I see it more as more people will be able to excel in more fields of area's in ways they might not have otherwise. AI can take a lot of the time restraints out of learning how to excel at a lot of specific fields of expertise. It might not outright replace the experts on those fields, but it can certainly supplement at a lot of them which really isn't a bad thing. It's kind of a jack of all trades scenario like with a bard class they don't exactly excel at much of anything, but are fairly satisfactory at pretty much everything at the same time.

I think in the long term it will be a net benefit as it get's better and better the average AAA game takes 4-5 years to develop and that is just too long for most games to be actually profitable at current pricing without BS GAS tied in so my hope is that once the tools get significantly better that it can be cut down to 3-4 years making new concepts possibly less risky for publishers. That is probably a best case scenario and the reality probably won't be so great.

There will always be the indie stuff and smaller projects on PC I don't think that will be affected.

My biggest issue with this AI NPC thing and AI texture thing is it being locked into one ecosystem I am ok with upscalers and or frame generation being locked to a specific hardware assuming that makes it better but developers having to make basically two different versions of a game to accommodate one hardware set I will never be on board with regardless.... Now if both future consoles support something similar sure which sounds like it could be a thing with both sony and microsoft seemingly wanting AMD to make some sort of AI based solution for them.

Chances are developers will treat it more like a extension layer to games in addition to the standard design you might have AI options that can be tapped into to do whatever. Those features might be in game or out of game in part much like with other mods for games.
 
Joined
Sep 4, 2022
Messages
220 (0.33/day)
I got to experience the AI NPC at Nvidia booth for CES 2024. It was wild. You could say anything you want and the NPC in Cyberpunk 2077 would naturally talk back to you. Imagine if side quests were generated from real-time conversations.
Had to find my old post about LLM in Unity
Imo

Are unscripted conversations with NPC powered by an LLM really what gamers want?

Definitely not in current form. I want to see the final product. The NPCs should sound far away from sounding like ai bots and at least attempt to sound natural if they want gamers to accept it. Less monotone and more expression like the Metahuman had. The expression should match the topic and not overly expressed. The ai has a disconnect with feeling that the artist wants the user to experience. This is why generative ai will still need human artist to paint a more palatable conversion that is relatable, and enhances your senses. The target demographic are humans who feel and need a stimulus that currently a creative human artist can hone in to perfection.
If this was presented as is I would reject this. Does it have potential as a tool in the right artists hands? Yes I am optimistic it does.
 
Joined
Aug 3, 2006
Messages
99 (0.02/day)
Location
San Antonio, TX
System Name Geil
Processor Ryzen 6900HS
Memory 16GB DDR5
Video Card(s) Radeon 6700S
Storage 1TB SSD
Display(s) 120Hz 2560x1600
Software Windows 11 home premium
Don't you already have to log into G-Force software to use your graphics card? I bet DLS6.0 will be subscription based and their fans will call it a feature.
 
Joined
Sep 4, 2022
Messages
220 (0.33/day)
Don't you already have to log into G-Force software to use your graphics card? I bet DLS6.0 will be subscription based and their fans will call it a feature.
The new Nvidia app doesn't require a log in and is Nvidia's attempt to phase out the Nvidia control panel and geforce experience ( which requires the login.) The login is there as a choice and you can update driver with clean install without logging in for the past quarter.
 
Joined
Sep 10, 2018
Messages
5,857 (2.76/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Don't you already have to log into G-Force software to use your graphics card? I bet DLS6.0 will be subscription based and their fans will call it a feature.

You've never needed to use geforce experience to use an Nvidia gpu there has been a Driver only option for Ages and prior to that just do a custom install and unselect it.
 
Joined
Nov 11, 2016
Messages
3,205 (1.15/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
It seems Dlaa is on the right where is is more details in the textures all around the image. When using Dlaa in Vermitide 2 I noticed that the texture resolution subjectively has more resolution than the Temporal anti aliasing but at the cost of 50 watts of additional power. I prefer DLAA if the hardware allows playable frame rate.

Lol no one is scared on new technology, some are just trying to start a conversation on the inevitable direction of Nvidia's RTX lineup especially when they said DLSS will eventually get to version nomenclature 10.
What I am excited about is the potential of 3d textures with this technology. Huang mentioned textures and objects meaning the textures do not have to flat and can have added geometry.
Also who do you think will be splitting the bill initially? It's us the enthusiast beta testers that make way for tomorrow's standards. :cool::rockout:

You got tricked by the sharpen+ filter :cool: (it's pretty good at adding textures), check the FPS counter, the DLSS.Balanced run at ~120FPS while DLAA is only~80FPS.

AMD just announced their own Neural textures block compression too
With RX7000 has some AI cores built in, I guess it will be locked to rx7000 and after
 
Joined
Jan 14, 2019
Messages
10,496 (5.26/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
My biggest issue with this AI NPC thing and AI texture thing is it being locked into one ecosystem
It reminds me... if AI is such a generic thing that all three chip vendors had to jump on it, then why does the software stack that uses it (DLSS, XeSS) have to be vendor-specific? What's so vendor-specific about tensor cores anyway? Is't it just an overgrown NPU integrated into a GPU? Anyway, I digress.
 
Joined
Jan 8, 2024
Messages
106 (0.61/day)
It reminds me... if AI is such a generic thing that all three chip vendors had to jump on it, then why does the software stack that uses it (DLSS, XeSS) have to be vendor-specific? What's so vendor-specific about tensor cores anyway? Is't it just an overgrown NPU integrated into a GPU? Anyway, I digress.

Nothing too special. Tensor cores are designed to be more efficient at a specific set of operations. You can technically run similar code on a GPU shader core or CPU core but the performance or quality may suffer. In the future all the vendor-exclusive optimizations will be under the hood.
 
Joined
Mar 21, 2016
Messages
2,324 (0.77/day)
As far as textures and geometry there are already means to convert images to geometry which has enormous implications for games and 3D modeling. It's really awesome technology and will get better and better I'm certain and people will come up with interesting ways to tap into it and leverage it. Having ceilings, floors, and walls with like actual 3D carvings of images is going to be very cool. Bump Mapping is nice, but it only goes so far to add dimension.

We'll probably see developers scale back on some of the poly-count on a lot of objects and models in favor of applying more of that to the things I mentioned since those are huge parts of most scenes. Not to mention just jacking up poly-count doesn't significantly improve it. It can help, but the perceived image uplift has really diminishing returns for the GPU overhead involved I'd argue. I mean poly-count has gotten easier on newer GPU's over time, but still better utilizing the resources available is always going to be more ideal. It really depends in no small part on how photo realistic the developers are aiming towards though.

I can't wait to see many more games incorporating more details to scene like you'd see in say old architectural temples in India that are intricately sculpted and dimensional because that's really where I see it going being able to inject that type of scene into games much more easily and quickly overall with the aid of AI. It's going to be really cool factoring in what would that sort of thing look like if it were orcish, dwarvin, elvin, among other standard running fantasy themes and like mythological and folklore in general. Like what carvings would dragons have etched into walls and temples? It's a huge difference having actual dimension versus bump mapping simulating dimensional details.
 
Joined
Jan 14, 2019
Messages
10,496 (5.26/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
As far as textures and geometry there are already means to convert images to geometry which has enormous implications for games and 3D modeling. It's really awesome technology and will get better and better I'm certain and people will come up with interesting ways to tap into it and leverage it. Having ceilings, floors, and walls with like actual 3D carvings of images is going to be very cool. Bump Mapping is nice, but it only goes so far to add dimension.

We'll probably see developers scale back on some of the poly-count on a lot of objects and models in favor of applying more of that to the things I mentioned since those are huge parts of most scenes. Not to mention just jacking up poly-count doesn't significantly improve it. It can help, but the perceived image uplift has really diminishing returns for the GPU overhead involved I'd argue. I mean poly-count has gotten easier on newer GPU's over time, but still better utilizing the resources available is always going to be more ideal. It really depends in no small part on how photo realistic the developers are aiming towards though.

I can't wait to see many more games incorporating more details to scene like you'd see in say old architectural temples in India that are intricately sculpted and dimensional because that's really where I see it going being able to inject that type of scene into games much more easily and quickly overall with the aid of AI. It's going to be really cool factoring in what would that sort of thing look like if it were orcish, dwarvin, elvin, among other standard running fantasy themes and like mythological and folklore in general. Like what carvings would dragons have etched into walls and temples? It's a huge difference having actual dimension versus bump mapping simulating dimensional details.
Isn't that already done effectively with tessellation?
 
Joined
Jan 29, 2023
Messages
932 (1.79/day)
System Name KLM
Processor 7800X3D
Motherboard B-650E-E Strix
Cooling Arctic Cooling III 280
Memory 16x2 Fury Renegade 6000-32
Video Card(s) 4070-ti PNY
Storage 512+512+1+2+2+2+2+6+500+256+4+4+4
Display(s) VA 32" 4K@60 - OLED 27" 2K@240
Case 4000D Airflow
Audio Device(s) Edifier 1280Ts
Power Supply Shift 1000
Mouse 502 Hero
Keyboard K68
Software EMDB
Benchmark Scores 0>1000
All that is since 3.7.10, 3.7.0 & 3.7.1 had not. Version jumped from .1 to .10 tho.
 
Joined
Mar 21, 2016
Messages
2,324 (0.77/day)
I'm saying you can things like this and create very intricate detailed 3D renders of 2D images. I mean that's just a basic example case though really. That could easily be made into a really cool tile for a wall, floor, or ceiling and looks a lot better than bump mapping. It has real depth to it for shading and lighting. You can get lower poly or higher poly with it as well with some simple tricks. It's pretty awesome the implications are massive. Really it would be a great way to construct detailed 3Dimensional face's for any object in reality. Like you could tweak that as a face for a pillar and have this really dope looking pillar with a lot of detail and dimension baked into it. Now if you tried to model that by hand exactly as it looks that would quite a bit of more painstaking time to do so. You can vary it reasonably well and easy too so it's quite neat.


There is other cool blenders stuff as well I've seen. like this Minecraft example that's honestly mind blowing like I think more can be done around the concept you don't just need to use blocks for example or mine craft blocks. It feels like that whole concept could become a very intricately advanced means to construct scenes dynamically and quickly around a handful of assets potentially.

 
Joined
Jan 14, 2019
Messages
10,496 (5.26/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
I'm saying you can things like this and create very intricate detailed 3D renders of 2D images. I mean that's just a basic example case though really. That could easily be made into a really cool tile for a wall, floor, or ceiling and looks a lot better than bump mapping. It has real depth to it for shading and lighting. You can get lower poly or higher poly with it as well with some simple tricks. It's pretty awesome the implications are massive. Really it would be a great way to construct detailed 3Dimensional face's for any object in reality. Like you could tweak that as a face for a pillar and have this really dope looking pillar with a lot of detail and dimension baked into it. Now if you tried to model that by hand exactly as it looks that would quite a bit of more painstaking time to do so. You can vary it reasonably well and easy too so it's quite neat.


There is other cool blenders stuff as well I've seen. like this Minecraft example that's honestly mind blowing like I think more can be done around the concept you don't just need to use blocks for example or mine craft blocks. It feels like that whole concept could become a very intricately advanced means to construct scenes dynamically and quickly around a handful of assets potentially.

AMD might have a slightly similar answer:
 
Joined
Jun 14, 2020
Messages
2,875 (1.94/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Might just have to do with it being UW 3440x1440p but DLSS looks pretty meh in a lot of games even with the latest DLL I can only stand 4k in the quality mode so maybe that it's..... Even balanced 4k is barf to me although my 4k screen is 48 inches could be why. I do sit about 5-7 feet back still doesn't matter.

If using DLSS I prefer to use DLDSR in combination then it looks fine.
You are not meant to use dlss balanced even at 4k. In screenshots it looks mostly fine, but when you are actually playing it's a big nono. Balanced performance etc are meant as a last resort, it's either that or you can't play the game.
 
Joined
Sep 10, 2018
Messages
5,857 (2.76/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
You are not meant to use dlss balanced even at 4k. In screenshots it looks mostly fine, but when you are actually playing it's a big nono. Balanced performance etc are meant as a last resort, it's either that or you can't play the game.

Oh I know, I personally like DLAA 1440P UW over 4k Balanced but everyone's gotta make their own decisions.

Might be ok on a 32 inch 4k monitor though pixel density is probably high enough to hide the issues a bit better than a 48 inch one. I did play around with the 42 inch C3 for a while and it looked decently crisper than the 48 inch C1 I have still DLSS balanced was a bit too much on it.
 
Top