As far as VRAM and it being of greater importance. I mean yes and no. It doesn't exactly change that equation. If you're using the same resolution textures nothing really changes. If you are only using the same amount of texture variety within a game scene nothing changes. You can simply swap textures easily for the same resolution textures and VRAM requirements shouldn't exactly change heavily.
You are thinking if the textures are pre-baked. That this is being attached to DLSS indicates it's something that'll run in real time. Even a small model like a LORA will take an additional 1 - 1.5GB of VRAM. That will certainly have to go with a smaller model as the memory consumption of something like SDXL is 16GB. Mind you SDXL tops out at 1024 x 1024 (or any variation with the same number of pixels). I'd imagine any generated images would have to be up-scaled as well.
There is a lot I could say and touch upon, but you can totally create textures with AI these just simple ones with a basic AI model not even a good quality one like DALL E-3
Dall E-3 has models that range from 7 Billion to 70 Billion parameters. Just for comparison SDXL has 3.5 billion. How much processing power it requires to run locally is completely unknown unless one works at OpenAI.
If you can generate textures locally on the device you won't need to stream as much data from the host. The freed up bandwidth can then be used for more assets which can then be multiplied using AI. The more you buy...
It takes vastly longer to generate an image than it does to stream it from your HDD or even any server regardless of where it is in the world. Generating a 1024x1024 with 40 steps SDXL takes my 4090 about 7 seconds. That's using 100% of the GPU as well.
How much Latency would this create or would a Local LLM be held in the memory of the GPU for textures. Really interesting to think about.
Far far far too much. It's not remotely feasible right now to do so in real time in at least the next 4 GPU generations.
I got to experience the AI NPC at Nvidia booth for CES 2024. It was wild. You could say anything you want and the NPC in Cyberpunk 2077 would naturally talk back to you. Imagine if side quests were generated from real-time conversations.
The problem with entirely AI generated quests is that they will all be meaningless filler. The AI will generate a generic situation with the associated quest. It'll reset the dungeon if you did it already and spawn in NPCs with generic gear according to the theme. In otherwords, essentially Bethesda's Radiant quest system with extra flexability.
That's using it the wrong way IMO. Instead they should use that level of interaction to trigger hand made quests. This enables players to interact more naturally with NPCs while still keeping the content quality high.
I think you're missing the whole point. You generate the textures and swap in place of the original textures in game. It doesn't matter if there is some latency once the texture is created there won't be any latency at that point it's saved to storage or held in VRAM probably storage if you want to keep it. It's not over complicated. Same with objects, npcs, dialog, animations, effects, sound, code, and host of other possibilities. It's only a matter of time before we have AI games we can build from the ground up to share and connect with others. People are grossly underestimating the potential of AI for gaming. The building blocks are already largely in place for much of it though game engines will have to incorporate much of them into the design if they haven't already begun to in order to really make it a more seamless process to the end user.
Latency is always important in games. People complain about games that complie shaders during gameplay and that's why most games don't do that anymore. What you are proposing is ramping that up an insane amount and also creating a lot write amplification on consumer SSDs that aren't designed for a ton of writes. A 4090 takes 7 seconds to generate 1024 x 1024 image using 100% of the GPU. You'd be waiting minutes after the initial load in before your screen stops freezing and then it would freeze each and every time a new texture is generating. Suffice to say, even if it happens only once for each texture it's completely unacceptable. The alternative is to restrict how much GPU the AI could use but that's not feasible for AI textures because they are dynamically generated and by extension no placeholder texture would exist. You couldn't just use a generic texture for all textures currently being generated either, that would look horrendous.
It makes far more sense that the dev would just include all needed textures from the get go. That ensures the experience players receive is not only consistent with the dev's vision but also that the players aren't fighting through a stutterfest.
Forget about generating more than one thing with AI. It's not feasible on a 4090 and it's definitely not feasible on cards regular consumers have. You are going to need a card about 30x the performance of a 4090 before that's feasible. Probably need something like 80 GB of VRAM to boot as each AI is going to require it's own memory.
Ever played a open world survival building game well it's a lot like that, but on serious steroids with inference thrown into the mix on all the development aspects of the game design. It won't happen immediately, but more and more of that is going to end up happening. There is plenty I could touch upon more, but I've already elaborated plenty on more than I really intended to. I think we'll be seeing huge advancements with AI as creative tools in the future and is coming quick and fast in terms of rapid advancement. What is possible this year will feel like a bit of a joke compared to what's possible next year or in two years. I might be the same general premise, but the quality will have improved substantially due to more hardware and higher quality hardware combined with better inference training and algorithms to train around.
Yes, AI is a great tool for devs. I'll ask it for character backgrounds for dnd. The output is almost always something cliche or that I've read before but it serves as a great base to take and make my own. That's the attitude devs should be going into AI with. Fully AI generated content just feels like stealing other's IP.