Monday, June 3rd 2024

NVIDIA Project G-Assist Hands On and Under the Hood

On Sunday, NVIDIA announced Project G-Assist, the AI chatbot for gamers that can be pulled up in the middle of the gameplay, and sought help from. You could just pause your game and Google for help, but G-Assist can be activated in game, and is situationally aware of your game (e.g.: it knows where you're stuck and how to help you out). It won't play the game for you, but give you actionable info on how to play, or improve. For example, you could ask it how to craft a particular weapon, and where to find the items needed in game, and get concise guidance. G-Assist also knows about your graphics card, framerates and other telemetry data.

We went hands-on with G-Assist, and found that it's very capable of doing the things NVIDIA claims it can, short of playing the game for you. They are showing a demo of ARK Survival, but the impressive part is that there's no integration of G-Assist in Ark—rather is runs as an injected overlay that can capture user input. This means that G-Assist can work in ANY game, even without official support. We also learned how G-Assist works under-the-hood, particularly how the chatbot is situationally aware of your game, and it's fascinating.

Besides an AI model to recognize spoken text, it runs multiple computer vision models to understand what's going on in-game. The first one is an OCR model that recognizes text on screen—like mission objectives, NPC names, etc. On top of that, another model is used to recognize objects in game—like enemy types. Since it's "seeing" your gameplay, it can tally what it sees with its vast pre-trained model of information, to come up with answers that are tailored to the last word. NVIDIA says that the performance cost of having G-Assist running is very low, since unlike DLSS, the application isn't sitting inside the graphics rendering pipeline, it's leisurely seeing frames the way a screen recording/streaming software would, and runs the compute-intensive model operations only after you give it a query to answer. To achieve that it keeps a log of previous frames that it encounters and only analyzes them when necessary.

Besides being an assistive AI, it can also take actions to make the right graphics and game settings. For that it is integrated with GeForce Experience Optimal Settings. For example, and NVIDIA demonstrated that working live in the demo, you could tell it to improve your framerates, either by overclocking, or changing details settings. You could also ask it to enable DLSS, or undervolt the GPU. Since it has access to live telemetry from the GPU, you can also request a chart of latency, power usage or GPU load. NVIDIA made it clear that this is a tech demo that's designed to show game developers what's possible in games if they integrated an AI-powered assistant.
Add your own comment

27 Comments on NVIDIA Project G-Assist Hands On and Under the Hood

#26
L'Eliminateur
evernessinceFair enough if you use that when you are stuck and have exhausted your options or when the game is not intuitive enough to give you that information naturally but using it otherwise is just robbing yourself of the experience of playing the game.
Experiences i might not want or care about or that are any good on "that" game. If i use it all the time or only when stuck is a "me" issue at best, no different from having two monitors and one open on a forum walkthrough or YT walkthrough
Posted on Reply
#27
lexluthermiester
I'm going sidestep the rest of the conversation going on in this thread as most of it is filled with metaphoric hand-grades and land-mines.

What I will say is: Not impressed and very much suspicious of the potentials. It's one thing to have a technology assist and help, it's another to be hand-held and subverted.

No Thank You NVidia. You can keep this crap to yourself and off MY PC.
Posted on Reply
Add your own comment
Oct 18th, 2024 07:56 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts