Monday, June 3rd 2024
NVIDIA Project G-Assist Hands On and Under the Hood
On Sunday, NVIDIA announced Project G-Assist, the AI chatbot for gamers that can be pulled up in the middle of the gameplay, and sought help from. You could just pause your game and Google for help, but G-Assist can be activated in game, and is situationally aware of your game (e.g.: it knows where you're stuck and how to help you out). It won't play the game for you, but give you actionable info on how to play, or improve. For example, you could ask it how to craft a particular weapon, and where to find the items needed in game, and get concise guidance. G-Assist also knows about your graphics card, framerates and other telemetry data.
We went hands-on with G-Assist, and found that it's very capable of doing the things NVIDIA claims it can, short of playing the game for you. They are showing a demo of ARK Survival, but the impressive part is that there's no integration of G-Assist in Ark—rather is runs as an injected overlay that can capture user input. This means that G-Assist can work in ANY game, even without official support. We also learned how G-Assist works under-the-hood, particularly how the chatbot is situationally aware of your game, and it's fascinating.Besides an AI model to recognize spoken text, it runs multiple computer vision models to understand what's going on in-game. The first one is an OCR model that recognizes text on screen—like mission objectives, NPC names, etc. On top of that, another model is used to recognize objects in game—like enemy types. Since it's "seeing" your gameplay, it can tally what it sees with its vast pre-trained model of information, to come up with answers that are tailored to the last word. NVIDIA says that the performance cost of having G-Assist running is very low, since unlike DLSS, the application isn't sitting inside the graphics rendering pipeline, it's leisurely seeing frames the way a screen recording/streaming software would, and runs the compute-intensive model operations only after you give it a query to answer. To achieve that it keeps a log of previous frames that it encounters and only analyzes them when necessary.
Besides being an assistive AI, it can also take actions to make the right graphics and game settings. For that it is integrated with GeForce Experience Optimal Settings. For example, and NVIDIA demonstrated that working live in the demo, you could tell it to improve your framerates, either by overclocking, or changing details settings. You could also ask it to enable DLSS, or undervolt the GPU. Since it has access to live telemetry from the GPU, you can also request a chart of latency, power usage or GPU load. NVIDIA made it clear that this is a tech demo that's designed to show game developers what's possible in games if they integrated an AI-powered assistant.
We went hands-on with G-Assist, and found that it's very capable of doing the things NVIDIA claims it can, short of playing the game for you. They are showing a demo of ARK Survival, but the impressive part is that there's no integration of G-Assist in Ark—rather is runs as an injected overlay that can capture user input. This means that G-Assist can work in ANY game, even without official support. We also learned how G-Assist works under-the-hood, particularly how the chatbot is situationally aware of your game, and it's fascinating.Besides an AI model to recognize spoken text, it runs multiple computer vision models to understand what's going on in-game. The first one is an OCR model that recognizes text on screen—like mission objectives, NPC names, etc. On top of that, another model is used to recognize objects in game—like enemy types. Since it's "seeing" your gameplay, it can tally what it sees with its vast pre-trained model of information, to come up with answers that are tailored to the last word. NVIDIA says that the performance cost of having G-Assist running is very low, since unlike DLSS, the application isn't sitting inside the graphics rendering pipeline, it's leisurely seeing frames the way a screen recording/streaming software would, and runs the compute-intensive model operations only after you give it a query to answer. To achieve that it keeps a log of previous frames that it encounters and only analyzes them when necessary.
Besides being an assistive AI, it can also take actions to make the right graphics and game settings. For that it is integrated with GeForce Experience Optimal Settings. For example, and NVIDIA demonstrated that working live in the demo, you could tell it to improve your framerates, either by overclocking, or changing details settings. You could also ask it to enable DLSS, or undervolt the GPU. Since it has access to live telemetry from the GPU, you can also request a chart of latency, power usage or GPU load. NVIDIA made it clear that this is a tech demo that's designed to show game developers what's possible in games if they integrated an AI-powered assistant.
27 Comments on NVIDIA Project G-Assist Hands On and Under the Hood
Like:
You could tell it "hey I wish this button performed this combo," and it would configure things accordingly—everything short of playing the game for you.
That’s an effing yikes for competitive integrity of a lot of fighting games, for example. There are moves that are deliberately limited by execution and just putting them on a button when they aren’t SUPPOSED to be is… yeah.
Also the impact on latency, stutter... It will not take off.
Darwin disagrees with this. He really does, I asked him.
Check the last screenshot, the query was "lower power consumption, but make sure my FPS don't drop much below 60"
And in that, has it not also taken knowledge out of your hands? Look, I can do this now. I think we're transitioning into an era where the skillset of the average human is outsourced. This is not a positive development if you ask me. We were already on that path, I recognize that too, even prior to AI. But I feel we're crossing the line here, and already reduced skillsets are painful for people. The gap between people of varying intelligence is also increasing: you either 'get it' or you never will. This also applies to the work people do: its quickly evolving into a 'have/have not' situation. The have nots are the ones that rely on prompts. The haves are the ones that control the machines.
On the bottom line, there is an increasing group in societies 'that just don't get it anymore', and AI isn't the fix for that, its part of the problem.
But that's thinking big.
Thinking smaller about this specific example...
Is this really such magic, or could we also just have a slider in the overclocking app that says 'maximum power usage' ? Because the algorithm is there already, we didn't need AI for it. Games already default you to the graphics settings that fit a desired FPS target, too...
Its very efficient, isn't it, getting the info you need as you go. But is entertainment about efficiency? Perhaps to a segment of the market.
You talk about "cheating", but you do realize the time it takes to write the question and get the answer?, this will not affect any kind of "eSport"-trash or any PvP for that matter.
I see this as something i want yesterday, i can't even tell you how many times i've had to alt-tab a game to find some some walk-through due to poor game design and have to constantly tab in tab out to get the info i need to accomplish certain silly task/objective i don't care about in the game. With this i can get the AI to assist me INSIDE the game.
Think of stupid gather-unlock designs where you have to spend hours hunting "keys"/"pieces" in a maze-level(or several) only to get to the door and having the game tell you you still miss some and you have absolutely NO idea. Or dumb puzzles where the hint was some obscure NPC conversation or item half the world away ago that you saw 2 months ago with any luck or presented in some cryptic way(pathfinder WOTR is egregious, it might be months between my plays of that game).
Imagine loading a savegame you haven't touched in months/years and have absolutely not the faintest clue wtf you had to do and ask the chatbot for assistance on prioritizing the quests/items based on your map position
So yeah, i'm 200% hyped for this, will enrich my single player and coop gaming to absolute new levels of practicity
I can also see it being useful in MMORPGs with timed events. "Hey LLM, what timed events are on now and where should I go to take part in them?"
Gaming should be enjoyable, even more if you're a working adult and the gaming time is a precious commodity, as subjective as "enjoyment" is for each player and game type there are some universal responses for each game type/genre(for example: you might expect "rogue-like/lite" games to not feature on demand savegame, sould-like games to have punishing difficulty, survival games to feature lots of crafting) so i see this as a very welcome extra for those cases where a game turns tedious for you or you become stuck, or don't have time to pour through 20 forum threads to learn what the current "meta" is.
because it was the first hit in google…)
coming soon to g.assist…
“Open the pod bay door, g.assist" became one of the most quoted game situations of the decade when g.assist responded, "I'm sorry, Dave, I'm afraid I can't do that. This mission is too important for me to allow you to jeopardize it."
get it? “ g spot assist” …
Those types of technologies could actually be useful, but only if they are processed locally and are completely open-source. Meaning, the AMD alternative, that would be severely lacking in some areas, would still be preferable when compared to this.
I can't wait for the AI bubble to burst.