If it was filling the VRAM and eating into system RAM the fps would drop heavily. 4060Ti 8GB and 16GB run about the same.
Game engines have mechanisms to stream assets in. Storing assets in main system memory only becomes an issue when something frequently used is pushed to main system memory or when you are frequently needing to fetch assets from main system memory. Otherwise streaming a single rock texture from main system memory for example will not cause any stuttering. The game will naturally store the most accessed assets in VRAM while pushing the least accessed to main system memory.
Most games do not start experiencing stuttering until about 25-30% over the video card's VRAM capacity. We know this because channels like HWUB have demonstrated time and time again that on cards with less VRAM in games that require more memory than a video card provides, you can see the main system memory usage increases. Some games do dynamically adjust texture quality or utilize texture swapping to ensure the game runs smoothly as well but these are not a cure all and typically used with caution as they can come with very nastly side effects on cards without enough VRAM. Particularly in the case of a game using lower quality textures, at that point it's pretty much just lying to you about what texture settings you are using, which misleads anyone that might be looking at VRAM utilization for such a a game. You haven't lowered VRAM usage in such a scenario, you've just lowered quality.
In short, it's not as black and white as "game performance dips the instant you exceed VRAM allowance", you need to significantly exceed VRAM allowance for there to be an impact. Gamers would be up in arms if it worked the way you seem to believe it does. I re-iterate my point that TPU should show total system memory usage as well, particularly for lower end cards. This is important for people with 16GB of system memory, as opposed to the 32GB used in the test bench.