What should happen, then? Would you prefer a sudden drop in performance as the game loads the high quality texture from the system RAM?
I'd watch VRAM allocation on my older 6GB cards in games from 2+ years ago and frequently I'd see it run right up against the limit and neither drop fps nor obviously ignore a texture. On Nvidia cards. My AMD card (5600 XT) would drop those fps while swapping out and frankly the experience was pretty bad. However that was 3-5 years ago. Nowadays the experience has merged with Nvidia dropping frames though not as bad as the 5600 XT would, and AMD not dropping nearly as many, so the experience today on both is similar.
Much of that is AMD cards having more VRAM nowadays which is helpful, and is the main point. We're not getting enough VRAM nowadays to avoid these problems. I got 6GB on my $270 1060 in 2017 and 7 years later we're getting 8GB for the same price or more. That kinda sucks. However for AMD cards I wonder whether they also got their act together and implemented better texture management/compression. Maybe from a driver update a year ago or so IIRC.
But the second part is: has Nvidia's VRAM management sucked more recently? I'm playing Hogwarts Legacy and Ark SA recently and low VRAM memory hinders both of these games pretty badly even when adjusting other settings gets both running well otherwise. Crappy game design is a reasonable argument and both are sucktastic in certain areas which *have* to be coding problems, but when turning down textures in both games to Medium on 8GB cards barely gets them to scrape by with decent 1% lows, VRAM becomes the single biggest problem on 8GB cards.