Omniverse Machinima
This is perhaps the most impressive software-side announcement by NVIDIA alongside Ampere. A few years ago, Ansel introduced a means for people to freeze their game and take pictures of their in-game content, including modifying the scene with custom lighting and filters. Omiverse Machinima is essentially Ansel for 3D animated film making. The app allows you to use in-game assets of a compatible game to create movies. The possibilities are endless. Maybe a DOOM Eternal fan can finally make cutscenes out of Codex lore using in-game assets, or fan-made Star Trek cutscenes using STO assets.
NVIDIA believes that Omniverse Machinima is the first step towards democratizing 3D animated films the way YouTube democratized video, or blogs writing and Spotify music.
Real-Time Raytraced Global Illumination
NVIDIA has a long history of providing SDKs to game developers to simplify common tasks. In the past, topics addressed by GameWorks were lighting, temporal anti-aliasing, hair rendering, fluid simulation and others.
Now, NVIDIA is offering a complete solution for global illumination, which is the holy grail of lighting in computer-generated graphics. GI performs a physically correct simulation of the lights and is able to handle many shortcoming of traditional lighting techniques—it's much more hardware intensive, though.
To use RTXGI, developers have to place light probes throughout their environments, which are used to record and analyze how light travels through empty space in the scenes. When rendering, these (few) light probes are able to provide lighting data for objects that are physically close to the probes, while taking into account bouncing light and properly mixing the colors.
While it is obvious that RTXGI will work best on Ampere hardware, NVIDIA made sure the technology will also run on older architectures, even Pascal, which has no native raytracing support. I also see no reason why it wouldn't work on AMD hardware. The beauty is that developers can easily adjust the number of samples per frame to fine-tune image quality to available hardware resources. This means no separate assets or render paths have to be created by the developer: time equals money, after all.
NVIDIA also pointed out that RTXGI runs asynchronous to the render loop, which means you could adjust its simulation rate not to run with every single frame, but only at fixed time intervals, which further improves performance at the cost of some simulation accuracy that will barely be noticeable during gaming.