Raevenlord
News Editor
- Joined
- Aug 12, 2016
- Messages
- 3,755 (1.22/day)
- Location
- Portugal
System Name | The Ryzening |
---|---|
Processor | AMD Ryzen 9 5900X |
Motherboard | MSI X570 MAG TOMAHAWK |
Cooling | Lian Li Galahad 360mm AIO |
Memory | 32 GB G.Skill Trident Z F4-3733 (4x 8 GB) |
Video Card(s) | Gigabyte RTX 3070 Ti |
Storage | Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB |
Display(s) | Acer Nitro VG270UP (1440p 144 Hz IPS) |
Case | Lian Li O11DX Dynamic White |
Audio Device(s) | iFi Audio Zen DAC |
Power Supply | Seasonic Focus+ 750 W |
Mouse | Cooler Master Masterkeys Lite L |
Keyboard | Cooler Master Masterkeys Lite L |
Software | Windows 10 x64 |
Further planting its roots on the VR SDK and development field, NVIDIA has just announced availability of two more SDK packages, for their VRWorks Audio and 360 Video suites. Now a part of NVIDIA's VRWorks suite of VR solutions, the VRWorks Audio SDK provides real-time ray tracing of audio in virtual environments, and is supported in Epic's Unreal Engine 4 (here's hoping this solution, or other solutions similar to it, address the problem of today's game audio.) The VRWorks 360 Video SDK, on the other hand, may be less interesting for graphics enthusiasts, in that it addresses the complex challenge of real-time video stitching.
Traditional VR audio ( and gaming audio, for that matter) provide an accurate 3D position of the audio source within a virtual environment. However, as it is handled today, sound is processed with little regard to anything else but the location of the source. With VRWorks Audio, NVIDIA brings to the table considerations for the dimensions and material properties of the physical environment, helping to create a truly immersive environment by modeling sound propagation phenomena such as reflection, refraction and diffraction. This is to be done in real time, at a GPU level. This work leverages NVIDIA's OptiX ray-tracing technology, which allows VRWorks Audio to trace the path of sound in real time, delivering physically accurate audio that reflects the size, shape and material properties of the virtual environment.
NVIDIA says that this is, at least for now, the only hardware-accelerated and path-traced audio solution that creates a complete acoustic image of the environment in real time without requiring any "pre-baked" knowledge of the scene. As the scene is loaded by the application, the acoustic model is built and updated on the fly, with audio effect filters being generated and applied on the sound source waveforms. The software release consists of a set of C-APIs for integration into any engine or application, and an integration for Epic's Unreal Engine 4 that's now available on GitHub.
As to its VRWorks 360 Video SDK, it enables real-time 4K 360 video capture, stitching and streaming. Regarding this feature, Kinson Loo, CEO of Z CAM, said that "The fact that NVIDIA manages to stitch 4K 360 stereoscopic video in real time, making livestreaming possible, changes the production pipeline and enables entirely new use cases in VR."
View at TechPowerUp Main Site
Traditional VR audio ( and gaming audio, for that matter) provide an accurate 3D position of the audio source within a virtual environment. However, as it is handled today, sound is processed with little regard to anything else but the location of the source. With VRWorks Audio, NVIDIA brings to the table considerations for the dimensions and material properties of the physical environment, helping to create a truly immersive environment by modeling sound propagation phenomena such as reflection, refraction and diffraction. This is to be done in real time, at a GPU level. This work leverages NVIDIA's OptiX ray-tracing technology, which allows VRWorks Audio to trace the path of sound in real time, delivering physically accurate audio that reflects the size, shape and material properties of the virtual environment.
NVIDIA says that this is, at least for now, the only hardware-accelerated and path-traced audio solution that creates a complete acoustic image of the environment in real time without requiring any "pre-baked" knowledge of the scene. As the scene is loaded by the application, the acoustic model is built and updated on the fly, with audio effect filters being generated and applied on the sound source waveforms. The software release consists of a set of C-APIs for integration into any engine or application, and an integration for Epic's Unreal Engine 4 that's now available on GitHub.
As to its VRWorks 360 Video SDK, it enables real-time 4K 360 video capture, stitching and streaming. Regarding this feature, Kinson Loo, CEO of Z CAM, said that "The fact that NVIDIA manages to stitch 4K 360 stereoscopic video in real time, making livestreaming possible, changes the production pipeline and enables entirely new use cases in VR."
View at TechPowerUp Main Site