News Posts matching #SDK

Return to Keyword Browsing

Bose Introduces the World's First Audio Augmented Reality Platform

This week at SXSW, Bose introduces Bose AR, the world's first audio augmented reality platform, and glasses to hear- a Bose AR prototype that launches the future of mobile sound. Bose also announces its SDK schedule date for developers, manufacturers, and research institutions, along with collaborations currently under way, and venture funding for related start-ups.

Unlike other augmented reality products and platforms, Bose AR doesn't change what you see, but knows what you're looking at- without an integrated lens or phone camera. And rather than superimposing visual objects on the real world, Bose AR adds an audible layer of information and experiences, making every day better, easier, more meaningful, and more productive.

Dell Partners with Meta to Sell Meta 2 Augmented Reality Development Kit

Dell today announced it will be the first authorized reseller of the Meta 2 Augmented Reality Development Kit, equipping commercial companies with the tools needed to more easily innovate and adopt new AR technology applications that can advance their business. In partnership with Meta, Dell aims to make AR more accessible for business deployment, particularly in healthcare, manufacturing and construction, by providing tools for creating immersive experiences unique to the needs of those industries.

Dell is the only technology provider with an end to end ecosystem to consume, create and power VR and AR. The new offering with Meta stems from Dell's VR/AR Technology Partner Program, which brings together other innovators in VR and AR to test and collaborate on the best technology solutions for varying applications and experiences. This program allows Dell to help current and potential customers to better navigate the new and rapidly evolving VR/AR ecosystem, by working with partners to verify and certify the best software and hardware solutions for VR and AR applications - bringing standardization where it is needed most.

HTC Reveals Vive Focus Standalone VR Headset and Vive Wave VR Open Platform

HTC, a pioneer in innovative, smart mobile and virtual reality (VR) technologies, today held its VIVE Developer Conference 2017 (VDC2017), where it announced VIVE WAVE, a VR open platform and toolset that will open up the path to easy mobile VR content development and high-performance device optimization for third-party partners. 12 hardware partners, namely 360QIKU, Baofengmojing, Coocaa, EmdoorVR, Idealens, iQIYI, Juhaokan, Nubia, Pico, Pimax, Quanta and Thundercomm, announced their support for the integration of Vive Wave as well as the VIVEPORT VR content platform into their future products. Vive Wave is a clear step forward in bringing together the highly fragmented mobile VR market that has growth up in China the last several years. It saves tremendous efforts by allowing developers to create content for a common platform and storefront across disparate hardware vendors. Over 35 Chinese and global content developers have already built VR content optimized for Vive Wave, with 14 showing live demos at the event. Vive also unveiled the VIVE FOCUS, its highly anticipated premium standalone VR headset for the China market that is also based on the Vive Wave VR open platform.

Creative Launches Aurora Reactive SDK for Sound BlasterX Products

Creative Technology Ltd today announced that it would be launching the Aurora Reactive SDK. This tool would effectively convert the Aurora Reactive Lighting System found on Sound BlasterX products into an open platform, allowing developers the freedom to customize, animate and synchronize its lighting behavior. The 16.8 million color Aurora Reactive Lighting System is currently found on the Sound BlasterX Katana, Vanguard K08, Siege M04, AE-5, and Kratos S5.

The Aurora Reactive SDK is a system with APIs (Application Programming Interfaces) that allow third party developers to program Creative's Sound BlasterX RGB-enabled hardware. The SDK will come complete with sample codes, an API library, and documentation to enable even novice programmers to get started.

Razer Announces the Wolverine Ultimate Gamepad for PC and Xbox One

Razer, the leading global lifestyle brand for gamers, today announced the officially licensed Razer Wolverine Ultimate gaming controller for Xbox One and PC. The Razer Wolverine Ultimate was designed to adapt itself to any gamer. Two interchangeable D-Pads, a range of interchangeable thumbsticks with different heights and shape and a total of 6 remappable triggers and buttons - both via Razer Synapse for Xbox and on-the-fly - provide maximum customizability.

An integrated RGB lighting strip that can be controlled via Razer Synapse for Xbox adds more ways to personalize the controller and introduces Razer Chroma to Xbox gamers everywhere. Gamers can choose from 16.8 million colors and a variety of effects that include Static, Spectrum Cycling, Breathing, Wave and more. Additionally, the Razer Wolverine Ultimate will be the first console product to support the Razer Chroma SDK, allowing developers to integrate advanced lighting capabilities for Xbox One games, and console controllers for next level gaming immersion.

Razer Takes Chroma Lighting Beyond Peripherals with the Hardware Development Kit

Razer, the leading global lifestyle brand for gamers, today announced the Razer Chroma Hardware Development Kit (HDK), the world's most advanced modular lighting system for PC gamers and enthusiasts. Integrated within the Razer Chroma ecosystem, the Chroma HDK offers all-in-one color customization with precise control down to the individual LED.

Users can shape and bend the LED strips to fit virtually any surface to light up an entire room, home or office for total game immersion. The individually controllable lights are integrated into Razer Synapse 3, and are powered by Razer Chroma technology, which unlocks customizable lighting features that can be synced across devices.

NVIDIA Announces OptiX 5.0 SDK - AI-Enhanced Ray Tracing

At SIGGRAPH 2017, NVIDIA introduced the latest version of their AI-based, GPU-enabled ray-tracing OptiX API. The company has been at the forefront of GPU-powered AI endeavors in a number of areas, including facial animation, anti-aliasing, denoising, and light transport. OptiX 5.0 brings a renewed focus on AI-based denoising.

AI training is still a brute-force scenario with finesse applied at the end: basically, NVIDIA took tens of thousands of image pairs of rendered images with one sample per pixel and a companion image of the same render with 4,000 rays per pixel, and used that to train the AI to predict what a denoised image looks like. Basically (and picking up the numbers NVIDIA used for its AI training), this means that in theory, users deploying OptiX 5.0 only need to render one sample per pixel of a given image, instead of the 4,000 rays per pixel that would be needed for its final presentation. Based on its learning, the AI will then be able to fill in the blanks towards finalizing the image, saving the need to render all that extra data. NVIDIA quotes a 157x improvement in render time using a DGX station with Optix 5.0 deployed against the same render on a CPU-based platform (2 x E5-2699 v4 @ 2.20GHz). The Optix 5.0 release also includes provisions for GPU-accelerated motion blur, which should do away with the need to render a frame multiple times and then applying a blur filter through a collage of the different frames. NVIDIA said OptiX 5.0 will be available in November. Check the press release after the break.

NVIDIA Releases VRWorks Audio and 360 Video SDKs at GTC

Further planting its roots on the VR SDK and development field, NVIDIA has just announced availability of two more SDK packages, for their VRWorks Audio and 360 Video suites. Now a part of NVIDIA's VRWorks suite of VR solutions, the VRWorks Audio SDK provides real-time ray tracing of audio in virtual environments, and is supported in Epic's Unreal Engine 4 (here's hoping this solution, or other solutions similar to it, address the problem of today's game audio.) The VRWorks 360 Video SDK, on the other hand, may be less interesting for graphics enthusiasts, in that it addresses the complex challenge of real-time video stitching.

Traditional VR audio ( and gaming audio, for that matter) provide an accurate 3D position of the audio source within a virtual environment. However, as it is handled today, sound is processed with little regard to anything else but the location of the source. With VRWorks Audio, NVIDIA brings to the table considerations for the dimensions and material properties of the physical environment, helping to create a truly immersive environment by modeling sound propagation phenomena such as reflection, refraction and diffraction. This is to be done in real time, at a GPU level. This work leverages NVIDIA's OptiX ray-tracing technology, which allows VRWorks Audio to trace the path of sound in real time, delivering physically accurate audio that reflects the size, shape and material properties of the virtual environment.

NVIDIA Announces Public Ansel SDK, Developer Plugins

NVIDIA, Ansel, a framework for doing real-time screenshot filters and photographic effects, has seen the release of a public SDK and a few developer plugins to boot. Unreal Engine and Unity have both gained plugins for the technology, and the tech is reportedly coming to Amazon's Lumberyard engine as well. This should most assuredly aid in the adoption of the technology, as well as open it up to new markets where it was previous unavailable, such as indie game development. The public SDK is presently available for download from NVIDIA directly at developer.nvidia.com/ansel

NVIDIA Announces DX12 Gameworks Support

NVIDIA has announced DX12 support for their proprietary GameWorks SDK, including some new exclusive effects such as "Flex" and "Flow." Most interestingly, NVIDIA is claiming that simulation effects get a massive boost from Async Compute, nearly doubling performance on a GTX 1080 using that style of effects. Obviously, Async Compute is a DX12 exclusive technology. The performance gains in an area where NVIDIA normally is perceived to not do so well are indeed encouraging, even if only in their exclusive ecosystem. Whether GCN powered cards will see similar gains when running GameWorks titles remains to be seen.

Shadowplay Now Automagically Records Your Greatest Moments

NVIDIA has announced a new SDK for its products known as Shadowplay Highlights. Shadowplay Highlights augments the existing recording game technology of NVIDIA Shadowplay to automatically capture hot moments in your favorite videogame. Whether it's your latest Triple Kill or a particular daring jump on the race track, if the game engine tells the SDK it's significant, Shadowplay spins up, combining previously recorded gameplay with live recordings, to create a perfect video of your glory moment. You can then edit the footage from within the game and directly upload it to a number of social networks.

The technology includes many options for quality or diskspace saving, and anything in-between. Of course, as with all things Shadowplay, the technology certainly will require a GeForce branded graphics card and support from game developers as well. A video demonstrating the technology follows after the break.

IBM and NVIDIA Team Up on World's Fastest Deep Learning Enterprise Solution

IBM and NVIDIA today announced collaboration on a new deep learning tool optimized for the latest IBM and NVIDIA technologies to help train computers to think and learn in more human-like ways at a faster pace. Deep learning is a fast growing machine learning method that extracts information by crunching through millions of pieces of data to detect and rank the most important aspects from the data. Publicly supported among leading consumer web and mobile application companies, deep learning is quickly being adopted by more traditional business enterprises.

Deep learning and other artificial intelligence capabilities are being used across a wide range of industry sectors; in banking to advance fraud detection through facial recognition; in automotive for self-driving automobiles and in retail for fully automated call centers with computers that can better understand speech and answer questions.

NVIDIA Releases VRWorks SDK Update for "Pascal"

NVIDIA today released a major update to its VRWorks SDK that enables game developers to implement new VR features features introduced by the GeForce "Pascal" graphics processors, taking advantage of the new Simultaneous Multi-projection Engine (SMP). The two major features introduced are Lens-Matched Shading and Single-Pass Stereo.

Lens-Matched Shading uses SMP to provide substantial performance improvements in pixel shading. The feature improves upon Multi-res Shading by rendering to a surface that more closely approximates the lens corrected image that is output to the headset display. This avoids the performance cost of rendering many pixels that are discarded during the VR lens warp post-process. Single-Pass Stereo, on the other hand, removes the need for a GPU to render the geometry and tessellation of a 3D scene twice (one for each eye/viewport), and lets both viewports share one pass of geometry and tessellation, thereby halving the the tessellation and vertex-shading workload.

Dell Announces VR-Ready Precision Workstations

Dell today announced new Virtual Reality-ready solutions that feature refined criteria for optimal VR experience, whether consuming or creating VR content. Dell has defined VR-ready solutions by three criteria:
  • Minimum CPU, memory, and graphics requirements to support optimal VR viewing experiences;
  • Graphics drivers that are qualified to work reliably with these solutions; and,
  • Passing performance tests conducted by Dell using test criteria based on HMD (head-mounted display) suppliers, ISVs or 3rd party benchmarks where available.
Working closely with its hardware and software partners, Dell is formalizing its commitment to the future of VR by delivering solutions that are optimized for VR consumption and creation alongside ISV applications for professional customers.

Oculus to Begin Taking Pre-orders for the Oculus Rift CV1 on January 6

Oculus, makers of the popular Oculus Rift VR HMD, announced that it will open the gates for pre-orders for its upcoming Rift CV1 HMD on the 6th of January, 2016, at 08:00 Pacific Time. You'll be able to take it for a spin right out of the box, on the bundled games Lucky's Tale, and EVE: Valkyrie, two games built almost entirely around VR, by leveraging the Oculus SDK.

2016 is shaping up to be the year VR takes off in a big scale, with consumer electronics giants planning to launch their VR headsets; game developers building their games around major VR SDKs, and graphics hardware companies like AMD and NVIDIA making major moves in the VR industry. AMD is sitting on a treasure-chest of IP with its LiquidVR technology, while NVIDIA recently announced a VR-ready certification program.

AMD Counters GameWorks with GPUOpen, Leverages Open-Source

AMD is in no mood to let NVIDIA run away with the PC graphics market, with its GameWorks SDK that speeds up PC graphics development (in turn increasing NVIDIA's influence over the game development, in a predominantly AMD GCN driven client base (20% PC graphics market-share, and 100% game console market share). AMD's counter to GameWorks is GPUOpen, with the "open" referring to "open-source."

GPUOpen is a vast set of pre-developed visual-effects, tools, libraries, and SDKs, designed to give developers "unprecedented control" over the GPU, helping them get their software closer to the metal than any other software can. The idea here is that an NVIDIA GameWorks designed title won't get you as "close" to the metal on machines such as the Xbox One and PlayStation 4, or PCs with Radeon GPUs, as GPUOpen. Getting "close to the metal" is defined as directly leveraging features exposed by the GPU, with as few software layers between the app and the hardware as possible.

NVIDIA Releases GeForce 358.50 WHQL Game Ready Driver

NVIDIA released the GeForce 358.50 WHQL drivers, which are "Game Ready" for Star Wars: Battlefront Open Beta. Open for pre-loading now on Origin for free, the game goes life in a day. In addition to this, it includes an updated driver support for GameWorks VR SDK; and OpenGL ARB 2015 extensions. This includes support for OpenGL ES 3.2 on the desktop. Grab the driver from the links below.
DOWNLOAD: GeForce 358.50 WHQL for Windows 10 64-bit | Windows 10 32-bit | Windows 7/8 64-bit | Windows 7/8 32-bit

AMD Releases Catalyst 15.8 Beta Driver

AMD released the Catalyst 15.8 Beta driver. In addition to an updated display driver, with likely support for newly launched GPUs, the Radeon R9 Nano and R9 370X, the drivers offer DirectX 12 performance optimizations for "Ashes of the Singularity," and optimizations with stability updates for "Batman: Arkham Knight." More importantly, the driver integrates Oculus 0.7 SDK, and fixes a number of game-specific bugs.
DOWNLOAD: AMD Catalyst 15.8 Beta for Windows 10/8.1/7 64-bit | 32-bit

HGST Delivers World's First 10TB Enterprise HDD for Active Archive Applications

Helping the world harness the power of data, HGST, a Western Digital company (NASDAQ: WDC) today announced the first enterprise-class 10TB (terabyte) hard disk drive (HDD) for next-generation active archive applications. The host-managed Ultrastar Archive Ha10 SMR HDD sets a new standard in enabling the world's densest server and storage systems with unprecedented TCO levels. This industry-defining product is the result of combining two complementary technologies - HGST's second generation, field-proven HelioSeal platform and shingled magnetic recording (SMR) - to deliver unmatched storage density and power efficiency, without compromising reliability and performance predictability. With an industry-leading 10TB capacity, the Ultrastar Archive Ha10 gives customers a time-to-market capacity advantage for archival environments and applications where data is sequentially written and randomly read, such as social media, cloud storage, online backup, life sciences as well as media and entertainment.

HGST recognizes SMR as core technology necessary in driving areal density increases. By overlapping or "shingling" the data tracks on top of each other, higher areal density can be achieved within the same physical footprint. Based on feedback from customers whose data center environments demand predictable performance and control of how data is handled, HGST has implemented a host-managed SMR solution. The sequential write behavior of host-managed SMR complements active archive workloads.

Razer Introduces the Firefly Gaming Mousepad with Customizable Lighting FX

Razer, a leader in connected devices and software for gamers, today announced the release of the Razer Firefly, the first hard gaming mouse mat with Razer's Chroma lighting feature. The Chroma feature adds 16.8 million color options and numerous customizable lighting effects to the mouse mat, including reactive, wave and spectrum cycling.

Razer Firefly has lighting along its left, right and bottom borders and can sync up with other Chroma enabled Razer devices for virtually limitless color combinations.
Engineered with all the trusted Razer gaming-grade performance features, the Razer Firefly has a micro-textured finish for the balance between control and speed. The optimized reflective surface ensures mouse movements translate to precise cursor movements and rapid in-game responsiveness.

NVIDIA Frees PhysX Source Code

After Epic's Unreal Engine 4 and Unity 5 game engines went "free,"with their source-codes put up by their makes for anyone to inspect freely, NVIDIA decided to join the bandwagon of showering game developers with technical empowerment, by putting up the entire source-code of PhysX 3.3.3, including its cloth and destruction physics code, on GitHub. The move to put up free-code of PhysX appears to be linked to the liberation of Unreal Engine 4 code.

NVIDIA PhysX is the principal physics component of Unreal-driven game titles for several years now. There's a catch, though. NVIDIA is only freeing CPU-based implementation of PhysX, and not its GPU-accelerated one, which leverages NVIDIA's proprietary CUDA GPU compute technology. There should still be plenty for game devs and students in the field, to chew on. In another interesting development, the PhysX SDK has been expanded from its traditionally Windows roots to cover more platforms, namely OS X, Linux, and Android. Find instructions on how to get your hands on the code, at the source link.

AMD Announces New LiquidVR Technology

AMD announced an initiative to deliver the best possible VR experience for developers and users through new AMD technologies and partnerships. The first output of AMD's initiative is LiquidVR, a set of innovative technologies focused on enabling exceptional VR content development for AMD hardware, improved comfort in VR applications by facilitating performance, and plug-and-play compatibility with VR headsets. The upcoming LiquidVR SDK makes a number of technologies available which help address obstacles in content, comfort and compatibility that together take the industry a major step closer to true, life-like presence across all VR games, applications, and experiences.

In virtual reality, the concept of 'presence' is described as the perception of being physically present in a simulated, nonphysical world in a way that fully immerses the user. A key obstacle to achieving presence is addressing motion-to-photon latency, the time between when a user moves their head and when his or her eye sees an updated image reflecting that new position. Minimizing motion-to-photon latency is critical to achieving both presence and comfort, two key elements of great VR.

AMD Releases Mantle Programming Guide and Reference API

AMD announced that it published the complete 450-page programming guide for its Mantle 3D graphics API, and the reference API itself. The two can be accessed from here. In the run up to its GDC 2015 presentation, in a blog post written by the company's top technology exec Raja Koduri, the company said it will talk about the future of Mantle in its GDC presentation. The company intends to develop, maintain and support Mantle and its eco-system, while maintaining that it will participate in the development and support of industry-standard APIs such as DirectX 12 and GLnext (the next major version of OpenGL).

Micron Announces Development of New Parallel Processing Architecture

Micron Technology, Inc., one of the world's leading providers of advanced semiconductor solutions, today announced the development of a fundamentally new computing architecture capable of performing high-speed, comprehensive search and analysis of complex, unstructured data streams.

Micron's Automata Processor (AP) is an accelerator that leverages the intrinsic parallelism of memory and aims to dramatically advance computing capabilities in areas such as bioinformatics, video/image analytics, and network security which pose challenges for conventional processor architectures because of the amount of complex, unstructured data.

AMD Debuts New SDK, Tools and Libraries, for Heterogeneous Computing Developers

AMD kicked off its 2013 Developer Summit (APU13) today, announcing a new unified Software Development Kit (SDK,) an improved CodeXL tool suite with added features and support for the latest AMD hardware, and added heterogeneous acceleration in popular Open Source libraries. Together, these tools provide a substantial step forward in productivity and ease-of-use for developers wishing to harness the full power of modern heterogeneous platforms spanning form servers to PCs to handheld devices.

"Developers are essential to our mission of realizing the full potential of modern computing technologies," said Manju Hegde, corporate vice president, Heterogeneous Solutions, AMD. "Enriching the developer experience by harnessing these technologies is a critical part of AMD's mission to accelerate developer adoption."
Return to Keyword Browsing
Apr 7th, 2025 16:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts