News Posts matching #SDK

Return to Keyword Browsing

Razer Announces the Wolverine Ultimate Gamepad for PC and Xbox One

Razer, the leading global lifestyle brand for gamers, today announced the officially licensed Razer Wolverine Ultimate gaming controller for Xbox One and PC. The Razer Wolverine Ultimate was designed to adapt itself to any gamer. Two interchangeable D-Pads, a range of interchangeable thumbsticks with different heights and shape and a total of 6 remappable triggers and buttons - both via Razer Synapse for Xbox and on-the-fly - provide maximum customizability.

An integrated RGB lighting strip that can be controlled via Razer Synapse for Xbox adds more ways to personalize the controller and introduces Razer Chroma to Xbox gamers everywhere. Gamers can choose from 16.8 million colors and a variety of effects that include Static, Spectrum Cycling, Breathing, Wave and more. Additionally, the Razer Wolverine Ultimate will be the first console product to support the Razer Chroma SDK, allowing developers to integrate advanced lighting capabilities for Xbox One games, and console controllers for next level gaming immersion.

Razer Takes Chroma Lighting Beyond Peripherals with the Hardware Development Kit

Razer, the leading global lifestyle brand for gamers, today announced the Razer Chroma Hardware Development Kit (HDK), the world's most advanced modular lighting system for PC gamers and enthusiasts. Integrated within the Razer Chroma ecosystem, the Chroma HDK offers all-in-one color customization with precise control down to the individual LED.

Users can shape and bend the LED strips to fit virtually any surface to light up an entire room, home or office for total game immersion. The individually controllable lights are integrated into Razer Synapse 3, and are powered by Razer Chroma technology, which unlocks customizable lighting features that can be synced across devices.

NVIDIA Announces OptiX 5.0 SDK - AI-Enhanced Ray Tracing

At SIGGRAPH 2017, NVIDIA introduced the latest version of their AI-based, GPU-enabled ray-tracing OptiX API. The company has been at the forefront of GPU-powered AI endeavors in a number of areas, including facial animation, anti-aliasing, denoising, and light transport. OptiX 5.0 brings a renewed focus on AI-based denoising.

AI training is still a brute-force scenario with finesse applied at the end: basically, NVIDIA took tens of thousands of image pairs of rendered images with one sample per pixel and a companion image of the same render with 4,000 rays per pixel, and used that to train the AI to predict what a denoised image looks like. Basically (and picking up the numbers NVIDIA used for its AI training), this means that in theory, users deploying OptiX 5.0 only need to render one sample per pixel of a given image, instead of the 4,000 rays per pixel that would be needed for its final presentation. Based on its learning, the AI will then be able to fill in the blanks towards finalizing the image, saving the need to render all that extra data. NVIDIA quotes a 157x improvement in render time using a DGX station with Optix 5.0 deployed against the same render on a CPU-based platform (2 x E5-2699 v4 @ 2.20GHz). The Optix 5.0 release also includes provisions for GPU-accelerated motion blur, which should do away with the need to render a frame multiple times and then applying a blur filter through a collage of the different frames. NVIDIA said OptiX 5.0 will be available in November. Check the press release after the break.

NVIDIA Releases VRWorks Audio and 360 Video SDKs at GTC

Further planting its roots on the VR SDK and development field, NVIDIA has just announced availability of two more SDK packages, for their VRWorks Audio and 360 Video suites. Now a part of NVIDIA's VRWorks suite of VR solutions, the VRWorks Audio SDK provides real-time ray tracing of audio in virtual environments, and is supported in Epic's Unreal Engine 4 (here's hoping this solution, or other solutions similar to it, address the problem of today's game audio.) The VRWorks 360 Video SDK, on the other hand, may be less interesting for graphics enthusiasts, in that it addresses the complex challenge of real-time video stitching.

Traditional VR audio ( and gaming audio, for that matter) provide an accurate 3D position of the audio source within a virtual environment. However, as it is handled today, sound is processed with little regard to anything else but the location of the source. With VRWorks Audio, NVIDIA brings to the table considerations for the dimensions and material properties of the physical environment, helping to create a truly immersive environment by modeling sound propagation phenomena such as reflection, refraction and diffraction. This is to be done in real time, at a GPU level. This work leverages NVIDIA's OptiX ray-tracing technology, which allows VRWorks Audio to trace the path of sound in real time, delivering physically accurate audio that reflects the size, shape and material properties of the virtual environment.

NVIDIA Announces Public Ansel SDK, Developer Plugins

NVIDIA, Ansel, a framework for doing real-time screenshot filters and photographic effects, has seen the release of a public SDK and a few developer plugins to boot. Unreal Engine and Unity have both gained plugins for the technology, and the tech is reportedly coming to Amazon's Lumberyard engine as well. This should most assuredly aid in the adoption of the technology, as well as open it up to new markets where it was previous unavailable, such as indie game development. The public SDK is presently available for download from NVIDIA directly at developer.nvidia.com/ansel

NVIDIA Announces DX12 Gameworks Support

NVIDIA has announced DX12 support for their proprietary GameWorks SDK, including some new exclusive effects such as "Flex" and "Flow." Most interestingly, NVIDIA is claiming that simulation effects get a massive boost from Async Compute, nearly doubling performance on a GTX 1080 using that style of effects. Obviously, Async Compute is a DX12 exclusive technology. The performance gains in an area where NVIDIA normally is perceived to not do so well are indeed encouraging, even if only in their exclusive ecosystem. Whether GCN powered cards will see similar gains when running GameWorks titles remains to be seen.

Shadowplay Now Automagically Records Your Greatest Moments

NVIDIA has announced a new SDK for its products known as Shadowplay Highlights. Shadowplay Highlights augments the existing recording game technology of NVIDIA Shadowplay to automatically capture hot moments in your favorite videogame. Whether it's your latest Triple Kill or a particular daring jump on the race track, if the game engine tells the SDK it's significant, Shadowplay spins up, combining previously recorded gameplay with live recordings, to create a perfect video of your glory moment. You can then edit the footage from within the game and directly upload it to a number of social networks.

The technology includes many options for quality or diskspace saving, and anything in-between. Of course, as with all things Shadowplay, the technology certainly will require a GeForce branded graphics card and support from game developers as well. A video demonstrating the technology follows after the break.

IBM and NVIDIA Team Up on World's Fastest Deep Learning Enterprise Solution

IBM and NVIDIA today announced collaboration on a new deep learning tool optimized for the latest IBM and NVIDIA technologies to help train computers to think and learn in more human-like ways at a faster pace. Deep learning is a fast growing machine learning method that extracts information by crunching through millions of pieces of data to detect and rank the most important aspects from the data. Publicly supported among leading consumer web and mobile application companies, deep learning is quickly being adopted by more traditional business enterprises.

Deep learning and other artificial intelligence capabilities are being used across a wide range of industry sectors; in banking to advance fraud detection through facial recognition; in automotive for self-driving automobiles and in retail for fully automated call centers with computers that can better understand speech and answer questions.

NVIDIA Releases VRWorks SDK Update for "Pascal"

NVIDIA today released a major update to its VRWorks SDK that enables game developers to implement new VR features features introduced by the GeForce "Pascal" graphics processors, taking advantage of the new Simultaneous Multi-projection Engine (SMP). The two major features introduced are Lens-Matched Shading and Single-Pass Stereo.

Lens-Matched Shading uses SMP to provide substantial performance improvements in pixel shading. The feature improves upon Multi-res Shading by rendering to a surface that more closely approximates the lens corrected image that is output to the headset display. This avoids the performance cost of rendering many pixels that are discarded during the VR lens warp post-process. Single-Pass Stereo, on the other hand, removes the need for a GPU to render the geometry and tessellation of a 3D scene twice (one for each eye/viewport), and lets both viewports share one pass of geometry and tessellation, thereby halving the the tessellation and vertex-shading workload.

Dell Announces VR-Ready Precision Workstations

Dell today announced new Virtual Reality-ready solutions that feature refined criteria for optimal VR experience, whether consuming or creating VR content. Dell has defined VR-ready solutions by three criteria:
  • Minimum CPU, memory, and graphics requirements to support optimal VR viewing experiences;
  • Graphics drivers that are qualified to work reliably with these solutions; and,
  • Passing performance tests conducted by Dell using test criteria based on HMD (head-mounted display) suppliers, ISVs or 3rd party benchmarks where available.
Working closely with its hardware and software partners, Dell is formalizing its commitment to the future of VR by delivering solutions that are optimized for VR consumption and creation alongside ISV applications for professional customers.

Oculus to Begin Taking Pre-orders for the Oculus Rift CV1 on January 6

Oculus, makers of the popular Oculus Rift VR HMD, announced that it will open the gates for pre-orders for its upcoming Rift CV1 HMD on the 6th of January, 2016, at 08:00 Pacific Time. You'll be able to take it for a spin right out of the box, on the bundled games Lucky's Tale, and EVE: Valkyrie, two games built almost entirely around VR, by leveraging the Oculus SDK.

2016 is shaping up to be the year VR takes off in a big scale, with consumer electronics giants planning to launch their VR headsets; game developers building their games around major VR SDKs, and graphics hardware companies like AMD and NVIDIA making major moves in the VR industry. AMD is sitting on a treasure-chest of IP with its LiquidVR technology, while NVIDIA recently announced a VR-ready certification program.

AMD Counters GameWorks with GPUOpen, Leverages Open-Source

AMD is in no mood to let NVIDIA run away with the PC graphics market, with its GameWorks SDK that speeds up PC graphics development (in turn increasing NVIDIA's influence over the game development, in a predominantly AMD GCN driven client base (20% PC graphics market-share, and 100% game console market share). AMD's counter to GameWorks is GPUOpen, with the "open" referring to "open-source."

GPUOpen is a vast set of pre-developed visual-effects, tools, libraries, and SDKs, designed to give developers "unprecedented control" over the GPU, helping them get their software closer to the metal than any other software can. The idea here is that an NVIDIA GameWorks designed title won't get you as "close" to the metal on machines such as the Xbox One and PlayStation 4, or PCs with Radeon GPUs, as GPUOpen. Getting "close to the metal" is defined as directly leveraging features exposed by the GPU, with as few software layers between the app and the hardware as possible.

NVIDIA Releases GeForce 358.50 WHQL Game Ready Driver

NVIDIA released the GeForce 358.50 WHQL drivers, which are "Game Ready" for Star Wars: Battlefront Open Beta. Open for pre-loading now on Origin for free, the game goes life in a day. In addition to this, it includes an updated driver support for GameWorks VR SDK; and OpenGL ARB 2015 extensions. This includes support for OpenGL ES 3.2 on the desktop. Grab the driver from the links below.
DOWNLOAD: GeForce 358.50 WHQL for Windows 10 64-bit | Windows 10 32-bit | Windows 7/8 64-bit | Windows 7/8 32-bit

AMD Releases Catalyst 15.8 Beta Driver

AMD released the Catalyst 15.8 Beta driver. In addition to an updated display driver, with likely support for newly launched GPUs, the Radeon R9 Nano and R9 370X, the drivers offer DirectX 12 performance optimizations for "Ashes of the Singularity," and optimizations with stability updates for "Batman: Arkham Knight." More importantly, the driver integrates Oculus 0.7 SDK, and fixes a number of game-specific bugs.
DOWNLOAD: AMD Catalyst 15.8 Beta for Windows 10/8.1/7 64-bit | 32-bit

HGST Delivers World's First 10TB Enterprise HDD for Active Archive Applications

Helping the world harness the power of data, HGST, a Western Digital company (NASDAQ: WDC) today announced the first enterprise-class 10TB (terabyte) hard disk drive (HDD) for next-generation active archive applications. The host-managed Ultrastar Archive Ha10 SMR HDD sets a new standard in enabling the world's densest server and storage systems with unprecedented TCO levels. This industry-defining product is the result of combining two complementary technologies - HGST's second generation, field-proven HelioSeal platform and shingled magnetic recording (SMR) - to deliver unmatched storage density and power efficiency, without compromising reliability and performance predictability. With an industry-leading 10TB capacity, the Ultrastar Archive Ha10 gives customers a time-to-market capacity advantage for archival environments and applications where data is sequentially written and randomly read, such as social media, cloud storage, online backup, life sciences as well as media and entertainment.

HGST recognizes SMR as core technology necessary in driving areal density increases. By overlapping or "shingling" the data tracks on top of each other, higher areal density can be achieved within the same physical footprint. Based on feedback from customers whose data center environments demand predictable performance and control of how data is handled, HGST has implemented a host-managed SMR solution. The sequential write behavior of host-managed SMR complements active archive workloads.

Razer Introduces the Firefly Gaming Mousepad with Customizable Lighting FX

Razer, a leader in connected devices and software for gamers, today announced the release of the Razer Firefly, the first hard gaming mouse mat with Razer's Chroma lighting feature. The Chroma feature adds 16.8 million color options and numerous customizable lighting effects to the mouse mat, including reactive, wave and spectrum cycling.

Razer Firefly has lighting along its left, right and bottom borders and can sync up with other Chroma enabled Razer devices for virtually limitless color combinations.
Engineered with all the trusted Razer gaming-grade performance features, the Razer Firefly has a micro-textured finish for the balance between control and speed. The optimized reflective surface ensures mouse movements translate to precise cursor movements and rapid in-game responsiveness.

NVIDIA Frees PhysX Source Code

After Epic's Unreal Engine 4 and Unity 5 game engines went "free,"with their source-codes put up by their makes for anyone to inspect freely, NVIDIA decided to join the bandwagon of showering game developers with technical empowerment, by putting up the entire source-code of PhysX 3.3.3, including its cloth and destruction physics code, on GitHub. The move to put up free-code of PhysX appears to be linked to the liberation of Unreal Engine 4 code.

NVIDIA PhysX is the principal physics component of Unreal-driven game titles for several years now. There's a catch, though. NVIDIA is only freeing CPU-based implementation of PhysX, and not its GPU-accelerated one, which leverages NVIDIA's proprietary CUDA GPU compute technology. There should still be plenty for game devs and students in the field, to chew on. In another interesting development, the PhysX SDK has been expanded from its traditionally Windows roots to cover more platforms, namely OS X, Linux, and Android. Find instructions on how to get your hands on the code, at the source link.

AMD Announces New LiquidVR Technology

AMD announced an initiative to deliver the best possible VR experience for developers and users through new AMD technologies and partnerships. The first output of AMD's initiative is LiquidVR, a set of innovative technologies focused on enabling exceptional VR content development for AMD hardware, improved comfort in VR applications by facilitating performance, and plug-and-play compatibility with VR headsets. The upcoming LiquidVR SDK makes a number of technologies available which help address obstacles in content, comfort and compatibility that together take the industry a major step closer to true, life-like presence across all VR games, applications, and experiences.

In virtual reality, the concept of 'presence' is described as the perception of being physically present in a simulated, nonphysical world in a way that fully immerses the user. A key obstacle to achieving presence is addressing motion-to-photon latency, the time between when a user moves their head and when his or her eye sees an updated image reflecting that new position. Minimizing motion-to-photon latency is critical to achieving both presence and comfort, two key elements of great VR.

AMD Releases Mantle Programming Guide and Reference API

AMD announced that it published the complete 450-page programming guide for its Mantle 3D graphics API, and the reference API itself. The two can be accessed from here. In the run up to its GDC 2015 presentation, in a blog post written by the company's top technology exec Raja Koduri, the company said it will talk about the future of Mantle in its GDC presentation. The company intends to develop, maintain and support Mantle and its eco-system, while maintaining that it will participate in the development and support of industry-standard APIs such as DirectX 12 and GLnext (the next major version of OpenGL).

Micron Announces Development of New Parallel Processing Architecture

Micron Technology, Inc., one of the world's leading providers of advanced semiconductor solutions, today announced the development of a fundamentally new computing architecture capable of performing high-speed, comprehensive search and analysis of complex, unstructured data streams.

Micron's Automata Processor (AP) is an accelerator that leverages the intrinsic parallelism of memory and aims to dramatically advance computing capabilities in areas such as bioinformatics, video/image analytics, and network security which pose challenges for conventional processor architectures because of the amount of complex, unstructured data.

AMD Debuts New SDK, Tools and Libraries, for Heterogeneous Computing Developers

AMD kicked off its 2013 Developer Summit (APU13) today, announcing a new unified Software Development Kit (SDK,) an improved CodeXL tool suite with added features and support for the latest AMD hardware, and added heterogeneous acceleration in popular Open Source libraries. Together, these tools provide a substantial step forward in productivity and ease-of-use for developers wishing to harness the full power of modern heterogeneous platforms spanning form servers to PCs to handheld devices.

"Developers are essential to our mission of realizing the full potential of modern computing technologies," said Manju Hegde, corporate vice president, Heterogeneous Solutions, AMD. "Enriching the developer experience by harnessing these technologies is a critical part of AMD's mission to accelerate developer adoption."

Intel Releases SDK with OpenCL 1.2 Support for Intel Xeon Phi Coprocessors

Intel has announced the production release of the Intel SDK for Open CL Applications XE 2013 that launched as a beta program in December. The new SDK broadens options for developers on Intel architecture and includes tools, optimization guides and training.

The SDK helps OpenCL developers improve performance and efficiency on Intel Xeon Phi coprocessors and Intel Xeon processors as well as create highly parallel applications for high performance computing workstations, data analytics and other uses. Download the Intel SDK for OpenCL Applications XE 2013 here.

IDF 2013 Transforming Computing Experiences from the Device to the Cloud

During her keynote at the Intel Developer Forum today in Beijing, Diane Bryant, senior vice president and general manager of Intel's Datacenter and Connected Systems Group, discussed how her company is helping users harness powerful new capabilities that will improve the lives of people by building smarter cities, healthier communities and thriving businesses.

Bryant unveiled details of upcoming technologies and products that show how Intel aims to transform the server, networking and storage capabilities of the datacenter. By addressing the full spectrum of workload demands and providing new levels of application optimized solutions for enterprise IT, technical computing and cloud service providers, unprecedented experiences can be delivered.

Intel Media SDK Helps Accelerate Atom-Based Devices and Open Source Software

Intel Media SDK 2013 now supports fixed-function hardware acceleration for video on Intel Atom processor-based tablets with Microsoft* Windows operating systems including video playback, editing and conversion.

The SDK, newly optimized for upcoming 4th generation Intel Core processors, codenamed "Haswell," includes enhanced support for Windows 8 including Microsoft DirectX 11, fully accelerated MPEG2 encode and MPEG/JPEG decode, and a Windows Store development sample. Use of Intel Media SDK 2013 also includes free licensing and source for integration with Open Source projects. The SDK is available as a free download.

MultiTouch launches Enriched Reality for Interactive Displays

MultiTouch Ltd, the world leader in interactive display systems, has announced its latest technology, Enriched Reality. With the new technology, MultiTouch's MultiTaction series of displays are able to detect every object that interacts or is placed on the system.

Enriched Reality is based on MultiTouch's proprietary Computer Vision Through Screen (CVTS) technology and uses 2D optical markers for real-life object detection to uniquely identify any object attached to a marker. In addition to this, Enriched Reality supports blob tracking which recognizes all basic geometric shapes including circles, triangles and rectangles.
Return to Keyword Browsing
Dec 18th, 2024 04:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts