News Posts matching #Unreal Engine

Return to Keyword Browsing

NCSOFT Unveiled its Digital Human for the First Time at GDC 2023

NCSOFT, a global premier game developer and publisher, unveiled its digital human technology together with Project M's trailer at the Game Developers Conference 2023 (GDC), currently being held in San Francisco, CA.

On March 22, NCSOFT revealed the video at State of Unreal, Epic Games' opening session at GDC held to introduce new Unreal Engine tools and technologies in collaboration with its partners. Here, Songyee Yoon, chief strategy officer (CSO) at NCSOFT, was on stage to introduce the company's latest project, Project M, and its trailer.

NVIDIA Omniverse Accelerates Game Content Creation With Generative AI Services and Game Engine Connectors

Powerful AI technologies are making a massive impact in 3D content creation and game development. Whether creating realistic characters that show emotion or turning simple texts into imagery, AI tools are becoming fundamental to developer workflows - and this is just the start. At NVIDIA GTC and the Game Developers Conference (GDC), learn how the NVIDIA Omniverse platform for creating and operating metaverse applications is expanding with new Connectors and generative AI services for game developers.

Part of the excitement around generative AI is because of its ability to capture the creator's intent. The technology learns the underlying patterns and structures of data, and uses that to generate new content, such as images, audio, code, text, 3D models and more. Announced today, the NVIDIA AI Foundations cloud services enable users to build, refine and operate custom large language models (LLMs) and generative AI trained with their proprietary data for their domain-specific tasks. And through NVIDIA Omniverse, developers can get their first taste of using generative AI technology to enhance game creation and accelerate development pipelines with the Omniverse Audio2Face app.

Razer Introduces Universal Haptics SDK and Directional Haptics at GDC 2023

Razer, the leading global lifestyle brand for gamers, today announced the release of the Interhaptics universal HD haptic SDK and directional haptics at the Game Developers Conference (GDC) 2023 in San Francisco. This free SDK release focuses on enabling a heightened immersive gaming experience, bringing audio and visual effects to life with HD haptic feedback that can now be completely customized through the Interhaptics SDK.

With today's announcement, Interhaptics, the leading haptic technology platform, has expanded its support to include PlayStation 5*, PlayStation 4, Meta Quest 2, X-input controllers, iOS, and Android devices for game engines such as Unity and Unreal Engine. Additionally, the haptic composer software has been upgraded to include in-app testing for DualSense wireless controllers for PS5 and select Razer HyperSense headsets. Interhaptics can now deploy HD haptics on over 5 billion devices across multiple ecosystems. Developers can sign up for the waiting list for the Razer Kraken V3 HyperSense Dev Kit with programmable directional HD haptics at the Interhaptics website.

NVIDIA Accelerates Neural Graphics PC Gaming Revolution at GDC With New DLSS 3 PC Games and Tools

Ahead of next week's Game Developers Conference (GDC), NVIDIA announced an expanded game roster and new developer plug-ins for NVIDIA DLSS 3. The latest version of NVIDIA's AI-powered Deep Learning Super Sampling (DLSS) technology is now supported in an assortment of blockbuster games and franchises, and being integrated into Unreal Engine, one of the world's most popular game engines. The company is also publicly releasing the DLSS Frame Generation plug-in to further ease developer adoption of the technology.

"Neural graphics has revolutionized gaming since its introduction with NVIDIA DLSS, and we're now taking it to new heights," said Matt Wuebbling, vice president of global GeForce marketing at NVIDIA. "PC gaming super-franchises such as Diablo and Forza Horizon and Bethesda's new Redfall are raising the bar for image quality with stunning graphics while using DLSS to keep gameplay smooth as silk." Since its launch in 2018, NVIDIA DLSS has driven a neural graphics revolution in PC gaming. Neural graphics intertwines AI and graphics to create an accelerated rendering pipeline that continuously learns and improves. Instead of natively rendering every pixel in a frame, DLSS allows the game to render 1/8th of the pixels then uses AI and GeForce RTX Tensor Cores to reconstruct the rest of the pixels, dramatically multiplying frame rates, while delivering crisp, high-quality images that rival native resolution.

NCSOFT Debuts Trailer for its First Real-Time Strategy Game - 'Project G'

NCSOFT, a global premier game developer and publisher, today debuted the first trailer for the company's new mobile and PC title, 'Project G,' on its official YouTube channel. Project G is the company's first real-time strategy (RTS) game, coming to global players. This brand-new IP is currently under development as a strategy game set in a massive-scale war. Each player will expand by accumulating limited resources and will also make use of different tactics in territory conquests between guilds.

The trailer features 100% real gameplay scenes with high-quality graphics currently under development, built on Unreal Engine. It showcases various game system details, including unique characters of different races, strategic combats executed with melee and ranged units, and tactical maneuvers of 'Dragons' and 'Strategic Arms' in objective and territory conquest wars. It also reveals in-game footage where battles between individual forces expand into massive warfare.

Embarks Studios' THE FINALS has DLSS 3 support

THE FINALS, the upcoming free-to-play first-person shooter from Embarks Studios, features both support for NVIDIA DLSS 3 and RTX Global Illumination. Surprisingly, neither NVIDIA nor Embarks Studio announced the support, although NVIDIA did briefly note that the game will have support for DLSS.

THE FINALS, described as free-to-play, combat-centered game show is a multiplayer team game that takes place in virtual arenas that can be "altered, exploited, and even destroyed", is built on Unreal Engine. The advanced destruction system in the game looks quite impressive and while there are certainly things that are indestructible, it still looks like quite a lot of fun and brings a certain twist to the game.

AMD FSR 2.2 for Unreal Engine now available on GPUOpen

Although it has already been available in some games, AMD's FidelityFX Super Resolution 2.2 is now available as an Unreal Plugin over on GPUOpen. AMD's FidelityFX Super Resolution 2 has been already used in some games ever since AMD released the source code for the technology, which includes the titles like Forza Horizon 5, Need For Speed Unbound, and F1 22, but implementation in various engines can take time, and now it is available as a plugin for Unreal Engine. AMD's FSR 2.2 brings several improvements including new logic that should reduce "High-Velocity Ghosting," an issue that usually plagues racing games. It also feature a new Debug API Checker, which should provide much easier debugging for developers.

Intel Xeon W-3400/2400 "Sapphire Rapids" Processors Run First Benchmarks

Thanks to the attribution of Puget Systems, we have a preview of Intel's latest Xeon W-3400 and Xeon W-2400 workstation processors based on Sapphire Rapids core technology. Delivering up to 56 cores and 112 threads, these CPUs are paired with up to eight TeraBytes of eight-channel DDR5-4800 memory. For expansion, they offer up to 112 PCIe 5.0 lanes come with up to 350 Watt TDP; some models are unlocked for overclocking. This interesting HEDT family for workstation usage comes at a premium with an MSRP of $5,889 for the top-end SKU, and motherboard prices are also on the pricey side. However, all of this should come as no surprise given the expected performance professionals expect from these chips. Puget Systems has published test results that include: Photoshop, After Effects, Premiere Pro, DaVinci Resolve, Unreal Engine, Cinebench R23.2, Blender, and V-Ray. Note that Puget Systems said that: "While this post has been an interesting preview of the new Xeon processors, there is still a TON of testing we want to do. The optimizations Intel is working on is of course at the top, but there are several other topics we are highly interested in." So we expect better numbers in the future.
Below, you can see the comparison with AMD's competing Threadripper Pro HEDT SKUs, along with power usage using different Windows OS power profiles:

Intel XeSS Plugin Released for Unreal Engine

Intel released the XeSS Unreal Engine plugin, letting game developers integrate the performance enhancement technology with their Unreal Engine 4 and Unreal Engine 5 powered games, simulators, and 3D visualization applications. The plugin lets Unreal Engine take advantage of XeSS not just on Intel Arc "Alchemist" GPUs, where they benefit from the accelerated XMX code-path; but also AMD and NVIDIA GPUs, where the technology takes advantage of the slower yet functional DP4a code-path. XeSS is technically a second-generation super-resolution technology that Intel claims is on-par with AMD FSR 2.x and NVIDIA DLSS 2. Integrating it is as straightforward as adding AMD FSR support. Those interested can grab the plugin from the GitHub source link, below.

Unreal Engine 5.1 Available with Updated Lumen, Nanite, and Virtual Shadow Maps

Epic Games is excited to announce that Unreal Engine 5.1 is now available. With this release, we've built upon the groundbreaking feature set introduced in UE5, making it more robust, efficient, and versatile for creators across all industries. As part of this effort, we've been stress-testing the engine against different workflows, making it applicable to more sectors. We've laid the groundwork for the Lumen dynamic global illumination and reflections system, the Nanite virtualized micropolygon geometry system, and Virtual Shadow Maps (VSM) to support games and experiences running at 60 FPS on next-gen consoles and capable PCs, enabling fast-paced competitive games and detailed simulations to run without latency.

Meanwhile, Nanite has also been updated with a Programmable Rasterizer to allow for material-driven animations and deformations via World Position Offset, as well as opacity masks. This exciting development paves the way for artists to use Nanite to program specific objects' behavior, for example Nanite-based foliage with leaves blowing in the wind. This release adds a number of features to improve efficiency for developers of games and other large-scale interactive projects, helping teams be more productive.

HaptX Introduces Industry's Most Advanced Haptic Gloves, Priced for Scalable Deployment

HaptX Inc., the leading provider of realistic haptic technology, today announced the availability of pre-orders of the company's new HaptX Gloves G1, a ground-breaking haptic device optimized for the enterprise metaverse. HaptX has engineered HaptX Gloves G1 with the features most requested by HaptX customers, including improved ergonomics, multiple glove sizes, wireless mobility, new and improved haptic functionality, and multiplayer collaboration, all priced as low as $4,500 per pair - a fraction of the cost of the award-winning HaptX Gloves DK2.

"With HaptX Gloves G1, we're making it possible for all organizations to leverage our lifelike haptics," said Jake Rubin, Founder and CEO of HaptX. "Touch is the cornerstone of the next generation of human-machine interface technologies, and the opportunities are endless." HaptX Gloves G1 leverages advances in materials science and the latest manufacturing techniques to deliver the first haptic gloves that fit like a conventional glove. The Gloves' digits, palm, and wrist are soft and flexible for uninhibited dexterity and comfort. Available in four sizes (Small, Medium, Large, and Extra Large), these Gloves offer the best fit and performance for all adult hands. Inside the Gloves are hundreds of microfluidic actuators that physically displace your skin, so when you touch and interact with virtual objects, the objects feel real.

AMD Launches FSR 2.0 Plugin for Unreal Engine 4 & 5

AMD has recently released two new plugins for Unreal Engine 4 and 5 that help developers easily add support for FidelityFX Super Resolution (FSR) 2.0 to their games. The first FSR 2.0 games only started to arrive this month but with over 20 titles already announced to be receiving the technology we expect that this latest development will greatly increase that number. AMD joins NVIDIA in offering Unreal Engine plugins to simplify the implementation of their respective temporal upscaling solutions with AMD claiming that adding FSR 2.0 support only takes days for games that already use motion vectors. AMD has published developer guides on how to install and use the plugins on which can be found below.

Hell, It's About Time! Frost Giant Studios Unveils Stormgate, the First Truly Social RTS

Millions of viewers around the world today watched the world premiere cinematic trailer for Frost Giant Studios' eagerly-anticipated real-time strategy game, Stormgate. Southern California-based Frost Giant Studios was founded in 2020 by Tim Morten and Tim Campbell, veteran game development leaders who helped create some of the most acclaimed and best-selling PC games of all time-as well as some of the most-watched esports-including Blizzard Entertainment's WarCraft III and StarCraft II.

"We are building Stormgate for the real-time strategy community--past, present, and future," said Tim Morten, CEO and production director at Frost Giant Studios. "Our vision is to create a social experience that breaks down the barriers that have kept people away, to welcome back players who have been waiting for the next great RTS, and to prove that the RTS genre can thrive once again."

Epic Games Announces The Matrix Awakens: An Unreal Engine 5 Experience

Epic Games is excited to announce that The Matrix Awakens: An Unreal Engine 5 Experience is now available to download for free on PlayStation 5 and Xbox Series X/S. This boundary-pushing technical demo is an original concept set within the world of Warner Bros' The Matrix. Written and cinematically directed by Lana Wachowski, it features Keanu Reeves and Carrie-Anne Moss reprising their roles as Neo and Trinity and—in a reality-flipping twist—also playing themselves.

The project reunited many of the crew that worked on the seminal The Matrix trilogy, including James McTeigue, Kym Barrett, John Gaeta, Kim Libreri, Jerome Platteaux, George Borshukov, and Michael F Gay, in collaboration with teams across both Epic Games and partners, such as SideFX, Evil Eye Pictures, The Coalition, WetaFX (formerly Weta Digital), and many others.

Epic Games Officially Announces Unreal Engine 5

The wait is over—we're very excited to announce that Unreal Engine 5 is now available to download! With this release, we aim to empower both large and small teams to really push the boundaries of what's possible, visually and interactively. UE5 will enable you to realize next-generation real-time 3D content and experiences with greater freedom, fidelity, and flexibility than ever before. As you may have seen, the new features and workflows have already been production-proven for game development in Fortnite and The Matrix Awakens: An Unreal Engine 5 Experience demo.

Meanwhile, although some major new features like Lumen and Nanite have not yet been validated for non-games workflows (this is an ongoing goal for future releases), all creators will be able to continue using workflows supported in UE 4.27. But they'll also benefit from a redesigned Unreal Editor, better performance, artist-friendly animation tools, an extended mesh creation and editing toolset, improved path tracing, and much more. See the documentation for full details.

NVIDIA Launches New Omniverse Game Development Tools

Enriching its game developer ecosystem, NVIDIA today announced the launch of new NVIDIA Omniverse features that make it easier for developers to share assets, sort asset libraries, collaborate and deploy AI to animate characters' facial expressions in a new game development pipeline. With the NVIDIA Omniverse real-time design collaboration and simulation platform, game developers can use AI- and NVIDIA RTX -enabled tools, or easily build custom ones, to streamline, accelerate and enhance their development workflows. New features for game developers include updates to Omniverse Audio2Face, Omniverse Nucleus Cloud and Omniverse DeepSearch, as well as the introduction of Unreal Engine 5 Omniverse Connector.

"Omniverse provides a powerful development pipeline that addresses the challenges of doing business in today's world," said Frank DeLise, vice president of Omniverse at NVIDIA. "Its ability to unify artists, art, tools and applications under a single platform can inspire collaboration among even the most dispersed game development organization."

CD PROJEKT RED Prepares a new Witcher Game with Unreal Engine 5

CD PROJEKT RED today revealed that the next installment in The Witcher series of video games is currently in development with Unreal Engine 5, kicking off a new saga for the franchise and a new technology partnership with Epic Games.

Today's announcement marks the first official confirmation of a new game in The Witcher series since the release of CD PROJEKT RED's previous single-player, AAA RPG in the franchise—The Witcher 3: Wild Hunt—which won a total of 250 Game of the Year awards and was later expanded upon with the Hearts of Stone and Blood & Wine add-ons.

AMD FidelityFX Super Resolution (FSR) Plugin for Unreal Engine 4 Released

AMD released the FidelityFX Super Resolution (FSR) plugin for Unreal Engine 4, allowing game developers to integrate the performance enhancement technology with their games. A competing technology to NVIDIA DLSS, FSR lets gamers improve frame-rates of their games by trading off quality. At the higher "Quality" presets, this quality loss is supposed to be practically unnoticeable, but with significant improvements to frame-rates. We detailed how the technology works in our article that gets under its hood and evaluates performance. At its launch, AMD listed out a broad list of launch partners for the technology, but Unreal was a notable absentee. Over the following months, AMD appears to have worked toward bringing the tech to even UE4. The plugin is being distributed through AMD's GPUOpen portal.

DOWNLOAD: AMD FSR Plugin for Unreal Engine 4

Qualcomm Announces Snapdragon Spaces XR Developer Platform

Qualcomm. introduces Snapdragon Spaces XR Developer Platform, a headworn Augmented Reality (AR) developer kit to enable the creation of immersive experiences that seamlessly blur the lines between our physical and digital realities. With proven technology and an open, cross-device horizontal platform and ecosystem, Snapdragon Spaces delivers the tools to bring developers' ideas to life and revolutionize the possibilities of headworn AR. Snapdragon Spaces is in early access with select developers and is expected to be generally available in the Spring of 2022.

Qualcomm Technologies is a pioneer in Augmented Reality with over a decade of AR research and development. Utilizing these years of innovation and expertise, Snapdragon Spaces offers robust machine perception technology that is optimized for performance and low power for the next generation of AR glasses. The Snapdragon Spaces platform provides environmental and user understanding capabilities that give developers the tools to create headworn AR experiences that can sense and intelligently interact with the user and adapt to their physical indoor spaces. Some of the marquee environmental understanding features include spatial mapping and meshing, occlusion, plane detection, object and image recognition and tracking, local anchors and persistence, and scene understanding. The user understanding machine perception features include positional tracking and hand tracking.

AMD FidelityFX FSR Source Code Released & Updates Posted, Uses Lanczos under the Hood

AMD today in a blog post announced several updates to the FidelityFX Super Resolution (FSR) technology, its performance enhancement rivaling NVIDIA DLSS, which lets gamers dial up performance with minimal loss to image quality. To begin with, the company released the source code of the technology to the public under its GPUOpen initiative, under the MIT license. This makes it tremendously easy (and affordable) for game developers to implement the tech. Inspecting the source, we find that FSR relies heavily on a multi-pass Lanczos algorithm for image upscaling. Next up, we learn that close to two dozen games are already in the process of receiving FSR support. Lastly, it's announced that Unity and Unreal Engine support FSR.

AMD broadly detailed how FSR works in its June 2021 announcement of the technology. FSR sits within the render pipeline of a game, where an almost ready lower-resolution frame that's been rendered, tone-mapped, and anti-aliased, is processed by FSR in a two-pass process implemented as a shader, before the high-resolution output is passed on to post-processing effects that introduce noise (such as film-grain). HUD and other in-game text (such as subtitles), are natively rendered at the target (higher) resolution and applied post render. The FSR component makes two passes—upscaling, and sharpening. We learn from the source code that the upscaler is based on the Lanczos algorithm, which was invented in 1979. Media PC enthusiasts will know Lanczos from MadVR, which has offered various movie upscaling algorithms in the past. AMD's implementation of Lanczos-2 is different than the original—it skips the expensive sin(), rcp() and sqrt() instructions and implements them in a faster way. AMD also added additional logic to avoid the ringing effects that are often observed on images processed with Lanczos.

NVIDIA Working on Ultra Quality Mode for DLSS Upscaling

NVIDIA's Deep Learning Super Sampling (DLSS) technology has been developed to upscale lower resolutions using artificial intelligence and deep learning algorithms. By using this technique, users with RTX cards can increase their framerates in supported games, with minimal loss in image quality. Recently, AMD introduced FidelityFX Super Resolution, a competing technology, which in one aspect might be technologically better than the DLSS competition. How you might wonder? Well, at the "quality" setting, NVIDIA's DLSS renders the game at 66.6% of the resolution, upscaling it 1.5 times. At the same "quality" preset, AMD FSR renders the game at 77% of the resolution and upscales the image by 1.3 times. This is technically providing an advantage to AMD FSR technology, as the image is posed to look better with less upscaling. DLSS on the other hand uses much more information, because it considers multiple frames in its temporal algorithm.

That newfound competition could be what made NVIDIA re-think their options and today we are getting some exciting news regarding DLSS. In the Unreal Engine 5 (UE5) documentation, there is a placeholder for "Ultra Quality" DLSS mode, which is supposed to rival AMD's "Ultra Quality" mode and offer the best possible image quality. Currently, the latest DLSS version is 2.2.6.0, which is present in some DLSS supported games, and can be added to others using a DLL-swap. The updated version with the Ultra Quality preset is already present in UE5, called DLSS 2.2.9.0. Mr. Alexander Battaglia from Digital Foundry has made a quick comparison using the two versions, however, we are waiting for more in-depth testing to see the final results.

NVIDIA Bringing DLSS 2.0 Support to UE 5, Linux via Proton, and Even More Titles

NVIDIA is reportedly about to make some big announcements related to its DLSS 2.0 performance enhancement feature. Excepts from the announcement we leaked to the web by VideoCardz. To begin with, the company is about to announce that both the upcoming Unreal Engine 5, and the current UE 4, support DLSS 2.0, besides the latest version 2021.2 of the Unity Engine. A large number of first-party game engines now support DLSS 2.0, including notably, the Rockstar Games RAGE engine powering RDR2, CryEngine, Decima, AnvilNext, REDEngine, and more.

NVIDIA is also preparing to announce that a vast new selection of games support DLSS 2.0, including Red Dead Redemption 2 (support coming soon), Rainbow Six Siege, DOOM Eternal (patch scheduled for June 29), Rust, and more. Lastly, NVIDIA is about to announce that it is working with Valve to bring DLSS support to Linux, via Proton compatibility layer. This will enable playing AAA Windows games on Linux with DLSS enabled.

Epic Games Announces $1 Billion Funding Round

Today Epic Games announced that it completed a $1 billion round of funding, which will allow the company to support future growth opportunities. Epic's equity valuation is now $28.7 billion.

This round includes an additional $200M strategic investment from Sony Group Corporation, which builds on the already close relationship between the two companies and reinforces their shared mission to advance the state of the art in technology, entertainment, and socially-connected online services. Other investment partners include Appaloosa, Baillie Gifford, Fidelity Management & Research Company LLC, GIC, funds and accounts advised by T. Rowe Price Associates, Ontario Teachers' Pension Plan Board, funds and accounts managed by BlackRock, Park West, KKR, AllianceBernstein, Altimeter, Franklin Templeton and Luxor Capital. Epic continues to have only a single class of common stock outstanding and CEO Tim Sweeney remains the controlling shareholder of the company.

NVIDIA's DLSS 2.0 is Easily Integrated into Unreal Engine 4.26 and Gives Big Performance Gains

When NVIDIA launched the second iteration of its Deep Learning Super Sampling (DLSS) technique used to upscale lower resolutions using deep learning, everyone was impressed by the quality of the rendering it is putting out. However, have you ever wondered how it all looks from the developer side of things? Usually, games need millions of lines of code and even some small features are not so easy to implement. Today, thanks to Tom Looman, a game developer working with Unreal Engine, we have found out just how the integration process of DLSS 2.0 looks like, and how big are the performance benefits coming from it.

Inthe blog post, you can take a look at the example game shown by the developer. The integration with Unreal Engine 4.26 is easy, as it just requires that you compile your project with a special UE4 RTX branch, and you need to apply your AppID which you can apply for at NVIDIA's website. Right now you are probably wondering how is performance looking like. Well, the baseline for the result was TXAA sampling techniques used in the game demo. The DLSS 2.0 has managed to bring anywhere from 60-180% increase in frame rate, depending on the scene. These are rather impressive numbers and it goes to show just how well NVIDIA has managed to build its DLSS 2.0 technology. For a full overview, please refer to the blog post.

Sony Electronics Launches Groundbreaking Spatial Reality Display

Sony Electronics Inc. today announced the debut of the Spatial Reality Display (SR Display), a groundbreaking new product made with Sony's award-winning Eye-Sensing Light Field Display (ELFD) technology. The display, initially shared with attendees at the Consumer Electronics Show in January of this year, does not require virtual reality glasses or a headset. The SR Display enables creators across a variety of industries, from automotive and industrial design, to Computer Graphics (CG) and Visual Effects (VFX) designers and creators in film to bring ideas to life in stunning 3D displays.

"We're excited to bring the world's best technology to bear, moving the design and creation industry forward, particularly as the shift to digital has become so pronounced," stated Mike Fasulo, president and chief operating officer of Sony Electronics North America. "This technology drives new versatility, allowing us to advance an entirely new medium and experience for designers and creators everywhere."
Return to Keyword Browsing
Nov 21st, 2024 11:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts