Friday, March 23rd 2018
EPIC Games Shows off Homonymous Raytracing Proof of Concept Videos
EPIC delivered their "The State of Unreal" presentation at the Game Developers Conference in San Francisco this week (which had a great deal much time devoted to raytracing technologies). EPIC Games has been at the forefront of graphics development for some years now. Their engines (alongside a few other key players in the industry) routinely push the boundaries of graphical fidelity. And naturally, something would be amiss if they weren't jumping on the raytracing bandwagon as well with some of the most impressive proof of concept videos I've ever seen in real-time generated graphics. The company took their time, then, to unveil their artistic interpretations of real-time ray tracing, motion capture and real-time facial animation mapping. Cue many videos, after the break.
The first video is one readers of TechPowerUp may have already seen - it's a real-time, fully raytraced rendition of the Star Wars character Phasma. The shading and reflections are top notch - as is the elevator music, really (winks). This one video was done to showcase EPIC's partnership and usage of NVIDIA's RTX technology, and if you don't believe that can be rendered in real time, you should (it's really only a matter of how much processing power one can throw at any scene, isn't it?) Another video explains what is going on, and really focuses on the Depth of Field improvements this technology brings to the table.
Another proof of concept video that was brought to life to show the state of cutting edge 3D modeling and motion capture is the Siren video, which was developed by EPIC games in partnership with Tencent, Cubic Motion and 3Lateral. Motion capture is something we've seen brought to incredible levels of detail before already, though; just think of last year's Hellblade: Senua's Sacrifice, which really added it in an extremely impressive, detailed, and expressive fashion.
Here's a behind the scenes video of Siren, as well. Don't you just love that accent?
And finally, my personal favorite out of all of these. I have a special admiration for Andy Serkis' performance - he's truly the best motion capture actor of our days. When you look at the following video, a collaboration between 3Lateral and EPIC games and developed on the Unreal Engine, try and remember that this is only based on motion capture - you can actually recognize that quite clearly with the angle of the camera lens. If you ever wanted to see a rendition of Macbeth delivered by Andy Serkis while in the skin of an alien (and who hasn't?), well, you can thank me.
It's a braver new world out there, folks. Could you imagine, when you picked up your first Master System videogame, that this is where we'd be going to? Now consider this really isn't the best we can ever achieve. It's mind-boggling.
Source:
via TechSpot
The first video is one readers of TechPowerUp may have already seen - it's a real-time, fully raytraced rendition of the Star Wars character Phasma. The shading and reflections are top notch - as is the elevator music, really (winks). This one video was done to showcase EPIC's partnership and usage of NVIDIA's RTX technology, and if you don't believe that can be rendered in real time, you should (it's really only a matter of how much processing power one can throw at any scene, isn't it?) Another video explains what is going on, and really focuses on the Depth of Field improvements this technology brings to the table.
Another proof of concept video that was brought to life to show the state of cutting edge 3D modeling and motion capture is the Siren video, which was developed by EPIC games in partnership with Tencent, Cubic Motion and 3Lateral. Motion capture is something we've seen brought to incredible levels of detail before already, though; just think of last year's Hellblade: Senua's Sacrifice, which really added it in an extremely impressive, detailed, and expressive fashion.
Here's a behind the scenes video of Siren, as well. Don't you just love that accent?
And finally, my personal favorite out of all of these. I have a special admiration for Andy Serkis' performance - he's truly the best motion capture actor of our days. When you look at the following video, a collaboration between 3Lateral and EPIC games and developed on the Unreal Engine, try and remember that this is only based on motion capture - you can actually recognize that quite clearly with the angle of the camera lens. If you ever wanted to see a rendition of Macbeth delivered by Andy Serkis while in the skin of an alien (and who hasn't?), well, you can thank me.
It's a braver new world out there, folks. Could you imagine, when you picked up your first Master System videogame, that this is where we'd be going to? Now consider this really isn't the best we can ever achieve. It's mind-boggling.
25 Comments on EPIC Games Shows off Homonymous Raytracing Proof of Concept Videos
As for that virtual woman, while it's impressive, the smile is still a dead giveaway it's not a real person no matter how realistic the rest is. They just can't seem to nail emotions perfectly no matter what. A character smiling has been animator's nemesis for decades. Doesn't look like much has changed even in 2018. :P
But the Alien and the Woman just looked weird and off
Before they change their minds.
Soon (y)
It probably will end up in just a few games and then just getting sprinkled on top of it for some better AO or global illumination..
I don't think developers struggle with DX12 (or Vulkan), it's just that DX12 (and Vulkan) take way more work with very little to show for. I.e. run a title in DX11 side by side with DX12 mode and you can barely spot the differences. Ray tracing is not like that.
I'm not saying it will be easy to do or that the devs will feel right at home with it. But the reward is there, I believe someone will do the legwork.
I'm just sceptic that it will end up as another buzzword like some other "game changing" technologies...
The Star Wars troopers scene looked jaw dropping and the fact it ran real-time on 4x Tesla cards it means we'll have this in actual games within 4-5 years. It's far away, but not that far away...
(based on speed bumps between generations in the past)
But it is years down the road, and I would wager the application specific math hardware will be out first, much like tessellation was in old ATI cards but unused.
Still an interesting demo.
I'm not that excited about the Unreal engine though…
This time it's got proper DX API support. True, tessellation got that years ago and it still doesn't run well on most video cards. But it's getting there.
Just by the time this comes out, most games will probably be streamed to us with a subscription from some cloud based platform somewhere. We live in interesting times.
So the argument goes from looks great but OMG four Volta DGX, expensive FU Nvidia... to:
OMG it didn't really need expensive DGX Volta monster... FU Nvidia.
The tweet = "We don't need a $50,000 station just $12,000 worth of GPUs"
Even Nvidia during the GeForce RTX presentation showed it
Kind of hard to dispell it when they run the demo and few months later Nvidia CEO shows a slide saying such. Even if Turing was used you'll notice 45ms is still under 30fps.