Friday, March 23rd 2018

EPIC Games Shows off Homonymous Raytracing Proof of Concept Videos

EPIC delivered their "The State of Unreal" presentation at the Game Developers Conference in San Francisco this week (which had a great deal much time devoted to raytracing technologies). EPIC Games has been at the forefront of graphics development for some years now. Their engines (alongside a few other key players in the industry) routinely push the boundaries of graphical fidelity. And naturally, something would be amiss if they weren't jumping on the raytracing bandwagon as well with some of the most impressive proof of concept videos I've ever seen in real-time generated graphics. The company took their time, then, to unveil their artistic interpretations of real-time ray tracing, motion capture and real-time facial animation mapping. Cue many videos, after the break.

The first video is one readers of TechPowerUp may have already seen - it's a real-time, fully raytraced rendition of the Star Wars character Phasma. The shading and reflections are top notch - as is the elevator music, really (winks). This one video was done to showcase EPIC's partnership and usage of NVIDIA's RTX technology, and if you don't believe that can be rendered in real time, you should (it's really only a matter of how much processing power one can throw at any scene, isn't it?) Another video explains what is going on, and really focuses on the Depth of Field improvements this technology brings to the table.


Another proof of concept video that was brought to life to show the state of cutting edge 3D modeling and motion capture is the Siren video, which was developed by EPIC games in partnership with Tencent, Cubic Motion and 3Lateral. Motion capture is something we've seen brought to incredible levels of detail before already, though; just think of last year's Hellblade: Senua's Sacrifice, which really added it in an extremely impressive, detailed, and expressive fashion.


Here's a behind the scenes video of Siren, as well. Don't you just love that accent?


And finally, my personal favorite out of all of these. I have a special admiration for Andy Serkis' performance - he's truly the best motion capture actor of our days. When you look at the following video, a collaboration between 3Lateral and EPIC games and developed on the Unreal Engine, try and remember that this is only based on motion capture - you can actually recognize that quite clearly with the angle of the camera lens. If you ever wanted to see a rendition of Macbeth delivered by Andy Serkis while in the skin of an alien (and who hasn't?), well, you can thank me.


It's a braver new world out there, folks. Could you imagine, when you picked up your first Master System videogame, that this is where we'd be going to? Now consider this really isn't the best we can ever achieve. It's mind-boggling.
Source: via TechSpot
Add your own comment

25 Comments on EPIC Games Shows off Homonymous Raytracing Proof of Concept Videos

#1
RejZoR
I must say this Raytracing thing is going to be as big revolution as introduction of pixel shaders was over decade and a half ago. If programmable pixel shaders pushed the graphics to whole new levels, this is that equivalent. Given this was running in actual real-time tells me general application isn't that far away. Which is pretty cool. Finally something major in the graphics segment after many years of tiny evolutionary improvements.

As for that virtual woman, while it's impressive, the smile is still a dead giveaway it's not a real person no matter how realistic the rest is. They just can't seem to nail emotions perfectly no matter what. A character smiling has been animator's nemesis for decades. Doesn't look like much has changed even in 2018. :P
Posted on Reply
#2
Omwe
It took 4 NVLinked Tesla V100's to run that Star Wars demo at 1080p24:

Posted on Reply
#3
ZoneDymo
The star wars demo looked good and was funny as well.
But the Alien and the Woman just looked weird and off
Posted on Reply
#4
Xzibit
OmweIt took 4 NVLinked Tesla V100's to run that Star Wars demo at 1080p24:
Gets yours now for only $49,900.




Before they change their minds.
Posted on Reply
#5
_JP_
RejZoRAs for that virtual woman, while it's impressive, the smile is still a dead giveaway it's not a real person no matter how realistic the rest is. They just can't seem to nail emotions perfectly no matter what. A character smiling has been animator's nemesis for decades. Doesn't look like much has changed even in 2018. :p
Going over the uncanny valley requires all the mining power in the world. And it will only be able to render at 720p. :p
Posted on Reply
#6
bug
OmweIt took 4 NVLinked Tesla V100's to run that Star Wars demo at 1080p24:

I've said from the beginning that current cards probably don't cut it. But now ray tracing is no longer decades away, it's merely years. Still great news considering what we're looking at.
Posted on Reply
#7
ikeke
So, in historical perspective similar increase in computing power required for desktop usage as 4XG80 (8800GTX) -> 1xGF100(GTX480) or 4xGF100 -> 1xGK110(GTX780Ti) or 4xGK110 -> 1xGV100.

Soon (y)
Posted on Reply
#8
iO
RejZoRI must say this Raytracing thing is going to be as big revolution as introduction of pixel shaders was over decade and a half ago. If programmable pixel shaders pushed the graphics to whole new levels, this is that equivalent. Given this was running in actual real-time tells me general application isn't that far away. Which is pretty cool. Finally something major in the graphics segment after many years of tiny evolutionary improvements.
Dont get too excited. Game devs already struggle with the implementation of DX12 and adding RT wont make it any easier.
It probably will end up in just a few games and then just getting sprinkled on top of it for some better AO or global illumination..
Posted on Reply
#9
bug
iODont get too excited. Game devs already struggle with the implementation of DX12 and adding RT wont make it any easier.
It probably will end up in just a few games and then just getting sprinkled on top of it for some better AO or global illumination..
If it gets into the Unreal engine, it will get into a lot of games.
I don't think developers struggle with DX12 (or Vulkan), it's just that DX12 (and Vulkan) take way more work with very little to show for. I.e. run a title in DX11 side by side with DX12 mode and you can barely spot the differences. Ray tracing is not like that.
I'm not saying it will be easy to do or that the devs will feel right at home with it. But the reward is there, I believe someone will do the legwork.
Posted on Reply
#10
iO
bugIf it gets into the Unreal engine, it will get into a lot of games.
I don't think developers struggle with DX12 (or Vulkan), it's just that DX12 (and Vulkan) take way more work with very little to show for. I.e. run a title in DX11 side by side with DX12 mode and you can barely spot the differences. Ray tracing is not like that.
I'm not saying it will be easy to do or that the devs will feel right at home with it. But the reward is there, I believe someone will do the legwork.
Every major engine supports stuff like displacement mapping, tesselation, AO, HiRez textures and so on, which can drastically improve the visuals of a game, yet it doesnt mean that it gets automatically used by developers just because its supported.
I'm just sceptic that it will end up as another buzzword like some other "game changing" technologies...
Posted on Reply
#11
RejZoR
Ray Tracing is a game changer. Till now, almost everything was faked to mimic how things look in reality. Ray Tracing mimics how light interacts with physical objects. There is no faking in it (which is why it's expensive to use). We're all aware it's not just gonna happen over night, but having a glimpse of it running in a real-time sequence is amazing achievement. Till now, Ray Tracing was an offline process that took several seconds or minutes to render single frame. Pixel shaders also didn't happen over night, first they basically only used them for water and reflections in mirrors (remember how basic TES:Morrowind looked but then there was hyper real water waves and flow thx to pixel shaders?, but today, pixel shaders are integral part of basically every effect on the screen.

The Star Wars troopers scene looked jaw dropping and the fact it ran real-time on 4x Tesla cards it means we'll have this in actual games within 4-5 years. It's far away, but not that far away...
Posted on Reply
#12
Fluffmeister
Yeah I certainly don't expect the latest triple A titles to suddenly look like the latest and greatest Star Wars film, but it certainly shows what we can all look forward to in the coming years.
Posted on Reply
#13
ikeke
Two generations past Volta we'll hopefully have enough power in top tier GPU to run the Star Wars troopers scene on one gpu in real time :)

(based on speed bumps between generations in the past)
Posted on Reply
#14
bug
iOEvery major engine supports stuff like displacement mapping, tesselation, AO, HiRez textures and so on, which can drastically improve the visuals of a game, yet it doesnt mean that it gets automatically used by developers just because its supported.
I'm just sceptic that it will end up as another buzzword like some other "game changing" technologies...
And that is in spite of the fact that you see what ray tracing can do every time you go see a SciFi movie. It's your opinion and I can respect that.
Posted on Reply
#15
Divide Overflow
bugAnd that is in spite of the fact that you see what ray tracing can do every time you go see a SciFi movie.
True, but they aren't rendering those frames in real time. It's something to look forward to definitely.
Posted on Reply
#16
Steevo
Will we reach the time when smaller process allows huge silicon chips with 4 times the processing power to do this in one chip? Yep.

But it is years down the road, and I would wager the application specific math hardware will be out first, much like tessellation was in old ATI cards but unused.

Still an interesting demo.
Posted on Reply
#17
Easy Rhino
Linux Advocate
Wow, I havn't heard the name Ray Tracing in 10 years. They are still hyping it?
Posted on Reply
#18
efikkan
Don't expect fully raytraced games anytime soon, but it still can be used on voxels to create desent shadows and glare.

I'm not that excited about the Unreal engine though…
Posted on Reply
#19
bug
Easy RhinoWow, I havn't heard the name Ray Tracing in 10 years. They are still hyping it?
If by "hyping" you mean, "got it working in real time on almost consumer grade hardware", then yes, they are ;)
This time it's got proper DX API support. True, tessellation got that years ago and it still doesn't run well on most video cards. But it's getting there.
Posted on Reply
#21
PowerPC
I like how they are demonstrating the Star Wars demo on a tablet...

Just by the time this comes out, most games will probably be streamed to us with a subscription from some cloud based platform somewhere. We live in interesting times.
Posted on Reply
#22
Fluffmeister
OmweIt took 4 NVLinked Tesla V100's to run that Star Wars demo at 1080p24:

It didn't really.


So the argument goes from looks great but OMG four Volta DGX, expensive FU Nvidia... to:

OMG it didn't really need expensive DGX Volta monster... FU Nvidia.
Posted on Reply
#23
StrayKAT
This was the kind of demo I was hoping to see (that Metro thing Nvidia had was underwhelming).
PowerPCI like how they are demonstrating the Star Wars demo on a tablet...

Just by the time this comes out, most games will probably be streamed to us with a subscription from some cloud based platform somewhere. We live in interesting times.
If that was the case, Nvidia will be out of business. I'm pretty sure they don't want that. :P
Posted on Reply
#24
Xzibit
FluffmeisterIt didn't really.


So the argument goes from looks great but OMG four Volta DGX, expensive FU Nvidia... to:

OMG it didn't really need expensive DGX Volta monster... FU Nvidia.
Reflection demo vs PICA PICA demo


The tweet = "We don't need a $50,000 station just $12,000 worth of GPUs"

Even Nvidia during the GeForce RTX presentation showed it



Kind of hard to dispell it when they run the demo and few months later Nvidia CEO shows a slide saying such. Even if Turing was used you'll notice 45ms is still under 30fps.
Posted on Reply
#25
Caring1
I'm not seeing any videos or links in the O.P.
Posted on Reply
Add your own comment
Dec 26th, 2024 07:41 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts