Thursday, October 17th 2019
Intel and Wargaming Join Forces to Deliver Ray Tracing to World of Tanks
Intel has been very serious about its efforts in computer graphics lately, mainly because of its plans to launch a dedicated GPU lineup and bring new features to the graphics card market. Today, Intel and Wargaming, a maker of MMO titles like World of Tanks, World of Warships, and World of Warplanes, partnered to bring ray tracing feature to the Wargaming's "Core" graphics engine, used in perhaps one of the best-known MMO title - World of Tanks.
Joint forces of Intel and Wargaming developers have lead to the implementation of ray tracing, using only regular software techniques without a need for special hardware. Being hardware agnostic, this implementation works on any graphics card that can run DirectX 11, as the "Core" engine is written in DirectX 11 API. To achieve this, developers had to make a solution that uses CPU's resources for fast, multi-threaded bounding volume hierarchy which then feeds the GPU's compute shaders for ray tracing processing, thus making the ray tracing feature entirely GPU shader/core dependent. Many features are reworked with emphasis put on shadow quality. In the images below you can see exactly what difference the new ray-tracing implementation makes, and you can use almost any graphics card to get it. Wargaming notes that "some FPS" will be sacrificed if ray tracing is turned on, so your GPU shouldn't struggle too much.
Joint forces of Intel and Wargaming developers have lead to the implementation of ray tracing, using only regular software techniques without a need for special hardware. Being hardware agnostic, this implementation works on any graphics card that can run DirectX 11, as the "Core" engine is written in DirectX 11 API. To achieve this, developers had to make a solution that uses CPU's resources for fast, multi-threaded bounding volume hierarchy which then feeds the GPU's compute shaders for ray tracing processing, thus making the ray tracing feature entirely GPU shader/core dependent. Many features are reworked with emphasis put on shadow quality. In the images below you can see exactly what difference the new ray-tracing implementation makes, and you can use almost any graphics card to get it. Wargaming notes that "some FPS" will be sacrificed if ray tracing is turned on, so your GPU shouldn't struggle too much.
72 Comments on Intel and Wargaming Join Forces to Deliver Ray Tracing to World of Tanks
to me it's like PhysX which ought to be a huge hit, it might be one or it might be not ... time will tell.
although one thing i find interesting in that news, it's that RT is CPU driven.
If Nvidia released a line of GTX2000 cards that did not have the RT stuff in it for the normal non insane prices of old, then it would be fine.
But they dont, so now you are stuck with these inflated prices for hardware you dont even want but have to pay for, that really is the problem here.
Yes we remember when cell phones were new and SSD's, and did we buy those? nope, instead we said its too expensive and just like here made arguments we did not need them, which at the time was correct as everything in our world was build around not having those new gadgets yet, same now with RTX where we can say its just not worth it.
Being an early adopter is just pretty much always a poor choice.
By the time RT is worth it and games are build from the ground up using it the prices will have gone down a lot and quality will have gone up, but we all know this.
If RTRT is killing performance....then why is it implemented now? when the hardware itself is not even ready for it? and yet you are paying through the nose for it...
Ultimately we all agree though, but yeah, you should see why people dont care about RT to the point of hating on the attention it gets and rather had cheaper non RT gpu's
That and this going at it themselves that Nvidia likes to do, is not the way to go, they should work together with other companies and establish a standard to follow and improve on.
This is not comparable to other, past advances in graphics. Its a departure from what was an iterative and constant process of increasing efficiency and raw performance. RT adds a layer of gross inefficiency on top - conveniently timed with mainstream resolutions becoming very easy to render on rather cheap GPUs.
You said it right though! Its supposed to be expensive - RT is the means to that goal, indeed.
@ZoneDymo said it right. We dont fight RT, we are struggling to see the added value vs the added cost.
Time will tell if the tech ever gets wings. So far it hasnt, except in marketing and announcements. There is not a single truly jaw dropping live product out there today. And thats not due to lack of attention - RT news is near clickbait grade material.
What concerns me is that people are judging RTRT as if what we are seeing right now is all that we will get. This is only the tip of the iceberg.
Cautious optimism, thats about the best way to approach this.
I am thinking a little in the future here. We need a unified shader model with the shaders able to run the Raytracing calculations efficiently as well. Vulcan 2 or DirectX 13 maybe? So every game developer can manage the raytracing how he likes. This brings me to intel Larabee, which was a GPU as general purpose as it gets. It was, so simplify it, just a bunch of x86 cores packed together. And there were some interesting Raytracing projects with it too.
Quake Wars Raytracing
The current Raytracing can still be used for Adventure games. We only need 30FPS for them. And a fully raytraced Adventure game, based on that technology would look great :)
The current raytracing attemps are difficult. Companies try to pack it somehow into rasterizded engines. (WOT and RTX promo games). There are better 3d model implimentations for it, which come with it's own advantages and disadvantages. Building a raytraced game from the ground up, with the raytraced preferred model and texture structures might include some thinking out of the box. And if you want to support rasterization at that stage, you probably have to built 2 different games.
I hope that Nvidia is dropping it's properitary RTX tech and goes with DXR and Vulcan ractracing. Just the same way AMD dropped Mantle for DX12 and Vulcan. This way me might all benefit from better, fully raytraced games :)
Background thinking was hardware PhysiX games, which could only add physical simulation, but could not built a game around it. A game developer would never cripple it's sales to go with one companys hardware.
Take this smack to the face then: scene wide RT shadows in Shadow of Tomb Raider, running @60fps 1440p max settings on a single RTX 2070, This is the power of hardware RT! Go back to your dark cave of software RT that can't even draw some limited tank shadows without demolishing fps.
This is worse on BF5 which is only used on reflections. Even with such limited RT, the current GPUs just aren't good enough.
There is an alternative to software or hardware RT, that is NO RT in 2019.
I like the only thing that you can convince yourself with is how poorly games runs with RT on, oh this card runs LESS BAD! :laugh:
The major caveats, in my opinion, are that NVidia halfassed the implementation big time.
We know, that rendering with RTRT can deliver better illumination, shadow casting and reflections than any comparable rasterized renderer could, in some cases it even enhances other render technics, like e.g subsurface scattering or physically based rendered textures quite a bit.
But neither the current software does nor the current RTX hardware could.
Battlefield 5 / METRO Exodus had to tune every RTX effect back, to achieve fluid 60+ fps @ 1080p with the 2080ti / RTX TITAN.
Shadow of the Tomb Raider only implemented RTX shadows.
For RTRT to clearly up the ante compared to current rasterized rendering, you need the full suite:
Global illumination with a sufficient amount of sample rays and bounces to get proper lighting, shadows, and reflections at 60+ fps.
RTX, unfortunately, is not optimized enough to deliver that with the current RTX cards, especially since only the upper range of cards (2080, 2080SUPER, 2080 TI, RTX TITAN) could push the current RTX implementations at proper framerates and resolution.
Add to it, that it is an proprietary feature to Turing and the 20xx series received an extra bump in price over the last generations steadily rising prices.
You get really hard-pressed, to applaud Nvidia for their innovation here.
They delivered RTRT, yes, but at a mediocre level against traditional rasterized rendering with a steep price premium.
One can only hope that the somewhat bad impression Nvidia printed on RTRT in trying to corner the market with RTX, gets dispelled with the coming solutions from Intel and AMD.
Even nVidia believes that RTRT will not be ready until 2023.
As for Quake RTX, a game from 1997 that requires a 2070 Super to even run on 1080p 60FPS. nVidia made this themselves so there is no "the devs doesn't know how to make games" argument.
Also when you act like you know everything, and start calling people names, you better make sure what you say is true.
Minecraft RTX was announced in August 2019 and has no solid release date.
Also AMD did not drop Mantle necessarily to make way for DX12 and Vulkan. AMD initial plan was to make Mantle the forefront for modern 3D API that will continue to exist alongside DX and Vulkan. the problem is AMD want to handle Mantle exactly the same way Nvidia did for their CUDA: AMD have full control of Mantle development but others can take mantle and work around it to make mantle work on their hardware. Mantle will be always built for AMD GCN hardware strength first. Game developer probably want to see mantle to be handled exactly like MS DX/Khronos Group OpenGL where every IHV have their hands in shaping the API spec because they know if AMD did not fully open Mantle development to other IVH then other IHV will never going to fully support the API.
RT is a big toolbox. And the problem is, if you're only using part of it, its no longer truly RT, its just a fancy post effect. All these examples you gave are one trick ponies... with 'optimal' hardware for it.
I'm not a huge fan of him, but Raja said it very right in some interview - RT shouldn't cost the end user extra money. Today, it does. You're paying less for a non-RT enabled GPU in both camps. This tech will ONLY survive in gaming if it gets shoved in slowly and gently, as harmless as possible. So far, Nvidia's Turing approach is more akin to someone stomping you in the face because 'It just works'.
That's a rather strange definition of 'now'. And even stranger of 'thriving'. If 'thriving' equals the amount of marketing/press announcements you get fed with... I guess VR rules the planet by now and RT is close second. Maybe after 'AI', though.
RT GI void of defects? That's cool stuff, with a limited number of rays. It has a resolution much like any rasterized affair, handily helped out by algorithms to make it look better. Don't let the marketing get to your head - and when in doubt, fire up some RTX -low/medium for proof. Dynamic lighting is also not very new for us. The only real win is that its less work to develop that, because it is based on set rules. You're not left creating the rules. That's about all the 'win' you will really have for the near future. The quality improvement only comes many years later and only on high end GPUs -definitely not next-gen consoles.
Still, it will be interesting to see what they are capable of squeezing out of that new console with an AMD GPU. So far, we know just about nothing wrt that, but we do know Nvidia needs to create a massive die to get somewhat playable 'effects' going. So.. I guess that 1080p 60 FPS Playstation fun is over, soon. 25-30 seems the golden target again.