Thursday, October 17th 2019

Intel and Wargaming Join Forces to Deliver Ray Tracing to World of Tanks

Intel has been very serious about its efforts in computer graphics lately, mainly because of its plans to launch a dedicated GPU lineup and bring new features to the graphics card market. Today, Intel and Wargaming, a maker of MMO titles like World of Tanks, World of Warships, and World of Warplanes, partnered to bring ray tracing feature to the Wargaming's "Core" graphics engine, used in perhaps one of the best-known MMO title - World of Tanks.

Joint forces of Intel and Wargaming developers have lead to the implementation of ray tracing, using only regular software techniques without a need for special hardware. Being hardware agnostic, this implementation works on any graphics card that can run DirectX 11, as the "Core" engine is written in DirectX 11 API. To achieve this, developers had to make a solution that uses CPU's resources for fast, multi-threaded bounding volume hierarchy which then feeds the GPU's compute shaders for ray tracing processing, thus making the ray tracing feature entirely GPU shader/core dependent. Many features are reworked with emphasis put on shadow quality. In the images below you can see exactly what difference the new ray-tracing implementation makes, and you can use almost any graphics card to get it. Wargaming notes that "some FPS" will be sacrificed if ray tracing is turned on, so your GPU shouldn't struggle too much.
Add your own comment

72 Comments on Intel and Wargaming Join Forces to Deliver Ray Tracing to World of Tanks

#51
GreiverBlade
64KMy point is that it's expensive because the hardware is new just like with past new hardware. You've been around long enough to see that many times over.

We don't go straight to this overnight:

i fully know that and in movies industry it has its use, as you said for now for games ... not there now :laugh: (what was the hardware cost/need behind that demo)

to me it's like PhysX which ought to be a huge hit, it might be one or it might be not ... time will tell.


although one thing i find interesting in that news, it's that RT is CPU driven.
Posted on Reply
#52
ZoneDymo
64KIt's embarrassing seeing people who normally embrace tech advancement fighting against RTRT so hard.
Of course hardware that improves handling RTRT is too expensive. It's supposed to be. Remember when cell phones were new or SSDs?
RTRT kills the performance of my GPU. It's supposed to. It's for the future. It's not for past GPUs.
Nvidia is taking control of RTRT with RTX GPUs. They may try but they will fail. AMD already has a hardware solution for handling RTRT better with the upcoming PS5. Intel has a solution as well.
Developers are slow to embrace RTRT and some implementations are sloppy. Time takes time.
To me it looks like a good thing for gamers eventually but there are of course some rough spots right now.
People dont really fight RT, people fight the high prices because of it.
If Nvidia released a line of GTX2000 cards that did not have the RT stuff in it for the normal non insane prices of old, then it would be fine.
But they dont, so now you are stuck with these inflated prices for hardware you dont even want but have to pay for, that really is the problem here.

Yes we remember when cell phones were new and SSD's, and did we buy those? nope, instead we said its too expensive and just like here made arguments we did not need them, which at the time was correct as everything in our world was build around not having those new gadgets yet, same now with RTX where we can say its just not worth it.
Being an early adopter is just pretty much always a poor choice.
By the time RT is worth it and games are build from the ground up using it the prices will have gone down a lot and quality will have gone up, but we all know this.

If RTRT is killing performance....then why is it implemented now? when the hardware itself is not even ready for it? and yet you are paying through the nose for it...

Ultimately we all agree though, but yeah, you should see why people dont care about RT to the point of hating on the attention it gets and rather had cheaper non RT gpu's
That and this going at it themselves that Nvidia likes to do, is not the way to go, they should work together with other companies and establish a standard to follow and improve on.
Posted on Reply
#53
Vayra86
64KMy point is that it's expensive because the hardware is new just like with past new hardware. You've been around long enough to see that many times over.

We don't go straight to this overnight:

Yes, we do, we did this a million times already. Now we try to brute force it in realtime and sell it like its new.

This is not comparable to other, past advances in graphics. Its a departure from what was an iterative and constant process of increasing efficiency and raw performance. RT adds a layer of gross inefficiency on top - conveniently timed with mainstream resolutions becoming very easy to render on rather cheap GPUs.

You said it right though! Its supposed to be expensive - RT is the means to that goal, indeed.

@ZoneDymo said it right. We dont fight RT, we are struggling to see the added value vs the added cost.

Time will tell if the tech ever gets wings. So far it hasnt, except in marketing and announcements. There is not a single truly jaw dropping live product out there today. And thats not due to lack of attention - RT news is near clickbait grade material.
Posted on Reply
#54
64K
ZoneDymoPeople dont really fight RT, people fight the high prices because of it.
If Nvidia released a line of GTX2000 cards that did not have the RT stuff in it for the normal non insane prices of old, then it would be fine.
But they dont, so now you are stuck with these inflated prices for hardware you dont even want but have to pay for, that really is the problem here.

Yes we remember when cell phones were new and SSD's, and did we buy those? nope, instead we said its too expensive and just like here made arguments we did not need them, which at the time was correct as everything in our world was build around not having those new gadgets yet, same now with RTX where we can say its just not worth it.
Being an early adopter is just pretty much always a poor choice.
By the time RT is worth it and games are build from the ground up using it the prices will have gone down a lot and quality will have gone up, but we all know this.

If RTRT is killing performance....then why is it implemented now? when the hardware itself is not even ready for it? and yet you are paying through the nose for it...

Ultimately we all agree though, but yeah, you should see why people dont care about RT to the point of hating on the attention it gets and rather had cheaper non RT gpu's
That and this going at it themselves that Nvidia likes to do, is not the way to go, they should work together with other companies and establish a standard to follow and improve on.
We have to start somewhere. It's like this. If there is no hardware that supports RTRT then why would Developers learn how to use it and implement it? The hardware had to come first and it was going to be expensive whether it was released now or 20 years from now. New hardware is always expensive partly due to the expensive R&D that went into developing it. The early adopters are paying for that R&D just like always. Prices will continue to come down just like SSD prices came down. What is necessary is competition and now Nvidia will be getting that from AMD and Intel as well. When I saw that article about Sony saying there would be a hardware solution for RTRT in the PS5 that was the end of my caution about whether or not RTRT was here to stay. With Nvidia, AMD, Intel and the Console makers all involved in RTRT there's nothing to stop it now.

What concerns me is that people are judging RTRT as if what we are seeing right now is all that we will get. This is only the tip of the iceberg.
Posted on Reply
#55
Vayra86
64KWe have to start somewhere. It's like this. If there is no hardware that supports RTRT then why would Developers learn how to use it and implement it? The hardware had to come first and it was going to be expensive whether it was released now or 20 years from now. New hardware is always expensive partly due to the expensive R&D that went into developing it. The early adopters are paying for that R&D just like always. Prices will continue to come down just like SSD prices came down. What is necessary is competition and now Nvidia will be getting that from AMD and Intel as well. When I saw that article about Sony saying there would be a hardware solution for RTRT in the PS5 that was the end of my caution about whether or not RTRT was here to stay. With Nvidia, AMD, Intel and the Console makers all involved in RTRT there's nothing to stop it now.

What concerns me is that people are judging RTRT as if what we are seeing right now is all that we will get. This is only the tip of the iceberg.
You can look at VR for a very recent confirmation that a console announcement does not carry its popularity on the market. PSVR ran quite well, but is VR here to stay now? Nope...

Cautious optimism, thats about the best way to approach this.
Posted on Reply
#56
64K
Vayra86You can look at VR for a very recent confirmation that a console announcement does not carry its popularity on the market. PSVR ran quite well, but is VR here to stay now? Nope...
That's a poor comparison. If the console came with a mandatory VR headset and the customer had to pay for it whether they wanted to or not then it would be more like what Sony is doing. Sony will be putting chips in the PS5 that will support RTRT and the customer will be paying the cost whether they want RTRT or not. To me that means Sony believes RTRT is here to stay and there are benefits to the tech.
Posted on Reply
#57
renz496
ZoneDymoPeople dont really fight RT, people fight the high prices because of it.
If Nvidia released a line of GTX2000 cards that did not have the RT stuff in it for the normal non insane prices of old, then it would be fine.
But they dont, so now you are stuck with these inflated prices for hardware you dont even want but have to pay for, that really is the problem here.
nah even if nvidia release 2080Ti without RT for $700. people will still going to find something to complain about.
ZoneDymoIf RTRT is killing performance....then why is it implemented now? when the hardware itself is not even ready for it? and yet you are paying through the nose for it...
why now? because if we keep waiting for the hardware to catch up then we never able to implement it....forever. even if GPU maker push RT 5 years from now we still going to said the same thing. because at that time we are going to say "it is too early to implement RT because we cannot get 60FPS on 8k (or even 16k) res even with the fastest GPU"
ZoneDymoUltimately we all agree though, but yeah, you should see why people dont care about RT to the point of hating on the attention it gets and rather had cheaper non RT gpu's
That and this going at it themselves that Nvidia likes to do, is not the way to go, they should work together with other companies and establish a standard to follow and improve on.
that standard already exist. they were called MS DXR.
Posted on Reply
#58
Aerpoweron
FiendishOne of those games is using RT for full Global Illumination, the other is using RT for limited shadow effects, the fact that the performance hit is similar should give you a very good idea how much better dedicated hardware is. The Crytek demo ended up showing the same thing when they noted how much better performing RTX cards would be if the dedicated hardware was used.

This SIGGRAPH presentation showed some of the numbers regarding how much dedicated hardware is boosting RT performance, it's substantial.



The dedicated hardware itself, takes up a relatively tiny amount of die space, both the Tensor cores and the RT cores together only account for ~9% of the die space on current RTX cards and of that 9%, the RT cores represent only about a third of the amount. So assuming no other bottlenecks, you're getting 2x-3x times performance for a ~3% die space cost.

A lot of people think RT is too expensive even WITH the performance boost from dedicated hardware. Seems quite clear that going forward dedicated RT hardware is indeed a "must have" for real time RT, if it's ever going to be taken seriously at least.
I think there is some kind of mismatch between the power needed for RTX and that what was expected. When you run games with RTX enabled the power draw goes down. So, now the RTX cores are used, but the rest can not be fully utilized. The RTX cores should be within the shader pipeline. With the power draw going down, the 3% of die space for them might not be enough to keep the the rest of the shader pipeline fully occupied.

I am thinking a little in the future here. We need a unified shader model with the shaders able to run the Raytracing calculations efficiently as well. Vulcan 2 or DirectX 13 maybe? So every game developer can manage the raytracing how he likes. This brings me to intel Larabee, which was a GPU as general purpose as it gets. It was, so simplify it, just a bunch of x86 cores packed together. And there were some interesting Raytracing projects with it too.

Quake Wars Raytracing

The current Raytracing can still be used for Adventure games. We only need 30FPS for them. And a fully raytraced Adventure game, based on that technology would look great :)

The current raytracing attemps are difficult. Companies try to pack it somehow into rasterizded engines. (WOT and RTX promo games). There are better 3d model implimentations for it, which come with it's own advantages and disadvantages. Building a raytraced game from the ground up, with the raytraced preferred model and texture structures might include some thinking out of the box. And if you want to support rasterization at that stage, you probably have to built 2 different games.

I hope that Nvidia is dropping it's properitary RTX tech and goes with DXR and Vulcan ractracing. Just the same way AMD dropped Mantle for DX12 and Vulcan. This way me might all benefit from better, fully raytraced games :)
Background thinking was hardware PhysiX games, which could only add physical simulation, but could not built a game around it. A game developer would never cripple it's sales to go with one companys hardware.
Posted on Reply
#59
MuhammedAbdo
ZubasaThe power of RT is not able to play on 60 fps in 2019 with a $500+ card, nice moral victory you have there.
All the % is nice in theory, but WoT is a PVP game that can get rather competitive, so no one will sacrifice even 10% fps, let alone 55%.
As for limited effects, all implmentations of Ray Tracing is very minimal given the horrible performance of even Hardware based RT.
To the point where it isn't hard to see noise / artifacts in the lighting. Practically the RT cores are just wasted die space for this generation of GPUs.
So here is a reality pill for you.
Don't like that Metro Exodus uses RT GI which is the heaviest form of RT there is? The kind that was thought to be impossible to run on a single GPU?!

Take this smack to the face then: scene wide RT shadows in Shadow of Tomb Raider, running @60fps 1440p max settings on a single RTX 2070, This is the power of hardware RT! Go back to your dark cave of software RT that can't even draw some limited tank shadows without demolishing fps.

Posted on Reply
#60
Zubasa
MuhammedAbdoDon't like that Metro Exodus uses RT GI which is the heaviest form of RT there is? The kind that was thought to be impossible to run on a single GPU?!

Take this smack to the face then: scene wide RT shadows in Shadow of Tomb Raider, running @60fps 1440p max settings on a single RTX 2070, This is the power of hardware RT! Go back to your dark cave of software RT that can't even draw some limited tank shadows without demolishing fps.

Not so heavy when current RTX runs such a low ray count that I can see artificacts in the lighting quite often without trying to look for them.
This is worse on BF5 which is only used on reflections. Even with such limited RT, the current GPUs just aren't good enough.
There is an alternative to software or hardware RT, that is NO RT in 2019.
I like the only thing that you can convince yourself with is how poorly games runs with RT on, oh this card runs LESS BAD! :laugh:
Posted on Reply
#61
Testsubject01
64KIt's embarrassing seeing people who normally embrace tech advancement fighting against RTRT so hard.

Of course hardware that improves handling RTRT is too expensive. It's supposed to be. Remember when cell phones were new or SSDs?

RTRT kills the performance of my GPU. It's supposed to. It's for the future. It's not for past GPUs.

Nvidia is taking control of RTRT with RTX GPUs. They may try but they will fail. AMD already has a hardware solution for handling RTRT better with the upcoming PS5. Intel has a solution as well.

Developers are slow to embrace RTRT and some implementations are sloppy. Time takes time.

To me it looks like a good thing for gamers eventually but there are of course some rough spots right now.
I can only speak for myself and to clarify, I do believe that RTRT belongs in future 3D render engines.
The major caveats, in my opinion, are that NVidia halfassed the implementation big time.

We know, that rendering with RTRT can deliver better illumination, shadow casting and reflections than any comparable rasterized renderer could, in some cases it even enhances other render technics, like e.g subsurface scattering or physically based rendered textures quite a bit.

But neither the current software does nor the current RTX hardware could.

Battlefield 5 / METRO Exodus had to tune every RTX effect back, to achieve fluid 60+ fps @ 1080p with the 2080ti / RTX TITAN.
Shadow of the Tomb Raider only implemented RTX shadows.

For RTRT to clearly up the ante compared to current rasterized rendering, you need the full suite:
Global illumination with a sufficient amount of sample rays and bounces to get proper lighting, shadows, and reflections at 60+ fps.

RTX, unfortunately, is not optimized enough to deliver that with the current RTX cards, especially since only the upper range of cards (2080, 2080SUPER, 2080 TI, RTX TITAN) could push the current RTX implementations at proper framerates and resolution.
Add to it, that it is an proprietary feature to Turing and the 20xx series received an extra bump in price over the last generations steadily rising prices.

You get really hard-pressed, to applaud Nvidia for their innovation here.
They delivered RTRT, yes, but at a mediocre level against traditional rasterized rendering with a steep price premium.

One can only hope that the somewhat bad impression Nvidia printed on RTRT in trying to corner the market with RTX, gets dispelled with the coming solutions from Intel and AMD.
Posted on Reply
#62
Zubasa
Testsubject01I can only speak for myself and to clarify, I do believe that RTRT belongs in future 3D render engines.
The major caveats, in my opinion, are that NVidia halfassed the implementation big time.

We know, that rendering with RTRT can deliver better illumination, shadow casting and reflections than any comparable rasterized renderer could, in some cases it even enhances other render technics, like e.g subsurface scattering or physically based rendered textures quite a bit.

But neither the current software does nor the current RTX hardware could.

Battlefield 5 / METRO Exodus had to tune every RTX effect back, to achieve fluid 60+ fps @ 1080p with the 2080ti / RTX TITAN.
Shadow of the Tomb Raider only implemented RTX shadows.

For RTRT to clearly up the ante compared to current rasterized rendering you need the full suite:
Global illumination with a sufficient amount of sample rays and bounces to get proper lighting, shadows, and reflections at 60+ fps.

RTX, unfortunately, is not optimized enough to deliver that with the current RTX cards, especially since only the upper range of cards (2080, 2080SUPER, 2080 TI, RTX TITAN) could push the current RTX implementations at proper framerates and resolution.
Add to it, that it is an proprietary feature to Turing and the 20xx series received an extra bump in price over the last generations steadily rising prices.

You get really hard-pressed, to applaud Nvidia for their innovation here.
They delivered RTRT, yes, but at a mediocre level against traditional rasterized rendering with a steep price premium.

One can only hope, that the somewhat bad impression Nvidia printed on RTRT in trying to corner the market with RTX gets dispelled with the coming solutions from Intel and AMD.
Pretty much this, we know Real Time Ray Tracing might be great in the future, but somehow there are so many people that believe the futures means 2018 and not 2023.
Even nVidia believes that RTRT will not be ready until 2023.
Posted on Reply
#63
MuhammedAbdo
Testsubject01Battlefield 5 / METRO Exodus had to tune every RTX effect back, to achieve fluid 60+ fps @ 1080p with the 2080ti / RTX TITAN.
Metro and Battlefield V run 1440p @60fps on a 2080Ti.
ZubasaNot so heavy when current RTX runs such a low ray count that I can see artificacts in the lighting quite often without trying to look for them.
It"s still miles ahead of any software RT solution, and it delivers massive IQ increases regardless of low ray count or whatever you convince yourself of.
ZubasaThis is worse on BF5 which is only used on reflections. Even with such limited RT, the current GPUs just aren't good enough.
Testsubject01Battlefield 5 / METRO Exodus had to tune every RTX effect back, to achieve fluid 60+ fps @ 1080p with the 2080ti / RTX TITAN.
Shadow of the Tomb Raider only implemented RTX shadows.
Battlefield V was just the first game to use RTX, developers learned a lot after it. Control used Reflections + Shadows + GI and it ran and looked wonderfully. We now have full Path tracing for all shadows, reflections and lighting for games such as Quake 2 and Minecraft.
Posted on Reply
#64
Zubasa
MuhammedAbdoBattlefield V was just the first game to use RTX, developers learned a lot after it. Control used Reflections + Shadows + GI and it ran and looked wonderfully. We now have full Path tracing for all shadows, reflections and lighting for games such as Quake 2 and Minecraft.
Minecraft is the exactly the "filthy non-RTX Ray Tracing / Path Tracing " that you hated so much.
As for Quake RTX, a game from 1997 that requires a 2070 Super to even run on 1080p 60FPS. nVidia made this themselves so there is no "the devs doesn't know how to make games" argument.
Posted on Reply
#65
MuhammedAbdo
ZubasaMinecraft is the exactly the "filthy non-RTX Ray Tracing / Path Tracing " that you hated so much.
Minecraft is accelerated by RTX genius, the non RTX version is both of lower quality and lower performance.
Posted on Reply
#66
Zubasa
MuhammedAbdoMinecraft is accelerated by RTX genius, the non RTX version is both of lower quality and lower performance.
Oh really? There are many versions of Minecraft, and they are not really compatible with each other.
Also when you act like you know everything, and start calling people names, you better make sure what you say is true.
Minecraft RTX was announced in August 2019 and has no solid release date.
Posted on Reply
#67
renz496
AerpoweronThe current raytracing attemps are difficult. Companies try to pack it somehow into rasterizded engines. (WOT and RTX promo games). There are better 3d model implimentations for it, which come with it's own advantages and disadvantages. Building a raytraced game from the ground up, with the raytraced preferred model and texture structures might include some thinking out of the box. And if you want to support rasterization at that stage, you probably have to built 2 different games.
our hardware still very far to get to that point. maybe in 10 to 15 years. right now what they try to push is hybrid rendering which combine both RT and rasterization. in nvidia vision they expect triple A dev to completely ditch baked effect to be completely replaced with RT solution in 2023 time frame. and that is just triple A. it still require a few years after that for indie developer to use this approach.
AerpoweronI hope that Nvidia is dropping it's properitary RTX tech and goes with DXR and Vulcan ractracing. Just the same way AMD dropped Mantle for DX12 and Vulcan. This way me might all benefit from better, fully raytraced games :)
Background thinking was hardware PhysiX games, which could only add physical simulation, but could not built a game around it. A game developer would never cripple it's sales to go with one companys hardware.
IHV implementation will always be proprietary even if they were build their solution based on existing open standard. in nvidia case they don't need to drop anything from RTX. their RTX solution already work with existing open standard (MS Direct X DXR and Vulkan). this time around nvidia is smart not to push it via their proprietary API (like GPU PhysX needing CUDA) and make their RTX implementation to be compliant with existing open standard from the get go.

Also AMD did not drop Mantle necessarily to make way for DX12 and Vulkan. AMD initial plan was to make Mantle the forefront for modern 3D API that will continue to exist alongside DX and Vulkan. the problem is AMD want to handle Mantle exactly the same way Nvidia did for their CUDA: AMD have full control of Mantle development but others can take mantle and work around it to make mantle work on their hardware. Mantle will be always built for AMD GCN hardware strength first. Game developer probably want to see mantle to be handled exactly like MS DX/Khronos Group OpenGL where every IHV have their hands in shaping the API spec because they know if AMD did not fully open Mantle development to other IVH then other IHV will never going to fully support the API.
Posted on Reply
#68
Vayra86
MuhammedAbdoDon't like that Metro Exodus uses RT GI which is the heaviest form of RT there is? The kind that was thought to be impossible to run on a single GPU?!

Take this smack to the face then: scene wide RT shadows in Shadow of Tomb Raider, running @60fps 1440p max settings on a single RTX 2070, This is the power of hardware RT! Go back to your dark cave of software RT that can't even draw some limited tank shadows without demolishing fps.

Heaviest there is? Global Illumination has been done for decades on all sorts of GPUs. The fact RT needs such an amount of horsepower for it - while still being only a *single point light* is utterly ridiculous. It does show us nicely how incredibly far away we still are to achieving any sort of realism required to truly make full RT stick in games. Because as long as its not, it will be fighting a much cheaper to render rasterized opposition - in screenshots, trailers, and all non-RT games.

RT is a big toolbox. And the problem is, if you're only using part of it, its no longer truly RT, its just a fancy post effect. All these examples you gave are one trick ponies... with 'optimal' hardware for it.

I'm not a huge fan of him, but Raja said it very right in some interview - RT shouldn't cost the end user extra money. Today, it does. You're paying less for a non-RT enabled GPU in both camps. This tech will ONLY survive in gaming if it gets shoved in slowly and gently, as harmless as possible. So far, Nvidia's Turing approach is more akin to someone stomping you in the face because 'It just works'.
Posted on Reply
#69
MuhammedAbdo
Vayra86Heaviest there is? Global Illumination has been done for decades on all sorts of GPUs.
That's static GI, low precision and full of defects. RT GI is completely dynamic and is void of most defects.
Vayra86This tech will ONLY survive in gaming if it gets shoved in slowly and gently, as harmless as possible
It's already thriving, next consoles have RT now, all future games will have it.
Vayra86far away we still are to achieving any sort of realism required to truly make full RT stick in games.
We don't need full RT now, what we need is Hybrid solution. Which is what RTX is and what next consoles will provide.
Posted on Reply
#70
kapone32
renz496our hardware still very far to get to that point. maybe in 10 to 15 years. right now what they try to push is hybrid rendering which combine both RT and rasterization. in nvidia vision they expect triple A dev to completely ditch baked effect to be completely replaced with RT solution in 2023 time frame. and that is just triple A. it still require a few years after that for indie developer to use this approach.



IHV implementation will always be proprietary even if they were build their solution based on existing open standard. in nvidia case they don't need to drop anything from RTX. their RTX solution already work with existing open standard (MS Direct X DXR and Vulkan). this time around nvidia is smart not to push it via their proprietary API (like GPU PhysX needing CUDA) and make their RTX implementation to be compliant with existing open standard from the get go.

Also AMD did not drop Mantle necessarily to make way for DX12 and Vulkan. AMD initial plan was to make Mantle the forefront for modern 3D API that will continue to exist alongside DX and Vulkan. the problem is AMD want to handle Mantle exactly the same way Nvidia did for their CUDA: AMD have full control of Mantle development but others can take mantle and work around it to make mantle work on their hardware. Mantle will be always built for AMD GCN hardware strength first. Game developer probably want to see mantle to be handled exactly like MS DX/Khronos Group OpenGL where every IHV have their hands in shaping the API spec because they know if AMD did not fully open Mantle development to other IVH then other IHV will never going to fully support the API.
Um as far as I know Mantle essentially became DX12 as all of it's benefits were integrated into that API.
Posted on Reply
#71
candle_86
Rt is like hdr was in 2004. Sure gf6 could do it, but it brought the mighty 6800 ultra to it's knees. It wasn't really until the 8800 that hdr got to a playable state at any decent resolution, same with rt, give it time.
Posted on Reply
#72
Vayra86
MuhammedAbdoThat's static GI, low precision and full of defects. RT GI is completely dynamic and is void of most defects.


It's already thriving, next consoles have RT now, all future games will have it.

We don't need full RT now, what we need is Hybrid solution. Which is what RTX is and what next consoles will provide.
Next gen consoles that don't exist have it 'now'. Future games that don't exist have it 'now'.

That's a rather strange definition of 'now'. And even stranger of 'thriving'. If 'thriving' equals the amount of marketing/press announcements you get fed with... I guess VR rules the planet by now and RT is close second. Maybe after 'AI', though.

RT GI void of defects? That's cool stuff, with a limited number of rays. It has a resolution much like any rasterized affair, handily helped out by algorithms to make it look better. Don't let the marketing get to your head - and when in doubt, fire up some RTX -low/medium for proof. Dynamic lighting is also not very new for us. The only real win is that its less work to develop that, because it is based on set rules. You're not left creating the rules. That's about all the 'win' you will really have for the near future. The quality improvement only comes many years later and only on high end GPUs -definitely not next-gen consoles.

Still, it will be interesting to see what they are capable of squeezing out of that new console with an AMD GPU. So far, we know just about nothing wrt that, but we do know Nvidia needs to create a massive die to get somewhat playable 'effects' going. So.. I guess that 1080p 60 FPS Playstation fun is over, soon. 25-30 seems the golden target again.
Posted on Reply
Add your own comment
Nov 23rd, 2024 19:58 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts