• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Ray Tracing and Variable-Rate Shading Design Goals for AMD RDNA2

Joined
Jun 10, 2014
Messages
3,038 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
VRS is great that's one of the more exciting new GPU features in reality. A simple and easy tangible performance boost by diluting parts of the scene you care less about in the grand scheme why would anyone view that as a bad trade off is beyond me. Utilizing resources where they can be put to best usage plain and horsepower means very little when you have no traction which is exactly why drag cars do burnouts before they race to warm those f*ckers up a little to grip the road when they goose it aka 3, 2, 1 punch it.
VRS is a technology that I've wanted for 10 years, but not as a way to reduce details in parts of the scene, only to improve select parts. I think this technology has great potential, but like with many other advanced techniques, it needs to be utilized right, otherwise the end result is bad.

Let's say you have a scene with a nice landscape in the lower half of the screen, and a sky (just a skydome or skybox) in the upper half. You might think that rendering the upper half in much fewer samples might be a good way to optimize away wasteful samples. But the truth is that low detail areas like skies are very simple to render in the first place, so you will probably end up with a very blurry area and marginal performance savings.

To make matters worse, this will probably only increase the frame rate variance (if not applied very carefully). If you have a first-person game walking a landscape, looking straight up or down will result in very high frame rates while looking straight forward into an open landscape will give low performance. Even if you don't do any particular fancy LoD algorithms, the GPU is already pretty good at culling off-screen geometry, and I know from experience that trying to optimize away any "unnecessary" detail can actually increase this frame rate variance even more.
 
Joined
Sep 3, 2019
Messages
3,850 (1.93/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 85C temp limit, CO -8~14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MT/s 1.38V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.3037), upgraded from Win10 to Win11 on Jan 2024
Joined
May 31, 2016
Messages
4,485 (1.41/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
well it's not like nvidia already does it in wolfenstein games
and why would that be blurry ? it's not an image reconstruction tehcnique.
On the VRS video linked earlier it does get blurry when the res is really down. So your image reconstruction technique mention here is simply pointless.
I'm more worried about the implementation of this so that it wont end up like NV's DLSS. Nvidia does it? You mean NVidia's using it.
 
Joined
Jul 9, 2015
Messages
3,462 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
No Radeon card has much of any value till they get this done.
In times when less than 1% of games support it, and even those who support make 2080Ti sweat, yeah, I mean, a must have/no feature-no buy feature.
Obviously.

Because once we get into full throttle RT "non-RT cards are not supported" some time in 2025, today's card will absolutely be adequate to run it.
Apparently.

But, but Real time ray tracing are gimmicks!

—-certain fanbois

Low ray count ray tracing, producing hell of a noisy images that get heavily "denoised" to produce a handful of effects in some otherwise traditionally rasterized scenes.. is not gimmick?

Because, let me guess, it has "RT" and "real time" in it?

Clearly, only fanbois would disagree with it!

Exciting times!
 
Joined
Aug 6, 2017
Messages
7,412 (2.69/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
In times when less than 1% of games support it, and even those who support make 2080Ti sweat, yeah, I mean, a must have/no feature-no buy feature.
Obviously.

Because once we get into full throttle RT "non-RT cards are not supported" some time in 2025, today's card will absolutely be adequate to run it.
Apparently.



Low ray count ray tracing, producing hell of a noisy images that get heavily "denoised" to produce a handful of effects in some otherwise traditionally rasterized scenes.. is not gimmick?

Because, let me guess, it has "RT" and "real time" in it?

Clearly, only fanbois would disagree with it!

Exciting times!
well better go back to buying cards that can't do none of it.
isn't it good to have a choice...
about that 1%.....look how many triple A games out now or announced for 2020 have rtx support.
blurry and noisey ? depnds
this is rtx+dlss in Control

Control Screenshot 2019.09.20 - 17.40.43.82.png
 
Last edited:
Joined
Nov 21, 2010
Messages
2,358 (0.45/day)
Location
Right where I want to be
System Name Miami
Processor Ryzen 3800X
Motherboard Asus Crosshair VII Formula
Cooling Ek Velocity/ 2x 280mm Radiators/ Alphacool fullcover
Memory F4-3600C16Q-32GTZNC
Video Card(s) XFX 6900 XT Speedster 0
Storage 1TB WD M.2 SSD/ 2TB WD SN750/ 4TB WD Black HDD
Display(s) DELL AW3420DW / HP ZR24w
Case Lian Li O11 Dynamic XL
Audio Device(s) EVGA Nu Audio
Power Supply Seasonic Prime Gold 1000W+750W
Mouse Corsair Scimitar/Glorious Model O-
Keyboard Corsair K95 Platinum
Software Windows 10 Pro
But, but Real time ray tracing are gimmicks!

—-certain fanbois

Gene_Wilder_as_Willy_Wonka.jpeg


Do tell, what has been you user experience/utilisation of RTRT over the course of the ownership of your RTX card? Do you believe that you will fully take advantage it's RTRT capabilities before replacing it with a next gen nvidia/amd card that can actually handle such?

P.S. If it's a gimmick at present there is nothing wrong with doing so.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,462 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
May 25, 2019
Messages
12 (0.01/day)
View attachment 139366

Do tell, what has been you user experience/utilisation of RTRT over the course of the ownership of your RTX card? Do you believe that you will fully take advantage it's RTRT capabilities before replacing it with a next gen nvidia/amd card that can actually handle such?

P.S. If it's a gimmick at present there is nothing wrong with doing so.

So, err. dumb question. RT was originally developed by a software company, not Nvidia. Nvidia took it and developed a hardware method to (attempt to) make it have an "acceptable impact" and release it to the market earlier than said software company's solution. I feel that was a brilliant marketing move, but a poor end-user solution. RTRT would be great it if had little-to-no-real-world impact on framerates, and widespread adoption to the game development world. If there is a software solution, and hardware that can "add this in", without dropping framerates below "targets" such as 60 FPS for 60hz gaming, or 144 FPS for 144HZ gaming, or 240 for 240hz gaming, why would we care that the card has RT cores or not? Now, are we going to get there in the current or coming generation, almost certainly not. But since development is currently relying on those RT cores for programming models, we're actually delaying development adoption to get it 'now' IF we do get there?

VRS, on the other hand, is a potential framerate improvement, leaving us with "better' state, provided it doesn't significantly impact image quality. Why would anyone NOT want this, IF it works as advertised?
 
Joined
Feb 3, 2017
Messages
3,915 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
This is very interesting to me... the software based RayTracing. Pretty good results IMO
There is no good reason to be very keen on software-based raytracing (running on generic shaders, as in this case).

Until Crytek implements DXR there is no direct comparison. So far Neon Noir running on Vega 56/64 is about on par with it running on GTX1080. Anything DXR cards can do (mainly considerably larger amount of rays) is on top of that. If you want a comparison, check the differences between GTX and RTX cards - RTX2060 should be generally on par with GTX1080, so it is direct enough comparison - in Battlefield V DXR. It employs DXR for the same effect as Neon Noir employs its RT shaders).

On the VRS video linked earlier it does get blurry when the res is really down. So your image reconstruction technique mention here is simply pointless.
I'm more worried about the implementation of this so that it wont end up like NV's DLSS. Nvidia does it? You mean NVidia's using it.
VRS is in DX12 and both Nvidia and Intel have this capability deployed. I believe Nvidia also has OpenGL and Vulkan extensions available for it, not sure about Intel.
VRS is not an image reconstruction technique. It does reduce image quality in parts of the image but option of using VRS is purely and entirely up to developer. When used well - in parts of screen that do not benefit from more details and quality lowered to acceptable degree - it provides a small but measurable performance boost for minimal image quality penalty.
 
Last edited:
Joined
Aug 6, 2017
Messages
7,412 (2.69/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Hell yeah, how dare I!?!?!?!
Oh wait:



I guess it is too much to expect from users on techy forum to have basic understanding of the underlying technology...
I told you,you can pay the same and be happy to get less if you wish,you have a choice.
 
Joined
Nov 21, 2010
Messages
2,358 (0.45/day)
Location
Right where I want to be
System Name Miami
Processor Ryzen 3800X
Motherboard Asus Crosshair VII Formula
Cooling Ek Velocity/ 2x 280mm Radiators/ Alphacool fullcover
Memory F4-3600C16Q-32GTZNC
Video Card(s) XFX 6900 XT Speedster 0
Storage 1TB WD M.2 SSD/ 2TB WD SN750/ 4TB WD Black HDD
Display(s) DELL AW3420DW / HP ZR24w
Case Lian Li O11 Dynamic XL
Audio Device(s) EVGA Nu Audio
Power Supply Seasonic Prime Gold 1000W+750W
Mouse Corsair Scimitar/Glorious Model O-
Keyboard Corsair K95 Platinum
Software Windows 10 Pro
So, err. dumb question. RT was originally developed by a software company, not Nvidia. Nvidia took it and developed a hardware method to (attempt to) make it have an "acceptable impact" and release it to the market earlier than said software company's solution. I feel that was a brilliant marketing move, but a poor end-user solution. RTRT would be great it if had little-to-no-real-world impact on framerates, and widespread adoption to the game development world. If there is a software solution, and hardware that can "add this in", without dropping framerates below "targets" such as 60 FPS for 60hz gaming, or 144 FPS for 144HZ gaming, or 240 for 240hz gaming, why would we care that the card has RT cores or not? Now, are we going to get there in the current or coming generation, almost certainly not. But since development is currently relying on those RT cores for programming models, we're actually delaying development adoption to get it 'now' IF we do get there?

VRS, on the other hand, is a potential framerate improvement, leaving us with "better' state, provided it doesn't significantly impact image quality. Why would anyone NOT want this, IF it works as advertised?

This goes back to the comment I quoted, I'm probably one of those certain people claimed to say "RT is a gimmick", that is out of context. How Nvidia is implementing it is a gimmick, no matter how you cut it with the current batch of cards it's not practical nor useful therefore a gimmick. RT itself was never thought or said to be superfluous by myself personally. That out of way will probably get there eventually but feel like had nvidia not snatched it up it would we would have seen acceptable RT sooner because the collaborative element has been removed, working on RT from that point either develop your own from scratch or go through Nvidia and use their hardware.
 
Last edited:
Joined
Dec 18, 2018
Messages
24 (0.01/day)
System Name godzilla
Processor Intel Core i7-920
Cooling Air
Memory 12GB
Video Card(s) Nvidia Geforce 970
Storage Samsung 970 EVO
Display(s) LG OLED55C9
Audio Device(s) Sennheiser HD 800 S
Mouse Logitech G Pro Wireless
Keyboard Logitech G19
I would be surprised if they have a larger Navi ready, and if they did, the TDP would probably be in the 300-350W range.
Also don't forget that RX 5700 was renamed "last minute", even some official photos displayed "RX 690".
I'd take a 350W card that can pull 4k 60FPS in Ubisofts poorly optomized yearly AAAs. From either company.
Back in the day I owned Asus Mars II which was Bitchin'fast!3D2000 personified. It was 365W TDP tripple slot 3x 8pin monstrosity, and yes, it had over 20000 BungholioMarks.

I had no problem with heat, and it lasted over 3 years. Good times.
 
Joined
Dec 22, 2011
Messages
3,890 (0.81/day)
Processor AMD Ryzen 7 5700X3D
Motherboard MSI MAG B550 TOMAHAWK
Cooling Thermalright Peerless Assassin 120 SE
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Hisense 55" U7K
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
I do love these pissing contests, Fermi famously got no love from the ATi/AMD crowd for being hot and power hungry, but it was clearly the faster more forward looking tech. And in grand scheme of things, AMD cards over recent years have made Fermi look kind to the environment!

Turing packs all this tech already, and can only improve once Nvidia go down to 7nm too, RTRT may realistically remain years off, but things like VRS coming to next gen consoles can certainly offer some nice benefits as details and res go up. AMD are playing catch up, but there is no need to get too butt hurt people.
 
Joined
Mar 21, 2016
Messages
2,586 (0.79/day)
VRS is a technology that I've wanted for 10 years, but not as a way to reduce details in parts of the scene, only to improve select parts. I think this technology has great potential, but like with many other advanced techniques, it needs to be utilized right, otherwise the end result is bad.

Let's say you have a scene with a nice landscape in the lower half of the screen, and a sky (just a skydome or skybox) in the upper half. You might think that rendering the upper half in much fewer samples might be a good way to optimize away wasteful samples. But the truth is that low detail areas like skies are very simple to render in the first place, so you will probably end up with a very blurry area and marginal performance savings.

To make matters worse, this will probably only increase the frame rate variance (if not applied very carefully). If you have a first-person game walking a landscape, looking straight up or down will result in very high frame rates while looking straight forward into an open landscape will give low performance. Even if you don't do any particular fancy LoD algorithms, the GPU is already pretty good at culling off-screen geometry, and I know from experience that trying to optimize away any "unnecessary" detail can actually increase this frame rate variance even more.
I think you used a poor example because it's unlikely that scenario would be applied or only sparingly. As far as frame time variance is concerned the hardware remains the same in either case it's simply prioritizing render tasks a bit different within the GPU. If anything it could be used to improve frame time variance in the worse case scenario's by utilizing VRS selectively switching a few things to a lower quality when frame rates dip below certain FPS trigger thresholds til they normalize. Sure I'm sure it could get used poorly, but it could get used very well at the same time and having it as a option doesn't hurt.

Here's a example GPU recognizes frame rate is below 60FPS or say below 30FPS which is even worse and input lag gets really crappy really quickly below that point. AA is gets set for 75% of screen resolution for w/e high quality setting you determine and the other 25% gets set lower when the trigger point kicks in until the frame rate normalizes. Frame rate variance improved a bit of image quality reduction temporarily, but in the grand scheme a good trade off perhaps given the scenario described. That could be applied to more than AA like shading, lighting, and geometry as well as other stuff. Boils down to how it gets used and applied, but VRS has the premise of improving quality and performance both in variable ways. It just depends how it gets injected into the render pipe line.

On the VRS video linked earlier it does get blurry when the res is really down. So your image reconstruction technique mention here is simply pointless.
I'm more worried about the implementation of this so that it wont end up like NV's DLSS. Nvidia does it? You mean NVidia's using it.
Plenty of video streams do variable rate adjustments similar to that based on DL speed because of traffic congestion. I honestly wouldn't mind a bit of selective DLSS smeary temporarily if my FPS dipped below a frame rate threshold I determined it beats chopping frame rates and sloppy input lag.

well better go back to buying cards that can't do none of it.
isn't it good to have a choice...
about that 1%.....look how many triple A games out now or announced for 2020 have rtx support.
blurry and noisey ? depnds
this is rtx+dlss in Control

View attachment 139363
That scene looks like it's be post processed with Charmin Ultrasoft soap opera effect. I cannot in good faith say I'm fond of the look. My vision isn't even 20/20 I'm blind as a bat w/o glasses, but that would drive me nuts personally so I'd hate to think what people with good vision think of it that dull mess. I practically looks like a mClassic doing upscaling from like a 720p console game in terms of texture detail it's horrible quite frankly the quality simply isn't there bottom line you can't make a blu-ray quality video out of a DVD rom which also true of all the sacrifices to RTRT to do with ray tracing and lower the amount of light passes and denoise to poorly run RTRT at a unsteady frame rate.

View attachment 139366

Do tell, what has been you user experience/utilisation of RTRT over the course of the ownership of your RTX card? Do you believe that you will fully take advantage it's RTRT capabilities before replacing it with a next gen nvidia/amd card that can actually handle such?

P.S. If it's a gimmick at present there is nothing wrong with doing so.
Let's not kid ourselves here next generation nvidia/amd card won't handle won't be tricking anyone's eyes into thinking RTRT Crysis is real life either.

Hell yeah, how dare I!?!?!?!
Oh wait:



I guess it is too much to expect from users on techy forum to have basic understanding of the underlying technology...
Speak of the devil or close enough and actually that that demo was one of the better examples of RTRT type effects aside from that staged star wars demo Nvidia did that didn't materialize into actual playable games like that go figure who would've guessed it. Crytek did a pretty decent job though defiantly not perfect simply because single GPU hardware won't get us to GoPro Hero realism at this point in time we've got a ways to go still before we reach that point.

This goes back to the comment I quoted, I'm probably one of those certain people claimed to say "RT is a gimmick", that is out of context. How Nvidia is implementing it is a gimmick, no matter how you cut it with the current batch of cards it's not practical nor useful therefore a gimmick. RT itself was never thought or said to be superfluous by myself personally. That out of way will probably get there eventually but feel like had nvidia not snatched it up it would we would have seen acceptable RT sooner because the collaborative element has been removed, working on RT from that point either develop your own from scratch or go through Nvidia and use their hardware.
You make a bit of a good point RTRT could be viewed as a bit of a preemptive **** block attempt by Nvidia to ray tracing with developers that will ultimately slow the progression of ray tracing. No one wants another HairWorks or Physx scenario down the road for ray tracing, but could be right where things are headed towards. Luckily AMD is in the next gen console's so we might avoid that scenario so good chess move follow up by Lisa Su. I'm sure RTRT will improve in the coming hears and heat up further, but at this stage it's safe to call it a bit of a gimmick given how it both looks and performs neither are optimal and need tons more polish before people consider them high quality and high desirability. I don't think too many people bought RTX cards for RTRT alone, but rather for both performance/efficiency and RTX features that include RTRT among other tech like mesh shading and DLSS.

Turing packs all this tech already, and can only improve once Nvidia go down to 7nm too, RTRT may realistically remain years off, but things like VRS coming to next gen consoles can certainly offer some nice benefits as details and res go up.
Pretty much agreed Nvidia moving to 7nm will certainly only bring about further advancements though so too will AMD moving to 7nm EUV and increasing it's GPU divisions R&D budget over time as it continues to pay down debt from that ATI merger from years past. Intel's CPU stumbling will only benefit AMD especially given it's higher focus on the CPU side at present. AMD is defiantly in a good position to shift gears and focus or switch from 2WD to 4WD at any point in time between CPU/GPU so that's a good thing really it's worse days appear behind them. AMD has it's work cut out for them ahead especially on the GPU side of things, but I think they'll inch their way forward and regain market share back from Nvidia over the coming years. I do believe a stronger R&D budget and less debt will make a big difference in their overall competitiveness plus Intel's stumbles should help and those security stumbles could hurt Intel a lot that won't just be forget about given the scale of them that keeps getting deeper.
 
Last edited:
Joined
Aug 6, 2017
Messages
7,412 (2.69/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
That scene looks like it's be post processed with Charmin Ultrasoft soap opera effect. I cannot in good faith say I'm fond of the look. My vision isn't even 20/20 I'm blind as a bat w/o glasses, but that would drive me nuts personally so I'd hate to think what people with good vision think of it that dull mess. I practically looks like a mClassic doing upscaling from like a 720p console game in terms of texture detail it's horrible quite frankly the quality simply isn't there bottom line you can't make a blu-ray quality video out of a DVD rom which also true of all the sacrifices to RTRT to do with ray tracing and lower the amount of light passes and denoise to poorly run RTRT at a unsteady frame rate.
:roll:
well,thanks for the elaborate description.
I don't know why it looks like that to you ,maybe you do need glasses after all.


You make a bit of a good point RTRT could be viewed as a bit of a preemptive **** block attempt by Nvidia to ray tracing with developers that will ultimately slow the progression of ray tracing.
Luckily AMD is in the next gen console's so we might avoid that scenario so good chess move follow up by Lisa Su.

:roll:
 
Last edited:
Joined
Jun 10, 2014
Messages
3,038 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I'd take a 350W card that can pull 4k 60FPS in Ubisofts poorly optomized yearly AAAs. From either company.
Would you take a 350W card when you can get a 250W card with the same performance?
350W is pushing it in terms of cooling it without terrible noise levels.

I think you used a poor example because it's unlikely that scenario would be applied or only sparingly. As far as frame time variance is concerned the hardware remains the same in either case it's simply prioritizing render tasks a bit different within the GPU.
I think you missed the point. The hardware is of course the same, the variance is in the workload.

If anything it could be used to improve frame time variance in the worse case scenario's by utilizing VRS selectively switching a few things to a lower quality when frame rates dip below certain FPS trigger thresholds til they normalize. Sure I'm sure it could get used poorly, but it could get used very well at the same time and having it as a option doesn't hurt.
It is certainly possible to build an algorithm that uses performance metrics from previous frames and dynamically adjusts LoD on the fly, I have even looked into implementing something like that once. The issue is that you have to rely on the performance metrics of the last ~10 frames, so any adjustment will happen after the performance has changed, and will for this reason not reduce stutter. The best approach is to reduce the variance preemptively.

I stand by my claim that it can be used poorly, resulting in blurry scenes and in worst case flickering or artifacts.
 
Joined
Jul 9, 2015
Messages
3,462 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
I told you,you can pay the same and be happy to get less if you wish,you have a choice.
People who insist RTX "is a must" (I won't repeat "less than 1% of games, yadayada") talk about having a choice.
Ironic.

Let me elaborate, not only is RTX a gimmick at this point (pathetic number of rays that are capable to produce noisy shadow/reflection like effects, with heavy de-noising) it is absolutely not clear how this area will develop. Whatever it will be, as with FreeSync, it won't be NV alone deciding it, heck, but AMD alone could, as, wait for it:
1) AMD commands 35% of the GPU market (and is poised to grab more), but also
2) 100% of the console market (Switch is in no way capable of RT-ing anyhow) which is expected to roll out next gen consoles with GPU at 2070-2080-ish levels

Last, but not least, the screenshot you have shared makes me smile. Looks like generic adventure to me.
Yeah, devs were able to gimmick reflections/shadows way before RT-ing, it's just about effort to implement it (must be much easier with RT).

Would you take a 350W card when you can get a 250W card with the same performance?
It will depend on the price.
 
Joined
Aug 6, 2017
Messages
7,412 (2.69/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
People who insist RTX "is a must" (I won't repeat "less than 1% of games, yadayada") talk about having a choice.
Ironic.

Let me elaborate, not only is RTX a gimmick at this point (pathetic number of rays that are capable to produce noisy shadow/reflection like effects, with heavy de-noising) it is absolutely not clear how this area will develop. Whatever it will be, as with FreeSync, it won't be NV alone deciding it, heck, but AMD alone could, as, wait for it:
1) AMD commands 35% of the GPU market (and is poised to grab more), but also
2) 100% of the console market (Switch is in no way capable of RT-ing anyhow) which is expected to roll out next gen consoles with GPU at 2070-2080-ish levels

Last, but not least, the screenshot you have shared makes me smile. Looks like generic adventure to me.
Yeah, devs were able to gimmick reflections/shadows way before RT-ing, it's just about effort to implement it (must be much easier with RT).


It will depend on the price.
funny how you talk like that all the time while amd reveals their goal for 2020 is to match nvidia's 2018 :laugh:
 
Last edited:
Joined
Jul 9, 2015
Messages
3,462 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Aug 6, 2017
Messages
7,412 (2.69/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Last edited:
Joined
Mar 21, 2016
Messages
2,586 (0.79/day)
I think you missed the point. The hardware is of course the same, the variance is in the workload.


It is certainly possible to build an algorithm that uses performance metrics from previous frames and dynamically adjusts LoD on the fly, I have even looked into implementing something like that once. The issue is that you have to rely on the performance metrics of the last ~10 frames, so any adjustment will happen after the performance has changed, and will for this reason not reduce stutter. The best approach is to reduce the variance preemptively.

I stand by my claim that it can be used poorly, resulting in blurry scenes and in worst case flickering or artifacts.
So long as VRS has gears it can shift thru that hopefully subdivides with standard 24FPS animation frame rates it should be a great option with little downside. I could be used poorly, but so can RTRT and other things so that's nothing new.
 
Joined
Oct 2, 2019
Messages
89 (0.05/day)
Processor Ryzen 5 1600 3.8GHz@1.331V
Motherboard ASUS Strix B350-F
Cooling Noctua NH-D15
Memory Kingston HyperX Fury DDR4 16G*2 3000MHz CL16 @1.35V
Video Card(s) Msi RTX 2070 Super Gaming Z Trio
Storage WD Blue 500G SSD*1 WD Black 2TB*1
Display(s) ASUS ROG SWIFT PG279Q
Case Lian Li O11D XL
Audio Device(s) Edifier C3X
Power Supply Corsair RM750x
Mouse logitech G102 with modded micro switchs
Benchmark Scores Time Spy 9943
Fixing their driver is probably more important idk
 
Joined
May 31, 2016
Messages
4,485 (1.41/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Plenty of video streams do variable rate adjustments similar to that based on DL speed because of traffic congestion. I honestly wouldn't mind a bit of selective DLSS smeary temporarily if my FPS dipped below a frame rate threshold I determined it beats chopping frame rates and sloppy input lag.
I'd rather skip RT and go with higher FPS than use DLSS (I see the difference in-games with image quality with this thing on) to speed things up because RT is eating all the performance.
 
Joined
Nov 25, 2012
Messages
247 (0.06/day)
So, err. dumb question. RT was originally developed by a software company, not Nvidia. Nvidia took it and developed a hardware method to (attempt to) make it have an "acceptable impact" and release it to the market earlier than said software company's solution. I feel that was a brilliant marketing move, but a poor end-user solution. RTRT would be great it if had little-to-no-real-world impact on framerates, and widespread adoption to the game development world. If there is a software solution, and hardware that can "add this in", without dropping framerates below "targets" such as 60 FPS for 60hz gaming, or 144 FPS for 144HZ gaming, or 240 for 240hz gaming, why would we care that the card has RT cores or not? Now, are we going to get there in the current or coming generation, almost certainly not. But since development is currently relying on those RT cores for programming models, we're actually delaying development adoption to get it 'now' IF we do get there?

VRS, on the other hand, is a potential framerate improvement, leaving us with "better' state, provided it doesn't significantly impact image quality. Why would anyone NOT want this, IF it works as advertised?

really CBF digging around in the past

but https://www.awn.com/news/sgi-demos-integration-art-vps-ray-tracing-hardware

You will find SGI had it ages ago for cad not really for gaming.
Pretty sure sun had real time ray tracing as well..

First gen hardware / software tends to suck, until it get momentum.

An open standard would be nice.

Since consoles will be using and and consoles drive PC gaming unless some killer app has Ray tracing is just a gimmick at the moment.

Anyone else remember stand alone physics cards? Wank factory 99% there where some titles that you could run that where cool but other than that a waste of money.
 
Joined
Oct 10, 2018
Messages
943 (0.41/day)
Unlike many irrational fan Boyz here, I have been very critical of AMD in both cpu and gpu products they released and the undeserved hype they got with 7nm process. But I have to admit and admire the rx5700. Best card for the money and interestingly power draw as well. You can fine tune it to consume around 140w and have a slightly better performance than its stock.and it beats both 2060 super and 2060 naturally.probably their best product of the year and after that comes ryzen 5 3600.
 
Top