Regarding next gen raytracing performance importance etc:
First DX11 dGPU 2009, first DX11 IGP 2012, first game that requires DX11 hardware (HLSL Shader Model 5) 2013 Crisis 3, but in reality it was only one AAA game, if you check the prevalent DX11 engines, frostbite had mandatory requirements for DX11 in 2015 (Battlefield 4 didn't need HLSL Shader Model 5) the same for Unreal 4, 2014 we had some experimental first tries, in general 2015 is more good indication as the first real DX11 "mature" year.
So 6 years from first DX11 dGPU & 3 years from first DX11 IGP.
Then after 2 years in 2011 we had first DX11.1 dGPU and so on up to today's hardware releases.
So regarding "mature" Raytracing adoption i expect similar path.
I won't take 2018 Turing as first DX12 Ultimate dGPU date because it started at $350 (and $500 the second cheapest model), so it didn't penetrate a large portion of the market anyway and also the competition didn't have anything up to 2020 plus the new raytracing-capable consoles launched in 2020, and in general raytracing is a much bigger deviation than what DX11 was to DX10, also it seems 2022 is for AMD IGP and 2023 for Intel IGP (which has the majority of the IGP market) so probably 2025-2026 will be the first year to be characterized as "mature" DX12 Ultimate year.
And this is the optimistic scenario imo (the pessimistic scenario is when PS6/Xbox SX next will launch (2028?), but it isn't like Nvidia failed or whatever or raytracing isn't going to be compelling enough etc, it's how the market operates anyway more or less.