I agree with everything except the built around rtx from the ground up, there simply was not enough time to do that.
How ever this reminds me of PhysX where u needed a separate card to get good performance before the 400 series.
we are moving in the right direction regardless of how long it takes
Edit: and like i said the only issue is the pricing at which nvidia chose to adopt this in with its msrp.
Do you really think developers will build engines from the ground up, just to reach a level of graphical fidelity that is comparable to what they can already do with simple, tried and 'cheap' techniques? Dp you realize that most of the popular engines today follow a completely different model? There was good reason to implement this through DXR and the API DX12:
low risk. Engines have become SaaS applications, so this whole RT business will be given a priority determined by its customers - the developers that have to implement a costly technique. There are only very few studios that are even in a position to really push an engine. Frostbite is one of those rare exceptions, and the 4A engine probably is another. In-house engines, which means that the amount of content they are used for is never going to be widespread.
I'll refer you to every other tech that required specific, dedicated dev time. It takes a very, very solid business case to even remotely start that kind of work. Where is that business case? Just an analysis: there isn't one, as long as RTRT isn't available in all but the slowest of GPUs. So that 2080ti performance, right now, for the adoption of RTX, means absolutely nothing at all. There IS no adoption rate as long as this is the price point it requires. And I think everyone can agree that the lower tier GPUs offer unacceptable performance.
I agree there are quite possibly many tweaks and improvements to be made, no doubt about it. But in the same way, we had high hopes for DX12 features like mGPU. How's that going? Its another one of those time- and configuration-heavy implementations with questionable benefits that only touch a minority of the gaming market. Again: there was no business case to be made for it, and look where mGPU is at today.
There are so many parallels with failed technologies its unreal. The inability to recognize that, to me, is a dangerous form of blindness. This is NOT the same as hardware T&L or anything like that. The timeframe isn't the same, the industry support isn't the same, and the demand for such a tech is radically different. We will always chase photo-realism in digital content, and yes, RT can be a means to this end. But somehow, ever since Nvidia said this was a holy grail, we seem to have forgotten there were very, very good reasons to take a different approach than brute forcing everything, because that is what RTRT really means, no matter how you optimize it. RTRT will ALWAYS be in competition with cheaper tech that gets very similar results and is supported everywhere.
In the end, the only viable developments in any kind of product depend on the price they can be sold at. Look at Elon Musk, and his plans to colonize Mars, and how the very core of everything he plans is based on the cost of that return trip, and bringing it down to a reasonable level. If there is no profitability, it will die - if you cannot bring a development to the masses, it will die.
if there is one thing this industry (gaming) has shown us the last ten years, it is a steady trend towards the 'one-size-fits-all' method of development. Consoles moved to x86. Most of the games are ported back and forth. Exclusives are rapidly drying up compared to previous gen. Everything is aimed at maximizing profit while minimizing development time before a product is rushed out the door, only to incrementally fix it afterwards with a much smaller team. That is not the landscape where RTRT will be able to flourish. Its one of those items on a loooong list of 'nice-to-haves' in software development. And it sure as hell isn't on the top of that list.