The sooner people understand that nVidia took GPUs designed specifically for datacenter workloads and only scrambled to try to figure out how to market them to gamers as an after-thought, the sooner you chumps who ponied up big $$ for their first generation junk can stop hoping for the promised land of nGreedia. Some intrepid soul at nVidia headquarters during a strategy meeting probably said: "I know! If we write a code wrapper, we can use the tensor cores to do primitive, slow real-time ray-tracing, but we can tell all the suckers who will buy anything we shovel that we 'painstakingly crafted the GPU to do real-time ray-tracing'! They'll actually believe that we designed Turing as a gaming GPU! They're so gullible, they'll actually THANK us for dumping our lower binned Turing chips that we couldn't sell to datacenters, onto them, and at eye-watering prices, too! We've actually managed to move the price tag of 'premium' gaming cards up from $300 just a few years ago, all the way up to $1,200 for an x80 Ti card, and now there are even saps who'll pay $2,500 for the ever-so-slightly faster Titan card line-up we created to dump the very tiny number of GPUs that aren't quite data-center quality, but are just that extra smidgeon less flawed than the x80 Ti chips!!" And guess what. It worked, as usual.