1) They came from the same wafer using the same mask for lithography process as the titan, it's called binning ok. Dont you know that? It's how you can have the RX 580 and the RX 570, the wafer is made to produce only RX 580 but due to defects not all chips end up becoming the RX 580. You want prime example? look at Nvidia, why was the 1070ti released after 1 year? Two reasons, one because of vega 56 and second, the process improved by so much that they now have excess chips that is better than the 1070 but could not be the 1080 due to defects, so Nvidia released the 1070ti to make more money. This is the most basic "How to make money" of silicon wafer. A mask used to etch a wafer cost in excess of several millions of dollars, it will be extremely expensive to design separate mask for each chip separate line of chips. The 1080ti is the lowest quality silicon on the wafer period, all the highest quality silicon goes on to be sold at the professional market. Ever wonder why AMD Ryzen 1st gen threadripper could clock way higher compared to their 1st gen desktop counterpart? It's because threadripper chips are top 10% of the chips in terms of quality.
https://www.techpowerup.com/gpudb/2863/titan-x-pascal - Titan XP
https://www.techpowerup.com/gpudb/2877/geforce-gtx-1080-ti - 1080ti
wow suspiciously similar wow much wow. Both are 471mm square with the same exact transistor count wow. Yeap they are not the same you're right
How could my eyes deceive me, clearly these two gpu chips are not the same. silly me
Bonus
https://www.techpowerup.com/gpudb/2865/quadro-p6000 - Quadro P6000
wow much similar such amazement, oh noooo, I must get my eyes checked, they are not the same.
2)10 year project bruh, What the hell? Gosh damn, Nvidia did not waste R&D on this for 10 years, I repeat they did not spend 10 years worth of R&D on RTX. What the F? 2Billion per year? Lmao
10 years in the making does not mean 2 billion in R&D per year. That is stupid business. Nvidia makes it sounds like oh we had designed it in for 10 years, it's marketing Ray tracing was done long ago not new. They had the initial idea to do it in real time but the technology(the litography process) was not there yet if so Kepler would have inkling of being able to do ray tracing in their architecture. If you said so, then AMD also wasted 10 years of R&D on this, their first ray tracing demo was back in 2008 go check it out.
3)Good good, Nvidia listen to this guy, raise your profit margin so that you can milk more money out of consumers.
4) Turing traditional cores are similar to Pascal(rasterization, 99% of games), the only new stuff was ray tracing cores, tensor cores was available through volta. All of their R&D money was spent on the new stuff and again all the R&D fund comes from Pascal. Which yet again no mass adoption of games of the new tech in 2-3 years time. 2080ti is shown to be just capable of ray tracing at 60FPS at 1080p, I repeat 60FPS at 1080P. Now I don't know about you but I prefer over 100 FPS in competitive games over wow pretty shadows. So what about the 2080? 2070? hell 2060? ray tracing 60FPS at 480p for the 2060?