People might call me a troll but lack of RTX and DLSS is hardly a negative at this current juncture of all junctures.
The real negative is when you turn RTX/DLSS on and it eats more than half your frames and looks almost identical to ultra quality settings at 1440p. All that extra money spent on an RTX card when all you needed was to run 1440p or 4k resolutions
I think to look at this objectively and without exaggerating the performance hit (30 - 50%) w/ biggest hit at highest resolutions
At 4k, we are talking 1.5% of the market, 95% of which is at 60 Hz. While I can understand the position, Id rather not lose 30-40% of my performance to RT, half the games in TPUs test suite are doing 90 or better fps at 4k.... so if two brothers have same system except for GFX card.
The nVidia 2080 Ti user will have half 90% of his games capped at 60 fps due to monitor limitations, but for 10 of those games, he can turn on RT with little or no penalty, still being above his monitor's limitation. The AMD Radeon VII user will have a bit less of his games capped at 60 fps due to monitor limitations because the Ti is 40% faster. So the point is .... we shouldn't base what card to buy based upon someone who can take advantage of things many can't.
At 1440p, adding up all the fps for games in TPUs test suite the $700 RTX 2080 is 13.3% faster than the $700 Radeon VII. With both cards overclocked, that grows to 21.8 %. So what's the downside of RT ? Let's look at the options here .... $700 Radeon 7 OC'd versus $700 RTX 2080 OC'd
Now at this point in time there's not a lot of games that support it so this is purely conjectional in that we must assume at some point that a % of the games (say 20 - 30% for sake of argument). From Metro article conclusions, Wiz puts the expected hit at 30 - 40% once it's out a bit and tweaked, Ill use 35%. So lets assume for example, that some developers release updated version of their games and I'll pick 25% of the games on the list ... numbers are 1440p w/ both cards overclocked.
Divinity OS2 could be played at 119.0 fps on an OC'd VII, 153.1 on a 2080 or 99.5 on a 2080 w/ RT enabled.
F1 could be played at 133.3 fps on an OC'd VII, 157.9 on a 2080 or 102.7 on a 2080 w/ RT enabled.
GTAV could be played at 150.9 fps on an OC'd VII, 184.3 on a 2080 or 119.8 on a 2080 w/ RT enabled.
Witcher 3 could be played at 103.2 fps on an OC'd VII, 127.0 on a 2080 or 82.5 on a 2080 w/ RT enabled.
Now think of that from a perspective of folks playing at 65, 100, 120 Hz monitors. It's an option and it doesn't cost you a dime. Now if ya built ya computer so you can brag about how many fps you get, fine. But, if you are looking from the perspective of the gaming experience, frankly I don't think that I'd care if it was on or off for 3 of those games. I's take the extra 20-30 fps and enjoy ULMB. However, on Witcher 3, if it came down to playing on a Radeon VII at 103.2 versus having the choice to play at 127.0 with ULMB or 82.5 w/ RT and ULMB, I'd like to experience the latter.
So again, let's look at the options here .... $700 Radeon VII OC'd versus $700 RTX 2080 OC'd
1. Of the 21 games in the test suite, only 1 game under 80 fps, with the 2080 (3 for the Radeon VII) which means turning anykinda-Sync off as using Motin Blur reduction is an option ONLY on the 2080.
2. Of the 21 games in the test suite, with both cards overclocked, 2080 is faster in 19 of them Radeon VII in 2 of them.
3. Overall, the 2080 is 22% faster with both cards OC'd
4. So far .... is RT even a factor in the decision here ?
5. No one is mandating you to use it ... what is the downside ?
It's like going down to buy a new SUV .. and the salesman says, hey ya know what ... "I can sell ya the 2WD model you came here for ... but for the same price I can give ya the RT model and it comes with 4WD, larger more efficient engine means it accelerates faster and uses less gas, comes standard with AC, and runs cooler" and turning it down cause the carpeting in the trunk is red instead of green.
Was not so long ago, that I was saying "the 780 OCd is faster than AMDs offering OCd but below that, weigh your options. Then it was, "Well from the 970 price point on up, nVidia has the edge but below that look at both cards in each price niche ... " And then it was "Well from the xx60 price point on up ...". Saying it isn't so, doesn't change the numbers.
In short, at the $700 price point, like the $300 price point, AMD doesn't really have a horse in the race. Not having RT at this price point is not a deal killer, because no one else has it either. But it does have ULMB which is certainly a significant incentive. And at the upper price tiers, providing the option is an incentive, not as much as ULMB but any tech that gives the user different options to enhance the user experience is a good thing.