• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6900 XT Graphics Card OpenCL Score Leaks

I'd call it looking at the big picture, personally.
nah, in any AMD-related thread he always shits on anything they do, and when he gets proven wrong by time, he always ends up moving the goalpost
usually it goes like this:
Zen 3 going to be faster than intel in games? nope, no way, don't kid yourself. -> Zen 3 actually ends up being faster than intel in games? heh, it's not an upgrade to intel users anyway!
RDNA 2 going to be faster than 2080 ti in games? nope, no way, don't kid yourself. -> RDNA 2 actually ends up being faster than a 2080 ti AND rivals nvidia's top cards? well, it might be faster than the 2080 ti by a nice margin, but the RT performance isn't up to far, so who cares if their raster performance is on point!

I've seen these two exact scenarios happen in threads he commented on, at this point I wouldn't be surprised if he gets paid to post such tripe... either that or it's a massive cope/buyers remorse, don't know which

Their RT performance, this moment, is no better than first gen RT from NV.
first/second gen tessellation was slow as fuck too, who cares. RTRT effects won't become REALLY mainstream until at least 5 years, and won't be mandatory in any game for at least 10 more years/end of the new console generation... and by the time they do become mainstream, current RTX and RX cards will be outdated by then, and you'll be getting much better performance for less in said time, so really it's a flawed argument anyway
 
Last edited:
There are AMD people who do the saaaaaaaaaaaame thing in NV/Intel threads bud. This place is a cesspool for the polarizing toxic fanboys. I've put many, from both sides, on ignore, but at this point, since these people (somehow) continue to exist at this site, a lot of these threads look like swiss cheese. :p

first/second gen tessellation was slow as fuck too, who cares. RTRT effects won't become REALLY mainstream until at least 5 years, and won't be mandatory in any game for at least 10 more years
Just comparing apples to apples, bud. It's slower, period. And if you want to use that feature, it won't be as good as the NV offerings. I would also bet good money says if their RT was as fast, suddenly a lot of AMD people would be jocking the merits................. but here we are. Let the (curiously) loyal defend their brethren at all costs.
 
Just comparing apples to apples, bud. It's slower, period. And if you want to use that feature, it won't be as good as the NV offerings.
I won't want to use that feature for at least 5 years, just like how I didn't want to use tessellation when it first emerged back in 2010 :D, the hardware is simply not mature enough yet, and the effects themselves do not provide a large enough visual uplift to justify the framerate impact
 
I won't want to use that feature for at least 5 years, just like how I didn't want to use tessellation when it first emerged back in 2010 :D, the hardware is simply not mature enough yet, and the effects themselves do not provide a large enough visual uplift to justify the framerate impact
That's you. there are PLENTY of others who use it without issue or concern. 2nd gen RT is VASTLY improved and in many titles, you don't need DLSS for 60 FPS+ at 1440p even with a 3070.

Justify it all you want, but you know what I said is the truth. If you don't use it, that's fine, but there are plenty who do, more daily, and that train gains momentum thanks to consoles.
 
there he is, the cute little shill, moving goalposts as usual!
I'm not particularly keen on raytracing performance and wouldn't use that as a deciding factor to judge a GPU at the moment, but I do value NVENC, DLSS is actually useful in some instances, and WFH I'm actually finding RTX Voice useful too.

Given how popular streaming and WFH is at the moment, those are features that can't really be ignored and AMD simply doesn't compete at all. Like, they haven't even bothered trying! On top of that, the number of things that are vastly superior when run using CUDA instead of OpenCL is starting to get ridiculous. I know CUDA is proprietary, but Nvidia have put the work in to make it a thing, and spent years investing in it at this point. AMD have basically said "sure, we do OpenCL" and left it to rot with developers left to fend for themselves. Is it any wonder that CUDA is now the dominant application acceleration API?

Before anyone wonders, No, I'm not an Nvidia fanboy. If anything I hate Nvidia for regular unethical and shady practices and have a strong preference for the underdog (because that's the better option that promotes healthy competition and benefits us as consumers the most) but I can't deny the facts - RDNA2 is completely uncompetitive in many ways. The only thing it is actually competitive in is traditional raster-based gaming and that's a shrinking slice of the market.
 
Last edited:
That's you. there are PLENTY of others who use it without issue.

Justify it all you want, but you know what I said is the truth. :)
fair enough :)
 
2nd gen RT is VASTLY improved and in many titles

Actually they haven't improved RT AT ALL, they just added a LOT more cores, the impact is exactly the same as the RTX 2000 series.
 
Actually they haven't improved RT AT ALL, they just added a LOT more cores, the impact is exactly the same as the RTX 2000 series.
Correct. Regardless of how they got there, RT performance is a lot better.
 
Correct. Regardless of how they got there, RT performance is a lot better.
I also think most all current RT games were optimised for their 1st gen RT cores, as time goes on I'm optimistic that if games are optimised more heavily for Ampere (which irrespective of RT will certainly be the case) then the divide between Turing and Ampere in RT processing will widen.

AMD's architectures improving over time relative to their Nvidia counterpart when they launched may be at least in part due to their compute heavy nature, and it might be how Ampere fares over time too.

To hammer in an old point, indeed if RT, DLSS, voice, streaming, CUDA etc etc mean bupkiss to you, and I know there would be MANY out there that fit the bill, all the power to you, why pay for features you don't want or wont use. But accept that the other side of that coin is if you want any or all of those, AMD is either weaker on non-existent in those spaces.
 
I also think most all current RT games were optimised for their 1st gen RT cores, as time goes on I'm optimistic that if games are optimised more heavily for Ampere (which irrespective of RT will certainly be the case) then the divide between Turing and Ampere in RT processing will widen.

I'm only guessing here, but I suspect that game devs will optimise for AMD's DXR implementation first and foremost. It's a huge market with (relatively) fixed hardware that commands the lion's share of the sales volumes and profit.

As for the PC gaming market, yes - it's pretty big - but it's not all running AMD 6800-series raytracing cards or better.
 
Back
Top