Yes yes, very sympathetic when AMD lock 6900XT voltage/freq in order to sell higher SKUs: 6900XT LC, 6900XTXH, 6950XT
Does the 6900 XT have locked voltage/frequency? I know mine is a Navi 21 XTX die ("Ultimate"), but that's just a special bin picked for scaling better to higher clock speeds at higher voltages. Is the stock, Navi 21 XT 6900 XT locked down in terms of adjusting clock speeds or frequencies? Yes, I know there's a frequency ceiling for what can be set in software, but in my experience that tends to be higher than what can be done stably without exotic cooling anyhow, so I don't see the issue. All I've noticed about mine is that it doesn't really undervolt at all, but that's just a characteristic of that bin of the silicon - it still gets stupidly efficient with a moderate underclock.
That is actually a cost-saving measure. Nvidia traditionally engineers more complex chips. Yields for those are not that good at first. So you get the "not fully enabled" dies. Once production steps up, yields improve and fully unlocked chips become more viable. If they pushed for fully enabled dies you end up either with more expensive dies or with cut-down ones (to the level that can be produced initially) with nowhere to go once yields improve.
This is IMO a pretty reasonable approach - but in the market, it has the effect of saying "hey, this is the new cool flagship, the best of the best" only for 6 months to pass and them to say "hey, forget that old crap,
this is the best of the best!" Which, regardless of the realities of production, is a pretty shitty move when the most explicit selling point of the previous product was precisely how it was
the best. There's obviously a sliding scale of how shitty this is simply due to something faster always being on the way, but IMO Nvidia tends to skew towards pissing on their fans more than anything in this regard.
I also don't get why people get hung up on dies being fully enabled or not. You get the product benched as-is and you know very well what it is capable of.
This I wholeheartedly agree with. Whether a die is fully enabled or not is entirely irrelevant - what matters is getting what you're paying for, as well as having some base level honesty in marketing.
ME. NO.
In the same way it cannot make games good, good games can also be enhanced by it.
We get it, lots of you don't care about RT, you've only been shouting that from the rooftops since Turings debut, believe me, we hear you loud and clear.
Would ya'll please be mindful there is a subset of enthusiasts who do want good RT performance in future purchases, for it not to be an afterthought, a checkbox to tick.
If it's not for you, awesome, but I tire of hearing nObOdY cArEs ABouT rAy TrACinG when clearly people do, so please, stop (silly to even ask I know, this will probably make the vocal among you double down and write me an essay on why RT is a gimmick or that 'most' people don't care, good on you!).
Personally I'd love to be able to very strongly consider AMD GPU's, but a prerequisite of that is for them to take RT more seriously, and lessen the hit, and they can certainly swing even more buyers there way if they deliver on that, so I eagerly wait to see if the top product has made significant strides.
I mostly agree with this, in fact I waited to buy a new GPU in order to get RT support - but I'm also perfectly fine with RT performance on my 6900 XT. I loved Metro Exodus with RT enabled at 1440p, and while I've only barely tried Control, that too seemed to work fine. Is the 6900 XT perfect? Obviously not. Are Nvidia's contemporary offerings faster? Yes - but not
that much faster, not enough that it'll matter in 2-3 years as RT performance becomes more important. And either way, my GPU beats both current gen consoles in RT, so I'll be set for base level RT performance for the foreseeable future.
RT performance is absolutely an important aspect of the value of Nvidia's GPUs - the question is
how important. For me, it's ... idk, maybe 5:1 raster-v-RT? Rasterization is a lot more important overall, and for the foreseeable lifetime of this product and its contemporaries, I don't see the delta between them as
that meaningful long term. When my 6900 XT performs between a 3070 and 3080 in RT depending on the title, and the 3090 Ti is maybe 20% faster than those on average, that means they'll all go obsolete for RT at roughly the same time. There are absolutely differences, but I don't see them as big enough to dismiss AMD outright.