Actually I think it was GT730 that was partly a Fermi. Anyway, you knew what card you had and Nvidia clearly told you which features you can expect. That's the point.
There's no lottery, there's no browsing github. Official, precise information from the manufacturer. Isn't this how it supposed to be?
The Nvidia site clearly states the chip name and family.
I mean: how would you feel if you went to a car dealer and they told you cars have gearboxes, but you'll need to check github to learn if there's a reverse gear?
Radeon VII has been around for 2.5 months and since it's not mentioned by OBS, there's no way to learn what in can encode, right?
Wikipedia says it's VCE 4.1.
OBS says 4.0 was the "last VCE generation before AMD switched to the VCN naming scheme"..
I look at Radeon VII page and it still says "Video Code Engine (VCE)".
https://www.amd.com/en/products/graphics/amd-radeon-vii
So OBS was wrong this time. Let's hope it's the first and last time, because what would we do otherwise...?
Mess.
Is it because AMD saves on everything they can to sell the products for 20% less than competition? I'd rather pay more.
You can game on a weak card, but you can't grow features.
R9 390X was way more powerful than 380X, but it could only encode 1080p and 380X could do 4K. So which card is better for "gaming and streaming"?
And both cards are OK for 1440p high or even 4K low. It's not like there's much to admire anyway.
How is performance even a variable in this topic? We're talking about hardware encoding features. If you need more grunt, buy a bigger card. If you're playing Diablo3 or Fortnite, 1650 is OK. If you have a 250W ITX box, 1080Ti is an idiotic choice.