- Joined
- Jan 14, 2019
- Messages
- 13,674 (6.22/day)
- Location
- Midlands, UK
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
It has something similar, but I'll look into that.RDNA doesnt actually have ai cores though (matrix).
So you think Nvidia spends the vast majority of its marketing time and resources on a feature that literally nobody cares about? C'mon...Yes, I think it is.
1. Caring about existing customers = better company image.I'm not even entirely sure what you are arguing with here. That if nvidia tried, they could somehow make it work on ampere and Turing? Sure they could, I have no doubt about it. The question is, why should they spend resources doing that for free, and what would the end result be? Would it actually work properly? And if it didn't, we would go back to the same argument about artificial limitations and nvidia being greedy making it work like crap.
2. Wider game adoption.
3. Even if it doesn't work properly, it's at least a demo for your future purchase, just like RT on a 1660 Ti was.
4. If spending resources on it doesn't make sense, then why did they decide to do it now? Why not just leave it?