- Joined
- Jan 14, 2019
- Messages
- 15,931 (6.90/day)
- Location
- Midlands, UK
System Name | My second and third PCs are Intel + Nvidia |
---|---|
Processor | AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode |
Motherboard | MSi Pro B650M-A Wifi |
Cooling | be quiet! Shadow Rock LP |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | PowerColor Reaper Radeon RX 9070 XT |
Storage | 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda |
Display(s) | Dell S3422DWG 34" 1440 UW 144 Hz |
Case | Corsair Crystal 280X |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | 750 W Seasonic Prime GX |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE Plasma |
It has something similar, but I'll look into that.RDNA doesnt actually have ai cores though (matrix).
So you think Nvidia spends the vast majority of its marketing time and resources on a feature that literally nobody cares about? C'mon...Yes, I think it is.

1. Caring about existing customers = better company image.I'm not even entirely sure what you are arguing with here. That if nvidia tried, they could somehow make it work on ampere and Turing? Sure they could, I have no doubt about it. The question is, why should they spend resources doing that for free, and what would the end result be? Would it actually work properly? And if it didn't, we would go back to the same argument about artificial limitations and nvidia being greedy making it work like crap.
2. Wider game adoption.
3. Even if it doesn't work properly, it's at least a demo for your future purchase, just like RT on a 1660 Ti was.
4. If spending resources on it doesn't make sense, then why did they decide to do it now? Why not just leave it?