- Joined
- Sep 17, 2014
- Messages
- 22,684 (6.05/day)
- Location
- The Washing Machine
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Stop watching LinuxTechTips, you'll get brain cancer.
I completely agree with you. I don't need influencers to form an opinion though, unlike what you seem to be alluding to.
I've said RTX was shit *during* the GDC Keynote, before it dawned upon most people here or even Linus himself. No I don't need a cookie for that, just to illustrate my point. I've even said long before Turing launch that this gen would likely be a tiny jump in performance and I predicted when RTX was first announced that they were going to sell Turing on this feature alone. And all on this very forum too, so you can scan my post history to verify everything.
Crystal ball or common sense, you decide.