- Joined
- Nov 26, 2021
- Messages
- 1,705 (1.52/day)
- Location
- Mississauga, Canada
Processor | Ryzen 7 5700X |
---|---|
Motherboard | ASUS TUF Gaming X570-PRO (WiFi 6) |
Cooling | Noctua NH-C14S (two fans) |
Memory | 2x16GB DDR4 3200 |
Video Card(s) | Reference Vega 64 |
Storage | Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA |
Display(s) | Nixeus NX-EDG27, and Samsung S23A700 |
Case | Fractal Design R5 |
Power Supply | Seasonic PRIME TITANIUM 850W |
Mouse | Logitech |
VR HMD | Oculus Rift |
Software | Windows 11 Pro, and Ubuntu 20.04 |
Good God, man! That isn't my argument. I only said that older nodes are more profitable. Of course, newer nodes are necessary to keep the industry moving forward. In time, they will also become more profitable as the initial investment is paid off.Your whole argument is that we should still be on 3000 nm because it would definitely be cheaper to make. Got it. Never innovate, never change, never improve, that's how to stay competitive. I've often said intel was a successful company when they completely stopped innovating.
Not like you could shrink nodes or use chiplets and over time they would outweigh the profit of older architectures. Of course not that would be ridiculous. Must be why cell phones that used to cost $500 can be bought at 711 for 20 bucks.
You can make excuses and arguments and spend your entire pay check on Intel every week, and it wouldn't make a difference. Because where the money actually is in the server market, they are inferior and nowhere to be seen. Too few cores because of their architecture, too little efficiency because of bad design. Period, full stop, stock price reflects it perfect.