- Joined
- Nov 4, 2005
- Messages
- 12,057 (1.72/day)
System Name | Compy 386 |
---|---|
Processor | 7800X3D |
Motherboard | Asus |
Cooling | Air for now..... |
Memory | 64 GB DDR5 6400Mhz |
Video Card(s) | 7900XTX 310 Merc |
Storage | Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives |
Display(s) | 55" Samsung 4K HDR |
Audio Device(s) | ATI HDMI |
Mouse | Logitech MX518 |
Keyboard | Razer |
Software | A lot. |
Benchmark Scores | Its fast. Enough. |
Saying this is like the dotcom bubble isn't the negative implication you seem to think it is. Online retail rose during the dotcom bubble and became successful because of the funding it provided:
Dot-com bubble - Wikipedia
en.wikipedia.org
"In 2000, the dot-com bubble burst, and many dot-com startups went out of business after burning through their venture capital and failing to become profitable.[5] However, many others, particularly online retailers like eBay and Amazon, blossomed and became highly profitable."
This is the same for any new market, a ton of startups pop up and most of them go out of business finding what does and doesn't work.
If you are implying that AI has no use you'd be dead wrong, just from a medical and engineering perspective it's already indispensable.
The AI of everything will burst.
Bubble? Burst? It was a medium-size correction before the cancer-like growth that hasn't stopped until now, it just ceased a bit in 2023. One more bubble burst like that, and Nvidia + Amazon + Apple + Microsoft + Google + a couple AI newcomers will constitute 80% of the world economy before 2050.
Its a bubble that is going to burst once people realize how little actual work it is doing and how many resources its sucking up for mediocre AI BS, it can't make me coffee yet. The dot com bubble burst when people didn't flock to all the websites that were flashy but did nothing, this will be the same, then it will be put to use where it should, drug research, climate modeling, DNA function modeling, physics. The fact that even with a stack of hardware it doesn't run the data density means its never going to scale to truly functional status with current silicon limitations, except in some supercomputers or for niche applications. A company I work with uses machine learning to determine seed health/viability in almost real time, and it takes a whopping 2 GPU's. I don't need AI to text my wife and ask whats for dinner.