- Joined
- May 2, 2017
- Messages
- 7,762 (2.78/day)
- Location
- Back in Norway
System Name | Hotbox |
---|---|
Processor | AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6), |
Motherboard | ASRock Phantom Gaming B550 ITX/ax |
Cooling | LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14 |
Memory | 32GB G.Skill FlareX 3200c14 @3800c15 |
Video Card(s) | PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W |
Storage | 2TB Adata SX8200 Pro |
Display(s) | Dell U2711 main, AOC 24P2C secondary |
Case | SSUPD Meshlicious |
Audio Device(s) | Optoma Nuforce μDAC 3 |
Power Supply | Corsair SF750 Platinum |
Mouse | Logitech G603 |
Keyboard | Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps |
Software | Windows 10 Pro |
Okay, here's a challenge: show me a real-world use case of AI with relevance for a relatively ordinary person that isn't already being done by conventional algorithms. Please. 'Cause I haven't seen a single one. I'd be impressed if you could even find one where a significant and relevant performance improvement can be seen.Sorry, but I don't understand how a thinking adult can write something like this in 2020.
Yes, ML/AI is just about some "algorithms" - like everything we do on computers. If you don't know any use case, google will help you...
Sure, not everyone wants a desktop PC, so we don't expect car makers to talk about them in their keynotes, do we? And no, regulators don't (usually) develop technologies, but consumers shouldn't be being fed decade-or-more-away vaporware to create excitement for tech that might never show up either. All the hype for autonomous cars is just that: hype.Not everyone wants a desktop as well. I don't understand this argument...
Same with the "decade away". So what? We should wait for regulations and develop then? How would that evem work?
This is nothing new, all streaming providers have migrated across codecs already, most from something previous to H.265 to H.265. The thing you're missing is that they don't scrub their libraries of the other formats when this is done. You're trying to make this out as some situation where end users will actually notice the transition, which it isn't - it just means that Netflix can plan to scrub legacy formats from their libraries X years ahead when most people have moved on to hardware supporting AV1. This has zero impact on consumers. What do I care if Netflix or YouTube saves money? YT is free, and there's no way those savings are doing anything but padding Netflix's bottom line.Imagine a situation where Netflix uses AV1 and only Intel has a hardware decoder. Youtube may be going for AV1 as well...
For these streaming companies making more efficient codecs is the primary way to save costs.
Intel's presentation was about potential consumer electronics in various fields that either don't exist or don't work. Fiction at best. And, key, their major field of expertise, which also tends to consist of real consumer electronics, was barely mentioned whatsoever.And Intel's presentation was all about "consumer electronics", just from the tech point of view (not final products). AMD's presentation was mostly about gaming and occasionally about "creators". And just a small part of consumers does that.
I know. There's nothing revolutionary about AI whatsoever, just a massive hype train over fancy algorithms that don't do much new.Most people associate AI with autonomous cars, robots, terminators etc. It's just not true.
... you actually believe that? Tell me, how is your laptop supposed to "optimize your workflow"? In what way? This is typical vague nonsense that will never, ever pan out. Is it supposed to tell you when to take toilet breaks and when to drink more coffee, or nag you when you've been spending too much time talking to your colleagues?Dell just announced XPS laptops will learn how you use them, to optimize your workflow over time. We'll see more and more applications like that.
So your laptop kind of "thinks". But it's not really "Skynet scenario", is it?
There are some very tiny improvements that can be made, like pre-loading applications into memory if behavior is recognized that could be a precursor to using that application, but ... that's not going to make any kind of difference unless all you do every day is open and close slow-loading applications.
Good for you. You belong to a tiny niche even within the PC enthusiast space, which is itself a tiny niche of humanity. For the rest of us, this has pretty much zero tangible benefit.The "AI" umbrella term includes ML. And that's what I do most of the time (at work and as a hobby). I benefit from the ML boost libraries and ML accelerators.
Which is exactly what I was saying. They're trying to sell (slightly fancy) algorithms as something brand new and revolutionary, when the fact is that real-world improvements from this tech are ... tiny. Outside of datacenters and research, at least.As for "AI" itself, i.e. when the computer just decides how to do stuff - it's really nothing new. Photo/video editing software has been doing that for a long time. It's just that today we have chips and libraries that make this faster.
And there are many more possible uses.
Maybe virus scanners will benefit. Maybe some productivity software will get faster (it already does). Maybe my laptop will last longer on battery. It's all great.
[/QUOTE]
Sure, some stuff will get marginally faster, some stuff will get marginally better, and things will improve over time. I'm all for smarter battery management and similar systems, as most computer infrastructure is shockingly dumb in a lot of ways. But none of this is anything close to the revolutionary PR BS they are trying to sell it as. AI is not a new computing paradigm, it's just more of the same. Which is fine - what we already have is pretty frickin' cool. But slight improvements to various parts of it isn't going to revolutionize anything, so they should really stop saying that.