Hate is not the right word or description. We just don't care and don't want it on our machines. The people of the world have been fine before it, we'll continue being fine without it.You might not hate the AI so much if you only had a
System Name | My second and third PCs are Intel + Nvidia |
---|---|
Processor | AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode |
Motherboard | MSi Pro B650M-A Wifi |
Cooling | be quiet! Shadow Rock LP |
Memory | 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36 |
Video Card(s) | PowerColor Reaper Radeon RX 9070 XT |
Storage | 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda |
Display(s) | Dell S3422DWG 34" 1440 UW 144 Hz |
Case | Corsair Crystal 280X |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | 750 W Seasonic Prime GX |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE Plasma |
Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front. So then, the only choice companies have is to use the "you'll hear about it repeatedly until you end up liking it" approach in their home PC marketing.Having taken a look at the results of the poll thus far, 83%+ say no thank you to AI.
You watching @ microsoft, @ apple, @Google?? Hmmm?? Most of us don't care and don't want it on our devices/PC's! Take a hint! DON"T force it on us. We will react poorly and to your detriment.
Hate is not the right word or description. We just don't care and don't want it on our machine. The people of the world have been fine before it, we'll continue being fine without it.
I'm not saying AI and such don't have usefulness, just that it doesn't belong on our personal devices by default.
System Name | Kuro |
---|---|
Processor | AMD Ryzen 7 7800X3D@65W |
Motherboard | MSI MAG B650 Tomahawk WiFi |
Cooling | Thermalright Phantom Spirit 120 EVO |
Memory | Corsair DDR5 6000C30 2x48GB (Hynix M)@6000 30-36-36-76 1.36V |
Video Card(s) | PNY XLR8 RTX 4070 Ti SUPER 16G@200W |
Storage | Crucial T500 2TB + WD Blue 8TB |
Case | Lian Li LANCOOL 216 |
Power Supply | MSI MPG A850G |
Software | Ubuntu 24.04 LTS + Windows 10 Home Build 19045 |
Benchmark Scores | 17761 C23 Multi@65W |
Considering the apparent popularity of certain local AI uses like roleplaying, starting off with the early popularity of AIDungeon 2 - Does anyone remember that? - now done with things like finetuned Llama 3 models, and the number of AI-generated avatars on this very forum*, I'd say it's good to have some options that run local, as opposed to just relegating them to API and server space, or do something like what Intel did by dummying out AVX512 on client processors (Not that GPU makers didn't dummy out FP64, but for entirely unrelated reasons.)Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front.
System Name | My second and third PCs are Intel + Nvidia |
---|---|
Processor | AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode |
Motherboard | MSi Pro B650M-A Wifi |
Cooling | be quiet! Shadow Rock LP |
Memory | 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36 |
Video Card(s) | PowerColor Reaper Radeon RX 9070 XT |
Storage | 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda |
Display(s) | Dell S3422DWG 34" 1440 UW 144 Hz |
Case | Corsair Crystal 280X |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | 750 W Seasonic Prime GX |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE Plasma |
Sure, but these weren't anything anyone asked for - nor did they exist during the Turing debut. It's more of a "since we have it, let's use it for something" approach to AI instead of a burning need now fulfilled.Considering the apparent popularity of certain local AI uses like roleplaying, starting off with the early popularity of AIDungeon 2 - Does anyone remember that? - now done with things like finetuned Llama 3 models, and the number of AI-generated avatars on this very forum*, I'd say it's good to have some options that run local, as opposed to just relegating them to API and server space, or do something like what Intel did by dummying out AVX512 (Not that GPU makers didn't dummy out FP64, but for entirely unrelated reasons.)
Not that it mattered for the greater majority of users, as you observed. Until more visible use of generative AI for home users pops up, and a lot of people are certainly trying. Not that they would produce anything groundbreaking or even useful, but the hardware capability has to be there first.
*My own was grabbed off HF when they ran SD 1.5 demo.
System Name | Kuro |
---|---|
Processor | AMD Ryzen 7 7800X3D@65W |
Motherboard | MSI MAG B650 Tomahawk WiFi |
Cooling | Thermalright Phantom Spirit 120 EVO |
Memory | Corsair DDR5 6000C30 2x48GB (Hynix M)@6000 30-36-36-76 1.36V |
Video Card(s) | PNY XLR8 RTX 4070 Ti SUPER 16G@200W |
Storage | Crucial T500 2TB + WD Blue 8TB |
Case | Lian Li LANCOOL 216 |
Power Supply | MSI MPG A850G |
Software | Ubuntu 24.04 LTS + Windows 10 Home Build 19045 |
Benchmark Scores | 17761 C23 Multi@65W |
Arguably the whole thing actually started with Tesla/GeForce 8, with unified shaders and CUDA, when people started to realize that video hardware can have more uses than pixel-flinging. Cue supercomputers built with thousands of slightly-modified GPUs in the 2010s topping the TOP500.Sure, but these weren't anything anyone asked for - nor did they exist during the Turing debut. It's more of a "since we have it, let's use it for something" approach to AI instead of a burning need now fulfilled.
You make some compelling points.Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front. So then, the only choice companies have is to use the "you'll hear about it repeatedly until you end up liking it" approach in their home PC marketing.
Edit: It's strikingly similar to the crypto craze, which was also never meant to be the Regular Joe's bread and butter, but instead, a way for a handful of mining farm operators to get filthy rich.
Edit 2: I also think that DLSS was an afterthought to have something to sell GeForce cards with Tensor cores by - proven by the existence of the 16-series. Nvidia probably had it as a plan B in case AI wouldn't stick.
Processor | Intel i5-12600k |
---|---|
Motherboard | Asus H670 TUF |
Cooling | Arctic Freezer 34 |
Memory | 2x16GB DDR4 3600 G.Skill Ripjaws V |
Video Card(s) | EVGA GTX 1060 SC |
Storage | 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500 |
Display(s) | Dell U3219Q + HP ZR24w |
Case | Raijintek Thetis |
Audio Device(s) | Audioquest Dragonfly Red :D |
Power Supply | Seasonic 620W M12 |
Mouse | Logitech G502 Proteus Core |
Keyboard | G.Skill KM780R |
Software | Arch Linux + Win10 |
Arguably, it started way before that, when Nvidia realized computers are meant for, well, computing and started their own compute stack. Tensors were just an evolutionary step, they enabled something that was much more difficult to do before and here we are now. Hating on AI is like hating AVX512, SSE or the floating point accelerator.Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front. So then, the only choice companies have is to use the "you'll hear about it repeatedly until you end up liking it" approach in their home PC marketing.
Edit: It's strikingly similar to the crypto craze, which was also never meant to be the Regular Joe's bread and butter, but instead, a way for a handful of mining farm operators to get filthy rich.
Edit 2: I also think that DLSS was an afterthought to have something to sell GeForce cards with Tensor cores by - proven by the existence of the 16-series. Nvidia probably had it as a plan B in case AI wouldn't stick.
Except that AVX & SSE are instruction sets within the inner workings of a processor. AI is much more than that. It has the potential for good or evil. Look at it this way, we'll borrow your analogy, AVX & SSE are tools, like a hammer. Alone they do nothing and are benign. AI is like a human. Given a human a hammer and we can do bother good or evil with it. But at the end of the day, a hammer is only ever going to be a hammer. It's the human using it that determines what the outcome will be. It's the same with AI. Once programed by a human, the result produced by AI and the hardware it runs on will show whether the result is good or bad.Hating on AI is like hating AVX512, SSE or the floating point accelerator.
Processor | Intel i5-12600k |
---|---|
Motherboard | Asus H670 TUF |
Cooling | Arctic Freezer 34 |
Memory | 2x16GB DDR4 3600 G.Skill Ripjaws V |
Video Card(s) | EVGA GTX 1060 SC |
Storage | 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500 |
Display(s) | Dell U3219Q + HP ZR24w |
Case | Raijintek Thetis |
Audio Device(s) | Audioquest Dragonfly Red :D |
Power Supply | Seasonic 620W M12 |
Mouse | Logitech G502 Proteus Core |
Keyboard | G.Skill KM780R |
Software | Arch Linux + Win10 |
At this moment, AI is also just a series of computation (i.e. a tool). It has no more potential for good or evil than your CPU's AVX computations being used to optimize the heating in your home or optimizing the trajectory of a ballistic missile.Except that AVX & SSE are instruction sets within the inner workings of a processor. AI is much more than that. It has the potential for good or evil. Look at it this way, we'll borrow your analogy, AVX & SSE are tools, like a hammer. Alone they do nothing and are benign. AI is like a human. Given a human a hammer and we can do bother good or evil with it. But at the end of the day, a hammer is only ever going to be a hammer. It's the human using it that determines what the outcome will be. It's the same with AI. Once programed by a human, the result produced by AI and the hardware it runs on will show whether the result is good or bad.
Or can't control. Both are perfectly natural. It is the responsibility of the creators of AI tools to show not only what can be done, but also that those tools can be trusted not to cause harm.humans are afraid of anything they don't understand
Processor | Intel i5-12600k |
---|---|
Motherboard | Asus H670 TUF |
Cooling | Arctic Freezer 34 |
Memory | 2x16GB DDR4 3600 G.Skill Ripjaws V |
Video Card(s) | EVGA GTX 1060 SC |
Storage | 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500 |
Display(s) | Dell U3219Q + HP ZR24w |
Case | Raijintek Thetis |
Audio Device(s) | Audioquest Dragonfly Red :D |
Power Supply | Seasonic 620W M12 |
Mouse | Logitech G502 Proteus Core |
Keyboard | G.Skill KM780R |
Software | Arch Linux + Win10 |
You cannot prove something cannot be trusted to not cause harm. Are you familiar with Asimov's "The Naked Sun"?Or can't control. Both are perfectly natural. It is the responsibility of the creators of AI tools to show not only what can be done, but also that those tools can be trusted not to cause harm.
System Name | My second and third PCs are Intel + Nvidia |
---|---|
Processor | AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode |
Motherboard | MSi Pro B650M-A Wifi |
Cooling | be quiet! Shadow Rock LP |
Memory | 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36 |
Video Card(s) | PowerColor Reaper Radeon RX 9070 XT |
Storage | 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda |
Display(s) | Dell S3422DWG 34" 1440 UW 144 Hz |
Case | Corsair Crystal 280X |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | 750 W Seasonic Prime GX |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE Plasma |
Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?You make some compelling points.
Still, I don't like the push from the "big-wigs" to get "AI" on our devices. I don't trust AI or them. It's ultimate usefulness has yet to be proven and we don't need the software side of AI bogging down our devices/systems.
Tools can be trusted to do no harm only as much as we, humans can be trusted not to use our tools to do harm.Or can't control. Both are perfectly natural. It is the responsibility of the creators of AI tools to show not only what can be done, but also that those tools can be trusted not to cause harm.
Processor | Intel i5-12600k |
---|---|
Motherboard | Asus H670 TUF |
Cooling | Arctic Freezer 34 |
Memory | 2x16GB DDR4 3600 G.Skill Ripjaws V |
Video Card(s) | EVGA GTX 1060 SC |
Storage | 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500 |
Display(s) | Dell U3219Q + HP ZR24w |
Case | Raijintek Thetis |
Audio Device(s) | Audioquest Dragonfly Red :D |
Power Supply | Seasonic 620W M12 |
Mouse | Logitech G502 Proteus Core |
Keyboard | G.Skill KM780R |
Software | Arch Linux + Win10 |
There is a push, but it's coming from marketing more than anything else. Any program using the crappiest of models, or even something slightly more involved than if/then/else gets an AI sticker these days. For example, what I'm working on right now got an "AI assistant" because we threw a bunch of generic, domain-oriented, documents at a model and it can answer a few very basic questions now. If you don't do that, investors will think you're lagging and invest their money elsewhere.Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?
As with all unknowns, our choice with AI is to fear it or learn it. The second one seems wiser to me considering that all three chip designer companies are heavily invested in it, so AI is going to stay whether we like it or not. Based on what I've gathered so far, AI is nothing more than a fancy name for a new type of processing cores besides our usual INT and FP units, made for matrix calculations. It is not intelligent, self-aware or self-sufficient, and it has no sense of morality or anything else whatsoever. It computes whatever we, humans want to compute, just like any other part of your PC. Without software, it sits unused.
System Name | Kuro |
---|---|
Processor | AMD Ryzen 7 7800X3D@65W |
Motherboard | MSI MAG B650 Tomahawk WiFi |
Cooling | Thermalright Phantom Spirit 120 EVO |
Memory | Corsair DDR5 6000C30 2x48GB (Hynix M)@6000 30-36-36-76 1.36V |
Video Card(s) | PNY XLR8 RTX 4070 Ti SUPER 16G@200W |
Storage | Crucial T500 2TB + WD Blue 8TB |
Case | Lian Li LANCOOL 216 |
Power Supply | MSI MPG A850G |
Software | Ubuntu 24.04 LTS + Windows 10 Home Build 19045 |
Benchmark Scores | 17761 C23 Multi@65W |
Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?
As with all unknowns, our choice with AI is to fear it or learn it. The second one seems wiser to me considering that all three chip designer companies are heavily invested in it, so AI is going to stay whether we like it or not. Based on what I've gathered so far, AI is nothing more than a fancy name for a new type of processing cores besides our usual INT and FP units, made for matrix calculations. It is not intelligent, self-aware or self-sufficient, and it has no sense of morality or anything else whatsoever. It computes whatever we, humans want to compute, just like any other part of your PC. Without software, it sits unused.
For programs that are actually on the frontier of local AI performance, and actually on the power user end of consumer space, Ollama can be good. SOTA models of 70B+ size and their finetunes can produce some fun and mostly self-consistent story on prompt, game-master better RPs than any AIDungeon 2 or 3 ever did back in the day, and hold their end on a discussion like this reasonably well, generally making sense with few glaring errors and hallucinations, while sometimes offering specific insights beyond the obvious. You can run those models reasonably well with 64GB of RAM or more, if you would tolerate the slow inference.There is a push, but it's coming from marketing more than anything else. Any program using the crappiest of models, or even something slightly more involved than if/then/else gets an AI sticker these days. For example, what I'm working on right now got an "AI assistant" because we threw a bunch of generic, domain-oriented, documents at a model and it can answer a few very basic questions now. If you don't do that, investors will think you're lagging and invest their money elsewhere.
And once again: really nothing for the end user to worry about.
Or can't control. Both are perfectly natural. It is the responsibility of the creators of AI tools to show not only what can be done, but also that those tools can be trusted not to cause harm.
That has been going on for a while now for tech products. In 90's it was naughties and FPS, slightly later it was - and still is - social networks and associated ills and things like phone use where they shouldn't be, now it's this. Never before has it been regarded as an existential risk by a whole lot of otherwise reasonable people, though.Tools can be trusted to do no harm only as much as we, humans can be trusted not to use our tools to do harm.
Unfortunately (or not?), we live in an age when technological advancement is multitudes faster than the rate at which humanity's readiness for it is growing. Where it gets us, we'll see. Ride or die, I guess.![]()
Testing and open disclosure easily go a long distance to demonstrate.You cannot prove something cannot be trusted to not cause harm. Are you familiar with Asimov's "The Naked Sun"?
Apple has the "Apple Intelligence". Google is working on something for Android and ChromeOS.Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?
System Name | ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017) |
---|---|
Processor | ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K |
Motherboard | ❶ X570-F ❷ Z390-E ❸ Z270-E |
Cooling | ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62 |
Memory | ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16 |
Video Card(s) | ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI |
Storage | ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD |
Display(s) | ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS |
Case | ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C |
Audio Device(s) | ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432 |
Power Supply | ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2 |
Mouse | ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502 |
Keyboard | ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610 |
Software | ❶ Win 11 ❷ 10 ❸ 10 |
Benchmark Scores | I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail |
Testing and open disclosure easily go a long distance to demonstrate.
Apple has the "Apple Intelligence". Google is working on something for Android and ChromeOS.
System Name | PCGOD |
---|---|
Processor | AMD FX 8350@ 5.0GHz |
Motherboard | Asus TUF 990FX Sabertooth R2 2901 Bios |
Cooling | Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED |
Memory | 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V) |
Video Card(s) | AMD Radeon 290 Sapphire Vapor-X |
Storage | Samsung 840 Pro 256GB, WD Velociraptor 1TB |
Display(s) | NEC Multisync LCD 1700V (Display Port Adapter) |
Case | AeroCool Xpredator Evil Blue Edition |
Audio Device(s) | Creative Labs Sound Blaster ZxR |
Power Supply | Seasonic 1250 XM2 Series (XP3) |
Mouse | Roccat Kone XTD |
Keyboard | Roccat Ryos MK Pro |
Software | Windows 7 Pro 64 |
Its apart of hive mind/skynet bs, no thanksI have no use for AI or any interest in it for now so no.