• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Would you pay more for hardware with AI capabilities?

Would you pay more for hardware with AI capabilities?

  • Yes

    Votes: 1,909 7.4%
  • No

    Votes: 21,627 83.9%
  • Don't know

    Votes: 2,227 8.6%

  • Vote for this poll on the frontpage
  • Total voters
    25,763
Joined
Mar 21, 2016
Messages
2,357 (0.78/day)
You might not hate the AI so much if you only had a
 

Attachments

  • WBRNMiJkaYxDkJD3VIlD--1--9gl54.jpg
    WBRNMiJkaYxDkJD3VIlD--1--9gl54.jpg
    540.1 KB · Views: 32
Joined
Jul 5, 2013
Messages
26,047 (6.46/day)
Having taken a look at the results of the poll thus far, 83%+ say no thank you to AI.

You watching @ microsoft, @ apple, @Google?? Hmmm?? Most of us don't care and don't want it on our devices/PC's! Take a hint! DON"T force it on us. We will react poorly and to your detriment.

You might not hate the AI so much if you only had a
Hate is not the right word or description. We just don't care and don't want it on our machines. The people of the world have been fine before it, we'll continue being fine without it.

I'm not saying AI and such don't have usefulness, just that it doesn't belong on our personal devices by default.
 
Last edited:
Joined
Jan 14, 2019
Messages
10,614 (5.28/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Having taken a look at the results of the poll thus far, 83%+ say no thank you to AI.

You watching @ microsoft, @ apple, @Google?? Hmmm?? Most of us don't care and don't want it on our devices/PC's! Take a hint! DON"T force it on us. We will react poorly and to your detriment.


Hate is not the right word or description. We just don't care and don't want it on our machine. The people of the world have been fine before it, we'll continue being fine without it.

I'm not saying AI and such don't have usefulness, just that it doesn't belong on our personal devices by default.
Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front. So then, the only choice companies have is to use the "you'll hear about it repeatedly until you end up liking it" approach in their home PC marketing.

Edit: It's strikingly similar to the crypto craze, which was also never meant to be the Regular Joe's bread and butter, but instead, a way for a handful of mining farm operators to get filthy rich.

Edit 2: I also think that DLSS was an afterthought to have something to sell GeForce cards with Tensor cores by - proven by the existence of the 16-series. Nvidia probably had it as a plan B in case AI wouldn't stick.
 
Last edited:
Joined
May 22, 2024
Messages
188 (3.36/day)
System Name Kuro
Processor AMD Ryzen 7 7800X3D@65W
Motherboard MSI MAG B650 Tomahawk WiFi
Cooling Thermalright Phantom Spirit 120 EVO
Memory Corsair DDR5 6000C30 2x48GB (Hynix M)@6000 30-36-36-48 1.36V
Video Card(s) PNY XLR8 RTX 4070 Ti SUPER 16G@200W
Storage Crucial T500 2TB + WD Blue 8TB
Case Lian Li LANCOOL 216
Audio Device(s) Sound Blaster AE-7
Power Supply MSI MPG A850G
Software Ubuntu 24.04 LTS + Windows 10 Home Build 19045
Benchmark Scores 17761 C23 Multi@65W
Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front.
Considering the apparent popularity of certain local AI uses like roleplaying, starting off with the early popularity of AIDungeon 2 - Does anyone remember that? - now done with things like finetuned Llama 3 models, and the number of AI-generated avatars on this very forum*, I'd say it's good to have some options that run local, as opposed to just relegating them to API and server space, or do something like what Intel did by dummying out AVX512 on client processors (Not that GPU makers didn't dummy out FP64, but for entirely unrelated reasons.)

Not that it mattered for the greater majority of users, as you observed. Until more visible use of generative AI for home users pops up, and a lot of people are certainly trying. Not that they would produce anything groundbreaking or even useful, but the hardware capability has to be there first.

*My own was grabbed off HF when they ran SD 1.5 demo.
 
Joined
Jan 14, 2019
Messages
10,614 (5.28/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Considering the apparent popularity of certain local AI uses like roleplaying, starting off with the early popularity of AIDungeon 2 - Does anyone remember that? - now done with things like finetuned Llama 3 models, and the number of AI-generated avatars on this very forum*, I'd say it's good to have some options that run local, as opposed to just relegating them to API and server space, or do something like what Intel did by dummying out AVX512 (Not that GPU makers didn't dummy out FP64, but for entirely unrelated reasons.)

Not that it mattered for the greater majority of users, as you observed. Until more visible use of generative AI for home users pops up, and a lot of people are certainly trying. Not that they would produce anything groundbreaking or even useful, but the hardware capability has to be there first.

*My own was grabbed off HF when they ran SD 1.5 demo.
Sure, but these weren't anything anyone asked for - nor did they exist during the Turing debut. It's more of a "since we have it, let's use it for something" approach to AI instead of a burning need now fulfilled.
 
Joined
May 22, 2024
Messages
188 (3.36/day)
System Name Kuro
Processor AMD Ryzen 7 7800X3D@65W
Motherboard MSI MAG B650 Tomahawk WiFi
Cooling Thermalright Phantom Spirit 120 EVO
Memory Corsair DDR5 6000C30 2x48GB (Hynix M)@6000 30-36-36-48 1.36V
Video Card(s) PNY XLR8 RTX 4070 Ti SUPER 16G@200W
Storage Crucial T500 2TB + WD Blue 8TB
Case Lian Li LANCOOL 216
Audio Device(s) Sound Blaster AE-7
Power Supply MSI MPG A850G
Software Ubuntu 24.04 LTS + Windows 10 Home Build 19045
Benchmark Scores 17761 C23 Multi@65W
Sure, but these weren't anything anyone asked for - nor did they exist during the Turing debut. It's more of a "since we have it, let's use it for something" approach to AI instead of a burning need now fulfilled.
Arguably the whole thing actually started with Tesla/GeForce 8, with unified shaders and CUDA, when people started to realize that video hardware can have more uses than pixel-flinging. Cue supercomputers built with thousands of slightly-modified GPUs in the 2010s topping the TOP500.

Again, the capabilities would have to be there first. I think there is always an unspoken and not necessarily burning need for better interactivities in video games, considering the kind of hype then associated with AI and atmospheric actions in games from Oblivion to Skyrim, before graphics with HD textures, fancy shaders, and ray-traced effects took over. Current advancements have not yet really percolated into that space, outside tech demos.
 
Joined
Jul 5, 2013
Messages
26,047 (6.46/day)
Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front. So then, the only choice companies have is to use the "you'll hear about it repeatedly until you end up liking it" approach in their home PC marketing.

Edit: It's strikingly similar to the crypto craze, which was also never meant to be the Regular Joe's bread and butter, but instead, a way for a handful of mining farm operators to get filthy rich.

Edit 2: I also think that DLSS was an afterthought to have something to sell GeForce cards with Tensor cores by - proven by the existence of the 16-series. Nvidia probably had it as a plan B in case AI wouldn't stick.
You make some compelling points.

Still, I don't like the push from the "big-wigs" to get "AI" on our devices. I don't trust AI or them. It's ultimate usefulness has yet to be proven and we don't need the software side of AI bogging down our devices/systems.
 

bug

Joined
May 22, 2015
Messages
13,454 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Someone correct me if I'm wrong, but I don't think this AI craze is about us at all. It all started with Nvidia putting Tensor cores into Volta, and then Turing for datacentre use, and then it continued with other companies putting AI into their architectures for the same purpose. Nearly every current-gen CPU and GPU has some form of AI capability now, but there is no use case for us, home users. Sure, we've got DLSS, but we've also got FSR which works without AI. The only reason we have these "useless cores" is because home PC architectures trickle down from professional/datacentre ones, as developing separate architectures for separate use cases would cost too much money and effort. The problem is that all this AI development hurts advancements in other areas, AI cores take up die area, engineering teams spend time working on AI instead of something else, etc, and without those advancements, these companies have no choice but to try to sell their products by advertising AI, however useless it is to us. They either do this, or skip AI altogether, which would hurt them a lot more on the datacentre front. So then, the only choice companies have is to use the "you'll hear about it repeatedly until you end up liking it" approach in their home PC marketing.

Edit: It's strikingly similar to the crypto craze, which was also never meant to be the Regular Joe's bread and butter, but instead, a way for a handful of mining farm operators to get filthy rich.

Edit 2: I also think that DLSS was an afterthought to have something to sell GeForce cards with Tensor cores by - proven by the existence of the 16-series. Nvidia probably had it as a plan B in case AI wouldn't stick.
Arguably, it started way before that, when Nvidia realized computers are meant for, well, computing and started their own compute stack. Tensors were just an evolutionary step, they enabled something that was much more difficult to do before and here we are now. Hating on AI is like hating AVX512, SSE or the floating point accelerator.
 
Last edited:
Joined
Jul 5, 2013
Messages
26,047 (6.46/day)
Hating on AI is like hating AVX512, SSE or the floating point accelerator.
Except that AVX & SSE are instruction sets within the inner workings of a processor. AI is much more than that. It has the potential for good or evil. Look at it this way, we'll borrow your analogy, AVX & SSE are tools, like a hammer. Alone they do nothing and are benign. AI is like a human. Given a human a hammer and we can do bother good or evil with it. But at the end of the day, a hammer is only ever going to be a hammer. It's the human using it that determines what the outcome will be. It's the same with AI. Once programed by a human, the result produced by AI and the hardware it runs on will show whether the result is good or bad.
 

bug

Joined
May 22, 2015
Messages
13,454 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Except that AVX & SSE are instruction sets within the inner workings of a processor. AI is much more than that. It has the potential for good or evil. Look at it this way, we'll borrow your analogy, AVX & SSE are tools, like a hammer. Alone they do nothing and are benign. AI is like a human. Given a human a hammer and we can do bother good or evil with it. But at the end of the day, a hammer is only ever going to be a hammer. It's the human using it that determines what the outcome will be. It's the same with AI. Once programed by a human, the result produced by AI and the hardware it runs on will show whether the result is good or bad.
At this moment, AI is also just a series of computation (i.e. a tool). It has no more potential for good or evil than your CPU's AVX computations being used to optimize the heating in your home or optimizing the trajectory of a ballistic missile.

Publicly available LLMs do not learn on their own. It's why you have GPT 3, 3.5, 4, 4o and so on. They're models validated by humans that will not step outside their approved boundaries. You cannot even query them without applying corrections, they're a looong way from doing anything autonomously.

Yes, there is potential of harm, but which invention/advancement doesn't carry that? And yes, there is fear, but it's instinctual: humans are afraid of anything they don't understand.
 

bug

Joined
May 22, 2015
Messages
13,454 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Or can't control. Both are perfectly natural. It is the responsibility of the creators of AI tools to show not only what can be done, but also that those tools can be trusted not to cause harm.
You cannot prove something cannot be trusted to not cause harm. Are you familiar with Asimov's "The Naked Sun"?
 
Joined
Jan 14, 2019
Messages
10,614 (5.28/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
You make some compelling points.

Still, I don't like the push from the "big-wigs" to get "AI" on our devices. I don't trust AI or them. It's ultimate usefulness has yet to be proven and we don't need the software side of AI bogging down our devices/systems.
Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?

As with all unknowns, our choice with AI is to fear it or learn it. The second one seems wiser to me considering that all three chip designer companies are heavily invested in it, so AI is going to stay whether we like it or not. Based on what I've gathered so far, AI is nothing more than a fancy name for a new type of processing cores besides our usual INT and FP units, made for matrix calculations. It is not intelligent, self-aware or self-sufficient, and it has no sense of morality or anything else whatsoever. It computes whatever we, humans want to compute, just like any other part of your PC. Without software, it sits unused.

Or can't control. Both are perfectly natural. It is the responsibility of the creators of AI tools to show not only what can be done, but also that those tools can be trusted not to cause harm.
Tools can be trusted to do no harm only as much as we, humans can be trusted not to use our tools to do harm.

Unfortunately (or not?), we live in an age when technological advancement is multitudes faster than the rate at which humanity's readiness for it is growing. Where it gets us, we'll see. Ride or die, I guess. :ohwell:
 

bug

Joined
May 22, 2015
Messages
13,454 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?

As with all unknowns, our choice with AI is to fear it or learn it. The second one seems wiser to me considering that all three chip designer companies are heavily invested in it, so AI is going to stay whether we like it or not. Based on what I've gathered so far, AI is nothing more than a fancy name for a new type of processing cores besides our usual INT and FP units, made for matrix calculations. It is not intelligent, self-aware or self-sufficient, and it has no sense of morality or anything else whatsoever. It computes whatever we, humans want to compute, just like any other part of your PC. Without software, it sits unused.
There is a push, but it's coming from marketing more than anything else. Any program using the crappiest of models, or even something slightly more involved than if/then/else gets an AI sticker these days. For example, what I'm working on right now got an "AI assistant" because we threw a bunch of generic, domain-oriented, documents at a model and it can answer a few very basic questions now. If you don't do that, investors will think you're lagging and invest their money elsewhere.
And once again: really nothing for the end user to worry about.
 
Joined
May 22, 2024
Messages
188 (3.36/day)
System Name Kuro
Processor AMD Ryzen 7 7800X3D@65W
Motherboard MSI MAG B650 Tomahawk WiFi
Cooling Thermalright Phantom Spirit 120 EVO
Memory Corsair DDR5 6000C30 2x48GB (Hynix M)@6000 30-36-36-48 1.36V
Video Card(s) PNY XLR8 RTX 4070 Ti SUPER 16G@200W
Storage Crucial T500 2TB + WD Blue 8TB
Case Lian Li LANCOOL 216
Audio Device(s) Sound Blaster AE-7
Power Supply MSI MPG A850G
Software Ubuntu 24.04 LTS + Windows 10 Home Build 19045
Benchmark Scores 17761 C23 Multi@65W
Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?

As with all unknowns, our choice with AI is to fear it or learn it. The second one seems wiser to me considering that all three chip designer companies are heavily invested in it, so AI is going to stay whether we like it or not. Based on what I've gathered so far, AI is nothing more than a fancy name for a new type of processing cores besides our usual INT and FP units, made for matrix calculations. It is not intelligent, self-aware or self-sufficient, and it has no sense of morality or anything else whatsoever. It computes whatever we, humans want to compute, just like any other part of your PC. Without software, it sits unused.
There is a push, but it's coming from marketing more than anything else. Any program using the crappiest of models, or even something slightly more involved than if/then/else gets an AI sticker these days. For example, what I'm working on right now got an "AI assistant" because we threw a bunch of generic, domain-oriented, documents at a model and it can answer a few very basic questions now. If you don't do that, investors will think you're lagging and invest their money elsewhere.
And once again: really nothing for the end user to worry about.
For programs that are actually on the frontier of local AI performance, and actually on the power user end of consumer space, Ollama can be good. SOTA models of 70B+ size and their finetunes can produce some fun and mostly self-consistent story on prompt, game-master better RPs than any AIDungeon 2 or 3 ever did back in the day, and hold their end on a discussion like this reasonably well, generally making sense with few glaring errors and hallucinations, while sometimes offering specific insights beyond the obvious. You can run those models reasonably well with 64GB of RAM or more, if you would tolerate the slow inference.

Notably, that software and its backend currently have no support for dedicated NPUs, and are already memory bound on most current systems, as far as I'm aware.
Or can't control. Both are perfectly natural. It is the responsibility of the creators of AI tools to show not only what can be done, but also that those tools can be trusted not to cause harm.
Tools can be trusted to do no harm only as much as we, humans can be trusted not to use our tools to do harm.

Unfortunately (or not?), we live in an age when technological advancement is multitudes faster than the rate at which humanity's readiness for it is growing. Where it gets us, we'll see. Ride or die, I guess. :ohwell:
That has been going on for a while now for tech products. In 90's it was naughties and FPS, slightly later it was - and still is - social networks and associated ills and things like phone use where they shouldn't be, now it's this. Never before has it been regarded as an existential risk by a whole lot of otherwise reasonable people, though.

Problems with abuses of lesser generative AIs has already become evident, even when these tools don't actually have a will of their own.

As to tools not causing harm, observe the reception of Stable Diffusion 3 and how tools not causing potential harm can also become tools not (as) useful for its designed purpose, even though that specific example was more or less inevitable.
 
Last edited:
Joined
Mar 21, 2016
Messages
2,357 (0.78/day)
First I heard of AI dungeon sounds neat. I'm going to take a peek at it just out of curiosity.

You decide to test the blade's power on this unsuspecting creature. With a swift motion, you swing the weapon, feeling the blood-red jewel's energy coursing through you. The blade sings through the air, leaving a trail of shimmering energy in its wake. The creature, sensing danger, turns to face you, its eyes wide with fear. But it's too late. The blade connects with a sickening thud, and the creature collapses, lifeless. You stand there, panting slightly, taking in the aftermath of your first kill. The creature lies still at your feet, its lifeless body a testament to the power of the relic.

I am a bad man XD
 
Last edited:
Joined
Jul 5, 2013
Messages
26,047 (6.46/day)
You cannot prove something cannot be trusted to not cause harm. Are you familiar with Asimov's "The Naked Sun"?
Testing and open disclosure easily go a long distance to demonstrate.

Is there even a software side in the consumer space besides Microsoft's pitiful attempts with their Copilot bullshit?
Apple has the "Apple Intelligence". Google is working on something for Android and ChromeOS.
 
Joined
Jan 1, 2019
Messages
448 (0.22/day)
The GPU is designed for running HLSL which it is good at. Some weather applications can leverage the GPU to render climate models but the scope is limited
 
Joined
Jan 20, 2019
Messages
1,364 (0.68/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
voted "dont know"

At some point when its use is compellingly useful... yeah why not.

If AI automation wipes my ass after a dump without leaving the gaming chair, i'm IN!
 
Joined
Nov 25, 2012
Messages
247 (0.06/day)
Hell no.
AI is a buzz word.

Not that I have locked into it much but is there an open standard that works?
Also since I dislike big Green anything they are pushing and any closed standards I'm against
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,698 (6.54/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
I have no use for AI or any interest in it for now so no.
Its apart of hive mind/skynet bs, no thanks
 
Joined
Sep 1, 2020
Messages
2,110 (1.49/day)
Location
Bulgaria
By the time "AI" becomes a thing for the local user, all new hardware on the market will already have an NPU. so whoever builds a system with new components will have no choice but to buy "AI" hardware. So, this poll wouldn't make sense for that future.
 
Top