• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Would you pay more for hardware with AI capabilities?

Would you pay more for hardware with AI capabilities?

  • Yes

    Votes: 2,086 7.3%
  • No

    Votes: 24,247 84.3%
  • Don't know

    Votes: 2,413 8.4%

  • Total voters
    28,746
  • Poll closed .
Alternate poll idea: since your GPU will have AI capabilities, your CPU will have AI capabilities as well, which one do you plan on using? :D
The operating system will choose for you sooner or later and will likely do it with 95%+ accuracy by 2026, just like task schedulers assigning jobs to performance or efficiency CPU cores.

My guess is that the OS will run AI functions on the subsystem that has more free RAM available to the AI model. The OS may weight a certain type of silicon if the performance-per-watt metric is well known and/or perhaps based on memory bandwidth performance. Based on AMD's relative newness to AI/ML silicon differentiation, I expect modern OSes to assign AI/ML tasks directly to Nvidia discrete GPUs by default. A lot of today's AI code is already optimized for CUDA.

We already know how this is going to play out on smartphones: premium models (Samsung Galaxy, iPhones, etc.) will get SoCs with AI/ML cores first, followed by trickle down deployment to mid-range and entry-level SoCs a few years later. One should expect the same from the tablet market.

For mobile PCs, I expect AI-enabled SoCs on entry-level "business" ultrabooks. The high end gaming models with discrete GPUs may end up being the last to receive AI silicon, but that would be on AMD Radeon-equipped notebooks. There aren't many of those today, Nvidia has pretty much cornered the notebook PC discrete GPU market.

Apple has it easy: they have deprecated Apple Intelligence features from Intel-based Macs and all Apple Silicon systems feature UMA (Unified Memory Architecture). This includes iPhone, iPad, and other Apple devices that gain Apple Intelligence functionality later on. That basically means a single architectural approach which makes their engineers' jobs way easier. They don't have to worry about whether to assign AI/ML jobs to discrete GPUs (and GPU subsystem memory) because they have none.

Not true. With enough public push back, companies will not include it if people will not buy it. Some are unwilling to pay extra, others will not buy anything that includes it.
We know smartphone manufacturers won't give consumers a choice, at least in the premium models. They will all have AI/ML cores whether Ned Nerdwad wants them or not. Joe Consumer won't even know the difference unless there's some sort of AI-assisted feature that isn't available to them because their SoC is excluded from supported systems.

Remember that smartphones with AI/ML cores have been around for years.

The discussion will end up restricted to a small subset of users of AMD Radeon discrete PC graphics card users. And that discussion will end up lasting about a year or so before smart ones realize that running an old non-AI GPU provides less performance-per-watt than a modern GPU (with AI silicon) that has AI features turned off.

It's not like Radeon GPUs are particularly energy efficient to begin with anyhow.

:p
 
Last edited:
We know smartphone manufacturers won't give consumers a choice, at least in the premium models. They will all have AI/ML cores whether Ned Nerdwad wants them or not. Joe Consumer won't even know the difference unless there's some sort of AI-assisted feature that isn't available to them because their SoC is excluded from supported systems.
And that's a fair point. As long as it can be disabled, it would be acceptable. I'm talking about PC's. I don't want that AI/NPU crap in my PC.
 
Not true. With enough public push back, companies will not include it if people will not buy it. Some are unwilling to pay extra, others will not buy anything that includes it.
Well the NVIDIA 7000 series will be my next upgrade, and if the RTX 7070 (Ti) will have AI i will buy it regardless as i will want a new GPU, since no GPU is no option (AMD can fek off).
 
True. This reminds me of the last time consumers pushed back at CPU and GPU makers and won. It was... wait a minute... I know this... I swear it will come to me...

Well, there was the GTX 970 Class Action Suit and although there was an in case settlement the customers did get $25 each. Hardly a big win but at least it was something.
 
Well, there was the GTX 970 Class Action Suit and although there was an in case settlement the customers did get $25 each. Hardly a big win but at least it was something.
Perfect example. And NVidia has not made that segmented-memory-bus crap again.

So if we all voice our objections, the companies will listen.
 
Well, there was the GTX 970 Class Action Suit and although there was an in case settlement the customers did get $25 each. Hardly a big win but at least it was something.
That's why I don't like partly disabled GPUs. Who knows what else they secretly disabled besides a few shader cores? No, if the parts are physically there, they should be fully usable. I know companies have to sell defects, too, I just don't want to be their experimental guinea pig.
 
That's why I don't like partly disabled GPUs. Who knows what else they secretly disabled besides a few shader cores? No, if the parts are physically there, they should be fully usable. I know companies have to sell defects, too, I just don't want to be their experimental guinea pig.
That sounds a bit paranoid.
"Experimental"? Partially disabled silicon has been with us for two decades, there's nothing experimental about it anymore.
"Secretly disabled"? Each review will show you a block diagram of the CPU/GPU under review, highlighting all the blocks and what's inside them. And fusing off only works at a block level. There's no risk of buying a GPU only to discover DX12 was "secretly disabled" leaving you with DX11 only.
 
If this poll is repeated a year or so from now, the most common answer won't be No (because I don't want AI in my PCs). Instead, it will be No (because I can't buy a PC without AI anyway).
 
Well, I was having a brainstorm about this. If they design games with CPU AI, GPUs have been using AI for some time. Yes, it can work. Will it make sense? no, as then you are forcing people who do not have a lot of money to upgrade. The poor communities will suffer. Don't get me wrong. AI has advantages. I am a web developer. We have clients that send us one sentence and tell us you know what you are doing, Build a website. As a web developer, it is not our responsibility to source the content. Yes, we do research on the industry to get a better idea of how the industry works. So if it comes down to it, we use chat GPT to build a website. If it comes to building a website, writing an ad for marketing, etc, it is pretty good. But for general use, it is not really required. Schools here are busy getting wary as kids can bypass their projects. There should be a safety measurement where kids cannot freely access it. If it is work-related yes it can be very helpful but it also makes people lazy. I only use it for work nothing else.
 
That sounds a bit paranoid.
"Experimental"? Partially disabled silicon has been with us for two decades, there's nothing experimental about it anymore.
"Secretly disabled"? Each review will show you a block diagram of the CPU/GPU under review, highlighting all the blocks and what's inside them. And fusing off only works at a block level. There's no risk of buying a GPU only to discover DX12 was "secretly disabled" leaving you with DX11 only.
I don't think any review mentioned that the 4 GB 970 is actually a 3.5 GB card. And even though you're right 99% of the time, I still like using all the parts I physically have. It's just a "me" thing. :ohwell:

Well, I was having a brainstorm about this. If they design games with CPU AI, GPUs have been using AI for some time. Yes, it can work. Will it make sense? no, as then you are forcing people who do not have a lot of money to upgrade. The poor communities will suffer. Don't get me wrong. AI has advantages. I am a web developer. We have clients that send us one sentence and tell us you know what you are doing, Build a website. As a web developer, it is not our responsibility to source the content. Yes, we do research on the industry to get a better idea of how the industry works. So if it comes down to it, we use chat GPT to build a website. If it comes to building a website, writing an ad for marketing, etc, it is pretty good. But for general use, it is not really required. Schools here are busy getting wary as kids can bypass their projects. There should be a safety measurement where kids cannot freely access it. If it is work-related yes it can be very helpful but it also makes people lazy. I only use it for work nothing else.
Devs can't force people to upgrade. There has to be some kind of intermediary solution, like software rendering mode back when T&L was all the rage. No one will buy their games otherwise.
 
I don't think any review mentioned that the 4 GB 970 is actually a 3.5 GB card. And even though you're right 99% of the time, I still like using all the parts I physically have. It's just a "me" thing. :ohwell:
Fair enough. Reviews couldn't have mentioned that because, at the time, the specs Nvidia released to them were actually wrong (they did not include any info about that peculiar configuration). Hopefully the lawsuit that followed will ensure that was a one-off, never to be seen again.
 
Back
Top