I think everyone else here has missed the real question. I believe Wizard is asking if we'd pay more for a device with an NPU.
I think the word AI has made everyone think of LLMs and generating images whereas NPUs like NVIDIAs tensor cores or Apples Neural Engine enable a lot of great features I use daily
Or potentially not so great features like Windows Recall that was just announced...
Have I got this right
@W1zzard ?
The only thing those cores are really used for are to assist with offline voice recognition and to speed up face-unlock. I might be ignorant of other uses but those are the only headline features that get any coverage/attention.
Apples Neural Engine NPU is responsible for a whole loooooad of stuff. The things you mentioned as well as (and more),
Computer Vision for face, object, landmarks and nature recognition in the photos app so it can categorize faces, allow you to search for key words, or learn more about a plant or animal you took a photo off. Object detection in the photos app where you can pluck the subject out of the image. And allow you to search for all of this stuff from Spotlight
Live Text which provides OCR in Camera and Photos, which allows you to copy handwriting or text or follow links, phone numbers or addresses
I often use that were text isn't selectable in an app or website, but if I take a screenshot I can.
Video effects like background blurring, centre stage and noise reduction
People occlusion and motion tracking in augmented reality, hugely important on something like Vision Pro
Photographic Styles in the Camera app, Memories curation and stylistic effects in Photos, personalized recommendations like wallpaper suggestions, VoiceOver image captioning, finding image duplicates in Photos, etc. etc etc
This might have been an interesting question five years ago.
Today it's a joke. Apple put ML cores in their phones with the iPhone X/iPhone 8 series.
Back in 2017. So no, I didn't bother submitting a vote in this poll. But it might have been fun before the pandemic.
Many TPU discussions are several years behind the times. And a lot of AI functions are already here without any fanfare. You ever get a recent fraud alert from your credit card issuer or bank? That's AI in IRL action.
You use an M2 Pro Mac, so you're using Apples "hardware with AI capabilities" everyday and may not even know it. And that's not a bad thing, it means its working efficiently
This is all stuff that could be brute forced on a CPU, but it's slow and ineffecient. Just like trying to render an image without GPU acceleration or enabling ray tracing without RT cores