• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Reiterates Belief that 2025 is the Year of the AI PC

Makes sense your dad has been very busy with me....
That's a tad concerning, given he's been dead about half a decade. Don't worry, I'm not insulted, because he is not particularly missed.
 
That's a tad concerning, given he's been dead about half a decade. Don't worry, I'm not insulted, because he is not particularly missed.
lol this shit is cracking me up. Going from tech to locker-room talk
 
Well yeah, ofc AMD said this.

Their entire naming scheme for everything this year involves cramming assloads of AI into every nook and cranny where it will fit. They bought a whole ass company for their "XDNA" fixed function hardware to drive AI.

They've bet maybe not the whole farm, but everything but the Donkey on AI.

They're not going to turn around and say "Lol, whups! The return on investment just doesn't look like its there..."
 
That's a tad concerning, given he's been dead about half a decade. Don't worry, I'm not insulted, because he is not particularly missed.

Damn, that's always been my luck..... my bad was just running with your joke. Which I took zero offense to as well.
lol this shit is cracking me up. Going from tech to locker-room talk

That how crap AI is instead of talking about how It can help us we are cracking jokes about our computers wanting us to infect them with a Virus....
 
That how crap AI is instead of talking about how It can help us we are cracking jokes about our computers wanting us to infect them with a Virus....
The s**tposting in this thread is worth more than today's "AI" ever will be.
 
The s**tposting in this thread is worth more than today's "AI" ever will be.
AI couldn't write the shitposts we are writing because they are tuned for extreme political correctness. Seriously, you don't need to be a Trumpist to know that that level of censorship is stupid.
 
sure, buddy, 10 bucks says in a blind test at a framerate that is ok to use framegen with, aka about 100, you will not see any difference.

i can see a difference from 60 to 90, and again at 120-165 range, and again at 200+, it gets smoother and smoother the higher you go and is nice on the eyes and immersion, that simple
 
2025..............year of the something that's forsure.......................
 
don't come to me with that AI crap and backdoors for the government. F
 
2025, putting the AI in pAIn.
 
AMD is making me want to sell my 7900 xt and get a rtx 5080. might just say fk it and do it, i dunno, all companies are the same, but big papa wants some multi-frame gen goodness

Oh, has it been a week already? You change hardware like other people change socks.
 
AI AI AI AI AI AI
Puerto Rico
AI AI AI AI AI AI
Puerto Rico

(Vaya Von Dios - Puerto Rico)
 
AI couldn't write the shitposts we are writing because they are tuned for extreme political correctness. Seriously, you don't need to be a Trumpist to know that that level of censorship is stupid.
Some people need a few years to come to terms with reality
Some people never do
Funny thing is, no matter how hard people want it, their utopia never happens, and that goes for all sides of every discussion.

That goes for AI... it goes for free speech. Its always going to be imperfect, FUBAR, and yet we'll keep trying to use it.

I guess we just have to follow AMD and 'keep believing' it can be better :)
 
View attachment 384372
What is this utter BS :slap: :laugh:
none of these really need ai to be possible...
Not required, but performing better with AI/ML.
Offline and lower latency is more about on device AI processing rather than cloud processing.

Some of the stuff listed here can be done more efficiently with AI acceleration, especially on a laptop. You really don't want those stuff to be pure GPU/CPU task it's going to drain your battery. Phones have been using AI acceleration for those thing long before AI became the buzzword that it is now. The NPU has always been about efficiency rather than sheer speed. It also allow you to keep your GPU/CPU compute ressources for other stuff.

@csendesmark You can laugh if you want, but that's absolutely the rational behind the NPU. Things like live translation, digital animated avatars, eye contact correction or auto framing are not just computed once, it's a sustained workload, wich is why phones and laptops are the priority, when the desktop doesn't really need and NPU. Machine learning is also better at handling dynamic input data compared to a regular algorithm. Even cameras like the EOS 1DX MK III or the Sony A1 II are making use of ML for enhanced subject recognition and autofocus and have on board ML acceleration.
1739366651134.png
1739366680890.png
 
Last edited:
Not required, but performing better with AI/ML.
Offline and lower latency is more about on device AI processing rather than cloud processing.

Some of the stuff listed here can be done more efficiently with AI acceleration, especially on a laptop. You really don't want those stuff to be pure GPU/CPU task it's going to drain your battery. Phones have been using AI acceleration for those thing long before AI became the buzzword that it is now. The NPU has always been about efficiency rather than sheer speed. It also allow you to keep your GPU/CPU compute ressources for other stuff.

@csendesmark You can laugh if you want, but that's absolutely the rational behind the NPU. Things like live translation, digital animated avatars, eye contact correction or auto framing are not just computed once, it's a sustained workload, wich is why phones and laptops are the priority, when the desktop doesn't really need and NPU. Machine learning is also better at handling dynamic input data compared to a regular algorithm. Even cameras like the EOS 1DX MK III or the Sony A1 II are making use of ML for enhanced subject recognition and autofocus and have on board ML acceleration.
View attachment 384486View attachment 384487

"Some of the stuff listed here can be done more efficiently with AI acceleration"
Which ones tho?
1739370219113.png

On this:
1739370268997.png

Image generation is a totally different thing you brought up, for that an NPU is quite useful, never questioned that!
But these bullet-point list is pure marketing bs... none of it makes sense
 
Oh, has it been a week already? You change hardware like other people change socks.

I've had my 7900 xt since july 2023, but aight if you say so - I did get a 1070 briefly, because i thought it would be fun to refurbish it with mx-6, etc. but that was just a fun side project. and yeah of course i upgraded to 7800x3d when my local microcenter had them for $196, i'd been a fucking idiot not to lol
 
"Some of the stuff listed here can be done more efficiently with AI acceleration"
Which ones tho?
View attachment 384494
On this:
View attachment 384495
Image generation is a totally different thing you brought up, for that an NPU is quite useful, never questioned that!
But these bullet-point list is pure marketing bs... none of it makes sense
- offline support and lower latency is in the context of local AI vs cloud AI. Cloud based A.I is almost never used for real-time application beause there's too much latency between the input, and the output result, they not talking about system latency.

- Digital avatars/ auto subject framing/ eye correction require subject detection, and a finer detection of the movement of said subject (since digital avatars are not static, but animated according to the facial movement of a person). I've read a lot about that subject, computers are really bad at perceiving the world like we do, deep learning improved that a lot, and the computer need to apply that ML algorithm on the fly.
The Limits of Computer Vision, and of Our Own | Harvard Medicine Magazine

- In the same sipirit, posture correction also require the computer to recognize what's a bad posture and a good posture in the place. Wich make use of computer vision/subject recignition.

-Lighting enhancement is for video calls, it's to make up for low light, some of them are specifically making the speaker brighter, while also try to make up for the noise present in low ligh situation. So subject detection, and image reconstruction are involved.

- Auto translation can include real time audio translation wich need real time speech/language recognition, I don't think that they only mean text translation.

- Accesibility features, mostly for people who have isues with their eyesight, so you use subject recignition to describe what's happening on the screen, even when people don't make content with the description tag. Also improving speech based input, because again, computers are by default bad at recognizing sounds and pictures like we do. To this day I'm still seeing exemple of a computer doing mistakes in audio transcription.

-Blue light dimming is a just behavior based automated warm mode to reduce eye strain. Instead of just using it at a set time, it's going to do it several time a day base on your behavior. That's the theory, but I haven't seen that one implemented yet, but imo that sound annoying

- Longer battery life because there's more and more applications that are making use of on device ML algorithms, so the NPU can increase the battery life. Improved performance also because of the NPU who can let the CPU/GPU available for other stuff.

I feel there's a misconcpetion that A.I is mostly about generating stuff, when it's also extensively used to make computers "see", "hear", and "understand" things that would be a pain to do with classic programming.
 
Back
Top