Wednesday, February 28th 2024
Qualcomm AI Hub Introduced at MWC 2024
Qualcomm Technologies, Inc. unveiled its latest advancements in artificial intelligence (AI) at Mobile World Congress (MWC) Barcelona. From the new Qualcomm AI Hub, to cutting-edge research breakthroughs and a display of commercial AI-enabled devices, Qualcomm Technologies is empowering developers and revolutionizing user experiences across a wide range of devices powered by Snapdragon and Qualcomm platforms.
"With Snapdragon 8 Gen 3 for smartphones and Snapdragon X Elite for PCs, we sparked commercialization of on-device AI at scale. Now with the Qualcomm AI Hub, we will empower developers to fully harness the potential of these cutting-edge technologies and create captivating AI-enabled apps," said Durga Malladi, senior vice president and general manager, technology planning and edge solutions, Qualcomm Technologies, Inc. "The Qualcomm AI Hub provides developers with a comprehensive AI model library to quickly and easily integrate pre-optimized AI models into their applications, leading to faster, more reliable and private user experiences."Qualcomm AI Hub - Developer's Gateway to Superior On-device AI Performance
The new Qualcomm AI Hub contains a library of pre-optimized AI models for seamless deployment on devices powered by Snapdragon and Qualcomm platforms. This library provides developers with more than 75 popular AI and generative AI models, such as Whisper, ControlNet, Stable Diffusion, and Baichuan 7B, which are optimized for superior on-device AI performance, lower memory utilization, and better power efficiency, across different form factors and packaged in various runtimes. Each model is optimized to take advantage of hardware acceleration across all cores within the Qualcomm AI Engine (NPU, CPU, and GPU) resulting in 4X faster inferencing times. The AI model library automatically handles model translation from source framework to popular runtimes and works directly with the Qualcomm AI Engine direct SDK, then applies hardware-aware optimizations. Developers can seamlessly integrate these models into their applications, reducing time-to-market, and unlocking the benefits of on-device AI implementations such as immediacy, reliability, privacy, personalization, and cost savings.
The optimized models are available today on the Qualcomm AI Hub, GitHub, and Hugging Face. New models will continually be added to the Qualcomm AI Hub along with upcoming support for additional platforms and operating systems. Developers can sign up today to run the models themselves on cloud-hosted devices based on Qualcomm Technologies' platforms and get earlier access to new features and AI models coming through Qualcomm AI Hub.
"We are thrilled to host Qualcomm Technologies' AI models on Hugging Face," said Clement Delangue, cofounder and CEO, Hugging Face. "These popular AI models, optimized for on-device machine learning and ready to use on Snapdragon and Qualcomm platforms, will enable the next generation of mobile developers and edge AI applications, making AI more accessible and affordable for everyone."Cutting Edge AI Research Advancements
Source:
Qualcomm News
"With Snapdragon 8 Gen 3 for smartphones and Snapdragon X Elite for PCs, we sparked commercialization of on-device AI at scale. Now with the Qualcomm AI Hub, we will empower developers to fully harness the potential of these cutting-edge technologies and create captivating AI-enabled apps," said Durga Malladi, senior vice president and general manager, technology planning and edge solutions, Qualcomm Technologies, Inc. "The Qualcomm AI Hub provides developers with a comprehensive AI model library to quickly and easily integrate pre-optimized AI models into their applications, leading to faster, more reliable and private user experiences."Qualcomm AI Hub - Developer's Gateway to Superior On-device AI Performance
The new Qualcomm AI Hub contains a library of pre-optimized AI models for seamless deployment on devices powered by Snapdragon and Qualcomm platforms. This library provides developers with more than 75 popular AI and generative AI models, such as Whisper, ControlNet, Stable Diffusion, and Baichuan 7B, which are optimized for superior on-device AI performance, lower memory utilization, and better power efficiency, across different form factors and packaged in various runtimes. Each model is optimized to take advantage of hardware acceleration across all cores within the Qualcomm AI Engine (NPU, CPU, and GPU) resulting in 4X faster inferencing times. The AI model library automatically handles model translation from source framework to popular runtimes and works directly with the Qualcomm AI Engine direct SDK, then applies hardware-aware optimizations. Developers can seamlessly integrate these models into their applications, reducing time-to-market, and unlocking the benefits of on-device AI implementations such as immediacy, reliability, privacy, personalization, and cost savings.
The optimized models are available today on the Qualcomm AI Hub, GitHub, and Hugging Face. New models will continually be added to the Qualcomm AI Hub along with upcoming support for additional platforms and operating systems. Developers can sign up today to run the models themselves on cloud-hosted devices based on Qualcomm Technologies' platforms and get earlier access to new features and AI models coming through Qualcomm AI Hub.
"We are thrilled to host Qualcomm Technologies' AI models on Hugging Face," said Clement Delangue, cofounder and CEO, Hugging Face. "These popular AI models, optimized for on-device machine learning and ready to use on Snapdragon and Qualcomm platforms, will enable the next generation of mobile developers and edge AI applications, making AI more accessible and affordable for everyone."Cutting Edge AI Research Advancements
- For the first time running on an Android smartphone, Qualcomm AI Research is demonstrating Large Language and Vision Assistant (LLaVA), a 7+ billion parameter large multimodal model (LMM) that can accept multiple types of data inputs, including text and images, and generate multi-turn conversations with an AI assistant about an image. This LMM runs at a responsive token rate on device, which results in enhanced privacy, reliability, personalization, and cost. LMMs with language understanding and visual comprehension enable many use cases, such as identifying and discussing complex visual patterns, objects, and scenes.
- Qualcomm AI Research is also showcasing its first demonstration of Low Rank Adaptation (LoRA) on an Android smartphone. Running Stable Diffusion with LoRA, users can create high-quality custom images based on personal or artistic preferences. LoRA reduces the number of trainable parameters of AI models, enabling greater efficiency, scalability, and customization of on-device generative AI use cases. Beyond enabling fine-tuned language vision models (LVMs) for different artistic styles, LoRA is broadly applicable for customized AI models, such as large language models, to create tailored personal assistants, improved language translation, and more.
- On a Windows PC, Qualcomm AI Research is showcasing a world's first on-device demonstration of a 7+ billion parameter LMM that can accept text and audio inputs (e.g., music, sound of traffic, etc.) and then generate multi-turn conversations about the audio.
- Smartphones: We are displaying a range of flagship commercial AI smartphones powered by the Snapdragon 8 Gen 3 Mobile Platform, including the HONOR Magic6 Pro, OPPO X7 Ultra, and Xiaomi 14 Pro. Each of these devices includes exciting new generative AI features, such as AI-generated image expansion (Xiaomi), AI-powered video creation and AI-powered calendar creation (HONOR), and image object eraser (OPPO).
- PCs: The new Snapdragon X Elite and its 45 TOPS NPU is built for on-device AI that will transform how we interact with our PCs. Using the popular and free image editor GIMP with the Stable Diffusion plug-in, Qualcomm Technologies will demonstrate that you can type an image you want and generative AI will create an image in 7 seconds—that's 3x faster than x86 competitors.
- Automotive: Leveraging its industry-leading AI hardware and software solutions, the Company is also demonstrating traditional and generative AI capabilities with the Snapdragon Digital Chassis Platform, aiming to deliver more powerful, efficient, private, safer, and personalized experiences for drivers and passengers.
- Consumer IoT: We're showcasing Humane's AI Pin that runs on a Snapdragon platform and offers users the ability to take AI with them everywhere in an entirely new, conversational, and screenless form factor.
- Connectivity: The new Snapdragon X80 Modem-RF System integrates a second-generation 5G AI processor to enhance cellular performance, coverage, latency, and power efficiency. We also introduced Qualcomm FastConnect 7900 Mobile Connectivity System, the first AI-optimized Wi-Fi 7 system, which leverages AI to set a new bar for adaptable, high-performance, low latency, and low-power local wireless connectivity.
- 5G Infrastructure: Qualcomm Technologies will showcase three groundbreaking AI-based enhancements for network management, including a generative AI assistant for radio access network (RAN) engineers to simplify network and slice management tasks, an AI-based open RAN application (rApp) that reduces network energy consumption, and an AI-based 5G network slice lifecycle management suite.
Comments on Qualcomm AI Hub Introduced at MWC 2024
There are no comments yet.