News Posts matching #AI

Return to Keyword Browsing

Advantech Unveils New Lineup Powered by AMD Ryzen Embedded 8000 Series Processors

Advantech, a global leader in embedded IoT computing, is excited to launch its latest Edge AI solutions featuring the advanced AMD Ryzen Embedded 8000 Series processors. The SOM-6873 (COMe Compact), AIMB-2210 (Mini-ITX motherboards), and AIR-410 (AI Inference System) deliver exceptional AI performance leveraging the first AMD embedded devices with integrated Neural Processing Units. These integrated NPUs are optimized to enhance AI inference efficiency and precision. Together with traditional CPU and GPU elements, the architecture delivers performance up to 39 TOPS. They also support dual-channel DDR5 memory and PCIe Gen 4, providing ample computing power with flexible TDP options and thermal solutions.

Within value-added software services, such as the Edge AI SDK that integrates the AMD Ryzen AI software, the model porting procedure is accelerated. These features make the solutions ideal for edge applications such as HMI, machine vision in industrial automation, smart management and interactive services in urban entertainment systems, and healthcare devices such as ultrasound machines.

NVIDIA RTX "Blackwell" GPU with 96 GB GDDR7 Memory on 512-Bit Bus Appears

Recent shipping manifests suggest that NVIDIA works on a graphics card with 96 GB of GDDR7 memory. Documents reveal a product utilizing a 512-bit memory bus and a clamshell (memory on both PCB sides) design that combines two 3 GB modules per memory controller. This setup effectively doubles the memory capacity of existing workstation-oriented cards. The product is believed to use the GB202 chip, the only Blackwell desktop GPU with a 512-bit interface. The documents refer to a board labeled PG153, a designation not seen in any of NVIDIA's existing consumer GPUs. This finding points toward a professional or workstation model rather than a gaming product. There is a possibility that it could be part of the RTX 6000 Blackwell or RTX 8000 Blackwell series.

NVIDIA's current top workstation card, the RTX 6000 "Ada," features 48 GB of memory. A move to 96 GB would be a substantial jump, enabling more complex workloads for content creation, data analysis, and AI. This GPU could carry a significantly higher power target than current workstation models. However, professional GPUs often maintain lower clock speeds to keep power consumption within limits that accommodate more stable operation in professional environments. There is no confirmed information regarding the card's official name or final specifications, such as core count or actual clock frequencies. NVIDIA's workstation GPUs have historically provided a higher core count than their gaming counterparts. If the rumored 96 GB GPU follows this pattern, it may surpass even the potential GeForce RTX 5090, which comes with 32 GB of GDDR7. NVIDIA is expected to hold its annual GPU Technology Conference in March. This event is viewed as a likely venue for official announcements. Until then, these details remain unverified.

Samsung Galaxy S25 Series Sets the Standard of AI Phone as a True AI companion

Samsung Electronics Co., Ltd. today announced the Galaxy S25 Ultra, Galaxy S25+, and Galaxy S25, setting a new standard towards a true AI companion with our most natural and context-aware mobile experiences ever created. Introducing multimodal AI agents, the Galaxy S25 series is the first step in Samsung's vision to change the way users interact with their phone—and with their world. A first-of-its-kind customized Snapdragon 8 Elite Mobile Platform for Galaxy chipset delivers greater on-device processing power for Galaxy AI plus superior camera range and control with Galaxy's next-gen ProVisual Engine.

"The greatest innovations are a reflection of their users, which is why we evolved Galaxy AI to help everyone interact with their devices more naturally and effortlessly while trusting that their privacy is secured," said TM Roh, President and Head of Mobile eXperience Business at Samsung Electronics. "Galaxy S25 series opens the door to an AI-integrated OS that fundamentally shifts how we use technology and how we live our lives."

MediaTek Adopts AI-Driven Cadence Virtuoso Studio and Spectre Simulation on NVIDIA Accelerated Computing Platform for 2nm Designs

Cadence today announced that MediaTek has adopted the AI-driven Cadence Virtuoso Studio and Spectre X Simulator on the NVIDIA accelerated computing platform for its 2 nm development. As design size and complexity continue to escalate, advanced-node technology development has become increasingly challenging for SoC providers. To meet the aggressive performance and turnaround time (TAT) requirements for its 2 nm high-speed analog IP, MediaTek is leveraging Cadence's proven custom/analog design solutions, enhanced by AI, to achieve a 30% productivity gain.

"As MediaTek continues to push technology boundaries for 2 nm development, we need a trusted design solution with strong AI-powered tools to achieve our goals," said Ching San Wu, corporate vice president at MediaTek. "Closely collaborating with Cadence, we have adopted the Cadence Virtuoso Studio and Spectre X Simulator, which deliver the performance and accuracy necessary to achieve our tight design turnaround time requirements. Cadence's comprehensive automation features enhance our throughput and efficiency, enabling our designers to be 30% more productive."

US Prepares for Stargate Project: 500 Billion Dollars of AI Infrastructure Buildout

On Tuesday, the newly inaugurated United States president, Donald Trump, announced a massive AI infrastructure expansion in the US called Stargate Project. Stargate is an idea that brings private investments across the US land, with up to 500 billion US dollars committed to the project over the next four years. This is single-handedly one of the most significant infrastructure projects ever planned, and this time it is all about AI and data centers. The initial phase involves deploying 100 billion US Dollars immediately, while the remaining 400 billion will be deployed periodically over the next four years. OpenAI and SoftBank are leading this project, with Softbank's CEO Masayoshi Son being the project's chairman. Major equity partners include SoftBank, OpenAI, Oracle, and MGX. Major technology partners who will supply the know-how, planning, software, and hardware are Arm, Microsoft, NVIDIA, Oracle, and OpenAI.

Leading the entire operation will be up to OpenAI, who is gaining operational lead in the project, while Softbank oversees financial planning. Interestingly, the buildout has already begun. OpenAI is currently exploring a few sites in Abilene, Texas, which includes ten 500,000 sq. ft. data centers with 20 planned for the future. Interestingly, the infrastructure expansion will most likely be present in every US state that can provide ample land and power capacity. OpenAI is looking for partners to help with the massive data centers' power, land, and construction. The most significant impact of this project will be on the power grid, which will require additional buildout and implementation of small nuclear reactors running locally nearby to satisfy the power draw from hundreds of thousands and even millions of GPUs. OpenAI is praising NVIDIA for its almost decade-long partnership, meaning that most GPUs will likely be NVIDIA-sourced.

Seagate Anticipates Cloud Storage Growth due to AI-Driven Data Creation

According to a recent, global Recon Analytics survey commissioned by Seagate Technology, business leaders from across 15 industry sectors and 10 countries expect that adoption of artificial intelligence (AI) applications will generate unprecedented volumes of data, driving a boom in demand for data storage, in particular cloud-based storage. With hard drives delivering scalability relative to terabyte-per-dollar cost efficiencies, cloud service providers rely on hard drives to store mass quantities of data.

Recently, analyst firm IDC estimated that 89% of data stored by leading cloud service providers is stored on hard drives. Now, according to this Recon Analytics study, nearly two-thirds of respondents (61%) from companies that use cloud as their leading storage medium expect their cloud-based storage to grow by more than 100% over the next 3 years. "The survey results generally point to a coming surge in demand for data storage, with hard drives emerging as the clear winner," remarked Roger Entner, founder and lead analyst of Recon Analytics. "When you consider that the business leaders we surveyed intend to store more and more of this AI-driven data in the cloud, it appears that cloud services are well-positioned to ride a second growth wave."

NVIDIA's Frame Generation Technology Could Come to GeForce RTX 30 Series

NVIDIA's deep learning super sampling (DLSS) has undergone many iterations to the current version 4 with the transformer model, delivering new technologies such as DLSS Multi Frame Generation, predicting multiple frames in advance to generate the upcoming frame, and increasing the frame output per second. However, not every NVIDIA GPU generation supports these more modern DLSS technologies. In an interview with Digital Foundry, Bryan Catanzaro, VP of Applied Deep Learning Research at NVIDIA, commented on trickling down some DLSS technologies to older GPU generations. For example, DLSS Ray Reconstruction, Super Resolution, and Deep Learning Anti-Aliasing (DLAA) work on NVIDIA GeForce RTX 20/30/40/50 series GPUs. However, the RTX 40 series carries an exclusive DLSS Frame Generation, and the newest RTX 50 series carries the DLSS Multi Frame Generation as an exclusive feature.

However, there is hope for older hardware. "I think this is primarily a question of optimization and also engineering and then the ultimate user experience. We're launching this Frame Generation, the best Multi Frame Generation technology, with the 50 Series, and we'll see what we're able to squeeze out of older hardware in the future." So, frame generation will most likely arrive on the older RTX 30 series, with even a slight possibility of the RTX 20 series getting the DLSS frame generation. Due to compute budget constraints, the multi-frame generation will most likely stay an RTX 50 series exclusive as it has more raw computing power to handle this technology.

LG Display Unveils 4th-Generation OLED Panel Optimized for AI Era

LG Display, the world's leading innovator of display technologies, continues to lead the way in large-sized OLED technology by unveiling its fourth-generation OLED TV panel. 33% brighter than the previous generation and optimized for the AI TV era, it is the industry's first ever OLED display to achieve a maximum brightness as high as 4,000 nits (1 nit is the brightness produced by a candle).

Panels with both high brightness and energy efficiency are essential for AI TVs, as they use upscaling that analyzes content in real time to deliver ultra-high picture quality of up to 8K. The industry also considers higher brightness to be a key picture quality factor because it enables more vivid images that are akin to natural human vision.

NVIDIA NeMo AI Guardrails Upgraded with Latest NIM Microservices

AI agents are poised to transform productivity for the world's billion knowledge workers with "knowledge robots" that can accomplish a variety of tasks. To develop AI agents, enterprises need to address critical concerns like trust, safety, security and compliance. New NVIDIA NIM microservices for AI guardrails—part of the NVIDIA NeMo Guardrails collection of software tools—are portable, optimized inference microservices that help companies improve the safety, precision and scalability of their generative AI applications.

Central to the orchestration of the microservices is NeMo Guardrails, part of the NVIDIA NeMo platform for curating, customizing and guardrailing AI. NeMo Guardrails helps developers integrate and manage AI guardrails in large language model (LLM) applications. Industry leaders Amdocs, Cerence AI and Lowe's are among those using NeMo Guardrails to safeguard AI applications. Developers can use the NIM microservices to build more secure, trustworthy AI agents that provide safe, appropriate responses within context-specific guidelines and are bolstered against jailbreak attempts. Deployed in customer service across industries like automotive, finance, healthcare, manufacturing and retail, the agents can boost customer satisfaction and trust.

Supermicro Empowers AI-driven Capabilities for Enterprise, Retail, and Edge Server Solutions

Supermicro, Inc. (SMCI), a Total IT Solution Provider for AI/ML, HPC, Cloud, Storage, and 5G/Edge, is showcasing the latest solutions for the retail industry in collaboration with NVIDIA at the National Retail Federation (NRF) annual show. As generative AI (GenAI) grows in capability and becomes more easily accessible, retailers are leveraging NVIDIA NIM microservices, part of the NVIDIA AI Enterprise software platform, for a broad spectrum of applications.

"Supermicro's innovative server, storage, and edge computing solutions improve retail operations, store security, and operational efficiency," said Charles Liang, president and CEO of Supermicro. "At NRF, Supermicro is excited to introduce retailers to AI's transformative potential and to revolutionize the customer's experience. Our systems here will help resolve day-to-day concerns and elevate the overall buying experience."

ADLINK Launches the DLAP Supreme Series

ADLINK Technology Inc., a global leader in edge computing, unveiled its new "DLAP Supreme Series", an edge generative AI platform. By integrating Phison's innovative aiDAPTIV+ AI solution, this series overcomes memory limitations in edge generative AI applications, significantly enhancing AI computing capabilities on edge devices. Without increasing high hardware costs, the DLAP Supreme series achieves notable AI performance improvements, helping enterprises reduce the cost barriers of AI deployment and accelerating the adoption of generative AI across various industries, especially in edge computing.

Lower AI Computing Costs and Significantly Improved Performance
As generative AI continues to penetrate various industries, many edge devices encounter performance bottlenecks due to insufficient DRAM capacity when executing large language models, affecting model operations and even causing issues such as inadequate token length. The DLAP Supreme series, leveraging aiDAPTIV+ technology, effectively overcomes these limitations and significantly enhances computing performance. Additionally, it supports edge devices in conducting generative language model training, enabling them with AI model training capabilities and improving their autonomous learning and adaptability.

NVIDIA & IQVIA Build Specialized Agentic AI for Life Science & Healthcare Applications

IQVIA, the world's leading provider of clinical research services, commercial insights and healthcare intelligence, is working with NVIDIA to build custom foundation models and agentic AI workflows that can accelerate research, clinical development and access to new treatments. AI applications trained on the organization's vast healthcare-specific information and guided by its deep domain expertise will help the industry boost the efficiency of clinical trials and optimize planning for the launch of therapies and medical devices—ultimately improving patient outcomes.

Operating in over 100 countries, IQVIA has built the largest global healthcare network and is uniquely connected to the ecosystem with the most comprehensive and granular set of information, analytics and technologies in the industry. Announced today at the J.P. Morgan Conference in San Francisco, IQVIA's collection of models, AI agents and reference workflows will be developed with the NVIDIA AI Foundry platform for building custom models, allowing IQVIA's thousands of pharmaceutical, biotech and medical device customers to benefit from NVIDIA's agentic AI capabilities and IQVIA's technologies, life sciences information and expertise.

NVIDIA Scolds Outgoing US Administration Over AI Diffusion Ruling

For decades, leadership in computing and software ecosystems has been a cornerstone of American strength and influence worldwide. The federal government has wisely refrained from dictating the design, marketing and sale of mainstream computers and software—key drivers of innovation and economic growth. The first Trump Administration laid the foundation for America's current strength and success in AI, fostering an environment where U.S. industry could compete and win on merit without compromising national security. As a result, mainstream AI has become an integral part of every new application, driving economic growth, promoting U.S. interests and ensuring American leadership in cutting-edge technology.

Today, companies, startups and universities around the world are tapping mainstream AI to advance healthcare, agriculture, manufacturing, education and countless other fields, driving economic growth and unlocking the potential of nations. Built on American technology, the adoption of AI around the world fuels growth and opportunity for industries at home and abroad. That global progress is now in jeopardy. The Biden Administration now seeks to restrict access to mainstream computing applications with its unprecedented and misguided "AI Diffusion" rule, which threatens to derail innovation and economic growth worldwide.

Samsung Shows Galaxy Book5 Pro and Galaxy Book5 360 at CES 2025

Samsung has revealed two new laptop models at CES 2025: the Galaxy Book5 Pro and Galaxy Book5 360, both featuring Intel's latest processors and artificial intelligence capabilities. The Galaxy Book5 Pro comes in two sizes. The 16-inch version weighs 1.56 kg and includes a 76.1Wh battery, while the 14-inch model weighs 1.23 kg with a 63.1 Wh battery. Both laptops use AMOLED displays with 2880×1800 resolution and variable refresh rates from 48 to 120 Hz. Storage options reach 1 TB, with either 16 GB or 32 GB of memory available.

The Galaxy Book5 360 features a 15.6-inch touchscreen that can fold back 360 degrees, effectively turning the laptop into a tablet. It has a 1920×1080 resolution display and weighs 1.46 kg, with a 68.1 Wh battery. Both models use Intel's Core Ultra processors with built-in NPUs that Samsung uses for features like AI Select for searching and Photo Remaster for image enhancement. The laptops also integrate with other Samsung devices through features like Multi Control for unified device navigation and Quick Share for file transfers. The new laptops include standard connectivity options like Wi-Fi 7 and Bluetooth 5.4. Samsung claims the Pro models can achieve up to 25 hours of battery life, though real-world usage typically yields shorter durations.

Aetina & Qualcomm Collaborate on Flagship MegaEdge AIP-FR68 Edge AI Solution

Aetina, a leading provider of edge AI solutions and a subsidiary of Innodisk Group, today announced a collaboration with Qualcomm Technologies, Inc., who unveiled a revolutionary Qualcomm AI On-Prem Appliance Solution and Qualcomm AI Inference Suite for On-Prem. This collaboration combines Qualcomm Technologies' cutting-edge inference accelerators and advanced software with Aetina's edge computing hardware to deliver unprecedented computing power and ready-to-use AI applications for enterprises and industrial organizations.

The flagship offering, the Aetina MegaEdge AIP-FR68, sets a new industry benchmark by integrating Qualcomm Cloud AI family of accelerator cards. Each Cloud AI 100 Ultra card delivers an impressive 870 TOPS of AI computing power at 8-bit integer (INT8) while maintaining remarkable energy efficiency at just 150 W power consumption. The system supports dual Cloud AI 100 Ultra cards in a single desktop workstation. This groundbreaking combination of power and efficiency in a compact form factor revolutionizes on-premises AI processing, making enterprise-grade computing more accessible than ever.

MSI Shows Off MEG Vision, Aegis & Codex AI Gaming PCs at CES 2025

MSI presented multiple pre-built AI gaming desktop PCs at CES—their showroom tables were laden with offerings from the MEG, Aegis, Vision and Codex families. A lot of high-end components were mentioned on neighboring spec sheet placards—MSI has picked up the latest processor technology from Intel and AMD. Most of the showcased systems are not cutting-edge enough to feature NVIDIA's RTX 50 "Blackwell" series. A big exception is their second generation MEG VISION X AI gaming desktop—a specific GPU model was not mentioned, but it would be safe to assume that a custom MSI GeForce RTX 5090 card would be worthy enough for such a fancy system. Certain press outlets have reported that MSI organized a preview event last October—a lucky few inspected the MEG VISION X AI 2nd during a factory tour.

Intel's Core Ultra 9 285K "Arrow Lake-S" processor is reportedly a constant across all MEG VISION X AI 2nd build configurations. MSI has not revealed pricing, but industry experts reckon that the most expensive variant could be configured with the market's top-tier GPU, memory and storage options. Customers are encouraged to update hardware in the future—a product placard states that internals are easily accessed and simple to upgrade. A TechPowerUp staffer felt the need to examine the MEG VISION X AI 2nd's "AI HMI" touch screen interface. This unique selling point—with AI functionality—absolutely dominates the case's front panel.

HP Shaping the Future of Play at CES 2025

Today's gamers are not just players; we are the innovators and leaders of tomorrow's workforce. As someone who recognizes the incredible potential within our gaming community, I'm committed to supporting your journey both in and out of the game. In our latest announcements focusing on the future of play and work, we've shared exciting updates across various fronts, including some major innovations in gaming. I'm thrilled to introduce even more gear designed to enhance your gaming experience. From the powerful OMEN 16 Laptop and the compact OMEN 16L Desktop to our stunning new OMEN displays and precise HyperX mice, I can't wait to dive into the details and show you how these products will elevate your play and keep you ahead of the curve. We're not just creating innovations; we're shaping the future of play.

Power and Performance for All: The OMEN 16 Laptop and OMEN 16L Desktop
At HP, your feedback is our guiding star. You told us you want smooth, immersive gameplay on machines that stay cool and quiet; we created the third-generation OMEN 16 to be our coolest and quietest version yet. The OMEN 16 elevates your gaming experience with cutting-edge technology and thoughtful design, keeping performance at its peak. Let's dive into the technical specifics.

Supermicro Begins Volume Shipments of Max-Performance Servers Optimized for AI, HPC, Virtualization, and Edge Workloads

Supermicro, Inc. a Total IT Solution Provider for AI/ML, HPC, Cloud, Storage, and 5G/Edge is commencing shipments of max-performance servers featuring Intel Xeon 6900 series processors with P-cores. The new systems feature a range of new and upgraded technologies with new architectures optimized for the most demanding high-performance workloads including large-scale AI, cluster-scale HPC, and environments where a maximum number of GPUs are needed, such as collaborative design and media distribution.

"The systems now shipping in volume promise to unlock new capabilities and levels of performance for our customers around the world, featuring low latency, maximum I/O expansion providing high throughput with 256 performance cores per system, 12 memory channels per CPU with MRDIMM support, and high performance EDSFF storage options," said Charles Liang, president and CEO of Supermicro. "We are able to ship our complete range of servers with these new application-optimized technologies thanks to our Server Building Block Solutions design methodology. With our global capacity to ship solutions at any scale, and in-house developed liquid cooling solutions providing unrivaled cooling efficiency, Supermicro is leading the industry into a new era of maximum performance computing."

TOZO Shows Golden X2 Pro HiFi Wireless Earbuds at CES 2025

Audio manufacturer TOZO revealed its new Golden X2 Pro wireless earbuds at CES today, featuring dual digital-to-analog converters (DACs) and expanded smart features. These new earbuds are TOZO's push into the premium audio segment. The X2 Pro uses a hybrid driver system, pairing a 12 mm dynamic driver with a Knowles balanced armature. New earbuds support LDAC codec and Hi-Res Wireless Audio certification, with a frequency response ranging from 12 Hz to 44.1 kHz. The active noise cancellation system employs six microphones to reduce environmental noise by up to 50 decibels, while automatically adjusting to surrounding conditions. For spatial audio, the earbuds incorporate gyroscopic sensors to enable head tracking.

A notable hardware addition is the 1.47-inch color display built into the charging case, which shows battery status and allows users to adjust settings without using their phone. The earbuds also include a feature called Vocal Vibration Detection that automatically adjusts audio when the user starts speaking. On the software side, TOZO has added several AI-powered features through its companion app, including language translation, voice transcription, and meeting note generation. The implementation of these features and their effectiveness in real-world use remains to be tested.

Palit Announces Pandora, a Compact AI Computer Powered by NVIDIA Jetson Orin NX Super

Palit Microsystems, a global leader in graphics technology, introduces Pandora, its first AI computer powered by the NVIDIA Jetson Orin NX Super. Available in 8 GB and 16 GB configurations, it delivers 117 and 157 AI TOPS respectively, helping address the demand for high-performance edge AI computing. Compact at 123x145x66 mm and weighing only 470 g, Pandora is designed for space-constrained environments where performance and efficiency are critical. Its active cooling system ensures optimal thermal performance, enabling reliable operation during demanding tasks. Built to handle complex edge AI workloads, Pandora excels in applications such as smart retail, education, robotics, and generative AI, making it a versatile solution for diverse industries.

Features of Pandora
Pandora offers comprehensive connectivity with a wide range of I/O options, including dual 1G Ethernet ports, four USB ports with OTG functionality, an HDMI 2.0 interface, and audio input/output capabilities, ensuring seamless integration with peripherals for diverse AI applications. Equipped with a pre-installed 128 GB SSD, its advanced expandability includes four M.2 slots for simultaneous support of SSDs, Wi-Fi, and 5G/LTE modules. Features like MIPI inputs, an additional 14-pin UART, and CAN bus interfaces enhance compatibility with robotics, autonomous vehicles, and industrial systems, providing exceptional flexibility.

Biostar and DEEPX Showcase Advanced Edge AI Solutions at CES 2025

BIOSTAR, a leading manufacturer of Edge AI embedded computers, IPC solutions motherboards, graphics cards, and PC peripherals, is excited to announce x86 Edge AI platform showcase at CES 2025. Scheduled to take place from January 7 to 10 in Las Vegas, Nevada, USA, BIOSTAR, in partnership with DEEPX, an AI semiconductor company from South Korea, will showcase cutting-edge x86 Edge AI solutions at DEEPX's booth (#9045) in the North Hall of the Las Vegas Convention Center.

DEEPX is a pioneering company in on-device AI. It develops advanced AI semiconductors that optimize performance, reduce power consumption, and enhance cost efficiency across various industries, including smart camera modules, smart mobility, smart factories, consumer electronics, smart cities, surveillance systems, and AI servers. Building upon their successful collaborations, BIOSTAR and DEEPX set to unveil their latest joint innovations at CES 2025, further enhancing their prominence in the industrial computing ecosystem.

MAINGEAR Launches Desktops and Laptops with NVIDIA GeForce RTX 50 Series GPUs Based on Blackwell Architecture

MAINGEAR, the leader in premium-quality, high-performance gaming PCs, today announced its lineup of desktops and laptops equipped with NVIDIA GeForce RTX 50 Series GPUs. Powered by the NVIDIA Blackwell architecture, GeForce RTX 50 Series GPUs bring groundbreaking capabilities to gamers and creators. Equipped with a massive level of AI horsepower, the GeForce RTX 50 Series enables new experiences and next-level graphics fidelity. Users can multiply performance with NVIDIA DLSS 4, generate images at unprecedented speed, and unleash creativity with the NVIDIA Studio platform.

Plus, NVIDIA NIM microservices - state-of-the-art AI models that let enthusiasts and developers build AI assistants, agents, and workflows - are available with peak performance on NIM-ready systems.

NVIDIA Redefines Game AI With ACE Autonomous Game Characters

The term 'AI' has been used in games for decades. These non-playable characters (NPCs) traditionally follow strict rules designed to mimic intelligence, adhere to a guided story, and provide a scripted interaction with the player. However, with the rise of intelligent language models, game AI is primed for a truly intelligent overhaul. At CES 2025, NVIDIA is redefining game AI with the introduction of NVIDIA ACE autonomous game characters. First introduced in 2023, NVIDIA ACE is a suite of RTX-accelerated digital human technologies that bring game characters to life with generative AI. NVIDIA is now expanding ACE from conversational NPCs to autonomous game characters that use AI to perceive, plan, and act like human players. Powered by generative AI, ACE will enable living, dynamic game worlds with companions that comprehend and support player goals, and enemies that adapt dynamically to player tactics.

Enabling these autonomous characters are new ACE small language models (SLMs), capable of planning at human-like frequencies required for realistic decision making, and multi-modal SLMs for vision and audio that allow AI characters to hear audio cues and perceive their environment. NVIDIA is partnering with leading game developers to incorporate ACE autonomous game characters into their titles. Interact with human-like AI players and companions in PUBG: BATTLEGROUNDS, inZOI, and NARAKA: BLADEPOINT MOBILE PC VERSION. Fight against ever-learning AI-powered bosses that adapt to your playstyle in MIR5. And experience new gameplay mechanics made possible with AI-powered NPCs in AI People, Dead Meat, and ZooPunk.

NVIDIA Project G-Assist Comes To NVIDIA App In February, An AI Assistant For GeForce RTX AI PCs

Six months ago at Computex, NVIDIA showcased Project G-Assist - a tech demo that offered a glimpse of how AI assistants could elevate the PC experience for gamers, creators, and more. Today, we're excited to announce the initial release of the G-Assist System Assistant feature is coming to GeForce RTX users via the NVIDIA app in February. As modern PCs become more powerful, they also grow more complex to operate. Users today face over a trillion possible combinations of hardware and software settings when configuring a PC for peak performance - spanning GPU, CPU, monitors, motherboard, peripherals, and more.

NVIDIA built Project G-Assist, an experimental AI assistant that runs locally on GeForce RTX AI PCs, to simplify this experience. G-Assist helps users control a broad range of PC settings, from optimizing game and system settings, charting frame rates and other key performance statistics, to controlling select peripherals settings such as lighting - all via basic voice or text commands.

NVIDIA NIM Microservices and AI Blueprints Usher in New Era of Local AI

Over the past year, generative AI has transformed the way people live, work and play, enhancing everything from writing and content creation to gaming, learning and productivity. PC enthusiasts and developers are leading the charge in pushing the boundaries of this groundbreaking technology. Countless times, industry-defining technological breakthroughs have been invented in one place—a garage. This week marks the start of the RTX AI Garage series, which will offer routine content for developers and enthusiasts looking to learn more about NVIDIA NIM microservices and AI Blueprints, and how to build AI agents, creative workflow, digital human, productivity apps and more on AI PCs. Welcome to the RTX AI Garage.

This first installment spotlights announcements made earlier this week at CES, including new AI foundation models available on NVIDIA RTX AI PCs that take digital humans, content creation, productivity and development to the next level. These models—offered as NVIDIA NIM microservices—are powered by new GeForce RTX 50 Series GPUs. Built on the NVIDIA Blackwell architecture, RTX 50 Series GPUs deliver up to 3,352 trillion AI operations per second of performance, 32 GB of VRAM and feature FP4 compute, doubling AI inference performance and enabling generative AI to run locally with a smaller memory footprint.
Return to Keyword Browsing
Jan 23rd, 2025 09:41 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts