News Posts matching #AI

Return to Keyword Browsing

NVIDIA Announces Turing-based Quadro RTX 8000, Quadro RTX 6000 and Quadro RTX 5000

NVIDIA today reinvented computer graphics with the launch of the NVIDIA Turing GPU architecture. The greatest leap since the invention of the CUDA GPU in 2006, Turing features new RT Cores to accelerate ray tracing and new Tensor Cores for AI inferencing which, together for the first time, make real-time ray tracing possible.

These two engines - along with more powerful compute for simulation and enhanced rasterization - usher in a new generation of hybrid rendering to address the $250 billion visual effects industry. Hybrid rendering enables cinematic-quality interactive experiences, amazing new effects powered by neural networks and fluid interactivity on highly complex models.

Five Years Too Late, Typo Fix Offers Improved AI in Aliens: Colonial Marines

It has been a long five years since Aliens: Colonial Marines launched as a hot mess. Being critically panned by gamers and critics alike. One of the reasons behind the negative reception was the game's poor AI. The Xenomorphs had a tendency to run straight into gunfire. Or worse yet, would stand around or group up making them easy targets. Suffice to say the Xenomorphs were far from scary. A typographical error has been discovered as the reason behind some of those issues.

As noted on the ResetERA forums, a post by jamesdickinson963 on the ACM Overhaul ModDB page traced the problem to a spelling error in a single line of code within the game's ini file. The code shown below has "teather" instead of the proper "tether". This simple mistake in theory, results in the "zone tether" failing to load the AI parameters attached to the broken bit of code.

Let's Go Driverless: Daimler, Bosch Select NVIDIA DRIVE for Robotaxi Fleets

(Editor's Note: NVIDIA continues to spread its wings in the AI and automotive markets, where it has rapidly become the de facto player. While the company's gaming products have certainly been the ones to project the company's image - and profits - that allowed it to come to be one of the world's leading tech companies, it's hard to argue that AI and datacenter accelerators has become one of the chief departments in raking in profits for the company. The company's vision for Level 4 and Level 5 autonomous driving and the future of our connected cities is an inspiring one, that came straight from yesterday's science fiction. Here's hoping the human mind, laws and city design efforts accompany these huge technological leaps -or at least don't strangle them too much.)

Press a button on your smartphone and go. Daimler, Bosch and NVIDIA have joined forces to bring fully automated and driverless vehicles to city streets, and the effects will be felt far beyond the way we drive. While the world's billion cars travel 10 trillion miles per year, most of the time these vehicles are sitting idle, taking up valuable real estate while parked. And when driven, they are often stuck on congested roadways. Mobility services will solve these issues plaguing urban areas, capture underutilized capacity and revolutionize the way we travel.

Samsung Foundry and Arm Expand Collaboration to Drive High-Performance Computing Solutions

Samsung Electronics, a world leader in advanced semiconductor technology, today announced that its strategic foundry collaboration with Arm will be expanded to 7/5-nanometer (nm) FinFET process technology to remain a step ahead in the era of high-performance computing. Based on Samsung Foundry's 7LPP (7nm Low Power Plus) and 5LPE (5nm Low Power Early) process technologies, the Arm Artisan physical IP platform will enable 3GHz+ computing performance for Arm's Cortex -A76 processor.

Samsung's 7LPP process technology will be ready for its initial production in the second half of 2018. The first extreme ultra violet (EUV) lithography process technology, and its key IPs, are in development and expected to be completed by the first half of 2019. Samsung's 5LPE technology will allow greater area scaling and ultra-low power benefits due to the latest innovations in 7LPP process technology.

Baidu Unveils 'Kunlun' High-Performance AI Chip

Baidu Inc. today announced Kunlun, China's first cloud-to-edge AI chip, built to accommodate high performance requirements of a wide variety of AI scenarios. The announcement includes training chip "818-300"and inference chip "818-100". Kunlun can be applied to both cloud and edge scenarios, such as data centers, public clouds and autonomous vehicles.

Kunlun is a high-performance and cost-effective solution for the high processing demands of AI. It leverages Baidu's AI ecosystem, which includes AI scenarios like search ranking and deep learning frameworks like PaddlePaddle. Baidu's years of experience in optimizing the performance of these AI services and frameworks afforded the company the expertise required to build a world class AI chip.

NVIDIA Joins S&P 100 Stock Market Index

With tomorrow's opening bell, NVIDIA will join the Standard and Poors S&P 100 index, replacing Time Warner. The spot that NVIDIA is joining in has been freed up by the merger of Time Warner with AT&T. This marks a monumental moment for the company as membership in the S&P 100 is reserved for only the largest and most important corporations in the US. From the tech sector the list comprises illustrious names such as Apple, Amazon, Facebook, Google Alphabet, IBM, Intel, Microsoft, Netflix, Oracle, Paypal, Qualcomm and Texas Instruments.

NVIDIA's stock has seen massive gains over the last years, thanks to delivering record quarter after record quarter. Recent developments have transformed the company from a mostly gaming GPU manufacturer to a company that is leading in the fields of GPU compute, AI and machine learning. This of course inspires investors, so the NVIDIA stock has been highly sought after, now sitting above 265 USD, which brings the company's worth to over 160 billion USD. Congratulations!

ASUS Introduces Full Lineup of PCI-E Servers Powered by NVIDIA Tesla GPUs

ASUS, the leading IT Company in server systems, server motherboards, workstations and workstation motherboards, today announced support for the latest NVIDIA AI solutions with NVIDIA Tesla V100 Tensor Core 32GB GPUs and Tesla P4 on its accelerated computing servers.
Artificial intelligence (AI) is translating data into meaningful insights, services and scientific breakthroughs. The size of the neural networks powering this AI revolution has grown tremendously. For instance, today's state of the art neural network model for language translation, Google's MOE model has 8 billion parameters compared to 100 million parameters of models from just two years ago.

To handle these massive models, NVIDIA Tesla V100 offers a 32GB memory configuration, which is double that of the previous generation. Providing 2X the memory improves deep learning training performance for next-generation AI models by up to 50 percent and improves developer productivity, allowing researchers to deliver more AI breakthroughs in less time. Increased memory allows HPC applications to run larger simulations more efficiently than ever before.

NGD Systems Delivers Industry-First 16TB NVMe Computational U.2 SSD

NGD Systems, Inc., the leader in computational storage, today announced the general availability (GA) of the 16 terabyte (TB) Catalina-2 U.2 NVMe solid state drive (SSD). The Catalina-2 is the first NVMe SSD with 16TB capacity that also makes available NGD's powerful "In-Situ Processing" capabilities. The Catalina-2 does this without impact to the reliability, quality of service (QoS) or power consumption, already available in the current shipping NGD Products.

The use of Arm multi-core processors in Catalina-2 provides users with a well-understood development environment and the combination of exceptional performance with low power consumption. The Arm-based In-Situ Processing platform allows NGD Systems to pack both high capacity and computational ability into the first 16TB 2.5-inch form factor package on the market. The NGD Catalina-2 U.2 NVMe SSD only consumes 12W (0.75W/TB) of power, compared to the 25W or more used by other NVMe solutions. This provides the highest energy efficiency in the industry.

Intel's Mobileye Secures a Future-Focused Deal for 8 Million Self-Driving Systems in 2021

Intel's Mobileye, the AI and self-driving feature the blue giant acquired last year for a cool $15.3 billion, has just announced, via an exclusive report to Reuters, that they've secured a contract to provide some 8 million self-driving systems to an European automaker. The deal is a future-focused one, and will see, by 2021, distribution for Intel's EyeQ5 chip, which is designed for fully autonomous driving - an upgrade to the EyeQ4 that will be rolled out in the coming weeks, Reuters reports, according to senior vice president for advanced development and strategy at Mobileye Erez Dagan.

Amnon Shashua, Mobileye's chief executive, said that "By the end of 2019, we expect over 100,000 Level 3 cars [where the car is self-driving but still allows for user intervention should the system be unable to progress for more than 10 seconds] with Mobileye installed." This deal is sure to make Intel even more of a player in the automotive space, where NVIDIA and a number of other high-profile companies have been making strides in recent years.

Probabilistic Computing Takes Artificial Intelligence to the Next Step

The potential impact of Artificial Intelligence (AI) has never been greater - but we'll only be successful if AI can deliver smarter and more intuitive answers. A key barrier to AI today is that natural data fed to a computer is largely unstructured and "noisy."

It's easy for humans to sort through natural data. For example: If you are driving a car on a residential street and see a ball roll in front of you, you would stop, assuming there is a small child not far behind that ball. Computers today don't do this. They are built to assist humans with precise productivity tasks. Making computers efficient at dealing with probabilities at scale is central to our ability to transform current systems and applications from advanced computational aids into intelligent partners for understanding and decision-making.

A Very Real Intelligence Race: The White House Hosts 38 Tech Companies on AI

The White House today is hosting executives from 38 companies for a grueling, embattled day of trying to move through the as of yet murky waters of AI development. The meeting, which includes representatives from Microsoft, Intel, Google, Amazon, Pfizer, and Ford, among others, aims to gather thoughts and ideas on how to supercharge AI development in a sustainable, safe, and cost-effective way.

Fields such as agriculture, healthcare and transportation are being spearheaded as areas of interest (military applications, obviously, are being discussed elsewhere). The Washington Post quotes Michael Kratsios, deputy chief technology officer at the White House, as saying in a recent interview that "Whether you're a farmer in Iowa, an energy producer in Texas, a drug manufacturer in Boston, you are going to be using these techniques to drive your business going forward."

VIA Launches VIA Edge AI Developer Kit

VIA Technologies, Inc., today announced the launch of the VIA Edge AI Developer Kit, a highly-integrated package powered by the Qualcomm Snapdragon 820E Embedded Platform that simplifies the design, testing, and deployment of intelligent Edge AI systems and devices.

The kit combines the VIA SOM-9X20 SOM Module and SOMDB2 Carrier Board with a 13MP camera module that is optimized for intelligent real-time video capture, processing, and edge analysis. Edge AI application development is enabled by an Android 8.0 BSP, which includes support for the Snapdragon Neural Processing Engine (NPE) and full acceleration of the Qualcomm Hexagon DSP, Qualcomm Adreno 530 GPU, or Qualcomm Kryo CPU to power AI applications. A Linux BSP based on Yocto 2.0.3 is set to be released in June this year.

More Humane AI: Microsoft Launches "AI for Accessibility" Initiative

Microsoft at its Build conference today announced one of the better use cases for AI yet: to empower those with disabilities. Dubbed the AI for Accessibility Initiative, this Microsoft program will see $25 million dollars being deployed across five years to further research and development to specifically target challenges faced by people with disabilities in three key areas: human connection, employment and modern life. The $25 million budget will be used by Microsoft as seed grants for developers, universities, institutions and other Microsoft partners, with the Redmond-based Microsoft pledging to also further invest - and scale up - development for key promising ideas that are birthed from this project. The AI bit comes from its implementation in inclusive design scenarios, scaling it up through platforms, services, and different solutions.

Further, Microsoft will help partners include accessibility solutions on their products, which could allow for a base model for accessibility technologies on families of products. Microsoft President Brad Smith said there are about a billion people around the world with some kind of disability, either temporary or permanent, and it's for these people, and those that will come after, that Microsoft is committing to this investment.

Adobe and NVIDIA Announce Partnership to Deliver New AI Services

At Adobe Summit, Adobe and NVIDIA today announced a strategic partnership to rapidly enhance their industry-leading artificial intelligence (AI) and deep learning technologies. Building on years of collaboration, the companies will work to optimize the Adobe Sensei AI and machine learning (ML) framework for NVIDIA GPUs. The collaboration will speed time to market and improve performance of new Sensei-powered services for Adobe Creative Cloud and Experience Cloud customers and developers.

The partnership advances Adobe's strategy to extend the availability of Sensei APIs and to broaden the Sensei ecosystem to a new audience of developers, data scientists and partners. "Combining NVIDIA's best-in-class AI capabilities with Adobe's leading creative and digital experience solutions, all powered by Sensei, will allow us to deliver higher-performing AI services to customers and developers more quickly," said Shantanu Narayen, president and CEO, Adobe. "We're excited to partner with NVIDIA to push the boundaries of what's possible in creativity, marketing and exciting new areas like immersive media."

Intel FPGAs Accelerate Artificial Intelligence for Deep Learning in Microsoft's Bing

Artificial intelligence (AI) is transforming industries and changing how data is managed, interpreted and, most importantly, used to solve real problems for people and businesses faster than ever.

Today's Microsoft's Bing Intelligent Search news demonstrates how Intel FPGA (field programmable gate array) technology is powering some of the world's most advanced AI platforms. Advances to the Bing search engine with real-time AI will help people do more and learn by going beyond delivering standard search results. Bing Intelligent Search will provide answers instead of web pages, and enable a system that understands words and the meaning behind them, the context and intent of a search.

Fortigis Introduces Their Ultimate AI-Driven Security Router

Fortigis Inc., today announced the launch of Fortigis, a cloud-enabled privacy & security wireless Virtual Private Network (VPN) router that patrols and controls who connects to a user's Wi-Fi network. The Indiegogo campaign for Fortigis, which is being managed by the Launchpad Agency, has a funding goal of $50,000 USD, and features limited Super Early Bird Specials of $199 as well as several other rewards for backers.

Fortigis uses a Virtual Private Network (VPN) to provide users online privacy and anonymity by creating a secure and private network. The secure connection encrypts all of the user's incoming and outgoing web traffic so no one can trace their online activity, protecting them against identity theft, ad targeting and having their online transactions such as online shopping or banking hijacked.

Intel, NVIDIA Join Forces Towards Garnering US Government Support for AI Field

Intel and NVIDIA found themselves on the same side of the field last week as both companies defended the US government's planning and intervention into the AI field. Naturally, these two giants' interests aligned in spurring the government to adopt policies and decision making that would impulse and accelerate development of AI technologies. That move would certainly boos both companies' endgames, as both have cut increased footholds in the Ai industry in their respective areas.

This was just the first of three planned hearings with industry professionals in the AI field. The order of business: to inform the federal government in formulating public policy around AI technologies, with a defining stake: making sure the US maintains its leadership in AI. Texas Congressman Will Hurd, which spearheaded the testimony in the hearing, said that "dominance in artificial intelligence is a guaranteed leg up in the realm of geopolitics and economics." AI's impact on general system's inefficiencies, such as transportation, accounting, healthcare, cybersecurity, and defense systems, were pointed towards by the NVIDIA and Intel spokespersons as definite areas of interest and extreme economic growth - and less expenditure - via AI deployment, which could ultimately add eight trillion dollars to the U.S. economy by 2035.

NVIDIA Announces Q4 and Fiscal 2018 Results

NVIDIA (NASDAQ:NVDA) today reported record revenue for the fourth quarter ended January 28, 2018, of $2.91 billion, up 34 percent from $2.17 billion a year earlier, and up 10 percent from $2.64 billion in the previous quarter. GAAP earnings per diluted share for the quarter were a record $1.78, up 80 percent from $0.99 a year ago and up 34 percent from $1.33 in the previous quarter. Non-GAAP earnings per diluted share were $1.72, also a record, up 52 percent from $1.13 a year earlier and up 29 percent from $1.33 in the previous quarter.

For fiscal 2018, revenue was a record $9.71 billion, up 41 percent from $6.91 billion a year earlier. GAAP earnings per diluted share were a record $4.82, up 88 percent from $2.57 a year earlier. Non-GAAP earnings per diluted share were $4.92, also a record, up 61 percent from $3.06 a year earlier. "We achieved another record quarter, capping an excellent year," said Jensen Huang, founder and chief executive officer of NVIDIA. "In a powerful sign of our progress, attendees at NVIDIA's GPU Technology Conferences reached 22,000, up tenfold in five years, as software developers working in AI, self-driving cars, and a broad range of other fields continued to discover the acceleration and money-saving benefits of our GPU computing platform.

Thermaltake Releases World's First AI Voice Controlled Digital Power Supply

Thermaltake has infused Artificial Intelligence (AI) into its Smart Power Management (SPM) system and unveiled the new 'AI Voice Control' feature to enhance the DPS G Mobile APP, making it easier for mining and PC gaming enthusiasts to control and custom design their ultimate powerful PC system. The 'AI Voice Control' function is currently only featured in the Toughpower iRGB PLUS Titanium/Platinum digital power supply lineups for iOS, and the Android version will be coming soon.

Via mobile phone, users can access 'Voice Control' to easily turn the power supply on/off; change light mode, color, brightness; and control the speed. In addition, the DPS G Mobile APP provides access to the PSU's key parameters: including the historical efficiency, fan speed, wattage, voltage, temperature, total usage time/kWh/electricity cost, and the CPU/VGA's historical temperature. It also gives users control over the RGB fan lighting mode, PC off remote control, and real-time alerts covering major PSU abnormalities such as fan malfunction; like overheating (above 140℉/60℃) or abnormal voltage level (over/under 5% of normal level) conditions. Best yet, it is fully TT RGB Plus Sync compatible, meaning users will be able to sync all RGB colors seamlessly to all Thermaltake TT RGB Plus-compatible product lines for even greater color coordination.

Leaked AI-powered Game Revenue Model Paper Foretells a Dystopian Nightmare

An artificial intelligence (AI) will deliberately tamper with your online gameplay as you scramble for more in-game items to win. The same AI will manipulate your state of mind at every step of your game to guide you towards more micro-transactions. Nothing in-game is truly fixed-rate. The game maps out your home, and cross-references it with your online footprint, to have a socio-economic picture of you, so the best possible revenue model, and anti buyer's remorse strategy can be implemented on you. These, and more, are part of the dystopian nightmare that takes flight if a new AI-powered online game revenue model is implemented in MMO games of the near future.

The paper's slide-deck and signed papers (with corrections) were leaked to the web by an unknown source, with bits of information (names, brands) redacted. It has too much information to be dismissed off hand for being a prank. It proposes leveraging AI to gather and build a socio-economic profile of a player to implement the best revenue-generation strategy. It also proposes using an AI to consistently "alter" the player's gameplay, such that the player's actions don't have the desired result leading toward beating the game, but towards an "unfair" consequence that motivates more in-game spending. The presentation spans a little over 50 slides, and is rich in text that requires little further explanation.
The rest of the presentation follows.

ASUS Announces Innovative Blue Cave AC2600 Wi-Fi Router

ASUS today announced Blue Cave, a stunningly designed high-performance AC2600 Wi-Fi router based on Intel's connected home technology that's built to meet the demanding needs of modern smart homes. Winner of a 2017 Computex Best Choice Golden Award, Blue Cave is an easy-to-use, innovatively designed router with stylish good looks and family-friendly features. It's built to cope easily with the increasingly complex and bandwidth-heavy demands of smart home networks, offering superb multi-device performance powered by the Intel Home Wi-Fi chipset, comprehensive commercial-grade security with ASUS AiProtection, and out-of-the-box IoT integration with Amazon Alexa and IFTTT.

"Combining years of ASUS experience in wireless innovation and Intel's connected home technology, Blue Cave delivers ultrafast Wi-Fi speeds throughout the home," said Tenlong Deng, General Manager, ASUS Networking and Wireless Devices Business Unit. "Not only powerful, its eye-catching design with no external antennas blends perfectly with any décor."

Vuzix to Showcase Their 'Blade' Augmented Reality Smart Glasses at CES 2018

Vuzix Corporation, a leading supplier of Smart Glasses and Augmented Reality (AR) technologies for the consumer and enterprise markets, is pleased to announce the official unveiling of the Vuzix Blade, which was awarded four International CES Innovation 2018 awards in the areas of Fitness, Sports and Biotech; Wireless Handset Accessories; Portable Media Players and Accessories as well as Computer Accessories.

The Vuzix Blade AR Smart Glasses provide a wearable smart display with a see-through viewing experience utilizing Vuzix' proprietary waveguide optics and Cobra II display engine. It's like having your computer or smartphone screen information right in front of you, wherever you go. The Vuzix Blade weighing in at less than 3 oz. represents the first pair of smart glasses that provide the wearer with a sizable virtual screen and brilliant pallet of colors via a thin completely see-through lens, in a fashionable form factor. The Vuzix Blade features adjustable FOV location and brightness, and is prescription ready, enabling any wearer both indoors and out to enjoy a comfortable visual experience. The Vuzix Blade, which can be paired seamlessly to Android or iOS phones enables individuals or workers to leave their phone in their pocket while visually presented with location aware content driven by the user's phone. Built for today's world and usable for work or play, the Vuzix Blade AR Smart Glasses offer the wearer heads up and hands-free mobile computing and connectivity, in an unprecedented form factor.

Samsung Optimizes Exynos 9810 for AI Applications and Richer Multimedia Content

Samsung Electronics, a world leader in advanced semiconductor technology, today announced the launch of its latest premium application processor (AP), the Exynos 9 Series 9810. The Exynos 9810, built on Samsung's second-generation 10-nanometer (nm) FinFET process, brings the next level of performance to smartphones and smart devices with its powerful third-generation custom CPU, faster gigabit LTE modem and sophisticated image processing with deep learning-based software. In recognition of its innovation and technological advancements, Samsung's Exynos 9 Series 9810 has been selected as a CES 2018 Innovation Awards HONOREE in the Embedded Technologies product category and will be displayed at the event, which runs January 9-12, 2018, in Las Vegas, USA.

"The Exynos 9 Series 9810 is our most innovative mobile processor yet, with our third-generation custom CPU, ultra-fast gigabit LTE modem and, deep learning-enhanced image processing," said Ben Hur, vice president of System LSI marketing at Samsung Electronics. "The Exynos 9810 will be a key catalyst for innovation in smart platforms such as smartphones, personal computing and automotive for the coming AI era."

The Laceli AI Compute Stick is Here to Compete Against Intel's Movidius

Gyrfalcon Technology Inc, an emerging AI chip maker in Silicon Valley, CA, launches its Laceli AI Compute Stick after Intel Movidius announced its deep learning Neural Compute Stick in July of last year. With the company's first ultra-low power, high performance AI processor Lightspeeur 2801S, the Laceli AI Compute Stick runs a 2.8 TOPS performance within 0.3 Watt of power, which is 90 times more efficient than the Movidius USB Stick (0.1 TOPS within 1 Watt of power.)

Lightspeeur is based on Gyrfalcon Technology Inc's APiM architecture, which uses memory as the AI processing unit. This eliminates the huge data movement that results in high power consumption. The architecture features true, on-chip parallelism, in situ computing, and eliminates memory bottlenecks. It has roughly 28,000 parallel computing cores and does not require external memory for AI inference.

LG's 2018 Speaker Lineup Combines Great Sound with Artificial Intelligence

LG Electronics (LG) at CES 2018 is showcasing superior, clearer and smarter lineup of premium audio products that promises to change the way people think about home speakers. New for 2018 is Meridian Audio's advanced high performance audio technology to deliver more natural and warm sound. From immersive Dolby Atmos soundbars to portable Bluetooth speakers and its latest artificial intelligence (AI) speaker , LG has something for every music and movie lover.

Sound All Around and Above
LG's new SK10Y soundbar delivers 550W of powerful output and supports 5.1.2 channels by harnessing the power of Dolby Atmos technology. A unique aspect of the technology is that unique "sound objects" can be precisely placed anywhere in a three-dimensional space for an immersive sound from all directions, including the ceiling, which enhances realism and the effect of being in the middle of the action. To create such powerful, textured sound, the SK10Y is equipped with multiple speakers - including a pair of powerful up-firing speakers - to envelop the listener from every angle. Users can adjust the volume of the upfiring speakers to optimize the sound with the height of the ceiling in room.
Return to Keyword Browsing
Dec 23rd, 2024 05:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts