News Posts matching #Driver

Return to Keyword Browsing

NVIDIA GeForce Game Ready Drivers 531.41 WHQL Released

NVIDIA has released the latest version of its GeForce Game Ready drivers, the version 531.41 WHQL. The new release provides the best day-0 gaming experience for Diablo IV beta phase, including support for DLSS 2, as well as the same DLSS 2 support for The Last of Us Part I, Smalland: Survive the Wild, and Deceive Inc. The new NVIDIA GeForce Game Ready Drivers 531.41 WHQL is also the Game Ready driver for Resident Evil 4 and it adds DLSS 3 support for Forza Horizon 5. Most importantly, it also adds support for the technology preview of Cyberpunk 2077's Ray Tracing: Overdrive Mode. The new NVIDIA GeForce Game Ready Drivers 531.41 WHQL also adds several GeForce Experience Profiles, as well as fixes several issues.

DOWNLOAD: NVIDIA GeForce Game Ready 531.41 WHQL

Update: Added Open Issues to the release highlights.

3% of AMD Radeon Users May Experience Unusually Low 3DMark TimeSpy Performance, Driver Fix Underway

About 3% of AMD Radeon graphics card users may experience lower than usual 3DMark TimeSpy performance, says UL Benchmarks, developers of the 3DMark graphics benchmark suite. The issue came to light when a Google developer noticed that his RX 7900 XTX exhibited lower than expected performacne in TimeSpy, and took it up with UL. While the 3DMark developer hasn't been able to reproduce the issue on their end, it mentions that AMD is aware of it, has had more luck in reproducing it, and is working on a driver-level fix. For now, UL offers no solution other than to roll back to older driver versions and try testing again.

SAM/ReBAR Stripped Out of AMD Open-Source OpenGL Driver RadeonSI Gallium3D

Support for AMD's Smart Access Memory and the overarching Resizable BAR technologies has been removed from the RadeonSI Gallium3D OpenGL driver as of today's Mesa 22.3.7 release. The comment in the announcement simply reads, "Disable Smart Access Memory because CPU access has large overhead." The nail in the coffin seems to have been this bug ticket submitted last month for the game Hyperdimension Neptunia Re;Birth1, in which the user reported the game running oddly slow on their RX 6600 while previously they had no issues on the much older R9 380. The solution provided was to simply disable ReBAR/SAM either with radeonsi_disable_sam=true or via UEFI. In the comments of the ticket lead RadeonSI developer Marek Olšák states, "We've never tested SAM with radeonsi, and it's not necessary there."

Apparently the performance advantages weren't panning out for RadeonSI, and since direct optimizations of these features was not a primary goal the decision was made to cut them out. Attempts to optimize SAM with RadeonSI date as far back as December 2020 and Mesa 21.0, but support for SAM under Linux goes further back. None of the changes to RadeonSI will affect other drivers such as RADV, the open-source Radeon Vulkan driver, and this code change is limited to only the RadeonSI OpenGL driver.

NVIDIA Releases 531.26 Hotfix for CPU Usage Bug

NVIDIA was quick to respond to the reported CPU usage bug present in Game Ready Driver 531.18 and has issued Hotfix Driver 531.26 which claims to resolve the issue. This release still includes the new RTX Video Super Resolution AI upscaling feature introduced with 531.18, which you can use with RTX 30 and 40 series cards in conjunction with Chrome or Edge internet browsers. Below are the release notes.

This hotfix addresses the following issues:
  • Higher CPU usage from NVIDIA Container might be observed after exiting a game [4007208]
  • [Notebook] Random bugcheck may be observed on certain laptops with GeForce GTX 10/MX250/350 series GPUs [4008527]
GeForce Hotfix Driver Version 531.26 can be found here, available for Windows 10 and 11 x64.

AMD Releases Adrenalin Edition 23.3.1 WHQL GPU Drivers

AMD has released its latest Adrenalin drivers for Radeon graphics cards. With support dating back to RX 400, the latest Adrenalin 23.3.1 WHQL drivers bring a lot of improvements to the table, as well as support for Halo Infinite Ray Tracing Update and Wo Long: Fallen Dynasty game. Most importantly, the new driver has a series of fixes, including intermittent driver timeout, system freeze, or BSOD, with the latest Radeon RX 7000 series. Problems in games such as Premium Gold Packs in EA SPORTS FIFA 23 and Dying Light 2 lighting effects corruption have been fixed. As far as Radeon RX 6000 series goes, this driver release manages to fix corruption in certain scenes with ray tracing enabled observed in the Returnal game. Check the list below for the entire set of changes.
Download: AMD Radeon Graphics Drivers 23.3.1 WHQL here.

Bug in NVIDIA Game Ready Driver 531.18 Causing High CPU Utilization

Last week's release of NVIDIA's Game Ready Driver introduced a new bug with the NVIDIA Display Container process that increases CPU usage by as much as 15% after closing a game. NVIDIA has recently confirmed the bug on their forum in a feedback thread and assigned it a bug track ID of 4007208. The problem appears to stem from a telemetry service known as NvGSTPlugin.dll, or Game Session Telemetry Plugin, which is loaded by NvidiaContainer after a game has been run. Some users are reporting that completely removing the offending .dll solves the problem entirely and a guide on how to do so has been posted on the r/nvidia subreddit in a thread about the release. If that sounds like far too much hassle then the prevailing advice is to simply remain on driver version 528.49 until the issue is resolved. NVIDIA is expected to release a hotfix driver as early as tomorrow to address this and possibly other issues.

Update Mar 7th: NVIDIA issued a hotfix driver release for this bug.

Rambus Delivers 6400 MTs DDR5 Registering Clock Driver to Advance Server Memory Performance

Rambus Inc., a premier chip and silicon IP provider making data faster and safer, today announced the availability of its new 6400 MT/s DDR5 Registering Clock Driver (RCD) and sampling to the major DDR5 memory module (RDIMM) manufacturers. With a 33% increase in data rate and bandwidth over Gen 1 4800 MT/s solutions, the Rambus Gen 3 6400 MT/s DDR5 RCD enables a new level of main memory performance for data center servers. Delivering industry-leading latency and power, it offers optimized timing parameters for improved RDIMM margins.

"Data center workloads have an insatiable thirst for greater memory bandwidth and capacity, and our mission is to advance the performance of server memory solutions that meet this need for each new server platform generation," said Sean Fan, chief operating officer at Rambus. "We were first in the industry to 5600 MT/s, and now we have raised the bar with our Gen 3 DDR5 RCD capable of 6400 MT/s to support a new generation of RDIMMs for server main memory."

NVIDIA Could Release AI-Optimized Drivers, Improving Overall Performance

NVIDIA is using artificial intelligence to design and develop parts of its chip designs, as we have seen in the past, making optimization much more efficient. However, today we have a new rumor that NVIDIA will use AI to optimize its driver performance to reach grounds that the human workforce can not. According to CapFrameX, NVIDIA is allegedly preparing special drivers with optimizations done by AI algorithms. As the source claims, the average improvement will yield a 10% performance increase with up to 30% in best-case scenarios. Presumably, AI can do optimization on two fronts: shader compiles / game optimization side or for power management, which includes clocks, voltages, and boost frequency curves.

It still needs to be made clear which aspect will company's AI optimize and work on; however, it can be a combination of two, given the expected drastic improvement in performance. Special tuning of code for more efficient execution and a better power/frequency curve will bring the efficiency level one notch above current releases. We have already seen AI solve these problems last year with the PrefixML model that compacted circuit design by 25%. We need to find out which cards NVIDIA plans to target, and we can only assume that the latest-generation GeForce RTX 40 series will be the goal if the project is made public in Q1 of this year.

YoY Growth of NAND Flash Demand Bits Will Stay Under 30% from 2022 to 2025 as Demand Slows for PC Client SSDs, Says TrendForce

Client SSDs constituted a major driver of demand bit growth in the NAND Flash market for the past two years as the effects of the COVID-19 pandemic were spurring procurement activities related to working and studying from home. TrendForce currently projects that the attach rate of client SSDs among notebook computers will reach 92% in 2022 and around 96% in 2023. However, the demand surge related to the pandemic is subsiding, and the recent headwinds in the global economy have caused a demand freeze in the wider consumer electronics market. Hence, among the major application segments of the NAND Flash market, client SSDs are going to experience the most significant demand slowdown. This, in turn, will constrain demand bit growth as well. TrendForce projects that for the period from 2022 to 2025, the YoY growth rate of NAND Flash demand bits will remain below 30%.

The average NAND Flash content of client SSDs has already surpassed 500 GB this year. Quotes for 512 GB SSDs have fallen sharply and come to a level that is roughly comparable to the quotes that were given for 256 GB SSDs half-a-year ago. In fact, quotes for 512 GB SSDs are also near the level for HDDs with the same capacity. On the other hand, upgrading to 1 TB or higher for notebook SSDs could be challenging for PC OEMs mainly because the licensing fee for the Windows OS has a positive correlation with device specifications. Therefore, an increase in SSD capacity will raise the cost of a whole notebook computer. With PC OEMs being less keen on adopting SSDs that are 1 TB or higher, growth in the average NAND Flash content of client SSDs will also be more limited in the future.

Montage Technology Delivers World's First Gen1 DDR5 Clock Driver Engineering Samples

Montage Technology, a leading data processing and interconnect IC design company, today announced that it is delivering the world's first Gen1 DDR5 Clock Driver (CKD or DDR5CK01) samples to the top DRAM memory vendors for their development of memory modules used in new-generation desktop and notebook computers.

For a long time, the clock driver functions have been integrated onto the register clock driver (RCD) device, which is used in the server platforms, rather than the PC computers. With the boost of the DDR5 data rate, the frequency of the clock signal becomes higher and higher, and the signal integrity issue of the clock becomes more and more challenging. As the DDR5 data rate reaches 6400MT/s and above, the memory modules such as UDIMMs and SODIMMs used in desktop and notebook computers, will need an on-DIMM clock driver to buffer and re-drive the clock signal of the memory modules, to meet the signal integrity and reliability requirement of the high-speed clock signal.

Intel Detects 43 GPU Driver Bugs... By Watching a Review Video

Intel has been in the firing lines for the consecutive delays and general lack of clarity surrounding the launch of its Arc Alchemist family of discrete GPUs. Staggered availability has meant that the only currently available Arc GPU - the A380 - is still only available in the Chinese market, where the Internet café game is still strong. Intel drivers in particular have been fraught with bugs and as we know, software can bring even the most competent hardware to its knees. So there's maybe an echo of warning bells to the real state of Arc's software suite when Intel admits to having detected 43 different GPU driver bugs... While watching a review video from Gamers Nexus.

We've conducted our own review of Intel's Arc A380 (after importing it from China), and we did call to attention how we "encountered numerous bugs including bluescreens, corrupted desktop after startup, random systems hangs, system getting stuck during shutdown sequences, and more." Only AMD and NVIDIA seem to have an idea on just how complex the matter of breaking into and maintaining a position in this particular product segment takes. Intel itself is still in the process of learning just what that takes, as its own VP and general manager of the Visual Computing Group, Lisa Pearce, penned in a blog post.

Intel Driver Update Confirms VPU Integration in Meteor Lake for AI Workload Acceleration

Intel yesterday confirmed its plans to extend its Meteor Lake architecture towards shores other than general processing. According to Phoronix, Intel posted a new driver that lays the foundations for VPU (Versatile Processing Unit) support under Linux. The idea here is that Intel will integrate this VPU within its 14th Gen Meteor Lake architecture, adding AI inferencing acceleration capabilities to its silicon. A sure-fire way to achieve enormous gains in AI processing, especially in performance/watt. Interestingly, Intel is somewhat following Apple's footsteps here, as the company already includes AI-dedicated processing cores in its desktop/laptop Apple Silicon processors since the M1 days.

Intel's VPU architecture will surely be derived from Movidius' designs, which Intel acquired back in 2016 for a cool $400 million. It's unclear which parts of Movidius/Intel IP will be included in the VPU units to be paired with Meteor Lake: whether a full-blown, SoC (System on Chip)-like VPU design such as the Myriad X VPU, or if Intel will take select bits of the architecture (plus the equivalent of five additional years of research and development), sprinkling them on top of their upcoming architecture. We do know the VPU itself will include a memory management unit, a RISC-based microcontroller, a Neural Compute System (what exactly entails this compute system and its slices is the mysterious part) and network-on-chip capabilities.

AMD Introduces Radeon Raytracing Analyzer 1.0

Today, the AMD GPUOpen announced that AMD developed a new tool for game developers using ray tracing technologies to help organize the model geometries in their scenes. Called Radeon Raytracing Analyzer (RRA) 1.0, it is officially available to download for Linux and Windows and released as a part of the Radeon Developer Tool Suite. With rendering geometries slowly switching from rasterization to ray tracing, developers need a tool that will point out performance issues and various workarounds in the process. With RRA, AMD has enabled all Radeon developers to own a tool that will answer many questions like: how much memory is the acceleration structure using, how complex is the implemented BVH, how many acceleration structures are used, does geometry in the BLAS axis align enough, etc. Developers will find it very appealing for their ray tracing workloads.
AMDRRA is able to work because our Radeon Software driver engineers have been hard at work, adding raytracing support to our Developer Driver technology. This means that once your application is running in developer mode - using the Radeon Developer Panel which ships with RRA - the driver can log all of the acceleration structures in a scene with a single button click. The Radeon Raytracing Analyzer tool can then load and interrogate the data generated by the driver, presenting it in an easy-to-understand way.

Intel's Day-0 Driver Updates Now Limited to Xe-based iGPUs and Graphics Cards

Intel Graphics, with its latest Graphics Drivers 31.0.101.3222, changed the coverage of its latest driver updates. The company would be providing game optimizations and regular driver updates only for its Gen12 (Iris Xe), and Arc "Alchemist" graphics products. Support for Gen9, Gen9.5, and Gen11 iGPUs integrated with 6th, 7th, 8th, 9th, and 10th generations of Intel processors, namely "Skylake," "Kaby Lake," "Coffee Lake," "Ice Lake," and "Cascade Lake," will be relegated to a separate, quarterly driver update cycle, which only covers critical updates and security vulnerabilities, but not game optimizations.

Intel's regular Graphics Driver cycle that includes Day-0 optimizations timed with new game releases, will only cover the Gen12 Xe iGPUs found in 11th Gen "Tiger Lake," "Rocket Lake," and 12th Gen "Alder Lake" processors; besides the DG1 Iris Xe graphics card; and Arc "Alchemist" discrete GPUs. Version 31.0.101.3222 appears to be a transitioning point, and so it has drivers from both branches included within a 1.1 GB package (the main branch supporting game optimizations for new GPUs, and the legacy branch for the older iGPUs). You can grab this driver from here.

AMD Releases Ryzen Chipset Drivers 4.06.10.651

AMD has recently released new chipset drivers for Ryzen processors on a wide range of platforms including A320, B350, X370, B450, X470, X399, A520, B550, X570, TRX40, and WRX80 that are running Windows 10 & 11. The new drivers include additional program support, display & airplane mode notifications, Ryzen 9 Mobile improvements, and various bug fixes. This latest release also sees the introduction of six new drivers including official USB 4.0 support, the complete changelog and download link can be found below.

Download: AMD Ryzen Chipset Drivers (4.06.10.651)

NVIDIA GeForce 516.40 WHQL Drivers Released

NVIDIA today released the latest version of GeForce software. Version 516.40 WHQL comes Game Ready for "Fall Guys: Free for All," NVIDIA Reflex support for "ICARUS," and RTX ray tracing support for "Jurassic Wold Evoluion 2," "Resident Evil 2," "Resident Evil 3," and "Resident Evil 7." Among the handful issues fixed with this release include shadows not rendering correctly in "Enscape," brightness settings are not getting applied correctly on certain Lenovo notebooks with Advanced Optimus enabled; and Club 3D CAC-1085 dongle being limited to 4K-60 Hz display-mode.

DOWNLOAD: NVIDIA GeForce 516.40 WHQL

NVIDIA Releases Security Update 473.47 WHQL Driver for Kepler GPUs

Ten years ago, in 2012, NVIDIA introduced its Kepler series of graphics cards based on the TSMC 28 nm node. Architecture has been supported for quite a while now by NVIDIA's drivers, and the last series to carry support was the 470 driver class. Today, NVIDIA pushed a security update in the form of a 473.47 WHQL driver that brings fixes to various CVE vulnerabilities that can cause anything from issues that may lead to denial of service, information disclosure, or data tampering. This driver version has no fixed matters and doesn't bring any additional features except the fix for vulnerabilities. With CVEs rated from 4.1 to 8.5, NVIDIA has fixed major issues bugging Kepler GPU users. With a high risk for code execution, denial of service, escalation of privileges, information disclosure, and data tampering, the 473.47 WHQL driver is another step for supporting Kepler architecture until 2024, when NVIDIA plans to drop the support for this architecture. Supported cards are GT 600, GT 700, GTX 600, GTX 700, Titan, Titan Black, and Titan Z.

The updated drivers are available for installation on NVIDIA's website and for users of TechPowerUp's NVCleanstall software.

NVIDIA Releases Open-Source GPU Kernel Modules

NVIDIA is now publishing Linux GPU kernel modules as open source with dual GPL/MIT license, starting with the R515 driver release. You can find the source code for these kernel modules in the NVIDIA Open GPU Kernel Modules repo on GitHub. This release is a significant step toward improving the experience of using NVIDIA GPUs in Linux, for tighter integration with the OS and for developers to debug, integrate, and contribute back. For Linux distribution providers, the open-source modules increase ease of use.

They also improve the out-of-the-box user experience to sign and distribute the NVIDIA GPU driver. Canonical and SUSE are able to immediately package the open kernel modules with Ubuntu and SUSE Linux Enterprise Distributions. Developers can trace into code paths and see how kernel event scheduling is interacting with their workload for faster root cause debugging. In addition, enterprise software developers can now integrate the driver seamlessly into the customized Linux kernel configured for their project.

Marvell Introduces Industry's First 800G Multimode Electro-Optics Platform for Cloud Data Centers

Marvell (NASDAQ: MRVL) today announced the industry's first 800 Gbps or 8x 100 Gbps multimode platform solution, that enables data center infrastructure to achieve dramatically higher speeds for short-reach optical modules and Active Optical Cable (AOC) applications. As artificial intelligence (AI), machine learning (ML) and high-performance computing (HPC) applications continue to drive greater bandwidth requirements, cloud-optimized solutions are needed that can bring lower power, latency and cost to short-range data center interconnections. The new 800G platform, which includes Marvell's PAM4 DSP with a multimode transimpedance amplifier (TIA) and Driver, enables faster data center speeds scaling to 800 Gbps, using conventional cost-effective vertical-cavity surface-emitting laser (VCSEL) technology while accelerating time-to-market with plug-and-play deployment.

Today's data centers are packed with equipment utilizing optical modules or AOCs connected by multimode optical fiber optimized for communication over short distances within data centers. This 100G per lane multimode fiber provides cost-effective, low-power, short-reach connectivity. To support multi-gigabit transmissions, multimode architectures often use VCSEL transmitters, which offer the cost benefits of reliability, power efficiency and easy deployment.

Hackers Threaten to Release NVIDIA GPU Drivers Code, Firmware, and Hash Rate Limiter Bypass

A few days ago, we found out that NVIDIA corporation has been hacked and that attackers managed to steal around 1 TB of sensitive data from the company. This includes various kinds of files like GPU driver and GPU firmware source codes and something a bit more interesting. The LAPSUS$ hacking group responsible for the attack is now threatening to "help mining and gaming community" by releasing a bypass solution for the Lite Hash Rate (LHR) GPU hash rate limiter. As the group notes, the full LHR V2 workaround for anything between GA102-GA104 is on sale and is ready for further spreading.

Additionally, the hacking group is making blackmailing claims that the company should remove the LHR from its software or share details of the "hw folder," presumably a hardware folder with various confidential schematics and hardware information. NVIDIA did not respond to these claims and had no official statement regarding the situation other than acknowledging that they are investigating an incident.

Update 01:01 UTC: The hackers have released part of their files to the public. It's a 18.8 GB RAR file, which uncompresses to over 400,000 (!) files occupying 75 GB, it's mostly source code.

Intel Fails to Deliver on Promised Day-0 Elden Ring Graphics Driver

It seems that someone at Intel forgot to press "post" on the company's promised day-0 driver update for one of this year's most anticipated games - Elden Ring. The company previously announced a partnership with Elden Ring developer FromSoftware in the development of an updated driver that wold give Intel-based Elden Ring players streamlined performance and a (hopefully) bug-free experience when it comes to graphics rendering. But Elden Ring's launch day of February 24th has come and gone - and Intel is mum on where exactly its updated driver lies. For now, the latest available Intel graphics driver stands at version 101.1121 - released in November last year.

It may be the case that the driver development hit an unexpected snag, or perhaps Intel has simply opted to delay the driver's launch until there are actually some discrete-level graphics cards available for purchase - the company's initial Arc Alchemist lineup is expected to be announced and launched later this month. That would make sense - especially considering how a driver update this close to release might include some interesting data on the upcoming graphics cards that could be pursued by data miners. Even so, it doesn't seem like a good PR move for Intel to have loudly promised an updated driver and then fail to release it - especially as Intel's uphill battle in the discrete GPU market is just beginning. Perhaps the driver developers are having too much fun with the critically and consumer-acclaimed latest installment from FromSoftware?

8-inch Wafer Capacity Remains Tight, Shortages Expected to Ease in 2H23, Says TrendForce

From 2020 to 2025, the compound annual growth rate (CAGR) of 12-inch equivalent wafer capacity at the world's top ten foundries will be approximately 10% with the majority of these companies focusing on 12-inch capacity expansion, which will see a CAGR of approximately 13.2%, according to TrendForce's research. In terms of 8-inch wafers, due to factors such as difficult to obtain equipment and whether capacity expansion is cost-effective, most fabs can only expand production slightly by means of capacity optimization, equating to a CAGR of only 3.3%. In terms of demand, the products primarily derived from 8-inch wafers, PMIC and Power Discrete, are driven by demand for electric vehicles, 5G smartphones, and servers. Stocking momentum has not fallen off, resulting in a serious shortage of 8-inch wafer production capacity that has festered since 2H19. Therefore, in order to mitigate competition for 8-inch capacity, a trend of shifting certain products to 12-inch production has gradually emerged. However, if shortages in overall 8-inch capacity is to be effectively alleviated, it is still necessary to wait for a large number of mainstream products to migrate to 12-inch production. The timeframe for this migration is estimated to be close to 2H23 into 2024.

Intel Adds Experimental Mesh Shader Support in DG2 GPU Vulkan Linux Drivers

Mesh shader is a relatively new concept of a programmable geometric shading pipeline, which promises to simplify the whole graphics rendering pipeline organization. NVIDIA introduced this concept with Turing back in 2018, and AMD joined with RDNA2. Today, thanks to the finds of Phoronix, we have gathered information that Intel's DG2 GPU will carry support for mesh shaders and bring it under Vulkan API. For starters, the difference between mesh/task and traditional graphics rendering pipeline is that the mesh edition is much simpler and offers higher scalability, bandwidth reduction, and greater flexibility in the design of mesh topology and graphics work. In Vulkan, the current mesh shader state is NVIDIA's contribution called the VK_NV_mesh_shader extension. The below docs explain it in greater detail:
Vulkan API documentationThis extension provides a new mechanism allowing applications to generate collections of geometric primitives via programmable mesh shading. It is an alternative to the existing programmable primitive shading pipeline, which relied on generating input primitives by a fixed function assembler as well as fixed function vertex fetch.

There are new programmable shader types—the task and mesh shader—to generate these collections to be processed by fixed-function primitive assembly and rasterization logic. When task and mesh shaders are dispatched, they replace the core pre-rasterization stages, including vertex array attribute fetching, vertex shader processing, tessellation, and geometry shader processing.

The Power of AI Arrives in Upcoming NVIDIA Game-Ready Driver Release with Deep Learning Dynamic Super Resolution (DLDSR)

Among the broad range of new game titles getting support, we are in for a surprise. NVIDIA yesterday announced a feature list of its upcoming game-ready GeForce driver scheduled for public release on January 14th. According to the new blog post on NVIDIA's website, the forthcoming game-ready driver release will feature an AI-enhanced version of Dynamic Super Resolution (DSR), available in GeForce drivers for a while. The new AI-powered tech is, what the company calls, Deep Learning Dynamic Super Resolution or DLDSR shortly. It uses neural networks that require fewer input pixels and produces stunning image quality on your monitor.
NVIDIAOur January 14th Game Ready Driver updates the NVIDIA DSR feature with AI. DLDSR (Deep Learning Dynamic Super Resolution) renders a game at higher, more detailed resolution before intelligently shrinking the result back down to the resolution of your monitor. This downsampling method improves image quality by enhancing detail, smoothing edges, and reducing shimmering.

DLDSR improves upon DSR by adding an AI network that requires fewer input pixels, making the image quality of DLDSR 2.25X comparable to that of DSR 4X, but with higher performance. DLDSR works in most games on GeForce RTX GPUs, thanks to their Tensor Cores.
NVIDIA Deep Learning Dynamic Super Resolution NVIDIA Deep Learning Dynamic Super Resolution

FTC Sues to Block $40 Billion Semiconductor NVIDIA and Arm Chip Merger

The Federal Trade Commission today sued to block U.S. chip supplier Nvidia Corp.'s $40 billion acquisition of U.K. chip design provider Arm Ltd. Semiconductor chips power the computers and technologies that are essential to our modern economy and society. The proposed vertical deal would give one of the largest chip companies control over the computing technology and designs that rival firms rely on to develop their own competing chips. The FTC's complaint alleges that the combined firm would have the means and incentive to stifle innovative next-generation technologies, including those used to run datacenters and driver-assistance systems in cars.

"The FTC is suing to block the largest semiconductor chip merger in history to prevent a chip conglomerate from stifling the innovation pipeline for next-generation technologies," said FTC Bureau of Competition Director Holly Vedova. "Tomorrow's technologies depend on preserving today's competitive, cutting-edge chip markets. This proposed deal would distort Arm's incentives in chip markets and allow the combined firm to unfairly undermine Nvidia's rivals. The FTC's lawsuit should send a strong signal that we will act aggressively to protect our critical infrastructure markets from illegal vertical mergers that have far-reaching and damaging effects on future innovations."
Return to Keyword Browsing
Nov 21st, 2024 10:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts