News Posts matching #SLI

Return to Keyword Browsing

NVIDIA Prepares H100 NVL GPUs With More Memory and SLI-Like Capability

NVIDIA has killed SLI on its graphics cards, disabling the possibility of connecting two or more GPUs to harness their power for gaming and other workloads. However, SLI is making a reincarnation today in the form of a new H100 GPU model that spots higher memory capacity and higher performance. Called the H100 NVL, the GPU is a unique edition design based on the regular H100 PCIe version. What makes the H100 HVL version so special is the boost in memory capacity, now up from 80 GB in the standard model to 94 GB in the NVL edition SKU, for a total of 188 GB of HMB3 memory, running on a 6144-bit bus. Being a special edition SKU, it is sold only in pairs, as these H100 NVL GPUs are paired together and are connected by three NVLink connectors on top. Installation requires two PCIe slots, separated by dual-slot spacing.

The performance differences between the H100 PCIe version and the H100 SXM version are now matched with the new H100 NVL, as the card features a boost in the TDP with up to 400 Watts per card, which is configurable. The H100 NVL uses the same Tensor and CUDA core configuration as the SXM edition, except it is placed on a PCIe slot and connected to another card. Being sold in pairs, OEMs can outfit their systems with either two or four pairs per certified system. You can see the specification table below, with information filled out by AnandTech. As NVIDIA says, the need for this special edition SKU is the emergence of Large Language Models (LLMs) that require significant computational power to run. "Servers equipped with H100 NVL GPUs increase GPT-175B model performance up to 12X over NVIDIA DGX A100 systems while maintaining low latency in power-constrained data center environments," noted the company.

Jensen Confirms: NVLink Support in Ada Lovelace is Gone

NVIDIA CEO Jensen Huang in a call with the press today confirmed that Ada loses the NVLink connector. This marks the end of any possibility of explicit multi-GPU, and marks the complete demise of SLI (over a separate physical interface). Jensen stated that the reason behind removing the NVLink connector was because they needed the I/O for "something else," and decided against spending the resources to wire out an NVLink interface. NVIDIA's engineers also wanted to make the most out of the silicon area at their disposal to "cram in as much AI processing as we could". Jen-Hsun continued with "and also, because Ada is based on Gen 5, PCIe Gen 5, we now have the ability to do peer-to-peer cross-Gen 5 that's sufficiently fast that it was a better tradeoff". We reached out to NVIDIA to confirm and their answer is:
NVIDIAAda does not support PCIe Gen 5, but the Gen 5 power connector is included.

PCIe Gen 4 provides plenty of bandwidth for graphics usages today, so we felt it wasn't necessary to implement Gen 5 for this generation of graphics cards. The large framebuffers and large L2 caches of Ada GPUs also reduce utilization of the PCIe interface.

AMD Radeon RX 6900 XT Scores Top Spot in 3DMark Fire Strike Hall of Fame with 3.1 GHz Overclock

3DMark Fire Strike Hall of Fame is where overclockers submit their best hardware benchmark trials and try to beat the very best. For years, one record managed to hold, and today it just got defeated. According to an extreme overclocker called "biso biso" from South Korea and a part of the EVGA OC team, the top spot now belongs to the AMD Radeon RX 6900 XT graphics card. The previous 3DMark Fire Strike world record was set on April 22nd in 2020, when Vince Lucido, also known as K|NGP|N, set a record with four-way SLI of NVIDIA GeForce GTX 1080 Ti GPUs. However, that record is old news since January 27th, when biso biso set the history with AMD Radeon RX 6900 XT GPU.

The overclocker scored 62389 points, just 1,183 more from the previous record. He pushed the Navi 21 XTX silicon that powers the Radeon RX 6900 XT card to an impressive 3,147 MHz. Paired with a GPU memory clock of 2370 MHz, the GPU was probably LN2 cooled to achieve these results. The overclocker used EVGA's Z690 DARK KINGPIN motherboard with Intel Core i9-12900K processor as a platform of choice to achieve this record. You can check it out on the 3DMark Fire Strike Hall of Fame website to see yourself.

COLORFUL Launches the First GPU History Museum

Colorful Technology Company Limited, a professional manufacturer of graphics cards, motherboards, all-in-one gaming and multimedia solutions, and high-performance storage, announces the launch of the GPU History Museum in partnership with NVIDIA. COLORFUL has recently relocated to Shenzhen New Generation Industrial Park. With that, COLORFUL is proud to announce the launch of the first GPU History Museum in China. The museum will showcase the beginnings of the Graphics Processing Unit (GPU), to the development and evolution of graphics cards up to the present generation.

NVIDIA Will Stop Creating SLI Driver Profiles After January 2021

NVIDIA has been limiting SLI support recently with only the RTX 3090 featuring support for the feature and even then only through modern APIs such as DirectX 12 and Vulkan meaning that games must explicitly support SLI to work. NVIDIA will no longer be adding new SLI driver profiles on RTX 20 Series and earlier GPUs starting on January 1st, 2021. The only way to use SLI going forward will be through native game integrations which NVIDIA will focus on helping developers provide. NVIDIA also noted that various DirectX 12 and Vulkan games already feature native integrations such as; Shadow of the Tomb Raider, Civilization VI, Sniper Elite 4, Gears of War 4, and Red Dead Redemption 2. Creative and other non-gaming applications that support multi-GPU acceleration will continue to function across all supported GPUs.

NVIDIA Reserves NVLink Support For The RTX 3090

NVIDIA has issued another major blow to multi-GPU gaming with their recent RTX 30 series announcement. The only card to support NVLink SLI in this latest generation will be the RTX 3090 and will require a new NVLink bridge which costs 79 USD. NVIDIA reserved NVLink support for the RTX 2070 Super, RTX 2080 Super, RTX 2080, and RTX 2080 Ti in their Turing range of graphics cards. The AMD CrossFire multi-GPU solution has also become irrelevant after support for it was dropped with RDNA. Developer support for the feature has also declined due to the high cost of implementation, small user base, and often poor performance improvements. With the NVIDIA RTX 3090 set to retail for 1499 USD, the cost of a multi-GPU setup will exceed 3079 USD reserving the feature for only the wealthiest of gamers.

NVIDIA Releases GeForce Software 442.19, Game-ready for Zombie Army: Dead War 4

NVIDIA today posted, the latest version of GeForce software. Version 442.19 WHQL comes game-ready for "Zombie Army: Dead War 4," "Apex Legends Season 4," and "Metro Exodus: Sam's Story." With these drivers, NVIDIA improved the maximum frame-rate setting with limits between 20 and 1,000 FPS. Find this setting in "NVIDIA Control Panel>3D Settings>Maximum Frame Rate control." NVIDIA also introduced VRSS (VR super-sampling), an image quality enhancement that super sampling selectively on the central region of a frame.

Among the bug fixes are a game crash during a cutscene in "The Witcher 3: Wild Hunt - Blood and Wine," SETI@Home experiencing driver crash (TDR) on "Maxwell" GPUs, random stoppage of OBS streaming with "Call of Duty Modern Warfare," Battleye with NVIDIA low-latency mode causing a DWM crash and recovery; certain SLI+G-Sync stutter issues being fixed; a "Doom" (2016) game crash with "Kepler" GPUs, and an NVENC memory leak. Grab the drivers from the link below.

DOWNLOAD: NVIDIA GeForce Software v442.19 | Don't forget to check out TechPowerUp NVCleanstall 1.3.0

NVIDIA Develops Tile-based Multi-GPU Rendering Technique Called CFR

NVIDIA is invested in the development of multi-GPU, specifically SLI over NVLink, and has developed a new multi-GPU rendering technique that appears to be inspired by tile-based rendering. Implemented at a single-GPU level, tile-based rendering has been one of NVIDIA's many secret sauces that improved performance since its "Maxwell" family of GPUs. 3DCenter.org discovered that NVIDIA is working on its multi-GPU avatar, called CFR, which could be short for "checkerboard frame rendering," or "checkered frame rendering." The method is already secretly deployed on current NVIDIA drivers, although not documented for developers to implement.

In CFR, the frame is divided into tiny square tiles, like a checkerboard. Odd-numbered tiles are rendered by one GPU, and even-numbered ones by the other. Unlike AFR (alternate frame rendering), in which each GPU's dedicated memory has a copy of all of the resources needed to render the frame, methods like CFR and SFR (split frame rendering) optimize resource allocation. CFR also purportedly offers lesser micro-stutter than AFR. 3DCenter also detailed the features and requirements of CFR. To begin with, the method is only compatible with DirectX (including DirectX 12, 11, and 10), and not OpenGL or Vulkan. For now it's "Turing" exclusive, since NVLink is required (probably its bandwidth is needed to virtualize the tile buffer). Tools like NVIDIA Profile Inspector allow you to force CFR on provided the other hardware and API requirements are met. It still has many compatibility problems, and remains practically undocumented by NVIDIA.

NVIDIA Releases GeForce 441.20 WHQL Drivers

NVIDIA today posted GeForce 441.20 WHQL software. The drivers come game-ready for "Star Wars Jedi: The Fallen Order," and "Stormland." This includes SLI support for the latest Star Wars game. NVIDIA also introduced support for CUDA 10.2 compute API. The list of G-Sync compatible monitors has been expanded with three more displays. Among the issues addressed with this release are geometry corruption on GTX 900-series "Maxwell" GPUs with "Red Dead Redemption" in Vulkan; G-Sync getting disengaged when V-Sync is disabled in "Red Dead Redemption." Vulkan-related errors with "The Surge" have also been fixed. There's also a surprising bug-fix, "Quake 3 Arena" appearing washed-out when color-depth is set to 16-bit. Performance drops with "CS: GO" have also been addressed. Grab the driver from the link below.
DOWNLOAD: NVIDIA GeForce 441.20 WHQL Drivers

AMD CEO Lisa Su: "CrossFire Isn't a Significant Focus"

AMD CEO Lisa Su at the Hot Chips conference answered some questions from the attending press. One of these regarded AMD's stance on CrossFire and whether or not it remains a focus for the company. Once the poster child for a scalable consumer graphics future, with AMD even going as far as enabling mixed-GPU support (with debatable merits). Lisa Su came out and said what we all have been seeing happening in the background: "To be honest, the software is going faster than the hardware, I would say that CrossFire isn't a significant focus".

There isn't anything really new here; we've all seen the consumer GPU trends as of late, with CrossFire barely being deserving of mention (and the NVIDIA camp does the same for their SLI technology, which has been cut from all but the higher-tier graphics cards). Support seems to be enabled as more of an afterthought than a "focus", and that's just the way things are. It seems that the old, old practice of buying a lower-tier GPU at launch and then buying an additional graphics processor further down the line to leapfrog performance of higher-performance, single GPU solutions is going the way of the proverbial dodo - at least until an MCM (Multi-Chip-Module) approach sees the light of day, paired with a hardware syncing solution that does away with the software side of things. A true, integrated, software-blind multi-GPU solution comprised of two or more smaller dies than a single monolithic solution seems to be the way to go. We'll see.

NVIDIA RTX 2060 Super and RTX 2070 Super Chips Come in Three Variants Each. Flashing Possible?

While working on GPU-Z support for NVIDIA's new GeForce RTX Super cards, I noticed something curious. Each of the RTX 2060 Super and RTX 2070 Super is listed with three independent device IDs in the driver: 1F06, 1F42, 1F47 for the former and 1E84, 1EC2, 1EC7 for the latter. GeForce RTX 2080 Super on the other hand, like nearly every other NVIDIA SKU, uses only a single device ID (1E81). The PCI device ID uniquely identifies every GPU model, so the OS and driver can figure out what kind of device it is, what driver to use, and how to talk to it. I reached out to NVIDIA, for clarification, and never heard back from them besides an "interesting, I'll check internally" comment.

With no official word, I took a closer look at the actual values and remembered our NVIDIA segregates Turing GPUs article, that was part of the launch coverage for the initial GeForce RTX unveil. In that article, we revealed that NVIDIA is creating two models for each GPU, that are identical in every regard, except for name and price. If board partners want to build a factory-overclocked card, they have to buy the -A variant of the GPU, because only that is allowed to be used with an out of the box overclock. Manual overclocking by the users works exactly the same on both units.

AMD Radeon RX 5700 "Navi" Graphics Cards Lack CrossFire Support

In a comical turn of events, while NVIDIA restored NVLink SLI support for its RTX 2070 Super graphics card on virtue of being based on the "TU104" silicon, AMD did the opposite with "Navi." The Radeon RX 5700 XT and RX 5700 lack support for AMD CrossFire. If you put two of these cards on, say, a swanky X570 motherboard that splits PCIe gen 4.0 to two x8 slots with bandwidths comparable to PCIe gen 3.0 x16; you won't see an option to enable CrossFire. AMD, responding to our question on CrossFire compatibility clarified that AMD dropped CrossFire support for "Navi" in favor of DirectX 12 and Vulkan "explicit" multi-GPU mode. The older "implicit" multi-GPU mode - CrossFire - used by DirectX 11, DirectX 9, and OpenGL games is not supported. The AMD statement follows.
Radeon RX 5700 Series GPU's support CrossFire in 'Explicit' multi-GPU mode when running a DX12 or Vulkan game that supports multiple GPU's. The older 'implicit' mode used by legacy DX9/11/OpenGL titles is not supported.

GIGABYTE Intros X299-WU8 Motherboard Capable of 4x PCIe x16

GIGABYTE introduced the X299-WU8, a high-end desktop motherboard being sold as a quasi-workstation-class board, in the CEB form-factor (305 mm x 267 mm). Based on Intel X299 Express chipset, it features out-of-the-box support for Intel's socket LGA2066 Core X 9000-series processors, in addition to existing Core X 7000-series. A design focus with this board is on PCIe connectivity. The board employs two PLX PEX8747 PCIx gen 3.0 x48 bridge chips, which convert two gen 3.0 x16 links from the processor to four downstream x16 links, which can further be switched to x8. All seven expansion slots are PCI-Express 3.0 x16 physically, which are electrically "x16/NC/x16/NC/x16/NC/x16" or "x16/x8/x8/x8/x8/x8/x8." The topmost slot stays x16, while the other six share three x16 links depending on how you populate them. The board has certifications for 4-way SLI and CrossFireX.

The GIGABYTE X299-WU8 draws power from a combination of 24-pin ATX and two 8-pin EPS connectors, conditioning it for the CPU with an 8+1 phase VRM. An additional 6-pin PCIe power input, which is optional, stabilizes slot power delivery to the graphics cards. The CPU socket is flanked by eight DDR4 DIMM slots, supporting up to 128 GB of quad-channel DDR4 memory. Storage connectivity is surprisingly sparse, with just one M.2-2280 slot that has PCIe 3.0 x4 wiring, and eight SATA 6 Gbps ports. USB connectivity includes USB 3.1 gen 2 (including a type-C port), a number of USB 3.1 gen 1 ports, both on the rear panel and via headers; high-end onboard audio including an ALC1220 CODEC and headphones amp; and two 1 GbE networking interfaces. Expect this board to be priced around $600, given that the PEX8747 isn't cheap these days, and this board has two of it.

G-Sync and SLI Don't Get Along: Framerate Falls If You Combine Them

Several users have complained lately about performance issues on their SLI systems, and after some discussions in the NVIDIA subreddit and on NVIDIA's forums, the conclusion seems to be clear: performance drops when SLI and G-Sync are working together. The folks over at ExtremeTech have made a good job exploring the issues and they have confirmed that frame rate falls if both features are enabled on the PC. The problem seems related to timing according to their data, but there's no clear solution yet to that issue.

The problems are huge in titles such as Rising Storm 2 (206 vs 75 fps with SLI and G-Sync enabled) and not that important on others like Witcher 3 (113 vs 98 fps). The test setup included two GTX 1080 GPUs and an Acer XB280HK monitor that supports 4K but only at 60 Hz, a perfect choice to detect whether the problem was real or not. In their tests with several games they confirmed the problem, but didn't find a defined pattern: "Turning G-Sync on and using SLI is not guaranteed to tank your frame rate. [...] Different games showed three different performance models". In Deus Ex Mankind Divided the gap appeared only on DX11 mode. In Far Cry 5, the penalty size increases as the frame rate rises, and in Hitman the results were even more confusing.

KFA2 Announces Its RTX 2080 Ti OC, RTX 2080 OC Graphics Cards

KFA2 has just announced its iteration of NVIDIA's RTX 20-series graphics cards -namely, in the form of the RTX 2080 Ti and RTX 2080. For now, there seem to be no blower-style coolers available - both cards carry dual-fan cooling solutions, much like NVIDIA's redesigned reference graphics cards. Both cards feature a twin 90 mm fan design with RGB lighting on their 11 fan blades, built with a three-phase, 6-slot design. The cooling solution is a two-slot one, with implemented vapor chamber technology, and KFA2 is touting its new "Xrystallic" design which adds a transparent, glossy material surface to the shroud.

GIGABYTE Z390 Aorus Elite Pictured, New Round of Branding Chaos Incoming

GIGABYTE is ready with its Z390 Aorus Elite motherboard. The name might suggest that it's the company's flagship product based on the Z390 chipset, but in reality, it's part of the company's new nomenclature that differentiates boards from different price-points. The current "Gaming 3, Gaming 5, and Gaming 7" series is going to be replaced. The cheaper "Gaming 3" extension will be replaced by "Elite." The slightly divergent "Ultra Gaming" extension gets replaced by "PRO." The mid-tier Gaming 5, on the other hand, extension is replaced by "Ultra." The high-end Gaming 7 tier is replaced by "Master," and the flagship Gaming 9 extension by XTREME. We've already seen one of these with the recent Aorus X399 XTREME TR4 motherboard.

Moving on to the product at hand, the Z390 Aorus Elite is a somewhat entry-tier product that's probably priced around the $150-mark, owing to the high chipset price and the higher CPU VRM requirements it mandates. We already count a 13-something-phase VRM (which could include phase-doubling). The "Gaming 3" extension historically lacked NVIDIA SLI support, and that carries over to the Z390 Aorus Elite. The second PCIe x16 slot on this board is electrical x4, and wired to the PCH. Storage connectivity, besides six SATA, includes two M.2 slots with x4 wiring, one of which includes a heatsink. GIGABYTE boards are known for good onboard audio implementations, and this board is no exception. It appears to have WIMA capacitors, ground-layer isolation, EMI shield over the CODEC, which very likely could be an ALC1220. USB 3.1 connectivity, including a type-C port, and 1 GbE interface driven by an i219-V could make for the rest of it. There's minimal RGB bling on board, but you'll get 2-3 aRGB headers.

Maxsun Teases Next-gen NVIDIA GeForce Product at ChinaJoy

Maxsun may not be much of a household name in territories outside China, as the western market penetration isn't the most relevant within NVIDIA's partners. That said, the company does enjoy the status of NVIDIA AIB, so they're privy to details on next-gen products - especially when they are, allegedly, so close to a reveal and launch.

Maxsun showcased their take on NVIDIA's next generation (you can find the ending of that very word in the photo of the presentation slide) products with a 3D render of what could very well be their finalized next-gen graphics card's looks - and part of their premium, iCraft segment. The ubiquitous RGB is there, as always, pandering to the majority of users' lighting requirements. The graphics card presents a dual-slot, triple-fan solution, and there doesn't seem to be a DVI connector, nor an SLI termination, for that matter. The card also seems to have a single 8-pin power connector, and the GeForce branding is clear. Sadly, the render doesn't specify the model it pertains to - it would be great to finally have some closure on the 1100, 2000 series' debates.

NVIDIA GeForce GTX 1180 Bare PCB Pictured

Here are some of the first pictures of the bare printed circuit board (PCB) of NVIDIA's upcoming GeForce GTX 1180 graphics card (dubbed PG180), referred to by the person who originally posted them as "GTX 2080" (it seems the jury is still out on the nomenclature). The PCB looks hot from the press, with its SMT points and vias still exposed. The GT104 GPU traces hint at a package that's about the size of a GP104 or its precessors. It's wired to eight memory chips on three sides, confirming a 256-bit wide memory bus. Display outputs appear flexible, for either 2x DisplayPort + 2x HDMI, or 3x DisplayPort + 1x HDMI configurations.

The VRM setup is surprisingly powerful for a card that's supposed to succeed the ~180W GeForce GTX 1080, which can make do with a single 8-pin PCIe power input. The card draws power from a combination of 6-pin and 8-pin PCIe power connectors. There is a purportedly 10-phase VCore side, which in all likelihood is a 5-phase setup with "dumb" phase-doubling; and similarly, a 2-phase memory power (which could again be doubled single-phase). The SLI-HB fingers also make way. There's a new connector that looks like a single SLI finger and an NVLink finger arranged side-by-side. NVIDIA still hasn't given up on multi-GPU. NVLink is a very broad interconnect, in terms of bandwidth. NVIDIA probably needs that for multi-GPU setups to work with not just high resolutions (4K, 5K, or even 8K), but also higher bit-depth, higher refresh-rates, HDR, and other exotic data. The reverse side doesn't have much action other than traces for the VRM controllers, phase doublers, and an unusually large bank of SMT capacitors (the kind seen on AMD PCBs with MCM GPUs).

An Anthem for SLI: Bioware's New Universe in 60 FPS 4K Run on Two NVIDIA GeForce GTX 1080Ti GPUs

Lo and Behold: SLI working properly. This was my first reaction whilst reading up on this potential news piece (which, somewhat breaking the fourth wall, actually did end up as one). My thought likely isn't alone; it's been a while since we heard of any relevant dual graphics card configuration and performance improvement now, as developers seem to be throwing dreams of any "Explicit Multi-GPU" tech out of the water. This slight deviation from the news story aside, though: Anthem needed two of the world's fastest GPUs running in tandem to deliver a 4K, 60 FPS experience.

NVIDIA GV102 Prototype Board With GDDR6 Spotted, Up to 525 W Power Delivery. GTX 1180 Ti?

Reddit user 'dustinbrooks' has posted a photo of a prototype graphics card design that is clearly made by NVIDIA and "tested by a buddy of his that works for a company that tests NVIDIA boards". Dustin asked the community what he was looking at, which of course got tech enthusiasts interested.

The card is clearly made by NVIDIA as indicated by the markings near the PCI-Express x16 slot connector. What's also visible is three PCI-Express 8-pin power inputs and a huge VRM setup with four fans. Unfortunately the GPU in the center of the board is missing, but it should be GV102, the successor to GP102, since GDDR6 support is needed. The twelve GDDR6 memory chips located around the GPU's solder balls are marked as D9WCW, which decodes to MT61K256M32JE-14:A. These chips are Micron-made 8 Gbit GDDR6, specified for 14 Gb/s data rate, operating at 1.35 V. With twelve chips, this board has a 384-bit memory bus and 12 GB VRAM. The memory bandwidth at 14 Gbps data rate is a staggering 672 GB/s, which conclusively beats the 484 GB/s that Vega 64 and GTX 1080 Ti offer.

Intel Expands 8th Gen. Core Desktop Processor Family, Introduces New Chipsets

Intel today expanded its 8th generation Core desktop processor family, to include xx new models across its Core i7, Core i5, and Core i3 brand extensions. The company also introduced entry-level Pentium Gold and Celeron processors. The chips are based on the 14 nm "Coffee Lake" silicon, and are compatible with socket LGA1151 motherboards based on Intel 300-series chipset. Intel has relegated dual-core to the Celeron and Pentium Gold brands. The Celeron series includes 2-core/2-thread chips with 3 MB L3 cache; while the Pentium Gold series includes 2-core/4-thread chips with 4 MB L3 cache.

The company is launching the 8th generation Celeron series with two models, the G4900 and the G4920, clocked at 3.10 GHz, and 3.20 GHz, respectively. The Pentium Gold family has three parts, the G5400, the G5500, and the G5600, clocked at 3.70 GHz, 3.80 GHz, and 3.90 GHz, respectively. The 8th generation Core i3 family of 4-core/4-thread parts receives a new member, the i3-8300. Endowed with 8 MB of L3 cache, this chip is clocked at 3.70 GHz, and sits between the i3-8100 and the i3-8350K, but lacks the unlocked multiplier of the latter.

NVIDIA Releases GeForce 391.35 WHQL Game Ready Drivers

NVIDIA today released the latest version of its GeForce software suite. Version 391.35 WHQL drivers are game-ready for "Far Cry 5," the week's biggest AAA game release. This includes game optimization, an SLI profile, and GeForce Experience optimal settings. The drivers also add SLI profiles for "WRC 7" and GRIP.

The drivers also introduce security updates for a staggering seven CVEs, mostly relating vulnerabilities discovered in the kernel-mode layers of the driver, and user-mode drivers for DirectX and OpenGL. NVIDIA discovered that certain specially crafted pixel shaders can cause infinite loops and denial of service (system hang), or escalation of privileges. Besides these, the drivers patch a memory leak found in NVIDIA Freestyle (custom shader tool), bugs with 3D Profile Manager, and a game freeze noticed on "Diablo III" when rapidly switching tasks on SLI machines. Grab the drivers from the links below.
DOWNLOAD: NVIDIA GeForce 391.35 WHQL

The change-log follows.

ASUS ROG Crosshair VII Hero Motherboard Packaging Teased

Leaks surrounding motherboards based on AMD's upcoming X470 chipset, which will accompany its first-wave of Ryzen 2000-series "Pinnacle Ridge" processors, are on the rise. We now see a fairly clear picture of the retail packaging of ASUS Republic of Gamers (ROG) Crosshair VII Hero, specifically its "WiFi" sub-variant that includes a WLAN card. This board surfaced on benchmark database listings. There's no product image on the box, but the logos make things pretty cut and dry - AMD X470 chipset, socket AM4, support for NVIDIA SLI and AMD CrossFire; ASUS Aura Sync RGB LED management, and an unchanged Ryzen logo. We have confirmation of ASUS readying at least two ROG motherboards based on the X470 so far - the Crosshair VII Hero, and the Strix X470-F. It remains to be seen if the chipset also gets the company's coveted ROG "Extreme" treatment.

Colorful Announces iGame SLI HB Bridge

Colorful Technology Company Limited, professional manufacturer of graphics cards, motherboards and high-performance storage solutions is proud to announce its iGame SLI HB Bridge that will fit perfectly with the aesthetics of COLORFUL graphics cards supporting SLI. The new iGame SLI HB bridges supports dual-link SLI mode that improves performance for graphics card that support NVIDIA SLI including the latest NVIDIA 10-Series graphics cards.

The new COLORFUL iGame SLI HB Bridge improve SLI performance when used with compatible NVIDIA 10-series Pascal GPUs that support High-Bandwidth SLI by delivering double the bandwidth of traditional SLI bridges. The iGame SLI HB Bridge also supports traditional SLI compatible graphics cards from NVIDIA. This increases overall performance from both GPUs and provides a smoother experience when playing demanding games that properly utilize NVIDIA SLI. Not only do you get improved, overall performance in higher resolutions, you also get higher framerates which compliment today's high-performance monitors.

Easily Build a Quiet and Gorgeous System with NZXT's N7 Z370 Motherboard

NZXT today announces the N7 Z370, its first motherboard. Built around Intel's Z370 chipset, the N7 delivers everything you need to build a powerful, gorgeous gaming PC right out-of-the-box. All the essentials are included, along with a built-in digital fan controller and integrated RGB lighting channels. The all-metal motherboard cover perfectly matches the color and finish of your case, creating a visually seamless backdrop for your components. Using NZXT's CAM software, you have full control over your system's lighting, cooling, and performance.

"Leveraging our years of experience as professional PC builders, we've designed the N7 motherboard with a completely new approach. In the face of increasing complexity in the PC gaming market, we want to make building easier and more enjoyable, with the N7 as a cornerstone for this new experience. Everything you need--from easy layout and obvious connections to digital fan control and RGB lighting--is included. We've also designed a completely unique motherboard cover so it's both beautiful and powerful. You can't build a quieter, better-looking system as easily as you can with our new N7 motherboard." says Johnny Hou, NZXT's founder, and CEO.
Return to Keyword Browsing
May 3rd, 2024 03:43 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts