News Posts matching #Turing

Return to Keyword Browsing

NVIDIA GeForce GTX 1650 Released: TU117, 896 Cores, 4 GB GDDR5, $150

NVIDIA today rolled out the GeForce GTX 1650 graphics card at USD $149.99. Like its other GeForce GTX 16-series siblings, the GTX 1650 is derived from the "Turing" architecture, but without RTX real-time raytracing hardware, such as RT cores or tensor cores. The GTX 1650 is based on the 12 nm "TU117" silicon, which is the smallest implementation of "Turing." Measuring 200 mm² (die area), the TU117 crams 4.7 billion transistors. It is equipped with 896 CUDA cores, 56 TMUs, 32 ROPs, and a 128-bit wide GDDR5 memory interface, holding 4 GB of memory clocked at 8 Gbps (128 GB/s bandwidth). The GPU is clocked at 1485 MHz, and the GPU Boost at 1665 MHz.

The GeForce GTX 1650 at its given price is positioned competitively with the Radeon RX 570 4 GB from AMD. NVIDIA has been surprisingly low-key about this launch, by not just leaving it up to the partners to drive the launch, but also sample reviewers. There are no pre-launch Reviewer drivers provided by NVIDIA, and hence we don't have a launch-day review for you yet. We do have GTX 1650 graphics cards, namely the Palit GTX 1650 StormX, MSI GTX 1650 Gaming X, and ASUS ROG GTX 1650 Strix OC.

Update: Catch our reviews of the ASUS ROG Strix GTX 1650 OC and MSI GTX 1650 Gaming X

Colorful Announces GeForce GTX 1650 4GB Ultra Graphics Card

Colorful Technology Company Limited, professional manufacturer of graphics cards, motherboards and high-performance storage solutions is proud to announce the launch of its latest graphics card for the entry-level gaming market. The COLORFUL iGame GeForce GTX 1650 Ultra 4G brings NVIDIA's Turing graphics architecture to the masses, and COLORFUL brings the best out of the GPU thanks to its years of experience of working with gamers.

For new gamers and those upgrading from integrated graphics and want a taste of what's to come, the COLORFUL iGame GeForce GTX 1650 Ultra 4G is a prime choice to start with. Featuring the latest NVIDIA GPU technology: 12nm Turing architecture that brings with the best of Geforce, including GeForce Experience, NVIDIA Ansel, G-Sync and G-Sync Compatible monitor supports, Game Ready Drivers and so much more. COLORFUL has given the iGame GTX 1650 Ultra 4G with a performance boost via One-key OC button so you can get extra performance without tinkering.

NVIDIA GeForce GTX 1650 Specifications and Price Revealed

NVIDIA is releasing its most affordable graphics card based on the "Turing" architecture, the GeForce GTX 1650, on the 23rd of April, starting at USD $149. There doesn't appear to be a reference-design (the GTX 1660 series lacked one, too), and so this GPU will be a partner-driven launch. Based on NVIDIA's smallest "Turing" silicon, the 12 nm "TU117," the GTX 1650 will pack 896 CUDA cores and will feature 4 GB of GDDR5 memory across a 128-bit wide memory interface.

The GPU is clocked at 1485 MHz with 1665 MHz GPU Boost, and the 8 Gbps memory produces 128 GB/s of memory bandwidth. With a TDP of just 75 Watts, most GTX 1650 cards will lack additional PCIe power inputs, relying entirely on the slot for power. Most entry-level implementations of the GTX 1650 feature very simple aluminium fan-heatsink coolers. VideoCardz compiled a number of leaked pictures of upcoming GTX 1650 graphics cards.

NVIDIA Also Releases Tech Demos for RTX: Star Wars, Atomic Heart, Justice Available for Download

We've seen NVIDIA's move to provide RTX effects on older, non-RT capable hardware today being met with what the company was certainly expecting: a cry of dismay from users that now get to see exactly what their non-Turing NVIDIA hardware is capable of. The move from NVIDIA could be framed as a way to democratize access to RTX effects via Windows DXR, enabling users of its GTX 1600 and 1000 series of GPUs to take a look at the benefits of raytracing; but also as an upgrade incentive for those that now see how their performance is lacking without the new specialized Turing cores to handle the added burden.

Whatever your side of the fence on that issue, however, NVIDIA has provided users with one more raytraced joy today. Three of them, in fact, in the form of three previously-shown tech demos. The Star Wars tech demo (download) is the most well known, certainly, with its studies on reflections on Captain Phasma's breastplate. Atomic Heart (download) is another one that makes use of RTX for reflections and shadows, while Justice (download) adds caustics to that equation. If you have a Turing graphics card, you can test these demos in their full glory, with added DLSS for improved performance. If you're on Pascal, you won't have that performance-enhancing mode available, and will have to slog it through software computations. Follow the embedded links for our direct downloads of these tech demos.

NVIDIA Extends DirectX Raytracing (DXR) Support to Many GeForce GTX GPUs

NVIDIA today announced that it is extending DXR (DirectX Raytracing) support to several GeForce GTX graphics models beyond its GeForce RTX series. These include the GTX 1660 Ti, GTX 1660, GTX 1080 Ti, GTX 1080, GTX 1070 Ti, GTX 1070, and GTX 1060 6 GB. The GTX 1060 3 GB and lower "Pascal" models don't support DXR, nor do older generations of NVIDIA GPUs. NVIDIA has implemented real-time raytracing on GPUs without specialized components such as RT cores or tensor cores, by essentially implementing the rendering path through shaders, in this case, CUDA cores. DXR support will be added through a new GeForce graphics driver later today.

The GPU's CUDA cores now have to calculate BVR, intersection, reflection, and refraction. The GTX 16-series chips have an edge over "Pascal" despite lacking RT cores, as the "Turing" CUDA cores support concurrent INT and FP execution, allowing more work to be done per clock. NVIDIA in a detailed presentation listed out the kinds of real-time ray-tracing effects available by the DXR API, namely reflections, shadows, advanced reflections and shadows, ambient occlusion, global illumination (unbaked), and combinations of these. The company put out detailed performance numbers for a selection of GTX 10-series and GTX 16-series GPUs, and compared them to RTX 20-series SKUs that have specialized hardware for DXR.
Update: Article updated with additional test data from NVIDIA.

ZOTAC GeForce GTX 1650 Pictured: No Power Connector

Here are some of the first clear renders of an NVIDIA GeForce GTX 1650 graphics card, this particular one from ZOTAC. The GTX 1650, slated for April 22, will be the most affordable GPU based on the "Turing" architecture, when launched. The box art confirms this card features 4 GB of GDDR5 memory. The ZOTAC card is compact and SFF-friendly, is no longer than the PCIe slot itself, and is 2 slots-thick. Its cooler is a simple fan-heatsink with an 80 mm fan ventilating an aluminium heatsink with radially-projecting fins. The card can make do with the 75W power drawn from the PCIe slot, and has no additional power connectors. Display outputs include one each of DisplayPort 1.4, HDMI 2.0b, and a dual-link DVI-D.

NVIDIA RTX Logic Increases TPC Area by 22% Compared to Non-RTX Turing

Public perception on NVIDIA's new RTX series of graphics cards was sometimes marred by an impression of wrong resource allocation from NVIDIA. The argument went that NVIDIA had greatly increased chip area by adding RTX functionality (in both its Tensor ad RT cores) that could have been better used for increased performance gains in shader-based, non-raytracing workloads. While the merits of ray tracing oas it stands (in terms of uptake from developers) are certainly worthy of discussion, it seems that NVIDIA didn't dedicate that much more die area to their RTX functionality - at least not to the tone of public perception.

After analyzing full, high-res images of NVIDIA's TU106 and TU116 chips, reddit user @Qesa did some analysis on the TPC structure of NVIDIA's Turing chips, and arrived at the conclusion that the difference between NVIDIA's RTX-capable TU106 compared to their RTX-stripped TU116 amounts to a mere 1.95 mm² of additional logic per TPC - a 22% area increase. Of these, 1.25 mm² are reserved for the Tensor logic (which accelerates both DLSS and de-noising on ray-traced workloads), while only 0.7 mm² are being used for the RT cores.

NVIDIA GeForce GTX 1650 Availability Revealed

NVIDIA is expected to launch its sub-$200 GeForce GTX 1650 graphics card on the 22nd of April, 2019. The card was earlier expected to launch towards the end of April. With it, NVIDIA will introduce the 12 nm "TU117," its smallest GPU based on the "Turing" architecture. The GTX 1650 could replace the current GTX 1060 3 GB, and may compete with AMD offerings in this segment, such as the Radeon RX 570 4 GB, in being Full HD-capable if not letting you max your game settings out at that resolution. The card could ship with 4 GB of GDDR5 memory.

Anthem Gets NVIDIA DLSS and Highlights Support in Latest Update

Saying Anthem has had a rough start would be an understatement, but things can only get better with time (hopefully, anyway). This week saw an update to the PC version that brought along with it support for NVIDIA's new DLSS (Deep Learning Super Sampling) technology to be used with their new Turing-microarchitecture GeForce RTX cards. NVIDIA's internal testing shows as much as 40% improvement in average FPS with DLSS on relative to off, as seen in the image below, and there is also a video to help show graphical changes, or lack thereof in this case. DLSS on Anthem is available on all RTX cards at 3840x2160 resolution gameplay, and on the RTX 2060, 2070, and 2080 at 2560x1440. No word on equivalent resolutions at a non-16:9 aspect ratio, and presumably 1080p is a no-go as first discussed by us last month.

Note that we will NOT be able to test DLSS on Anthem, which is a result of the five activations limit as far as hardware configurations go. This prevented us from doing a full graphics card performance test, but our article on the VIP demo is still worth checking into if you were curious. In addition to DLSS, Anthem also has NVIDIA Highlights support for GeForce Experience users to automatically capture and save "best gameplay moments", with a toggle option to enable this setting in the driver. A highlight is generated for an apex kill, boss kill, legendary kill, multi kill, overlook interaction, or a tomb discovery. More on this in the source linked below in the full story.

NVIDIA Releases GeForce 419.67 WHQL Game Ready Drivers

After the oddity that was its GeForce 419.67 WHQL Creator Ready drivers, NVIDIA launched new GeForce drivers with the same 419.67 version number, but with "Game Ready" branding. It's now clear that Creator Ready is a fork of the GeForce software, released at a slightly lesser frequency, targeting creativity and productivity software that don't quite need Quadro feature-set or certifications. GeForce 419.67 WHQL Game Ready, on the other hand, add day-one optimization for "Battlefield V: Firestorm," a new update that brings the highly addictive Battle Royale gameplay mode to the Battlefield franchise. Optimization is also added or refined for "Anthem," "Shadow of the Tomb Raider," and "Sekiro: Shadows Die Twice." NVIDIA expanded the list of Adaptive Sync monitors that are now capable of G-Sync.

Among the bugs fixed are a performance drop noticed in DaVinci Resolve, overexposed brightness and color seen in "Far Cry: New Dawn" with HDR turned on; performance issues with "Total War: Warhammer 2" with AA turned on; artifacts seen in certain Adobe applications; screen corruption when switching display modes with HDR turned on in "Apex Legends," FOV reduction when recording with GeForce Experience; flickering noticed in "Star Citizen" followed by a CTD on "Turing" GPUs, abnormal time taken on GeForce GTX 980 responding to NVAPI calls; TITAN RTX overheating when enabling TCC mode via NVLink; and second monitor flickering with two monitors connected to an RTX 2070. Grab the driver from the link below.

DOWNLOAD: NVIDIA GeForce 419.67 WHQL Game Ready

The change-log follows.

NVIDIA: Turing Adoption Rate 45% Higher Than Pascal, 90% of Users Buying Upwards in Graphics Product Tiers

NVIDIA during its investor day revealed some interesting statistics on its Turing-based graphics cards. The company essentially announced that revenue for Turing graphics cards sales increased 45% over that generated when NVIDIA introduced their Pascal architecture - which does make sense, when you consider how NVIDIA actually positioned its same processor tiers (**60, **70, **80) in higher pricing brackets than previously. NVIDIA's own graphics showcase this better than anyone else could, with a clear indication of higher pricing for the same graphics tier. According to the company, 90% of users are actually buying pricier graphics cards this generation than they were in the previous one -which makes sense, since a user buying a 1060 at launch would only have to pay $249, while the new RTX 2060 goes for $349.

Other interesting tidbits from NVIDIA's presentation at its investor day is that Pascal accounts for around 50% of the installed NVIDIA graphics cards, while Turing, for now, only accounts for 2% of that. This means 48% of users sporting an NVIDIA graphics card are using Maxwell or earlier designs, which NVIDIA says presents an incredible opportunity for increased sales as these users make the jump to the new Turing offerings - and extended RTX feature set. NVIDIA stock valuation grew by 5.82% today, likely on the back of this info.

Shadow of the Tomb Raider RTX Patch Now Available: RTX and DLSS Enabled

A new patch has become available for Shadow of the Tomb Raider, which updated the game to the latest graphical technologies in the form of RTX and DLSS. The PC port of the game has been handed by developer Nixxes, which partnered with NVIDIA to work on adding ray-tracing enabled shadows to the game (there's a thematic coherence there if I've ever seen one).

NVIDIA GeForce GTX 1650 Details Leak Thanks to EEC Filing

The GeForce GTX 1650 will be NVIDIA's smallest "Turing" based graphics card, and is slated for a late-April launch as NVIDIA waits on inventories of sub-$200 "Pascal" based graphics cards, such as the GTX 1050 series, to be digested by the retail channel. A Eurasian Economic Commission filing revealed many more details of this card, as an MSI Gaming X custom-design board was finding its way through the regulator. The filing confirms that the GTX 1650 will pack 4 GB of memory. The GPU will be based on the new 12 nm "TU117" silicon, which will be NVIDIA's smallest based on the "Turing" architecture. This card will likely target e-Sports gamers, giving them the ability to max out their online battle royale titles at 1080p. It will probably compete with AMD's Radeon RX 570.

NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Update

NVIDIA had their customary GTC keynote ending mere minutes ago, and it was one of the longer keynotes clocking in at nearly three hours in length. There were some fascinating demos and features shown off, especially in the realm of robotics and machine learning, as well as new hardware as it pertains to AI and cars with the all-new Jetson Nano. It would be fair to say, however, that the vast majority of the keynote was targeting developers and researchers, as usually is the case at GTC. However, something came up in between which caught us by surprise, and no doubt is a pleasant update to most of us here on TechPowerUp.

Following AMD's claims on software-based real-time ray tracing in games, and Crytek's Neon Noir real-time ray tracing demo for both AMD and NVIDIA GPUs, it makes sense in hindsight that NVIDIA would allow rudimentary DXR ray tracing support to older hardware that do not support RT cores. In particular, an upcoming drivers update next month will allow DXR support for 10-series Pascal-microarchitecture graphics cards (GTX 1060 6 GB and higher), as well as the newly announced GTX 16-series Turing-microarchitecture GPUs (GTX 1660, GTX 1660 Ti). The announcement comes with a caveat letting people know to not expect RTX support (think lower number of ray traces, and possibly no secondary/tertiary effects), and this DXR mode will only be supported in Unity and Unreal game engines for now. More to come, with details past the break.

NVIDIA GTC 2019 Kicks Off Later Today, New GPU Architecture Tease Expected

NVIDIA will kick off the 2019 GPU Technology Conference later today, at 2 PM Pacific time. The company is expected to either tease or unveil a new graphics architecture succeeding "Volta" and "Turing." Not much is known about this architecture, but it's highly likely to be NVIDIA's first to be designed for the 7 nm silicon fabrication process. This unveiling could be the earliest stage of the architecture's launch cycle, would could see market availability only by late-2019 or mid-2020, if not later, given that the company's RTX 20-series and GTX 16-series have only been unveiled recently. NVIDIA could leverage 7 nm to increase transistor densities, and bring its RTX technology to even more affordable price-points.

Palit Unveils its GeForce GTX 1660 Lineup

Palit Microsystems Ltd, the leading graphics card manufacturer, releases the new NVIDIA Turing architecture GeForce GTX 16 series in Palit GeForce product line-up, GeForce GTX 1660 Dual OC, Dual, StormX OC and StormX.

Palit GeForce GTX 1660 is built with the latest NVIDIA Turing architecture which performs great at 120FPS, so it's an ideal model for eSports gaming titles It can also reach an amazing performance and image quality while livestreaming to Twitch or YouTube. Like its big brother, the GeForce GTX 1660 utilizes the "TU116" Turing GPU that's been carefully architected to balance performance, power, and cost. TU116 includes all of the new Turing Shader innovations that improve performance and efficiency, including support for Concurrent Floating Point and Integer Operations, a Unified Cache Architecture with larger L1 cache, and Adaptive Shading.

GIGABYTE Unveils its GeForce GTX 1660 Graphics Card Series

GIGABYTE, the world's leading premium gaming hardware manufacturer, today announced the latest GeForce GTX 1660 graphics cards powered by NVIDIA Turing architecture. GIGABYTE launched 3 graphics cards - GeForce GTX 1660 GAMING OC 6G, GeForce GTX 1660 GAMING 6G, GeForce GTX 1660 OC 6G. These GeForce GTX 1660 graphics cards not only use overclocked GPUs certified by GIGABYTE but are also built with GIGABYTE cooling systems for game enthusiasts pursuing extreme performance and the best gaming experience.

GIGABYTE releases the GeForce GTX 1660 GAMING OC 6G graphics card for users who prefer Triple Fan solutions with the new GeForce GTX 1660. In addition to the GIGABYTE "Alternate Spinning" patented function, the unique blade fan design can effectively enhance the airflow, which is spilt by the triangular fan edge and guided smoothly through the 3D stripe curve on the fan surface. Equipped with 3 pure copper composite heat-pipes and direct touch GPU, it can dissipate heat from the GPU quickly, keeping the graphics card working at a low temperature with higher performance and product stability. Moreover, GAMING OC built with extreme durable and overclocking design which provides over-temperature protection design and load balancing for each MOSFET, plus the Ultra Durable certified chokes and capacitors, provides excellent performance and a longer system life.

MSI Reveals New GeForce GTX 1660 Series Graphics Cards

As the world's most popular GAMING graphics card vendor, MSI is proud to announce its new graphics card line-up based on the new GeForce GTX 1660 GPU, the latest addition to the NVIDIA Turing GTX family.

The GeForce GTX 1660 utilizes the "TU116" Turing GPU that's been carefully architected to balance performance, power, and cost. TU116 includes all of the new Turing Shader innovations that improve performance and efficiency, including support for Concurrent Floating Point and Integer Operations, a Unified Cache Architecture with larger L1 cache, and Adaptive Shading.

EVGA Rolls Out the GeForce GTX 1660 for Zero Compromise Gaming

The EVGA GeForce GTX 1660 Graphics Cards are powered by the all-new NVIDIA Turing architecture to give you incredible new levels of gaming realism, speed, power efficiency, and immersion. With the EVGA GeForce GTX 1660 Graphics Cards you get the best gaming experience with next generation graphics performance, ice cold cooling and advanced overclocking features with the all new EVGA Precision X1 software.

First HDB fan on an NVIDIA graphics card optimizes airflow, increases cooling performance, and reduces fan noise by 15%, compared to more commonly-used sleeve-bearing fans on graphics cards. The special upraised 'E' pattern on the enlarged fan allows a further reduction in noise level by 4%. With a brand new layout, completely new codebase, new features and more, the new EVGA Precision X1 software is faster, easier and better than ever.

NVIDIA Launches the GeForce GTX 1660 6GB Graphics Card

NVIDIA today launched the GeForce GTX 1660 6 GB graphics card, its successor to the immensely popular GTX 1060 6 GB. With prices starting at $219.99, the GTX 1660 is based on the same 12 nm "TU116" silicon as the GTX 1660 Ti launched last month; with fewer CUDA cores and a slower memory interface. NVIDIA carved the GTX 1660 out by disabling 2 out of 24 "Turing" SMs on the TU116, resulting in 1,408 CUDA cores, 88 TMUs, and 48 ROPs. The company is using 8 Gbps GDDR5 memory instead of 12 Gbps GDDR6, which makes its memory sub-system 33 percent slower. The GPU is clocked at 1530 MHz, with 1785 MHz boost, which are marginally higher than the GTX 1660 Ti. The GeForce GTX 1660 is a partner-driven launch, meaning that there won't be any reference-design cards, although NVIDIA made should every AIC partner has at least one product selling at the baseline price of $219.99.

Read TechPowerUp Reviews: Zotac GeForce GTX 1660 | EVGA GeForce GTX 1660 XC Ultra | Palit GeForce GTX 1660 StormX OC | MSI GTX 1660 Gaming X

Update: We have updated our GPU database with all GTX 1660 models announced today, so you can easily get an overview over what has been released.

NVIDIA GeForce GTX 1650 Memory Size Revealed

NVIDIA's upcoming entry-mainstream graphics card based on the "Turing" architecture, the GeForce GTX 1650, will feature 4 GB of GDDR5 memory, according to tech industry commentator Andreas Schilling. Schilling also put out mast box-art by NVIDIA for this SKU. The source does not mention memory bus width. In related news, Schilling also mentions NVIDIA going with 6 GB as the memory amount for the GTX 1660. NVIDIA is expected to launch the GTX 1660 mid-March, and the GTX 1650 late-April.

NVIDIA GeForce GTX 1660 and GTX 1650 Pricing and Availability Revealed

(Update 1: Andreas Schilling, at Hardware Luxx, seems to have obtained confirmation that NVIDIA's GTX 1650 graphics cards will pack 4 GB of GDDR5 memory, and that the GTX 1660 will be offering a 6 GB GDDR5 framebuffer.)

NVIDIA recently launched its GeForce GTX 1660 Ti graphics card at USD $279, which is the most affordable desktop discrete graphics card based on the "Turing" architecture thus far. NVIDIA's GeForce 16-series GPUs are based on 12 nm "Turing" chips, but lack RTX real-time ray-tracing and tensor cores that accelerate AI. The company is making two affordable additions to the GTX 16-series in March and April, according to Taiwan-based PC industry observer DigiTimes.

The GTX 1660 Ti launch will be followed by that of the GeForce GTX 1660 (non-Ti) on 15th March, 2019. This SKU is likely based on the same "TU116" silicon as the GTX 1660 Ti, but with fewer CUDA cores and possibly slower memory or lesser memory amount. NVIDIA is pricing the GTX 1660 at $229.99, a whole $50 cheaper than the GTX 1660 Ti. That's not all. We recently reported on the GeForce GTX 1650, which could quite possibly become NVIDIA's smallest "Turing" based desktop GPU. This product is real, and is bound for 30th April, at $179.99, $50 cheaper still than the GTX 1660. This SKU is expected to be based on the smaller "TU117" silicon. Much like the GTX 1660 Ti, these two launches could be entirely partner-driven, with the lack of reference-design cards.

Manli Announces GeForce GTX 1660 Ti Series Graphics Cards

Manli Technology Group Limited, the major Graphics Cards, and other components manufacturer, today announced the brand new 16 series graphics solution, Manli GeForce GTX 1660 Ti with two options: Single Fan and Blower Fan.

The Manli GeForce GTX 1660 Ti is equipped with the world's fastest memory, 6 GB of GDDR6, and a 192 bit memory controller. The base clock is 1500 MHz which can dynamically boost up to 1770 MHz to deliver a smooth and fast gaming experience. Meanwhile, it is packed with the award-winning NVIDIA Turing architecture, adaptive shading technology, and NVIDIA Ansel which delivers super resolution of images.

ASUS Unveils its GeForce GTX 1660 Ti Turing Graphics Cards

NVIDIA's Turing architecture offers a lot of cutting-edge functionality. Its highest-profile features-RT cores for real-time ray tracing and Tensor cores for Deep Learning Super Sampling (DLSS)-are forward-looking in that they will be increasingly more impactful as developers integrate their powerful functionality into more games. But beyond those two capabilities, the design is also optimized to help enthusiasts get more performance out of today's most popular titles, right out of the box.

The first Turing-based graphics cards were decidedly high-end, targeting price points and performance levels where ray tracing and DLSS would shine together. Now, it's time for Turing to strut its stuff in a more value-oriented context. With the NVIDIA GeForce GTX 1660 Ti, gamers gain access to the huge per-core performance lift of Turing, along with next-gen GDDR6 VRAM that offers much greater memory bandwidth than the previous generation.

GIGABYTE Launches its GeForce GTX 1660 Ti Graphics Card Family

GIGABYTE, the world's leading premium gaming hardware manufacturer, today announced the latest GeForce GTX 1660Ti graphics cards powered by NVIDIA TuringTM architecture. GIGABYTE launched 5 graphics cards - AORUS GeForce GTX 1660Ti 6G, GeForce GTX 1660Ti GAMING OC 6G, GeForce GTX 1660Ti WINDFORCE OC 6G, GeForce GTX 1660Ti OC 6G, GeForce GTX 1660Ti MINI ITX OC 6G. These GeForce GTX 1660Ti graphics cards not only use overclocked GPUs certified by GIGABYTE but are also built with GIGABYTE cooling systems for game enthusiasts pursuing remarkable performance and the best gaming experience.

The top-of-the-line AORUS GeForce GTX 1660Ti 6G graphics card, featuring ultra-luxury components, demonstrates AORUS's pursuit of perfection. AORUS provides an all-around cooling solution for all key components of the graphics card. AORUS takes care of not only the GPU but also the VRAM and MOSFET, to ensure a stable overclock operation and longer life. WINDFORCE 3x 80mm cooling system is the most innovative cooling solution that provides the most efficient thermal performance for the graphics card. AORUS graphics are dedicated to perfect quality. With 5 composite heat-pipes directly touching the GPU and an excellent circuit design with top-grade materials, the thermal design not only helps maximize the performance of the GPU but also allows the card to maintain stable and long-life operation. In addition, AORUS GeForce GTX 1660Ti also integrates a stylish metal back-plate with RGB illumination and comes with the latest, built in RGB Fusion 2.0 which customers can synchronize various lighting effects with other AORUS devices.
Return to Keyword Browsing
Nov 21st, 2024 09:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts