News Posts matching #Turing

Return to Keyword Browsing

NVIDIA TU106 Chip Support Added to HWiNFO, Could Power GeForce RTX 2060

We are all still awaiting how NVIDIA's RTX 2000 series of GPUs will fare in independent reviews, but that has not stopped the rumor mill from extrapolating. There have been alleged leaks of the RTX 2080 Ti's performance and now we see HWiNFO add support to an unannounced NVIDIA Turing microarchitecture chip, the TU106. As a reminder, the currently announced members in RTX series are based off TU102 (RTX 2080 Ti), and TU104 (RTX 2080, RTX 2070). It is logical to expect a smaller die for upcoming RTX cards based on NVIDIA's history, and we may well see an RTX 2060 using the TU106 chip.

This addition to HWiNFO is to be taken with a grain of salt, however, as they have been wrong before. Even recently, they had added support for what, at the time, was speculated to be NVIDIA Volta microarchitecture which we now know as Turing. This has not stopped others from speculating further, however, as we see 3DCenter.org give their best estimates on how TU106 may fare in terms of die size, shader and TMU count, and more. Given that TSMC's 7 nm node will likely be preoccupied with Apple iPhone production through the end of this year, NVIDIA may well be using the same 12 nm FinFET process that TU102 and TU104 are being manufactured on. This mainstream GPU segment is NVIDIA's bread-and-butter for gross revenue, and so it is possible we may see an announcement with even retail availability towards the end of Q4 2018 to target holiday shoppers.

GIGABYTE Unveils GeForce RTX 20-series Graphics Cards

GIGABYTE, the world's leading premium gaming hardware manufacturer, and NVIDIA, the leading company of GPU, released the latest GeForce RTX 20 series graphics cards that powered by NVIDIA TuringTM architecture. GIGABYTE first launched 5 graphics cards - Geforce RTX 2080 Ti GAMING OC 11G, Geforce RTX 2080 Ti WINDFORCE OC 11G, Geforce RTX 2080 GAMING OC 8G, Geforce RTX 2080 WINDFORCE OC 8G, and Geforce RTX 2070 GAMING OC 8G. The 5 graphics cards feature GIGABYTE WINDFORCE 3X cooling system with alternate spinning fan, RGB fusion, protection metal back plate, GIGABYTE certified ultra-durable materials and one-click overclocking, so that all gamers can enjoy the ultimate gaming experience with the extreme performance.

GIGABYTE WINDFORCE 3x cooling system takes care of every component on the graphics card, and is equipped with 3 unique blade fans, high-efficiency pure copper composite heat-pipes, heat pipes directly touch GPU, and semi-passive fan function. These cooling technologies keep the graphics card in a low-temperature at any time, resulting in higher and more stable performance. The middle fan spins in reverse to optimize airflow for heat dissipation, enabling more efficient performance at a lower temperature.

Introducing the EVGA GeForce RTX 20-Series Graphics Cards

The EVGA GeForce RTX 20-Series Graphics Cards are powered by the all-new NVIDIA Turing architecture to give you incredible new levels of gaming realism, speed, power efficiency, and immersion. With the EVGA GeForce RTX 20-Series gaming cards you get the best gaming experience with next generation graphics performance, ice cold cooling with EVGA iCX2, and advanced overclocking features with the all new EVGA Precision X1 software.

The new NVIDIA GeForce RTX GPUs have reinvented graphics and set a new bar for performance. Powered by the new NVIDIA Turing GPU architecture and the revolutionary NVIDIA RTX platform, the new graphics cards bring together real-time ray tracing, artificial intelligence, and programmable shading. This is not only a whole new way to experience games-this is the ultimate PC gaming experience.

The new GPUs were unveiled at a special NVIDIA two-day event called the "GeForce Gaming Celebration" which kicked off tonight at the Palladium in Cologne, Germany ahead of Gamescom 2018.

ZOTAC Announces its GeForce RTX 20-series

ZOTAC Technology, a global manufacturer of innovation, is pleased to change the playing field of graphics cards once more with ZOTAC GAMING GeForce RTX 20-series graphics cards. The new ZOTAC GAMING GeForce RTX 20-series will be available in twin fan and triple fan AMP models with all-new designs.

The new NVIDIA GeForce RTX GPUs have reinvented graphics and set a new bar for performance. Powered by the new NVIDIA Turing GPU architecture and the revolutionary NVIDIA RTX platform, the new graphics cards bring together real-time ray tracing, artificial intelligence, and programmable shading. This is not only a whole new way to experience games-this is the ultimate PC gaming experience.

MSI Unveils its GeForce RTX Series

As the leading brand in True Gaming hardware, MSI has sold over 8 million graphics cards in the last year alone. Today we are extremely proud to share with you our take on NVIDIA's exciting new GeForce RTX 20 series GPUs.

The new NVIDIA GeForce RTX GPUs have reinvented graphics and set a new bar for performance. Powered by the new NVIDIA Turing GPU architecture and the revolutionary NVIDIA RTX platform, the new graphics cards bring together real-time ray tracing, artificial intelligence, and programmable shading. This is not only a whole new way to experience games-this is the ultimate PC gaming experience.
The new GPUs were unveiled at a special NVIDIA two-day event called the "GeForce Gaming Celebration" which kicked off tonight at the Palladium in Cologne, Germany ahead of Gamescom 2018.

NVIDIA Turing has 18.9 Billion Transistors

NVIDIA revealed that "Turing," the chip powering its RTX 2080-series has up to 18.9 billion transistors, making it the second biggest chip ever made (after NVIDIA V100). The Turing chip combines three key components, SM cores (CUDA cores), RT cores, and Tensor cores. The CUDA cores (SM cores) offer 14 TFLOPs of compute power; with tensor cores (4x4x4 matrix multiplication) at 110 TFLOPs FP16; and RT cores processing 10 giga-rays per second (10x over the predecessor).

NVIDIA GeForce RTX 2080 Ti Reference Design Teased

Here's the first teaser picture of NVIDIA's upcoming super high-end GeForce RTX 2080 Ti graphics card, with its most prominent feature being a dual-fan reference-design cooler. Given that blank PCB pictures of the RTX 2080 reference board shows two fan headers, it's possible that NVIDIA could make this dual-fan cooler common for both the RTX 2080 Ti and the RTX 2080+ (a premium 3,072-SP version of the RTX 2080, which could launch in September).

NVIDIA GeForce RTX 2000 Series Specifications Pieced Together

Later today (20th August), NVIDIA will formally unveil its GeForce RTX 2000 series consumer graphics cards. This marks a major change in the brand name, triggered with the introduction of the new RT Cores, specialized components that accelerate real-time ray-tracing, a task too taxing on conventional CUDA cores. Ray-tracing and DNN acceleration requires SIMD components to crunch 4x4x4 matrix multiplication, which is what RT cores (and tensor cores) specialize at. The chips still have CUDA cores for everything else. This generation also debuts the new GDDR6 memory standard, although unlike GeForce "Pascal," the new GeForce "Turing" won't see a doubling in memory sizes.

NVIDIA is expected to debut the generation with the new GeForce RTX 2080 later today, with market availability by end of Month. Going by older rumors, the company could launch the lower RTX 2070 and higher RTX 2080+ by late-September, and the mid-range RTX 2060 series in October. Apparently the high-end RTX 2080 Ti could come out sooner than expected, given that VideoCardz already has some of its specifications in hand. Not a lot is known about how "Turing" compares with "Volta" in performance, but given that the TITAN V comes with tensor cores that can [in theory] be re-purposed as RT cores; it could continue on as NVIDIA's halo SKU for the client-segment.

NVIDIA Announces Financial Results for Second Quarter Fiscal 2019

NVIDIA today reported revenue for the second quarter ended July 29, 2018, of $3.12 billion, up 40 percent from $2.23 billion a year earlier, and down 3 percent from $3.21 billion in the previous quarter.

GAAP earnings per diluted share for the quarter were $1.76, up 91 percent from $0.92 a year ago and down 11 percent from $1.98 in the previous quarter. Non-GAAP earnings per diluted share were $1.94, up 92 percent from $1.01 a year earlier and down 5 percent from $2.05 in the previous quarter.

"Growth across every platform - AI, Gaming, Professional Visualization, self-driving cars - drove another great quarter," said Jensen Huang, founder and CEO of NVIDIA. "Fueling our growth is the widening gap between demand for computing across every industry and the limits reached by traditional computing. Developers are jumping on the GPU-accelerated computing model that we pioneered for the boost they need.

NVIDIA Does a TrueAudio: RT Cores Also Compute Sound Ray-tracing

Positional audio, like Socialism, follows a cycle of glamorization and investment every few years. Back in 2011-12 when AMD maintained a relatively stronger position in the discrete GPU market, and held GPGPU superiority, it gave a lot of money to GenAudio and Tensilica to co-develop the TrueAudio technology, a GPU-accelerated positional audio DSP, which had a whopping four game title implementations, including and limited to "Thief," "Star Citizen," "Lichdom: Battlemage," and "Murdered: Soul Suspect." The TrueAudio Next DSP which debuted with "Polaris," introduced GPU-accelerated "audio ray-casting" technology, which assumes that audio waves interact differently with different surfaces, much like light; and hence positional audio could be made more realistic. There were a grand total of zero takers for TrueAudio Next. Riding on the presumed success of its RTX technology, NVIDIA wants to develop audio ray-tracing further.

A very curious sentence caught our eye in NVIDIA's micro-site for Turing. The description of RT cores reads that they are specialized components that "accelerate the computation of how light and sound travel in 3D environments at up to 10 Giga Rays per second." This is an ominous sign that NVIDIA is developing a full-blown positional audio programming model that's part of RTX, with an implementation through GameWorks. Such a technology, like TrueAudio Next, could improve positional audio realism by treating sound waves like light and tracing their paths from their origin (think speech from an NPC in a game), to the listener as the sound bounces off the various surfaces in the 3D scene. Real-time ray-tracing(-ish) has captured the entirety of imagination at NVIDIA marketing to the extent that it is allegedly willing to replace "GTX" with "RTX" in its GeForce GPU nomenclature. We don't mean to doomsay emerging technology, but 20 years of development in positional audio has shown that it's better left to game developers to create their own technology that sounds somewhat real; and that initiatives from makers of discrete sound cards (a device on the brink of extinction) and GPUs makers bore no fruit.

NVIDIA Posts Cryptic #BeForTheGame Video Pointing at 20th August

When NVIDIA debuted its "Turing" GPU architecture through its recent Quadro RTX series, PC enthusiasts felt being left hung and dry. The occasion was SIGGRAPH, the biggest annual expo of digital content creators, and so a Quadro unveiling felt fitting. Come 21st August, and Gamescom will be almost upon us. NVIDIA is planning its own event in host city Cologne a day earlier. The theme of the event is "Be For The Game."

NVIDIA posted the mother of all teasers pointing to the August 20 event. It doesn't mention a new product launch, but there are enough hints, such as the back-plate reminiscent of TITAN V, combined with glossy green and black surfaces that look similar to the Quadro RTX reference boards. The video winks at both gamers and PC enthusiasts, with the first half depicting a sick build being put together. We can't wait!
The video follows.

Samsung 16Gb GDDR6 Memory Powers Latest NVIDIA Quadro Professional Graphics Solution

Samsung Electronics Co., Ltd., a world leader in advanced semiconductor technology, today announced that its 16-gigabit (Gb) Graphics Double Data Rate 6 (GDDR6) memory is being used in NVIDIA's new Turing architecture-based Quadro RTX GPUs.

Thanks to Samsung's industry-leading 16Gb GDDR6 memory, end users can expect improved performance and energy efficiency in the widest array of graphics-intensive applications, including computer-aided design (CAD), digital content creation (DCC) and scientific visualization applications. Samsung's 16Gb GDDR6 can also be used in rapidly growing fields such as 8K Ultra HD video processing, virtual reality (VR), augmented reality (AR) and artificial intelligence (AI).

NVIDIA Announces Turing-based Quadro RTX 8000, Quadro RTX 6000 and Quadro RTX 5000

NVIDIA today reinvented computer graphics with the launch of the NVIDIA Turing GPU architecture. The greatest leap since the invention of the CUDA GPU in 2006, Turing features new RT Cores to accelerate ray tracing and new Tensor Cores for AI inferencing which, together for the first time, make real-time ray tracing possible.

These two engines - along with more powerful compute for simulation and enhanced rasterization - usher in a new generation of hybrid rendering to address the $250 billion visual effects industry. Hybrid rendering enables cinematic-quality interactive experiences, amazing new effects powered by neural networks and fluid interactivity on highly complex models.

NVIDIA's Next Gen GPU Launch Held Back to Drain Excess, Costly Built-up Inventory?

We've previously touched upon whether or not NVIDIA should launch their 1100 or 2000 series of graphics cards ahead of any new product from AMD. At the time, I wrote that I only saw benefits to that approach: earlier time to market -> satisfaction of upgrade itches and entrenchment as the only latest-gen manufacturer -> raised costs over lack of competition -> ability to respond by lowering prices after achieving a war-chest of profits. However, reports of a costly NVIDIA mistake in overestimating demand for its Pascal GPUs does lend some other shades to the whole equation.

Write-offs in inventory are costly (just ask Microsoft), and apparently, NVIDIA has found itself in a miscalculating demeanor: overestimating gamers' and miners' demand for their graphics cards. When it comes to gamers, NVIDIA's Pascal graphics cards have been available in the market for two years now - it's relatively safe to say that the majority of gamers who needed higher-performance graphics cards have already taken the plunge. As to miners, the cryptocurrency market contraction (and other factors) has led to a taper-out of graphics card demand for this particular workload. The result? NVIDIA's demand overestimation has led, according to Seeking Alpha, to a "top three" Taiwan OEM returning 300,000 GPUs to NVIDIA, and "aggressively" increased GDDR5 buying orders from the company, suggesting an excess stock of GPUs that need to be made into boards.

NVIDIA's Next-Gen Graphics Cards to Launch in Q3 2018, Breadcrumb Trail Indicates

We the media and you enthusiasts are always getting scare jumps every time a high-profile launch is announced - or even hinted at. And few product launches are as enthusing as those of new, refined graphics cards architectures - the possibilities for extra performance, bang for buck improvements, mid-tier performance that belonged in last generation's halo products - it's all a mix of merriment and expectation - even if it sometimes tastes a little sour.

Adding to the previous breadcrumbs neatly laid-out regarding NVIDIA's Hot Chips presentation on a new "Next Generation mainstream GPU", the source for et another piece of bread that would make Grettel proud comes from Power Logic, a fan supplier for numerous AIB partners (company representative holding an EVGA graphics card below), who have recently said they expected "Q3 orders to be through the roof". Such an increase in demand usually means increased orders as AIB partners stock up on materials to produce a substantial enough stock for new product launches, and does fall in line with the NVIDIA Hot Chips presentation in August. Q3 starts in July, though, and while the supply-chain timings are unknown, it seems somewhat tight for a July product launch that coincides with the increased fan orders.

NVIDIA Briefs AIC Partners About Next-gen GeForce Series

NVIDIA has reportedly briefed its add-in card (AIC) partners about its upcoming GeForce product family, codenamed "Turing," and bearing a commercial nomenclature of either GeForce 11-series, or GeForce 20-series. This sets in motion a 2-3 month long process of rolling out new graphics cards by board partners, beginning with reference-design "Founders Edition" SKUs, followed by custom-design SKUs. Sources tell Tom's Hardware Germany that AIC partners have began training product development teams. NVIDIA has also released a BoM (bill of materials) to its partners, so aside from the ASIC itself, they could begin the process of sourcing other components for their custom-design products (such as coolers, memory chips, VRM components, connectors, etc.).

The BoM also specifies a timeline for the tentative amount of time it takes for each of the main stages of the product development, leading up to mass-production. It stipulates 11-12 weeks (2-3 months) leading up to mass-production and shipping, which could put product-launch some time in August (assuming the BoM was released some time in May-June). A separate table also provides a fascinating insight to the various stages of development of a custom-design NVIDIA graphics card.

NVIDIA GTX 1080-successor By Late-July

NVIDIA is reportedly giving finishing touches to its first serious GeForce-branded GPU based on a next-generation NVIDIA architecture (nobody knows which), for a late-July product announcement. This involves a limited reference-design "Founders Edition" product launch in July, followed by custom-design graphics card launches in August and September. This chip could be the second-largest client-segment implementation of said architecture succeeding the GP104, which powers the GTX 1080 and GTX 1070.

It's growing increasingly clear that the first product could be codenamed "Turing" after all, and that "Turing" may not be the codename of an architecture or a silicon, but rather an SKU (likely either named GTX 1180 or GTX 2080). As with all previous NVIDIA product-stack roll-outs since the GTX 680, NVIDIA will position the GTX 1080-successor as a high-end product initially, as it will be faster than the GTX 1080 Ti, but the product will later play second-fiddle to a GTX 1080 Ti-successor based on a bigger chip.

Next-Generation NVIDIA Mobile GPUs to Be Released Towards End of 2018

An official Gigabyte UK Notebook representative, who goes by the name of Atom80, over at the OverclockersUK forums has confirmed that NVIDIA's next-generation mobile GPUs will launch towards the end of this year. When asked about whether Gigabyte will be providing a GTX 1080 option for their Aero 15X V8-CF1 notebook, Atom80 stated that there are no plans to upgrade the Aorus notebook family until the next-generation GPUs are available. Since the mobile variants usually launch a few months after the desktop variants, it's possible that we're looking at a summer launch for the desktop models.

NVIDIA Turing GPU to Start Mass Production in Q3 2018

Despite not being backed by an official statement, NVIDIA's next-generation crypto-mining Turing graphics cards are expected to be revealed at GTC 2018 between March 26 to 29. According to DigiTimes's latest report, NVIDIA is expecting a drop in demand for graphics card later this year. In an effort to prolong the lifecycle of their current graphics cards, mass production for Turing won't start until the third quarter of 2018. Sources around the industry have also revealed that NVIDIA had a sit-down with AIB partners to address the current situation. In short, NVIDIA partners are now forbidden to promote activities related to cryptocurrency mining and to sell bulks of consumer graphics cards to cryptominers.

Report: NVIDIA Not Unveiling 2018 Graphics Card Lineup at GDC, GTC After All

It's being reported by Tom's Hardware, citing industry sources, that NVIDIA isn't looking to expand upon its graphics cards lineup at this years' GDC (Game Developers Conference) or GTC (GPU Technology Conference). Even as reports have been hitting the streets that pointed towards NVIDIA announcing (if not launching) their two new product architectures as early as next month, it now seems that won't be the case after all. As a reminder, the architectures we're writing about here are Turing, reportedly for crypto-mining applications, and Ampere, the expected GeForce architecture leapfrogging the current top of the line - and absent from regular consumer shores - Volta.

There's really not much that can be gleaned as of now from industry sources, though. It's clear no one has received any kind of information from NVIDIA when it comes to either of their expected architectures, which means an impending announcement isn't likely. At the same time, NVIDIA really has no interest in pulling the trigger on new products - demand is fine, and competition from AMD is low. As such, reports of a June or later announcement/release are outstandingly credible, as are reports that NVIDIA would put the brakes on a consumer version of Ampere, use it to replace Volta on the professional and server segment, and instead launch Volta - finally - on the consumer segment. This would allow the company to cache in on their Volta architecture, this time on consumer products, for a full generation longer, while innovating the market - of sorts. All scenarios are open right now; but one thing that seems clear is that there will be no announcements next month.

NVIDIA to Unveil "Ampere" Based GeForce Product Next Month

NVIDIA prepares to make its annual tech expo, the 2018 Graphics Technology Conference (GTC) action-packed. The company already surprised us with its next-generation "Volta" architecture based TITAN V graphics card priced at 3 grand; and is working to cash in on the crypto-currency wave and ease pressure on consumer graphics card inventories by designing highly optimized mining accelerators under the new Turing brand. There's now talk that NVIDIA could pole-vault launch of the "Volta" architecture for the consumer-space; by unveiling a GeForce graphics card based on its succeeding architecture, "Ampere."

The oldest reports of NVIDIA unveiling "Ampere" date back to November 2017. At the time it was expected that NVIDIA will only share some PR blurbs on some of the key features it brings to the table, or at best, unveil a specialized (non-gaming) silicon, such as a Drive or machine-learning chip. An Expreview report points at the possibility of a GeForce product, one that you can buy in your friendly neighborhood PC store and play games with. The "Ampere" based GPU will still be based on the 12 nanometer silicon fabrication process at TSMC, and is unlikely to be a big halo chip with exotic HBM stacks. Why NVIDIA chose to leapfrog is uncertain. GTC gets underway late-March.

NVIDIA Turing is a Crypto-mining Chip Jen-Hsun Huang Made to Save PC Gaming

When Reuters reported Turing as NVIDIA's next gaming graphics card, we knew something was off about it. Something like that would break many of NVIDIA's naming conventions. It now turns out that Turing, named after British scientist Alan Turing, who is credited with leading a team of mathematicians that broke the Nazi "Enigma" cryptography, is a crypto-mining and blockchain compute accelerator. It is being designed to be compact, efficient, and ready for large-scale deployment by amateur miners and crypto-mining firms alike, in a quasi-industrial scale.

NVIDIA Turing could be manufactured at a low-enough cost against GeForce-branded products, and in high-enough scales, to help bring down their prices, and save the PC gaming ecosystem. It could have an ASIC-like disruptive impact on the graphics card market, which could make mining with graphics cards less viable, in turn, lowering graphics card prices. With performance-segment and high-end graphics cards seeing 200-400% price inflation in the wake of crypto-currency mining wave, PC gaming is threatened as gamers are lured to the still-affordable new-generation console ecosystems, led by premium consoles such as the PlayStation 4 Pro and Xbox One X. There's no word on which GPU architecture Turing will be based on ("Pascal" or "Volta"). NVIDIA is expected to launch its entire family of next-generation GeForce GTX 2000-series "Volta" graphics cards in 2018.

NVIDIA to Unveil "Turing" Consumer Graphics GPU Next Month

NVIDIA is reportedly working on a TITAN V-esque surprise for March 2018. According to Reuters, which summarized the company's Q4-2017 results and outlook, the company is working on a new consumer-graphics GPU for launch next month, codenamed "Turing." This could be the codename of an ASIC or an SKU and not the architecture (which could be "Volta"). The Reuters report describes "Turing" as a "new GPU gaming chip." This unequivocally points to a consumer graphics (GeForce) product, and not a professional (Quadro), or HPC (Tesla) product.
Return to Keyword Browsing
Nov 21st, 2024 09:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts