News Posts matching #GeForce

Return to Keyword Browsing

Intel Arc Alchemist Xe-HPG Graphics Card with 512 EUs Outperforms NVIDIA GeForce RTX 3070 Ti

Intel's Arc Alchemist discrete lineup of graphics cards is scheduled for launch this quarter. We are getting some performance benchmarks of the DG2-512EU silicon, representing the top-end Xe-HPG configuration. Thanks to a discovery of a famous hardware leaker TUM_APISAK, we have a measurement performed in the SiSoftware database that shows Intel's Arc Alchemist GPU with 4096 cores and, according to the report from the benchmark, just 12.8 GB of GDDR6 VRAM. This is just an error on the report, as this GPU SKU should be coupled with 16 GB of GDDR6 VRAM. The card was reportedly running at 2.1 GHz frequency. However, we don't know if this represents base or boost speeds.

When it comes to actual performance, the DG2-512EU GPU managed to score 9017.52 Mpix/s, while something like NVIDIA GeForce RTX 3070 Ti managed to get 8369.51 Mpix/s in the same test group. Comparing these two cards in floating-point operations, Intel has an advantage in half-float, double-float, and quad-float tests, while NVIDIA manages to hold the single-float crown. This represents a 7% advantage for Intel's GPU, meaning that Arc Alchemist has the potential for standing up against NVIDIA's offerings.

The Power of AI Arrives in Upcoming NVIDIA Game-Ready Driver Release with Deep Learning Dynamic Super Resolution (DLDSR)

Among the broad range of new game titles getting support, we are in for a surprise. NVIDIA yesterday announced a feature list of its upcoming game-ready GeForce driver scheduled for public release on January 14th. According to the new blog post on NVIDIA's website, the forthcoming game-ready driver release will feature an AI-enhanced version of Dynamic Super Resolution (DSR), available in GeForce drivers for a while. The new AI-powered tech is, what the company calls, Deep Learning Dynamic Super Resolution or DLDSR shortly. It uses neural networks that require fewer input pixels and produces stunning image quality on your monitor.
NVIDIAOur January 14th Game Ready Driver updates the NVIDIA DSR feature with AI. DLDSR (Deep Learning Dynamic Super Resolution) renders a game at higher, more detailed resolution before intelligently shrinking the result back down to the resolution of your monitor. This downsampling method improves image quality by enhancing detail, smoothing edges, and reducing shimmering.

DLDSR improves upon DSR by adding an AI network that requires fewer input pixels, making the image quality of DLDSR 2.25X comparable to that of DSR 4X, but with higher performance. DLDSR works in most games on GeForce RTX GPUs, thanks to their Tensor Cores.
NVIDIA Deep Learning Dynamic Super Resolution NVIDIA Deep Learning Dynamic Super Resolution

GIGABYTE Launches GeForce RTX 3080 graphics cards with 12GB of VRAM

GIGABYTE TECHNOLOGY Co. Ltd, a leading manufacturer of premium gaming hardware, today announced the upgraded versions of GeForce RTX 3080 graphics cards with 12 GB of VRAM. GIGABYTE provides consumers with a variety of series to meet the needs of more customers. The AORUS series is recommended for the enthusiasts who want the ultimate performance and colorful RGB appearance. The GAMING OC series is the best choice of performance-minded gamers. The EAGLE series is the best choice for those who desire a unique design. In addition to the excellent cooling system and distinctive appearance of each series of graphics cards, they have been upgraded to a larger 12 GB memory capacity to meet the needs of latest graphics intensive 4K game and heavy-duty 3D creation.

The AORUS GeForce RTX 3080 MASTER 12G graphics card is equipped with the MAX-Covered cooling system that features three unique blade fans with a wind claw design and alternate spinning to deliver optimal airflow across the entire heatsink. The large heat-sink and a large vapor chamber and the multiple composite heat-pipes enable heat from the GPU and VRAM to dissipate quickly. Coupled with GIGABYTE's screen cooling technology, the extended heatsink fins allow air to pass through, forming an extremely efficient heat dissipation system to ensure stable operation. A powerful LCD monitor and RGB Fusion 2.0 allows gamers to enjoy various display modes and lighting effects. It is recommended for enthusiasts who want the ultimate 4k gaming performance with colorful RGB lighting.

NVIDIA Launches GeForce 511.17 WHQL Drivers

NVIDIA today released the latest version of GeForce driver software. Version 511.17 debuts the 510-series version numbering for the software. This driver introduces support for the GeForce RTX 3080 12 GB graphics card the company stealthily launched today. The drivers also fix a bug with "Detroit: Become Human," where random stuttering was observed in the game; flickering text in Windows with 12-bpc color enabled; and the mouse pointer getting frozen when toggling HDR from within Windows Control Panel, after enabling G-SYNC from NVIDIA Control Panel.
DOWNLOAD NVIDIA GeForce 511.17 WHQL

NVIDIA GeForce RTX 3080 12 GB Edition Rumored to Launch on January 11th

During the CES 2022 keynote, we have witnessed NVIDIA update its GeForce RTX 30 series family with GeForce RTX 3050 and RTX 3090 Ti. However, this is not an end to NVIDIA's updates to the Ampere generation, as we now hear industry sources from Wccftech suggest that we could see a GeForce RTX 3080 GPU with 12 GB of GDDR6X VRAM enabled, launched as a separate product. Compared to the regular RTX 3080 that carries only 10 GB of GDDR6X, the new 12 GB version is supposed to bring a slight bump up to the specification list. The GA102-220 GPU SKU found inside the 12 GB variant will feature 70 SMs with 8960 CUDA, 70 RT cores, and 280 TMUs.

This represents a minor improvement over the regular GA102-200 silicon inside the 8 GB model. However, the significant difference is the memory organization. With the new 12 GB model, we have a 384-bit memory bus allowing GDDR6X modules to achieve a bandwidth of 912 GB/s, all while running at 19 Gbps speeds. The overall TDP will also receive a bump to 350 Watts, compared to 320 Watts of the regular RTX 3080 model. For more information regarding final clock speeds and pricing, we have to wait for the alleged launch date - January 11th.

Razer Announces All-New Blade Gaming Laptops at CES 2022

Razer, the leading global lifestyle brand for gamers (Hong Kong Stock Code: 1337), is kicking off 2022 with new Razer Blade gaming laptop models including the Razer Blade 14, Razer Blade 15, and Razer Blade 17. The world's fastest laptops for gamers and creators are equipped with the recently announced NVIDIA GeForce RTX 30 Series Laptop GPUs, up to an RTX 3080 Ti, making the new Blades better than ever, now shipping with Windows 11. All new Razer Blade gaming laptops now also include groundbreaking DDR5 memory, providing blistering clock speeds up to 4800 MHz, an increase in frequency by up to 50% compared to the previous generation.

"The Razer Blade series continues to be the best gaming laptop by providing desktop-class performance on-the-go," says Travis Furst, Senior Director of Razer's Systems business unit. "Additionally, we've enabled creators to work anywhere with gorgeous displays, available NVIDIA Studio drivers, and up to 14-Core CPUs. Users will have the ability to choose any model or configuration that best fits their gaming or creating needs, while getting the latest and greatest in graphics, memory and processing technology."

NVIDIA GeForce RTX 3080 Ti Mobile Brings 16 Gbps Memory and TGP of 175 Watts

NVIDIA is preparing to launch an ultimate solution for high-end laptops and gamers that could benefit from the high-performance graphics card integration in mobile systems like gaming laptops. Rumored to launch sometime in January, NVIDIA is preparing a GeForce RTX 3080 Ti mobile GPU SKU that supposedly offers the highest performance in the Ampere mobile family. According to sources close to VideoCardz, team green has prepared to announce RTX 3080 Ti mobile design with faster memory and higher total graphics power (TGP). The memory speed will get an upgrade to 16 Gbps, compared to the 14 Gbps speed in RTX 3080 mobile SKU.

Similarly, the total overall TGP will also receive a bump to 175 Watts. This is just a tad higher than the 165 Watt TGP of RTX 3080 mobile. The Ti version will upgrade the CUDA core count and other things like TMUs to undetermined specifications. Currently, it is rumored that the Ti version could carry 7424 CUDA cores, which is an upgrade from 6144 of the regular RTX 3080 version.

Leaked Document Confirms That MSI GeForce RTX 3090 Ti SUPRIM X Graphics Card Launches January 27th

In the past few months, we have heard rumors of NVIDIA launching an upgraded version of the GA102 silicon called GeForce RTX 3090 Ti. The upgraded version is supposed to max out the chip and bring additional performance to the table. According to anonymous sources of VideoCardz, MSI, one of NVIDIA's add-in board (AIB) partners, is preparing to update its SUPRIM X lineup of graphics cards with the MSI GeForce RTX 3090 Ti SUPRIM X GPU, scheduled for January 27th launch date. This suggests that the official NDA lifts for these RTX 3090 Ti GPUs on January 27th, meaning that we could see AIBs teasing their models very soon.

As a general reminder, the GeForce RTX 3090 Ti graphics card should use a GA102-350 silicon SKU with 84 SMs, 10752 CUDA cores, 336 TMUs, 24 GB of GDDR6X memory running on a 384-bit bus at 21 Gbps speed with 1008 GB/s bandwidth, and a TBP of a whopping 450 Watts. If these specifications remain valid, the GPU could become the top contender in the market, however, with a massive drawback of pulling nearly half a KiloWatt of power.

NVIDIA Releases GeForce 497.29 WHQL Game Ready Drivers

NVIDIA today released the latest version of GeForce Game Ready drivers. Version 497.29 WHQL comes game-ready for "GTFO," and optimization for "Horizon Zero Dawn" with the added DLSS feature. Among the bugs fixed include a crash-to-desktop and visual artifacts with Microsoft Flight Simulator; a performance drop noticed with "Supreme Command:Forged Alliance/Supreme Commander 2," a stutter noticed with Windows Desktop when the mouse is moved after an extended period of time; and some localization issues in NVIDIA Control Panel.

DOWNLOAD: NVIDIA GeForce 497.29 WHQL

LG's First Ever Ultragear Gaming Laptop Delivers Maximum Power and Convenience

LG Electronics USA today unveiled its first gaming laptop, expanding its premium UltraGear lineup and bringing exciting news to gamers worldwide. A powerful performer with a seriously sleek design, the CES 2022 Innovation Award-winning laptop delivers sublime gaming experiences that can be enjoyed anywhere, at any time. LG's take-anywhere gaming rig features an 11th Gen Intel Tiger Lake H processor, NVIDIA GeForce RTX 3080 Max-Q graphics card, dual-channel memory and an ultra-fast dual SSD setup. In addition to a 17-inch IPS panel with a 1 millisecond response time and a 300 Hz refresh rate, the LG UltraGear gaming laptop ensures immersive, fluid gameplay for even the most graphically demanding PC games thanks to the latest top-of-the-line hardware. Also, LG's cooling system with vapor chamber keeps the laptop running cool, even when pushed to the limits.

Sharing DNA with LG's lightweight gram laptops, the 17G90Q has a streamlined, highly-portable design. The new, slim laptop features a large screen and 93Wh battery while maintaining a thickness of under 0.84 inches and a weight of less than 5.952 pounds. The LG UltraGear gaming laptop's aluminium casing offers style and durability, while the winged UltraGear badge on its exterior clearly communicates the power and quality for which LG's premium gaming brand is known.

Lightelligence's Optical Processor Outperforms GPUs by 100 Times in Some of The Hardest Math Problems

Optical computing has been the research topic of many startups and tech companies like Intel and IBM, searching for the practical approach to bring a new way of computing. However, the most innovative solutions often come out of startups and today is no exception. According to the report from EETimes, optical computing startup Lightelligence has developed a processor that outperforms regular GPUs by 100 times in calculating some of the most challenging mathematical problems. As the report indicates, the Photonic Arithmetic Computing Engine (PACE) from Lightelligence manages to outperform regular GPUs, like NVIDIA's GeForce RTX 3080, by almost 100 times in the NP-complete class of problems.

More precisely, the PACE accelerator was tackling the Ising model, an example of a thermodynamic system used for understanding phase transitions, and it achieved some impressive results. Compared to the RTX 3080, it reached 100 times greater speed-up. All of that was performed using 12,000 optical devices integrated onto a circuit and running at 1 GHz frequency. Compared to the purpose-built Toshiba's simulated bifurcation machine based on FPGAs, the PACE still outperforms this system designed to tackle the Ising mathematical computation by 25 times. The PACE chip uses standard silicon photonics integration of Mach-Zehnder Interferometer (MZI) for computing and MEMS to change the waveguide shape in the MZI.
Lightelligence Photonic Arithmetic Computing Engine Lightelligence Photonic Arithmetic Computing Engine

NVIDIA Announces Three New Mobile GPUs With Spring 2022 Availability

NVIDIA has just announced three new mobile GPUs, although the question is how new any of them really are, as the model names suggest they're anything but. First up is the GeForce RTX 2050, which should be based on the Turing architecture. The other two GPUs are the GeForce MX550 and MX570, both presumably based on the Ampere architecture, although NVIDIA hasn't confirmed the specifics.

The GeForce RTX 2050 features 2048 CUDA cores, which is more than the mobile RTX 2060, but it has lower clock speeds and a vastly lower power draw at 30-45 Watts depending on the notebook design choices and cooling. It's also limited to 4 GB of 64-bit GDDR6 memory, which puts this in GeForce MX territory when it comes to memory bandwidth, as NVIDIA quotes an up to memory bandwidth of a mere 112 GB/s.

ASUS Prepares ROG Zephyrus Duo GX650 Laptop With Upcoming AMD Ryzen 9 6900HX and NVIDIA GeForce RTX 3080 Ti

Prominent chip designers like AMD and NVIDIA could bless consumers with a broader offering of their new products as soon as CES 2022 arrives. AMD should present its rumored Rembrandt-H lineup of processors based on the enhanced Zen 3 core, sometimes referred to as Zen 3+. According to the latest report coming from MyLaptopsGuide, Bluetooth SIG has some data entry about ASUS'es upcoming ROG Zephyrus Duo GX650 laptop that integrates AMD Rembrandt-H processors and NVIDIA GeForce RTX 30-series graphics. As the website claims, the heart of this laptop will be AMD Ryzen 9 6900HX processor built on TSMC's 6 nm manufacturing process. We don't know much about this model, but we expect it to refine the previous Ryzen 9 5900HX.

We again see the rumored NVIDIA GeForce RTX 3080 Ti graphics card for mobile, powering the graphics side of things. This model is supposedly based on GA103S GPU SKU, which is likely tailor-made for laptops in mind and exclusive to them. ASUS has also paired 16 GB of DDR5-4800 RAM with an AMD Ryzen processor, suggesting that Rembrandt-H has a new memory controller in place. This laptop model also has a 16-inch 300 Hz Full HD screen with anti-glare; however, the amount of information ended there. We have to wait for CES 2022 launch to find out more.

NVIDIA Announces Updated Open-Source Image Scaling SDK

For the past two years, NVIDIA has offered a driver-based spatial upscaler called NVIDIA Image Scaling and Sharpening, for all your games, that didn't require game or SDK integrations to work. With the new November GeForce Game Ready Driver, we have improved the scaling and sharpening algorithm to now use a 6-tap filter with 4 directional scaling and adaptive sharpening filters to boost performance. And we have also added an in-game sharpness slider, accessible via GeForce Experience, so you can do real-time customizations to sharpness.

In contrast to NVIDIA DLSS, the algorithm is non-AI and non-temporal, using only information from the current low resolution image rendered by the game as an input. While the resulting image quality is best-in-class in comparison to scaling offered by monitors or other in-game scaling techniques, it lacks the temporal data and AI smarts of DLSS, which are required to deliver native resolution detail and robust frame-to-frame stability. By combining both NVIDIA DLSS and NVIDIA Image Scaling, the developer gets the best of both worlds: NVIDIA DLSS for the best image quality, and NVIDIA Image Scaling for cross-platform support. You can read how to enable the feature for any game down below.

Gigabyte Registers Four NVIDIA GeForce RTX 2060 12 GB Graphics Cards With the EEC

The on-again, off-again relationship between NVIDIA and its Turing-based RTX 2060 graphics seems to be heading towards a new tipping point. As previously reported, NVIDIA is expected to be preparing another release cycle for its RTX 2060 graphics card - this time, paired with an as puzzling as it is gargantuan (for its shader performance) 12 GB of GDDR6 memory. Gigabyte has given us yet another tip at the card's expected launch by the end of this year or early 2022 by registering four different card models with the EEC (Eurasian Economic Commission). Gigabyte's four registered cards carry the model numbers GV-N2060OC-12GD, GV-N2060D6-12GD, GV-N2060WF2OC-12GD, and GV-N2060WF2-12GD. Do however remember that not all registered graphics cards actually make it to market.

NVIDIA's revival of the RTX 2060 towards the current market conditions speaks in volumes. While NVIDIA is producing as many 8 nm cards as it can with foundry partner Samsung, the current state of the graphics card pricing market leaves no doubts as to how successfully NVIDIA has been able to cope with both the logistics and materials constraints currently experienced by the semiconductor market. The 12 nm manufacturing process certainly has more available capacity than Samsung's 8 nm; at the same time, the RTX 2060's mining capabilities have been overtaken by graphics cards from the Ampere family, meaning that miners most likely will not look at these as viable options for mining, thus improving availability for consumers as well. If the card does keep close to its expected $300 price-point upon release, of course.

Truck Full of EVGA Graphics Cards Gets Stolen in California

Unexpected events tend to happen in the world of graphics cards, and today seems like no exception. According to the public announcement on EVGA forums, a truck delivery full of the latest NVIDIA GeForce RTX 30 series graphics cards got stolen. The truck was going on its route from San Francisco to the EVGA Southern California distribution center. Inside the vehicle, countless GPUs were ranging in MSRP from $329.99 up to $1959.99. However, the company doesn't specify how many GPUs are now missing. It is important to note that any sale of these stolen GPUs is considered a felony, and if you have any information regarding this, please get in touch with EVGA at stopRTX30theft@evga.com email address.
Here is the full announcement from EVGA's forum:

NVIDIA Releases GeForce 496.49 Game Ready Drivers

NVIDIA today released the latest version of GeForce Game Ready drivers. Version 496.49 WHQL comes with optimization for "Marvel's Guardians of the Galaxy," including support for DLSS. It also features optimization for "Age of Empires IV," "Battlefield 2042" (Early Access), "Call of Duty Vanguard," "Chivalry 2," "GTA Definitive Edition Trilogy," "Forza Horizon 5," "Jurassic World Evolution," and "Riders Republic." The drivers also fix certain legacy issues with "DOOM 3: BFG Edition," and "Tom Clancy's The Division 2." A flickering issue noticed at 1080p on LG OLED C1 series monitors, has been fixed. The drivers also add G-SYNC support for several new monitors, and GeForce Experience optimizations. Grab the drivers from the link below.

DOWNLOAD: NVIDIA GeForce 496.49 WHQL

AxiomTek Introduces IPC970 4-slot IPC

Axiomtek - a world-renowned leader relentlessly devoted in the research, development, and manufacture of series of innovative and reliable industrial computer products of high efficiency - is pleased to introduce the IPC970, its new feature-rich, expandable 4-slot industrial system. This intelligent industrial computer is powered by the Intel Xeon or 10th generation Intel Core i7/i5/i3 processors (code name: Comet Lake S) with the Intel W480E chipset. The powerful edge computing system supports NVIDIA GeForce RTX 3090 graphics card with 10,496 CUDA cores and a whopping 24 GB of GDDR6X memory for powerful GPU computing capability. The ruggedized IPC970 enables simultaneous AI processing for intelligent AI computing at the edge.

The Axiomtek's IPC970 provides flexible expansion options with one I/O module slot and three PCIe slots plus one blank expansion slot. In addition, it has a full-size PCIe Mini Card slot for Wi-Fi/Bluetooth/LTE modules, one M.2 Key B 3042/3050 socket for 5G wireless connections, and one M.2 Key E 2230 socket for Wi-Fi/Bluetooth modules. Its front-facing I/O design makes it easy to access and deploy. To ensure stable and reliable operation in harsh industrial environments, the IPC970 has a wide operating temperature range of -10°C to +70°C and a power input of 24 V DC (uMin=19 V/uMax=30 V) with power-on delay function, over-voltage protection, over current protection, and reverse voltage protection.

Apple M1 Max Beats GeForce RTX 3080 Laptop GPU in GFXBench 5.0, but Doesn't Shine in Geekbench

This should be taken with a fair helping of salt, considering GFXBench 5.0 is mobile device focused benchmark, even though the company behind claims it's a platform independent benchmark. Regardless of that, it looks like the new 32 core GPU in Apple's M1 Max SoC offers some pretty competitive performance, as it manages GeForce RTX 3080 Laptop GPU in said test.

However, this is a median score for the GeForce RTX 3080 Laptop GPU and many of the tests that make up GFXBench 5.0 aren't using DirectX, which is one likely reason for Apple's M1 Max GPU beating the Nvidia card. On the other hand, all tests seem to support Metal, which is Apple's 3D API, whereas the Nvidia card has to fall back to using OpenGL which tends to offer lower performance than DirectX in games. In most of the tests we're looking at an average performance advantage of less than 10 percent in favour of Apple, but it's nonetheless impressive considering that Apple hasn't been in the GPU business for very long.

Acer Announces Predator Orion 7000 Gaming PC Powered by Intel 12th Generation Core "Alder Lake" CPUs

Acer today announced the expansion of its Predator gaming portfolio with new Predator Orion 7000 series desktops, featuring powerful performance in a stunning design, and two smart 4K gaming projectors. Further enhancing the gaming experience is the Predator gaming desk, which offers two practical surface options and a convenient storage rack.

"Predator Orion 7000-series desktops are premium, powerful rigs for serious players who demand incredible performance from even the most demanding titles," said Jeff Lee, General Manager, Stationary Computing, IT Product Business, Acer Inc. "In order to offer that next-level performance, we're excited to be among the first companies bringing the new 12th Gen Intel Core CPUs to our product portfolio."

NVIDIA Releases Game Ready 496.13 WHQL GeForce Graphics Driver, Support Removed for Windows 8.1/8/7 & Kepler

NVIDIA has today launched its 496.13 game-ready WHQL GeForce graphics driver with many improvements and changes. Starting with the naming, the company has jumped from the 472.12 WHQL version released on September 20th to the 496.xx naming released today. Such a significant increase in version naming is uncommon, and makes us wonder why the company decided to do it, probably in preparation for the Windows 11 branch of their drivers, which uses version 500.

Starting from release 496.13, NVIDIA has also removed support for Windows 8.1, Windows 8 and Windows 7. The last driver to support these operating systems is 472.12. This makes some sense, since between this release and today, Microsoft has launched their Windows 11 operating system. NVIDIA also trimmed more fat by removing support for the Kepler architecture, which was launched in 2012 and included models like GeForce GTX 780 Ti, GTX 780, GTX 770, GTX 760, GT 740, GT 730, GTX 690, GTX 680, GTX 670, GTX 660 Ti, GTX 660, GTX 650 Ti and GTX 630.

Update 15:57 UTC: Added confirmation from NVIDIA
Download NVIDIA GeForce Graphics Drivers 496.13 WHQL.

NVIDIA DLSS Gets Ported to 10 Additional Titles, Including the New Back 4 Blood Game

NVIDIA's Deep Learning Super Sampling (DLSS) technology has been one of the main selling points of GeForce RTX graphics cards. With the broad adoption of the technology amongst many popular game titles, the gaming community has enjoyed the AI-powered upscaling technology that boosts frame-rate output and delivers better overall performance. Today, the company announced that DLSS arrived in 10 additional game titles, and those include today's release of Back 4 Blood, Baldur's Gate 3, Chivalry 2, Crysis Remastered Trilogy, Rise of the Tomb Raider, Shadow of the Tomb Raider, Sword and Fairy 7, and Swords of Legends Online.

With so many titles receiving the DLSS update, NVIDIA advertises using the latest GeForce driver to achieve the best possible performance in the listed games. If you are wondering just how much DLSS adds to the performance, in the newest Back 4 Blood title, RTX GPUs see a 46% boost in FPS. Similar performance gains translate to other labels that received the DLSS patch. You can expect to achieve more than double the number of frames in older titles like Alan Wake Remastered, Tomb Raider saga, and FIST.
For more information about performance at 4K resolution, please see the slides supplied by NVIDIA below.

ASUS GeForce RTX 3070 With Noctua Cooling Appears

Back in August, we have seen rumors of ASUS collaborating with Austrian cooling specialist Noctua to develop a custom set of graphics cards based on custom cooling solutions provided by Noctua. In the early EEC listings, the references stood for a GPU based on NVIDIA's GeForce RTX 3070 GPU with a custom cooling provided by Noctua. We thought that such a collaboration would open many doors for both companies. Today, the product is finally looking like a genuine offer, and we got the first set of pictures thanks to ASUS Vietnam.

The card, pictured below, has dual 8-pin power connectors to supply the chip and has three DisplayPort connectors accompanied by two HDMI outputs. The product is a three-slot body with Noctua's iconic brown theme, featuring two large fans to cool the heatsink. While we don't know any further information, the Vietnamese pricing is supposed to stand at 26 million VND, translating to around 1137.62 US Dollars at the writing date.

NVIDIA Prepares to Deliver Deep Learning Anti-Aliasing Technology for Improved Visuals, Coming first to The Elder Scrolls Online

Some time ago, NVIDIA launched its Deep Learning Super Sampling (DLSS) technology to deliver AI-enhanced upscaling images to your favorite AAA titles. It uses proprietary algorithms developed by NVIDIA and relies on the computational power of Tensor cores found in GeForce graphics cards. In the early days of DLSS, NVIDIA talked about an additional technology called DLSS2X, which was supposed to be based on the same underlying techniques as DLSS, however, just to do image sharpening and not any upscaling. That technology got its official name today: Deep Learning Anti-Aliasing or DLAA shortly.

DLAA uses technology similar to DLSS, and it aims to bring NVIDIA's image-sharpening tech to video games. It aims to use the Tensor cores found in GeForce graphics cards, and provide much better visual quality, without sacrificing the performance, as it runs on dedicated cores. It is said that the technology will be offered alongside DLSS and other additional anti-aliasing technologies in in-game settings. The first game to support it is The Elder Scrolls Online, which now has it in the public beta test server, and will be available to the general public later on.

HYTE Unveils the new SFF Revolt 3 PC Case as its Premier Product

[Editor's note: We have published the review of HYTE Revolt 3 Case here.]

HYTE, the new PC components and lifestyle brand of iBUYPOWER, a leading manufacturer of high-performance custom gaming PCs, today released its premier product, the Revolt 3 Mini-ITX PC case. Previously announced during CES 2021 as the Revolt 3 MK3, the Revolt 3 was designed with careful consideration for DIY PC enthusiasts, gamers, and creators alike.

"iBUYPOWER is excited to introduce its new sub-brand, HYTE, to our community with its very first product, the Revolt 3" said Darren Su, Executive Vice President of iBUYPOWER. "With over 20 years of experience as a systems integrator we felt like we had a unique perspective to bring to the table when developing PC Components. We approached the Revolt 3 with the goal of designing a case with the freedom and flexibility that would allow the use of a wide range of components without imposing performance restrictions based on the size of the case."
Return to Keyword Browsing
Dec 22nd, 2024 12:36 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts