News Posts matching #Graphics

Return to Keyword Browsing

The Curious Case of the 12-pin Power Connector: It's Real and Coming with NVIDIA Ampere GPUs

Over the past few days, we've heard chatter about a new 12-pin PCIe power connector for graphics cards being introduced, particularly from Chinese language publication FCPowerUp, including a picture of the connector itself. Igor's Lab also did an in-depth technical breakdown of the connector. TechPowerUp has some new information on this from a well placed industry source. The connector is real, and will be introduced with NVIDIA's next-generation "Ampere" graphics cards. The connector appears to be NVIDIA's brain-child, and not that of any other IP- or trading group, such as the PCI-SIG, Molex or Intel. The connector was designed in response to two market realities - that high-end graphics cards inevitably need two power connectors; and it would be neater for consumers to have a single cable than having to wrestle with two; and that lower-end (<225 W) graphics cards can make do with one 8-pin or 6-pin connector.

The new NVIDIA 12-pin connector has six 12 V and six ground pins. Its designers specify higher quality contacts both on the male and female ends, which can handle higher current than the pins on 8-pin/6-pin PCIe power connectors. Depending on the PSU vendor, the 12-pin connector can even split in the middle into two 6-pin, and could be marketed as "6+6 pin." The point of contact between the two 6-pin halves are kept leveled so they align seamlessly.

ASRock Launches Radeon RX 5600 XT Challenger Pro 6G OC Graphics Card

The leading global motherboard, graphics card and mini PC manufacturer, ASRock, has launched new Radeon RX 5600 XT Challenger Pro 6G OC three-fan graphics card. The Radeon RX 5600 XT Challenger Pro 6G OC features ASRock's new styled shroud design with upgraded cooling fins, AMD's second-generation Radeon RX 5600 XT 7 nm GPU, plus 6 GB 192-bit GDDR6 memory and PCI Express 4.0 bus. The ASRock Radeon RX 5600 XT Challenger Pro 6G OC graphics card provides excellent overclocking settings, which enables users to enjoy a smooth 1080p gaming experience.

The ASRock Radeon RX 5600 XT Challenger Pro 6G OC adopts AMD's second-generation Radeon RX 5600 XT GPU. With factory default GPU base/game/boost clock settings, this new graphics card can reach 1420/1615/up to 1750 MHz respectively. The boost clock setting is 4% higher than the AMD's standard settings. Furthermore, the clock frequency of GDDR6 memory is set as 1750 MHz, which is 17% faster than AMD's memory default value - 1500 MHz. The ASRock Radeon RX 5600 XT Challenger Pro 6G OC is equipped with 3-fan cooler, 6 GB 192-bit GDDR6 memory and latest PCI Express 4.0 bus standard; ideally partnering with AMD Ryzen 3000 CPU systems and ASRock B550 and X570 motherboards. These premium specifications allow Radeon RX 5600 XT Challenger Pro 6G OC graphics card to have outstanding performance and bring users excellent 1080p gaming experience.

Cyberpunk 2077 Graphics Comparison Video Between 2018 and 2020 Builds Shows Many Differences

Cyberpunk 2077 is the year's most awaited game release, and has been met with not one, but two delays already. Originally expected to ship in April of this year, it has since been postponed to September, and now to November 19th on account of extra optimization and bug quashing from developer CD Projekt Red. However, the recent gameplay videos released for the game by the developer showcase the amount of work that has gone into the engine since 2018, when we were first treated to a gameplay video.

The video after the break comes courtesy of YouTube user 'Cycu1', who set up the 2018 and 2020 trailers side by side. In it, you can see extreme improvements to overall level and character detail (some of this can certainly be attributed to a lower-quality 2018 video compression). However, the video also showcases some lighting differences (I guess it's subjective whether this has worked out for better or worse, but the new videos supposedly make use of ray tracing). Another point that I'd like to call your attention to is that there seem to be some environment differences between the two versions - it seems that some environments were simplified compared to their 2018 version, such as the "Going Pro" mission - the chair and panels were removed from the environment and replaced by what looks like a garage door. Whether this was done as a way to improve performance is on CD Projekt Red's purview.

New AMD Radeon Pro 5600M Mobile GPU Brings Desktop-Class Graphics Performance and Enhanced Power Efficiency to 16-inch MacBook Pro

AMD today announced availability of the new AMD Radeon Pro 5600M mobile GPU for the 16-inch MacBook Pro. Designed to deliver desktop-class graphics performance in an efficient mobile form factor, this new GPU powers computationally heavy workloads, enabling pro users to maximize productivity while on-the-go.

The AMD Radeon Pro 5600M GPU is built upon industry-leading 7 nm process technology and advanced AMD RDNA architecture to power a diverse range of pro applications, including video editing, color grading, application development, game creation and more. With 40 compute units and 8 GB of ultra-fast, low-power High Bandwidth Memory (HBM2), the AMD Radeon Pro 5600M GPU delivers superfast performance and excellent power efficiency in a single GPU package.

Another Nail on Intel Kaby Lake-G Coffin as AMD Pulls Graphics Driver Support

Kaby Lake-G was the result of one of the strangest collaborations in the industry - though that may not be a just way of looking at it. It made total sense at the time - a product that combined the world's best CPU design with one of the foremost graphics architectures seems a recipe for success. However, the Intel-AMD collaboration was an unexpected one, as these two rivals were never expected to look eye to eye in any sort of meaningful way. Kaby Lake-G was revolutionary in how it combined both AMD and Intel IP in an EMIB-capable design, but it wasn't one built to last.

Now, after Intel has announced a stop to product manufacturing and order capacity, it's come the time for AMD to pull driver support. The company's latest Windows 10 version 2004 update-compatible drivers don't install on Kaby Lake-G powered systems, citing an unsupported hardware configuration. Tom's Hardware contacted Intel, who said they're working with AMD to bring back "Radeon graphics driver support to Intel NUC 8 Extreme Mini PCs (previously codenamed "Hades Canyon")." AMD, however, still hasn't commented on the story.

AMD Declares That The Era of 4GB Graphics Cards is Over

AMD has declared that the era of 4 GB graphics cards is over and that users should "Game Beyond 4 GB". AMD has conducted testing of its 4 GB RX 5500XT & 8 GB RX 5500XT to see how much of a difference VRAM can make on gaming performance. AMD tested the cards on a variety of games at 1080p high/ultra settings with a 3600X & 16 GB 3200 MHz ram, on average the 8 GB model performed ~19% better than its 4 GB counterpart. With next-gen consoles featuring 16 GB of combined memory and developers showing no sign of slowing down, it will be interesting to see what happens.

Intel Jasper Lake CPU Appears with Gen11 Graphics

Intel is preparing to update its low-end segment designed for embedded solutions, with a next-generation CPU codenamed Jasper Lake. Thanks to the popular hardware finder and leaker, _rogame has found a benchmark showing that Intel is about to bless low-end with a lot of neat stuff. The benchmark results show a four-core, four threaded CPU running at 1.1 GHz base clock with a 1.12 GHz boost clock. Even though these clocks are low, this is only a sample and the actual frequency will be much higher, expecting to be near 3 GHz. The CPU was spotted in a configuration rocking 32 GB of DDR4 SODIMM memory.

Jasper Lake is meant to be a successor to Gemini Lake and it will use Intel's Tremont CPU architecture designed for low-power scenarios. Designed on a 10 nm manufacturing node from Intel, this CPU should bring x86 processors to a wide range of embedded systems. Although the benchmark didn't mention which graphics the CPU will be paired with, _rogame speculates that Intel will use Gen11 graphics IP. That will bring a nice update over Gemini Lake's Gen9.5 graphics. That alone should bring better display output options and more speed. These CPUs are designed for Atom/Pentium/Celeron lineup, just like Gemini Lake before them.

Update: Updated the article to reflect the targeted CPU category.
Intel Tremont Intel Jasper Lake

Intel Posts Windows 10 May 2020 Update-ready Graphics Drivers

Intel today released its first Graphics Drivers ready for the upcoming Windows 10 May 2020 Update (2004). Version 27.20.100.8187 of Intel Graphics Drivers are WDDM 2.7 compliant, which means support for Shader Model 6.5, and Dolby Vision, on Gen 9.5 or later iGPUs. The drivers also add readiness for OneAPI, Intel's ambitious unified programming model for x86 processors, iGPU execution units, and future Xe compute processors. For gamers, the latest drivers add optimization for "Gears Tactics," "XCOM: Chimera Squad," and "Call of Duty Modern Warfare 2 Campaign Remastered), on Iris Plus or later iGPUs. As with the previous drivers, these drivers are OEM-unlocked.
DOWNLOAD: Intel Graphics Drivers 27.20.100.8187

Intel Teases "Big Daddy" Xe-HP GPU

The Intel Graphics Twitter account was on fire today, because they posted an update on the development of the Xe graphics processor, mentioning that samples are ready and packed up in quite an interesting package. The processor in question was discovered to be a Xe-HP GPU variant with an estimated die size of 3700 mm², which means we sure are talking about a multi-chip package here. How we concluded that it is the Xe-HP GPU, is by words of Raja Koduri, senior vice president, chief architect, general manager for Architecture, Graphics, and Software at Intel. He made a tweet, which was later deleted, that says this processor is a "baap of all", meaning "big daddy of them all" when translated from Hindi.

Mr. Koduri previously tweeted a photo of the Intel Graphics team at India, which has been working on the same "baap of all" GPU, which suggests this is a Xe-HP chip. It seems that this is not the version of the GPU made for HPC workloads (this is reserved for the Xe-HPC GPU), this model could be a direct competitor to offers like NVIDIA Quadro or AMD Radeon Pro. We can't wait to learn more about Intel's Xe GPUs, so stay tuned. Mr. Koduri has confirmed that this GPU will be used only for Data Centric applications as it is needed to "keep up with the data we are generating". He has also added that the focus for gaming GPUs is to start off with better integrated GPUs and low power chips above that, that could reach millions of users. That will be a good beginning as that will enable software preparation for possible high-performance GPUs in future.

Update May 2: changed "father" to "big daddy", as that's the better translation for "baap".
Update 2, May 3rd: The GPU is confirmed to be a Data Center component.

AMD Reports First Quarter 2020 Financial Results

AMD today announced revenue for the first quarter of 2020 of $1.79 billion, operating income of $177 million, net income of $162 million and diluted earnings per share of $0.14. On a non-GAAP* basis, operating income was $236 million, net income was $222 million and diluted earnings per share was $0.18.

"We executed well in the first quarter, navigating the challenging environment to deliver 40 percent year-over-year revenue growth and significant gross margin expansion driven by our Ryzen and EPYC processors," said Dr. Lisa Su, AMD president and CEO. "While we expect some uncertainty in the near-term demand environment, our financial foundation is solid and our strong product portfolio positions us well across a diverse set of resilient end markets. We remain focused on strong business execution while ensuring the safety of our employees and supporting our customers, partners and communities. Our strategy and long-term growth plans are unchanged."

Khronos Group Releases OpenCL 3.0

Today, The Khronos Group, an open consortium of industry-leading companies creating advanced interoperability standards, publicly releases the OpenCL 3.0 Provisional Specifications. OpenCL 3.0 realigns the OpenCL roadmap to enable developer-requested functionality to be broadly deployed by hardware vendors, and it significantly increases deployment flexibility by empowering conformant OpenCL implementations to focus on functionality relevant to their target markets. OpenCL 3.0 also integrates subgroup functionality into the core specification, ships with a new OpenCL C 3.0 language specification, uses a new unified specification format, and introduces extensions for asynchronous data copies to enable a new class of embedded processors. The provisional OpenCL 3.0 specifications enable the developer community to provide feedback on GitHub before the specifications and conformance tests are finalized.
OpenCL

Intel iGPU+dGPU Multi-Adapter Tech Shows Promise Thanks to its Realistic Goals

Intel is revisiting the concept of asymmetric multi-GPU introduced with DirectX 12. The company posted an elaborate technical slide-deck it originally planned to present to game developers at the now-cancelled GDC 2020. The technology shows promise because the company isn't insulting developers' intelligence by proposing that the iGPU lying dormant be made to shoulder the game's entire rendering pipeline for a single-digit percentage performance boost. Rather, it has come up with innovating augments to the rendering path such that only certain lightweight compute aspects of the game's rendering be passed on to the iGPU's execution units, so it has a more meaningful contribution to overall performance. To that effect, Intel is on the path of coming up with SDK that can be integrated with existing game engines.

Microsoft DirectX 12 introduced the holy grail of multi-GPU technology, under its Explicit Multi-Adapter specification. This allows game engines to send rendering traffic to any combinations or makes of GPUs that support the API, to achieve a performance uplift over single GPU. This was met with lukewarm reception from AMD and NVIDIA, and far too few DirectX 12 games actually support it. Intel proposes a specialization of explicit multi-adapter approach, in which the iGPU's execution units are made to process various low-bandwidth elements both during the rendering and post-processing stages, such as Occlusion Culling, AI, game physics, etc. Intel's method leverages cross-adapter shared resources sitting in system memory (main memory), and D3D12 asynchronous compute, which creates separate processing queues for rendering and compute.

Intel Rocket Lake-S Platform Detailed, Features PCIe 4.0 and Xe Graphics

Intel's upcoming Rocket Lake-S desktop platform is expected to arrive sometime later this year, however, we didn't have any concrete details on what will it bring. Thanks to the exclusive information obtained by VideoCardz'es sources at Intel, there are some more details regarding the RKL-S platform. To start, the RKL-S platform is based on a 500-series chipset. This is an iteration of the upcoming 400-series chipset, and it features many platform improvements. The 500-series chipset based motherboards will supposedly have an LGA 1200 socket, which is an improvement in pin count compared to LGA 1151 socket found on 300 series chipset.

The main improvement is the CPU core itself, which is supposedly a 14 nm adaptation of Tiger Lake-U based on Willow Cove core. This design is representing a backport of IP to an older manufacturing node, which results in bigger die space due to larger node used. When it comes to the platform improvements, it will support the long-awaited PCIe 4.0 connection already present on competing platforms from AMD. It will enable much faster SSD speeds as there are already PCIe 4.0 NVMe devices that run at 7 GB/s speeds. With RKL-S, there will be 20 PCIe 4.0 lanes present, where four would go to the NVMe SSD and 16 would go to the PCIe slots from GPUs. Another interesting feature of the RKL-S is the addition of Xe graphics found on the CPU die, meant as iGPU. Supposedly based on Gen12 graphics, it will bring support for HDMI 2.0b and DisplayPort 1.4a connectors.
Intel Rocket Lake-S Platform

Intel Xe Graphics to Feature MCM-like Configurations, up to 512 EU on 500 W TDP

A reportedly leaked Intel slide via DigitalTrends has given us a load of information on Intel's upcoming take on the high performance graphics accelerators market - whether in its server or consumer iterations. Intel's Xe has already been cause for much discussion in a market that has only really seen two real competitors for ages now - the coming of a third player with muscles and brawl such as Intel against the already-established players NVIDIA and AMD would surely spark competition in the segment - and competition is the lifeblood of advancement, as we've recently seen with AMD's Ryzen CPU line.

The leaked slide reveals that Intel will be looking to employ a Multi-Chip-Module (MCM) approach to its high performance "Arctic Sound" graphics architecture. The GPUs will be available in up to 4-tile configuration (the name Intel is giving each module), which will then be joined via Foveros 3D stacking (first employed in Intel Lakefield. This leaked slide shows Intel's approach starting with a 1-tile GPU (with only 96 of its 128 total EUs active) for the entry level market (at 75 W TDP) a-la DG1 SDV (Software Development Vehicle).

Intel DG1 Discrete GPU Shows Up with 96 Execution Units

As we are approaching the year 2020, when Intel is rumored to launch its discrete graphics cards to the hand of consumers around the world, we are gearing up on the number of leaks about the upcoming products. Thanks to Twitter user @KOMACHI_ENSAKA, who found the latest EEC listing, we have new information regarding Intel's upcoming DG1 discrete graphics solution.

In the leaked EEC listing, the DG1 GPU is being presented as a GPU with 96 execution units, meaning that Intel is planning to take on entry-level graphics cards with this GPU. If the graphics unit is following the same design principle of the previous-generation GPUs, then there should be around 8 shading units per one execution unit, totaling 768 shading units for the whole DG1 GPU. If the 12th Gen Xe design inside the DG1 follows a different approach, then we can expect to see a double amount of shading units, meaning 1536 in total.

MSI Prepares Another Version of AMD Radeon RX 580 Armor Graphics Card

While AMD is giving all signs of being preparing to release their latest entries into the midrange graphics card market in the form of theRX 5500 and RX 5300 series of graphics cards based on Navi, AMD's AIB partners are giving the slow burn on existing inventories of AMD's Polaris graphics chips. MSI, in this case, seems to have bet on a slight redesign of their previously-released RX 580 Armor and Armor MK2.

Changed is the color scheme - MSI went full black on this one. There's also a redesigned PCB, a redesigned I/O bracket (which keeps four display connectors), and a new cooler shroud. The heatsink's surface area also seems to have been increased, which should provide lower operating temperatures (anything beyond that, such as higher overclockability and longer lifespan, are speculations). The redesigned Armor keeps the single 8-pin PCIe power connector. No other details are available at time of writing.

The End of a Collaboration: Intel Announces Discontinuation of Kaby Lake-G with AMD Radeon Vega Graphics

The marriage of Intel and AMD IPs in the form of the Kaby Lake-G processors was met with both surprised grunts from the company and a sense of bewilderment at what could come next. Well, we now know what came next: Intel hiring several high-level AMD employees on the graphics space and putting together its own motley crew of discrete GPU developers, who should be putting out Intel's next-gen high-performance graphics accelerators sometime next year.

The Kaby Lake-G processors, however, showed promise, pairing both Intel's (at the time) IPC dominance and AMD's graphics IP performance and expertise on a single package by placing the two components in the same substrate and connecting them via a PCIe link. A new and succinct Intel notice on the Kaby Lake-G page sets a last order time (January 31, 2020, as the last date for orders, and July 31, 2020, as the date of last shipments), and explains that product market shifts have moved demand from Kaby Lake-G products "to other Intel products". Uptake was always slow on this particular collaboration - most of it, we'd guess, because of the chips' strange footprint arrangement for embedding in systems, which required custom solutions that had to be designed from scratch. And with Intel investing into their own high-performance graphics, it seems clear that there is just no need to flaunt their previous collaborations with other companies in this field. Farewell, Intel-AMD Kaby Lake-G. We barely knew you.

Sapphire Teases RX 5700 XT NITRO Graphics Card

Sapphire China has let out a partial teaser of their upcoming RX 5700 XT NITRO graphics card, which is nearing release and has apparently been finalized. The design shows a slightly longer graphics card than we're used to from Sapphire in recent generation (based on 8-pin port placement and component ratio in the provided image).

The card will seemingly sport a silver-based finish (and what looks like black details in the fan area), and features an RGB "SAPPHIRE" detail on the side, as well as what looks like an RGB strip alongside the PCB. From the image, it seems Sapphire will be using a triple fan cooling solution on this one. Being set in the NITRO series, it's currently unknown if this will be a NITRO or NITRO+ product.

Intel adds Integer Scaling support to their Graphics lineup

Intel's Lisa Pearce today announced on Twitter, that the company has listened to user feedback from Reddit and will add nearest neighbor integer scaling to their future graphics chips. Integer scaling is the holy grail for gamers using console emulators, because it will give them the ability to simply double/triple or quadruple existing pixels, without any loss in sharpness that is inherent to traditional upscaling algorithms like bilinear or bicubic. This approach also avoids ringing artifacts that come with other, more advanced, scaling methods.

In her Twitter video, Lisa explained that this feature will only be available on upcoming Gen 11 graphics and beyond - previous GPUs lack the hardware required for implementing integer scaling. In terms of timeline, she mentioned that this will be part of the driver "around end of August", which also puts some constraints of the launch date of Gen 11, which seems to be rather sooner than later, based on that statement.

ASRock Launches Radeon RX 5700 Performance Gaming GPU Series

The leading global motherboard, graphics card and mini PC manufacturer, ASRock, launches the flagship level product - Radeon RX 5700 series graphics cards featuring AMD's latest Radeon RX 5700 gaming GPU and 8GB 256-bit GDDR6 memory with great gaming experiences are created by bending the rules. Take control and forge your own path with Radeon RX 5700 series and experience powerful accelerated gaming customized for you.

The Radeon RX 5700 series GPUs are powered by new RDNA architecture -- the heart of AMD's advanced 7nm technology process. RDNA features up to 40 completely redesigned "Compute Units" delivering incredible performance and up to 4x IPC improvements, new instructions better suited for visual effects such as volumetric lighting, blur effects, and depth of field, and multi-level cache hierarchy for greatly reduced latency and highly responsive gaming. The RDNA architecture enables DisplayPort 1.4 with Display Stream Compression for extreme refresh rates and resolutions on cutting edge displays for insanely immersive gameplay.

AMD Releases Radeon Software Adrenalin 2019 19.6.2 Drivers

AMD today released the latest version of their Radeon Software Adrenalin 2019 Edition Drivers. The 19.6.2 Beta release offers no performance improvements however it does add support for various Vulkan extensions while simultaneously solving a few problems. These fixed issues include, Crackdown 3 experiencing application hangs on Radeon R7 370 series products, Wireless VR experiencing performance drops across various titles on some Radeon RX 400 and 500 series graphics products, and Performance Metrics Overlay failing to enable or being disabled when toggling Radeon Overlay in game just to name a few. You can grab this latest driver release from the link below.

DOWNLOAD: AMD Radeon Software Adrenalin 19.6.2 beta
The change-log follows.

AMD and Samsung Announce Strategic Partnership in Ultra Low Power, High Performance Graphics Technologies with RDNA

AMD and Samsung today announced a key strategic partnership in high performance graphics technologies for the mobile space. The agreement means that Samsung will license AMD's Radeon graphics IP in its latest RDNA iteration, no less, for integration on smartphone graphics processing. Let me stress how impressive this can be: that AMD developed a graphics architecture that can scale from a high-performance GPU down to a nimble, fast, power-sipping module for mobile graphics processing.

This is a huge strategic win for AMD, in that more and more products will be infused with their technology. As Lisa Su puts it, the Radeon user base and development ecosystem will be greatly increased with this Samsung integration - as will AMD's revenue, for sure. Perhaps we'll see a "Powered by AMD Radeon" sticker or engraving in our future Samsung smartphones, as we do with Leica partnerships, for example.)

Rumor: AMD Navi a Stopgap, Hybrid Design of RDNA and GraphicsCoreNext

The Wheel of Rumors turns, and assumptions come and pass, sometimes leaving unfulfilled hopes and dreams. In this case, the rumor mill, in what seems like a push from sweclockers, places Navi not as a "built from the ground-up" architecture, but rather as a highly customized iteration of GCN - iterated in the parts that it actually implements AMD's RDNA architecture, to be exact. And this makes sense from a number of reasons - it's certainly not anything to cry wolf about.

For one, AMD's GCN has been a mainstay in the graphics computing world since it was first introduced back in 2012, succeeding the company's TeraScale architecture. Game engines and assorted software have been well optimized already to take advantage of AMD's design - even with its two ISAs and assorted improvements over the years. One of the most important arguments is derived from this optimization effort: AMD's custom designs for the console market employ architectures that are GCN-based, and thus, any new architecture that would be used by both Microsoft and Sony for their next-generation consoles would have to be strictly backwards compatible.

ASRock Unveils World's First Thunderbolt 3 Graphics Card

ASRock will be showing the world's first Thin Mini-ITX form factor - RX570TM-ITX/TBT graphics card, which adapts with AMD Radeon RX570 graphics card and supports Intel Thunderbolt 3 technology. The RX570TM-ITX/TBT graphics card is not only the first graphics card with a Thunderbolt 3 Type-C interface which supports 40Gb/s transfer rate and power delivery, but it also features 4 USB 3.1 Gen1 ports, Gigabit Ethernet, and SATA 6Gb/s interface.
The RX570TM-ITX/TBT graphics card is based on the Thin Mini-ITX concept. With this graphics card, All-in-One PC manufacturers can develop their own Thunderbolt monitors with 3D performance and power charging with less effort. Moreover, it would be able to fit in a mini ITX chassis. By connecting a notebook through Thunderbolt 3 Type-C interface, it helps to increase the notebook's 3D computing power and charge the notebook at the same time.

Intel Introduces its New Graphics Command Center App, Paving the Way For Intel Xe

Intel has revealed the layout and overall look (as well as functionality, though that one is always changing) of their new Graphics Command Center app, which showcases the company's vision for a graphics control hub. The design is thematically coherent (read: it's blue), and is, for now, of a simple layout. Enthusiast options are expectedly going to be added closer to or upon release of the company's discrete-level graphics architecture with Intel Xe, but the Command Center as it is showcases Intel's overall spirit to their graphics push. For now, features keep the minimalist approach of Intel's integrated graphics - this is more of a new coat of paint than a new enthusiast-grade Command Center.

Intel made a video available on its Twitter account, and announced an early access program for users that want to partake i the feature and usability development of the new Command Center. The new app is available through Microsoft's App Store on Windows.
Return to Keyword Browsing
Nov 21st, 2024 11:03 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts