News Posts matching #TLB

Return to Keyword Browsing

Linux Kernel Patch Fixes Minutes-Long Boot Times on AMD "Zen 1" and "Zen 2" Processors

A significant fix has been submitted to the Linux kernel 6.13-rc1 that addresses prolonged boot times affecting older AMD processors, specifically targeting "Zen 1" and "Zen 2" architectures. The issue, which has been present for approximately 18 months, could cause boot delays ranging from several seconds to multiple minutes in extreme cases. The problem was discovered by a Nokia engineer who reported inconsistent boot delays across multiple AMD EPYC servers. The most severe instances showed the initial unpacking process taking several minutes longer than expected, though not all boots were affected. Investigation revealed that the root cause stemmed from a kernel modification implemented in June 2023, specifically related to CPU microcode update handling.

The technical issue was identified as a missing step in the boot process: Zen 1 and Zen 2 processors require the patch buffer mapping to be flushed from the Translation Lookaside Buffer (TLB) after applying CPU microcode updates during startup. The fix, submitted as part of the "x86/urgent" material ahead of the Linux 6.13-rc1 release, implements the necessary TLB flush for affected AMD Ryzen and EPYC systems. This addition eliminates what developers described as "unnecessary and unnatural delays" in the boot process. While the solution will be included in the upcoming Linux 6.13 kernel release, plans are in place to back-port the fix to stable kernel versions to help cover most Linux users on older Zen architectures.

Apple M1 Chips Affected by Unpatchable "PACMAN" Exploit

Apple M1 chips are a part of the Apple Silicon family that represents a new transition to Arm-based cores with new power and performance targets for Apple devices. A portion of building a processor is designing its security enclave, and today we have evidence that M1 processors got a new vulnerability. The PACMAN is a hardware attack that can bypass Pointer Authentication (PAC) on M1 processors. Security researchers took an existing concept of Spectre and its application in the x86 realm and now applied it to the Arm-based Apple silicon. PACMAN exploits a current software bug to perform pointer authentication bypass, which may lead to arbitrary code execution.

The vulnerability is a hardware/software co-design that exploits microarchitectural construction to execute arbitrary codes. PACMAN creates a PAC Oracle to check if a specific pointer matches its authentication. It must never crash if an incorrect guess is supplied and the attack brute-forces all the possible PAC values using the PAC Oracle. To suppress crashes, PAC Oracles are delivered speculatively. And to learn if the PAC value was correct, researchers used uArch side channeling. In the CPU resides translation lookaside buffers (TLBs), where PACMAN tries to load the pointer speculatively and verify success using the prime+probe technique. TLBs are filled with minimal addresses required to supply a particular TLB section. If any address is evicted from the TLB, it is likely a load success, and the bug can take over with a falsely authenticated memory address.
Apple M1 PACMAN Attack

ETH Mining: Lower VRAM GPUs to be Rendered Unprofitable in Time

Hold on to your ETH hats: you will still be able to cash in on the ETH mining craze for a while. However, you should look towards your 3 GB and 4 GB graphics cards with a slight distrust, for reasons that you should know, anyway, since you have surely studied your mining cryptocurrency of choice. Examples are the GTX 1060 3 GB, or one of those shiny new 4 GB RX 480 / RX 580 which are going at ridiculously premium prices right now. And as a side note, don't you love the mechanisms of pricing and demand?

The problem here stems from ETH's own design for its current PoW (Proof of Work) implementation (which is what allows you to mine the currency at all.) In a bid to make ETH mining unwieldy for the specialized silicon that brought Bitcoin difficulty through the roof, ETH implements a large size data set for your GPU to work with as you mine, which is stored in your GPU's memory (through the DAG, which stands for Directed Acyclic Graph). This is one of the essential differences between Bitcoin mining and Ethereum mining, in that Ethereum mining was designed to be memory-intensive, so as to prevent usage of ASICs and other specialized hardware. As a side-note, this also helps (at least theoretically) in ETH's decentralization, which Bitcoin sees more at risk because of the inherent centralization that results from the higher hardware costs associated with its mining.
Return to Keyword Browsing
Dec 19th, 2024 07:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts