News Posts matching #TLB

Return to Keyword Browsing

Apple M1 Chips Affected by Unpatchable "PACMAN" Exploit

Apple M1 chips are a part of the Apple Silicon family that represents a new transition to Arm-based cores with new power and performance targets for Apple devices. A portion of building a processor is designing its security enclave, and today we have evidence that M1 processors got a new vulnerability. The PACMAN is a hardware attack that can bypass Pointer Authentication (PAC) on M1 processors. Security researchers took an existing concept of Spectre and its application in the x86 realm and now applied it to the Arm-based Apple silicon. PACMAN exploits a current software bug to perform pointer authentication bypass, which may lead to arbitrary code execution.

The vulnerability is a hardware/software co-design that exploits microarchitectural construction to execute arbitrary codes. PACMAN creates a PAC Oracle to check if a specific pointer matches its authentication. It must never crash if an incorrect guess is supplied and the attack brute-forces all the possible PAC values using the PAC Oracle. To suppress crashes, PAC Oracles are delivered speculatively. And to learn if the PAC value was correct, researchers used uArch side channeling. In the CPU resides translation lookaside buffers (TLBs), where PACMAN tries to load the pointer speculatively and verify success using the prime+probe technique. TLBs are filled with minimal addresses required to supply a particular TLB section. If any address is evicted from the TLB, it is likely a load success, and the bug can take over with a falsely authenticated memory address.
Apple M1 PACMAN Attack

ETH Mining: Lower VRAM GPUs to be Rendered Unprofitable in Time

Hold on to your ETH hats: you will still be able to cash in on the ETH mining craze for a while. However, you should look towards your 3 GB and 4 GB graphics cards with a slight distrust, for reasons that you should know, anyway, since you have surely studied your mining cryptocurrency of choice. Examples are the GTX 1060 3 GB, or one of those shiny new 4 GB RX 480 / RX 580 which are going at ridiculously premium prices right now. And as a side note, don't you love the mechanisms of pricing and demand?

The problem here stems from ETH's own design for its current PoW (Proof of Work) implementation (which is what allows you to mine the currency at all.) In a bid to make ETH mining unwieldy for the specialized silicon that brought Bitcoin difficulty through the roof, ETH implements a large size data set for your GPU to work with as you mine, which is stored in your GPU's memory (through the DAG, which stands for Directed Acyclic Graph). This is one of the essential differences between Bitcoin mining and Ethereum mining, in that Ethereum mining was designed to be memory-intensive, so as to prevent usage of ASICs and other specialized hardware. As a side-note, this also helps (at least theoretically) in ETH's decentralization, which Bitcoin sees more at risk because of the inherent centralization that results from the higher hardware costs associated with its mining.
Return to Keyword Browsing
Nov 18th, 2024 22:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts