News Posts matching #Apple Intelligence

Return to Keyword Browsing

Apple Reports Q3 Fiscal 2024 Results

Apple today announced financial results for its fiscal 2024 third quarter ended June 29, 2024. The Company posted quarterly revenue of $85.8 billion, up 5 percent year over year, and quarterly earnings per diluted share of $1.40, up 11 percent year over year. "Today Apple is reporting a new June quarter revenue record of $85.8 billion, up 5 percent from a year ago," said Tim Cook, Apple's CEO. "During the quarter, we were excited to announce incredible updates to our software platforms at our Worldwide Developers Conference, including Apple Intelligence, a breakthrough personal intelligence system that puts powerful, private generative AI models at the core of iPhone, iPad, and Mac. We very much look forward to sharing these tools with our users, and we continue to invest significantly in the innovations that will enrich our customers' lives, while leading with the values that drive our work."

"During the quarter, our record business performance generated EPS growth of 11 percent and nearly $29 billion in operating cash flow, allowing us to return over $32 billion to shareholders," said Luca Maestri, Apple's CFO. "We are also very pleased that our installed base of active devices reached a new all-time high in all geographic segments, thanks to very high levels of customer satisfaction and loyalty."

Apple Trained its Apple Intelligence Models on Google TPUs, Not NVIDIA GPUs

Apple has disclosed that its newly announced Apple Intelligence features were developed using Google's Tensor Processing Units (TPUs) rather than NVIDIA's widely adopted hardware accelerators like H100. This unexpected choice was detailed in an official Apple research paper, shedding light on the company's approach to AI development. The paper outlines how systems equipped with Google's TPUv4 and TPUv5 chips played a crucial role in creating Apple Foundation Models (AFMs). These models, including AFM-server and AFM-on-device, are designed to power both online and offline Apple Intelligence features introduced at WWDC 2024. For the training of the 6.4 billion parameter AFM-server, Apple's largest language model, the company utilized an impressive array of 8,192 TPUv4 chips, provisioned as 8×1024 chip slices. The training process involved a three-stage approach, processing a total of 7.4 trillion tokens. Meanwhile, the more compact 3 billion parameter AFM-on-device model, optimized for on-device processing, was trained using 2,048 TPUv5p chips.

Apple's training data came from various sources, including the Applebot web crawler and licensed high-quality datasets. The company also incorporated carefully selected code, math, and public datasets to enhance the models' capabilities. Benchmark results shared in the paper suggest that both AFM-server and AFM-on-device excel in areas such as Instruction Following, Tool Use, and Writing, positioning Apple as a strong contender in the AI race despite its relatively late entry. However, Apple's penetration tactic into the AI market is much more complex than any other AI competitor. Given Apple's massive user base and millions of devices compatible with Apple Intelligence, the AFM has the potential to change user interaction with devices for good, especially for everyday tasks. Hence, refining AI models for these tasks is critical before massive deployment. Another unexpected feature is transparency from Apple, a company typically known for its secrecy. The AI boom is changing some of Apple's ways, and revealing these inner workings is always interesting.
Return to Keyword Browsing
Aug 5th, 2024 07:15 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts