Apple Intelligence requires modern hardware to run on-device generative AI and large language models, so here’s a list of compatible devices
Apple Intelligence is a marketing moniker for a set of generative AI features based on large language models that will be available on the iPhone, iPad and Mac starting this fall. Some features such as specific Siri improvements may arrive via later updates to iOS 18, iPadOS 18, and macOS Sequoia in late 2024 and early 2025.
Apple Intelligence system requirements: Is your hardware on the device compatibility list?
The Apple Intelligence device compatibility list is concise because the latest AI features require Apple silicon with at least 8GB of RAM and a fast Neural Engine,
- iPhone 15 Pro and later
- iPhone 15 Pro Max and later
- iPad with M1 and later
- Mac with M1 and later
Unless you have the latest iPhone or your iPad/Mac doesn’t use Apple silicon, you won’t be able to use generative AI in iOS 18. All iPhones coming after the iPhone 15 Pros are expected to meet the hardware requirements for Apple Intelligence, including the iPhone 16 family scheduled to arrive after iOS 18 is released.

Apple Intelligence depends on Neural Engine
The Neural Engine is a type of Apple-designed neural processing unit embedded into the main Apple silicon chip. A Neural Engine specializes in accelerating specific neural network operations used in artificial intelligence and machine learning, like convolutions and matrix multiplies, without taxing the CPU, GPU or the battery.
Apple explains the Apple Intelligence system requirements
During The Talk Show live from the WWDC24, Daring Fireball’s John Gruber asked Apple’s AI boss John Giannandrea and software engineering lead Craig Federighi whether this is a scheme to sell new iPhones.
“So these models, when you run them at run times, it’s called inference, and the inference of large language models is incredibly computationally expensive,” Giannandrea explained. Inference lets a trained machine learning model draw conclusions from new data without requiring examples of the desired result.
“It’s a pretty extraordinary thing to run models of this power on an iPhone,” Federighi said. These models run on Neural Engine. “It’s a combination of bandwidth in the device, it’s the size of the Apple Neural Engine, it’s the oomph in the device to actually do these models fast enough to be useful,” Giannandrea continued.

What’s stopping Apple from giving people a switch to enable Apple Intelligence on older hardware? “You could, in theory, run these models on a very old device, but it would be so slow that it would not be useful,” he explained.
