Apple Intelligence will be available in December for localised English in Australia. In April, a software update will add more language support, with additional updates throughout the year. This feature is included in iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1, bringing advanced AI capabilities to iPhone, iPad, and Mac. Optimised for on-device processing, it offers a fast, personalised, and privacy-focused experience. This powerful new technology will open up a vast array of possibilities for developers. This applies to developers coding apps for the Apple ecosystem but also any developer, leveraging the new capabilities to enhance and super-power their workflow. Let’s get an introduction to this transformative technology about to make waves in our daily lives.
What is Apple Intelligence?

Apple Intelligence is designed to integrate powerful AI tools directly into Apple’s ecosystem.
Unlike many other AI platforms that rely heavily on cloud computing, Apple Intelligence leverages on-device processing for most tasks, reducing the need for data to leave a user’s device.
This model not only enhances performance but ensures that user data remains private. For tasks requiring significant processing power, Apple uses Private Cloud Compute, a secure server system where sensitive data is temporarily processed without being stored or shared with third parties.
Features and Tools for Developers
Apple Intelligence offers tools that developers can incorporate to enhance user experiences on Apple devices:
- Writing Tools: Embedded throughout the Apple system, these tools allow users to proofread, rewrite, and summarise text in apps like Mail, Notes, and Pages. Developers can integrate these tools into their apps with minimal setup, offering users a polished and intuitive writing experience.
- Image Playground API: With this API, developers can let users create unique visuals within apps. This feature is particularly useful for photo editing, allowing custom effects or generating personalised avatars without relying on external models. Image Playground is accessible entirely on-device, reducing resource costs and maintaining user privacy.
- Siri and App Intents: Siri has become even more versatile with Apple Intelligence, now able to perform more complex, contextually aware actions across apps. Using App Intents, developers can create commands for Siri that interact with their apps in a more conversational way. This means users can perform actions hands-free or integrate Siri’s capabilities with other apps.
- Enhanced Search in Photos: Apple Intelligence now allows users to find specific moments in videos or types of photos using natural language descriptions. This feature also lets developers add search functions that locate text or objects within images or videos in their apps.
Privacy-First AI
Apple’s approach to AI prioritises privacy, utilising on-device models for the majority of operations, including text, image, and language processing. Private Cloud Compute is engaged only for resource-intensive tasks, with each session secured by Apple’s privacy protocols and independent oversight. Apple has also incorporated privacy safeguards for users accessing OpenAI’s ChatGPT through Siri, with IP masking and no storage of user requests by Apple or OpenAI.
Benefits and Challenges for Developers
While Apple Intelligence offers a host of innovative tools, developers might encounter several challenges. Device limitations restrict some features to newer models, and Apple’s AI tools can be challenging for those new to on-device coding. Still, the benefits—like enhanced user experience, stronger ecosystem integration, and Apple’s privacy-first design—make Apple Intelligence a valuable opportunity for developers.
Apple Intelligence enables innovative app interactions, enhancing expression, customisation, and efficiency while prioritising privacy. Developers can explore integration options by visiting the Apple Developer page on Apple Intelligence.