Updated on April 27th: Article originally posted on April 25th.
How Apple will improve the upcoming iPhone 16 and iPhone 16 Pro with artificial intelligence is one of the big questions in 2024. We now know more about Apple's plans to use AI in the iPhone, its approach, and how it will sell it to consumers.
Apple has submitted eight large-scale language models to the Hugging Face hub, an online resource for open-source AI implementations. An LLM is a dataset that a generative AI application uses to process input and iterate as many times as necessary to arrive at a good solution.
The larger the LLM, the more data is available. It's no surprise that these data sets were originally built in the cloud to be accessed as online services. He is required to create his LLM with a sufficient data footprint to run on mobile devices.
This requires new software technology, but also places demands on hardware that allows for more efficient processing. Android-focused chipset manufacturers such as Qualcomm, Samsung, and MediaTek offer system-on-chip packages optimized for generative AI. Apple is expected to do something similar with its next-generation Axx chips, allowing more AI routines to run on this year's iPhone 16 family rather than in the cloud.
Running on the device means that user data does not need to be uploaded and copied from the device for processing. As the public becomes more aware of AI privacy concerns, this will become a key marketing point.
Update: Saturday, April 27: Apple isn't the only company working hard to develop a small but effective language model for mobile devices.This weekend, Microsoft Details published and developer guide For Phi-3. The smallest of these three generative AI models, Phi-3 Mini, is available through Microsoft's Azure AI Studios, Ollama, and Hugging Face. Phi-3 Small and Phi-3 Medium are still in the development stage.
Phi-3 is a large language model that operates within a small footprint. Microsoft claims it can outperform models twice its size “on key benchmarks,” making a direct and favorable comparison with GPT-3.5T. Importantly, the Phi-3 Mini runs comfortably on Apple's A16 bionic chip, allowing third-party developers to target the iPhone 14 Pro and 14 Pro Max, the iPhone 15 family and future models. .
2024 will see the launch of many LLMs, from hobbyists to Silicon Valley (and Redmond) majors. Some products are licensed by the developer to the hardware manufacturer. There's a real possibility that Apple will leverage Google and Microsoft's AI models in iOS 18 and the next iPhone.
The model is easily available to third-party developers. There is a wide range of AI tools to choose from and you are looking for cross-platform support to ease the development process. As manufacturers lean toward AI for marketing and differentiation, apps that users crave will be able to participate in the AI revolution without being tied to a single choice by the manufacturer.
Alongside the code for these open-source efficient language models, Apple will publish the technology used and the theory behind the choices, including the decision to open-source all training data, metrics, and checkpoints. A research paper (PDF link) on the rationale has been published. and training structure.
This follows another LLM research paper published by Cornell University in collaboration with Apple's research and development team. This document described his Ferret-UI, his LLM that understands the device user interface and what is happening on the screen and provides a number of interactions. Examples include using audio to navigate to well-hidden settings or to describe what is shown on a display for people with visual impairments.
Three weeks after Apple launched the iPhone 15 family in 2023, Google launched the Pixel 8 and Pixel 8 Pro. The handset declared itself the first smartphone with built-in AI, demonstrating the rush to use and promote the benefits of generative AI in mobile devices. Since then, Apple has been on the back foot, at least publicly.
With a steady stream of research papers published on the new technology, Apple's AI plans are now visible to the industry, even if they aren't yet visible to consumers. By providing open-source code for these efficient language models and emphasizing on-device processing, Apple has announced that even though it is in talks with Google about licensing Gemini, it will be able to use a large number of AI systems running Android. It secretly shows how you want to have an edge over your device. Some of the iPhone's AI features.
Now let's take a closer look at the leaked iPhone 16 and iPhone 16 Pro designs…