Apple iPhone 16 Visual Intelligence and Google Partnership for Enhanced AI Integration

Apple iPhone 16 Visual Intelligence and Google Partnership for Enhanced AI Integration
September 11, 2024

Apple’s collaboration with Google is moving forward with the introduction of its new visual search feature, “Visual Intelligence,” which was unveiled at the company’s “It’s Glowtime” event. Alphabet currently pays around $20 billion each year to serve as the default search engine in Safari. In addition to this partnership, iPhone 16 users now have access to Google’s search engine and visual search capabilities through the camera control button built into the device.

Additionally, OpenAI’s ChatGPT is integrating with Siri, as demonstrated by a feature that allows users to point their phone’s camera at lecture notes and get explanations with a single click.

Apple shed a spotlight on the functionality of the Camera Control button, highlighting its versatility for quickly capturing photos and videos along with how users can conveniently adjust zoom and exposure settings by swiping across the button. In addition to its camera-related functions, the button serves as a gateway to Apple’s innovative visual search feature, which is built into its partnership with Google.

Initially, the camera control button appeared to be an enhanced shutter button, but Apple has clarified that it offers functions beyond photography. Through the Visual Intelligence feature, users can not only identify objects captured by the camera but can also seamlessly access third-party services without needing to launch individual apps.

Apple's Visual Intelligence, comparable to Google Lens or Pinterest Lens, allows users to quickly gain information about objects they encounter. In its demo, Apple showed the feature's ability to retrieve details about a restaurant or identify the breed of a dog encountered during a walk. Additionally, the feature can turn a poster for an event into a calendar entry.

Craig Federighi, Apple’s senior vice president of software engineering, revealed during the event that Google Search can also be accessed through the Camera Control button. He clarified that by pressing the button, consumers can instantly perform a Google search for products like bikes they might be interested in purchasing. A demonstration showed a user tapping the button to see similar bikes for sale, with the option to explore more results directly from Google.

However, Apple did not specify when the Camera Control button would prefer third-party services over Apple’s built-in features, such as Apple Maps, which was used in a restaurant search demo. Additionally, the company did not reveal how users can customize this feature. Although his response lacked specificity, Federighi assured users that they would always have the discretion to decide when to employ third-party tools.

This feature comes at a time when the conventional App Store experience is starting to feel a little dated, offering a new option to interact with software and services outside of Apple’s built-in apps. With AI assistants, users can now ask questions, complete tasks, and express their creativity without relying on standalone apps.

Apple is presenting itself as a platform that links consumers to third-party services, such as AI and search providers, rather than creating its own replacement for ChatGPT. Apple can provide these capabilities without relying on in-app purchases to make money by forming partnerships with companies like OpenAI. It also avoids taking responsibility for errors produced by third-party services like ChatGPT or Google Search.

Image Credits: Apple

Code Labs Academy © 2024 All rights reserved.