Apple’s Visual Intelligence and the iPhone 16 camera button: how will they work


Visual Intelligence is Apple’s built-in competitor to Google Lens.

On Monday, Apple unveiled its latest iPhone line along with two updated Apple Watch models and AirPods at its pre-recorded Glowtime event.

The main selling point of the new Apple devices is expected to be Apple Intelligence features, some of which will be rolled out to users starting in October.

ADVERTISEMENT

Aside from the AI, the biggest update of the new iPhone 16 is a new Camera Control button located on the right side of the smartphone, designed for more convenient photography.

Camera Control will have several functions.

Clicking it once will open the camera, while clicking and holding the button will capture a video.

iphone-camera-record
Image by Apple.

“A high precision force sensor interacts with the Taptic Engine, ensuring responsive haptic feedback like in a mechanical camera shutter,” iPhone product manager Piyush Pratik said during the event.

Additionally, a small multi-pixel capacitive sensor and signal processor recognize touch gestures. This, according to Apple, enables Camera Control to distinguish between a full click and a lighter press.

When the camera is opened, a lighter press on the button reveals a new preview that dissolves other UI elements to focus on framing the shot. Meanwhile, a double light press brings a new overlay that allows quick access to camera functions.

iPhone-camera-control2
Image by Apple.
ADVERTISEMENT

By sliding the finger on Camera Control, users will be able to adjust zoom in and out.

Certain apps, including Snapchat, will support integration with the new button, allowing users to share photos or videos more conveniently.

The Camera Capture is also integrated with a new feature called Visual Intelligence, which functions similarly to Google Lens.

Clicking and holding the camera control button while pointing to a certain object, such as a restaurant, a dog, or a bike, will search Google to identify the object and provide additional information.

For example, pointing the camera at the restaurant will show its opening hours, ratings, and quick options to check out the menu or make a reservation.

Similarly, pressing the Camera Control while pointing at a dog will tell you what kind of a dog it is, while pointing the camera to a flier of an event and pressing the button will add it to the user’s calendar along with information about the event’s date and time

Visual Intelligence will not be available at the launch, but it will be released later this year with an iOS 18 update.