Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

iPhone 8 to have 'Smartcamera' that knows what it is taking a picture of, leaked Apple code suggests

It's not clear whether the feature will be limited to the new iPhone, or will come to everyone with iOS 11

Andrew Griffin
Thursday 03 August 2017 10:41 BST
Comments
Apple CEO Steve Jobs watches a video of the new iPhone 3G as he delivers the keynote address at the Apple Worldwide Web Developers Conference June 9, 2008 in San Francisco, California
Apple CEO Steve Jobs watches a video of the new iPhone 3G as he delivers the keynote address at the Apple Worldwide Web Developers Conference June 9, 2008 in San Francisco, California (Justin Sullivan/Getty Images)

Apple's new iPhone will feature a smartcamera that can tell what it's looking at, according to leaked code.

Details found in files that were accidentally uploaded to the internet by Apple suggest that it is working on a feature that will be able to tell what is in a scene and adjust the settings to take the best possible image.

The code includes references to "scenes", which include Fireworks, Foliage, Pet, BrightStage, Sport, Sky, Snow, and Sunset/Sunrise. It's presumed that the "smartcam" will be able to detect those things using artificial intelligence, and the alter things like the exposure and shutter speed accordingly.

Some cameras, like those made by Canon, include different settings for things like sport. But they require their owners to choose what sort of thing they are taking a picture of – something the "smart" in the feature's name suggests will be done automatically.

There are also reference to "freeze motion" features. Those could be similar to the options that Apple is adding for its Live Photos, which allow people to take a range of pictures and then choose from the best image, allowing for moving scenes to be captured at the best possible time.

It's not clear whether the feature will be limited to the iPhone 8, which will be revealed next month, or if it will come to the new phones through the iOS 11 software update.

Apple's iPhones can already spot objects and people in pictures, allowing people to search through them in the Photos app. If a person wants to find all their pictures of dogs, for instance, they can type dog in the search bar found in the corner and then see each of those images.

But the new feature suggests similar features will be making their way into the camera app, and allow the ability to spot objects as pictures of them are taken. That could need the extra processing power and depth sensing features that will come with the iPhone 8, and so might be limited to those phones.

The code was found by Guilherme Rambo, who posted it online. It and a range of other leaks were found when Apple uploaded code for its HomePod smart speaker online, apparently by mistake – because it wasn't intended to be public, it hadn't been scrubbed of references to upcoming phones and features.

Mr Rambo, along with Steve Troughton-Smith, are two developers who have led the charge to pick through the HomePod files and find what they mean for the iPhone 8 and other products.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in