BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Google Lens Comes To iOS Via Photos

This article is more than 6 years old.

Google

Google Lens is a feature that pulls up relevant information using visual analysis when you point a smartphone camera at an object. Now Google Lens is rolling out to iOS devices via Google Photos in the coming weeks. You will have to make sure that Google Photos has been updated to version 3.15 on the Google Photos app for the feature to work. And Google Lens can only be used if the device language is set to English.

Google Lens can save the phone number or address to a contact when you take a photo of a business card. Once you take a photo of a book, you can get reviews and other details about it. Google Lens also pulls up more details when you take a photo of a landmark, a restaurant, a building, a painting in a museum, a plant and an animal. Or you can add an event to your calendar if you take a photo of a flyer or event billboard.

To access Google Lens on your iOS device, open the Google Photos app on your iPhone or iPad. Then select a photo and tap on the Google Lens icon. You can see a demonstration of how it works in the tweet from Google Photos below:

You will notice that the square-shaped Lens icon will appear in the bottom toolbar of an image when Google Lens becomes available on your device.  After tapping on the Google Lens icon, it takes a moment for it to analyze what is in the photo and provide the contextual information.

Google Lens is based on artificial intelligence technology that is also integrated with the Google Goggles app. Originally announced at the Google I/O 2017 conference in May 2017, the Google Lens app first started rolling out on October 4, 2017. And the feature arrived in Google Assistant for the Pixel and Pixel 2 phones in November 2017. Then Google Lens was released on Google Photos for non-Pixel Android devices on March 5, 2018.

Apple’s own Photos app in iOS currently does not pull up contextual details about photos saved in the Camera Roll. However, you can search for specific text, people, content and locations in the Apple Photos app. Interestingly, Apple’s ARKit platform is enabling developers to build apps featuring augmented reality. Some examples of what Apple developers can build with ARKit include interactive artwork, movie posters and signs.

Follow me on Twitter