Google Lens Will Help You Identify Objects and More

BY Evan Selleck

Published 17 May 2017

Today, during Google I/O, the company officially announced Google Lens, which uses a vast knowledge graph, to help users identify objects and much more.

With Google Lens, users will be able to identify objects with just the camera on their phone. Google’s Sundar Pichai used a variety of examples, including being able to use the camera in your phone to identify a flower. That could help let folks identify a flower they might be allergic to before they try to touch it, or smell it.

With Google Lens, users will also be able to take a photo of the Wi-Fi router’s SSID and password, and that information will get saved on their phone so the device can automatically connect to the Wi-Fi network.

Google went into detail a bit later, showing how Google Lens will be able to use Assistant to recognize an event that’s going on in your neighborhood just with a sign, even allowing users to buy tickets from an installed app like Ticketmaster. It will also immediately add the event to your calendar as well.

Inside Google Photos, Google Lens will identify the noteworthy buildings in a new area, even pull up directions for a location, and learn more about a painting and an artist when you see it out and about. With Lens, users can also use the camera to learn the phone number for a business they might not be near.

Google Lens in Photos will roll out later this year.

Google Lens will roll out to Google Assistant first, and then beyond the digital personal assistant at a later date.