Google Lens uses AI to understand the world better than humans can
The company is doubling down on machine-learning

Google has announced a new app called Lens at its I/O developer conference.
It's not available yet, but is coming soon to Photos and the Google Assistant.
With Google Lens, you'll be able to point your phone at real-world objects around you and instantly view useful information about them.
Google demonstrated a handful of impressive examples at the show.
In one, Lens identified the name of a flower, which could be handy with summer around the corner.
In another, jaw-dropping demo, a user pointed Lens at a W-Fi router, and it automatically grabbed the username and password using optical character recognition.
It can also consider additional information, such as GPS location data, to work out where you are and bring up information for specific branches of restaurants and shops around you, including reviews and opening hours.
Google CEO Sundar Pichai has made a big deal about the company's shift from a "mobile-first" to an "AI-first" approach.
"We are re-thinking all of our products” using machine-learning, he said.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments