Google Lens is an enormously useful image recognition tool that lets you identify plants, digitize notes, translate signs, investigate deepfakes, find a product online, get homework help, and more. Since 2022, its multisearch option has let you ask further questions about an image it identified. If you took a picture of a dress, for example, and the app found it for sale online, you could type “green” to find the dress in that color.
Now, Google Lens can listen in addition to seeing — you can ask for more info or add context to a search with your voice.
To use the new voice search, just hold the Lens shutter button down instead of tapping it.
Also: How you can use Google Maps to track wildfires and air quality
Android journalist Mishaal Rahman demonstrated the feature in a post on X. Pointing his phone at a plate of blueberries, Rahman presses and holds the Lens shutter button. A voice prompt appears and asks, “Speak now to ask about this image.” When Rahman audibly asks how many blueberries are in the picture, a Google search appears with the correct answer.
According to Android Police, the feature has been in the works since earlier this year, so it appears Google is rolling it out to the public pretty quickly. If you’re not seeing it, make sure you have the latest version of Google Lens.
Also: 3 ways Google just supercharged your Chrome browser with AI
The option was available on my Google Pixel 8 when I looked for it, and it worked well when I tried it out. I pointed it at a pumpkin plant I have growing in my backyard for Halloween, held the search button, and asked, “How long does it take this plant to grow fruit?”
Had I used Lens the traditional way, Google would have identified the plant as a pumpkin, but finding out more would have required typing in a Google search. With this new feature, however, Gemini identified the pumpkin and told me correctly that it takes 90 to 120 days until the plant produces fruit.
This update comes just a week after Google made several improvements to Google Search, including adding its “About this image” contextualizing feature to Circle to Search and Google Lens.