The new mode called ‘Near me’ will allow you to match the photo of the object nearby. As Raghavan explained, you can take a picture of a dish and then look for restaurants that serve it. Google will then show you a list of nearby restaurants. To implement this feature, Google scans relevant photos from websites, as well as photos posted by reviewers, and then compares them with the photo you uploaded. The ‘Near me’ feature is available only in English this year, but soon it will be expanded to other languages.
Google is also introducing so-called scene research.
This will allow users to pan the camera and then enter a search phrase about the objects in front of them. Explaining this function, Raghavan gave an example of an attempt to find a chocolate bar without nuts in a supermarket. You will be able to scan the entire shelf with chocolate bars, and then see labels that provide useful information, such as reviews of each object. Raghavan called it ‘Ctrl + F for the world around.’
The search giant first launched a multi-search feature in April, but its purpose was largely related to shopping and finding instructions. For example, you can take a screenshot or a photo of a dress you like and then enter a colour name. Google Multi-Search will show a list of results with similar dresses of this colour. Or you can take a picture of a particular plant species and find relevant information on request ‘care instructions’.
But as Google’s director of product management Lu Wang has hinted, during the release, Google’s multi-search feature could be used for much more, and it looks like this extra functionality is starting to appear.
To enable multi-search, you need to open the Google application on Android or iOS and click on the Lens icon on the right side of the search bar. Here you can upload a photo or screenshot from your gallery or take a picture of the object in front of you to start your search. If you swipe up on the screen, ‘Add to search’ will appear, where you can add relevant words or phrases.
This new feature is available today as part of a new Google Lens update on iOS and Android and is in beta, so it may not work as perfectly as it should initially. For example, it may bring up results for chocolate when you’re searching for information on an oil filter because of similarities between boxes. Multisearch will only be available to users in the United States in English. Google has yet to say whether this feature will be going overseas or in other languages, although that’s probably an eventuality.