There were so many product announcements yesterday at Google I/O that one would be forgiven to think it was a hardware event instead of the software-focused developer conference that it actually is. However, one of the areas that Google chose to spend quite a bit of time on was Search and how far it’s come during the past year.
Google has been on a path to make search more natural by using queries that are more in line with how humans speak and ask for information, such as combining two searches into one. To solve this issue, Google recently launched “multisearch” in the Google app, allowing you to search with images and text simultaneously. Later this year, Google will expand on this multisearch functionality by combining a picture or screenshot and adding the term “near me” to help you find local businesses that carry the item you are looking for. This feature will be available globally in English and expand to more languages over time.
But that’s not all that’s coming to “multisearch.” A new feature called “scene exploration” will also make its debut in the future, allowing you to use Google Lens to uncover more information about several objects together in a scene. The example given at I/O included several shelves with an assortment of candy bars and using Lens to pan across all of them to see which ones were high-rated, dark, or nut-free. Imagine getting all that information just by swiping your phone across several side-by-side items.
Other advancements in Search that were discussed were already in place on the web but will now be also accessible via the Google App. One of these is the ability to have personal results found in Search removed by using a new tool that will be released soon, as well as fact-checking tools that help you evaluate whether the information you are getting is to be trusted.