Google announced a new AI-based Android feature called “circle to search,” which allows users to search anything on their mobile phone by simply circling or highlighting it without switching between applications.
People can also tap or scribble over photos or videos within apps to instantly search for information about the item. For example, if you see a pair of boots in a video, you can circle them, and a slew of similar options from retailers across the web will pop up.
‘Natural, intuitive’ search
To activate the circle to search, long-press the home button or navigation bar on your Android device, making a Google search bar appear on the bottom of your screen.
Once you have done so, you can circle or scribble on an item like a pair of dog goggles, as seen in a demonstration video from Google, and the search bar expands to show you the search results of the object.
The scribble gesture can be used on both pictures and words, Google says. The company also suggested that users could circle the sunglasses a creator wore in their video to search for related items without leaving the same window.
Circle to Search will be available on the just-released Samsung Galaxy S24 series and Google’s Pixel 8 and Pixel 8 Pro starting Jan. 31, in all languages and locations where they’re available.
Google said the circling gesture is meant to make it more “natural” to interact with Google Search each time a question arises, and you don’t want to stop what you are doing by switching to a different app to search for information.
“Whether you’re texting friends, browsing social media, or watching a video, you can search for what’s on your screen right when your curiosity strikes,” Elizabeth Reid, Google VP of Search, said in a Jan. 17 blog post.
Google releases more search tools
In addition to circle to search, Google also announced a related generative AI search feature that combines images and text, which the company calls “multisearch.” The multimodal technology is reportedly similar to a feature OpenAI launched with ChatGPT.
Multisearch was introduced to Google Lens in 2022. Thanks to advances in artificial intelligence, the company said people can now point their camera at something, ask questions, and get AI-generated answers that go beyond identifying the object visually.
For example, when watching a food video featuring a Korean corn dog, you can circle the object and ask, “Why are these so popular?” You can do this, too, with an open board game that you may find at a yard sale, and you’re not sure how it’s played.
With multisearch, you can take a picture of the game and ask, “How do you play this?” and get a summary compiled by AI from several internet sources. Or you could scribble over the text “thrift flip” that appears when watching a YouTube Shorts video about thrifting, Google says in an example.
The company said it’s been testing multisearch for over a year to “see how generative AI can make Search radically more helpful,” with Search Generative Experience (SGE) in Google’s Search Labs program.
In October, Google said people who opted in to SGE could now type a query into the Google search bar, and the engine will create four picture options from which to choose and refine further.
SGE also allows users to write and customize drafts within the platform, with options to adjust the length and tone of the writing.