Knowledge Hub

Google’s new feature: Take a picture and add ‘Near Me’ to get local results

Share on facebook
Share on twitter
Share on linkedin
Share on email
Share on whatsapp

Google I/O 2022, an annual developer event, has kicked off and Google has announced a handful of changes to its most recent search engine innovation – Multisearch. For the uninitiated, Multisearch is an entirely new way to search using text and images at the same time. With Multisearch in Lens, you can go beyond the search box and ask questions about what you see. In the Google app, you can search with images and text at the same time – similar to how you might point at something and ask a friend about it.

How to Use Google Multisearch

  • Download the latest update for your Google app and then follow the steps below:
  • Open the Google app on Android or iOS
  • Tap the Lens camera icon
  • Upload a saved image or snap a photo of the world around you
  • Swipe up and tap the “+ Add to your search” button to add text

Things are going local with ‘Local Multisearch’

The company is adding a way to find local information with Multisearch, so users can uncover what they need from the millions of local businesses on Google. A new mode, called “near me,” will let users take a photo of an object and then find results locally. You’ll be able to use a picture or screenshot and add “near me” to see options for local restaurants or retailers that have the apparel, home goods and food you’re looking for. For example, say you see a colorful dish online you’d like to try – but you don’t know what’s in it, or what it’s called. When you use Multisearch to find it near you, Google scans millions of images and reviews posted on web pages, and from our community of Maps contributors, to find results about nearby spots that offer the dish so you can go enjoy it for yourself.

Local information in Multisearch will be available globally later this year in English, and will expand to more languages over time.

Explore your surroundings with ‘Scene Exploration’

Today, when you search visually with Google, we’re able to recognize objects captured in a single frame. But if you want to scan multiple things at once, you have to do it one by one. In the future, with an advancement called ‘Scene Exploration’, you’ll be able to use Multisearch to pan your camera in a wider scene and get context about multiple things in the frame at once.

One example Google cites of when this can come in handy is if you’re shopping for a chocolate bar without nuts for a friend and you don’t know exactly which one to buy — you can scan the entire candy shelf to make sure and buy the right one.

Unfortunately, Google didn’t offer a timeframe for when it expected to put the scene-scanning capability into the hands of users, as the feature is still in development.

0 replies on “Google’s new feature: Take a picture and add ‘Near Me’ to get local results”

Popular Blogs
Related Blogs
Category Cloud

Subscribe to Our Blog

Stay updated with the latest trends in the field of IT

Before you go...

We have more for you! Get latest posts delivered straight to your inbox