Google has introduced a series of upcoming search updates, designed to simplify product discovery and use multiple inputs to provide better contextual search results.
The main new addition is the implementation of the same A unique multitasking search model (MUM) that will allow people to search using variable entries, including visuals, as search parameters, facilitating extended discovery.
As Google explained:
“In the coming months, we’ll introduce a new way to visually search, with the ability to ask questions about what you see. ”
As you can see in this example, Google’s advanced search capacity will soon allow users to use the visual as a reference point – so if you want socks with this design pattern, you can use the image as a trigger to search for it in different product categories.
The same capacity can also be used in situations where you don’t know what something is called or just want to simplify the process (e.g. searching for the right expression, reading about bike parts, identifying the element you need, etc.).
This is a significant improvement in search capacity and could open up new considerations for discovering and the way people come to your website as a result of their behavior and queries.
Google’s MUM process will also facilitate broader contextual searches, based on advanced machine understanding, with Google also introducing a new element called ‘Things to Know’ to help steer search engines in the right direction.
“If you’re looking for “acrylic painting,” Google understands how people typically research this topic and shows aspects that people will look at first. For example, we can identify more than 350 topics related to acrylic painting and help you find the right path to take. ”
This, in turn, could become another SEO consideration, with more aspects of projects and related discoveries being added to search results. It could be very valuable to ensure that you are in touch with the latest trends and create website content based on these elements to increase detection.
Google also adds a more visually tailored depth search option for selected topics as well a new experience that identifies related topics in the video, with links to more research.
“With MUM, we can even display related topics that are not explicitly listed in the video, based on our advanced understanding of the information in the video. In this example, although the video does not utter the words “life story of a macaroni penguin,” our systems understand that the themes contained in the video relate to this theme, like the way macaroni penguins find their family members and move in front of predators.
Google says the first version of this feature will appear in the coming weeks, and more will come in the coming months.
Visual search is also a key component in Google’s advanced Lens search process, which will make e-commerce detection easier by allowing Google app users to search based on images, video and textual content on the website.
“IOS users will soon see a new button in the Google app to make all the images on the page searchable via a Google lens. Now finding this lamp or that shirt (and the like) is just a touch away. ”
Google is also expanding its list of products in the main search summary, based on the now 24 billion products listed in Google Shopping, while adding a new one. Filter “in stock” according to local store lists, so it only shows nearby stores that have what you want.
Within each of these elements, there are different considerations, and MUM’s progress is set to significantly change the way Google displays search results, which will have a big impact on detection.
It is less clear how exactly this could change your SEO approach, but as these new processes evolve, we will gain a better insight into their impact on SERPs and then user behavior, which could encourage a new approach to some aspects.
You can read more about Google’s search updates here.
Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.