The Future of Search Part 4: The Retail-Focused Evolution of Google’s SERP
This is part four of a five-part series detailing the near-term evolution and potential long-term future state of search. The first few sections dealt with Google reacting to and advancing beyond the cookie-less dilemma while also zeroing in on potential new profit centers. This fourth section details the in-progress reshaping of Google’s SERP in an attempt to make it more retail-focused.

Moving Retail Search Up a Level

As mentioned previously, Google has seen themselves become a conduit for other retailers. At best, they’re a place consumers visit prior to a transaction. But more often than not, they’re getting skipped over entirely, seeing as upwards of 74 percent of U.S. consumers begin their product searches on the Amazon.com site.

Google understands the inherent value in being a larger part of the retail ecosystem. Instead of a site consumers use to connect to retailers, they are a one-stop shop for everything retail. Recent changes to their Search Engine Results Pages (SERPs) speak to bringing the retail experience up a level, eventually enabling consumers to perform their entire purchase without leaving Google.

The Evolution of Google’s SERP

Google’s SERP has evolved over recent years. There has been a shift in prioritization – focusing on pages that sell over pages that inform.

Not too long ago, a search for a product would produce an SERP composed of mainly pages from that brand’s site. But with the proliferation of retail and the evolution of product detail pages (from static, one-dimensional pages to product-specific content hubs), Google’s algorithm has shifted towards highlighting product detail pages (PDPs) and the information stored within. This transition from educational to transaction-focused content has led to a complete renovation of Google’s search results.

When searching for a product, the SERP might contain one brand.com listing while the rest is retail.com focused, including the different promoted iterations (text ads, shopping ads, etc.). This shift is a direct result of PDP content representing the best answer for the query. This gives Google’s algorithm no choice but to highlight them.

Knowing this, Google has decided to pull that content up, in a way, by extracting it from the PDPs themselves. The goal being to house it all in one, central location for consumers – Google’s SERP – as opposed to across multiple, disparate brand.com and retail.com sites.

Google’s recent partnership with Shopify, the introduction of Shopping Graph, their shift towards more visual elements, and the introduction of Buy on Google are just a few examples of this.

Google and Shopify’s Partnership

Last year, Google announced a partnership with Shopify, making it easier for Shopify users to create ads on Google.

It’s similar to Shopify’s partnership with TikTok, which made the process of creating a storefront on TikTok easier for Shopify users. Shopify merchants with TikTok for Business accounts can add a shopping tab to their TikTok profiles and then sync their product catalogs to those profiles.

Similarly, the Google partnership requires Shopify users to push their product catalogs into Google’s Merchant Center tool – a pre-req for any advertiser using Shopping Ads. The linking of the product catalogs ensures Shopify users can easily advertise their products on Google while also ensuring Google has access to the product data of these very same users.

Another interesting angle at play here is the casual introduction of Google Shopping Ads to lower revenue merchants. Shopify has millions of users – a large chunk of which consists of smaller DIY shops. They’re the preferred ecommerce platform for such entities. Google is likely betting on these smaller Shopify clients applying the same roll-up-your-sleeves approach to digital marketing that they’ve used with their commerce efforts.

In this case, the automation built into Google Shopping is a great fit. Set it, forget it, and let the algorithm take hold. It’s the perfect storm Google has been hoping for based on the recent evolution of their advertising products. They’ve introduced so much automation in the last few years it’s almost hard to keep tabs. Within Google Ads in particular, they’ve spent the last few years developing features designed to push advertisers towards a more hands-off approach.

Oddly, there used to be a distrust of their algorithm. This distrust was rooted in the idea of the entity owning the product while also being in control of how the money funneling through that product is spent. But at this point, it’s almost foolish to fight since we know all of the user-based data signals they incorporate into each auction-specific decision. Algorithmic features like Smart Bidding, Dynamic Search Ads, Responsive Search Ads, etc. go far beyond what any manual attempt at optimizations could ever muster.

Google has been pointing towards mass consumption for a while. The Shopify integration is them officially putting a no experience required sign above their platforms. Just pump in your product data and they’ll do the rest. Whether it’s a mutually beneficial relationship is yet to be seen.

The Introduction of Shopping Graph

Google introduced the idea of Shopping Graph at the same time it revealed its Shopify integration. Shopping Graph can best be described as a commerce-focused variation on their Knowledge Graph. The goal of both features is the same ¬– keep users from leaving Google.

Shopping Graph pulls together all form of product information into a single location – Google’s SERP. It’s designed to give consumers all the information needed to make a purchase decision. Product details will be pulled from brand.com, retail.com, reviews, videos, and, as noted before, the product information brands themselves provide to Google via both Merchant and Manufacturer Center.

One of the main differences between Knowledge Graph and Shopping Graph is how they accumulate data. Knowledge Graph’s process is seen more as collecting and then disseminating. Pulling information in and then regurgitating the line of best fit. Shopping Graph’s process is more dependent on merchants willingly providing information via their product catalogs.

Eventually, brands will need to provide this information to be successful on Google.com. It’ll be to every brands’ advantage to submit their product data, and it will be to Google’s advantage to absorb it.

But again, it’s Google wedging themselves into the retail conversation. If all goes right, the recent trend of Amazon being where most product-based searches begin could become a thing of the past.

As more and more retailers supercharge their selection, expand their omni-channel capabilities, and strengthen their logistical efforts, Amazon might soon find itself on less steady ground. If volume, logistics, preference, etc. begin to teeter across the online retail ecosystem, Amazon may soon be vying for supremacy instead of owning it. They could easily fall back to being a stop instead of the stop along the commerce journey.

If this happens, Shopping Graph puts Google in position to recapture the title of go-to product search engine and claim the title as the one-stop shop for any consumer product needs. This keeps users on Google’s site while providing more opportunities to capture consumer data and engage via advertising.

Now all Google needs is to find a way to get consumers to transact on their site as well.

Buy on Google … with Buy on Google

Buy on Google – which officially rolled out last year to all merchants – hasn’t quite taken off. Despite this, it’s easily their most ambitious attempt at injecting themselves into the traditional retail.com experience.

Despite being a no-brainer for Google, Buy on Google represents an obvious mixed bag for sellers. The plus for Google is the minus for sellers in that it keeps consumers within Google’s domain for the entirety of the purchase process. Google’s offerings had previously linked out to seller’s sites to complete the sale which, in turn, allowed sellers to collect data, engage with consumers, etc. Buy on Google’s model puts a hard stop in place, preventing traditional, on-site DTC or retail.com transactions from occurring.

It also sits within an odd location. It’s a feature that isn’t always the easiest for consumers to reach, seeing as it currently requires a click on a paid ad to access. Despite Google’s recent introduction of free Shopping Ads and previously mentioned attempts to make their ads ecosystem more accessible, it’s still difficult for sellers to act on Buy on Google without … buying ads on Google.

The most recent numbers point to sellers feeling this pain, with just shy of 7,500 sellers adopting Buy on Google as a transactional option at the end of 2020. This represents a little under 1/10th the number of sellers within Walmart’s marketplace. It is also nowhere near Amazon’s vendors/sellers that soar over two million. The data also pointed to the platform often accounting for negligible sales, which is not great.

The inconsistencies and gaps noted point to Google still needing to figure this piece out. It’s hard to imagine them becoming a legitimate part of the retail conversation without offering some level of marketplace for consumers to transact on, especially since the assumption is that they’ll figure a way to make themselves more relevant within this arena.

Activating on Visual Elements

The underlying connective tissue across these retail-based enhancements is the visual component. AKA, making shopping ads more easily accessible. The Shopping Graph pulling PDP content upward, giving consumers the ability to Buy on Google. They all require a shift to a more visual SERP.

Amazon is incredibly particular regarding its image requirements. Certain sizes, pixel counts, backgrounds, text limitations, and static vs. lifestyle are just a few that come to mind. Every image needs to fall within a set of guidelines. Why? Because imagery can make or break an online interaction. And the absence of imagery? Well, in retail it’s a death knell. Google knows this – thus the shift from their previously text-heavy ecosystem.

One of their more recent advancements is multi-search, which gives users the ability to search via an image in concert with a text-based query. Google then cross-references both queries and provides an output. This focus on visual inputs allows consumers to streamline the search interaction – quickly finding results instead of fumbling for ways to describe an item via form fill.

Imagine someone finding the exact dress they’ve been looking for within an image online, but a mention of the designer is nowhere to be found. Instead of multiple, exhaustive, text-based queries, they can use Google Lens to capture the image and then layer the color variation they’re looking for in the form of text. The interaction with Google goes from:

A text-heavy, open-ended query capable of producing far-reaching, irrelevant results:

To an image and color specific, multi-modal search designed to produce tailored results (pun intended):

This is all by design, and a major component of Google’s most recent search algorithm updates.

The first such update was titled Bidirectional Encoder Representations from Transformers, or BERT for short. Then came the most recent iteration known as Multi-task Unified Model, or MUM. Both were designed to accomplish two things:

  1. To make normal people feel like absolute idiots when trying to recite their names
  2. To push Google’s Natural Language Processing (NLP) capabilities forward

The second goal was centered around helping Google evolve by grasping not just the content of a query, but also the context. This goal goes so far as to understand the context of specific words in search queries.

Speaking to Alexa is a great example of this. Ask Alexa a canned question it’s designed to answer (what’s the weather today?) and it will more than likely provide an adequate response. Try to attempt to engage in dialogue or some semblance of conversation, however, and its limitations arise rather quickly.

This is where NLP comes into play. A stronger NLP capability would enable Alexa to take in vocal commands and questions, process them, and return a relevant response. It’s the ability to understand complex questions and pathways that separates what we currently understand voice search to be from what it should be. Google – through this newest search algorithm – is striving for the latter.

BERT – with its focus on NLP – signified an initial shift towards more complex queries and, in turn, more exact results. For example, the tech behind BERT gave birth to featured snippets. Although they weren’t perfect, they did represent a new output for certain queries. At a base level, this achieved Google’s goal of providing answers and not just results. If Google can provide answers, then why can’t it anticipate them as well?

The idea of anticipatory search – which is more future-focused version of the current iteration – incorporates historical, behavioral, environmental, and predictive signals. Even though technology isn’t there to accomplish this type of interaction at the moment, the willingness on Google’s end is. They’re investing time and talent to get there.

The sphere of retail intersects the future state of search along multiple touchpoints. However, this is where they most violently collide. The idea of predicting or anticipating a need before it’s fully realized, ushering someone down a theoretical funnel in one fell swoop can flatten it in the process – anticipating a purchase before the consumer can themselves.

It’s guiding a lot of the products Google is developing (as mentioned) while also helping usher in the future iteration of their best-known product: Google Search.

Interested in learning about the future state of search, and all the various ways its connected to retail? Read Part 5.