Today’s search engines are one of the primary paths to content experiences on the web. Shopping, research, directions and questions all pass through Google (and to a lesser extent Bing and Yahoo) as the entry point to solutions-seeking. Generally, the experience is the destination, and search engines have done a fine job of channeling users to the precise experience they’re looking for.
What if, however, the end result could be reached earlier, circumventing today’s path? Changes in paths to consume content and experiences have emerged through technologies that offer a glimpse into what search may look like in the not-so-distant future.
Related: How Google Has Changed the World
With Google Answers and Google My Business data dominating in search results, and providing data and answers to the user directly after they search, it eliminates the need for a user to click through to a website. While this might not be a delightful or engaging content experience, a very distinct need is being addressed through answers and data delivery right in the search engine results pages. Consequently, disruptive technology companies (Google being one of them) have plans to delight in other ways — almost certainly changing the search landscape as we know it.
Human language search
We’re currently witnessing the rise of voice search on our personal devices. The results are simple answers and data, similar to what we see displayed visually via Answers boxes and Google My Business. An interesting trend that has emerged since Siri and Cortana came on the scene is our “human language search” approach to the way we query. Speaking to our devices in the same way we speak to a barista or store clerk, this behavior has followed us back to our keyboards when we search on desktop and mobile. This suits search engines well, as we tend to be clearer with our intent and context when we search this way versus the old-school Boolean method.
Anything more detailed than your device returning simple answers or asking for a next step, however, will require better familiarity between the user and the search engine. If we’re to expect the disappearance of the keyboard and rely on voice completely, machine learning and personalization will need to evolve. At the very least, we’ll be tethered to something that forces us to select the next step for the foreseeable future, even if we speak to them like other humans. With Google Lens visual search, however, all bets may be off.
Related: 4 Reasons Why Amazon’s Product-Search Dominance Matters
Digital personal assistants
Google Home, Amazon Alexa and others have taken search results a step further, neatly bypassing displayable data with answers, weather data and news being read aloud, versus a query, click or scroll. Commands to purchase can also be made through a digital personal assistant. What this means, however, is that chasing ownership of the top three results in search engines will be a thing of the past. When digital personal assistants are fed search results from the top spot, position one will be the only relevant position to own.
Shopping may be a different story altogether if your device prioritizes results from its own ecosystem over rival search engines. Looking at how Amazon manages its Alexa devices today is just a preview into online shopping’s future.
Related: Everything You Can Do With Amazon Alexa
As Google has evolved to meet the needs of users by limiting options and providing quick rewards directly in results pages, so have other complementary (and at times competitive) platforms.
Facebook Instant Articles and Google AMP don’t take users too far away from the originating platform source, enabling them to return to whatever they were doing before something caught their eye. Solutions like Facebook Store integrate products for an in-platform shopping experience, tightening the gap between product discovery and purchase, while directing users away from Google’s fairly limitless shopping mall of possibilities.
WeChat takes it a few steps further. What began as a messaging platform in China has become a robust social, commerce and payments ecosystem without a U.S. equivalent. Taking a taxi to dinner via WeChat can include hailing a ride, route sharing, messaging a photo from the cab and splitting payments all in a single platform. Meanwhile, the U.S. industry is still too fractured for full social and payments integration, meaning customers have limited social sharing in their ride sharing apps. In Uber’s current state, it just doesn’t seem realistic to pull a menu attached to your restaurant destination like it is in WeChat.
But, to think that before Uber and Lyft we would have Googled for a local taxi company, visited its website for a phone number and manually dialed to speak to a human is remarkable. And it’s paved the way for introducing something like WeChat in the U.S., or at the very least, for a powerhouse like Facebook to integrate social media, services and payments that limits the necessity for some types of search behavior in the future.
Related: Whether You Buy Online or in Stores, Google Will Know
If you’re interested in how organic search will be impacted next, look to other digital marketing channels for the blueprint. In paid media, we see search and display ads based on demographics and browsing history. Programmatic display, beacon technology and more advanced targeting gets even creepier. In organic search today, we see results based on location and past purchases we may have made. Google’s search algorithm has brought us what the engines consider the most relevant results, but those results are rarely individualized in any meaningful way.
Moving past simple targeting, machine learning and platform integration can deliver results to users on an absolute personal level. Imagine data from walled-in ecosystems like Facebook and a news site on the open web feeding data into Google’s organic algorithm to address our queries personally. Marketers are using beacon technology to know we were just at Starbucks and location data to know it’s about to rain in order to serve up the perfect ad creative. It’s only a matter of time before similar collections of intelligence can answer the “what should I do today in New York?” query with a relevant result that won’t have you over-caffeinated and caught in the rain.