What Google I/O 2023 reveals about the future of SEO
A CEO I advise recently told me about a conversation he had with Marissa Mayer during her time at Google. When talking about SEO, she dropped a comment that catches the current moment: “We’re not a search engine. We’re building an answer engine.” 15 years later, that vision has become reality.
The announcements at Google I/O 2023 herald the biggest changes so far to the most expensive real estate on the internet: Google Search. While a new paradigm for the tech industry started in November of last year (when Chat GPT came out), the new paradigm for SEO starts now.
In this Memo, I highlight 12 observations from the Google I/O 2023 keynote and surrounding material that can tell us what SEO might look like from here.
TL,DR:
The SEO playing field is split into 3 new playing fields: AI Snapshot, non-AI results, and Perspectives
SEOs need to learn a new set of skills, which is equally intimidating and exciting
Signals that might become more important: image optimization, brand combination searches, links/mentions, reviews
12 observations about the future of SEO from Google I/O
The prototypes introduced in the I/O 2023 keynote and blog articles Google published shortly after reveal a new set of skills and problems for SEOs.
Observation 1: The first thing that stuck out to me is the idea of “doing more with a single search”. Google Engineering VP Cathy Edwards repeated it many times throughout the part about Search of the I/O 2023 keynote.
The idea is simple: today, users need to perform many queries to get an answer to their overarching question or problem. Moving forward, Google wants to decrease the number of queries needed to get a good answer to ideally one.
In SEO, more queries mean more opportunities to get users on your site to convert them to customers or at least build brand recognition. Fewer searches mean less traffic and data, which isn’t just about conversion attribution but also user journey tracking. As I described before, less traffic doesn’t have to mean less business. It could also mean getting more qualified traffic. Overall, I expect some verticals, like e-commerce and publishing, to receive less traffic.
Observation 2: AI Snapshots provide a much better answer to longtail queries, which Google has been traditionally bad at. With all the SERP Features and ads Google shows, Longtail queries are an effective way to get high-quality traffic with good click-through rates. That’s going away.
AI Snapshots, Google’s product name for search results generated with PaLM 2, provide a direct answer and link to 3 web results. There’s a high chance SEO will be all about ranking in the AI Snapshot link carousel.
Did someone notice that some SERPs show only 3 organic results under the AI Snapshot?
Observation 3: The SEO playing field breaks into 3 parts: AI Snapshots, non-AI results and Perspectives.
The 3 new playing fields seem structured by EEAT: The AI Snapshot link carousel will likely get the most search traffic but be limited to high authority, high expertise web results. Below the AI Snapshot, we have “authentic, first-hand experience” results. Ranking here will likely take content that emphasizes the experience of individuals, most probably experts. And in the Perspectives tab, we have a new playing field of creator results. Ranking here might be much easier and based on a new set of signals.
Observation 4: Perspectives might be a valuable traffic source for affiliates and brands.
Perspectives is already live for some queries with high QDF (query deserves freshness) like “google io announcements” or “recession”. Right now, they’re coming from Twitter and web content. Pretty soon, I expect them to also come from Youtube, Reddit and TikTok.
Observation 5: Image thumbnail optimization will become a much stronger lever.
In all of these new surfaces, images stand out as a way to get attention and clicks. We’ll find new ways to make thumbnails stand out in AI Snapshots, classic web results and Perspectives. Youtubers have played this game for years, and SEOs will need to catch up.
Observation 6: video continues to play a role, but I was disappointed about how little Google showed. I accidentally found videos as part of AI Snapshots hidden in the SGE trailer, but I was hoping for a Key Moments in Video feature in AI answers. I’m bullish that Google will bring more Youtube results into AI Snapshots.
Observation 7: Corroborated results break the AI Snapshot down into different components and might allow us to understand how Google’s AI answer comes together. They are built on the classic ranking algorithm, which makes sense. Most search engines use a tech called Retrieval Augmented Generation [paper], which cross-references AI answers from LLMs (large language models) with classic search results to decrease hallucination. Google pointed out several times that the new experience is “rooted in the foundations of Search”, which alludes to the concept of Retrieval Augmented Generation.
Google understands all implications of the longtail search query and breaks them down (corroborates) into pieces of information.
Observation 8: SEOs will lose critical data if Google doesn’t build new ways to understand site performance.
Search volume and ranks have totally different meanings and might be useless for AI Snapshots. SEOs already don’t get enough data from Google to understand the impact and performance of SERP Features. We need new data in this AI world to help us understand the cause and effect of optimizations.
Observation 9: No concept for Google Ads yet.
Google still shows Shopping ads above AI answers, which helps the company monetize important verticals that matter. But we don’t yet know what Google Ads will look like in AI Snapshots. Does Google even know? Pichai and other execs stressed that ads are part of the new ecosystem, but Google seems hesitant to commit to a design just yet.
Observation 10: content still matters for e-commerce.
The shopping results in Google’s I/O keynote seem to be accompanied by buyer guide content.
In some cases, Google even links to such guides in the Snapshot carousel. It will be quite interesting to figure out how to get into that spot and when Google links to guides.
Observation 11: For the first time ever, YMYL topics might have it easier in SEO because they don’t trigger an AI Snapshot (at least not always).
Google has mentioned several times throughout the keynote to be very cautious about YMYL topics. Mind you, YMYL verticals like health, finance or legal are heavily regulated. If AI Snapshot answers look too much like recommendations, Google could get into really big trouble.
Observation 12: Google will watermark and tag AI content.
Google is adamant about doing all it can to detect AI content. Again, from an SEO perspective, AI content is only a problem when done with low quality. But from a platform perspective, Google cannot tolerate fake news, images or video. Google collaborates with Midjourney, Shutterstock and other generative AI tools on meta tags (similar to how search engines all follow similar standards?) and watermarks.
Google’s true advantage: data
Investors reacted well to Google’s presentation: Alphabet’s stock jumped up by over 5.2% shortly after the I/O 2023 keynote, while Microsoft’s dropped slightly.
However, when you zoom out, you notice Microsoft is still ahead in %-gains over the last 12 months.
To me, the fact Microsoft is still ahead in stock value gains is perfectly symbolic of Google not delivering an innovative concept for Search but copying Bing’s, Neeva’s or You’s approach to AI Search. Google is doing to Bing what Meta did to Snapchat with Stories: copy a new product feature to fence in the growth of a competitor. The cost of failing for Google is much higher than for Microsoft.
Google is doing the same with Bard Tools, a new marketplace for Bard integrations. Sounds a lot like Chat GPT plugins? That’s because it is the same move. Google has powerful partners, like Kayak, Spotify, OpenTable, ZipRecruiter, Instacart, Wolfram, and Khan Academy on board. Since every company will be an AI company, the sooner brands can train models on their own data and be visible in front of users, the better. For Google, it’s strategically important to own this marketplace.
Google’s biggest advantage is training AI models on all the data it has access to - not just the web but also user data. New AI tools like Help Me Write, the evolution of Auto Complete, are hopefully trained on all my emails. Google has understood this assignment in Cloud. Thomas Kurian, CEO of Google Cloud, announced a PaLM API and MakerSuite app that makes it very easy to train AI models on different data, for example, your website! [link]
A long list of Billion-Dollar Questions
Google’s keynote intimidates and excites me. Of course, I had an “Oh sh!t” moment like so many other SEOs. It’s the biggest change we’ve seen to Search.
And yet, I’m excited because the cards are reshuffled. New surfaces mean new opportunities and skills to learn.
13 questions came top of my mind for me during the Google Keynote:
Will Google keep classic organic results or phase them out when they see no one engages with them?
How many users will look at corroborated results?
Will users scroll below to organic listings?
Will Google give us new data to understand how to optimize for SGE?
Is the information users receive in AI snapshots sufficient, or do they want to hear from a specialist?
Will the top 3 results in the AI snapshot get most clicks, or will the total number of clicks shrink (zero clicks up)?
How will Google keep the web ecosystem alive to get the content it needs for AI Snapshots?
How much will we need to lean on data from ads in AI answers?
What will the AI Snapshot look like on mobile? Or will mobile look even more extreme?
What will algorithm updates look like in the future of Search?
Will Google bring out a sidebar chatbot like Edge/Bing?
Can Google provide enough incentive for site owners to let Google index their site, or will we see more sites opting out of Google Search if they see traffic eroding?
Is the push from Microsoft a welcome change that masks something Google wanted to do all along?
Many more questions will come up over the next months, but it’s clear that Google has achieved its vision of an answer engine. What will the vision for the next 15 years look like?