The world is going through a tech innovation cycle that could proportionally be as big as the internet itself. And SEO is at the forefront. But not Search becomes more effective with AI - also Search tools.
JetOctopus launched 20 heavy updates in 2024 that allow you to use your time for what matters most while automating recurring tasks like crawling, log analysis and reporting.
2024 was the year of innovation for JetOctopus, so 2025 can be the year of efficiency for you.
Check out all the cool stuff JetOctopus launched in 2024!
This is my official eulogy for the SEO keyword, which died many years ago, but no one has noticed. As a result, many marketing teams make sub-par decisions, and decision-makers lose trust in SEO as a channel. Just look at the recent “SEO is dead” reactions to Hubspot’s traffic decline.
Google enforces JS for crawling
Since January 20th, Google requires Javascript for Search, which makes rank tracking more expensive. This is the latest move in a long-standing battle between SEOs and Google. Rank trackers (Semrush, Ahrefs, SEOmonitor, etc.) have been operating in a gray zone. Google tolerates them but officially doesn’t allow it.
Now, with Javascript as a hard requirement, rank tracking needs more RAM, which increases the cost of every data point. The result is that the cost of doing SEO is rising. But rank tracking lost its value long before Google switched to Javascript crawling.
Focusing on single keywords hasn’t made sense in a while
With all the elements in the SERPs and the unrepresentative data we get, it’s hard to project impact and measure success purely based on keyword ranks.
In 2013, Google stopped sharing keyword referrers. The only way to understand what users searched to get to your site was and still is Google Search Console.
In 2014, Google started showing Featured Snippets, direct answers at the top of the search results, which created more winner-takes-it-all situations. Subsequent SERP features like People Also Asked or video carousels followed and climbed up the ranks. Today, over 30 known SERP Features compete for attention with classic search results. It’s very hard to predict how many clicks you might get because there are so many combinations of SERP Features.1
Since last year, Google shows ads between organic results and breaks the traditional separation of organic and paid results.
And, of course, last year, Google launched AI Overviews. The AI answers are now available in over 100 countries and provide in-depth answers. Clicking through to search results is now redundant in some cases.
The data Google shares around these trends range from non-existent to bare. AI Overviews or SERP Features are not included in Search Console Data. Not even speaking of the fact that Google filters out 50% of query data for “privacy reasons”: Ahrefs looked at 150,000 websites and found that about 50% of keywords and clicks are hidden.2
On one hand, a single web page can rank for thousands of keywords as long as those keywords express the same intent and the page gives a good answer to all implied questions. This has been the case for many years now. On the other hand, more and more keywords don’t deliver traffic because all clicks go to a SERP Feature that keeps people in the search results, or a click isn’t necessary - searchers get the answer in the search results. Sparktoro found that +37% of searches end without a click, and +21% result in another search.3
A couple of months ago, I rewrote my guide to inhouse SEO and started ranking in position one. But the joke was on me. I didn’t get a single dirty click for that keyword. Over 200 people search for “inhouse seo” but not a single person clicks on a search result. By the way, Google Analytics only shows 10 clicks from organic search over the last 3 months. So, what’s going on? The 10 clicks I actually got are not reported in GSC (privacy… I guess?), but the majority of searchers likely click on one of the People Also Asked features that show up right below my search result.
The bigger picture is that the value of keywords and ranks has tanked.
Our response can be two-fold:
1st, while the overarching goal should still be to rank at the top, we need to target the element that’s most likely to get all the attention in the SERPs, like video carousels or AI answers. In some cases, that means expanding “SEO” to other platforms like Youtube.
2nd, we need to look at aggregate data.
Aggregate traffic > keywords
We’re still operating with the old model of SEO where we track a list of keywords to measure success and set targets. But how much sense does that make given pages rank for many keywords? How much sense does it make a given search to move from a list of results to LLM answers?
The keyword doesn’t have a future in search. What does is intent, and LLMs are much better at understanding it.
So, here is my suggestion: Instead of focusing on keywords, we should focus on organic traffic aggregated on the page or domain level.
Some traps to watch out for:
We still need keywords to model brand vs. non-brand traffic by page (still works because you should have enough keywords)
Beware of seasonality
Split organic traffic out by new vs. existing pages
To track how well a domain or page are doing, we can still look at keywords, the direction of organic traffic is more indicative of whether it does well or not.
One big issue I have with keywords is that search volume has many flaws and is so unrepresentative of what’s actually going on. The term “inhouse seo” has a reported search volume of 90-200 in the biggest rank trackers but doesn’t actually deliver any clicks.
To know what pages to create without keyword research, talk to customers and analyze what topics and questions they care about. Analyze platforms like Reddit and Youtube for engagement and reverse engineer what topics work. And, we can - and probably will have to - use paid search data to inform SEO because it's more reflective of topics with the way that Google shows (PMAX) ads to users in Search, which are similar to user intent.
To project traffic, look at domains or pages that are already visible for topics we care about, just not on the keyword level. Clickstream data that reflects how users browse the web is much better because it doesn’t project potential traffic based on a keyword position.
Where keywords still make (some sense) is for analyzing historical search volume to project whether a topic is growing or shrinking, but I suggest using only large amounts of keywords.
A huge benefit of the aggregate traffic approach is that it transfers well to LLM because they don’t give us queries either, but we can track referral traffic on the domain and page level. Chat GPT even adds a URL parameter to outgoing clicks that makes tracking easier.
LLM defense: activated
The real reason Google enforces JS crawling is not to hurt SEOs but Gen AI competitors. ChatGPT and Perplexity are gaining significant ground. Chat GPT has already surpassed the traffic of Bing and Google’s Gemini. Perplexity is on the way there. However, LLM crawlers can't execute JS, which means they can now no longer crawl Google’s search results to ground their answers.
(BTW, you need to make sure your content is accessible without JS. Otherwise, LLMs can’t crawl it, and you can’t appear in their answers.)
The quality of some LLMs might decrease due to Google enforcing Javascript, but only temporarily. LLMs can still get SERP data in other ways. But one potential consequence of Google’s decision is that LLM developers build their own models or web indices to weigh answers and become independent of search engines. The second-order effect of that would be that it’s no longer enough to do good SEO to appear in LLMs. We would have to reverse engineer LLM results like we did with Google.
If SEO is a game, winning in SEO now requires adjusting based on the flow (intent) of the game instead of counting cards (keywords).
The future of search is not keywords but intentions. Let this be my official eulogy.
It's a new dawn. And we need to adjust or perish.
We still need rank trackers so much, because:
- you have to analyze how SERPs change through time (this is the most important one)
- rank trackers help you to see whether you rank in top-100 by keywords you planned
- rank trackers help you to see when the wrong URL is ranked by the keyword.
Yes, your page can rank by 1000s keywords, but in most cases, it's enough to track one of the most important keywords per page.