DayFR Euro

Mandatory JavaScript: How Google makes it difficult for SEO tools and LLMs

Since January 15, Google has introduced a requirement to limit scraping of its SERPs: the activation of JavaScript to access its search engine. While this decision was explained as an improvement in services for users, its consequences go far beyond, directly affecting SEO tools and LLMs such as OpenAI.

What to remember:

  • Google wants to require JavaScript activation to access its search engine, impacting SEO tools and LLMs.
  • This measure aims to strengthen protection against scraping, spam and abuse.
  • SEO tools must adapt, particularly via headless browsers, but this could increase costs and reduce speed.
  • The French SEO community quickly found solutions thanks to mutual assistance ????

Google's stated objectives

According to a Google spokesperson interviewed by TechCrunch, the new need to activate JavaScript to access its search engine is motivated by:

  • Protect your services against abuse, such as intensive scraping and spam.
  • Offer more personalized and up-to-date search results.

JavaScript makes it possible to limit abusive requests thanks to techniques such as rate-limitingand strengthens control mechanisms against bots. However, this measure affects a significant volume of users: approximately 8.5 billion searches per day are currently done without JavaScriptaccording to this spokesperson!

SEO tools in turmoil

SEO tools, many of which rely on scraping results pages to collect essential data, were directly impacted by this change by Google, which blocked them entirely or in part. We had already communicated last week on this “failure of SEO tools“, when the impact began to be felt, following Vincent Terrasi's post.

Adrian Delmarre, founder of the tool Haloscanreported to us: “The changes on Thursday/Friday ultimately impacted us quite moderately. We lost around 50% of our scraping capacity on Thursday (I should point out that this in no way affected the operation of Haloscan, which remained 100% available and usable all the time). We returned to full strength late Friday afternoon. At no extra cost, in particular thanks to a tip shared by Paul Sanches.

Thanks to the responsiveness and sharing within the French SEO community ????, and hats off to Paul for finding the ideal solution which helped many tools.

Google changed something. It happens often, but in general we manage it without it being seen.“, explains Adrian, while emphasizing that we must not be catastrophist. “This is not the end of SEO. There will be other changes. We will adapt. That's the principle of SEO, right?

If Google communicates on a mandatory execution of JavaScript to access its SERPs, according to Adrian, it is not “yet” necessary to activate JavaScript and therefore to emulate a browser for all calls. Perhaps this will be the case soon.

-

A measure primarily directed against LLMs?

For several experts, this decision by Google seems to primarily target linguistic models like those developed by OpenAI or Perplexity. In a webinars organized this January 21 by Olivier Duffez, Fabien Faceries (MyRankingMetrics) and Vincent Terrasi (Draft'n'Goal)experts explained that LLMs take the best content from Google and reuse it to answer user queries. This poses a double problem: on the one hand, Google loses control over this data and does not directly benefit from its exploitation; on the other, these tools make some users less dependent on Google.

This interpretation is shared by Adrian Delmarre: “Maybe Google is not targeting SEO tools, but its competitors like LLMs. If so, that's good news for us, because these models adapt quickly and show Google that blocking them doesn't do any good.“.

Consequences and solutions for professionals

These Google changes affect not only scrapers, but also tools for semantic optimization, positioning tracking, AI writing or cannibalization analysis.

The solutions envisaged to adapt to this new reality, however, have limits:

  • Using JavaScript. This approach requires full rendering of pages, which significantly increases technical costs for the tools.
  • Reinforcement of constraints by Google. The potential addition of traps or new anti-scraping rules could make these solutions even more complex and increasingly difficult to circumvent.
  • Reduced execution speed. Vincent Terrasi notes that as they stand, the tools should not increase their costs in the short term. However, their real-time speed could be compromised, although this is not necessarily critical for SEO, where immediacy is not essential.

A supportive SEO community

The SEO community's response to this upheaval has highlighted a remarkable capacity for adaptation and collaboration (despite the SEO dramas!). The tip shared by Paul Sanches, which allowed several tools to quickly bypass the JavaScript constraint, is a striking example of this solidarity and that warms the heart.

Thanks also to Olivier Duffez and Fabien Faceries for organizing a webinar so quickly to take stock of the situation, the context of the outage and the solutions found by the SEO tools!

Vincent Terrasi invites SEOs to develop plans B, C, etc., to be able to cross-reference the data without relying on a single tool, to avoid finding yourself without metrics in the event of a service outage. You are missing only one being, and everything is depopulated!

--

Related News :