▷ How to live without Googlebot? Practical case

▷ How to live without Googlebot? Practical case


What if Googlebot made you miss? See you living without? Probably not because it guarantees the discovery, exploration and correct indexing of websites. And yet, the SEO professionals have recently had to resolve to periodically suffer bugs. Chronicles of a nightmare that has become the space of a moment, a terrifying reality …

What to “see” Googlebot?

On Reddit, questions – And the answers! – Go fast. A user was concerned with the exact information that Googlebot perceives during exploration: Webmarketing trainingWebmarketing training

Does Googlebot ‘screenshot give a complete image of what Google can see? (…) How can I know what Google sees in my article? … I want to know what Googlebot sees in my website.

John Mueller, from Google, replied:

In most cases, yes [la capture d’écran de Googlebot donne une image complète de ce que Google peut voir]. But there are limits and temporal anomalies. Tell us about what you are trying to check.

In other words, Googlebot’s screenshot is A faithful reproduction of what Google sees During the exploration of your pages. He uses images, JavaScript and the CSS to faithfully restore web pages.

However, Mueller cautiously mentioned temporal anomalies.

Temporal abnormalities

We designate by temporal anomalies, the irregularities or differences between the real activity explored and the resources downloaded by Googlebot at a given time. In this case, Googlebot’s screenshot does not faithfully reflect the data traveled. Several scenarios lead to temporal anomalies.

The causes of temporal anomalies

If you use obsolete horodatages To put your data in cache, Googlebot commits errors.

Sometimes it’s not you but the server the culprit. It is enough that it transmits the data to Googlebot with delay to cause anomalies.

If the server clock is incorrectly configuredGooglebot is confusing and buckling. He perceives the exploration calendars confusedly.

Finally, if the configurations of your HTTP headers are incorrectGooglebot is disturbed. Indeed, it no longer clearly perceives whether or not your content has already been explored recently.

The effects of temporal anomalies

Following temporal anomalies, Googlebot can Overload the server By dint of repeated requests. In addition, your pages may be indexed with delay. Finally, your content appears online when it is already obsolete. SO :

  • The waste of server resources
  • Content obsolescence
  • The risk of losing places in the SERP ranking

are immediate consequences of temporal anomalies. By putting these elements end to end, it is your SEO ecosystem which literally goes into a spin.

Fortunately, Google has some useful tips to manage this situation without sinking into the drama.

What strategy to adopt?

Choose a server available optimally to prevent latency. Likewise, your accommodation services must provide the optimization necessary to promote exploration.

Take care to configure your HTTP servers and headers appropriately. Likewise, optimize your site on the technical level and constantly monitor the possible appearance of anomalies.

Finally, keep backups of the various updates. It’s called be proactive.

Recap good technical SEO practices to prevent Googlebot’s temporal abnormalities

  • HTTP Cache-Control and Last-Modified (ETags) accurate
  • A server available H24
  • A synchronized server with a robust NTP (NTP Time Protocol) server
  • Constant monitoring of exploration statistics
  • Regular audits

Some useful SEO tools to track down the Googlebot temporal anomalies

Screaming Frog SEO Spider Detect the exploration errors caused by the server. Also, Newspaper analyzers such as the ELK battery are effective.

And don’t forget the good old Google Search Console. Passe-share, it provides precise exploration statistics and detects the problems to be resolved.

The equation of the day

Googlebot’s temporal abnormalities = loss of visibility = fall in serp

To be monitored like milk 🍶 on fire 🔥🔥.

Related Articles