Search engines use "crawler robots" to index your web pages and their content. In some cases, the requests from these robots can be overwhelming for your website. Depending upon how your website is built/optimised and the resources of the hosting plan you've purchased, a flood of requests could cause the website to temporarily exceed its available CPU/memory resources, or could cause the website to slow down significantly for normal web visitors.

Many search engines like Bing and Yahoo! support REP (Robots Exclusion Protocol), and one part of this protocol is that the crawler robots will look for a file named robots.txt and follow any instructions within that file. The file can easily be created by any website owner or web developer and placed in the main website directory (usually public_html). More detailed information about the robots.txt file can be found at The Web Robots Pages (robotstxt.org).

In the robots.txt you can add one simple line to instruct all crawler robots to slow down:

Crawl-delay: 1

Google Bot

The Googlebot ignores the "Crawl-delay" directive. It's recommended to register your website for Google Search Console Tools to adjust the crawling rate and other settings.

For more information, please read the official documentation: Change Googlebot crawl rate - Search Console Help

Bing Bot

You can instruct the Bing bot to crawl your website more slowly as follows:

User-agent: bingbot
Crawl-delay: 1

The Bing crawling robot accepts values of 1 (slow), 5 (very slow) and 10 (extremely slow).

Alternatively, a website owner can register for Bing's Webmaster Tools and manage their website's crawl rate, here: Bing Webmaster Tools

Yandex

This search engine can quite aggressively crawl websites and is often responsible for causing website downtime. Thankfully, you can set a timeout value in seconds, so it will take a 2, 4, 6, 8 second break between each request, for example:

User-agent: Yandex
Crawl-delay: 4

Most Yandex users are in Russia, so if your website does not have an audience in Russia, you could consider blocking the robot altogether, like this:

User-agent: Yandex
Disallow: /

DISCLAIMER: The scripts provided in our knowledgebase are for informational purposes only. We do not provide any warranty or support. It is essential to review and modify the scripts to fit your site's specific needs. There may be unforeseen outcomes when adding new script code to an existing website. You should discuss it with your website manager and seek advice from an experienced website developer if you are unsure.

Updated by SP on 23/11/2022

Was this answer helpful? 3 Users Found This Useful (3 Votes)