How to Set Crawl Delay For Bots / Search Engines Using Robots.txt File?

Short tutorial on How to Set Crawl Delay For Bots / Search Engines Using Robots.txt File?

Setting Crawl Delay in Robots.Txt

Learn how to set crawl delay for bots or search engines using Robots.txt file. Follow the steps below to set crawl delay for search engine bots for accessing your web pages using robots.txt.
Step 1: Login to Cpanel
Cpanel Login
Step 2: Goto → File Manager
Cpanel Filemanager
Step 3: Goto → Public_html
Filemanager Publichtml
Step 4: Select → Robots.txt → Right Click → Edit. If you dont have one, just create one text file and upload the same.
Filemanager Publichtml Robots
Step 5: Use the following Rewrite Rule to set crawl delay for search engine bots.
User-agent: bingbot
Crawl-delay: 60

User-agent: msnbot
Crawl-delay: 60

User-agent: msnbot-UDiscovery/2.0b
Crawl-delay: 60

User-agent: *
Crawl-delay: 60

Step 6: You could also block unwanted bots from crawling your website as mentioned in the tutorials listed below.
Step 7: Google does not support the crawl delay command directly, but you can lower your crawl priority inside Google Webmaster Tool.
Step 8: If you still face load and traffic issues, it would be better to switch over to the content delivery network (CDN) system.
Step 9: You could view the raw access file or awstats to know about the bots crawling your website and those affecting the performance.


Related Topics