Short tutorial on Crawl Rate Optimization to improve your website performance.
Crawl rate is the rate or how many requests per second the bots crawl your web pages. Crawling can be very performance intensive. Crawl optimization should be a priority for any large website. If you are owning a content oriented website which gets updated frequently then it is good to set up a crawl rate frequency for the bots using crawl delay rule in your robots.txt file. Crawl-delay is used to stop bot from crawling web pages or website very frequently. This would improve the performance of your website by reducing the crawl load and make your website faster.
This short tutorial explains you on how to set crawl delay for search engine bots using robots.txt file and Google webmasters tool.
How to Set Crawl Delay For Bots / Search Engines Using Robots.txt File?
How to Set Crawl Delay in Google Webmaster Tool?