How to block unwanted bots with Robots.txt file?

Short tutorial on How to block unwanted bots with Robots.txt file?

Block Bots in Robots.txt File

In this article well discuss how you can block unwanted bots from accessing your website via Robots.txt file. Follow the steps below to block unwanted bots from being able to access your website.
Whenever a search engine crawls a website, it requests the robots.txt file first and then follows the rules within.
Step 1: Open Robots.txt file.
Step 2: If you are using Cpanel find the file under Public_html.
Login to Cpanel → Files
Cpanel Filemanager
Public_html → Robots.txt
Files Robots
Step 3: If you are using Plesk find the file under httpdocs.
Login to Plesk → Files

Plesk Files
Go to httpdocs → Robots.txt
Files Httpdocs Robots
If you dont have one, create robots.txt as a plain text file and upload it under public_html or httpdocs
Step 4: By default search engines should be able to crawl your website using the following rule.
User-agent: * Disallow: User-agent: rule specifies which User-agent the rule applies to, and * is a wildcard matching any User-agent.
Disallow: sets the files or folders that are not allowed to be crawled.
Step 5: Use the following code, to block unwanted bots from crawling your website. This would block the spambot from accessing your website.
User-agent: spambot Disallow: /
* Always make sure your edit and upload your files using FTP or SFTP not using the file manager.


Related Topics