BAVOKO SEO Tools allows you to make changes and optimizations to the robots.txt file of your website in a dedicated editor. If you make an error within this file, you may for example accidentally exclude certain areas of your site from the index. Therefore, we recommend that you only edit your robots.txt, if you know exactly what you are doing. Besides, it is usually not necessary for WordPress websites to make changes to the robots.txt.
BAVOKO SEO Tools also creates an automatic backup of the robots.txt file, which allows you to reset it at any time.
Table of Contents
What is the robots.txt?
The robots.txt (Robots Exclusion Standard Protocol) is a text file that contains precise instructions for the search enginge crawlers which of your pages may be crawled and which may not. Crawlers therefore always search for the robots.txt file first when they first visit a page and then apply the specified guidelines while crawling the page.
Editing the robots.txt in WordPress
The editor for the .htaccess and robots.txt files must first be activated in the BAVOKO SEO Tools settings before you can start editing.
- To do this, navigate to the “Settings” of BAVOKO SEO.
- In the tab “General” check the box “Unlock Editor”.
To start optimizing the robots.txt file after activating the editor, proceed as follows:
- Within the plugin, navigate to “Tools” >”.htaccess & robots.txt”.
ATTENTION: Excluding pages from indexing via robots.txt does not mean that your files are protected from access. Most bots stick to the given standard, but since it is not mandatory, there are always exceptions. For sensitive data, we recommend that you lock these areas with the .htaccess file.
Even if the idea is obvious that the sitemap of your website will be discovered by the search engine crawlers when you link it in the robots.txt, you should not rely on it. Be sure to store your WordPress Sitemap yourself in the Google Search Console to manage it properly and to fix eventual problems.
Automatic Backup System for .htaccess & robots.txt
BAVOKO SEO Tools has its own backup system with which a backup copy of the .htaccess file is created automatically every time changes are made. That is because serious errors can occur on your website, if you do an incorrect change, especially within the .htaccess file.
In the event of errors occurring on your website after a change in the .htaccess or robots.txt, you can download the automatically generated backups of the two files from the following folder via FTP access, which you simply have to copy and paste to the main directory of your WordPress installation: /wp-content/plugins/wp-seo-keyword-optimizer/backups/
BAVOKO SEO Tools: WordPress SEO Plugin
with robots.txt editor
If you want to edit the robots.txt file during your WordPress SEO work, to prevent search engine crawlers from indexing certain URLs of your website, BAVOKO SEO Tools offers you an uncomplicated and backed up way to do this in a simple editor.