How to create and configure robots.txt files in WordPress – Complete Guide

How to migrate from Squarespace to WordPress
Learn how to add custom CSS in WordPress

So that your website is well positioned in the search results of search engines (SERP), you need the bots of search engines to explore the most important pages of it. So have a file robots.txt WordPress well structured will help direct these bots to the pages you want them to index. 

In this article, you will learn a number of topics. Are they:

  1. What is a robots.txt file and why it is important.
  2. Where the WordPress robots.txt file is located.
  3. How to create a robots.txt file.
  4. Which includes rules in your robots.txt file WordPress.
  5. How to test the robots.txt file and include it in Google Webmaster Tools .

At the end of our content, you will have all that it takes to create and set up a robots.txt file to your WordPress site. And with that, increase your chances of appearing in top search engines placements.

Get to work!

What is a robots.txt file WordPress (and why you need it)?

An example of robots.txt to WordPress.
The file robots.txt standard WordPress is pretty basic, but you can replace it with other codes easily.

When you create a new website, search engines send bots recognition to track it and identify all the pages it has.

In this way, they will know which pages show in search results when someone searches for keywords related to the site theme. It’s that simple.

The problem is that most modern websites contain more elements than just pages.

WordPress, for example, allows you to install plugins, which have their own directories. You do not want them to appear in the search results because they are not exactly a very relevant content.

What the file robots.txt does is provide a set of instructions to the bots of search engines. He kind of goes like this: “Hey, you can look at these pages here, but do not go to those pages there.”

This file can be as detailed as you want it to be. And it’s easy to create it, even if you do not have much technical knowledge.

In practice, the search engines will explore your website even if you do not have a file robots.txt set. However, do not create a file of this type is bad for your site.

Without it, you allow these bots to index the entire contents of your website. And they are so fastidious that can be they end up showing parts of your site that you do not want people to access.

More importantly, without a robots.txt file, you will have many bots wandering through your site. And that by compromising his performance in loading speed and also on the access server.

Even if the impact is insignificant, the page load is something that should always be on your list of priorities. After all, a slow site is something that people usually hate about to never access a site again.  

Where the WordPress robots.txt file is located?

When you create a WordPress site, it automatically configures a file robots.txt virtual located in the main folder of your domain.

For example, if your site is located in the field seusitefalso.com.br , you will be able to find the seusitefalso.com.br/robots.txt address. And you will also be able to see that a file like this will appear:

  1. User-agent: *
  2. Disallow: / wp-admin /
  3. Disallow: / wp-includes /
User-agent: *
Disallow: / wp-admin /
Disallow: / wp-includes /

This is an example of a file robots.txt pretty basic. A clearer point of view, the part immediately after User-agent: states in which bots rules below apply.

An asterisk means that the rules are universal and apply to all bots. In this specific case, the file robots.txt tells the bots that they can not go to the directories   wp-admin and wp-includes .

And it makes a lot of sense because these two directories have many sensitive files. And it’s not very suitable to persons other than the owner of the WordPress site, have access to them.

However, you may want to add more rules to its own file robots.txt . Before doing this, you’ll need to understand what a file is virtual .

Typically, the location of robots.txt WordPress is within the directory root (root). This directory is often called public_html or www (or have the name of your own website).

its root folder on WordPress

Only the file robots.txt that WordPress sets up for you by default is not accessible in any directory. It works, but if you want to make changes, you need to create your own file and send it to your folder root (root) as a substitute.

We will show some ways to create new WordPress robots.txt within one minute. It is easy and fast. But for now, let’s focus on how to determine what rules you should include in a WordPress robots.txt file.

What rules include in your robots.txt file WordPress?

In the last section, you saw an example file robots.txt WordPress generated in the CMS itself. He had only two very short rules.

But most sites today have much more than that. Let’s take a look at two files robots.txt and show that both are different from each other.

Here is our first example file robots.txt WordPress:

  1. User-agent: *
  2. Allow: /
  3. # Disallowed Sub-Directories
  4. Disallow: / checkout /
  5. Disallow: / images /
  6. Disallow: / forum /
User-agent: *
Allow: /
# Disallowed Sub-Directories
Disallow: / checkout /
Disallow: / images /
Disallow: / forum /

This is a general robots.txt file an a website forum. Search engines often index the topics within these forums. But depending on what your forum is, you may not want to allow it.

In this way, Google will not index hundreds of topics where users interact poorly. Instead, you can apply rules to determine which Google’s robots from indexing only the most important topics and move away from the less important.

You should also have noticed the line Allow: / logo in the file start. This line determines the bots sweep every page of a site, unless specified that appear at the bottom.

Also, you’ll notice that we set these rules as universal (with an asterisk point as well as the file robots.txt virtual WordPres.

Now let’s see another example of a file robots.txt WordPress:

  1. User-agent: *
  2. Disallow: / wp-admin /
  3. Disallow: / wp-includes /
  4. User-agent: Bingbot
  5. Disallow: /
User-agent: *
Disallow: / wp-admin /
Disallow: / wp-includes /
User-agent: Bingbot
Disallow: /

In this file, we put the same rules that WordPress puts as standard. However, we have also added a new set of rules that blocks the Bing browser search bot , the Bingbot, to explore your site.

You can be very specific in what you want the bots of engine seek to access your site. In practice, of course, the Bingbot is benign (even if it is not as cool as Googlebot). Still, be warned that there are malicious bots out there.

The bad news is that bots do not always follow the rules or instructions you put in robots.txt files (they are half rebel, so to speak).

Keep in mind that most bots will follow the INSTRUCTIONS that you determine in a robots.txt file. You are not forced to do so, but only asking gentilment and to follow their rules.

If you inquire more about it, you will find many suggestions on what to allow and what to block in your WordPress site.

However, from our experience, we can recommend some rules that work best. Below is an example of what to do in your first file robots.txt WordPress:

  1. User-Agent: *
  2. Allow: / wp-content / uploads /
  3. Disallow: / wp-content / plugins /
User-Agent: *
Allow: / wp-content / uploads /
Disallow: / wp-content / plugins /

Traditionally, WordPress like to block access to wp-admin and wp-includes directories. However, this is no longer considered good practice .

Also, if you add metadata in your images for SEO purposes (Searh Engine Optimization), it makes little sense to ban bots find that information.

Instead, the two rules above cover what most sites require more basic as to be found in search engines.

Creating a robots.txt file WordPress (3 methods)

Once you decide what will be included in your robots.txt file, all that remains now is to learn how to create robots.txt WordPress. You can do this using a specific plugin or manually.

But do not worry. Is not difficult. We show the following step by step and teach you to use two of the most popular plugins for them to do it for you. In addition to also teach you how to do on their own, without relying on tools. Come on!

1. Use the Yoast SEO

Yoast SEO plugin to edit robots.txt files in wordpress

The Yoast SEO practically not need any introduction. It is the SEO plugin for WordPress more popular there. It allows you to optimize your pages and posts to make better use of the keywords you choose.

In addition, the SEO Yoast helps you better optimize their content when it comes to readability. So your readers make a more dynamic reading and appropriate best of the issues you discuss.

Personally, we are fans of the Yoast SEO because of its ease of use. This also applies to the process of creating files robots.txt .

Once you install and activate the plugin Yoast SEO, navigate to SEO> Tools in your Control Panel and look for the option File Editor :

robots.txt file editor option in section Yoast

Click it and you will be taken to a new page. In it, you can edit your file .htaccess without leaving the control panel of your WordPress site. It also has a very useful button called Create robots.txt file , which is exactly what is expected of him.

button to create the robots.txt Yoast plugin seo

When you click this button, a new editor will open and you can directly modify your file robots.txt . Keep in mind that the Yoast SEO has its own rules by default, replacing the file robots.txt existing virtual.      

Thus, to add or remove rules, remember to click the Save Changes in robots.txt to ensure that changes are applied.

button to jump robots.txt file changes in the Yoast SEO plugin

It’s that simple! Now let’s see how another popular plugin does exactly the same thing.

2. Use the plugin All in One SEO Pack

plugin All in One SEO Pack to edit and configure robots.txt files in WordPress

The plugin All in One SEO Pack is another important name in WordPress when we are talking about SEO. It has most of the functions that the Yoast SEO has, but some people prefer to use it because he is a lightweight plugin.

To be wanting to create WordPress robots.txt, this plugin does exactly what the Yoast SEO does without losing the simplicity.

Once you’ve installed the plugin , navigate to the All in One SEO> Tools Manager in own WordPress control panel. In it you will find an option called Robots.txt accompanied by a button Activate (Activate) below. Click on it.

tools manager in the plugin all in one seo pack

Now, a new tab Robots.txt will appear within the menu All in One SEO . If you click it, you will see new options to add new rules to your file, save the changes you make or delete them if you prefer.

robots.txt option to edit code in the wordpress plugin all in one seo pack

Only pay attention to the fact that you can not make changes directly in your file robots.txt using this plugin. In this situation, it is a little different from the Yoast SEO, which allows you to write and edit what you want in the file.    

edition of the robots.txt file on all plug-in one seo pack

Anyway, add a new rule is simple, so do not let that little detail disturb you. More importantly, the plugin All in One SEO Pack also includes a feature to block malicious bots, which you can access in the tab All In One SEO .

block evil robot created using the plugin all in one seo pack in wordpress

That’s all you need to do if you choose this method. Now it ‘s time for you to learn how to create a file robots.txt WordPress manually.

This option is great if you’re not too in order to use a plugin to take care of this process for you.

3. Create and upload your robots.txt file for WordPress FTP

Create a file txt is extremely easy. All you need to do is open your favorite text editor (such as Notepad or TextEdit ) and write a few lines.

After that, just save the file using any name and file type txt you want. Literally takes seconds to do this, it makes sense that you want to create a file r obots.txt WordPress without a plugin.

Here’s a quick example of a file of this type created manually.  

robots.txt example in the text editor

For the purpose of this tutorial, we save this text file on your computer. Once your file is created and saved, you need to connect to your site by FTP .

If you are not sure how to do this, we have a guide that can help you how to configure the FileZilla client .

Once you connect to your site, navigate to the folder public_html . So all you need to do now is uploading your file robots.txt from your computer to your server.

You can do this in two ways: by clicking the right mouse button on the file using the local browser on your FTP client or by simply dragging the file there. Like this:

uploading the robots.txt file by ftp

It should only take a few seconds to upload the file to be completed. As you can see, this method is almost as simple as using a specific plugin.

How to test your robots.txt WordPress file and send it to Google Webmaster Tools

When you create your file robots.txt WordPress upload it to your site, you can use Google Search Console to test it and find possible errors.

Google Search Console is a collection of tools that offer help for you to monitor how your content appears in the results of search engines.

One such tool is checker robots.txt , you can use as soon as you login to your console and navigate to the tab robots.txt Tester .

tab robots.txt tester in Google Search Console

Inside, you’ll find an edit field in which you will be able to add the code of your file robots.txt WordPress. From here, click the Submit ( Send) , as shown below.

Google Search Console will ask if you want to use the new code or remove it from your website. Just click on the  Ask Google to Update (Ask Google to update) to send it manually.

send robots.txt code manually on google search console

Now, the platform will check if your file has errors. If any, it will tell you exactly what it is.

However, you have seen some examples of files robots.txt WordPress in this article, so it’s likely that your code has no problem.

you learned how to work with and set up WordPress robots.txt files

Conclusion

Who does not want to appear and be found on the Internet? To improve the exposure of your website on the web, you need to ensure that the bots of search engines to find the most relevant information on your website.

As we have seen in this article, a file robots.txt well configured WordPress allows you to determine exactly how this bots interact with your site. In this way, you can define which users have access to the most useful and relevant content.

Do you have any questions or concerns about using, edit and configure robots.txt to WordPress? Know that we can help you! Please contact us by the comment box below!