Maybe for some time you have heard of robots.txt when you have read something about Google and SEO. Another very important thing, most websites may have a robots.txt file, but that doesn’t mean that most webmasters understand it. We will try to help you understand what this file consists of and how it is used to optimize your SEO.
What Is A WordPress Robots.txt?
In a nutshell, and as may be obvious, Robots.txt is a text file. This file is created to indicate to the robots of the search engines, how is the structure of our site and consequently, the best way to crawl and index pages on your site.
But, what are these robots? Robots are bots that visit all websites on the internet like search engine crawlers. These robots are scoured the web to help search engines like Google to index and classify the websites they find.
Robot.txt example:[php] User-agent: *
How To Create Your WordPress Robots.txt File
You can check your robots.txt by going to http://yoursite.com/robots.txt.
In the case of our website, the file Robots.txt is the following:
Using Yoast SEO
Let’s assume that you are using the Yoast SEO plugin, you can create and edit the robots.txt file directly from the Yoast interface. Go to SEO → Tools → File editor:
If we still don’t have the Robots.txt file, Yoast SEO gives us the option to create it.
As easy as clicking that button, and we can edit the content of the robots.txt file directly from the Yoast SEO interface.
In order to create or edit the Robots.txt file via FTP, we will need an FTP client, you can also use the cPanel file editor of our hosting, but it is not highly recommended.
In this case we are going to use Filezilla.
We are going to edit the file Robots.txt, to do this we open the FTP client and go to the root of our website.
If we have not yet created the file Robots.txt, we can do it from the root of our website.
Submit The WordPress Robots.txt File To Google
We have created the robots.txt file, but we still have to check that it is configured correctly.
To check it we will have to go to Google Search Console. From “Crawl” we find robots.txt tester, where we will click. We should now see a green Allowed if everything can be traced.
A good idea is to test the URLs we have blocked to make sure they are actually blocked or not allowed.
We have briefly explained the method and idea of adding Robots.txt, and we have learned how to optimize your SEO.
Robots.txt is one of the most important parts of site search engine optimization. We have the option of keeping search engines away from content that we don’t want to show.
On the other hand, by misusing it, we can seriously damage the search engine ranking. You have to be careful and study the creation of our archive very well, because although it is a good tool for SEO, it is also complicated.