How to Configure Robots.txt on WordPress in 2 Minutes for SEO!

Robots txt on WordPress by Bloggerable

This post may contain affiliate links. Read policies

In one of my discussions, somebody raised a query about the Robots.txt on WordPress that we use for proper SEO on our Blogging Website. When you improve SEO for your Blogging Website, Robots.txt stands out as a powerful SEO tool for higher rankings.

Robots.txt helps search engines in crawling your WordPress site in order to send high organic traffic. Configuring Robots.txt on WordPress can boost your search engine ranking for your target keywords. So, I’ll show you the exact strategy to create a valid Robots.txt on WordPress for best SEO practices.

What is Robots.txt File?

Robots.txt on WordPress

Robots.txt is a plain text file that controls the behavior of the search engine crawlers towards your blogging website. It tells the search engine how to crawl and index your web content for the target keyword on SERP results.

You can create it with any simple text editor (Notepad on Windows) and store it in the root directory which is also visible as the main folder of your WordPress website on your web servers.

In general terms, it contains simple text commands for the crawlers on what to include and what to exclude while indexing your site in the database of the relevant search engine.

The key elements of a Robots.txt file are:

Sitemap: (Your XML Sitemap URL)

User-agent: (User Agent Name)

Disallow: (URL Directory Not to be Crawled)

Allow: (URL Directory to be Crawled)

The idea is simple. You can write multiple commands on different lines in your Robots.txt file. Now, let’s take a look at what should be an ideal Robots.txt file and why you need it for SEO and organic traffic.

Perfect Robots.txt File Format

Bloggerable Coding Sitemap Illustration

I’ve analyzed some of the popular blogs on the Internet and they all are using a universal pattern of commands in Robots.txt on the WordPress dashboard. It’s simple to create and update on the root folder of your WordPress servers.

For your ease, I’ve created a template for your Robots.txt file. You can copy and paste this exact file with help of the following methods. I recommend using SEO Plugins to tackle your Robots.txt on WordPress as accessing it directly via File Manager can be risky in case of insufficient knowledge.

Note: Do not forget to change the Sitemap URL with your own!

Here’s the perfect Robots.txt format:

Sitemap: https://example.com/sitemap_index.xml

User-agent:*
Disallow: /wp-admin/
Disallow: /wp-content/plugins/

As you can see in the above Robots.txt file, we’ve Disallowed the crawlers to do through the Plugins data and WordPress Dashboard files. Along with it, we’ve also provided a valid Sitemap for our blog.

Allowing Google Bots to crawl through your WordPress Admin Panel can create many bugs and turns your blog into something suspicious in front of Google. Likewise, your Plugins are exposed to the crawlers and search engines. It can create unwanted link logs to the owners of the plugins.

Let’s see what will happen if you get a proper Robots.txt on WordPress blog…

Why is Robots.txt File Important for WordPress Blog?

Organic Traffic Stats Illustration

Don’t get confused with the usage of the Robots.txt file on your WordPress blog. If you don’t have a valid Robots.txt, the crawlers will still crawl your site. They will go through every link and file on your server.

But… It will become difficult for them to properly understand the structure of your blogging site. Imagine giving unwanted and useless tasks to Google while it has a job to display the best content from your site to your target audience.

Robots.txt tutorial by Bloggerable

That’s why having a working and SEO-friendly Robots.txt on WordPress Blog is a must. It helps crawlers to index the most important links on your blog faster than before.

In short, it is a booster for your On-page SEO Strategy.

It’s time to dig into the step by step guide on how to create or update your Robots.txt file so that you can get organic traffic and income from your blogging website.

Steps to Create Robots.txt on WordPress?

Although the general way of using the right kind of Robots.txt file is by having it as plain text in the root folder on the server. There are two ways to create a valid Robots.txt on WordPress. Choose what makes you feel comfortable and you’re good to go…

Method 1: Using the Yoast SEO Plugin (Editing Robots.txt – Easy!)

I’m sharing the method for Yoast SEO Plugin as most of the bloggers use this plugin. Yoast SEO comes with a built-in Robots.txt generator that you can use to get your blog a valid file.

You can also edit your Robots.txt file directly on your WordPress dashboard with this plugin.

Just go to Yoast SEO > Tools and click on File Editor from the options.

Yoast SEO Robots.txt on WordPress by Bloggerable

If you don’t have a Robots.txt on WordPress then it will show you an option to create a New Robots.txt file with one click.

User-agent: *
Disallow: /

The default text in Yoast SEO’s file contains invalid elements and stops crawlers to crawl your blogging website. It is advised to change it immediately as above one.

After writing the data from above shared Robots.txt format, click on ‘Save robots.txt file‘ and refresh the page for verification.

Method 2: Creating/Editing Robots.txt Manually with File Manager

In this method, we will create or edit your Robots.txt directly on the webserver. Don’t be afraid as it is a super easy method and requires access to the files on your server.

For the access, we’ll use Filester Plugin on WordPress (Easy)

Once you get inside, you’ll able to see the main folder of your files. If you see a robots.txt file, delete it as we will create a fresh file for you.

Filester Robots.txt Tutorial by Bloggerable

Click on New File > Text File and name it as robots.txt (it will add .txt automatically). Once created, Right Click > Edit file > TextArea and you’ll end up with a text editing window.

Write the above shared Robots.txt format and click on the Save & Close button. You can refresh the page to see if the file stays there for your verification.

Anyhow, in the next step, we’ll verify the valid Robots.txt file on your WordPress with help of a Robots.txt Checker. It’s a simple one-click process and easy to follow.

How to Check Robots.txt on WordPress?

See? Creating a Super Powerful Robots.txt on WordPress is so easy. Your blog is ready to control the crawlers for best SEO blog practices. But, it’s always good to check it before sleeping with a sigh of relief.

And it has become so easy to test Robots.txt on your WordPress blog with different checking tools. I personally recommend using Merkle’s Rebots.txt checker.

Robots.txt on WordPress Online Checker

Just visit the site, type your domain name and choose any preferred agent. I use Googlebot as it is the most common search engine used by people on the Internet.

Click on Test and you’ll get your Robots.txt data in the text snippet below. Bingo!

Conclusion

The proper configuration of your Robots.txt on WordPress blog is required to stop the crawlers from fetching data and indexing the pages which you don’t want to show publicly.

With these methods, you can easily create a Robots.txt file for your SEO purpose. Mind that having a Robots.txt file is not the only solution for your SEO Rankings. I always encourage you to take FREE Site Audit by Semrush as it indicates all the errors and warnings that are harming your blog.

By the way, don’t fall into the myths as many people think that blocking categories and tags in Robots.txt can boost your website crawling manner.

If you like the information, consider sharing it with your friends. Keep reading good stuff!

Good Luck,

Rajan Arora Credit Note Bloggerable

Leave a Comment

Your email address will not be published. Required fields are marked *