Create a Perfect Custom Robots.txt file for Blogger and improve SEO.

Robots.txt file controls the search engine crawling for a website. That means it plays a critical role in the search engine optimization of the Blogger blog. In this article, we’ll understand the best implementation of the robots.txt file in the Blogger blog.

What is the function of the robots.txt file?

With the help of the robots.txt file, we tell the search engine about the pages which should and shouldn’t crawl. Hence it allows us to control the functioning of search engine bots.

create custom robot.txt Blogger blog
Blogger robots.txt for best SEO

In the robots.txt file, we use the user-agent, allow, disallow, sitemap function to declare search engine bots, pages allowed to crawl, pages not allowed to crawl.

Usually, we use commands for all search engine crawling bots to index the pages throughout the web. But, for more details, you’ve to understand the robots.txt file for the Blogger blog.

The Best Robots.txt file for the Blogger Blog

To create a perfect custom robots.txt file for BlogSpot. First, we’ve to understand the functioning of the Blogger blog. For this, let’s analyze the default robots.txt file.

By default, this file look likes:

User-agent: Mediapartners-Google

User-agent: *
Disallow: /search
Allow: /

  • The first line of this file declares the bot type. Here it’s Google AdSense, which is disallowed to none. That means the AdSense ads can appear throughout the website.
  • The next user agent is *, which means all the search engine bots are disallowed to /search pages. That means disallowed to all search and label pages(due to the same URL structure).
  • And allow tag define that all pages other than disallowing section will be allowed to crawl.
  • The next line contains a post sitemap for the Blogger blog.

This is an almost perfect file to control the search engine bots and provide instruction for pages to crawl or not crawl. Please note, here, what is allowed to crawl will not make sure that the pages will index.

But this file allows for indexing the archive pages, which can cause a duplicate content issue. That means it will create junk for the Blogger blog.

We’ve to prevent this duplicate content issue caused by the archive section. That can achieve by stopping the bots from crawling the archive section. For this, we’ve to apply a Disallow rule /20* into the robots.txt file. But this rule will stop the crawling of the pages. So to avoid this, we’ve to apply a new Allow rule for the /*.html section that allows the bots to crawl posts and pages.

The default sitemap includes posts, not pages. So you have to add a sitemap for pages located under or for custom domain.

So the new perfect robots.txt file for the Blogger blog will look like this.

User-agent: Mediapartners-Google

User-agent: *
Disallow: /search*
Disallow: /20*
Allow: /*.html


Please replace with your Blogger domain or custom domain name. For example, suppose your custom domain name is, then the sitemap will be at In addition, you can check the current robots.txt at

Above file, the setting is best robots.txt practice as well as for SEO. This will save the crawling budget for the website and will helps the Blogger blog to appear in the search results. Along with you’ve to write SEO-friendly content to appear in the search results.

For the best possible setting for robots.txt and robots meta tag, try advanced robots meta tag and robots.txt file. The combination is one of the best practices to boost the SEO of the Blogger blog.

How to edit the robots.txt file of the Blogger blog?

Robots.txt file is always located at the root level of any website. But in Blogger, there is no access to root, then how to edit this robots.txt file?

Blogger provides all root file settings in its settings like robots.txt and ads.txt files. You have to log in to the Blogger account and edit the robots.txt file.

How to Edit Blogger robots.txt file
Provide custom robots.txt
  1. Go to Blogger Dashboard and click on the settings option,
  2. Scroll down to crawlers and indexing section,
  3. Enable custom robots.txt by the switch button.
  4. Click on custom robots.txt, a window will open up, paste the robots.txt file, and update.

After updating the custom robots.txt file, check it by visiting, where should be replaced with your domain address.

I hope you like this article. If any doubts or questions regarding Blogger or WordPress SEO, you can comment below.

Ashok Kumar

Ashok Kumar working in the Search Engine Optimization field since 2015. And worked on many successful projects since then. He shares the real-life experience of best SEO practices with his followers on You also can learn Advance level SEO for WordPress, Blogger, or any other blogging platform. Stay tuned.

Related Articles


  1. i have submit custom robots.txt file according to you but yet
    there is an error and ranking of site is going down continuously
    if you have any solution then please provide

  2. Thanks, bro. It was a very helpful blog especially for new bloggers I appreciate your hard work well-done Bro.

    You have made this tutorial very easy to understand for your readers as a blogger I impressed from your writing skills and from your sound knowledge keep going and best of luck for your future posts

Leave a Reply

Your email address will not be published. Required fields are marked *