Blogger

Blogger Custom robots.txt file to Boost SEO of the Blog in 2021

Every crawling bot first interacts with the robots.txt file and finds rules to crawl. The Robots.txt file controls the search engine crawling for a website. That means it plays a critical role in the search engine optimization of the Blogger blog. This article will explain how to create a perfect custom robots.txt file in the Blogger blog.

What are the functions of the robots.txt file?

The robots.txt file informs the search engine about the pages which should and shouldn’t crawl. Hence it allows us to control the functioning of search engine bots.

create custom robot.txt Blogger blog
Blogger robots.txt for best SEO

In the robots.txt file, we declare user-agent, allow, disallow, sitemap functions for search engines like Google, Bing, Yandex, etc. Let’s understand the meaning of all these terms.

Usually, we use robots meta tags for all search engines crawling bots to index blog posts and pages throughout the web. But if you want to save crawling budget, block search engine bots in some sections of the website, you’ve to understand the robots.txt file for the Blogger blog.

Analyze the default Robots.txt file of the Blogger Blog

To create a perfect custom robots.txt file for the Blogger blog. First, we’ve to understand the Blogger blog structure and analyze the default robots.txt file.

By default, this file look likes:

User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Allow: /

Sitemap: https://www.example.com/sitemap.xml
  • The first line of this file declares the bot type. Here it’s Google AdSense, which is disallowed to none. That means the AdSense ads can appear throughout the website.
  • The next user agent is *, which means all the search engine bots are disallowed to /search pages. That means disallowed to all search and label pages(same URL structure).
  • And allow tag define that all pages other than disallowing section will be allowed to crawl.
  • The next line contains a post sitemap for the Blogger blog.

This is an almost perfect file to control the search engine bots and provide instruction for pages to crawl or not crawl. Please note, here, what is allowed to crawl will not make sure that the pages will index.

But this file allows for indexing the archive pages, which can cause a duplicate content issue. That means it will create junk for the Blogger blog.

Create a Perfect custom robots.txt file for the Blogger Blog.

We understood how to default robots.txt file perform its function for the Blogger blog. Let’s optimize it for the best SEO.

The default robots.txt allows the archive to index that causes the duplicate content issue. We can prevent the duplicate content issue by stopping the bots from crawling the archive section. For this,

  • /search* will disable crwaling of all search and label pages.
  • Apply a Disallow rule /20* into the robots.txt file to stop the crawling of archive section.
  • The /20* rule will block the crawling of all posts, So to avoid this, we’ve to apply a new Allow rule for the /*.html section that allows the bots to crawl posts and pages.

The default sitemap includes posts, not pages. So you have to add a sitemap for pages located under https://example.blogspot.com/sitemap-pages.xml or https://www.example.com/sitemap-pages.xml for custom domain.

So the new perfect custom robots.txt file for the Blogger blog will look like this.

User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search*
Disallow: /20*
Allow: /*.html

Sitemap: https://www.example.com/sitemap.xml
Sitemap: https://www.example.com/sitemap-pages.xml

You’ve to replace www.example.com with your Blogger domain or custom domain name. For example, suppose your custom domain name is www.iashindu.com, then the sitemap will be at https://www.iashindu.com/sitemap.xml. In addition, you can check the current robots.txt at https://www.example.com/robots.txt.

Above file, the setting is best robots.txt practice as well as for SEO. This will save the website’s crawling budget and help the Blogger blog to appear in the search results. Along with you’ve to write SEO-friendly content to appear in the search results.

For the best possible setting for robots.txt and robots meta tag, try advanced robots meta tag and robots.txt file. The combination is one of the best practices to boost the SEO of the Blogger blog.

How to edit the robots.txt file of the Blogger blog?

This Robots.txt file is located at the root level of the website. But in Blogger, there is no access to root, then how to edit this robots.txt file?

Blogger provides all root file settings in its settings like robots.txt and ads.txt files. You have to log in to the Blogger account and edit the robots.txt file.

How to Edit Blogger robots.txt file
Provide custom robots.txt
  1. Go to Blogger Dashboard and click on the settings option,
  2. Scroll down to crawlers and indexing section,
  3. Enable custom robots.txt by the switch button.
  4. Click on custom robots.txt, a window will open up, paste the robots.txt file, and update.

After updating the custom robots.txt file for the Blogger blog, check it by visiting https://www.example.com/robots.txt, where www.example.com should be replaced with your domain address.

Conclusion.

We understood the function of the robots.txt file and created a perfect custom robots.txt file for Blogger blog. Blogger blog users can set up the above robots.txt file for best results.

In the default robots.txt file, the archive section is also allowed to crawl, which causes duplicate content issues for the search engine. And hence search engine gets confused about what to display in the search result, and not consider your pages for the search result.

It means the Robots tags are essential for the SEO of a website. You can consider combining both robots.txt and robots meta tag in the Blogger blog for the best results. You can also download responsive and SEO-friendly templates for the Blogger blog.

I hope you like this article. If any doubts or questions regarding Blogger or WordPress SEO, you can comment below.

Ashok Kumar

Ashok Kumar working in the Search Engine Optimization field since 2015. And worked on many successful projects since then. He shares the real-life experience of best SEO practices with his followers on seoneurons.com. You also can learn Advance level SEO for WordPress, Blogger, or any other blogging platform. Stay tuned.

Related Articles

30 Comments

  1. i have submit custom robots.txt file according to you but yet
    there is an error and ranking of site is going down continuously
    if you have any solution then please provide

  2. Thanks, bro. It was a very helpful blog especially for new bloggers I appreciate your hard work well-done Bro.

    You have made this tutorial very easy to understand for your readers as a blogger I impressed from your writing skills and from your sound knowledge keep going and best of luck for your future posts

    1. sitemap.xml is proper sitemap addition method.. you don’t need to update it after, say 500 posts or 1000 post . It contain all post sitemap in it.. you can add page sitemap too..

      Where atom feed a kind of rss feed. That’s not a proper sitemap method.

          1. No I don’t know properly that’s iam asking you second one is generated through famous sitemap generator tool called as labnol.org finally thankyou so much for replying

          2. The method discussed in this article is right one, other one is wrong method brother. You can try submitting both kind of sitemap to Google search console..
            You’ll see you’ve to add only 2 sitemaps- one for all pages, and one for all posts(no matter how many pages or posts you’ve).

            And for atom feed(labnol) you’ve to add sitemap for every 500 posts, if you’ve 3000 blog posts then you’ve to submit 6 sitemaps, and there is no any sitemap for pages..

  3. thanks for explaining but question Why didn’t you activate the feature enable custom robots header tags

    1. We already explain this in the article. It will block all the archive sections for the crawling bots to address the thin content issues. We can also perform this action by using noindex such content. But for that, read the meta tag and robots.txt combination (link provided in the last paragraph of this article).

  4. Hi Ashok Hw r u ? First of all I would like to appreciate you that you are guiding the bloggers in regards of best technical seo settings. But I am facing problem during use of the above given custom robots.txt file , when we use this format then it does not fetch all blog blog pages or posts in bing or in other search console. For example if we have 100 posts in blog then in bing it only fetching 2 or 3 posts by using sitemap.xml but if we use atom.xml then it fetch more but for that we have to change your custom robots.txt format. Give us a best solution ?

    One more thing if we add above custom robots.txt setting in blogger settings then is it mandatory to use this setting in just below of the theme html section ? As many bloggers are using paid blogger templates & they have no idea if their template providers have already installed such codes or not ? As all are not technically sound in coding and all , therefore your website viewer will trust on your call . Please guide to all in best way what they should do , so that they could get benefit from your given seo settings . Also guide viewers about ” Home page tags” Archives & search page tags” Post & Pages tags” settings if anyone use above given code of Custom robots.txt settings . is mandatory to use both codes ?

  5. could you please reply on my last query ? Since we updated given custom robots.txt file into our blog , it is not fetching our all blog pages into google search console or other search console. What is the solution of this ?

    1. There are two solutions.
      first is blocking search engines access to all thin content and allowing indexing of Pages and posts. (This post explain that)
      And
      2nd is allowing access of search engine to all thin content but noindex them in robots meta tag, allowing only indexing of Pages and Posts. (Combination of robots meta tag and robots.txt for blogger, find the link in the last paragraph.
      You can follow any of these two. Thanks

  6. Thanks You very much sir. You are great. i read many article so i comment here..
    1. thanks for understanding schama [i appllied in blogspot] – webmaster tool enhance yahoooooo
    2. thanks for m=1 for understanding [in the blogger market lots of faked post about it] but u understand clear about it.
    3. thanks for custome robots.txt

    Regars iliyas shaikh

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button