Robots.txt file controls the search engine crawling for a website. That means it plays a critical role in the search engine optimization of the Blogger blog. In this article, we’ll understand the best implementation of the robots.txt file in the Blogger blog.
How to set up the robots.txt file for Best Blogger SEO?
- Go to Blogger.com
- Sign in to your blogger account
- Go to Setting.
- Click on custom robots.txt and edit it
- Input desired robots.txt file.
- Go to blogger dashboard and click on setting tab
The Best Robots.txt file for the Blogger Blog
Find the search Crawlers and Indexing, and then click on edit custom robots.txt file.
By default, this file look likes:
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap: https://site.blogspot.com/sitemap.xml
Replace the above file with the new file that also include pages sitemap in it i.e.
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap: https://site.blogspot.com/sitemap.xml Sitemap: https://site.blogspot.com/sitemap-pages.xml
This Blogger robots.txt file allows the search engine bot to crawl your whole website, but it denies access to pages that start from /search. This is very useful for SEO. This will avoid creating junk in search results.
Above file, the setting is best robots.txt practice but not the best for SEO. For the best possible robot settings, try advanced configuration of robots meta tag and robots.txt file. The combination is one of the best practices to Boost the SEO of a website.
I hope you like this article if any doubt or questions regarding the Blogger or WordPress SEO can comment below.