Blogger Blogspot blog robots.txt file for advanced SEO, How to add pages and posts sitemap in it.
How to setup a Google robots.txt file, for Best Blogger SEO?
- Go to Blogger.com
- Sign in to your blogger account
- Go to Setting
- Click on Search Preference.
- Click on custom robots.txt and edit it
- Input desired robots.txt file.
- Go to blogger dashboard and click on setting tab
Find the search Crawlers and Indexing, and then click on edit custom robots.txt file.
By default, this file look likes:
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap: https://site.blogspot.com/sitemap.xml
Replace the above file with the new file that also include pages sitemap in it i.e.
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap: https://site.blogspot.com/sitemap.xml Sitemap: https://site.blogspot.com/sitemap-pages.xml
This Blogger robots.txt file allows search engine bot to crawl your whole website, but it denies access pages that start from /search, this is very useful for SEO. This will avoid creating the junk in search results.
Above file, the setting is best robots.txt practice but not the best for SEO. For the best possible robots settings, try advanced configuration of robots meta tag and robots.txt file.
I hope you like this article if any doubt or questions regarding the Blogger or WordPress SEO can comment below.