Do you know the default robots.txt file blocks many pages to crawl in a Blogger blog or website? This tutorial article will guide how you can use robots.txt and robots meta tag combined for best SEO results of Blogger blog. Letting all the crawlers crawl and index the whole website or blog without duplication content error.
One should use both Blogger robots.txt and robots meta tag combined with the best Search Engine result, i.e., perfectly optimizing Blogger.
Perfect Combination of Blogger robots.txt and meta robots tag
Blogger now gives you more control over its features than earlier. As Blogger now supports more advanced XML themes that allow you to control on SEO of the blog.
You can add robots meta tags to index the post, pages, and homepage. And allow all the search and archive sections to crawl with noindex value in the meta tags.
How Google Indexing works?
When the search engine’s bot visits your blog, first it checks the robots.txt file rules. Robots.txt file contains the rules that a search engine should follow. Blogger’s default robots.txt file blocks only blog’s search and labels pages to appear in Google’s search result.
You can check that by visiting http://www.yourblog.com/robots.txt or https://example.blogspot.com/robots.txt
As we already discussed this file contain rules for bot crawl. Try visiting your robots.txt file
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap: https://www.xyz.blogspot.com/sitemap.xml
What is the mean of each line? Understand it here
Now here is the next line that declared User-agent: * means bots or spiders of all search engines. The next line
Disallow: /search means that all pages that are similar in structure to that of https://xyz.blogspot.com/search are not allowed to crawl. Next line :
Allow: / states that all other pages are allowed. Read more about the robots.txt file here.
Sitemap: https://www.xyz.blogspot.com/sitemap.xmlDeclared XML sitemap used by Google or all other search engines for indexing your website or web page.
Blogger Robots.txt File settings will be.
Now I hope you understand all robots.txt file functions. So some changes to be made for better search engine crawling and index your web pages.
- Visit the Blogger dashboard.
- Dashboard > setting > Search Preference.
- Find Custom robots.txt and enable it.
- Paste the code given below. Note: Change the website name with yours.
Here you submitted both sitemaps for pages and posts. And also, Google Spider or bot can crawl your whole website. But for SEO, keep this in mind that you’ve to submit only post and pages for Search Engine Index. Why? Read this here.
Blogger Robots Meta Tags settings.
Using robots meta tags, we’ll avoid junk/duplicate content to index in the search engine. Here we’re using the latest robots meta tag suggest by Google.
- Go to Theme
- Now click on Edit HTML
<b:include data='blog' name='all-head-content'/>
Now, below this line
<b:include data='blog' name='all-head-content'/>, past the code given below.
<b:if cond='!data:view.isHomepage and !data:view.isSingleItem'> <meta content='noindex,follow' name='robots'/> <else/> <meta content='max-snippet:-1, max-image-preview:large, max-video-preview:-1' name='robots'/> </b:if>
What happens with this code? Final SEO Results.
- Boost SEO as only the content will index to google or other search engines.
- Resolve the Google search console error “blocked by robots.txt” for all pages.
- Website will interlink and, as a result, boost in on-Page SEO.
- The bot or spider can visit all pages of your website/ webpage. That results in more pages to index.
You can download our Blogger themes, Best SEO, Speed and User Friendly.
I hope you like this article on Blogger SEO. If any questions regarding any issue, feel free to ask in the comment below. Thank you.