Custom robots.txt management – Blogger SEO
Robots.txt is a text file on server in that you can write which directories and web pages or links should not be included for search results. It means you can restrict search engine bots to crawl some directories and web pages or links of your website or blog. Now custom robots.txt is available for Blogger. In Blogger search option is related with Labels. If you are not using labels wisely per post you should disallow crawl of search link. In Blogger by default search link is disallowed to crawl. In this robots.txt you can also write the location of your sitemap file. Sitemap is a file located on server which contains all posts’ permalinks of your website or blog. Mostly sitemap is found in xml format i.e. sitemap.xml.
Presently Blogger is working on sitemap.xml. Now Blogger is reading sitemap entries through feed. By this method most recent 25 posts are submitted to search engines. If you want search engine bots only work on most recent 25 posts then you should use robots.txt type 1 given below. In this robots.txt Google Adsense bot is allowed to crawl entire blog for best Adsense performance.
Robots.txt Type 1
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Disallow: /b
Allow: /
Sitemap: https://www.road2space.blogspot.com/feeds/posts/default?orderby=updated
Note: Don’t forget to change the https://www.road2space.blogspot.com with your blog address or custom domain. If you want search engine bots crawl most recent 500 posts then you should need to use following robots.txt type 2. If already you have more than 500 posts on your blog then you can add one more sitemap line highlighted in red. Robots.txt Type 2
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Disallow: /b Allow: / Sitemap: https://www.road2space.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500 Sitemap: https://www.road2space.blogspot.com/atom.xml?redirect=false&start-index=501&max-results=500
Note: Don’t forget to change the https://www.road2space.blogspot.com with your blog address or custom domain.
Mathematical expression for Blogger robots.txt sitemap entries:
Sitemap: https://www.techprevue.blogspot.com/atom.xml?redirect=false&start-index=(m*0)+1&max-results=m Sitemap: https://www.techprevue.blogspot.com/atom.xml?redirect=false&start-index=(m*1)+1&max-results=m Sitemap: https://www.techprevue.blogspot.com/atom.xml?redirect=false&start-index=(m*2)+1&max-results=m Sitemap: https://www.techprevue.blogspot.com/atom.xml?redirect=false&start-index={m*3}+1&max-results=m . . . Sitemap: https://www.road2space.blogspot.com/atom.xml?redirect=false&start-index={m*n}+1&max-results=m
Where m=500 and n=1, 2, 3, 4,…, n. If you have organized post labels in a well format and good experience of search engine optimization (SEO) then you can remove following line –
Disallow: /search
Most important if you don’t want to submit any blogger post or page to search engines then you can add them like that: For Post add a line like that –
Disallow: /yyyy/mm/post-name.html
For Page add a line like that –
Disallow: /p/page-name.html
Manage Blogger custom robots.txt
For this please follow these steps carefully. Dashboard ›› Blog’s Settings ›› Search Preferences ›› Crawlers and indexing ›› Custom robots.txt ›› Edit ›› Yes
I hope you’ll get the benefit of this post and get better search engine presence and ranking.
Custom robots.txt management – Blogger SEO
Reviewed by Ruchi
on
March 21, 2019
Rating:
I lost my bitcoin to fake (blockchain) impostors on Facebook, they contacted me as blockchain official support and i fell stupidly for their mischievous act, this made them gain access into my blockchain wallet whereby 7.0938 btc was stolen from my wallet in total .I was almost in a comma and dumbfounded because this was all my savings i relied on . Then I made a research online and found a recovery expert , with the contact address- wizardcyprushacker@gmail.com WhatsApp +1 (424) 209-7204
ReplyDeleteI wrote directly to the specialist explaining my loss. Hence, he helped me recover my bitcoin just after 2 days he helped me launch the recovery program , and the culprits were identified as well , all thanks to his expertise . I hope I have been able to help someone as well.