Wednesday, 29 January 2014

How to Custom Robots.txt for Blogspot?

Custom Robots.txt for Blogspot


Robots.txt serves to prevent web crawlers work with web robots from accessing all or part of a website if it is publicly viewable. Essentially a robots.txt file on a web site will serve as a request that specified robots ignore specified files or directories when crawling a site.
If you use the wordpress.org platform Blog, robots.txt file can create yourself via file manager in panel, but if you are blogspot user then there are steps that need to be before activating robots.txt (disabled by default).


STEPS :


1. Login to your Blogger account and go to your blog dashboard.

2. Click on Settings >> Search preferences

3. Edit your Custom robots.txt the choose Yes to enable custom robots.txt content.





4. Then fill with the following command


User-agent: Mediapartners-Google
Allow:/
User-agent: Googlebot
Disallow: /?m=1
Disallow: /?m=0
Disallow: /*?m=1
Disallow: /*?m=0
User-agent: *
Disallow: /search?*
User-agent: Twitterbot
Allow:/
User-agent: *
Disallow: /search
Disallow: /p/*
Disallow: /view/
Allow: /
Sitemap: http://prabhurockstar.blogspot.com/feeds/posts/default?
orderby=UPDATED




Note: red letters replace with your own blog



5. Save your jobs


For websites with multiple subdomains, each subdomain must have its own robots.txt file.
If example.com had a robots.txt file but sub.example.com not, the rule which would apply
for example.com would not apply to a.example.com




CHEERS

No comments:

Post a Comment