How To Add Custom Robots.txt File in Blogger (Blogspot)

The Current Interface by Blogger is more SEO friendly. Yes, it has many Search Engine friendly features and among them robot.txt is one of the new SEO friendly feature where your file can be customized with this according to your choice. Really this robot.txt plays a vital role in the search engine optimization process of your blog. In this post let us discuss about how to add custom robot.txt file in Blogspot Blog.

What is meant by Robot.txt?

Actually Robot.txt is simple text file with simple codes which is used by the owner of the website to write specific commands for search engine’s spiders or crawlers. These files instruct the crawlers about their limitation to index a particular page. Let me throw some extra light on this , remember whenever the web crawlers crawl your website before indexing your web pages they first search for robot.txt file to check for the exclusions protocol. Hence Search Engine Crawlers get an idea of which webpage to index and which page not to Index. This is a great tool to increase the ranking of your blog and also it helps you to get more traffic provided if it is used properly. Let us dig deep into this and learn about how to add custom robot.txt file in Blogspot blog.

Add custom robot.txt file in blogger Blog

1)      Go to
2)      Login To your Account
3)      Select the blog if you have multiple blog otherwise ignore this step
4)      Now look for settings>> Search Preferences
5)     If you click search preferences mentioned above you will get the below display
If you click edit option marked above you will get the below display. You can see a text area which you can use for content exclusion by web crawlers  or  spiders.
Now enter the following code in the above box

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /

NOTE: Replace “” with the name of your blog.
You are now done with adding custom robot.txt file to your Blogspot Blog

Explanation of each code

Now you will wonder to see there are many terms such as User-agent, Disallow, Disallow:/search,Allow, Sitemap so if you were not aware of these terms then you check my explanation to each code.

User-agent: Mediapartners-Google

This code is for Google Adsense.The Adsense bots will crawl this so that they can serve better ads for your blog.So this code is only for the use of goole Adsense.If you are not using google Adsense then you can simply ignore this step


The User agent marked with asterisk allows bots or crawlers of all search engine it may be either yahool or bing etc., to crawl your blog.


By adding this you are instructing crawlers not to index your web pages


This means you are not not allowing the search results of your blog by default.That is you are not allowing the crawlers to crawl the search directory which comes very next to your domain name.It will look like
This page will not be crawled abd never be indexed


As it name states allow this will allow crawlers to crawl the home page of your blog


This code refers the sitemap of our blog.This helps the crawlers to index all webpages which can be accessed.Web crawlers will always find a path to sitemap because it makes their job easier as sitemap have all the links of our published posts.
You can add the xml sitemap upto first 500 posts only


If you have more than 500 posts say 1000 then you can add two sitemap as below


Leave a Comment