Home > Blogger > How To Add Custom Robots.txt File In Blogger

How To Add Custom Robots.txt File In Blogger

Have You Hear About Robots.txt Or Use It On Your Blogger Blog.  Today We Will Discuss On Robots.txt File.

What Is Robots.txt

Robots.txt Is a File (Part) Of Your Blog Which Maintain Crawling, Indexing Of Your Blog. Every Search Engines Like Google Send His Spiders On Your Blog/Website. The Spiders Will Firstly Come On Robots.txt File And Analyze The Robots Exclusion Protocols. After Analyze, Spiders Start Crawling And Indexing Your Website. So Robots.txt Control On Crawling And Indexing Of Your Blog And Your Blog Posts. If You Want To Stop Spiders To Crawl a Post Then You Can Do It Easily With Robots.txt File. Mainly Website/Blog Owners Used Robots.txt For Protect Their Admin Parts. If You Have a Blog On WordPress Then You Can Easily Protect WP Admin Area By Robots.txt File.

Each Blogger Blog Have a Default robots.txt File. You Can Check Your Blog Robots.txt File By Adding /robots.txt After Your Blog.

For Example: http://yourblog.blogspot.com/robots.txt

The Default File Contains Some Codes. Look Likes Below:

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Allow: /

Sitemap: http://yoursite.blogspot.com/feeds/posts/default?orderby=UPDATED

 

 

 

 

 

How To Enable Robots.txt In Blogger

1. Go To Your Blogger Dashboard.

2. Click On Setting>>Preferences

3. You Will See Robots.txt Section Below. Click On Edit>>Yes.

add custom robots.txt in blogger

4. Copy The Default Robots.txt File By Going http://yourblog.blogspot.com/robots.txt And Paste In The Box. Click Save.

Must Read: How To Create Automatically Update Sitemap In Blogger

5. Now Your Robots.txt File Is Enable And You Can Easily Block Any Page/Post By Adding In Command In The Box.

How To Add Command For Stop Crawling And Indexing a Specific Post Or Page.

If You Want To Block a Specific Page/Post By Crawling And Indexing By Search Engines Spiders. Then You Need To Add a Command In Robots.txt Box. Here We Take An Example Of Contact Us Page. How To Stop Crawling Contact Us Page.

Disallow: /p/contact-us.html

Please Add This Line Below User-agent: * 

Note: If Your Contact Us Page Permalink Is Different Then Add The Contact Us Page New Permalink. And Always Write One Command Per Line. For More Information About Commands Please Read This Page.

Check Also

Increase comments on your blog

Best Tips To Get More Comments On Your Blog Posts

Your blog is heartless without visitors. You’ve noticed loads of comments on several blogs but …

Please comment here

avatar
300
TonmoyParves
Guest
TonmoyParves

Hello, Thanks for your good article. Really to be honest, i don’t have any idea about this as well i am newcomer Blogger. I am going to follow your these steps with my own Blog .

Cheers

Alejandro
Guest
Alejandro

I read a lot of interesting articles here. Probably you spend a
lot of time writing, i know how to save you a lot of time, there is an online tool that creates unique, SEO friendly posts in seconds, just type in google –
laranitas free content source

Arjun
Guest
Arjun

Keep robot.txt as it is,otherwise by some mistake of you,blog will drop on search engine.why you are not using custom robot heading tags for label,archive and page. for blocking spider.

Aryan alex
Guest
Aryan alex

thnx a lot for posting this useful article….:) 🙂 really helpful for new bloggers…..

jeevan
Guest
jeevan

very nice helpfull article bro

wpDiscuz