How to add a custom robots.txt file in BlogSpot blog?

Recently I have written about Google Search Console setup which is required for every website. We can manage our website coverage, URL’s indexing, etc from search console.

Before proceeding I would recommend you to read about Google Search Console configuration.

The robots.txt file contains some lines of code which will help the search engine to crawl, index the web pages present on your website. Search engines first read the robot.txt file of your website and accordingly start indexing URLs.

We can define in a robots.txt file which pages to crawl or not to crawl. Along with it, you can easily block or restrict the page from web crawlers to index so that it will not be seen in the search result.

Here is the sample of a robots.txt file. You can change the robot.txt file as per your requirement.

User-agent:  *
Disallow: /wp-admin/
Disallow: /recommended/
Disallow: /tag/

Sitemap: https://scrollbucks.com/post-sitemap.xml
Sitemap: https://scrollbucks.com/page-sitemap.xml

The asterisk after user-agent means that robots.txt file applies to all the web robots.

Disallow means tells the web robots that not to visit and crawl the pages.

The sitemap tells the web robot that where is your all web pages present and crawl these web pages.

Add Robots.txt in BlogSpot blog

Login to your BlogSpot blog and navigate to Setting -> Search Preferences

You will see the Crawlers and Indexing section enable the custom robots.txt and paste the above code. Make sure that you have updated all the website details in the robots.txt file.

Here are the few tutorials available for newbie bloggers.

The following two tabs change content below.

Amit Kharbade

Editor/Writer at ScrollBucks
Engineer, Blogger, Writer, and the man behind ScrollBucks. Amit has been contributing to ScrollBucks since 2014. He is writing about Blogging, Adsense Guide, WordPress tutorials, Affiliate marketing, and reviews.

Leave a Comment