We can updating our site. 💖 Some link are not working. So please try to second link

How to optimize Blogger with Robots.txt SEO standard

How to Create Robot.txt in Blogger And Optimize SEO • How to Make A Powerful Robot.txt File In Blogger/Blogspot
How to optimize Blogger with Robots.txt SEO standard

Hello Everyone, it's been a long time since I sat down today to optimize SEO for my blog. In addition to the allocation of keywords in the article, many other elements are also needed, such as robots.txt - The robots.txt file helps search engine crawlers Find out whether or not you can request crawling from which pages or files on your site. Today I will guide How to optimize Robots.txt for SEO for Blogspot

What is Robots.txt

Robots.txt is a file that includes commands that navigate the crawling process of search engines. It helps search engine crawlers and indexers know whether or not they can request crawling from which pages or files on your site.

As written above robots.txt will be great for SEO when you can prevent some Unwanted links indexed on search engines.

Benefits of using robots.txt

Adding robots.txt is optional, but necessary because:

  • Block indexing of unnecessary resources (e.g. videos, PDF files,...)
  • Block unnecessary page indexing.
  • Insert Sitemap
  • Optimize crawling: Google always has a maximum crawl rate for a website in a certain period of time. Therefore, we need to index the necessary pages and remove the necessary pages to save this crawl.

Basic commands of robots.txt

Command Function
User-agent: [Required, at least one command in each group] This is the name of the search engine crawler. For example Googlebot
Allow: Syntax that allows search engine robots to crawl.
Disallow: Syntax does not allow crawling by search engine robots.
Crawl-delay: This parameter defines how long (in seconds) bots must wait before moving on to the next (this syntax is rarely used)
Sitemap: Declare the sitemap location of the website.

Edit robots.txt for Blogsopt

Step 1: Go to blog management page > Settings.

Step 2: Scroll down and find Crawler and Indexers.

Enable custom robots.txt and click below to edit.

Configuring standard robots.txt for Blogspot

Here is a standard robots.txt configuration for those of you who are using Blogspot


 User-agent: *
 Allow: /
 User-agent: Googlebot
 Allow: /
 Allow: /search/label
 Disallow: /search
 Allow: /search(/)?$
 Disallow: *archive.html$
 Sitemap: https://www.fcthemes.eu.org/atom.xml?redirect=false&start-index=1&max-results=500;
 

NoteChange www.fcthemes.eu.org to your domain name and if your blog has more than 500 articles then change 500 to number bigger

Explain about this configuration

The first is User-agent: *, This syntax allows any bot to crawl like bots of Google, Bing, ... and define the Rules apply below.

Allow: /, This line is meant to allow indexing of all url prefixes

I want Google to not crawl unnecessary pages that other bots can still crawl, so I write a separate directive for Googlebot by adding the line User-agent: Googlebot

span>

Allow Crawling of the label page: Allow: /search/label/.

Block crawling of search pages that may have no content (Disallow: /search) but still allow article pages to be crawled (Allow: /search(/)?$)

Disallow: archive.html$ is to block crawling of sites ending in archive.html. I use the character ($) to match the url at the end.

Finally Sitemap: https... bookmarks the sitemap address of the blog.

Conclusion

So you can create and edit robots.txt file to optimize SEO for Blogspot, if you have any questions, please leave a comment below the article. Hope this article will help you.

Post a Comment

Please Don't Add Links 😠 I hate spam
© FC Themes. All rights reserved. Premium By FC Themes