How to Create a Sitemap and Robot Txt File for Blogger | Solve the problem of excluded pages

How to Create a Sitemap and Robot Txt File for Blogger | Solve the problem of excluded pages

Knowing how to create a sitemap and a robot txt file is one of the important things that you must know if you are dealing with a Blogger, as the sitemap and robottext files have an important role in archiving articles and making them appear in Google search results.

Whereas, creating a sitemap and robot txt files improves SEO and search engines, through the help of Google Search Console, which leads to the issuance of search results.


But you must know that it is necessary when creating a sitemap and dealing with a robot txt file that you must make sure that the files are correct because any error, even small and simple, may lead to the disruption or ban of your site or blog or cause your articles not to be archived on search engines.


Given the importance of knowing how to create a Sitemap and a Robot txt file, we will talk today in this article about how to create them, as well as the importance of each of them, and how to use and deal with them.

How to Create a Sitemap and Robot Txt File for Blogger | Solve the problem of excluded pages

What is a sitemap?

Sitemap are files that represent a map of your blog and your site. This map is sent to Google so that it works to automatically archive the site or blog articles and make them appear in the search engine.


A sitemap can be described as a roadmap for the site or blog. It is a list of all the pages that exist in the blog or site, and it is clarified by it what each page contains.


A sitemap can only be used when you register in the Google Search Console, and it is fixed to the ownership of the Blogger blog, and it is sent to Google through a section called the sitemap found in the search console tools.


And when you add the Silemap.xml file in the Search Console tools, you should not add any other file or any other file with it, because you are trying to archive the article only once, and if the archive is repeated, the engine can stop your site.


How to create a sitemap for blogger blogs:
You can find a sitemap for Sitemap.xml in a blog or blogger site that is embedded in it, and you can access it by adding or typing at the end of the URL of your blog sitemapr.xml or sitemap-pages.xml.


The idea of ​​making a sitemap file is that when you publish a new article on the site, that article page is automatically added to the sitemap, and then the sitemap files indicate to the search engine to notify it that there is a new article that has been added until it is archived, which makes the spiders crawl that page and recognize it and appear in the search results.


How to add sitemap to webmaster tools


You can add a sitemap to Google Webmaster Tools through several simple steps that we will show below:

  1. Log in to Google Search Console with the blog's email.
  2. Click on Sitemap in the side menu.
  3. Add the sitemap.xml code after the site name.
  4. Click Submit to add a sitemap.
  5. Do not add any other file so as not to harm your site or delay the indexing and archiving of your articles, as adding more than one file is one of the factors that lead to delays in archiving Blogger blogs.

What is a Robot txt file

Robot txt file is a file placed in the settings of the Blogger blog in its own box, and it is one of the files that you do to refer to Google and tell it to automatically archive the pages of your articles that are placed on your blog and site.


The creation of this file is based on the structure of the site and the site manager creates it, but he must bear in mind the necessity of creating it in a correct manner so as not to negatively affect the site and so as not to cause it not to be archived or delayed.


After creating the Robots txt file, it can be sent to Google by Google Console, and this file can be tested to ensure that there are no errors in it, and that it was created correctly.


How to Create Robots.txt File for Blogger Blogger

One of the main tasks of a Robots.txt file is to tell engines which pages search engines should crawl and which pages they should not crawl, allowing you to control the work of search engine bots.


Thus, it can prevent duplicate content issue by preventing bots from crawling to archive it.


This is done by applying the Disallow / 20 * rule in the Robots.txt file that stops crawling the pages. To avoid this and allow the pages to be crawled, we also add the /*.html rule to allow the robots to crawl the pages in addition to the sitemap of the pages we learned how to create from Before, this is a Robots.txt file for Blogger.


By applying the above, we will find that work begins to improve search engines, which leads to the appearance of the Blogger blog in the first results of the search.


How to add Robots.txt file to Blogger

You can add a robots text file to your Blogger blog through the next simple steps that we will explain:


  1. Log in to the Blogger site through your site email.
  2. Go to settings and from them to indexing and crawling programs.
  3. Activate the content of the customized Robots.txt file with the toggle button.
  4. Put the code of the Robots.txt file, and then click Save to File.
  5. Check the Robots.txt file after the update by visiting www.example.com/robots.txt of course and replacing the example with your own URL.


After we got acquainted with the concept of the sitemap file and the Robots.txt file and the method of creating each of them, we must know how to use them to solve the problems of excluded pages, and this is what we will talk about below.


How to solve the problem of excluded page

When you add the robot file and the sitemap, it starts to speed up to archive the blog, and the so-called excluded pages appear, and here many website owners believe that there is a problem with the site due to an error message (indexed, but blocked using robots.txt)


In fact, this message is not completely a mistake, but rather it is the result of organizing the archive on the site after creating a setmap and the correct robot file.


Where the robot file begins to prevent it from archiving search and archive pages and also prevents it from archiving static pages such as the privacy policy page and the contact us page, the important thing for him is to archive your articles, and archive the content that you write.


This helps a lot in organizing the archiving of the blog and increasing the speed of archiving topics in the search engine.


What are custom header tags:

It is an add-on through Blogger that you can use to prevent the archiving of a specific part of the blog or article by preventing the search spiders from accessing this part.


all command:

This allows search spiders to access every part of the blog and archive all content without any barriers.


noindex command:

This prevents search spiders from archiving the page and not appearing in search engines.


nofollow command:

This prevents spiders from archiving links on that page.


none command:

This command is a combination of noindex and nofollow .


noarchive command:

This prevents search spiders from keeping a cache or archive copy of that page.


nosnippet command:

This prevents the description from appearing at the bottom of the article in the search engine, or prevents an excerpt from the article from appearing under the title in the search engine.


noodp command:

This prevents search spiders from taking any data from the Open Directory.


nottranslate command:

Prevent spiders from allowing the translation of that page.


noimageindex command:

This prevents spiders from archiving images on this page.


unavailable_after command:

This means that the page is not archived or is not available in the engine after a specific date you choose.


From here, we have come to the end of the article that talks about explaining the Sitemaps and Robots txt file and how to create each of them, and how to add them to improve the site SEO and top search engines, as well as to archive blogs and various sites.

Comments
No comments
Post a Comment



    Reading Mode :
    Font Size
    +
    16
    -
    lines height
    +
    2
    -