What is robots.txt? Why is it important? How to use?


What is robots.txt? Why is it important? How to use

robots.txt sitemap, robots txt file,
Do you know what Robots.txt is? Today I am going to give you information about Robots.txt. If you have a blog or website, and all the information that you do not want, also become public on the Internet. Or a lot of good content that are not indexed.
If you want to know about all these. So you will get to know about all these things in this article Robots.txt

Robots.txt file is a small text file that tells the search engine Bots which part of the site to Crawl and Index.

The search engine first follows the robots file to index the Bots article and crawl accordingly. But there is no Robots.txt file in the blog, so the search engine Bots start indexing and crawling the data that we do not want.

What is robots.txt

custom robots txt content
A robot is a machine that follows our command, this is the job of robots.txt file. According to whoever gives the command, crawls from the data of the blog. And what we want is visible in the search results.

If you do not want to index the category or tags in your blog, then you have to give a command inside the robots.txt file.

Then whenever you search on a search engine like google, yahoo, bing, the category and tags of the blog do not show.

If you want to see the robots.txt file of any site, then you can search in the browser by typing /robots.txt at the end of that site's URL.
In the same way, you can also check your blog or website.

Advantages of Robots.txt File?

custom robots txt
It is very important to add a robots.txt file to the blog. Because applying Robots.txt file in the blog has many advantages.

With robots.txt, you can keep any content of your blog private from the search engine.

Any data such as files, folders, images, pdf, etc. can be prevented from being indexed in the search engine.

The search engine crawler not only shows the article of your site clean and clear, but everything starts appearing in it. Which can be stopped.

You can also keep sensitive information private.
With the help of robots.txt, the data spout is revealed from the sitemap of the blog. Which is considered good for the index.

Blog categories, labels, tags, images, documents, etc. Can be prevented from being indexed.
You can ignore duplicate pages of blogs in search engines.

What is the Syntax of robots.txt

Now let us learn about some syntax in the below so that it will help in understanding it. And everyone will know about which command is used by which command.

Allow: - If you allow something inside the text file, then it will appear in the search engine. What you allow allows Search Engine Bots for your blog to crawl and index your content.

Disallow: - Word that comes after the blog address such as search/keyword such as Unwanted Pages / Posts, Categories, Labels, Tags do not want to be indexed in Search Engine can prevent it from accessing Search Engine Robot.

User-Agent: - User-agent is the name of the command given for a robot-like Google Bot, Google Bot-Image, AdsBot Google, Google Bot-Mobile, Media partners-Google.

The user agent will have the same name in front of which you want to give the command.

User-Agent: *: - When giving the same command to all Search Engine Robots, we use the * sign in front of the User-Agent.

User-Agent: Media partner-GoogleUser: - When AdSense is advertised in your blog, write the Media partner-GoogleUser command in front of the user-agent. Therefore, when viewing ads, it does not see the ads code in it.

Sitemap.xml: - This is XML file. Through sitemap.xml, Google keeps information on all posts in the blog. And the search engine can easily index your blog posts.


How to Add Robots.txt File

custom robots txt generator for blogger
 Know all the steps given below to add custom robot.txt to your blogger.

Keep in mind that adding the wrong code in the robot.txt file has an effect on the ranking of your blog.

You can generate many free sites to generate Robots.txt File for your blog.

We have given below Robot.txt file code which can be copied and used for your blog. Instead of this, enter the URL address of your blog instead of http://www.example.com.

Robot.txt file code

robot.txt example

General: -

User-agent: *
Disallow: / search
Allow: /
Sitemap: http://www.example.com/atom.xml?redirect=false&start-index=1&max-results=500.

or
User-agent: *
Disallow: / search
Allow: /
Sitemap: https://www.example.com/sitemap.xml


If you use Google ads then:

User-agent: Media partners-Google
Disallow:

User-agent: *
Disallow: / search
Allow: /
Sitemap: https://www.example.com/sitemap.xml

Not only this, but there can be many other types depending on your command.

add custom robots.txt file in blogger
custom robots.txt for blogger, custom robots txt
Now we will know how to add Robots.txt file to your blog.

  • To do this setting, first, go to the dashboard of your blog. And click on Settings.
custom robots.txt, robots.txt generator, robots.txt tester, robots.txt sitemap, robots.txt no index, custom robots.txt for blogger, mytekhelp
custom robots.txt, robots.txt generator, robots.txt tester, robots.txt sitemap, robots.txt no index, custom robots.txt for blogger, mytekhelp

  • After that scroll the page upwards and go to the head section like Crawlers and indexing.
  • Enable custom robots.txt now.
  • Then click on Custom robots.txt.
  • After that, a small page will pop up to add custom robots.txt. Past it
  • After that, now click on save at the end.
  • Now your custom robots.txt has been added to the blog.


We hope that you will get help from this information about the custom robots.txt given for the blog and you will be able to add it to your blog. If you are satisfied with this information, then please share this post on a social sites.

Featured Post

Hosting Big Discount Offer Black Friday Sale 80%OFF FREE Domain

Hosting Big Discount Offer Black Friday Sale If you are thinking of buying hosting then this is a great opportunity for you.

Popular Posts

Contact Form

Name

Email *

Message *