What is robots.txt? Robots.txt (remember, always plural!) is actually just a plain .txt (text) file that usually placed in root of a blog or website. Its function is to limit or block search engine spiders / robots from crawling certain files of a website, or in simple robots.txt tells the spiders/ bots which files are allowed to be crawled and which files are not. Is it important to have robots.txt for website / blog? The answer is absolutely yes, the access limitation from robots.txt instructs the search engine bots to access and crawl only the important files or files that should not be shown for public.
It's just hard to imagine if you have thousands of files on your website and you let the spiders crawl it one by one. While we know that some files such as .php, javascripts .css', .inc are not important to be crawled. So, by adding the robots.txt files on your website root, the spiders from search engine will be more focus to crawl only the important files of your website.

The principal rules of Robots.txt is to dissalow spider from crawling, because search engine robots basic task is to crawl (all files) unless they are not allowed to. This is where we need to have robots.txt file to be exist on our site. How to create a correct robots.txt syntax? Below are examples and brief explanation of robots.txt rules:
That is example of correct robots.txt file for wordpress. If you want to create the same robots.txt rules for your blog, you can copy mine. Just create a plain text file named robots.txt, put the code and upload to your site root directory. Tho check whether your robots.txt syntax is correct and valid, you can check it here. Well, have a try and Good Luck!
It's just hard to imagine if you have thousands of files on your website and you let the spiders crawl it one by one. While we know that some files such as .php, javascripts .css', .inc are not important to be crawled. So, by adding the robots.txt files on your website root, the spiders from search engine will be more focus to crawl only the important files of your website.

To dissalow all robots
User-agent: *Note: the "*" (wildcard) sign represents all user agents (google, bing, yandex, etc). And the "/" (slash) means your website root directory.
Disallow: /
To allow all robots
User-agent: *Note : Since after the Disallow parameter is left empty (blank), it means user agent will have a full access to your site, in other word "Disallow : none". Or if you are sure to let the spiders crawl your site entirely, you can create an empty robots.txt files or barely not to create one (not recommended).
Disallow:
To allow only one robots (one user agent only)
User-agent: Googlebot
Disallow:
User-agent: *
Disallow: /
To disallow some directory and files (recommended)
User-agent: *As I explained earlier, it is very important to limit spider crawl by creating robots.txt on our site root. I meself decided to limit some unnecessary directories and files on my brother's wordpress site, here is my robots.txt:
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /wp-admin/
Disallow: /wp-includes/
User-agent: Googlebot
Disallow: /*.php$
Disallow: /*.js$
Disallow: /*.inc$
Disallow: /*.css$
Disallow: /*.swf$
Disallow: /*.zip$
Disallow: /*?*
Disallow: /*?
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /wp-login.php
Disallow: /wp-register.php
Disallow: /trackback/
Disallow: /feed/
Disallow: /author/
Disallow: /comments/
Sitemap: https://yourblog.com/sitemap.xml
Sitemap: https://yourblog.
robots.txt for Blogger
User-agent: Mediapartners-GoogleFor blogger sitemap url, blogger now support sitemap.xml file, read blogger sitemap explanation.
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://yourblog.com/sitemap.xml
That is example of correct robots.txt file for wordpress. If you want to create the same robots.txt rules for your blog, you can copy mine. Just create a plain text file named robots.txt, put the code and upload to your site root directory. Tho check whether your robots.txt syntax is correct and valid, you can check it here. Well, have a try and Good Luck!