First question that arises in mind whenever we are talking about robot.txt file is what robot.txt is and how to use it effectively. For a common person understanding this file type is not everybody’s cup of tea. Keeping this point in mind Google has made its testing very simpler by introducing robots.txt testing tool in Webmaster Tools. This testing tool can be found in crawler section of Google webmaster’s tool.
In this tool you can test new Web URLs to see whether these are robot.txt enabled or not on in simple language they are banned from spider’s crawling or not. To guide you on complicated directions, it will highlight the specific URL after that you can make changes in files accordingly and test again for any errors left over. When you are done with it what all you have to do is to upload the latest (the one recently we corrected) version of file to your server so that all the changes you have made on file take effect and run smoothly so that your website’s credibility on increases. An added benefit of this tool is that you will be able to review older versions of your robots.txt file, as well as we can check when accessibility issues prohibit us from crawling.