Share this page : facebooktwitterlinkedinmailfacebooktwitterlinkedinmail

Google:

Search Console

Edit the robots.txt tester: https://www.google.com/webmasters/tools/robots-testing-tool?utm_source=support.google.com/webmasters/&utm_medium=referral&utm_campaign=6155685

Test your robots.txt file

  1. Open the tester tool for your site, and scroll through the robots.txt code to locate the highlighted syntax warnings and logic errors. The number of syntax warnings and logic errors is shown immediately below the editor.
  2. Type in the URL of a page on your site in the text box at the bottom of the page.
  3. Select the user-agent you want to simulate in the dropdown list to the right of the text box.
  4. Click the TEST button to test access.
  5. Check to see if TEST button now reads ACCEPTED or BLOCKED to find out if the URL you entered is blocked from Google web crawlers.
  6. Edit the file on the page and retest as necessary. Note that changes made in the page are not saved to your site! See the next step.
  7. Copy your changes to your robots.txt file on your site. This tool does not make changes to the actual file on your site, it only tests against the copy hosted in the tool.
Limitations of the robots.txt Tester tool:
  • Changes you make in the tool editor are not automatically saved to your web server. You need to copy and paste the content from the editor into the robots.txt file stored on your server.
  • The robots.txt Tester tool only tests your robots.txt with Google user-agents or web crawlers, like Googlebot. We cannot predict how other web crawlers interpret your robots.txt file.