This warning was issued inside of Search Console, as well as sent via email to webmasters. The email and Search Console notifications also informed webmasters that Googlebot’s inability to crawl these files might result in “suboptimal rankings”.
They make it sound very dramatic, but I am hear to bring some good news. A quick edit to your robots.txt file is like to resolve this issue for you, and get your site back to optimal rankings.
The full warning reads like this:
If you received this warning because of your site blocking robots, be happy that you were notified. You know about the issue, and you can take the necessary action to resolve it.
The easy way to fix this, is to open up your robots.txt file. (PLEASE NOTE: if you are even just a little uncomfortable working on your robots.txt file, I would suggest you call your web designer or #seo to make these changes for you.)
Search for the below text:
Once you have removed these lines, head on over to Google’s Fetch and Render tool. This tool will confirm whether Googlebot can now crawl your site without hindrance. If there is something else blocking Googlebot, the utility will advise you of the required steps to be taken.
If all of this sounds G(r)eek to you, give me a call on 076 640 6339 or drop me an email here and I would gladly assist you in fixing this crawl error.