cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Robot.txt preventing Google Site from being crawled

We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit.
1
Network unreachable: robots.txt unreachableWe were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely.
-
Sep 11, 2016

 

 

This is the message and since I have no acess thru Godaddy to my files please fix it asap

3 REPLIES 3
Helper IV Helper IV
Helper IV

Re: Robot.txt preventing Google Site from being crawled

Hey @nothappy, sorry to hear you are not happy. What's the address of your site and what type of hosting account do you have? 

 

Cheers!

Doc

 

 

Need some help with WordPress? Check out Site Doctor 911
Worry-Free WordPress Support - So You Can Focus On The Important Stuff

Re: Robot.txt preventing Google Site from being crawled

Godaddy support told me I needed to contact my ISP.  Can I fix this myself online.  Please help!


 

 


 

Helper IV Helper IV
Helper IV

Re: Robot.txt preventing Google Site from being crawled

@Cheryl2if you are using WordPress, then you can change it yourself. In your WordPress dashboard, go to Settings -> Reading and make sure the checkbox under Search Engine Visibility is not checked. That's what changes the robot.txt file in WordPress.

 

If you're not using WordPress, then you need to look for the robot.txt file in your web site files and makes some changes there. Check out this article for a good explanation: https://moz.com/learn/seo/robotstxt

 

Cheers!

Doc

 

Need some help with WordPress? Check out Site Doctor 911
Worry-Free WordPress Support - So You Can Focus On The Important Stuff