I was wondering why http://wiki.ironchariots.org rarely or never appears in google searches. I enabled the google webmaster console with the wiki site and it shows googlebot has many 404 errors. Occasionally pages are correctly retrieved but the majority seem to fail, even though they definitely exist. I also tried fetching pages with http://web-sniffer.net/ which works with the default user agent, but fail with 404 if the user agent is googlebot. It appears as if googlebot accessing the wiki is blocked by godaddy.
I first suspected the request rate limits set by godaddy were the problem. For that reason, I reduced the crawl rate to the minimum allowed by google. However, this has not improved the situation.
This problem is strikingly similar to https://uk.godaddy.com/community/Managing-Web-Hosting/Temporory-Unreachable-in-google-fetch/m-p/3185...
I can only see http://www.ironchariots.org/ in search results, not http://wiki.ironchariots.org/ Also there are several hundred articles on the site and google reports 15 pages in its index. There does seem to be a problem... if people are serious about looking at this problem, at least try http://web-sniffer.net/ with different user agents.
Is there any information about request rate limits for godaddy?
perhaps you may find this useful: