google bot

  • 2
  • Question
  • Updated 8 years ago
  • Answered
When i try to use google bot for my web site it comes back failed
Photo of cmgarside


  • 1 Post
  • 0 Reply Likes

Posted 9 years ago

  • 2
Photo of Sanja


  • 10698 Posts
  • 495 Reply Likes
Hello cmgarside,

In order to assist further, please can you share a bit more information about what you're seeing? Are you trying to verify your domain with Google Webmaster but it's not working or is there something else you're trying to do?
Photo of justin popperwell

justin popperwell

  • 1 Post
  • 0 Reply Likes
this is the problem im having any ideas please

Recommended action
If the site error rate is 100%:

Using a web browser, attempt to access If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot.
If your robots.txt is a static page, verify that your web service has proper permissions to access the file.
If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure.

If the site error rate is less than 100%:

Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors.
The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website.
If your site redirects to another hostname, another possible explanation is that a URL on your site is redirecting to a hostname whose serving of its robots.txt file is exhibiting one or more of these issues.

After you think you've fixed the problem, use Fetch as Google to fetch to verify that Googlebot can properly access your site.
Photo of Stefan


  • 4448 Posts
  • 190 Reply Likes
Hello Justin

You could first try to resubmit your website to Google Webmaster.

If you get a message saying that your URL is blocked because of a robots.txt file, please ignore this. Google has simply tried to index the preview URL of one of your pages.

A preview URL is what is used when you open your site in the Yola Sitebuilder and then preview it. Yola prevents Google from crawling these URLs as Google should not index unpublished sites. This message simply serves to indicate that everything is working correctly. It will not interfere with your search engine ranking or Google Webmaster's ability to return data about your site.

As long as your site is published and you have submitted the correct, published URL, Google will be able to index your site. If you get a "Page not Found" or 404 error this probably means that there was a problem with your DNS server (the server of your domain name registrar) or that you had made a DNS change around the time that Google tried to index your site. Google makes an attempt approximately every 5 days, so you simply need to wait for the Google spiders to come around again and everything should be fine.

Please let me know if the problem persists.