Dragonbot respects the robots.txt directive. If your site is not being crawled due to being blocked by the robots.txt file, you'll want to update the directives for the user agent "dragonbot".
To allow Dragonbot to crawl any page on your site, add the following code to the robots.txt file in the root domain (or any other subdomain Dragon Metrics is set to crawl):
If the problem is not due to the robots.txt file, there could be other server settings that prevent Dragonbot from crawling your site. These issues will have to be resolved before Dragonbot is able to crawl your site.
- The server could be blocking Dragonbot's IP address
- While Dragonbot respects webmasters' limited resources by adhering to politeness rules (such as crawling at a reasonable speed), it's still possible the server is throttling or blocking Dragonbot for making too many requests in a period of time.
- The server could be using geolocation to try to redirect based on IP address