Dragon Metrics crawler (Dragonbot) will respect robots.txt directives by default. However, there may be situations when you may want to ignore robots.txt directives or use a custom robots.txt instead. This may be particularly helpful when crawling a site in a testing or staging environment.

Dragon Metrics allows you override the site's robots.txt in this way.

How to override the robots.txt

To get started, navigate to Campaign Settings > Crawler in the bottom left navigation.

Scroll down to the option for Robots.txt

To completely ignore robots.txt directives, choose Ignore robots.txt

To use a custom robots.txt instead, choose this option and add the text for your custom file in the text box below it.

To enable either setting, Site verification is required.

Click Save Changes when finished.

The changes will take effect next crawl, but you can always initiate a manual crawl immediately.

Did this answer your question?