Dragonbot respects the robots.txt directive. You can prevent Dragonbot from crawling specific areas of your site by writing directives for the user agent "dragonbot". In the future, we plan on introducing additional features that will make it easier to do this in Dragon Metrics without having to alter the robots.txt file.

For example, to restrict Dragon Metrics from crawling any page on your site, add the following code to the robots.txt file in the root domain (or any other subdomain Dragon Metrics is set to crawl):

User-agent: Dragonbot
Disallow: /

To restrict Dragon Metrics from crawling a directory on your site called "private", use the following code:

User-agent: Dragonbot
Disallow: /private

Dragonbot also supports wildcard characters. For example, to restrict Dragonbot from crawling any page that contains the word "user" anywhere in the URL (besides the domain), use the following code:

User-agent: Dragonbot
Disallow: /*user

To restrict Dragonbot from crawling any page that contains URL parameters, use the following code:

User-agent: Dragonbot
Disallow: /*?
Did this answer your question?