Dragon Metrics crawls your site to look for over 70 onsite optimization issues and tracks the status of each issue over time. Details on your site's crawl performance can be found in the Site Auditor.

Site Auditor makes it easy to:

  • Monitor the results of a site crawl

  • Understand which technical or optimization issues are affecting your site

  • Prioritize onsite issues

  • Find which URLs are affected by each issue, and prioritize which should be fixed first

  • Get in-depth guides and instructions on how to fix each issue, along with the data you need to quickly fix each one

  • Track optimization issues over time

Jump to a section

Getting Started

To get started, navigate to Site Auditor under Onsite in the left navigation.

After clicking entering the Site Auditor, you will be presented with an overview page of all the onsite issues, on the top of the page you can get some basic information about your site crawls.

A list of all optimization issues that Dragon Metrics checks can be found in the dropdown at the top of every page. The number of URLs affected by each issue is shown in parentheses next to each one. Click on an issue to see the full details.

There are a wide range of crawling options that can be modified for greater control of how your site is crawled. Click Crawl Options in the upper left of the report to configure any of them. Learn more about crawl options

The date of the most recent crawl is shown, along with the date of the next scheduled crawl. If don't want to wait for this scheduled crawl, you can click re-crawl site now to start a new crawl immediately. Learn more about re-crawling sites

Next up, the number of issues found on your site issues grouped by priority will be shown, along with the amount each one has changed compared to the previous crawl.

URLs Crawled

A breakdown of the type of URLs that were included in the crawl are shown under URLs Crawled. This can help show how crawl budget was used, and whether you need to adjust your crawl settings in Dragon Metrics or if changes to your site's architecture are necessary.

  • URLs Crawled – The total number of URLs that were included in the crawled

  • Primary URLs – The total number of URLs in the crawl that were not duplicates, redirects, noindex, or errors.

  • Duplicate URLs – The total number of duplicate URLs based on the URLs we crawled. A URL is considered a duplicate if one or more other URLs have the exact same content on it.

  • Redirected URLs – The total number of URLs included in the crawl that returned an HTTP 3xx status code.

  • Noindex URLs – The total number of URLs included in the crawl that contained a noindex directive in the meta robots tag, x-robots tag, or robots.txt.

  • Error URLs – The number of URLs included in the crawl that returned an HTTP 4xx or 5xx status code.

URLs Not Crawled

Not all URLs that our crawler came upon were included in the crawl. If they were not included, we'll break down the reasons why they were not.

  • URLs Not Crawled – The total number of URLs we found but did not crawl

  • Disallowed (robots.txt) – The total number of URLs we found but aren't able to crawl because they are disallowed by the robots.txt directive.

  • Disallowed (x-robots) – The total number of URLs we found but weren't included in the crawl due to the x-robots directive.

  • Excluded – The total number of URLs we found but aren't able to crawl because they were excluded in crawl options.

  • Crawl Errors – The total number of URLs we found but aren't able to crawl because of errors in receiving a valid response from the server. Please note that this is different than an HTTP 4xx or 5xx status code. These are URLs that we were not able to even receive a valid response from the server at all. This could include issues such as an invalid HTTP response, DNS error, content encoding errors, etc. A list of the most common errors we encounter can be found here.

Trends

Below you can see a series of trend charts, showing you the number of issues in different priorities, the total number of issues, and the number of pages crawled over time.

Highest Priority Issues

In the Highest Priority Issues table, you will see the most important issues on your site, sorted by importance and occurrences.

  • Issue – The name of the issue. Clicking onto each Issue will bring you to the detail page of that issue.

  • Priority – The priority of the issue, defined by High, Medium, Low, and Notice. Learn more about Issue Priority.

  • Type – The type of the issue, such as Meta Data, Images, Content. Learn more about Issue Type.

  • Count – The number of times this issue is detected on your site, and its change compare with previous crawls.

  • Trend – The number of times this issue is detected on your site over time.

Bar Charts

At the bottom of the page, there will be a series of bar charts showing all the most common on-page SEO data and issues we've detected on your website.

Issues by Priority

Distribution of issues we've detected on your site, grouped by priorities.

Issues by Type

Distribution of issues we've detected on your site, grouped by types.

Crawl Depth

Distribution of URLs we've found on your site, grouped by crawl depth (the minimum number of steps each URL is away from the home page).

Scheme

Distribution of URLs we've found on your site, grouped by their HTTP scheme.

Redirect Types

Distribution of redirect types we've detected on your site.

Error Types

Distribution of error types we've found on your site.

Title Tag

Distributions of issues related to title tags we've found on your site.

Meta Description

Distributions of issues related to meta description tags we've found on your site.

H1 Tags

Distributions of issues related to H1 tags we've found on your site.

Meta Keywords

Distributions of issues related to meta keywords tags we've found on your site.

Did this answer your question?