How can I ignore traffic from bots and crawlers?

Occasionally, your website may be visited by bots and crawlers who are not real people. Some bots, such as Googlebot or Bingbot, scan the content of your website to find information for the search results. Bots from Facebook, enter your website to capture the page title, description, and preview of the page when somebody pastes the link to your website on Facebook. Other bots, visit your site to scan the technical environment.

By default, we exclude traffic coming from the most popular bots and crawlers but you may run into the ones that are still being tracked. In that case, you can deliberately ignore bots and crawlers from tracking. To do that, you need to exclude the user agents associated with them.

In Piwik PRO, you can exclude user agents for each website or app, or for the whole account.

For a website or app

To ignore traffic from user agents related with bots and crawlers, follow these steps:

  1. Go to Menu > Administration.
  2. Navigate to Websites & Apps.
  3. On the left, select the website you want to work with.
  4. In Settings, scroll down until you see Blacklisting.
  5. Type the user agent details. You can paste the entire user agent or just the user-agent string.
Exclude user agents for a website or app.

For the whole account

  1. Go to Menu > Account settings.
  2. Navigate to Settings.
  3. In Global list of user agents to exclude, type the user agent details. You can paste the entire user agent or just the user-agent string.
  4. Exclude user agents for the whole account.
  5. Click Save.
Was this article helpful?

Be the first to rate this article.

Technical Support

If you have any questions, drop us a line at support_SPC@piwik_SPC.pro.

We’re happy to help!