How can I ignore traffic from bots and crawlers?
Occasionally, your website may be visited by bots and crawlers who are not real people. Some bots, such as Googlebot or Bingbot, scan the content of your website to find information for the search results. Bots from Facebook, enter your website to capture the page title, description, and preview of the page when somebody pastes the link to your website on Facebook. Other bots, visit your site to scan the technical environment.
By default, we exclude traffic coming from the most popular bots and crawlers but you may run into the ones that are still being tracked. In that case, you can deliberately ignore bots and crawlers from tracking. To do that, you need to exclude the user agents associated with them.
In Piwik PRO, you can exclude user agents for each website or app, or for the whole account.
For a website or app
To ignore traffic from user agents related with bots and crawlers, follow these steps:
- Go to Menu > Administration.
- Navigate to Websites & Apps.
- On the left, select the website you want to work with.
- In Settings, scroll down until you see Blacklisting.
- Type the user agent details. You can paste the entire user agent or just the user-agent string.
For the whole account
- Go to Menu > Account settings.
- Navigate to Settings.
- In Global list of user agents to exclude, type the user agent details. You can paste the entire user agent or just the user-agent string.
- Click Save.