menuclose
GoogleBot crawls over the internet 24/7. While doing so, it also loads all ads it finds on your pages. You don't want to count impressions or clicks made those bots, crawlers and spiders. Ad servers like AdGlare use bot filtering to remove invalid traffic from your reports.
Bot traffic accounts for more than 50% of all global internet traffic. An incredible number that can't be neglected. Unless you're running in-house campaigns, it's imperative that your ad server filters bot traffic to avoid report discrepancies and to keep the quality of your sold inventory high.
Although bot activity fluctuates over the years, we can't deny the huge impact that bots and spiders have on our analytical data. Back in 2016, Incapsula released a great infographic to give us an update on where we're heading. Bots are accounting for 51.8% of all internet traffic.
Genuine bots and crawlers tell us who they are via the User-Agent string that is passed along with each HTTP request. This string will be matched against IAB's list of known bots and spiders to determine if we're dealing with human or non-human traffic. For example, Googlebot uses the following user agent string:
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)The Media Rating Council (MRC) has set a standard for detection and filtering of invalid traffic. If the ad serving engines receive a request from such a user agent, an advertisement will be returned but the impression or click will simply not be logged. AdGlare uses this method to make sure page layout remains the same whether a bot or a human visits the page. This is important to let Google determine which content is above the fold - a significant factor in SEO.
In the online advertising industry, publishers are getting paid to show ads. Advertisers buy inventory to show ads to attract potential consumers, which are humans, not bots or automated scripts. As most inventory is sold on a CPM basis, it doesn't make sense to serve half of the campaign to non-human traffic.
If you don't filter for bots:
It's therefore common practice for advertisers to insist on filtering bots when closing a deal with a publisher or ad network.
The Interactive Adertising Bureau (IAB) maintains a list of all known bots and spiders. Ad Tech companies like AdGlare can subscribe to this list to make sure we're all filtering the same type of bots. We're filtering for the following:
It's highly recommended to enable bot filtering to minimize discrepancies with third-party ad servers, especially if you're a publisher. To do so, follow these steps:
In addition to filtering invalid traffic from bots, you may also want to consider to filter requests made from known malicious networks. A quick search on Google can provide you with a list of IP addresses (likely CIDR notations) from networks known to be infected with software to automatically crawl pages to artificially inflate impressions. AdGlare can filter those impressions and clicks at two levels:
Note that IP filtering works slightly differently than described above. Instead of returning an ad, the engines will simply respond with 'no ads available' for requests made from those IP ranges. The end result is the same: these impressions and clicks are not logged whatsoever, keeping your statistical reports free of bot traffic.
Now you're on track to improve your CTR and the quality of your inventory, it's absolutely worth it to consider the following practices as well.
Since 2013, AdGlare powers the ad serving stack of hundreds of brands and publishers worldwide. With ideas and suggestions coming from Publishers, AdOps and Marketeers like you, we're proud to offer one of the most up-to-date ad servers in the market. We adhere to IAB's LEAN Ads Program and Google's Coalition for Better Ads.
Are you a publisher or brand? Reach out to see how we can help you.