Bot Detection: Identifying the Good, the Bad, and the Ugly

“Robocop,” “Ex Machina,” “I Robot”: Hollywood loves to make movies about robots taking over the world, which thankfully is not yet a reality (…though it is 2020). However, on the internet, bots truly have achieved significant critical mass, representing an estimated 37.2% of all internet traffic. Bots are software programs that perform routine tasks and execute commands automatically. All bots are not created equal. There are different types of bots—some good, some bad:

  • Good bots: Good bots include search engines, which can help drive traffic to a website, as well as tools such as virtual assistants and chatbots, which provide a quick and efficient means of customer engagement.
  • Bad bots: Bots can be used for fraud or malicious disruption in many ways. In 2019, bad bots made up an estimated 24.1% of internet traffic. Bots are used for brute-force or credential stuffing attacks, card testing, and distributed denial of service, just to name a few.
  • Questionable bots: There is a whole class of bots whose activity can be either good or bad, depending on a business’s goals. Scraper bots are a prime example. While these bots can aggregate content that is used to drive third-party sales channels (think of online travel agencies and airlines), scrapers are also a key tool in promotional abuse—a rising problem. Spider bots are another example. While businesses can use these to index their sites to help improve search engine optimization routines, bad actors can also use them to find vulnerabilities.

Bad bots and questionable bots can have a number of adverse impacts on a business:

  • Fraud losses: Fraud losses are an obvious impact of bad bot activities, such as credential stuffing and card testing. In a Kount research study from September 2020, 80% of the surveyed e-commerce merchants indicated that increasingly sophisticated bot attacks are contributing to rising losses.
  • Server capacity: Bot traffic creates server load, which can either slow site performance for good users or require a firm to invest in additional hardware to maintain desired response times.
  • Distorted performance analytics: Bot volume can skew website performance analytics, which can make it difficult to understand the true performance of marketing campaigns.

Historically, bot detection focused on the perimeter (e.g., web access firewalls, or WAFs). As bots have grown more sophisticated, they often dodge these defenses by deploying low-and-slow attacks that mimic human behavior. Or they spread attacks across multiple IP addresses and proxies to bypass velocity controls. Now, to protect themselves, businesses need a broader complement of identity data to detect and stop the bad bots while letting the good ones through. This includes consortium device data, behavioral biometrics, and behavioral analytics.

With little in the way of deterrents or consequences, bot-based attacks will only get more sophisticated. As such, it’s imperative that any firm with a digital presence deploy equally sophisticated defenses.

Add new comment

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
3 + 5 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.

How can we help?

If you have a question specific to your industry, speak with an expert.  Call us today to learn about the benefits of becoming a client.

Talk to an Expert

Receive email updates relevant to you.  Subscribe to entire practices or to selected topics within
practices.

Get Email Updates