Bot Management Flashcards

1
Q

Objection: Isn’t this the same as Rate Limiting? Why should I pay more for this? (2 bullet response)

A

Rate Limiting is an effective tool but you have to write a rule for every combination of IP, User Agents, Hostnames, ASNs, etc.

Bot Management, when enabled, automatically figures out which of the combinations should be applied (no manual writing)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Objection: I only want to pay for the requests on the specific paths I am protecting

A

If only requests for a specific path are considered for the pricing, it will discourage the addition of additional rules against new bot attacks on previously unprotected parts of the site. We want to offer protection that is as holistic as possible, but we also understand that this might not be something that everybody wants.

Thus we suggest two options:

1) either we only take into account requests on specific paths which are currently being attacked.

2) Or we price based on 100% of the traffic your site gets.
We would make the latter option more attractive by discounting the cost per request as compared to the individual approach.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Objection: I don’t understand this concept of “good requests” for pricing. Can you help me understand it?

A

Traditionally, the security industry charges on requests, but this ends up being bad for you.

When you get attacked, your request number goes up and the vendor charges you more.

We don’t think that is right.

We instead use # of good requests, which are requests successfully passed through Bot Management. We believe this number will scale nicely based on how your business grows and how good of a job we do blocking bots.

As your traffic increases, we will block more bots and see more good requests come through as well.

As your good requests grow, we’ll have a conversation about new pricing if you exceed your cap.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Objection: I do not want a single false positive.

A

If you do not want a single false positive, we are not the best product for you as we are not comfortable guaranteeing no false positives.

Additionally, we believe that no product in the market will provide a zero false positive rate and any product that claims to is not being honest.

We do believe that our solution will give you the best possible results as they relate to your use case and we look forward to working with you on it.

We do help reduce the likelihood by enabling them to test rules in log mode.

It won’t eliminate all false positives but can give greater confidence before turning on other actions that may impact legitimate users.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

ROI: Cost Reduction

A

8% savings on Data Transfer - 40% of all requests tend to be automated traffic; 80% caching rate

20% lower spend vs. comp - PerimeterX and Distill charge on all requests (vs. good requests)

Spend on Disputed Txns - $15 for every disputed txn from payment processors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

ROI: Productivity Increase

A

$150-$250k/year - cost of top security engineers spending 2 days/wk on Bot Whack-a-mole / home-grown solutions

$300-$500k/year - developer time spent on building and then managing JavaScript from competition

20%-30% + in BDRs/SDRs - spam in marketing and other CTA forms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

ROI: Risk Reduction

A

$$ on Litigation from scraping - scraping (loss of IP), litigation from lost IPs, Trust & Safety responses

$$ spent on Support/PR - managing account takeovers/credential stuffing; what is the cost of a Twitter PR complaint?

Uptime SLAs - SREs / end customers have uptime SLAs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

ROI: Revenue Improvement

A

Improvement in performance - dropping traffic to serve bots; direct impact on revenue from website

Inventory based - inventory not sold or sold through secondary market

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are bots?

A

Automated program to carry certain tasks without human intervention

Mimic human behavior on the web

Scan content, interact with web pages, chat with users

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Good bot examples

A

Search engine crawlers, site monitoring, copyright, feed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Bat bot examples

A

Scraper, spam, click, fake Googlebots, botnet

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Evolution of bots

A

Basic (simply collect info, limited number of static IP addresses, repetitive attack pattern, easy to detect) –>

Mature (steal sensitive data, commit fraud, disrupt business, botnets, more difficult to counter) –>

Sophisticated (mimic human behavior or hijack a real customers’ browser and tokens; need threat intelligence, behavioral analysis, machine learning, fingerprinting)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How have you tried to stop bots to date?

A

Homegrown solutions, relying on hosting providers, rate limiting, WAF, multi-factor authentication, Javascript-based bot detection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Use cases (6)

A

Credential stuffing, inventory hoarding, credit card stuffing, content scraping, application DDoS, content spam

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Account Selection: Go After

A
  • Don’t have bot solution
  • Under attack
  • Using home grown solution
  • Need integrated solution (security/performance)
  • Looking for ease of use
  • Existing customer
  • Scraping and credential stuffing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Account Selection: Caution

A
  • Bots with high-value targets (Inventory hoarding)
  • Displacing solutions with heavy analytics users
  • Mixed traffic zones
  • Mobile Apps
  • Competitive displacement
17
Q

Account Selection: Avoid

A
  • Non-Browser facing API
  • Requires 100% accuracy
  • Automated partner requests
18
Q

Competitor: Akamai

A
  • Prohibitively expensive (and charges overages).
  • Offers two solutions: Bot Manager and Bot Manager Premier - the former being equivalent to our WAF and the latter being similar in nature, but difficult to implement.
19
Q

Competitor: PerimeterX

A

Very expensive; plus to inject the code customers would likely also have to procure Cloudflare Workers (or similar competitive offering, i.e. with Fastly) which adds to the cost.

20
Q

Competitor: Distil

A

Offers a “land and expand” strategy, starting with low “requests per path” pricing; they will often undercut us for a low # of requests, only to then shoot up shortly thereafter.

They also charge substantial professional service fees.

*When Distil comes in much lower than us, customer may not be seeing the full price.

21
Q

Can you support my API?

A
  • Our solution detects automated traffic. As such, our solution can protect APIs that are accessed via web browsers.
  • If both legitimate automated API traffic and malicious automated API traffic are present, we cannot currently help the customer, as we are unable to differentiate between the two.
  • If the API is only used through manual interaction (i.e., a user performs an action by clicking on something, which then triggers an API call) we can certainly help.
  • If there is minimal legitimate automation we can help (i.e., an airline wants to be scraped by Skyscanner and Kayak, but that’s it). We would whitelist those few scrapers and block everything else. At some point, however, this whitelisting approach does not scale.
22
Q

Machine learning

A

Our ML solution is based on all of our traffic, which is around 15% of global internet traffic. Every request we log has around 50 datapoints we analyze, which includes basics such as IP and User-Agents, but also the SSL cipher used and much more.

-Supervised machine learning take on X variables like gender and age and predicts a Y variable like income. Our X variables are request features like IP, country, user agent, request byte size. Our Y variable is the probability to solve a Captcha. We use 200M requests and retrain on a daily basis. You can learn about this data from your own request logs like ELS, LogPush, and Firewall API.

23
Q

Behavioral Analysis

A

We draw a baseline of the customer’s traffic. If there are any significant anomalies, or spikes, off of that baseline, it will have a negative impact on the score.

With Gatebot, we have used a similar approach over the last 4 years for our DDoS solution.

If you have any concerns regarding the accuracy of our product, we encourage you to try it out in logging mode to ensure we are recognizing attacks correctly.