After completing this unit, you’ll be able to:
- Describe what malicious bots want to do on a storefront.
- Explain why it’s difficult to know if an order is fraudulent.
- List bot mitigation strategies.
- Explain what you can learn about bots with the Reports & Dashboards tool.
- List the eCDN settings that help with bot mitigation.
- Describe how third-party tools and APIs help with bot mitigation.
What’s Happening in the Bot World?
Linda Rosenberg, Cloud Kicks admin, wants to learn as much as she can about bots. To mitigate bot attacks, she wants to know what they intend to do and how they plan to do it. A proactive approach will go a long way to keep her storefront safe.
There are two general types of bots.
- Good: Crawlers, spiders that scrape websites, or index for search engine optimization (SEO)
- Bad: malicious in nature and used by bad actors to steal data, commit fraud, or bring down sites
Good bots help shoppers find the storefront. Linda wants to welcome them with open arms, while blocking against the bad. It’s a serious challenge!
Bad actors want to steal from a site or prevent it from selling. They often use lists of stolen credentials to gain access to user accounts and conduct brute force attacks via unauthorized transactions. Most cyber attacks are simple and automated. Malicious bots test stolen credit card data on merchant storefronts, or collect information from targeted websites. Bad actors steal content and strain web infrastructure. This can reduce a website’s search engine optimization (SEO) ranking, slow system response times, or stop it altogether.
Strategize Bot Mitigation
It’s hard to know if an order is fraudulent because there are so many ways that bad actors can operate, and they’re inventing new ways all the time. Account takeovers from password reuse, shoppers falling for phishing attacks, identity theft, and triangulation schemes are just the beginning.
To help reduce fraud and fraudulent orders, it’s a good idea to integrate the site with a fraud prevention service or contact partners from the LINK Technology Partner Programs, such as PerimeterX and DataDome, for their help. Look for a service that can identify human users without negatively impacting the shopper experience. Advanced mitigation options, such as honeypots for huge downloads, increase the cost to the bot owner and make them think twice about their methods.
But there’s a lot that Linda can do on her own, including:
- Review traffic with the Reports & Dashboards tool.
- Configure eCDN settings such as rate limiting.
- Run third-party tools and the APIs.
- Use OCAPI (CDN Zone API) for programmatic protection.
Here’s her strategy for special sales.
- Make the shopper click something. Linda asks the developer to create a page that requires shoppers to click something, or a CAPTCHA request. Placing the CAPTCHA implementation before the add-to-cart operation controls the flow of requests to the add-to-cart pages.
- Require an account to make a purchase. Place limits on how many orders can be placed per second, per account, or how many credit cards are allowed per account.
- Require preregistration. This includes payment authorization to ensure real interest in the hype sale.
- Create a specific risk profile with strict purchasing rules. For example, configure order limits for each shipping address, IP address, or email. Block known fraudulent users based on zip code, email, and so on. Activate the risk profile for the duration of the sale to avoid impacting regular business.
Reports & Dashboards
Linda’s first act before a sale is to open the Reports & Dashboards tool traffic reports dashboard to see what’s happening. In the heat of a bot attack (or a potential attack), she wants to identify who is attacking and immediately block them. Here’s what she can learn in the Traffic dashboard about visitors.
- Number of visits
- Average visit duration
- Devices used to access your site
- Source of referrers
- Top IP addresses
- Top user agents
- Top robots
The top user agents and top robots data is especially useful.
- The Top User Agent report lists the unique agents in an agent family by class. It shows the number of requests from each agent, including all browsers. It excludes known robots.
- The Top Robots report shows the number of unique robots, including known robots. It shows the robot class and family, and the total number of requests from robots. This report helps Linda identify which robots are crawling her storefront. The report is limited to the 10,000 most recently used entries.
Once Linda identifies individual pages or controllers under attack, she can raise the threat level of her entire site via enterprise content delivery network (eCDN) settings, while continuing to figure out which specific areas are being attacked. She can also enable firewall settings and the WAF.
Here are some eCDN settings she can configure in Business Manager.
What it does
Disallow refinements using robots.txt
Honors good BOTs
DDoS protection (Layers 3, 4, and 7)
Thwarts volumetric and protocol attacks such as TCP floods, SYNC, UDP, and ICMP attacks.
Firewall: change security level (Low, Med, High, Under Attack)
Reduces the volume of an attack
WAF: change security level (Simulate, Challenge, or Block)
Reduces the volume of an attack
Third-Party Tools and the APIs
Linda uses third-party tools such as webmaster to review and control bot crawl rate. She also use the Salesforce Commerce API to:
- Enforce only her third-party CDN to pass through.
- Deny specific IPs.
- Block by country where bad actors might be hitting the site.
Let’s Wrap It Up
In this unit, you learned about bots, both good and bad: what they intend and how they go about it. You reviewed bot mitigation strategies, and drilled down into a few of them. Now you can take the final quiz and earn a new badge.