Akamai announces content protector to stop scraping attacks

Akamai Technologies, Inc., the cloud company that powers and protects life online, announced the availability of Content Protector, a product that stops scraping attacks without blocking the good traffic that companies need to enhance their business.

Scraper bots are a critical and often productive part of the commerce ecosystem. These bots search for new content, highlight products in comparison sites, and gather updated product information to share with customers. Unfortunately, scrapers also get used for harmful purposes such as competitive undercutting, surveillance before inventory hoarding attacks, and counterfeiting goods and websites. Scrapers also ping sites 24/7 unless stopped — so they can degrade site performance, which in turn frustrates consumers and causes them to abandon their visits. In addition, scrapers have become much more evasive and sophisticated over the past few years.

Akamai Content Protector helps detect and mitigate evasive scrapers that steal content for malicious purposes. It facilitates significantly better detections and fewer false negatives without increasing the rate of false positives. The product is designed for companies that need to protect their intellectual property, reputation, and revenue potential. It offers tailored detections that include:

Protocol-level assessment: Protocol fingerprinting evaluates how the client establishes the connection with the server at the different layers of the Open systems interconnection (OSI) model — verifying that the parameters negotiated align with the ones expected from the most common web browsers and mobile applications.

Application-level assessment: Evaluate if the client can run some business logic written in JavaScript. When the client runs JavaScript, it collects the device and browser characteristics and user preferences. These various data points are compared and cross-checked against the protocol-level data to verify consistency.

User interaction: Analyzes human interaction with the client through standard peripherals like a touch screen, keyboard, and mouse. Lack of interaction or abnormal interaction is typically associated with bot traffic.

User behavior: Monitors the user journey through the website. Botnets typically go after specific content, resulting in significantly different behavior than legitimate traffic.
Risk classification: Provides a deterministic and actionable low-, medium-, or high-risk classification of the traffic, based on the anomalies found during the evaluation.

“Content scraping brings serious harm to businesses,” explained Rupesh Chokshi, Senior Vice President and General Manager, Application Security, at Akamai. “This includes competitors undercutting your offers, slower sites that lead customers to get frustrated and leave, and brand damage from counterfeiters passing off subpar goods as your legitimate merchandise. Content Protector helps demonstrate the direct business value of security while enabling business leaders to grow their digital businesses.”

Cloudtechnology
Comments (0)
Add Comment