Bots make up more than 75% of total traffic for some businesses, but one in three can't distinguish legitimate bots from malicious ones.

Dark Reading Staff, Dark Reading

November 18, 2017

1 Min Read

One in three organizations can't differentiate good or legitimate bots from bad bots - a shortcoming that can affect application security.

Bots make up more than 75% of total traffic for some businesses, according to a Radware study on Web application security. The study found nearly half (45%) of businesses had been hit with a data breach in the past year, and 68% are not confident they can keep corporate information safe.

Malicious bots are a serious risk, as Web-scraping attacks can affect retailers by stealing intellectual property, undercutting prices, and holding mass inventory in limbo, the report states. In retail, 40% of businesses can't tell good bots from bad ones. The healthcare industry is also struggling: 42% of traffic comes from bots, but 20% of IT security execs can tell if they're nefarious.

Researchers found gaps in DevOps security, which likely stem from the pressure to consistently deliver application services. Half (49%) of respondents use the continuous delivery of application services and 21% plan to adopt it in the next 1-2 years. More than half (62%) believe this increases the attack surface and about half report they don't integrate security into continuous application delivery.

Read more details here.

INsecurity-Logo-wEventInfo-Horizontal.png

Join Dark Reading LIVE for two days of practical cyber defense discussions. Learn from the industry’s most knowledgeable IT security experts. Check out the INsecurity agenda here.

About the Author(s)

Dark Reading Staff

Dark Reading

Dark Reading is a leading cybersecurity media site.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights