Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Cloud Security

09:35 AM
Larry Loeb
Larry Loeb
Larry Loeb

US Government Leads World in Data Breaches

US government agencies are leading the world when it comes to data breaches, and the issue seems to be getting worse, according to a new report. However, a shift to cloud may help alleviate some problems.

The US is leading the world in a dangerous way: The country's federal agencies suffer the most data breaches by volume compared to other governments worldwide.

A new study by Thales E-Security and 451 Research, which is based on responses from IT professionals in the federal sector, found the US is experiencing higher rates of data breaches compared to the past, as well as higher rates compared to other governments. While 26% of non-US agencies reported breaches in the last year, 57% of US agencies also reported them.

This is almost double the 34% rate reported in 2016, and three times the 18% reported in 2015.

In total, the report found that 71% of all federal agencies have been breached over the years.

There has been a response by these agencies, notably in what they spend on. Of those surveyed, 93% reported that their agencies will increase IT security spending compared to last year and 73% report that their IT security spending will be much higher.

Encryption technologies designed to protect data were reported to increase at a 77% rate. Not only that, 88% of respondents reported that data and file encryption will be implemented this year, with 77% noting that application-level encryption would be performed.

This compares to the 89% reporting that data masking would be implemented, as well as the 84% who told researchers that cloud-based encryption would be done. This is needed since only 23% noted that encryption is currently being used in the cloud.

The changes that are underway show how important cloud computing has become to the federal government. The report states that 100% of all federal agencies have plans to adopt cloud technologies. But this sort of mass adoption brings security challenges with it.

For instance, once an agency has moved to the cloud, it may have little or no control over how data is actually stored or protected while at rest. Paradoxically, however, agencies may allow the cloud provider to control their encryption keys for their containers rather than owning and managing the keys themselves.

Showing the problem starkly, the agencies were also concerned about the custodianship of encryption keys in the cloud -- 69% reporting that it was a problem.

They should be concerned. This kind of behavior could be a violation of NIST 800-53, FedRAMP and the federal risk management framework, which require agencies to maintain control of access to their data.

The kinds of security tools that are being funded may not be the best for a situation. The pros knew that data-in-motion and data-at-rest defenses -- recognized at 78% and 77%, respectively -- were the most effective tools for protecting data.

The fundamentals of network security are being redefined – don't get left in the dark by a DDoS attack! Join us in Austin from May 14-16 at the fifth annual Big Communications Event. There's still time to register and communications service providers get in free!

However, Garrett Bekker, one of the report's authors, writes: "The largest amount of respondents plan to increase spending on endpoint and mobile devices, despite ranking endpoint and mobile devices as least effective at protecting sensitive federal data -- a major disconnect."

The report suggests that this kind of disconnect may be due to previous experiences with legacy systems. In the report, 53% of respondents cited a lack of budget as a perceived barrier to security. Agencies may not realize that today's security tools can cost less and impose a minimal overhead on existing systems compared to legacy tools.

It's clear from the report that the US will have to strengthen its adoption of encryption technologies to protect its data as it moves to the cloud. Fortunately, the plans to adopt such technologies, such as encryption gateways and third-party encryption key managers for cloud environments, are growing.

Related posts:

— Larry Loeb has written for many of the last century's major "dead tree" computer magazines, having been, among other things, a consulting editor for BYTE magazine and senior editor for the launch of WebWeek.

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
COVID-19: Latest Security News & Commentary
Dark Reading Staff 9/25/2020
Hacking Yourself: Marie Moe and Pacemaker Security
Gary McGraw Ph.D., Co-founder Berryville Institute of Machine Learning,  9/21/2020
Startup Aims to Map and Track All the IT and Security Things
Kelly Jackson Higgins, Executive Editor at Dark Reading,  9/22/2020
Register for Dark Reading Newsletters
White Papers
Current Issue
Special Report: Computing's New Normal
This special report examines how IT security organizations have adapted to the "new normal" of computing and what the long-term effects will be. Read it and get a unique set of perspectives on issues ranging from new threats & vulnerabilities as a result of remote working to how enterprise security strategy will be affected long term.
Flash Poll
How IT Security Organizations are Attacking the Cybersecurity Problem
How IT Security Organizations are Attacking the Cybersecurity Problem
The COVID-19 pandemic turned the world -- and enterprise computing -- on end. Here's a look at how cybersecurity teams are retrenching their defense strategies, rebuilding their teams, and selecting new technologies to stop the oncoming rise of online attacks.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, when determining the common dimension size of two tensors, TFLite uses a `DCHECK` which is no-op outside of debug compilation modes. Since the function always returns the dimension of the first tensor, malicious attackers can ...
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, a crafted TFLite model can force a node to have as input a tensor backed by a `nullptr` buffer. This can be achieved by changing a buffer index in the flatbuffer serialization to convert a read-only tensor to a read-write one....
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, if a TFLite saved model uses the same tensor as both input and output of an operator, then, depending on the operator, we can observe a segmentation fault or just memory corruption. We have patched the issue in d58c96946b and ...
PUBLISHED: 2020-09-25
In TensorFlow Lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, saved models in the flatbuffer format use a double indexing scheme: a model has a set of subgraphs, each subgraph has a set of operators and each operator has a set of input/output tensors. The flatbuffer format uses indices f...
PUBLISHED: 2020-09-25
In TensorFlow Lite before versions 2.2.1 and 2.3.1, models using segment sum can trigger writes outside of bounds of heap allocated buffers by inserting negative elements in the segment ids tensor. Users having access to `segment_ids_data` can alter `output_index` and then write to outside of `outpu...