Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Security Management

01:43 PM
Giorgio Regni
Giorgio Regni
News Analysis-Security Now

Are You Ready for GDPR?

Multi-cloud and software-defined storage solutions may ease the way to GDPR compliance.

Cybersecurity is a hot topic. Most of the headlines focus on massive data breaches and foreign state-sponsored hacking. The true scope of damages caused by cyber incidents is starting to sink in for everyone from CEOs to casual consumers. The loss of control over one’s personal data is an alarming consequence, and governments are responding by holding organizations accountable for protecting their users’ private information. The EU’s General Data Protection Regulation (GDPR) is the most far-reaching of these efforts. Set to take effect in less than a year (May 25, 2018), the legislation is intended to fortify and harmonize data protection for all individuals in the European Union.

The global impact of GDPR
The impact will be felt far beyond the EU, however, and businesses must start preparing now if they hope to continue serving EU customers online. The requirements are extensive, and the potential penalties are heavy. The maximum fine for serious infringements is € 20 million (approximately $22.3 million) or 4% of worldwide revenue, whichever is higher. For large multinational corporations, such penalties could reach billions of dollars. Fines are tiered, but even a 2% penalty for failure to conduct assessments, report breaches or keep sufficient records could be very costly.

How far do you have to go?
The UK’s Information Commissioner’s Office has published a
concise summaryof the primary preparations companies need to make to be compliant.

Big Internet players have long understood that data privacy and cybersecurity are complementary efforts, and may only need to tweak their privacy policies and processes in order to be GDPR compliant. However, businesses of all sizes and types continue to neglect the baseline security practices, and do not have a firm grip on where all their user data resides, who can access it and how it was obtained and processed. If you don’t know everything about your data assets, they are inherently vulnerable.

Moreover, failure to encrypt, monitor and defend data depositories is also rampant. This is evident in the rapid spread and alarming impact of ransomware like WannaCry as well as the recent negligent exposure of the Republican National Committee’s mega database (nearly 200 million records of potential voters) by one of their contractors. A Veritas survey found that worldwide, less than a third of global companies are prepared to meet the minimum GDPR standards; there is clearly a lot of work to do.

The multi-cloud approach
Making diligent efforts to implement and enforce strong data security and privacy policies across the enterprise is a good starting point. Investing in next-generation software defined data storage and data management technology is fundamental.

This is one of the reasons that multi-cloud strategies are becoming more common. Heeding the adage "don't put all your eggs in one basket," companies focused on improving data security, resiliency and availability across multiple locations are adopting multi-cloud infrastructure. This allows companies to use on-premise storage and public cloud services as they fit best, enhancing workflows by seamlessly moving data between clouds and leveraging the most effective services available within each cloud.

Storing and managing vast amounts of unstructured data has been made easier and even more secure with recent advances in software defined filesystem and object storage technology and S3 data services. Implementing these distributed, ultra-secure storage solutions increases performance, extends capacity, enables unprecedented scalability and provides greater location control. Versioning and WORM capabilities prevent accidental overwrites and enables tighter control over access rights and retention policies, important for data integrity and GDPR compliance.

You're invited to attend Light Reading's 11th annual Future of Cable Business Services event. Join us in New York on November 30 for the premier independent conference focusing on the cable industry's continuing efforts in the commercial services market – all cable operators and other communications service providers get in free.

Moreover, multi-cloud can help achieve compliance due to compliance certifications being achieved by certain cloud vendors, such as for HIPAA and SEC in the AWS cloud. In addition, compliance can be achieved through, multi-location (regional) data placement control to ensure data sovereignty (maintaining certain data "in country"), which helps compliance with GDPR locality requirements.

An opportunity to do data (and business) the right way
In the end, while compliance GDPR requirements will challenge most companies and seriously burdensome, it should be viewed as an opportunity. Experts go so far as to say that the EU’s GDPR will save the Internet. The requirements essentially add up to best practices for data management that all controllers and processors of personal data should be adhering to already. The potential fines are serious enough to grab executive and board attention and garner more resources for IT security and data management teams. A recent PwC survey found that almost all companies they polled (200 companies with more than 500 employees each) consider GDPR preparations a top priority and a majority (77%) plan to spend at least $1 million on GDPR compliance.

GDPR presents the strongest justification in many years for upgrading data storage infrastructure and management technology. In her Internet trends report, Mary Meeker highlighted a Bain survey that found top concerns about cloud computing are shifting as the technology becomes more widely trusted. While data security is still a top concern, it is less so than in previous years; concerns about compliance/governance and vendor lock-in, however, are rising significantly (see slide 183).

Data is the lifeblood of modern business and government. Well-managed data protected by visibly conscientious data privacy programs has proven to be a competitive advantage and differentiating factor. Compliant companies, especially big Internet players like Microsoft, Google, Cisco and Amazon already know this, and it is evident in their growth and success. They understand the intersection of privacy and security: Data is viewed as a critical asset, it's a constant board-level discussion and they have a lot of people working on it, from legal to engineering. They started several years ago, building in privacy controls, features and protocols.

The biggest Internet players have vast resources; smaller organizations playing catch-up have to be smart and deliberate about their next moves. With strategic investments in multi-cloud infrastructure and software defined storage solutions, companies can grow their data-driven business and programs with confidence, knowing that essential data is securely encrypted, disaster proof and highly available. They won't be locked in to vendors or proprietary hardware, and can procure the best solutions and services for diverse computing needs, partners and customers. With this approach and diligent attention to specific GDPR preparations on the business side, organizations will be ready for May 2018 and poised to reap the many rewards of robust and resilient data storage and management.

Related posts:

— Giorgio Regni is Scality's Co-Founder and Chief Technology Officer. He oversees the company's development, research and product management. He is a recognized expert in distributed infrastructure software at web scale, and has authored multiple US patents for distributed systems.

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
COVID-19: Latest Security News & Commentary
Dark Reading Staff 9/25/2020
Hacking Yourself: Marie Moe and Pacemaker Security
Gary McGraw Ph.D., Co-founder Berryville Institute of Machine Learning,  9/21/2020
Startup Aims to Map and Track All the IT and Security Things
Kelly Jackson Higgins, Executive Editor at Dark Reading,  9/22/2020
Register for Dark Reading Newsletters
White Papers
Current Issue
Special Report: Computing's New Normal
This special report examines how IT security organizations have adapted to the "new normal" of computing and what the long-term effects will be. Read it and get a unique set of perspectives on issues ranging from new threats & vulnerabilities as a result of remote working to how enterprise security strategy will be affected long term.
Flash Poll
How IT Security Organizations are Attacking the Cybersecurity Problem
How IT Security Organizations are Attacking the Cybersecurity Problem
The COVID-19 pandemic turned the world -- and enterprise computing -- on end. Here's a look at how cybersecurity teams are retrenching their defense strategies, rebuilding their teams, and selecting new technologies to stop the oncoming rise of online attacks.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, when determining the common dimension size of two tensors, TFLite uses a `DCHECK` which is no-op outside of debug compilation modes. Since the function always returns the dimension of the first tensor, malicious attackers can ...
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, a crafted TFLite model can force a node to have as input a tensor backed by a `nullptr` buffer. This can be achieved by changing a buffer index in the flatbuffer serialization to convert a read-only tensor to a read-write one....
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, if a TFLite saved model uses the same tensor as both input and output of an operator, then, depending on the operator, we can observe a segmentation fault or just memory corruption. We have patched the issue in d58c96946b and ...
PUBLISHED: 2020-09-25
In TensorFlow Lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, saved models in the flatbuffer format use a double indexing scheme: a model has a set of subgraphs, each subgraph has a set of operators and each operator has a set of input/output tensors. The flatbuffer format uses indices f...
PUBLISHED: 2020-09-25
In TensorFlow Lite before versions 2.2.1 and 2.3.1, models using segment sum can trigger writes outside of bounds of heap allocated buffers by inserting negative elements in the segment ids tensor. Users having access to `segment_ids_data` can alter `output_index` and then write to outside of `outpu...