Cloud Security Lessons from the Voter Data Leak
A poorly configured Amazon S3 bucket that led to a massive data leak could easily happen to any organization not adopting proper cloud security measures.
CORRECTED: In addition to exposing personal data on 198 million American voters, Deep Root Analytics' data leak this week exposed dangerous cloud security missteps that should serve as a cautionary tale for businesses.
The compromised data, millions of records with personal information including birthdates, phone numbers, self-reported racial background, home and mailing addresses, and party affiliation, was stored in an Amazon Web Services S3 bucket storage account owned by Deep Root Analytics, a data analytics firm working on behalf of the Republican National Committee (RNC).
Deep Root had set its S3 storage bucket files to public instead of private, a mistake which left them viewable to the open Internet.
Most records had permissions to be downloaded, and the files could be accessed without a password, according to UpGuard, which discovered and reported the leak.
Deep Root's data leak can serve as a lesson to businesses planning to make a secure transition to the cloud.
"Amazon, and all cloud service models, are easy to deploy, set up, and manage, but out of the box, they’re not secure by default," says Chris Pierson, chief security officer for Viewpost. Engineers have to go in and choose the access control list for the S3 instance they're setting up, choose to turn on encryption, and select identity and management rights for the S3 bucket.
The incident highlights the hazard of outsourcing, he continues. Businesses planning to outsource services to third parties, as the RNC did with Deep Root, should set up an information assurance program to ensure the right data security policies are in place.
As part of this type of program, businesses vet potential third parties through audits, website scanning, and penetration and vulnerability tests. They should ensure the company storing their data has the right infrastructure, people, and policies in place to secure it. Who can access the data? Is it encrypted?
"The biggest thing the RNC could have done - and I don't know if they did - was ensure they have an information assurance program that is in place, operating, and reviewing the risk third parties have to their organization," he emphasizes. "It's all about risk."
Votiro CEO Itay Glick calls Deep Root's mistake "careless" and explains how any company providing consumer services needs to protect themselves with basic security steps: properly setting default credentials, enabling two-factor authentication, and ensuring a vendor is using encryption.
The data leak could have broader implications if threat actors gain access to the information and use it for microtargeting, a common strategy used among political parties to define and appeal to voters.
"While this data leak is bad, what is worse is the potential of this data falling into the wrong hands," says Steve Malone, director of security product management at Mimecast."[Microtargeting] is an incredibly powerful tool when in the hands of a cybercriminal, who can use this data to implement very targeted spearphishing and social engineering attacks."
This could happen again
Experts agree this type of leak could be replicated. "Upguard's capabilities can be used by nation-states, cybercriminals, anyone out there," notes Pierson. "As people move more to the cloud, as they don't implement the same security measures and don't implement the same types of controls, there will be data leakage and exposures like this. You can bet cybercriminals will try to expose that."
In general, corporate assets should have the same protection regardless of where they reside, says Anthony Giandomenico, senior security strategist and researcher at Fortinet FortiGuard Labs. Many errors in the data center will carry over to the cloud and be amplified, which is often the result of an "out of sight, out of mind" mentality related to cloud storage.
"As assets move to the cloud, there is the potential to lose visibility," he says. "Also, sometimes, companies that initially move assets to the cloud leave the connections open to the Internet with just a simple password."
Black Hat USA returns to the fabulous Mandalay Bay in Las Vegas, Nevada, July 22-27, 2017. Click for information on the conference schedule and to register.
He advises companies define a standard level of security configurations for all assets and use a monitoring process to ensure those assets stay within that set security level.
Many businesses are struggling with who is responsible for securing cloud-based data. A new survey from Barracuda Networks discovered 71% of IT decision-makers feel cloud providers are responsible for customer data in the public cloud, and 66% believe cloud providers are responsible for their applications in the public cloud.
Lack of skilled talent is part of the problem, notes Bufferzone CTO Eyal Dotan. Five years ago, a security engineer's worst fear was a hostile employee might access resources from an internal server. Now the threats are much bigger.
"Now those engineers with that same training are taken into the cloud, and thus into a more hostile public, where your servers can be accessed both by your regular employees or some hacker on the other side of the world," he explains.
"Jumping into the cloud era, they need to be more trained and skilled as they are confronted with larger and more hostile potential threats."
Related Content:
About the Author
You May Also Like
DevSecOps/AWS
Oct 17, 2024Social Engineering: New Tricks, New Threats, New Defenses
Oct 23, 202410 Emerging Vulnerabilities Every Enterprise Should Know
Oct 30, 2024Simplify Data Security with Automation
Oct 31, 2024