Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Cloud

4/12/2019
10:30 AM
Ronan David
Ronan David
Commentary
Connect Directly
LinkedIn
RSS
E-Mail vvv
100%
0%

Cloudy with a Chance of Security Breach

Businesses must be aware of the security weaknesses of the public cloud and not assume that every angle is covered.

The forecast calls for clouds, and they show no signs of clearing up. Cloud platforms are everywhere. In fact, IDG's 2018 Cloud Computing Study showed 73% of all businesses have placed at least one application or part of their infrastructure in the cloud. The public cloud space is particularly hot at the moment as Google moves to aggressively compete with Microsoft and Amazon.

Digital transformation has been the primary catalyst for this explosion. Typically, cloud adoption starts when some workloads are moved to public providers — usually, development and test environments that are less critical. After a while, noncritical applications might migrate over, followed by storage — files and databases — and then, bigger deployments.

The public cloud is attractive because underlying parts of the infrastructure and orchestration procedures hosting the application components are fully managed and mostly hidden — such as network functions, Internet access, security, storage, servers, and computing virtualization. Everything is easily configurable through a simple user interface or an API. Security is enforced by using private networks isolated from Internet… or is it?

Information on private networks hosted in a public cloud is not safe. This is because private networks, even without access to the Internet, are still able to communicate with it via DNS. Most of the time, no specific configuration is required to get full DNS access from the workloads pushed onto public cloud infrastructures. As a result, DNS tunneling, DNS file systems, and data exfiltration are possible on most public cloud providers by default. It is not a security flaw; it's a feature that's built-in on purpose, mainly to help workloads that need to access cloud serverless services to ease the digital transformation. This opens up any business to a wide range of possible data leaks.

Four of the most likely scenarios are as follows:

1. Malicious code is inserted into the back end to perform DNS resolution to extract data. Typically, access is gained through standard methods (such as SQL injection, heap overflow, known vulnerability, and unsecured API) from outside the organization.

2. Malicious code is inserted into a widely used library so that it affects all users (such as a supply chain attack) regardless of the language (such as Java, Python, and Node.js).

3. People inside an enterprise who have access to a host can modify/install/develop an application that uses DNS to perform a malicious operation (contact command and control, push data, or get malware content).

4. A developer inserts specific code that doesn't require a change in the infrastructure and that uses DNS to extract production data, events, or account information. The code will pass the quality gates of the continuous integration and testing part and be automatically deployed to production.  

So, what can an organization do — especially when it is deploying multitiered applications on multiple cloud services?

First, a private DNS service should be deployed for any business information stored on a private network hosted in public cloud, even if it's temporary. A private DNS service will allow you to filter what is accessible and what should not be. It will allow you to also regularly audit cloud architecture which is now mandatory in any public cloud environment. This requires specific identifying cloud patterns that are new to most systems and network architects. Most workloads don't require full access to the Internet. But sometimes, it's necessary — for example, when a DevOps team needs to update the infrastructure, installation packages, or dependencies because it simplifies the deployment phase. Businesses can approach this by designing an "immutable infrastructure" with prebuilt images, private networks, and controlled communication inbound and outbound. They should also perform testing phases, especially since options of cloud providers may change without being integrated.

Second, cloud providers propose private networking solutions to deploy internal resources and back-end services (e.g., databases, file storage, specific computation, back-office management). This addresses security and regulatory concerns like data protection (e.g., GDPR), data ciphering, or simply to stop parts of an application from being exposed directly to the Internet. All good practice. However, a better practice is to deploy computational back-end resources on subnets or networks not connected to the Internet, and that can only be reached by known sources. Then, filtering rules can be enforced to restrict access and comply with security policies.  

Finally, ensure there is a flexible DDI (DNS-DHCP-IPAM) solution integrated into the cloud orchestrator to make the configuration easier. DDI will automatically push appropriate records in the configuration once the service is enabled. This will not only bring considerable time savings to your organization, but it will also enforce policies to help secure public cloud deployments.

Moving applications to either a public or private cloud is inevitable. And as businesses continue to transform themselves, cloud usage will tag along on the journey. But businesses need to be aware that the public cloud isn't infallible when it comes to security, and must not assume that every angle is covered. Otherwise, the convenience of the cloud will turn into an inconvenience for your data.

Related Content:

 

 

Join Dark Reading LIVE for two cybersecurity summits at Interop 2019. Learn from the industry's most knowledgeable IT security experts. Check out the Interop agenda here.

 

Ronan David develops the strategic direction for EfficientIP, which delivers fully integrated network security and automated solutions for DDI (DNS-DHCP-IPAM). He oversees EfficientIP's customer and partner relationships, resulting in corporate growth and development within ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Why Cyber-Risk Is a C-Suite Issue
Marc Wilczek, Digital Strategist & CIO Advisor,  11/12/2019
DevSecOps: The Answer to the Cloud Security Skills Gap
Lamont Orange, Chief Information Security Officer at Netskope,  11/15/2019
Attackers' Costs Increasing as Businesses Focus on Security
Robert Lemos, Contributing Writer,  11/15/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Current Issue
Navigating the Deluge of Security Data
In this Tech Digest, Dark Reading shares the experiences of some top security practitioners as they navigate volumes of security data. We examine some examples of how enterprises can cull this data to find the clues they need.
Flash Poll
Rethinking Enterprise Data Defense
Rethinking Enterprise Data Defense
Frustrated with recurring intrusions and breaches, cybersecurity professionals are questioning some of the industrys conventional wisdom. Heres a look at what theyre thinking about.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2019-19040
PUBLISHED: 2019-11-17
KairosDB through 1.2.2 has XSS in view.html because of showErrorMessage in js/graph.js, as demonstrated by view.html?q= with a '"sampling":{"value":"<script>' substring.
CVE-2019-19041
PUBLISHED: 2019-11-17
An issue was discovered in Xorux Lpar2RRD 6.11 and Stor2RRD 2.61, as distributed in Xorux 2.41. They do not correctly verify the integrity of an upgrade package before processing it. As a result, official upgrade packages can be modified to inject an arbitrary Bash script that will be executed by th...
CVE-2019-19012
PUBLISHED: 2019-11-17
An integer overflow in the search_in_range function in regexec.c in Oniguruma 6.x before 6.9.4_rc2 leads to an out-of-bounds read, in which the offset of this read is under the control of an attacker. (This only affects the 32-bit compiled version). Remote attackers can cause a denial-of-service or ...
CVE-2019-19022
PUBLISHED: 2019-11-17
iTerm2 through 3.3.6 has potentially insufficient documentation about the presence of search history in com.googlecode.iterm2.plist, which might allow remote attackers to obtain sensitive information, as demonstrated by searching for the NoSyncSearchHistory string in .plist files within public Git r...
CVE-2019-19035
PUBLISHED: 2019-11-17
jhead 3.03 is affected by: heap-based buffer over-read. The impact is: Denial of service. The component is: ReadJpegSections and process_SOFn in jpgfile.c. The attack vector is: Open a specially crafted JPEG file.