Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Perimeter

5/24/2011
11:41 AM
Adrian Lane
Adrian Lane
Commentary
50%
50%

Oracle 11G Available On AWS

When testing Oracle on Amazon AWS, consider how you will secure your data

Amazon announced availability of Oracle 11G machine images on Amazon AWS. It's fairly apparent that what's being provided is not really intended for production deployments at this time. Still, it's a neat way for customers test out the elasticity of Platform as a Service and sample Oracle 11G capabilities without having to invest in database licenses and new hardware.

But Amazon environments are very different than managing traditional IT. Sure, you can spin up a database really fast, but to do anything with it, you'll need to configure just about every aspect of your environment. And it's harder to inspect and monitor servers in the cloud. That applies to security as well as general administration.

In Amazon's announcement, security information is limited to automatic patching of the image files, and the following: "You can also control access and security for your instance(s) and manage your database backups and snapshots." Automated patching is a great advantage.

Unfortunately, there is a lot more to Oracle security than patching: you'll have to figure out access control and archive encryption, as the Amazon guidance leaves big gaps between the theory and the execution for access controls and archive security. From a security perspective, there are a lot of steps you need to perform in order to secure the database so that you'll be able to reliably store any data, much less sensitive information.

It's pretty clear that this service -- at least for the time being -- is not for production usage. Even if you are not planning on building a production database, it's beneficial to consider the security implications while you run your tests. Here is a quick rundown of what you need to consider for access controls and archive security:

Encryption: You are going to need to encrypt your archives and snapshots; Amazon S3 is simple but not necessarily secure. I'll jump right to the point and say you'll want transparent database encryption so that any archive or snapshots are automatically encrypted. You'll need to acquire the add-on Oracle package or install a OS layer encryption product like Vormetric. In both cases, since Amazon is patching the image files, you'll need to understand how this affects additional encryption features; most likely you will re-apply setup scripts or re-install products to the virtual image.

Authentication:You need to determine how you are going to authenticate users, internally or externally. I recommend external, but that still leaves a couple options as to how you do this. You can create and deploy an LDAP service in the cloud, or leverage Amazon services for credentials, or you can link back to your existing IT services. Any way you go, you are responsible for user setup and validating security on what you deploy.

If you plan on doing anything more than basic testing and proof of concepts -- and then totally dismantling the database afterwards -- here are several other things you should consider:

Certificate Management: There is a lot of management to the machine images, disk images, virtual networking, and data management; you will be running scripts to auto-configure images at startup and linking all of the resources together. You'll have certificates issued to validate connections and admin capabilities, so you need to capture these certificates in a secure location and distribute to select few. I don't recommend keeping public and private keys in the same directory, and I definitely don't recommend installing certificates on images were they can compromised.

Network Access: The database should only be indirectly accessible through your applications, or through some secured connection from your existing IT environment. You have the option of creating a virtual network with Amazon's Elastic Beanstalk and controlling how database connections can be created. You'll want to set up a VPN tunnel for management connections, and you'll want to set up the database with a private IP address so that it cannot be publicly addressed.

Assessment: Assessment of the database configuration to ensure you turn off unwanted services and reset default passwords. If you are quickly spinning up and shutting down databases, it's easy to miss configuration details, so get an assessment tool to validate security settings. Consider creating a script file that runs prior to launching the images so your systems have a secure baseline configuration.

Key Management: Consider how you want to manage encryption keys for the database. Sure, you can install the keys on the disk image, but that's not very secure as they can be read by attackers. You'll likely need a key server in the cloud, or once again, supporting the cloud from your existing IT environment.

Masking: If you're testing a new application, or the viability of an existing applications in the cloud, you'll need test data. Any data that you put in the cloud should not be production data until you have audited database security. Do yourself a favor and get a masking tool that will auto-generate data for you, or obfuscate existing data, before moving into the cloud database. Adrian Lane is an analyst/CTO with Securosis LLC, an independent security consulting practice. Special to Dark Reading. Adrian Lane is a Security Strategist and brings over 25 years of industry experience to the Securosis team, much of it at the executive level. Adrian specializes in database security, data security, and secure software development. With experience at Ingres, Oracle, and ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Data Leak Week: Billions of Sensitive Files Exposed Online
Kelly Jackson Higgins, Executive Editor at Dark Reading,  12/10/2019
Lessons from the NSA: Know Your Assets
Robert Lemos, Contributing Writer,  12/12/2019
4 Tips to Run Fast in the Face of Digital Transformation
Shane Buckley, President & Chief Operating Officer, Gigamon,  12/9/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
The Year in Security: 2019
This Tech Digest provides a wrap up and overview of the year's top cybersecurity news stories. It was a year of new twists on old threats, with fears of another WannaCry-type worm and of a possible botnet army of Wi-Fi routers. But 2019 also underscored the risk of firmware and trusted security tools harboring dangerous holes that cybercriminals and nation-state hackers could readily abuse. Read more.
Flash Poll
Rethinking Enterprise Data Defense
Rethinking Enterprise Data Defense
Frustrated with recurring intrusions and breaches, cybersecurity professionals are questioning some of the industrys conventional wisdom. Heres a look at what theyre thinking about.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2019-19807
PUBLISHED: 2019-12-15
In the Linux kernel before 5.3.11, sound/core/timer.c has a use-after-free caused by erroneous code refactoring, aka CID-e7af6307a8a5. This is related to snd_timer_open and snd_timer_close_locked. The timeri variable was originally intended to be for a newly created timer instance, but was used for ...
CVE-2014-8650
PUBLISHED: 2019-12-15
python-requests-Kerberos through 0.5 does not handle mutual authentication
CVE-2014-3536
PUBLISHED: 2019-12-15
CFME (CloudForms Management Engine) 5: RHN account information is logged to top_output.log during registration
CVE-2014-3643
PUBLISHED: 2019-12-15
jersey: XXE via parameter entities not disabled by the jersey SAX parser
CVE-2014-3652
PUBLISHED: 2019-12-15
JBoss KeyCloak: Open redirect vulnerability via failure to validate the redirect URL.