Oracle 11G Available On AWS
When testing Oracle on Amazon AWS, consider how you will secure your data
Amazon announced availability of Oracle 11G machine images on Amazon AWS. It's fairly apparent that what's being provided is not really intended for production deployments at this time. Still, it's a neat way for customers test out the elasticity of Platform as a Service and sample Oracle 11G capabilities without having to invest in database licenses and new hardware.
But Amazon environments are very different than managing traditional IT. Sure, you can spin up a database really fast, but to do anything with it, you'll need to configure just about every aspect of your environment. And it's harder to inspect and monitor servers in the cloud. That applies to security as well as general administration.
In Amazon's announcement, security information is limited to automatic patching of the image files, and the following: "You can also control access and security for your instance(s) and manage your database backups and snapshots." Automated patching is a great advantage.
Unfortunately, there is a lot more to Oracle security than patching: you'll have to figure out access control and archive encryption, as the Amazon guidance leaves big gaps between the theory and the execution for access controls and archive security. From a security perspective, there are a lot of steps you need to perform in order to secure the database so that you'll be able to reliably store any data, much less sensitive information.
It's pretty clear that this service -- at least for the time being -- is not for production usage. Even if you are not planning on building a production database, it's beneficial to consider the security implications while you run your tests. Here is a quick rundown of what you need to consider for access controls and archive security:
Encryption: You are going to need to encrypt your archives and snapshots; Amazon S3 is simple but not necessarily secure. I'll jump right to the point and say you'll want transparent database encryption so that any archive or snapshots are automatically encrypted. You'll need to acquire the add-on Oracle package or install a OS layer encryption product like Vormetric. In both cases, since Amazon is patching the image files, you'll need to understand how this affects additional encryption features; most likely you will re-apply setup scripts or re-install products to the virtual image.
Authentication:You need to determine how you are going to authenticate users, internally or externally. I recommend external, but that still leaves a couple options as to how you do this. You can create and deploy an LDAP service in the cloud, or leverage Amazon services for credentials, or you can link back to your existing IT services. Any way you go, you are responsible for user setup and validating security on what you deploy.
If you plan on doing anything more than basic testing and proof of concepts -- and then totally dismantling the database afterwards -- here are several other things you should consider:
Certificate Management: There is a lot of management to the machine images, disk images, virtual networking, and data management; you will be running scripts to auto-configure images at startup and linking all of the resources together. You'll have certificates issued to validate connections and admin capabilities, so you need to capture these certificates in a secure location and distribute to select few. I don't recommend keeping public and private keys in the same directory, and I definitely don't recommend installing certificates on images were they can compromised.
Network Access: The database should only be indirectly accessible through your applications, or through some secured connection from your existing IT environment. You have the option of creating a virtual network with Amazon's Elastic Beanstalk and controlling how database connections can be created. You'll want to set up a VPN tunnel for management connections, and you'll want to set up the database with a private IP address so that it cannot be publicly addressed.
Assessment: Assessment of the database configuration to ensure you turn off unwanted services and reset default passwords. If you are quickly spinning up and shutting down databases, it's easy to miss configuration details, so get an assessment tool to validate security settings. Consider creating a script file that runs prior to launching the images so your systems have a secure baseline configuration.
Key Management: Consider how you want to manage encryption keys for the database. Sure, you can install the keys on the disk image, but that's not very secure as they can be read by attackers. You'll likely need a key server in the cloud, or once again, supporting the cloud from your existing IT environment.
Masking: If you're testing a new application, or the viability of an existing applications in the cloud, you'll need test data. Any data that you put in the cloud should not be production data until you have audited database security. Do yourself a favor and get a masking tool that will auto-generate data for you, or obfuscate existing data, before moving into the cloud database. Adrian Lane is an analyst/CTO with Securosis LLC, an independent security consulting practice. Special to Dark Reading.
About the Author
You May Also Like
The State of Attack Surface Management (ASM), Featuring Forrester
Nov 15, 2024Applying the Principle of Least Privilege to the Cloud
Nov 18, 2024The Right Way to Use Artificial Intelligence and Machine Learning in Incident Response
Nov 20, 2024Safeguarding GitHub Data to Fuel Web Innovation
Nov 21, 2024