Security is a shared responsibility between the cloud provider and its customers, says Amazon Web Services security architect.

Charles Babcock, Editor at Large, Cloud

November 4, 2010

8 Min Read

Slideshow: Amazon's Case For Enterprise Cloud Computing

Slideshow: Amazon's Case For Enterprise Cloud Computing


Slideshow: Amazon's Case For Enterprise Cloud Computing (click image for larger view and for full slideshow)

Is your data secure in the leading public cloud? Steve Riley, Amazon Web Services security architect, responds in no uncertain terms: it's more secure there than in your data center.

In a talk at the Cloud Computing Conference and Expo 2010, he cited several ways Amazon's Elastic Compute Cloud (EC2) operation protects users' data and applications. At the same time he concedes that security in the cloud is "a shared responsibility." Amazon provides a secure infrastructure from "the concrete (of the data center floor) to the hypervisor," is one of his favorite expressions. But an EC2 customer has to write a secure application, transport to the cloud, and operate it in a secure fashion there.

AWS is working on an Internet protocol security (IPsec) tunnel connection between EC2 and a customer's data center to allow a direct, management network to EC2 virtual machines. Such a connection would allow customers to operate their servers in EC2 as if they were servers on the corporate network. And that allows EC2 customers "to imagine the day when they will have no more internal IT infrastructure," Riley asserted, trying to elicit a little shock and awe from his stolid listeners.

Riley addressed the Santa Clara, Calif., event in two separate sessions Tuesday, then gave a third talk Wednesday at a satellite RightScale user group meeting. In each session, he emphasized the security that Amazon Web Services currently supplies, plus what it will do in the near future to beef up its Virtual Private Cloud service. With a direct, frank style, the lanky, shaggy security guru said: "It's my job to help people get more comfortable with what makes people squirm."

For long-term storage, your data may be more secure in the cloud than in your data center because AWS' S3 storage service uses a standard cloud data preservation tactic, invoked by the big data handlers, such as Hadoop: It creates "multiple copies" of a data set, anticipating that a server or disk containing the data may fail. If it does, a new copy is automatically generated from a replicated set and a new primary copy is designated from among at least three copies.

Cloud design assumes disk and server hardware failures will occur and works around such failures, he said. It also stores the copies across two availability zones in the EC2 cloud, meaning a complete power outage or catastrophe could take out one of its data centers and a new data set would be reconstituted from another. Although Riley didn't say so, it appears that a data loss could occur only if at least three hardware failures holding a particular data set in disparate locations occurred simultaneously.

Through this approach, S3 has stored 180 billion objects for EC2 customers over four years and "hasn't lost one of them," said Riley. It's achieved a 99.999999999% (or eleven nines of) data availability, Riley claimed.

Slideshow: Cloud Security Pros And Cons

Slideshow: Cloud Security Pros And Cons


Slideshow: Cloud Security Pros And Cons (click image for larger view and for full slideshow)

AWS allows only its own operators into its data centers. Riley isn't allowed inside, even though he is an AWS security expert. "Customers asked us if they can audit the data center. We decided it was better to keep everyone out," he said. There are no signs indicating to passers-by that they are next to an AWS data center. The buildings are designed to blend in with those in their proximity and would be difficult to identify from a satellite photo, he said.

Data center administrators are kept out of customers' virtual machines. They need to know which operating system the customer is using in order to meet Windows license requirements. Otherwise, they inspect no applications and touch no customer data, Riley said.

For that reason he was dismayed when a customer reported the AWS management console "had gone wild and eliminated both instances of his SQL Server database," he recounted. As he worked through server logs, troubleshooting the problem, he discovered that one employee of the company had created an SQL Server database table on the C drive of the virtual machine. Realizing that copy would be lost at the end of the application's run, he initiated a second copy on a second server. Since the backup would take some time, he went home to dinner.

In his absence his boss discovered two, now idle Amazon Machine Images and terminated them both as wasteful, as they ran up hourly charges of pennies per hour. He deleted in the process the only copy of the table holding the application's data. When he realized his mistake, he complained to AWS. Riley figured out a resident at the boss' home IP address had pressed the terminate button twice at the time the man had complained about the console going wild.

To save data from an application instance, don't save it to a virtual machine's C drive. Both disappear with the termination of the virtual machine, Riley said. To temporarily store data from an application, use AWS' Elastic Block Store or EBS Service, he advised.

Riley cited the anecdote as an example of how AWS doesn't touch customer data and can't recreate data in event of a procedural failure. The logical map between a virtual machine and a physical disk volume disappears with the decommissioning of the virtual machine, he said. Amazon EC2 administrators don't know where the data is among thousands of servers and disks.

"We don't do any data mining, we have no derivative use. Your data belongs to you," he said.

Analytics Slideshow Calculating Cloud ROI

Analytics Slideshow Calculating Cloud ROI


Analytics Slideshow Calculating Cloud ROI (click image for larger view and for full slideshow)

At the same time, Riley said it was up to customers to write a secure application and send it to EC2 without viruses, worms, or other malware operating in them. Riley cited an incident where someone lodged an application to generate Zeus bots from inside EC2, a means of planting malware on an unsuspecting user's computer that is then used to seek passwords or email addresses.

On its website, AWS states that it will inspect the virtual machines submitted to it before mounting them in its infrastructure, but the Zeus incident that Riley cited indicates that the inspection is cursory at best. Riley went a step further and said AWS couldn't inspect a customer's virtual machines if it was to maintain a hands-off stance on customer data. It provides security inside EC2 by watching network traffic and detecting abnormal activities. It found the Zeus bot-generator that way. It does not inspect the internal operation of customer virtual machines, he said.

Customers, as they configure their virtual machine instances, are required to define the IP address where the application and its data are coming from and a destination protocol, which become a virtual firewall on the hypervisor for that VM, protecting it from intruder traffic.

Any virtual machine generating communications traffic is forced to route the traffic off the host server and onto the data center's physical network, where it can be inspected. A virtual machine's attempt to communicate with another virtual machine on the same server is refused. "We prohibit instance-to-instance communication," another security measure, Riley said.

AWS has its web servers communicate with application security groups and specific rules govern who may communicate with each group. The move tends to exclude any random traffickers who might have found their way off the Internet and onto a web server from knowing what barriers exist between it and an application it might like to reach.

Riley urged his listeners to add an additional layer of security, an "inspection zone" set of servers between their web server in EC2 and their executing virtual machine. If they did, all attempts to reach the virtual machine would have to go through the zone, where the user could install his preferred intruder detection, virus screening, and malware filtering, keeping such operations away from the application.

AWS launched an alternative inside the public cloud, the Virtual Private Cloud, which consists of servers dedicated to private and secure operations for VPC customers. It is designing a way to establish an IPsec tunnel between a router in an enterprise data center and a router in EC2. To do so, it will need to provide guidance on configuring the border gateway protocol (BGP), a network protocol used in inter-domain routing, to end users. "BGP is the challenging aspect of that configuration," he noted.

Nevertheless, Riley said AWS will attempt to implement the creation of IPsec tunnels for the beta VPC users late this year or early next, opening a new chapter of private cloud operation inside the public cloud infrastructure.

Riley said AWS has many customers eager to take advantage of such a capability and begin to shift more of their data center operation to EC2. "Think of this as an extension of your data center in our cloud," he said.

About the Author(s)

Charles Babcock

Editor at Large, Cloud

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse University where he obtained a bachelor's degree in journalism. He joined the publication in 2003.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights