At the same time, Riley said it was up to customers to write a secure application and send it to EC2 without viruses, worms, or other malware operating in them. Riley cited an incident where someone lodged an application to generate Zeus bots from inside EC2, a means of planting malware on an unsuspecting user's computer that is then used to seek passwords or email addresses.
On its website, AWS states that it will inspect the virtual machines submitted to it before mounting them in its infrastructure, but the Zeus incident that Riley cited indicates that the inspection is cursory at best. Riley went a step further and said AWS couldn't inspect a customer's virtual machines if it was to maintain a hands-off stance on customer data. It provides security inside EC2 by watching network traffic and detecting abnormal activities. It found the Zeus bot-generator that way. It does not inspect the internal operation of customer virtual machines, he said.
Customers, as they configure their virtual machine instances, are required to define the IP address where the application and its data are coming from and a destination protocol, which become a virtual firewall on the hypervisor for that VM, protecting it from intruder traffic.
Any virtual machine generating communications traffic is forced to route the traffic off the host server and onto the data center's physical network, where it can be inspected. A virtual machine's attempt to communicate with another virtual machine on the same server is refused. "We prohibit instance-to-instance communication," another security measure, Riley said.
AWS has its web servers communicate with application security groups and specific rules govern who may communicate with each group. The move tends to exclude any random traffickers who might have found their way off the Internet and onto a web server from knowing what barriers exist between it and an application it might like to reach.
Riley urged his listeners to add an additional layer of security, an "inspection zone" set of servers between their web server in EC2 and their executing virtual machine. If they did, all attempts to reach the virtual machine would have to go through the zone, where the user could install his preferred intruder detection, virus screening, and malware filtering, keeping such operations away from the application.
AWS launched an alternative inside the public cloud, the Virtual Private Cloud, which consists of servers dedicated to private and secure operations for VPC customers. It is designing a way to establish an IPsec tunnel between a router in an enterprise data center and a router in EC2. To do so, it will need to provide guidance on configuring the border gateway protocol (BGP), a network protocol used in inter-domain routing, to end users. "BGP is the challenging aspect of that configuration," he noted.
Nevertheless, Riley said AWS will attempt to implement the creation of IPsec tunnels for the beta VPC users late this year or early next, opening a new chapter of private cloud operation inside the public cloud infrastructure.
Riley said AWS has many customers eager to take advantage of such a capability and begin to shift more of their data center operation to EC2. "Think of this as an extension of your data center in our cloud," he said.