The new division of responsibility moves some security concerns off a business's plate while changing priorities for other risks.

Guy Podjarny, CEO & Cofounder, Snyk

March 21, 2018

5 Min Read

Serverless computing is an exciting new approach in the world of cloud infrastructure that lets developers write small code functions and publish them to the cloud, where the platform runs them on demand. This new model overhauls many aspects of operations and unlocks opportunities for cost reduction, and large and small organizations are trying it out.

Alongside its operations impact, serverless computing and its underlying function-as-a-service (FaaS) platform also carry significant security implications. The new division of responsibility moves some security concerns off a business's plate while simultaneously magnifying or shuffling priorities for other risks.

Risks You Can Worry About Less
First and foremost, serverless computing, as its names implies, lowers the risks involved with managing servers. While the servers clearly still exist, they are no longer managed by the application owner, and are instead taken care of by the cloud platform operators — for instance, Google, Microsoft, or Amazon. Efficient and secure handling of servers is a core competency for these platforms, and so it's far more likely they will handle it well.

The biggest concern you can eliminate is  addressing vulnerable server dependencies. Patching your servers regularly is easy enough on a single server but quite hard to achieve at scale. As an industry, we are notoriously bad at tracking vulnerable operating system binaries, leading to one breach after another. Stats from Gartner predict this trend will continue into and past 2020. With a serverless approach, patching servers is the platform's responsibility.

Beyond patching, serverless reduces the risk of a denial-of-service (DoS) attack. No server management also means no capacity management, as FaaS automatically provisions ad hoc servers to meet incoming demand. Such optimal scaling reduces the chance of an outage, including one attempted deliberately through a DoS attack. Attacks trying to take down a server will be stopped, as the platform kills the crippled server within seconds alongside launching new ones for new clients. Serverless computing won't help against a high-volume distributed DoS attack, but the risk and damage of a DoS attack is greatly diminished.

Lastly, FaaS offers an opportunity to apply fine grain permission control. Each deployed function has to be granted explicit access to data, services, and other functions, and similar policies control who can invoke each function in the first place. Since functions are smaller than full applications, we can greatly reduce the number of code paths that access our sensitive data, as well as reduce the damage an attacker can do following a successful exploit. This granularity offers a great security opportunity, but it requires additional effort in configuring and maintaining such accurate policies.

The Risks That Bubble to the Top
Unfortunately, every architecture has its flaws, and serverless computing also triggers an increase in certain risks, caused by the statelessness and flexibility that also make it shine. In addition, by mitigating the above concerns, serverless computing draws attacker attention to other attack vectors, which remain open.

The first concern to grow with FaaS is the moving and storage of data. Since serverless forces all functions to be stateless, sensitive cached data, such as user sessions and negotiated keys, cannot be kept in memory and must be moved and stored in an external location. Moving the data risks leaking it in the process, and storing the data elsewhere requires security controls on the new database, and may have compliance implications as well. These data concerns are not new ones, but since data is moved and stored more often, the risk of a security failure grow.

Beyond data, serverless apps also make greater use of third-party services. Due to its event-driven nature, as well as the mentioned requirement that functions remain stateless, serverless applications rely more heavily on third-party services than typical apps. These services may be offered by the cloud platform itself or by external providers, and range from authentication to storage to email and messaging services. Each interaction with a third party needs multiple security controls, and the eventual dependency chain carries the risk of being as strong as its weakest link.

This third-party risk also applies to software, in the form of vulnerable open source libraries. Similar in nature to server dependencies, vulnerable application dependencies can cause serious harm, as demonstrated in Equifax being breached through a vulnerable Java library. Functions make heavy use of these libraries, most commonly pulled from npm and PyPI, and many FaaS platforms fetch them as part of their built-in provisioning. The platforms do not, however, manage these dependencies, which means you must monitor for known vulnerabilities in application dependencies yourself to remain secure.

Last but not least, a serverless approach increases your attack surface. Breaking up your applications into small functions allows for great flexibility as you combine functions in different ways but also exposes great risk. Functions may be invoked in many different execution sequences, and they can't rely on input validation, authorization, or similar controls to have already happened. To properly secure a FaaS application, make sure each function maintains its own perimeter, and invest in security libraries and processes that help make such defense in depth easier.

Is Serverless Better for Security?
Serverless computing offers an incredible opportunity accelerating the pace we develop applications while dramatically reducing the cost of operating them. With the powerful benefits it brings to the table, it seems clear it is here to stay, and its adoption will rapidly grow.

Like many things, a serverless approach doesn't clearly improve or worsen security; it simply changes priorities. Notably, it reduces the security concerns revolving servers and raises the ones related to applications. In a serverless context, worry less about capacity planning and server dependencies and more about moving data, applying permissions, and managing vulnerable application dependencies.

By adjusting our priorities and focusing on these updated concerns, we can make the reduction of servers lead to a reduction of risk.

Related Content:

Interop ITX 2018

Join Dark Reading LIVE for two cybersecurity summits at Interop ITX. Learn from the industry's most knowledgeable IT security experts. Check out the Interop ITX 2018 security track here. Save $200 off your conference pass with Promo Code DR200.

 

About the Author(s)

Guy Podjarny

CEO & Cofounder, Snyk

Guy Podjarny (@guypod) is a cofounder at Snyk.io focusing on securing open source code. Guy was previously CTO at Akamai following their acquisition of his startup, Blaze.io. Prior to that, Guy worked on the first web app firewall & security code analyzer, and dealt with security in the IDF. Guy is a frequent conference speaker, the author of O'Reilly's "Responsive & Fast," "High Performance Images" and an upcoming Open Source Security book.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights