Serverless Security: Risks and Best Practices

SHARE:

Facebook logo LinkedIn logo X (formerly Twitter) logo

Serverless computing is a new paradigm wherein application developers do not need to manage servers; rather, they deploy code as functions, then servers are assigned based on demand. It uses the Function-as-a-Service (FaaS) model, a type of cloud computing that allows developers to easily package and deploy their code without having to deal with the extensive server infrastructure. FaaS is an event-driven computing execution architecture in which developers create logic that is deployed in containers that are fully managed by a platform and then performed on demand.

In serverless, the servers are abstracted away from application development, and the cloud provider takes responsibility for provisioning, maintaining, and scaling the server infrastructure in response to the code events that are deployed. Serverless programs respond to demand and autonomously scale up and down as needed once they’ve been launched. Public cloud providers’ serverless products are often metered on-demand using an event-driven execution approach. As a result, a serverless function doesn’t cost anything while it’s not in use. An example of a serverless system solution would consist of a database, user authentication, a web server, security token service (STS), and Lambda functions. The most popular examples of FaaS include Google Cloud Functions, Microsoft Azure Functions, and AWS Lambda.

Serverless Security

What you'll learn

  • What serverless computing is, and what kinds of security it needs

  • Potential risks and issues associated with serverless security

  • Techniques you can use to avoid these risks

Use of Serverless Technology

One of the main reasons why the use of serverless technology has increased recently is its ability to expedite the software development process. It allows developers to outsource server infrastructure management to the Cloud Service Provider (CSP), then take care of the application functions. The main challenge faced in serverless, however, is the fact that the CSP is only responsible for the security of the cloud – not security in the cloud. This means that the serverless application is not only still exposed to the risks and vulnerabilities that traditional applications face, but it also faces security challenges that are unique to the serverless architecture. Therefore, application developers need to take responsibility for their serverless applications through identity and access management (IAM), resource configuration, and the protection of code functions and libraries.

In this guide, we will discuss serverless security, the risks associated with the architecture, and the best practices for mitigating such challenges.

What Is Serverless Security?

Traditionally, most applications face security risks such as cross-site scripting, broken access control, database injection, sensitive data exposure, insecure deserialization, and many others. To mitigate this, they use traditional methods such as installing and configuring firewalls, using IPS tools, or even using server-based protection methods. This will not work for serverless architecture, which does not focus on network inspection; rather, it focuses on behavioral protection, code protection, and security permissions on the client-side. Serverless security, therefore, is the additional layer of protection added directly to the applications to secure the code functions, thereby giving developers compliance and security posture over their applications.

Serverless Security Risks

1: Increased Attack Surfaces

Serverless functions consume input data from a variety of event sources, including HTTP APIs, cloud storage, IoT device connections, and queues. This significantly increases the attack surface, since some of these parts may contain untrusted message formats which may not be properly reviewed by the standard application layer protection. The connection links used to fetch input data (such as protocols, vectors, and functions) could be used as points of attacks if their independent vulnerabilities are exposed.

2: Security Misconfiguration

Serverless applications are prone to cyber attacks due to insecure configurations in the settings and features offered by the cloud service provider. For instance, Denial-of-Service (DoS) attacks often occur in serverless applications due to misconfigured timeout settings between the functions and the host, where the low concurrent limits are used as points of attack against the application. Attackers can also exploit the function links by interjecting the function calls where they elongate the function events to execute longer than expected, allowing Denial-of-Wallet (DoW) attacks and increasing the cost of the serverless function. Using unprotected functions from public repositories (like GitHub and S3 buckets) also causes DoW attacks due to the leakage of sensitive data. This is because attackers take advantage of the exposed functions with unprotected secrets and keys that are hardcoded in the code.

3: Broken Authentication

Serverless applications are stateless, and the use of microservices in their architecture exposes the moving parts of the independent functions to authentication failure. For instance, if just one function’s authentication is mishandled in an application with hundreds of serverless functions, it will impact the rest of the application. Attackers could focus on one function to get access to the system through different methods, such as dictionary attacks and automated brute force.

4: The Threat of Over-Privileged Functions

The serverless ecosystem relies on many independent functions, and each function has its own roles and permissions. The massive interaction between functions might sometimes cause functions to be over-privileged in their rights. For instance, a function that constantly accesses the database and updates other functions could be a huge risk because of its visibility to actors.

Serverless Security Best Practices

1. Use API Gateways as Security Buffers

One way of preventing event-data injection in serverless applications is to separate data from functions using API HTTPS endpoint gateways. With data being fetched from a wide range of sources, an API gateway will act as a security buffer, creating a separation between app users on the client-side and the serverless functions on the backend. This reduces the attack surface by providing several security checks using a reverse proxy. Using HTTPs endpoints allows you to leverage inbuilt security protocols, such as data encryption and your provider’s key management, which help to safeguard the stored data, environment variables, and sensitive data.

2. Data Separation and Secure Configurations

To avoid DoW attacks, you should put prevention mechanisms in place, such as code scanning, separating commands and queries, and identifying any exposed secret keys or unlinked triggers, then configuring them to adhere to the CSP’s best practices for serverless applications. Function timeouts should also be set to the minimum to ensure that the execution calls are not interrupted by DoS attackers.

3. Dealing with Insecure Authentication

To mitigate against the risk of broken authentication, you need to implement multiple specialized access control and authentication services. To ensure that authentication is harder to penetrate, you can use the CSP’s access control solutions, including OAuth, OIDC, SAML, OpenID Connect, and multi-factor authentication (MFA). Additionally, you can enforce customized password complexity requirements and policies with regard to length and character type, which makes it hard for hackers to crack. 

4. Sufficient Serverless Monitoring and Logging

To get in-depth visibility into all of the functions within a serverless application, you need to invest in an extensive observability and monitoring tool. Relying solely on the logging and monitoring tools provided by the CSP is not sufficient, since it does not cover the application layer. This is a huge risk because the application event data within it is exposed to security attacks, and they could act as potential entry points for attacks if they’re not monitored closely.

5. Minimize Privileges

The best practice for minimizing privileges in independent functions is to separate functions from one another and limit their interactions by provisioning IAM roles on their rights. This could also be achieved by ensuring that the code runs with the least number of permissions required to perform an event successfully.

6. Separate Application Development Environments

One of the best development practices is to ensure continuous development, integration, and deployment (CI/CD) by separating the various environments from staging, development, and production. This ensures that proper vulnerability management is prioritized at every development stage before advancing that version of the code. It also ensures continuous testing and improvement of the application through patch prioritization, safeguarding updates, and identifying vulnerabilities, which helps developers stay ahead of attackers.

Staying Ahead of Serverless Security Risks With Sysdig

With the increased uptake of cloud platforms and serverless architecture, the popularity of abstraction among cloud agent models cannot be overstated. The use of virtual images and containers (such as EKS and ECS) as host machines is constantly increasing within the serverless ecosystem. The main challenge of containerization, however, is container security – the process of ensuring that security protocols are implemented to protect the underlying infrastructure, runtime, and data in container applications.

To expedite innovation and increase standardization in the Container-as-a-Service (Caas) space, Sysdig has created Falco, which helps in detecting threats among containers, cloud-native hosts, and Kubernetes. Sysdig has also built serverless agents with the use of AWS Fargate to make the Container-as-a-Service model more convenient to use and easier to monitor for security events inside the containers. Sysdig’s entire product line will assist in addressing most of the serverless security challenges discussed above and help you implement the best practices for a seamless serverless product.