Top 3 Serverless Mistakes
October 17, 2022

Tal Melamed
Contrast Security

Ever experience a serverless nightmare?

Hacker News contributor "huksley" has, and it's been a pricey wakeup call about the need to understand the complexities of parameters in a serverless environment.

According to the tale of woe they posted earlier this year, huksley wound up DDoSing him- or herself. They had accidentally created a serverless function that called itself in a recursive loop that ran for 24 hours before it was caught — a function that was using over 70 million Gbps and ran up a "shocking" bill of $4,600.

That's just one of the top serverless mistakes — one caused by not knowing that AWS Billing alertss don't work on the AWS CloudFront content delivery network, which collects information on charges from all regions. That information collection takes time and thus delays the release of the billing alert, as huksley detailed.

Read on for what we see as the top three serverless mistakes that can similarly get you into trouble.

Serverless: The new buzzword

First, some background about why the word "serverless" is becoming a buzzword in the application world. The term "serverless" refers to a cloud-native development model that allows organizations to build and run their applications without the burdens of physical server infrastructure.

Serverless applications offer instant scalability, high availability, greater business agility and improved cost efficiency. This dynamic flexibility helps save time and money across the entire software development life cycle (SDLC). An August 2022 report on the global serverless apps market is forecasting that between 2022  to 2031, the market will record ~23% compound annual growth rate (CAGR).

Still, serverless application security (AppSec) remains a serious issue. As it is, traditional application security testing (AST) tools cannot provide adequate coverage, speed or accuracy to keep pace with the demands of serverless applications. In fact, as of April 2021, concerns about the dangers of configured or quickly spun-up cloud-native (serverless or container-based) workloads had increased nearly 10% year-over-year.

There's good reason for growing concern: For one, malicious actors are already targeting AWS. What we see as the biggest serverless mistakes:

Mistake No. 1: Not understanding security gaps

Organizations think that AWS manages the security, but that is not fully true. Once you write your own code, that code — including the AWS Lambda infrastructure — falls us under your responsibility as a developer or organization.

As such, you have to consider code and configuration, given that the code is always under the customer's responsibility.

Put plainly, in the AWS shared-responsibility model, organizations cannot just use the parameter security. Rather, they need to protect themselves.

AWS is responsible for securing underlying structure, but developers must make sure they secure serverless workloads or functions themselves, given that in serverless, there's no perimeter to secure. Rather, lambda must secure itself by using the "zero-trust" model, which entails:

1. Thorough, continuous authentication and authorization based on all available data.

2. The use of least-privilege access.

3. The assumption that a breach exists: an assumption that supports the visibility provided by end-to-end encryption and use analytics — visibility that leads to improoved defenses and threat detection.

Mistake No. 2: Using traditional tools

Whereas serverless applications are gaining traction due to their benefits, traditional AST tools cause workflow inefficiencies that ultimately bottleneck serverless release cycles.

Traditional security tools — Static Analysis Secuurity Testing (SAST) and Dynamic Analysis Security Testing (DAST) — just aren't made to scan modern appliications.

For example, with the complexity of modern application programming interface (API) code, the frameworks that support them and the complex interconnections between them is simply too much for static tools. Such tools produce an onslaught of false positives, and they miss serious vulnerabilities.

As well, in serverless-based applications, where the architecture is event-based as opposed to synchronous (as is a monolithic application), code can be executed via numerous types of events, like files, logs, code commits, notifications and even voice commands. Traditional tools just aren't built for that and cannot see beyond a simple REST API.

Given their lack of visibility and accuracy, legacy tools depend on expert staff to do manual security triage as they attempt to diagnose and interpret the results before handing recommendations (with limited context) back to developers to fix the problems. After weeding out the high number of false positives, security teams are left to figure out which vulnerabilities should be addressed first. This inefficiency inhibits SDLCs, increases costs and often fails to eliminate many vulnerabilities that can be exploited by cyberattacks.

Static and dynamic tools don't scale well, typically requiring experts to set up and run the tool as well as to interpret the results.

All these reasons are why organizations are opting instead for purpose-built, context-based solutions. Serverless applications are a mix of code and infrastructure, and it is therefore essential to understand both. Organizations need a serverless solution that understands both the code of the functions and its configurations — such as entry points (i.e., triggers) and Identity and Access Management (IAM) policies — and that provides custommers with context-based insight into serverless risks.

Mistake No. 3: The Dangers of Misconfigurations

As "huksley" found out, serverless presents the potential for large overages with incorrect parameters. Without setting a limit on the number of requests allowed by a serverless function, the code could accidentally rack up numerous requests and create a large AWS charge.

Configuration of function at the permission level is another major issue in serverless. Usually, developers use generic permission levels, which give functions far too many permissions. This can result in vulnerable or stolen keys, which can have a big impact in cloud computing. In such a scenario, malicious actors may be able to steal information from databases/buckets, given that the lambda permission has been set at a very broad level. Developers must instead apply the least permissions needed.

That isn't an easy task. Or, to be more precise, this task can be easy — if you write only one function, with 1,000 liines of code. But with dependencies, it becomes a little crazy. The code needs to run in order to understand what function is being carried out, and it must have enough permissions to execute that function.

Conclusion

In April 2022, Cado Security discovered Denonia, the first ever malware to specifically target AWS Lambda. More threats are sure to follow. Avoiding these top mistakes can help to secure your organization when they do.

To fend off such attacks, keep an eye out for free, open-source tools — they can help to secure youur serverless without breaking the bank.

Tal Melamed is Senior Director, Cloud-Native Security Research at Contrast Security
Share this

Industry News

April 25, 2024

JFrog announced a new machine learning (ML) lifecycle integration between JFrog Artifactory and MLflow, an open source software platform originally developed by Databricks.

April 25, 2024

Copado announced the general availability of Test Copilot, the AI-powered test creation assistant.

April 25, 2024

SmartBear has added no-code test automation powered by GenAI to its Zephyr Scale, the solution that delivers scalable, performant test management inside Jira.

April 24, 2024

Opsera announced that two new patents have been issued for its Unified DevOps Platform, now totaling nine patents issued for the cloud-native DevOps Platform.

April 23, 2024

mabl announced the addition of mobile application testing to its platform.

April 23, 2024

Spectro Cloud announced the achievement of a new Amazon Web Services (AWS) Competency designation.

April 22, 2024

GitLab announced the general availability of GitLab Duo Chat.

April 18, 2024

SmartBear announced a new version of its API design and documentation tool, SwaggerHub, integrating Stoplight’s API open source tools.

April 18, 2024

Red Hat announced updates to Red Hat Trusted Software Supply Chain.

April 18, 2024

Tricentis announced the latest update to the company’s AI offerings with the launch of Tricentis Copilot, a suite of solutions leveraging generative AI to enhance productivity throughout the entire testing lifecycle.

April 17, 2024

CIQ launched fully supported, upstream stable kernels for Rocky Linux via the CIQ Enterprise Linux Platform, providing enhanced performance, hardware compatibility and security.

April 17, 2024

Redgate launched an enterprise version of its database monitoring tool, providing a range of new features to address the challenges of scale and complexity faced by larger organizations.

April 17, 2024

Snyk announced the expansion of its current partnership with Google Cloud to advance secure code generated by Google Cloud’s generative-AI-powered collaborator service, Gemini Code Assist.

April 16, 2024

Kong announced the commercial availability of Kong Konnect Dedicated Cloud Gateways on Amazon Web Services (AWS).

April 16, 2024

Pegasystems announced the general availability of Pega Infinity ’24.1™.