Security and the Twelve-Factor App - Step 8
A blog series by WhiteHat Security
May 13, 2019

Eric Sheridan
WhiteHat Security

In the previous blog of this WhiteHat Security series, the Twelve-Factor App looked at exporting services via port binding and included advice on what to apply from a security point of view.

We now move on to Step 8 of the Twelve-Factor App, which recommends scaling out via the process model discussed in Step 7.

Start with Security and the Twelve-Factor App - Step 1
Start with Security and the Twelve-Factor App - Step 2
Start with Security and the Twelve-Factor App - Step 3
Start with Security and the Twelve-Factor App - Step 4
Start with Security and the Twelve-Factor App - Step 5
Start with Security and the Twelve-Factor App - Step 6
Start with Security and the Twelve-Factor App - Step 7

Defining Concurrency in the Twelve-Factor App

A simple explanation for this factor is to picture a lot of little processes handling specific requirements, such as web requests, API calls, or sending tweets. Keeping all these working independently means that the application will scale better, and you’ll be able to manage more activities concurrently.

According to the Twelve-factor app, processes are a first class citizen, in which processes take strong cues from the unix process model for running service daemons. Twelve-Factor goes on to say that by using this model, the developer can architect the app to handle diverse workloads by assigning each type of work to a process type. For example, HTTP requests may be handled by a web process, and long-running background tasks handled by a worker process.

Applying Security to Step 8

The security challenge to this step is that the ability to scale requires paying attention to APIs that are known to introduce Denial of Service issues. One such API is known as "readLine". Implementations of this method are available on almost every software development platform and yet is subject to Denial of Service. "readLine" will continuously read bytes from a given input stream until a newline character is found. Assume the attacker controls that stream… what if the attacker never provides a newline character? What will happen? More often than not, this will result in errors and stability issues stemming from memory exhaustion.

Two simple processes can be implemented to strengthen the security posture of this step:

1. Ban DoS-able API i.e. Document relevant DoS-able API for your platform (such as readLine) and ban them

2. Resource Closure i.e. Expose simplistic patterns to facilitate closing of I/O resources (e.g. scope)

In the next blog we will cover Step 9, Disposability, which is all about maximizing robustness with fast startup and a graceful shutdown, and what this means from a security point of view.

Read Security and the Twelve-Factor App - Step 9

Eric Sheridan is Chief Scientist at WhiteHat Security
Share this

Industry News

December 19, 2024

Check Point® Software Technologies Ltd. has been recognized as a Leader in the 2024 Gartner® Magic Quadrant™ for Email Security Platforms (ESP).

December 19, 2024

Progress announced its partnership with the American Institute of CPAs (AICPA), the world’s largest member association representing the CPA profession.

December 18, 2024

Kurrent announced $12 million in funding, its rebrand from Event Store and the official launch of Kurrent Enterprise Edition, now commercially available.

December 18, 2024

Blitzy announced the launch of the Blitzy Platform, a category-defining agentic platform that accelerates software development for enterprises by autonomously batch building up to 80% of software applications.

December 17, 2024

Sonata Software launched IntellQA, a Harmoni.AI powered testing automation and acceleration platform designed to transform software delivery for global enterprises.

December 17, 2024

Sonar signed a definitive agreement to acquire Tidelift, a provider of software supply chain security solutions that help organizations manage the risk of open source software.

December 17, 2024

Kindo formally launched its channel partner program.

December 16, 2024

Red Hat announced the latest release of Red Hat Enterprise Linux AI (RHEL AI), Red Hat’s foundation model platform for more seamlessly developing, testing and running generative artificial intelligence (gen AI) models for enterprise applications.

December 16, 2024

Fastly announced the general availability of Fastly AI Accelerator.

December 12, 2024

Amazon Web Services (AWS) announced the launch and general availability of Amazon Q Developer plugins for Datadog and Wiz in the AWS Management Console.

December 12, 2024

vFunction released new capabilities that solve a major microservices headache for development teams – keeping documentation current as systems evolve – and make it simpler to manage and remediate tech debt.

December 11, 2024

CyberArk announced the launch of FuzzyAI, an open-source framework that helps organizations identify and address AI model vulnerabilities, like guardrail bypassing and harmful output generation, in cloud-hosted and in-house AI models.

December 11, 2024

Grid Dynamics announced the launch of its developer portal.

December 10, 2024

LTIMindtree announced a strategic partnership with GitHub.