Cloud Workload Security - Improving Practices for Deployment and Run-Time
May 10, 2022

Yasser Fuentes
Bitdefender

DevOps is considered green when it comes to security practices. Developers are generally focused on the performance and deployment of solutions, rather than their protection. As cloud workload security (CWS) advances from deployment, to mainstream adoption, to run-time optimization, there are certain steps that DevOps teams need to implement to ensure they're properly protecting their projects.

Below, find three critical steps for DevOps teams to improve their CWS protections for application deployment and run-time.

Ensure a proper assessment

The first step to implementing proper security measures to DevOps pipelines is to make sure a proper assessment is performed. It's critical for an organization to understand the risks associated with migration and cloud solution provider infrastructure. This assessment requires DevOps teams to ask multiple questions.

First, what is the shared responsibility of this project?

Consider all parties who will be utilizing this solution and who has a hand in keeping it running once live.

Second, which controls can be used with the current infrastructure, and which ones do you have to implement?

Once a CWS initiative is in active run-time, take note of the security capabilities you are able to immediately implement, and which safeguards are still missing.

Lastly, which security controls are in line with risk management?

Once your initial assessment is complete, make sure that you are allocating adequate security controls to align with risk management initiatives. By performing this assessment, security teams and DevOps developers alike are able to better protect against cyberattacks before and during deployment-especially important in the modern DevOps environment.

Recognize the cybercriminal draw to cloud infrastructure environments

In the current DevOps landscape, there are a number of reasons cybercriminals are shifting attacks to virtualized, and more specifically Linux environments. First, because more than 80 percent of workloads that reside in the cloud/hybrid cloud (both servers and containers) environments run on Linux-based distributions.

Why is this the case?

They're more efficient, easier to manage, they consume less resources, and at their core, they're purpose-built to serve a specific goal. This means that they're more generically built and formulaic, making it easier for cybercriminals to mimic an environment.

Second, Linux-based workloads are the most overlooked across the board in any infrastructure- many believe that because it is open sourced, they're not responsible for securing Linux.

Lastly, most distributions are housed in the open-source realm meaning there's no real commitment to provide security updates and patches, deeming them vulnerable by nature. When deploying a solution on a Linux/open-source environment, DevOps teams should be hyper-aware of the security risks and what this type of environment will mean for security purposes in the long run.

Steps for protecting and deploying container-based applications

With this information in mind, there are key steps for building and deploying more secure virtualized environments. When developing a DevOps initiative, make sure that security is leveraged as part of every stage of the deployment pipeline.

It's important to consider capabilities, such as managed detection and response (MDR) and extended detection and response (XDR), as part of the assessment process during pre-deployment to proactively assess threats, misconfigurations, and vulnerabilities.

Next, when your containers are ready for run-time, make sure you have safeguards for run-time protection. You can build container environments with protections, but without actual run-time protection, those containers remain vulnerable during a successful breach.

Understand that you're responsible for the data that your applications process within the cloud, whether it's owned or not. Once other users adopt your technology, securing the hosted data becomes a shared responsibility.

Each one of these steps will ensure more trustworthy, user-friendly environments.

Conclusion

Remember that security controls by themselves are just a piece of technology. Security controls, in an ideal setting, should be in line with processes and product development. Gearing up for deployment should not mean sacrificing security controls to deploy faster or more efficiently.

Additionally, the DevOps engineers behind these controls must be able to understand the technology, the protocols, and the risks- plus how to effectively take advantage of these technologies to use them to their full potential. By understanding how to protect DevOps initiatives to their fullest, dev teams will ultimately build better, security hardened, container environments.

Yasser Fuentes is Technical Product Manager (Cloud) at Bitdefender
Share this

Industry News

November 20, 2024

Spectro Cloud completed a $75 million Series C funding round led by Growth Equity at Goldman Sachs Alternatives with participation from existing Spectro Cloud investors.

November 20, 2024

The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, has announced significant momentum around cloud native training and certifications with the addition of three new project-centric certifications and a series of new Platform Engineering-specific certifications:

November 20, 2024

Red Hat announced the latest version of Red Hat OpenShift AI, its artificial intelligence (AI) and machine learning (ML) platform built on Red Hat OpenShift that enables enterprises to create and deliver AI-enabled applications at scale across the hybrid cloud.

November 20, 2024

Salesforce announced agentic lifecycle management tools to automate Agentforce testing, prototype agents in secure Sandbox environments, and transparently manage usage at scale.

November 19, 2024

OpenText™ unveiled Cloud Editions (CE) 24.4, presenting a suite of transformative advancements in Business Cloud, AI, and Technology to empower the future of AI-driven knowledge work.

November 19, 2024

Red Hat announced new capabilities and enhancements for Red Hat Developer Hub, Red Hat’s enterprise-grade developer portal based on the Backstage project.

November 19, 2024

Pegasystems announced the availability of new AI-driven legacy discovery capabilities in Pega GenAI Blueprint™ to accelerate the daunting task of modernizing legacy systems that hold organizations back.

November 19, 2024

Tricentis launched enhanced cloud capabilities for its flagship solution, Tricentis Tosca, bringing enterprise-ready end-to-end test automation to the cloud.

November 19, 2024

Rafay Systems announced new platform advancements that help enterprises and GPU cloud providers deliver developer-friendly consumption workflows for GPU infrastructure.

November 19, 2024

Apiiro introduced Code-to-Runtime, a new capability using Apiiro’s deep code analysis (DCA) technology to map software architecture and trace all types of software components including APIs, open source software (OSS), and containers to code owners while enriching it with business impact.

November 19, 2024

Zesty announced the launch of Kompass, its automated Kubernetes optimization platform.

November 18, 2024

MacStadium announced the launch of Orka Engine, the latest addition to its Orka product line.

November 18, 2024

Elastic announced its AI ecosystem to help enterprise developers accelerate building and deploying their Retrieval Augmented Generation (RAG) applications.

Read the full news on APMdigest

November 18, 2024

Red Hat introduced new capabilities and enhancements for Red Hat OpenShift, a hybrid cloud application platform powered by Kubernetes, as well as the technology preview of Red Hat OpenShift Lightspeed.

November 18, 2024

Traefik Labs announced API Sandbox as a Service to streamline and accelerate mock API development, and Traefik Proxy v3.2.