Steps You Should Be Automating in the SDLC - Part 3
November 07, 2018

DEVOPSdigest asked experts from across the IT industry for their opinions on what steps in the SDLC should be automated. Part 3 covers the development environment and the infrastructure.

Start with Steps You Should Be Automating in the SDLC - Part 1

Start with Steps You Should Be Automating in the SDLC - Part 2

DEVELOPMENT ENVIRONMENT

It is critical for any aspiring high velocity engineering organization to automate the set up of development environments. Every manual step and every moment spent on development environment set up is a moment your engineers are not yet creating business value. A new contributor should have a complete environment setup and be able to code within one hour. Additionally, the development environment should match the production environment as much as possible — it is far better to discover an environmental bug in development than after a deploy to production. Fast and automated development environment setups equal more engaged engineers, which equals more engineering business value delivered at a faster pace.
Nell Shamrell-Harrington
Principal Software Development Engineer, Chef

Use containers to automate the creation and sharing of your developer environments. Today too many developers still build code on their own machines locally and end up with hard-to-trace problems due to differences in dependencies and runtime package versions. That can be avoided by creating a developer container image for each project based on and perhaps even identical to the production container image. This gets rid of costly "but it works on my machine" problems and, if saved in the project repo, makes it easier for someone to quickly reproduce an older dev environment to address a legacy customer's bug.
Brad Micklea
Senior Director of Developer Experience and Programs, Red Hat

There is usually a disconnect between development and production environments, which adds friction when an application needs to be deployed. Using automation to make development look like production, by sharing tools and configurations helps avoid this problem. For example, using a tool to codify and automate the setup and teardown of development in a way that matches production, makes it easier to test applications and avoid them breaking in production due to subtle differences.
Armon Dadgar
Founder and Co-CTO, HashiCorp

INFRASTRUCTURE MANAGEMENT

DevOps groups should eliminate error-prone processes like infrastructure and tool upgrades and maintenance through infrastructure management automation and machine learning.
Steve Garrison
VP Marketing, ZeroStack

In terms of development lifecycle automation, the biggest bang-for-buck for most organizations is infrastructure. This entails codifying system configurations and providing infrastructure-on-demand, with infrastructure-as-a-service for application teams. For many organizations, standing up infrastructure for development is manual, and the entire lifecycle is bottlenecked waiting for infrastructure to be configured as needed to support internal development, QA and production team activities. Automating infrastructure enables other pipeline processes that depend on infrastructure. In addition, focusing on infrastructure automation is a great starting point for improving collaboration between Dev, QA and Ops as part of a DevOps transformation.
Marc Hornbeek
Principal Consultant – DevOps, Trace3

As part of your DevOps process, an area that would benefit from automation: Infrastructure as Code to eliminate rework, nonstandard configurations and enforce company policies for better cost, compliance and agility.
Jeanne Morain
Author and Strategist, iSpeak Cloud

SCALABILITY

We should be automating scalability. Today’s cloud provides virtually limitless capacity, so by making automation part of cloud-native architectures, we can pre-authorize scalability, allowing it to happen automatically, without human intervention.
Sharan Gurunathan
EVP and Principal Solutions Architect, Coda Global

DASHBOARDS

You should automate the way you build dashboards, and like other automation, dashboard configuration should follow best practices around software development, including versioning and change control. Automation lets you create standard patterns cheaply, which in turn facilitates communication across teams and helps new engineers get up to speed quickly.
Daniel "Spoons" Spoonhower
Co-Founder and CTO, LightStep

DATA

Development teams are on-boarding site reliability engineers (SREs) to transform applications to become more resilient. SREs feed off detailed telemetry data from the application to anticipate problems and get ahead of them. Thus, one of the most important automation steps is to embed a monitoring data collector into your application run time early on during the development phase. Later, when the application is in production, the data collector will send telemetry to your monitoring severs regardless of where your application ends up running, in a private or public cloud.
Mike Mallo
Offering Management Lead - Hybrid Cloud DevOps, IBM Cloud Unit, IBM Corporation

Any process or handover that requires duplicating data in two systems (tools) should be automated to remove the waste and overhead that slows software delivery teams down and takes time away from value-adding work. For example, copying incident details from an ITSM tool used by the service desk to the Agile planning tool used by developers, and then manually synchronizing the status of both, or copying requirements into Agile stories, or defects into an issue tracker. Automation can flow and synchronize the necessary data across the best-of-breed tools used for planning, design, development, testing, release and operation, providing every role with the information they need in near-real time and directly in their tool of choice. By standardizing and normalizing the data as it flows, software delivery organizations can also get visibility into their big picture metrics — within a single product and across multiple products — and insights into their value creation process.
Naomi Lurie
Director of Product, Tasktop Technologies

DATABASE

Automation in software development has traditionally been restricted to front-end applications. The database is now entering the picture because changes to applications often mean the code behind the database schema needs to be updated as well. Companies should therefore think about including the database in continuous integration. By automating the build and testing of code changes every time they are committed to version control, errors will be caught earlier in the development process and the database will no longer be a bottleneck during deployments.
Simon Galbraith
CEO & Co-Founder, Redgate

Continuous Delivery is already highly adopted (83%) for Application Development. Yet, with a competitive landscape and a need to continue and improve, companies need to identify other existing bottlenecks in their application release. Databases are the ideal candidate when it comes to accelerating application releases, and are lagging behind application delivery with only 36% CI/CD adoption. In 2019, companies will turn to automate their database releases. While allowing faster database releases is a goal, companies must remember the true goal is to go as fast as possible without the risk of costly re-work and downtime. This can be achieved by adopting DevOps best practices for the database, setting safety nets to negate database risk, as well as manage security aspects to make sure short feedback loops are handled across Dev, Sec & Ops.
Yaniv Yehuda
Co-Founder and CTO, DBmaestro

Time and time again, we hear technology leaders say they should have addressed the database release management problem before tackling the easy problem of application release automation. But the reality is — database deployments are often forgotten about. Pushing out the application is the easy part of DevOps, but managing and automating database changes are the real challenges. As development teams continue moving quickly, there is a need to support the high rate of change. Through automation, database administrators (DBAs) have enormous impact on how quickly software gets into the hands of customers, shortening the time it takes to bring application innovation to market innovation to market while eliminating the security vulnerabilities, costly errors, data loss and downtime often associated with manual database deployment methods.
Robert Reeves
CTO and Co-Founder, Datical

Read Steps You Should Be Automating in the SDLC - Part 4, all about security.

Share this

Industry News

November 20, 2024

Spectro Cloud completed a $75 million Series C funding round led by Growth Equity at Goldman Sachs Alternatives with participation from existing Spectro Cloud investors.

November 20, 2024

The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, has announced significant momentum around cloud native training and certifications with the addition of three new project-centric certifications and a series of new Platform Engineering-specific certifications:

November 20, 2024

Red Hat announced the latest version of Red Hat OpenShift AI, its artificial intelligence (AI) and machine learning (ML) platform built on Red Hat OpenShift that enables enterprises to create and deliver AI-enabled applications at scale across the hybrid cloud.

November 20, 2024

Salesforce announced agentic lifecycle management tools to automate Agentforce testing, prototype agents in secure Sandbox environments, and transparently manage usage at scale.

November 19, 2024

OpenText™ unveiled Cloud Editions (CE) 24.4, presenting a suite of transformative advancements in Business Cloud, AI, and Technology to empower the future of AI-driven knowledge work.

November 19, 2024

Red Hat announced new capabilities and enhancements for Red Hat Developer Hub, Red Hat’s enterprise-grade developer portal based on the Backstage project.

November 19, 2024

Pegasystems announced the availability of new AI-driven legacy discovery capabilities in Pega GenAI Blueprint™ to accelerate the daunting task of modernizing legacy systems that hold organizations back.

November 19, 2024

Tricentis launched enhanced cloud capabilities for its flagship solution, Tricentis Tosca, bringing enterprise-ready end-to-end test automation to the cloud.

November 19, 2024

Rafay Systems announced new platform advancements that help enterprises and GPU cloud providers deliver developer-friendly consumption workflows for GPU infrastructure.

November 19, 2024

Apiiro introduced Code-to-Runtime, a new capability using Apiiro’s deep code analysis (DCA) technology to map software architecture and trace all types of software components including APIs, open source software (OSS), and containers to code owners while enriching it with business impact.

November 19, 2024

Zesty announced the launch of Kompass, its automated Kubernetes optimization platform.

November 18, 2024

MacStadium announced the launch of Orka Engine, the latest addition to its Orka product line.

November 18, 2024

Elastic announced its AI ecosystem to help enterprise developers accelerate building and deploying their Retrieval Augmented Generation (RAG) applications.

Read the full news on APMdigest

November 18, 2024

Red Hat introduced new capabilities and enhancements for Red Hat OpenShift, a hybrid cloud application platform powered by Kubernetes, as well as the technology preview of Red Hat OpenShift Lightspeed.

November 18, 2024

Traefik Labs announced API Sandbox as a Service to streamline and accelerate mock API development, and Traefik Proxy v3.2.