2024 DevOps Predictions - Part 8
December 14, 2023

Industry experts offer thoughtful, insightful, and often controversial predictions on how DevOps and related technologies will evolve and impact business in 2024. Part 8 covers AI's impact on DevOps and development.

Start with: 2024 DevOps Predictions - Part 1

Start with: 2024 DevOps Predictions - Part 2

Start with: 2024 DevOps Predictions - Part 3

Start with: 2024 DevOps Predictions - Part 4

Start with: 2024 DevOps Predictions - Part 5

Start with: 2024 DevOps Predictions - Part 6

Start with: 2024 DevOps Predictions - Part 7

AI CHANGES EXPECTATIONS OF DEVELOPER PRODUCTIVITY

Automation tools will make a more visible impact on developer velocity and how developers' work is measured. This year's explosion of AI and ML is instigating an unparalleled transformation in business productivity expectations. In 2024, the extended accessibility to AI- and ML-driven automation tools will continue to elevate the benchmarks for code quality, reliability, and security, in response to the escalating demand for expedited software delivery.
Sairaj Uddin
SVP of Technology, The Trade Desk

Frontrunners Will Crack the Productivity Code

We keep talking about first mover advantage but in 2024, we'll see that disappear as best practices emerge by mid-year and gain widespread adoption later in the year. By the end of 2024, those who have adopted AI-assisted code tools and cracked the code on how to use AI well will be outperforming companies that are not. For DevOps teams already on this bandwagon, information sharing and increased productivity will become standard, enabling them to monetize and scale their successes.
Wing To
GM of Intelligent DevOps, Digital.ai

AI Advances abstraction in programming

We will see a significant leap in how AI advances abstraction for developers. As developers have looked to increase efficiencies, they have abstracted out the common and mundane tasks. Each new language, framework, and SDK that comes along abstracts another level of tasks that developers don't need to worry about. AI will take abstraction to the next level. AI-powered reference architectures will give developers a jump on starting new projects or lend a hand when solving complex problems. Developers will no longer begin with a blank slate. Instead, AI will help remove the intimidation of an empty page to jump start projects and streamline workflows.
Scott McAllister
Principal Developer Advocate, ngrok

AI transforms Kubernetes into automatic transmission system of cloud-native era

AI is poised to redefine how businesses utilize Kubernetes for application deployment and managing their infrastructure. Similar to how automatic transmissions streamlined driving, AI will become the automatic transmission for Kubernetes. AI will serve as the bridge between Kubernetes' inherent complexity and accessibility so that even entry-level team members will be able to efficiently navigate and manage Kubernetes environments. AI will act as an intelligent guide, simplifying intricate operations and offering real-time insights. It will not only automate issue detection but also empower less experienced staff to operate Kubernetes proficiently. This empowerment will optimize the workforce, reducing the need for extensive training or specialized knowledge. Consequently, businesses will be able to streamline their operations, reduce human intervention, and significantly cut operational costs, making the adoption of Kubernetes even more feasible and economical.
Mohan Atreya
SVP Product and Solutions, Rafay

API Plays Increasing Role in AI

APIs will play an increasing role in the growth of AI. They will grow into the de-facto mechanism that AI agents use to increase their access to the world — both to integrate data from myriad sources, and to operate on the world around them.
Abhijit Kane
Co-Founder, Postman

GENAI ACCELERATES OPEN SOURCE DEVELOPMENT

Generative AI will accelerate the impact of small, open source software teams. History shows that tiny teams — sometimes just one person! — can have outsized impact with open source. Generative AI is going to amplify this "open source impact effect" to incredible new levels. When we look at the cost of developing open source, actually writing the code itself isn't the expensive part. It's the documentation, bug handling, talking to people, responding to requests, checking examples of code on GitHub and more — all of which is very human-intensive. The open source community will benefit from generative AI for the same reason so many other efforts will: efficient elimination of tiresome human tasks. By helping with all of that, large language models will accelerate open source development this upcoming year, making smaller teams even more powerful.
Adrien Treuille
Director of Product Management and Head of Streamlit, Snowflake

AI ENABLES DEVOPS TO BE MORE PREDICTIVE

AI will enable DevOps and SecOps to become more predictive in nature. With the explosion of AI, DevOps will grow to be more predictive in nature, adjusting proactively to release and update software. In particular, DevOps and SecOps will join more closely to anticipate and remediate threats through predictive DevOps.
Ed Frederici
CTO, Appfire

ESTABLISHED FRAMEWORKS HAVE THE EDGE

More established frameworks are going to have a leg up compared to newer frameworks on which AI hasn't been as deeply trained. Which frameworks frontend devs choose to use and the unique skills they bring to the table will impact the job market and how development processes change.
Rita Kozlov
Senior Product Director, Cloudflare

DEV-TO-OPS RATIO INCREASES

The dev-to-ops headcount ratio will increase in 2024. Thanks to advances in AI co-pilots, developers are more efficient than ever. However, the Ops side remains slow to adopt and benefit from GenAI technologies. Unlike development, Ops functions require accurate and predictable outcomes — not a point of strength for LLMs. Many organizations have worked to apply AI models to production troubleshooting. Yet, none of these efforts have borne fruit, as production systems like Kubernetes and AWS are far too complex to be made fully autonomous. As development teams increasingly benefit from co-pilots and Ops teams become further marred in increasing complexities, the dev-to-ops headcount ratio has declined (i.e., we see fewer developers than Ops professionals). This is highly problematic for the industry. Thankfully, we'll see a reversal of fortune in 2024 as AI models improve, producing greater efficiencies in Ops workflows, and as investment into DevOps platforms increases, creating better tooling and higher abstraction abilities.
Sheng Liang
Co-Founder and CEO, Acorn Labs

VECTOR DATABASES PLAY KEY ROLE IN TECH STACK

As new applications get built from the ground up with AI, and as LLMs become integrated into existing applications, I believe vector databases will play an increasingly important role in the tech stack, just as application databases have in the past. Teams will need scalable, easy to use, and operationally simple vector data storage as they seek to create AI-enabled products with new LLM-powered capabilities
Avthar Sewrathan
GM for AI and Vector, Timescale

MLOps Integrates with DevOps

In 2024, MLOps will increasingly integrate with DevOps to create more streamlined workflows for AI projects. The combination of MLOps and DevOps creates a set of processes and automated tools for managing data, code and models to enhance the efficiency of machine learning platforms. Data scientists and software developers will get the freedom to transition to high-value projects without the need for manually overseeing models. The trend is driven by streamlining the process of delivering models to production to reduce time-to-value.
Haoyuan Li
Founder and CEO, Alluxio

The Rise of AI-as-a-Service

It's already possible to use OpenAI's ChatGPT in your own applications, but being able to model responses based on your own, proprietary datasets will bring much more value to businesses. This leads to issues of data sovereignty and confidentiality, which will see the rise of not just cloud-based AI services, but the ability to run them in siloed cloud-environments.
Ben Dechrai
Developer Advocate, Sonar

Evolving beyond the chatbot

The breakout star of generative AI has been ChatGPT; subsequently 2023 saw most interfaces to generative AI via chat. As designers and developers work with the technology, and as more specialized LLMs are produced, we'll see AI fade into the background, but we'll see more powerful applications built upon it. Right now, chatbots are hammers and everything looks like a nail, to truly use AI to its full potential we will need to move beyond that.
Phil Nash
Developer Advocate, Sonar

Share this

Industry News

December 03, 2024

SmartBear announced its acquisition of QMetry, provider of an AI-enabled digital quality platform designed to scale software quality.

December 03, 2024

Red Hat signed a strategic collaboration agreement (SCA) with Amazon Web Services (AWS) to scale availability of Red Hat open source solutions in AWS Marketplace, building upon the two companies’ long-standing relationship.

December 03, 2024

CloudZero announced the launch of CloudZero Intelligence — an AI system powering CloudZero Advisor, a free, publicly available tool that uses conversational AI to help businesses accurately predict and optimize the cost of cloud infrastructure.

December 03, 2024

Opsera has been accepted into the Amazon Web Services (AWS) Independent Software Vendor (ISV) Accelerate Program, a co-sell program for AWS Partners that provides software solutions that run on or integrate with AWS.

December 02, 2024

Spectro Cloud is a launch partner for the new Amazon EKS Hybrid Nodes feature debuting at AWS re:Invent 2024.

December 02, 2024

Couchbase unveiled Capella AI Services to help enterprises address the growing data challenges of AI development and deployment and streamline how they build secure agentic AI applications at scale.

December 02, 2024

Veracode announced innovations to help developers build secure-by-design software, and security teams reduce risk across their code-to-cloud ecosystem.

December 02, 2024

Traefik Labs unveiled the Traefik AI Gateway, a centralized cloud-native egress gateway for managing and securing internal applications with external AI services like Large Language Models (LLMs).

December 02, 2024

Generally available to all customers today, Sumo Logic Mo Copilot, an AI Copilot for DevSecOps, will empower the entire team and drastically reduce response times for critical applications.

December 02, 2024

iTMethods announced a strategic partnership with CircleCI, a continuous integration and delivery (CI/CD) platform. Together, they will deliver a seamless, end-to-end solution for optimizing software development and delivery processes.

November 26, 2024

Check Point® Software Technologies Ltd. has been recognized as a Leader and Fast Mover in the latest GigaOm Radar Report for Cloud-Native Application Protection Platforms (CNAPPs).

November 26, 2024

Spectro Cloud, provider of the award-winning Palette Edge™ Kubernetes management platform, announced a new integrated edge in a box solution featuring the Hewlett Packard Enterprise (HPE) ProLiant DL145 Gen11 server to help organizations deploy, secure, and manage demanding applications for diverse edge locations.

November 26, 2024

Red Hat announced the availability of Red Hat JBoss Enterprise Application Platform (JBoss EAP) 8 on Microsoft Azure.

November 26, 2024

Launchable by CloudBees is now available on AWS Marketplace, a digital catalog with thousands of software listings from independent software vendors that make it easy to find, test, buy, and deploy software that runs on Amazon Web Services (AWS).