2024 DevOps Predictions - Part 8
December 14, 2023

Industry experts offer thoughtful, insightful, and often controversial predictions on how DevOps and related technologies will evolve and impact business in 2024. Part 8 covers AI's impact on DevOps and development.

Start with: 2024 DevOps Predictions - Part 1

Start with: 2024 DevOps Predictions - Part 2

Start with: 2024 DevOps Predictions - Part 3

Start with: 2024 DevOps Predictions - Part 4

Start with: 2024 DevOps Predictions - Part 5

Start with: 2024 DevOps Predictions - Part 6

Start with: 2024 DevOps Predictions - Part 7

AI CHANGES EXPECTATIONS OF DEVELOPER PRODUCTIVITY

Automation tools will make a more visible impact on developer velocity and how developers' work is measured. This year's explosion of AI and ML is instigating an unparalleled transformation in business productivity expectations. In 2024, the extended accessibility to AI- and ML-driven automation tools will continue to elevate the benchmarks for code quality, reliability, and security, in response to the escalating demand for expedited software delivery.
Sairaj Uddin
SVP of Technology, The Trade Desk

Frontrunners Will Crack the Productivity Code

We keep talking about first mover advantage but in 2024, we'll see that disappear as best practices emerge by mid-year and gain widespread adoption later in the year. By the end of 2024, those who have adopted AI-assisted code tools and cracked the code on how to use AI well will be outperforming companies that are not. For DevOps teams already on this bandwagon, information sharing and increased productivity will become standard, enabling them to monetize and scale their successes.
Wing To
GM of Intelligent DevOps, Digital.ai

AI Advances abstraction in programming

We will see a significant leap in how AI advances abstraction for developers. As developers have looked to increase efficiencies, they have abstracted out the common and mundane tasks. Each new language, framework, and SDK that comes along abstracts another level of tasks that developers don't need to worry about. AI will take abstraction to the next level. AI-powered reference architectures will give developers a jump on starting new projects or lend a hand when solving complex problems. Developers will no longer begin with a blank slate. Instead, AI will help remove the intimidation of an empty page to jump start projects and streamline workflows.
Scott McAllister
Principal Developer Advocate, ngrok

AI transforms Kubernetes into automatic transmission system of cloud-native era

AI is poised to redefine how businesses utilize Kubernetes for application deployment and managing their infrastructure. Similar to how automatic transmissions streamlined driving, AI will become the automatic transmission for Kubernetes. AI will serve as the bridge between Kubernetes' inherent complexity and accessibility so that even entry-level team members will be able to efficiently navigate and manage Kubernetes environments. AI will act as an intelligent guide, simplifying intricate operations and offering real-time insights. It will not only automate issue detection but also empower less experienced staff to operate Kubernetes proficiently. This empowerment will optimize the workforce, reducing the need for extensive training or specialized knowledge. Consequently, businesses will be able to streamline their operations, reduce human intervention, and significantly cut operational costs, making the adoption of Kubernetes even more feasible and economical.
Mohan Atreya
SVP Product and Solutions, Rafay

API Plays Increasing Role in AI

APIs will play an increasing role in the growth of AI. They will grow into the de-facto mechanism that AI agents use to increase their access to the world — both to integrate data from myriad sources, and to operate on the world around them.
Abhijit Kane
Co-Founder, Postman

GENAI ACCELERATES OPEN SOURCE DEVELOPMENT

Generative AI will accelerate the impact of small, open source software teams. History shows that tiny teams — sometimes just one person! — can have outsized impact with open source. Generative AI is going to amplify this "open source impact effect" to incredible new levels. When we look at the cost of developing open source, actually writing the code itself isn't the expensive part. It's the documentation, bug handling, talking to people, responding to requests, checking examples of code on GitHub and more — all of which is very human-intensive. The open source community will benefit from generative AI for the same reason so many other efforts will: efficient elimination of tiresome human tasks. By helping with all of that, large language models will accelerate open source development this upcoming year, making smaller teams even more powerful.
Adrien Treuille
Director of Product Management and Head of Streamlit, Snowflake

AI ENABLES DEVOPS TO BE MORE PREDICTIVE

AI will enable DevOps and SecOps to become more predictive in nature. With the explosion of AI, DevOps will grow to be more predictive in nature, adjusting proactively to release and update software. In particular, DevOps and SecOps will join more closely to anticipate and remediate threats through predictive DevOps.
Ed Frederici
CTO, Appfire

ESTABLISHED FRAMEWORKS HAVE THE EDGE

More established frameworks are going to have a leg up compared to newer frameworks on which AI hasn't been as deeply trained. Which frameworks frontend devs choose to use and the unique skills they bring to the table will impact the job market and how development processes change.
Rita Kozlov
Senior Product Director, Cloudflare

DEV-TO-OPS RATIO INCREASES

The dev-to-ops headcount ratio will increase in 2024. Thanks to advances in AI co-pilots, developers are more efficient than ever. However, the Ops side remains slow to adopt and benefit from GenAI technologies. Unlike development, Ops functions require accurate and predictable outcomes — not a point of strength for LLMs. Many organizations have worked to apply AI models to production troubleshooting. Yet, none of these efforts have borne fruit, as production systems like Kubernetes and AWS are far too complex to be made fully autonomous. As development teams increasingly benefit from co-pilots and Ops teams become further marred in increasing complexities, the dev-to-ops headcount ratio has declined (i.e., we see fewer developers than Ops professionals). This is highly problematic for the industry. Thankfully, we'll see a reversal of fortune in 2024 as AI models improve, producing greater efficiencies in Ops workflows, and as investment into DevOps platforms increases, creating better tooling and higher abstraction abilities.
Sheng Liang
Co-Founder and CEO, Acorn Labs

VECTOR DATABASES PLAY KEY ROLE IN TECH STACK

As new applications get built from the ground up with AI, and as LLMs become integrated into existing applications, I believe vector databases will play an increasingly important role in the tech stack, just as application databases have in the past. Teams will need scalable, easy to use, and operationally simple vector data storage as they seek to create AI-enabled products with new LLM-powered capabilities
Avthar Sewrathan
GM for AI and Vector, Timescale

MLOps Integrates with DevOps

In 2024, MLOps will increasingly integrate with DevOps to create more streamlined workflows for AI projects. The combination of MLOps and DevOps creates a set of processes and automated tools for managing data, code and models to enhance the efficiency of machine learning platforms. Data scientists and software developers will get the freedom to transition to high-value projects without the need for manually overseeing models. The trend is driven by streamlining the process of delivering models to production to reduce time-to-value.
Haoyuan Li
Founder and CEO, Alluxio

The Rise of AI-as-a-Service

It's already possible to use OpenAI's ChatGPT in your own applications, but being able to model responses based on your own, proprietary datasets will bring much more value to businesses. This leads to issues of data sovereignty and confidentiality, which will see the rise of not just cloud-based AI services, but the ability to run them in siloed cloud-environments.
Ben Dechrai
Developer Advocate, Sonar

Evolving beyond the chatbot

The breakout star of generative AI has been ChatGPT; subsequently 2023 saw most interfaces to generative AI via chat. As designers and developers work with the technology, and as more specialized LLMs are produced, we'll see AI fade into the background, but we'll see more powerful applications built upon it. Right now, chatbots are hammers and everything looks like a nail, to truly use AI to its full potential we will need to move beyond that.
Phil Nash
Developer Advocate, Sonar

Share this

Industry News

November 20, 2024

Spectro Cloud completed a $75 million Series C funding round led by Growth Equity at Goldman Sachs Alternatives with participation from existing Spectro Cloud investors.

November 20, 2024

The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, has announced significant momentum around cloud native training and certifications with the addition of three new project-centric certifications and a series of new Platform Engineering-specific certifications:

November 20, 2024

Red Hat announced the latest version of Red Hat OpenShift AI, its artificial intelligence (AI) and machine learning (ML) platform built on Red Hat OpenShift that enables enterprises to create and deliver AI-enabled applications at scale across the hybrid cloud.

November 20, 2024

Salesforce announced agentic lifecycle management tools to automate Agentforce testing, prototype agents in secure Sandbox environments, and transparently manage usage at scale.

November 19, 2024

OpenText™ unveiled Cloud Editions (CE) 24.4, presenting a suite of transformative advancements in Business Cloud, AI, and Technology to empower the future of AI-driven knowledge work.

November 19, 2024

Red Hat announced new capabilities and enhancements for Red Hat Developer Hub, Red Hat’s enterprise-grade developer portal based on the Backstage project.

November 19, 2024

Pegasystems announced the availability of new AI-driven legacy discovery capabilities in Pega GenAI Blueprint™ to accelerate the daunting task of modernizing legacy systems that hold organizations back.

November 19, 2024

Tricentis launched enhanced cloud capabilities for its flagship solution, Tricentis Tosca, bringing enterprise-ready end-to-end test automation to the cloud.

November 19, 2024

Rafay Systems announced new platform advancements that help enterprises and GPU cloud providers deliver developer-friendly consumption workflows for GPU infrastructure.

November 19, 2024

Apiiro introduced Code-to-Runtime, a new capability using Apiiro’s deep code analysis (DCA) technology to map software architecture and trace all types of software components including APIs, open source software (OSS), and containers to code owners while enriching it with business impact.

November 19, 2024

Zesty announced the launch of Kompass, its automated Kubernetes optimization platform.

November 18, 2024

MacStadium announced the launch of Orka Engine, the latest addition to its Orka product line.

November 18, 2024

Elastic announced its AI ecosystem to help enterprise developers accelerate building and deploying their Retrieval Augmented Generation (RAG) applications.

Read the full news on APMdigest

November 18, 2024

Red Hat introduced new capabilities and enhancements for Red Hat OpenShift, a hybrid cloud application platform powered by Kubernetes, as well as the technology preview of Red Hat OpenShift Lightspeed.

November 18, 2024

Traefik Labs announced API Sandbox as a Service to streamline and accelerate mock API development, and Traefik Proxy v3.2.