Outcomes, Not Outputs: Making Measurement Meaningful
February 08, 2024

Dave Laribee
Nerd/Noir

During British colonial rule in India, authorities in the snake-ridden city of Delhi offered rewards for dead cobras. The offer backfired when people recognized the incentive as a business opportunity and started breeding cobras, resulting in overpopulation when the scheme ended. This came to be known as the "Cobra Effect."

I've seen a similar thing happen when leaders set core metrics to evaluate individuals and product and engineering teams. Metrics get abused when they become a target, as Goodhart's Law states.

Or they're opaque and arbitrary because developers aren't involved in setting the measurement strategy. Or they're irrelevant, better suited to describing manufacturing work (launch new model by end of the year) than software development (setting a target for the number of commits).

No matter the issue, metrics are rarely the best way to understand how the work developers are doing drives the business forward.

Of course, teams need some way to quantify progress. So how can leaders measure and understand how well individuals and teams are working?

Outcomes are a better way to gauge success than outputs. Outcomes shift the focus away from arbitrary numbers like lines of code and toward the real impact on customers and industries.

How Metrics Fall Short

Contrary to what McKinsey consultants claim, we have plenty of well-established metrics for software development.

Most of our established metrics measure productivity (how much a team or individual produces in a given timeframe) or performance (the degree of skill with which tasks are completed). We can use velocity, deployment frequency, or any number of other metrics to illustrate what our teams are doing.

But these metrics don't help us understand what's going wrong when the numbers aren't what we expect. Metrics don't tell us if certain code is difficult to work with or if clunky release processes impact productivity.

As the Cobra Effect demonstrates, metrics can also create perverse incentives. Developers may not be breeding snakes to earn a bounty, but they may game metrics if pressured to meet them, such as by taking on only simple tasks to pad velocity.

This is especially true if metrics are employed solely for the benefit of leadership. Individuals are expected to carry out a strategy with no insight into its development and no avenue to provide feedback.

Worst of all, metrics don't actually tell us whether developer work matters. A team might execute an activity repeatedly with proficiency, even excellence — but if that activity isn't relevant to organizational goals, then what's the point?

Work to Achieve Outcomes, Not to Juice Metrics

When I'm helping a large-scale software development organization transform how they work, one of my main goals is to shift their focus from metrics to outcomes. Leading with outcomes is a more effective way to assess performance and productivity.

An outcome is a result that will happen — not the lines of code a team will produce, but rather the effect that code will have on users or businesses.

Outcomes help developers understand their work in terms of value. Too often, developers don't understand exactly who they're building for or why. Outcomes contribute to a stronger mental model of a customer or user and show how technology is expected to drive business results.

We're not suggesting throwing out metrics altogether. Instead, we're using outcomes to determine which metrics we should look at based on what we're trying to achieve. Then we use those metrics — which may not be the same for every team in the organization — to gauge progress.

How to Start Measuring in Terms of Outcomes

Outcomes often begin as a hunch. We take a page from the product manager's discovery playbook to get from gut feeling to validated outcome: we ask questions, gather data, and sift through insights to make sure that our outcome is relevant and valuable. The ideal outcome is small and achievable, yielding a path toward larger ones.

For example, say an engineering manager thinks her team could deliver value at a more predictable pace. She validates her hunch with data from project tracking software, which shows a wide variance in story size.

Then she uses a developer experience platform to dig deeper into drivers that influence productivity and assess her team against industry benchmarks.

She discovers that her team has low satisfaction scores on requirements quality and batch size, indicating that they're struggling to release early and often. She scopes her original outcome to something more concrete and immediate: "We work on small stories to ensure a consistent pace of delivery."

Now she can work backward with her team to find the contextual metrics that indicate progress toward those outcomes — like decreased average story size and increased iteration completion percentage.

Once the initial outcome is achieved, the team can shift to a new outcome — which likely means new metrics. This iterative process facilitates continuous improvement.

A Consensus-Driven Approach to Delivering Real Value

Notice that in the example above, the engineering manager works with her team to develop their strategy for measuring productivity. The shift from metrics to outcomes isn't just about pointing a team toward results that meaningfully help users or businesses. It's about creating a culture of transparency by involving developers in the process of defining what success means for individuals and the team.

That includes educating developers on what outcomes are and why they matter, as well as actively soliciting and responding to feedback. Done right, this shift — from metrics to outcomes, and from top-down mandate to bottom-up empowerment — will give developers a new level of agency in innovating and solving problems.

When developers have the opportunity, resources, and motivation to improve, everyone wins: teams, leaders, and the people and companies who will eventually use our products.

Dave Laribee is Co-Founder and CEO of Nerd/Noir
Share this

Industry News

May 01, 2024

Amazon Web Services (AWS) announced the general availability of Amazon Q, a generative artificial intelligence (AI)-powered assistant for accelerating software development and leveraging companies’ internal data.

May 01, 2024

Red Hat announced the general availability of Red Hat Enterprise Linux 9.4, the latest version of the enterprise Linux platform.

May 01, 2024

ActiveState unveiled Get Current, Stay Current (GCSC) – a continuous code refactoring service that deals with breaking changes so enterprises can stay current with the pace of open source.

May 01, 2024

Lineaje released Open-Source Manager (OSM), a solution to bring transparency to open-source software components in applications and proactively manage and mitigate associated risks.

May 01, 2024

Synopsys announced the availability of Polaris Assist, an AI-powered application security assistant on the Synopsys Polaris Software Integrity Platform®.

April 30, 2024

Backslash Security announced the findings of its GPT-4 developer simulation exercise, designed and conducted by the Backslash Research Team, to identify security issues associated with LLM-generated code. The Backslash platform offers several core capabilities that address growing security concerns around AI-generated code, including open source code reachability analysis and phantom package visibility capabilities.

April 30, 2024

Azul announced that Azul Intelligence Cloud, Azul’s cloud analytics solution -- which provides actionable intelligence from production Java runtime data to dramatically boost developer productivity -- now supports Oracle JDK and any OpenJDK-based JVM (Java Virtual Machine) from any vendor or distribution.

April 30, 2024

F5 announced new security offerings: F5 Distributed Cloud Services Web Application Scanning, BIG-IP Next Web Application Firewall (WAF), and NGINX App Protect for open source deployments.

April 29, 2024

Code Intelligence announced a new feature to CI Sense, a scalable fuzzing platform for continuous testing.

April 29, 2024

WSO2 is adding new capabilities for WSO2 API Manager, WSO2 API Platform for Kubernetes (WSO2 APK), and WSO2 Micro Integrator.

April 29, 2024

OpenText™ announced a solution to long-standing open source intake challenges, OpenText Debricked Open Source Select.

April 29, 2024

ThreatX has extended its Runtime API and Application Protection (RAAP) offering to provide always-active API security from development to runtime, spanning vulnerability detection at Dev phase to protection at SecOps phase of the software lifecycle.

April 29, 2024

Canonical announced the release of Ubuntu 24.04 LTS, codenamed “Noble Numbat.”

April 25, 2024

JFrog announced a new machine learning (ML) lifecycle integration between JFrog Artifactory and MLflow, an open source software platform originally developed by Databricks.

April 25, 2024

Copado announced the general availability of Test Copilot, the AI-powered test creation assistant.