28 March 2026

9 min Reading Time

66 percent of developers don’t believe their company’s productivity metrics reflect their actual work. Meanwhile, AI tools already write 41 percent of all code – and deployment stability has dropped by 7.2 percent, according to the Google DORA Report 2024. Developer Experience isn’t a “feel-good” topic. It’s the lever on which cloud teams’ productivity hinges – and most companies measure it incorrectly.

TL;DR

  • 66 percent of developers distrust their company’s productivity metrics (JetBrains State of Developer Ecosystem 2025).
  • 75 to 85 percent of developer time is spent waiting: industry-average Flow Efficiency stands at just 15 to 25 percent. Most developers wait – not develop.
  • AI writes 41 percent of code and saves 30 to 60 percent of time on routine tasks. Yet Code Churn (overwritten code) will double by 2026. More output does not equal more value.
  • DORA alone is no longer enough: Deployment Frequency and Lead Time measure Delivery – not Experience. Frameworks like SPACE, DevEx, and DX Core 4 complement technical metrics with satisfaction and cognitive load.
  • 62 percent cite non-technical factors – such as communication, collaboration, and role clarity – as equally important for productivity as technical ones.

Why DORA Metrics Fall Short

For years, the four DORA metrics (Deployment Frequency, Lead Time for Changes, Change Failure Rate, Mean Time to Recovery) have been the gold standard for measuring DevOps performance. They assess how quickly and reliably a team delivers software. What they don’t measure: how the team feels while doing it.

The Google DORA Report 2024 reveals a troubling trend: delivery stability has declined by 7.2 percent – even as teams deploy more frequently than ever before. More deployments don’t automatically mean better software. The tight coupling between speed and stability that DORA champions begins to break down when Developer Experience is poor.

Google itself now states: No single metric fully captures developer productivity. The company tracks Speed, Ease, and Quality as distinct dimensions. That’s precisely where the expansion begins: DORA measures the machine – but not the human operating it.

JetBrains Developer Ecosystem 2025
66 %
of developers distrust their company’s productivity metrics

Source: JetBrains State of Developer Ecosystem, 2025

What Developer Experience Really Is

DevEx describes the sum of all experiences a developer has while working – from onboarding and toolchain quality to architectural clarity. Authors of the SPACE framework (including Nicole Forsgren, co-author of the DORA Report) introduced the DevEx Framework in 2023, defining three core dimensions:

Feedback Loops: How quickly does a developer receive feedback on their code? CI/CD runtimes, code review wait times, and test results all matter. Long feedback loops kill productivity by disrupting flow.

Cognitive Load: How much mental capacity do the toolchain, architecture, and documentation consume? Every poorly integrated tool, undocumented API, or manual deployment step increases cognitive load – and reduces capacity for creative problem-solving.

Flow State: How often do developers achieve deep concentration? Meetings, Slack notifications, context switching between projects, and support tickets all prevent flow. Research shows it takes developers 23 minutes to regain focus after an interruption.

The Toolchain as a Productivity Killer

Industry-average Flow Efficiency sits at 15 to 25 percent. That means: In an eight-hour workday, a developer spends just one to two hours coding. The rest is waiting – for CI/CD pipelines, code reviews, Kubernetes-deployments, approvals, and context from other teams.

The toolchain is a key driver. A typical cloud team juggles 10 to 15 tools simultaneously: IDE, Git, CI/CD, container registry, Kubernetes, monitoring, logging, alerting, ticketing, documentation, and chat. Each tool switch is a context switch. Every login screen, sluggish UI, and missing integration costs minutes – adding up to hours per week.

Platform Engineering directly addresses this: An internal developer platform (IDP) consolidates tools, automates workflows, and reduces cognitive load. Gartner forecasts that by 2026, 80 percent of engineering organizations will operate Platform Engineering teams. Why? Because Developer Experience doesn’t scale with more tools – it scales with fewer.

“No single metric fully captures developer productivity. We track Speed, Ease, and Quality as separate dimensions.” Google, Developer Productivity Framework

AI as Accelerator – and Problem

84 percent of developers use – or plan to use – AI tools. AI already generates 41 percent of all code. According to JetBrains, nine out of ten developers save at least one hour per week using AI assistants. The productivity gains are real.

But they create a new challenge: Code Churn. Code written and overwritten shortly thereafter will double by 2026. AI generates code faster than humans can review it. The result: more pull requests, longer review queues, and declining code quality when reviews happen under time pressure.

For cloud teams, this means: AI tools improve Developer Experience only when embedded into existing workflows. An AI copilot that generates code in the IDE – but knows nothing about your internal APIs – creates technical debt, not productivity. The highest-performing teams feed AI tools with internal context: architecture documentation, API specifications, and SBOM data.

Five Actions for Better Developer Experience

1. Measure and optimize Flow Efficiency. What percentage of developer time is active work versus waiting? Target: increase from 15 to 30 percent. Levers: accelerate CI/CD (builds under 10 minutes), cap code reviews at 24 hours, automate deployment pipelines.

2. Reduce Cognitive Load via Platform Engineering. Self-service infrastructure platforms, standardized templates for new services, and automated environment provisioning. Every manual step the platform handles lifts cognitive load from the developer.

3. Enforce a meeting diet for dev teams. Max two meeting days per week; the rest is Focus Time. No meetings before 11 a.m. Default to async-first communication for anything not requiring real-time alignment. This sounds cultural – but research is clear: every interruption costs 23 minutes to recover focus.

4. Track DevEx metrics alongside DORA. Quarterly Developer Satisfaction Surveys (measuring satisfaction, pain points, tool ratings); Flow Efficiency tracking; Cognitive Load assessments. Dropbox and Booking.com use the Developer Experience Index (DXI), linking DevEx directly to business outcomes. DX Core 4 unifies Speed, Effectiveness, Quality, and Impact in one framework.

5. Contextualize AI tools. Connect AI copilots to internal knowledge: architecture diagrams, API documentation, coding standards, security policies. A copilot that understands your company’s context produces less churn – and more usable code. That requires investment in Retrieval-Augmented Generation (RAG) and internal knowledge bases.

Conclusion

Developer Experience is not a wellness program. It’s the multiplier for cloud teams’ productivity. 66 percent of developers distrust current metrics. 75 to 85 percent of their time is spent waiting. And AI-generated code introduces new quality issues – if not integrated into the workflow. Companies taking DevEx seriously invest in Platform Engineering, measure Flow Efficiency, and create conditions for Flow State. Everyone else pays the productivity penalty in longer release cycles, higher turnover, and lower-quality code. DORA measures the machine. DevEx measures the human. Only together do they deliver the full picture.

Frequently Asked Questions

What’s the difference between DORA and DevEx?

DORA measures software delivery performance: how fast and reliably a team ships software (Deployment Frequency, Lead Time, Change Failure Rate, MTTR). DevEx measures the developer’s experience: how satisfied, productive, and focused they feel while working (Feedback Loops, Cognitive Load, Flow State). They’re complementary.

How do you measure Developer Experience?

Three approaches: quarterly Developer Satisfaction Surveys (covering satisfaction, pain points, tool ratings); Flow Efficiency tracking (active work time vs. wait time, derived from ticketing and CI/CD data); and Cognitive Load assessment (surveys on toolchain complexity, context switches, and interruptions). DX Core 4 and the Developer Experience Index (DXI) offer standardized frameworks.

Does AI make developers more productive?

Yes – but with caveats. 84 percent use AI tools; nine out of ten save at least one hour weekly. Yet Code Churn doubles by 2026 because AI generates more code than teams can meaningfully review. The net effect depends on whether teams integrate AI-generated code into their quality processes – or simply produce more.

What is Flow Efficiency?

The share of time a developer spends on active work (writing code, solving problems) versus total time – including wait time (for builds, reviews, deployments, approvals). Industry average: 15-25 percent. Top performers reach 40 percent or more.

Is Platform Engineering worthwhile for small teams?

Yes – starting around five to ten developers. Below that, the overhead of a dedicated Platform team is too high. But even small teams can improve DevEx: standardized templates, automated CI/CD, and clear documentation require no Platform team – just discipline. Start with a shared-responsibility platform instead of a dedicated team.

Further Reading

Platform Engineering 2026: Internal Developer Platforms

Container Supply Chain Security: 87 Percent of Docker Images

Ingress-NGINX End-of-Life: Migrating to Gateway API

More from the MBF Media Network

Digital Chiefs: The Digital Operating Model

MyBusinessFuture: AI in Mid-Sized Companies

SecurityToday: NIS2 in Germany

Header Image Source: Pexels / Lukas Blazek (px:574069)

Also available in

A magazine by Evernine Media GmbH