When Agile Needs to Inspect Itself

 There is a pattern I have seen repeat across organizations of different sizes and industries. Teams adopt an Agile framework, populate their calendars with the right ceremonies, use the right vocabulary in standups, and then wonder why delivery feels just as slow and friction-filled as before. This is Agile Theatre: the performance of methodology without the substance of it. SAFe is perhaps the most visible example, a framework so dense with alignment meetings and coordination layers that the overhead of running it can quietly consume the productivity gains it was supposed to create. I have been thinking about this problem lately, and two frameworks have helped me sharpen my thinking: Agile² and DORA.

The piece "Agile²" makes an argument worth sitting with. Agile, in its original form, was a correction to a specific failure mode: process worship. The Agile Manifesto pushed back against the idea that following a detailed plan was the same as delivering value. The uncomfortable irony is that many teams today have rebuilt that same failure mode with new vocabulary. Scrum ceremonies, SAFe PI planning, story point velocity tracking -- these can become the thing teams optimize for, rather than the actual delivery of working software.

Agile² is the discipline of applying Agile's own logic to Agile itself. Every ceremony, every artifact, every process gets subjected to a single question: does this practice increase the rate at which we deliver correct, valuable software? If the answer is no, it gets cut. The framework commitments that follow from this are notably concrete. Velocity must be defined in measurable terms. Ownership must stay close to the work. Every ceremony should be treated as an experiment with a sunset clause. And "because Scrum says so" is never an acceptable justification for anything. That last one deserves to be posted on the wall of every planning room.

DORA metrics give that single question some teeth. Developed by the DevOps Research and Assessment program, the four key metrics are Deployment Frequency, Lead Time for Changes, Change Failure Rate, and Mean Time to Restore. Together they shift the measurement conversation away from outputs like story points and velocity, and toward outcomes: how fast does working software reach production, how often does it break, and how quickly does the team recover when it does. That shift in focus is significant. You can hit every sprint goal and still have a slow, fragile delivery pipeline. DORA makes that visible.

What I find valuable about pairing these two frameworks is how well they fit together philosophically. Agile² supplies the discipline of questioning every process. DORA supplies the objective data to make that questioning honest. Without measurement, process improvement is just opinion. Without a willingness to question your processes, metrics become another form of theatre. Used together, they push teams toward something the original Manifesto was actually after: continuous, real improvement grounded in outcomes rather than ceremonies.

The deeper lesson here is not about any specific framework. It is about the habit of turning Agile's own tools back on Agile. Inspect and adapt is supposed to apply to how you work, not just to what you build. If your retrospectives never result in a ceremony being eliminated, if your velocity number is stable but your delivery confidence is not, it may be time to ask the hard question. Does this practice make us faster, or does it just make us feel organized? That question, asked honestly and regularly, is what separates genuine agility from the performance of it.

Popular posts from this blog

The Dual Faces of Technology: Enhancing and Replacing Jobs

The Technology Trap: Capital, Labor and Power in the Age of Automation (Book Review)

Deputy Product Owners