You Cannot Automate Your Way Out of Dysfunction

“The purpose of a system is what it does.” Stafford Beer

A team drowning in slow approvals decides the problem is speed, so they add AI. Leadership announces a new assistant to unlock productivity. A chatbot is rolled out to close knowledge gaps and reduce internal friction. Demos look promising. Early outputs feel impressive. But weeks later, nothing fundamental has changed. Decisions are still unclear. Ownership is still fuzzy. Data is still inconsistent. If anything, the noise level has increased. You can't use AI to repair broken systems. What it does in reality is accelerate their weaknesses.

You see, AI multiplies what already exists. Unclear decision rights leads directly to faster confusion. If no one knows who owns decisions, AI generates more options. Further, more stakeholders weigh in, and decision latency increases. Think about it like this: AI increases surface area of disagreement. Poor data result in confidently wrong outputs. If your data are incomplete, inconsistent, and/or politically curated, AI will generate plausible but incorrect conclusions, increase executive overconfidence, and mask underlying data rot. Garbage in, authoritative garbage out.

Avoidance culture uses AI as a shield. In fragile cultures, leaders use AI outputs to avoid accountability. “The model recommended it,” they’ll say, as if probability were a substitute for judgment. Decisions that should require ownership get reframed as neutral, data-driven inevitabilities. When outcomes go poorly, responsibility dissolves into the abstraction of the algorithm. No one chose. The system did. Over time, this erodes trust. Teams learn that AI is not a tool to inform thinking but a buffer to absorb blame. Instead of sharpening strategy, it becomes a political instrument that protects decision-makers from the consequences of their own calls.

Hero culture finds itself mired in AI-induced chaos. If your organization rewards last-minute saves and visible urgency, AI becomes jet fuel poured onto an already unstable system. Prototypes multiply. Features ship faster. Slack channels light up with output. But velocity without guardrails is not progress. Documentation lags. Testing compresses. Dependencies get ignored. The same leaders who celebrate speed now preside over escalating instability. Firefighting becomes constant. You ship faster, but you also break faster, and the heroics become more frequent because the underlying system was never designed to sustain that pace.

You can experience with AI the illusion of productivity. That's because AI often produces more drafts, more dashboards, more summaries, and more analysis. But productivity is not output volume. Real productivity requires clear ownership, clean inputs, aligned incentives, and decision discipline. Without those, as I've been saying, AI just creates more noise.

AI can serve as a culture diagnostic and actually help you address issues within organizational culture. Instead of seeing AI as a potential cure, view it rather as a stress test, mirror, or diagnostic tool. Where AI fails, look at governance, data pipelines, decision rights and incentives.
 
Before AI can create real leverage, certain foundations have to exist. Workflows must be documented so the system has something stable to enhance. Accountability must be defined so outputs translate into decisions rather than debate. Data hygiene must be taken seriously so confidence is earned, not assumed. Leadership must be willing to make calls instead of hiding behind probabilities. Incentives must reward long-term value instead of short-term optics. AI performs best in high-trust, well-run systems. Without that foundation, AI simply reflects the organization. 

Popular posts from this blog

The Dual Faces of Technology: Enhancing and Replacing Jobs

The Technology Trap: Capital, Labor and Power in the Age of Automation (Book Review)

Deputy Product Owners