Why AI engineering needs old-school discipline

By The New Stack

Share:

Key Concepts

  • AI-Powered Software Engineering: The integration of generative AI and autonomous agents into the software development lifecycle.
  • Systems Thinking: An approach that views software development as a holistic system involving people, processes, and technology, rather than just a tool-based change.
  • Coding Agents: AI models capable of writing, testing, and managing code, often working in "swarms" or as part of a human-machine team.
  • Cognitive Load: The mental effort required by both humans and AI agents to process information, manage codebases, and maintain architectural integrity.
  • DORA Metrics: A set of performance metrics (Deployment Frequency, Lead Time for Changes, Change Failure Rate, Time to Restore Service) used to measure software delivery effectiveness.
  • Mutation Testing: A method to evaluate the quality of software tests by introducing small changes (mutations) to the code to see if tests catch the errors.
  • Progressive Context Disclosure: A technique to manage AI cognitive load by providing only the necessary, relevant information for a specific task rather than the entire codebase.

1. Main Topics and Key Points

The discussion centers on the transition from AI experimentation (proof-of-concepts) to production-ready software engineering. Nimisha Asthagiri emphasizes that while companies are eager to adopt AI, many struggle with business ROI.

  • The "Why" vs. "How": Companies often focus on "how to go faster," but should instead ask "what can we build now that we couldn't before?"
  • The Failure Rate: Citing Gartner, Asthagiri notes that 40% of agentic projects are projected to be canceled by 2027, largely due to a lack of strategic alignment and systemic integration.
  • Shift in Metrics: Success should be measured by iteration cycles, interaction quality, and "first-pass acceptance rates" (how often AI-generated code is accepted without rework) rather than just raw output volume.

2. Systems Thinking and Engineering Discipline

Asthagiri argues that AI is not just a tool but a systemic change.

  • Paved Roads: Organizations should build technical platforms that act as "paved roads," providing developers with standardized, secure, and efficient pathways to production.
  • Reinforcing Fundamentals: Traditional engineering disciplines—such as Test-Driven Development (TDD), modularity, and Zero Trust security—are more critical than ever to prevent "AI slop" (low-quality, unverified code).
  • Human Judgment: The role of the human is shifting from repetitive coding to higher-order judgment, architectural oversight, and strategic decision-making.

3. Coding Agents and Architectural Integrity

The conversation highlights the evolution of coding agents and the risks of unmanaged automation.

  • Agentic Topologies: Teams are moving toward role-specific agents (e.g., front-end vs. back-end specialists).
  • Caution on Swarms: While individual agents are maturing, "coding agent swarms" (hundreds of agents working simultaneously) are still experimental and pose risks regarding conflict resolution and regulatory compliance.
  • Architectural Decision Records (ADRs): Agents should be tasked with documenting their decisions so humans can review and reference them, as manual code review of AI-generated output is becoming unsustainable.

4. Managing the "Dark Code" and Volatility

A significant challenge identified is the explosion of code volume.

  • Code as Commodity: Because AI makes code generation cheap, organizations must be more selective about what they build versus buy.
  • Ephemeral Code: Asthagiri suggests a shift in mindset where some code is treated as temporary or "ephemeral," generated for a specific purpose and then discarded, rather than maintained indefinitely.
  • Dark Code: Similar to "dark data," organizations risk accumulating vast amounts of unmaintained, low-value code generated by AI.

5. Notable Quotes

  • "It’s not just AI, but AI that works." — Nimisha Asthagiri, on the philosophy of ThoughtWorks.
  • "The question that we're hearing a lot from executives is 'how do we go faster?'... A better question here might be 'what do we build given the latest technology that we couldn't build before?'"
  • "The machine's cognitive load is as much important as our human ones."

6. Synthesis and Conclusion

The primary takeaway is that the "AI FOMO" (fear of missing out) phase is giving way to a need for disciplined, strategic engineering. To achieve ROI, organizations must stop treating AI as a magic button for speed and start treating it as a component of a larger, well-architected system. By focusing on modularity, rigorous feedback loops (like mutation testing), and strategic "human-in-the-loop" oversight, companies can navigate the current hype cycle and build sustainable, high-value software. The ThoughtWorks Technology Radar serves as a key resource for tracking the maturity of these evolving techniques.

Chat with this Video

AI-Powered

Hi! I can answer questions about this video "Why AI engineering needs old-school discipline". What would you like to know?

Chat is based on the transcript of this video and may not be 100% accurate.

Related Videos

Ready to summarize another video?

Summarize YouTube Video