Adi Kuruganti, Chief AI and Development Officer at Automation Anywhere, joins Amir to break down what it actually takes to build agents for the enterprise, not in theory, but in environments where complexity, governance, observability, and real business outcomes matter.
This conversation gets into the part of enterprise AI that most people skip. Not just what agents can do, but what changes when you have to deploy them across regulated systems, measure performance in production, manage model drift, and rethink how product and engineering teams ship software. It is a smart look at where enterprise AI is going, and what technical leaders need to understand before the market catches up.
What stood out
• Enterprise agents are only as strong as their data, context, and deployment model. In large companies, that means dealing with hybrid environments, air gapped systems, privacy controls, and process level context, not just model quality.
• AI is changing more than coding. Adi explains how his team is using AI across the full software development lifecycle, from spec creation and test generation to production event triage and release workflows.
• The release process is shifting from periodic launches to continuous iteration. That puts more pressure on observability, because teams now have to track model behavior, latency, and runtime performance as features roll out.
• Security can no longer sit off to the side. Prompt injection, shared tenant risk, and post production anomaly detection all require security teams to work much closer to AI and product teams.
• Mass adoption is not just a technology problem. The tools are improving fast, but enterprises still need change management, clear use cases, internal operating models, and people who know how to make AI part of daily work.
Timestamped Highlights
00:00 Adi Kuruganti joins the show to unpack what enterprise agent development really looks like today, from deployment models to governance to observability.
02:07 Why enterprise agents are different. Adi explains why context, data control, and environment complexity matter more in large organizations.
04:57 How AI is reshaping the software development lifecycle. From code suggestions to automated tests to incident triage, AI is moving deeper into product delivery.
10:13 The old handoff model is breaking. Product, design, and engineering are starting to work in a much more fluid, AI assisted way.
12:22 What changes in release management when AI writes part of the code and teams ship continuously instead of waiting for big release cycles.
18:17 How enterprises should judge agent performance, from human review and exception handling to evals, runtime benchmarks, and model drift.
27:21 Adi on the real AI adoption curve, job disruption, and why the bigger shift is not replacement, but making AI part of how people actually work every day.
A line worth sitting with
“AI should be a core element of how they work.”
Worth applying
• If you are building with AI, evaluate more than accuracy. Cost, latency, and consistency matter too.
• If you are leading teams, do not treat observability as a nice to have. Runtime visibility is part of the product now.
• If you are thinking about adoption, start with a real business problem and scale from early wins instead of trying to automate everything at once.
Follow the show for more conversations with the builders, operators, and technology leaders shaping how modern companies are actually being built.