Powered by RND
PodcastsTechnologyThe IT/OT Insider Podcast - Pioneers & Pathfinders

The IT/OT Insider Podcast - Pioneers & Pathfinders

By David Ariens and Willem van Lammeren
The IT/OT Insider Podcast - Pioneers & Pathfinders
Latest episode

Available Episodes

5 of 27
  • Connecting People, Parts and Processes with Tego’s Tim Butler
    Welcome to another episode of the IT/OT Insider Podcast. Today, we’re diving into visibility, traceability, and real-time analytics with Tim Butler, CEO and founder of Tego.For the last 20 years, Tego has been specializing in tracking and managing critical assets in industries like aerospace, pharmaceuticals, and energy. The company designed the world’s first rugged, high-memory passive UHF RFID chip, helping companies like Airbus and Boeing digitize lifecycle maintenance on their aircraft.It’s a fascinating topic—how do you keep track of assets that move across the world every day? How do you embed intelligence directly into physical components? How does all of this connect to the broader challenge of IT and OT convergence? And… how do you create a unified view that connects people, parts, and processes to business outcomes?Let’s dive in!Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and support our work.From Serial Entrepreneur to Asset IntelligenceTim’s journey into asset intelligence started 20 years ago, when he saw a major opportunity in industrial RFID technology."At the time, RFID chips had only 96 or 128 bits of storage. That was enough for a serial number, but not much else. We set out to design a chip that could hold thousands of times more memory—and that completely changed the game."That chip became the foundation for Tego’s work in aerospace.* Boeing and Airbus needed a better way to track assets on planes.* Maintenance logs and compliance records needed to (virtually) move with the asset itself.* Standard RFID solutions didn’t have enough memory or durability to survive extreme conditions.By designing high-memory RFID chips, Tego helped digitize aircraft maintenance and inventory management. They co-authored the ATA Spec 2000 Chapter 9-5 standards that are now widely used in aerospace."The challenge was clear—planes fly all over the world, so the data needed to travel with them. We had to embed intelligence directly into the assets themselves."A Real-World Use Case: Tracking Aircraft Components with RFIDOne of the best examples of Tego’s impact is in the aerospace industry.The Challenge:* Aircraft components need regular maintenance and compliance tracking.* Traditional tracking methods relied on centralized databases, which weren’t always accessible.* When a plane lands, maintenance teams need instant access to accurate, up-to-date records.The Solution:* Every critical component (seats, life vests, oxygen generators, galley equipment, etc.) is tagged with a high-memory RFID chip (yes, also the one underneath your next airplane seat probably has one 🙂).* When a technician scans a tag, they instantly access the asset’s history.The Impact:* Reduced maintenance delays—Technicians no longer have to search for data across multiple systems.* Improved traceability—Every asset has a digital history that travels with it.* Compliance enforcement—Airlines can quickly verify whether components meet regulatory requirements."This isn’t just about making inventory tracking easier. It’s about ensuring safety, reducing downtime, and making compliance effortless."The IT vs. OT Divide in AerospaceA major theme of our podcast is the convergence of IT and OT—and in aerospace, that divide is particularly pronounced.Tim breaks it down:* IT teams manage enterprise data—ERP systems, databases, and security.* OT teams manage physical assets—maintenance operations, plant floors, and repair workflows.* Both need access to the same data, but they use it differently."IT thinks in terms of databases and networks. OT thinks in terms of real-world processes. The goal isn’t just connecting IT and OT—it’s making sure they both get the data they need in a usable way."The Future of AI and Asset IntelligenceWith all the buzz around AI and Large Language Models (LLMs), we asked Tim how these technologies are impacting industrial asset intelligence.His take? AI is only as good as the data feeding it."If you don’t have structured, reliable data, AI can’t do much for you. That’s why asset intelligence matters—it gives AI the high-quality data it needs to make meaningful predictions."Some of the key trends he sees:* AI-powered maintenance recommendations—Analyzing historical asset data to predict failures before they happen.* Automated compliance checks—Using AI to validate and flag compliance issues before inspections.* Smart inventory optimization—Ensuring that spare parts are always available where they’re needed most.But the biggest challenge? Data consistency."AI works best when it has standardized, structured data. That’s why using industry standards—like ATA Spec 2000 for aerospace—is so important."Final ThoughtsIndustrial asset intelligence is evolving rapidly, and Tego is leading the way in making assets smarter, more traceable, and more autonomous.From tracking aircraft components to ensuring regulatory compliance in pharma, Tego’s technology blends physical and digital worlds, making it easier for companies to manage assets at a global scale.Together with Tego, businesses create a single source of truth for people, processes, and parts that empowers operations with the vision to move forward.If you’re interested in learning more about Tego and their approach to asset intelligence, visit www.tegoinc.com.Stay Tuned for More!Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.🚀 See you in the next episode!Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts: Spotify Podcasts: Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions. This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com
    --------  
    36:06
  • Industrial DataOps #12 with HiveMQ – Dominik Obermaier on MQTT, UNS and Massive Scale
    Welcome to the final episode of our special Industrial DataOps podcast series. And what better way to close out the series than with Dominik Obermaier, CEO and co-founder of HiveMQ—one of the most recognized names when it comes to MQTT and Unified Namespace (UNS).Dominik has been at the heart of the MQTT story from the very beginning—contributing to the specification, building the company from the ground up, and helping some of the world’s largest manufacturers, energy providers, and logistics companies reimagine how they move and use industrial data.Every Company is Becoming an IoT CompanyDominik opened with a striking analogy:"Just like every company became a computer company in the ‘80s and an internet company in the ‘90s, we believe every company is becoming an IoT company."And that belief underpins HiveMQ’s mission—to build the digital backbone for the Internet of Things, connecting physical assets to digital applications across the enterprise.Subscribe for free to receive new posts and support our work.Today, HiveMQ is used by companies like BMW, Mercedes-Benz, and Lilly to enable real-time data exchange from edge to cloud, using open standards that ensure long-term flexibility and interoperability.What is MQTT?For those new to MQTT, Dominik explains what it is: a lightweight, open protocol built for real-time, scalable, and decoupled communication.Originally developed in the late 1990s for oil pipeline monitoring, MQTT was designed to minimize bandwidth, maximize reliability, and function in unstable network conditions.It uses a publish-subscribe pattern, allowing producers and consumers of data to remain decoupled and highly scalable—ideal for IoT and OT environments, where devices range from PLCs to cloud applications."HTTP works for the internet of humans. MQTT is the protocol for the internet of things."The real breakthrough came when MQTT became an open standard. HiveMQ has been a champion of MQTT ever since—helping manufacturers escape vendor lock-in and build interoperable data ecosystems.From Broker to Backbone: Mapping HiveMQ to the Capability ModelHiveMQ is often described as an MQTT broker, but as Dominik made clear, it's far more than that. Let’s map their offerings to our Industrial DataOps Capability Map:Connectivity & Edge Ingest →* HiveMQ Edge: A free, open-source gateway to connect to OPC UA, Modbus, BACnet, and more.* Converts proprietary protocols into MQTT, making data accessible and reusable.Data Transport & Integration →* HiveMQ Broker: The core engine that enables highly reliable, real-time data movement across millions of devices.* Scales from single factories to hundreds of millions of data tags.Contextualization & Governance →* HiveMQ Data Hub and Pulse: Tools for data quality, permissions, history, and contextual metadata.* Pulse enables distributed intelligence and manages the Unified Namespace across global sites.UNS Management & Visualization →* HiveMQ Pulse is a true UNS solution that provides structure, data models, and insights without relying on centralized historians.* Allows tracing of process changes, root cause analysis, and real-time decision support.Building the Foundation for Real-Time Enterprise DataFew topics have gained as much traction recently as UNS (Unified Namespace). But as Dominik points out, UNS is not a product—it’s a pattern. And not all implementations are created equal."Some people claim a data lake is a UNS. Others say it’s OPC UA. It’s not. UNS is about having a shared, real-time data structure that’s accessible across the enterprise."HiveMQ Pulse provides a managed, governed, and contextualized UNS, allowing companies to:* Map their assets and processes into a structured namespace.* Apply insights and rules at the edge—without waiting for data to reach the cloud.* Retain historical context while staying close to real-time operations."A good data model will solve problems before you even need AI. You don’t need fancy tech—you need structured data and the ability to ask the right questions."Fix the Org Before the TechOne of the most important takeaways from this conversation was organizational readiness. Dominik was clear:"You can’t fix an organizational problem with technology."Successful projects often depend on having:* A digital transformation bridge team between IT and OT.* Clear ownership and budget—often driven by a C-level mandate.* A shared vocabulary, so teams can align on definitions, expectations, and outcomes.To help customers succeed, HiveMQ provides onboarding programs, certifications, and educational content to establish this common language.Use CaseOne specific use case we’d like to highlight is that at Lilly, a Pharmaceutical company:Getting Started with HiveMQ & UNSDominik shared practical advice for companies just starting out:* Begin with open-source HiveMQ Edge and Cloud—no license or sales team required.* Start small—connect one PLC, stream one tag, and build from there.* Demonstrate value quickly—show how a single insight (like predicting downtime from a temperature drift) can justify further investment.* Then scale—build a sustainable, standards-based data architecture with the support of experienced partners.Final Thoughts: A Fitting End to the SeriesThis episode was the perfect way to end our Industrial DataOps podcast series—a conversation that connected the dots between open standards, scalable data architecture, organizational design, and future-ready analytics (and don’t worry, we have lots of other podcast ideas for the months to come :)).HiveMQ’s journey—from a small startup to powering the largest industrial IoT deployments in the world—is proof that open, scalable, and reliable infrastructure will be the foundation for the next generation of digital manufacturing.If you want to learn more about MQTT, UNS, or HiveMQ Pulse, check out the excellent content at www.hivemq.com or their article on DataOps. Stay Tuned for More!Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.🚀 See you in the next episode!Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts: Spotify Podcasts: Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions. This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com
    --------  
    43:37
  • Industrial DataOps #11 with AVEVA – Clemens & Roberto on Unlocking the Value of Industrial Data
    Welcome to Episode 11! As we get closer to Hannover Messe 2025, we’re also approaching the final episodes of this podcast series. Today we have two fantastic guests from AVEVA: Roberto Serrano Hernández, Technology Evangelist for the CONNECT industrial intelligence platform, and Clemens Schönlein, Technology Evangelist for AI and Analytics.Together, they bring a unique mix of deep technical insight, real-world project experience, and a passion for making industrial data usable, actionable, and valuable.We cover a lot in this episode: from the evolution of AVEVA's CONNECT industrial intelligence platform, to real-world use cases, data science best practices, and the cloud vs. on-prem debate. It’s a powerful conversation on how to build scalable, trusted, and operator-driven data solutions.Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts.What is CONNECT?Let’s start with the big picture. What is the CONNECT industrial intelligence platform? As Roberto explains:"CONNECT is an open and neutral industrial data platform. It brings together all the data from AVEVA systems—and beyond—and helps companies unlock value from their operational footprint."This isn’t just another historian or dashboard tool. CONNECT is a cloud-native platform that allows manufacturers to:* Connect to on-prem systems.* Store, contextualize, and analyze data.* Visualize it with built-in tools or share it with AI platforms like Databricks.* Enable both data scientists and domain experts to collaborate on decision-making.It’s also built to make the transition to cloud as seamless as possible—while preserving compatibility with legacy systems."CONNECT is for customers who want to do more – close the loop, enable AI, and future-proof their data strategy"Where CONNECT Fits in the Industrial Data Capability MapRoberto breaks it down neatly:* Data Acquisition – Strong roots in industrial protocols and legacy system integration.* Data Storage and Delivery – The core strength of CONNECT: clean, contextualized, and trusted data in the cloud.* Self-Service Analytics & Visualization – Tools for both data scientists and OT operators to work directly with data.* Ecosystem Integration – CONNECT plays well with Databricks, Snowflake, and other analytics platforms.But Clemens adds an important point:"The point isn’t just analytics—it’s about getting insights back to the operator. You can’t stop at a dashboard. Real value comes when change happens on the shop floor."Use Case Spotlight: Stopping Downtime with Data Science at AmcorOne of the best examples of CONNECT in action is the case of Amcor, a global packaging manufacturer producing the plastic film used in things like chip bags and blister packs.The Problem:* Machines were stopping unpredictably, causing expensive downtime.* Traditional monitoring couldn’t explain why.* Root causes were hidden upstream in the process.The Solution:* CONNECT was used to combine MES data and historian data in one view.* Using built-in analytics tools, the team found that a minor drift in a temperature setpoint upstream was causing the plastic’s viscosity to change—leading to stoppages further down the line.* They created a correlation model, mapped it to ideal process parameters, and fed the insight back to operators."The cool part was the speed," said Clemens. "What used to take months of Excel wrangling and back-and-forth can now be done in minutes."The Human Side of Industrial Data: Start with the OperatorOne of the most powerful themes in this episode is the importance of human-centric design in analytics.Clemens shares from his own experience:"I used to spend months building an advanced model—only to find out the data wasn't trusted or the operator didn’t care. Now I start by involving the operator from Day 1."This isn’t just about better UX. It’s about:* Getting faster buy-in.* Shortening time-to-value.* Ensuring that insights are actionable and respected.Data Management and Scaling ExcellenceWe also touched on the age-old challenge of data management. AVEVA’s take? Don’t over-architect. Start delivering value."Standardization is important—but don’t wait five years to get it perfect. Show value early, and the standardization will follow."And when it comes to building centers of excellence, Clemens offers a simple yet powerful principle:"Talk to the people who press the button. If they don’t trust your model, they won’t use it."Final ThoughtsAs we edge closer to Hannover Messe, and to the close of this podcast series, this episode with Clemens and Roberto reminds us what Industrial DataOps is all about:* Useful data* Actionable insights* Empowered people* Scalable architectureIf you want to learn more about AVEVA's CONNECT industrial intelligence platform and their work in AI and ET/OT/IT convergence, visit: www.aveva.comStay Tuned for More!Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.🚀 See you in the next episode!Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts: Spotify Podcasts: Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions. This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com
    --------  
    33:11
  • Industrial DataOps #10 with Celebal Technologies - Anupam Gupta on ERP, AI, and Lake Houses in Manufacturing
    Welcome to Episode 10 of the IT/OT Insider Podcast. Today, we're pleased to feature Anupam Gupta, Co-Founder & President North Americas at Celebal Technologies, to discuss how enterprise systems, AI, and modern data architectures are converging in manufacturing.Celebal Technologies is a key partner of SAP, Microsoft, and Databricks, specializing in bridging traditional enterprise IT systems with modern cloud data and AI innovations. Unlike many of our past guests who come from a manufacturing-first perspective, Celebal Technologies approaches the challenge from the enterprise side—starting with ERP and extending into industrial data, AI, and automation.Anupam's journey began as a developer at SAP, later moving into consulting and enterprise data solutions. Now, with Celebal Technologies, he is helping manufacturers combine ERP data, OT data, and AI-driven insights into scalable Lakehouse architectures that support automation, analytics, and business transformation.Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and support our work.ERP as the Brain of the EnterpriseOne of the most interesting points in our conversation was the role of ERP (Enterprise Resource Planning) systems in manufacturing."ERP is the brain of the enterprise. You can replace individual body parts, but you can't transplant the brain. The same applies to ERP—it integrates finance, logistics, inventory, HR, and supply chain into a single system of record."While ERP is critical, it doesn't cover everything. The biggest gap? Manufacturing execution and OT data.* ERP handles business transactions → orders, invoices, inventory, financials.* MES and OT systems handle operations → machine status, process execution, real-time sensor data.Traditionally, these two have been separated, but modern manufacturers need both worlds to work together. That's where integrated data platforms come in.Bridging Enterprise IT and Manufacturing OTCelebal Technologies specializes in merging enterprise and industrial data, bringing IT and OT together in a structured, scalable way.Anupam explains: "When we talk about Celebal Tech, we say we sit at the right intersection of traditional enterprise IT and modern cloud innovation. We understand ERP, but we also know how to integrate it with IoT, AI, and automation."Key focus areas include:* Unifying ERP, MES, and OT data into a central Lakehouse architecture.* Applying AI to optimize operations, logistics, and supply chain decisions.* Enabling real-time data processing at the edge while leveraging cloud for scalability.This requires a shift from traditional data warehouses to modern Lakehouse architectures—which brings us to the next big topic.What is a Lakehouse and Why Does It Matter?Most people are familiar with data lakes and data warehouses, but a Lakehouse combines the best of both.Traditional Approaches:* Data warehouses → Structured, governed, and optimized for business analytics, but not flexible for AI or IoT data.* Data lakes → Can store raw data from many sources but often become data swamps—difficult to manage and analyze.Lakehouse Benefits:* Combines structured and unstructured data → Supports ERP transactions, sensor data, IoT streams, and documents in a single system.* High performance analytics → Real-time queries, machine learning, and AI workloads.* Governance and security → Ensures data quality, lineage, and access control."A Lakehouse lets you store IoT and ERP data in the same environment while enabling AI and automation on top of it. That's a game-changer for manufacturing."Celebal Tech is a top partner for Databricks and Microsoft in this space, helping companies migrate from legacy ERP systems to modern AI-powered data platforms.There's More to AI Than GenAIWith all the hype around Generative AI (GenAI), it's important to remember that AI in manufacturing goes far beyond chatbots and text generation."Many companies are getting caught up in the GenAI hype, but the real value in manufacturing AI comes from structured, industrial data models and automation."Celebal Tech is seeing two major AI trends:* AI for predictive maintenance and real-time analytics → Using sensor and operational data to predict failures, optimize production, and automate decisions.* AI-driven automation with agent-based models → AI is moving from just providing recommendations to executing complex tasks in ERP and MES environments.GenAI has a role to play, but:* Many companies are converting structured data into unstructured text just to apply GenAI—which doesn't always make sense.* Enterprises need explainability and trust before AI can take over critical operations."Think of AI in manufacturing like self-driving cars—we're not fully autonomous yet, but we're moving toward AI-assisted automation."The key to success? Good data governance, well-structured industrial data, and AI models that operators can trust.Final Thoughts: Scaling DataOps and AI in ManufacturingFor manufacturers looking to modernize their data strategy, Anupam offers three key takeaways:* Unify ERP and OT data → AI and analytics only work when data is structured and connected across systems.* Invest in a Lakehouse approach → It's the best way to combine structured business data with real-time industrial data.* AI needs governance→ Without trust, transparency, and explainability, AI won't be adopted at scale."You don't have to replace your ERP or MES, but you do need a data strategy that enables AI, automation, and better decision-making."If you want to learn more about Celebal Technologies and how they're bridging AI, ERP, and manufacturing data, visit www.celebaltech.com.Stay Tuned for More!Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.🚀 See you in the next episode!Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts: Spotify Podcasts: Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions. This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com
    --------  
    35:48
  • Industrial DataOps #9 with Databricks - David Rogers on scaling AI
    Welcome to Episode 9 in our Special DataOps series. We’re getting closer to Hannover Messe, and thus also the end of this series. We still have some great episodes ahead of us, with AVEVA, HiveMQ and Celebal Technologies joining us in the days to come (and don’t worry, this is not the end of our podcasts, many other great stories are already recorded and will be aired in April!)In this episode, we’re joined by David Rogers, Senior Solutions Architect at Databricks, to explore how AI, data governance, and cloud-scale analytics are reshaping manufacturing.David has spent years at the intersection of manufacturing, AI, and enterprise data strategy, working at companies like Boeing and SightMachine before joining Databricks. Now, he’s leading the charge in helping manufacturers unlock value from their data—not just by dumping it into the cloud, but by structuring, governing, and applying AI effectively.Databricks is one of the biggest names in the data and AI space, known for lakehouse architecture, AI workloads, and large-scale data processing. But how does that apply to the shop floor, supply chain, and industrial operations?That’s exactly what we’re unpacking today.Join Our Community Today! Subscribe for free to receive all new postWhat is Databricks and How Does It Fit into Manufacturing?Databricks is a cloud-native data platform that runs on AWS, Azure, and Google Cloud, providing an integrated set of tools for ETL, AI, and analytics.David breaks it down:"We provide a platform for any data and AI workload—whether it’s real-time streaming, predictive maintenance, or large-scale AI models."In the manufacturing context, this means:* Bringing factory data into the cloud to enable AI-driven decision-making.* Unifying different data types—SCADA, MES, ERP, and even video data—to create a complete operational view.* Applying AI models to optimize production, reduce downtime, and improve quality."Manufacturers deal with physical assets, which means their data comes from machines, sensors, and real-world processes. The challenge is structuring and governing that data so it’s usable at scale."Why Data Governance Matters More Than EverGovernance is becoming a critical challenge in AI-driven manufacturing.David explains why:"AI is only as good as the data feeding it. If you don’t have structured, high-quality data, your AI models won’t deliver real value."Some key challenges manufacturers face:* Data silos → OT data (SCADA, historians) and IT data (ERP, MES) often remain disconnected.* Lack of lineage → Companies struggle to track how data is transformed, making AI deployments unreliable.* Access control issues → Manufacturers work with multiple vendors, suppliers, and partners, making data security and sharing complex.Databricks addresses this through Unity Catalog, an open-source data governance framework that helps manufacturers:* Control access → Manage who can see what data across the organization.* Track data lineage → Ensure transparency in how data is processed and used.* Enforce compliance → Automate data retention policies and regional data sovereignty rules."Data governance isn’t just about security—it’s about making sure the right people have access to the right data at the right time."A Real-World Use Case: AI-Driven Quality Control in AutomotiveOne of the best examples of how Databricks is applied in manufacturing is in the automotive industry, where manufacturers are using AI and multimodal data to improve yield of battery packs for EV’s.The Challenge:* Traditional quality control relies heavily on human inspection, which is time-consuming and inconsistent.* Sensor data alone isn’t enough—video, images, and even operator notes play a role in defect detection.* AI models need massive, well-governed datasets to detect patterns and predict failures.The Solution:* The company ingested data from SCADA, MES, and video inspection cameras into Databricks.* Using machine learning, they automatically detected defects in real time.* AI models were trained on historical quality failures, allowing the system to predict when a defect might occur.* All of this was done at cloud scale, using governed data pipelines to ensure traceability."Manufacturers need AI that works across multiple data types—time-series, video, sensor logs, and operator notes. That’s the future of AI in manufacturing."Scaling AI in Manufacturing: What Works?A big challenge for manufacturers is moving beyond proof-of-concepts and actually scaling AI deployments.David highlights some key lessons from successful projects:* Start with the right use case → AI should be solving a high-value problem, not just running as an experiment.* Ensure data quality from the beginning → Poor data leads to poor AI models. Structure and govern your data first.* Make AI models explainable → Black-box AI models won’t gain operator trust. Make sure users can understand how predictions are made.* Balance cloud and edge → Some AI workloads belong in the cloud, while others need to run at the edge for real-time decision-making."It’s not about collecting ALL the data—it’s about collecting the RIGHT data and applying AI where it actually makes a difference."Unified Namespace (UNS) and Industrial DataOpsDavid also touches on the role of Unified Namespace (UNS) in structuring manufacturing data."If you don’t have UNS, your data will be an unstructured mess. You need context around what product was running, on what line, in what factory."In Databricks, governance and UNS go hand in hand:* UNS provides real-time context at the factory level.* Databricks ensures governance and scalability at the enterprise level."You can’t build scalable AI without structured, contextualized data. That’s why UNS and governance matter."Final Thoughts: Where is Industrial AI Heading?* More real-time AI at the edge → AI models will increasingly run on local devices, reducing cloud dependencies.* Multimodal AI will become standard → Combining sensor data, images, and operator inputs will drive more accurate predictions.* AI-powered data governance → Automating data lineage, compliance, and access control will be a major focus.* AI copilots for manufacturing teams → Expect more AI-driven assistants that help operators troubleshoot issues in real time."AI isn’t just about automating decisions—it’s about giving human operators better insights and recommendations."Final ThoughtsAI in manufacturing is moving beyond hype and into real-world deployments—but the key to success is structured data, proper governance, and scalable architectures.Databricks is tackling these challenges by bringing AI and data governance together in a platform designed to handle industrial-scale workloads.If you’re interested in learning more, check out www.databricks.com.Stay Tuned for More!Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.🚀 See you in the next episode!Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts: Spotify Podcasts: Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions. This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com
    --------  
    35:10

More Technology podcasts

About The IT/OT Insider Podcast - Pioneers & Pathfinders

How can we really digitalize our Industry? Join us as we navigate through the innovations and challenges shaping the future of manufacturing and critical infrastructure. From insightful interviews with industry leaders to deep dives into transformative technologies, this podcast is your guide to understanding the digital revolution at the heart of the physical world. We talk about IT/OT Convergence and focus on People & Culture, not on the Buzzwords. To support the transformation, we discover which Technologies (AI! Cloud! IIoT!) can enable this transition. itotinsider.substack.com
Podcast website

Listen to The IT/OT Insider Podcast - Pioneers & Pathfinders, Waveform: The MKBHD Podcast and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features
Social
v7.15.0 | © 2007-2025 radio.de GmbH
Generated: 4/18/2025 - 3:42:15 AM