Podcast: Bridging the gap between shop floor reality and AI strategy in manufacturing
Key Highlights
- Retirements erode “tribal knowledge,” cutting efficiency; embed capture in daily work orders to preserve context and decision-making insights.
- AI fails when built for the boardroom; design from the shop floor to reduce friction and integrate directly into technician workflows.
- Disconnected systems block AI value; unify data across maintenance, inventory, and operations to enable actionable insights.
- AI readiness starts with digitization: move from paper to structured data first, then layer AI to avoid wasted spend and low adoption.
In this episode of Great Question: A Manufacturing Podcast, Nick Haase of MaintainX explores the challenges manufacturers face as workforce retirements and labor shortages collide with the push for AI adoption. He discusses how disconnects between leadership and the shop floor can lead to ineffective technology and stalled implementation. The conversation highlights the importance of capturing tribal knowledge, improving system interoperability, and prioritizing frontline-friendly tools. He also outlines practical steps for building a digital foundation that enables successful AI use in industrial operations.
Below is an excerpt from the podcast:
PS: Manufacturing is at an inflection point. Retirements and a persistent skilled labor shortage are converging at the worst possible time, just as the industry is being told to adopt AI and accelerate automation. However, most of these AI strategies are designed from the carpeted side of the building by executives, consultants, and vendors who rarely spend time on the plant floor. The result can often be tools that look great in the boardrooms, but fail at the point of execution.
I'm Tom Wilk, the chief editor of Plant Services. And in this episode, Nick Haase, the co-founder of MaintainX and I are going to analyze a few reasons why AI adoption in a plant can stall, from additional data entry burdens to siloed organizations and tools that just don't talk to each other. Nick, thanks for being with us today.
Nick Haase: Tom, thanks so much for having me. Excited to be here.
PS: I'm excited for our next conversation here. AI has been a persistent topic. Just when I thought the marketing hype was going to die down, here we are in an age when AI has overwhelmed not just commercial life, but industrial life too.
NH: Yeah, you can't escape it. It's growing and evolving faster than anything else we've seen.
PS: Maybe we can start looking at a workforce question and ease into AI. It is true that we're facing a generational workforce transition. When I joined Plant Services 12 years ago, retirements were just on the immediate horizon. And here we are, we've got a whole second generation moving in to help cover for all the old guard that have retired. Beyond the hiring challenges associated with this kind of transition, what's actually at risk when experienced technicians retire like that?
NH: Yeah, I think the obvious answer is headcount, and the real answer is the context. A 30-year experienced technician doesn't just know how to fix a pump. They know that pump. They know it runs hot on Thursdays because of the upstream batch scheduling. They know the vibration that means replace this bearing this weekend versus we have two more months. And that's not something they can share in any manual.
It’s what I like to think of as tribal knowledge depreciation. It's an asset on everyone's balance sheet that you can't see, and it's losing value every day. Companies track equipment depreciation religiously, but they generally are tracking and have very limited visibility into the institutional knowledge that's walking out the door. And so you have this compounding problem when that person leaves. The next tech doesn't just lack their skill, they lack the decision-making shortcuts that kept the plant running efficiently. So you don't just lose the one person's productivity, you lose the multiplier effect that they had on everyone around them.
The companies getting ahead of this aren't running knowledge capture projects. They're embedding capture into everyday work, like when a technician closes a work order and notes what they did and why, so that it can become a searchable, trainable record. But that only works if the system is making that easy and not burdensome.
PS: There really is a multiplier effect, isn't there, after a certain period of time when you have experienced technicians who, as you say, know the assets inside and out.
NH: And it's really hard to measure what is that worth and how do I capture that. It's something that a lot of folks take for granted, and not the sense that they don't realize the value of those technicians. It's more so that they don't think about the sustainability of that operation, what happens when this person leaves, and they think that maybe they can distill some of that through osmosis or shadowing of the next generation. But the problem, the challenge is now we're running into this labor skilled labor shortage, and it's becoming harder to do that.
PS: Let's talk about how companies are approaching AI potentially to help them address this kind of skills gap and address this kind of labor shortage. At the outset, I mentioned there was a carpeted side and the concrete side of the building, and I like the way that you framed that out. What does that framing imply for how companies do or don't approach AI?
NH: It's an informal slang, I've heard referenced by both sides. So it's not one, it doesn't seem too derogatory, one versus the other, but it's a simple observation that most technology decisions in manufacturing are made by folks who work on carpet. The executives, the IT folks, consultants, and the vendors. The people who have to live with those decisions work on the concrete, on the shop floor – the technicians, the operators, the reliability engineers.
The gap isn't malicious. It's just structural. The carpeted side optimizes for dashboards, reporting, compliance, and the concrete side needs something that works with gloves on in a loud environment in the middle of a 2 a.m. breakdown. What I see is when AI strategies are designed strictly from that carpeted side, you get tools that are impressive in a demo, but not really practical in application in the field.
The classic example is that a predictive maintenance alert fires, but there's no connection to any other system, so there's no work order attached, no parts availability that's checked, no routing to the technicians that are supposed to do something about it. And so that alert becomes noise, technicians ignore it, and then leadership wonders why adoption is low and they're not getting the value from investments they're making.
So I've seen that the unlock is really, how do you build from that front line up? Start with, what does the technician need to do their job better today and reduce that friction? Then you can layer intelligence on top of the workflows they're already using. That's the only sequence I've seen work across the board.
PS: When you've toured plants, and you talked to the people on the shop floor, are they reporting that they've had uses for AI to help cover up these gaps?
NH: I see a lot of folks trying to do that today, and I didn't see that same enthusiasm or energy around it or investment a couple years ago. So there's certainly not a lack of awareness that there may be something to help in this broad spectrum. But one of the challenges that I see is a lot of the leadership teams are getting nervous about being behind, so they're delegating AI projects downstream without a lot of tactical implementation or thinking about what exactly needs to be done. And then every question, every challenge that they have, they may say, is there an AI solution that solves this? And not necessarily thinking holistically about that operation and understanding where the technology at can add value immediately, versus maybe places where that technology is only going to really be effective with the right foundation underneath it.
PS: I've heard that concern too, the sort of concern which centers on interoperability – can you actually have the tools work together. What role does interoperability play in whether manufacturers can actually benefit from these technologies they invest in?
NH: Yeah, I mean, it's the structural bottleneck that nobody really wants to talk about because it's not sexy. Like most plants run on this patchwork of systems. They run an asset management tool from 2008, an ERP that doesn't talk to it or connect with it that might be equally outdated. Sensor platforms that are bolted on top, OT systems that may be networked but only to each other and to nothing else in their ecosystem. And then spreadsheets sort of filling every gap in between.
None of it's connected, none of it's talking to each other. So when someone says, we want to use AI for predictive maintenance, the first question isn't about models or algorithms. It's can your systems even share data with each other? And in most cases, the answer is just no. And so the result is that teams spend an enormous amount of energy and resources just getting a clear picture of what's happening. Maintenance pulling data from one system, operations from another, and they're reconciling in a meeting room with printouts or jammed together spreadsheets. And it's not a technology problem in a traditional sense; it's more of an architectural problem.
So the companies that are making real progress are the ones consolidating around fewer, more connected systems, not necessarily ripping and replacing everything overnight, but being more intentional about making their core workflow platform the connected tissue and understanding where your data is going to live, what systems benefit from data from other systems, and how they can add more contextual components to it. And when your work execution data and your asset history and your parts inventory and your sensor feeds are all live and flow through one place, the AI has something to work with. But without that, you're feeding AI like fragmented, contradictory, incomplete, uncontextualized data, and wondering why it doesn't help solve your problems, because it only sees part of the picture.
PS: Is this the point where folks on the concrete side would reach out to, say, IT over in the carpeted building to either find an integrator partner or ask about that? Or is this the kind of thing where the folks on the concrete side can reach out and find their own integrator partners to work with? What would you recommend?
NH: I think it's a bit of both. And even another part is like for the folks on the carpeted side is, listen to your concrete side when they say it's not working. This is not a universal truth by any stretch, but they don't necessarily have the experience with or the understanding of the architecture of how some of their IT systems are set up. So they may not ask the right questions or they may not say things the right way. But if you're hearing that something's not working, it's generally a good idea to go start investigating how does this data flow or how should it flow, and how do people think it's flowing.
And to that point, what I've seen work is that, especially in larger corporations, is they have some leaders not at every plant, but at a couple of plants that do have a better foundational understanding of this, that are going out and getting their own integration partners and working with folks to try to build that “lighthouse experience,” the phrase I'm hearing more and more often, which is to say that they're going to show the rest of the company the art of the possible and then get everyone excited to want to try to bring that to their plants now that they've kind of gone through that initial testing phase.
PS: That makes sense, that's something which you hear a lot at conferences, which I think some plant practitioners are uncomfortable with, but you have to socialize the wins. You have to be out there, and one of our consultants that we work with always says, you've got to speak three different languages, the boardroom, the plant floor, and the lower decks. You need a certain verbal dexterity to make that lighthouse effect realized.
Let me ask you about AI for a couple of questions. At the MARCON conference this past year, we heard about some AI agents that were being deployed on mobile phones. And the goal of these agents was to help field technicians provide better data when they were doing repairs, whether through spoken word like Siri, whether through entering in the text field. But one of the things that the designers had made it a point to point out was that these apps and these agents were optional. They didn't want to overburden the technicians with needing to use these.
That's really my question for you is, I think there's a difference between AI that adds work for technicians, and AI that removes the friction. In your experience, have you come across things like that and have some examples of AI that helps and AI that maybe doesn't?
NH: Every day. And I think that's where a lot of folks start to hit their head on some of the walls with AI, thinking maybe it doesn't work for us versus it's not all it’s hyped up to be. AI that adds work is, for example, you get a predictive alert, but now someone has to manually log into a different system, create a work order, check parts, assign a tech. The AI detected something, but it's creating a new task instead of solving one.
Whereas like AI that removes friction, this is where you get in some of that agentic like workflows, that same alertifier, but a draft work order is auto-generated with the right asset, the right procedure, the right parts list, pre-checked against inventory, routed to the technician who's on shift and qualified and has worked on that asset before. And that technician just has to open their phone and see an alert for “here's what's wrong, here's what to do, and here's what you need.” And that's removing friction.
Another example is a technician spend a surprising amount of time just figuring out what to do next. Which work orders are a priority? Where's the asset? What's the history of it? An AI that can synthesize that context and put it in front of the technician so they don't have to go hunting for it, that's kind of the value that people adopt because it makes their day easier and not harder. You're not adding more to their work.
The litmus test is really simple: after you deploy your AI project or test, does that front line worker's day get simpler and more complicated? And if the answer is more complicated, you've built AI for the boardroom and not the floor. That's where I've seen that biggest difference.
PS: Are the people who work on the concrete side comfortable voicing their reservations, I guess, when it comes to AI getting in the way? Who do they talk to?
NH: I mean, they'll almost certainly complain to their manager. Will they complain three or four levels up to an abstract e-mail name that they see from corporate, or someone who shows up once every couple quarters? I think there's a little more hesitancy to be overly candid with them. And that's part of the responsibility of those leaders to make themselves a little more approachable or really try put those frontline folks into the process of designing these solutions, so that they can get buy-in from the front instead of having to fight it on the backside and learn surprises. But ultimately, the answer is if you're seeing that there's a friction with adoption, you're probably running into one of those situations where it's more complicated than it's supposed to be.
PS: Given that MaintainX focuses on the CMMS side of the business, the work order management and job plan management, do you sense that plant workers can feel a real benefit to AI when it comes to, say, scheduling workflows and rescheduling workflows on the fly and reassigning resources?
NH: I do, but a lot of times, in my experience, they don't even think of it as AI. They just think of it as the tool making their job easier and giving them the information that they need, and if that's AI, they're excited about it. If you just call it a feature, they're fine with it. I don't think AI when I don't have to dig through all the span in my e-mail inbox because that's my spam folder and the spam filter. That's technically AI, but it just has a feature name. So, you know, if we're doing something that makes their job easier, you know, it's not that they don't mind learning about AI or wanting to understand it, but they're interested in saying, is this helping me? If it's helping me, call it whatever you want. I like it.
PS: What does AI readiness actually look like in this sense for a plant that may still be running on paper or spreadsheets?
NH: I've got to be a little bit honest here because there's a lot of hype, and if you're still running paperwork orders and spreadsheet-based schedules or other things, you're probably not ready for AI today. And that's okay. But you have to be honest about where you are.
AI readiness isn't about buying an AI product. It's about having structured digital work execution data. It's around having networked machines and OT data that goes into a centralized historian and a place that you can pull data from. So if your work orders are digital, your asset hierarchy is clean, your procedures are documented, and your technicians are actually using a system day-to-day, that's the foundation that enables you to grow from there. The good news is that this isn't this isn't a five-year journey anymore. Mobile-first platforms have collapsed the implementation timeline and adoption dramatically. We see plans go from paper to fully digital work execution in weeks, not months or years. And once you're capturing structured data in the flow of work, you're building the beginning steps of a data set that AI needs to be useful.
That sequence really matters because Step 1 is digitize your core workflows, make them easier than paper. Easier than paper. Step 2, build a clean data history on your assets. Step 3, you can start layering intelligence on top of that, whether that's automated scheduling, predictive-based maintenance with sensors, anomaly detection, or AI-assisted troubleshooting. If you just try to skip straight to Step 3, you're going to not only waste money and time and resources, but also credibility.
So I tell any manager or leader listening, don't let the AI conversation paralyze you. The single best thing that you can do today is get as much of your work off of paper and into a system that your team will actually use, and everything else follows from that.
PS: It reminds me of the debate I'm sure you've heard to at these maintenance conferences where how much predictive work should be part of your average workflow. And you know, the marketing for these tools has a certain sizzle to it when you officially, when you get like thermal camera images and motion amplification. But I always sometimes have to remind myself that PdM should really be no more than about 25% of the work done because at some point, you've got diminishing returns. You don't need those kind of tools to get the necessary work done.
NH: And I see folks over-engineer the process where they'll have some team that does something really smart and actually pretty impressive, but then they'll put an Andon light there because it can't connect to the rest of their systems that they're using. And that, again, reduces the likelihood that you're going to get the adoption you need for those, to be at least measured and evaluated on their merit, rather than wondering is there a change management issue that's underlying.
PS: We'll get you out of here on this question then, Nick. We started with workforce questions, and we kind of dove into AI for a little while. I want to close on a general workforce question, which might involve AI in it. When you visit plant floors, when you and your teams have gone out there in the field, what patterns do you see in companies or organizations that are successfully enabling the next-gen workforce vs. those that aren't? The ones that might be AI natives, the ones like my kids where they can't imagine checking the weather without Siri.
NH: The reality is all this stuff goes back to the truism of people, process, technology in that order. The number one differentiator isn't about budget or technology sophistication. It's whether leadership treats frontline digitization as a strategic priority or as a back-office project.
Successful orgs have maintenance & reliability and frontline professionals with a seat at the table. There's a clear owner for the workflow and for the ongoing support of it. Technicians are involved in selecting and configuring tools. The system is mobile first because that's how work happens. And critically, the data flowing out of that is not just being captured, but it's being used to make decisions so these technicians can see the impact of that work that they're capturing. The struggling orgs are the ones that buy a platform because a consultant or some leader recommended it, and an executive saw a demo. Nobody asked the techs what they needed, and the adoption is low, so your data is sparse, your dashboards are empty, and then leadership loses confidence, and the whole thing dulls. It's like a death spiral.
The pattern I see in the best plants are the ones that measure adoption, not just deployment. They don't say “we rolled out this tool with 200 users.” They say “85% of work orders are being completed digitally with full procedure documentation.” And that's the difference between checking a box and actually changing how work gets done.
To your point earlier on, the generational angle here, that really matters as well. The younger technicians are coming in expecting modern mobile solutions. They expect to look something up on their phone and not dig through binders or filing cabinets. And if you give them paper-based processes, you're not just being inefficient, you're making yourself a less attractive employer in a market where you can't afford to lose candidates. So those are the angles I've seen, what makes the difference between the successful ones and the struggling ones so far.
PS: I'm sure our listeners are going to take that to heart when they try to figure out how not only to attract the new workers, but retain them as well.
NH: That's right. That's the great new battle.
About the Podcast
Great Question: A Manufacturing Podcast offers news and information for the people who make, store and move things and those who manage and maintain the facilities where that work gets done. Manufacturers from chemical producers to automakers to machine shops can listen for critical insights into the technologies, economic conditions and best practices that can influence how to best run facilities to reach operational excellence.
Listen to another episode and subscribe on your favorite podcast app
About the Author

Thomas Wilk
editor in chief
Thomas Wilk joined Plant Services as editor in chief in 2014. Previously, Wilk was content strategist / mobile media manager at Panduit. Prior to Panduit, Tom was lead editor for Battelle Memorial Institute's Environmental Restoration team, and taught business and technical writing at Ohio State University for eight years. Tom holds a BA from the University of Illinois and an MA from Ohio State University


