Note: The Q&A will be recorded separately, to be released as a podcast and a blog post.
Listen to it as a podcast here:
Watch the webinar here:
Check out the slides:
Most continuous improvement programs don't fail because the methodology is wrong or because the team doesn't know the tools.
They fail through accumulated small neglect. A communication gap that doesn't get addressed. A metric that becomes vanity rather than insight. An improvement that gets implemented without locking it in. A relationship that stays transactional when it needed to become personal. A piece of language that ties the worker to the work in a way that makes accountability feel like blame. A piece of work that becomes invisible to the rest of the organization.
Chris Burnham's framing for these patterns: they're like splinters. Individually, each one is small. Left unattended, they fester. Other people watch what happens to the first splinter and decide not to risk picking up the next one. The whole program becomes more brittle than it should be.
This webinar walks through six specific pitfalls Chris has encountered across his career in manufacturing, logistics, and healthcare, along with practical countermeasures for each. The session is unusually structured for a CI talk -- it's essentially a diagnostic framework, six categories of failure with specific signals and prescriptions, organized for working CI leaders who want to know what to look for in their own programs.
Chris Burnham served as Continuous Improvement Program Manager at Wright Medical in Memphis at the time of this recording. His background includes work at Coca-Cola, XPO Logistics, and Celestica, and his degree is in criminal justice from Western Carolina University -- a credential he uses every day, he says, in the work of getting information from the source rather than through filters, reading situations from his own observation, and helping people from different backgrounds find common ground.
The session is hosted by Mark Graban, then VP of Improvement and Innovation Services at KaiNexus and the author of Lean Hospitals, Healthcare Kaizen, and The Mistakes That Make Us.
Before getting to the pitfalls, Chris names something most practitioners already do but rarely articulate: CI leaders move between four distinct roles throughout the day, and operating well requires knowing which role the moment calls for.
Leader -- the role where others are looking to you for direction, guidance, or advice on what to do next.
Learner -- the role where you're building skills, gaining knowledge, or trying to grasp a situation deeply enough to understand the challenge before acting on it.
Advocate -- the role of speaking for the process and for the people, even when it's uncomfortable. Chris defines advocacy as seeing the truth and speaking the truth, at every opportunity.
Coach -- the role of asking questions that change how people think, rather than telling them what to do. The best coaches Chris has encountered ask questions that provoke thought rather than questions that lead to predetermined answers.
The point isn't that these roles are separate jobs. It's that effective CI work requires fluency in switching between them based on what the moment requires. A leader who only operates as a coach won't make decisions when decisions need to be made. A leader who only operates as a leader will never develop the people around them. The work is in the switching.
The first pattern Chris names is the most common and the most underestimated. Karen Martin's work in The Outstanding Organization names it directly: lack of clarity breeds chaos and confusion. The lack of clarity usually doesn't show up as something completely wrong. It shows up as a small detail that's missing, an instruction that's slightly ambiguous, a handoff that leaves a cloud of unanswered questions about what's supposed to happen next.
Chris's diagnostic test: somebody needs to take care of this problem. The word "somebody" is the warning sign. Somebody doesn't work here. If you see it, you own it until you hand it to a specific named person who can take it from you.
The countermeasures Chris recommends:
Practice asking open-ended questions. Yes/no questions don't surface thought; they surface choices. The skill is in asking questions that produce understanding rather than questions that produce compliance. He recommends finding a partner to practice with and getting honest feedback on whether the questions are actually open or just feel open.
Document tasks using a clearly defined system, whatever system fits your work. Wikis, OneNote, email summaries -- the specific tool matters less than the discipline of writing things down where they can be referenced again.
Write things down by hand when you can. Chris carries a paper-and-pen planner despite having access to every digital tool. The reason isn't nostalgia. It's that the act of writing slows the brain down enough to engage with what's being said, and the act of stopping to write something down in front of someone signals to them that what they're saying matters.
Repeat back for confirmation. "Just so I understand what you're saying, you indicated that..." Then wait for the yes. This is basic active listening, taught everywhere, skipped routinely. Chris's framing: it crystallizes the idea in your own mind and demonstrates to the other person that you were actually listening.
Assume there is no agreement until agreement is confirmed. Specifically and verbally. Walking out of a meeting where everyone nodded is not the same as walking out of a meeting where agreement was named, said out loud, and acknowledged.
Don't get information filtered through someone else. Get it from the source whenever possible. Chris attributes this discipline to his criminal justice training. The pace of modern work pushes everyone toward summarized, distilled information -- but the work of CI requires accurate primary information, not the version that's survived two rounds of telephone.
The second pattern is about what gets measured and how it gets displayed. Without clear targets and meaningful charts, teams are running without a map. Chris is direct that this is often where his colleague Mark would have more to say -- Mark's Measures of Success book is the deeper treatment of the topic. But the pitfall itself is worth naming.
Vanity metrics are the metrics that look good but don't drive decisions. Number of people trained. Number of near-misses without assignable cause. Activity counts that have no clear target. The diagnostic test: if the metric moves up or down, what action follows? If the answer is nothing, the metric isn't doing its job.
The countermeasures:
Use the right chart for the measurement. Chris references Dan Roam's The Back of the Napkin for the underlying principle: different kinds of information call for different kinds of visualization. A trend over time isn't a bar chart -- it's a process behavior chart (also called an SPC chart, or as Mark teaches them in Measures of Success, a control chart with statistically calculated limits).
Process behavior charts produce a specific kind of thinking that other charts can't. When a data point sits outside the control limits, the chart prompts the question: what happened here? When four consecutive points trend toward a limit, the chart prompts: what's changing? The chart turns the metric from a number to be reported into a question to be investigated.
Make the targets visible. Post the metrics where people can see them. The visibility does two distinct kinds of work. It tells the people doing the work what good looks like. It also reduces the volume of status-update interruptions -- when someone asks how things are going, you can point to the chart.
Avoid getting trapped in the appearance of metrics rather than their substance. Chris's diagnostic question: can a decision actually be made based on what this number shows? If not, replace it with something that can.
The third pattern is where improvement programs erode through the natural drift of processes back toward their previous states. The technical term Chris uses is entropy -- the tendency of any process to degrade into chaos unless actively maintained.
The pattern: a team makes a real improvement. The improvement works. But the new standard never gets documented, the people doing the work don't get retrained, and within weeks or months, performance has drifted back to where it was before.
The countermeasures:
Document the new standard. Chris references Training Within Industry (TWI) as the framework that handles this well -- Job Instruction breaks the work down into essential elements, and Job Methods challenges each element with questions about who, when, and where. The discipline of working through these questions either confirms the elements are right or unlocks an opportunity to put them somewhere better.
Confirm that the teacher has taught and the learner has learned. The test Chris uses (drawn from medical education, where his father was a vascular surgery professor): see one, do one, teach one. The learner can teach the work back to someone else. Until they can, the learning hasn't happened -- only the explaining has.
Connect the new standard to what matters to the individual and the team. People are naturally self-interested, Chris is direct. The new way has to make sense for them specifically, not just for the abstract organization. This isn't manipulation. It's recognizing that sustained behavior change requires the person to understand why the change matters to them.
Don't become the process police. If people aren't following the new process, there's a training gap or a process problem -- not a discipline problem. The default response should be to investigate what's actually going wrong, not to enforce compliance with a standard that isn't working.
The fourth pattern is about the quality of relationships between CI leaders and the people whose work they're trying to improve. Chris is direct: you cannot challenge people and produce growth if they don't believe you have their best interests at heart.
The frame Chris uses comes from Kim Scott's Radical Candor. Scott's argument is that effective leadership requires both high personal care and high direct challenge. If you have high challenge without personal care, it lands as obnoxious aggression. If you have high care without challenge, it produces what Scott calls ruinous empathy -- relationships that feel good but don't develop anyone. The work is in doing both simultaneously.
The countermeasures Chris offers:
Build personal connection through time and shared rapport. You can't start with "what's going on with you today?" cold. The connection has to be built through small consistent moves over time.
Share first. If you want people to talk about what matters to them, talk about what matters to you first. Pictures of family on your desk, conversations about your goals, openness about the challenges you're facing -- these create the conditions where reciprocal openness becomes possible.
Chris references John Doerr's Measure What Matters for the idea of posting your own objectives and key results visibly. The reason isn't that people need to track you. It's that visible objectives let others see what you're working on, what's hard for you, and how their work could connect to yours.
The closing principle: you're not a stone pillar. You're part of a community, and communities are built on interdependence. The willingness to be visible and connected is what makes everything else possible.
The fifth pattern is about language. When CI leaders challenge results or outcomes, the natural default of English ties the individual to the result. "Your numbers are down." "Why didn't you hit the target?" Even when the intent is to discuss the process, the language often makes it about the person.
The countermeasure Chris recommends is being deliberate about word choice and physical positioning.
The maxim "easy on the people, hard on the process" is well-known in Lean circles and easy to violate. The practical move Chris describes: when discussing performance, get side by side with the person, looking at the chart or process map together. The body language matters. Two people standing next to each other looking at the same chart are working on the problem together. Two people on opposite sides of a table are confronting each other.
Use language of empowerment. Recognize people. Give kudos in public when the opportunity exists. The connection back to Pitfall 4: a high personal connection makes direct challenge possible. A low personal connection makes the same challenge feel like attack.
This requires practice. Chris recommends rehearsing presentations and having a trusted second coach observe you, watching for moments where the language slips from process to person.
The sixth pattern is about whether improvement work is visible to the people who could benefit from seeing it. If the work is invisible, the rest of the organization can't contribute, can't challenge, can't replicate, and can't draw inspiration from it.
The countermeasures:
Make problems visible through storyboards, huddles, and posted metrics. Visibility certifies that the work is important and invites feedback.
Celebrate success when it happens. TWI Job Relations is direct about this -- give praise at the right time, intentionally and visibly. Chris's framing: be an ambassador of joy. Find opportunities to recognize people's contributions in every interaction. The behavior is contagious. When leaders do it, others do it.
Keep the main thing the main thing. Visible results help the organization stay focused on what matters rather than getting pulled into small subsets of problems that don't move the strategic needle.
For KaiNexus users specifically, Chris notes that he's found leadership dashboards with scheduled email subscriptions to drive engagement effectively. When senior leaders receive a weekly visual summary of what's happening, the attention follows the visibility.
The closing thought Chris offers ties the six pitfalls back to a single underlying principle. The continuous improvement program isn't a collection of projects, tasks, or metrics. It's a collection of people with different views, different experiences, different capabilities -- and the culture that emerges from how those people work together.
The people are the roots. They grow the program, sustain it, protect it. The countermeasure to all six pitfalls is the same: focus on engagement with the people. Operate in the right role at the right moment. Build the relationships that make direct challenge possible. Use the language that builds trust rather than damages it. Make the work visible. Lock in the gains.
The aphorism Chris attributes to a consulting group he encountered:
If you focus on the people, the numbers will come. If you focus on the numbers, the people will go.
This is the take-home. The numbers matter -- Chris is not suggesting they don't. But the numbers follow from the relationships, the culture, the engagement. Programs that lead with numbers and treat people as the means produce short-term results and long-term decay. Programs that lead with people and treat numbers as the natural consequence produce results that sustain.
Several elements of Chris's framework map directly to how the platform supports CI programs at scale.
The communication discipline benefits from documentation infrastructure that lets meeting notes, action items, and confirmations live in a searchable shared space rather than scattered across email and personal notes. The platform's meeting documentation, file attachments, and collaboration features support the "write it down, repeat back, confirm agreement" disciplines Chris describes.
The process behavior chart functionality is one of the more underused pieces of the platform. Most organizations track metrics in spreadsheets that produce simple line graphs or bar charts. Built-in control charts with statistically calculated limits let teams see special-cause variation versus common-cause variation -- which is what produces the kind of investigative conversations Chris describes.
The locking-in-improvements work is supported through the platform's project documentation, knowledge repository, and the searchable history of completed improvements that lets organizations capture and reference what worked. The TWI-style breakdown of standard work can live in the system rather than in binders nobody opens.
The visibility work -- making problems, progress, and results visible across the organization -- is supported by dashboards, scheduled email reports, and the cross-organizational view of improvement work that Chris mentions specifically in the session. Leaders who receive a weekly visual summary stay engaged in ways they wouldn't through ad-hoc reporting.
The personal connection and language work isn't directly supported by infrastructure -- that's leader behavior, not software. But the infrastructure can amplify the leader's behavior by making the work visible enough that recognition becomes easier and more frequent.
None of this substitutes for the human work Chris describes. The platform is a tool that makes the disciplines easier to sustain at scale. The disciplines themselves are what produce sustained improvement programs.
Chris Burnham served as Continuous Improvement Program Manager at Wright Medical in Memphis at the time of this recording. His career includes work at Coca-Cola, XPO Logistics, and Celestica before joining Wright Medical. He holds a BS in Criminal Justice from Western Carolina University -- a credential he applies every day in the work of gathering accurate information from primary sources, reading situations from his own observation, and finding common ground with people from different backgrounds. He created and hosted The Lean Leadership Podcast, which remains available in podcast directories.
Why do continuous improvement programs typically fade over time?
Not because the methodology is wrong or because the team doesn't know the tools. Programs fade through accumulated small neglect -- communication gaps that don't get addressed, vanity metrics that replace meaningful ones, improvements that don't get locked in through training, relationships that stay transactional, language that ties the worker to the work, and improvement work that becomes invisible to the rest of the organization. Chris's framing: these patterns are like splinters. Individually small. Cumulatively they degrade the program.
What is the difference between a process behavior chart and a regular line chart?
A process behavior chart includes statistically calculated control limits (typically three standard deviations from the mean) that distinguish normal variation from special-cause variation. A regular line chart just shows the data over time. The control limits matter because they let the team know which variation is worth investigating and which is just the process being itself. Reacting to noise as if it's a signal is one of the most costly mistakes in performance management. Mark Graban's book Measures of Success is the deeper treatment of why this distinction matters.
What does "easy on the people, hard on the process" mean in practice?
It means that when something goes wrong, the diagnostic question is what about the process allowed this to happen, not who screwed up. The language and body language both matter. Standing side by side with the person, looking at the chart or process map together, is the physical version of process focus. Sitting across the table directly facing the person is the physical version of person focus. The first produces collaborative problem-solving. The second produces defensiveness, regardless of intent.
What is "see one, do one, teach one" and where does it come from?
A pedagogical principle from medical education -- specifically, surgical training. The learner sees the procedure performed, then does it themselves under supervision, then teaches it to someone else. The test of whether learning has happened is whether the learner can teach the work back. Until they can, only the explaining has happened. Chris applies the same principle to standard work training in CI -- if the operator can't teach the new standard to someone else, the training hasn't fully landed.
Why does Chris recommend not becoming the "process police"?
Because if people aren't following a new process, the problem is almost always upstream of compliance. Either the training didn't work, the process doesn't actually fit the work, or there's a system constraint that makes following the process difficult. Enforcing compliance with a broken process produces resentment and workarounds. Investigating why the process isn't being followed produces information about what to fix. The first builds an adversarial relationship between CI and the workforce. The second builds the partnership that sustains the program.
What does "if you focus on the people, the numbers will come" actually mean in operational terms?
It means that engagement is the upstream variable and metrics are the downstream consequence. Programs that lead with metrics -- setting targets, demanding compliance, holding people accountable to numbers -- often produce short-term improvement and long-term decay. The decay happens because the people doing the work disengage from a system that treats them as means to ends. Programs that lead with engagement -- building relationships, making the work visible, recognizing contributions, focusing on culture -- produce metrics that improve and sustain because the people are invested in the improvements. The principle isn't an excuse to ignore numbers. It's a recognition that the order matters.
Copyright © 2026
Privacy Policy