Lesson 6: Implementing Sustainable Improvement Work

Lesson 6: Implementing Sustainable Improvement Work

“Lesson 6 Lectures”
(Source URL)

Summaries

  • Lesson 6 - Implementing Sustainable Improvement Work > Lesson 6 Lectures > Moving to Reliability
  • Lesson 6 - Implementing Sustainable Improvement Work > Lesson 6 Lectures > Designing a Reliable System
  • Lesson 6 - Implementing Sustainable Improvement Work > Lesson 6 Lectures > Cautions about Reliability
  • Lesson 6 - Implementing Sustainable Improvement Work > Lesson 6 Lectures > Two Stories of Reliability
  • Lesson 6 - Implementing Sustainable Improvement Work > Lesson 6 Lectures > Faculty Footnotes

Lesson 6 – Implementing Sustainable Improvement Work > Lesson 6 Lectures > Moving to Reliability

  • Now, by this point in the course- and it’s hard to believe we’re already in Lesson 6- I’m sure that you appreciate that many aspects of the health care system don’t perform nearly reliably enough.
  • I could pick virtually any aspect of health care delivery and tell you a sorry tale.
  • Let’s talk about something really simple, like hand hygiene.
  • How often have we literally wrung our hands over data showing that health care providers practice hand hygiene only 40% or 50% of the time? Now, some of you may be arguing that hand hygiene performance is improving.
  • Now think about the entire set of processes that should be performed correctly and reliably when a patient with, say, influenza or an antibiotic resistant infection is on isolation precautions.
  • That’s gowning, gloving, removing the gown and gloves properly, and then performing hand hygiene, because it’s so easy to contaminate your hands when you take off your gloves.
  • Now think about this from the point of view of the patient and a family.
  • To them, failure to adhere to these basic patient safety practices represents what I’ll call defects in care.
  • Now, obviously, for such a screening strategy to be effective, staff would have to be highly reliable in performing all of the components of isolation precautions when they entered and left the room.
  • Typical rates of adherence were 82% for putting on gloves when entering the room, 77% for putting on a gown, and only 62% for performing hand hygiene after removing the gloves and gown.
  • I published a short piece in the New England Journal that contrasted what happens in a neonatal intensive care unit that cares for babies with birth weights as low as 750 grams- let me just emphasize, this is 750 grams- really, really small, fragile babies- contrasting that with what happens in a computer chip manufacturing facility.
  • At that time, we were consistently finding hand hygiene defect rates of 40% or greater, and nowhere were individuals who frequently failed to practice hand hygiene being held accountable.
  • I had to ask myself whether the stakes weren’t even higher in hospitals caring for tiny, vulnerable babies.
  • Yet there were no consequences for failure to perform hand hygiene in these neonatal intensive care units.
  • Reliability in health care systems isn’t anywhere near where it needs to be.

Lesson 6 – Implementing Sustainable Improvement Work > Lesson 6 Lectures > Designing a Reliable System

  • Instead, we say that it’s the system that let them down.
  • So let’s look at what a hospital might do to devise a system that would make it easy for people to perform these tasks well every time, a much more reliable system.
  • Not until such a system is in place can we start referring to errors by caregivers as violations and holding them accountable for their actions.
  • That way staff can practice the procedures and have their knowledge and competence verified.
  • No amount of training, vigilance, hard work, or personal heroism can make up for a poorly designed and supported system.
  • In other words, there should be a strong safety culture enabled by careful, reliable system design and function.
  • High-reliable industries such as aviation and nuclear power understand that a culture in which people are unwilling or afraid to report vulnerabilities in critical systems can lead to catastrophic failures, such as plane crashes and radiation releases.
  • Of course, even the best systems need to be monitored, what we refer to as quality control, or QC, of key components of the system.
  • Those of you who are familiar with patient safety methods and tools will be thinking that this is related to a Failure Mode and Effects Analysis, or FMEA, which helps identify the most vulnerable steps in a system where errors could have serious consequences and need to be monitored closely.
  • How likely is it that an error or defect will occur? How likely is it that, if an error occurs, it will be caught or mitigated? And how serious will the consequences be if the error is not caught or mitigated? Such analyses not only help design quality control procedures, they also help identify steps in the system that are so important that even one mistake would be catastrophic.
  • Perfection in equipment and humans is not achievable, so sometimes systems have to have what we call redundancy or buffers should failures occur.
  • You may be encouraged to know that there are systems in health care that perform at very high levels of reliability, such as blood banking.
  • Transfusion reactions are very, very rare now because blood-banking systems have been refined over time, are monitored thoroughly and routinely through strict quality control checks, and follow standard practices outlined by regulators, such as the Food and Drug Administration in the United States.
  • Anesthetic delivery is another high-reliability system, the product of intensive systems improvement over many years enabled by equipment that makes it easy for those administering anesthesia to do the right thing.
  • It’s almost impossible to do the wrong thing because of the design of the system, what we call “forcing functions” in the safety literature.

Lesson 6 – Implementing Sustainable Improvement Work > Lesson 6 Lectures > Cautions about Reliability

  • DON GOLDMAN: In the first three lectures of this lesson, I introduced reliability, and showed how this concept is important in virtually every system, health care or otherwise.
  • Now I have to pause and sound cautionary notes about reliable systems.
  • First, blind autocratic efforts to create efficient, highly reliable systems are lethal to worker satisfaction and safety culture.
  • By the late 1930s Taylorism was out of vogue because it was dehumanizing, and led to worker protests and strikes.
  • On the other hand, many of its principles, suitably modified to respect the workforce, can be found in modern industrial engineering and Lean, which strive to design reliable, efficient systems that incorporate standard work, reduce waste, and improve productivity.
  • The key is that they have the important added dimension of building a culture that values employees and seeks their continuous input into improving the systems in which they work.
  • I sometimes have observed managers and health care personnel so dedicated to reducing unwarranted variability in performance and improving reliability of their care systems that they just don’t see the forest for the trees.
  • They keep banging away at standard work and fine tuning elements of the system in an effort to boost reliability, and they miss the fact that the system they have designed is not achieving the results they want and need.
  • Now, I know that high-performing organizations are on the constant lookout for systems that need a complete overhaul, not just some tinkering.
  • Then it occurred to someone that this was a wasteful system, and it was decided to put all of these items together in one kit so they would be always ready when needed at the bedside.
  • It’s a bit like an impressionist painter stepping back from his detailed perfect brush work, and having an aha moment that a picture of water lilies could be suffused with light and life by applying paint more freely- or the pointilist, or to say it in French, pointilliste, discovering that dots of color would come together harmoniously when seen from a few feet away.
  • It’s not bad advice, unless the standard work the general wants and gets from his army is the wrong standard work.
  • As you may know, countless cavalry men and about 6 million horses died in World War I following the standard operating procedure of a cavalry charge against barb wire and machine guns.
  • We’ve been talking mainly about systems that are designed to keep people safe.
  • You may recall from the course introduction that I emphasized how frequently our clinical systems let patients down.
  • Patients with diabetes too often do not have good glucose control, have their lipids and blood pressure checked and controlled, have their eyes examined, have checks for foot ulcers and neurological deficits.
  • The patient’s preferences and the patient’s circumstances have to be factored in.

Lesson 6 – Implementing Sustainable Improvement Work > Lesson 6 Lectures > Two Stories of Reliability

  • Basically, we had enough money to go to Bogota, Colombia, and help two hospitals, that deliver most of the poor children of Bogota, reduce their infection rate following cesarean delivery.
  • They did not have a lot of time and luxury to do complicated quality improvement.
  • So we had to keep it simple, and we did our training, as we say, just in time- just the tools they needed, in real time, so that they could do this improvement project to try and reduce the rate of infection following cesarean delivery.
  • I’m not going to show you all the tools and things we did.
  • I’m just going to show you the system diagram that we drew with them, to show what the processes of care were like for a very simple part of care- getting antibiotics delivered to the mother in a timely fashion, to reduce the risk of infection.
  • There are many, many studies that show that timely delivery of antibiotics can reduce the risk of infection by at least two-fold.
  • The antibiotics were being given at the right time in 70% in one hospital, and 30% in the other.
  • So we wanted to see, what’s the system they have that is producing this very low level of reliability? And here’s the diagram.
  • Then the first decision, that diamond, is do I give antibiotic prophylaxis or not? And a lot of the time, as I’ve just told you, the answer was no- 30% in one hospital and 70% in the other, because they didn’t understand the evidence.
  • If they said yes, we want to give antibiotics, then the prescription went to the pharmacy.
  • Well this is a poor country and it turned out that every physician had their own favorite antibiotic.
  • Some were expensive drugs that had been donated by a drug company, to whet the appetite for these antibiotics.
  • So the reason there’s a decision node there, if the pharmacy didn’t have the antibiotic, then the prescription was given to the family, to go outside the hospital to a courtyard area, where vans were selling antibiotics.
  • So the family had to buy the antibiotic and somehow rush to the labor and delivery area, in the vain hope, often, that it would be given to the team, to give to the mother in a timely fashion.
  • There is a standard order set for the prescription for pre-operative care of this patient, including antibiotics.
  • The nurse then takes the antibiotic and puts it in the kit, the package, that’s going to the delivery room, the operating suite, with the mother.
  • Then the procedure starts, and the antibiotics are delivered, and that’s the end of it.
  • All you need to know is, those circles are the administration of antibiotics, yes or no.
  • You can see that it goes up to about 100% over time.
  • The squares are whether the antibiotics were being given at the right time.
  • It’s a little harder process, but still went up to almost 100%. And those diamonds are the rate of infection following cesarean delivery.
  • So that’s just an example to kind of whet your appetite for what can be done under adverse circumstances, simply by thinking about how to make a pathway that is reliable and standard.
  • I talked to the QI people in the various departments, the A&E emergency department, the wards, the intensive care unit, the clinics, and I got tons of material.
  • I got algorithms, and severity of illness scales, and computer screen shots- pages, and pages, and pages, more than I could digest in, literally, several hours of study.
  • No one had a simple visual- a high level systems diagram, that would illustrate, clearly, the pathway a child with asthma would ideally traverse from the time they were brought into emergency, until the time they were sent home.
  • One of the greatest musicals of all time is by Stephen Sondheim, called Into the Woods.
  • It’s a story about ordinary people’s hopes and dreams, and the dangers they encounter when they journey into the woods to accomplish their individual goals.
  • Into the woods to get my wish, I don’t care how, the time is now.
  • There were dangers in the woods, and you can read this ambivalence in the lyrics that I just recited.
  • It’s your job, when thinking about a care pathway for a patient with, let’s say, asthma, to design it, so that it won’t lead your patients into unexpected troubles- witches, wolves, giants, that were in the woods in that the musical, or into care that is chaotic and potentially harmful.

Lesson 6 – Implementing Sustainable Improvement Work > Lesson 6 Lectures > Faculty Footnotes

  • DAVE WILLIAMS: So, it’s hard to believe we’re coming to the end of our sixth lesson, and we wanted to take a little bit of time to wrap up and talk a little bit about some of the ideas that are involved in the lesson.
  • So one concept to appreciate is, as we’re thinking about standardization of care and trying to build these good systems, is that, you know, with every system there are going to be people that are involved and patients with different kinds of conditions and situations.
  • Sometimes there’s a concern or a conflict between focusing in on trying to create a high reliability system and how that fits with clinical judgment.
  • Can you talk a little bit about how clinical judgment and reliable design come together? DON GOLDMANN: This comes up all the time.
  • You know Atul Gawande who may be- some of you have heard as he’s quite famous- wrote about the Cheesecake Factory and how they’re standardized processes were so great.
  • That’s all true when you’re making cheesecakes and you want to make them the same way every time, and clinical people will go to the other extreme.
  • We’ll say, well, every patient’s different and my wisdom and clinical judgment and years of experience are absolutely important and don’t tell me what to do.
  • He said, I see you’re really anxious about this, and this is the evidence.
  • It’s going to take a lot of persuading for me to say, OK, your judgment overwhelms and is more important than all of that evidence in epidemiology, so I tend to look at this as what is the degree of evidence? Not just degree of belief, as you’ve been talking about, but the degree of evidence.
  • The experience, the accumulated wisdom of people who’ve observed thousands of patients, and in fact, their learning is part of the practice.
  • DAVE WILLIAMS: So I loved your story about Colombia and trying to work with physicians there and the other health care providers to try to change the process.
  • I’m curious what were some of the things you learned as you had that experience and might help us as we think about improvement going forward.
  • The United States, we tend to be embedded in tradition and the way we always do and a little suspicious of these new approaches.
  • You go to where they are and get their opinion on their terms and make it convenient for them.
  • I know you and I are always talking about, you know, Deming, Shewhart.
  • I’m always saying, well, how can we explain that without referencing Deming and Shewhart.
  • I see no evidence that bringing the staff together and giving them two days of training in a classroom about the principles we’re discussing here is effective.

Return to Summaries

(image source)

 

Print Friendly, PDF & Email