Reflections on Healthcare Safety: updated!

I had been discussing healthcare safety with Robert Centor, since starting to read his blog http://www.medrants.com/ specifically http://is.gd/5xXBA discussing “hard data” on safety programme effectiveness. While I’ve got plenty of references for you, they’re coming from my own experience more focused on safety engineering across industries rather than exclusively in the healthcare domain. Feel free to use the “grocery store approach” and take what you need, leave the rest on the shelf.

Personally, I have one example of cost savings or “safety dividend” from my time as Safety Office of USS Halsey. Captain Bob Pacek credits my afloat safety awareness programs with $292,000 (2009 dollars) in accident avoidance, given the history of incidents that have occurred in other underway replenishment at sea evolutions.

Looking at healthcare, Michelle Mello reports that the Harvard School of Public Health found the average 2006 cost per hospital-aquired condition (HAC) was $58,766 for all adverse events in Utah and Colorado [Mello, 2007]. I can report for certain that there is a major hospital system in the Austin area that had seven HACs in 2008 trending upward to eight as of July 2009.

If this year’s trend holds true, that means that the hospital (just one of their facilities!) will spend over half a million dollars, on events that the Joint Commission concurs are reasonably preventable. I know that Minnesota requires HACs to be publically filed in a database and that Leapfrog Group (leapfroggroup.org) has long advocated a nationwide data-collection effort.

If you need a medical figure to point to for authority, Dr. Philip G. Lewis, FACPM, FACOEM, corporate medical director for Rohm and Haas Company, teaches a module on environmental medicine to medical residents in which he points out, “you have to always be prepared to leave an enterprise when it persists in following a path that, while legal, is unsafe and unwise.” [Lewis, 2008]

In one of the broadest industry surveys of which I’m aware, researchers in the psychology of safe work cultures analyzed more than three billion (3×10^9) work hours of 1,750,000 staff across 297 companies, in 4,000 hours of confidential interviews with trained counselors. Their results give us some metrics: for every 600 near-misses, there are 30 property damage incidents, 10 minor injuries and one major injury (including fatalities). In 2003, ConocoPhillips Marine updated Bird and Germain’s original results in another large study with the stunning results that for every fatality there are at least 300,000 at-risk behaviors, (those not consistent with safety programs, training and machinery components) [Bird, Germain 1997], [Roughton, 2008].

ConocoPhillips Marine is now PolarMarine and Chris (Safety Dir for the new firm) requires me to state that CP doesn’t advertise these numbers because they are so easy to distort in sound-bites. To be meaningful, you and I are aware that they have to be traceable, to be evidence-based and auditable. All three words point to the same thing, just in different vernaculars or “tribal languages” depending on the industry.

From a risk management perspective, Gavin de Becker writes,

“People do things, we say, ‘out of the blue’, ‘all of a sudden’, ‘out of nowhere.’ These phrases support the popular myth that predicting human behavior isn’t possible. Yet to successfully navigate through morning traffic, we make amazingly accurate high-stakes predictions about the behavior of literally thousands of people … We expect all the drivers to act just as we would, but we still alertly detect those few who might not-so that we are also predicting their behavior, unpredictable though we may call it…

“We want to believe that human violence is somehow beyond our understanding, because as long as it remains a mystery, we have no duty to avoid it, explore it, or anticipate it.” [de Becker 1998]

Writing in Security Management magazine, Jeff Marquart, one of de Becker’s division VPs, explains “I cringe every time I hear these words [nothing you can do] because they are so untrue and so damaging. … Despite much proof to the contrary, many people continue to believe that an assassin is like a force of nature…The reality, however, is that successful assassinations require that hundreds of factors play out a certain way. Even the smallest snag in an assassin’s plans will make the difference between success and failure.” [Marquart 1999]

Systems thinking IS the answer in all of these success stories and the lack of it, a major culprit in the trajedies. Karl Weick has one of the best examples of an industry practicing systems thinking without knowing that they even do: shipyards. His quote from “Making Sense of the Organization” [http://tinyurl.com/yklamss] is timeless.

I have previously proposed safety training agendas using extremely short (12 minute) lessons to fit in even the busiest schedules. That word, busy, is well onto becoming the latest opiate of the masses. A dear friend, who happens to be the VP of Human Resources for my hometown bank, asked me if systems thinking and systems engineering weren’t just just names for doing more with less.

Actually the reverse is true: these disciplines are ways of dealing with less (often imposed on us from outside), by using more of the brain God gave us! By recognizing and adapting practices that have worked before to our unique needs and situations, we can start to overcome one of the most common modern myths: I’m too busy! [See http://is.gd/5Ab3b for alternatives]

By contrast, Edward Tufte, the renowned expert on communicating high-density data effectively remarks that, in crucial matters,

“When someone says, ‘Just give me the highlights,’ what they really mean is, ‘I have no sense of what’s relevant. I’m not willing to probe. I don’t have any time to think seriously about this problem.’ But detail offers the impatient something wonderful: When you’re focused on data, not pitching, the meeting will be shorter. It’ll be more intense and you’ll have to work harder, but you’ll learn a lot more and you have a chance at making smarter decisions…Meetings run this way are more participative and shorter, but people have to work harder. They have to think. But who could ask for more: smarter and shorter meetings?” [Tufte, 2009]

Hope this helps and have a very, merry and blessed Christmas and happy New Year!

Regards,

Matt

///

NOTES:

Bird, Frank E. and Germain, George L., Loss Control Management: Practical Loss Control Leadership, Revised Edition, Det Norske Veritas (U.S.A.), Inc. (1996): Figure 1-3, p. 5.

de Becker, Gavin. The Gift of Fear, New York, NY: Dell (1998): 15-16.

Mello, Michelle M. “Who Pays for Medical Errors? An Analysis of Adverse Event Costs, the Medical Liability System, and Incentives for patient Safety Improvement” Journal of Empirical Legal Studies Vol. 4, No. 4 (Dec. 2007): 835-60.

Marquart, Jeff. “Can a determined assassin be stopped?.” Security Management. American Society for Industrial Security. April 1999. Retrieved July 17, 2009 from HighBeam Research: http://www.highbeam.com/doc/1G1-54482535.html.

Roughton, James. “The Accident Pyramid,” Safety Culture Plus, July 22, 2008, Retrieved July 31, 2009 from http://emeetingplace.com/safetyblog/?p=275

Tufte, Edward. “How Facts Change Everything (If You Let Them).” Sloan Management Review, Retrieved July 30, 2009 from http://sloanreview.mit.edu/the-magazine/articles/2009/summer/ 50409/how-facts-change-everything-if-you-let-them/

1 comment to Reflections on Healthcare Safety: updated!

  • Kelsi Slosser

    The Physician-Patient Alliance for Health & Safety (PPAHS) is an advocacy group dedicated to improving patient health and safety. Doing so involves many facets. Hospitals need to leverage technology to provide necessary monitoring of patient vital signs. Management needs to enable healthcare providers to reduce safety risks by implementing procedures centered on patient safety. Physicians and patients alike need access to information on patient safety and must take an active role in preventing adverse events.

Leave a Reply

 

 

 

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

A walk in the forest…

Systems Thinking teaches us how to scale our perspective to study the forest or the trees, while Systems Engineering tells us what to do with the data we find.

Visitors: Where in the World?