Indicators part 2: What does a comprehensive program contain?

We already know how to look at event rate by counting threshold events and dividing by worker hours, but that only tells us one third of the story. So, what's missing? OUTCOMES! How many times have we performed tasks successfully? This will be difficult to quantify immediately, but this can be broken down into process parts - how many work orders completed, separated by new installations, surveillance testing, preventative maintenance, corrective maintenance and troubleshooting. This tells us a story - how many times we have good outcomes versus bad ones for a period of time. A good indicator will tell you that story.

Bring it all together, please!

Okay, what's the third piece of the indicator puzzle? Not all outcomes were achieved in the desired fashion. Shortcuts taken, rushing the job, and a myriad of other things could result in a satisfactory job outcome, but certainly, not a job well done. What were the behaviors that got you to that outcome? This is where a slick Observation program comes in where peers can remark on each other and management can also weigh in on behaviors that were used during the work.

Now this is a more comprehensive program - failure rate, success count, and behaviors that helped get desired outcomes.

In the dark room (where development happens)

Now that you have this information, the puzzle comes together to develop a picture. This picture should give you a fairly accurate depiction of where to focus your error prevention program. The problem remains, if you don't do anything with the data, why are you going through all of the trouble to track, trend, and observe it in the first place?

A quick anecdote to drive this home

I remember a time in my past years ago where a Maintenance Corrective Action Coordinator spelled out in detail the problems in the Maintenance department and 3 focus areas to turn performance around. His report was collected, reviewed and overall ignored by management (he became highly frustrated and quit a very good paying job shortly afterwards). No kidding- within 2 months, all three of his predictors became true. If you're not going to use this information to improve, don't bother capturing it all and analyzing it. You're just wasting everyone's time.

Some suggestions

Indicators must begin with the end in mind (Thank you Mr. Covey). What are you going to do with the information to improve performance?

Check out this TEDx Talk on starting with the end in mind! Great story on how sheep respond (carrots and sticks) and "energy through hope!"

Sometimes a simple indicator is all that is necessary. See this excerpt from "The Sid Story" with Dennis Franz for a nice training example of a simple efficiency (but specific) indicator positively affecting department performance. I love this story about an indicator improving performance because people knew the score... Everyone who knows me, knows I love "Gamification." This is an excellent form of it.

Feedback

For everyone that discusses these posts in LinkedIn, and for all the thoughtful emails I've received - I wanted to say thank you for being part of this community and keeping these conversations on performance improvement going. More posts and new podcasts coming soon!!!!

Previous
Previous

What is the difference between HPE, HPT and HPI and does it matter?

Next
Next

Are all disasters preventable? (Earthquakes and Positive Train Control)