Interested in linking to "Ensure CMMS data is good enough to trust"?
You may use the Headline, Deck, Byline and URL of this article on your Web site. To link to this article, select and copy the HTML code below and paste it on your own Web site.
By David Berger, P.Eng., contributing editor
If you’ve ever been surprised by the results of a query or report your CMMS generated, there are several possible reasons. Perhaps you misread the numbers or pressed a wrong key. Maybe you weren’t aware of certain information, your expectations were unrealistic or you were in denial. Unfortunately and all too often, data quality is the culprit — “garbage in, garbage out.” If indeed that is the reason, your CMMS software is not worth the cost of the computer on which it runs.
It’s quite astonishing just how reliant management is on a CMMS to provide information such as budget variances, asset availability and performance, energy consumption, payroll hours consumed, work backlog and so on. Yet despite our thirst for information, there’s sometimes little thought as to where the data is coming from and whether it reflects reality. It’s our inexplicable blind faith in technology that is our weakness — as if anything the CMMS outputs to screen or paper must be accurate because a computer processed it. As many maintenance managers discovered over the years, the quality of data input into the CMMS can be sadly lacking.
Outlined below is the definition of data quality, including examples of how it can be compromised. In next month’s column, I’ll explain what can be done to detect, prevent and rectify data quality problems.
There are many terms that can be clumped under the umbrella of data quality, in whole or in part. Each term is defined below. In general, data quality is information that meets the expectations of its source and consumers.
Integrity ensures that data is preserved, from source to consumer, without distortion from accidental or malicious intervention. For example, a receiver of spare parts scans a barcode which the scanner reads incorrectly because the label is damaged, or someone steals a spare part, hacks into the CMMS and then falsifies the number of spare parts received.
Validity ensures data conforms to business rules such as format, range and logic. For example, when entering a meter reading to determine if a PM is due, it must be four numeric digits with a value greater than the last entry, but by no more than 1,000 miles.
Accuracy refers to the degree of data correctness and conformity to a given standard. For example, a technician works on a job that should take no more than three hours, but instead enters five hours on the work order because of human error or a deliberate distortion of the facts.
Precision refers to the level of granularity of the data. For example, when performing a maintenance inspection that requires measuring the amount of oil remaining, a technician records the level as “1/2” instead “0.4532.”
Credibility refers to the reasonableness of data or how believable it is. For example, technicians have varying degrees of experience and diagnostic capability in determining a root cause of failure — data from one source might be more credible than another source.
“Its our inexplicable blind faith in technology that is our weakness.”- David Berger, P.Eng., contributing editor
Timeliness means that although some data might be perfect in every other way, unless it’s current or comes at the appropriate time, it might be of no value. For example, if work order information is entered at the end of each week instead of at the end of each job, any equipment history reports generated throughout the week might not reflect reality and lead to poor decision-making.
Completeness becomes a problem if users choose to use fewer fields or enter less data into the CMMS, either to save time or cost, but ultimately, decision-making capability can be compromised. For example, CMMS users don’t use standard job plans for repetitive work, including estimated hours to complete the job, safety procedures, standard parts required and so on. The lack of standardized processes results in poor productivity and an inability to track variance analysis on performance and quality standards.
Conciseness and higher quality data implies greater brevity and being more succinct in entering data - for example, using a descriptive field to enter “I took out the old motor and put in a brand new one” versus using a coded action field to select “replaced motor.”
Redundancy, making two or more entries of the same data, is a common problem with a large database — for example, entering a new vendor in the vendor master under the name of “ABC Bearings of Canada Ltd.,” when the existing CMMS entries “ABC Bearings” and “Ontario 1234567” already refer to the same vendor. This is also common in entering multiple part numbers referencing the identical part carried by different suppliers.
Consistency refers to the level of data repeatability — would someone enter the same data under the same circumstances in the future. For example, technicians aren’t always consistent when selecting problem, cause and action codes.
Objectivity means it’s possible to introduce biases into the data as humans enter, approve or manipulate CMMS data, producing distortions that might be recognizable. For example, a technician who is anxious to replace a piece of equipment that’s perceived as problematic might color the description of a failure or exaggerate the downtime recorded.
Utility refers to the data’s applicability and usefulness. For example, there are hundreds of reports available on a CMMS, but not all reports have the same utility; companies find it difficult to prioritize and focus on a small set of measures that trade off, such as tracking asset availability, utilization, performance, reliability, quality of output and total cost of ownership for critical assets.
Accessibility means that CMMS data should be available and easy to obtain for those authorized to enter or retrieve it. For example, when using mobile devices in remote regions of the country, technicians should have some means of entering data and accessing work orders, GIS maps showing asset location, equipment history and so on.
Usability, one of the most critical aspects of data quality, refers to the intuitiveness and ease of use of data, including ease of learning and remembering. For example, technicians might use shortcuts to save time when entering work order or master file data; this is fine if data can be found afterward and understood by everyone.
Traceability is important when you need to know the source of the data to determine its credibility and to obtain further information. For example, a maintenance supervisor drills down on a major budget variance and discovers that a new technician is working on an unfamiliar piece of equipment, so more training is required.
Flexibility means higher-quality data won’t be compromised when the process or the CMMS database supporting it are changed. For example, upgrading your CMMS to a new and improved version shouldn’t compromise the accuracy or integrity of the data converted to the new system. As you see, there are so many ways to corrupt or distort CMMS data, despite our growing trust in, and dependence on, computer systems. This underlines the importance of maintaining a healthy skepticism when making decisions based on data the CMMS presents. In next month’s column, I’ll examine ways to achieve and maintain higher levels of data quality.
E-mail Contributing Editor David Berger, P.Eng., partner, Western Management Consultants, at email@example.com.