More often than not, maintenance departments have become reliant on CMMS to provide a steady stream of asset data, spare parts inventory data, employee data or financial data. As our dependence on computers grows each year and our trust in technology increases, we must be very careful not to develop a false sense of security when basing key decisions on output from a CMMS, without first validating that data quality is sound. In my previous column, the concept of data quality was introduced and defined using words such as integrity, completeness, reliability, accuracy, consistency, accessibility and 10 other terms. CMMS-related examples were provided for each term.
In this column, I will examine ways to achieve and maintain higher levels of data quality. This includes various techniques such as proper data conversion, process design and system setup, as well as better use of the features and functions available on many CMMS packages.
When a new CMMS is purchased to replace an existing CMMS, management will need to decide whether or not to transfer any data from the old system to the new one. This can be a key decision in terms of data quality because there is a risk that any pre-existing data problems might be simply transported to the new CMMS. Typical examples of such problems are missing data, poorly worded descriptions and errors.
Furthermore, new unanticipated issues might surface, for example, when trying to translate a single 50-character vendor address field used by the old system into two 20-character address fields required in the new system without losing data integrity. Another typical data conversion issue is deciding how to convert a descriptive field on the old system into a coded field on the new system. A good example of this issue is translating a long description of the root cause of a problem on a given work order, into the most applicable symptom and failure codes.[pullquote]
Thus, implementing a new CMMS is a great excuse for reviewing the existing data in detail and making sure the new CMMS is loaded with data that is accurate, valid, complete and concise. In my experience, about 50% of companies eventually abandon their plans to convert much of the data from existing systems electronically. Instead they scrub the data and re-enter it into the new system manually or use semi-automated tools such as spreadsheet templates the CMMS vendor provides.
For transactional data such as work order history and spare parts usage, many companies decide not to load the old data onto the new system. As a low-cost alternative, accessibility is provided by simply generating electronic or hardcopy reports from the old system and storing them in an easily retrievable location or allowing users to access the old system for a reasonable period of time.
To ensure data quality, processes must be designed to use and complement features and functions on the CMMS. For example, approval processes can be established using either simple approvals functionality on a CMMS or a workflow engine on the more sophisticated packages. Either way, supervisory staff should validate data in terms of reasonableness and spending limits, such as the total labor and materials charged to a work order. The key to good process design in terms of data quality is that adequate controls are in place.
Another key area of vulnerability regarding data quality is poor system design and setup. In conjunction with your CMMS vendor and IT department, think of the ways that data might be compromised and plan an approach that mitigates each risk. Consider the following risks, for example:
- Accidental or malicious alteration or destruction of data by users
- Accidental or malicious hacking or introduction of viruses or worms by external sources
- Hardware malfunction such as disk crashes or electrical surges
- Environmental influences such as excessive humidity, temperatures and dust
- Poor network administration such as not knowing who is hosting the application and not being vigilantly managing security profiles and passwords
- Poor emergency preparedness for earthquakes, floods or acts of war
- Errors occurring when data is transmitted from one computer to another.
To prevent or reduce the effect of these risks to data quality, consider such measures as:
- Regular and multiple back-ups, including off-site storage
- System mirroring or redundancy such that critical hardware and software are duplicated in an alternate location; in the event of system failure, the mirrored/redundant system kicks in
- Environmental controls in place for hardware (e.g., climate-controlled data center) and user equipment (e.g., ruggedized laptops)
- Policies and procedures in place and enforced to deal with ensuring data quality (e.g., security management)
- Strict controls and detection software, for monitoring system access and activity
- Disaster recovery and business continuity plans that are regularly practiced
- Error detection and correction software for data transmission
Before upgrading your existing CMMS or implementing a new one, make certain that there will be no data quality issues at the unit, process or system levels through extensive testing. As well, test the system to determine if there are data quality problems because of peak loading — stress testing — or from application interfaces — integration testing. Testing can be time-consuming and expensive; however, it’s well worth the effort in that it is an effective means of reducing data quality problems and assuring the CMMS works as intended.
To get the most out of the policies and procedures, process design, and the system setup, users must be properly trained both technically and in the softer skills. An example of technical skills required is how to input work order data properly. Softer skills would include escalation procedures if a security breach is suspected or data quality is compromised.
One of the most sophisticated groups of functions of a modern CMMS for preventing and detecting data quality problems falls under the banner of data security and integrity. Included in this category are such features as:
- Security access: provides password protection for legitimate users to login to the system and provides access to individual modules, menu items, screens, tabs, links, fields and even specific databases or data (e.g., limiting access to purchase orders that reference a given account number)
- Security profiles: allows user with administrator rights to build and select security profiles that can then be configured to the specific needs of individuals or groups
- Edit capability: blocks a user from undertaking a pre-determined combination of add/change/delete capability anywhere in the database, depending on the user’s security profile
- Audit trail: tracks login and logout activity, changes to any database, and even keystrokes in some cases
- Error-checking: checks data entered for errors in format (e.g., part number must be five numeric digits starting with an alpha character); range (e.g., equipment code ranges from 1,500 to 3,000); and logic (e.g., wrong engine on a given vehicle or too many or not enough pumps in a given piece of equipment)
- Digital signatures: provides a secure process for identifying an electronic document as identical to the original (data integrity) and from the stated source (authentication).
These features not only provide data security, but they also provide improved productivity, flexibility and scalability, depending on how the user interface is configured for user group profiles such as planners and supervisors, as well as for individuals.
E-mail Contributing Editor David Berger, P.Eng., partner, Western Management Consultants, at [email protected].