After celebrating at the end of many long months, even years, of planning and executing an improvement project, the last thing on the minds of the project team is a rehash of the gory project details. Nevertheless, a post-implementation review or audit can be an extremely valuable exercise that provides feedback indicating whether the plan was executed properly and key benefits were achieved efficiently and effectively.
Avoid finger-pointing in the review. Instead, make it forward-looking, focusing on what went well and therefore should be repeated in the future, as well as what can be improved.
View more CMMS content on PlantServices.com
Many companies hire an independent expert to conduct the review, thereby ensuring a balanced report. The outsider may provide valuable insight into the cause of problems and corrective actions. For the purposes of reviewing software implementation, your experts can be either consultants or vendors.
The following summarizes the contents of a typical post-implementation review report, using a CMMS implementation as the example project. However, the approach proposed here is applicable to any maintenance improvement project under review.
The review document
The report's first section describes the methodology used for gathering information. Most reviews have two parts: extensive interviews and perusal of documentation. The people interviewed should represent each stakeholder group involved in the project, including maintenance management, maintenance technicians, operations management and workers, senior management, accounting, information technology, engineering, purchasing and the CMMS vendor. The report should list the people interviewed.
It's useful to have a list of standard questions to guide each interview. Some sample questions include:
What was the purpose of the project?
What did you like about the implementation process?
What things would you change if you had to do it again tomorrow?
Were your expectations met? Why or why not?
Was communication effective throughout the process?
It's important to reiterate the original purpose behind implementing the CMMS to ensure the original objectives were met. If the objectives were clearly stated at the project start, then that material copied from a previous document is sufficient. It's surprising how often objectives weren't clear, as evidenced by the range of responses to the first question above. Focusing on a few well-communicated objectives is key to ensuring a successful project.
Issues and recommendations
The bulk of the review should identify issues from each stage in the project, recommendations for eliminating outstanding issues and suggestions for preventing a recurrence. Consider some of the more common issues below.
The definition of requirements is the first potential trouble spot. Common complaints regarding development of the CMMS specifications include failure to:
Involve every key stakeholder.
Develop specific requirements that can differentiate among vendor offerings.
Prioritize the requirements.
Establish requirements for the next three to five years, rather than focusing on replacing the current system.
Determine the process changes required to enable the CMMS.
Vendor selection also has common issues, including:
Lack of a formal vendor selection committee.
Failure to follow a methodology mutually agreed to by all stakeholders.
Omission of critical steps in evaluating vendors, such as site visits and reference checks.
Failure to speak to someone other than the project champion during site visits and reference checks.
Improper or inconsistent documentation of vendor ratings, which makes vendors feel that they were treated unfairly.
Testing of candidate CMMS packages, too often, is merely a vendor's demonstration. Effective reviews, on the other hand, require you to provide the vendor with test scripts to demonstrate that the CMMS can manipulate your own real data and produce reports in accordance with your specifications. The vendor should test customized software or configuration changes fully before delivery. Other forms of testing include hardware testing, stress testing (ensuring the system can handle peak volumes), integration testing (passing data to and from accounting, shop-floor data collection and ERP systems) and conducting a pilot.