Interested in linking to "2009 CMMS/EAM Review: Power up a winner "?
You may use the Headline, Deck, Byline and URL of this article on your Web site. To link to this article, select and copy the HTML code below and paste it on your own Web site.
By David Berger, P.Eng., Contributing Editor
Plant Services' CMMS/EAM Software Review allows you to define, refine and source your requirements in search for the software that best fits your needs. Click here to learn more.
So, you’re thinking about a new or replacement computerized maintenance management system or enterprise asset management system (CMMS/EAM). Have you thought about what could go wrong? I hate to be negative about it, but many studies over the years put the odds at more than 50% that your CMMS/EAM system implementation will end in failure — even if this isn’t your first time trying.
It’s therefore worth your while to try to understand the typical problems that companies face when selecting a CMMS. Choosing the right software is by no means a guarantee that you’ll be successful through to the end of implementation and beyond, but at least you’re off to a great start. The more time and effort you put into the initial planning, design and selection phases, the greater your prospects for success.
Some of the typical blunders made when selecting a CMMS/EAM are described in a recent white paper titled, "10 Pitfalls to Avoid When Selecting a CMMS/EAM." This white paper, based on a reader survey Plant Services conducted in December 2008, details the top 10 selection mistakes:
Do any of these mistakes sound familiar? Your best defense is to allocate adequate time for the right resources in the early planning and selection stages. This can save you years of aggravation during and after implementation. Follow a solid step-by-step methodology for designing new processes, developing the system requirements that support them, and selecting the right combination of CMMS/EAM package and vendor. Thus, to increase the odds that your CMMS/EAM software implementation will be a success, consider following these steps:
1. Build process/system requirements: One of the most critical steps in planning for a new or replacement CMMS/EAM is to determine your needs. Procuring a CMMS/EAM system isn’t about finding the best software package on the market. The key to a successful implementation is selecting a CMMS/EAM package that best fits with your requirements. There are many wonderful CMMS/EAM packages available today, but every one of them has its strengths and weaknesses. Your task is to determine user specifications based on the needs of stakeholders (e.g., maintenance, operations, engineering, IT, materials management, purchasing, finance), and then choose the combination of CMMS/EAM vendor and software package, that can best deliver on your needs.
This is why it’s so critical to invest about three months to six months in the design of new processes and supporting system specifications using some sort of participatory approach involving key stakeholders. This must be done before the selection phase begins so you can filter the sales pitch from each vendor and steer them to exactly what you need to see demonstrated to determine the best fit. Many companies naively believe that there’s no point in spending time on process design until the system has been selected. Although it’s true that your process design can’t be finalized at a detailed level before knowing which software package you’ll purchase, it also has been shown through 30 years of track records that neglecting to engage stakeholders in process design before the selection will increase the probability of failure.
The methodology you use to build process/system requirements should involve defining process flows that reflect the current state and the desired future state. The future state processes will be supported by best practices and enabling system specifications.
“The more time and effort you put into the initial planning, design and selection phases, the greater your prospects for success.”- David Berger, P.Eng.
For example, perhaps the future state processes indicate stakeholder desire to move to a more planned environment from the current state of firefighting. Reducing the high percentage of emergency and reactive maintenance requires more preventive and condition-based maintenance, better planning and scheduling, and a variety of analysis tools for managing the transition. Addressing the gap between current state and the desired state presents a number of challenges that should be discussed long before a system is selected, such as:
Throughout the process/system design phase you can also identify Quick Wins, i.e., improvements that can be made immediately because they don’t rely on the new CMMS/EAM package being implemented. Quick Wins not only provide immediate savings, but they also generate excitement, build momentum and establish credibility for the project.
Table 1. On tap at www.PlantServices.com/CMMS_Review
Once the system requirements supporting the future state process flows have been established, the next step is to draft an RFP and send it to at least three CMMS/EAM vendors that are most likely to provide a best fit. If you’re unsure which vendors are good candidates, then conduct some research, such as examining the Plant Services CMMS/EAM Software Review (www.PlantServices.com/CMMS_Review) to get a feel for which packages offer which features and functions. The review site allows users to manipulate the weightings of about 350 criteria to better understand which packages fit with your requirements. As of this month, it includes vetted information on the capabilities of 19 packages (Table 1) including six that are new for 2009.
2. Develop a request for proposal (RFP):
The RFP should outline some background on your company, including the organizational structure, your current technology environment, the reason for launching the CMMS/EAM project, and your definition of success for the project, both quantitative and qualitative. There also should be a section on your procurement policy, key steps in the procurement process, expected timeline, and any terms and conditions. Be aware that throwing in too many nonnegotiable constraints might scare away vendors because it’s not worth their while. Of course, the RFP provides technical user requirements, such as “ability to create a third-party invoice for labor and materials the technicians used” and general requirements like “ability to define default values for any field.” The CMMS/EAM Software Review offers many more examples of what might become your user requirements.
3. Establish a vendor selection committee: To ensure the key stakeholders are well represented when selecting the CMMS/EAM vendor and software, it’s important to establish a vendor selection committee. This committee is best kept to approximately 7-12 people for greater efficiency and effectiveness. You always can supplement the process with additional stakeholders by inviting specialists to various activities during the selection phase. For example, you might want to have a few technicians and their supervisors attend when the vendors are demonstrating their mobile solution, or people from purchasing during demos of the spare parts procurement process.
4. Determine short-list evaluation criteria: In the first meeting of the vendor selection committee, consensus must be reached on what criteria will be used to short-list proposals. Obviously, it’s futile to evaluate proponents at a detailed level without using or at least seeing the package, and meeting the vendor. But just like short-listing a stack of resumes, more general criteria can be used to fairly evaluate the written material received.
The survey for the aforementioned “10 Pitfalls …” white paper asked users to rank 21 CMMS/EAM selection criteria in terms of importance to their organization, then in terms of how well their current CMMS performs them. The results, which are detailed in the paper, offer a perspective on evaluation criteria that may make sense for your organization.
Table 2. Sample Selection Criteria
provides a sample breakdown of selection criteria and associated weightings that would reference applicable sections of the RFPs. With the addition of columns for each vendor that submits a proposal, the vendor selection committee can use such a table to rate, score and compare vendors.Table 2
Note that the weighting and scoring of the evaluation criteria is a subjective process. Different stakeholders have varying opinions on what is important, and this is why it’s critical to build consensus. Try to finalize the table before receiving any vendor responses to avoid any prejudice.
5. Read through the proposals: Each vendor selection committee member should read through the proposals to get an overall feeling as to how the vendors responded. Ratings shouldn’t be attempted in the first reading, however, notes should be made on each proposal, otherwise you’ll forget your initial impressions.
6. Rate the proposals: Each vendor selection committee member should, on their own, attempt to short-list the proposals. This is best accomplished by completing the table with additional columns for rating each vendor. There are many ways to rate the vendors and all have a good deal of subjectivity.
One suggested approach is to read through only the sections relevant to a given selection criterion for the proposals. As you read through the relevant section of each proposal, physically rank the proposals. Thus, if there were 15 proposals submitted, at the end of the read-through on say, the first selection criterion (Company Profile), you’ll have a pile of 15 proposals sitting on your desk with the uppermost proposal being the best and the proposal at the bottom of the pile being the worst response to the first selection criterion.
Then, each proposal must be rated with the highest score possible being equal to the weighting, and the lowest score zero. Two or more proposals might receive the same score for a given selection criterion. You don’t have to give any proposal a perfect score (i.e., a rating equal to the weighting) if you feel none deserve it, because the rating process is strictly a means of determining relative scores. This physical ranking followed by rating of the proposals is repeated for each selection criterion.
When rating the selection criteria under the cost effectiveness category, an easy way to score each vendor is to assign the cheapest solution the highest rating and the most expensive solution the lowest rating. The difference is then apportioned for the balance of respondents.
7. Final rating of vendors by the vendor selection committee: At a meeting, the vendor selection committee reaches consensus on the ratings for each of the proposals. One way is to have the person best qualified to evaluate the proposals on a given criterion begin the discussion by stating the highest score and who received it. Discussion might ensue, but eventually the group will agree on a winner(s) and their score. This process is repeated for the lowest score for the same criterion, followed by the rest of the proposals. It’s critical to be accurate on the relative rating (i.e., the spread) and ranking of the proposals for each criterion, but not as critical to be accurate on the absolute value of any given score.
8. Short-list vendors: Eventually, the group will complete the table with agreed-upon scores, and extract a short list. If at all possible, only two or possibly one vendor should be short-listed to maximize the opportunity to analyze in detail their software solution(s) before a firm contract is in place.
9. Follow-up to short-listed vendors: Some of the vendors who don’t make the short list might request a postmortem on their proposals. By retaining the rating table and backup notes from the vendor selection committee meeting, the vendor can be given fairly objective reasons why it wasn’t short-listed.
10. Individual committee member evaluation: Immediately following vendor demonstrations for each short-listed vendor, selection committee members must complete individual detailed ratings, line-by-line, on the specification. Note that any vendor involved in the demos should prove that its software works using your data and your processes. This is accomplished by supplying vendors, at least one week in advance of their demos, a series of test scripts or scenarios and accompanying data for your critical or complex processes. Other key inputs into the evaluation process are at least three reference checks, at least two site visits, and any other meetings where contact is made with the vendor, such as any preliminary meetings to negotiate terms.
New for 2009 at www.PlantServices.com/CMMS_Review
Detailed ratings are presented for committee discussion and approval, and a master evaluation rating sheet is prepared reflecting consensus of the group. Ratings are multiplied by weightings and tabulated for all items. A selection matrix is written that compares the evaluation criteria, including the technical specification on a detailed, line-by-line basis. The committee then approves the final vendor selection and presents its recommendations for ratification by senior management.
11. Vendor selection committee evaluation:
Following this 11-step process will improve your chances of selecting and implementing the CMMS/EAM software that best fits your needs. The key is maximizing the work done initially, i.e., to define new processes and supporting system requirements. Furthermore, build consensus across multiple stakeholders throughout the design and selection phases to ensure buy-in. Finally, develop a balanced set of evaluation criteria, and use a methodology that enables you to rigorously evaluate each vendor’s package against those criteria.