CMMS / Software / Asset Management System

Testing 1, 2, 3: How to test your CMMS

David Berger says know what you're getting into to avoid costly CMMS problems down the line.

By David Berger

Whether you’re buying a new CMMS or upgrading an existing one, testing is one of the most important measures you can take to ensure a successful purchase, configuration, and implementation. This is especially true in today’s world of complex technology.

CMMS packages provide great flexibility and functionality while simultaneously raising the probability and impact of things going terribly wrong thanks to our rising dependency on technology. This is especially true when integrating various software, hardware, and telecommunications solutions involving multiple vendors, platforms, and versions. Testing helps identify problems and risks long before you undertake the costly and time-consuming process of moving a given CMMS application into the production environment.

Testing will answer two key questions: 1) Does this CMMS package meet user requirements? 2) Does the package work as communicated by the CMMS vendor and understood by the users? The scope of testing extends beyond just testing the software. It can include deliverables such as detailed user requirements, technical specification, design, documentation, and procedures.

Following is a brief introduction to the complex, laborious world of testing, with details on the steps required to properly test a new or upgraded CMMS and the types and levels of testing available.

Key steps

Many people believe that testing a CMMS package means attending vendor demonstrations and “playing” with a demo version of the software before purchasing a new package or installing an upgrade. But proper testing requires at least as much work as writing a user requirements document. There is no limit as to how thorough your testing can be. The rule of thumb is that testing is cost-justified if the potential loss resulting from implementing a poor-quality CMMS exceeds the cost of testing.

Develop a test plan. A few hours spent up front developing a plan for how to approach the testing process will save considerable cost and aggravation. The test plan should encompass what needs to be tested when and to what level of detail.

When buying a new CMMS, two key milestones require testing. The first is before the selection of the CMMS, and the second is after a package is selected and before implementing a fully configured CMMS.

For testing prior to package selection, user requirements should be evaluated line by line to determine the appropriate testing required. There are three testing possibilities here: Category 1 testing, using the vendor’s data and procedures (i.e., a vendor demo); Category 2 testing, which uses the vendor’s data but your specified procedures; and Category 3 testing, where you provide your own data as well as the procedures. The third approach is by far the most costly.

Once a new CMMS package is selected or before upgrading your existing CMMS, a test plan that covers all processes and functionality supported by the software should be developed.

Prepare test scripts. When a user requirement calls for Category 3 testing, a test script must be produced describing the actions needed to execute the test and the expected results. Sample data needs to be supplied as well.

You may wish to provide the CMMS vendor with some recent equipment history data to see if the vendor can produce a certain report as specified in the requirements document. A sample report can be prepared showing what information is required for what purpose.

Generate entry/exit criteria. Before moving from one level of testing to the next, entry and exit criteria must be met. For example, a module will not go to pilot until all Severity 1 variances (i.e., deviations from user requirements) have been eliminated in user acceptance testing. Severity 1 variances can be defined as variances that have a significant negative impact on the ability of the maintenance department to meet its service-level commitments to the operations department.

Determine resource requirements. Human resources that may be required are testers, knowledgeable users, and technical experts. External resources required include representation from the CMMS vendor and any relevant partners. Other resources that need to be considered are automated testing tools, test equipment, a test lab (which might just be a conference room set up temporarily for testing), and vendor facilities.

Execute the test plan and analyze results. Be careful that your testing provides a level of comfort that the CMMS will work in your live production environment(s) and will provide the information necessary to meet your needs. Results of each round of testing should be summarized, with the summary noting the severity of each variance, the follow-up actions taken (and who took them), the date the actions were completed, and sign-off on entry and exit criteria. The CMMS vendor and/or its partner organizations should be reporting regularly on their progress in fixing any issues assigned to them to ensure that there is no impact on the timeline.

Types of testing

Many types of tests can be conducted. One of the most critical tests for CMMS packages is performance testing. This test demonstrates how well the package operates in a production environment, as by tracking response time, throughput, and so on.

Another critical test is integration testing, which ensures data is passed correctly between applications. This is useful because most CMMS packages interface with numerous other systems, such as enterprise resource planning (ERP) systems, human resources programs, and shop-floor data collection systems.

Usability testing is near and dear to the user community. The software, documentation, workstation and other deliverables from the CMMS implementation must be usable; that is, they must present a simple language and workflow, consistent use of terminology and keys, effective training tools and help functionality, and the ability to avoid errors, among other things.

Regression testing also is a key test. It verifies that no unwanted changes result from introducing change into another part of the system. This is done by comparing results before and after a change is made, using the same test scripts.

If system vulnerability is a potential issue, security testing ensures confidential information is adequately protected against loss, corruption, and misuse. Special software tools can test how easy it is to break into your system.

You can conduct more than a dozen other tests, such as:

  • Stress or volume testing to see how the CMMS handles peak load conditions
  • Audit and control testing to test the adequacy and effectiveness of controls
  • Error-handling testing for detecting and responding to exceptions such as erroneous input
  • Transaction flow testing, which monitors the path of a transaction from data entry through processing to final output
  • Disaster recovery testing, which looks to ensure that a system can recover properly after a system failure.

Levels of testing

The CMMS package is staged through various levels of tests as described below. Within each level, the system undergoes as many passes as required to bring the number and severity of variances in line with the entry/exit criteria.

Unit testing by vendor. The CMMS vendor should have conducted exhaustive tests of new and modified code in a given module prior to releasing the software. Vendors that release software without adequate testing risk tremendous customer backlash or even legal action.

System testing. Before purchasing a new CMMS or accepting a new release, system testing will verify that the package operates the way it was designed to work, both in isolation and integrated with other applications. Testing is conducted primarily by technical people with some user representation.

User acceptance testing. This round of testing is performed by users to verify that the CMMS meets the needs of end users. The test must simulate the user environment and will therefore test user documentation, security access, and so on.

Operability testing. To demonstrate that the application can operate in a production environment, operability tests are conducted after or concurrent to user acceptance testing. Whether they use the term “pilot” or “prototype” for this testing, companies agree on the value of operability testing prior to full-blown implementation. Operability testing is conducted by front-line staff in discussion with the project team.