Selecting the right maintenance management system can improve your operations

Harry Lorick, P.E., President, and Lydia Cox, Senior Consultant, LA Consulting, Manhattan Beach, California

A properly implemented Computerized Maintenance Management System (CMMS) can serve as a powerful management tool. Resources and activities can be justified and work history is documented. Public agencies may need to implement a CMMS for a number of reasons—mandates by federal, state and local agencies, results of internal or external audits, justifying resources, and establishing an automated system.

A difficult task is determining how to select and implement the right CMMS. An agency can document improvements in operations and use it to assist in efficiency savings. The common approach currently taken by many agencies is selecting software and changing existing processes to match the software. Although initially this approach appears to be less costly and time consuming, using this process is often longer, difficult and costly to implement. This article focuses on how to select a system that helps agencies manage limited resources.

The four main phases to implementing a maintenance system include operations review, writing an RFP, system software selection and implementation. The graphic below provides a basic flow chart of these management processes.

Operations Review
Recommendations made here are used to determine specific system needs and functions and can be a useful tool in assisting in the software selection process. The most useful aspect of a review is the ability to identify areas of improvement within an agency. The improvements will facilitate process changes and align a professional business approach to maintenance operations by utilizing the basic management functions of planning, organizing, directing and improving. The information is important as it provides the City Manager and/or City Council with the data necessary to justify purchasing a CMMS.

The first step in beginning an operations review is collecting initial information including population, growth rate, inventories, asset condition, activities performed, resource information, processes and other relevant maintenance information. Information is collected along with field observations of selected activities and interviews with employees.

Using the compiled information, observations during field reviews, comparisons and benchmarking can be made against similar agencies' activities, performance and quality. The relevant agencies should be of a comparable size and perform the same functions. The next step is to establish recommendations based upon comparisons and findings. Information should be confirmed continuously with employee involvement at all organizational levels. This will establish a relationship with employees for implementing future steps and ensure data accuracy and integrity.

The recommendations identify specific technology and business process improvements that can be used to improve operations, a key goal of implementing a CMMS. The recommendations in an operations review can be used to make process enhancements to the agency and then select the right software that will complement it. This will allow an agency to perform best management practices with a maintenance system rather than allowing the system to guide an agency's practices. Secondly, recommendations will assist in writing the Request for Proposal (RFP).

Preparing a Software Scope and Drafting an RFP
Preparing software needs and drafting the RFP is challenging, but clearly defining needs can facilitate this process. Some of these include operations overview, project purpose, and agency goals of key items such as agency functions, automation and staff capability, technology integration, linkages, data storage and entry, and cost.

During the needs development process, carefully review all software requirements being considered, as not all maintenance software has the same capabilities. You do not want to exclude a vendor by an unwarranted requirement. It is important to research various systems available.

The purpose of the project and scope of work must be clearly written. This will give the vendor an idea of exactly what related tasks may be necessary to implement their system, such as training for a specific number of individuals, database population and/or linkages to other systems.

While the purpose, scope and software needs will consist of the largest volume of the content for the RFP, there are other details such as submittal format and project timelines that must be included. Contact information and deadlines should be clearly set and a project timeline established. Vendors can be asked to submit a response by e-mail with intentions to submit. This will allow an agency to research the submitters prior to proposals being submitted as well as respond to any questions made by one vendor to all of the proposed submitters. The format of the proposal should be well defined to allow reviewers ease of comparison between the different vendor capabilities. Each vendor should be asked to submit references with proper contact information. Finally, a scoring sheet should be provided in the RFP that indicates the scoring and weights given to each category/task listed to allow each vendor to align their submittal with agency needs.

A notification should be sent to each known software vendor as well as posted on your agency website and in local newspapers to reach as many possible vendors during the RFP process. This will ensure that all possible vendors are contacted and that a reasonable number of vendors will submit.

Once all of the proposals have been submitted, a "shortlist" of vendors must be selected. The purpose of shortlisting is to narrow vendors to the most qualified, but not for selection. The first step is to do an initial review and remove all proposals that do not meet the minimum requirements in the RFP.

The review committee assigned to evaluate the proposals should review the scoring criteria again before the actual meeting to review proposals. This will help minimize personal bias and create a fair scoring environment. The review committee member should fill out one sheet per submitted proposal. At the end of the meeting each vendor will have a specific score based upon the outlined criterion. Vendors with high scores are the likely candidates to be shortlisted. There is no limit to the number of vendors that can be selected to demonstrate their product, but only candidates that have a real opportunity to be selected should be invited to present their product to your agency.

Once the shortlist process is complete, shortlisted vendors should be provided with a specific agenda and time to present/be interviewed that conforms to the original timeline and goals of the agency outlined in the RFP, as well as provide enough time to demonstrate their system's innovative technology. A sample of your agency's data should be given to the vendors along with the agenda. This will allow the vendor to use specific information and create a demonstration of capabilities of the software with understandable and relevant data.

Once all of the shortlisted vendors have been interviewed, each one should be scored again using the same scoring criteria as prior to the shortlist process. Information provided in the interview regarding capabilities should be updated on the scoring sheet to reflect any new information. References should be checked for the software vendors that are most likely to be selected. Specific questions should be asked that will help determine if their previous customers are satisfied.

At the end of this process if no single vendor appears to outperform the rest or if several vendors appear to be an equally good fit, you may consider performing actual site visits at agencies that are using the proposed software. During a site visit, communicate with the actual users of the system, not just those involved in the selection and purchase. Observing a system in use can be a helpful determinant in understanding if a system can meet your agency's needs. The key is to select a system that best works for your agency needs, and not one that meets someone else's requirements.

Software System Selection
Cost should not be the determining factor as to whether a vendor should be shortlisted, as it is difficult even with standard formats to get a similar cost submittal. Agencies often have a specific amount budgeted for software, which may not be consistent with the price of the software they would like. Once all of the vendors have been scored a second time, then cost should be considered. When evaluating, it is important to look at total cost—not just the price of the software. The price of training, supporting hardware and software, data population and annual upgrades can greatly impact the cost. The total cost of a system should be taken into consideration along with all other relevant factors in the scoring.

Selection of software must be a balance between cost and capability. Even if the cost of the highest-scoring vendor exceeds the others, there is the ability to negotiate a lower price and retain the desired software capabilities. Initially do not exclude the highest-scoring vendor if the price appears excessive until you are provided a chance to renegotiate. If negotiations fail to generate a price that is acceptable for the agency then advance to the second-highest-scoring software and begin negotiations. When the balance of cost and software needs has been achieved, the software selection process can be finalized.

After a contract has been signed, implementation of the new system can begin. Implementation is considerably more than installing software. It is common during this stage for several inventories to be integrated as well as data population into the database for ease of access and use.

Although a maintenance system can increase automation and provide easy access for managers and employees to a considerable amount of data, the key to making the system work is proper training. Initially, staff must be assisted in learning how to use the system for both reactive and proactive work. Maintenance crews must be taught to complete daily work reporting forms and data entry, while administrative personal must learn to enter data, take work orders or work requests and close out files as well as maintain resource cost files. Management needs to be trained to interpret reports that provide valuable decision-making data.

Once the system is in place the best management practices of Planning, Organizing, Directing and Controlling processes using system data should be institutionalized to facilitate continuous improvement. This will allow the system to be updated annually and assist in the creation of performance data and comparisons of planned versus actual work on a monthly and annual basis.

This will allow improvement through feedback, better planning and use of good business practices.

Once the system is established, it can be linked to accounting systems, PMS systems, GIS and/or wireless units, allowing for more ways to increase automation and improve operations.

The success of a CMMS lies in basic management concepts: senior management support, staff involvement and training, selection of the proper system to meet changing needs, and integration, not overlay, into daily operations. Many agencies have experienced dramatic results with some agencies experiencing 25-30 percent in efficiency savings during the first year with continuous savings each year thereafter. Implementing a system is more than just buying software—it is an ongoing process that can provide valuable efficiency and improvement savings.

LA Consulting, established in 1993, provides a wide variety of planning, systems and technology services applied to public agencies and municipalities, with an emphasis on systems implementation and technical support for public works operations and maintenance. The firm's founder, Harry Lorick, has an extensive background in public works operations, management consulting, systems development, training and technology consulting; he can be reached at Lydia Cox, Senior Consultant with the firm, can be reached at