using action plans to measure roi

10
24 www.ispi.org JANUARY 2003 T he use of action planning to measure the return on investment (ROI) shows much promise for performance improvement (PI) interventions. Action planning is powerful, flexible, and effi- cient. With this approach, participants develop an action plan for improving performance during a training program or PI project. The plan is a step-by-step guide to drive application on the job. In this case study, this project added sig- nificant value to a restaurant chain and used minimum resources. The key to success in this approach is carefully planning the evaluation, building it into the performance improvement process, and using the data to help future partic- ipants succeed with the same perfor- mance improvement project. Evaluation Challenges In recent years, increased emphasis has been placed on measurement and evalu- ation, including the calculation of the ROI. Top executives, chief financial officers, and internal clients now ask for more accountability for training and development and performance improve- ment initiatives (Phillips, 2000). This challenge has professionals in our field searching for specific ways to increase accountability with minimum addi- tional resources. When more detailed measurement and evaluation are considered, a vari- ety of barriers often surface. One major problem is that there is never enough time to collect, analyze, and present data in a meaningful way. Additional measurement and evalua- tion adds cost to a process that is already too expensive in the minds of some executives. Furthermore, there is always the challenge of collecting adequate data, particularly when par- ticipants must supply the data. The quality and quantity of data always suffer when participants are reluctant to allocate enough time. Identifying appropriate measures to mon- itor is another challenge, particularly for projects in which a variety of measures can be driven by a project. This is partic- ularly true for all types of PI initiatives that include leadership development, management training, team building, problemsolving, and innovation. These and other issues often create the need for an evaluation process that min- imizes resources and time, enhances par- ticipants’ role in the process, and provides sufficient quality and quantity of data for an appropriate ROI analysis. The action-planning tool described here can accomplish this feat. But first, a brief explanation of the actual ROI process. Using Action Plans to Measure ROI by Jack J. Phillips and Patricia P. Phillips

Upload: dr-jack-j-phillips

Post on 06-Jul-2016

217 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Using action plans to measure ROI

24 www.ispi.org • JANUARY 2003

The use of action planning to measure the return oninvestment (ROI) shows muchpromise for performance

improvement (PI) interventions. Actionplanning is powerful, flexible, and effi-cient. With this approach, participantsdevelop an action plan for improvingperformance during a training programor PI project. The plan is a step-by-stepguide to drive application on the job. Inthis case study, this project added sig-nificant value to a restaurant chain andused minimum resources. The key tosuccess in this approach is carefullyplanning the evaluation, building it intothe performance improvement process,and using the data to help future partic-ipants succeed with the same perfor-mance improvement project.

Evaluation Challenges

In recent years, increased emphasis hasbeen placed on measurement and evalu-ation, including the calculation of theROI. Top executives, chief financial officers, and internal clients now ask for more accountability for training anddevelopment and performance improve-ment initiatives (Phillips, 2000). Thischallenge has professionals in our fieldsearching for specific ways to increaseaccountability with minimum addi-tional resources.

When more detailed measurementand evaluation are considered, a vari-ety of barriers often surface. Onemajor problem is that there is neverenough time to collect, analyze, andpresent data in a meaningful way.Additional measurement and evalua-tion adds cost to a process that isalready too expensive in the minds ofsome executives. Furthermore, thereis always the challenge of collectingadequate data, particularly when par-ticipants must supply the data. Thequality and quantity of data alwayssuffer when participants are reluctantto allocate enough time.

Identifying appropriate measures to mon-itor is another challenge, particularly forprojects in which a variety of measurescan be driven by a project. This is partic-ularly true for all types of PI initiativesthat include leadership development,management training, team building,problemsolving, and innovation.

These and other issues often create theneed for an evaluation process that min-imizes resources and time, enhances par-ticipants’ role in the process, andprovides sufficient quality and quantityof data for an appropriate ROI analysis.The action-planning tool described herecan accomplish this feat. But first, a briefexplanation of the actual ROI process.

Using Action Plans to Measure ROIby Jack J. Phillips and Patricia P. Phillips

Page 2: Using action plans to measure ROI

Performance Improvement • Volume 42 • Number 1 25

The ROI Process

This process is often labeled the ROI process and collects sixtypes of data about a project, program, or solution. Figure 1shows the six types of data collected, providing a balanced,credible approach to identifying the success of a project. Aspart of the definition, a specific method or technique must beimplemented to isolate the impact of the project. The firstfour types of data are consistent with Kirkpatrick’s tradi-tional four levels (Kirkpatrick, 1994). At the heart of theprocess is a step-by-step model presented in Figure 2, whichshows how data are collected, processed, and analyzed. Theprocess starts with evaluationplanning, in which detailed objec-tives of the program or solution aredeveloped. Evaluation plans aredeveloped to make decisionsregarding how the data are col-lected, processed, and analyzed.Two important outputs comethrough this process: a detaileddata collection plan and an ROIanalysis plan.

At the first level of data, reactionand planned actions are capturedfrom participants who wereinvolved in the program. Next,learning data are captured as spe-cific improvements in skills,knowledge, and perceptions aremeasured. After the program isimplemented, application andimplementation data are collectedto show the progress in the useand application of skills andknowledge. The correspondingbusiness impact, which is directlylinked to the project or solution, is

measured. Together, thesefirst four blocks in theprocess model comprisethe key elements of thedata-collection plan.

The remaining blocksare critical to the actualROI analysis plan. Thenext step is to isolate theeffects of the projectfrom other influences.This process uses one ormore methods to sepa-rate the influence of thePI project from other fac-tors that had an influ-ence on the business

measure. In the next block, the business impact data areconverted to monetary value and annualized to develop anannual value for the project. The first year of value is usedfor short-term solutions; longer periods are used for moreextensive, long-range implementation. The fully loadedcosts are captured to reflect both direct and indirect costs ofthe solution. Monetary benefits and costs are combined inthe calculation of the ROI. The intangible benefits are iden-tified throughout the process and tabulated after an attempt ismade to convert them to monetary value. If an intangible itemcannot be credibly converted to monetary value, it is left asan intangible measure and becomes the sixth type of data.

Figure 1. Types of Data Collected in Impact Studies.

Figure 2. The ROI Process Model.

Page 3: Using action plans to measure ROI

The first four types ofdata are collected dur-ing and after the PI solu-tion is implemented.Two types of data (ROIand intangible mea-sures) are developed inthe process. This com-prehensive measure-ment process is nowused by thousands oforganizations in alltypes of settings, includ-ing public sector andnonprofit organizations.

How Action Plans Work

The action planningprocess can be traced tothe 1930s, when the fed-eral government used itas the participant action planning approach (PAPA) (Officeof Personnel Management, 1980). In early use of the actionplan, the primary focus was to develop specific plans forchanging behavior after training. In this approach, partici-pants actually complete their action plan, detailing howthey will change their behavior or apply what they learnedin the program. In recent years, the focus has extendedbeyond behavior change to include the anticipated impactdriven by the behavior change (Phillips, 1997). For thisnew approach to be successful, there must be a clear link-age between the behavior change and a predeterminedbusiness measure. The anticipated impact can be devel-oped and actual values placed on the unit of measurelinked to the program.

Figure 3 shows the steps needed to integrate the action plan-ning process into a PI initiative. It begins with an earlyannouncement of the process and with appropriate timebuilt into the program so that action planning becomes anintegral part of the process. This way, action planning isperceived not as an add-on evaluation tool, but as an appli-cation tool.

Background of Case Study

Cracker Box, Inc. is a large, fast-growing restaurant chainlocated in major metro areas. In the past 10 years, CrackerBox has grown steadily and now has over 400 stores withplans for continued growth. A store manager is responsi-ble for the operation of each restaurant. Cracker Box mustdevelop almost 150 new store managers per year to pre-pare for growth and turnover, which is lower than theindustry average.

Store managers operate autonomously and are held account-able for store performance. Working with members of the storeteam, managers control expenses, monitor operating results,and take actions as needed to improve store performance. Eachstore records dozens of performance measures in a monthlyoperating report; other measures are reported weekly.

Cracker Box recruits managers both internally and exter-nally and requires that they have restaurant experience.Many of them have college degrees, preferably in hospital-ity management. The training program for new managersusually lasts nine months. When selected, a store managertrainee reports directly to a store manager, who serves as hisor her mentor. Trainees are usually assigned to a specificstore location for the duration of training and preparation.During the period, the entire store team reports to the storemanager trainee as the store manager (mentor) coaches thetrainee. As part of the formal development process, eachstore manager trainee attends at least three one-week pro-grams at the company’s corporate university, which islocated near company headquarters. These training sessionsinclude the Performance Management Program.

Performance Management Program

The Performance Management Program teaches new storemanagers how to improve store performance. Program par-ticipants learn how to establish measurable goals foremployees, provide performance feedback, measureprogress toward goals, and take action to ensure that goalsare met. Problem analysis and counseling skills are also cov-ered. The program focuses on using the store team to solveproblems and improve performance. The one-week program

26 www.ispi.org • JANUARY 2003

Figure 3. Sequence of Activities for Action Planning.

Page 4: Using action plans to measure ROI

Performance Improvement • Volume 42 • Number 1 27

is residential and often includes evening assignments.Corporate university staff and operation managers facilitatethe program, and they integrate skill practice sessionsthroughout the instruction.

Needs Assessment

The overall needs assessment for this program is in two parts.The first part is a macrolevel needs assessment for the storemanager job, which is similar to assessments conducted formajor job groups in other organizations. The corporate uni-versity’s performance consultants have identified specifictraining and developmental needs for new managers, partic-ularly with issues involving policy, practice, performance,and leadership. This needs assessment provided the basis fordeveloping three programs for each new manager trainee.

The second part of the assessment is built into this programas the individual trainees provide input for a store-levelneeds assessment. The program coordinator asks partici-pants (trainees) to provide limited needs assessment dataprior to the program. Each participant is required to meetwith the store manager (that is, his or her mentor) and iden-tify at least three operating measures that, if improved,should enhance store performance. Each measure mustfocus on changes that both the store manager and managertrainee perceive as worthwhile. These business impact mea-sures define the business need for the program and couldinclude productivity, absenteeism, turnover, customer com-plaints, revenues, inventory control, accidents, or any othermeasure that needs improvement. Each participant in a spe-cific manager trainee group could have different measures.

To ensure that job performance needs are met, each partici-pant is asked to review the detailed objectives of the programand select only measures that could be improved by the effortsof the team using skills taught in the program. The importantchallenge in this step is to avoid selecting measures that can-not be enhanced through the use of the input of the team andthe skills and knowledge contained in the program. This stepeffectively completes the job performance needs assessment.

As participants register for the program, they are remindedof the requirement to complete an action plan as part of theapplication of the process. This requirement is presented asan integral part of the program, not as an add-on data-col-lection tool. Action planning is positioned as necessary forparticipants to see their actual improvements and theimprovements generated from the entire group. Credit forthe program is not granted until the action planning processis completed and data are reported.

Why Evaluate This Program?

The decision to conduct an ROI analysis for this programwas reached through a methodical and planned approach. A

corporate university team decided at the outset that busi-ness improvement data would be collected from this pro-gram for the following reasons:

• This project was designed to add value at the store level,and the outcome is expressed in store-level measuresthat are well known and respected by the managementteam. The evaluation should show the value of the PI interms they understand and appreciate.

• This approach to evaluation shifts the data-collectionprocess to an application perspective. Manager traineesdid not necessarily perceive that the information theyprovided was for the purpose of evaluation, but saw itas more of an application tool to show the impact oftheir training. The monetary impact and ROI calcula-tions help to achieve this perception and remove theresistance to providing the data.

• The application and impact data enabled the store teamto make improvements and adjustments. The impact ofthe improvement would be communicated to all teammembers. The ROI data helped the corporate universityteam gain respect from operating executives as well asthe store managers.

Therefore, the team built the evaluation into the programand included a requirement to develop the ROI.

Planning for the evaluation is critical to saving costs andimproving the quality and quantity of data collection. Theplanning process also provides an opportunity to clarifyexpectations and responsibilities and show the client group(that is, the senior operating team) exactly how this programis evaluated. Two documents are created: the data collectionplan and the ROI analysis plan.

Data-Collection Plan. Figure 4 shows the data-collectionplan for this program. Broad objectives are detailed along thefive levels of evaluation, which represent the first five typesof data collected. As the plan illustrates, the facilitator col-lects typical reaction and satisfaction data at the end of theprogram. Learning objectives focus on the five major areas ofthe program: establishing employee goals, providing feed-back and motivating employees, measuring employee per-formance, solving problems, and counseling employees.Learning measures are obtained through observations fromthe facilitator as participants practice the various skills.

Through application and implementation, participantsfocused on two primary areas. The first requirement was toapply the skills in appropriate situations; the second was tocomplete all steps in their action plan. For skill application,the evaluation team developed a followup questionnaire,which would be implemented three months after the pro-gram to measure the use of the skills along with certainother related issues. Six months after the program, theaction plan data are provided to show the actual improve-ment in the measures planned.

Page 5: Using action plans to measure ROI

Business impact objectives vary with the individual becauseeach trainee identifies at least three measures needingimprovement. These measures appear on the action planand serve as the documents for the corporate university staffto tabulate the overall improvement.

The ROI objective for this program is 25%, which was theROI standard established for internal programs at CrackerBox. This means that at least 25% return on the fundsinvested is acceptable. (The ROI formula is discussed later.)This was slightly above the internal rate of return expectedfrom other investments, such as the construction of a newrestaurant.

ROI Analysis Plan. The ROI analysis plan, which appearsin Figure 5, shows how the data are processed and reported.Business impact data, listed in the first column, form thebasis for the remainder of the analysis. Business measuresare level 4 data items, identified on the action plans. Themethod for isolating the effects of the project at Cracker Boxwas participant estimation. The method to convert data tomonetary values relied on three techniques: standard values(when they were available), internal expert input, or partic-ipant’s estimate. Cost categories represent a fully loadedprofile of costs; anticipated intangibles are detailed, and thecommunication targets are outlined. The ROI analysis planbasically represents the approach to process businessimpact data to develop the ROI and to capture the intangi-ble data. Collectively, these two planning documents out-line the approach for evaluating this project.

Developing the Action Plan: A Key to ROI Analysis

Figure 3 shows the sequence of activities from introductionof the action planning process to reinforcement during theprogram. The requirement for the action plan was commu-nicated prior to the program, along with the request forneeds assessment information. On Monday, the first day ofthe program, the program facilitator described the actionplanning process in a 15-minute discussion, setting thestage for the week. Participants received specially preparednotepads on which to capture specific action items through-out the program. They were instructed to make notes whenthey learned a technique or skill that could be useful inimproving one of the measures on their list of three businessmeasures needing improvement. In essence, this notepadbecame a rough draft of the action plan.

The action planning process was discussed in greater detailin a one-hour session on Thursday afternoon. This discus-sion included three parts: actual forms; guidelines fordeveloping action plans, including SMART (Specific,Measurable, Achievable, Realistic, and Time based) require-ments; and examples to illustrate what a complete actionplan should look like.

The program facilitator distributed the action planningforms in a booklet containing instructions, five blank actionplans (only three are required, one for each measure), andexamples of completed action plans. On Thursday evening,participants completed the booklets in a facilitated session

28 www.ispi.org • JANUARY 2003

Figure 4. Data-Collection Plan.

Page 6: Using action plans to measure ROI

Performance Improvement • Volume 42 • Number 1 29

lasting approximately one and a half hours. Participantsworked in teams to complete all three action plans. Eachplan took about 20-30 minutes to complete. Figure 6 showsa completed action plan. During the session, participantscompleted the top portion, the left column on which theylist the action steps, and parts A, B, and C in the right col-umn. (After six months, participants completed the remain-der of the form—parts D, E, and F, as well as intangiblebenefits and comments.) The senior program facilitatormonitored most of these sessions. Sometimes an operationsexecutive was present to monitor the sessions and learn

about the issues confronting store operations. The involve-ment of operations executives provided an additional bene-fit of keeping the participants focused on the task. Theseoperating executives were impressed with the focus of theprogram and the quality of the action planning documents.

By design, the action plan could focus on any specific stepsas long as they were consistent with the skills required inthe program and related to the business improvement mea-sures. The most difficult part of developing the plan was toconvert the business measure to a monetary value. Three

Figure 5. ROI Analysis Plan.

Figure 6. Action Plan Completed Six Months After Program.

Page 7: Using action plans to measure ROI

approaches were offered to participants. First, standard val-ues were used when they were available. Fortunately forCracker Box, standard values are available for most of theoperating measures. Operations managers and specialistshad previously developed or assigned a cost (or value) to aparticular measure for use in controlling costs and develop-ing an appreciation for the impact of store-level perfor-mance measures. Second, when a standard value was notavailable, participants were encouraged to use expert input.This option involved contacting someone in the organiza-tion who may know the value of a particular item. The pro-gram facilitator encouraged participants to call the experton Friday morning and include the value to the action plan.Third, when a standard value or expert input was not avail-able, participants were asked to estimate the cost or valueusing all the knowledge and resources available to them.Fortunately, the measure was a concern to the trainee (par-ticipant) and the store manager (mentor), who had someappreciation for the actual value. Estimation was possible inevery case when standard values and expert input were notavailable. It was important to require that this value bedeveloped during the program or at least soon after comple-tion of the program.

The next day, Friday, the participants briefly reviewed theaction planning process with the group. Each action plantook about five minutes. To save time, each team chose oneaction plan to present to the entire group to underscore thequality of the action planning process. The program facili-tator explained the followup steps to the group. It was rec-ommended that the manager trainee and the store managerdiscuss the document before sending a copy to the corporateuniversity staff. Contact information was included in case astaff member had a question about the data.

Results

Reaction and Learning. Reaction data, collected at the endof the program using a standard questionnaire, focused onissues such as relevance of the material, the amount of newinformation, and intention to use the skills. The content,delivery, and facilitation were also evaluated. Figure 7shows a summary of the reaction data on a rating scale,where 1 is unsatisfactory and 5 is exceptional.

Learning improvement is measured at the end of the pro-gram using a self-assessment and a facilitator assessment.Although these measures are subjective, they provide anindication of improvements in learning. Significantimprovements in both the self-assessments and facilitatorassessments are usually reported. In this specific study, thefacilitator assessment data revealed that all participants hadacquired the skills on at least a satisfactory level.

Application and Implementation. To determine the extentto which the skills are being used and to check progress of

the action plan, participants received a questionnaire threemonths after the program. This two-page, user-friendlyquestionnaire covered the following topics:• skill usage• skill frequencies• linkage to store measures• barriers to implementation• enablers for implementation• progress with action plan• quality of the support from the manager• additional intangible benefits• recommendations for program improvements

Participants reported progress in each of the areas andindicated that they had significantly used the skills evenbeyond the projects involving action plans. Also, traineesindicated that this program links with many store mea-sures beyond the three measures selected for action plan-ning. Typical barriers of implementation included lack oftime, understaffing, changing culture, and lack of inputfrom the team. Typical enablers were the support from thestore manager and early success with the application of theaction plan. This followup questionnaire allowed managertrainees an opportunity to summarize the progress withthe action plan. In essence, it served as a reminder to con-tinue with the plan as well as a process check to see ifthere were issues that should be explored. Managertrainees gave the store manager high marks in terms of sup-port provided to the program. Participants suggested sev-eral improvements, all minor, which were implemented ifthey added value.

Business Impact. Participants collected business impactdata for each plan. Although the action plan (Figure 6) con-tains some Level 3 application data (the left side of theform), the primary value of the action plan was businessimpact. In the six-month followup, participants wererequired to furnish the following five items.

• Change in Business Measures: The actual change in thebusiness measure, on a monthly basis, is used to developan annual, first-year improvement.

• Estimate of Percent of Improvement: The only feasibleway to isolate the effects of this particular program is toobtain an estimate directly from the participants.

30 www.ispi.org • JANUARY 2003

Figure 7. Reaction of Program Participants.

Page 8: Using action plans to measure ROI

Performance Improvement • Volume 42 • Number 1 31

Manager trainees monitored business measures andobserved improvement. Realizing that other factorscould have influenced the improvement, trainees wereasked to estimate the percent of improvement resultingfrom the application of the action steps on the actionplan. Realistically, they probably know the actual influ-ences driving a particular measure, at least the portion ofthe improvement related directly to their actions. Eachmanager trainee was asked to be conservative with theestimate and express it as a percentage.

• Level of Confidence: Recognizing that the above value isan estimate, manager trainees were asked to indicate thelevel of confidence in their allocation of the contributionto this program, using a range of 0% (for no confidence)to 100% (for certainty). This is included as item F on theaction plan. This number reflects the degree of uncer-tainty in the value and actually frames an error range forthe estimate.

• Intangible Benefits: The participants were asked to provideinput on intangible benefits observed or monitored duringthe six months that were directly linked to this program.

• Additional Comments: Participants were asked to provideadditional comments, including explanations.

The example in Figure 6 focuses directly on absenteeismfrom participant number three. This participant has a weeklyabsenteeism rate of 8% and a goal to reduce it to 5%.Specific action steps appear on the left side of the form. Thevalue per absence is $41, an amount that represents a stan-dard value. The change on a monthly basis is 2.5 percentage

points, slightly below the target. The manager trainee esti-mated that 65% of the change is directly attributable to thisprogram and that he is 80% confident in this estimate. The80% confidence estimate frames an error range for the 65%allocation, allowing for a possible 20%± adjustment in theestimate. The estimate is adjusted to the low side, bringingthe contribution rate of this program to absenteeism reduc-tion to 52% (65% x 80% = 52%), a conservative value. Thisparticular store location, which is known because of theidentity of the store manager trainee, has 40 employees.Employees work an average of 220 days. The annual improve-ment value for this example can be calculated as follows:

40 Employees x 220 Days x 2.5% x $41 = $9,020

This is a total first-year improvement before the adjust-ments. Figure 8 shows the annual improvement values onthe first measure for just the 14 participants in this group.(Note that participant number five did not return theaction plan so that person’s data were omitted from theanalysis.) A similar table is generated for the second andthird measures. The values are adjusted by the contribu-tion estimate and the confidence estimate. In the absen-teeism example, the $9,020 is adjusted by 65% and 80% toyield $4,690 ($9,020 x 52%). This same adjustment ismade for each of the values, with a total first-year adjustedvalue for the first measure of $68,240. The same process isfollowed for the second and third measures for the group,yielding totals of $61,525 and $58,713, respectively. Thetotal first-year monetary benefits for this group are the sumof these three values.

Figure 8. Business Impact Data.

Page 9: Using action plans to measure ROI

Program Cost. Figure 9 details the program costs for a fullyloaded cost profile. The cost of the needs assessment is pro-rated over the life of the program, which is estimated to bethree years with 10 sessions per year (30 sessions total). Theprogram development cost is prorated over the life of the pro-gram as well, using the same basis. The program materialsand lodging costs are direct costs. Facilitation and coordina-tion costs were estimated. Time away from work representslost opportunity and is calculated by multiplying five daystimes daily salary costs adjusted for a 30% employee benefitsfactor (that is, the costs for employee benefits). Training andeducation overhead costs were estimated. Actual direct costsfor the evaluation are included. These total costs of $47,242represent a conservative approach to cost accumulation.

ROI Analysis. The total monetary benefits are calculated byadding the values of the three measures, totaling $188,478.This leaves a benefits-to-cost ratio (BCR) and ROI as follows:

BCR = $188,478 / $47,242 = 3.98ROI = ($188,478 - $47,242) / $47,242 = 298%

(approximately 300%)

This ROI value of almost 300% greatly exceeds the 25% tar-get value. The target audience considered the ROI valuecredible, although extremely high. Its credibility rests onthe following principles on which the study was based:

• The data are collected directly from the participants inconcert with their store manager.

• Most of the data could be audited in store operations toverify actual amounts.

• To be conservative, the data include only the first year ofimprovements. With the changes reported in the actionplans, there probably will be some second- and third-yearvalues, although they are omitted from the calculation.

• The monetary improvement has been discounted for theeffect of other influences. In essence, the participantstake credit only for the part of the improvement relatedto this project. This estimate of contribution to the pro-gram is adjusted for the error of the estimate, adding tothe conservative approach.

• The costs are fully loaded to include both direct andindirect costs.

• The data are included for only those individuals whocompleted and returned the action plans (for example,no data appeared for participant number five in Figure8, because that person did not return an action plan.)

• The business impact does not include value obtainedfrom using the skills to address other problems or toinfluence other measures. Only the values from threemeasures taken from the action planning projects wereused in the analysis.

Consequently, the ROI process develops convincing dataconnected directly to improvements in store operations.

From the viewpoint of the chief financial officer, the datacan be audited and monitored. It should be reflected asactual improvement in the stores. Overall, the senior man-agement team considered the results credible and fully sup-ported the program.

Intangible Data. As a final part of the complete profile ofdata, the intangible benefits were itemized. The participantsprovided input on intangible measures at two time frames.The follow-up questionnaire provided an opportunity formanager trainees to indicate intangible measures they per-ceived to represent a benefit directly linked to this program.Also, the action plan provided an opportunity for trainees toadd intangible benefits. Collectively, each of the followingbenefits were listed by at least two individuals: • a sense of achievement • increased confidence• improved job satisfaction• promotion to store manager• stress reduction • improved teamwork

To some executives, these intangible measures are just asimportant as the monetary payoff.

Communication Strategy. Figure 10 shows the strategy forcommunicating results from the study. All key stakeholdersreceived the information. The communications were rou-tine and convincing. The information to store managers andregional managers helped to build confidence in the pro-gram. The data provided to future participants were moti-vating and helped them select measures for action plans.

Advantages of Action Planning

In this example, it was critical to build evaluation into theprogram, positioning the action plan as an application toolinstead of a data-collection tool. This approach helpedsecure commitment and ownership for the process. It alsoshifted much of the responsibility for evaluation to the par-ticipants as they collected data, isolated the effects of theproject, and converted data to monetary values, the threemost critical steps in the ROI process. The costs were easy

32 www.ispi.org • JANUARY 2003

Figure 9. Program Cost Summary.

Page 10: Using action plans to measure ROI

Performance Improvement • Volume 42 • Number 1 33

to capture, and the reportswere easily generated(from the templates) andsent to the various targetaudiences.

This approach providesthe additional advantageof evaluating programswhen a variety of measuresare influenced. Figure 11lists the typical applica-tions of the action plan-ning approach for ROIapplications (Phillips, 2002).The application can varyconsiderably, and the actualbusiness measure drivencan vary with each partici-

pant. Improvements are integrated after they are convertedto monetary value. Thus, the common value among mea-sures is the monetary value representing the value of theimprovement.

More importantly, this process drives six types of dataitems: satisfaction, learning, application, business impact,ROI, and intangible benefits. Collectively, these six types ofdata provide a balanced, credible viewpoint of the successof the program and provide much needed data to makeimprovements.

References

Phillips, J.J. (Ed.). (2000). Performance analysis and con-sulting. Alexandria, VA: American Society for Training andDevelopment.

Kirkpatrick, D.L. (1994). Evaluating training programs: Thefour levels. San Francisco: Berrett-Koehler Publishers.

Office of Personnel Management. (1980). Assessingchanges in job behavior due to training: A guide to the par-ticipant action plan approach. Washington, DC: USGovernment Printing Office.

Phillips, J.J. (1997). Return on invest-ment in training and performanceimprovement programs. Woburn,MA: Butterworth-Heinemann.

Phillips, J.J., & Phillips, P.P. (Eds.).(2001). Measuring return on invest-ment Volume 3. Alexandria, VA:American Society for Training andDevelopment.

Phillips, P.P. (2002). The bottom lineon ROI. Atlanta, GA: CEP Press.

As a world-renowned expert on measurement and evaluation, Dr. Jack J.Phillips provides consulting services for Fortune 500 companies and work-shops for major conference providers throughout the world. Jack is also theauthor or editor of more than 30 books, 10 about measurement and evalua-tion, and more than 100 articles.

His expertise in measurement and evaluation is based on extensiveresearch and more than 27 years of corporate experience in five indus-tries (that is, aerospace, textiles, metals, construction materials, and bank-ing). Jack has served as training and development manager at two Fortune500 firms, senior human resources (HR) officer at two firms, president ofa regional federal savings bank, and management professor at a majorstate university.

In 1992, Jack founded Performance Resources Organization (PRO), aninternational consulting firm providing comprehensive assessment, measure-ment, and evaluation services for organizations. In 1999, PRO was acquiredby the Franklin Covey Company and is now known as The Jack Phillips Centerfor Research. Today, The Center is an independent, leading provider of mea-surement and evaluation services to the global business community. Jack maybe reached at [email protected].

Patricia Pulliam Phillips is Chairman & CEO of The Chelsea Group, aresearch and consulting company focused on accountability issues in training,HR, and PI. Patricia conducts research on accountability issues and works withclients to build accountability systems and processes in their organizations.She has helped organizations implement the ROI process in more than 20countries around the world. She has been involved in hundreds of ROI impactstudies in a variety of industries.

Patricia has an MA in Public and Private Management from Birmingham-Southern College. She is certified in ROI evaluation and serves as co-authoron the subject in publications including Training Journal and What SmartTrainers Know, with Lorraine L. Ukens (Eds.), Jossey-Bass/Pfeiffer (2001);Evaluating Training Programs (2nd ed.), by Donald L. Kirkpatrick, Berrett-Koehler Publishers, Inc. (1998). Patricia has authored several issues of theASTD Infoline series, including Mastering ROI (1998) and Managing EvaluationShortcuts (2001). She served as issue editor for the ASTD In Action casebook,Measuring Return on Investment Volume 3 (2001), and Measuring ROI in thePublic Sector (2002). Patricia is co-author of The Human ResourcesScorecard: Measuring Return on Investment, Butterworth-Heinemann (2001)and author of The Bottomline on ROI, CEP (2002). Patricia may be reached [email protected].

Figure 10. Communication Strategy.

Figure 11. Typical ProgramsWhere Action Planning Can BeUsed to Develop ROI.