training evaluation.ppt
DESCRIPTION
ppt for training and developmentTRANSCRIPT
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 1
5module
Training Training EvaluationEvaluation
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 2
Introduction Introduction (1 of 2)(1 of 2)
Training effectivenessTraining effectiveness refers to the benefits that the company and the trainees receive from training
Training outcomes or criteriaTraining outcomes or criteria refer to measures that the trainer and the company use to evaluate training programs
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 3
Introduction Introduction (2 of 2)(2 of 2)
Training evaluationTraining evaluation refers to the process of collecting the outcomes needed to determine if training is effective
Evaluation designEvaluation design refers to from whom, what, when, and how information needed for determining the effectiveness of the training program will be collected
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 4
Reasons for Evaluating TrainingReasons for Evaluating Training (1 of 2)(1 of 2)
Companies are investing millions of dollars in training programs to help gain a competitive advantage
Training investment is increasing because learning creates knowledge which differentiates between those companies and employees who are successful and those who are not
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 5
Reasons for Evaluating TrainingReasons for Evaluating Training (2 of 2)(2 of 2)
Because companies have made large dollar investments in training and education and view training as a strategy to be successful, they expect the outcomes or benefits related to training to be measurable.
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 6
Training evaluationTraining evaluation provides the data needed to demonstrate that training does provide benefits to the company.
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 7
Formative EvaluationFormative Evaluation
Formative evaluation –Formative evaluation – evaluation conducted to improve the training process
Helps to ensure that:the training program is well organized and runs smoothly
trainees learn and are satisfied with the program
Provides information about how to make the program better
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 8
Summative EvaluationSummative Evaluation
Summative evaluation –Summative evaluation – evaluation conducted to determine the extent to which trainees have changed as a result of participating in the training program
May also measure the return on investment (ROI) that the company receives from the training program
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 9
Why Should A Training Program Be Why Should A Training Program Be Evaluated?Evaluated? (1 of 2)(1 of 2)
To identify the program’s strengths and weaknesses
To assess whether content, organization, and administration of the program contribute to learning and the use of training content on the job
To identify which trainees benefited most or least from the program
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 10
Why Should A Training Program Be Why Should A Training Program Be Evaluated?Evaluated? (2 of 2)(2 of 2)
To gather data to assist in marketing training programs
To determine the financial benefits and costs of the programs
To compare the costs and benefits of training versus non-training investments
To compare the costs and benefits of different training programs to choose the best program
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 11
The Evaluation ProcessThe Evaluation Process
Conduct a Needs AnalysisConduct a Needs Analysis
Develop Measurable Learning Outcomes Develop Measurable Learning Outcomes and Analyze Transfer of Trainingand Analyze Transfer of Training
Develop Outcome MeasuresDevelop Outcome Measures
Choose an Evaluation StrategyChoose an Evaluation Strategy
Plan and Execute the EvaluationPlan and Execute the Evaluation
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 12
Training Outcomes: Training Outcomes: Kirkpatrick’s Four-Level Kirkpatrick’s Four-Level Framework of Evaluation CriteriaFramework of Evaluation Criteria
LevelLevel CriteriaCriteria FocusFocus
1 Reactions Trainee satisfaction
2 Learning Acquisition of knowledge, skills, attitudes, behavior
3 Behavior Improvement of behavior on the job
4 Results Business results achieved by trainees
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 13
Outcomes Used in Evaluating Training Outcomes Used in Evaluating Training Programs:Programs: (1 of 4)(1 of 4)
AffectiveOutcomes
ResultsReturn onInvestment
CognitiveOutcomes
Skill-BasedOutcomes
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 14
Outcomes Used in Evaluating Training Outcomes Used in Evaluating Training Programs:Programs: (2 of 4)(2 of 4)
Cognitive OutcomesCognitive OutcomesDetermine the degree to which trainees are familiar with the principles, facts, techniques, procedures, or processes emphasized in the training programMeasure what knowledge trainees learned in the program
Skill-Based OutcomesSkill-Based OutcomesAssess the level of technical or motor skillsInclude acquisition or learning of skills and use of skills on the job
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 15
Outcomes Used in Evaluating Training Outcomes Used in Evaluating Training Programs:Programs: (3 of 4)(3 of 4)
Affective OutcomesAffective OutcomesInclude attitudes and motivation
Trainees’ perceptions of the program including the facilities, trainers, and content
ResultsResultsDetermine the training program’s payoff for the company
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 16
Outcomes Used in Evaluating Training Outcomes Used in Evaluating Training Programs:Programs: (4 of 4)(4 of 4)
Return on Investment (ROI)Comparing the training’s monetary benefits with the cost of the training
direct costs
indirect costs
benefits
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 17
How do you know if your outcomes are How do you know if your outcomes are good?good?
Good training outcomes need to be:
Relevant
Reliable
Discriminative
Practical
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 18
Good Outcomes: Good Outcomes: RelevanceRelevance
Criteria relevance –Criteria relevance – the extent to which training programs are related to learned capabilities emphasized in the training program
Criterion contamination –Criterion contamination – extent that training outcomes measure inappropriate capabilities or are affected by extraneous conditions
Criterion deficiency – failure to measure training outcomes that were emphasized in the training objectives
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 19
Criterion deficiency, relevance, and Criterion deficiency, relevance, and contamination:contamination:
Outcomes Measured in Evaluation
Outcomes Identified by
Needs Assessment and
Included in Training
Objectives
Outcomes Related to Training
Objectives
Contamination Relevance Deficiency
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 20
Good Outcomes Good Outcomes (continued)(continued)
Reliability –Reliability – degree to which outcomes can be measured consistently over time
Discrimination –Discrimination – degree to which trainee’s performances on the outcome actually reflect true differences in performance
Practicality –Practicality – refers to the ease with which the outcomes measures can be collected
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 21
Training Evaluation PracticesTraining Evaluation Practices
79%
38%
15% 9%
0%
10%
20%
30%
40%
50%
60%
70%
80%
Reaction Cognitive Behavior Results
Outcomes
Perc
enta
ge o
f Cou
rses
Usi
ng O
utco
me
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 22
Training Program Objectives and Their Training Program Objectives and Their Implications for Evaluation:Implications for Evaluation:
Reactions: Did trainees like the program?Did the environment help learning?Was material meaningful?
Skill-Based: Ratings by peers or managers based on observation of behavior
Cognitive: Pencil-and-paper tests Affective: Trainees’ motivation or job attitudes
Skill-Based: Performance on a work sample Results: Did company benefit through sales, quality, productivity, reduced accidents, and complaints?Performance on work equipment
Outcomes
Learning Transfer
Objective
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 23
Evaluation Designs: Threats to ValidityEvaluation Designs: Threats to Validity
Threats to validityThreats to validity refer to a factor that will lead one to question either:
The believability of the study results (internal (internal validity)validity), or
The extent to which the evaluation results are generalizable to other groups of trainees and situations (external validity)(external validity)
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 24
Threats to ValidityThreats to Validity
Threats To Internal Validity
Company
Persons
Outcome Measures
Threats To External Validity
Reaction to pretest
Reaction to evaluation
Interaction of selection and training
Interaction of methods
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 25
Methods to Control for Threats to ValidityMethods to Control for Threats to Validity
Pre- and PosttestsPre- and Posttests
Use of Comparison Use of Comparison GroupsGroups
Random AssignmentRandom Assignment
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 26
Types of Evaluation DesignsTypes of Evaluation Designs
Posttest – only
Pretest / Posttest
Posttest – only with Comparison Group
Pretest / Posttest with Comparison Group
Time Series
Time Series with Comparison Group and Reversal
Solomon Four–Group
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 27
Comparison of Evaluation DesignsComparison of Evaluation Designs(1 of 2)(1 of 2)
Design Groups Pre-training Post-training Cost Time Strength
Posttest Only Trainees No Yes Low Low Low
Pretest / Posttest Trainees Yes Yes Low Low Medium
Posttest Only with Comparison Group
Trainees and Comparison
No Yes Medium Medium Medium
Pretest / Posttest with Comparison Group
Trainees and Comparison
Yes Yes Medium Medium High
Measures
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 28
Comparison of Evaluation DesignsComparison of Evaluation Designs(2 of 2)(2 of 2)
Design Groups Pre-training Post-training Cost Time Strength
Time Series Trainees Yes Yes, several Medium Medium Medium
Time Series with Comparison Group and Reversal
Trainees and Comparison
Yes Yes, several High Medium High
Solomon Four-Group Trainees ATrainees BComparison AComparison B
YesNoYesNo
YesYesYesYes
High High High
Measures
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 29
Example of a Pretest / Posttest ComparisonExample of a Pretest / Posttest ComparisonGroup Design:Group Design:
Pre-training Training Post-training Time 1
Post-training Time 2
Lecture Yes Yes Yes Yes
Self-Paced Yes Yes Yes Yes
Behavior Modeling
Yes Yes Yes Yes
No Training (Comparison)
Yes No Yes Yes
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 30
Example of a Solomon Four-Group Example of a Solomon Four-Group Design:Design:
Pretest Training Posttest
Group 1 Yes IL-based Yes
Group 2 Yes Traditional Yes
Group 3 No IL-based Yes
Group 4 No Traditional Yes
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 31
Factors That Influence the Type of Factors That Influence the Type of Evaluation DesignEvaluation Design
Factor How Factor Influences Type of Evaluation Design
Change potential Can program be modified?
Importance Does ineffective training affect customer service, product development, or relationships between employees?
Scale How many trainees are involved?
Purpose of training Is training conducted for learning, results, or both?
Organization culture Is demonstrating results part of company norms and expectations?
Expertise Can a complex study be analyzed?
Cost Is evaluation too expensive?
Time frame When do we need the information?
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 32
Conditions for choosing a Conditions for choosing a rigorousrigorous evaluation evaluation design: design: (1 of 2)(1 of 2)
1. The evaluation results can be used to change the program
2. The training program is ongoing and has the potential to affect many employees (and customers)
3. The training program involves multiple classes and a large number of trainees
4. Cost justification for training is based on numerical indicators
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 33
Conditions for choosing a Conditions for choosing a rigorousrigorous evaluation evaluation design: design: (2 of 2)(2 of 2)
5. You or others have the expertise to design and evaluate the data collected from the evaluation study
6. The cost of training creates a need to show that it works
7. There is sufficient time for conducting an evaluation
8. There is interest in measuring change from pre-training levels or in comparing two or more different programs
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 34
Importance of Training Cost InformationImportance of Training Cost Information
To understand total expenditures for training, including direct and indirect costsTo compare costs of alternative training programsTo evaluate the proportion of money spent on training development, administration, and evaluation as well as to compare monies spent on training for different groups of employees
To control costs
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 35
To calculate return on investment (ROI), To calculate return on investment (ROI), follow these steps: follow these steps: (1 of 2)(1 of 2)
1. Identify outcome(s) (e.g., quality, accidents)
2. Place a value on the outcome(s)
3. Determine the change in performance after eliminating other potential influences on training results.
4. Obtain an annual amount of benefits (operational results) from training by comparing results after training to results before training (in dollars)
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 36
To calculate return on investment (ROI), To calculate return on investment (ROI), follow these steps: follow these steps: (2 of 2)(2 of 2)
5. Determine training costs (direct costs + indirect costs + development costs + overhead costs + compensation for trainees)
6. Calculate the total savings by subtracting the training costs from benefits (operational results)
7. Calculate the ROI by dividing benefits (operational results) by costs
The ROI gives you an estimate of the dollar return expected from each dollar invested in training.
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 37
Determining Costs for a Cost-Benefit Determining Costs for a Cost-Benefit Analysis:Analysis:
DevelopmentCosts
OverheadCosts
Compensationfor
Trainees
Direct Costs Indirect Costs
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved.
6 - 38
Example of Return on InvestmentExample of Return on Investment
Industry Training Program ROI
Bottling company Workshops on managers’ roles 15:1
Large commercial bank Sales training 21:1
Electric & gas utility Behavior modification 5:1
Oil company Customer service 4.8:1
Health maintenance organization
Team training 13.7:1