evaluation of training rationale for evaluation types of evaluation data validity issue evaluation...

45
Evaluation of Training • Rationale for Evaluation • Types of Evaluation Data • Validity Issue • Evaluation Design

Upload: jack-gregory

Post on 17-Jan-2018

235 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Evaluation of Training

• Rationale for Evaluation

• Types of Evaluation Data

• Validity Issue

• Evaluation Design

Page 2: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Rationale for Evaluation

• Organizational activities should be regularly examined to ensure they are occurring as planned and producing the intended results

• To correct things (people, processes, products or services) that deviate from their objectives

Page 3: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design
Page 4: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Evaluation Schedule - Sample

Page 5: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Types of Evaluation Data

Two areas of evaluation:Process evaluation: how well the training was designed, developed, and implemented

Outcome evaluation: how well the training achieves its objectives

Page 6: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Process DataBefore Training• Analyzing the processes used to develop

training– effectiveness of needs analysis– assessment of the training objectives– evaluation of the design of the training– assessment of evaluation tools– examination of the training package

• Pre-training evaluation enables correction of errors or omissions

Page 7: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Were needs diagnosed correctly What data sources were used? Was a knowledge/skill deficiency identified? Were trainees assessed to determine their pre-requisite KSAs?

Were needs translated into training objectives? Were all objectives identified Were the objectives written in a clear, appropriate manner?

Was an evaluation system designed to measure accomplishment of objectives?

Was the training program designed to meet all the training objectives? Was previous learning that might either support or inhibit learning in training identified? Were individual differences assessed and taken into consideration in training design? Was the trainee motivation to learn assessed? What steps were taken to address trainee motivation to learn? Were processes built into the training to facilitate recall and transfer? What steps are included in the training to call attention to key learning events? What steps are included in the training to aid trainees in symbolic coding and cognitive organization? What opportunities are included in the training to provide symbolic and behavioural practice? What actions are included in the training to ensure transfer of learning to the job?

Are the training techniques to be used appropriate for each of the learning objectives of the training?

Potential Questions for a Process Analysis: Pre-training

Page 8: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Process Data: During Training

• Examining whether the implementation of the training program reflects what was proposed, designed and included in the training manual

• Assessing the appropriateness of training techniques and methodologies for achieving training objectives

Page 9: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Potential Questions for Process Analysis – During Training

Was there a match between trainer, training techniques, and training/learning objectives?

Were lecture portions of the training effective?Was involvement encouraged/solicited?Were questions used effectively?

Did the trainer appropriately conduct the various training methodologies (case, role play, etc.)?Were they explained well?Did the trainer use the allotted time for activities?Was enough time allotted?Did the trainees follow instructions?Was there effective debriefing following exercises?

Did the trainer follow the training design and lesson plans?Was enough time given for each of the requirements?Was time allowed for questions?

Page 10: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Uses of Process Data• Trainer

– Helps to determine what works well and what does not

• Other trainers– May be able to apply if the process is generalizable

• Training manager– Decision making when the training fails, or problem

with particular trainer

Page 11: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Outcome Data

• To determine how well training has met its goals

• Four types of outcomes that are normally used:– reaction– learning– behaviour– organizational results

Page 12: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Reaction Outcomes

• measures of trainee’s:– perceptions– emotions– subjective evaluationsof the training experience

• the first level of evaluation• favourable reactions are important in

creating motivation to learn

Page 13: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Reaction Outcomes (cont’d)

• the data are used to determine what the trainees thought about the training

• usually it is the only type of evaluation undertaken

• reaction questionnaires are divided into two types:– affective : measures general feelings– utility : beliefs about the value of training

Page 14: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Reaction Outcomes (cont’d)

• do not measure learning – only opinions and attitudes of trainees about the training

• categories normally included in developing a reaction questionnaire:– relevance of training– training content, materials and exercises– trainer’s behaviour– facilities

Page 15: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Reaction Outcomes (cont’d)

• Training Relevance– provides an indication of the value of training– perceived value influences interests

• Training Content, Materials and Exercises– includes written materials, videos, exercises and other

tools of instruction– based on trainees’ feedbacks, modifications could be

made

Page 16: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Reaction Outcomes (cont’d)• Reactions to the Trainer(s)

– evaluations on the trainer’s actions– how well the trainer conducted the training

programme

• Facilities– items related to the facilities – noise, temperature,

seating arrangements etc.– helps in determining whether the facilities are to be

used for future training programmes

Page 17: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Sample Reaction Questionnaire - TrainerSample Reaction Questionnaire - TrainerPlease circle the number that reflects the degree to which you agree or disagree with the following statements. 1 Strongly disagree 2 Disagree 3 Neither agree nor disagree 4 Agree 5 Strongly agree 1 The trainer did a good job of stating the objectives at

the beginning of training 1 2 3 4 5

2 The trainer made good use of visual aids when making presentations

1 2 3 4 5

3 The trainer was good at keeping everyone interested in the topics

1 2 3 4 5

4 The trainer encouraged questions and participation from trainees

1 2 3 4 5

5 The trainer made sure everyone understood the concepts before moving on to the next topic

1 2 3 4 5

6 The trainer summarized important concepts before moving to the next module

1 2 3 4 5

7 Overall, how would you rate this trainer (check one) _____1. Poor; I would not recommend this trainer to others _____2. Adequate; I would recommend this trainer only if no others were available _____3. Average _____4. Good; I would recommend this trainer above most athers _____5. Excellent; this trainer is among the very best I’ve ever worked with

Page 18: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Sample Reaction Questionnaire – Instructor and Materials

Page 19: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Learning Outcomes

• Learning objectives are developed from TNA• The gap between trainee’s KSAs and the

required KSAs defines the learning that must occur

• Three types of learning outcomes:– knowledge– skills– attitudes

Page 20: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Learning Outcomes - Knowledge

• There are three types of knowledge:– declarative– procedural– strategic

Page 21: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Knowledge Outcome - Declarative

• factual knowledge• paper-and-pencil tests are often used to

determine whether trainees have learned the knowledge– easier to administer and score– if properly developed, it accurately measures most

declarative knowledge• multiple-choice test is the most common

– reliable– covers a broader content

Page 22: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Knowledge Outcome -Procedural

• organizing information into mental models • one method commonly used is “paired

comparisons”– trainee’s answers are compared to an

expert’s answers

Page 23: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Knowledge Outcome - Strategic

• deals with the ability to develop and apply cognitive strategies in problem solving

• assesses the level of understanding the trainee has about the decisions and choices he/she makes

• trainees are required to provide the rationale for decisions/choices made

Page 24: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Learning Outcome Questionnaire - Example

Page 25: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Skill-Based Outcomes

• to determine if a skill or set of behaviours has been learned

• it measures the level of learning not whether they are used on the job

• two levels of skill acquisition:– compilation– automaticity

Page 26: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Skill-based Outcome - Compilation

• to determine the additional skills acquired by the trainee from training

• various methods may be used:– structured scenario – multiple raters using standardized methods

Page 27: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Skill-based Outcome - Automaticity

• the speed in which a skill is being used• one method is speed test – performance

has to be completed within a certain time• e.g emergency procedures for pilot

Page 28: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Attitudinal Outcomes

• assessment of changes in attitudes• attitudinal scales are often used• to measure changes, evaluation should

use pre/post measure of responses on a scale

• caution: weaknesses of self-report

Page 29: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Questionnaire - Example

Page 30: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design
Page 31: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Ambiguity?

Page 32: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Behaviour Outcomes

• to determine whether the training has transferred to the job

• primary sources of data:– interviews– questionnaires– direct observation– performance records

Page 33: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Behaviour Outcomes (cont’d)

• questionnaires are often used, because:– opinions can be obtained from a large number

of employees– the information can be tabulated to yield a

numerical response– respondents are anonymous– short amount of time is required

Page 34: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Organizational Results• The objectives of training are developed to solve

organizational problems• Evaluation is conducted to assess whether the

training has solved the problems• Some aspects of evaluation may include:

– reduction in defects– increase in productivity– reduction in grievances– increase in quality

Page 35: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Validity Issue in Evaluation

• Internal validity:– whether the change was a function of

training– the confidence that the results of the

evaluations are in fact correct

• External validity– whether the same results are

generalisable to other groups of trainees

Page 36: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Threats to Internal Validity

• History– other events that take place concurrently with training

• Maturation– changes that occur because of the passage of time

• Testing– presence in pretest/post-test design that use the

same test

Page 37: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Threats to Internal Validity (cont’d)

• Instrumentation– two different but equivalent tests – is it really

equivalent– could cause differences in the two scores

• Statistical regression– tendency for those who score either very high

or very low on a test to regress to the middle when taking the test again (“regression to the mean”

Page 38: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Threats to Internal Validity (cont’d)

• Initial Group Differences– comparison between trainees and a similar

group of employees (control group who have not been trained)

– it is important that the control group be similar in every way to the training group

• Loss of Group Members– poor scorers in pretest may be demoralised

and drop out of the training

Page 39: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Threats to Internal Validity (cont’d)

• Diffusion of Training– trainees may share the knowledge with the

control group– post-test scores may not show differences

between the training group and the control group

• Compensating Treatment– control group may get special assistant

because they are not given training

Page 40: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Threats to Internal Validity (cont’d)

• Compensatory Rivalry– the control may see the situation as a

challenge and compete for better performance

– post-test scores may not show difference in performance between the two groups

Page 41: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Threats to Internal Validity (cont’d)

• Demoralised Control Group– the control may perceive that they are made

control group because they are not as good as the training group

– they may give up and actually reduce productivity

– post-test scores would show difference but not due to training

Page 42: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

External Validity

• The evaluation has to be internally valid before it can be externally valid

• If training is effective for a group, will it also be effective for other groups?

i.e is the evaluation generalizable

Page 43: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Threats to External Validity• Testing

– subsequent training groups may not have pre-tests – difficult to conclude that they would be as effective

– those who took pre-test may have focused on certain materials highlighted in the test

• Selection– a training program with identical design may produce

different results for different categories of employees

Page 44: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Threats to External Validity (cont’d)

• Reaction to Evaluation– success from a group may make further evaluation

unnecessary– those who are evaluated may got more attention

(“Hawthorne Effects”)• novelty• special• received feedback• know they are being observed• inspired by the trainer

Page 45: Evaluation of Training Rationale for Evaluation Types of Evaluation Data Validity Issue Evaluation Design

Threats to External Validity (cont’d)

• Multiple techniques– effectiveness could be from a combination of

techniques:• lecture – no effect• video instruction – effective

– video instruction is then used for future training, but ineffective

– the effectiveness of training in the first group was the result of both lecture and video instruction