teaching behavior observation and data collection skills ... · skills training relies on...

1
Sessions 0 20 40 60 80 100 Participant 2 Baseline Post-Video Training 0 20 40 60 80 100 Participant 1 Booster Percentage of Agreement Between Participants’ and Trained Observers’ Data 0 20 40 60 80 100 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Participant 4 Video 1 Video 2 Video 3 Video 4 Video 5 0 20 40 60 80 100 Participant 3 Booster Diana Perez 1 , Candice Hansard 1 , Ellie Kazemi 1 , and Christina Saez 2 1 California State University Northridge; 2 Los Angeles Mission College o In applied behavior analysis (ABA), research on clinical skills training relies on consistent data collection of procedural integrity by trained observers. Observer training requires a lot of time and resources. Alternatively, video training has been shown to be efficacious for similar trainings (Dempsey et al., 2012; Field et al., 2015). o Purpose. To evaluate the efficacy of a video training package on direct observation of procedural integrity and data collection skills. Teaching Behavior Observation and Data Collection Skills through Video Training Implications / Future Directions Acknowledgments This poster was funded through a grant from the National Institutes of Health (NIH) Building Infrastructure Leading to Diversity (BUILD) # 5RL5MD009603 Discussion o All participants met performance criteria One viewing of video training package § Total training time: 42 min § 2-3 testing videos All said that they would recommend the training and felt confident Extra time from a trainer was not needed o Limitations: Potential reactivity Some participants ran out of time Social Validity Questionnaire Statement Participant Average Rating (Likert scale of 1-5) Recommend this…to learn to collect direct observational data. M = 4.75, SD = 0.5 Training package…can be used when a trainer is not present. M = 4.5, SD = 1 I feel confident that I correctly collect data on a preference assessment. M = 4.5, SD = 0.58 Introduction Methodology Results Conclusion o Train research assistants in less time with little supervision o Need to conduct maintenance probes Also generalization probes in vivo o Test for observer effects References: Dempsey, C. M., Iwata, B. A., Fritz, J. N., & Rolider, N. U. (2012). Observer training revisited: A comparison of in vivo and video instruction. Journal of Applied Behavior Analysis, 45(4), 827- 832. doi: 10.1901/jaba.2012.45-827. Field, S. P., Frieder, J. E., Mcgee, H. M., Peterson, S. M., & Duinkerken, A. (2015). Assessing observer effects on the fidelity of implementation of functional analysis procedures. Journal Of Organizational Behavior Management, 35(3-4), 259-295. doi:10.1080/01608061.2015.1093058 Variables & Measures Dependent Variable = % of agreement between participants’ and trained observers’ data Performance Criteria = ≥ 90% agreement across 2 consecutive sessions without additional help (Booster session) Sample Characteristics Participants = 4 undergraduate students No experience collecting data on behavior nor implementing preference assessments Design & Procedures Single Subject: Multiple Baseline Across Participants Baseline = Collect data on preference assessments (30min – 1hr 15min) Video Training Package = Watch video (38 min) Post Video Training = Collect data on preference assessments, Booster, Novel Videos (30 min – 1 hr) Social Validity Questionnaire: Likert scale of 1-5 (1 = strongly disagree, 5 = strongly agree) ˄ Performance Criteria

Upload: others

Post on 27-Jun-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Teaching Behavior Observation and Data Collection Skills ... · skills training relies on consistent data collection of procedural integrity by trained observers. • Observer training

Sessions

0

20

40

60

80

100

Participant 2

Baseline Post-Video Training

0

20

40

60

80

100

Participant 1

Booster

Perc

enta

ge o

f Agr

eem

ent B

etw

een

Parti

cipa

nts’

and

Tra

ined

Obs

erve

rs’ D

ata

0

20

40

60

80

100

1 2 3 4 5 6 7 8 9 10 11 12 13 14

Participant 4

Video 1Video 2Video 3

Video 4Video 5

0

20

40

60

80

100

Participant 3

Booster

Diana Perez 1, Candice Hansard 1, Ellie Kazemi 1, and Christina Saez 21 California State University Northridge; 2 Los Angeles Mission College

o In applied behavior analysis (ABA), research on clinical skills training relies on consistent data collection of procedural integrity by trained observers.

• Observer training requires a lot of time and resources.• Alternatively, video training has been shown to be

efficacious for similar trainings (Dempsey et al., 2012; Field et al., 2015).

o Purpose. To evaluate the efficacy of a video training package on direct observation of procedural integrity and data collection skills.

Teaching Behavior Observation and Data Collection Skills through Video Training

Implications / Future Directions

Acknowledgments

This poster was funded through a grant from the National Institutes of Health (NIH) Building Infrastructure Leading

to Diversity (BUILD) # 5RL5MD009603

Discussiono All participants met performance criteria

• One viewing of video training package§ Total training time: 42 min§ 2-3 testing videos

• All said that they would recommend the training and felt confident

• Extra time from a trainer was not needed

o Limitations: • Potential reactivity• Some participants ran out of time

Social Validity Questionnaire

StatementParticipant

Average Rating (Likert scale of 1-5)

Recommend this…to learn to collect direct observational data.

M = 4.75, SD = 0.5

Training package…can be used when a trainer is notpresent.

M = 4.5, SD = 1

I feel confident that I correctly collect data on a preference assessment.

M = 4.5, SD = 0.58

Introduction

Methodology

Results

Conclusion

o Train research assistants in less time with little supervision

o Need to conduct maintenance probes• Also generalization probes in vivo

o Test for observer effects

References: Dempsey, C. M., Iwata, B. A., Fritz, J. N., & Rolider, N. U. (2012). Observer training revisited: A

comparison of in vivo and video instruction. Journal of Applied Behavior Analysis, 45(4), 827-832. doi: 10.1901/jaba.2012.45-827.

Field, S. P., Frieder, J. E., Mcgee, H. M., Peterson, S. M., & Duinkerken, A. (2015). Assessing observer effects on the fidelity of implementation of functional analysis procedures. Journal Of Organizational Behavior Management, 35(3-4), 259-295. doi:10.1080/01608061.2015.1093058

Variables & MeasuresDependent Variable =

% of agreement between participants’ and trained observers’ data

Performance Criteria =≥ 90% agreement across 2 consecutive sessions

without additional help (Booster session)

Sample CharacteristicsParticipants =

4 undergraduate students No experience collecting data on behavior nor

implementing preference assessments

Design & ProceduresSingle Subject: Multiple Baseline Across Participants

Baseline = Collect data on preference assessments (30min – 1hr 15min)

Video Training Package =Watch video (38 min)

Post Video Training =Collect data on preference assessments, Booster, Novel

Videos (30 min – 1 hr)

Social Validity Questionnaire: Likert scale of 1-5 (1 = strongly disagree, 5 = strongly agree)

˄ Performance Criteria