706.721.7839 [email protected] affairs/eii
TRANSCRIPT
Empowering Educators to be Innovators
706.721.7839 [email protected]
www.gru.edu/mcg/academic- affairs/eii
Jefferson Scale of Lifelong Learning-Health Professions Version (JeffSLL-HSP)
Malorie Kosht Novak, PT, Ph.D., DPT, Christie Lancaster Palladino, MD, MSc,
Brittany Layne Ange, MS, Deborah South Richardson, Ph.D.
Department of Physical Therapy and Educational Innovation Institute
Georgia Regents University, Augusta, Georgia
DESCRIPTION
DEVELOPMENT
THE JEFFSLL-HPS APPLICATION
CONCLUSIONS
REFERENCES
The Jefferson Scale of Lifelong Learning-Health
Professions Students Version (JeffSLL-HPS), an
adaptation of the Jefferson Scale of Lifelong
Learning-Medical Students Version (JeffSPLL-MS)1
is an instrument that measures health professions
students‟ (HPS) orientation toward lifelong learning
(LLL). It has 14 items with response options
presented along a 4-point Likert scale (1= strongly
disagree; 4 = strongly agree). Higher JeffSLL-HPS
scores indicate a greater orientation toward lifelong
learning. The instrument may be administered
electronically or on paper.
The JeffSPLL-MS1 was modified for use with HPS
across different disciplines. Cognitive interviews
were used to assess content validity. 180 senior
students in dental hygiene, dental medicine,
medicine, nursing, occupational therapy, physician
assistant, physical therapy, and respiratory therapy
completed the JeffSLL-HPS (out of 502 students
approached). Confirmatory factor analysis (CFA)
revealed a three-factor solution consistent with that
of the JeffSPLL-MS.1 The factors are named
“learning beliefs and motivation,” “skills in seeking
information,” and “attention to learning
opportunities.”1 Internal consistency for scores on
each of the three factors ranged from .62 -.78. CFA
of the subgroup of medical students yielded results
similar to those reported above, suggesting that the
JeffSLL-HPS may be appropriate to use with this
group. Students answered using the full range of
responses for each item on the JeffSLL-HPS,
suggesting that social desirability was not a major
factor in item response. We received IRB approval
for this study.
JeffSPLL-MS Compared to JeffSLL-HPS
Statistic JeffSPLL-MS1
(N = 652)
JeffSLL-HPS
(N = 180)
Total Mean Score 43.52 43.06
Standard
Deviation 4.65 5.50
Range 27 - 56 26 – 56
Cronbach's Alpha .77 .85
Our work with the JeffSLL-HPS demonstrates an
internal structure consistent with that of the
JeffSPLL-MS1 and suggests that the JeffSLL-HPS
may be used as a reliable assessment of orientation
toward LLL in students from multiple healthcare
disciplines. We are conducting longitudinal research
with the instrument to study whether students‟
orientation toward LLL changes over time. Future
research should investigate additional forms of
validity evidence for scores on the JeffSLL-HPS and
whether scores on self-report measures of LLL
translate into behavior change and educational
outcomes. Additional research could also explore
whether the JeffSLL-HPS could be used to assess
the effectiveness of specific activities implemented
in a curriculum geared toward facilitating LLL.
ACKNOWLEDEGEMENT
Jefferson Scale of Lifelong Learning (JeffSLL- Health Professions Student Version)
Instructions: Please indicate the extent of your agreement with each of
the following statements by circling the appropriate number.
1. Searching for the answer to a question is, in and by itself, rewarding. ............................ 1 2 3 4 2. Lifelong learning is a professional responsibility of all healthcare providers .................... 1 2 3 4 3. I enjoy reading articles in which issues of healthcare/medicine are discussed ................ 1 2 3 4 4. I routinely attend student study groups. .......................................................................... 1 2 3 4 5. I read healthcare/medical literature in journals, websites or textbooks at least once every week............................................................................................................. 1 2 3 4 6. I routinely search electronic resources to find out about new developments in healthcare/medicine. ...................................................................................................... 1 2 3 4 7. I believe that I would fall behind if I stopped learning about new developments in healthcare/medicine. ................................................................................................... 1 2 3 4 8. One of the important goals of health professions‟ education is to develop students‟ lifelong learning skills ...................................................................................................... 1 2 3 4 9. Rapid changes in health science/medicine require constant updating of knowledge and development of new professional skills. ................................................................... 1 2 3 4 10. I always make time for learning on my own, even when I have a busy class schedule and other obligations. ...................................................................................... 1 2 3 4 11. I recognize my need to constantly acquire new professional knowledge. ........................ 1 2 3 4 12. I routinely attend optional sessions, such as professional meetings, guest lectures, or clinics where I can volunteer to improve my knowledge and clinical skills ....................... 1 2 3 4 13. I take every opportunity to gain new knowledge/skills that are important to my discipline. .................................................................................................................. 1 2 3 4 14. My preferred approach in finding an answer to a question is to consult a credible resource such as a textbook or electronic resource. ....................................................... 1 2 3 4
© 2007 Jefferson Medical College. All rights reserved. Adapted with permission for administration to health professions students at Georgia Regents University.
Str
on
gly
Dis
ag
ree
Dis
ag
ree
Ag
ree
Str
on
gly
Ag
ree
We wish to thank Mohammadreza Hojat, Ph.D. of
Jefferson Medical College for his consultation and
permission to adapt the JeffSPLL-MS.
1. Wetzel AP, Mazmanian PE, Hojat M, et al. Measuring medical
students„ orientation toward lifelong learning: a psychometric
evaluation. Acad Med. 2012;85(10):S41-S44.
2. Slavin MD. Teaching evidence-based practice in physical
therapy: critical competencies and necessary conditions. JOPTE
2004;18(3):4-11.
We confirmed that in our sample, the JeffSLL-HPS
replicated the structure of the JeffSPLL-MS.1 This
tool may be valuable for faculty and administrators in
health professions programs to assess their goal of
meeting accreditation standards, the effects of
curricular design and teaching strategies on LLL,2
and student attitudes toward LLL.2
Assessing Health Professions Students’ Orientation Toward Lifelong Learning
Malorie Kosht Novak, PT, Ph.D., DPT, Christie Lancaster Palladino, MD, MSc,
Brittany Layne Ange, MS, Deborah South Richardson, Ph.D.
Department of Physical Therapy and Educational Innovation Institute
Georgia Regents University, Augusta, Georgia
REFERENCE
INTRODUCTION
PURPOSE
METHODS
RESULTS DISCUSSION
CONCLUSION
REFERENCES
Lifelong learning is considered an
element of professionalism in many
disciplines,1 and accreditation
standards for healthcare
professions education require an
emphasis on lifelong learning
(LLL). However,
tools to assess LLL in healthcare
professions students (HPS) are
lacking,2 thus making it difficult to
assess whether these standards
have been met.
Health Professions Students’ Orientation
Toward Lifelong Learning
To examine the psychometric
properties of an adaptation of the
Jefferson Scale of Physician
Lifelong Learning- Medical
Students Version (JeffSPLL-MS)3
that was designed to assess LLL in
HPS (JeffSLL-HPS)
The JeffSPLL-MS was modified for
use with HPS across different
disciplines and administered to
students in their last year of study.
180/502 (35.9%) students
representing all programs
responded with useable surveys.
We received IRB approval for this
study.
1. Arnold L. Assessing professional behavior: yesterday, today, and
tomorrow. Acad Med. 2002;77:502-515.
2. Slavin MD. Teaching evidence-based practice in physical therapy:
critical competencies and necessary conditions. JOPTE.
2004,18(3):4-11.
3. Wetzel AP, Mazmanian PE, Hojat M, et al. Measuring medical
students' orientation toward lifelong learning: a psychometric evaluation. Acad Med. 2012;85(10):S41-S44.
College Program N/Total (%) Completed*
Allied Health
Sciences
Dental Hygiene 6/23 (26%)
Occupational
Therapy 11/39 (28%)
Physician Assistant 18/37 (49%)
Physical Therapy 24/33 (73%)
Respiratory Therapy 6/22 (9%)
Dental Medicine Dental Medicine 14/67 (21%)
Medical College of
Georgia Medicine 80/201 (40%)
Nursing Nursing 18/80 (23%)
*Three survey respondents did not indicate their program; we included their data in the
analyses.
JeffSPLL-MS3 Compared to JeffSLL-HPS
Statistic JeffSPLL-MS
(N = 652)3
JeffSLL-HPS
(N = 180)
Total Mean Score 43.52 43.06
Standard
Deviation 4.65 5.50
Range 27 - 56 26 – 56
Cronbach's Alpha .77 .85
Confirmatory factor analysis revealed a
three-factor structure for the JeffSPLL-HPS
that was consistent with the JeffSPLL-MS.3
The factors are named “learning beliefs and
motivation,” “skills in seeking information,”
and “attention to learning opportunities.”3
The JeffSLL-HPS may be valuable
to
use in healthcare professions
programs to assess compliance
with
accreditation criteria regarding LLL,
effects of curricular design and
teaching strategies on LLL,2 and
student attitudes toward LLL.2
The internal structure of responses
of the JeffSLL-HPS is consistent
with the JeffSPLL-MS. As
healthcare professions education
becomes more interdisciplinary, it
is important to have tools to assess
elements of professionalism, such
as LLL, across disciplines. The
JeffSLL-HPS is currently being
used in a longitudinal study to
assess changes in orientation
toward LLL in Doctor of Physical
Therapy students from
matriculation to graduation.
How Do Engineering Majors Perform in the Pre-Clinical Years of Medical School Compared to Their Peers?
Brittany Ange, MS; Andria Thomas, PhD; and Paul Wallach, MD Medical College of Georgia at Georgia Regents University
Evaluation Services
• An Analysis of Covariance (ANCOVA) was calculated to determine the effect of undergraduate major on pre-clinical grade point average (GPA) after adjusting for undergraduate GPA.
• Similarly, ANCOVAs were performed to determine the effect of undergraduate major on Year 1 GPA and Year 2 GPA separately after adjusting for undergraduate GPA.
• Students matriculate into medical school with varying undergraduate degrees and diverse backgrounds.
• An increasing number of students have degrees other than the traditional basic sciences.
• The curriculum of an undergraduate degree is designed with an emphasis on problem solving skills development. Thus, anecdotal evidence suggest that medical students with engineering degrees may struggle in the pre-clinical curriculum (first two years) of medical school.
• The purpose of this study was to examine how medical students with an undergraduate engineering degree perform in the first two years of medical school compared to their peers.
• ANCOVA results showed that the overall F-test was significant (p<.0001) indicating that either undergraduate GPA, major, or both had an effect on pre-clinical GPA.
• The p-value for major was not statistically significant (p=0.622) and the p-value for undergraduate GPA was statistically significant (p<.0001).
• Hence, after adjusting for undergraduate GPA, there was not a statistically significant difference among pre-clinical GPA across the categories of majors used in this study.
• Secondary analyses assessing year 1 and year 2 GPA separately yielded similar results. There was not a statistically significant difference among Year 1 GPA (p=0.348) or Year 2 GPA (p=0.395) across majors after adjusting for undergraduate GPA.
• These results suggest that engineering majors do not perform differently from non-engineering majors in their pre-clinical performance in total.
• This study also revealed that, overall, engineering majors do not perform differently from their peers during the first or second year of medical school.
• This information is useful for: • Medical school admissions offices as they seek to determine the best performance indicators of medical school. • Undergraduate engineering programs as they advocate for the usefulness and applicability of their programs for future medical students.
• Future research will seek to examine if these results hold during the clinical component of their medical education.
Major Category MCAT Mean (SD)
UGPA Mean (SD)
Year 1 GPA Mean (SD)
Year 2 GPA Mean (SD)
Pre-Clinical GPA Mean (SD)
Engineering 31.3 (2.5) 3.6 (0.2) 3.4 (0.4) 3.5 (0.4) 3.4 (0.4)
Non-Eng. Science 30.4 (3.0) 3.7 (0.2) 3.3 (0.5) 3.5 (0.5) 3.4 (0.4)
Other 30.2 (3.1) 3.7 (0.2) 3.4 (0.4) 3.5 (0.4) 3.4 (0.4)
• Students in the Classes of 2011 – 2014 (n=721) were included.
• Undergraduate major was categorized as follows: engineering (n=56), non-engineering science (n=538), and other (n=127).
PROCESS ORIENTED GUIDED INQUIRY LEARNING: A NATURAL FIT FOR OCCUPATIONAL THERAPY EDUCATION
Evidence based practice (EBP) is the use of research, experience, and client/student
factors in the provision of health care and education. Learning and employing EBP
requires critical thinking and evaluation skills. Students learn best when they are actively
engaged in learning cycles of exploration, concept invention, and application.
David Hanson and colleagues, funded by the National Science Foundation, developed
Process-Oriented Guided-Inquiry Learning (POGIL) to enhance science education and
address the observed weaknesses of both lecture-based and problem-based learning.
Specifically, lecture-based learning is passive, solitary, and limits responsibility for learning.
Problem based learning has the weaknesses of 1) expecting content to be learned by
novices who may not recognize important content and 2) requiring extensive facilitation to
ensure that the problem-solving progresses effectively to desired outcomes (Gallow;
Problem-Based Learning).
POGILs were developed to facilitate the skills of information processing, critical and
analytical thinking, problem solving, communication, teamwork, management, and
assessment (Hanson, 2006). This strategy has been adapted in a variety of non-core
science areas, such as marketing, healthcare, and humanities (Hale & Mullen, 2009).
Evidence Based Practice was taught in the first semester of the Occupational Therapy Program for
6 years using a traditional lecture-based format. During that time both instructors and students were
dissatisfied with the course. To address this dissatisfaction and to increase active learning, we
implemented POGIL strategies in the Fall of 2010 (1x/week for 3 hours) and continued in Fall 2011
and 2012 (2x/week for 1 ½ hours). We developed 12 modules on topics including: an introduction to
evidence based practice, asking clinical questions, searching for evidence, the structure and use of
scientific writing, APA formatting, levels of evidence, and appraising evidence from descriptive,
quasi-experimental, RCTs, systematic reviews, and qualitative studies.
Each class had pre-assigned readings. In 2010 new groups were established weekly at the
beginning of each class; in 2011 and 2012 students remained in the same group throughout the
semester. POGIL sessions would begin with the distribution of the packets. Students were expected
to work through the instructions, moving from the basic content to the applied experiences by the
end of the session.
Over the course of the semesters, faculty‟s belief that the use of POGIL was an effective teaching
strategy was reinforced through observation of the active engagement and written reflections from
the students. Implementation in the second and third years was enhanced due to better structuring
of the activities; a clear introduction of the benefits of the process on the first day of class; and
clearer grading guidelines than during the first year.
The summary points listed in the next column would assist other faculty in using this method.
Gallow, D. What is Problem-based Learning? Problem-based Learning Faculty
Institute, University of California, Irvine. Retrieved 3/21/13 from
http://www.pbl.uci.edu/whatispbl.html
Hale, D., & Mullen, L.G. (2009). Designing process-oriented guided-inquiry activities:
A new innovation for marketing class.
Hanson, D.M. (2006). Instructor‟s guide to process-oriented guided-inquiry learning.
Lisle, IL: Pacific Crest
Problem-based learning. Learning Theories.com retrieved 3/21/13 from
http://www.learning-theories.com/problem-based-learning-pbl.html
Preparing, Planning & Developing It is important to experience and study the POGIL method before implementing
Faculty need to be creative and write clear task instructions
Development requires appreciable initial time and familiarity with format Foster identification of essential elements of learning through graphics
Introducing benefits of this learning method to promote student buy-in is critical
Challenges
Student‟s inexperience with critique and reflection
Student‟s perception of teaching/learning: “I feel like I have to teach myself. I pay you to
teach me.”
Environment not consistently conducive to small group work
Process is multifactorial, thus difficult to draw causal relationships.
Examining effectiveness of teaching strategy requires additional measures Advantages Some students preferred this format to “Death by PowerPoint”
Format appealed to adult preferences for active learning
Self-assessment of knowledge trended higher across the semester
Suggested group size is 3-5 students with specific roles assigned to each member per
session:
Manager: Manages the group; time-keeper; ensures participation in roles; interfaces with
facilitator
Recorder: Records names/roles of group; note-taker
Presenter: Presents concise oral reports to class
Reflector: Observes/comments on the group‟s performance
Additional roles might include Technician, Encourager, Fact Checker.
POGILs can be employed as an adjunct or supplement to lectures, or as the primary method
of teaching/learning with some supplemental mini-lectures. Written materials guide the
process of the session, with in-class facilitation available. Presentations of group work and
discussion clarify content. Sessions end with shared critical thinking questions.
POGIL activities are done in structured, collaborative small groups. They
follow a defined Learning Cycle:
Lynn Jaffe, ScD, OTR/L; Robert Gibson, PhD, MSOTR/L; Mariana D’Amico, EdD, OTR/L, BCP
College of Allied Health Sciences at Georgia Regents University, Augusta, Georgia
POGIL FORMAT DESCRIPTION
OBJECTIVES 1. Describe POGIL, the collaborative learning format used in the Evidence-Based
Practice class
2. Discuss pros and cons of using POGIL format with graduate students
Overall, an increase in course satisfaction was evident between 2010 and 2011, but
remained static in 2012. Comments and ratings revealed there is still room for improvement.
Some students suggested having more explicit directions for the assignments and more
mini-lectures before doing the active learning.
Student self-reports on knowledge and skill development were comparable both years,
showing statistically significant improvement across the semester.
Student comments were more positive in 2011 & 2012 than in 2010. Examples:
“I think the POGILS were a good way of helping us to retain the information.” 2010
“It was interesting that when it came to the tests I felt like I knew the information well
because we had worked with the information so much in class.” 2011
“Prepared us to be better researchers. Group activities allowed us to work on collaboration
and team work.” 2011
“Having groups in class to work on assignments was a strength because it helped students
learn more by seeing other student's thought processes.” 2011
“…It also helped with teaching me how to operate effectively within a group.” 2011 “All of the hands on work was useful in learning the material.” 2012
BACKGROUND COURSE DESCRIPTION
SELECTED REFERENCES
CLASS OUTCOMES
SUMMARY POINTS
The IPE Journal at GRU Lori Anderson, Michael Brands, Miriam Cortez-Cooper, Matthew Diamond, Mahmood Mozaffari, Barbara Russell
.
Student Explanations for Declining Participation in an Educational Survey: A Qualitative Analysis
Lara Stepleman, Abbey Valvano, & Lauren Penwell-Waines
Educational Innovation Institute (EII), Georgia Regents University (GRU), Augusta, GA
Online administration of educational surveys is a common method for collecting critical health professions student data; yet, poor response rates frequently limit the utility and generalizability of findings.
Although numerous studies have explored approaches to increasing survey response, we know less about the motivations of students choosing not to participate.
Objective: To thematically describe anonymous, optional written feedback from health professions students declining participation in an online survey study about sexual health competencies.
Participants: 178 health professions students providing a written explanation for opting-out of an online survey regarding training issues in sexual health
Approximately 1600 health professions students
received survey
683 acknowledged receipt of survey
496 at least partially completed
187 requested to opt-out
178 provided written explanations
Explanations were independently analyzed by three raters using tenets of grounded theory. Our iterative analysis yielded six themes.
Background
Methods
Conclusions
This study contributes to our knowledge about the contextual factors related to health professions students’ decisions to decline participation in educational survey research, especially around a topic perceived to be sensitive.
Both survey characteristics and the characteristics of those completing the survey should be considered when optimizing the instrument and survey procedures for maximum response.
Opt out responses are unique, informative data that researchers may consider collecting when feasible to provide additional context to their results.
Limitations: Demographics for the opt-out sample are unknown, precluding additional analyses regarding characteristics of those who provided opt-out feedback.
Results
Table 1. Themes of opt-out explanations
We would like to acknowledge the GRU EII for their financial support toward the completion of this project.
MAJOR THEMES EXEMPLAR QUOTES n (%)
Survey Structure Length, unsure how to answer item as worded
“Too long”
“Some of the questions are impossible to answer accurately.”
45 (25)
Time “MCG raised my tuition, I have to work to make an extra 1000 per semester. No time for surveys.”
33 (18)
Content Relevancy Not seen as relevant, important, interesting or applicable
“As a dental student, we rarely deal with sexual health.”
“It doesn’t apply to me.”
28 (15)
Content Sensitivity Disclosure concerns, discomfort
“I do not wish to share this kind of information freely, even if anonymous.”
“I do not feel comfortable taking a survey on this
topic.”
20 (11)
Personal Competency Not enough experience with sexual health topics, perceived competence in sexual health training
“I do not feel I am adequately educated about sexual health”
“I feel comfortable discussing sexual health with my patients and feel like I approach it in a non-
threatening and non-judgmental way.”
19 (10)
Personal Attributes Religion, marital status, sexual activity
“Personal” “Not sexually active” “Religious reasons”
“I am married”
10 (5)
Figure 1. Model of opt-out response themes
N = 182 total codes (16 responses were given more than 1 code)
28 responses did not contribute to a meaningful theme or were too vague (e.g., “Choose not to respond”) and were included in an “Other” category.
10 nonsensical responses were excluded from thematic analyses.