new directions in the research of tec hlhnology enhdhanced...
TRANSCRIPT
New Directions in the Research of h l h d d iTechnology‐Enhanced Education:
A Systematic Review and Research Synthesis y y
Chris Rakes, University of Louisville
Lauren Wagener, University of Tennessee – Knoxville
b f llRobert Ronau, University of Louisville
Purpose of this ResearchPurpose of this Research
• Provide a comprehensive picture ofProvide a comprehensive picture of educational technology literature
• Synthesize educational technology literature• Synthesize educational technology literature (in mathematics) to guide future practice and researchresearch
• Evaluate a set of frameworks as a lens for l i d h i i d i lanalyzing and synthesizing educational
technology literature
AssumptionsAssumptions
• Quantitative and qualitative studies areQuantitative and qualitative studies are equally important.
• Each design contributes something differentEach design contributes something different to understanding problems, implying that mixed methodology studies are valuable.
• Technology is equally important across content areas.
• A large research base for educational technology exists.
Systematic ProcessSystematic Process
• Define “Educational Technology” as digital tools gy gused for teaching
• Well‐defined search terms and parameters• Broad range of electronic databases• Inclusion of “gray” literatureAtt ti t i t t li bilit• Attention to inter‐rater reliability
• Iterative Process, (At least) 2 stages:1 Random Sample of 5 content areas1. Random Sample of 5 content areas2. Focused sample, Educational Technology,
Mathematics Only
Purpose of Stage 1Purpose of Stage 1
• Provide a systematic review of the impact ofProvide a systematic review of the impact of technology on teaching and learning
• Propose a framework for looking at teacher• Propose a framework for looking at teacher knowledge from which gaps in the literature can be addressedcan be addressed
SampleSample
• We located 1 785 published papers across fiveWe located 1,785 published papers across five content areas (mathematics, social studies, science language arts career and technicalscience, language arts, career and technical education)
• Stratified random sample• Stratified random sample
Subject MatterNumber of
Published PapersFinal
Sample SizeMathematics 309 30Science 335 29Career and Technical Education 497 28L A 179 30Language Arts 179 30Social Studies 465 28Total 1785 145
Structured InquiryThree Frameworks
1. Comprehensive Framework of Teacher Knowledge (CFTK)A. Field (e.g., An et al., 2004; Davis & Simmt, 2003; Shulman, 1986)
B. Mode (e.g., Fennema & Franke 1992; Philipp et al. 2007; Schön, 1995)
C. Context (e.g., Bryk & Driscoll, 1988; Davis, 2007)
2 T h l P d d C t t K l d (TPACK)2. Technology, Pedagogy, and Content Knowledge (TPACK)A. Interactions of PCK with technology (e.g., Mishra & Koehler, 2006;
Niess, 2005)
B. Stages of Integration (e.g., Niess et al., 2008)
3. Research DesignA A l i M h dA. Analytic Methods (e.g., Tashakorri & Teddlie, 1998; Whitehurst, 2003
B. Threats to Validity (e.g., Shadish, Cook, & Campbell, 2002)
C. Reliability (e.g., Furr & Bacharach, 2008; Urbina, 2004)
D. Trustworthiness (e.g., Cresswell, 2007; Patton, 2002)
Stage 1 Results: Research DesignStage 1 Results: Research Design
• High proportion of scholarly articles that wereHigh proportion of scholarly articles that were not research (66%)
• Of the 50 studies 35 used qualitative designs• Of the 50 studies, 35 used qualitative designs (70%)
Of h 50 di l 4 d i d h d• Of the 50 studies, only 4 used mixed method designs (8%)
• Of the 35 qualitative studies, 16 used case study methodology (46%)
Stage 1 Results: Teacher KnowledgeStage 1 Results: Teacher Knowledge
• Of the 95 studies 48 addressed teacherOf the 95 studies, 48 addressed teacher knowledge (51%)
• Only 8 of the 48 addressed Knowledge of• Only 8 of the 48 addressed Knowledge of Individual Context (17%)
31 f h 48 dd d P d i l• 31 of the 48 addressed Pedagogical Knowledge (65%)
• 25 of the 53 studies addressing TPACK were in the mathematics field (60%)
Stage 1: Implications and dRecommendations
1 Large percentage of non‐research papers (65%)1. Large percentage of non research papers (65%)
2 ll f id id d l2. All types of evidence are not considered equal (e.g., Whitehurst, 2003)
( )Most Compelling Lowest Emphasis1. randomized trial (true experiment)
2. comparison groups (quasi‐experiment)
3. pre‐post comparison
Most Compelling Lowest Emphasis in Ed Tech
4. correlational studies
5. case studies
6 anecdotesLeast Compelling
Highest Emphasis in Ed Tech6. anecdotes
Stage 2: Research Questions1. What is the overall structure of research in mathematics
instructional technology?
2 Wh t i th ll t f th h fi di i2. What is the overall nature of the research findings in mathematics instructional technology?
3. How can data sources used in mathematics instructional technology research be categorized?
4. What are the key outcomes from papers in mathematics i i l h l ( i d b f k )?instructional technology (organized by frameworks)?
5. How do data source categories align with study outcomes in mathematics instructional technology research?mathematics instructional technology research?
Stage 2: Research Questions6. How can teacher and student outcomes in mathematics
instructional technology research be categorized?
7. What NCTM Principles are addressed in mathematics instructional technology research? To what degree, how, implicit/explicitimplicit/explicit.
8. Which TPACK Standards are addressed in mathematics instructional technology research?
9. What aspects of teacher knowledge are addressed in mathematics instructional technology research?
f f10. To what degree do the six frameworks capture the scope of mathematics instructional technology research?
Addressing Stage 1 LimitationsAddressing Stage 1 Limitations
• Focus on educational technology in mathematics: expand data search to find missing articles (e.g., calculator studies)
• Enhance coding structure to include 3 additionalEnhance coding structure to include 3 additional frameworks: NCTM Principles, Sources of Data, and Outcomes
• More detailed coding of TPACK: Differentiation• More detailed coding of TPACK: Differentiation between Stages and Standards, Implicit and Explicit
• Coding of non‐research papers (e.g., research to i )practice)
• Development of a coding database, emphasis on Inter‐Rater Reliabilityy
Stage 2Stage 2
• In Stage 1 we identified 309 manuscriptsIn Stage 1, we identified 309 manuscripts addressing educational technology in mathematicsmathematics
• Using study bibliographies, we identified additional search terms; as a result we haveadditional search terms; as a result, we have identified 1015 manuscripts – 430 non‐research and 585 researchresearch and 585 research.
Coding Sheet Categories for Papers & diStudies
• Outcomes: Student, Teacher Knowledge (CFTK),Outcomes: Student, Teacher Knowledge (CFTK), Teacher Orientation, Teacher Behavior
• Sources of Data (e.g., Walker, 2010)( g , , )
• TPACK (e.g., Mishra & Koehler, 2006; Niess, 2005)
• NCTM Principles (NCTM 2000)NCTM Principles (NCTM, 2000)
• Research Design (e.g., Cresswell, 2004; Patton, 2002; Shadish et al., 2002; Tashakorri & Teddlie, 1998)
Outcomes FrameworkOutcomes Framework
Data SourcesData Sources
TPACKTPACK
NCTM PrinciplesNCTM Principles
Types of TechnologyTypes of Technology
Research Design, Quantitative: lSampling & Assignment
Research Design, Quantitative: l b l l dReliability & Validity
Research Design, QualitativeResearch Design, Qualitative
Non‐Research PapersNon Research Papers
Next StepsNext Steps
• Code StudiesCode Studies– Calibration Sample, 35 studies
Inter Rater Reliability Checks: 100 studies coded– Inter‐Rater Reliability Checks: 100 studies coded by 2 people
• Analyses addressing each research question• Analyses addressing each research question
QuestionsQuestions
• Full Article is In Press:Full Article is In Press:Ronau, R. N., Rakes, C. R., Niess, M. L., Wagener, L., Pugalee, D., Browning,
C., Driskell, S. O., & Mathews, S. M. (in press). New directions in the research of technology-enhanced education In J Yamamoto C Penny Jresearch of technology-enhanced education. In J. Yamamoto, C. Penny, J. Leight, & S. Winterton (Eds.), Technology leadership in teacher education: Integrated solutions and experiences. Hershey, PA: IGI Global.
ReferencesAn, S., Kulm, G., & Wu, Z. (2004). The pedagogical content knowledge of middle school, mathematics teachers in China and the U.S. Journal of Mathematics Teacher Education, 7, 145-172.
Bryk, A. S., & Driscoll, M. E. (1988). The high school as community: contextual influences and consequences for students and teachers. National Center on Effective Secondary Schools, Madison, WI.
Creswell, J. W. (2007). Qualitative inquiry and research design (2nd ed.). Thousand Oaks, CA: Sage.
Davis, B. (2007). Learners within contexts that learn. In T. Lamberg & L. R. Wiest (Eds.), Proceedings of the 29th annual meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education Stateline (Lake Tahoe), NV: University of Nevada, Reno.
Davis, B., & Simmt, E. (2003). Understanding learning systems: Mathematics education and complexity science. Journal for Research in Mathematics Education, 34, 137-167.
Fennema, E., & Franke, M. L. (1992). Teachers' knowledge and its impact. In D.A.Grouws (Ed.), Handbook of research on mathematics teaching and learning (pp. 147-164). Reston, VA: National Council of Teachers of Mathematics.
Furr, R. M., & Bacharach, V. R. (2008). Psychometrics: An introduction. Los Angeles: Sage.
National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics. Reston, VA: Author.
Niess, M. L. (2005). Preparing teachers to teach science and mathematics with technology: Developing a Technology Pedagogical Content Knowledge. Teaching and Teacher Education, 21, 509-( ) p g gy p g gy g g g g523.
Niess, M. , Ronau, R. N., Shafer, K. G., Driskell, S. O., Harper, S. R., Johnston, C., Browning, C., Özgün-Koca, S. A., & Kersaint, G. (2009). Mathematics teacher TPACK standards and development model. Contemporary Issues in Technology and Mathematics Teacher Education, 9, 4-24.
Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Thousand Oaks: Sage.
Philipp, R. A., Ambrose, R., Lamb, L. L. C., Sowder, J. T., Schappelle, B. P., Sowder, L. et al. (2007). Effects of early field experiences on the mathematical content knowledge and beliefs of prospective elementary school teachers: An experimental study. Journal for Research in Mathematics Education, 38, 438-476.
Schön, D. A. (1995). The new scholarship requires a new epistemology. Change, 27, 26-35.
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental design. Boston: Houghton Mifflin Company.
Shulman, L. S. (1986). Those who understand: A conception of teacher knowledge. American Educator, 10, 9-15.
Tashakkori, A., & Teddlie, C. (1998). Mixed methodology: Combining qualitative and quantitative approaches. Thousand Oaks, CA: Sage Publications.
Urbina, S. (2004). Essentials of psychological testing. Hoboken, NJ: John Wiley & Sons.
Walker, T. L. (2010). Types of research data. Unpublished manuscript, Department of Education, Washington State University, Vancouver, WA. Retrieved from http://www.vancouver.wsu.edu/fac/walkert/types_of_research_data.htm
Whitehurst, G. J. (2003). Identifying and implementing educational practices supported by rigorous evidence: A user-friendly guide. Washington, D. C.: U. S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. Retrieved June 1, 2009, from http://www.ed.gov/rschstat/research/pubs/rigorousevid/rigorousevid.pdf
Research design a) Randomized experimentb) Non-equivalent control groupc) Other ____________________________d) Can’t tell
Group assignment mechanism a) Random assignmentGroup assignment mechanism a) Random assignmentb) Haphazard assignmentc) Other nonrandom assignmentd) Can’t tell
Selection mechanism a) Self-selected into groupsb) Selected into groups by others on a basis related to outcome (e g goodb) Selected into groups by others on a basis related to outcome (e.g., good
readers placed in the expectancy group)c) Selected into groups by others not known to be related to outcome (e.g.,
randomized experiment)d) Can’t tell
If lidit ti t i i ) C t lIf a validity estimate was given in the report, what was the source of the estimate?
a) Current sampleb) Citation from another studyc) Can’t telld) n/a, validity was not mentioned in the report
What was the nature of the a) Concurrent validityvalidity estimate? b) Convergent validity
c) Predictive validityd) Othere) Can’t tell (e.g., simply asserted that the measure is “valid”)f) n/a, validity was not mentioned in the report
Metric for score reliability a) Internal consistencyb) Split-halfc) Test-retestd) Can’t telle) None given
Source of score reliability estimate a) Current sampleb) Citation from another studyc) Can’t telld) None given
Type of Study a) Narrative/Historicalb) Biography) i Sc) Design Study
d) Phenomenologye) Ethnographyf) Grounded theoryg) Case Study
Describe the methodology. (Mark all that apply)
a) Covert Observationb) Overt Observationc) Interviewd) Focus Groupe) Other________________________f) Can’t Tell
Was trustworthiness addressed? a) Nob) Yes
Describe:1) Persistent observation2) Use of triangulation techniques3) Peer debriefing4) Negative case analysis5) Referential adequacy6) Member checks7) Thick description8) Dependability audit9) Confirmability audit10) Reflexive journal11) Can’t tell)
Research DesignResearch Design
• Type of StudyType of Study
• Group Assignment
S l i• Selection
• Validity/Trustworthiness
• Reliability
CFTK Aspects Explicitly Addressed for teachers
a) Subject Matterb) Pedagogical Knowledge) g g gc) Orientationd) Discernmente) Individual Contextf) Environmental Contextg) Can’t Tell
CFTK Aspects Explicitly dd d f t d t
a) Subject Matterb) P d i l K l daddressed for students b) Pedagogical Knowledgec) Orientationd) Discernmente) Individual Contexte) Individual Contextf) Environmental Contextg) Can’t Tell
TPACK Stages described (teachers a) RecognizingC S ages desc bed ( eac e sonly)
a) ecog gb) Acceptingc) Adaptingd) Exploringe) Advancingf) Can’t Tell