Advancing Assessment of Quantitative and Scientific Reasoning

Download Advancing Assessment of  Quantitative and  Scientific Reasoning

Post on 15-Jan-2016

22 views

Category:

Documents

0 download

DESCRIPTION

Advancing Assessment of Quantitative and Scientific Reasoning. Donna L. Sundre Amy D. Thelk Center for Assessment and Research Studies (CARS) James Madison University www.jmu.edu/assessment/. Overview of talk. Current NSF Research project History of the test instrument - PowerPoint PPT Presentation

TRANSCRIPT

Advancing Assessment of Quantitative and Scientific ReasoningDonna L. SundreAmy D. ThelkCenter for Assessment and Research Studies (CARS)James Madison Universitywww.jmu.edu/assessment/ Overview of talkCurrent NSF Research projectHistory of the test instrumentPhase I: Results from JMUPhase II: Future directionsResults from some of our partners:Michigan StateTruman StateVirginia StateCurrent NSF Project3-year grant funded by National Science Foundation: Advancing assessment of scientific and quantitative reasoningHersh & Benjamin (2002) listed four barriers to assessing general education learning outcomes: confusion; definitional drift; lack of adequate measures, and misconception that general education cannot be measuredThis project addresses all of these concerns with special emphasis on the dearth of adequate measuresObjective of NSF project Exploring the psychometric quality and generalizability of JMUs Quantitative and Scientific Reasoning instruments to institutions with diverse missions and serving diverse populations.Partner InstitutionsVirginia State University: State-supported; Historically Black institution Michigan State University: State-supported; Research institutionTruman State University: State-supported; Midwestern liberal arts institution St. Marys University (Texas): Independent; Roman-Catholic; Hispanic Serving institutionProject phasesPhase I: First Faculty institute (conducted July 2007 at JMU); followed by data collection, identification of barriers, and reporting of resultsPhase II: Validity studies (to be developed and discussed during second faculty institute, July 2008), dissemination of findings and institutional reportsHistory of the instrumentNatural World test, developed at JMU, currently in 9th versionSuccessfully used for assessment of General Education program effectiveness in scientific and quantitative reasoningGenerates two subscores: SR and QRSummary of results since 2001Table of Results -- 5 Test Versions.docAdaptation of an instrumentJMU instrument has been carefully scrutinized for over 10 yearsThe QR and SR is currently administered at over 25 institutions across the nationNSF decided to fund this CCLI project to further study procedures for adoption and adaptation of instruments and assessment modelsEvaluating the generalizability of the instrumentStep 1: Mapping Items to ObjectivesRelating test items to stated objectives for each institutionIn the past back translation method was used (Dawis, 1987) ..\..\JMU\NSF Grant\Truman\Blank ObjectiveGrid_truman.docParticipants at the NSF Faculty Institute used a new content alignment method that was reported on at NCME (Miller, Setzer, Sundre & Zeng, 2007)Forms were custom made for each institutionExample Content Alignment form.docEarly content validity evidenceResults strongly support generalizability of test itemsTruman State: 100% of items mapped to their objectivesMichigan State: 98% (1 item not mapped)Virginia State: 97% (2 items unmapped)St. Marys: 92% (5 items not mapped)Mapping of items alone is not sufficientBalance across objectives must be obtainedTeams then created additional items to cover identified gaps in content coverage14 for MSU; 11 for St. Marys; 10 for Truman State; 4 for VSU Step 2: Data Collection and AnalysisDuring Fall 2007 semester, test was administered to students at 3 of the 4 partner institutions Spring 2008 data collection from students at sophomore level or aboveResults so farMeans not given: This activity is not intended to promote comparison of students across institutionsAt this stage, reliabilities provide the most compelling generalizability evidence; of course, the upcoming validity studies will be informativeResearch at JMUStandard Setting to aid in interpretationValidity evidence: Instrument aligns with curriculumStandard SettingUsed Angoff Method to set standardsOur process was informal, uniqueResults look meaningful but well reevaluate as we collect more data in upcoming administrationsFaculty Objective StandardsChart1Objective 10.2030.5Objective 20.1840.365Objective 30.1870.34Objective 40.1420.256Objective 50.1470.449Objective 60.1170.263Objective 70.1990.487Objective 80.1420.34QR-90.1540.436NW-9 Total0.1350.474Faculty Standard*Freshmen (no CL3 experience)CL3 Package completersProportion of students meeting standardProportion of students meeting faculty objective standardsSheet1ObjectiveFaculty Standard*Freshmen (no CL3 experience)CL3 Package completersObjective 1-0.800.200.50Objective 2-0.730.180.37Objective 3-0.760.190.34Objective 4-0.790.140.26Objective 5-0.750.150.45Objective 6-0.760.120.26Objective 7-0.780.200.49Objective 8-0.750.140.34QR-9-0.750.150.44NW-9 Total-0.760.140.47Sheet2Sheet3Validity evidence for instrument and curriculum at JMUValidity evidence for instrument and curriculum at JMU -- 2Phase II studiesSamples of Upcoming Studies:Correlational Studies: Is there a relationship between scores on the QR/SR and other standardized tests? and other academic indicators?Comparison of means or models: Is there a variation in the level of student achievement based upon demographic variables? Is there a relationship between scores on the QR/SR and declared majors? Can this instrument be used as a predictor for success and/or retention for specific majors?Qualitative Research: Will institutional differences be reflected in the results of a qualitative interview that accompanies the administration of QRSR?ReferencesDawis, R. (1987). Scale construction. Journal of Counseling Psychology, 34, 481-489.Hersh, R. H., & Benjamin, R. (2002). Assessing selected liberal education outcomes: A new approach. Peer Review, 4 (2/3), 11-15.Miller, B. J., Setzer, C., Sundre, D. L., & Zeng, X. (2007, April). Content validity: A comparison of two methods. Paper presentation to the National Council on Measurement in Education. Chicago, IL. History of back translation; problemsBenefits of new methodShow example form of eachKeep in mind that although each schools objectives were related to general education science/math, each institution had a different set of objectives!

Recommended

View more >