abigail gonzales, brigham young university & university of nevada, las vegas

Download Abigail Gonzales, Brigham Young University & University of Nevada, Las Vegas

If you can't read please download the document

Upload: mervyn

Post on 19-Mar-2016

30 views

Category:

Documents


0 download

DESCRIPTION

TRIAL PERIODS & COMPLETION POLICIES: A COMPARATIVE STUDY OF VIRTUAL SCHOOLS IN THE UNITED STATES & CANADA. Abigail Gonzales, Brigham Young University & University of Nevada, Las Vegas Dr. Michael K. Barbour, Wayne State University. Agenda. Describe study Share findings - PowerPoint PPT Presentation

TRANSCRIPT

  • TRIAL PERIODS & COMPLETION POLICIES: A COMPARATIVE STUDY OF VIRTUAL SCHOOLS IN THE UNITED STATES & CANADA

    Abigail Gonzales, Brigham Young University & University of Nevada, Las VegasDr. Michael K. Barbour, Wayne State University

  • AgendaDescribe studyShare findingsDiscuss collectively implications & future directions

  • State of Virtual Schools in U.S.Explosive growthStudent population primarily supplementaryVariety of types of virtual schoolsStatewide, virtual charter, Multi-district/consortia, single-district, private, for profit, & universityGeographic location High concentration Western & Southeastern statesNortheastern states slow adopters

  • State of Virtual Schools in CanadaFirst virtual schools in 1993

    Some activity in all provinces and territoriesMost have extensive programsOnly Prince Edward Island has very little activity

    Most have combination of district-based and provincial programs

  • Challenges of virtual schoolingAttrition is a significant problem (Carr, 2000; Lary, 2002; Rice, 2005)Multiple factors contribute to differencesNon-learning related factorsWhen we start counting studentsHow we count them

  • Purpose of StudyExamine variation in trial period policies in US and CanadaVariability across types schools & geographic regions

    Examine variation in how US and Canadian schools define course completionsVariability across types schools & geographic regions

  • Significance of StudyIs there a need to standardize? Cannot standardize metric without knowing current landscapeAre policies adopted context specific?

  • Review of LiteratureResearchers call for standardizing performance measures (Smith et al., 2005; Pape et al., 2006; Watson et al., 2006)Limited research examining two policies Pape et al., (2006) compared 3 v. schools2 trial periods: 3 and 5 weeks2 defined completion as 60%, 1 used qualitative tagEvidence trial periods can sift out weaker students (Ballas & Belyk, 2000; Cavanuagh, Gillan, Bosnick, Hess, & Scott, 2005; McLeod, Hughes, Brown, Choi, & Maeda, 2005)Course completion definitions affect retention rates (Pape et al., 2005; Roblyer, 2006)

  • MethodsSampling Procedures159 US schools117 Canadian schools Email survey4 contact attempts (2 emails, fax, phone)

  • Methods Virtual school: state approved / regionally accredited school offering credit through DL methods including the internet (Clark, 2001)School type taxonomy from Cavanaugh, Barbour, and Clark 2008Regional DivisionsUS Watson & Ryan 2007Canadian

  • US Geographical RegionsSoutheastern StatesNortheastern StatesWestern StatesCentral Sates

  • Canadian Geographical RegionsWestern CanadaCentral CanadaAtlantic Canada

  • Sample by Region: US

    RegionUS SampleUS % of SampleCentral States4125.5Northeastern States1811.2Southeastern States3320.5Western States6741.6Total159100

  • Sample by Region: Canada

    RegionCanadian SampleCA % of SampleAtlantic97.7Central3025.6Western7765.8Across regions1.8Total117100%

  • Sample by School Type

    School typeUSUS %Cyber Charter3421.1For Profit95.6Multi-district116.8Private2113Single district4930.4State led2414.9University led116.8Other (Aboriginal, Unknown, etc)00Total159100%

    CanadaCanada %000043.432.59480.343.4001210.4117100%

  • Responses & Response RatesTotal responses: 123 US: 88 schools @ 55.3% response rateCanada: 35 schools @ 30% response rate

    Chart1

    86

    35

    United States

    Canada

    Responses

    Response breakdown

    Sheet1

    Responses

    United States86

    Canada35

    To resize chart data range, drag lower right corner of range.

  • Responses by School Type

    School typeUSUS %Cyber Charter1618.2For Profit11.1Multi-district78.0Private1314.8Single-district2629.5State led1719.3University led89.1Other (Aboriginal, unknown)00Totals88100%

    CanadaCanada %000025.725.7288038.6000035100%

  • Representativeness by School Type

    School typeUS Sample %US Response %% DifferenceCyber Charter21.118.22.9For Profit5.61.14.5Multi-district6.88.0-1.2Private1314.8-1.8Single-district3029.5.5State led14.919.3-4.4University led6.89.1-2.3Other (Aboriginal, unknown)000

  • Representativeness by School Type

    School typeCanadian Sample %Canadian Response %% DifferenceCyber Charter000For Profit000Multi-district3.45.7-2.3Private2.55.7-3.2Single-district80.380-.3State led3.48.6-5.2University led000Other (Aboriginal, unknown)10.40-10.4

  • Representativeness by Region

    RegionUS Sample %US Response %% DifferenceCentral States25.526.1-.6Northeastern States11.29.12.1Southeastern States20.522.7-2.2Western States41.642-.4

  • Responses by Region

    RegionCanadaCanada %Atlantic Canada38.6Central Canada1131.4Western Canada2057.1Total35100%

  • Representativeness by Region

    RegionCanadian Sample %Canadian Response %% DifferenceAtlantic Canada7.78.6-.9Central Canada25.631.4-5.8Western Canada65.857.18.7Across Regions.80.8

  • Trial Period PrevalenceNo trial: 27 Trial: 61 Total: 88

    No trial: 23 Trial: 12 Total: 35

    United StatesCanada

    Chart1

    12

    23

    Canada

    Yes 34%

    No 66%

    Sheet1

    Canada

    Yes12

    No23

    To resize chart data range, drag lower right corner of range.

    Chart1

    61

    27

    Column1

    Yes 69%

    No 31%

    Sheet1

    Column1

    Yes61

    No27

    To resize chart data range, drag lower right corner of range.

  • Trail Period LengthRange: 1-185Mean: 19.59*

    *w/o extreme outliersRange: 3 - 112Mean: 28.82*

    United StatesCanadaDifference significant @ p=.05

  • Trial period length in days (n=72)

    Chart1

    20

    11

    50

    10

    51

    122

    40

    10

    61

    80

    93

    20

    10

    01

    12

    21

    United StatesCanada

    US

    Canada

    Sheet1

    DaysUSCanadaSeries 3

    1202

    3112

    7503

    8105

    1051

    14122

    1540

    2010

    2161

    2880

    3093

    3520

    4010

    4501

    6012

    >11221

  • Trial period length variations bySchool type:US sig. @ p=.05 df(5) f3.909Differences: Private school vs. state-led, cyber charters, and single-districtCanada No significant difference

    Geographical region:US & Canada No significant difference

  • Course Completion DefinitionsGrade irrelevantGrade relevantOther

  • Course Completion Definitions whereGrade is Irrelevant

    DefinitionsUSUS %CanadaCanada %Remain in course 6 days beyond midterm0025.7Remain in course1618.61337.1Complete all/majority of coursework1112.8822.9Totals2731.4%2365.7%

    DefinitionsUSUS %Remain in course 6 days beyond midterm00Remain in course1618.6Complete all/majority of coursework1112.8Totals2731.4%

  • Course Completion Definitions whereGrade is Relevant

    DefinitionsUSUS %CanadaCanada %Pass the course (60%)3844.21234.3Pass course & final22.300Pass w/ D/64%11.200Pass w/ C-/70%6700Pass w/ B-/80%44.700Pass w/ A-/90%11.200Totals5260.6%1234.3%

    DefinitionsUSUS %Pass the course (60%)3844.2Pass course & final22.3Pass w/ D/64%11.2Pass w/ C-/70%67Pass w/ B-/80%44.7Pass w/ A-/90%11.2Totals5260.6%

  • Course Completion Definitions whereOther

    DefinitionsUSUS %CanadaCanada %Mastery not defined by grade11.200Individual schools define completion44.700Totals55.9%00

    DefinitionsUSUS %Mastery not defined by grade11.2Individual schools define completion44.7Totals55.9%

  • Completion Definitions whereGrade Relevant vs. IrrelevantUnited StatesCanada

  • Course completion variations bySchool type:US & Canada No significant difference

    Geographical region:US & Canada No significant difference

  • Findings SummaryTrial Period PresenceMore prevalent in US

    Trial Period LengthCanada had longer trial periods than USMost common lengths were 2 and 4 weeksRegional differences: Not sig. School type: US sig. only- private schools

  • Findings SummaryCourse completion definitionsMore stringent definition in USUS 66% grade relevant vs. Canada 34%US greater range in definitions than Canada

  • Implications: US and CanadaWhat implications do you see this study has?Policy practices are inverseFuture research explore why and what drives policy adoption

  • Implications: United StatesNeed common metrics for calculating attritionBest if same as bricks-and-mortar schoolsGather data for internal and external reportingInternal = Institutional metricsExternal = Standardized metricsDetermining metric easier since geography and school type factor little

  • Implications: CanadaSmall sample size = difficult to generalizeLess variation so less of a problemUS implications may applyInternal/external reportingGeography and school type not significant

  • Participant DiscussionHow do you determine or set your trial period policies and completion definitions?What influences you?Should a common metric be established?Who would determine the standardized metric?What would be the optimal trial period/ course completion policy?What other metrics / policies need standardization?Questions?

  • ReferencesBallas, F. A., & Belyk, D. (2000). Student achievement and performance levels in online education research study. Red Deer, AB: Schollie Research & Consulting. Retrieved July 31, 2005, from http://www.ataoc.ca/files/pdf/AOCresearch_full_report.pdfCarr, S. (2000). As distance education comes of age, the challenge is keeping the students. The Chronicle of Higher Education, 46(23), A39-41.Cavanaugh, C., Gillan, K. J., Bosnick, J., Hess, M., & Scott, H. (2005). Succeeding at the gateway: Secondary algebra learning in the virtual school. Jacksonville, FL: University of North Florida.Cavnaugh, C., Barbour, M., & Clark, T. (2008, March). Research and practice in k-12 online learning: A review of literature. Paper presented at the annual meeting of the American Educational Research Association, New York.Clark, T. (2000). Virtual high schools: State of the states - A study of virtual high school planning and preparation in the United States: Center for the Application of Information Technologies, Western Illinois University. Retrieved July 4, 2005, from http://www.ctlt.iastate.edu/research/projects/tegivs/resources/stateofstates.pdfLary, L. (2002). Online learning: Student and environmental factors and their relationship to secondary student school online learning success. Unpublished dissertation, University of Oregon.

  • References ContinuedMcLeod, S., Hughes, J. E., Brown, R., Choi, J., & Maeda, Y. (2005). Algebra achievement in virtual and traditional schools. Naperville, IL: Learning Point Associates.Pape, L., Revenaugh, M., Watson, J., & Wicks, M. (2006). Measuring outcomes in K-12 online education programs: The need for common metrics. Distance Learning, 3(3), 51-59.Rice, K. L. (2006). A comprehensive look at distance education in the K-12 context. Journal of Research on Technology in Education, 38(4), 425-448.Roblyer, M. D. (2006). Virtually successful: Defeating the dropout problem through online school programs. Phi Delta Kappan, 88(1), 31-36.Smith, R., Clark, T., & Blomeyer, R. L. (2005). A synthesis of new research on K-12 online learning. Naperville, IL: Learning Point Associates.Tucker, B. (2007). Laboratories of reform: Virtual high schools and innovation in public education. Retrieved April 20, 2008, from http://www.educationsector.org/usr_doc/Virtual_Schools.pdfWatson, J. F., & Ryan, J. (2007). Keeping pace with k-12 online learning: A review of state-level policy and practice. Vienna, VA: North American Council for Online Learning. Retrieved September 23, 2007, from http://www.nacol.org/docs/KeepingPace07-color.pdf

    Ask audience introduce self & what do.What interested them in attending?If their institution has a trial period? How they define course completions? Illustrate wide variability even within the room. *2005-2007 State led schools went from 21 to 42 according to Watson and Ryans Keeping pace wit hK-12 online learningEstimated 700,000 (Tucker 2005) to 1 million (Christensen & Horn, 2008) students participating in K-12 online learningMichigan requires e-learning component for graduate requirement from high school. Other states may soon follow.

    The student population is primarily using online learning to supplement their brick and mortar courses but there is a growing trend toward more full-time programs.*12-40% (Lary, 2002)50% (Rice, 2006) Low as 3 % as high as 70% (Robyler, 2007)

    Wide range in what is reported Retention rates are significant because they are a key metric in measuring the health and quality of a school

    *Is there a need to standardize? Or have we already done this organically?

    There has been a call to standardize but before you can do this it is important to know the current landscape. If there are policies institutions have gravitated to en mass, determining a standard metric becomes easier.

    Are these policies context specific?

    *In Papes et al. 2006 article Measuring outcomes in K-12 online education programs: The need for common metrics she examined 3 virtual high schools--Virtual High School Global Consortium (VHS), Illinois Virtual High School (IVHS), and Connections Academy (CA).

    Qualitative tag which included intertwining metrics: attendance, participation, and performance (p. 55). This data was combined to calculate a qualitative tag depicting performance ranging from Satisfactory to Alarm

    . Long trial periods can act as a sifting mechanism during which weaker students to drop out, masking attrition rates for lower performing students. In turn, virtual schools with generous trial periods would be able to report high retention rates because students who were having trouble and would have likely struggled to complete the course would have dropped out by the time the virtual school began counting then as students. This problem has been well documented in comparative studies examining student performance of virtual school students compared to brick-and-mortar students (Ballas & Belyk, 2000; Cavanuagh, Gillan, Bosnick, Hess, & Scott, 2005; McLeod, Hughes, Brown, Choi, & Maeda, 2005).

    *159 US schools located:: NACOL Clearinghouse list, State-led schools from Keeping Pace with K-12 Education 2007Canadian schools selected based on Email survey : 3 questions; 2 open-ended, Addressed to principal / program director

    Used Clarks 2001 definition of a virtual school as a state approved and/or regionally accredited school that offers secondary credit courses through distance learning methods that include Internet-based delivery

    *159 US schools located:: NACOL Clearinghouse list, State-led schools from Keeping Pace with K-12 Education 2007Email survey : 3 questions; 2 open-ended, Addressed to principal / program director

    *Abby, for future reference: - Nunavut is NV - YK, NT, and NU are considered north of 60 - Western Canada is BC, AB, SK and MB - Central Canada is ON and QC - Atlantic Canada is NB, NS, PE, and NL (and you left NL off the map)-> dont worry about it for tomorrow, just for future reference (i.e., our eventual manuscript) *US: Single district, state led, and cyber charter schools accounted for 66.4% of virtual schools in our sample. *Canada: 29.9%

    Response breakdown by country majority were US schools responding.*Single district, Cyber charter, and State led: 67%Fairly representative set of responses compared to the sample set.*Single district, Cyber charter, and State led: 67%Fairly representative set of responses compared to the sample set.*Exceptionally representative with regard to single district, less so with others*Single district, Cyber charter, and State led: 67%

    Maybe overlay these on a map???**Single district, Cyber charter, and State led: 67%

    Maybe overlay these on a map???*US: Of the 88 schools surveyed, 27 schools had no trial period compared to 61 schools had a trial periodCanada: Of the 35 Canadian schools surveyed, 23 schools had no trial period compared to 12 schools that did.

    Trial periods were much more common practice in the US than in Canada.

    However, Canada had several instances where a trial period was marked by an event such as submitting your first assignment, taking your first quiz, paying your tuition; in contrast to a time period that was common in the US.

    *

    A t-test comparing the 2 means indicated that there was a significant difference between US and Canadian Trial lengths

    A t-test when the extreme outliers present shifted means for US 25.02 and Canada 35.75. T-test on this found that it was not significant*US most common: 28-30 days about 4 weeks accounted for 28.3% of the sampleMost common (14 days) 14-15 days: 2 weeks accounted for 26.7% of the sampleCombined: those two accounted for over 50% of the variation= 33

    Canada most common was 30 days or 4 weeks.60 and 30 and 14 days accounted for 7 of the 12 with trial periods almost 60%. The rest were scattered throughout the range.

    *Ran One way ANOVAs to see if there were any significant differences in trial length based on school type or region.

    For School types found that US significant with an f value of 3.909

    Did a Post HokTukey test to see which variables were significant and found that *Wide range from Remaining in course past 6 days of the mid-term to passing the course at a mastery level with 90% or better

    Combining US and Canada: Total of 50 out of 112 us this approach(44.6%)

    *Wide range from Remaining in course past 6 days of the mid-term to passing the course at a mastery level with 90% or better

    Combining US and Canada: Total of 50 out of 112 us this approach(44.6%)

    Canada has significantly less variation in comparison with the US. This may be due to the fact that schools are organized and run at the national rather than state/regional level.

    *Policies in the US and Canada are the inverse in terms of trial period presence and completion definitionsWith the exception of school type and region not being factors, Canada and the US are almost the reverse in terms of their policies in these two areas. F*This study gives us evidence beyond anecdote or guess, that variations are significant and there is a need to standardize trial period policies and course completion definitions. We need to count students in the same time and same manner.Ideally, best if we could align this with how brick and mortar schools are calculating attrition/retention to allow for comparisons

    **How do you determine. If a standardized metric were to be established, who should determine it?

    *