a perplexing puzzle

31
Evaluating blended learning: a perplexing puzzle Dr Victoria Daskalou, Department of Economics Mr. George Lekatsas, Network Operation Center University of Patras, Greece Upatras Opatel Scientific Coordinator: Prof. N. Karacapilidis TUMS workshop, Tehran, Iran 2-5/12/2018 1

Upload: others

Post on 18-Dec-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Evaluating blended learning: a perplexing puzzle

Dr Victoria Daskalou, Department of Economics Mr. George Lekatsas, Network Operation CenterUniversity of Patras, GreeceUpatras Opatel Scientific Coordinator: Prof. N. KaracapilidisTUMS workshop, Tehran, Iran 2-5/12/2018

1

Scope of Presentation*

● Blended learning: overview of opportunities and challenges● Evaluating blended learning● Evaluation criteria● Main evaluation frameworks● Approaches for evaluating blended courses● Quality in online and open education

*This presentation was inspired by the article Bowyer, J., & Chambers, L. (2017). Evaluating

blended learning: Bringing the elements together. Research Matters, 23, 17-26.

2

The scope of the presentation is to present issues in evaluating blended learning. We start with answering “why” blended learning by giving an overview of opportunities and challenges. Then we talk about the “what”, by explaining the dimensions of evaluating blended learning. First we present the evaluation criteria and we briefly speak about the main evaluation frameworks, expressing some of our own experiences. Finally, we present the different methods for approaching the “by whome” we should be evaluated.

By Giulia Forsythe@ Flickr, Public domain

3

Visual notes by from Giulia Forsythe from event held in UK (#oucel12) try to define Blended Learning and present administrative challenges and sustainability Issues; We can define blended learning as an effective integration of both online and face-to-face educational activities. Main benefits of blended learning include its active and collaborative learning nature, the strategic use of classroom time for both students and teachers, the improved learning outcomes for students. On the contrary blended learning faces specific challenges that can be summarized in policy issue related to strategic planning of financial, technical and human resources, course scheduling and support. Moreover, they need comprehensive instructor’s training and the appropriate instructional design. Last but not least they need an ongoing evaluation procedure.

Blended learning Opportunities

● Improved student outcomes● Strategic use of classroom time● Collaborative, active-learning

(student, peers, instructors, community)

Challenges

● Policy issues: strategic planning of financial, technical and human resources, course scheduling, support

● Comprehensive instructor’s training + instructional design

● Ongoing evaluation

An effective integration ofboth online and face-to-face educational activities

4

These are all listed here.

Evaluating blended learning

Why? What is the purpose of evaluation? Improve student engagement,

resources, or overall course quality?

Who should be involved?

Lecturers, students, course leaders?

How and when should evaluation take place?

Methods of data collection; during the course or at the

end?

What should be evaluated?

Teaching, learning, course outcomes, resources, quality

of assessment?

Adapted from (Pombo & Moreira, 2012)

5

And in this talk we will elaborate in evaluating blended learning. Four are the main question during each evaluation process:What is the purpose of evaluation? Do we want to Improve student engagement, resources, or overall course quality?What should be evaluated? Teaching, learning, course outcomes, resources, quality of assessment?How and when should evaluation take place? Which are the methods of data collection; The evaluation will be implemented during the course or at the end?Who should be involved? Lecturers, students, course leaders?

Evaluation criteria

Combination of learning outcomes and measures of student satisfaction and student engagement

Measuring learning outcomes:

● Grades and marks, activity, attendance, and dropout rates. Caution to motivation!

Measuring student satisfaction:

● Self-report questionnaires to investigate students’ satisfaction based on personal experience

Measuring student engagement:

● “the active participation of students and staff and students working in partnership”, (SEHEJ). Behavioural, emotional, cognitive engagement (Trowler, 2010)

● Quantitative: measures of attendance and submission of work, Questionnaires

● Qualitative: interviews, focus groups, observations, etc.

6

But which are the evaluation criteria? Against what should we evaluate a blended course? In the related literature evaluation criteria are a combination of learning outcomes and measures of student satisfaction and student engagement. For measuring learning outcomes we use Grades and marks, activity, attendance, and dropout rates. Caution to motivation! We should take in to account the learner’s attitude towards learning (i.e.) when we consider the effectiveness of blended learning.Student satisfaction is important since it is based on the student experiences during the blended course. We usually implement self-reported questionnaires to investigate students’ satisfaction. For example: Does this course satisfy your learning needs?Finally we measure student engagement. But was is really student engagement. From the Student Engagement in Higher Education Journal https://journals.gre.ac.uk/index.php/raise we define that student engagement is the “the active participation of students and staff and students working in partnership”. We should take care after the following student engagement dimensions (Trowler, 2010)

1. Behavioural: relating to students’ actions. For example, class attendance, submission of work, contribution to class discussion,or participation in school-related activities (e.g., extra-curricular sports or school governance).

2. Emotional: relating to students’ affective reactions in relation to their learning. For example, an emotionally engaged student might report that they were interested in their course and that they enjoyed learning.

3. Cognitive: relating to students’ psychological investment in their learning. For example, the desire to go beyond the requirements of the class and the adoption of metacognitive learning strategies (think about my

1. thinking/learning, self-regulated learning).For measuring the student engagement we use both:

● Quantitative: measures of attendance and submission of work, Questionnaires● Qualitative: unstructured interviews, focus groups, observations, etc.

Evaluation frameworks for blended learning

● Web-Based Learning Environment Instrument (WEBLEI)● Hexagonal E-Learning Assessment Model (HELAM)● E-Learning framework● Technology Acceptance Model (TAM)● Rubric-based frameworks● Conceptual framework for evaluating blended learning

7

We will briefly describe the following main frameworks for evaluating blended learning:

● Web-Based Learning Environment Instrument (WEBLEI)● Hexagonal E-Learning Assessment Model (HELAM)● E-Learning framework● Technology Acceptance Model (TAM)● Rubric-based frameworks● Conceptual framework for evaluating blended learning

Web-Based Learning Environment Instrument (WEBLEI) (Chang & Fisher, 2003)

● Questionnaire investigating students’ perceptions and experiences of online learning environments

● 4 Scales: 1. Emancipatory activities[independence]: convenience, efficiency

and autonomy2. Co-participatory activities: flexibility, reflection, quality,

interaction, collaboration and feedback3. Qualia [experience]: looking at success, confidence,

accomplishments and interest4. Information structure and design: how well the course and

learning materials are structured and designed

8

The WEBLEI framework, proposed by Chang & Fisher, 2003), uses a questionnaire investigating students’ perceptions (beliefs) and experiences of online learning environments that consists of four scales:The first scale investigate student perceptions regarding Emancipatory activities, i.e. student independence, related to her convenience, efficiency and autonomy. Example: I can access the learning activities at times convenient to me. (likert scale: Always – Often – Sometimes – Seldom – Never)The second scale is related to Co-participatory activities ( flexibility, reflection, quality, interaction, collaboration and feedback). Example: The flexibility allows me to meet my learning goals. Qualia reflects life experiences of individuals looking at success, confidence, accomplishments and interest for learning in the blended environment. Example: I enjoy learning in this environment.Finally, the last scale of the questionnaire is about information structure and design that investigates how well the course and learning materials are structured and designed. Example: The subject content is appropriate for delivery on the Web.

Hexagonal E-Learning Assessment Model(Özkan & Koseler, 2009)

Source: (Özkan & Koseler, 2009) © 2009 Elsevier Ltd. Used for educational purposes under the fair use policy

● Perceived learner satisfaction

● Six dimensions● Main instrument:

Questionnaire (+Focus groups for A1)

9

The HEALM model investigates perceived learners satisfaction in 6 six dimensions. Its main instrument is a questionnaire but also used a Focus group in order to investigate if the system was promoted well.

HEALM: 6 dimension

1. Supportive issues: policy issues about use of elearning system. a. Example: If the use of U-Link was optional, I would still prefer to use U-Link as a supportive tool as it helps my

performance in the module.

2. Learner ‘s attitudes: learners’ perspective towards elearning. a. Example: U-Link improves my success in the module.

3. Instructor’s attitudes: how students found instructors teaching. a. Example: The instructor follows up student problems and tries to find out solution via U-Link.

4. System quality: concerns system usability and technical characteristics. a. Example: The program directions and navigations are clear.

5. Information content quality: educational content. a. Example: Course content and presentation gain attention.

6. Service quality: issues about the service as a whole. a. Example: I do not experience any problems during registrations.

10

Six dimensions:1. Supportive issues, concern policy issues about the use of the elearning

system. Example: If the use of U-Link was optional, I would still prefer to use U-Link as a supportive tool as it helps my performance in the module.

2. Learner ‘s attitudes, concern the learners’ perspective towards elearning. Example: U-Link improves my success in the module.

3. Instructor’s attitudes explore how students found instructors teaching. Example: The instructor follows up student problems and tries to find out solution via U-Link.

4. System quality, concerns system usability and technical characteristics. Example: The program directions and navigations are clear.

5. Information content quality, concern the educational content. Example: Course content and presentation gain attention.

6. Service quality concerns issues about the service as a whole. Example: I do not experience any problems during registrations.

E-Learning framework

© B. Khan. Used or educational purposes under the fair use policy. Source: http://www.asianvu.com/bookstoread/framework/other.htm

● By Badrul Khan● “provide guidance in the

design, development, delivery and evaluation of open and distributed learning environments.”

● Eight dimensions

11

The E-learning framework according its creator Badrul Khan is not restricted to evaluation. It “provide guidance in the design, development, delivery and evaluation of open and distributed learning environments.” But several authors use it for evaluation purposes. The framework consists of eight dimensions. Source: http://asianvu.com/bk/framework/?page_id=171

● The pedagogical refers to teaching and learning. Ιssues: content analysis, audience analysis, goal analysis, media analysis, design approach, organization and methods and strategies of e-learning environments.

● The technological dimension examines issues of technology infrastructure in e-learning environments. Ιnfrastructure planning, hardware and software.

● The interface design refers to the overall look and feel of e-learning programs. Interface design dimension encompasses page and site design, content design, navigation, and usability testing.

● The evaluation includes both assessment of learners and evaluation of the instruction and learning environment.

● The management refers to the maintenance of learning environment and distribution of information.

● The resource support dimension examines the online support and resources required to foster meaningful learning environments.

● The ethical considerations relate to social and political influence, cultural diversity, bias, geographical diversity, learner diversity, information accessibility, etiquette, and the legal issues.

● The institutional dimension is concerned with issues of administrative affairs, academic affairs and student services related to e-learning.

E-learning Framework: Dimensions

● Pedagogical: teaching & learning. Ιssues: Analysis of content, audience, goals, media, design

approach, organization and methods and strategies

● Technological: technology infrastructure. Ιnfrastructure planning, hardware and software.

● Interface design: overall look and feel of e-learning programs. Page and site design, content

design, navigation, and usability testing.

● Evaluation: both assessment of learners and evaluation of instruction and learning env.

● Management: Maintenance of learning environment and distribution of information.

● Resource support: Online support and resources required to foster learning

● Ethical considerations: Social and political influence, cultural diversity, bias, geographical

diversity, learner diversity, information accessibility, etiquette, and the legal issues.

● Institutional: issues of administrative affairs, academic affairs and student services related

to e-learning.

12

Technology Acceptance Model (Davis, 1993)

Questionnaire focusing solely on the technology aspects ofblended learning and how they affect user satisfaction and course retention

Technology Acceptance Model (TAM)Adapted from (Davis, 1993)

13

System design features

Perceived usefulness

Perceived ease of use

Attitude towards using

Actual system use

Externalstimulus

Cognitive response

Affective response

Behavioral response

TAM, proposed by (Davis,1993) specifies the causal relationships between system design features, perceived usefulness, perceived ease of use, attitude toward using, and actual usage behavior.Perceived usefulness (the degree to which a person believes that using a particular system would enhance their performance) and perceived ease of use (the degree to which a person believes that using a particular system would be free from effort) are two of the main predictors of system use. The TAM uses a questionnaire and focuses solely on the technology aspects of blended learning and how they affect user satisfaction and course retention.

Rubric evaluation frameworks

“Rubrics usually contain evaluative criteria, quality definitions for those criteria at particular levels of achievement, and a scoring strategy”, Wikipedia

Criteria for elearning:● learning, learner support,

course organization, assessment, design and the use of technology

Rubric for Online Instruction, California State University, Used under Creative Commons Attribution 3.0 License

14

“Rubrics usually contain evaluative criteria, quality definitions for those criteria at particular levels of achievement, and a scoring strategy”, Wikipedia. Most rubric frameworks for evaluating elearning, include criteria to evaluate learning, learner support, course organization, assessment, design and the use of technology. An example of the a first part of the rubric can be seen here.

Conceptual framework for evaluating blended learning (1/2) (Bowyer & Chambers, 2017)

Source: (Bowyer & Chambers, 2017)

© UCLES 2017, Used for educational

purposes under the fair use policy.

Dependent variables: Outcomes

Independent Variables: In 3 spheres

of influence:

1. Situation: wider context,

institutional elements

2. Course organisation: design

and planning, content,

technology and assessment

3. Individual perspectives:

learner and teacher elements +

crucial features of

communication, interaction

and collaboration

15

The conceptual framework for evaluating blended learning of (Bowyer & Chambers, 2017) tries to bring all the elements together by proposing that we should see student outputs as dependent variables that are influenced by independent variables in 3 spheres:

1. Situation: wider context, institutional elements2. Course organisation: design and planning, content, technology and

assessment3. Individual perspectives: learner and teacher elements + crucial features of

communication, interaction and collaboration

Conceptual framework for evaluating blended learning (2/2) (Bowyer & Chambers, 2017)

Source: (Bowyer & Chambers, 2017) © UCLES 2017, Used or educational purposes under the fair use policy.

16

Here we can see their proposals for measuring the student outcomes. For example for measuring learner’s satisfaction we should measure emotional engagement using the questionnaire in Krause and Coates 2008, Intellectual engagement scale. E.g. I enjoy the intellectual challenge of subjects I am studying. Or Upatras survey: How interesting do you find the course?

Upatras Student Survey

4 Types of online Questionnaires:

1. Undergraduate studies

2. Clinic

3. Laboratory

4. Upon graduation (alumni)

5. Postgraduate studies

Undergraduate Questionnaire: Dimensions

Course attendance TextbooksCourse-notes

Teaching Course DifficultyLearning outcomes

Mean Standard deviation

17

Results from the Upatras Student Survey for undergraduate studies (Scale 1-5).

Source: Upatras information System for Quality Assurance

For example, using the Upatras student survey we would investigate student perspectives related to educational material and teaching.

One Minute Paper (OMP)*

Only 3 questions:

1. What was the most interesting thing you learned during this activity?

2. What questions remained unanswered?

3. Summarize the main point of today’s educational activity in one sentence.

An everyday tool for evaluating outcomes. Can be used in any type of educational activities for continuous evaluation.

*Thanks to my colleague M. Komninou for sharing course evaluation ideas

18

I enjoy implementing as an online questionnaire the One Minute Paper. It is called OMP because it takes only one minute for each student to complete. It consists from only 3 questions:

1. What was the most important thing you learned during this activity? 2. What questions remained unanswered?3. Summarize the main point of today’s educational activity in one sentence.

I believe that it is an everyday tool for any type of educational activities which facilitates a continuous evaluation procedure.

A tiny mid-term evaluation tool*

An easy tool to evaluate the instructor. Ask students to anonymously fill-in their proposals right after these 3 words…

*Thanks to my colleague M. Komninou for sharing course evaluation ideas

Continue…..

Stop……

Start……..

19

The continuous evaluation is important. For example, we can activate in the middle of the semester on our elearning environment, a tiny mid-term evaluation tool, so to catch the student needs: keep doing the activities that learners like, stop doing things they do not like and finally start doing new things that they would like to see from you.

Approaches for Evaluating Blended Courses (Savoie-Roskos et al., 2018)

● Combining different approaches:○ Student Evaluation and Assessment ○ Peer Evaluations ○ Instructional Design Evaluations

● Continuous evaluation: before, during, and after the course

20

According to (Savoie-Roskos et al., 2018) student evaluation and assessment is not enough for blended courses. Instructors should also use peers for evaluating their course and should also ask for the guidance of instructional designers in a continuous evaluation procedure: before, during, and after the course

Quality in online and open education

21

Three significant main areas related to quality in online learning, including e-learning

Source: (Ossiannilsson et al, 2015). Used under the Creative Commons-License

Very interesting Report:

Quality models in online and open education around the globe: State of the art and recommendations

https://www.icde.org/global-overview-of-quality-modelsThe review of international quality standard models illustrates that there are many existing schemes and models for quality assurance of open, distance, flexible and online education, including e-learning. They share many common features and many are designed to offer flexibility for institutions to adapt to suit national and institutional contexts. The most common structure encountered presents criteria for performance in aspects of institutional management, curriculum design student support and other elements of educational provision, further subdivision into performance indicators and indications of sources of evidence. The most general categorisation of activities is Management (Institutional strategy, visions, and resourcing) Products (processes of curriculum and module development) and Services (student, and staff support, information resources etc.).The report specifies that at least 17 models are commonly used.

Evaluation of training is and will always be a puzzle to solve...

Image by geralt@pixabay under CC0 Creative Commons

22

As you have might understood the evaluation of training is and will always be a puzzle to solve. I would like to ask our colleagues from TUMS and any other partner to share with us its experience.

Questions?

23

Reference notes (1/2)

Bowyer, J., & Chambers, L. (2017). Evaluating blended learning: Bringing the elements together.

Research Matters, 23, 17-26.

Chang, V., & Fisher, D. (2003). The validation and application of a new learning environment

instrument for online learning in higher education. In Technology-rich learning environments: A future perspective (pp. 1-20).

Davis, F. D. (1993). User acceptance of information technology: system characteristics, user

perceptions and behavioral impacts. International journal of man-machine studies, 38(3), 475-487.

Ossiannilsson, E., Williams, K., Camilleri, A. F., & Brown, M. (2015). Quality Models in Online and Open Education around the Globe: State of the Art and Recommendations. Retrieved from

https://www.pedocs.de/volltexte/2015/10879/pdf/Ossiannilsson_et_al_2015_Qualitymodels.pdf

25

Reference notes (2/2)

Özkan, S., & Koseler, R. (2009). Multi-dimensional students' evaluation of e-learning systems in the higher education context: An empirical investigation. Computers & Education, 53, 1285-1296.

Pombo, L., & Moreira, A. (2012). Evaluation framework for blended learning courses: A puzzle

piece for the evaluation process. Contemporary Educational Technology, 3(3), 201-211.

Savoie-Roskos, Mateja R.; Bevan, Stacy; Charlton, Rebecca; and Israelsen Graf, Marlene (2018) "Approaches to Evaluating Blended Courses," Journal on Empowering Teaching Excellence: Vol. 2 : Iss. 1 , Article 3. Retrieved fromt: https://digitalcommons.usu.edu/jete/vol2/iss1/3

Trowler, V. (2010). Student engagement literature review. The higher education academy, 11(1),

1-15.

26

Funding

● This educational material is developed within the project "OPATEL: Online Platform for

Academic TEaching and Learning in Iraq and Iran", under the contract

73915-EEP-1-2016-1-DE-EPPKA2-CBHE-JP.

● The OPATEL project is funded by the Erasmus+ programme of the European Union.

● The European Commission support for the production of this material does not constitute an endorsement of the contents which reflects the views only of the authors, and the Commission cannot be held responsible for any use which may be made of the information contained therein.

27

Note on History of Published Version

The present work is the edition 1.0

28

License Notes

The current material is available under the Creative Commons AttributionNonCommercial-ShareAlike

4.0 International license or later International Edition. The individual works of third parties are excluded,

e.g. photographs, diagrams etc. They are contained therein and covered under their conditions of use in

the section «Use of Third Parties Work Note».

[1] http://creativecommons.org/licenses/by-nc-sa/4.0/

As Non-Commercial is defined the use that:

Does not involve direct or indirect financial benefits from the use of the work for the distributor of the

work and the license holder.

Does not include financial transaction as a condition for the use or access to the work.

Does not confer to the distributor and license holder of the work indirect financial benefit (e.g.

advertisements) from the viewing of the work on website.

The copyright holder may give to the license holder a separate license to use the work for commercial

use, if requested.

29

Preservation Notices

Any reproduction or adaptation of the material should include:

● the Reference Note,

● the Licensing Note,

● the declaration of Notices Preservation,

● the Use of Third Parties Work Note (if available), together with the accompanied URLs.

30