evaluating the use of online synchronous … · 2013. 11. 29. · this research study ......
TRANSCRIPT
EVALUATING THE USE OF ONLINE
SYNCHRONOUS COMMUNICATION TO
ENHANCE LEARNING IN STATISTICS
Christine McDonald
BSc DipEd MLitt
A thesis submitted to fulfil the requirements for the degree of
Doctor of Education
in the
Faculty of Education
Queensland University of Technology
Brisbane
2013
Page i
Abstract
According to social constructivists, learners are active participants in constructing
new knowledge in a social process where they interact with others. In these social
settings teachers or more knowledgeable peers provide support. This research study
investigated the contribution that an online synchronous tutorial makes to support
teaching and learning of undergraduate introductory statistics offered by an
Australian regional university at a distance.
The introductory statistics course which served as a research setting in this study was
a requirement of a variety of programs at the University, including psychology,
business and science. Often students in these programs perceive this course to be
difficult and irrelevant to their programs of study. Negative attitudes and associated
anxiety mean that students often struggle with the content. While asynchronous
discussion forums have been shown to provide a level of interaction and support, it
was anticipated that online synchronous tutorials would offer immediate feedback to
move students forward through ―stuck places.‖ At the beginning of the semester the
researcher offered distance students in this course the opportunity to participate in a
weekly online synchronous tutorial which was an addition to the usual support
offered by the teaching team. This tutorial was restricted to 12 volunteers to allow
sufficient interaction to occur for each of the participants.
The researcher, as participant-observer, conducted the weekly tutorials using the
University‘s interactive online learning platform, Wimba Classroom, whereby
participants interacted using audio, text chat and a virtual whiteboard. Prior to the
start of semester, participants were surveyed about their previous mathematical
experiences, their perceptions of the introductory statistics course and why they
wanted to participate in the online tutorial. During the semester, they were regularly
asked pertinent research questions related to their personal outcomes from the
tutorial sessions. These sessions were recorded using screen capture software and the
participants were interviewed about their experiences at the end of the semester.
Page ii
Analysis of these data indicated that the perceived value of online synchronous
tutorial lies in the interaction with fellow students and a content expert and with the
immediacy of feedback given. The collaborative learning environment offered the
support required to maintain motivation, enhance confidence and develop problem-
solving skills in these distance students of introductory statistics. Based on these
findings a model of online synchronous learning is proposed.
Keywords
Community of Inquiry, Content Analysis, Distance Education, Online Learning,
Statistics Education, Synchronous Chat
Page iii
Table of Contents
Abstract i
Keywords ii
Table of Contents iii
Declaration of Authorship vii
Acknowledgements viii
CHAPTER 1: INTRODUCTION 1
1.1 Background 2
1.2 Aims of the study 5
1.3 Research design 5
1.4 Significance of the study 6
1.5 Limitations of the study 7
1.6 Overview of the study 7
1.7 Role of the researcher 9
1.8 Overview of the thesis 9
1.9 Summary 10
CHAPTER 2: LITERATURE REVIEW 11
2.1 Theories of learning 11
2.1.1 Constructivism 11
2.1.2 Adult learning theories 14
2.2 Distance education and online learning 16
2.2.1 Theories of distance education 17
2.2.2 Technological affordances in distance education 18
2.2.3 Online learning in statistics 20
2.3 Computer-mediated communication 21
2.3.1 Online asynchronous communication 22
2.3.2 Online synchronous communication 23
2.4 Interactions in online learning 24
2.4.1 Social presence 28
2.4.2 Cognitive presence 28
2.4.3 Teaching presence 30
2.4.4 Applying the CoI framework in broader contexts 31
2.5 Summary of the literature review 32
CHAPTER 3: RESEARCH METHODOLOGY 33
3.1 Research design 34
3.1.1 Research setting 35
3.1.2 Participants 37
3.1.3 Sequence of the study 38
3.2 Research method 40
3.2.1 Data collection 40
3.2.1.1 Initial survey 41
3.2.1.2 Recordings of online tutorial sessions 41
3.2.1.3 Weekly research questions 42
3.2.1.4 Final interviews 43
Page iv
3.2.2 Data analysis 45
3.2.2.1 Analysis of initial survey, weekly questions and
final interviews
45
3.2.2.2 Analysis of interaction in the online synchronous
tutorials
46
3.2.2.3 Community of Inquiry framework coding and
analysis
46
3.2.2.4 Analysis using narrative description 48
3.2.3 Validity of the study 48
3.3 Summary of the research methodology 49
CHAPTER 4: SURVEY AND INTERVIEW FINDINGS 50
4.1 Initial Survey 50
4.1.1 Preconception of studying the introductory statistics course 51
4.1.2 Effect of prior experience of mathematics on learning of
statistics
52
4.1.3 Previous experience of using discussion forums 53
4.1.4 Previous experience of online synchronous communication 54
4.1.5 Decision to participate in the online tutorial 55
4.2 Final Interviews 56
4.2.1 Participant demographics and reasons for participating 56
4.2.2 Interaction with other participants and the tutor 57
4.2.3 Perceived support of learning 58
4.2.4 Using the Wimba Classroom 59
4.2.5 Other interaction opportunities 61
4.2.6 Final comments 61
4.3 Summary of the survey and interview findings 62
CHAPTER 5: ONLINE SYNCHRONOUS TUTORIALS 63
5.1 Tutorial 1 66
5.1.1 Who attended Tutorial 1 66
5.1.2 Format and content of Tutorial 1 66
5.1.3 Interactions in Tutorial 1 69
5.1.4 Use of Tutorial 1 archive 72
5.2 Tutorial 2 72
5.2.1 Who attended Tutorial 2 72
5.2.2 Format and content of Tutorial 2 73
5.2.3 Interactions in Tutorial 2 75
5.2.4 Use of Tutorial 2 archive 76
5.3 Tutorial 3 76
5.3.1 Who attended Tutorial 3 77
5.3.2 Format and content of Tutorial 3 77
5.3.3 Interactions in Tutorial 3 79
5.3.4 Use of Tutorial 3 archive 81
5.4 Tutorial 4 81
5.4.1 Who attended Tutorial 4 81
5.4.2 Format and content of Tutorial 4 82
5.4.3 Interactions in Tutorial 4 84
5.4.4 Use of Tutorial 4 archive 85
5.5 Tutorial 5 86
5.5.1 Who attended Tutorial 5 86
Page v
5.5.2 Format and content of Tutorial 5 86
5.5.3 Interactions in Tutorial 5 89
5.5.4 Use of Tutorial 5 archive 92
5.6 Tutorial 6 92
5.6.1 Who attended Tutorial 6 92
5.6.2 Format and content of Tutorial 6 93
5.6.3 Interactions in Tutorial 6 95
5.6.4 Use of Tutorial 6 archive 98
5.7 Tutorial 7 98
5.7.1 Who attended Tutorial 7 98
5.7.2 Format and content of Tutorial 7 99
5.7.3 Interactions in Tutorial 7 101
5.7.4 Use of Tutorial 7 archive 103
5.8 Tutorial 8 103
5.8.1 Who attended Tutorial 8 103
5.8.2 Format and content of Tutorial 8 104
5.8.3 Interactions in Tutorial 8 107
5.8.4 Use of Tutorial 8 archive 109
5.9 Revision tutorial 109
5.9.1 Who attended the revision tutorial 109
5.9.2 Format and content of the revision tutorial 110
5.9.3 Interactions in the revision tutorial 113
5.9.4 Use of the revision tutorial archive 117
5.10 Content analysis using the Community of Inquiry framework 117
5.10.1 Tutorial 3 118
5.10.2 Tutorial 6 122
5.10.3 Tutorial 8 126
5.10.4 Summary of CoI analysis 130
5.11 Summary 131
CHAPTER 6: NARRATIVES 132
6.1 Sophie 133
6.2 Harry 137
6.3 Daniel 140
6.4 Jess 144
6.5 Summary of the narratives 149
CHAPTER 7 : DISCUSSION AND CONCLUSIONS 150
7.1 The online synchronous environment 151
7.2 Describing interaction in the online tutorial 155
7.2.1 People interacting 155
7.2.2 Interaction with content 158
7.2.3 Technology affords interaction 159
7.3 Nature of the dialogue in the online tutorial 160
7.4 Perception of the value of the online tutorial 161
7.5 Teaching and learning statistical concepts online 162
7.5.1 A model of online synchronous learning 164
7.6 Future Directions 167
7.7 Postscript 167
Page vi
REFERENCES 169
APPENDICES 179
A Initial survey questions 180
B Weekly research questions 181
C Final interview key questions 183
LIST OF FIGURES
2.1 Online interactions 25
2.2 Model of online learning showing types of interaction 26
2.3 Community of inquiry (CoI) framework 27
2.4 Practical inquiry model 29
3.1 Data collection process 39
5.1 Participants indicating answers on the virtual whiteboard 67
5.2 Writing on virtual whiteboard using tablet technology 68
5.3 Development of table of graphical and statistical summaries 69
5.4 Using diagrams to explain concepts 74
5.5 Interpreting SPSS output 79
5.6 Normal approximation to the binomial distribution 84
5.7 Guiding through the steps of the calculation 88
5.8 Working through a sign test informally 94
5.9 Formulae and calculations 102
5.10 Looking for keywords 106
5.11 Eliminating distractors 115
5.12 Participants contribute to writing formulae 116
5.13 Timeline of presences in Tutorial 3 120
5.14 Timeline of presences in Tutorial 6 124
5.15 Timeline of presences in Tutorial 8 128
7.1 Model of online synchronous learning 165
LIST OF TABLES
5.1 Student participation 65
5.2 Tutorial 3: Number of messages by participant and presence 119
5.3 Tutorial 3: Frequency (percent) of technology use by each participant 121
5.4 Relationship between presence and technology used during Tutorial 3 122
5.5 Tutorial 6: Number of messages by participant and presence 123
5.6 Tutorial 6: Frequency (percent) of technology use by each participant 125
5.7 Relationship between presence and technology used during Tutorial 6 126
5.8 Tutorial 8: Number of messages by participant and presence 127
5.9 Tutorial 8: Frequency (percent) of technology use by each participant 129
5.10 Relationship between presence and technology used during Tutorial 8 130
6.1 Details of narrative participants 132
Page vii
Statement of original authorship
The work contained in this thesis has not been previously submitted to meet
requirements for an award at this or any other higher education institution. To the
best of my knowledge and belief, this document contains no material previously
published or written by another person except where due reference is made.
Signature:
Date:
Page viii
Acknowledgements
As with all works of this nature there are a number of people to thank for their
invaluable assistance. First and foremost I would like to thank my principal
supervisor, Associate Professor Margaret Lloyd. Marg, you are absolutely
amazeballs! I truly appreciate your wise counsel and support through the rough
patches. And there were a few. Your constructive and instructive criticisms
throughout the whole process were irreplaceable. You knew just when to leave me be
and when to give me a prod. Your patience was ever-enduring.
I would also like to thank my associate supervisor, Dr Shaun Nykvist for stepping
into the role so late in proceedings. The teaching team in the Doctor of Education
programme in the Centre for Learning Innovation at QUT provided a strong
launching pad for my entry into the world of doctoral study. Thank you.
How do you thank those special people who were prepared to go under the
microscope of a case study? Words don‘t seem enough. To Participants S1 – S9 (you
know who you are), thank you for your generosity in participating in my study and
for your invaluable feedback on the process.
I wish to thank my colleagues at USQ who have been so generous with their support
of my endeavours, especially Rachel (and Ruby), Linda, Carola and Petrea. A very
special mention has to go to Associate Professor Brigit Loch, now of Swinburne
University, who started me on this journey so many years ago. Birgit, your
enthusiasm and persistence are monumental!
To my Mum and Dad, Thelma and Fred Argus, thank you for your support and
encouragement over many years. Dad, wish you were here. I know you would have
been so proud.
To my family and my inspiration, Nicole, Susan, Katharine and Cameron, thank you
for putting up with a Mum who is a perennial student.
Finally to my long suffering, patient husband, Peter, no longer will you hear ―I just
need to finish this bit and then we can …‖
1
Chapter 1: Introduction
Studying at a distance can give a student the freedom to study anywhere at any time,
it can, however, be a lonely and isolating experience. While adults are expected to be
independent learners, studying unfamiliar content - that is also perceived to be
difficult - can be quite daunting. This is exacerbated when they are required to study
particular content rather than choosing to do so. This research study investigated the
nature and impact of online synchronous communication in the teaching and learning
of one such course, namely, an undergraduate introductory statistics course,
comprising a one semester unit of study at an Australian regional university.
Contemporary distance courses are no longer fully reliant on printed study
packages. They generally have online components which range from printed study
materials available in an electronic form to be downloaded from a course website to
fully functional learning management systems that include multimedia materials and
provide opportunities for students to interact with their teachers and other students.
Some of these online materials and interaction opportunities provide additional
support while others are tied to assessment requiring students to participate in
mandatory asynchronous discussion forums.
Formalised online interaction with teachers and other students can negate the
freedom afforded by the distance study mode. It can, however, also reduce the
isolation felt by some students and support them when they are in most need. When
this interaction is not tied to assessment, students are free to choose to participate if
and when the need arises. When the course content is perceived to be difficult, as in
the instance described in this thesis, the immediacy of online synchronous interaction
can offer opportunities for students to collaborate in ―meaning making‖ and in
moving forward from ―stuck places‖ (Meyer & Land, 2005).
The undergraduate introductory statistics course investigated in this study is a
compulsory course in programs other than those in the statistics discipline area, for
example, psychology, business and science. Students in these programs often
consider statistics to be irrelevant to their programs of study. Past experience has
shown that studying statistics online tends to create a considerable degree of angst.
2
This negative attitude, coupled with anxiety about the perceived difficulty of this
course, means that a number of students struggle with it. While students who study
on campus have regular face-to-face assistance from teaching staff, those studying by
distance frequently attempt to work on their own. By offering these students the
opportunity to participate in an online synchronous tutorial, the researcher wanted to
provide greater support for their learning. In addition, conducting these tutorials
would allow the researcher to gain greater insights into the affordances provided by
this online medium in assisting distance students to successfully engage in and
complete a course of study in a quantitative discipline.
The purpose of this chapter is to introduce the study. First, the background to
the study is described by setting the scene and demonstrating the need for research in
the area of online synchronous support for learning quantitative content (Section
1.1). This is followed by the aims of the study including the research question
(Section 1.2) and research design (Section 1.3). Next, the significance of the study
and its contribution to the body of knowledge in this area are offered (Section 1.4),
while its limitations are outlined in Section 1.5. An overview of the study is
presented in Section 1.6 and the role of the researcher in Section 1.7. Lastly, an
overview of this document is provided in Section 1.8. The chapter concludes with a
short summary (Section 1.9).
1.1 Background
Teaching undergraduate introductory statistics to distance students at a regional
university in Australia since the early 1990s, the researcher has seen many changes
in the way that distance education has been delivered. The motivation for this
research developed in part from an enduring frustration with the level of support, or
lack thereof, given to distance students studying quantitative courses in a tertiary
education setting. Additional motivation came from a wish to formally investigate
the researcher‘s previous experiments in using online synchronous communication
tools in teaching to support the learning of students in statistical methods (Loch &
McDonald, 2007).
When the researcher began work at the University more than twenty years
ago, distance students were sent a study package consisting of:
3
(i) an introductory book outlining administrative procedures within a course and
detailing the assessment requirements; and,
(ii) a printed study book that referred to sections in a prescribed textbook and
sometimes elaborated on the content of the textbook.
Interaction with these students consisted of: written feedback in marked assignments
which were returned through the post; exchange of correspondence responding to
questions posed about the content; and, the occasional telephone call. Students were
left largely to their own devices to work through the materials. In this, distance
learning was reliant on the postal service for the distribution of printed material,
provision of assessments that required completed assignments to be posted to the
tutor and finally feedback to be posted back to the student. The time lapse between
these activities and the perceived difficulty of studying quantitative content meant
that students received limited and sporadic support to engage fully with and succeed
in acquiring the necessary skills to master the content.
With the advent and widespread use of email in tertiary institutions, students
gained an avenue for receiving more timely feedback from tutors. However, this was
still problematic because of the difficulties of including mathematical symbols and
diagrams in electronic form. With the introduction of teleconferencing, telephone
tutorials were conducted by teaching staff at the University. Through this medium,
students were able to converse with one another and the tutor via a telephone
―hookup‖ between several study centres across Queensland. However, interstate and
overseas students were excluded as costs were considered prohibitive. Despite the
capacity to do so, there was very little student-student interaction and ―classes‖ were
typically instructor-led. The tutor often struggled while discussing the quantitative
content on the phone as much of it is visual in nature. These two digital forms of
communication had advantages over print and postal delivery, but were still
unsatisfactory in the teaching of statistical methods to distance students.
Learning management systems, such as WebCT, Moodle and Blackboard,
have led to the use of asynchronous discussion forums to enhance interaction in a
wide variety of discipline contexts. While discussion forums have been available for
a number of years now, their use in quantitative disciplines has been hampered by the
ongoing difficulties associated with writing mathematical symbols easily in a text
environment (Smith & Ferguson, 2005). This has generally been overcome by
4
attaching scanned documents of handwritten mathematics to a forum posting,
however this is cumbersome and time consuming. Discussion forums are largely
asynchronous in nature and, while allowing students flexibility to study where and
when they want, they lack the immediacy of synchronous communication, such as
teleconferencing, where feedback can be given and acted upon in ―real time‖
(Brannon & Essex, 2001).
The development of tablet technology, whereby it is possible to write rather
than type electronically in documents, has revealed a myriad of possibilities in
providing support for learning and teaching in the quantitative disciplines (Galligan,
Hobohm & Loch, 2012). Mathematical symbols can be directly written into emails
and electronic handwriting can be used in synchronous chat and videoconferencing
software. This enhances the opportunities for showing distance students how
mathematical reasoning is developed step-by-step in the moment and describing
statistical concepts using diagrams and graphs. As a result, it is now feasible to
scaffold the development of statistical concepts in real time in an online synchronous
environment.
In recent years, the researcher has been complementing the learning of
introductory statistics by distance students with online tutorials, that is, instructor-led
synchronous text chats. These tutorials were initially offered in an attempt to
overcome the barriers of unfamiliar language (terminology) and perceptions of
difficulty of this content. They used technologies that support the ability to
electronically ―write‖ on the computer screen rather than just type, thus allowing
greater control over the use of mathematical symbols and diagrams. Their perceived
effectiveness is one of the main motivators for the researcher to formalise her
understandings of the role of synchronous technology in supporting student learning
in this field.
Offering greater support to distance students in quantitative disciplines
involves making available a variety of opportunities for students to engage with the
content, the tutor and one another. However, it is not sufficient to just provide
opportunities. As proposed through this study, a greater understanding of how these
opportunities operate and contribute to student learning of statistical concepts
through the use of these technologies is necessary to develop them to full capacity.
Through the potential of technology, that is, computer-mediated communication,
5
support for learning in quantitative disciplines using scaffolding within a
constructivist paradigm is now possible.
1.2 Aims of the study
The purpose of this study is to describe the affordances of an online synchronous
learning environment and to investigate the impact it may have on the teaching and
learning of quantitative content, in particular, in undergraduate introductory statistics
by distance learners. The research question posed for this study is
How does an online synchronous environment contribute to the teaching and
learning of statistical concepts by distance learners?
The aims of this study of an online synchronous learning environment are:
1. To describe the student-teacher, student-student and student-content
interaction in the teaching and learning of statistical concepts in this
environment;
2. To investigate the nature of the dialogue in this environment for this
discipline context;
3. To examine the student perceptions of the value of this environment to
learning of statistical concepts; and,
4. To formulate theoretical constructs pertaining to the teaching and learning of
statistical concepts in this environment.
1.3 Research design
Using a case study methodology, this research study explored the phenomenon of
supporting distance students learning undergraduate introductory statistics in the
context of an online synchronous environment (Baxter & Jack, 2008; see Section 3.1
for more details). Consistent with naturalistic inquiry, the researcher did not impose a
priori expectations on the outcome of the study (Lincoln & Guba, 1985).
Multiple methods of data collection and analysis were employed in order to
identify how the online synchronous tutorials supported learning in this quantitative
discipline context (Section 3.2). The data sources included: an initial survey, weekly
research questions, final interviews and online tutorial recordings including text chat
(Section 3.2.1).
6
Using the lens of the Community of Inquiry framework (Garrison, Anderson
& Archer, 2000; see Sections 2.4, 3.2.2.3 and 5.10), this study examined the
interactions taking place in online synchronous tutorials in the pursuit of knowledge
construction from a socio-cultural perspective. This model - consisting of the
elements of course design, group dynamics and facilitation style - provided a more
in-depth understanding of the learning process in the online synchronous
environment.
Analysis of the tutorial recordings provided insights into the interactions
taking place (see Chapter 5). In addition, thematic coding of initial survey responses,
weekly feedback from the participants, and final interviews was used to gain a
deeper understanding of the role that this online environment can play in supporting
distance students in their learning of introductory statistics (Section 3.2.2).
1.4 Significance of the study
Where this research study differs from other computer-mediated communications
studies in tertiary education settings is in: (i) the technology used, and (ii) the context
to which it is applied. Online synchronous tutorials and discussions, where
communication is primarily in text, have been used effectively in other disciplines
such as Teacher Education (Burnett, 2003). However, tutors and students have
struggled to effectively communicate mathematically in the online environment
(Smith & Ferguson, 2005). Although it is possible to type correct mathematical
symbols in Microsoft Word documents using Equation Editor, most students resort to
handwriting on paper and faxing through questions, or scanning a handwritten
document and emailing it to the tutor as an attachment unless it is required for a
formal assessment. In using a primarily text-based medium, a type of shorthand
notation has to be used in order to communicate mathematically. While this works to
some degree, it is far from ideal and can in fact be misleading. Even though
synchronous tools, particularly online text chat, have been available for some time,
inclusion of an electronic writing facility within the online chat environment,
pertinent to quantitative disciplines, is fairly recent (Loch & McDonald, 2007).
Further to this, with the advent of electronic handwriting in Chat software
such as Windows Live Messenger and interactive online learning platforms, such as
Elluminate or Wimba, it is now possible to engage students more interactively in
7
quantitatively-based content in an online synchronous environment. From the
researcher‘s perspective and through the experience documented in this section, there
is a clear role for this type of technology in the teaching of introductory statistics.
The study described in this thesis will contribute to the investigation into its most
appropriate use and the value it holds for students.
1.5 Limitations of the study
While there is potential for this study to inform practice, it needs to be acknowledged
that it is limited by the need to use volunteers and the size of the tutorial group
enlisted. Limitations of the study relate to its small scale thus reducing the
generalisability of its findings. In light of these issues, this study was restricted to
focussing on student perceptions of and contributions to the interaction taking place
in the online tutorials rather than attempting to evaluate learning and teaching
outcomes based on formal assessment of content knowledge. Other variables specific
to this study include the capacity and technical competence of the participants. That
participation in the online tutorials was voluntary may also bring a ―halo effect‖ to
the findings of the study. The experience of the researcher (as participant-observer)
may also not be replicable in other studies.
1.6 Overview of the study
The study consisted of four stages: preparation, data collection, data analysis and
presentation of results. Even though these four stages required several years to
complete, data collection was conducted over a one semester offering of the
introductory statistics course. Despite the fact that these four stages are seemingly
distinct time periods in the progress of the study, it should be noted that there was
some overlap in when they occurred.
The preparation stage of the study included negotiation of appropriate work
conditions for the researcher to conduct the study during the teaching semester of the
data collection stage, obtaining ethics clearance, introduction of the researcher to
potential participants by the Course Leader, and enlistment of participants to the
study including formal consent to participate (Section 3.1). To prepare participants
for operating effectively within the synchronous online environment an initial
tutorial was scheduled to introduce the participants to the functionality of the
interactive online learning platform used in this study, Wimba Classroom.
8
For the researcher, data collection involved administering the initial survey,
teaching in and recording the online synchronous tutorials across a period of nine
weeks of the semester, posing the weekly research questions and conducting the final
interviews with individual participants (Section 3.2.1). The initial survey elicited
attitudes to the prospect of studying introductory statistics and expectations of the
contribution that the online synchronous tutorial was expected to make to their
studies. The weekly research questions included student perceptions of what was
personally achieved by participating in the previous week‘s tutorial and gauged how
they were feeling about the course as the semester progressed. The final interview
investigated participants‘ reasons for participating in the online synchronous tutorials
and their views on the contribution that the tutorials made to their learning of
introductory statistics.
In the data analysis stage of the study, thematic coding of the initial survey
and the final interviews informed interpretation of student perceptions of the value of
the online synchronous tutorials (Section 3.2.2). Applying ―thick description‖ to
these tutorials using a detailed account of the activity taking place, the context in
which it occurred and incorporating the perceptions of the ―actors‖ informed
understanding of the contribution of these tutorials to the teaching and learning of
introductory statistics by distance students (Stake, 1995). This description included
details of student-student, student-teacher and student-content interactions in the
building of a community of inquiry to foster collaborative learning consistent with
that proposed by Garrison, Anderson and Archer (2000). Vignettes of several
participants‘ individual experiences of the tutorials added further to understanding of
the case.
As the final stage of the study, results were collated into three findings
chapters (Chapters 4, 5 and 6). These chapters addressed the first three aims of the
study related to the types of interaction, the nature of the dialogue and student
perceptions of the value of the online synchronous tutorial to learning of statistical
concepts. The final aim of the study was addressed in the discussion chapter where a
model of online synchronous learning was proposed (Chapter 7).
9
1.7 Role of the researcher
As noted, the researcher has taught in the course investigated by this study for more
than two decades and is therefore thoroughly experienced in both face-to-face and
traditional asynchronous distance teaching of the course content. However, during
the conduct of this study, she was not a member of the official teaching team nor did
she take a role in the marking of student work, so that participants could be assured
that their involvement in the study did in no way negatively influence their end of
semester results for the course.
The researcher, as participant-observer, facilitated voluntary weekly online
synchronous tutorials, administered the initial survey, the weekly research questions
and final interviews. As participant-observer, the researcher had the advantageous
position of being ―inside‖ the case to maximise what could be learned, but had to be
careful not to unduly influence the outcome of the study. Students were fully aware
of her role as researcher, since this information was made known in the initial call for
volunteers to participate.
1.8 Overview of the thesis
This chapter (Chapter 1) has introduced the study. The following chapter, Chapter 2,
will present a review of literature relevant to the study. Since the focus of this study
is on interaction in the learning of introductory statistics in a higher education setting
using an online synchronous environment, literature pertaining to learning theory, in
particular constructivism and adult learning (Section 2.1), distance education
(Section 2.2), computer-mediated communication (Section 2.3) and interaction in
online learning (Section 2.4) form the main components of this review.
The methodology and research design for the study are elaborated in Chapter
3. As noted in Section 1.3, a case study approach was employed to inform the
strategies of inquiry including the types of data collected (Section 3.2.1), the analysis
used (Section 3.2.2) and the interpretation of the data. The research design including
the research setting (Section 3.1.1), enlistment of participants (Section 3.1.2), and
sequence of the study (Section 3.1.3) are described in detail. The validity of the study
is addressed in Section 3.2.3.
The findings of the study are presented in three chapters. Chapter 4 presents
outcomes from the initial survey and final interviews. The initial survey included
10
preconceptions of studying the introductory statistics course (Section 4.1.1),
students‘ prior experience with mathematics (Section 4.1.2), their involvement in
online discussion forums (Section 4.1.3), their prior experience of online
conferencing in other courses (Section 4.1.4), and their reasons for participating in
the online tutorial in introductory statistics (Section 4.1.5). The final interviews
included demographics of the participants and why they chose to participate in the
online tutorial (Section 4.2.1), participant views on interaction with other participants
and the tutor/teacher (Section 4.2.2), their perceptions of how the online tutorial
supported their learning (Section 4.2.3), using the interactive online learning
platform, Wimba Classroom (Section 4.2.4) and interaction outside this Classroom
(Section 4.2.5).
Following this, Chapter 5 gives detailed description of each online
synchronous tutorial in terms of who attended, format and content, interactions
taking place and use made of the archives (Sections 5.1 – 5.9), with further insights
using the lens of the Community of Inquiry framework (Section 5.10).
Individual student stories in relation to their contributions to and experience
of the online synchronous tutorials are developed in Chapter 6. The thesis concludes
with a discussion of the outcomes of the study, conclusions derived from the study, a
proposed model of online synchronous learning and recommendations for further
research (Chapter 7).
1.9 Summary
This study sets out to explore and describe the use of the online synchronous learning
environment and the contribution that it may make to the teaching and learning of
quantitative content by distance learners, in particular undergraduate introductory
statistics. It is intended that the outcomes of this study will inform practice in the
provision of effective support to distance students of quantitative content, in
particular undergraduate introductory statistics, with the view to improve the student
experience.
This chapter has served to briefly introduce the motivation, setting and
context of this research study. The following chapter provides a review of the
research literature that has informed the research design and analysis of the study
described in this thesis.
11
Chapter 2: Literature review
The literature review presented in this chapter highlights the key elements that
inform the teaching and learning context of this study. It begins with a broad
overview of the established theories of learning (Section 2.1) generally associated
with online learning in the tertiary sector, that is constructivism (Section 2.1.1) and
adult learning (Section 2.1.2). The chapter then moves to a detailed examination of
distance education, with particular reference to online statistics education (Section
2.2), and computer-mediated communication (Section 2.3). The final section of the
chapter focuses on interaction in online learning (Section 2.4). A short summary
concludes the chapter (Section 2.5).
2.1 Theories of learning
There is a range of contrasting learning theories and philosophical approaches to
education (Bigge & Shermis, 1999). Which learning theory to enlist ultimately
depends on the context in which the learning is taking place – the teacher, the
learners and the content to be introduced. In the context of this study, that is, of adult
learners engaged in tertiary distance studies, the most relevant theories are
constructivism (Section 2.1.1) and adult learning or andragogy (Section 2.1.2).
Constructivism has been selected because of (i) the researcher‘s emphasis in her own
teaching on the active engagement of learners in creating new knowledge; and, (ii)
the affordance for the development of community made possible by the use of online
technologies (to be explored later in this chapter). The attention to adult learning is
justified by the age of students in the cohort under investigation in this study, that is,
students enrolled in university courses.
2.1.1 Constructivism
According to constructivists, learners are active participants in constructing new
knowledge rather than passive acquirers of knowledge through memorising or having
it simply presented to them (Larochelle, Bednarz & Garrison, 1998). Learners
construct their knowledge by reflecting on and making sense of their own
experience, and this requires online instructors to be aware of the strengths of the
online synchronous medium by being proactive in enabling rather than directing
12
learning (Burnett, 2003). Further to this, social constructivists believe that learning
is a social process, where learners interact with others through testing and
challenging their understandings (Bates & Poole, 2003).
One approach to justify the use of a primarily constructivist approach to
learning in the context of this study is to contrast the two major families of learning
theories (Bigge & Shermis, 1999), namely: (i) S-R (Stimulus-Response) conditioning
theories of the behaviourist family, including theorists such as Watson, Thorndike
and Skinner; and, (ii) interactionist theories of the cognitive family into which
constructivist learning theory falls and includes theorists such as Piaget, Dewey and
Vygotsky.
S-R (Stimulus-Response) conditioning theories view learning as a change in
observable behaviour caused by environmental influences, that is, a change in the
strength of S-R connections (Bigge & Shermis, 1999). The association of a stimulus
with a response depends on an appropriate method of reinforcement through reward
or punishment. Behaviourists take a more scientific stance maintaining a high degree
of objectivity and rejecting reference to unobservable states such as feelings,
attitudes and consciousness (Bates & Poole, 2003). For example, Watson, using a
strongly scientific approach, suggested that learning was a process of building
conditioned reflexes through the substitution of one stimulus for another (Bigge &
Shermis, 1999). Similarly, Thorndike‘s ―connectionism‖ linked mental functions to
biological connections (Thorndike, 1931/1968), wherein S-R connections were made
through random trial-and-error and more pleasurable responses would become fixed
(Bigge & Shermis, 1999). Learning was believed to take place through repetition and
reward to correct responses and the gradual elimination of inappropriate responses
(Knowles, Holton III & Swanson, 2005; Smith, 2001). Lastly and perhaps best
known in the behaviourist family is Skinner‘s theory of operant conditioning. This
theory describes learning as a process in which a response is made more frequent by
arranging reinforcing consequences (Skinner, 1976).
In diametric contrast to behaviourism, cognitive interactionist theories point
to learning as a process of gaining or changing insights, outlooks, expectations, or
thought patterns (Bigge & Shermis, 1999), that is, gaining understandings. Piaget‘s
studies of the nature of children of different ages related their innate developmental
stages to their acquisition of knowledge (genetic epistemology). Children start at the
13
sensorimotor stage where they are only directly interacting with their environment
and progress through several stages where finally in the formal operations stage they
are able to apply thought and reason (Piaget, 1969/1971). Each stage builds on the
previous stage, reconstructing cognition at the more advanced level. This involves
the key processes of assimilation, where new knowledge meshes with existing
insights, and accommodation, where these internal insights are reconstructed to
reflect or accommodate the change (Bigge & Shermis, 1999). Dewey similarly
proposed the idea of growth through learning experiences suggesting that learners,
facilitated by a teacher, have an active role in the learning process by incorporating
the exchange of ideas in a social context (Dewey, 1940/1969). Such processes were
seen to lead to a growing ability to organise, analyse and synthesize (Cross-Durant,
2001; Knowles et al., 2005).
However, the learning theory of most interest to this study is that of
Vygotsky. As a social constructivist, Vygotsky advocated that social interaction was
fundamental to the development of cognition. He proposed that a child learns
through shared problem-solving experiences. Vygotsky‘s Zone of Proximal
Development (ZPD) can be defined as the distance between a child‘s actual level of
development and some higher level of potential development which can be attained
under adult guidance or in collaboration with more capable peers (Hung & Chen,
2001; Maurino, 2007). In other words, Vygotsky‘s ZPD links ―what is‖ to ―what can
be‖ – challenging the learner (with help) to progress beyond where the learner feels
comfortable, thus allowing learning and cognitive development to take place. The
size of the zone set by the teacher is vital to the progress of the learner across the
ZPD (Maurino, 2007). The social environment that is created provides the necessary
scaffolding for the learner to progress. In other words, scaffolding is created by the
adult (teacher) to help the learner to move across the ZPD. Bigge and Shermis (1999)
suggested that ―a first component of scaffolding is the engagement of children in
interesting, culturally meaningful collaborative problem-solving activities‖ (p. 130).
There is a need to create a common ground for communication which then leads to
mutual support. While Vygotsky was primarily referring to the learning of children,
this approach can be equally applied to novice adult learners, particularly those who,
as in this study, are anxious about the content with which they are required to
engage. This is discussed in greater detail in the following subsection.
14
2.1.2 Adult learning theories
While the theories introduced in the previous subsection (Section 2.1.1) were
developed to describe the learning of children, they are equally applicable to adult
learning particularly where the tutor and more capable peers mentor the learning of
novice adult learners. It remains important, however, to consider understandings,
however disputed, in and around how adult learning differs from that of children.
Foremost in theories of adult learning is andragogy. Although its origins are
ascribed to the 19th Century, Knowles (1950) forwarded the concept of andragogy to
describe the learning of adults and to compare it to ―pedagogy‖ as the learning of
children. It is based on the understanding that:
Andragogy assumes that the point at which an individual achieves a self-
concept of essential self-direction is the point at which he [/she]
psychologically becomes adult. A very critical thing happens when this
occurs: the individual develops a deep psychological need to be perceived
by others as being self-directing. Thus, when he [/she] finds him[/her]self in
a situation in which he [/she] is not allowed to be self-directing, he [/she]
experiences a tension between that situation and his[/her] self-concept. His
[/her] reaction is bound to be tainted with resentment and resistance.
(Knowles, 1978, p. 56)
Elsewhere, Knowles (1990) described the distinguishing characteristics of adult
learners. These are:
1. The need to know — adult learners need to know why they need to learn
something before undertaking to learn it.
2. Learner self-concept — adults need to be responsible for their own decisions
and to be treated as capable of self-direction.
3. Role of learners' experience — adult learners have a variety of experiences of
life which represent the richest resource for learning. These experiences are
however imbued with bias and presupposition.
4. Readiness to learn — adults are ready to learn those things they need to know
in order to cope effectively with life situations.
15
5. Orientation to learning — adults are motivated to learn to the extent that they
perceive that it will help them perform tasks they confront in their life
situations.
While supported through anecdotal experience, the disputes in and around
andragogy and Knowles‘ (1950, 1978, 1990) assumptions lie in the defining of
adulthood as a social construction rather than through physical or cognitive
development (Pogson & Tennant, 1995). The extent to which these assumptions are
characteristic of adult learners only is still debated (Merriam, 2001). Adult
experience is a far less predictable indicator of learning than a child‘s chronological
age or stage of development. Notably, an adult‘s ―more and deeper life experiences
may or may not function positively in a learning situation‖ (Merriam, 2001, p. 5).
However, there has been a move from a dichotomous view of andragogy and
pedagogy to more of a continuum from teacher-directed to student-directed learning
(Merriam, 2004).
The theory of adult learning has been expanded to include the concepts of
transformative learning and self-directed learning. Mezirow‘s theory of
transformative learning is considered to be a theory of adult learning where prior
interpretation is transformed into a new interpretation via the learning process
(Mezirow, 2003; Taylor, 2008). Independent thinking can be considered as the goal
of transformative learning with critical reflection by the learner being central to the
process (Merriam, 2004). However, with the development of what constitutes
transformative learning, there is a broadening emphasis to include context and not
just individuation. A more holistic approach indicates an appreciation of the role of
relationship with others in the process of transformative learning (Taylor, 2008).
Similarly, a self-directed learning experience can be described by a number of
models (Merriam, 2001). Self-directed learning can be defined by the learner‘s
capacity to be self-directed, the transformational nature of the learning and the
promotion of socio-political action. In formal educational contexts it is often aligned
with a shift from a teacher-centred to a learner-centred approach (Garrison, 2003). In
the distance education context, the theory of self-directed learning was originally
consistent with the concept of independent study and autonomy as defined by Moore
(2007) in his theory of transactional distance (as discussed further in Section 2.2.1).
From a collaborative constructivist perspective, Garrison (2003) identified three core
16
components of self-directed learning as self-management, self-monitoring and
motivation. However, he qualified this by pointing out that ―without appropriate
support and guidance, learners may not persist or achieve the desired educational
outcomes‖ (p. 165). With this in mind and the advancement of computer-mediated
communication, Garrison noted that the concept of self-directed learning needs to
inform the teaching and learning transaction within a critical community of inquiry
(see Section 2.4 for more details).
While there is no one all-encompassing model of adult learning, each of these
theories have contributed to an understanding of the characteristics which define
adult learning and, as such, can be used to inform decisions made about how best to
support adult learners in formal educational environments. In particular, aspects of
these theories will contribute to the discussion of issues related to the teaching and
learning in the distance education setting of this study.
2.2 Distance education and online learning
Distance education can be defined as any form of structured learning in which the
student and the teacher are in separate physical locations (Bates & Poole, 2003). In
some cases, the teaching and learning can also be taking place at separate times.
Gunawardena and McIssac (2004) pointed out that ―such programs are particularly
beneficial for the many people who are not financially, physically or geographically
able to obtain traditional education‖ (p. 356). Bates (2005) similarly suggested that
distance education is ideal for learners who travel frequently or who have erratic
schedules.
Distance education in the form of correspondence study was developed in the
1800s as an alternative to traditional education where teacher and students would
meet at the one location. Although viewed as a poor substitute, it provided
educational opportunities for those who were not counted among the elite of that era.
Such education, in the form of guided independent study supported by print-based
materials with minimal student-teacher interaction, suffered from low completion
rates. In more recent times, distance education is seen to be filling the needs of
continuing education and lifelong learning by providing more flexible learning
environments through the affordances of digital technologies (Gunawardena &
McIssac, 2004).
17
2.2.1 Theories of distance education
Theories of distance education are strongly allied to the differing models that it has
adopted over time. These models are, in turn, a product of both the dominant
pedagogies and the technologies of their era. Identified theories of distance education
include: (i) an industrialised perspective (Peters, 2007); (ii) teaching-learning
conversations (Holmberg, 2007); and (iii) transactional distance (Moore, 2007).
Industrialised perspective: The industrialisation of education (Peters, 2007) is more
an organisational theory to inform the mass production and distribution of
instructional packages, the division of labour, and extensive use of technical media,
making it possible to instruct large numbers of students where and when they want it.
Self-direction and self-motivation were core to the success of this model of distance
education (Section 2.1.2). Garrison (2000) noted that this dominance of
organisational issues over teaching and learning highlighted the need to choose
between independence and interaction. However, he added that this has become less
of an issue with the advent of computer-mediated communication which has made
both possible.
Teaching-learning conversation: In contrast to but not opposed to the industrial
model is an empathetic approach to distance learning (Holmberg, 2007). This
approach is more strongly aligned with the teaching-learning conversation whereby
pre-produced correspondence study guides are written in a conversational style in
order to engage and motivate the learner. These teaching-learning conversations
make the information in complicated or formal academic texts more accessible. Real
interaction in the form of written feedback to assignments completes a
―conversation‖. This approach focuses on the importance of discourse as opposed to
an emphasis on independence and autonomy of the distance learner.
Transactional distance: Transactional distance theory provides a framework for
defining distance education in pedagogical terms, where teaching and learning in
separate locations can be better understood (Moore, 2007). The transaction in
distance education relates to the connection between teachers and learners in a
situation where they are in geographically different locations. However, transactional
distance is a relative measure that describes the degree of connectedness between the
teacher and learner unrelated to geographical distance.
18
This degree of connectedness is measured by the balance between three sets
of ―macro factors‖: structure, dialogue and autonomy (Moore, 2007, p. 90). The
structure relates to the nature of the teaching program being presented including
learning objectives, presentation of information, activities, exercises and tests. A
highly structured program would be one where there is little allowance for deviation
from the designated pathway through the content and educational process, thus
reducing the level of self-direction, minimising self-motivation and so not allowing
for individual differences in student needs. Dialogue refers to the interchange
between teachers and learners for the creation of knowledge. While the extent and
nature of the dialogue is influenced by a number of factors, one overarching
contributing factor is the structure that has been applied to the program of study
(Moore, 2007). Other factors include the medium of communication, the subject
matter, personalities of participants, culture and language. Learner autonomy,
consistent with the notions of andragogy and self-directed learning (Section 2.1.2)
refers to the level of self-management that a learner is expected to or permitted to
exercise in the learning context in relation to motivation, setting goals, the learning
process and the evaluation of the learning that has taken place. Moore proposed that
transactional distance is determined by the interaction of these three macro factors.
However, Gorsky and Caspi (2005) asserted that empirical studies that attempted to
support or to validate transactional distance theory only partially did so and lacked
reliability and/or construct validity. Further, they suggested that the theory could be
reduced to a single proposition – as the amount of dialogue increases, transactional
distance decreases.
These theories provide an informative starting point for the study described in
this thesis. Through its use of synchronous technologies it eschews the industrialised
approach and attempts to establish a teaching-learning conversation more akin to the
social constructivist teaching approach framed by Vygotsky (Section 2.1.1), adopted
by the researcher. In this, it will attempt to map the transactional distance between
learners and teachers. This notion of transaction is examined in greater detail through
other models of interaction (Section 2.4).
2.2.2 Technological affordances in distance education
Distance education has been driven over time by the invention of new technologies
described as successive generations - from the provision of a reliable postal service
19
(1st generation), to the use of broadcast radio and television (2nd generation), to the
development of ―open‖ universities (3rd generation), to teleconferencing (4th
generation) and the opening up of the Internet (5th generation) (Moore & Kearsley,
2005). Bates (2005), however, compacted these into three generations by combining
the 2nd, 3rd and 4th generations into one that is ―industrial in nature‖ (p. 7).
Characteristics that define this generation include print and integrated multi-media
with highly centralised methods of mass production and delivery where students are
supported by a tutor rather than the producer of the materials (typical of the Open
Universities‘ model) (Peters, 2007).
A different generational perspective yet still closely aligned with the
affordances of technology is one which considers distance education pedagogy
(Anderson & Dron, 2011). It was proposed that there are three generations of
pedagogy, namely cognitive-behaviourist, social constructivist, and connectivist.
Even though the focus is on the structure of learning, it appears that the development
of these distance education pedagogies has been constrained by the technology
available at the time. Regardless of whichever generational view is held, it seems
that with the development of each new technology, new models of distance education
have followed, even though the processes of the earlier generations have not been
completely displaced by the new ones (Anderson, 2008). What is of interest to this
study is that each generation brings differing affordances for the process of learning
with newer synchronous technologies allowing heightened and more spontaneous
communication between teacher and student.
Along with these technological developments come new ways of defining
distance education, for example, distance learning, open learning, networked
learning, flexible learning, distributed learning, online learning and e-learning (Bates,
2005; Guri-Rosenblit, 2009). In spite of the term used, there is an expectation that
distance education will still be knowledge-, community-, assessment- and learner-
centred, just like all forms of quality learning (Anderson, 2008). However, some may
use the term learning-centred rather than learner-centred where the educational
processes focus on the learning and interaction taking place rather than the learner
(Anderson, 2008). This will be discussed in more detail in Section 2.4.
It is important to note, however, that distance education in contemporary
times does not necessarily incorporate online learning nor online learning imply
20
distance learning. However, there is some overlap (Guri-Rosenblit, 2009). Online
learning or e-learning includes the use of new electronic media for a variety of
learning purposes from enhancing the experiences of students in otherwise
conventional classrooms to providing greater interactional or transactional
opportunities for students studying at a distance. With the current rapid development
of computing and communications technologies, a number of delivery media are
presently available ranging from the old style printed materials through to television,
email and computer conferencing. These have the potential to provide increased
interactivity between teachers and students (Gunawardena & McIssac, 2004). Hara,
Bonk and Angeli (2000) noted that ―students have greater opportunities with
electronic collaboration tools to solicit and share knowledge while developing
common ground or intersubjectivity with their peers and teachers‖ (p. 140).
While developments in technology open up greater possibilities for teaching
and learning, these need to be balanced by the costs to the provider and the benefits
to learners (Bates, 2005). In addition, the affordances supplied by each technological
advancement should be balanced by creative pedagogy (Twomey, 2009). The
technologies themselves are merely the means to allow communication between
teachers and learners. The creative pedagogy enacted is influenced by the discipline
of study, in this instance statistics.
2.2.3 Online learning in statistics
In their review of statistical education, Tishkovskaya and Lancaster (2012)
acknowledged that ―teaching statistical courses is challenging because they serve
students with varying backgrounds and abilities, many of whom have had negative
experiences with statistics and mathematics‖ (p. 2). They further noted that even
though there is a critical need for statistically educated citizens, students at all levels
lack interest when taking introductory statistics courses. With this in mind, in recent
years there has been an explosion of technology-enhanced teaching materials to
improve understanding of key concepts in statistics (Rubin, 2007). Many of these
technological enhancements and web-based resources have taken the form of
educational java applets to visualise statistical concepts, video lectures, repositories
of ―pedagogically rich‖ data sets, simulations and interactive graphical displays
(Tishkovskaya & Lancaster, 2012). These resources have been used to supplement
teaching in face-to-face classes as well as fully online classes.
21
Teaching statistics online has taken a variety of forms. Online statistics
courses have been taught completely online or in hybrid/blended mode (mixture of
online and face-to-face), asynchronously and synchronously, to graduate and
undergraduate students, and include varying amounts of interaction amongst
participants (Mills & Raju, 2011). It was recognised that active discussions in online
statistics courses can help motivate and engage students in the learning process,
increasing feelings of connectedness (Everson & Garfield, 2008). Following their
study comparing a traditional on campus class with a class taught in hybrid mode,
Utts, Sommer, Acredolo, Maher and Matthews (2003) recommended that students
need to interact with a ―knowledgeable instructor‖ and should participate in weekly
meetings face-to-face or by online discussion forums. In online courses most
interaction has taken place asynchronously via discussion forums or synchronously
using email or what could be described as virtual office hours where students‘
questions could be answered (Suanpang, Petocz & Kalceff, 2004; Kreiner, 2006).
Even though there has been a growing body of research into online teaching
of statistics ―there is much still to learn about how to effectively implement these
courses and what practices are best‖ (Mills & Raju, 2011, p. 2). With the recognition
that interaction is an important consideration in online learning, the affordances of
the technology to support communication and interaction among students is an
essential consideration.
2.3 Computer-mediated communication
Computer-mediated communication (CMC) refers to communication between
individuals and among groups via networked computers (Naidu & Jarvela, 2006).
With the shift from a behaviourist to constructivist view of learning comes an
increased emphasis on collaborative learning strategies for the construction of
knowledge (Romiszowski & Mason, 2004). This means that it becomes a critical
agent in online distance teaching and learning, particularly where a constructivist
approach is adopted (Section 2.1.1). Gaining insights from analysing the discourse
that takes place in computer-supported collaborative learning environments is
integral to understanding the learning and knowledge construction afforded by these
environments (De Wever, Schellens, Valcke, & Van Keer, 2006).
22
CMC can take on numerous forms: email, bulletin boards, wikis, blogs,
podcasts, discussion groups, chat, and videoconferencing. These can be grouped into
two modes: asynchronous and synchronous. In the asynchronous mode (discussed in
further detail in Section 2.3.1), communicating individuals are not logged on to a
computer network or the Internet at the same time. An asynchronous message is sent
or posted to be viewed at a later time, whereas in the synchronous mode (Section
2.3.2), individuals are able to communicate in real time. Synchronous
communication allows almost instantaneous responses to messages that have been
sent or posted. Mason (2003) suggested that ―a mix of real-time and asynchronous
opportunities for interaction is increasingly also assumed in best practice guidelines
for on-line delivery‖ (p. 97). Bates (2005) added that ―distance learners not only
benefit from a combination of synchronous and asynchronous technology, but they
also prefer this approach‖ (p. 189). Interaction in online learning has been
investigated largely in asynchronous environments such as email and discussion
forums, rather than in synchronous communication situations such as text chat and
videoconferencing (Hrastinski, Keller & Carlson, 2010). This study endeavours to
fill a gap in the literature in relation to the use of online synchronous communication
to support distance learning in a quantitative discipline.
2.3.1 Online asynchronous communication
Asynchronous learning tools can foster deep learning as they provide a timeframe for
a learner to read and reflect upon contributions to a discussion made by other
participants before posting a response. Further to this, asynchronous discussion
forums provide flexibility in online learning by allowing learners time to read,
reflect, formulate and respond to postings, thus offering more time on task, and so
more opportunity for knowledge construction (Bates, 2005), or simply read and
reflect without leaving a response – ―pedagogical lurking‖ (Dennen, 2008).
Even though learners can theoretically participate in discussion forums at any
time and in any location, Burr and Spennemann (2004), using a large-scale multi-
year sample, found that ―available technology does not influence study habits, but
that work and study habits influence when the technology is being accessed‖ (p. 26).
They argued that, with knowledge of patterns of use, teachers can better target their
support of student learning. For example, the question that initiates an asynchronous
discussion influences the level of the response from learners (Meyer, 2004).
23
Further, teacher intervention in discussion forums has the power to push a
discussion forward by being selective in resolving an issue, for example, by
answering a direct question, validating a response to provide clarification or
encouragement, redirecting the discussion to keep it on task or alleviating
misconceptions, expanding to move the discussion to a new level, or withholding
advice or information to allow learner discussion to flow (Simonsen & Banfield,
2006). Care needs to be taken when using ―withholding‖ as this may be perceived as
disinterest on the part of the teacher. To overcome potential pitfalls such as lack of
immediate feedback, lack of timely contribution and lack of time for development of
thoughtful discussion, Gilbert and Dabbagh (2005) suggested a structured approach
including facilitator guidelines, posting protocols and evaluation criteria. They found
from a case study that while facilitator guidelines and evaluation criteria which gave
grade weightings to timely and consistent contributions had a positive impact on
promoting meaningful discussion, that is, reducing transactional distance (Section
2.2.1), posting protocols that restrict the length of postings, thus increasing
transactional distance, had a negative impact.
2.3.2 Online synchronous communication
While there is a growing body of literature investigating synchronous forms of online
communication (Cox, Carr & Hall, 2004; Hrastinski, 2006; Hrastinski et al., 2010;
Stein et al., 2007), these have been primarily in disciplines more closely aligned to
the social sciences such as Teacher Education (Burnett, 2003; Stein et al., 2007),
African Studies and Economics (Cox et al, 2004) and Business English (Hrastinski,
2006), rather than in the quantitative disciplines such as investigated in this study.
Online chats for teaching university students have been used in a range of learning
environments. The reasons range from giving an added dimension to the learning
experience of on-campus students (Cox et al., 2004) to providing a supportive
learning environment to fully distance students (Myers, Bishop, Rajaman & Kelly,
2004).
In online student-centred synchronous learning environments, students are
able to, and seem more willing to interact and take turns, constructing their own
knowledge, than in the traditional face-to-face classroom (Cox et al., 2004). Online
chatting can provide equal opportunities - breaking down personal, cultural and
social barriers, enabling inhibited students to contribute their own ideas rather than
24
acquiesce to a more dominant peer and thus foster a more collaborative learning
environment (Freiermuth, 2002). In highly visual disciplines such as mathematics
and statistics, online tutors and students need to be able to view, edit and post
diagrams and formulae directly in online postings. As noted in Section 1.4, with the
advent of chat clients that allow handwriting as well as typing of text, posting of
diagrams and formulae is possible (Loch & McDonald, 2007).
The real power of synchronous communication in enhancing teaching and
learning lies in its allowance ―for immediate and timely feedback and creat[ing] a
strong social presence more easily than asynchronous online environments‖ (Bates,
2005, p.188). It has been noted that ―video-conferencing can support interactivity
and collaborative work‖ (Bates, 2005, p. 188). Synchronous communication, through
its immediacy and capacity for spontaneity, can replicate a ―real‖ classroom
irrespective of distance and thus reduce the transactional distance between teacher
and learner (Section 2.2.1). However, it requires considerable prior planning
including preparedness on the part of the tutor for real-time improvisation and an
awareness of the impact of possible cognitive overload (Kear, Chetwynd, Williams
& Donelan, 2012). It would appear that finding the right balance is imperative to the
effective use of this medium.
The following section is dedicated to interaction in online learning. Here,
interaction is seen as an extension of the communication, discussed in this
subsection, into online learning contexts.
2.4 Interactions in online learning
Interaction is defined as ―reciprocal events that require at least two objects and two
actions. Interactions occur when these objects or events mutually influence one
another‖ (Wagner, 1994, p. 8). While it may be fitting to encourage autonomy and
independent thought associated with self-directed learning (Section 2.1.2), ―even the
innermost learning activities occur in a social context mediated by communicative
action‖ (Garrison, 2003, p. 164). Any educational experience presupposes some sort
of interaction. The question is what types of interaction should be enlisted and how
much of each type should be used.
Moore (1989) subdivided the ―what types‖ of interaction in online learning
into three distinct categories: learner-content (learner with content), learner-teacher
25
(learner with teacher) and learner-learner (learner with other learners). Some aspects
of learner-content interaction relate back to the traditional views of distance
education typical of andragogy and self-directed learning (Section 2.1.2), where
content was written material and learners were studying largely independently of the
teacher and institution. However, these days, this also includes multimedia
presentations and ―adaptive‖ systems that current technology affords and which
perform some of the formerly learner-teacher interaction (Anderson & Kuskis,
2007). Learner-teacher interaction, which can occur in both synchronous and
asynchronous forms, is where the teacher assists the students in interacting with the
content (Moore & Kearsley, 2005). This includes stimulating interest and motivation
to learn, providing additional explanation and providing counsel and support, where
the learner can draw on the knowledge and experience of the discipline expert,
typical of constructivism (Section 2.1.1). Learner-learner interaction goes beyond the
social component of support and motivation to a point where ―the act of engaging in
learner-learner interaction forces learners to construct or formulate ideas in a deep
learning sense‖ (Anderson & Kuskis, 2007, p. 297).
Expanding on Moore‘s model, Anderson (2003a) included three extra
categories: teacher-content, teacher-teacher, content-content interaction. Figure 2.1
represents these combined categories. The categories identified by Anderson (2003a)
add an interesting complexity to the previously cited interactions (from Moore,
1989).
Figure 2.1. Online interactions (after Anderson 2003a; Moore, 1989)
Teacher-content interaction is focused on instructional design marrying the
technology to the pedagogy to enhance student-content interaction. Teacher-teacher
interaction through networked scholarly communities of practice fosters sharing of
resources and experiences. Content-content interaction in the form of computer
26
programs which can ―retrieve information, operate other programs, make decisions,
and monitor resources on networks‖ is in the early stages of development and relies
upon the work of multidisciplinary teams which include computer scientists, linguists
and educators (Anderson & Kuskis, 2007, p. 304). This type of interaction could
provide more learner autonomy in the form of increased self-management and self-
monitoring, consistent with self-directed learning (Section 2.1.2).
Incorporating the social constructivist perspective, the challenge for educators
is to match a combination of these types of interactions to the types of learners, the
subject matter and the learning objectives to build a productive community of
learners. However, there may be some flexibility in that it could be possible through
instructional design to substitute one type of interaction for one of the others with
little loss in educational effectiveness (Anderson, 2003b). These sets of interactions
are represented in Figure 2.2 (reproduced here with the permission of T. Anderson).
According to this model, asynchronous and synchronous forms of
communication are not distinguished one from the other. Both forms of
communication can be used to foster collaborative learning and communities of
inquiry for student-student-teacher interaction. However, synchronous
communication allows immediate and more timely feedback to be given (Section
2.3.2).
Figure 2.2. Model of online learning showing types of interaction (Anderson, 2008)
27
While there is much of interest in Figure 2.2, a particular component which
will inform the methodology of the proposed study is the ―Community of inquiry‖ or
CoI which has been expanded into a framework for use in analysing online teaching
and learning (Garrison et al., 2000). The CoI framework (Figure 2.3) has its roots in
computer conferencing specifically in the context of asynchronous, text-based group
discussions rather than from a traditional distance education theoretical perspective
that assumed that students worked independently from each other (Garrison,
Anderson & Archer, 2010). This framework offers a lens by which the nature of the
educational transactions taking place in an online learning context can be
investigated (Akyol, Arbaugh et al., 2009). Through the interactions of the three
presences – social, cognitive and teaching presence, within a community of inquiry
supported by computer-mediated communication (Figure 2.3, reproduced here with
the permission of D. R. Garrison), knowledge is socially constructed (Section 2.1.1).
Figure 2.3. Community of inquiry (CoI) framework (Garrison et al., 2000)
A criticism of the CoI framework is that it does not address issues of deep and
meaningful learning (Rourke & Kanuka, 2009). However, this framework adopts a
constructivist view by seeking to understand how knowledge is constructed rather
than an objectivist focus on learning outcomes, focusing on the nature of the
educational transaction (Akyol, Arbaugh et al., 2009). While the framework was
designed for exploratory and descriptive studies, there have been a number of studies
that have investigated the perception of learners, their experiences of the three
presences and inter-relationships among the presences over time (Arbaugh et al.,
28
2008; Akyol & Garrison, 2008). It has been argued that the inter-relationship of
social, cognitive, and teaching presence is required for quality interaction promoting
discourse for deep and meaningful learning to be achieved (Garrison & Cleveland-
Innes, 2005). The following discussion will provide further information on the three
core elements of the framework (Figure 2.3), namely, (i) social presence (Section
2.4.1), (ii) cognitive presence (Section 2.4.2), and (iii) teaching presence (Section
2.4.3).
2.4.1 Social presence
Social presence asks learners to establish personal and purposeful relationships to
foster effective communication and group cohesion (Garrison, 2007). By projecting
their personal characteristics into the community of inquiry, participants view one
another as ―real‖ people (Garrison & Archer, 2007). Garrison et al. (2000) proposed
that social presence is established through emotional expression, open
communication and group cohesion whereby participants create a supportive
environment where critical thinking and inquiry are fostered. Since visual cues are
not possible, emotional expression can be developed through the use of emoticons.
Open communication develops from an initial stage of students and teacher getting to
know one another, where understanding of expectations in the online community is
established and a level of comfort in communicating openly online follows (Garrison
& Arbaugh, 2007). Group cohesion develops around common goals based on
collaborative activity in the community of inquiry (Garrison, 2007).
While social presence does not guarantee that critical discourse will develop
in an online environment, it is difficult for such discourse to develop without the
establishment of social presence (Garrison & Cleveland-Innes, 2005). In addition,
Hwang and Arbaugh (2006) showed a strong relationship between social presence
and learning outcomes. Social presence intersects with cognitive presence in an
educational context through collaborative activity with a common intellectual focus
when students recognise that they are not there just for social reasons (Garrison,
2007).
2.4.2 Cognitive presence
Cognitive presence is characterised by ―exploration, construction, resolution and
confirmation of understanding‖ (Garrison, 2007, p. 65). This presence develops
through practical inquiry. The practical inquiry model of Garrison et al. (2000) is
29
based upon the foundational ideas of Dewey (1933) (see Figure 2.4 and Section
2.1.1).
The two axes represent reflection on practice and assimilation of information
and construction of meaning incorporating both the shared and personal worlds.
Cognitive presence is initiated with a state of dissonance described as a triggering
event. This is followed by exploration which involves searching for information that
gives greater understanding of the problem. Students put forward suggestions for
consideration and brainstorm about the problem (Akyol & Garrison, 2010).
Integration involves combining or rejecting the ideas generated by this information
until a coherent concept is formed. Finally, resolution of the problem signifies
formulation of a solution or application of an idea. If this resolution is not successful,
the process of inquiry continues. Students have great difficulty in progressing from
the exploratory phase to integration and resolution (Vaughn & Garrison, 2005).
However, the topics being discussed and questions being posed have an impact on
the level of cognitive activity (Arnold & Ducate, 2006).
Figure 2.4. Practical inquiry model (Garrison et al., 2000)
Making students aware of the phases of inquiry and how they relate to the
prescribed task are ways of progressing the discussion to higher levels of response
(Garrison & Arbaugh, 2007). To ensure that students move through the phases of the
practical inquiry model efficiently, teaching presence provides the necessary
guidance (Shea & Bidjerano, 2009; Garrison et al., 2010).
30
2.4.3 Teaching presence
Anderson, Rourke, Garrison and Archer (2001) defined teaching presence as ―the
design, facilitation, and direction of cognitive and social processes for the purpose of
realizing personally meaningful and educationally worthwhile learning outcomes‖ (p.
5). Teaching presence balances cognitive and social issues, facilitating discourse and
direct instruction (Garrison et al., 2000; Garrison & Arbaugh, 2007). There seems to
be a consensus that teaching presence is a significant contributor to student
satisfaction, perceived learning and a sense of community (Garrison, 2007). The CoI
framework defines teaching presence by three categories: design and organisation,
facilitating discourse, and direct instruction (Garrison et al., 2000; Anderson et al.,
2001).
Design and organisation involves planning the educational experience both
before and during the progression of an online course of study. In building the
curriculum, the teacher adapts course content for online delivery by redesigning
lecture notes to include additional ―insights and other customised views of course
content‖ (Anderson et al., 2001, p. 6). Other elements include designing the methods
and assessments appropriate to the online environment, negotiating timelines and
providing guidance for students to make effective use of the online medium. This
instructional management in turn leads to a greater sense of community and higher
levels of learning on the part of the students (Shea, Li & Pickett, 2006). While these
conclusions are based on self-reported data, it would appear that too much structure
may inhibit students from engaging in meaningful online discourse (Gilbert &
Dabbagh, 2005).
The teacher in facilitating discourse to build understanding engages and
motivates students to interact with one another and the course content whereby
meaning is shared, areas of agreement and disagreement are identified and a
collaborative community of learners works towards consensus and understanding
(Garrison et al, 2000; Garrison & Arbaugh, 2007). Teaching presence is not
necessarily only the realm of the teacher. Fellow students can play this very
important role in facilitating discourse (Garrison et al., 2000). However, an
advantage of an instructor providing the teaching presence is that this subject expert
is in a position to give feedback on student responses and keep discussion moving in
the most efficient and effective manner (Garrison & Arbaugh, 2007). The teacher is
31
responsible for keeping a balance in the discourse by encouraging students who are
less active in the discussions and curbing the contributions of overly dominant
students (Anderson et al., 2001). Interaction and discourse are crucial contributors to
higher order learning but not without structure and direction (Garrison, 2007).
Direct instruction is described as the teacher‘s provision of intellectual and
scholarly leadership by sharing with students their subject matter knowledge,
scaffolding learner knowledge to reach a higher level of understanding (Garrison &
Arbaugh, 2007). However, the pedagogical expertise of the teacher is also vital.
Awareness of the developmental progression of the inquiry process provided by the
teacher influences the quality of the discourse and the depth of the learning
(Anderson et al., 2001).
Even though there has been some debate over whether teaching presence is
three dimensional (design and organisation, facilitating discourse, and direct
instruction) or two dimensional (design and organisation, and directed facilitation),
there is merit in keeping the three dimensional construct of teaching presence as the
difference may be simply an artifact of the context being studied (Arbaugh &
Hwang, 2006; Garrison et al, 2010; Shea et al., 2006). With a greater understanding
of the role that teaching presence plays in online education, more purposeful
guidelines can be developed to help support online educators (Anderson et al., 2001).
Despite the fact that digital technologies provide opportunities for engaging a variety
of resources, teachers remain central in any learning situation (Guri-Rosenblit, 2009).
2.4.4 Applying the CoI framework in broader contexts
While the CoI framework was developed to describe asynchronous online
interaction, it may be applicable to synchronous interaction as it has been
demonstrated as relevant in a blended learning environment where there was a
mixture of face-to-face and online interaction (Akyol, Garrison & Ozden, 2009). The
CoI framework is relevant in learning contexts where the focus is on ―how we
construct knowledge as opposed to an objectivist focus on learning outcomes‖
(Akyol, Arbaugh et al., 2009, p. 124). However, to gain further insights into the
design of collaborative constructivist learning environments that provide deep and
meaningful learning experiences, the dynamics of the relationship amongst the three
CoI presences needs to be understood (Akyol & Garrison, 2010).
32
2.4 Summary of the literature review
The literature review undertaken in this chapter has presented details on (i) theories
of learning, (i) distance education and online learning, (iii) computer-mediated
communication, and (iv) interactions in online learning. Each of these has informed
the research study described in this document. Particularly, the Community of
Inquiry (CoI) model (Garrison et al., 2000) has provided a useful framework for
analysing the ―nature of the educational transaction‖ that takes place in the online
synchronous tutorial by considering the ―interaction‖ amongst the three core
elements: (i) social, (ii) cognitive, and (iii) teaching presence. By taking into account
student perceptions of the value of the online synchronous tutorial and unpacking the
content of the dialogue used to develop understanding of concepts in introductory
statistics in the online tutorials, it is believed that greater insights into the nature of
the educational transaction, and thus the teaching and learning of statistical concepts,
can be achieved. The following chapter (Chapter 3) presents the methodology of the
research study.
33
Chapter 3: Research methodology
This chapter describes the research design and specific methodology adopted by this
study. As noted in the literature review, most studies related to online synchronous
environments are in social science contexts such as Teacher Education (Section
2.3.2). There are few studies that specifically investigate the contributions that online
synchronous environments make to interaction and learning in quantitative
disciplines such as introductory statistics (Section 2.2.3). Since the online
synchronous environment being studied includes human activity located in the real
world where it can only be understood in that specific context, a case study
methodology is considered appropriate (Gillham, 2000; Yin, 2009).
By examining the processes of an online synchronous tutorial in introductory
statistics at a regional university in Australia, this case study seeks to answer the
following question:
How does an online synchronous environment contribute to the learning of
statistical concepts by distance learners?
As introduced in Chapter 1 of this document, the aims of this study are:
1. To describe the student-teacher, student-student and student-content
interaction in the learning of statistical concepts in this environment;
2. To investigate the nature of the dialogue in this environment for this
discipline context;
3. To examine the student perceptions of the value of this environment to
learning of statistical concepts; and,
4. To formulate theoretical constructs pertaining to the teaching and learning of
statistical concepts in this environment.
Firstly, the research design, including the research setting (Section 3.1.1), the
participants enlisted (Section 3.1.2), and the sequence of the study (Section 3.1.3) are
explained. The research method, including data collection (Section 3.2.1), data
34
analysis (Section 3.2.2), and validity of the study (Section 3.2.3) are elaborated.
Finally, the chapter will conclude with a brief summary (Section 3.3).
3.1 Research design
A qualitative paradigm was adopted for this study, whereby the research was
conducted in a natural setting, using methods that are interactive and humanistic
(Creswell, 2003; Gillham, 2000). The methodological approach is interpretive with a
focus on constructivist investigation (Guba & Lincoln, 1989). The setting (online
synchronous environment), the actors (volunteer participants studying introductory
statistics), the events (online tutorial) and the process (interactions within the online
tutorial) are examined (Miles & Huberman, 1994). The researcher built rapport and
credibility with the participants while also reflecting on her role as participant-
observer (Creswell, 2003; see Section 1.7 for more detail on the role of the
researcher in this study). Multiple methods of data collection have been enlisted
(Creswell, 2003; see Section 3.2.1 for more details). Themes and issues are identified
(Stake, 1995).
Of the selection of qualitative designs from which to choose, case study is the
most appropriate for this research as the focus is on a ―how‖ type question, the
behaviour of the participants was not manipulated and the phenomenon sits within a
context from which it cannot be separated and so both aspects needed to be explored
(Yin, 2009). Case study methodology determined the strategies of inquiry, data
collection, analysis, and interpretation, since the focus of the study was on an in-
depth exploration of the interactions that took place in a particular location at a
particular time (Creswell, 2003). The ―case‖ in this study is defined by the following
―bounded‖ conditions: an online tutorial group of up to twelve volunteer participants
from the group of distance students studying an introductory statistics course at a
regional university in Australia. In keeping with the qualitative paradigm, purposeful
sampling was used to best understand the interactions taking place in online tutorials
(Creswell, 2003; Yin, 2009). Consistent with case study design and as previously
noted, multiple sources of data were employed, including surveys, observations,
interviews and audio-visual recordings. While survey data is more commonly used in
quantitative studies, it was only used here as a precursor to more detailed observation
and interview data.
35
Case study methodology has been developed by two main proponents, Robert
Yin and Robert Stake and both viewpoints will inform this research study. For
example, Yin (2009) contended that case studies can be categorised as explanatory,
exploratory or descriptive. An explanatory case study is used to explain causal links,
whereas an exploratory study is appropriate when exploring a situation where there
are no clear outcomes, and a descriptive study is used to describe an intervention or
phenomenon (Yin, 2009). However, case studies can also be distinguished as
intrinsic, instrumental or collective (Stake, 1995). While Yin (2009) and Stake
(1995) used different terminology to define variation amongst case studies, they
essentially make similar distinctions. While this research study falls somewhere
between Yin‘s exploratory and Stake‘s instrumental definitions of case study
methodology, it sought to understand the nature of the interaction taking place in the
online synchronous tutorial where the researcher was a participant-observer, and as
such aligns more closely with the Stake methodology. As an instrumental case study
the issues predominate and ―an ongoing interpretative role of the researcher is
prominent‖ (Stake, 1995, p. 43). In summary, this study explored ―how‖ an online
synchronous tutorial might contribute to the teaching and learning of introductory
statistics with the hope that insights gained will be useful to other educators as they
extend the boundaries of support for distance students studying quantitative content.
3.1.1 Research setting
The specific context for this investigation was an introductory undergraduate
statistics course taught across one semester by statistics specialists as a service to
other disciplines, namely, Psychology, Business, Commerce, Biology, Biomedical
Science, Chemistry and Physics. This course had an approximate enrolment of 400
students who participated in either internal (25%) or external mode (75%).
While core to these programs at the University, students embarking on this
course do not generally understand at first why they have to study it. Anecdotally, it
would seem that these students see the content as being quite different from that of
their mainstream discipline courses and they frequently question its relevance. For
many students, it is the first quantitative course that they are required to study at
university, and, as with many quantitative courses, it comes with more than its fair
share of ―bad press,‖ indicated by such comments as ―it‘s a really hard subject‖ and
―lots of people fail.‖ This is despite the evidence from course leader reports over
36
many semesters that failure rates for students who submit all of the assessment items
are quite low. It should be noted, however, that this course generally has relatively
high attrition rates, as evidenced by the number of students who fail to submit one or
more assignments and/or do not sit the final examination, which may mask
difficulties in understanding the content.
It would also seem that students‘ prior perceptions are that this course is a
difficult mathematics course even though there is very little formal mathematical
content other than inserting numbers into a simple linear equation or constructing a
graph of a linear function (mathematics that is taught in lower secondary school). As
a result, students are often fearful of the content being studied and, in distance
learning mode, this can then be compounded by the isolation that can be felt in the
distance learning situation.
For the purposes of this study, customised weekly online synchronous
tutorials were offered during eight of the eleven weeks of the Southern Summer
semester of 2010 (November 2010 to February 2011) with a ninth (revision) tutorial
being held during the examination period at the end of the semester. The tutorials did
not start until the fourth week of the semester so that students could settle into their
studies and to allow time for participants to be recruited. Participation was voluntary
and the online tutorials were conducted as a complement to, rather than a
replacement for, the standard University offering of online materials and
asynchronous support. The timing of these tutorials was negotiated with the students
enlisted to participate in the study and it was agreed to hold them for one hour on a
Tuesday evening. The researcher conducted the tutorials from her computer at home.
Students were similarly participating via an Internet connection which was
convenient to them, but, typically because of the ―after-hours‖ scheduling, was from
their homes.
At that time the University had a licence to use the interactive learning
platform Wimba Classroom and this was used to conduct the online tutorials. To
minimise the impact of technical issues with connecting to Wimba it was decided to
share alternative contact information for Windows Live Messenger and Skype, both
freely available Internet communications software. Participants were introduced to
the functionality of the Wimba Classroom in the first online tutorial.
37
To enhance the online experience, participants were requested, where
possible, to use a headset with microphone to reduce the impact of background noise.
However, participants were still able to contribute by entering text through the Chat
window in Wimba if they were unable to use a microphone. As previously noted, this
online synchronous tutorial support was an addition to the usual asynchronous
discussion forum traditionally offered in this introductory statistics course.
3.1.2 Participants
This study enlisted the participation of a group of 12 volunteers from the
approximate 300 distance students initially enrolled in the introductory statistics
course in a weekly online synchronous tutorial for eight weeks of the semester.
While the final group of study participants was less than 5% of the original cohort
enrolled at the beginning of the semester, at least 30 students expressed interest in
either being in the study or participating in an online tutorial that was not related to
the research study. A separate online synchronous tutorial was conducted for the
latter group of students. What was being requested in this study was additional to
minimum course participation requirements and, as not all students studying this
course were likely to be available or willing to be available to participate in an online
activity at a specific time, this necessitated the use of volunteers with flexible time
commitments. As a result this case study required purposeful selection of distance
students. It was expected that volunteers would see potential benefits in participating
in an online synchronous tutorial and thus be prepared to commit to a weekly virtual
meeting at a specified time. It should be noted that volunteers are not necessarily
representative of all students studying the introductory statistics course. Further to
this, it was anticipated that by selecting willing participants, understanding of how an
online synchronous environment contributed to the learning of statistical concepts by
distance learners could be maximised (Stake, 1995).
The size of the tutorial group was originally restricted to 10 participants as
this number was perceived to give all participants a reasonable chance to participate
actively in discussions (Loch & McDonald, 2007). However, more than 10
volunteered. It was initially thought that if more than 10 students volunteered, more
than one group would be formed to provide a ―literal replication‖ of the case study
(Yin, 2009). Using multiple cases would have allowed greater understanding of the
original case by providing an opportunity to investigate similarities and differences
38
between the cases (Baxter & Jack, 2008; Yin, 2009). However, with only 12
volunteers this was not deemed possible. It was decided to form one tutorial group of
12 participants.
The researcher acted as facilitator of the online tutorial. As participant-
observer, the researcher facilitated and observed the interaction of the learners in the
online tutorial. Being ―inside‖ the case provided invaluable opportunities for
producing an ―accurate‖ view of the case study phenomenon (Yin, 2009). Being
aware of possible pitfalls was essential as while a participant-observer may be in a
position to gain greater insights into interpersonal behaviours and motives, she may
also be accused of bias due to manipulation of events (Yin, 2009). Being open and
honest about actions taken by the participant-observer through corroboration using
multiple sources of data aimed to minimise the impact and perception of bias while
enhancing the value of the ―thick description‖ that resulted (Stake, 1995). This is
discussed further in Section 3.2.3.
3.1.3 Sequence of the study
A brief overview of the four stages of the study (preparation, data collection, data
analysis and presentation of results) was given in Chapter 1 (Section 1.6). This will
now be further elaborated.
Prior to enlistment of participants to the study, the Data Analysis Course
Leader posted a message on the Course Discussion Forum introducing the researcher
and her research project to the students enrolled in the course. She briefly explained
the researcher‘s role in the Course Teaching Team (online student academic support)
and reassured students that their decision on whether to participate or not in the study
would have no impact on their results or the support that they would receive in the
course. She added that the researcher had no part in the marking of their assessments
or the finalising of their grades.
Following this introduction, the researcher posted an invitation to participate
in the study on the Course Discussion Forum. This invitation contained information
about the purpose of the study and expectations of participation. The researcher
asked interested students to email her for further information. It was expected that
students who had the time and commitment would offer to participate. Students who
responded to this invitation were contacted by the researcher and sent a copy of the
39
approved Participant Information Form which included a description of the project,
what was expected of participants (what their role was to be in the study and what
the researcher‘s role as participant-observer was to be in the study), expected benefits
of participating in the study, potential risks of participating, assurances of
confidentiality, and avenues for obtaining further information or expressing any
concerns that they might have with the conduct of the study.
Students were also asked to answer five questions (see the initial survey
questions in Section 4.1 and Appendix A). By responding to this email and
answering the five questions of the initial survey students were formally giving their
consent to participate in the study. This email explicitly stated what they were
required to do beyond participation in the online tutorial (answering survey questions
and participating in an interview at the end of the semester). Once participants were
enlisted a suitable time for the tutorials was negotiated, participants were given
information about running the Wimba Setup Wizard and a date was set to conduct an
introductory tutorial to orient participants to the functionality of Wimba Classroom.
The data collection stage of the study included administering the initial
survey (Appendix A), teaching in and recording the online synchronous tutorials
across a period of nine weeks, posing the weekly research questions related to
student perceptions of the tutorials (Appendix B) and conducting the final interviews
with individual participants (Appendix C). This stage of the study is summarised in
Figure 3.1. This is discussed in detail in Section 3.2.1.
Figure 3.1. Data collection process
At all times during the data collection process, the researcher was open and
honest with participants and attempted to not bias outcomes by commenting on
40
expectations of outcomes of the study (Gillham, 2000). Participants were assured of
confidentiality of the information collected and the anonymity of any responses used
in subsequent publication of the research.
Typical of case study methodology, the data collection stage of the study
blended into the data analysis stage of the study which included thematic coding of
responses to the initial survey and final interviews and detailed description of the
online synchronous tutorials (Stake, 1995). Synthesis of this coding and description
into the stories of four participants‘ individual experiences of the online synchronous
tutorials expanded the analysis and interpretation of the interaction taking place.
Content analysis of tutorial recordings using the Community of Inquiry framework
was instrumental to the interpretation of the interactions within the tutorials. This is
discussed in detail in Section 3.2.2.
As the final stage of the study, results are presented in three findings chapters
followed by discussion and conclusions. Chapter 4 presents outcomes from the initial
survey and the final interviews and identifies common themes related to the first
three aims of the study. Following this, Chapter 5 provides a detailed description of
each online synchronous tutorial with further insights using the lens of the
Community of Inquiry framework. Individual student stories in relation to their
experiences of and contributions to the online synchronous tutorials, as interpreted
by the researcher, are documented in Chapter 6. Chapter 7 completes the study with a
discussion of the findings and conclusions.
3.2 Research method
Consistent with case study methodology, multiple methods of data collection (as
described in Section 3.2.1) and analysis (as described in Section 3.2.2) were used to
investigate the interactions taking place in an online synchronous tutorial in a
quantitative discipline context. Triangulation of evidence through the analysis and
interpretation of these different types of data add strength to the findings through
greater understanding of the research participants and the context (Baxter & Jack,
2008; Stake, 1995).
3.2.1 Data collection
Different types of data were collected in order to corroborate what was happening in
the online synchronous environment (Yin, 2009). The primary data source was the
41
recordings of the weekly online tutorials. Secondary sources of data included the
initial survey, key research questions posed at the beginning of each tutorial, and the
final interviews. By gathering different types of data it was possible to look for
convergence, whereby different kinds of evidence bear on the same point, and
identify possible conflicts, thus giving a more holistic view of the phenomenon
(Gillham, 2000; Yin, 2009).
Data collection was completely by electronic means. The initial survey
questions were emailed to participants (see Sections 3.2.1.1 and 4.1 for details).
Screen capture software was used to record interactions taking place within the
online tutorials (Section 3.2.1.2). Research questions asked at the beginning of each
tutorial were answered using the Polling function within Wimba Classroom (Section
3.2.1.3). The recording capability within Wimba Classroom was used to record the
audio of the final interviews (Section 3.2.1.4).
3.2.1.1 Initial survey
While collection of survey data is not common within qualitative studies, case study
researchers can use survey data to support a holistic understanding of the
phenomenon being investigated (Baxter & Jack, 2008). Surveys allow the researcher
to ask questions in a more structured format (Silverman, 2006).
A survey was conducted prior to the commencement of the online
synchronous tutorials to gauge students‘ preparedness for studying Data Analysis
and their previous knowledge and experience of online learning. This initial survey
collected information on student preconceptions of studying the introductory
statistics course, their prior experience with mathematics, their previous involvement
in online discussion forums, their prior experience of online conferencing in other
courses and their reasons for participating in the online tutorial in introductory
statistics (see Appendix A for details). It was not unlike a structured interview where
there is a list of set questions in a particular order and the agenda is totally
predetermined by the researcher (Tellis, 1997). Analysis of these data is discussed in
Section 3.2.2.1 and results presented in Section 4.1.
3.2.1.2 Recordings of online tutorial sessions
Within Wimba Classroom it was possible to record some of the interaction taking
place in each online synchronous tutorial. These recordings (or archives) included
42
voice, text chat and screen capture of actions taking place on the virtual whiteboard,
including polling questions.
Since the recordings within Wimba Classroom did not give this information
in one integrated recording but rather as two separate records – a file containing a
transcript of the text chat and another file containing the audio and screen view of the
virtual whiteboard, the researcher had to record the tutorials using specialist screen
capture software. With this software it was possible to capture all the action taking
place in the tutorial in one complete representation. In order to do this it was
necessary to set up a second computer, other than the one being used by the
researcher to facilitate the tutorials. The researcher entered the Wimba Classroom
from this second computer using the name ―Moderator‖ so that the Classroom would
be visible on the screen of this second computer. Participants were made aware of
this prior to the start of the tutorials.
This study used these recordings and the comprehensive notes made while
listening to the recordings as documentation to support observations made by the
researcher, as participant-observer, in her understanding of the interactions occurring
in the online synchronous tutorial (Stake, 1995). Analysis of these data is discussed
in Section 3.2.2.2 and results of this analysis are presented in Chapter 5.
3.2.1.3 Weekly research questions
Following the introductory tutorial, at the beginning of each subsequent tutorial, a
number of key research questions were asked using the Polling function within
Wimba Classroom. These questions were a mixture of multiple-choice questions
allowing more than one response and open-ended questions (see Appendix B for
details).
Beginning in Tutorial 2, and then weekly, participants were asked about the
level of the content covered in the previous week‘s tutorial to ascertain if the material
was delivered at an appropriate level so that adjustments could be made as necessary.
They were required to select amongst several options: too easy; nothing new; nothing
new but helpful; thought provoking; confusing; just right; too hard. They were
allowed to select more than one option. Another question that was asked each week
was the open-ended question: ―What was the most important thing you got out of last
week‘s tutorial?‖ which allowed participants the freedom to answer as they pleased,
so as to obtain the broadest possible view.
43
Several times across the semester (during Tutorials 3, 6, 7, 8 and the
Revision Tutorial) participants were asked: ―How are you feeling about Data
Analysis at this point in time?‖ Choices given for this question included: still
worried; getting on top of it; struggling; enjoying it; hating it; all good. Again they
were given the opportunity to choose more than one option. On two occasions
(during Tutorials 4 and 8), participants were asked to identify: ―Which topic have
you found most challenging?‖ from a list of topics that had been covered to that date.
They were allowed to choose two from the list, again to obtain the broadest possible
view as typically most students struggle with a number of topics.
In Tutorial 2 participants were asked: ―How do you feel about the online
tutorial environment?‖ Choices included: too impersonal; a bit daunting; a bit
clunky; easy to use; a bit tricky to navigate; just like being in a normal classroom; a
bit intimidating; quite comfortable. For this question participants were only allowed
one choice to focus attention on the most predominant aspect of the online
experience.
In the final tutorial (Revision Tutorial) two extra questions were added.
Both were open-ended questions. The first of these asked: ―Why did you choose to
participate in the online tutorial?‖ This question had also been asked in the initial
survey. The other asked: ―In what way did the online tutorial contribute to your
learning of Data Analysis?‖ This question was again posed in the final interview.
These questions were repeated on these occasions to check consistency of response
from the participants. Triangulation of evidence obtained from these differing times
across the semester added to the strength of the study by ―corroborating the same fact
or phenomenon‖ (Yin, 2009, p. 116). Analysis of these data is discussed in Section
3.2.2.1 and responses to these questions in each tutorial are given in Chapter 5.
3.2.1.4 Final interviews
Interviews allow researchers to gain insights into multiple realities (Stake, 1995). In
case studies, interviews are ―guided conversations rather than structured queries‖
(Yin, 2009, p. 106). Because most case studies relate to issues of human beliefs,
feelings, perceptions and behaviours, interviews are seen as essential sources of
evidence (Yin, 2009). However, interview questions can lead to alternative
interpretations by the interviewer and respondent that may need to be resolved
through further discourse (Mishler, 1991).
44
Silverman (2006) proposed three models relevant to interview data:
positivism, emotionalism and constructionism. Which approach is followed depends
upon the purpose of the interview and the status attached to the data collected. Since
this was an instrumental case study incorporating a need for general understanding, it
used an unstructured interview consistent with emotionalism, to elaborate on shared
authentic experiences (Stake, 1995). Even though open-ended questions were used
to obtain self-reported unique descriptions and interpretations of how the online
tutorials contributed to their learning of introductory statistics, the researcher was
mindful to keep participants focused on the issues while maintaining a friendly, non-
threatening manner during the interviews (Yin, 2009). However, it was also essential
to be aware that individuals made sense of their experiences through telling their
stories, so the researcher assisted in this process by using a conversational style thus
reducing chances of a power differential in the interview situation, encouraging the
participant to become a collaborator in interpreting the course of events (Mishler,
1991).
With a list of questions related to the aims of the research study and aspects
of the Community of Inquiry framework described in Section 2.4, the researcher
aimed to corroborate, or clarify further, the evidence gathered from the ―thick‖
description and content analysis of the online tutorial recordings. Though some of
these questions were based on the Community of Inquiry Survey Instrument
developed by Arbaugh, Cleveland-Innes, Diaz, Garrison, Ice, Richardson, Shea and
Swan (2008) and aimed to elicit similar information, the interview questions were
not as explicit as the CoI Survey Instrument, thus allowing participants to make
sense of their own experiences (see Appendix C for details). The intention was to ask
questions in an objective manner, actively listen, take notes and then write a report
for each interview in order to capture the context, nuance and innuendo (Stake, 1995;
Yin, 2009). Further to this, a recording of their particular interview was made
available to each participant to check the content (member checking) and to add
anything that may have been missed (Stake, 1995). The final interviews primarily
investigated participants‘ reasons for participating in the online synchronous tutorials
and their views on the contribution that the tutorials made to their learning of
introductory statistics. Analysis of these data is discussed in Section 3.2.2 and results
are presented in Section 4.2.
45
3.2.2 Data analysis
Analysis of data in a case study is informed by ―the researcher‘s presence, the nature
of the interaction between researcher and participants, the triangulation of the data,
the interpretation of perceptions, and rich thick description‖ (Merriam, 1988, p. 120).
Consistent with case study methodology, a variety of data analysis techniques were
used in this study (Yin, 2009). These techniques can broadly be classified as thematic
analysis and content analysis. Thematic analysis involves the emergence of codes
from the data whereas content analysis uses codes which have been determined prior
to searching for these in the data (Liamputtong & Ezzy, 2005).
Thematic analysis was conducted using data collected from the initial survey,
weekly research questions, recordings of online tutorials and final interviews. During
the process of note-taking from each of these data sources and categorising of this
material initially informed by the study‘s purpose, a number of themes emerged
(Merriam, 1988). These are discussed in more detail in Sections 3.2.2.1 and 3.2.2.2.
Content analysis informed by the elements of the Community of Inquiry
framework (Garrison et al., 2000; see Section 2.4) was used to investigate the
dynamics of the interactions taking place in the online synchronous tutorials. When
performing content analysis, consideration needs to be given to the theoretical base
of the instrument used (in this case the CoI framework), the unit of analysis (in this
case a message) and the inter-rater reliability (De Wever et al., 2006). These are
discussed in more detail in Section 3.2.2.3.
3.2.2.1 Analysis of initial survey, weekly questions and final interviews
As outlined in Section 3.2.1.1, data was collected on student preconceptions of
studying the introductory statistics course, their prior experience of the online
environment, their reasons for participating in the online synchronous tutorials and
their perceptions of the value of these tutorials to learning quantitative content.
Comparison of the expectations identified in the initial survey with attitudes exposed
by the weekly research questions (Section 3.2.1.3) and perceptions of the success or
otherwise of the tutorials explored in the final interviews (Section 3.2.1.4) produced
themes to be examined to gain insights into the contribution that the online
synchronous tutorials made to student learning. Emergent themes included types of
interactions amongst the participants (students and teacher), the impact of immediacy
46
in synchronous communication and the affordances of the technology. These themes
are discussed in Chapter 4.
3.2.2.2 Analysis of interaction in the online synchronous tutorials
As indicated in Section 3.2.1.2, interaction in the online synchronous tutorials was
recorded using screen capture software which included voice, text chat and action
taking place on the virtual whiteboard. Detailed note-taking of these integrated
recordings of each tutorial was required in order to identify the common themes.
These themes highlighted any aspect that impacted on the conduct of the tutorials,
including which participants were present, the format of and content discussed in the
tutorials and the nature of the interactions taking place within the tutorials. The basis
of initial analysis of the interactions can be attributed to the three distinct categories
of Moore (1989): learner-content, learner-teacher and learner-learner (see Section
2.4), subsequently referred to in this document as student-to-content, student-to-
teacher and student-to-student. These themes are discussed in Chapter 5.
3.2.2.3 Community of Inquiry framework coding and analysis
A number of models have been proposed for analysing interaction with most having
been influenced by the Henri (1992) model which consisted of five dimensions,
namely, participation, interaction, social, cognitive and metacognitive (Gunawardena
& McIsaac, 2004). Gunawardena, Lowe and Anderson (1997) used Henri‘s model to
develop a five phase model of interaction analysis. In the meantime Garrison,
Anderson and Archer (2000) developed an interaction model to describe the nature
and quality of critical discourse in an online community of inquiry. In this research
study the interactions taking place in the online synchronous tutorial were further
analysed using coding based on the Community of Inquiry (CoI) framework
(Garrison et al., 2000; see Section 2.4 for details). The CoI framework with its three
elements of social presence, teaching presence and cognitive presence provided the
theoretical base for the analysis of the dynamics within the tutorial, as captured in the
recordings, and the changes in these dynamics over time across the semester. Content
analysis informed by the CoI framework is a powerful method that can be used to
understand text-based educational conferencing and discourse (Garrison, Cleveland-
Innes, Koole & Kappelman, 2006). However, this framework is applied here to
recordings that include both text-based and voice interaction. Rourke and Anderson
(2004) suggested that instead of developing new coding schemes for
47
content/transcript analysis, researchers should use schemes that have been developed
and used in previous research thus contributing to the validity of an existing
procedure. Garrison et al. (2006) noted that ―a sound theoretical framework and
model is essential to address validity issues‖ (p. 2) – coding schemes must be both
reliable and efficient. Consistent with these views content analysis using the coding
schemes adopted by the CoI framework were used to analyse the recordings of three
tutorials (Tutorials 3, 6 and 8) to highlight changes in dynamics amongst CoI
presences within the tutorials across the semester of the data collection stage of the
study (see Section 5.10).
Other than the theoretical base of content analysis, De Wever et al. (2006)
identified two important issues: the unit of analysis and inter-rater reliability. They
concluded that the unit of analysis, ranging from an individual sentence to a
complete message, is dependent on the context. For the online synchronous tutorials
in this study, the unit of analysis was a single message on a single topic or statistical
concept as it was more easily defined without ambiguity. The simplest and most
popular way of measuring inter-rater reliability is the ―percent agreement.‖
Alternatively, Garrison et al. (2006) suggested a negotiated approach to coding
transcripts. After independently coding transcripts, the coders discuss their codes and
work towards a consensus. This negotiated approach is particularly useful in
exploratory research where the main focus is on gaining a deeper understanding of
the learning taking place. Due to limitations in the conduct of this study, coding of
the data was not able to be completed by two independent raters to either ascertain
inter-rater reliability or perform a negotiated coding. As a compromise, the
researcher coded the tutorial recordings chosen for this analysis on two occasions
separated by approximately two months and used a ―negotiated approach‖ to arrive
at a ―consensus‖ on the coding to enhance the reliability of the analysis. As interest
in this study was in the dynamic relationship amongst the three presences of the CoI,
coding was carried out using the elements of social, teaching and cognitive presence
informed by the categories and indicators of each element (Garrison, 2007). In
addition to this, messages were coded in terms of the medium used to communicate,
namely text chat, microphone or emoticon/icon. The outcomes of this analysis are
presented in Section 5.10.
48
3.2.2.4 Analysis using narrative description
In a socio-cultural approach, four narratives have been developed to provide more
detailed insights into individuals‘ expectations, perceptions and experiences of the
online synchronous tutorial (see Chapter 6). Data from the initial survey (Section
4.1), the research questions and interaction in the tutorials (Sections 5.1 – 5.9), and
the final survey (Section 4.2) have been used to produce these ―descriptive portraits‖
(Merriam, 1988, p.127). For these narratives four participants were chosen to
represent the diversity in the research group in terms of age, gender, discipline of
study, geographic location and personal circumstance.
3.2.3 Validity of the study
In qualitative studies, validity is thought of in terms of ―trustworthiness‖ or
―authenticity‖ (Creswell, 2003). Some protocols to ―get it right‖ come under the
name ―triangulation‖ (Stake, 1995) which involves using information from different
data sources to build a coherent justification for the themes identified (Creswell,
2003). Construct validity is addressed by using multiple sources of evidence,
establishing a ―chain of evidence‖ whereby the link between evidence and
conclusions can be easily followed (Yin, 2009). Bias that the researcher brings to the
study needs to be clarified in an open and honest narrative. In addition, the researcher
needs to include ―negative or discrepant information that runs counter to the themes‖
to add credibility (Creswell, 2003, p. 196).
Some of the strategies that were used to strengthen the validity of this study
included triangulation of different data sources, using rich, thick description and
clarifying any bias brought to the study by the researcher (Stake, 1995; Yin, 2009).
Triangulation also included analysing the data using different approaches. The
recordings of the online tutorials were analysed using the interaction model of Moore
(1989) and then further analysed using the Community of Inquiry framework of
Garrison, Anderson and Archer (2000). Answers to the weekly research questions
were examined for consistency with themes identified from the initial survey data.
These were then compared with themes identified in the final interviews. This
triangulation of evidence added to the reliability of the content analysis of tutorial
recordings and the overall conclusions drawn.
49
3.3 Summary of the research methodology
This study enlisted a case study methodology to examine the interactions taking
place in online synchronous tutorials in a quantitative discipline to ascertain the
contribution that this activity brings to student learning. Consistent with case study
methodology, multiple methods of data collection and analysis were employed.
Outcomes from these are presented in the following three chapters. Chapter 4
presents the findings from the initial survey and final interviews. The interaction
taking place within the tutorials is described in detail in Chapter 5. The findings of
the study conclude with the presentation of four narratives in Chapter 6.
50
Chapter 4: Survey and interview findings
Applying case study methodology, multiple methods of data collection (as described
in Section 3.2.1) were used to investigate the interactions taking place in an online
synchronous tutorial in a quantitative discipline context. This chapter, along with
Chapters 5 and 6, describes and summarises the data collected to address the
following aims (as introduced in Chapter 1):
1. To describe the student-teacher, student-student and student-content
interaction in the learning of statistical concepts in this environment;
2. To investigate the nature of the dialogue in this environment for this
discipline context;
3. To examine the student perceptions of the value of this environment to
learning statistical concepts; and,
4. To formulate theoretical constructs pertaining to the teaching and learning of
statistical concepts in this environment.
This chapter, with particular attention to the third aim of the study, will focus
on the findings from (a) the initial survey (Section 4.1); and (b) the final interviews
(Section 4.2). Chapter 5 will present findings from the main data source of the study,
namely, the online synchronous tutorial, while Chapter 6 will present four narratives
which document the experiences of particular participants.
4.1 Initial Survey
Distance students enrolled in the introductory statistics course were invited to
participate in the research study on interactive online learning support (Section
3.1.3). Twenty-one students responded to this invitation. Seventeen of these students
were emailed the participant information form and the initial survey of five questions
(Appendix A). The remaining four students had delayed in their response to the
initial invitation to participate and, after an email exchange which included
information about the timing of the online tutorial, indicated that they would not be
able to participate at this time due to other commitments. Twelve students,
51
subsequently referred to as S1-S12, responded to the initial survey (see Section 4.2.1
for details).
Response to these survey questions, as agreed in the study‘s ethics approval,
was confirmation of a student‘s consent to participate in the study (Section 3.1.3).
This initial survey collected information on student preconceptions of studying the
introductory statistics course, their prior experience with mathematics, their
involvement in online discussion forums, their prior experience of online
conferencing in other courses and their reasons for participating in the online tutorial
in introductory statistics (Section 3.2.1.1).
4.1.1 Preconception of studying the introductory statistics course
The first question of the initial survey, ―Write a few words expressing how you feel
about studying Data Analysis?‖ was designed to elicit participants‘ initial feelings
about studying the course. Due to the prior teaching experiences of the researcher in
this course over a period of more than twenty years and the ongoing ―bad press‖ that
the course is known to receive, it was anticipated that there would be predominantly
negative feelings amongst participants about studying the introductory statistics
course.
Contrary to expectations, participants expressed feelings that ranged from
excitement and confidence to apprehension and panic at the thought of studying Data
Analysis. Four of the twelve participants (S1, S2, S6 and S8) expressed positive
feelings towards the prospect of studying the course. Two, S1 and S2, who were
studying mathematics degrees were looking forward to studying the introductory
statistics course. One of the two studying mathematics (S1) noted the usefulness of
studying statistics in today‘s society. Non-mathematician (S6) felt confident because
of the support available, while another non-mathematician (S8), having been a good
mathematics student at school (even though that was many years ago) was excited by
the challenge.
Six of the twelve participants (S3, S5, S9, S10, S11 and S12) expressed
negative feelings towards the prospect of studying statistics. Some expressed feelings
of apprehension because of the perception that the content would be difficult (S3, S9
and S10). Others used words such as ―scared‖ (S5), ―daunted‖ (S12) and ―panicked‖
(S11).
52
The remaining two participants were repeating the course. One (S7) was
noncommittal about the prospect of repeating the course, but recognised its value to
his discipline studies of biology. The other participant (S4) was nervous about taking
the course again, but noted better preparedness this time, commenting ―more familiar
with the software this time around and the language which I think was the problem
last time‖ and ―motivated to learn and get it right.‖ At the last attempt, circumstances
had not been conducive to studying as S4 did not have reliable Internet access and
was using borrowed materials.
Feelings were not overwhelmingly negative as had been anticipated by the
researcher. While about half of the participants expressed concerns at the prospect of
studying Data Analysis, the other half generally reported being confident and
excited.
4.1.2 Effect of prior experience of mathematics on learning of statistics
The second question of the initial survey, ―How do you think your past experiences
in mathematics might affect your learning in this subject?‖ was designed to obtain
any evidence of mathematics anxiety amongst the participants, as in past cohorts
many students from non-mathematical backgrounds had expressed concern about
their lack of mathematical knowledge, initially believing it to be a disadvantage
when studying this course.
Opinions about the impact of previous experiences of mathematics on
learning statistics were generally positive where participants had favourable
experiences when learning mathematics or statistics in the past, either in high school
or tertiary studies (S1, S2, S3, S4, S7, S8 and S12). Understandably the participants
who were currently studying mathematics at the tertiary level (S1 and S2) identified
that they had good mathematics skills and that this may be advantageous, however
one (S2) recognised that while basic mathematics skills were needed, introductory
statistics was quite different from mathematics. Another participant (S3) also felt that
past experiences in mathematics would assist, but recognised the difference between
mathematics and statistics as ―the interpretation and writing up of ‗findings‘ is
something I will have to concentrate and work at.‖
Where participants had less favourable past experiences or their exposure to
mathematics learning had been some time ago, concern was expressed. However,
53
generally this concern was tempered by optimism towards their studies in the coming
semester. Two participants (S5 and S6) commented on poor past performance in
statistics, but were still hopeful of doing better in this introductory statistics course.
In particular, S5 ―hated stats at high school,‖ while S6 attributed the past poor
performance to lack of motivation and poor work ethic, not to a lack of mathematical
ability. Several participants (S9, S10 and S11) were a little concerned because it had
been some time since they had studied mathematics. In spite of this, two (S10 and
S11) felt that they could overcome any anticipated disadvantage.
While the research group had diverse past mathematical experiences, none of
the participants emphasised their mathematics background as being a cause for
concern in their studies of introductory statistics.
4.1.3 Previous experience of using discussion forums
The third question of the initial survey, ―Do you use the discussion group facility in
Moodle? If so, in what courses have you used it?‖ was posed to gauge previous
uptake by participants of existing avenues for students to engage online with
teaching staff and other students. Engagement, in this instance, could have been
simply reading the discussion posts of others (lurking), posting questions to the
forum about course content and/or course organisation, or answering questions posed
by others. The distinction in the type of engagement was not necessarily gleaned in
responses to this question. However, they did indicate whether participants had, in
the past, been self-motivated to use such online resources to support their studies.
About half of the group reported that they had participated in discussion
forums to some degree in their previous studies, engaging with this asynchronous
form of support. In addition, it needs to be acknowledged that participants were at
different stages in their programs of study and that not all courses supported active
discussion forums.
Seven participants (S1, S2, S6, S7, S8, S9 and S12) indicated no prior
experience of discussion forums. On the other hand, five participants (S3, S4, S5,
S10 and S11) signified varying levels of involvement in discussion forums – S3, S5,
S10 and S11 having used discussion forums in all previous courses and S4 having
used discussion forums ―a fair bit this year,‖ but with no mention of previous years.
Again, the research participants were a diverse group with respect to engagement
54
with online support generally offered through the learning management system used
by the University.
4.1.4 Previous experience of online synchronous communication
The fourth question of the initial survey asked, ―Have you participated in
communicating with tutors and/or students online in any other courses (e.g. MSN
Messenger, Skype, Wimba, etc.), where all of you have been connected to the
Internet at the same time? If so, which courses, who was involved (a group of
students only, a group of students and a tutor, or just you and a tutor), what software
was used and when did this happen?‖ This question was used to inform about the
potential need within the group of training in the use of Wimba Classroom prior to
the commencement of the online tutorials. Answers to this question could also
indicate how comfortable or otherwise the participants were with synchronous online
communication in comparison with the asynchronous form of communication
investigated by the previous question in the survey.
Five participants (S1, S2, S7, S11 and S12) acknowledged that they had no
experience of synchronous online communication with students and/or teaching staff,
however S12 noted ―I have used MSN and Skype privately.‖ This indicated that
some training in the use of online conferencing software, in this case Wimba
Classroom, may be necessary.
The remaining seven participants had used a variety of forms of synchronous
communication in groups and/or one-to-one. Two participants (S4 and S5) had used
Wimba Classroom in group sessions with a tutor for a number of their courses,
though not in the same discipline areas (S4 in psychology and S5 in a tertiary
preparation program). Both participants were very positive about the experience. The
psychology student (S4) had not only used the synchronous online medium in class
discussions with a tutor, but also in individual and group text chats with other
students where a tutor was not present. One participant (S6) had used synchronous
online communication in the workplace rather than to support university studies.
Three participants (S3, S8 and S10) had each used Wimba Classroom in only one
course with a group of students and a tutor present, while another participant (S9) did
―communicate with one other student only via MSN Messenger and also Skype
occasionally.‖
55
From this, it can be seen that four of the five participants who had accessed
discussion forums (S3, S4, S5 and S10) had also taken advantage of synchronous
online communication opportunities (Section 4.1.3). It was anticipated that these
participants may be somewhat more comfortable with interacting in the online
tutorials. However, this proved to be not necessarily the case as S4 was the only one
of these to be consistently active in the tutorials throughout the semester (Section
5.10).
4.1.5 Decision to participate in the online tutorial
The fifth question of the initial survey, ―What made you decide to participate in the
online tutorial in Data Analysis?‖ elicited initial student expectations of what
participating in the online tutorial might offer them. The intention was to ask this
question again in the final interviews and compare responses.
Seven participants (S3, S5, S7, S8, S9, S10 and S11) commented on the
potential benefits of interacting with others in a ―classroom‖ community. These
benefits included support for learning (S5), learning from others‘ points of view (S3),
repetition to help things ―sink in‖ (S8 and S9), being able to listen (auditory learner;
S10), the value of immediacy of answers to questions (S11) and learning different
ways to solve problems (S7). Related to this, two participants (S1 and S4) viewed the
online tutorial as a regular opportunity amongst their normal daily activities to ―keep
on track‖ (S4) and that this time specifically set aside to devote to the study of the
statistics course would make it easier (S1).
Of the remaining three participants, one of the mathematics majors (S2) was
interested in the research aspect of the project and saw involvement as a way of
getting a greater understanding of how research worked, yet still noting that ―it may
be worthwhile to help with learning in the subject.‖ Another (S12) was looking for
involvement in the project to increase chances of success in the course, but did not
comment on how the online tutorial might help this. The final participant (S6)
expressed interest in the technological aspect of being involved in the online tutorial,
obtaining greater understanding of how Wimba worked, however still showing an
appreciation of the benefits for learning, ―can kill two birds with one stone.‖
56
4.2 Final interviews
Final interviews were conducted with participants using the functionality of Wimba
Classroom to converse with each participant individually and record the conversation
(Section 3.2.1.4). With a list of questions related to the aims of the research study
and aspects of the Community of Inquiry framework described in Section 2.4, the
researcher aimed to corroborate or clarify further the evidence gathered from the
analysis of the recordings of the online tutorials and the initial survey. The researcher
endeavoured to ask the questions in an objective manner by not leading the
participants in their responses. While a full transcript of each interview was not
produced, the researcher listened to each interview several times, took notes and then
wrote a report for each interview. Further to this, each participant was given access
to his or her recorded interview and so each had an opportunity to check what was
said and add anything that may have been missed.
4.2.1 Participant demographics and reasons for participating
The participants (n=9) who attended at least three of the nine online synchronous
tutorials across the semester (S1, S2, S3, S4, S5, S6, S7, S8 and S9) were
interviewed at the end of semester, after the Data Analysis examination but prior to
results being released (see Table 5.1 for attendance details). The participants came
from of a variety of different backgrounds and life experiences. The median age of
the nine participants was 34 years, with ages ranging from 25 to 69 years. Five of the
participants (S3, S4, S6, S8, and S9) were studying psychology, two (S1 and S2)
were studying mathematics/statistics, one (S5) was studying business and one (S7)
was studying biology. Four of the participants (S1, S2, S5 and S9) were working full-
time, one (S3) was working part-time, one (S6) was studying full-time, one (S4) was
an at-home mother of a young child and two (S7 and S8) were retired.
Each participant was asked why they had decided to participate in the online
tutorial. Most comments related to some form of interaction (S1 and S3) within a
community of learners (S1). By being part of a community (S1 and S4), participants
thought that they would feel less isolated (S4, S5 and S6), it would help them to keep
up (S4), they would get to know others in the course (S9) and they could build a
relationship with the tutor (S2). It was believed that the online tutorial would provide
them with more motivation (S4) and it would help them with their learning (S2, S7,
S8 and S9), in particular hearing what others were thinking (S3). S6 summarised this
57
succinctly, ―the more contact you have with people about your work, the more you
can discuss it, the better your understanding. It opens up so much more than if you
are by yourself and isolated.‖ Some participants (S2, S4 and S6) also commented on
their involvement in the study as providing an opportunity to gain experience of
being in a research project, as they were anticipating doing a research project of their
own at some stage in their studies. These responses were consistent with those given
in the initial survey at the beginning of the research study but somewhat more
expansive.
4.2.2 Interaction with other participants and the tutor
When asked to reflect on how the participants in the study contributed to the process
and experience of the online tutorial, some comments related to the idea of a shared
responsibility in working together (S2, S5, S6, S8 and S9), where they could ―learn a
lot from each other‖ (S6). Participants commented that others would often ask
questions that they were wanting to ask (S4, S7 and S8), but also questions that they
had not thought to ask but realised that they would like to know about (S8). It was
noted that, in discussions in the tutorials, participants offered differing viewpoints,
different ways of thinking about things and different ways of saying things (S2, S3,
S4 and S7), to which S9 added that she ―hadn‘t thought about presenting it in that
way‖ and S5 further added ―if you were too embarrassed to ask a question and
somebody asked a similar question you didn‘t feel so stupid‖ and would think ―I am
not the only one struggling.‖ Participants found this type of support for one another
reassuring (S1), to the point of feeling comfortable with being corrected by other
participants (S4). S3 summed it up as ―they added the life to it,‖ to which S7 agreed
―with the tutor and other students it brings it much more to life to what you are trying
to study.‖
There was a general feeling that the presence of a tutor in the online tutorial
was needed to ―keep everyone on track‖ (S1, S2, S3, S4 and S5). Having the tutor
present made the online tutorial ―like being in a [real] classroom‖ (S7). One of the
benefits that the tutor brought to the situation was content knowledge (S6, ) which
meant that the tutor could help clarify concepts (S1 and S9) and give a broader
picture of how the topics fitted together (S3 and S8). S4 commented that the tutor
was ―giving us questions and making us think‖ and that ―it wasn‘t even lecture form,
actual questions and then we would discuss the questions, so then understanding
58
would happen.‖ While all believed that the tutor was an essential element in the
online tutorial, some (S2, S4 and S5) also considered that it would be useful to have
some sessions without a tutor present, where students could ―nut things out on our
own‖ (S9). Two participants (S7 and S8) thought that it would be good to have a
Wimba Classroom where students could meet with other ―study buddies‖ to work
together one-on-one. S2 suggested that he would like to use breakout rooms where
students could be set questions to discuss in a small groups and then report back to
the whole group for further discussion.
4.2.3 Perceived support of learning
When asked to reflect on how the online tutorial helped in their learning of Data
Analysis, participants identified things that could be summarised as interaction and
immediacy. Interaction in the tutorial produced opportunities for clarification of the
concepts and participants‘ understanding of the content (S2, S3 and S4). This
clarification came from the tutor and other participants by working through problems
together: identifying the type of problem, setting out the working, using correct
terminology and finding the final solution – ―all of the finer details‖ (S3). ―Like
being in a classroom‖ (S7), participants were able to hear the examples and processes
explained rather than just reading them (S5). The weekly routine of committing to
participating meant that the tutorial was used as a way of consolidating the learning
of material already learned in private study or as an introduction to a new topic
making it easier to then learn the material in private study (S1). The immediacy of an
online synchronous tutorial meant that participants were able to ―to bounce ideas
around with the tutor‖ (S2), ―being able to talk about it‖ (S9) and found it ―beneficial
to see problems worked out live and be able to ask questions for immediate
feedback‖ (S1). Being able to ask questions in the moment and ―physically doing
questions‖ facilitated the learning (S2, S4, S8 and S9), ―reading it is one thing but
seeing it worked out on a whiteboard gives you an extra element of learning‖ (S8).
As S2 explained it, ―I think I would have found it a lot more difficult to realise that I
was having problems if I had actually just got on the discussion list and actually just
entered something there.‖ Interacting with the content in a supported environment
such as the online tutorial, where tutor and participants were willing to share their
questions, concerns and explanations, was perceived as beneficial to the learning
process.
59
As a bonus to the online interaction, participants had access to recordings
(archives) of the online tutorials. While all participants acknowledged the usefulness
of this resource, they were not viewed by all. Lack of time was a contributing factor.
Archives were considered to be useful for catching up on the content covered in the
tutorials that were missed, particularly in preparation for the tutorial that followed.
They were also seen as a useful resource to review what was said and to pick up on
things that may have been missed, ―good to listen over and over until you got a
concept‖ (S5) or when technical issues were experienced and Internet connection
was lost during the tutorial (S8 and S1). The recording could be stopped and started
at will and replayed as necessary (S3), with only the relevant parts being replayed
(S4). As participants joined the tutorial from their homes, distractions from family
members happened from time to time. The archives allowed participants to fill in the
gaps in the discussion that occurred for them, thus not disrupting the flow for the
others. At times the pace of the tutorial was a bit fast for some (S3, S7 and S8). The
archives provided the additional resource to make sense of everything that was
discussed in the tutorial.
4.2.4 Using the Wimba Classroom
Three of the participants (S4, S5 and S8) had used Wimba Classroom in other
courses, but the remaining five had no experience of this online learning platform.
However, one participant (S6) had used other videoconferencing software in her
workplace. While most participants felt reasonably comfortable with the Wimba
Classroom environment, technical issues did occur from time to time, but these could
mainly be explained in terms of Internet connectivity and speed. However, for some
it did take a few sessions to become familiar with the functionality ―not used to
speaking online but just a matter of practice‖ (S7). Even with this practice, one
participant still found it a little daunting (S8). In addition to this, it was noted that
some aspects of functionality could be improved. Some participants found using the
Talk button inconvenient (S5) as it had to be held down while talking and there was a
slight delay between clicking on the Talk button and voice being heard. Another
participant (S2) pointed to issues with the Erase function within Wimba. When the
Erase icon was clicked all annotations on the whiteboard were erased. This created
some interesting moments when early in the semester a participant unknowingly
erased all annotations on a slide and the others had to do them again. It did have a
60
positive aspect in that participants were able to self-correct and improve their
contributions. Some participants found Wimba easy to use and acknowledged that it
was very helpful to their learning style, ―I like being able to do things with other
people and to interact. It helps to reinforce things in my mind‖ (S9). A number of
participants commented that they would like to have Wimba sessions in other
courses (S1, S2, S3, S5, S6 and S7) or had good experiences of Wimba in other
courses (S4). However, S6 said that it would depend on the content topic. In
comparing a Wimba session with face-to-face classes S5 mentioned that ―it was
definitely no imposition at all and in fact if anything it was more convenient because
I sat in my pyjamas doing it.‖ S7 added that working in Wimba was ―definitely most
brilliant things ever,‖ adding that it felt like ―one-on-one straight away.‖
One aspect of interaction within Wimba was the dilemma of whether to use a
microphone or text chat to communicate. Attitudes to this issue varied amongst the
participants. For some there were technical issues, such as the speed of the Internet
connection and associated lag time in the voice being heard (S2) or the
inconvenience of having to hold the Talk button down while talking (S5), which
impacted this decision on which medium to use. For others it was lack of confidence
at speaking into a microphone, ―the text gives people a chance to say stuff if they‘re
a bit too shy to talk‖ (S9), but ―just a matter of practice‖ (S7). There were also
concerns about background noise in individual households when using the
microphone (S4). While each participant had their own preference, all participants
used a mixture of both. Most found that a mix of both text chat and microphone
happening simultaneously added to the dynamics of the online tutorial, ―while
someone is speaking someone else can actually be writing something‖ (S6).
However, for some this was a bit off-putting as it was difficult to keep up with the
pace of the conversation at times (S3 and S8). Occasionally there were issues with
two people speaking on microphones at the same time ―bit hard if two trying to speak
at the same time‖ (S5). Use of the Hand-up icon minimised this possibility,
particularly ―because we don‘t have any social cues‖ (S2). Using text chat meant that
participants could add to a conversation without interrupting the person speaking on
the microphone (S2). However, one participant (S9) commented on the more
personal nature of using the microphone, ―feel like I am getting to know people a bit
more.‖ In all, most participants did use both text chat and microphone as it suited.
61
4.2.5 Other interaction opportunities
Participants were asked if they had used the connection to others that had been
fostered in the online tutorial to arrange to communicate outside the tutorial. There
was only one instance of communication between research study participants outside
the tutorial. Participants S6 and S1 met face-to-face when S1 was passing through
the town where S6 lived. A number of participants commented that they would have
liked to have made contact with others. S2 had tried to form a study group with
others studying the course, not necessarily research study participants, but without
success. Another participant (S9) was in text chat contact with another student in the
course who she knew from studying other courses together. S5 suggested that it
would be useful to have a Wimba Classroom set up for students to meet with other
students online for informal conversation about the course content. Two participants
(S6 and S8) said that they would have liked to meet and talk with others, but S6
simply did not have the time, while S8 saw no need in this instance as she already
met with another student, not a research study participant, ―her study buddy,‖ face-
to-face. S7 felt that he should have made the effort to be in contact with others
outside the tutorial, but didn‘t. S3 saw no need in meeting with other students outside
the tutorial, as she believed that she had enough interaction with other students
through her involvement in the online tutorials.
4.2.6 Final comments
At the end of the interview, participants were given the opportunity to comment on
anything about the online tutorial that had not been specifically asked. Most made
comments of appreciation for the support that the tutorial gave them in their studies.
However, one participant (S1) commented that the online tutorial ―makes us external
students feel valued by this support.‖ He further added that the nature of the content
meant that it was difficult to write complex questions and ideas in an email, so the
online tutorial helped in dealing with these issues. S2 observed that even though the
discussion forums have meant that students had more time to think and reflect and
find information to answer questions, the interaction in the online tutorial was fun
and students had an opportunity to ask questions for clarification. Another participant
(S3) added that interaction in the online tutorial ―makes names into people.‖ The
general feeling was that more courses should offer this type of support.
62
4.3 Summary of the survey and interview findings
With reference to the third aim of this research study this chapter examined student
perceptions of the value of the online synchronous environment to the learning of
content in a quantitative discipline context. Perceptions following participation in the
online synchronous tutorials were consistent with initial perceptions expressed before
the tutorials were conducted. Most comments related in some way to the value of the
immediacy of the interactions with the tutor and other students and the support that
this provided to the learning process.
Recordings of the online synchronous tutorials provided further data to gain a
deeper understanding of the contribution that these tutorials made to student learning.
Analysis of the content of these recordings is presented in the following two
chapters. In Chapter 5 each tutorial is described in terms of who attended, format and
content of the tutorials, interactions taking place and use of the archives. Interactions
taking place in three of the tutorials were investigated in greater detail using the lens
of the Community of Inquiry framework (introduced in Sections 2.4 and 3.2.2.3) to
provide further insights into the types of interactions and the distribution of
interactions over time (Section 5.10). In Chapter 6 the stories of four participants,
based on information obtained from the initial survey, observations from the
recordings of the tutorials and answers offered in the final interviews are presented.
63
Chapter 5: Online Synchronous Tutorials
Online synchronous tutorials were conducted each week across nine weeks of the
Southern Summer semester of 2010 (November to February) using Wimba
Classroom, an interactive online learning platform. During this particular semester,
the introductory statistics course, Data Analysis, was offered primarily in distance
mode, with a smaller cohort of face-to-face students studying at one of the
University‘s three campuses (Section 3.1). Distance-only students were enlisted into
the research project.
This chapter primarily addresses the first two aims of the study related to the
types of interactions taking place and the nature of the dialogue in the online
synchronous environment. Each tutorial is described in terms of four main aspects,
namely, who attended, format and content, interactions taking place and use made of
the archives. In addition to this, three of the tutorials, Tutorials 3, 6 and 8 were
analysed using content analysis based on the Community of Inquiry framework to
provide further insights into the types of interactions and the distribution of
interactions taking place over time within the online synchronous tutorials, with
results presented in Section 5.10.
The first tutorial (Tutorial 1) was conducted in the fourth week of the eleven-
week semester allowing time at the beginning of the semester for students to become
established in the course before being approached about participating in the research
project. Following this introductory session, weekly tutorials (Tutorials 2-8) were
conducted across the remaining seven teaching weeks of the semester. A revision
tutorial coded as R (Table 5.1) was held in the first week of the two-week
examination period, since the examination for this course was timetabled for the
second week. As all participants were located in the same time zone, negotiations on
a suitable time for the tutorial were completed quickly. It was decided to hold the
tutorials between 7 and 8 on Tuesday evenings. Prior to the first tutorial, participants
were emailed instructions on running the Wimba Setup Wizard and alternative online
contact information (Windows Live Messenger and Skype), in case there were
technical issues connecting to the designated Wimba Classroom.
64
The Wimba Classroom allowed participants to interact using audio (voice
with a microphone), text chat, virtual whiteboard and emoticons/icons. Which of
these were used more frequently by participants is investigated in Section 5.10.
While it is possible to record online tutorials in Wimba, these recordings (or
archives) only capture voice and video of the virtual whiteboard content, namely,
PowerPoint slides and any writing or typing on the whiteboard (Section 3.2.1.2).
Even though text chat takes place in the Chat window at the same time that voice and
writing on the whiteboard are occurring, Wimba records it in a separate file. To
capture the flow of the tutorial and interaction taking place, it was necessary to
obtain an integrated recording of voice, text chat and actions taking place on the
virtual whiteboard. To achieve this, the researcher logged in to the Wimba
Classroom from a second computer, so as to record the participant view of the
screen. On this second computer, all interactions (voice, text chat and whiteboard)
could be assembled in the one recording using screen capture software. Participants
were made aware that these recordings were being made. They were also given
access to the archive recordings following each tutorial. Further, these archives were
available up until the final examination.
The number of participants attending the online tutorials ranged from four in
Tutorial 7 up to the full complement of active participants (n=9) in the Revision
Tutorial (R). As researcher, participant-observer and facilitator, I generally chose the
material to be discussed in the tutorials. Nevertheless, participants were encouraged
to email questions which could then be incorporated into the content of the next
tutorial. The principal activities being undertaken in the tutorials included discussion
of key concepts and content questions, as well as social conversation and technical
assistance with working within Wimba. The type of interaction taking place, that is,
student-to-teacher, student-to-student, or student-to-content, depended on the nature
of the activity undertaken during the tutorial (Sections 2.4 and 3.2.2.2). Effective
interaction within the online synchronous environment did take some time to
establish because of the lack of visual cues.
Of the 12 students (coded as S1 to S12) who agreed to participate in the
research study, 11 took part in at least one online tutorial. Table 5.1 summarises
student participation in the online tutorials and their access to the recordings
(archives) of the tutorial sessions.
65
Table 5.1
Student participation
Student Tutorial Session # Archives watched
attended absent
S1 2, 3, 4, 5, 6, 8, R 1, 7 7, R
S2 1, 2, 3, 5, 6, 8, R 4, 7
S3 1, 3, 6, 8, R 2, 4, 5, 7 5, 7
S4 1, 2, 3, 4, 5, 6, 8, R 7 7
S5 1, 2, 5, 7, R 3, 4, 6, 8 3, 4, 5, 6, 7, 8, R
S6 1, 2, 4, 5, 6, 7, 8, R 3
S7 1, 2, 3, 4, 5, 6, 7, 8, R 1, 2, 3, 4, 5, 6, 7, 8, R
S8 3, 4, 5, 7, 8, R 1, 2, 6 1, 2, 3, 4, 5, 6
S9 3, 8, R 1, 2, 4, 5, 6, 7 4, 6
S10 1 2, 3, 4, 5, 6, 7, 8, R
S111 2, 3 1, 4, 5, 6, 7, 8, R
S122 1, 2, 3, 4, 5, 6, 7, 8, R
Notes to Table 5.1
1. S11 withdrew from the course prior to Tutorial 4.
2. S12 did not participate in any tutorials and subsequently withdrew from the course.
Analysis of the data will primarily include participants S1 – S9 as:
(i) S10 did not respond to any of several follow-up emails after attendance at the
first tutorial;
(ii) S11 indicated intention to withdraw from the course just after the third
tutorial; and,
(iii) S12 withdrew from the course before the online tutorials started.
66
It should be noted, that as an active participant in the online tutorials (Section 1.7),
the researcher will often use the first person to refer to her interactions with the
participants.
5.1 Tutorial 1
Tutorial 1, as an introductory tutorial, was conducted in Week 4 of the semester. Up
until this point in the semester, students should have covered topics on descriptive
statistics, including types of variables, graphical displays and numerical summaries,
and the normal model. As the first tutorial, there was a need to familiarise the
participants with the workings of Wimba Classroom and facilitate introductions to
the group. This tutorial took more than the designated one hour, lasting for
approximately 70 minutes.
5.1.1 Who attended Tutorial 1
Seven participants attended the first tutorial – S2, S3, S4, S5, S6, S7 and S10. One
study participant, S1, had emailed an apology prior to the first tutorial citing a long-
standing work commitment that could not be rescheduled. A further three, S8, S9 and
S11 did not join the study until after the first tutorial.
Four of the attendees indicated that they were studying psychology, with the
remaining three studying, biology, mathematics/statistics and business. During the
introductions, it was found that five were from the capital city and two were from
coastal centres.
5.1.2 Format and content of Tutorial 1
After orientation to the Wimba Classroom environment and introductions to the
members of the group, five multiple-choice questions were used as an icebreaker to
encourage students to think about the topic and participate in the discussion. As
researcher, participant-observer and tutorial facilitator, I chose to present these
questions using the Polling feature within Wimba. Feedback on the answers given by
participants was in the form of a summary of responses with no specific
identification of the respondents.
The five questions examined key concepts covered in the course content over
the previous few weeks namely: types of variables, appropriate use of graphs,
interpreting graphs, and numerical summaries of data. Even though all participants
answered the first two questions on types of variables correctly, I used this
67
opportunity to emphasise that this concept was pivotal to deciding how to analyse
data. This was achieved by exposition and also by asking participants to elaborate on
their thinking when they were deciding which option to choose in the multiple-
choice question.
The next two (of the five) questions were on graphing. Not everyone
answered these questions correctly, that is, only S6 and S10 had the third question
correct, with S3, S6 and S7 answering the fourth correctly. The final question on
numerical summaries was answered correctly by only one participant (S4).
When questions were answered incorrectly by any one of the participants,
this presented an opportunity to discuss the multiple-choice options one by one and
why some distractors could be quickly eliminated as not feasible. Participants were
asked to indicate understanding by using the Tick/Cross icon in Wimba. In this
instance, all participants indicated understanding of all five questions once they had
been fully discussed. In all, answering and discussion of these five questions lasted
32 minutes.
Following the poll questions, I presented two questions that required
participants to do some numerical calculations before arriving at an answer. These
questions were displayed on the virtual whiteboard which allowed participants to
write collaboratively on the screen. The first question was a multiple-choice question
which could be answered once the median of a dataset was calculated (Figure 5.1).
Figure 5.1. Participants indicating answers on the virtual whiteboard
68
Everyone was asked to work out an answer to the problem presented in
Figure 5.1, but only four (of seven) participants initially indicated a response. In
Wimba, it is not possible to distinguish who is doing the writing on the virtual
whiteboard. Two of the participants chose the strongest distractor as the answer.
From this, it was a natural progression to lead the group through the steps required to
work out the correct answer and consider why the distractor was an attractive
response.
The second of these questions applied z-scores to a contextual question. This
required participants to understand the context, select the appropriate method and
calculate the z-scores for comparison. Again, participants were encouraged to
contribute to the solution on the whiteboard (Figure 5.2).
Figure 5.2. Writing on virtual whiteboard using tablet technology
One student typed the answer on the whiteboard with no reference to
working, so I acknowledged the participant‘s contribution, but then led the students
through the steps to the solution, including writing the numerical calculations on the
whiteboard using the capabilities of my tablet PC rather than just describing them.
These two questions took approximately 10 minutes to complete. I had an additional
normal distribution question prepared for the tutorial, but decided that I would keep
that for the next tutorial and, alternatively, elected to finish the tutorial by the
collaborative development of a table of graphical and numerical summaries of data.
The final activity in Tutorial 1 started with an empty table on the whiteboard.
Row and column headings indicated what type of information needed to be added to
69
the table, that is, the rows indicating graphical or statistical summaries and the
columns indicating the type of variable (categorical or quantitative).
Instead of asking generally for assistance with working through the activity, I
asked specific people to insert specific information where appropriate, for example,
S7 was asked to place pie graph into the appropriate cell of the table (Figure 5.3).
Figure 5.3. Development of table of graphical and statistical summaries
This activity took somewhat longer than expected as we had a few issues due
to limited functionality in Wimba, for example, clicking on the Erase button removed
all writing on the slide not just the writing of one person (Section 4.2.4). Once the
information about graphs was entered into the table, discussion followed, for
example, when it was better to select a particular type of graph over another. Because
the tutorial was running past the designated one hour, I decided to postpone
completion of this activity until the next tutorial.
Some brief discussion and organisation around future topics, availability of
recordings, provision of tutorials during the festive season and specific planning for
the next week‘s tutorial followed. In conclusion, everyone contributed some form of
thank you and good night.
5.1.3 Interactions in Tutorial 1
During the first few minutes of the tutorial, participants were given an opportunity to
try the interaction features within Wimba. They were asked to write or type their
name and geographical location on the whiteboard. Everyone participated in this
student-to-group and student-to-content interaction, however, S7 had some initial
70
technical difficulties and S4 typed into the Chat window instead. Participants tested
the emoticons (Tick/Cross, Smiley-face, Clapping, Hand-up) and some tried using a
microphone to communicate during this initial period. I had recommended that they
purchase a headset with built-in microphone emphasising that it was useful but not
essential to do so. Since everyone had microphone capability, I asked them, in turn
around the group, to introduce themselves by telling us what degree they were
studying and indicate any topics they would like covered.
Despite using the microphone without difficulty, most participants used text
chat as their primary means of communication throughout Tutorial 1. This allowed
more than one student to answer a particular question. As the text chat response does
not appear until the Send button is clicked, students were formulating and sending
responses in quick succession before reading the responses of others. Early in this
tutorial, I realised that I would need to have a list of participants on a piece of paper
beside me as the Participant Window in Wimba was not large enough to
accommodate the full list of participants being visible at the one time. However,
when participants used the Hand-up icon to indicate that they wanted to use the
microphone, their name would appear at the top of the list in the Participant Window
until they finished talking. To use the microphone, participants were required to click
and hold down the Talk button in Wimba. They had to consciously remember to do
this to be heard. If they did not wait a couple of seconds after clicking the Talk
button before speaking, their first few words were lost. This made using the
microphone a little less straightforward. Because of these issues, I decided to lock
my Talk button in the On position. It was not practical to have everyone lock their
Talk buttons as this led to electronic interference and background noise.
For the multiple-choice questions in this tutorial, using the Poll function
within Wimba allowed me to track how many had contributed an answer. I allowed
time for everyone to submit an answer and expressed an expectation that all would
do so. The responses were displayed in a summarised form with no way to identify
who selected which response. When a variety of responses were given, this led to
discussion of the choices. When asked to elaborate on their thinking, most of the
participants contributed responses using text chat rather than the microphone,
although S2, S4 and S5 occasionally used the microphone. In this tutorial, the
participants generally only used the microphone when a question was directed to
71
them in particular whereas I used the microphone throughout (teacher-to-student
interaction). When encouraged to use the microphone more, one participant (S5)
commented ―we are all shy,‖ while another (S4) pointed out that she had her very
young child with her and that his noise might be distracting. However, participants
did turn to the microphone when they wanted to ask a question that was too complex
to type in text chat quickly. While all participants contributed answers to the
multiple-choice questions and to the Tick/Cross confirmation of understanding, four
of the participants (S2, S4, S5, and S6) tended to be the most involved in the
conversation (both in text chat and microphone, as indicated by the number of
contributions made to discussion), whereas contributions made by the other three
participants (S3, S7 and S10) were more measured.
For the remainder of the tutorial (as noted in Section 5.1.2), we worked on
problems and a summary table on the virtual whiteboard. For these more complex
questions, participants interacted with the content both: (a) overtly by writing on the
virtual whiteboard (underlining keywords in the question and typing information
onto the whiteboard); and, (b) not openly apparent or privately, that is, by doing
calculations with calculator, pen and paper beside them at their computers (student-
to-content interaction).
Social interaction occurred occasionally during the course of the tutorial.
Early in the tutorial, two participants (S6 and S10) communicated student-to-student
in text chat to determine where each was geographically located. This was the only
instance of a participant ―talking‖ directly to another participant. Generally, social
comments were used to lighten the moment, a bit like an aside that a person might
make behind their hand to the person sitting next to them in a ―real‖ classroom. Of
course, these comments were made to the whole group as they were made in the Chat
window. Unintentional incorrect use of the Wimba functionality led to a bit of social
relief from intense concentration on the development of the content, when one
participant clicked on the Erase button and deleted everyone‘s contributions on the
whiteboard. It was not initially obvious that the Erase button deleted all
contributions, not just the last contribution. She was very apologetic, but it actually
gave participants an opportunity to self-correct when I asked them to re-enter
contributions. Comments such as ―Darn, got that one wrong,‖ ―Oh, that‘s right …
72
woops‖ and ―Kool, got it!‖ helped to lighten the intensity of the discussion and made
participants seem more real to one another.
Interaction in this tutorial was primarily teacher-driven. While comments
from participants were directed to the teacher, they were, in practical terms, made to
the whole group. Very little student-to-student interaction occurred.
5.1.4 Use of the Tutorial 1 archive
The link to the archive was emailed to all study participants (S1-S12) not just those
who had attended the tutorial. A report within Wimba was obtained to ascertain who
had accessed the archive and when it was viewed. Two participants, S7 (who had
attended) and S8 (who had not attended), used the archive of this tutorial 2-3 weeks
after the tutorial and then again a few weeks prior to the examination.
5.2 Tutorial 2
The second tutorial, Tutorial 2, was conducted in Week 5 of the semester. Although
the first tutorial (Tutorial 1) had been concerned with orientation to the format of the
tutorials and the features of Wimba, there was a need to repeat some of this for the
two first-timers, S1 and S11. One of the new attendees was studying psychology,
while the other was studying mathematics. One was located in a nearby non-coastal
regional centre while the other was in the capital city. Again, this tutorial took more
than the designated one hour, lasting for approximately 75 minutes. In addition, there
was some social interaction and testing of technology amongst the early arrivals
during the 10 minutes prior to the start of the tutorial.
5.2.1 Who attended Tutorial 2
Seven participants attended the second tutorial, namely, S1, S2, S4, S5, S6, S7 and
S11. Four participants (S1, S2, S5 and S7) entered the room early with three (S1, S2
and S5) posting greetings in the Chat window. I had placed a ―Welcome to Data
Analysis‖ message on the virtual whiteboard prior to the session so that participants
would know that they were in the right place. Following greetings from the early
arrivals, I also typed a greeting in text chat so that participants would be reassured
that we would be starting the tutorial on time. Two more participants, S6 and S11,
arrived during this waiting period and posted greetings to the other participants. One
participant, S4, arrived late due to a mix-up with links to the Wimba Classroom.
73
Of the remaining participants, S3 sent an apology due to uncertain Internet
access while travelling interstate, S8 had indicated prior to entering into the study
that she would not be able to attend this tutorial and S9 did not enter the study until
after this tutorial. As indicated earlier in the introduction to this chapter, S10 did not
respond to any of several follow-up emails after attendance at the first tutorial and
will not be included in any further discussion of findings.
5.2.2 Format and content of Tutorial 2
Tutorial 2 started in earnest with a formal spoken welcome followed by a quick
―technology check‖ and introductions from the two new attendees, S1 and S11. This
revealed that S11 did not have access to a microphone. I assured her that
communication using text chat was acceptable and that she would not be
disadvantaged by not having a microphone. A brief question and answer segment
using the Poll function within Wimba was used to obtain information from
participants related to the research questions of the study (Section 5.2.2.1) including
feedback on material related to the previous week‘s course content (Tutorial 1).
5.2.2.1 Reflection on Tutorial 1
The research poll questions started with a simple icebreaker followed by multiple-
choice questions related to attitudes to the level of the content covered in the
previous week and perceptions of the online environment; and an open-ended
question identifying the most important thing that was gained from the previous
week‘s tutorial (see Section 3.2.1.3 and Appendix B for details). All of those present
who had attended the previous tutorial indicated positive attitudes to the material
covered in the previous week with responses such as ―just right‖ (S5 and S7),
―nothing new but helpful‖ (S2), and ―thought provoking‖ (S6). All participants
indicated that they were comfortable with the online environment. Most participants
(S2, S5 and S7) indicated a particular content topic as the most important thing
gained from the previous week‘s tutorial with graphing and types of variables
predominating. However, one participant (S6) indicated that it made her realise that
she was already behind in the work and that it gave her the incentive to catch up.
After each question, I displayed the summary of the anonymous results on the screen
emphasising the importance of being completely honest as their answers were an
essential component of the research study. This segment lasted approximately 10
minutes.
74
5.2.2.2 Tutorial 2 content
The course content covered in Tutorial 2 comprised contingency tables including
joint, marginal and conditional distributions, and applications of the normal model.
Prior to this tutorial, I emailed information about the questions we were to discuss to
all participants. The question on contingency tables used one data set to cover most
aspects of descriptive statistics for examining the relationship between two
categorical variables. Participants contributed to the calculation of row and column
totals, used the table to find specific percentages related to joint, marginal and
condition distributions and used conditional distributions to discuss whether the
information in the table indicated that the variables were related. I used
question/answer facilitation techniques to ―tease out‖ the key concepts that needed to
be covered to answer this type of question. During the discussion, I seized on
opportunities as they presented to discuss statistical jargon and compared it with
everyday language. Describing conditional distributions and using them to answer
related questions were problematic at first, but, with practice, the participants
improved in expressing their observations in appropriate statistical language. This
part of the discussion continued for approximately 40 minutes.
From there, we moved on to discuss the normal model question that we did
not have time to complete in Tutorial 1. This was a typical normal model question
that involved a variable that followed a normal distribution and required finding the
proportion of values in a particular part of the distribution (Figure 5.4).
Figure 5.4. Using diagrams to explain concepts
75
For this we looked at a normal curve representing the particular context, related it to
the standard normal curve by calculating a z-score and then used the standard normal
table to find the required proportion. We did not have time to discuss the
complement to this, where we were given a proportion and had to find the value of
the variable that corresponded. This question was held over until the following week.
Discussion of this question lasted about 18 minutes.
Again, the final four minutes allowed time for general organisational issues:
questions about archives, plans for the following week, making a start on the
assignment and final farewells.
5.2.3 Interactions in Tutorial 2
Prior to the start of Tutorial 2, the text chat provided a useful means to communicate
with participants as they entered the Wimba Classroom at a variety of times. Wimba
polling was used to elicit feedback related to the research questions. An icebreaker
question was used to instigate involvement early in the tutorial and make participants
comfortable with answering questions using the Poll function. During this part of the
tutorial, most students used text chat rather than the microphone to communicate
even though all but one (S11) had microphone capability and I encouraged them to
use the microphone. Apart from a few social comments, most of the interaction at
this time was student-to-teacher and teacher-to-student related to technical issues.
The content questions were presented on the virtual whiteboard as it was
necessary for participants to do some calculations and share their results with the
group. Working through the questions required direction from me in the form of
question and answer facilitation (teacher-to-group interaction), leading the
participants step-by-step through the process. Most participants, with the exception
of S7, were actively involved in answering questions, asking questions and entering
relevant information on the whiteboard. Interactions took place using a combination
of microphone and text chat, mostly with me on the microphone and participants
using the text chat. However, as the tutorial progressed, some participants (S1, S2
and S6 in particular) used the microphone more often, especially if a question or
answer was more complex to communicate such as when S1 described how to find a
conditional distribution.
76
Participants interacted with the content by calculating answers to specific
questions and presenting answers on the whiteboard or in the Chat window. Using
the whiteboard or text chat rather than using the microphone gave everyone an
opportunity to contribute an answer spontaneously, without interruption. Because of
the difficulty of the concepts being studied and the wording needed to communicate
those concepts, the teacher-to-group interactions were primarily content-driven. This
was then followed by student-to-teacher interaction which, by virtue of the
technology, could be interpreted as student-to-group as well. In this tutorial a major
part of the interaction was in the form of exposition by me, however some
participants (mainly S1 and S6) were comfortable with interrupting in order to clarify
points of confusion (student-to-teacher interaction). In this environment handwriting
on the whiteboard allowed me to draw diagrams which helped immeasurably with
explaining the concepts and assisting with organising their thinking (teacher-to-
content and teacher-to-group interaction).
Social interaction occurred briefly at the beginning as participants greeted the
group and then again at the end as participants said thank you and goodbye. This
type of interaction only occurred using text chat. In short, the complexity of the
concepts being discussed dictated that the interactions in Tutorial 2 were mostly
teacher-to-group.
5.2.4 Use of Tutorial 2 archive
As for Tutorial 1, the link to the archive was emailed to all study participants not just
those who had attended the tutorial. Similarly, as for Tutorial 1, S7 (who had
attended) and S8 (who had not attended), accessed the archive of this tutorial 2-3
weeks after the tutorial and then again a few weeks prior to the examination.
5.3 Tutorial 3
The third tutorial, Tutorial 3, was conducted in Week 6 of the semester which
coincided with Christmas week. By this stage, most participants were comfortable
with the technology, although some were still experiencing some issues with Internet
connectivity. So as to be ready to start the tutorial on time, I set up the second
computer for the research recording (as noted previously in this chapter) and
uploaded the PowerPoint slides about 10 minutes prior to the advertised starting
time. I also typed ―Welcome to Data Analysis, I will be with you soon ‖ on the
77
whiteboard so that participants would know that I was there but would not be
tempted to ask questions too early for them to be recorded in the archive. Yet again,
this tutorial took more than the designated one hour lasting for approximately 75
minutes.
5.3.1 Who attended Tutorial 3
Despite commitments around Christmas, eight participants attended the third tutorial
(S1, S2, S3, S4, S7, S8, S9 and S11). Two participants (S5 and S6) had apologised
prior to the tutorial with S5 unable to attend due to work commitments and S6 due to
Internet connection issues while moving house. Two participants (S1 and S2) had
entered the room early and were soon joined by S7. After a further five minutes, S9
joined the group for her first tutorial. I had a brief text chat conversation with S9 to
check if the technical issues she experienced when trying to join earlier tutorials had
been resolved.
I started proceedings with ―Hi everyone‖ and a special welcome to S9. In the
meantime, S1 had exited and re-entered the room due to technical issues with audio.
S9 also experienced sound issues, so I suggested that she exit and re-enter to see if
that fixed the problem. While this was happening, S3 entered the room. The
remaining three participants joined the tutorial after the start; S4 part way through the
research poll questions; S8 during discussion of the fourth content slide and S11
towards the end of the tutorial when there was less than ten minutes remaining.
Approximately 10 minutes after exiting the Wimba Classroom, I had an email
exchange with S9 where she commented that she could not get back into the room
and needed to follow up on an error message that recommended she contact Wimba
support. She indicated that she would try to get it sorted out during the week and join
the group in the next tutorial.
5.3.2 Format and content of Tutorial 3
After a quick sound check, I started the tutorial with a number of research questions
relevant to the study. This was followed by discussion of material related to the
previous week‘s course content according the course schedule.
5.3.2.1 Reflection on Tutorial 2
As in the previous tutorial, the research poll questions started with a simple
icebreaker followed by multiple-choice questions related to key questions including
78
attitude to the level of the content covered in the previous week and how they were
feeling about the course at this point in time, and an open-ended question identifying
the most important thing gained from the previous week‘s tutorial (see Section
3.2.1.3 and Appendix B for details). All of those present who had attended the
previous tutorial (Tutorial 2) indicated positive attitudes to the material covered in
the previous week with responses such as ―just right‖ (S1, S2 and S7) and ―nothing
new but helpful‖ (S2). However, participants also described it as ―thought
provoking‖ (S4 and S7). Two participants (S2 and S7) indicated a particular content
topic, namely, distributions, as the most important thing gained from the previous
week‘s tutorial. Reassuringly, two participants indicated that ―I wasn‘t as far behind
as I thought I was‖ (S1) and ―…clarification of some questions. Hopefully am
getting there and switching on!!‖ (S4). With reference to their feelings about the
course, four out of five chose ―getting on top of it‖ (S1, S3, S4 and S7). The
Mathematics students (S1 and S2) both chose ―enjoying it.‖ None of the negative
options were chosen. As before, after each question, I displayed and discussed,
where necessary, the summary of the results on the screen. To fill in the time while
waiting for answers to be submitted, I discussed organisational issues and overall
course structure, that is, how topics fitted together. This segment lasted
approximately 10 minutes.
5.3.2.2 Tutorial 3 content
The course content covered in this tutorial included discussion questions, SPSS
(statistical software) output and multiple-choice questions relevant to regression and
correlation. Prior to the tutorial, I emailed the discussion question and associated
SPSS data file to all participants. The first question presented a scenario and data that
required analysis using regression and correlation (Figure 5.5). To add structure to
the discussion, I asked questions related to the most important words in the question
to highlight key concepts and understanding of the terminology that needed to be
used.
After initial discussion about the context of the question which yielded the variables
of interest and identified dependent and independent variables, we worked through
the analysis looking at the data in SPSS, the scatterplot of the data, outliers, the
equation of the line of best fit, residuals and residual plot, coefficient of
determination, correlation, lurking variables and interpretation of the corresponding
79
SPSS output (Figure 5.5). Discussion of this question set the scene for working
through the multiple-choice questions as we had reviewed key concepts of the topic
ready to test understanding. This part of the tutorial lasted for about 45 minutes.
Figure 5.5. Interpreting SPSS output
Working through the multiple-choice questions presented opportunities for
clearing up misunderstandings and reinforcing concepts that were already
understood. Participants were given an opportunity to select an answer and then
justify their choice. The six multiple-choice questions were chosen to emphasise the
distinction between explanatory and response variables, quantitative variables,
prediction using the regression equation, the need for the relationship to be linear,
direction of the relationship and interpretation of correlation. Since the final
examination was to include 20 multiple-choice questions, this was an ideal time to
discuss examination technique in relation to this type of question, for example,
underlining keywords and reducing choices by eliminating more obvious incorrect
answers. These questions took approximately 15 minutes to complete.
In the final five minutes of the tutorial, some specific questions that had not
been resolved earlier were answered, plans for the following week were discussed,
Christmas wishes expressed and the usual farewells typed into the Chat window.
5.3.3 Interactions in Tutorial 3
While working through the research questions at the beginning of the tutorial, only
minimal interaction took place. The extent of the interaction was: (i) student-to-
80
content in dealing with answering questions using the Poll function; and, (ii) teacher-
to-group in talking about the structure of the course. However, when the first content
question was displayed on the whiteboard, I directed a question to the group and one
participant used the Hand-up icon to respond. Once this initiative had been taken to
ask questions and contribute ideas in this way, some of the others followed. Using
the Hand-up icon and microphone to ask questions was used to a greater extent than
in the previous two tutorials.
The context of the principal question used in this tutorial to prompt
discussion on the topic was somewhat confusing and in some ways a poor choice on
my part. However, it did have a positive impact in that it prompted participants to
become actively involved in collectively arriving at an understanding of the context
before we were able to move on to the statistical implications of the question. The
main contributors to the discussion of this question were S1, S2, S7 (using the
microphone) and S4 (using text chat); see Tables 5.2 and 5.3.
As we worked through the content slides, I used a combination of direct
instruction and question/answer facilitation to engage the students. From my
perspective, interaction took the form of teacher-to-group instruction and
questioning, teacher-to-student interaction in responding to specific questions and
teacher-to-group interaction in posing questions to the group to move the discussion
forward. Participant contributions included student-to-teacher interaction when
asking a direct question or clarifying understanding and student-to-group interaction
when offering answers to my questions to further discussion.
To encourage more interaction in this tutorial, I decided to use more multiple-
choice questions rather than predominantly open-ended questions as was the case in
Tutorial 2. With multiple-choice questions, participants could offer a response even
if they were unsure of the answer. This is in comparison to short-answer questions
which can be more confronting in that most people would want to have a reasonable
idea of the answer before committing to a response.
There were six participants in the Wimba Classroom during the multiple-
choice questions but not all contributed answers. One was having technical
difficulties and could not contribute while, of the remaining five, all contributed to
two of the six questions, four contributed to two questions and three contributed to
the remaining two questions. During this period, only three participants (S1, S2 and
81
S4) contributed to the discussion using text chat and microphone. Even though S7
had contributed to discussion in the earlier part of the tutorial, he did not contribute
to text chat nor microphone discussion during the multiple-choice questions. S3 and
S8, who was experiencing technical difficulties, contributed very little to the
discussion throughout the tutorial (Table 5.2). Towards the end of the tutorial, there
was one instance of student-to-student interaction between S1 and S2, with S1
helping S2 with an SPSS issue that I had not been able to resolve.
Tutorial 3 had less social interaction than the previous tutorials and was
somewhat disjointed because of staggered entries and exits to the Wimba Classroom.
There was some social greeting at the beginning of the tutorial and farewells at the
end of the tutorial as usual (Figure 5.13). Some people experienced difficulties with
technology so this required some teacher-to-student interaction to rectify.
5.3.4 Use of Tutorial 3 archive
As was to become a regular practice, the link to the archive was emailed to all study
participants not just those who had attended the tutorial. Three participants (S5, S7
and S8) accessed the archive of this tutorial. Two of these participants, S7 and S8,
had attended the tutorial while S5 had not. However, S5 accessed the archive prior to
the following tutorial and then again prior to Tutorial 5. Similar to the previous
week, S7 and S8 each accessed the archive a couple of weeks after the tutorial and
then again prior to the examination.
5.4 Tutorial 4
The fourth tutorial, Tutorial 4, was conducted in Week 7 of the semester. This week
was usually taken as a semester break because it fell between Christmas and New
Year and the University was closed except for essential services. However, when
asked by a participant if we would be having a tutorial, I had agreed to conduct one if
participants wanted me to do so. Most of the participants voted in favour of holding a
tutorial. This tutorial took even more time than the previous three lasting
approximately 90 minutes.
5.4.1 Who attended Tutorial 4
Five participants attended the fourth tutorial (S1, S4, S6, S7 and S8). Since this
tutorial was viewed as an additional tutorial because of the time of the year, I did not
expect apologies for non-attendance as it was mutually understood that not all would
82
be able to attend. Despite this, several participants had contacted me prior to the
tutorial with an apology, namely, S3 due to an illness in the family, S5 due to illness
and S9 for non-specific reasons.
Two participants, S7 and S8, had entered the room early. I had, as usual,
posted the message ―Welcome to Data Analysis, I will be with you soon‖ on the
whiteboard. After a couple of minutes, I posted a text chat message to say ―Hello‖
and to advise that the tutorial would start in five minutes. I also asked the participants
to think about keywords related to the binomial distribution while they were waiting.
A few minutes later, S1 entered the room while I was busy communicating with S9
outside the room trying unsuccessfully to assist her with technical issues.
I started the tutorial by saying ―Hi everyone,‖ enabled all participants to write
on the whiteboard and then introduced the topic. Within five minutes of starting the
tutorial, the remaining participants, S4 and S6, arrived in quick succession.
5.4.2 Format and content of Tutorial 4
The format for this tutorial was not completely consistent with previous tutorials.
While waiting for more participants to arrive, I asked those already present to write
keywords on the whiteboard related to the binomial model which was the main topic
for the session. Following a sound check, S4 entered the room and we continued
discussing the SPIN acronym (Success, Probability of success, Independence,
Number of trials) and how it assisted in determining when the binomial model was
appropriate. Once the final participant for this tutorial (S6) entered the room, we
moved on to the regular research poll questions.
5.4.2.1 Reflection on Tutorial 3
The icebreaker question for this week referred to New Year‘s resolutions since this
time was fast approaching. The key research questions covering the level of content
and most important skill or knowledge gained from the previous tutorial were posed
(Appendix B). The four participants who had attended the previous tutorial indicated
that the content was ―just right‖ (S1, S4, S7 and S8) with three adding that it was
―thought provoking‖ (S1, S4 and S8) and one noting that there was ―nothing new but
helpful‖ (S1). Again, there were two aspects to responses about the most important
understandings gained. Two participants commented on specific content, namely:
―more experience working with scatterplot and R0 and R1 values from SPSS output‖
83
(S1) and ―regression and correlation and unstandardised residuals‖ (S7). The other
two participants commented on less tangible things such as ―reassurance about what
I have been studying‖ (S8) and ―got me back on track again clearing some
confusion‖ (S4). S6 had not been present at the previous tutorial.
A third research question, enquiring about which topic had been most
challenging so far, was added for the first time. The topics most often mentioned
were ―contingency tables and conditional distributions‖ which were covered in
Tutorial 2 and ―experiments, observational studies and sampling‖ which we had not
yet covered in the online tutorials. A summary of responses to each question was
displayed after each question, the final one prompting me to add that we would look
at experiments, observational studies and sampling next week. This segment lasted
for about six minutes.
5.4.2.2 Tutorial 4 content
Following the research questions, we returned to discussion about probability
distributions, in particular the binomial distribution. This time I did not email the
questions prior to the tutorial as they were all multiple-choice questions and I wanted
the group to share their thinking processes in working out the answers. The twelve
questions covered content which included three questions on basic probability rules.
The remaining nine questions tested knowledge on identifying binomial situations
(using the SPIN acronym), defining the binomial model parameters in specific
contexts, finding the mean and standard deviation of a binomial random variable,
calculating binomial probabilities using binomial tables and the normal
approximation to the binomial distribution (Figure 5.6). Everyone was encouraged to
do the calculations, use the tables as necessary and then to contribute an answer by
putting a mark on the whiteboard next to their choice.
From there, we discussed reasons for choices and where misconceptions
could arise. Following the discussion of each question, I used the Tick/Cross icon to
gauge understanding. Throughout the discussion, I highlighted keywords, correct use
of terminology and examination techniques. Incorrect answers were used as an
opportunity to expand the discussion beyond the direct concept being tested. In this
way participants were encouraged to ―have a go‖ even if they were unsure of the
answer to a question. These questions and associated discussion lasted for about an
hour.
84
Figure 5.6. Normal approximation to the binomial distribution
Because it had been a long session, we concluded the tutorial in the final
three minutes with some suggestions from participants for the next tutorial. The usual
farewells typed into the Chat window followed.
5.4.3 Interactions in Tutorial 4
After the usual greetings and sound check, this tutorial started with content
discussion on terminology associated with the binomial model. Interaction was
primarily student-to-group and teacher-to-group as keywords were discussed and
elaborated. However, there was one instance of student-to-student interaction in text
chat when S4 assisted S8 with instructions on how to write text on the whiteboard.
Minimal personal interaction took place during the research questions as participants
concentrated on the content of the questions and providing answers.
Tutorial 4 consisted of working through 12 multiple-choice questions related
to probability, in particular the binomial model. This gave ample opportunity for
participants to interact with the content by indicating an option. However, only one
of the twelve questions was answered by all five participants. This may have been
because I specifically asked that everyone should contribute an answer for that
question. For the other eleven questions, four participants contributed answers to one
question, three participants contributed answers to each of five questions and two
participants contributed answers to each of the remaining five questions. Following
my lead, participants also interacted with the content by underlining keywords in the
questions. I also interacted directly with the content and indirectly with the students
85
by using electronic handwriting on the whiteboard to develop solutions and enhance
explanations using diagrams (Figure 5.6). Teacher-to-student interaction was
focussed on checking understanding of the content through the use of the Tick/Cross
icon or Smiley Face emoticon. Most student-to-group interaction came from S1 and
S6 (using text chat or microphone) and S4 (using text chat). S1 was a little slow to
contribute at the start and he admitted that he had not looked at this content prior to
the tutorial. S7 contributed occasionally (using text chat or microphone) but this
lessened as the tutorial proceeded. While S8 contributed to discussion at the
beginning of the tutorial, she stopped once work started on the multiple-choice
questions. Teacher-to-group interaction was a mixture of: (i) facilitating discussion
by asking questions to lead to the next step to solve a problem, or (ii) direct
instruction of content needed to be understood to answer a question.
There were social greetings at the beginning of the tutorial and farewells at
the end as usual. There was some further personal interaction when S6 arrived late.
There were noticeably more student-to-student interactions using text chat in this
tutorial, particularly in the form of community building comments. When S6 picked
up an error in an answer given by S1, S1 responded ―Good catch,‖ to which S6
responded ―Thanks.‖ When S4 offered a correct answer and commented ―woo hoo,‖
S1 responded by using the Clapping emoticon. Participants were also showing more
confidence within the group, for example, S6 responded to a question from me to the
group with a candid ―Got me, can‘t remember‖ and when S6 offered an incorrect
answer, S1 admitted ―that‘s what I thought.‖ These were the types of interactions
more typically associated with a face-to-face class.
5.4.4 Use of Tutorial 4 archive
Four participants (S5, S7, S8 and S9) accessed the Tutorial 4 archive. Two of these,
S7 and S8, had attended the tutorial while S5 had been unable to attend due to illness
and S9 continued to have technical issues with connecting to Wimba Classroom. The
timing of their access indicates that three (S5, S7 and S8) used the archive during the
revision period just prior to the examination whereas S9 tried to access it just prior to
the following tutorial. One participant, S7, reviewed the archive twice during the
revision period.
86
5.5 Tutorial 5
The fifth tutorial, Tutorial 5, was conducted in Week 8 of the semester, the day after
the New Year‘s public holiday. In Australia, this particular year proved to be unusual
with respect to the weather. At this point in time all localities, where participants
lived, were experiencing much higher than normal rainfall, to the extent that some
had experienced localised flooding. This topic dominated text chat conversation
amongst the participants as they arrived in the Wimba Classroom. This tutorial lasted
approximately 75 minutes.
5.5.1 Who attended Tutorial 5
Seven participants attended Tutorial 5 (S1, S2, S4, S5, S6, S7 and S8). This group
included the five most regular attendees, namely S1, S2, S4, S6 and S7. One
participant, S3, was still unable to attend due to ongoing family issues while S9
continued to have technical problems which were not able to be resolved. All
participants were present in the room five minutes before the start. Because of an
error in uploading content for the tutorial, I had to exit the Wimba Classroom to
upload the correct content. During my absence, participants conversed about the
weather and flooding using text chat. Once I re-entered, there was no further delay.
5.5.2 Format and content of Tutorial 5
Tutorial 5 started with the regular research poll questions, followed by content
questions related to experiments, observational studies and sampling. According to
the course study schedule, this material should have been covered a couple of weeks
prior to this tutorial. I had delayed this content in order to cover the oft-perceived
more difficult topic of the binomial distribution in the previous tutorial. This was
designed to give the participants more time assimilating more difficult concepts in
preparation for their second assignment. The plan for this tutorial was to also cover
material on sampling distributions, if time allowed.
5.5.2.1 Reflection on Tutorial 4
As was becoming usual practice, I started the research poll questions with an
icebreaker. Appropriately, this question related to the weather, a topic that had been
dominating the social conversation. Key questions covering the level of content and
most important thing gained from the previous week‘s tutorial were the only research
questions posed in this tutorial (Appendix B). Four of the five participants who had
87
attended the previous tutorial indicated that the content was ―just right‖ (S1, S4, S6
and S7) with two also adding that it was ―thought provoking‖ (S1 and S4). The other
participant, who had been experiencing some health problems, indicated that she
thought that it was ―a bit confusing‖ but also ―thought provoking‖ (S8).
Responses to the usual question about the most important thing gained from
the previous week‘s tutorial included comments such as: ―that I needed to catch up‖
(S1), ―helpful embedding the knowledge‖ (S8), ―allowed me to tie things together
better – big picture stuff‖ (S6), and ―learn to read the questions twice or more!!‖
(S4). Two participants commented on specific content: ―using the normal to
approximate the binomial was useful‖ (S1), and ―binomial mean and standard
deviation‖ (S7). I did not publish a summary of responses to the research questions
this time because I wanted to start discussion of the content questions without delay.
During this period, there was some further social conversation about the flooding, in
particular, S1‘s chances of being able to travel north to take up new employment.
This segment lasted for about six minutes.
5.5.2.2 Tutorial 5 content
The course content covered in this tutorial included multiple-choice questions on
experiments, observational studies and sampling. As I had not emailed the questions
to participants prior to the tutorial, they did not have an opportunity to think about
the questions beforehand. It was not anticipated that this would be an issue as the
participants should have already covered this content in their private study.
I was hoping to complete the ten questions fairly quickly. The first two
afforded an opportunity to look for keywords, clarify terminology and discuss
definitions related to the distinction between observational studies and experiments.
As expected, a number of questions proved to be easy and straightforward and not
requiring much discussion as all responses were correct. Despite this, things changed
when participants were asked to justify their answers and explain why the other
options were not chosen. The next few questions tested knowledge of sampling
including population of interest and types of sampling.
Following this, questions on experimental design, including the statistical
concept of confounding, were posed. The final question prompted much discussion
because of unintentional poor wording. These ten questions took approximately 30
88
minutes to complete leaving time for discussion of some sampling distribution
questions.
The topic of sampling distributions, in particular the distribution of the
sample mean, was reviewed using five multiple-choice questions. Some questions
involved simple definitions but others required a deeper understanding of sampling
variability. For this latter type of question, understanding was developed by
identifying keywords in the question and talking through the thinking process,
particularly consideration of the meaning of each of the options in the multiple
choices. Participants were encouraged to explain their choice and what led them to
that answer.
Questions involving mathematical calculations were treated slightly
differently. After participants had committed to an answer, the group were guided
through the steps of the calculation so they could identify where they may have made
errors (Figure 5.7). In particular, we discussed the relationship between the standard
deviation and the standard error; one aspect that seemed to be poorly understood.
Several participants enquired about the archive of the tutorial. There were actually
nine questions prepared for this topic but these five questions took almost 20 minutes
to discuss.
Figure 5.7. Guiding through the steps of the calculation
The last ten minutes of Tutorial 5 were used to reassure participants that they
were progressing well. This was necessary as it became apparent from comments
such as ―a little more practice methinks!!‖ (S2) and ―arrrggghhh‖ (S5) that this
89
tutorial had dented their confidence. I commented that we would be revisiting this
content over the next few weeks and answered lingering questions about sampling
distributions. I briefly mentioned the specific content to be covered in the next
tutorial. We also discussed the possibility of having a revision tutorial in the week
before the examination. Finally, we all wished S1 a safe journey to his new
employment and the participants typed their usual farewells into the Chat window.
5.5.3 Interactions in Tutorial 5
Tutorial 5 evidenced a mix of student-to-content, teacher-to-content, student-to-
group, teacher-to-group, student-to-teacher, teacher-to-student, and student-to-
student interactions. With a false start to the tutorial, because of my need to leave the
Classroom to upload the correct content, some participants (S1, S4, S6 and S8) took
advantage of the opportunity to interact socially using text chat. Some of their
comments were directed to specific individuals but were mostly directed to the
group. When directing a comment or question to a specific person, the participant
generally named the person to whom the comment was directed. With the Chat
window visible to all, some comments may have been to specific individuals but in
this environment, it was not always possible to distinguish.
During the period when participants were answering the research questions
and waiting for others to lodge responses, some additional social interaction
occurred. This interaction consisted of both teacher-to-student (to S1) and student-to-
student (S2 and S6 to S1) as we expressed our well wishes on S1‘s move to new
employment in another location. Teacher-to-group interaction was needed to
encourage everyone to answer these questions promptly.
Interaction with content included working as a group on ten multiple-choice
questions on the topic of experiments, observational studies and sampling. Similar to
the previous tutorial, this required participants to visibly interact with the content by
indicating their favoured option in a multiple-choice question on the virtual
whiteboard. Not all participants indicated responses on the whiteboard. For the first
five questions between three and five of the six participants contributed responses.
However, the remaining five questions prompted much discussion resulting in a full
response rate by the last question. As previously noted, the technology did not
provide information on who was writing on the whiteboard so it was not possible to
determine who was not contributing a response.
90
Interaction with content also included working as a group on five multiple-
choice questions on the topic of sampling distributions. In general, there was a drop
in the number of responses observed on the whiteboard compared with the full
response rate observed in the last question of the earlier topic. This was not
unexpected due to the higher level of difficulty of this topic.
The last question in this topic required a number of steps and some
calculation to complete. Through facilitation, this prompted more contributions from
the participants thus actively engaging everyone in the group. By drawing a normal
distribution diagram and writing the calculations on the virtual whiteboard, this
question gave me an opportunity to demonstrate a step-by-step problem-solving
strategy to develop the answer (Figure 5.7). The complexity of this topic also gave
participants many opportunities to add to the discussion. As a result, there was
considerable student-to-group interaction. This was supported as necessary by
teacher-to-group interaction when I explained the more difficult concepts. At times, I
asked specific participants to expand on comments that they had made in the Chat
window (teacher-to-student). Usually they used the microphone to do this as it was
quicker for lengthier elaborations (student-to-teacher interaction, but with the group
benefitting from the explanations given). There was noticeably more use of the
microphone in this tutorial with all but one participant (S5) using the microphone at
least once during the tutorial.
Throughout the tutorial, my interaction was a mixture of teacher-to-student,
that is, with an individual, and teacher-to-students, that is, with the majority of my
comments, explanations and questions being purposely directed to the whole group.
With no visual cues, when I asked a question of or supplied an explanation to a
specific student, it was necessary to address them by name. In particular, teacher-to-
student interaction occurred whenever a specific student was asked to elaborate on
their response to a multiple-choice question. The elaboration of their response was
generally directed towards the teacher but the whole group benefitted from it. Further
to this, student-to-teacher interaction involved asking questions to clarify
understanding as the discussion of a multiple-choice question progressed. The
answer to any individual student‘s question was directed from teacher to that student
but, again, the whole group benefitted from the explanation.
91
During this question-answer interaction, the microphone was the main means
of communication. When specific participants were directly asked questions, they
responded using the microphone. However, other participants would also comment
in the Chat window in quick succession to add to that participant‘s response. When
using the microphone to ask a question, participants were now in the habit of using
the Hand-up icon so as not to interrupt the flow of the conversation.
While the questions on the whiteboard were the motivators of any interaction,
the facilitation role of the teacher was necessary to move the discussion forward and
clarify points that were obstructing progress. In the facilitator role, I provided the
content that prompted the discussion. I interacted with the content by modelling the
problem-solving strategies required and I confirmed student understanding
throughout the discussion of the key concepts required to formulate an answer to a
question. In modelling problem-solving processes, I encouraged students to underline
keywords, but they were slow on the uptake of this strategy at this point in time.
There were three main types of distinguishable student-to-student interaction
in this tutorial.
The first involved comments used to support a specific answer given by another
student, for example, when S6 said ―Good point S2,‖ referring to a pertinent
comment made by S2 in the discussion of a question.
The second type of student-to-student interaction occurred when one offered
emotional support to another. S4 prefaced a question directed to me with ―I hope
this isn‘t a silly question‖ and S6 responded ―no such thing as a silly q‘n S4‖ in
the Chat window while I was responding to the question.
The third type of student-to-student interaction occurred at the end of the tutorial
when S2, S4, S6 and S7 posted specific well wishes to S1 on his upcoming move
to a new town made challenging given the flooding occurring across the country
at that point in time. Comments included ―Pack your floaties, S1!‖ ―Take a GPS,
S1,‖ and ―Be careful, be safe!‖ to which he responded ―Yep, gonna convert the
Kia to a Duck.‖
S1, S2, S4, S5, S6 contributed most throughout discussion. S7 and S8 contributed
towards the end of the tutorial.
92
5.5.4 Use of Tutorial 5 archive
Four participants (S3, S5, S7 and S8) accessed the archive, with S7 reviewing it
twice. One of the participants, S3, had not attended the tutorial due to ongoing family
issues. The timing of access indicates that in all cases the archive was reviewed
during the revision period just prior to the examination.
5.6 Tutorial 6
The sixth tutorial, Tutorial 6, was conducted in Week 9 of the semester, the day after
an unprecedented flood event occurred in our region. All participants in the study
were affected to some extent. I sent an email on the evening of the flood event to
inform participants that I would conduct the tutorial the following evening provided I
had an Internet connection. Several participants responded the next day expressing
support for me, my family and my community. This was despite the fact that many of
them were facing flood concerns of their own as, in coming days, the flood waters in
our major river systems moved inevitably towards the coast and, in particular, into
the capital city. Initially, I had problems with my Internet connection but finally
succeeded just before the tutorial was due to start. It was not unexpected that the first
few minutes of the tutorial were devoted to finding out how everyone was faring.
This tutorial lasted approximately 75 minutes.
5.6.1 Who attended Tutorial 6
Six participants attended Tutorial 6 (S1, S2, S3, S4, S6 and S7). This was an
excellent attendance considering the circumstances. Most of those who attended were
affected by the flood, were about to be affected by the flood or had family and
friends affected by the flood. S1, who had been farewelled in Tutorial 5, was caught
in transit while moving with his family to a new town. Even though he experienced
technical problems due to an unstable Internet connection at his accommodation, he
was able to participate in most of the tutorial. Two study participants emailed an
apology for not attending: S5 was faced with a chaotic household as some of her
extended family needed to be evacuated from the flood zone and S9 had other
unspecified duties. It was assumed that S8 was facing similar difficulties as her
district was also in major flood.
93
5.6.2 Format and content of Tutorial 6
To establish some sense of normality, I proceeded to the usual research poll
questions fairly quickly. These were followed by an introduction to hypothesis
testing using the sign test. I had intended offering some multiple-choice questions on
hypothesis testing with proportion but there was insufficient time. I also did not have
time to revisit sampling distributions, deciding instead to email answers to questions
not covered in the previous tutorial, advising participants to email me if they had any
difficulties.
5.6.2.1 Reflection on Tutorial 5
The first research question (icebreaker) asked how everyone was faring in the floods.
This sparked some social conversation about the severity of the floods with S7
commenting that he was ―just putting more sand bags out the back!‖ The flood
actually reached his locality two days later. There was further conversation with S1
about his travels north. I repeated the three research questions that were asked in
Tutorial 3, checking on their attitude to the level of the content covered in the
previous tutorial, asking an open-ended question identifying the most important thing
gained from the previous week‘s tutorial and enquiring how they were feeling about
the course at this point in time (Appendix B).
Four of the five participants who had attended the previous tutorial indicated
that the content was ―just right‖ (S1, S2, S4 and S7) with three adding that it was
―thought provoking‖ (S1, S4 and S7) and the other commenting ―nothing new but
helpful‖ (S2). The fifth participant indicated that she thought that it was ―a bit
confusing‖ but also ―thought provoking‖ (S6).
Responses to the question about the most important skills or knowledge
gained from Tutorial 5 included: ―just generally helpful‖ (S2), ―that I was behind and
need to catch up‖ (S6), and ―learning to read questions properly!‖ (S4). Two
participants commented on specific content: ―brushing up on z scores‖ (S4) and
―using the different formulas for finding the SD and Mean for proportion and mean
of a sampling distribution‖ (S1). Another participant commented ―all very important‖
(S7). With reference to feelings about the course, all participants said that they were
―enjoying it,‖ with four of the six adding that they were ―getting on top of it‖ (S1,
S3, S4 and S7). Of the remaining two, one added that she was ―struggling‖ (S6) and
94
the other added ―all good‖ (S2). It should be noted that the participant who was
struggling and found it a bit confusing had indicated that she was behind in her study
because of moving house. During this time, there was further social conversation
about the impacts of the flooding. This segment lasted for about 16 minutes.
5.6.2.2 Tutorial 6 content
The tutorial could have moved in a number of directions from here as I had prepared
material for more than one topic. The participants took a vote and decided on the
sign test. I had not been able to email the questions prior to the tutorial so
participants came into the discussion relatively unprepared. As well as being a simple
introduction to hypothesis testing, the sign test applied previous knowledge of the
binomial distribution.
As a group, we worked through a sign test example highlighting the four
steps to a hypothesis test: developing the hypothesis statements; formulating a test
statistic; finding the p-value; and drawing a conclusion in the context of the question.
We did this informally (Figure 5.8), discussing the key concepts, and then more
formally with appropriate setting out and terminology.
Figure 5.8. Working through a sign test informally
I talked about the research process of formulating the research question and
gathering evidence to support it. From the discussion, we were able to clarify such
things as the distinction between null hypothesis and alternative hypothesis, and one-
tailed and two-tailed tests. We also looked at how to state the conclusion in
appropriate language balancing the statistical language with everyday language.
95
Calculating the test statistic drew on and tested their knowledge of the binomial
distribution. Some people had tried examples from the course study book so this
discussion also gave them an opportunity to clarify understanding in other contexts.
The discussion included level of significance, Type I error and Type II error. The
content of this topic was very dense with new concepts and terminology prompting
one participant to comment that ―[my] brain hurts but I‘m sure I‘ll be fine‖ (S6). This
question on the sign test, in all of its facets, took us 45 minutes to complete.
Again, the last ten minutes of the tutorial were used to reassure participants
that we would be revisiting these concepts and processes many times over the
coming weeks. I answered a few questions from participants wanting to confirm their
understanding. I could ―see‖ that they were all reaching saturation point with so
many new and complex ideas being discussed. It was necessary to give them time to
review what we had discussed and let the key ideas come together into coherent
thoughts. Because we did not cover as much content as I had hoped we might, I
proposed to email everyone the answers to the sampling distribution questions and
the multiple-choice questions on hypothesis testing with proportion. I requested that
participants try these multiple-choice questions in their own time, email me any
queries, and that we would look at them briefly at the beginning of the next tutorial.
Further, I recommended that we move on to hypothesis testing about a mean in the
next tutorial as we still had a considerable amount of content to cover before the end
of semester. Finally, we all implored each other to stay safe and then finished with
our usual farewells.
5.6.3 Interactions in Tutorial 6
Interactions in Tutorial 6 consisted of a mix of student-to-content, teacher-to-content,
student-to-teacher, teacher-to-student, student-to-group, and teacher-to-group.
During the first few minutes, as students arrived in the Wimba Classroom, social
interaction using text chat was informally initiated amongst participants.
The first question in the research question segment of the tutorial was an
icebreaker which prompted discussion about the impact of the floods. This resulted
in student-to-group and teacher-to-group interaction with the occasional teacher-to-
student and student-to-student social conversation using both text chat and
microphone (Figure 5.14). This continued throughout this segment of the tutorial as
we waited for participants to lodge responses to the four research questions. In
96
particular, teacher-to-student and student-to-teacher interaction using the microphone
involved an exchange between S1 and me, referring to his unexpected attendance at
the tutorial considering that he was in transit to his new location. When he revealed
his current location, S6 saw an opportunity to arrange a face-to-face study session
with him as he was staying in her town until floodwaters subsided. Student-to-
student interaction using text chat ensued, swapping contact information and
arranging to meet.
Some group-to-teacher interaction was afforded indirectly by the technology.
Functional icons (Ticks/Crosses) were used to gauge agreement with the choice of
topic to be discussed (Table 5.6). The topic agreed upon was hypothesis testing, in
particular the sign test. While only one question was completed during the tutorial,
the topic was sufficiently complex to promote considerable discussion. This
complexity required the teacher-to-group interaction to be a mixture of facilitation
and direct instruction. Facilitation questioning elicited responses from individual
participants, using both text chat and microphone. When opting to use the
microphone, participants generally used the Hand-up icon to which I responded in
the designated order. Again, while responses from participants appeared to be
student-to-teacher interaction, it was in effect student-to-group interaction. Due to
lack of visual cues, emoticons (for example, smiling or clapping) were used by
participants during this time to break down social barriers.
Even though most participants had covered the content of this tutorial in their
private study, gaps in their understanding of key concepts soon came to light. This
required more direct instruction than had previously been required in the earlier
tutorials. During the periods of direct instruction, participants did not just sit and
listen but contributed to explanations (student-to-group interaction). Some also
contributed statements in the form of questions (student-to-teacher interaction) to
indicate they were unsure and wanted to confirm their understanding, such as when
S4 asked in text chat, ―So, is it fair to say, alt hyp = will improve the quality; null
hyp = did not improve or stayed the same?‖ Sometimes the direct instruction
required a lengthy explanation of a concept (teacher-to-group interaction). During
such instruction, participants posted questions using text chat (student-to-group
interaction), thus not interrupting the flow of my explanation on the microphone.
97
This enabled participants to immediately lodge their concerns or add to the
explanation before their thoughts could be forgotten.
Teacher-to-student interaction, that is responding specifically to individual
questions, led on to teacher-to-group interaction by facilitating the next step in the
solution to a problem or explaining a misunderstood concept in order to move the
discussion forward. A diagram was drawn on the whiteboard to assist in an
explanation (Figure 5.8). Use of keywords was discussed and students were
requested to interact with the content on the whiteboard by identifying the keywords
in the question. Further teacher-to-student interaction was required to overcome
some technical issues experienced by some participants during this tutorial.
Student-to-content interaction was visible in their contributions on the
whiteboard although it was not possible to distinguish who was contributing (Table
5.5). In addition, this type of interaction was visible as responses to questions using
text chat. However, it was also evident, yet not visible, in the calculations that the
participants needed to do on paper and in looking up tables. Student-to-group
interactions not only included their questions and responses to questions, but also
their use of the Tick/Cross icon to register their agreement with a statement made or
to confirm understanding of a concept explained. One student-to-teacher interaction
was a confirmation to the teacher that the explanation was valued when S2 used text
chat to say ―I was wondering about that.‖
While most student-to-student interaction was social in nature, there were a
couple of exchanges about content using text chat, in particular S2 with S4 and S6
with S4. For example, S2 answered a couple of questions posed by S4, while I was
using the microphone to explain another aspect of the problem to S6:
S4: Is that because we are giving it a 50/50 chance therefore need to look
at the 0.5 column, S2?
S2: Yeah I think so
S4: Add 9 and 10 together then as you said because 9 or 10?
S2: Yep
S4: Ta
98
S2: Welcome
Again, while this was a student-to-student exchange, all participants benefitted from
the comments because of the online classroom environment.
5.6.4 Use of Tutorial 6 archive
Four participants (S5, S7, S8 and S9) accessed the archive, with S7 (the only one of
these who had attended the tutorial) reviewing it twice, once prior to the seventh
tutorial and then again just prior to the examination. Two participants (S5 and S8)
accessed the archive just prior to the examination, whereas S9 accessed it for only a
few minutes just prior to the seventh tutorial.
5.7 Tutorial 7
The seventh tutorial, Tutorial 7, was conducted in Week 10 of the semester. This
coincided with the massive clean-up required after major flooding in the capital city
and nearby coastal regions where a number of the study participants lived and
worked. Flooding had been widespread in the state over that summer and all
participants in the study were still being affected in some way. This explained why
the predominant topic of social conversation at the beginning of the tutorial
continued to be about the weather, the floods and how people were coping. This
tutorial lasted approximately 80 minutes.
4.7.1 Who attended Tutorial 7
Four participants attended the seventh tutorial (S5, S6, S7 and S8). The low
attendance was not surprising given that everyone was still experiencing the
aftermath of the flood and accompanying power blackouts. In particular, S1, S2 and
S3 emailed their apologies on the day following the tutorial, explaining that they had
been unable to attend due to power blackouts and/or Internet connectivity issues.
Three participants (S5, S7 and S8) were already in the Wimba Classroom
when I typed onto the whiteboard ―Welcome to Data Analysis, I will be with you at
7‖. During this waiting period, S7 typed into the Chat window ―how did you all
survive the flood in Brisbane.‖ When there was no immediate response, I struck up a
conversation with S7 about his experiences during the flood. Following this, S8
described her circumstances and, after a slight technical problem, S5 joined the
conversation with a description of her circumstances. The conversation turned to
99
how everyone was progressing with their studies during which S6 entered the room.
After checking how S6 was faring in the floods, we proceeded to the tutorial proper.
5.7.2 Format and content of Tutorial 7
The tutorial started as usual with the research questions. This was followed by
discussion on how to use the decision flow chart on hypothesis testing which I had
emailed to the participants prior to the tutorial. We also discussed the formula sheet
that would be provided in the examination. From there, we proceeded to work
through ten multiple-choice questions on hypothesis testing and confidence intervals
for proportion. It was the intention to do this quickly and move on to hypothesis
testing and confidence intervals for means, but again we had insufficient time to
discuss this content in this tutorial.
5.7.2.1 Reflection on Tutorial 6
At this stage of the semester, I decided to forego the icebreaker question and progress
immediately through the key research questions: attitude to the level of the content
covered in the previous tutorial, the open-ended question identifying the most
important thing gained from the previous week‘s tutorial, how they were feeling
about the course at this point in time, and which topics they found most challenging
so far (Appendix B).
Only two of the four present had also attended the previous tutorial (Tutorial
6), so the first two questions were only answered by these two participants. While
one thought that the content was ―just right‖ (S7), the other thought that it was
―thought provoking‖ but ―a bit confusing‖ (S6). Responses to the question about the
most important understanding gained from the previous week‘s tutorial included
comments such as ―success and fail, Type 1 errors and Type 2 errors‖ (S7), and ―it
helped me sort the logic behind the sign test, but I‘m still confused about some
things‖ (S6). On feelings about the course, three of the four participants said that
they were ―enjoying it‖ (S5, S7 and S8). However, one of these added that she was
―still worried‖ and ―struggling‖ but was ―getting on top of it‖ (S5). Another also
added that she was ―still worried‖ and ―struggling‖ (S8). The fourth participant
indicated that she was ―struggling‖ but ―getting on top of it‖ (S6). As for the most
challenging topics, three of the four participants chose ―hypothesis testing and
confidence intervals‖ (S6, S7 and S8), with two adding ―sampling distributions‖ (S6
100
and S8), and the third adding ―binomial models and probability‖ (S7). The fourth
person (S5) chose ―experiments, observational studies and sampling‖ along with
―sampling distributions.‖ These choices could not be related to tutorials missed as
most of the participants had been in attendance at the tutorials where their choices
were discussed. The only exception to this was when S8 chose hypothesis testing and
confidence intervals (Tutorial 6). Because I did not display summaries of the
responses to the questions, this segment was completed in less than seven minutes.
5.7.2.2 Tutorial 7 content
Now that we had discussed aspects of hypothesis testing, it was an opportune time to
discuss the decision flow chart on this topic and the related topic of confidence
intervals. This allowed us to talk more about the ―bigger picture‖ of inferential
statistics to try to alleviate confusion. Students of this course were expected to do
simple hypothesis tests and confidence intervals by hand and by interpreting SPSS
output. To assist with this, students were provided with a formula sheet in the
examination. In this tutorial, we discussed this formula sheet and particular formulae
related to hypothesis testing, confidence intervals and sample size. We examined
each of the formulae in turn and I suggested key points to annotate on the sheet. This
discussion lasted for about 20 minutes.
Multiple-choice questions on hypothesis testing for proportion had been
emailed to participants immediately following the previous tutorial (Tutorial 6). I
expected that we would complete discussion of these questions fairly quickly as
participants should have had an opportunity to attempt them in their own time.
Despite this, these ten questions required approximately 45 minutes to complete.
Terminology and statistical language appeared to create a barrier to understanding.
The questions covered key concepts including population of interest, sample
statistics, test statistics, critical values, standard errors, level of confidence, margin of
error, p-values, sample size and test assumptions. In the process of discussing this
content, we practised useful problem-solving strategies such as underlining
keywords, summarising the given information, drawing a diagram, checking the
formula sheet, and eliminating obvious incorrect options.
Since we had not covered as much as I had planned, I suggested that we have
a longer than usual tutorial in the following week, so that we could look at
hypothesis testing for a mean and the chi square test. This was greeted by ―Yes
101
please‖ (S6) and ―I‘m good for a longer sess [session]!‖ (S5). I encouraged
participants to email or phone if they wanted to discuss anything from the tutorial. As
there were no further questions, we finished with our usual farewells.
5.7.3 Interactions in Tutorial 7
Interactions in Tutorial 7 again consisted of a mix between students, content and
myself (as teacher). Social interaction at the beginning of the tutorial was initially
instigated in text chat but quickly progressed to using the microphone, as there were
only two participants (S7 and S8) present at the time. In the first instance, there was
teacher-to-student interaction as I asked each participant how they fared in the
floods. This was followed in turn by student-to-teacher interaction as each responded
by describing their situation. This continued when the third and fourth participants
(S5, returning to the classroom after experiencing technical issues; and S6) joined the
group.
During the routine research questions, interaction was kept to a minimum to
reduce the time devoted to this segment. However, there were some teacher-to-group
comments to encourage quick responses to the questions. In addition, there were
some comments from me forewarning of possible background noises during the
tutorial due to a birthday party in progress in my household at the time. S5 quickly
responded ―And you are doing a tute? Where‘s our cake??‖ This helped to lighten
the atmosphere created by the earlier conversation about the seriousness of the
floods.
Content for this tutorial consisted of discussion of the decision flow chart for
inference, formula sheet and ten multiple-choice questions on hypothesis testing and
confidence intervals for proportion. Participants had been sent a copy of the flow
chart and multiple-choice questions prior to the tutorial. Most of the interaction when
discussing how to use the flow chart was teacher-to-group with the occasional
student-to-teacher comment and consequential teacher-to-student response about the
content. As teacher, I explained the information summarised by the flow chart and
how to apply it when answering questions requiring hypothesis testing and
confidence intervals. During this discussion, teacher-to-content interaction assisted
understanding by referring to and annotating the flowchart displayed on the
whiteboard. Student-to-teacher interaction using the Tick/Cross icon confirmed
understanding. Similar interactions (teacher-to-group, student-to-teacher, teacher-to-
102
group and teacher-to-content) were observed during the discussion of the content and
function of the formula sheet. During these discussions, most of the interaction was
teacher-to-group in the form of direct instruction.
Working through the multiple-choice questions followed a similar format to
previous tutorials. Participants were encouraged to interact with the content by
underlining keywords on the whiteboard. Participant involvement in underlining of
keywords increased as the tutorial progressed. After teacher-to-group elaboration of
the context related to the keywords in the question, participants were asked to
interact with the content by indicating their favoured option on the whiteboard. The
teacher-to-group interaction included a mixture of both facilitation questions to lead
the participants through the problem-solving steps to a solution and direct instruction
exposition to clarify some of the concepts. Both of these were enhanced by teacher-
to-content interaction in the form of writing on the virtual whiteboard (Figure 5.9).
Figure 5.9. Formulae and calculations
This writing of formulae and calculations visually supported the verbal explanations.
A number of the multiple-choice questions were numerical in nature and required
participants do calculations with the aid of a calculator and look up statistical tables
in a textbook. This instigated student-to-content interaction on paper for each
individual participant in their own time. Not all participants shared the results of their
calculations. Two participants, S5 and S6, contributed the majority of the student-to-
group discussion using text chat, although S7 did contribute using both the
microphone and text chat on a couple of occasions.
103
Overall, there were very few student contributions using the microphone,
even though this would have been more manageable compared with other tutorials,
given the small number of participants attending. No direct student-to-student
interaction was observed. While social interaction occurred during the welcome and
farewell part of the tutorial, no social interaction occurred during the content part of
the tutorial.
5.7.4 Use of Tutorial 7 archive
With so many missing this tutorial and taking the content covered into consideration,
it was not surprising that five participants accessed the archive. Three who had
missed the tutorial (S2, S3 and S4) accessed the archive during the week following
the tutorial to be prepared for Tutorial 8. The remaining two (S5 and S7) had been
present at the tutorial and accessed the archive during the revision period prior to the
examination (S5 doing so twice). When asked during the tutorial if they had been
using the archives, S5 and S7 confirmed that they did, with S5 further commenting in
text chat ―love it! It‘s great for revision.‖ One participant (S3) in an unsolicited
personal email commented ―I have to say that I really love the archive tutorials (as
well as the tutorial) because if during class the group move on while I'm mulling
something over, I can go and replay, pause, replay, pause the sections of the tutorial I
was only half listening to.‖
5.8 Tutorial 8
The eighth and final tutorial on new content was conducted in Week 11, the last
week of the semester. The two-week examination period followed and the Data
Analysis examination was timetabled to be held on the Wednesday of the second
week. I opened the Wimba Classroom early with a message on the whiteboard: ―I
will be with you in a few minutes.‖ We had decided beforehand that this tutorial was
likely to be extended as we needed to cover a considerable amount of content on
inferential statistics. As a result, this tutorial lasted approximately 125 minutes.
5.8.1 Who attended
Eight participants attended the eighth tutorial (S1, S2, S3, S4, S6, S7, S8 and S9). S5
sent an apology. She had to finish an assignment for another course and wanted to
devote her time to that task. Four participants (S2, S6, S7 and S8) were already in the
Wimba Classroom when I entered. While we waited for the others to arrive, we did a
104
sound check and I took questions from those present. In the meantime, three more
participants (S1, S3 and S4) entered the room in quick succession. The final
participant (S9) was still having technical issues and finally joined the group 20
minutes later. Another two participants (S1 and S8) also had technical issues with
seeing the content on the whiteboard during the course of the evening, with S8
deciding to leave at about 40 minutes into the session and S1 leaving after 90
minutes. S9 decided to leave early at 80 minutes into the session, but returned
approximately 20 minutes later and remained for the final 20 minutes of the session.
5.8.2 Format and content of Tutorial 8
Once everyone was settled and initial greetings were completed, I posed the four key
research questions and finished this segment with a light-hearted question to relieve
the tension. From there we worked through a chi-square test using a combination of
calculations, multiple-choice questions and SPSS output. Following this, we
discussed six questions on confidence intervals and hypothesis testing with means.
These questions included a mix of multiple-choice questions, calculations using a
formula and calculator, and interpretation of SPSS output. Both sets of questions had
been emailed to participants prior to the tutorial.
5.8.2.1 Reflection on Tutorial 7
Similar to the previous week, the four key research questions were asked: attitude to
the level of the content covered in the previous week, the open-ended question
identifying the most important thing gained from the previous week‘s tutorial, how
they were feeling about the course at this point in time and which topics they found
most challenging so far (Appendix B). Because only four participants had attended
the last tutorial I invited those who had watched the archive of Tutorial 7 to
contribute answers to the research questions as well.
On the subject of content from the previous week, only three of the six who
answered chose ―just right‖ (S1, S3 and S7). One of these added that it was also
―thought provoking‖ (S7). The other three participants selected ―thought provoking‖
(S4, S6 and S8), with two of these adding that it was ―a bit confusing‖ (S6 and S8).
Both of the participants who commented that it was ―a bit confusing‖ had attended
the previous tutorial but had not reviewed the archive before this tutorial. Responses
to the question about the most important thing gained from last week‘s tutorial
105
included comments such as ―persist, persist, persist! A few things were clarified.
Good for reinforcement.‖ (S6), and ―everything!‖ (S4). Others commented on
specific content such as ―hypothesis and proportion‖ (S7), ―rules for hypothesis
testing‖ (S3), and ―differences between means and proportion stats tests, t tables, all
of it‖ (S4). On feelings about the course, six of the seven participants said that they
were ―enjoying it‖ (S1, S2, S3, S4, S7 and S8). However, four of these added that
they were ―struggling‖ (S1, S3, S4 and S8). The other participant (S6) indicated that
she was ―struggling‖ and ―still worried‖ but ―getting on top of it.‖ In all, five out of
the seven felt that they were ―getting on top of it‖ (S3, S4, S6, S7 and S8). S9 did not
contribute to these questions as she arrived late.
With reference to the most challenging topics, six of the seven participants
chose ―hypothesis testing and confidence intervals‖ (S1, S3, S4, S6, S7 and S8), with
two adding ―binomial models and probability‖ (S4 and S7), and one adding
―regression and correlation‖ (S3). The other participant (S2) had found that ―nothing
has been particularly challenging so far,‖ but if he had to pick any it would be
―sampling distributions.‖
Instead of having the icebreaker at the beginning, I chose to finish with a
light-hearted moment by asking: ―What are you going to do when the exam is over?‖
Five out of the seven chose ―feel a sense of achievement‖ (S1, S2, S3, S4 and S7),
whereas the other two chose ―take a holiday‖ and ―get some sleep‖ (S6 and S8).
Because I did not display summaries of the responses to the questions, this segment
was completed in a little over five minutes.
5.8.2.2 Tutorial 8 content
Even though the chi-square test was the last topic in the course, I decided to work
through it first in case we did not have sufficient time to complete everything on the
agenda for this tutorial. I referred back to Tutorial 2 where we had first talked about
contingency tables and quickly summarised the important points so far. From there,
we worked on a chi-square scenario, initially with multiple-choice questions on
calculating expected counts and degrees of freedom. We proceeded to calculate all of
the expected counts for the table, with all participants contributing. We discussed the
hypothesis statements, calculated the chi-square statistic, degrees of freedom, p -
value and conclusion in context. Finally, we discussed the SPSS output of
conditional distributions and the chi-square test, relating this back to the calculations
106
we had completed earlier. We also discussed interpretation of the standardised
residuals. Again there was a lot of statistical terminology to absorb. This chi-square
scenario and all of its related analysis took approximately 50 minutes to complete.
Hypothesis testing and confidence intervals for means were discussed using
six multiple-choice questions followed by working through a hypothesis test using a
single sample test of a mean and then a two related samples t-test which included
interpreting SPSS output. The multiple-choice questions tested concepts such as
standard error, margin of error, critical value, confidence interval, test statistic and p-
value related to hypothesis testing, and confidence intervals for a mean, all of which
had been covered previously, but in a slightly different context using a proportion
instead of a mean. From there we discussed, in more detail, hypothesis testing for a
mean using a single sample, reinforcing the four steps to a hypothesis test. Included
in this was some discussion on using the t-table. In the final question we identified a
scenario as being a related-samples context (Figure 5.10), and then proceeded to do
the corresponding hypothesis test by interpreting SPSS output. Throughout
discussion of this content, we again applied problem-solving techniques such as
underlining keywords, summarising the given information, drawing a diagram and
referring to the formula sheet. This segment took another 50 minutes.
Figure 5.10. Looking for keywords
Having finished discussion of the new content I asked for an indication of
who wanted a revision tutorial using the Tick/Cross icon. With everyone present
indicating a tick, I agreed to organise a tutorial for the following week at the usual
107
time, but asked that people email me any special requests of content that they wanted
revised. With such a long session behind us, farewells were very brief finishing with
my concluding comment ―give ourselves all a pat on the back for hanging in there so
long.‖
5.8.3 Interactions in Tutorial 8
While this tutorial included the expected interactions, there was a higher proportion
of teacher-to-group interaction which could be classified as direct instruction. With
examinations looming, there was initially minimal social interaction beyond teacher-
to-student acknowledgement of individual participants being present. However, some
technical and organisational issues needed to be addressed before progressing to
discussion of content. A sound check (teacher-to-group interaction) was facilitated
using the Tick/Cross icon. Organisational issues related to finalising the research
project interviews required teacher-to-student interactions followed by student-to-
teacher responses. Resolving technical issues required teacher-to-student and
student-to-teacher interaction using both microphone and text chat. General content
questions were addressed briefly in specific student-to-teacher interactions with
teacher-to-group responses using the microphone. While this was happening, a social
conversation between S2 and S6 unfolded in the Chat window.
Once the tutorial moved on to the research questions, some teacher-to-student
and student-to-teacher interaction on organisational issues persisted. With these
resolved, conversation subsided while participants concentrated on answering the
research questions as quickly as possible. To lighten the atmosphere, the last research
question was not a serious one. This prompted a little social interaction (student-to-
teacher, student-to-group and teacher-to-group) before proceeding to the course
content.
The discussion of course content was in two parts: a chi-square question and
some questions on hypothesis testing and confidence intervals for means. Initial
discussion of the chi-square question included teacher-to-group interaction in which I
referred back to content from an earlier tutorial and posed leading questions to
facilitate student contributions. Student-to-group interaction occurred mainly in text
chat. Using text chat allowed more than one student to respond to any one question.
Though there were eight participants present, most of the contributions came from
S1, S2, S4, S6 and S9 (Table 5.8). Student-to-content interaction was visible on the
108
whiteboard when participants entered missing values in tables of observed and
expected counts and answered multiple-choice questions. It was also assumed to be
present (but was not visible) when they drew diagrams of the chi-square distribution,
looked up statistical tables and did calculations on paper. Teacher-to-content
interaction was used to visually support explanations and demonstrate calculations
using writing on the whiteboard (Figure 5.10).
While participants were encouraged to use the microphone, they seemed to
prefer text chat as the primary means of communication (Table 5.9). If a participant
used the microphone in this part of the discussion, the Hand-up icon was also
generally used. Teacher-to-group interaction was a mix of facilitation and direct
instruction. Student-to-teacher interaction using the Tick/Cross icon to confirm
understanding of the content was regularly employed during direct instruction
throughout the tutorial (Table 5.9). As participants were still trying to assimilate the
more unfamiliar content and there were no visual cues to ascertain level of
understanding quickly, this allowed individuals to respond student-to-teacher.
Occasionally there were student-to-group interactions with comments much like a
person might say to the person next to them in a face-to-face tutorial, for example,
when S2 realised his mistake in an answer ―bugger – of course.‖ Some student-to-
teacher interactions were in the form of a comment finishing with a question mark,
that is, offering a comment but indicating that the student was unsure. One example
of this occurred when S1 posted in text chat ―it would be an equal value amongst the
choices?‖ indicating that his answer was tentative. On occasion, S4 would restate a
concept in her own words (using the microphone) to confirm that she had understood
the teacher‘s explanation.
When changing to a new topic, teacher-to-group interaction included some
general organisational points and words of encouragement. Participants were now
practising problem-solving processes that I had modelled in previous tutorials. As
each new question on hypothesis testing was displayed on the screen, participants
immediately interacted with the content by underlining keywords. Participants also
interacted with the content by doing calculations associated with an analysis and
offering answers using text chat, often in quick succession. I responded (teacher-to-
group interaction) with demonstrations of calculations on the whiteboard. Although
teacher-to-group interaction predominated, there were occasional teacher-to-student
109
responses acknowledging a student by name for their contribution. In the student-to-
group interaction, two participants, S2 and S9, were quick to respond with answers.
However, use of text chat again meant that more than one participant was able to
offer an answer to a question and check their answer against others‘ contributions.
One notable student-to-student interaction occurred in the Chat window when S2
assisted S9 with an answer to her content question.
The length of the tutorial was telling on everyone‘s abilities to concentrate, so
a few student-to-group and teacher-to-group social comments helped to break the
tension throughout the tutorial (Figure 5.15). Even though participants were
operating as individuals, comments and answers using text chat produced an
environment of collaboration and cooperation. This was evidenced by participants
being prepared to contribute to the discussion as we worked through the solutions
step-by-step.
5.8.4 Use of Tutorial 8 archive
As was expected, since she was the only participant who missed the tutorial, S5
accessed the archive. In fact, she accessed it twice in the revision period leading up
to the examination. Only one other (S7) accessed the archive in the revision period.
5.9 Revision Tutorial
Since the Data Analysis examination was timetabled in the second week of the
examination period, I offered to hold the revision tutorial in the first week.
Participants had been asked to submit requests for topics to be covered. Because this
was a revision tutorial, discussing an example examination paper, I anticipated that it
would be extended. I was prepared to stay online as long as necessary, based on
participant needs. This tutorial lasted approximately 150 minutes.
5.9.1 Who attended the revision tutorial
All nine active research study participants attended the revision tutorial (S1, S2, S3,
S4, S5, S6, S7, S8 and S9). It should be noted that some participants had other
examinations to prepare for during this period. Four participants (S1, S7, S8 and S9)
had entered the room early. I noticed that although S9 dropped out she was able to
re-enter successfully. I greeted everyone in the Chat window and reassured that ―I
will be with you in a minute,‖ as I finished final preparations. Those present greeted
the group, after which we conducted a sound check. In the meantime, S3 entered the
110
room. While we waited for others to arrive, I took general questions from the group
and organised a time frame for conducting final interviews for the study. After S4
arrived, conversation turned to the weather and the impending cyclone hovering off
the coast. Soon after starting the research poll questions, S5 entered. The final two
participants were quite late joining the group; S6 arrived thirty minutes into the
tutorial when we were discussing the third multiple-choice content question and S1
arrived fifty-five minutes into the tutorial when we were discussing the twelfth
multiple-choice question, apologising that he had forgotten the time while bathing
the children.
5.9.2 Format and content of the revision tutorial
With everyone settled, I posed five research questions, three of which had been asked
regularly throughout the semester. From there, we worked through an example
examination paper consisting of multiple-choice questions and written responses to
questions. A copy of the example examination had been emailed to participants prior
to the tutorial.
5.9.2.1 Reflection on Tutorial 8
Using the Poll function, I asked the three key research questions and two extra
questions, one of which related back to the initial survey and the other to be asked
again in the final interview after the examination. The key research questions were
about attitude to the level of the content covered in the previous week, identifying the
most important thing gained from the previous week‘s tutorial and feelings about the
course at this point in time (Appendix B). Five of the six participants present had
attended the previous tutorial (Tutorial 8). Because I wanted to obtain a full picture
of the attitudes of participants, I asked the two (S1 and S6) who arrived quite late and
missed this segment, but were present at the previous tutorial, to email me responses
to the questions. On the topic of attitude to content from the previous week (drawn
from the collated responses), three of the eight chose ―just right‖ (S2, S3 and S9).
Three participants selected ―thought provoking‖ (S4, S7 and S8), while two chose
both ―just right‖ and ―thought provoking‖ (S1 and S6). Responses to the question
about the most important thing gained from last week‘s tutorial included comments
such as ―it‘s not as difficult as I thought‖ (S3), ―learning differences between tests‖
(S4), and ―confirming that I was on the right track with chi squares tests and resids
[residuals]‖ (S9). Others commented on specific content such as ―a bit of revision of
111
contingency tables‖ (S2), ―chi square test distribution and statistical value‖ (S7), and
―very helpful as gave me insight into chi square test which I hadn‘t done at that
stage‖ (S6). On feelings about the course, seven of the nine participants said that they
were ―enjoying it‖ (S1, S2, S3, S4, S7, S8 and S9). However, four of these added that
they were ―getting on top of it‖ (S3, S4, S7 and S8), with two of these further
admitting that they were ―still struggling‖ (S4 and S8). Three participants indicated
that they were ―still worried‖ (S3, S5 and S6), with one adding that she was ―getting
on top of it‖ (S6).
The main reason for including the additional two research questions was to
compare the responses with those given in the initial survey and the final interviews
to check consistency. The first of the additional questions asked why they had chosen
to participate in the online tutorial. This question was from the initial survey before
the online tutorials commenced. Indicative responses from the group included ―I
need all the help I can get!!!‖ (S5 with S8 concurring), ―interaction with other
students and tutor as well as a guide to keeping up with the workload‖ (S3 with S9
concurring), ―to be more informed and to learn how other students study‖ (S7),
―beneficial to hear different discussions and perceptions of the material being taught‖
(S2), ―to help motivate me to keep up to date‖ (S1), ―I am so sick of playing it safe
and not making the most of opportunities – nothing like risk-taking and getting in
your discomfort zone‖ (S6), and ―feel more a part of uni as an external student‖ (S4).
The second additional question asked participants how the online tutorial
contributed to their learning. Indicative responses included:
if I had a wrong idea about something it was more likely to be brought out
during the online tutes (S2);
it helped to reinforce my learning and helped clarify concepts (S9);
… by either introducing me to a topic … or by consolidating my knowledge
on the topics (S1);
felt more in touch (S4);
practical examples have been very helpful (S8);
good sounding board esp being external (S5);
112
my learning style incorporates auditory, and visual can become very tiring
(S3);
helped in understanding the language of DA [Data Analysis] by listening to
how it was used while interacting (S6); and,
very good as I can revise with the recording tute (S7).
Because I did not display summaries of the responses to the questions, this segment
was completed in 10 minutes.
5.9.2.2 Revision Tutorial content
I had asked participants to email me any requests for topics to cover in this revision
tutorial. As I had only received one request for some discussion on probability, I
decided it would be productive to work through one of the example examinations
that were made available to all students for revision purposes. This example
examination included some questions on probability. Prior to the tutorial, I emailed
everyone a copy of the example examination paper that we would discuss. The
examination was in two parts – Part A consisting of 20 multiple-choice questions for
20 marks and Part B consisting of 5 questions (with multiple parts) requiring written
answers for 30 marks. These questions covered all aspects of the course. While
working through the multiple-choice questions, participants contributed by
underlining keywords, committing to a choice and justifying their answer when
requested to do so. Participants were also given opportunities to ask for clarification
of understanding. Responding to their questions was not just left to me, as other
participants contributed explanations and examples to assist. In working through
these questions I was able to point out some of the traps in the wording of questions,
point out commonly made errors and reinforce the need to read questions carefully.
Part A of the examination paper took us approximately one hour.
The written part of the paper was a little more difficult to structure in an
online environment. As time was short, I decided that we would concentrate on
looking for the keywords in the questions, discuss strategies for working out what a
question required (formulae required and how to go about answering), but leaving
everyone to do the calculations later in their own time. Brief answers to these
questions were available to all students, so participants would be able to check their
working and were encouraged to contact me should they have any queries or
113
concerns. Even without actually doing all of the calculations, Part B of the
examination paper took us approximately one hour to complete.
With this tutorial being such a long one, it was not surprising that a couple of
participants were not able to stay for the whole time. One participant (S9) stepped
out for twenty minutes to have something to eat, but returned and stayed for the rest
of the tutorial. Another (S5) had to leave about half an hour before the end to attend
to her children, saying ―I‘m sorry gang but I have to scoot, a 5 year old and 15 year
old argument that I have to referee, wish me luck!! ciao (and good luck with the
exams!).‖ In the last ten minutes we discussed organisational issues related to the
examination. Participants offered words of advice on their examination strategies. I
reassured those with last minute nerves and offered to assist with any last minute
questions by email. I commented that I would wait until after the examination to
contact them for their individual interviews for the research study. Finally,
participants departed with many messages of ―good luck‖ and ―thank you.‖
5.9.3 Interactions in the revision tutorial
While interaction in this tutorial consisted of the usual mix, there seemed to be a
much more even distribution of these interactions across the tutorial. With no new
content being introduced, there was less direct instruction required. There was
noticeably more student-to-student interaction than in earlier tutorials, where
participants showed their support for one another, both emotionally and intellectually
such as when S6 commented ―good work S2‖ in response to an answer given by S2
and S9 gave her advice on exam technique ―read the questions slowly!!!!‖
After teacher-to-group and teacher-to-student greetings as each person
arrived, participants were encouraged yet again to use the microphone to interact.
Even though one participant (S2) commented ―it‘s just easier to type,‖ some
participants used the microphone more and not just when they needed to give longer
more complex answers. Teacher-to-group interaction was necessary to organise
processes related to the research project interviews and the final research questions.
During the research questions segment of the tutorial, teacher-to-group social
comments filled in the quiet times as participants lodged their answers. Some
student-to-student social conversation occurred between S5 and S9 in text chat.
While this conversation was specifically between these two people, it was visible to
114
all. Some teacher-to-group social conversation about my dog going crazy in the
background prompted some student-to-group social chat about pets in general. Talk
about the impending cyclone prompted light-hearted comments (student-to-group)
about the abundance of snakes and spiders due to the floods. During this social
conversation, participants used the microphone to respond when a comment or
question was directed to them by name.
Content discussions contained two overall components: multiple-choice
questions from Part A of the example examination and short answer questions from
Part B of the examination. Teacher-to-group interactions during Part A were very
similar to previous tutorials where multiple-choice questions had been used. These
included advice on problem-solving techniques, advice on examination techniques,
questioning to facilitate student involvement in answering the questions and use of
the Tick/Cross icon to confirm understanding. This interaction was interspersed with
teacher-to-student interaction which attempted to involve participants who had not
been very active in discussion in past tutorials. This was initiated by asking specific
participants to explain the reasoning behind their answers to the multiple-choice
questions. This prompted student-to-teacher responses using the microphone rather
than text chat. In addition, other participants supported a respondent by adding
student-to-group comments in text chat to keep the discussion flowing. To reduce
any possibilities of confusion, teacher-to-group interaction facilitated the discussion
with some additional explanation expanding on understanding by modelling the
thinking processes needed to solve a problem. Some teacher-to-group direct
instruction mingled with the facilitation to highlight important concepts in the topic
being examined.
Student-to-content interaction involved underlining of keywords, crossing out
obvious distractor options (Figure 5.11) and indicating the chosen answer on the
whiteboard. However, in contrast to previous tutorials, participants joined in these
activities with very little prompting from me. As in past tutorials, teacher-to-content
interaction included drawing diagrams and doing calculations on the whiteboard.
115
Figure 5.11. Eliminating distractors
With the examination fast approaching, participants were keen to stay
focussed on the content. However, occasional student-to-group social chat, sprinkled
throughout the tutorial, helped to ease the tension. This created an atmosphere of
student-to-student mutual support which was extended to the teacher as participants
posted messages such as ―Ahh, got it‖ (S5) and ―OK, I think I have it‖ (S9) to
acknowledge that they had understood my explanation of a key concept or thinking
process. While most student-to-group discussion was again taking place in text chat,
there was more use of the microphone in this tutorial compared with past tutorials.
As in previous tutorials, short comments were generally given in text chat and longer
comments given using the microphone. However, there was one exception when S4
posted a much longer and more involved than usual comment in text chat, possibly
due to the presence of her baby in the background.
With more familiarity with the content, student-to-group interactions included
more suggestions of different ways to approach a problem and how to avoid some of
the pitfalls in the wording of questions. Where, in the past, teacher-to-group
interactions included warnings on where thinking could go awry, these were now
being offered student-to-group, such as when S4 commented ―almost went 50-50
chance‖, which was a common error in binomial questions. Student-to-group
interactions also included acknowledgments of when their thinking had been flawed,
such as when S1 commented ―Sorry, you guys are right.‖ At one stage a flurry of
student-to-group discussion on technical issues occurred using both the microphone
and text chat when I accidently unlocked the Talk button on my computer and could
not be heard for a few minutes. This episode was quickly disregarded with a comical
comment from S7 who posted ―I thought that the dog might have chewed your mic
[microphone]‖ which prompted a few ―LOL‖ comments from other participants.
116
During discussion of Part B of the examination, participants were quick to
underline keywords identifying the relevant information contained in a question.
Since the questions were short answer rather than multiple-choice, participants had to
be prepared to put forward their own ideas and not be guided by presented choices.
Participants mostly interacted student-to-group using text chat by offering
suggestions as I posed questions to uncover the key concepts that needed to be
understood to solve the problem. S1, S2, S4 and S9 were the main contributors to
this discussion with occasional contributions from the others in attendance. While
student-to-content interaction included underlining keywords in a question and
highlighting relevant information in SPSS output on the whiteboard, towards the end
of the tutorial a couple of participants even wrote relevant equations on the
whiteboard (Figure 5.12). Because of time constraints, we did not do all calculations
but rather concentrated on the method to solve each problem. Teacher-to-content
interaction, as earlier, involved writing formulae and drawing diagrams on the
whiteboard to support explanations and summarise contributions made by
participants. Teacher-to-group interaction included questioning to draw out
intermediate steps in a solution, confirming student responses as correct, clarifying
deficient thinking, and modelling the thinking processes in formulating a solution.
The Tick/Cross icon was also used to confirm understanding.
Figure 5.12. Participants contribute to writing formulae
Student-to-student interaction was used to agree with another‘s response
generally identifying the other by name. It was also evident when S1 and S4 posted
good luck messages, in text chat, to S5 when she had to leave the tutorial early.
Student-to-student interaction provided support ―Good work, S1‖ and acknowledged
an error ―Oops, yeah‖ (S2). These interactions replaced visual cues such as a smile of
support or a nod of agreement.
117
The last few minutes of the tutorial were primarily teacher-to-group words of
advice on examination preparation and techniques. However, a number of
participants added their examination preparation suggestions as well. The tutorial
concluded with well wishes all-round.
5.9.4 Use of the revision tutorial archive
Three participants (S1, S5 and S7) accessed the archive of the revision tutorial in the
days prior to the examination. It was not expected that many would access this
archive as the tutorial was held just a little over a week before the examination and
everyone had attended the tutorial.
5.10 Content analysis using the Community of Inquiry framework
The Community of Inquiry (CoI) framework provided a different lens with which to
investigate and understand the interactions taking place in the online tutorial and the
affordances of the medium. The unit of analysis, defined as a ―message,‖ was
determined as a complete comment by one individual on one particular point, topic
or statistical concept. This was generally one voice comment or one text message as
participants tended to keep their messages short and to the point. Each message was
numbered in order through the tutorial. While listening to the recordings of the
online tutorials and being able to refer to the transcript of text chat contributions,
messages were coded as social presence, teaching presence or cognitive presence
(Sections 2.4 and 3.2.2.3). The messages were also categorised in terms of medium
of delivery, that is, text chat, microphone (voice) or emoticon/icon and by participant
contributing the message.
Three tutorials (n=9) were chosen for this analysis. These tutorials (Tutorial
3, Tutorial 6 and Tutorial 8) spanned the semester so as to give an indication of any
changes in the dynamics of the interactions over time.
i. Tutorial 3 (Week 6) was chosen for analysis to allow sufficient time for
participants to settle into the online environment and to feel comfortable
with me and their fellow students (Sections 5.3 and 5.10.1)
ii. Tutorial 8 (Week 11) was chosen as it was the last tutorial that introduced
new content to the group (Sections 5.8 and 5.10.3)
118
iii. The third tutorial needed to be one in between these two and Tutorial 6
(Week 9) was randomly chosen to fulfil this role (Sections 5.6 and 5.10.2)
The CoI analysis identified 1424 messages across these three tutorials.
5.10.1 Tutorial 3
As previously stated, Tutorial 3, conducted in Week 6, was attended by eight
participants and lasted for 75 minutes (Section 5.3). In this tutorial 333 messages
were identified. The topic for discussion was regression and correlation.
In this tutorial, considerable teacher facilitation was required to maintain the
flow of discussion (Section 5.3.3). This is evidenced by the proportion of messages
(n=167, 50%) contributed by the teacher (Table 5.2). The participants contributed to
varying degrees with S1 and S2 contributing by far the most (n=55, 17% and n=46,
14% respectively), followed by S4 and S7 (n=20, 6% and n=18, 5% respectively). It
should be noted that S9 had ongoing technical issues and did not stay beyond the first
few minutes of the tutorial (n=3, 1%); S8 experienced technical issues with
communication but was able to listen (n=3, 1%); and S11 arrived in the closing few
minutes of the tutorial (n=4, 1%). On the other hand, S3 stayed throughout the
tutorial but chose to listen rather than speak (n=3, 1%). Since the contributions to
discussion by writing on the virtual whiteboard could not be specifically associated
with any particular participant within the Wimba Classroom but were a notable part
of the conversation, these contributions have been collectively classed as ND (not
determined). To omit these contributions would diminish the effect of interaction in
the emerging community and preclude critical data on the presences, specifically
cognitive presence.
The Community of Inquiry framework was used to code each message
(N=333) in the online synchronous tutorial as one of social presence, teaching
presence or cognitive presence (Table 5.2). Teaching presence, principally by the
teacher, dominates the tutorial (n=175, 53%). With reference to all messages,
teaching presence on the part of the teacher (n=152, 46%) was used to foster
cognitive presence amongst the students (n=113, 34%). On the other hand, teaching
presence accredited to students (n=23, 7%) was related to solving technical issues
with the online medium and was therefore deemed teaching presence, as it was
119
related to the design and organisation of the learning experience (Section 2.4.3).
Social presence had a minor part to play in this tutorial (n=43, 13%).
Table 5.2
Tutorial 3: Number of messages by participant and presence (N=333)
Participant Social
Presence
Teaching
Presence
Cognitive
Presence
Frequency (%
of total)
ND 0 0 14 14 (4%)
S1 8 5 42 55 (17%)
S2 11 7 28 46 (14%)
S3 1 2 0 3 (1%)
S4 3 2 15 20 (6%)
S7 3 3 12 18 (5%)
S8 0 1 2 3 (1%)
S9 0 3 0 3 (1%)
S11 4 0 0 4 (1%)
Teacher 13 152 2 167 (50%)
Total 43 (13%) 175 (53%) 115 (35%) 333 (100%)
The role of social presence was more obvious when the timeline of the
occurrence of the presences was investigated (Figure 5.13). Messages were
numbered in order of occurrence and plotted against the type of presence represented
by the message. This graph presents the changes in presence across time, indicating
when each presence predominated. As previously mentioned (Section 5.3) and
consistent with Figure 5.13, social presence occurred at the beginning when
participants greeted one another upon entering the Wimba Classroom and at the end
when they said thank you and farewell. The relatively low incidence of social
presence that occurred in between these times included a thank you to the teacher for
answering a question and explaining a concept, a word of encouragement from me
and a welcome to a latecomer. Interaction primarily involved an oscillation between
120
teaching presence and cognitive presence as the teacher facilitated discussion which
prompted cognitive responses from the participants.
Figure 5.13. Timeline of presences in Tutorial 3
As noted previously, in the online synchronous tutorial there were four principal
avenues for communication: text chat, voice using a microphone, emoticons used to
compensate for the lack of visual cues and icons to acknowledge understanding or
agreement (Tick/Cross). The degree to which each was used is indicated in Table
5.3, with use of emoticons and icons being combined as one category. Most of the
communication involved the microphone (n=182, 55%), although there was a
substantial use of text chat (n=113, 34%).
Participants who contributed most to discussion in the tutorial primarily used
text chat – S1 (n=39, 71%), S2 (n=37, 80%) and S4 (n=18, 90%) – although the
exception was S7 who mostly used the microphone (n=12, 67%), commenting in the
final interview that it was ―just a matter of practice‖ (Table 5.3). With half of the
contributions to the discussion being made by the teacher (n=167, 50%; Table 5.2)
and the fact that the teacher primarily used the microphone as her means of
121
communication (n=149, 89%; Table 5.3), it was not unexpected that most of the
communication occurred using the microphone (n=182, 55%; Table 5.3).
Table 5.3
Tutorial 3: Frequency (percent) of technology use by each participant (N=333)
Participant Using
text chat
Using
microphone
Using
emoticon/icon
Total
ND 0 (0%) 0 (0%) 14 (100%) 14 (100%)
S1 39 (71%) 12 (22%) 4 (7%) 55 (100%)
S2 37 (80%) 5 (11%) 4 (9%) 46 (100%)
S3 1 (33%) 1 (33%) 1 (33%) 3 (100%)
S4 18 (90%) 1 (5%) 1 (5%) 20 (100%)
S7 3 (17%) 12 (67%) 3 (17%) 18 (100%)
S8 0 (0%) 2 (67%) 1 (33%) 3 (100%)
S9 3 (100%) 0 (0%) 0 (0%) 3 (100%)
S11 4 (100%) 0 (0%) 0 (0%) 4 (100%)
Teacher 8 (5%) 149 (89%) 10 (6%) 167 (100%)
Total 113 (34%) 182 (55%) 38 (11%) 333 (100%)
There appears to be a relationship between presence and the type of
technology used (Table 5.4). Teaching presence was predominantly achieved by way
of the microphone (n=142, 81%), whereas social presence occurred mainly through
text chat (n=32, 74%). With cognitive presence, the distinction is not as strong, with
text chat having the higher proportion (n=62, 54%) compared with using a
microphone (n=31, 27%). As previously mentioned, the technology used tended to
be influenced by the complexity of the comment that was made as it was easier to
verbally express a long and complex comment than write it in text chat.
122
Table 5.4
Relationship between presence and technology used during Tutorial 3 (N=333)
Using
text chat
Using
microphone
Using
emoticon/icon
Total
Social
Presence
32 (74%) 9 (21%) 2 (5%) 43 (100%)
Teaching
Presence
19 (11%) 142 (81%) 14 (8%) 175 (100%)
Cognitive
Presence
62 (54%) 31 (27%) 22 (19%) 115 (100%)
Total 113 (34%) 182 (55%) 38 (11%) 333 (100%)
5.10.2 Tutorial 6
With six participants in attendance, Tutorial 6 was conducted in Week 9 of the
semester and lasted for approximately 75 minutes (see Section 5.6). In this tutorial
373 messages were identified. The topic for discussion was hypothesis testing using
the sign test.
Tutorial 6 had a relatively high level of teacher involvement, as evidenced by
the proportion of messages (n=151, 40%) contributed by the teacher (see Table 5.5).
However, this was not as high as in Tutorial 3 (n= 167, 50%; Table 5.2). The
participants contributed to varying degrees with S6 (n=54, 15%), S4 (n=44, 12%), S1
(n=40, 11%) and S2 (n=38, 10%) contributing by far the most. This represented a
more even distribution of involvement by these four participants than previously. As
in the past, S3 and S7 were relatively quiet (n=15, 4% and n=16, 4% respectively).
As indicated earlier (Section 5.10.1), ND contributions were anonymous entries on
the virtual whiteboard.
123
Table 5.5
Tutorial 6: Number of messages by participant and presence (N=373)
Participant Social
Presence
Teaching
Presence
Cognitive
Presence
Frequency (%
of total)
ND 0 0 15 15 (4%)
S1 17 9 14 40 (11%)
S2 10 3 25 38 (10%)
S3 7 2 6 15 (4%)
S4 13 3 28 44 (12%)
S6 23 6 25 54 (15%)
S7 9 2 5 16 (4%)
Teacher 32 118 1 151 (40%)
Total 111 (29%) 143 (38%) 119 (33%) 373 (100%)
The Community of Inquiry framework was again used to code the
contributions to discussion in the tutorial in terms of the three presences (Table 5.5).
With reference to all messages, teaching presence principally associated with the
teacher (n=118, 32%) balanced cognitive presence amongst the students (n=118,
32%).
Social presence in this tutorial showed a marked increase (n=111, 29%; Table
5.5) compared with Tutorial 3 (n=43, 13%; Table 5.2). From Figure 5.14, it can be
seen that social presence occurred not only at the beginning and end of the tutorial
with the usual greetings and farewells but noticeably more so during the content
discussion part in the middle of the tutorial.
There was considerable social conversation at the beginning of the tutorial
(Figure 5.14) as the group had been dealing with the impact of the tragic flood event
that had occurred in the region at that time (Section 5.6). Social presence through the
content discussion part of this tutorial consisted of the usual appreciation shown to
124
the teacher and fellow participants for answering questions and a thank you to the
teacher for explaining a concept. It also included the occasional amusing comment
by a participant to relieve the tension of maintaining a high level of concentration,
such as when S6 commented ―[my] brain hurts but I‘m sure I‘ll be fine!‖
Figure 5.14. Timeline of presences in Tutorial 6
As well as coding messages by presence, they were also categorised by type
of technology used (Table 5.6). Although most of the messages were delivered using
the microphone (n=179, 48%), it should also be acknowledged that the majority of
these were made by the teacher (n=145; Table 5.6).
125
Table 5.6
Tutorial 6: Frequency (percent) of technology use by each participant (N=373)
Student Using
text chat
Using
microphone
Using
emoticon/icon
Total
ND 0 (0%) 0 (0%) 15 (100%) 15 (100%)
S1 26 (65%) 6 (15%) 8 (20%) 40 (100%)
S2 24 (63%) 4 (11%) 10 (26%) 38 (100%)
S3 6 (40%) 4 (27%) 5 (33%) 15 (100%)
S4 31 (70%) 6 (14%) 7 (16%) 44 (100%)
S6 35 (65%) 10 (19%) 9 (17%) 54 (100%)
S7 5 (31%) 4 (25%) 7 (44%) 16 (100%)
Teacher 2 (1%) 145 (96%) 4 (3%) 151 (100%)
Total 129 (35%) 179 (48%) 65 (17%) 373 (100%)
All of the participants who were active in discussion preferred text chat
(Table 5.6): S1 (n=26, 65%), S2 (n=24, 63%), S4 (n=31, 70%) and S6 (n=35, 65%).
Emoticons/icons were mainly used to indicate an answer on the whiteboard or a
Tick/Cross to indicate understanding. Occasionally an emoticon such as smiling or
laughing was used to lighten the moment.
There appears to be a similar relationship between presence and the type of
technology used as was found in Tutorial 3 (Tables 5.4 and 5.7). Teaching presence
is predominantly achieved by way of the microphone (n=120, 84%), whereas social
presence occurs mainly through text chat (n=51, 46%). However, preference for this
particular technology for social presence was not as strong as that seen in Tutorial 3.
Similar to Tutorial 3, cognitive presence was mainly manifest with text chat
(n=69, 58%) compared with using a microphone (n=18, 15%). As previously
mentioned, the technology used tended to be influenced by the complexity of the
comment that was made as it was easier to speak a long and complex comment than
write it in text chat.
126
Table 5.7
Relationship between presence and technology used in Tutorial 6 (N=373)
Using
text chat
Using
microphone
Using
emoticon/icon
Total
Social
Presence
51 (46%) 41 (37%) 19 (17%) 111 (100%)
Teaching
Presence
9 (6%) 120 (84%) 14 (10%) 143 (100%)
Cognitive
Presence
69 (58%) 18 (15%) 32 (27%) 119 (100%)
Total 129 (35%) 179 (48%) 65 (17%) 373 (100%)
5.10.3 Tutorial 8
The eighth and final tutorial on new content was conducted in Week 11 of the
semester with eight participants in attendance. It lasted for approximately 125
minutes (Section 5.8) and yielded 718 messages. The topics for discussion included
the chi-square test of independence and hypothesis testing and confidence interval
for means.
In this tutorial there was a higher level of teacher involvement as evidenced
by the proportion of messages (n=346, 48%) contributed by the teacher (Table 5.8)
than in Tutorial 6 (Table 5.5), but closer to that in Tutorial 3 (Table 5.2). The
participants contributed to varying degrees with S2 and S4 contributing the most
(n=106, 15% and n=61, 9% respectively), closely followed by S1, S9 and S6, (n=49,
7%; n=40, 6% and n=33, 5% respectively). As mentioned before, ND was
anonymous writing by participants on the whiteboard which included underlining of
keywords and indicating answers to multiple-choice questions. This type of
interaction contributed to a small extent overall (n=47, 7%; Table 5.8), but was
considerably more than in previous tutorials, almost proportionately twice as much
as Tutorial 3 (n=14, 4%; Table 5.2) and Tutorial 6 (n=15, 4%; Table 5.5). This could
127
be explained by the increased involvement by participants in the problem-solving
process by writing on the whiteboard during Tutorial 8.
Table 5.8
Tutorial 8: Number of messages by participant and presence (N=718)
Participant Social
Presence
Teaching
Presence
Cognitive
Presence
Frequency (%
of total)
ND 0 0 47 47 (7%)
S1 15 1 33 49 (7%)
S2 35 3 68 106 (15%)
S3 3 0 11 14 (2%)
S4 18 4 39 61 (9%)
S6 10 2 21 33 (5%)
S7 3 0 11 14 (2%)
S8 4 4 0 8 (1%)
S9 9 1 30 40 (6%)
Teacher 33 313 0 346 (48%)
Total 130 (18%) 328 (46%) 260 (36%) 718 (100%)
From the CoI coding of the messages into the three presences (Table 5.8), it
can be seen that teaching presence again dominates (n=328, 46%) over cognitive
presence (n=260, 36%) and social presence (n=130, 18%). Teaching presence
increased compared with Tutorial 6 (n=328, 46% in Tutorial 8 compared with n=143,
38% in Tutorial 6) due to the higher proportion of direct instruction necessitated by
the complexity of the content covered (Section 5.8).
From Figure 5.15, it can be seen that social presence occurred throughout this
tutorial and was not restricted to the greetings at the beginning and farewells at the
end of the tutorial (n=130, 18%; Table 5.8). Even though the incidence of social
128
presence was not as frequent as in Tutorial 6 (n=111, 29%; Table 5.5), it took on a
more focussed form of mutual support and encouragement amongst participants.
Figure 5.15. Timeline of presences in Tutorial 8
As well as identifying messages by the presence represented, they were also
categorised by type of technology that was used (Table 5.9). Although more than half
of the messages were delivered using the microphone (n=381, 53%), it should also be
acknowledged that nearly half of all of the contributions (n=346, 48%) were made by
the teacher and nearly all of the teacher‘s contributions (n=344, 99%) were made
using the microphone.
Participants who were active in discussion preferred text chat (Table 5.9): S1
(n=29, 59%), S2 (n=76, 72%), S4 (n=37, 61%), S6 (n=23, 70%), and S9 (n=36,
90%). Emoticons/icons were mainly used to indicate a keyword or an answer on the
whiteboard (47 out of 133) or a Tick/Cross to indicate understanding.
There appears to be a similar relationship between presence and the type of
technology used as was found in Tutorials 3 and 6 (Tables 5.4, 5.7 and 5.10). As
shown in Table 5.10, teaching presence is predominantly achieved by way of the
129
microphone (n=316, 96%), whereas social presence occurs mainly through text chat
(n=59, 45%). However, similar to Tutorial 6, preference for a particular technology
is not as strong in social presence as that seen in Tutorial 3.
Table 5.9
Tutorial 8: Frequency (percent) of technology use by each participant (N=718)
Student Using
text chat
Using
microphone
Using
emoticon/icon
Total
ND 0 (0%) 0 (0%) 47 (100%) 47 (100%)
S1 29 (59%) 10 (20%) 10 (20%) 49 (100%)
S2 76 (72%) 10 (9%) 20 (19%) 106 (100%)
S3 0 (0%) 1 (7%) 13 (93%) 14 (100%)
S4 37 (61%) 6 (10%) 18 (30%) 61 (100%)
S6 23 (70%) 3 (9%) 7 (21%) 33 (100%)
S7 1 (7%) 0 (0%) 13 (93%) 14 (100%)
S8 0 (0%) 5 (63%) 3 (38%) 8 (100%)
S9 36 (90%) 2 (5%) 2 (5%) 40 (100%)
Teacher 2 (1%) 344 (99%) 0 (0%) 346 (100%)
Total 204 (28%) 381 (53%) 133 (19%) 718 (100%)
Similar to Tutorials 3 and 6, cognitive presence was mainly evident with text
chat (n=134, 52%) compared with using a microphone (n=21, 8%), see Tables 5.4,
5.7 and 5.10. As previously mentioned, the complexity of the comment seemed to
drive which technology was preferred.
130
Table 5.10
Relationship between presence and technology used in Tutorial 8 (N=718)
Using
text chat
Using
microphone
Using
emoticon/icon
Total
Social
Presence
59 (45%) 44 (34%) 27 (21%) 130 (100%)
Teaching
Presence
11 (3%) 316 (96%) 1 (<1%) 328 (100%)
Cognitive
Presence
134 (52%) 21 (8%) 105 (40%) 260 (100%)
Total 204 (28%) 381 (53%) 133 (19%) 718 (100%)
5.10.4 Summary of CoI analysis
Analysing the recordings of the online tutorials using the Community of Inquiry
framework uncovered not only more about the nature of the interaction but also when
each type of interaction occurred in each tutorial. As an aside to this analysis, the
affordances of the technology used were examined. The contribution to each tutorial
by each participant was also revealed.
Of all the presences, teaching presence was most prevalent in all three
tutorials, followed by cognitive presence and then social presence (compare Tables
5.2, 5.5 and 5.8). Tutorials 6 and 8 were noted to have similar distributions of
presence with approximately 15% social presence, 50% teaching presence and 35%
cognitive presence. A more even distribution of the three presences was noted in
Tutorial 6. While cognitive presence stayed consistent across all three tutorials at
about 35%, social presence was markedly higher in Tutorial 6. The difference in
distribution of presences in Tutorial 6 compared with the other two tutorials could be
explained by the exceptional circumstances at the time that this tutorial was
conducted. As previously explained (Section 5.6), the timing of Tutorial 6 coincided
with an unprecedented flood event that all participants experienced to some degree.
This generated more involvement by the participants, in particular more social
engagement.
131
When comparing the timelines of presence (Figures 5.13, 5.14 and 5.15) the
distribution of social presence across a tutorial is most notable. Across the semester
there appears to be a gradual shift in this distribution from the incidence of social
presence mainly appearing at the beginning and end of the tutorial in Tutorial 3 to a
more even distribution across the whole tutorial in Tutorial 8. This could be partly
explained by the increase in a sense of community that developed across the
semester.
Which technology was used depended upon the participant and the purpose.
Teaching presence was mainly afforded by using the microphone, whereas social
presence and cognitive presence were primarily afforded by text chat.
5.11 Summary
This chapter explored issues related to the types of interaction taking place in the
online synchronous tutorial to support learning of statistical concepts and
investigated the nature of the dialogue in this environment for this discipline context,
thus addressing the first two aims of this research study (Section 1.2). Thick
description (Sections 5.1 – 5.9) supported by analysis using the Community of
Inquiry framework (Section 5.10) were used to gain a greater understanding of the
role of synchronous technology in supporting student learning in a quantitative
discipline context. The researcher, as participant-observer, was in a unique position
to maximise this understanding by being ―inside‖ the online tutorial. Recordings of
the tutorials were used to uncover the nature of the interaction amongst the
participants, between the participants and the teacher and between the participants
and the content discussed. Exploring the nature of these interactions further through
the three presences of the CoI framework added to the understanding of the part that
this online environment has to play in supporting distance students when learning
quantitative content.
The outcomes from these analyses are further enhanced by information
gleaned from the responses to the initial survey and in the final interviews (see
Chapter 4). Narratives describing the perceptions of and contributions to the online
synchronous tutorials of four participants who represent the diversity within the
group are presented in Chapter 6.
132
1Chapter 6: Narratives
Four narratives have been collated to provide insights into individual‘s expectations,
perceptions and experiences of the online synchronous tutorial. Data from the initial
survey (Section 4.1), the weekly research questions and interaction in the tutorials
(Sections 5.1 – 5.9), and the final interviews (Section 4.2) have been used to inform
the stories that follow.
From the nine participants who were active in the research study, four
participants, S1, S4, S6 and S7 were selected for these narratives as they represent
the diversity in the research group in a number of key factors: gender (two males and
two females), age (range from 31 years to 67 years), discipline (psychology and non-
psychology – psychology being the largest single cohort of students in the Data
Analysis course), location (two in the capital city and two in regional centres) and
circumstance (variety of other commitments, prior online experiences and experience
of statistics). These details are summarised in Table 6.1.
Table 6.1
Details of narrative participants
Participant Gender
M/F
Age
(~yrs)
Discipline Location Circumstance
Code Alias
S4 Sophie F ~31 Psychology Coastal city At-home
mother
S7 Harry M ~67 Biology Capital city Retiree
S1 Daniel M ~34 Mathematics Capital city
relocated to
country town
Full-time
teacher
S6 Jess F ~48 Psychology Coastal town Teacher on
leave
133
For ease of reading and to personalise their stories, the following aliases for
the participants were chosen: S4 will be called Sophie (Section 6.1), S7 Harry
(Section 6.2), S1 Daniel (Section 6.3) and S6 Jess (Section 6.4). Again, it should be
noted, that as an active participant in the online tutorials (Section 1.7), the researcher
will often use the first person to refer to her interactions with the participants.
6.1 Sophie
At the time of the research study, Sophie was a 31 year old at-home mother of one
young child, taking almost full responsibility for the household as her partner was
frequently working away from home. Although she was not in paid employment, she
participated in voluntary work a couple of days a week. She was repeating the
introductory statistics course, Data Analysis, towards the end of her studies in a
Bachelor of Science degree majoring in psychology. Sophie admitted to being
somewhat nervous about studying Data Analysis because of the content and her past
experience studying this course. She believed that she had been unsuccessful at her
previous attempt due to her circumstances: namely, at that time it was her first course
on returning to university studies after a break; she did not have a reliable Internet
connection; and, she was borrowing the textbook from the library because she did
not have a textbook of her own. Sophie felt that she would have a much greater
chance of passing this time as she: had overcome the disadvantages from her
previous attempt; was more familiar with the software used in the course; was more
familiar with statistical language; and, was ―motivated to learn and get it right‖
(Initial survey).
Sophie had encountered fairly positive mathematical experiences, saying that
she ―used to enjoy maths,‖ but she did admit that she had some problems in the past
with higher-level formulae. Further, in her working life she ―worked with numbers a
lot and learnt how to understand formulas a bit better‖ (Initial survey). She had a
diverse range of positive experiences of the online environment having participated
in group text chats, structured discussions, and online live lectures in Wimba in a
number of her psychology courses. She had not used the asynchronous discussion
forums in her studies in the semester of the research study due to lack of time. She
had, however, used them and the online text chat facilities within the University‘s
learning management system to make contact with other students in the past in other
courses. These experiences motivated Sophie to offer to participate in the study as
134
she found that ―online live lectures or tutes make me feel more involved and in
touch, and help me learn and either keep me on track or get me back on track and
help a lot with direction for study‖ (Initial survey). She did not interact with any of
the other participants outside of the online tutorials, although she indicated that she
would have liked to, but time did not permit.
Sophie was a very active participant in the online tutorials (see Section 5.10,
in particular Tables 5.5 and 5.8 for relative frequencies of messages). She attended
all but one tutorial (Table 5.1). The tutorial she missed was the one immediately after
the major flood event when there were issues with Internet connectivity and power
blackouts. She accessed the archive of this tutorial in preparation for the following
tutorial. Despite this, she still felt behind to some extent during the next tutorial.
Sophie contributed during tutorials using both text chat and the microphone. She was
comfortable with either medium. However, she preferred to use text chat as she did
not want the background sounds of her baby to distract others in the tutorial. She also
commented that one of the advantages of using text chat was that she could be typing
a question while I was talking and then adjust it, depending on what I had said,
before hitting the Enter key to send it. Her circumstances were not always ideal for
participating in the online tutorial, as she had her baby with her each time as
evidenced by his chatter in the background on the occasions when she did use the
microphone. Despite this, she stayed dedicated to the task, ―will have to relisten to
last 5 or 10 mins cause bub ‗singing loudly‘. Will email if any questions after re-
listen‖ (Tutorial 8).
Even though Sophie was not particularly confident in her knowledge of the
subject, she was not intimidated by the group discussions, being prepared to ask
questions ―I hope this isn‘t a silly question‖ (Tutorial 5); offer answers; acknowledge
her mistakes ―Oh whoops‖ (Tutorial 4); and share her successes ―Ah something just
clicked from last week‖ (Tutorial 8). She often apologised to the group, believing
that her questions might confuse people, but, in my role as tutor, I found them to be
useful for clarifying things that were commonly misunderstood. The other
participants seemed to appreciate them as well (see comment made by Jess in Section
6.4). Occasionally Sophie would explain a concept in her own words (using the
microphone) to confirm that she had understood my explanation. With reference to
the CoI framework (Sections 2.4, 3.2.2.3 and 5.10), Sophie‘s contributions to the
135
online tutorial were both social and cognitive, adding substantially to the sense of
community that developed within the tutorial across the semester.
From the research questions asked during the tutorials (Section 3.2.1.3),
Sophie indicated that she found the content to be ―just right‖ and ―thought
provoking.‖ When asked each week to indicate the most important thing that she
obtained from the previous week‘s tutorial, she generally commented on issues that
had been resolved such as ―consolidation of some home learning and clarification of
some question. Hopefully am getting there and switching on!!‖ and ―Got me back on
track again clearing some confusion and explanations made easy to help me
remember and explain things,‖ rather than identifying specific topics. In response to
how she felt about the Data Analysis course, she consistently said that she was
―enjoying it‖ and ―getting on top of it.‖ However, towards the end of the semester
she added that she was also ―struggling.‖ When asked which topics she found most
challenging, she indicated ―binomial and probability, and hypothesis testing and
confidence intervals.‖ Even though the binomial model and probability is covered
early in the course, as tutor/teacher, I was aware that it was a topic that often
continues to plague students throughout the course. Hypothesis testing and
confidence intervals are covered in the last few weeks of semester and link together a
number of concepts from earlier in the course. As such it is also traditionally a topic
that is challenging to most students. Sophie acknowledged that she would ―feel a
sense of achievement‖ when she finished the semester, but added that she would
―feel relieved‖ as well. In the final tutorial she was again asked why she had chosen
to participate and how the online tutorial contributed to her learning of Data
Analysis. She said that she had participated ―to help me learn more thoroughly and
feel more a part of uni as an external student and to take part in a research project for
‗good karma‘ LOL!‖ She felt that the tutorial ―contributed greatly!! Was very helpful
with learning and making me feel a part of something. Felt more in touch and very
thankful for it!!‖
In the final interview, a number of things came to light about Sophie and her
expectations of herself, what she believed she had gained from participating in the
research study, and what she experienced from being involved. One factor in her
decision to participate in the online tutorial was her perception of the benefits she
had experienced from participating in Wimba sessions in the past. She had been
136
hoping ―something like this‖ would happen in Data Analysis. She admitted that she
felt guilty when she fell behind in her studies and that these feelings actually made
her fall behind even more. She decided to join the research study mainly for her own
benefit but added that it was also because it was a research project and - since I had
helped her in the past with her previous attempt at Data Analysis - she wanted to help
me. She expanded on this by noting that, within some of her key psychology courses,
students were expected to participate in research projects undertaken by Honours
students. She had participated in some of these projects and she was hoping that, by
participating in the research projects of others, one day others would do the same for
her.
Sophie displayed a lack of confidence in her knowledge and the outcome of
the examination during the final interview. However, she acknowledged that she was
more dedicated and had more interest in studying statistics this time. She realised
that putting too much pressure on herself to obtain the best marks was
counterproductive and that, once she took this pressure off, she actually performed
better. In answer to a question on what was the most difficult thing about the content
and the course, she commented that it was not so much about learning the statistical
techniques but rather which analysis method to use for a specific context.
When asked how the other participants contributed to her experience in the
online tutorial (Final interview), Sophie said that at first she had been concerned
about asking questions thinking that ―somebody is sitting there rolling their eyes –
can‘t believe they don‘t know that, but it wasn‘t like that at all.‖ She felt that the
other participants actually ―enabled‖ and ―facilitated‖ her by asking questions about
things that she had been wondering or the answers to their questions reinforced
things that she had been thinking. She felt that it was ―good that [there were] real
people out there.‖ She was motivated by them and felt supported by them, as noted in
her remark that ―no one made me feel silly.‖ Because of this, she felt comfortable
with other participants confirming or correcting her understanding.
On the subject of how the online tutorial helped her with learning in Data
Analysis, she admitted that she did not know if she would have been ―lost‖ without it
but she thought that this may have been the case. She saw participation in the online
tutorial as her:
137
… learning tool even before the lecture notes and worksheets and stuff like
that. Hearing stuff, being able to do stuff, and asking questions being thrown
around, and actually physically doing questions that you asked of people,
definitely helped the thinking rather than sitting there and seeing all these
formulas and everything and just panicking.
(Final interview)
When questioned about the need or otherwise for a tutor to be present (Final
interview), Sophie responded that the tutor was ―definitely needed‖ to keep the
students ―on track.‖ She acknowledged that she saw no problem in having text chat
sessions with only students present but, from past experience in another course, she
noticed that on one or two occasions they did ―get off track‖ and ―we all learnt the
wrong thing.‖ She did not believe that she could have come as far without the tutor
―teaching us things and going through examples, giving us questions and making us
think. It wasn‘t even lecture form – actual questions and then we would discuss the
questions, so then understanding would happen.‖ Sophie felt that it was really helpful
to have the commitment of attending the online tutorial each week to keep her up-to-
date and to motivate her. She admitted ―I am always the one that says this is probably
a silly question but, and just to have other people say ‗no, it is not a silly question‘
and you to say that; it makes you a bit more motivated and, like OK, just get on with
it. Good‖ (Final interview).
6.2 Harry
Harry was a 67 year old retiree with no family commitments to impact on his studies.
At the time of the research study, he had only just started in his undergraduate
studies towards a Bachelor of Science majoring in biology, having completed one
course towards his degree. He had completed the tertiary preparatory program prior
to enrolling in the undergraduate degree. Having failed at his first attempt, he was
repeating Data Analysis and decided to only undertake one course in the semester of
the research study. He realised the need to understand statistics better to further his
studies in Biology. Since Harry had recently completed Tertiary Preparatory
Mathematics courses, he believed that this would help him with formulae needed for
Data Analysis. He indicated that he had no prior experience of using asynchronous
discussion forums. Even though Wimba sessions were available in the tertiary
138
preparatory program, he had not participated at that time and regretted not doing so.
Consequently, he decided to join the research study as he thought it would be to his
advantage to learn different ways to solve problems. His sister, who had previously
studied at the University, advised that he should take advantage of it, telling him that
―it will be better for you‖ (Final interview).
In an unsolicited email about halfway through the semester, Harry
commented ―I really enjoy the online Wimba as I have not done it before and get
great feedback not only from you but the other students.‖ He felt that even though
some people had technical difficulties and it was a bit difficult to start with, it was a
good concept, ―like being in a [real] classroom‖ (Final interview). When prompted to
further compare the online to a face-to-face situation and comment on whether
anything about it bothered him, he replied that it was a bit difficult in the beginning
because he had not done it before, but not being face-to-face did not matter because
―you had your nice face on the front anyway.‖ By this, Harry was alluding to a small
photo that I had uploaded to one corner of the main screen in Wimba to make the
online environment less impersonal. To achieve a similar effect, other instructors
have used a video feed of themselves in the corner of the main screen in Wimba. I
did not do this as it used up too much bandwidth and my home Internet connection
was not fast enough to include video.
Harry was the most reliable participant in the research study having attended
all nine tutorials (Table 5.1). He did not see it as an imposition to commit to the one
hour each week. In fact, in the final interview, he said that he had enjoyed it. Though
he could be colloquially described as a ―man of few words‖ (Tables 5.2, 5.5 and 5.8),
he occasionally contributed to the discussion using both text chat and the
microphone. In particular, if he was specifically asked a question, he would always
offer an answer using the microphone. It was noticeable that he contributed more to
discussion in Tutorial 7 when there were only four participants present (Section
5.7.3). He contributed most frequently to questions that needed calculations to be
completed and when statistical tables needed to be used. Even in the tutorials where
he only contributed a couple of comments or questions, his presence throughout was
noticed as he acknowledged his understanding or otherwise using the Tick/Cross
icon when asked. He commented that he was not used to speaking in the online
environment and felt that people were ―a bit shy‖ of using the microphone. He added
139
that it was a matter of practice and that ―after the first three or four sessions we all
got pretty helpful to each other‖ (Final interview). He joined in the social text chat at
times bringing his dry sense of humour to the group. For example, he commented
―Take a GPS‖ when Daniel (S1) mentioned his impending road trip through the
floods to his new home in the central-north of the state (Tutorial 5), and to me, that ―I
thought that the dog might have chewed your mic‖ (Tutorial 8) when I had
inadvertently turned off my microphone for a few minutes in the middle of an
explanation.
Similar to responses given by Sophie (S4; Section 6.1) to a research question
during the tutorials, Harry indicated that he found the content to be ―just right‖ and
―thought provoking.‖ When asked each week to indicate the most important thing
that he obtained from the previous week‘s tutorial, he generally commented on some
aspect of the topic covered in that tutorial, such as graphs and variables through to
hypotheses and proportion. In response to how he felt about the course he mostly
chose ―getting on top of it,‖ but in the final few tutorials he added that he was
―enjoying it.‖ Asked which topics he found most challenging, again Harry gave a
response similar to that given by Sophie (S4) when he indicated ―binomial and
probability and hypothesis testing and confidence intervals.‖ However, when asked
in the final interview what he thought was most difficult about the content and the
course, Harry referred to the language of statistics, how it was different from
mathematics and that it was ―not that difficult, but just had different concepts.‖ In
addition, Harry mentioned that the subject matter was very important to his future
studies in biology and that it was ―relevant to everything.‖
Despite any difficulties with the content, he acknowledged that he would
―feel a sense of achievement‖ when he finished the semester. In the final tutorial, he
was also asked why he had chosen to participate and how the online tutorial
contributed to his learning of Data Analysis. He participated ―to be more informed
and to learn how other students study.‖ He felt that the tutorial was ―very good as I
can revise with the recording tute [archive].‖
When asked if he would like to have Wimba sessions in other courses in the
future, Harry responded ―definitely, most brilliant things ever‖ (Final interview). He
believed that it was good to use Wimba to interact with other students and he was
―pretty sure everyone got something out of it.‖ He felt that all students should use
140
Wimba more often. Harry did not make contact with other participants outside the
designated tutorial time but admitted that he ―should have done‖ (Final interview).
This course had very active asynchronous discussion forums. Harry
commented that he read forum posts, but did not use them as much as he believed he
should have. He added that the online tutorials were better because of the immediacy
which he explained as being ―one-on-one straight away.‖ He felt that it was useful
―sitting in your own room reading a book and you can go back and forward [in the
book],‖ but with the tutor and other students in the online tutorial it ―brings much
more to life what you are actually trying to study‖ (Final interview).
Harry was the only participant in the research study who used all the archives
of the online tutorials. In fact, he watched most of them at least twice, often in the
couple of weeks following the tutorial and then again during the revision period just
prior to the examination. When asked about this, he commented that he could replay
everything several times to interact with and hear what everyone said.
When asked how the other participants contributed to his experience in the
online tutorial, he pointed out that different people asked different questions and
brought lots of different viewpoints to the discussion. He also added that the tutor
moderated the process and explained ―where we all went wrong or right,‖ and that it
was ―like being in a classroom using the blackboard‖ (Final interview). He felt that
the tutorial would not have been as effective without the tutor.
Unfortunately, Harry was a victim of the post flood disruptions on the
morning of the examination. The ferries and trains to his suburb were disrupted and
despite allowing twice the usual amount of time for bus transport, he was unable to
arrive at the examination centre on time. As a result, he was given a supplementary
examination which he subsequently sat and passed at the end of the following
semester.
6.3 Daniel
Daniel was a 34 year old full-time high school teacher who was completing the
Graduate Certificate in Science (Mathematics) in order to teach the higher levels of
the secondary mathematics curriculum. He had two young children under the age of
three and, as a result, found that mornings and nights were very busy times of the
141
day. He was excited by the prospect of studying Data Analysis as he saw it as ―a
useful tool in today‘s data-driven workplace‖ (Initial survey). He felt that his past
experiences of mathematics, especially his recent solid passes in two undergraduate
mathematics courses, would stand him in good stead for his current studies in
statistics. Daniel had experienced online communication in the form of asynchronous
discussion forums in his previous courses but had no experience of synchronous
communication, such as in Wimba, with tutors or other students. He felt that
participation in the research study ―might make me set aside some definite weekly
time to study and think about the subject matter‖ (Initial survey).
Daniel was a very active participant in the online tutorials (Tables 5.2, 5.5
and 5.8). He attended all but two of the nine tutorials (Table 5.1). He missed Tutorial
1 because he had committed to a school awards night prior to agreeing to participate
in the research study. He missed Tutorial 7 because he did not have an Internet
connection for that week as he had very recently arrived in the relatively small
country town where he was taking up a new teaching position. However, he did
watch the archive of Tutorial 7 before the tutorial that followed.
Even though Daniel had experienced some technical difficulties at times, he
felt comfortable in the Wimba environment. He used a mixture of text chat and
microphone to communicate during the tutorials, ―as children of the digital age we‘re
kind of able to do both‖ (Final interview). Having to use the Hand-up icon to ask a
question using the microphone was not seen as an inconvenience to him, as he
recognised that there were no visual cues to do this otherwise. Daniel was
comfortable with using the microphone or text chat, but regularly used text chat so as
to not interrupt the flow of the conversation. He commented that he did not like
interrupting anyone.
Daniel added value to the tutorial discussions in a number of ways. He was
confident with giving his version of an explanation to add clarity. Even when he had
not covered content prior to the tutorial and he was unsure, he worked in the moment
and was prepared to offer answers to my questions (Tutorial 8). He offered support
and encouragement to his fellow participants. When Jess (S6) picked up on an error
in an answer that Daniel had offered, he thanked her by saying ―Good catch Jess‖
(Section 5.4.3). When Sophie (S4) was excited at getting something correct that she
had struggled with earlier, Daniel clicked on the Clapping-hands icon to show
142
support (Tutorial 4). He also added to the comfort of others by freely admitting when
he was having trouble - ―I got confused with the wording here. My mind doesn‘t like
the negative phrasing‖ (Tutorial 6) - when he could just as easily have not said
anything. Daniel added depth to the discussion by looking beyond the topic and
trying to link different concepts together to ask a more complex question or make a
perceptive comment. This was evidenced when he was looking for a connection
between the normal approximation to the binomial distribution and the central limit
theorem (Tutorial 5).
Even though Daniel had experienced some difficult circumstances and
technical problems through the semester, he persisted with attending the tutorials. He
unexpectedly managed to participate successfully in Tutorial 6, despite the fact that
he was in a hotel room with his family (which included two small children) using an
unstable mobile Internet connection. This became necessary as he was delayed in his
journey to his new employment due to the widespread flooding that was occurring
that summer. A happy consequence of this delay was that he was able to meet with
Jess (S6) face-to-face for a study session during this stopover in her hometown
(Tutorial 6, Section 5.6.3).
From the research questions asked during the tutorials, Daniel initially
indicated that he found the content to be ―nothing new but helpful.‖ For the later
tutorials, similar to the comments of Sophie (S4) and Harry (S7), he responded that
they were ―just right‖ and ―thought provoking.‖ In each tutorial, when asked to
indicate the most important thing that he got out of the previous week‘s tutorial, he
commented on issues such as ―that I wasn‘t as far behind as I thought I was. That I‘m
on the right track and it‘s not as hard as it looks‖ (Tutorial 3), as well as particular
topics such as ―using different formulas for finding the SD and mean for proportion
and mean of a sampling distribution‖ (Tutorial 6). In response to how he felt about
the course, his choices included ―enjoying it‖ and ―getting on top of it.‖ When asked
which topics he found most challenging, he indicated ―contingency tables and
conditional distributions‖ and, in agreement with Sophie (S4) and Harry (S7),
―hypothesis testing and confidence intervals.‖ He acknowledged that he would ―feel
a sense of achievement‖ when he finished the semester, but added that he would also
―feel relieved.‖
143
In the final tutorial, when asked why he had chosen to participate in the
online tutorial, he responded that he participated ―to help motivate me to keep up to
date and to give me an opportunity to see more examples of the types of problems we
read about in the book.‖ With reference to the question on how the tutorial
contributed to his learning, he felt that it ―helped me by either introducing me to the
topic (if I was running behind on the study schedule) or by consolidating my
knowledge on the topics.‖
When asked at the final interview why he had decided to participate in the
online tutorial, he replied that his main motivation was ―keeping me involved with
things, trying to feel more a part of a learner‘s community as opposed to just logging
on and viewing lectures.‖ He further clarified that he ―was after the interactivity and
maybe the community of learners.‖
If he had been given the opportunity to be involved in Wimba sessions in
other courses, Daniel indicated that he would have appreciated it, particularly with
his higher-level mathematics course. He identified that seeing problems worked out
in real time and being able to ask questions for immediate feedback as benefits of the
online tutorial in Wimba. He added that, while viewing recordings of campus
lectures in his mathematics course, he had felt ―if only I could ask or make a point or
if I could only say something, then I think that it would have helped my
understanding of the content‖ (Final interview).
As briefly mentioned earlier, Daniel had the unique experience of meeting
face-to-face with one of the other study participants. Even though some participants
lived in the same city as one another, none of them arranged to meet outside the
tutorial. While travelling with his family to their new location, Daniel was delayed in
Jess‘s home town for a week due to the widespread flooding happening throughout
the state at that time. During the online tutorial that week, Jess realised that Daniel
was in town, so they swapped contact information and arranged to meet on two
occasions. As Daniel put it in the final interview:
… we met to just look over some of the stuff and make sure we were on the
right page, and I guess it just really turned into us reassuring each other
that we were sort of doing the right thing, making sure we were touching all
the bases, just because there was a lot of different avenues of data coming at
us, from the book, to the study book, to the website.
144
In the final interview, when asked about the contributions of other
participants to the tutorial, Daniel commented that it was good to see other people‘s
ideas and questions – ―there were some that thought of things that I hadn‘t thought
of.‖ However, he felt that the tutor was an important part, being the leader and
keeping everyone on track, ―groups tend to lose focus unless they have a strong
leader.‖ While he acknowledged that the leader did not have to be the tutor and this
role could be filled by a student, he admitted that he liked having a person there who
knew the content ―down pat,‖ a content expert. He did not feel that the tutorial would
have been as effective if the tutor had not been present. In fact, he commented that he
probably would not have participated if that had been the case. Even though Daniel
found the archives useful and had accessed them twice to cover material he had
missed, he felt that it was much better being a participant and having a tutor present.
In assisting his learning of the Data Analysis course, he felt that the online
tutorial was often a consolidation tool. However, once, when he had not prepared
before one of the tutorials, he found that the tutorial was a valuable tool to introduce
the topic to him. He found the examples discussed in the tutorial were easy to follow.
He felt that this made studying that particular topic easier. He viewed the tutorial as
providing a ―safety net‖ above and beyond any other support provided in the course,
with a weekly routine where he could talk to an expert. He noted that the topic that
he had most difficulty with was hypothesis testing, in particular, notation. He
commented that his ―understanding went back and forth between feeling confident
and not feeling so confident, especially with notation between a statistic and a
parameter‖ (Final interview).
Overall, as an external student, he appreciated being able to participate in the
online tutorial. It made him set aside a little time to work on the course. In addition,
he added that it ―makes us external students feel valued by this support‖ (Final
interview).
6.4 Jess
Jess was a 48 year old former high school teacher studying a Bachelor of Science
majoring in psychology with plans to transfer to the Graduate Diploma in
Psychological Studies when she had sufficient prerequisite courses completed. She
had been working full-time but found combining work and study to be quite
145
stressful. At the beginning of the summer semester, with two courses already
completed towards her degree, she decided to take a break from teaching and
concentrate on her studies. Jess was feeling confident about studying Data Analysis,
especially with the support that was available (Initial survey). She had studied
statistics at tertiary level many years ago and attributed her underperformance at that
time to a lack of motivation and poor work ethic rather than a lack of mathematical
ability. In recent times, she had successfully taught some statistics as part of the
secondary school curriculum, so she did not feel that her past experiences in
mathematics would have any impact on her current studies in Data Analysis.
Jess used the asynchronous discussion forums in the Data Analysis course.
Although she had not previously used synchronous communication such as Wimba
in her studies, she did have experience of this type of online communication in
professional development activities in her employment. When asked in the initial
survey why she had decided to participate in the research study her response was
―Why not?‖ She thought that it would be a ―great way‖ to develop her technological
skills while studying Data Analysis. She added ―I am a strong believer in learning in
context and these tutorials will be a great way to do this.‖ Jess also thought that she
might have to use Wimba in future studies, so she might as well take this opportunity
to develop her confidence and competence in this area. When asked if she would like
to have online tutorials in Wimba in any of her other courses, she indicated that it
would depend on the content of the course (Final interview).
As with Sophie (S4), Jess was interested in the research aspect of the project.
She was hopeful of undertaking some research of her own in postgraduate studies in
the future – ―good to be on the receiving end sometimes so it gives you a better idea
of what is going on‖ (Final interview). She also appreciated the interaction that the
online tutorial could afford, offering that ―the more contact you have with people
about your work, the more you can discuss it, the better your understanding. It opens
up so much more than if you are by yourself and isolated‖ (Final interview). Because
she had previous experience of video conferencing through her work, she was
confident using either text chat or microphone to communicate in Wimba. She could
see advantages in someone being able to write while someone else was speaking,
saying that it ―probably mimics the actual real life classroom better if you are
actually also able to write while someone‘s speaking‖ (Final interview).
146
As a result of an unexpected opportunity, Jess met face-to-face with one of
her fellow participants in the research study (Section 6.3). When Jess realised that
Daniel (S1) was going to be staying in her town because of a delay in his travels due
to flooding in the region, she approached him about meeting for a study session by
addressing him directly using text chat during Tutorial 6. In the final interview, Jess
mentioned that they met at the local library, ―which was really good. I enjoyed
meeting someone else who was actually in the tutorial group.‖ She added that she
would ―love to‖ have talked to the other students, face-to-face.
Jess was an active contributor to all aspects of the online tutorials (Table 5.5).
She attended all but one of the tutorials (Table 5.1). She missed Tutorial 3 because of
a lack of Internet connectivity while moving house. Jess did not access any of the
archives. Moving house during the semester, looking after her niece for part of the
semester, participating in the research study, and studying another course in addition
to Data Analysis over the shorter than usual Summer semester all contributed to
reducing the time she had available to access the archives. She added that she ―would
love to have used them, but did not physically have the time.‖ Despite this, she
thought that ―they were really valuable and probably would have spent a fair bit of
time thinking about the questions in terms of revision‖ (Final interview).
Jess was comfortable using both the microphone and text chat. She was
prepared to ask questions when she did not understand. She was supportive of other
students, noticeably when she commented ―No such thing as a silly q‘n Sophie,‖
when Sophie (S4) prefaced a question with ―I hope this isn‘t a silly question‖
(Section 5.5.3). On another occasion, she supported Sophie‘s questioning by saying
―Thanks Sophie, I was wondering this as well‖ (Tutorial 6). Jess did not hide her
worries and concerns saying ―Good, I was getting nervous‖ (Tutorial 5). This was in
response to my comment to the group that some of the questions we had been
discussing were somewhat more difficult than the ones they would need to answer in
the examination. She was noticeably quieter in Tutorial 8 but she later acknowledged
that she had not covered the content before the tutorial. At one stage in Tutorial 7,
she became a little overwhelmed, commenting that ―I‘m confused as there‘s so
much.‖ However, she did not let that stop her from participating actively in the
discussions.
147
From the research questions asked during the tutorials, and similar in
response to Sophie (S4) and Harry (S7) and, to some extent, Daniel (S1), Jess
indicated that she found the content to be mostly ―just right‖ and ―thought
provoking.‖ In the latter part of the semester when the material became somewhat
more complex, she also added ―a bit confusing.‖ When asked each week to indicate
the most important thing that she obtained from the previous week‘s tutorial, Jess
commented on issues such as ―that I was behind and this gives me more incentive to
catch up despite trying to finish my job and move house – busy time!‖ and
―confirmed what I knew but was a good reminder and allowed me to tie things
together better – big picture stuff.‖ She also mentioned particular topics such as ―it
helped me sort out the logic behind the sign test but I‘m still confused about some
things, probably minor but will ask if Wimba session on this week.‖ In response to
how she felt about the course, her initial choices included ―enjoying it,‖ but towards
the end of the semester there were more of ―getting on top of it,‖ ―struggling‖ and
―still worried.‖ When asked which topics she found most challenging, she indicated
―sampling distributions‖ and, similar to Sophie (S4), Harry (S7) and Daniel (S1),
―hypothesis testing and confidence intervals.‖ She was looking forward to ―getting
some sleep‖ and ―having a party‖ when she finished the semester, but added that she
would also like to ―take a holiday‖ (Final tutorial).
Because Jess was late to the Revision Tutorial (final tutorial of the semester),
I asked her to email me her responses to the research questions that had been asked.
When Jess was asked why she had chosen to participate in the online tutorial and
how the tutorial contributed to her learning of Data Analysis, she indicated that she
had participated because ―I'm all for getting as much help as I can get - from you and
other students.‖ She added that ―if I am also able to do something to help someone
else along the way or other students now and in the future; all the better! A win-win
for all concerned!‖ She felt that she needed to boost her confidence with regards to
using technology, ―using Wimba was a good-sized leap for me at the time. I also
knew that I would be using Wimba later this year, so it just made sense to make the
most of this opportunity.‖ She indicated that she missed the interaction that occurs in
classes on campus, commenting that ―distance study is extremely flexible but face-
to-face chats about work and hearing how other people think are very powerful when
learning new work.‖ Jess added, ―I am so sick of playing it safe and not making the
148
most of opportunities - nothing like risk-taking and getting in your discomfort zone‖
(Final tutorial).
When asked to comment on how the tutorial contributed to her learning, Jess
said that it clarified the content, including links between topics. For her it ―verified
what I was thinking or corrected my thinking, highlighted what I didn't understand so
I could alter my thinking, etc. (there is no way I'm going to pretend I understood
everything)‖ (Final tutorial). She added that it ―helped in understanding the language
of DA [Data Analysis] by listening to how it was used while interacting.‖ She found
drawings very helpful when trying to solve a problem or to understand a
concept. She also felt that involvement in the online tutorial was ―applying pressure
to get up to date when behind.‖
When talking about what went on in the online tutorials, Jess commented that
some people seemed hesitant and did not participate as well as they might have if
they had had the opportunity to meet others in the group face-to-face. She thought
that perhaps some of the participants felt unsure of themselves because of the subject
matter and did not want to reveal how much they may not have known. On the other
hand, she felt that there were some who obviously enjoyed it and ―that really stood
out‖ (Final interview). She enjoyed the interaction and support that the tutorial
offered, saying that it was ―nice to share the responsibility.‖ She felt that it was
important to assist others to feel comfortable – ―where you can support people and
do what you can. I think we learn a lot from each other.‖ On the subject of whether a
tutor needed to be present, Jess noted that, for some people, it would have been
difficult without the tutor‘s input. She added that even though they were all adult
learners and there was an expectation that they were independent, some needed to be
directed on where to go so that they did not ―get stuck on something and stay there
for too long‖ (Final interview).
On the whole, Jess did not really find the course that difficult although she
mentioned that she could understand why some may have had problems. She thought
that the initial few weeks were not that troublesome but the second half of the course
―was really heavy going‖ (Final interview). She felt that the terminology was not an
issue and that familiarity, hearing it and seeing it, was the key. Where she had most
difficulty was with the amount of time, or rather the lack of it, she had to devote to
the course. She was disappointed that she did not have time to revise for the
149
examination in her usual manner. At times the online tutorial provided clarity in
some of the content for her, ―someone would say something and the light bulb would
come on and you think ‗Ah,‘ because you thought you understood it but you didn‘t
totally understand something‖ (Final interview). She felt that talking to people in the
online tutorial format really helped, offering that ―I think I would have found it a lot
more difficult to realise that I was having problems if I had actually just got on the
discussion list and actually just entered something there. I don‘t think that it would
have been the same‖ (Final interview).
6.4 Summary of the narratives
Each narrative provided insights into the individual journeys of a variety of
participants through the semester of Data Analysis. Each participant recognised the
beneficial impact on motivation and ―keeping on track‖ that the tutorials provided.
By being ―forced‖ to set aside specific time each week to participate in the tutorial
they were able to consolidate their learning and clarify any misunderstandings on a
regular basis. The immediacy of the online synchronous tutorial allowed them to hear
different viewpoints, learn from each other and obtain timely feedback from the
tutor/teacher, like being in a ―real‖ classroom. While they all indicated that they
enjoyed the tutorials, they added that the tutorials were also thought provoking.
Through their social presence they each contributed to the sense of community which
in turn gave participants the confidence to contribute without fear of ridicule.
The common themes to emerge from the findings of Chapters 4, 5 and 6 are
discussed in Chapter 7. This discussion relates to the aims of the study through the
affordances of the technology used, the interaction that resulted in the online
synchronous tutorials, the nature of the dialogue that developed and the participant
perceptions of the value of these tutorials to their learning. Chapter 7 concludes with
some suggestions for further research.
150
Chapter 7: Discussion and conclusions
By examining the processes of an online synchronous tutorial in introductory
statistics at a regional university in Australia, this case study sought to answer the
following question:
How does an online synchronous environment contribute to the learning of statistical
concepts by distance learners?
As introduced in Chapter 1 of this document, the aims of this study were:
1. To describe the student-teacher, student-student and student-content
interaction in the learning of statistical concepts in this environment;
2. To investigate the nature of the dialogue in this environment for this
discipline context;
3. To examine the student perceptions of the value of this environment to
learning of statistical concepts; and,
4. To formulate theoretical constructs pertaining to the teaching and learning of
statistical concepts in this environment.
The research question will be addressed by considering each of these aims in turn.
The ―case‖ in this study was explored by describing the interactions amongst a
community of learners that occurred in a series of online synchronous tutorials over a
period of one semester (as introduced in Section 1.6).
In accordance with case study methodology, multiple methods of data
collection (as described in Section 3.2.1) and analysis (as described in Section 3.2.2
and presented in Chapters 4, 5 and 6) were used to investigate the interactions and
the nature of the dialogue taking place in online synchronous tutorials in a
quantitative discipline context. Discussion of these findings will focus on four
elements: the affordances of the online synchronous environment (Section 7.1) and
description of the interaction that occurred in this environment (Section 7.2) to
address the first aim, the nature of the dialogue in this environment (Section 7.3) to
address the second aim, and student perceptions of the value of this environment
(Section 7.4) to address the third aim. The integration of these elements will be
151
discussed in the context of teaching and learning of statistical concepts in an online
synchronous tutorial leading to a proposed model of online synchronous learning in
this context (Section 7.5), addressing the fourth aim of the study. Proposed future
research directions are offered (Section 7.6) and a postscript (Section 7.7) will
complete this thesis.
7.1 The online synchronous environment
The nature of distance education has been moulded over time by technological
advances and the availability of the technology to the masses (Anderson, 2008;
Moore & Kearsley, 2005). The rapid development of communications technologies
and associated potential to provide increased interactivity between teachers and
students are finally being realised (Gunawardena & McIsaac, 2004). Through the
affordances of such technologies more flexible learning environments have
developed (Gunawardena & McIsaac, 2004). In order to understand the learning
taking place in a computer-supported collaborative environment, interaction and
discourse afforded by such environments need to be described and analysed in detail
(De Wever et al., 2006) (see Section 2.3). While asynchronous technologies offer
time to read and reflect before formulating a response and may offer more time on
task and the flexibility to construct knowledge at any time (Section 2.3.1), they can
be lacking in the area of immediate and timely feedback, spontaneity and replication
of a ―real‖ classroom to reduce transactional distance (Bates, 2005; see Section
2.3.2). With greater understanding of the affordances provided by technology the
teacher has a pivotal role to play in exploiting the benefits and minimising the
technical difficulties of the online synchronous environment.
Within the interactive online learning platform used in this study, voice
(using a microphone), text chat, writing on a virtual whiteboard and emoticons/icons
were the means of communication. Within this online synchronous environment, the
context of the communication (subject matter and people discussing it) and the
subsequent combinations of these four forms of communication influenced the
dynamics of the interaction taking place (see Chapter 5, particularly Section 5.10).
With the lack of visual cues, the teacher was necessarily the instigator and
facilitator of any interaction in the initial stages of the online synchronous tutorials in
this study, be that social engagement or content discussion. Since it was much easier
152
for the teacher to express complex thoughts using voice rather than typing (as in text
chat), much of the conversation in the online tutorial was initiated using voice
(Section 5.10, Tables 5.3, 5.6 and 5.9). Even though voice may be considered easier
to use, the participants were reticent to use it, favouring instead text chat (Section
5.1.3). This did vary amongst participants and across the semester (Section 5.10).
The use of voice seemed to be related to the level of complexity of what needed to be
said and what was happening in the tutorial at the time. Specifically, if a participant
was addressed by name with the teacher either posing a question or asking for
comment, the participant invariably answered using the microphone. It was
perceived by some participants that using text chat allowed them to contribute to the
conversation without interrupting the flow of proceedings (Sections 4.2.4, 5.6.3 and
5.9.3). It also meant that background noise in the various locations of the participants
did not impinge (Sections 4.2.4 and 6.1).
While teachers need to exploit the affordances of the technology to engage
and motivate all students to actively interact with the content, they also need to
temper this with an appreciation of the diversity within the group of students in
relation to the level of familiarity with the terminology required by the content,
degree of difficulty of the content and attitude of students towards the content being
discussed, particularly pertinent to the study of statistics. This becomes even more
critical in an online synchronous environment because of the immediacy of the
interaction in combination with the lack of visual cues to indicate how comfortable
students are with what is being asked of them. This was evident when participants
were being encouraged to use the microphone to which the response from one was
―we are all shy‖ (Section 5.1.3). It needs to be acknowledged that active involvement
could mean sitting quietly during the tutorial taking it all in, similar to ―pedagogical
lurking‖ as described in the asynchronous environment (Dennen, 2008). At least
three participants were not as active in their contributions (Tables 5.3, 5.6 and 5.9),
but were nonetheless involved in the learning process, indicating a level of
understanding when requested using the Tick/Cross icon and then reviewing the
content later by listening to an archived recording and replaying the parts that needed
to be reinforced (Section 6.2).
While the online tutorial could be likened to a face-to-face classroom
because of synchronicity, one advantage of this online medium was that several
153
participants could be answering a question or making a comment at the same time
through the Chat window, whereas in a face-to-face situation this would be seen as
talking over one another (Sections 4.2.4 and 5.5.3). In the online environment this
simultaneous communication could also be happening while the teacher was
explaining a concept or discussing an example using voice and the virtual
whiteboard. This had both positive and negative consequences. It meant that
participants did not have the frustration of waiting to take their turn. They could post
a question or response whenever they felt the need. It also meant that the teacher did
not have to be as controlling of the situation as would likely happen in a face-to-face
classroom or in the virtual situation by requiring the Hand-up icon to be used to
request an opportunity to speak. This resulted in a free flowing conversation
developing with considerable participant involvement (Cox, et al., 2004). On the
other hand, for some (the teacher included), it could be chaotic and troublesome
keeping up with the bombardment of information and consequent cognitive overload
(Kear et al., 2012). This could explain in some way the use of the archives to
reinforce learning by participants who had actually attended the tutorial (Section
4.2.3 and Table 5.1). In addition, this could partly explain the reticence of some to
actively engage in the conversation and be satisfied with listening and reading the
contributions of others (Section 4.2.4).
Prior experience of the online synchronous environment did not seem to
make any difference to how well participants adapted to the environment although it
probably helped that some in the group were familiar with it (Sections 4.1.4 and 6.2).
After an initial orientation to the interactive online learning platform and some
practice at using features within it, such as the Tick/Cross icon to confirm
understanding and the Hand-up icon to signify wanting to say something using a
microphone, participants became comfortable with interacting (Section 4.2.4).
Occasionally there were technical issues to do with functionality within the
interactive online learning platform and individual Internet connection speeds, for
instance the inconvenience of having to hold down the Talk button to speak, but
these were far outweighed by the benefits of interaction with other participants and
the tutor/teacher and the convenience of being able to join the ―class‖ from home
(Section 4.2.4). With time and experience, the technology afforded a heightened
sense of community which in turn meant that the technology could be exploited in a
154
number of different ways to accommodate different learning styles. The active
learners could contribute openly in the Chat window or using voice. The passive
learners (pedagogical lurkers) could listen and observe yet still participate by
indicating understanding using the Tick/Cross icon and feel part of the community
(Cox et al., 2004; Dennen, 2008). Even though these passive learners could obtain
the relevant information from the archives, they still chose to attend the tutorials on a
regular basis (Table 5.1). Over time as participants became more comfortable with
the technology and the fact that it was not possible to distinguish who was writing on
the virtual whiteboard, collaborative problem-solving developed more fully (Section
5.9). Anonymity may have assisted in this.
How the affordances of the technology were maximised was influenced
strongly by the teacher. The teacher set the tone and expectations for the class. This
was achieved in the first instance by social engagement early in the teaching period
and at the beginning of each class, as this was a non-threating way to encourage
participants to use the technology to greatest advantage (Section 5.1.3). The use of
―ice-breaker‖ questions at the beginning of each tutorial encouraged participants to
become active and prompted social conversation. The teacher tried to be inclusive in
fostering the online communication by asking specific participants by name to
contribute. However, it was difficult for the teacher to have a clear indication of what
participants were comfortable with in relation to using the technology, especially as
there were no visual cues. In this there needed to be an element of compromise to
allow participants to ascertain for themselves how they wanted to use the technology,
for example, whether to use voice or text chat, and not try to force this issue. The
teacher needed to find a balance between being inclusive by asking specific
participants to respond and potentially alienating a reluctant participant.
The intention of the online synchronous tutorial in this context was to
actively engage students in a community of learners studying the quantitative content
of introductory statistics by emulating a ―real‖ classroom (Anderson, 2003b;
Garrison et al, 2000; Garrison & Arbaugh, 2007). The affordances of the interactive
online learning platform, Wimba Classroom, were exploited in different ways by
different people. While some participants primarily used text chat others were
willing to use a combination of text chat and voice. All participants made use of the
Tick/Cross icon, but it was not possible to distinguish who was writing on the virtual
155
whiteboard. The operational characteristics of voice (using a microphone), text chat
and writing on a virtual whiteboard and their relative ease of use within Wimba were
the enablers behind the interaction observed in the online tutorial. However,
affordances of technology need to be balanced by creative pedagogy (Twomey,
2009). In this lies the challenge for the teacher to provide not only the physical
environment (albeit virtual in this case), but also the necessary learning environment
to engage the students interactively.
7.2 Describing interaction in the online tutorial
With the widespread uptake of distance education a long-held view was that students
would need to choose between independence and interaction, but this is no longer
believed to be the case; both can coexist (Garrison, 2000). Interaction involves a
myriad of connections (Anderson, 2008; Anderson & Dron, 2011) in order to reduce
transactional distance (Moore, 2007) and build a collaborative community of learners
(Garrison et al, 2000; Garrison & Arbaugh, 2007; see Figure 2.2). In addressing the
first aim, this study investigated the dynamics of interaction taking place in the
online tutorials in terms of three components: the people involved, the content being
discussed and the technology being utilised. It further attempted to understand how
these three components came together to form a community of learners by analysing
these interactions in terms of the three presences of the Community of Inquiry
framework (Garrison et al., 2000).
7.2.1 People interacting
While participants in this study were all adults and as such were independent learners
capable of self-direction (Garrison, 2003), they nonetheless appreciated the support
given by their involvement in the online synchronous tutorials, particularly given the
perceived difficulty of the content being studied (Merriam, 2001; see Section 4.2.3).
With the lack of visual cues in the tutorial, participants were not influenced by the
physical appearance of fellow participants. While the sound of a voice and a name
generally indicated gender, age was not necessarily able to be determined and was
not directly revealed in social conversation across the semester. Considering the
diversity in age of the participants (Table 6.1), age-bias was a potential inhibiter to
the development of the sense of community amongst this group. With this in mind
156
lack of visual cues could be considered as a positive factor in the promotion of
collaboration amongst the participants (Freiermuth, 2002).
Social presence was an important component in the development of
community amongst the group (Garrison & Arbaugh, 2007). Establishment of social
presence was facilitated and modelled by the teacher from the very beginning of the
semester and then fostered throughout at the beginning and conclusion of each
tutorial. Inadvertently, people‘s personal lives impinged on the interaction taking
place in the online tutorial. Since each one of the research participants was sitting in
front of a computer in their own home, everyday life, such as a baby gurgling or a
dog going crazy, was happening in the background and this added to the connection
felt within the group, ―making names into people‖ (Sections 4.2.6, 5.9.3 and 6.1).
Social presence interacted with teaching presence to establish the
connectedness amongst the group needed to encourage the collaborative problem-
solving activities of cognitive presence (Shea & Bidjerano, 2009; see Section 2.4.2).
With a mixture of facilitation, direct instruction and timely feedback the teacher was
able to support participants to contribute questions, explain their reasoning and offer
answers (Anderson et al., 2001). Hearing the contributions of others helped
participants to view problems from a number of different perspectives which led to
greater understanding (Section 4.2.2).
As the semester progressed, the oscillation between teaching presence and
cognitive presence became more interspersed with social presence (Figure 5.15).
Further to this, social presence in the form of emotional and social support,
evidenced by comments such as ―good point‖ and ―no such thing as a silly question,‖
became an integral part of the interactions within the group (Sections 5.5.3 and
5.8.3). Comments providing this type of support were not directly related to
cognitive presence and, as such, could be seen as unnecessary, yet at opportune
moments they provided the incentive to keep cognitive interaction moving forward.
In a social sense they also provided lighter moments to bring some relief from the
intensity and immediacy of content discussion and thus reduced cognitive overload
(Kear et al., 2012; Section 5.9.3).
The establishment of community took time and commitment from
participants to attend on a regular basis. By almost half way through the semester
participants appeared to be showing more confidence within the group. This was
157
evidenced by participants being able to publicly admit when they were wrong in their
thinking, such as when one participant admitted to being wrong by commenting
―that‘s what I thought,‖ even though they could just as easily have not said anything
at all (Section 5.4.3). This sharing of their thinking and feelings further supported the
growth of the sense of community. It also allowed others to see that they were not the
only one struggling (Section 4.2.2).
The level of social interaction was dependent on who was in attendance and
not on how many were there. This was particularly noticeable in Tutorial 4 where
there was an increase in student-to-student interaction in social support, though there
were only five in attendance (Section 5.4.3). Three of the five in attendance at that
tutorial were amongst the four most active contributors in the group overall (Section
5.10, in particular Tables 5.2, 5.5 and 5.8). It should also be noted that the four
participants who added most to overall interaction contributed prominently to both
social presence and cognitive presence throughout the semester (Section 5.10).
The combination of social interaction and teaching presence (facilitation)
increased engagement within the group (Section 5.4.3). As noted in Section 2.4.3,
teaching presence is a combination of design and organisation, facilitation and direct
instruction. Initially this involved setting up expectations of how the tutorial would
be managed, that is the expectation that participants would ask questions and offer
answers (Garrison & Arbaugh, 2007; see Section 5.2.3). Once participants became
more comfortable with the technology and realised its potential for interaction, they
exploited the affordances it provided for moving their learning forward. The teacher
facilitated a supportive atmosphere by answering questions seriously regardless of
the simplicity or complexity of the questions – all questions were valued to
encourage participants to engage. Even when someone was on the wrong track it was
important to be supportive in bringing them back in the right direction – facilitation
rather than direct instruction as much as possible. A supportive approach was even
more acutely necessary due to the lack of visual cues. It was also important for the
teacher to quickly acknowledge contributions in the Chat window, so that
participants would keep contributing in that way thus maximising the interaction and
collaboration taking place.
158
Interaction within the online tutorial was a shared responsibility which took
time to develop. Each participant contributed something to the tutorial such that they
were able to support one another and learn from one another (Section 6.4).
7.2.2 Interaction with content
The conscious act of setting aside time to participate in the tutorial each week meant
that participants were connecting with the content on a regular basis. Interaction with
the content required primarily a mix of teaching presence and cognitive presence.
Teaching presence in terms of design and organisation was necessary in the early
stages in selecting the content and method of presenting the content to maximise
interaction by all participants with the content (Shea et al., 2006). Content selected
for discussion in the online tutorials was closely aligned with content being taught in
regular classes during the semester of the study. Presenting the content mostly in the
form of multiple-choice questions allowed all participants to contribute to discussion
and the development of fully-formed answers regardless of whether they felt
confident about the topic or knew how to answer the question. The multiple-choices
provided the prompts for discussion of probable answers as well as opportunities for
identifying common pitfalls in thinking by considering the relative merits of the
distractors in such questions. This enhanced the collaborative problem-solving
process.
One aspect of understanding the content in an introductory statistics course
is familiarity with the terminology used – the language of statistics. This can present
quite a barrier to engaging with the content. The online synchronous tutorial gave
participants the opportunity to hear the terminology used correctly and to be guided
by the teacher in using the terminology correctly themselves (Sections 4.2.3 and
5.3.2).
Statistical content, as with most quantitative content, is hierarchical – to
understand material later in the course it is important to understand the earlier
content. This can also create barriers. Learning statistical content in a community of
learners with a content expert (teacher) allowed participants to have concepts
clarified and relationships amongst concepts explained as needed (Section 4.2.2). It
also allowed participants to verbalise their knowledge and seek confirmation that
their understanding was sound. The teacher using question/answer facilitation
fostered discussion. By modelling the thinking processes and problem-solving
159
techniques and, where necessary through direct instruction, the teacher supported
participants in their assimilation of the content (Section 2.4.3).
7.2.3 Technology affords interaction
It has been established that advances in communication technologies have changed
the ways in which teachers exploit technology in their teaching (Moore & Kearsley,
2005; also see Section 7.1). While the teacher in this study realised that using
multiple-choice questions to engage participants in interacting with the content
would be less threatening, it was not realised initially that using the Polling function
within Wimba would actually limit the interaction (Section 5.1.3). Using polling
meant that participants chose an option and discussion followed once the summary of
answers was displayed. By projecting multiple-choice questions on the virtual
whiteboard instead, participants were able to physically interact with the multiple-
choice questions, such as underlining keywords, and illustrate the problem-solving
process needed to arrive at an answer. In this way the answer was not paramount but
the process was. During this interaction the technology supported the phases of
cognitive presence (Garrison, 2007): the multiple-choice question was the triggering
event, underlining keywords signified exploration, discussing how those keywords
combined to lead to a solution denoted integration and selecting which option to pick
in the multiple choices demonstrated resolution (Section 2.4.2). The immediacy of
this collaborative environment, along with facilitation from the teacher (teaching
presence), allowed the participants to overcome the difficulties commonly faced in
asynchronous environments of progressing from the exploratory phase to integration
and resolution (Vaughn & Garrison, 2005).
Another aspect of the technology that maximised interaction was the
multiple inputs happening synchronously - voice, text chat and writing on the
whiteboard (Section 5.6.3). When one person was answering a question with voice,
others were adding to the explanation in text chat to support, while another was
writing on the whiteboard – collaborative activity increasing participation (Sections
5.6.3 and 5.8.3). In addition, when several people gave the same answer to a question
in the Chat window, it gave people the opportunity to see that others were thinking
the same way they were. Unlike asynchronous environments this mixture of voice,
text chat and writing on the virtual whiteboard added to the dynamics of the online
tutorial (Section 4.2.2). The virtual whiteboard added further to this process by
160
making it possible to value-add visually, with diagrams and formulae, to any
explanations using electronic handwriting (Figures 5.1 to 5.12). The availability of
the Chat window also added to building a sense of community as participants could
engage socially before the tutorial started while waiting for the teacher to complete
preliminary organisation (Section 5.5.1).
With no opportunity to ―see‖ if students were confused or comprehending
the explanation of a concept by the teacher, the Tick/Cross icon within Wimba
allowed a quick confirmation of understanding. This meant that the flow of
discussion was not interrupted and yet the teacher was still able to ascertain that the
discussion was producing the desired outcomes (Section 5.1.3).
7.3 Nature of the dialogue in the online tutorial
In addressing the second aim of the study, it was found that the main contributing
factors to the nature of the dialogue in the online tutorial included the affordances of
the technology, the presentation of the content and the development of the
community of learners across time. The interaction amongst these factors determined
how the dialogue developed, while the interplay amongst the three presences: social,
cognitive and teaching, informed understanding of this process.
In the online synchronous tutorial where there were no visual cues, dialogue
primarily developed between the teacher and participants at the instigation of the
teacher (teaching presence). Even though the dialogue was often one-to-one, in
essence it became one-to-many. The teacher in responding to a specific participant‘s
enquiry shared the response with the whole group. This dialogue was usually enacted
with the teacher using the microphone and the participant using text chat (primarily
cognitive presence). As a result, teacher utterances could be long and involved as
necessary but participants generally delivered short to the point messages. If and
when a participant had a longer more involved comment or question they would
generally use the microphone (Section 7.1). In this environment with no visual
prompt to project uncertainty, participants would indicate that they were unsure of
their contribution to furthering the discussion by finishing a comment with a question
mark. This type of comment was not a direct question but a shorthand way of saying
―I think it might be this‖ and indicated the desire to have their understanding
confirmed or their misunderstanding addressed (Section 5.6.3).
161
The content was mostly presented as multiple-choice questions on the
virtual whiteboard. In this way participants were prompted with possible responses
and thus in a position to enter into the conversation. By breaking down barriers this
type of question in this format fostered discussion about key concepts and
highlighted nuances in statistical language while minimising the possibility for
participants to feel uncomfortable about asking and answering questions. Scaffolding
the learning in this way assisted the students to be active participants in the learning
process (Bigge & Shermis, 1999).
In the early stages of community development social dialogue was
instigated by the teacher to ―break the ice‖ (Section 5.1.2). This social dialogue was
generally between the teacher and the group but over time as the sense of community
developed social dialogue from one participant to another occurred (social presence).
Since these interactions were in text chat, they were of necessity short and generally
indications of their support, one for another. Initially participants were reticent to
offer opinions of a cognitive nature and would only offer answers to questions when
asked directly. However, over time and with encouragement and support from the
teacher and the inclusive nature of the multiple-choice questions on the virtual
whiteboard, they gained in confidence to respond to the teacher and the group. This
confidence developed even further to a point where participants supported other
participants directly with added explanations and assistance using the Chat window,
while the teacher was otherwise occupied using the virtual whiteboard (Section
5.6.3).
7.4 Perception of the value of the online tutorial
Studying at a distance can be an isolating experience, even though there are benefits
such as the independence to study where and when one wants (Section 2.2). Even
though adult learners are viewed as being capable of self-directed learning which
includes self-management, self-monitoring and motivation, this may not produce the
desired outcomes without support and guidance (Garrison, 2003). In addressing the
third aim of this study, the perceived value of the online synchronous tutorial can be
summarised as providing this support and guidance.
Participants expressed preconceptions of the tutorial providing the
opportunity to be part of a community of learners where they would feel less
162
isolated. By making the commitment to connect to the interactive learning platform
weekly, it was believed that motivation would be maintained and it would be easier
to ―keep on track‖ and not fall behind (Section 4.1.5). Through shared responsibility,
participants believed that the online tutorial would bring with it differing viewpoints
and demonstrate different ways of thinking. The prospect of hearing a variety of
questions asked and answered was viewed as a benefit of participation – questions
that were of concern to them but asked by others as well as questions that they had
not even thought to ask. In this, an opportunity to build a relationship with the
teacher within a collaborative learning environment was exploited (Section 4.1.5). In
addition to this came the belief that the more content was discussed, the more
understanding would follow. These preconceptions persisted and by the end of the
semester they remained as contributors to the perceived value of the online tutorial
by the participants (Section 4.2.1).
Participants felt that it was beneficial to see the solutions to problems
developed in ―real time‖ and be able to ask questions for immediate feedback (the
teacher modelling the problem-solving process) and confirm their understanding
(Section 4.2.3). Being involved in the online tutorial brought life to their learning. In
the tutorial they could ascertain how they were progressing in their understanding
compared with others – the shared struggle. Hearing the teacher use the correct
terminology in context also contributed to greater understanding (Section 4.2.3).
It was perceived that the online synchronous tutorial felt like the
relationship with the teacher was one-on-one because of the high level of interaction
and the immediacy of the situation (Section 6.2). The teacher was seen as being an
important motivator, clarifying concepts and giving the big picture of how the topics
studied fitted together.
7.5 Teaching and learning statistical concepts online
Teaching and learning of statistical concepts involves the development of problem-
solving skills to apply these concepts appropriately in a variety of contexts. As a
social constructivist, Vygotsky advocated that social interaction was fundamental to
learning through shared problem-solving experiences (Section 2.1.1). The learner in
progressing from what is known to what needs to be known crosses what Vygotsky
called the Zone of Proximal Development (ZPD). The teacher‘s role in this process
163
was to define the size of the ZPD and thus support successful transition by
scaffolding the learning in a social environment of collaboration. In this social
environment learners interacted with others through testing and challenging their
understandings (Bates & Poole, 2003).
This research study applied social constructivist principles to online
synchronous tutorials to provide a common ground for communication to enhance
mutual support amongst a group of students. Integral to this was support provided by
the teacher (teaching presence) through provision of content in a form that
encouraged interaction, facilitation of the interaction and direct instruction related to
key concepts required to progress understanding. To this end, participants were given
opportunities to participate in ―meaning making‖ in a collaborative non-threatening
learning environment where they were encouraged to ask questions, clarify concepts
taught, express ideas in their own words, and confirm understanding (Meyer & Land,
2005). Indicative of this type of support participants were assured that no question
was a silly question (Section 5.5.3).
Once a relationship was established within the group and the participants felt
confident and supported by the positive attitude of the teacher – that they were
encouraged and valued for asking questions – they demonstrated their preparedness
to contribute to discussion (Sections 4.2.2 and 4.2.3). Participants were encouraged
by comments from the teacher that getting an incorrect answer could be seen as a
positive outcome as it prompted discussion of the finer nuances in the wording of a
question. This attitude encouraged participants to contribute even if they didn‘t feel
confident with the answer that they had formulated (Section 5.4.2).
The learning of statistics required engagement with the language of statistics.
To assist with this, the teacher modelled the problem-solving process using
appropriate terminology. As a consequence participants were encouraged to engage
with this process in solving problems collaboratively on the virtual whiteboard. By
the end of the semester this activity amongst participants occurred with little
provocation from the teacher. The fostering of this team approach led to an
environment where participants supported one another both emotionally and
cognitively to achieve greater understanding.
Personal circumstances impacted on the level of commitment individual
participants could make to the online tutorials. During the semester a number of
164
participants experienced circumstances which prevented them from attending all of
the tutorials. It appeared that those who could make the commitment to participate
most of the time were those who engaged the most, both socially and cognitively.
However, the direction of the relationship between commitment to participate and
level of engagement in class is unclear. It may be that the commitment to participate
over a sustained period of time built the trust that led to a sense of community which
resulted in a community of inquiry. On the other hand it could be that participants
who were more outgoing in nature were also more prepared to commit to the weekly
appointment. Regardless, the development of social presence early in the semester
created the group cohesion which built confidence and in conjunction with the
immediacy of the synchronous situation fostered the collaborative learning
environment for cognitive presence to develop (Section 2.4).
7.5.1 A model of online synchronous learning
Interaction in distance learning has been modelled for a combination of
asynchronous and synchronous communication (Figure 2.2) and asynchronous online
communication (Figure 2.3). In addressing the fourth aim of this study, formulating
theoretical constructs pertaining to the teaching and learning of statistical concepts in
this environment, a model of online synchronous learning is proposed. Any model
incorporating the contribution that an online synchronous tutorial makes to the
learning of statistics should be learning-centred rather than learner-centred, one
where the focus is on learning and interaction (Anderson, 2008). At the centre of the
online synchronous tutorial is a dynamic process, the act of learning (see Figure 7.1).
The overarching feature of this model of online synchronous learning is the
affordance of the interactive online learning platform which creates an environment
where the action can take place (Section 2.3.2). This environment in turn interacts
with the act of learning through the participants (teacher and students) and their
engagement with the content to be learned. Within this environment the key elements
(student, teacher and content) interact with one another and the technology to
produce the learning in ―real time‖ by crossing the Zone of Proximal Development
(ZPD). This is enabled by the mode of communication chosen – a mixture of voice,
text chat, writing on the virtual whiteboard and emoticons/icons – which allows the
teacher to scaffold the learning. In addition to this, the Community of Inquiry (CoI)
165
which includes teaching presence, social presence and cognitive presence is the
vehicle by which the act of learning can occur.
Figure 7.1. Model of online synchronous learning
Viewing the model as layers in an interactive process, the act of learning is at
the core of the process. The act of learning consists of collaboration amongst the
participants interacting with the content to produce the desired learning outcomes.
The inner layer surrounding the act of learning represents the ZPD which according
to the social constructivist theory of Vygotsky is traversed to allow the students to
move from a position of current knowledge to the formation of new knowledge at a
higher level (Section 2.1.1). This requires two-way interaction amongst the teacher,
the students and the content. This interaction forms the boundary between the inner
and outer layers of the model.
To arrive at the act of learning by passing through the ZPD a number of
factors come into play: mode of communication, scaffolding and the elements of the
Community of Inquiry. This outer layer of the model represents the components that
combine together to produce the interaction. Scaffolding by the teacher is closely
related to the teaching presence of the CoI. This scaffolding includes the selection of
the content, the form in which the content is presented and the facilitation of
166
discussion amongst the participants by the teacher. The interplay amongst the
elements of the CoI (teaching presence and social presence) provides the supportive
environment for productive discussion (cognitive presence) to take place (Section
2.4). In other words, teaching presence provides the scaffolding through the design
and organisation of the content, direct instruction where necessary, and facilitation of
the collaboration and interaction with students and content which moves students
through the ZPD to the cognitive presence of the act of learning. Social presence
influenced by the mode of communication interacts with the other presences to
initiate and maintain this act of learning. The synchronicity provided by technology
through the mode of communication (voice, text chat, writing on the virtual
whiteboard and emoticons/icons) promotes the dynamic nature of the act of learning
where several modes of communication can be occurring simultaneously.
In the specific context of this study, namely teaching introductory statistics in
an online synchronous tutorial, the application of this model is exemplified. The
affordance of the synchronous technology allowed the teacher, students and content
to interact within a community of inquiry in ―real time,‖ providing the benefits of
immediacy of explanation and feedback (Section 2.3.2). Using several modes of
communication (voice, text chat, writing on a virtual whiteboard and
emoticons/icons), often occurring at the same time, added to the dynamics of the
learning experience. Organisation of content in the form of multiple-choice questions
on the virtual whiteboard and consequent facilitation of discussion scaffolded
movement across the ZPD to the act of learning.
An integral part of this act of learning in relation to quantitative disciplines
such as statistics is the notion of problem-solving. The act of pulling the problem
apart, finding the keywords, linking these words to key concepts and then combining
the key concepts together in a way that leads to a solution lends itself well to
collaborative interaction in a community of inquiry. The immediacy of the online
synchronous environment means that the focus can be on action. This act of learning
is not only the construction of knowledge but the development of the problem-
solving skills to apply that knowledge.
167
7.6 Future directions
While this research study into the ―how‖ of the support provided by an online
synchronous tutorial to the learning of statistical concepts by distance students lacks
generalisability due to the specific circumstances surrounding the ―case‖ studied, it
nonetheless offers some insights for practice. As such, it begs the question: would
this approach provide similar outcomes in other quantitative courses and at other
than the introductory level?
The role of the tutor/teacher in this environment needs to be investigated in
more depth. Some participants in this study suggested the use of breakout rooms
where students in the online tutorial could break into smaller groups, discuss specific
content and then feed back to the whole group. This would take more time (for
people who already have limited time available) and would probably need to be quite
targeted. The organisational role of the tutor/teacher in such a scenario could provide
further insights into the contribution that synchronous communication makes to the
learning of quantitative content. How to optimise the impact of this type of
interaction needs further examination.
It could be argued that the level of support offered by an online synchronous
tutorial is unsustainable in the current economic climate within universities. In spite
of this, could this type of support succeed with a larger group of students? Could
lessons learned from this ―case‖ be applied to how asynchronous discussion forums
are operationalised? While asynchronous discussion forums lack immediacy, it
would be useful to investigate the impact of differing response times to different
types of questions posed by distance students in these forums and how this may
affect their motivation, engagement and satisfaction with a course of study.
With the rapid advances in communication technologies, humanisation of the
technology provides numerous avenues for further research with online synchronous
interaction being central to this.
7.7 Postscript
Distance students often feel isolated and overwhelmed. Because most are working
full-time and have family commitments, they find it difficult to find the right balance
between these and the requirements of their studies. When they are studying
unfamiliar content that they may not initially perceive as relevant to their program of
168
study, such as introductory statistics, it is not easy for them to stay motivated and ―on
track.‖ Course resources in the form of static content, as in text supported by
multimedia presentations, provide the core study materials. These are not always
satisfactory as they do not give students the opportunity to ask questions nor
verbalise their understanding to elicit confirmation. Asynchronous discussion forums
can fill this need but they do not provide the immediacy needed to quickly move
students on from ―stuck places‖ (Meyer & Land, 2005). In my experience,
quantitative content, such as introductory statistics, has a number of potential ―stuck
places.‖ Providing distance students with an opportunity to be involved in an online
synchronous tutorial, where they can interact with the content supported by a teacher
and fellow students in ―real time,‖ could provide the impetus to keep them engaged
with the content and the university. From my experience of this online synchronous
tutorial the sense of community took time to develop and the impact of the weekly
online tutorials was not fully realised until the last tutorial of the semester. It became
obvious in the revision tutorial, when the pressure of learning new content was
removed and confidence had grown, participants entered enthusiastically into the
spirit of collaborative learning with much less facilitation by me being necessary
(Section 5.9.3).
In recent years, I see more and more escalation of the massification of
education and with it the potential loss of purposeful and personal engagement with
our distance students. Because of our ever-increasingly busy lives, I understand why
distance students struggle to maintain motivation to keep up with the requirements of
their programs of study. The affordances of technology can dehumanise the
education of our distance students by providing limitless and sophisticated resources
for them to use or it can allow us to forge connections to produce interactive and
collaborative learning environments. While we have students who are prepared to put
aside the time to commit one hour a week to participate in an online synchronous
tutorial, I believe that we should take the opportunity to enrich their learning
experience.
169
References
Akyol, Z., Arbaugh, J.B., Cleveland-Innes, M., Garrison, D.R., Ice, P., Richardson,
J.C., & Swan, K. (2009). A response to the review of the community of
inquiry framework. Journal of Distance Education, 23(2), 123-136.
Akyol, Z., & Garrison, D. R. (2008). The development of a community of inquiry
over time in an online course: Understanding the progression and
integration of social, cognitive and teaching presence. Journal of
Asynchronous Learning Networks, 12, 3-22.
Akyol, Z., & Garrison, D. R. (2010). Understanding cognitive presence in an online
and blended community of inquiry: Assessing outcomes and processes for
deep approaches to learning. British Journal of Educational Technology.
DOI: 10.1111/j.1467-8535.2009.01029.x
Akyol, Z., Garrison, R. D. & Ozden, M. Y. (2009). Development of a community of
inquiry in online and blended learning contexts. Procedia Social and
Behavioural Sciences, 1, 1834-1838.
Anderson, T. (2003a). Modes of interaction in distance education: Recent
developments and research questions. In M. G. Moore & W. G. Anderson
(Eds.), Handbook of distance education (pp.129-144). London: Lawrence
Erlbaum.
Anderson, T. (2003b). Getting the mix right again: An updated and theoretical
rationale for interaction. International Review of Research in Open and
Distance Learning, 4(2). Retrieved from
http://www.irrodl.org/index.php/irrodl/article/view/149/230
Anderson, T. (2008). Towards a theory of online learning. In T. Anderson (Ed.), The
Theory and Practice of Online Learning (2nd ed., pp. 45-74). Athabasca, AL:
Athabasca University Press. Retrieved from
http://www.aupress.ca/index.php/books/120146
170
Anderson, T., & Dron, J. (2011). Three generations of distance education pedagogy.
International Review of Research in Open and Distance Learning, 12(3), 80-
97.
Anderson, T., & Kuskis, A. (2007). Modes of interaction. In M. G. Moore (Ed.),
Handbook of distance education (2nd ed., pp. 295-309). Mahwah, NJ:
Lawrence Erlbaum.
Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching
presence in a computer conferencing context. Journal of Asynchronous
Learning Networks, 5(2), 1-17.
Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P.,
Richardson, J. C., et al. (2008). Developing a community of inquiry
instrument: Testing a measure of the community of inquiry framework using
a multi-institutional sample. Internet and Higher Education, 11(3-4), 133-
136.
Arbaugh, J. B., & Hwang, A. (2006). Does "teaching presence" exist in online MBA
courses? Internet and Higher Education, 9(1), 9-21.
Arnold, N., & Ducate, L. (2006). Future foreign language teachers' social and
cognitive collaboration in an online environment. Language Learning &
Technology, 10(1), 42-66.
Bates, A. W. (2005). Technology, e-learning and distance education (2nd ed.). New
York: Routledge.
Bates, A. W., & Poole, G. (2003). Effective teaching with technology in higher
education. San Francisco, CA: Jossey-Bass.
Baxter, P., & Jack, S. (2008). Qualitative case study methodology: Study design and
implementation for novice researchers. The Qualitative Report, 13(4), 544-
559.
Bigge, M. L., & Shermis, S. S. (1999). Learning theories for teachers (6th ed.). New
York: Longman.
Brannon, R.F., & Essex, C. (2001). Synchronous and asynchronous communication
tools in distance education. TechTrends, 45(1), 36-42.
171
Burnett, C. (2003). Learning to chat: Tutor participation in synchronous online chat.
Teaching in Higher Education, 8(2), 247-261.
Burr, L., & Spennemann, D. H. R. (2004). Patterns of user behavior in university
online forums. International Journal of Instructional Technology and
Distance Learning, 1(10), 11-18.
Cox, G., Carr, T., & Hall, M. (2004). Evaluating the use of synchronous
communication in two blended courses. Journal of Computer Assisted
Learning, 20, 183-193.
Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed
methods approaches (2nd ed.). Thousand Oaks, CA: Sage.
Cross-Durant, A. (2001). John Dewey and lifelong education. In P. Jarvis (Ed.),
Twentieth century thinkers in adult & continuing education (2nd ed.).
London: Kogan Page.
Dennen, V. (2008). Pedagogical lurking: Student engagement in non-posting
discussion behavior. Computers in Human Behavior, 24, 1624-1633.
De Wever, B., Schellens, T., Valcke, M., & Van Keer, H. (2006). Content analysis
schemes to analyze transcripts of online asynchronous discussion groups: A
review. Computers & Education, 46(1), 6-28.
Dewey, J. (1933). How we think (rev. ed.). Boston, MA: D.C. Heath.
Dewey, J. (1940/1969). Education today. New York: Greenwood Press.
Everson, M. G., & Garfield, J. (2008). An innovative approach to teaching online
statistics courses. Technology Innovations in Statistics Education, 2(1).
Retrieved from http://escholarship.org/uc/item/2v6124xr
Freiermuth, M. R. (2002). Online chatting: An alternative approach to simulations.
Simulation & Gaming, 33(2), 187-195.
Galligan, L., Hobohm, C., & Loch, B. (2012). Tablet technology to facilitate
improved interaction and communication with students studying
mathematics at a distance. Journal of Computers in Mathematics and
Science Teaching, 31(4), 363-385.
172
Garrison, D. R. (2000). Theoretical challenges for distance education in the 21st
century: A shift from structural to transactional issues. International Review
of Research in Open and Distance Learning, 1(1). Retrieved from
http://www.irrodl.org/index.php/irrodl/article/view/2
Garrison, D. R. (2003). Self-directed learning and distance education. In M. G.
Moore & W. G. Anderson (Eds.), Handbook of Distance Education (pp.
161-168). London: Lawrence Erlbaum.
Garrison, D. R. (2007). Online community of inquiry review: Social, cognitive, and
teaching presence issues. Journal of Asynchronous Learning Networks, 1(1),
61-72.
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based
environment: Computer conferencing in higher education. Internet and
Higher Education, 2(2-3), 87-105.
Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the
community of inquiry framework: A retrospective. Internet and Higher
Education, 13(1-2), 5-9.
Garrison, D. R., & Arbaugh, J. B. (2007). Researching the community of inquiry
framework: Review, issues, and future directions. Internet and Higher
Education, 10(3), 157-172.
Garrison, D. R., & Archer, W. (2007). A theory of community of inquiry. In M. G.
Moore (Ed.), Handbook of Distance Education (2nd ed., pp. 77-88).
Mahwah, NJ: Lawrence Erlbaum.
Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in
online learning: Interaction is not enough. American Journal of Distance
Education, 19(3), 133.
Garrison, D. R., Cleveland-Innes, M., Koole, M., & Kappelman, J. (2006). Revisiting
methodological issues in transcript analysis: Negotiated coding and
reliability. Internet and Higher Education, 9(1), 1-8.
Gilbert, P. K., & Dabbagh, N. (2005). How to structure online discussions for
meaningful discourse: a case study. British Journal of Educational
Technology, 36(1), 5-18.
173
Gillham, B. (2000). Case study research methods. London: Continuum.
Gorsky, P., & Caspi, A. (2005). A critical analysis of transactional distance theory.
Quarterly Review of Distance Education, 6(1), 1-11.
Guba, E. G., & Lincoln, Y. S. (1989). Fourth generation evaluation. Newbury Park,
CA: Sage.
Gunawardena, C. N., Lowe, C. A., & Anderson, T. (1997). Analysis of a global
online debate and development of an interaction analysis model for
examining social construction of knowledge in computer conferencing.
Journal of Educational Computing Research, 17(4), 397-431.
Gunawardena, C. N., & McIssac, M. S. (2004). Distance Education. In D. H.
Jonassen (Ed.), Handbook of research on educational communications and
technology (2nd ed., pp. 355-395). Mahwah, NJ: Lawrence Erlbaum.
Guri-Rosenblit, S. (2009). Distance education in the digital age: Common
misconceptions and challenging tasks. Journal of Distance Education,
23(2), 105-122.
Hara, N., Bonk, C. J., & Angeli, C. (2000). Content analysis of online discussion in
an applied educational psychology course. Instructional Science, 28, 115-
152.
Henri, F. (1992). Computer conferencing and content analysis. In A. R. Kaye (Ed.),
Collaborative learning through computer conferencing: The Najaden
papers (pp. 117-136). Berlin: Springer-Verlag.
Holmberg, B. (2007). A theory of teaching-learning conversations. In M. G. Moore
(Ed.), Handbook of distance education (2nd ed., pp 69-75). Mahwah, NJ:
Lawrence Erlbaum.
Hrastinski, S. (2006). Introducing an informal synchronous medium in a distance
learning course: How is participation affected? Internet and Higher
Education, 9(2), 117-131.
Hrastinski, S., Keller, C., & Carlsson, S. A. (2010). Design exemplars for
synchronous e-learning: A design theory approach. Computers & Education,
55(2), 652-662.
174
Hung, D. W. L., & Chen, D.-T. (2001). Situated cognition, Vygotskian thought and
learning from the communities of practice perspective: Implications for the
design of web-based e-learning. Educational Media International, 38(1), 3-
12.
Hwang, A., & Arbaugh, J. B. (2006). Virtual and traditional feedback-seeking
behaviors: Underlying competitive attitudes and consequent grade
performance. Decision Sciences Journal of Innovative Education, 4(1), 1-
28.
Kear, K., Chetwynd, F., Williams, J., & Donelan, H. (2012). Web conferencing for
synchronous online tutorials: Perspecives of tutors using a new medium.
Computers & Education, 58, 953-963. doi: 10.1016/j.compedu.2011.10.015
Knowles, M.S. (1950). Informal adult education. Chicago, IL: Association Press.
Knowles, M.S. (1978). The adult learner: A neglected species (2nd ed.) Houston,
TX: Gulf Publishing.
Knowles, M.S. (1990). The adult learner: A neglected species (4th ed.) Houston, TX:
Gulf Publishing.
Knowles, M. S., Holton III, E. F., & Swanson, R. A. (2005). The adult learner: the
definitive classic in adult education and human resource development (6th
ed.). Burlington: Elsevier.
Kreiner, D. S. (2006). A mastery-based approach to teaching statistics online.
International Journal of Instructional Media, 33, 73-79.
Larochelle, M., Bednarz, N., & Garrison, J. (Eds.). (1998). Constructivism and
education. Cambridge, UK: Cambridge University Press.
Liamputtong, P., & Ezzy, D. (2006). Qualitative research methods (2nd ed.).
Melbourne: Oxford University Press.
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Newbury Park, CA: Sage.
Loch, B., & McDonald, C. (2007). Synchronous chat and electronic ink for distance
support in mathematics. Innovate - Journal of Online Education, 3(3).
Retrieved from http://innovateonline.info/
175
Mason, R. (2003). On-line learning and supporting students: New possibilities. In A.
Tait & R. Mills (Eds.), Rethinking learner support in distance education:
Change and continuity in an international context (pp. 90-101). London:
RoutledgeFalmer.
Maurino, P. S. M. (2007). Online asynchronous threaded discussions: Good enough
to advance students through the proximal zone of Activity Theory?
TechTrends: Linking Research & Practice to Improve Learning, 51(2), 46-
49.
Merriam, S. B. (1988). Case study research in education: A qualitative approach.
San Francisco, CA: Jossey-Bass.
Merriam, S. B. (2001). Andragogy and self-directed learning: Pillars of adult
learning theory. New Directions for Adult and Continuing Education, 89, 3-
14.
Merriam, S. B. (2004). The role of cognitive development in Mezirow's
transformational learning theory. Adult Education Quarterly, 55(1), 60-68.
Meyer, K. A. (2004). Evaluating online discussions: Four different frames of
analysis. Journal of Asynchronous Learning Networks, 8(2), 101-114.
Meyer, J. H. F., & Land, R. (2005). Threshold concepts and troublesome knowledge
(2): Epistemological considerations and a conceptual framework for
teaching and learning. Higher Education, 49(3), 373-388.
Mezirow, J. (2003). Transformative learning as discourse. Journal of Transformative
Education, 1(1), 58-63.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded
source book (2nd ed.). Thousand Oaks, CA: Sage.
Mills, J. D., & Raju, D. (2011). Teaching statistics online: A decade‘s review of the
literature about what works. Journal of Statistics Education, 19(2).
Retrieved from http://www.amstat.org/publications/jse/v19n2/mills.pdf
Mishler, E. G. (1991). Research interviewing: Context and narrative. Cambridge,
MA: Harvard University Press.
176
Moore, M. G. (1989). Three types of interaction. American Journal of Distance
Education, 3(2), 1-6.
Moore, M. G. (2007). The theory of transactional distance. In M. G. Moore (Ed.),
Handbook of distance education (2nd ed., pp. 89-105). Mahwah, NJ:
Lawrence Erlbaum.
Moore, M. G., & Kearsley, G. (2005). Distance education: A systems view (2nd ed.).
Belmont, CA: Thomson Wadsworth.
Myers, S. C., Bishop, D., Rajaman, S. S., & Kelly, J. (2004). Virtual office hours:
Tutoring distance students in statistics and economics. Paper presented at
the OCDE Convergence of Libraries, Learning and Technology, Columbus,
Ohio. March.
Naidu, S., & Jarvela, S. (2006). Analyzing CMC content for what? Computers &
Education, 46(1), 96-103.
Peters, O. (2007). The most industrialized form of education. In M. G. Moore (Ed.),
Handbook of distance education (2nd ed., pp. 57-68). Mahwah, NJ:
Lawrence Erlbaum.
Piaget, J. (1969/1971). Science of education and the psychology of the child (D.
Coltman, Trans.). London: Longman.
Pogson, P. & Tennant, M. (1995). Understanding adults. In G. Foley (Ed.),
Understanding adult education and training (pp.20-30). St Leonards,
Australia: Allen & Unwin.
Romiszowski, A., & Mason, R. (2004). Computer-mediated communication. In D. H.
Jonassen (Ed.), Handbook of research on educational communications and
technology (2nd ed., pp. 397-431). Mahwah, NJ: Lawrence Erlbaum.
Rourke, L., & Anderson, T. (2004). Validity in quantitative content analysis.
Educational Technology Research & Development, 52(1), 5-18.
Rourke, L., & Kanuka, H. (2009). Learning in communities: A review of the
literature. Journal of Distance Education, 23(1), 19-48.
Rubin, A. (2007). Much has changed; little has changed: Revisiting the role of
technology in statistics education 1992-2007. Technology Innovations in
177
Statistics Education, 1(1). Retrieved from
http://escholarship.org/uc/item/833239sw
Shea, P., & Bidjerano, T. (2009). Community of inquiry as a theoretical framework
to foster ―epistemic engagement‖ and ―cognitive presence‖ in online
education. Computers & Education, 52(3), 543-553.
Shea, P., Li, C. S., & Pickett, A. (2006). A study of teaching presence and student
sense of learning community in fully online and web-enhanced college
courses. Internet and Higher Education, 9(3), 175-190.
Silverman, D. (2006). Interpreting qualitative data. (3rd ed.). London: Sage.
Simonsen, L., & Banfield, J. (2006). Fostering mathematical discourse in online
asynchronous discussions: An analysis of instructor interventions. Journal
of Computers in Mathematics and Science Teaching, 25(1), 41-75.
Skinner, B. F. (1976). About behaviorism. New York: Random House.
Smith, G. G., & Ferguson, D. (2005). Student attrition in mathematics e-learning:
Growing pains of a new generation. Australasian Journal of Educational
Technology, 21(3), 323-334.
Smith, W. A. (2001). E L Thorndike. In P. Jarvis (Ed.), Twentieth century thinkers in
adult & continuing education (2nd ed., pp. 77-93). London: Kogan Page.
Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: Sage.
Stein, D. S., Wanstreet, C. E., Glazer, H. R., Engle, C. L., Harris, R. A., Johnston, S.
M., et al. (2007). Creating shared understanding through chats in a
community of inquiry. Internet and Higher Education, 10(2), 103-115.
Suanpang, P., Petocz, P., & Kalceff, W. (2004). Student attitudes to learning business
statistics: Comparison of online and traditional methods. Educational
Technology & Society, 7(3), 9-20.
Taylor, E. W. (2008). Transformative learning theory. New Directions for Adult and
Continuing Education, 119, 5-15.
Tellis, W. (1997). Introduction to case study. The Qualitative Report, 3(2). Retrieved
from http://www.nova.edu/ssss/QR/QR3-2/tellis1.html
178
Tishkovskaya, S., & Lancaster, G. A. (2012). Statistical education in the 21st century:
A review of challenges, teaching innovations and strategies for reform.
Journal of Statistics Education, 20(2), 1-55.
Thorndike, E. L. (1931/1968). Human Learning. Cambridge, MA: The M.I.T. Press.
Twomey, P. (2009). Summit on the global agenda: On technology and innovation at
the world economic forum (2009). Retrieved from
http://www.weforum.org/pdf/globalagenda.pdf
Utts, J., Sommer, B., Acredolo, C., Maher, M. W., & Matthews, H. R. (2003). A
study comparing traditional and hybrid internet-based instruction in introductory
statistics classes. Journal of Statistics Education, 11(3). Retrieved from
http://www.amstat.org/publications/jse/v11n3/utts.html
Vaughan, N., & Garrison, D. R. (2005). Creating cognitive presence in a blended
faculty development community. Internet and Higher Education, 8(1), 1-12.
Wagner, E. D. (1994). In support of a functional definition of interaction. American
Journal of Distance Education, 8(2), 6-26.
Yin, R. K. (2009). Case study research: Design and methods (4th ed.). Thousand
Oaks, CA: Sage.
179
Appendices
Appendix A: Initial survey questions
Appendix B: Weekly research questions
Appendix C: Final interview questions
180
Appendix A
Initial survey questions:
1. Write a few words expressing how you feel about studying Data Analysis?
2. How do you think your past experiences in mathematics might affect your
learning in this subject?
3. Do you use the discussion group facility in Moodle? If so, in what courses
have you used it?
4. Have you participated in communicating with tutors and/or students online in
any other courses (e.g. MSN Messenger, Skype, Wimba, etc.), where all of
you have been connected to the internet at the same time? If so, which
courses, who was involved (a group of students only, a group of students and
a tutor, or just you and a tutor), what software was used and when did this
happen?
5. What made you decide to participate in the online tutorial in Data Analysis?
181
Appendix B
Weekly research questions (tutorial in which question was asked):
1. The content covered last week was (all tutorials)
a. too easy
b. nothing new
c. nothing new but helpful
d. thought provoking
e. confusing
f. just right
g. too hard
2. What was the most important thing you go out of last week‘s tutorial? (all
tutorials)
3. How do you feel about the online tutorial environment? (Tutorial 2)
4. How are you feeling about Data Analysis at this point in time? (Tutorials 3,
6, 7, 8, R)
a. still worried
b. getting on top of it
c. struggling
d. enjoying it
e. hating it
f. all good
5. Which topic have you found most challenging so far? (Tutorials 4, 7, 8)
a. Types of variables and graphs
b. Contingency tables and conditional distributions
c. Normal models
182
d. Experiments, observational studies and sampling
e. Regression and correlation
f. Binomial models and probability
g. Sampling distributions
h. Hypothesis testing and confidence intervals
i. Nothing has been particularly challenging so far
6. Why did you choose to participate in the online tute? (Tutorial R)
7. In what way did the online tute contribute to your learning of Data Analysis?
(R)
183
Appendix C
Final interview key questions and prompts:
A. Demographic information
1. What is your approximate age? (five-year range was requested, however
all participants gave an exact age in years).
2. What is your program of study?
3. How many courses have you completed prior to this semester?
4. How far along are you in your program of study?
5. What is your employment/study status? (full-time, part-time etc)
6. What sort of family commitments do you have that may impact your
studies?
B. Questions related to research question and aims
1. Did you have any online experiences with your previous courses and what
sort of experiences were they? Discussion forums? Wimba?
2. Would you like to have a Wimba class in any future courses that you
undertake?
3. Why did you decide to participate in this study?
4. Which medium did you prefer for communications, text chat or
microphone? What were the advantages/disadvantages of this medium?
Using icons?
5. How did you feel about communicating in the online environment,
Wimba?
6. Did you interact with any other students in the research group outside the
online tutorial? If you didn‘t, would you have liked to?
7. Did you use the archives of the online sessions? If so, how did you use
them?
184
8. How do you think the other participants contributed to the process and
your experience of the online tutorial?
9. Do you think that the tutorial discussion could have proceeded without
the tutor being present? In what way?
10. How did the online tutorial help you with your learning of Data Analysis?
11. Do you have anything to add?
The final interviews were semi-structured and conducted in a conversational style to
maintain rapport between the researcher and the participant. As such, questions were
not always asked in the order given above. It should be noted that participants were
initially reminded that they could pass on a question at any stage in the interview.