document5
DESCRIPTION
education technologyTRANSCRIPT
January 2007 Journal of Engineering Education 57
ROMAN TARABAN
Department of PsychologyTexas Tech University
EDWARD E. ANDERSON
Department of Mechanical EngineeringTexas Tech University
ALLI DEFINIS
Department of PsychologyTexas Tech University
ASHLEE G. BROWN
Department of PsychologyTexas Tech University
ARNE WEIGOLD
Department of PsychologyTexas Tech University
M.P. SHARMA
Department of Chemical and Petroleum EngineeringUniversity of Wyoming
ABSTRACT
The development of procedural knowledge in students, i.e.,the ability to effectively solve domain problems, is the goal ofmany instructional initiatives in engineering education. Thepresent study examined learning in a rich learning environ-ment in which students read text, listened to narrations,interacted with simulations, and solved problems usinginstructional software for thermodynamics. Twenty-threeengineering and science majors who had not taken a thermo-dynamics course provided verbal protocol data as they usedthis software. The data were analyzed for cognitive processes.There were three major findings: (1) students expressed sig-nificantly more cognitive activity on computer screens requir-ing interaction compared to text-based screens; (2) there werestriking individual differences in the extent to which studentsemployed the materials; and (3) verbalizations revealed thatstudents applied predominantly lower-level cognitiveprocesses when engaging these materials, and they failed toconnect the conceptual and procedural knowledge in waysthat would lead to deeper understanding. The results provide
a baseline for additional studies of more advanced students inorder to gain insight into how students develop skill in engi-neering.
Keywords: cognitive processing, instructional software, skill
development
I. INTRODUCTION
A. Cognitive Influence on Engineering Education ResearchSome recent initiatives in engineering education research have
adopted a cognitive framework for designing and implementing
studies of student learning behaviors and outcomes [1]. This is an
encouraging development because it allows engineering education
reform to benefit from the basic research on cognition and learning
that has been going on since the 1970s. According to cognitive the-
ories, learning results in changes in mental representations and
processes and depends critically on learners’ prior knowledge and
their ability to effectively synthesize and store what they gained
from problem-solving episodes [2]. Indeed, curriculum reform ef-
forts affirm the centrality of students’ experiences and attempt to
identify those experiences that maximize student gains within the
available time for learning.
As one significant manifestation of the cognitive approach,
there are several ongoing efforts to advance classroom practice and
outcomes by identifying and understanding misconceptions held by
engineering students regarding basic engineering concepts, like rateand energy [3]. One way this research is being pursued is through
the development and implementation of concept inventories [4, 5].
In recent work, Streveler et al. [6] have expanded this effort and
have proposed to identify misconceptions that occur across multiple
curricular areas and to repair these misconceptions through remedi-
ation involving changes in mental schemas.
Misconceptions are part of a learner’s prior knowledge—more
technically, declarative knowledge—in a specific knowledge domain.
In cognitive theories of skilled problem solving [2, 7], acquiring and
using declarative knowledge is crucial for effective performance.
However, problem solving is largely procedural, i.e., action-oriented
knowledge, and draws on a distinct form of memory that stores pro-
cedural knowledge [2].
The development of procedural knowledge in students, i.e., the
ability to effectively solve domain problems, is the goal of many in-
structional initiatives in engineering education. Gray et al. [8] out-
line five steps for solving equilibrium and kinetics problems that
they assert should be followed “without exception.” These steps are
First Steps in Understanding EngineeringStudents’ Growth of Conceptual andProcedural Knowledge in an InteractiveLearning Context
guided largely by the problem statement and include (1) stating
what needs to be found, (2) considering the problem assumptions,
(3) finding the equations needed for solution, (4) computing the so-
lution, and (5) verifying the solution. Litzenger et al. [7] also stress
the structured nature of setting up, solving, and checking the equa-
tions involved in problem solving. Importantly, their Integrated
Problem Solving Model explicitly combines declarative knowledge
with procedural knowledge by treating the activation and applica-
tion of prior knowledge as a definite step at each stage of problem
solving. Other research, like the work of Zywno and Stewart [9],
has incorporated additional cognitive factors. They investigated
computer-based learning using measures of cognitive complexity
based on Bloom’s taxonomy [10], and learning styles based on the
Felder-Solomon Index of Learning Styles [11].
B. Theory Behind the PracticeStructured approaches to problem solving, like the Integrated
Problem Solving Model [7], are well-aligned with cognitive theo-
ries of the development of expertise at the postsecondary level and
beyond [12, 13, 14, 15]. These theories assert that skill is the result
of declarative knowledge—facts associated with the domain—
being integrated and transformed into procedures in a continuous
process of refinement over time, as the result of deliberate practice
on the part of the learner [14]. It needs to be emphasized that from
a cognitive theoretical perspective, one’s skill is not fully character-
ized by problem solving performance in one’s area of training, but
also by how one’s conceptual knowledge expresses itself in problem
solving, and ultimately by one’s grasp of the deep principles govern-
ing the domain [16, 17].
Some of the most compelling research on the development of
skilled performance has involved comparisons between novice and
expert problem solvers. This research has shown how novice prob-
lem solvers are guided by superficial problem features (e.g., involv-ing an inclined plane), whereas experts draw on theoretical principles
(e.g., conservation of energy) when asked to sort problems into open-
ended categories [18]. Other findings have identified characteristic
differences in constructing mental representations and solutions for
problems, with experts constructing a representation of the problem
as they read through it and reasoning forward to the desired quantity
(e.g., velocity v). Conversely, novices reason backward from the de-
sired quantity, trying to solve for the desired variable value (e.g., v)
by patching together knowledge of equations [19; see also, 20].
This research has been successful in identifying critical differences
between experts and novices in terms of conceptual knowledge and
procedural skill by examining differences in how individuals ap-
proach and solve discrete problems. It is less clear how students de-
velop from novices to experts through engagement with the learn-
ing resources in their training programs.
C. Rationale for the Present StudyThis study builds on prior research examining students’ use of
learning resources in thermodynamics courses [21, 22] that showed
that students devoted the majority of their study time to developing
problem-solving skill and less time to reading their textbooks, sug-
gesting a bias for developing procedural skill and less inclination to-
ward increasing conceptual knowledge. Other research has suggest-
ed that students strive to develop conceptual knowledge, but do so
at lower cognitive levels. Zywno and Stewart [9] studied the learn-
ing effects of a computer-based module on the topic of control sys-
tems. Comparing pre-test to post-test results, they found greater
gains at lower cognitive levels of Bloom’s taxonomy [10] (Level 2:
Comprehension; Level 3: Application) rather than at higher levels
(Level 4: Analysis; Level 6: Evaluation). Streveler et al. [6] used a
Delphi method with experienced engineering faculty in order to
identify concepts in engineering mechanics and electric circuits that
were difficult and poorly understood by students. The researchers
found that faculty underestimated the difficulty that students faced
in understanding many of the concepts. This research and related
work by Miller and colleagues [5, 23] shows that academically suc-
cessful engineering students often lack deep understanding of the
concepts and principles that underlie their areas of training.
The present study is concerned with the early stages of learning
in an engineering area. The analyses address what students know in
terms of definitions, facts, and concepts (i.e., declarative knowl-
edge), how they use that knowledge, and how they solve problems
(i.e., procedural knowledge). The data were collected while students
were learning. A coding system was developed for capturing what
students know and do. These were the questions that were ad-
dressed:
● What cognitive processes do students use when compre-
hending text and solving problems in thermodynamics?
● Are there differences in the cognitive processes that students
apply in different learning contexts, specifically, those involv-
ing text, those involving student interactions, and those re-
quiring problem solving?
● Do students’ cognitive processes help to distinguish between
relatively good and weak learners?
● Are some learning contexts relatively more evocative of higher-
order cognitive processes?
II. CASE STUDY
The study examined learning in a rich visual, auditory, and print
environment in which students read text, listened to narrations, in-
teracted with simulations, and solved problems using instructional
software. This software implemented active-learning methods and
exploited state-of-the-art technology and authoring tools for learn-
ing. It was important to explore learning in this context because it
represented a major trend in contemporary teaching using technol-
ogy to provide students with engaging and evocative learning mate-
rials and aids. Critically, these kinds of learning materials were con-
sistent with theories of skill development, which demand that
students be provided with relevant factual knowledge and the
means to transform that knowledge into skill through applications
to problems. These materials differed from traditional lecture and
textbook learning resources in their ability to engage students’ sens-
es and cognitive faculties more fully and provide immediate and
constructive feedback to student inputs.
The goals of the study were to apply a method for describing
and understanding the cognitive processing of students as they in-
teracted with introductory engineering learning materials and to
provide exploratory data and analyses that yielded insights into
the general nature of these students’ cognitive processing, as well
as the capacity of this method to characterize individual differ-
ences between students. The method, known as verbal protocolanalysis, is a data-collection method used in human factors usabil-
ity studies and in psychological studies, primarily in the fields of
58 Journal of Engineering Education January 2007
expert/novice research and text comprehension research [24, 25].
Verbal protocols are open-ended think-aloud reports, through
which participants are asked to verbalize what they are thinking as
they complete a task, without attempting to interpret or summa-
rize the materials for the experimenter, unless those interpreta-
tions or summaries are a natural part of their thought processes.
Atman and Bursic [26] showed how the method could be used to
document design processes applied by engineering students, and
how to use those detailed descriptions in order to evaluate the
quality of students’ solutions, to look for growth in students’ per-
formance over time, and to evaluate whether engineering curricula
were meeting their stated objectives. As another example,
Litzenger et al. [7] collected verbal protocol data from engineer-
ing majors in order to confirm elements of the Integrated Problem
Solving Model.
A. Participants, Materials, and ProcedureTwenty-five undergraduate students at Texas Tech University
were recruited through General Psychology classes. Science and en-
gineering students often enroll in General Psychology to fulfill a
general education requirement, and they participate in experiments
to earn extra credit in the course. All volunteers for this study were
science or engineering majors who had not taken an introductory
thermodynamics course. Two participants were eliminated due to
low audibility in the tape recordings that provided the main data for
this study. Basic demographic data for the remaining participants
were as follows. Eighteen participants were male and five were fe-
male. Their mean age was 19.61 years (standard deviation, SD �1.88], the mean number of self-reported completed college credits
was 37.65 (SD � 30.11), and self-reported science and engineering
credits was 10.26 (SD � 11.04). These participants were considered
appropriate for this study because they had academic backgrounds
typical of students who would be required to learn the concepts of
introductory thermodynamics at an early point in their academic
training.
The materials used in this study were computer-based instruc-
tional supplements authored by E.E. Anderson for the textbook
Thermodynamics: An Engineering Approach, 4th Edition [27]. The
computer screens present students with text content, tables, figures,
and graphs. They also include active-learning screens with interac-
tive exercises, graphical modeling, physical world simulations, ex-
ploration, and quiz screens [cf., 28]. Each content screen includes a
voice narration related to the subject matter on the screen. The ma-
terials were reviewed by engineering faculty during development for
the textbook and after publication, and data were collected from
students on usability and comparisons to other media [29]. Figure 1
provides an example of an interactive exercise screen and Figure 2
provides an example of a quiz screen.
Chapter 1—Introduction to Thermodynamics—and Chapter 2—
Thermodynamic Properties—were selected for this study, in order
that the participants, who had not had the thermodynamics
course, would find the material comprehensible. Roughly 43 per-
cent of the total number of screens in these chapters contained an
interaction of some sort. Due to the relatively high proportion of
screens requiring student interaction, these materials provided
many opportunities for active learning and problem solving, which
are considered essential to the development of conceptual knowl-
edge and procedural skill.
Figure 1. Example of an interactive screen.
January 2007 Journal of Engineering Education 59
Participants took part in the experiment through individual
meetings in a quiet room with the experimenter. They were given
detailed instructions for the verbal protocol (“think aloud”) task and
were (falsely) informed that they would take a short test on the cov-
ered material after working through it, in order to ensure that they
applied themselves to learning the material as if studying for an
exam. They then proceeded to complete the think-aloud task,
which took between 30 and 60 minutes. Participants worked with
one half of one chapter. Eleven participants completed the first half
of Chapter 1 and twelve completed the first half of Chapter 2. The
data were tape-recorded for later transcription, with the permission
of participants. During data collection, the primary role of the ex-
perimenter was to prompt participants regularly to continue to ver-
balize their thoughts.
B. ResultsThe twenty-three protocols were transcribed by two of the exper-
imenters and an assistant. The initial phase of the analysis consisted
of establishing a rubric for parsing participants’ utterances into seg-
ments for coding and then developing the codes in Table 1. The
convention adopted for parsing, i.e., for segmenting the protocols,
was to code idea segments, which were often indicated by noticeable
pauses in a participant’s speech pattern. The parsed segments were
typically clauses or sentences. The second task in analyzing the pro-
tocols was to develop a set of codes. This was accomplished through
individual coding and group discussion among three of the experi-
menters using the two longest protocols. The task of developing
codes was approached in a “bottom-up” fashion [30], with two goals
in mind that emerged through the preliminary coding and discus-
sion. One goal was to distinguish the context of an utterance, as indi-
cated in the columns of Table 1. The six contexts were these:
● Comments made that were associated with the process of
moving between computer screens (Navigation).
● Comments made in response to the spoken narration that
initiated content screens (Narration).
● Comments made on screens that contained only text (TextOnly Screen).
● Comments made on screens that contained text and a related
table, figure, or graphic, but that did not allow for student in-
teraction (Text plus Table, Figure, or Graphic Screen).
● Comments made on screens with interactive graphics or sim-
ulations that allowed or required student interactions (Inter-active Screen).
● Comments made on screens that required students to process
text and tables and answer one or more questions related to
the preceding content screens (Quiz Screen).
The other goal was to identify descriptive labels at appropriate
levels of abstraction for the participants’ verbalizations, as indicated
by the rows in Table 1. Selection of these labels was guided by prior
research on comprehension strategies [31, 32, 33], but many of the
labels were composed in a bottom-up manner [30] in direct re-
sponse to the data. The twenty-three protocols were then coded in-
dependently by two experimenters, with the understanding that
codes could be added to the table as needed. After coding began,
only one major code was added, which was the “Vague” code in
Table 1 (#8a).
The process of transcription and coding of the 23 protocols was
time-intensive and took approximately 900 person-hours. The data
60 Journal of Engineering Education January 2007
Figure 2. Example of a quiz screen.
set consisted of 3501 coded utterances. In the initial coding, the
raters agreed on parsing decisions 92.35 percent of the time. That
means that one rater coded a piece of text while the other rater com-
bined the text with a contiguous piece of text less than 8 percent of
the time. For the 92.35 percent of the cases in which both raters
assigned a code, the raters agreed on codes 79.13 percent of the
time, which was a moderately high level of agreement. To further
analyze the raters’ level of agreement, a Kappa statistic [34] was cal-
culated. The use of Kappa is often advocated by researchers because
it adjusts the agreement measure for chance. The Kappa statistic for
January 2007 Journal of Engineering Education 61
Table 1. Frequencies of each code, summed across the 23 participants, for each type of context (see Appendix for example statements).
these data was equal to 0.77. This was in the range of SubstantialAgreement (0.61–0.80) [34]. Discrepancies in parsing and coding
were resolved through discussion among the raters, followed by mu-
tual agreement. These final codes were used in subsequent analyses.
Representative examples of verbalizations associated with codes can
be found in the Appendix. The complete transcript and a more com-
prehensive list of examples can be obtained from the first author.
The development of the codes and a coding rubric address the first
question in this study and make explicit the cognitive processes that
students use when comprehending text and solving problems.
The raw frequencies of the final codes are shown in Table 1,
with column sums indicating the frequencies of codes for the six
contexts in which an utterance might have been made, and row
sums indicating the frequencies of specific kinds of utterances,
summed across contexts. An examination of column sums shows
that about 13 percent of the comments related to navigation—the
transitions of the user from screen to screen. Only about 30 percent
of the Navigation comments (n � 126) signaled some difficulty on
the part of the user with screen controls, indicating that, in general,
the software was quite useable. The comments related to difficulty
in navigating sometimes indicated potential improvements to the
software that are discussed elsewhere [35] and will not be consid-
ered further here. Less than 1 percent (n � 30) of the total com-
ments indicated difficulty using or comprehending the narration
provided on content screens, indicating that audio information, as
opposed to the more familiar visual information in computer-based
sources, was a workable element of the software [cf., 36].
Narration comments were related to the content of computer
screens and could be made on any type of screen (e.g., text screen).
Navigation comments related to the process of movement from
screen to screen and could occur in transitioning from one type of
screen to another (e.g., from a text screen to an interactive screen).
In order to further analyze the comments made on the four types of
content screens (Text, Text-Table-Figure-Graphic, Interactive, and
Quiz) the Narration comments, coded for type of comment, were
incorporated into the counts for those screens. The Navigation
comments were not analyzed further. These modified data (which
now included Narration comments) were used to calculate the aver-
age number of comments that each participant made on the four
types of screens. The means are displayed in Table 2. Participants
made fewest comments on Text-Table-Figure-Graphic screens and
on Text screens. The number of comments increases substantially
on Interactive and Quiz screens. These data were subjected to fur-
ther statistical tests by first confirming that the data for each of the
screen types were normally distributed, using the Kolmogorov-
Smirnov test [37]. The data were then submitted to a Repeated
Measures Analysis of Variance (ANOVA), and showed a
significant effect for type of screen [F (3, 63) � 15.08, p � 0.001].
Pairwise comparisons of means, using a Bonferroni adjustment for
the number of tests [38], showed that the mean number of com-
ments on Text-Table-Figure-Graphic screens did not differ signifi-
cantly from Text screens, but significantly more comments were
made on Interactive and Quiz screens compared to Text and Text-Table-Figure-Graphic screens. Means for Interactive and Quizscreens were not significantly different from each other. As a fol-
low-up to these statistical tests, an examination of individual means
showed that for 21 of the 23 participants, the number of comments
made on Interactive and Quiz screens combined exceeded the num-
ber of comments made on Text and Text-Table-Figure-Graphicscreens combined. Addressing the second question in this study,
there were significant differences in the number of verbalizations
that were associated with differences in kinds of screens. Screens
that allowed or required students to do something evoked more
overt cognitive activity. Participants produced more verbalizations
and were presumably more cognitively active on screens that re-
quired interaction compared to text-based screens.
In a second analysis, the codes were divided into those associated
with lower-level cognitive processes (Codes 1–3) and higher-level
cognitive processes (Codes 4–5), based on a search of cognitive re-
search on comprehension and problem solving [18, 25, 31–33, 39,
40]. Summing the total number of the two types of cognitions by
participant revealed substantial individual differences in the types of
cognitions utilized, as shown in Figure 3. Individuals differed in
two ways. First, some participants were more active than others in
processing the materials, as indicated by their overall number of
comments. Second, individuals also differed in the relative frequen-
cy of lower-level versus higher-level cognitive comments. Address-
ing the third question in this study, the results showed that an
analysis of cognitive processes on an individual basis has the poten-
tial to distinguish between good and weak learners, based on the
frequency with which they engage in lower-order and higher-order
cognitive processes. Predominant use of lower-order processes is
suggestive of a weak learner. This point will be discussed further in
Section III.
Table 3 compiles the data in Figure 3 into the average number of
lower- and higher-level cognitive comments made on each of the
four screen types. An examination of the means shows that, overall,
students made few higher-level cognitive comments and many
more lower-level comments. Kolmogorov-Smirnov tests [37]
showed that the data for lower-level cognitive codes, averaged
across the four screen types, were normally distributed, but not the
data for higher-level cognitive codes; therefore, Wilcoxon rank
order tests were used to compare the average number of lower- and
higher-level cognitive comments. Rank-order tests are appropriate
for data that are not normally distributed. The statistical results
showed that students made significantly more lower-level than
higher-level cognitive verbalizations [Z � �3.68, p � 0.001], sug-
gesting that they were processing the material in a shallow fashion.
To the extent that higher-level comments were made, they were es-
pecially more frequent on quiz screens, and somewhat more fre-
quent on interactive screens, compared to text-based screens. This
latter observation is consistent with the analyses for the data sum-
marized in Table 2. Addressing the fourth question in this study,
the data in Table 3 suggest that screens requiring student interac-
tions are relatively more effective in evoking higher-order cognitive
processes than are text screens.
62 Journal of Engineering Education January 2007
Table 2. Mean number of comments made by each participantper screen on each screen type. (This Table excludes Navigationcomments in Table 1. *Text Plus is Text with a Table, Figure, orGraphic. **The mean number of comments is based on the numberof screens of each type that each participant viewed.)
III. DISCUSSION AND CONCLUSIONS
The present research is part of new initiatives in engineering ed-
ucation to incorporate pedagogical innovations into the classroom,
and to do so with the benefit of cognitive theories and methodolo-
gies in order to advance instructional practice and effectiveness in
engineering education. An assumption of the present research was
that engineering education takes place through multiple media and
through a variety of opportunities to interact with learning tools.
There are strong precedents for using students’ overt verbalizations
to identify the cognitive representations that they construct to com-
plete a task [7, 19, 20, 24–26, 41], as in this study. Sampling stu-
dents’ verbalizations while they complete academic tasks in engi-
neering holds strong promise for providing powerful and directive
insights for curricular activities and learning objectives.
The present analyses revealed significant differences in the num-
ber and kind of cognitions students engage in, depending on the
nature of the materials. The situations demanding overt actions
from students—interactive exercises and quiz problems—were also
the ones that evoked the larger measure of cognitive activity as evi-
denced through more verbal activity. These preliminary findings
provide an argument for further investigation into the quantity and
quality of learning interactions that come about due to faculty
choices and provision of activities for students.
The results also revealed clear individual differences in respond-
ing. An examination of student learning behaviors showed that the
majority of students approached the materials using very simple
cognitive strategies—pulling words from the screens and construct-
ing rudimentary paraphrases of the materials. Chi et al. [39] have
described students like these as “poor learners.” The present set of
verbalizations revealed little use of metacognitive reading strategies,
like activating background knowledge to increase comprehension,
or revising background knowledge based on new information [42].
The most striking finding in these data was the virtual absence of
inference, explanation, and drawing conclusions from the informa-
tion provided in the text, interactive, and problem-solving formats.
These missing elements warrant a more comprehensive investiga-
tion in future studies, as these are the kinds of cognitive processes
January 2007 Journal of Engineering Education 63
Table 3. Mean number of lower-level and higher-level cognitive comments made by each participant per screen on each screen type. (This Table excludes Navigation comments and comment types 6, 7, and 8 in Table 1.)
Figure 3. Sums of coded statements by participant. Lower-Level Cognition includes statement types 1–3 from Table 1; Higher-Level Cognition includes statement types 4–5 from Table 1 (“Navigation” statements are excluded from sums).
that underpin scientific practice across all disciplines [43] which are
necessary to build deep conceptual understanding and the ability to
solve novel problems in a domain [2].
An alternative explanation of students’ reliance on lower-level
processes is that these are beginning students and therefore lower-
level processing is most appropriate. They strive to comprehend the
content, but are not yet cognitively prepared for finding connections
between pieces of information, making inferences, posing ques-
tions, self-explaining, drawing conclusions, and ultimately con-
structing rich cognitive representations. From a cognitive perspec-
tive, this alternative explanation is not credible. Good learners are
metacognitive even when the material is unfamiliar. That is, they
activate background knowledge that may be relevant, seek to infer
definitions for unknown words from context, paraphrase, self-
explain, and generally try to maintain a coherent representation of
the material. In future studies, it will be important to examine cog-
nitive processes for students who are more advanced in their engi-
neering major than the students in this study to ascertain whether
the good-versus-weak-learner distinction persists for those stu-
dents. If it does, there will be a strong incentive for searching for
ways to better identify processing weaknesses in students—even
advanced students—and to more deliberately assist them to become
more effective engineering learners.
There are clear limitations to the conclusions that can be drawn
from the present data. This study was not meant to be definitive in
any sense, but rather a starting point in a promising direction of re-
search that examines the cognitive practices of students presented
with rich and complex learning materials. Nevertheless, the present
study invites further probing into the complex cognitive landscape
that underpins the emergence of professional skill and expertise
over the course of many years of training and practical experience.
The next steps in this line of research should include sampling of a
wider range of student abilities and the examination of the connec-
tions between the breadth and depth of students’ cognitive engage-
ment with course materials and their performance in the class, as in-
dicated by more conventional measures like tests, homework, and
project grades.
The claim that “engineering is a profoundly creative process”
[44] seems entirely apt as a description of the nature of professional
engineering. It also conveys a sense of the mindset and skill levels
that are set as goals for advanced students in engineering through
the ABET engineering standards. But how does a student become
a reflective thinker and effective problem solver? How does a stu-
dent get started on the path to becoming facile at tackling problems
in his or her domain, and what are the signs of advancement? This
paper applies a cognitive theory of the development of expertise [2,
13, 14] and a bottom-up approach [30] in an attempt to address
what the early stages of development might be like for engineering
students. The knowledge that we gain as researchers about the early
stages of knowledge and skill development in engineering students
will be important to the development of teaching methodologies
and learning aids and to curriculum reform initiatives. Formidable
challenges still lie ahead, as researchers pursue better theoretical
and pedagogical knowledge of students’ misconceptions [3–6],
how students acquire and develop the rhetorical structures and
schemas for scientific comprehension [5–7, 44, 45], and how they
develop the problem solving skills necessary to become skilled
engineers [7, 8].
ACKNOWLEDGMENTS
This research was supported, in part, by a grant from the
National Science Foundation, NSF-CCLI 0088947 and a grant
from the Texas Tech University Graduate School. We would like
to thank Krystal Blankenship for assistance in transcribing the
data.
REFERENCES
[1] DiGregorio, J., “Advancing Scholarship in Engineering Educa-
tion: Launching a Year of Dialogue,” Proceedings, American Society for Engi-
neering Education Annual Conference and Exposition, Chicago, IL, 2006.
[2] VanLehn, K., “Cognitive Skill Acquisition,” Annual Review of
Psychology, Vol. 47, 1996, pp. 513–539.
[3] Prince, M., and M. Vigeant, “Using Inquiry-Based Activities to
Promote Understanding of Critical Engineering Concepts,” Proceedings,
American Society for Engineering Education Annual Conference and
Exposition, Chicago, IL, 2006.
[4] Steif, P.S., “An Articulation of Concepts and Skills Which Under-
lie Engineering Statics,” Proceedings, 34th Frontiers in Education Conference,
Savannah, GA, 2004.
[5] Miller, R.L., R.A. Streveler, B.M. Olds, M.A. Nelson, and M.R.
Giest, “Concept Inventories Meet Cognitive Psychology: Using Beta Test-
ing as a Mechanism for Identifying Engineering Student Misconceptions,”
Proceedings, American Society for Engineering Education Annual Conference
and Exposition, Portland, OR, 2005.
[6] Streveler, R., M.R. Geist, R. Ammerman, C. Suizbach, R.L.
Miller, B.M. Olds, and M. Nelson, “Identifying and Investigating Difficult
Concepts in Engineering Mechanics and Electric Circuits,” Proceedings,
American Society for Engineering Education Annual Conference and
Exposition, Chicago, IL, 2006.
[7] Litzinger, T., P. Van Meter, M. Wright, and J. Kulikowich, “A
Cognitive Study of Modeling During Problem Solving,” Proceedings,
American Society for Engineering Education Annual Conference and
Exposition, Chicago, IL, 2006.
[8] Gray, G.L., F. Costanzo, and M.E. Plesha, “Problem Solving in
Statics and Dynamics: A Proposal for a Structured Approach,” Proceedings,
American Society for Engineering Education Annual Conference and
Exposition, Portland, OR, 2005.
[9] Zywno, M.S., and M.F. Stewart, “Learning Styles of Engineering
Students, Online Learning Objects and Achievement,” Proceedings, Ameri-
can Society for Engineering Education Annual Conference and Exposition,
Portland, OR, 2005.
[10] Bloom, B.S., and D.R. Krathwohl, Taxonomy of Educational Ob-
jectives: The Classification of Educational Goals, New York, NY: Longmans,
Green, and Co., 1956.
[11] Felder, R.M., and B.A. Solomon, “Index of Learning and Teach-
ing Styles in Engineering Education,” accessed at http://www2.ncsu.
edu/unity/lockers/users/f/felder/public/ILSdir/ILS-a.htm.
[12] Anderson, J. R., Cognitive Psychology and Its Implications (6th ed.),
New York, NY: Worth Publishers, 2005.
[13] Bedard, J., and M.T.H. Chi, “Expertise,” Current Directions in
Psychological Science, Vol. 1, 1992, pp. 135–139.
[14] Ericsson, K., and A. Lehmann, “Expert and Exceptional Perfor-
mance: Evidence of Maximal Adaptation to Task Constraints,” Annual
Review of Psychology, Vol. 47, 1996, pp. 273–305.
64 Journal of Engineering Education January 2007
[15] Sahdra, B., and P. Thagard, “Procedural Knowledge in Molecular
Biology,” Philosophical Psychology, Vol. 18, No. 4, 2003, pp. 477–498.
[16] National Research Council, How People Learn, Washington,
D.C.: National Academy Press, 2000.
[17] Chi, M.T.H., “Common Sense Conceptions of Emergent
Processes: Why Some Misconceptions Are Robust,” Journal of the Learning
Sciences, Vol. 14, 2005, pp. 161–199.
[18] Chi, M.T.H., P.J. Feltovich, and R. Glaser, “Categorization and
Representation of Physics Problems by Experts and Novices,” Cognitive
Science, Vol. 5, 1981, pp. 121–152.
[19] Larkin, J.H., “Enriching Formal Knowledge: A Model for Learn-
ing to Solve Textbook Physics Problems,” in J.R. Anderson (Ed.), Cogni-
tive Skills and Their Acquisition (pp. 321–335), Hillsdale, NJ: Erlbaum
Associates, 1981.
[20] Priest, A.G., and R.O. Lindsay, “New Light on Novice-Expert
Differences in Physics Problem Solving,” British Journal of Psychology, Vol.
83, 1992, pp. 389–405.
[21] Taraban, R., M.W. Hayes, E.E. Anderson, and M.P. Sharma,
“Giving Students Time for the Academic Resources That Work,” Journal
of Engineering Education, Vol. 93, No. 3, 2004, pp. 205–210.
[22] Taraban, R., E.E. Anderson, M.W. Hayes, and M.P. Sharma,
“Developing On-Line Homework for Introductory Thermodynamics,”
Journal of Engineering Education, Vol. 94, No. 3, 2005, pp. 339–342.
[23] Miller, R.L., R.A. Streveler, B.M. Olds, M.T.H. Chi, M.A.
Nelson, and M.R. Giest, “Misconceptions about Rate Processes: Prelimi-
nary Evidence for the Importance of Emergent Conceptual Schemas in
Thermal and Transport Sciences,” Proceedings, American Society for Engi-
neering Education Annual Conference and Exposition, Chicago, IL, 2006.
[24] Ericsson, K.A., and H.A. Simon, Protocol Analysis: Verbal Reports
as Data, Cambridge, MA: MIT Press, 1984.
[25] Pressley, M., and P. Afflerbach, Verbal Protocols of Reading: The
Nature of Constructively Responsive Reading, Hillsdale, NJ: Erlbaum Asso-
ciates, 1995.
[26] Atman, C.J., and K.M. Bursic, “Verbal Protocol Analysis as a
Method to Document Engineering Student Design Processes,” Journal of
Engineering Education, Vol. 87, No. 2, 1998, pp. 121–132.
[27] Cengel, Y.A., and M.A. Boles, Thermodynamics: An Engineering
Approach, 4th Edition, Boston, MA: McGraw-Hill, 2001.
[28] Anderson, E.E., R. Taraban, and M.P. Sharma, “Implementing
and Assessing Computer-Based Active Learning Materials in Introductory
Thermodynamics,” International Journal of Engineering Education, Vol. 21,
No. 6, 2006, pp. 1168–1176.
[29] Taraban, R., E.E. Anderson, M.P. Sharma, and A. Weigold,
“Developing a Model of Students’ Navigations in Computer Modules for
Introductory Thermodynamics,” Proceedings, American Society for Engineer-
ing Education Annual Conference and Exposition, Nashville, TN, 2003.
[30] Strauss, A.L., and J.M. Corbin, Basics of Qualitative Research,
Newbury Park, CA: Sage Publications, 1990.
[31] Taraban, R., K. Rynearson, and M. Kerr, “College Students’ Aca-
demic Performance and Self-Reports of Comprehension Strategy Use,”
Journal of Reading Psychology, Vol. 21, 2000, pp. 283–308.
[32] Saumell, L., M. Hughes, and K. Lopate, “Underprepared College
Students’ Perceptions of Reading: Are Their Perceptions Different Than
Other Students’?” Journal of College Reading and Learning, Vol. 29, 1999,
pp. 123–135.
[33] Nist, S.L., and J.L. Holschuh, “Comprehension Strategies at the
College Level,” In R. Flippo and D. Caverly (Eds.), Handbook of College
Reading and Study Strategy Research (pp. 75–104), Mahwah, NJ: Erlbaum
Associates, 2000.
[34] Landis, J.R., and G.G. Koch, “The Measurement of Observer
Agreement for Categorical Data,” Biometrics, Vol. 33, 1997, pp. 159–174.
[35] Taraban, R., A. Weigold, E.E. Anderson, and M.P.
Sharma,”Students’ Cognitions When Using an Instructional CD for In-
troductory Thermodynamics,” Proceedings, American Society for Engineering
Education Annual Conference and Exposition, Portland, OR., 2005.
[36] Mayer, R. E., Multi-Media Learning, Cambridge, UK: Cam-
bridge University Press, 2001.
[37] SPSS Inc., SPSS Advanced Statistics User’s Guide, Chicago, IL:
Author, 1990.
[38] Box, G., H. Hunter, and J. Hunter, Statistics for Experimenters,
New York, NY: John Wiley & Sons, 1978.
[39] Chi, M.T.H., M. Bassok, M. Lewis, P. Reimann, and R. Glaser,
“Self-Explanations: How Students Study and Use Examples in Learning
to Solve Problems,” Cognitive Science, Vol. 18, 1989, pp. 145–182.
[40] Kintsch, W., Comprehension, New York, NY: Cambridge Univer-
sity Press, 1998.
[41] Hmelo-Silver, C.E., and M.G. Pfeffer, “Comparing Expert and
Novice Understanding of a Complex System from the Perspective of
Structures, Behaviors, and Functions,” Cognitive Science, Vol. 28, 2004, pp.
127–138.
[42] Taraban, R., “The Growth of Text Literacy in Engineering Un-
dergraduates,” Proceedings, American Society for Engineering Education An-
nual Conference and Exposition, Chicago, IL, 2006.
[43] Taraban, R., A. Pietan, and R. Myers, “Discourse Functions in
Student Research Reports: What Can We Say About What Students
Know and Learn Through Research Experiences,” Paper presented at To
Think and Act Like a Scientist, a conference at Texas Tech University, Lub-
bock, TX, 2006.
[44] National Academy of Engineering, The Engineer of 2020, Wash-
ington, D.C.: National Academies Press, 2004.
[45] Otero, J., J. León, and A. Graesser, (Eds.), The Psychology of Sci-
ence Text Comprehension, Mahwah, NJ: Erlbaum Associates, 2002.
AUTHORS’ BIOGRAPHIES
Roman Taraban is associate professor in the Department of Psy-
chology at Texas Tech University. He received his Ph.D. in cogni-
tive psychology from Carnegie Mellon University. His interests are
in how undergraduate students learn, and especially, how they draw
meaningful connections in traditional college content materials.
Address: Department of Psychology, Mail Stop 2051, Texas
Tech University, Lubbock, TX, 79409; telephone: (�1)
806.742.3711 ext. 247; fax: (�1) 806.742.0818; e-mail:
Edward E. Anderson is professor of Mechanical Engineering at
Texas Tech University. He received his B.S. and M.S. degrees in
Mechanical Engineering from Iowa State University and Ph.D. de-
gree from Purdue University. His research interests are in applying
technology to teaching.
Address: Department of Mechanical Engineering, Mail Stop
1021, Texas Tech University, Lubbock, TX, 79409; telephone:
(�1) 806.742.0133; fax: (�1) 806.742.0134; e-mail: ed.
Alli DeFinis is a graduate student in the Psychology program at
Texas Tech University.
January 2007 Journal of Engineering Education 65
Address: Department of Psychology, Mail Stop 2051, Texas Tech
University, Lubbock, TX, 79409; telephone: (�1) 806.742.3711;
fax: (�1) 806.742.0818; e-mail: alli.definis@ttu. edu.
Ashlee G. Brown is a graduate student in the Psychology pro-
gram at Texas Tech University.
Address: Department of Psychology, Mail Stop 2051, Texas
Tech University, Lubbock, TX, 79409; telephone: (�1)
806.742.3711; fax: (�1) 806.742.0818; e-mail: ashlee.g.
Arne Weigold is a graduate student in the Psychology program
at Texas Tech University.
Address: Department of Psychology, Mail Stop 2051, Texas Tech
University, Lubbock, TX, 79409; telephone: (�1) 806.742.3711;
fax: (�1) 806.742.0818; e-mail: [email protected].
M.P. Sharma is professor of Chemical and Petroleum Engi-
neering at the University of Wyoming. He received his Ph.D. de-
gree in Mechanical Engineering from Washington State Universi-
ty. A current area of interest is conducting research on teaching and
learning methods, especially on the use of synchronous and asyn-
chronous tools using Web technology.
Address: 1000 E. University Avenue, Department of Chemical
and Petroleum Engineering, University of Wyoming, Laramie,
WY 82071; telephone: (�1) 307.766.6317; fax: (�1)
307.766.6777; e-mail: [email protected].
66 Journal of Engineering Education January 2007
APPENDIX
Examples of Participant Statements Associated with Codes
1b Navigation—Simply Describes Action (Neither the
statement nor context suggest a specific purpose.)
“…started it over.”
3a Navigation—Signals Comprehension
“Ok, back to Chapter 2.”
4a Navigation—Makes Inference About Content
“So, it says TOC I’m assuming that it’s Table of Contents.”
4g Navigation—Expresses Comprehension Strategy
“I think I may go back because I don’t remember the equation
right now.”
7c Navigation—Signals Difficulty Using
“Ok now I don’t even think I’m in the right place anymore.”
3a Narration—Signals Comprehension
“Well now that makes a little more sense.”
4g Narration—Expresses Comprehension Strategy
“so now I’m going to play page three over to hear the description
of the graph again”
7b Narration—Signals Difficulty Comprehending
“Ok, I don’t get that one.”
1b Text Only—Simply Describes Action (Neither the state-
ment nor context suggest a specific purpose.)
“I am reading the paragraph.”
2a Text Only—Reads/Repeats Verbatim
“When a system consisting of a chemically homogenous sub-
stance is divided into…”
2b Text Only—Shows Early Comprehension
“So there are solid, liquid, and gaseous phases.”
2c Text Only—Paraphrases
“Ok pure substance is when it’s the same substance all the way
through.”
3a Text Only—Signals Comprehension
“It looks like I am going to learn about chemicals called pure
substances.”
4a Text Only—Makes Inference About Content
“…and the 1 minus x is to compensate for the m sub f…”
4b Text Only—Connects Information in Current Screen
“The mass of everything is m sub f plus m sub g.”
4c Text Onl—Connects To Previous Screens
“x is the other equation that they used.”
8a Text Only—Vague
“Let’s see, hmm…”
1a Text plus Table, Figure, or Graphic—Orients
“Ok and there’s a little graph…”
2c Text with Table, Figure, or Graphic—Paraphrases
“The graph is explaining that water will flat line at different
levels of pressure…”
4a Text with Table, Figure, or Graphic—Makes Inference
About Content
“…but it looks like we combined a few of those equations.”
4c Text with Table, Figure, or Graphic—Connects To Previous
Screens
“And now they’re doing the same thing with the log specific
volume graph.”
4e Text with Table, Figure, or Graphic—Explains Content
“by looking at the diagram, by comparing the pressure and
specific volume you can tell states at which the substance is both
liquid and vapor form or when it’s in neither”
4f Text with Table, Figure, or Graphic—Draws Conclusion
“they all rise about to, uh at the same slope.”
7b Text with Table, Figure, or Graphic—Signals Difficulty
Comprehending
“I’m not really sure what z sub f and z sub fg stand for.”
4a Interactive Screen—Makes Inference About Content
“…but since there’s not as much pressure I guess it allows the
temperature to increase more than it did with the 50kPa.”
4b Interactive Screen—Connects Information in Current
Screen
“…and the log volume behaves similarly to before but it’s a lot
smaller now.”
4d Interactive Screen—Connects To Outside Knowledge
“I think I remember the triple point from high school.”
4e Interactive Screen—Explains Content
“…the temperature is staying the same because the pressure of
the gas coming up from the water kept on pushing the
pressure”
4f Interactive Screen—Draws Conclusion
“so basically, the lower the pressure, the least likely it’s going to
be any kind of liquid”
5a Interactive Screen—Anticipates/Predicts
“I guess it’s going to the left of the critical point.”
5b Interactive Screen—Checks/Confirms Prediction
“Let’s see if I was right…alright got it!”
January 2007 Journal of Engineering Education 67
6a Interactive Screen—Makes Metacognitive Comment
“That’s kind of weird.”
4c Quiz Screen—Connects To Previous Screens
“If I remember right, the s’s are enthalpy.”
5a Quiz Screen—Anticipates/Predicts
“I’m going to say 1.312 because that’s what it’s lined up with, so
I’m going to try it.”
5c Quiz Screen—Applies Mental Math
“so I got 600, um, 2133 times .2…2 times .1 would be uh 213,
213 times 2 is 426 plus 604 is about 1009.”
6a Quiz Screen—Makes Metacognitive Comment
“Ok, well I guess that shouldn’t be too hard if that’s how the
questions are going to be asked.”
68 Journal of Engineering Education January 2007