[ieee 2010 14th international power electronics and motion control conference (epe/pemc 2010) -...

5
Active Learning Power Electronics: A New Assessment Methodology Zoja Raud Tallinn University of Technology, Tallinn, Estonia, e-mail: [email protected] AbstractA new student assessment methodology in the context of active learning Power Electronics is offered. The study summarizes the benefits and drawbacks of different qualitative and quantitative assessment practices. Throughout the paper, examples of assessments from Tallinn University of Technology are highlighted. Keywords—assessment, education methodology, teaching, power electronics. I. INTRODUCTION Contemporary changes in the business area, responses of companies to these changes, and the modern technologies pose a number of challenges to the present and future engineers as well as to the educational institutions. An engineering education community responded quickly on these requirements. This resulted in further pressure for institutions to establish and implement effective assessment within their curricula. Assessment moved from an off-topic to the center of engineering education discussions. Since the advancement of engineering training in many ways depends on assessment, high-quality assessments may provide educators with information they can use to move the field forward. Oppositely, inadequate or poorly constructed assessments may cause instructors to pursue ineffective paths, resulting in the loss of time, money and energy. The infusion of accepted principles and practices of student assessment has a significant impact on the development of engineering curricula and evaluations in terms of student performance. Following [1], the term assessment describes an act of collecting data or evidence that can be used to answer classroom, curricular or research questions. It has a broader sense than measuring individual student’s competencies, such as scores on a classroom exam or homework assignments. Since a major goal of engineering education refers to acquiring a physical process understanding, the studying information is descriptive in nature. Therefore, the qualitative evaluations are often based on the various descriptive methods. The methodologies that have been repeatedly used in descriptive knowledge assessments are surveys (abstracts) and conversational analysis [2]. On the other hand, different quantitative techniques are often used for current assessment procedures. This type of evaluation is appropriate when an intervention such as a new topic has been implemented and there is a desire to compare the effectiveness of new contents with a previously given material. To use quantitative techniques, a teacher needs to find the most effective tools and seeks or develops assessment instruments that will draw out the actual information [3]. The tests and quizzes refer to the most popular quantitative tools [4], [5]. This study examines a new approach to assessment used in electrical engineering education at Tallinn University of Technology. Here, the brief review of recent developments in assessment is given. Depending upon the setting and the purpose of assessment, different methods are discussed. The majority of this paper is dedicated to the description of the assessment tools that have been used in the authors’ practice obtained from the Power Electronics course. II. ASSESSMENT IN ACTIVE LEARNING CONTEXT One significant peculiarity of traditional engineering education is the difficulty in the practical application of the theoretical knowledge base. The knowledge transference from the classroom to the new situations and contexts may not occur spontaneously. In most cases, deliberate teaching interventions are needed in order to increase the probability of such a transference occurring [6]. In particular, electrical engineering curricula have solid foundations in science and mathematics, with the expectation that students connect mathematical and scientific concepts to the engineering practice of design and maintenance. However, it appears that the relationships between mathematics and engineering have not been clearly communicated to students through the curricula, resulting in a high dropout rate and low retention of engineering students [7]. Students’ perceptions on this issue demonstrate the lack of clear communication. Accordantly, one of the main goals of teaching is to impart to the student the ability to apply knowledge in different life situations [8]. Therefore, instead of using instructions on a knowledge framework creation, it is argued that an instructor would give guidance and support and thus help learners to become actively involved in the skill acquisition process. Hence, intensive and direct teaching of learning strategies in electrical engineering departments, including practice and training in the use of those tools, should help the students to succeed in their learning. It is a tradition that the grading and assessment schemes are largely prescribed by the host university. This evaluation division for examinations and practical credits is usually given in the curricula. The students are required to take the theory exams, as these exams serve to qualify them for the next semester. The inability to assess higher order of cognitive understanding and affective attributes via such assessment are often cited [9], [10]. Also in the practice estimation process, the questions posed to students regarding important aspects of their work typically give a subjective and narrow mark. Such traditional “paper and pencil” assessment methods are 14th International Power Electronics and Motion Control Conference, EPE-PEMC 2010 978-1-4244-7855-2/10/$26.00 ©2010 IEEE T14-1

Upload: zoja

Post on 24-Mar-2017

240 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: [IEEE 2010 14th International Power Electronics and Motion Control Conference (EPE/PEMC 2010) - Ohrid, Macedonia (2010.09.6-2010.09.8)] Proceedings of 14th International Power Electronics

Active Learning Power Electronics: A New Assessment Methodology

Zoja Raud Tallinn University of Technology, Tallinn, Estonia, e-mail: [email protected]

Abstract— A new student assessment methodology in the context of active learning Power Electronics is offered. The study summarizes the benefits and drawbacks of different qualitative and quantitative assessment practices. Throughout the paper, examples of assessments from Tallinn University of Technology are highlighted.

Keywords—assessment, education methodology, teaching, power electronics.

I. INTRODUCTION Contemporary changes in the business area, responses

of companies to these changes, and the modern technologies pose a number of challenges to the present and future engineers as well as to the educational institutions. An engineering education community responded quickly on these requirements. This resulted in further pressure for institutions to establish and implement effective assessment within their curricula. Assessment moved from an off-topic to the center of engineering education discussions.

Since the advancement of engineering training in many ways depends on assessment, high-quality assessments may provide educators with information they can use to move the field forward. Oppositely, inadequate or poorly constructed assessments may cause instructors to pursue ineffective paths, resulting in the loss of time, money and energy. The infusion of accepted principles and practices of student assessment has a significant impact on the development of engineering curricula and evaluations in terms of student performance.

Following [1], the term assessment describes an act of collecting data or evidence that can be used to answer classroom, curricular or research questions. It has a broader sense than measuring individual student’s competencies, such as scores on a classroom exam or homework assignments. Since a major goal of engineering education refers to acquiring a physical process understanding, the studying information is descriptive in nature. Therefore, the qualitative evaluations are often based on the various descriptive methods. The methodologies that have been repeatedly used in descriptive knowledge assessments are surveys (abstracts) and conversational analysis [2]. On the other hand, different quantitative techniques are often used for current assessment procedures. This type of evaluation is appropriate when an intervention such as a new topic has been implemented and there is a desire to compare the effectiveness of new contents with a previously given material. To use quantitative techniques, a teacher needs to find the most effective tools and seeks or develops assessment instruments that will draw out the actual

information [3]. The tests and quizzes refer to the most popular quantitative tools [4], [5].

This study examines a new approach to assessment used in electrical engineering education at Tallinn University of Technology. Here, the brief review of recent developments in assessment is given. Depending upon the setting and the purpose of assessment, different methods are discussed. The majority of this paper is dedicated to the description of the assessment tools that have been used in the authors’ practice obtained from the Power Electronics course.

II. ASSESSMENT IN ACTIVE LEARNING CONTEXT One significant peculiarity of traditional engineering

education is the difficulty in the practical application of the theoretical knowledge base. The knowledge transference from the classroom to the new situations and contexts may not occur spontaneously. In most cases, deliberate teaching interventions are needed in order to increase the probability of such a transference occurring [6]. In particular, electrical engineering curricula have solid foundations in science and mathematics, with the expectation that students connect mathematical and scientific concepts to the engineering practice of design and maintenance. However, it appears that the relationships between mathematics and engineering have not been clearly communicated to students through the curricula, resulting in a high dropout rate and low retention of engineering students [7]. Students’ perceptions on this issue demonstrate the lack of clear communication. Accordantly, one of the main goals of teaching is to impart to the student the ability to apply knowledge in different life situations [8]. Therefore, instead of using instructions on a knowledge framework creation, it is argued that an instructor would give guidance and support and thus help learners to become actively involved in the skill acquisition process. Hence, intensive and direct teaching of learning strategies in electrical engineering departments, including practice and training in the use of those tools, should help the students to succeed in their learning.

It is a tradition that the grading and assessment schemes are largely prescribed by the host university. This evaluation division for examinations and practical credits is usually given in the curricula. The students are required to take the theory exams, as these exams serve to qualify them for the next semester. The inability to assess higher order of cognitive understanding and affective attributes via such assessment are often cited [9], [10]. Also in the practice estimation process, the questions posed to students regarding important aspects of their work typically give a subjective and narrow mark. Such traditional “paper and pencil” assessment methods are

14th International Power Electronics and Motion Control Conference, EPE-PEMC 2010

978-1-4244-7855-2/10/$26.00 ©2010 IEEE T14-1

Page 2: [IEEE 2010 14th International Power Electronics and Motion Control Conference (EPE/PEMC 2010) - Ohrid, Macedonia (2010.09.6-2010.09.8)] Proceedings of 14th International Power Electronics

widely criticized as too much oriented towards the exams, with very few other forms of evaluation and feedback being used [11]. When the sole purpose of assessment is to measure the ability of students to respond to the questions asked in the form of credits and examinations, it does not answer whether the students can apply that knowledge and use it in the real world [12]. Here, assessment is not considered as a part of the learning process, but rather something, that takes place at a fixed time during the academic year.

Meanwhile, in Tallinn University of Technology an active learning methodology has been introduced into the course of Power Electronics. This approach popularized firstly in [13] focuses on the students desire to learn that moves the responsibility of learning on learners. A well-established precept of educational theory is that people are most strongly motivated to learn things they clearly perceive as a need to know. Some examples of active learning refer to inquiry learning, case-based teaching, discovery learning, and just-in-time teaching. Many studies, like [14], support the finding that students who learned how to apply learning strategies have reached higher achievements than those who did not acquire these.

The objectives of active learning implementation into the syllabus of Power Electronics were to expand significantly the learning opportunities for different groups of students, both the strong and the weak ones. The following aims of active learning were announced:

• students motivation in studying the contemporary electronic concepts

• students engaging with the materials they study • reasoning skill development and understanding

electronic processes • articulating and testing of the student ideas

through experimentations and discussions To achieve these aims, the course format has been

rebuilt [15] inline with the assessment methodology. As a result, active learning became a way to overpass the barrier between the practical application and the theoretical knowledge. Since the traditional assessment practice tends to focus only on an evaluation of learning and largely fails to consider assessment as a tool of improving learning, an effort has been made to transfer from assessment of learning towards a strategy of assessment for learning [16], [17]. Assessment in the context of active learning introduced into the Power Electronics course promotes learning and ultimately students’ progress and achievement and has a major influence on what learners learn, how effectively they learn, and consequently on the quality of their learning.

Assessment does not take place only at a fixed time during the year. The offered assessment procedure was built into the teaching process to monitor the progress of students regularly and to apply it as a guideline of the students’ achievements. This approach is used as a way of reflection and feedback for instructors in gauging problem areas, identifying weaknesses, and addressing issues in order to see where students are in terms of their learning progress. Moreover, the new assessment also changed the students’ mind in order to make use of assessment as a learning tool, and not just to pass examinations. It favours integration of evaluation, teaching, and learning tasks, which became authentic, meaningful, and engaging. The key features of this approach propose an active

participation of learners as the active and informed participants in assessment of their own performance and in development of reflective thinking [18]. Therefore, the evaluation strategy was redefined and reformulated for the goals of active learning to stimulate a learner by assessment and to receive currently the actual feedback.

III. ASSESSMENT THROUGHOUT LECTURES As a first step of assessment in the active learning

context, the methodology of theoretical training has been improved in a way that deploys the course contents upon the two layers:

• mandatory material as a minimum knowledge base of the study in the course (approximately 20% of the studying volume)

• optional material, which involves theoretical, practical, computing, simulation, and assessment areas (the rest 80% of the course)

Each lecture consists now of the following four parts: • pre-lecture discussion of the previous material • pre-lecture or post-lecture quiz • teaching the new topic of the mandatory material

that may involve in-lecture discussion of correct/incorrect quiz responses

• post-lecture summing-up discussion

To stimulate the learners’ activity, at the beginning of the course assessment criteria are introduced, which are mutually agreed between the students and lecturer. At the first lesson, all tasks are described and their influence on the final grade is explained and justified. The main components of the final grade are clarified here. This information is available also on the course Internet pages, where all other relevant materials are gradually posted as the course progresses.

Being an assessment instrument, the pre-lecture talk and the summing-up discussion that finalizes each lecture promote active learning. In addition, they give constructive feedback to the students involved. If the other students cannot be active enough, they also are given feedback in the form of a request to be more active.

Due to the pre-lecture and summing-up discussions and quizzing, the intrinsic lecture time fines off, thus the additional educational resources are recommended to students. The full scope of active learning resources is given in Fig. 1. All resources are divided between the internal and external ones as well as between the real

Internal resources

External resources Partners’ traditional teaching

Partners’ virtual resources

World-wide resources

Departmental traditional teaching

Institutional virtual resources

Direct students

and teachers

exchange

Fig. 1. Resources of active learning

T14-2

Page 3: [IEEE 2010 14th International Power Electronics and Motion Control Conference (EPE/PEMC 2010) - Ohrid, Macedonia (2010.09.6-2010.09.8)] Proceedings of 14th International Power Electronics

materials and the virtual ones. The real group envelops the faculty, the printed textbooks and manuals for lectures, laboratory work, exercises, tests and examinations. Also, equipment, buildings, rooms, and laboratories are needed. In the virtual group the web-textbooks, software, e-manuals and databases are represented. In addition to the traditional internal institutional resources, the educational tools of the partners’ universities and enterprises are involved as well as the worldwide open Internet resources become accessible for deep learning. Among them, Virtual Lab of Riga Technical University, Latvia and eDrive of St. Petersburg Electrotechnical University, Russia are presented.

The student and teachers’ exchange in the frame of the partner agreements and international programs engages student activity as well. The main partners for such exchange are as follows:

• Kempten and Giessen-Friedberg Universities of Applied Sciences, Germany

• Riga Technical University, Latvia • Lappeenranta and Helsinki Universities of

Technology, Finland • Vilnius Gediminas and Kaunas Technical

Universities, Lithuania • University of Ljubljana, Slovenia

It is a remarkable feature of the discussed approach that

a significant volume of the recommended resources is optional. It serves to stimulate the strong students in their success in active learning. At the same time, the weak students acquire mainly the mandatory information presented in the internal textbooks and manuals.

The in-site quizzes with selected-response questions are used regularly as an important tool of learning monitoring and student assessment. They serve as a main way of reflection and feedback in diagnosing and gauging learning problem areas, identifying weaknesses and addressing issues in order to see where students are in terms of their learning progress, to find the ways of the problem solving, and to promote further learning. For the lecturers, the results of quizzes become the ground for the next pre-lecture discussion. They obtain useful information that helps to find the points of weakness in a course, showing, which topics or chapters are to be reviewed to meet the students’ needs and to make these topics more understandable. Using such a feedback in the active learning classroom, an instructor adjusts lecture contents based on student responses to warm-up assignments and comes to the lesson with proper knowledge of student questions and concerns. For students, the quizzing results act as assessment scores. Students are more attentive in class because the lecture contents are adjusted to their level of understanding and the student concerns are addressed through properly designed classroom discussions.

To make lectures more appealing to students, an instructor usually gives 8 to 12 assignments/quizzes in a semester, each containing 15 to 25 questions. Students are asked to find right answers for each question from the personal multiple-choice question forms. The possible number of correct answers for a question may be from zero to four.

As a rule, a question form includes some groups of problems. The main group concerns the issues from the preceding lesson. The second group relates to the past lectures. Historical facts and the names of famous scientists are the typical questions of all quizzes. Some questions are devoted to the common system theory and the basics of electrical engineering, such as Ohm and Kirchhoff’s circuit laws. Calculations based on the mental arithmetic also accompany each quiz. As the discussed courses are given in English, the language competence undoubtedly helps to acquire higher scores.

The scoring principle is sufficiently simple: +1 is given for each correct answer and −1 for an incorrect answer. Thus, the penalties for the wrong answers protect from unprepared participation. The subject area of the forthcoming quiz is announced in advance therefore the students may read up for the quiz in time.

Immediately afterwards or at the beginning of the next lecture, the lecturer discusses the test problems. This discussion touches both the correct and the incorrect responses to the alternatives of the multiple-choice questions and gives thorough rationale and justification.

To improve the quiz results and to help students in their assessment procedure, the learners are asked about the reasons of their failing right after the first assessment. Typical results of such a survey are given in Table 1. They concern both the students and the instructors.

TABLE I.

TYPICAL REASONS OF STUDENT FAILURE IN QUIZZES

# Reason Improvement by students

Improvement by teachers

1 Low knowledge of the material

Preparation to discussion

Questions to be published in manuals and

Internet

2 Weak English leading to misunderstanding

Upgrading English skills

Translation material into

Estonian

3 Weak discussion experience

Regular attendance of discussions

Improvement of questionnaire

4Insufficient

information about discussion

More activity in classes and at

home

Citing results in the Internet

During the following quizzes, the results are stabilized.

As a rule, most of the students respond correctly approximately to half of the questions and only few of them know all the answers. Undoubtedly, the results depend on the knowledge level established by both the mandatory and the optional learning resources.

Overall results obtained in 2009/2010 academic year are given in the top diagram of Fig. 2. Here, in the gray sector (S) the number of students is indicated that declined active learning. The middle scoring students are given in green (M), the large scoring in blue (L), and the excellent scoring in red (X).

To help the students in their assessment improvement, the Internet Assessment Homepages have been developed, which are updated along with the current quizzing. As a

T14-3

Page 4: [IEEE 2010 14th International Power Electronics and Motion Control Conference (EPE/PEMC 2010) - Ohrid, Macedonia (2010.09.6-2010.09.8)] Proceedings of 14th International Power Electronics

result, a strong dependence between the quizzing scores and examination grades has been found.

IV. ASSESSMENT THROUGHOUT EXERCISES AND LABORATORY PRACTICE

In the scope of the described educational environment, the theoretical studies do not always have to be passed before the taking the practical ones. To part of students, it is better to pass the theoretical studies first, but to other it is better to pass practices first because this can provide a good basis for the theory as they help students to understand some of the theoretical concepts.

The structure of the new manuals for both the exercises and laboratory practices repeats the structure of the lecturing described above. The practice contents cover two layers − the mandatory material and the optional one. The mandatory part of these lessons is executed in

accordance with the traditional step-by-step instructions resulting in the standard report with the circuit diagrams, calculations, experimental traces, measurement tables, and conclusions. The new optional part of the practice is processed using the informal instructions and fresh ideas of teachers. Here, the learning assessment invokes to evaluate:

• clarity of the experiment statement • understanding the learning objectives and the

methodology used • evaluation of the problem solving under the

practical headings • rate of calculations, simulation performance, and

software selected • practical experience and qualification obtained

from the exercises • nature and appropriateness of student

collaboration and group working potential To provide an effective skill acquisition, the existing

classes were redesigned to center teaching and assessment on the needs and abilities of learners. The emphasis was on generating the student interest to the engineering methods and tools. The new hands-on and interactive learning modules were incorporated into the course. To enhance understanding by linking theory and practice, the applicable tasks were delivered in the classes. Electrical design focused on the real-world engineering products was taught using the popular toolboxes, like Multisim and Matlab. As a rule, analytical problems are explained and collaboratively solved in the first lesson of each practice module. Herein, an instructor encourages an active participation and facilitates understanding. Students are trained in technical report writing that helps them to produce qualitative study explanation and presentation.

The principles of assessment throughout the exercises on calculations and simulations were developed in the context of active learning. Though the full number of the lessons comes up eight in the semester, only five of them are mandatory whereas the rest are optional. The average number of problems to solve in a lesson is 4 to 6 usually. Again, the solving of the only one problem in a lesson is mandatory whereas the other problems are optional. The number of variants is equal to the number of students. The scoring system assumes obtaining one score for each solved problem.

The overall scope of the laboratory works accessible by the students (now their number comes to 10) has been grouped into the three thematic blocks. Each block includes the mandatory and the optional works; therefore, everybody should perform minimum six works. In turn, each work contains both mandatory and optional experiments to be conducted. All laboratory works incorporate five basic activities that promote active learning and give the constructive feedback between the students and instructors involved:

• off-site preparation • in-class pre-work talk • performance of a laboratory work • in-class summing-up discussion • off-site report generation and defense

The classroom talks and discussions are used regularly as an important tool of the learning monitoring and students’ assessment. To ensure the students readiness for

Averaged scoring summary of quizzes, % of attendees(active learning - 77%)

X12%

L38%

M27%

S23%

Averaged scoring summary of exercises, % of attendees(active learning - 66%)

X34%

L4%

M27%

S35%

Averaged scoring summary of labs, % of attendees(active learning - 47%)

X8%

L12%

M27%

S53%

Fig. 2. Results of active assessment

T14-4

Page 5: [IEEE 2010 14th International Power Electronics and Motion Control Conference (EPE/PEMC 2010) - Ohrid, Macedonia (2010.09.6-2010.09.8)] Proceedings of 14th International Power Electronics

experimentation, an instructor asks usually 10 to 20 questions before, during, and after the work. Students are asked to find answers to the preliminary cited questions. Right answers increase the student’s personal rating. According to the simple scoring principle, a student wins scores for each correct answer. Here, learning assessment invokes to evaluate the similar features with the exercises.

Immediately after the experiments, an instructor and the students discuss the results obtained. This discussion touches both the correct and the incorrect responses and gives their justification. Again, the Internet assessment homepages help the students in their assessment improvement.

The particular principles of the practice arrangement were developed in the context of active learning. Each student of a working team has his own responsibility in the work. One of them is supposed for the circuit assembling. He ought to develop the circuit diagram and to lead a team during the circuit assembling. Since the circuit is prepared without an instructor help, he obtains his personal score. The second team member is supposed for the calculations provided during and after experimentations. Since the calculating results match the experimental ones, he obtains his personal score. The third team participant keeps the minutes and plots the diagrams along with the experimentation. Since the diagrams are ready within the lesson, he obtains his score also. These roles change weekly therefore everybody learns to play all the roles.

Every laboratory work involves both the mandatory and the optional items. Each team member may obtain additional scores by execution the optional items. By answering the questions given at the end of the laboratory blocks the students obtains more scores. The average number of problems to solve in a block comes up 10 usually. Again, the solving of the only one problem in a lesson is mandatory whereas the other ones are optional. The number of variants is equal to the number of students. The scoring principle assumes obtaining one score for each solved problem.

Overall results of assessment of exercises and labs are given in the bottom diagrams of Fig. 2. Here, in the gray sectors (S) the number of students is indicated that declined active learning. The middle scoring students are given in green (M), the large scoring in blue (L), and the excellent scoring in red (X).

V. CONCLUSION The study shows that the ultimate purpose of

assessment (or any type of evaluation) is to improve student learning, which begins with setting objectives and renews itself with each assessment activity. It is displayed that assessment is an integrated and important component that should be considered throughout the educational process. Different qualitative and quantitative assessment practices in combination provide Power Electronics educators with the rigorous and sound assessment tools for lecturing, exercises and laboratory works.

ACKNOWLEDGMENT This paper was supported by the Project ETF8020. The

author expresses the gratitude to her supervisor, Professor Valery Vodovozov for essential contribution to this work.

REFERENCES [1] B. M. Olds, B. M., Moskar, and R. L. Miller, “Assessment in

engineering education: Evolution, approaches and future collaborations,” Journal of Engineering Education, 2005, 1, pp. 13 – 25.

[2] C. Haller, V. Gallagher, T. Weldon, and R. Felder, “Dynamics of peer education in cooperative learning workgroups,” Journal of Engineering Education, vol. 89(3), 2000, pp. 285 – 293.

[3] T. Ellis, “Animating to build higher cognitive understanding: A model for studying multimedia effectiveness in education,” Journal of Engineering Education, vol. 93(1), 2004, pp. 59 – 64.

[4] L. A. Suskie, Questionnaire survey research: What works (2nd Edition), Florida State University: Association for Institutional Research, 1996.

[5] M. Laeser, B. Moskal, R. Knecht, and D. Lasich, “Engineering design: Examining the impact of gender and the team’s gender composition,” Journal of Engineering Education, vol. 92(1), 2003, pp. 49 – 56.

[6] A. Raviv, “Academic skills: The key to meaningful learning in the higher education system – An action research conducted at Tel-Hai Academic College,” The 7th International Conference on Education and Information Systems, Technologies and Applications (EISTA 2009), Orlando, Florida, 2009, pp. 241 – 246.

[7] J. F. Froyd and M. W. Ohland, “Integrated engineering curricula,” Journal of Engineering Education, 2005, 1, pp. 147 – 164.

[8] R. Lidor, “The influence of imparting learning strategies on the performance of motor skills,” Hebetim Behinuch (Aspects of Education), vol. 1(1), 1996, pp. 63-88.

[9] H. Virolainen, “Digital portfolio as a learning tool,” The 7th International Conference on Education and Information Systems, Technologies and Applications (EISTA 2009), Orlando, Florida, 2009, pp. 248 – 252.

[10] A. M. Rashad, A. A. Youssif, R. A. Abdel-Ghafar, and A. E. Labib, “E-assessment tool: A course assessment tool integrated into knowledge assessment,” In: Iskander, M. (Ed.), Innovative Techniques in Instruction Technology, e-Learning, e-Assessment, and Education, New York: Springer, 2008, pp. 7 –11.

[11] N. J. Powell, P. J. Hicks, W. S. Truscott, P. R. Green, A. R. Peaker, A. Renfrew, and B. Canavan, “Four case studies of adapting enquiry-based learning (EBL) in electrical and electronic engineering,” International Journal of Electrical Engineering Education, 2008, vol. 45(2), pp. 121 – 130.

[12] Y. E. Woyessa, S. P. Van Tonder, and D. Van Jaarsveldt, “Alternative student assessment in engineering education: Lecturers’ perceptions and practices,” The 2nd International Multi- Conference on Engineering and Technological Innovation (IMETI 2009), Orlando, Florida, 2009, pp. 224 – 229.

[13] C. Bonwell, and J. Eison, “Active learning: Creating excitement in the classroom,” AEHE-ERIC Higher Education Report No.1, Washington, D.C.: Jossey-Bass, 1991, 320 p.

[14] C. E. Weinstein and R. E. Mayer, “The teaching of learning strategies,” In: M. Wittrock, ed., Handbook of Research on Teaching, New York: Macmillan, 1986, pp. 315 – 327.

[15] Z. Raud and V. Vodovozov, “Active learning in Electric Drives and Power Electronics,” 7th International Symposium on Topical Problems in the Field of Electrical and Power Engineering, Doctoral School of Energy and Geotechnology, Narva-Joesuu, Estonia, 2009, pp. 63 – 68.

[16] D. T. Rover, N. G. Santiago, and M. M. Tsai, “Active learning in an electronic design automation course,” IEEE International Conference on Microelectronic Systems Education, 1999, pp. 78 – 79.

[17] H. Geyser, “Learning from assessment,” In: S. Gravett and H. Geyser, (Eds), Teaching and Learning in Higher Education, Pretoria: Van Schaik, 2004, pp. 90 – 109.

[18] C. Savander-Ranne, O. P. Lunden, and S. Kolari, “An alternative teaching method for electrical engineering courses,” IEEE Transactions on Education, 2008, vol. 51(4), pp. 423 – 431.

T14-5