quality evaluation of the higher education programmes

3
Quality Evaluation of the Higher Education Programmes David Bañeres, Laura Porta, Teresa Romeu, Montse Serra IT, Multimedia and Telecomunications Department Universitat Oberta de Catalunya The universities in the European Higher Education Area (EHEA) must fulfil several directives in order to be able to provide a high education degree. A degree is evaluated in all its life cycle: from its design, when it is approved, during its monitoring, until its verification. Thus, each country has a quality agency that articulates all these processes of evaluation following the policies recommended by the European Association for Quality Assurance in Higher Education (ENQA) in order to guarantee the quality required for a specific degree [1]. Related to the monitoring process of a degree, an annual report evaluating several indicators has to be delivered to the agency. More precisely, the report describes information like statistics about the enrolment, the student performance, the satisfaction results, or the type of learning resources. Moreover, there is also a summary of the year composed of: detected issues, a set of enhancement proposals and planned corrective measures. Subsequently, this report is analyzed by the agency and a report's feedback is returned to the university together with the issues that should be corrected and improved during the next year. Based on this annual report, the monitoring process consists of the steps described below: The degree's coordinator analyzes the remarks performed by the agency in the previous report in order to make the suitable adjustments within the degree. The issues related to a subject that should be corrected are delivered to the teacher in charge of the subject. The issues are shared within teaching staff of the same knowledge area in order to find solutions and implement the corresponding corrections and improvements. The teachers collect evidences throughout the course in order to point out the weakness and the strength of it for the next annual report. At the end of the course, the evidences and the analysis of the course are delivered to the coordinator's degree and teaching staff of the same knowledge area. The degree's coordinator collects all the evidences and, simultaneously, he analyzes and synthesizes the results of the evaluation in her annual report in a continuous process of improvement. As we can observe in the previous description, the monitoring process involves the degree's coordinator and all the teaching staff of the degree. This fact tends to be time- consuming since all the teachers should collaboratively work to meet the quality standards. Currently, in our university this collaborative work is manually performed, that is, the information is transferred by arranging a meeting between the coordinator and the teachers. Additionally, the information concluded from this meeting is delivered

Upload: tekinguoc

Post on 17-Jul-2015

15 views

Category:

Education


1 download

TRANSCRIPT

Page 1: Quality Evaluation of the Higher Education Programmes

Quality Evaluation of the Higher Education

Programmes

David Bañeres, Laura Porta, Teresa Romeu, Montse Serra

IT, Multimedia and Telecomunications Department

Universitat Oberta de Catalunya

The universities in the European Higher Education Area (EHEA) must fulfil several

directives in order to be able to provide a high education degree. A degree is evaluated

in all its life cycle: from its design, when it is approved, during its monitoring, until its

verification. Thus, each country has a quality agency that articulates all these processes

of evaluation following the policies recommended by the European Association for

Quality Assurance in Higher Education (ENQA) in order to guarantee the quality

required for a specific degree [1].

Related to the monitoring process of a degree, an annual report evaluating several

indicators has to be delivered to the agency. More precisely, the report describes

information like statistics about the enrolment, the student performance, the satisfaction

results, or the type of learning resources. Moreover, there is also a summary of the year

composed of: detected issues, a set of enhancement proposals and planned corrective

measures. Subsequently, this report is analyzed by the agency and a report's feedback is

returned to the university together with the issues that should be corrected and improved

during the next year.

Based on this annual report, the monitoring process consists of the steps described

below:

The degree's coordinator analyzes the remarks performed by the agency in the

previous report in order to make the suitable adjustments within the degree.

The issues related to a subject that should be corrected are delivered to the

teacher in charge of the subject. The issues are shared within teaching staff of

the same knowledge area in order to find solutions and implement the

corresponding corrections and improvements.

The teachers collect evidences throughout the course in order to point out the

weakness and the strength of it for the next annual report.

At the end of the course, the evidences and the analysis of the course are

delivered to the coordinator's degree and teaching staff of the same knowledge

area.

The degree's coordinator collects all the evidences and, simultaneously, he

analyzes and synthesizes the results of the evaluation in her annual report in a

continuous process of improvement.

As we can observe in the previous description, the monitoring process involves the

degree's coordinator and all the teaching staff of the degree. This fact tends to be time-

consuming since all the teachers should collaboratively work to meet the quality

standards. Currently, in our university this collaborative work is manually performed,

that is, the information is transferred by arranging a meeting between the coordinator

and the teachers. Additionally, the information concluded from this meeting is delivered

Page 2: Quality Evaluation of the Higher Education Programmes

to the coordinator with unformatted documents increasing the analyzing time. This

methodology is not efficient and is even more impractical within a degree with a large

teaching staff.

The present proposal focuses on the evaluation of the quality of a concrete degree by

using techniques commonly used on e-assessment [2], since some of these techniques of

assessment used to evaluate the students can be extrapolated to other areas. Specifically,

we propose to use rubric-based assessment to evaluate the quality of a degree.

The rubric helps to self-evaluate a specific subject by measuring some indicators.

Indicators with regard to the learning resources, tools, evaluation, satisfaction, quality of

the teaching plan, and even the performance of the teaching staff are evaluated. Notice

that, this rubric approach guarantees the same criteria for each subject. The gradation of

the quality ranges from deficient, acceptable to excellent. Moreover, each indicator also

accepts information in natural language describing issues and enhancements tasks

scheduled for the next semester.

In order to improve the efficiency of the evaluation process, the technique has been

implemented within a tool called AVALA. The tool collects all the evidences related to

the operative of a concrete degree in the course of a semester. Furthermore, it has been

designed as a collaborative framework where all the teaching staff and the coordinator

can share all the information related to quality assurance.

The tool has a twofold objective. In one hand, as we described, it helps a teacher to

collect and analyze all the evidences related to the performance of the teaching process

during a semester. At the same time, the teacher can analyze the progress of these

evidences from the current semester in comparison with the previous ones. On the other

hand, the coordinator degree can consult all the collected evidences on a single

environment, it means, the coordinator has a view of the underlying information of the

current semester and at a glance she knows what issues need to be addressed.

Additionally, the tool has other features to help in some steps of the monitoring process:

The comments of the quality agency are stored by subjects. This feature helps

the instructors to automatically know the issues that should be corrected during

the current semester.

The system synthesizes the subjects’ evaluations for delivering automatically

reports about the critical issues of the degree. This information will help the

coordinator to perform the annual reports.

The AVALA tool has been used in the IT, Telecommunication and Multimedia

department of the Universitat Oberta de Catalunya during the courses 2012-13 and

2013-14 (divided in two semesters both courses). Three degrees and six masters have

been managed using the tool and around sixty professors have been involved in the

process of evaluation.

The experimental results have been satisfactory. The coordinators agree that the tool is

useful to review and centralize the information previously cited. Besides, the critical

problems can be easily detected. The system misses a control panel with alerts such as

subjects with low performance or satisfaction. Indeed, professors agree on the

improvement of the efficiency related to the transfer of the information and the

Page 3: Quality Evaluation of the Higher Education Programmes

structured way to store the data. However, they complain about the time invested in the

evaluation of all the subjects.

The tool has some deficiencies that should be corrected in the immediate future. Some

users' demands will be taken into account in order to improve the framework.

Nevertheless, the main goals of the tool have been achieved reducing the time invested

in the guidance's quality of the degree.

As a final conclusion, AVALA tool provides an added value, not only with regard to the

efficiency of the teacher’s work, but also when establishing a better way to manage all

the monitoring processes that belong to a degree contributing the quality’s guarantee.

REFERENCES

[1] Standards and Guidelines for Quality Assurance in the European Higher Education

Area. http://www.enqa.eu/wp-content/uploads/2013/06/ESG_3edition-2.pdf. Last

accessed: 21/10/2014

[2] Crisp, G. T. (2012). Integrative assessment: reframing assessment practice for

current and future learning. Assessment & Evaluation in Higher Education, 37(1), 33-

43.