continuous quality improvement in higher education a...

40
0 Continuous Quality Improvement in Higher Education A case study in Engineering School of Boras University Ahoo Shokraiefard This thesis comprises 15 ECTS credits and is a compulsory part in the Master of Science with a Major in Quality and Environmental Management, Thesis-number 9/2011

Upload: hoanghanh

Post on 06-Mar-2018

218 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

0

Continuous Quality Improvement in Higher Education

A case study in Engineering School of Boras University

Ahoo Shokraiefard

This thesis comprises 15 ECTS credits and is a compulsory part in the Master of Science with a Major in Quality and Environmental Management, Thesis-number 9/2011

Page 2: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

1

Continuous Quality Improvement in Higher Education

A case study in Engineering School of Boras University

Ahoo Shokraiefard

Master thesis

Subject Category: Quality In Higher education

University of Borås School of Engineering SE-501 90 BORÅS Telephone +46 033 435 4640

Examiner: Roy Andersson

Supervisor, name: Maria Fredrikson,

Supervisor, address: University of Boras, Department of Quality

Examiner, name Roy Andersson

Supervisor, address: University of Borås, School of Engineering

501 90 BORÅS

Page 3: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

2

To my mother who supports me with her smile, To my father who has ever been there for me, And many thanks to Mohammad Saeid Zandi, who is a true friend

Page 4: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

3

Abstract:

This thesis considers “Quality in Higher Education” from different points of view. The aim is to achieve continuous quality improvement in Engineering School of Boras University as a case study. In order to improve quality, the quality criteria and definitions in higher education are discussed. Different improvement methods that have been successfully used to improve quality in Educational systems such as PDCA (Deming wheel) and EFQM (European Foundation for Quality Management) are shortly presented. These methods are applied in Boras University Engineering School to find out the roots problems and main barriers to improve quality, and there are some different solutions suggested to implement in order to achieve continuous quality improvement in this especial department. Although this case is focused on Boras University, the writer believes that the same methods and assessments can be used in any kind of educational organizations.

Key words: Quality in higher education, Quality improvement methods, PDCA, EFQM, self-assessment

Page 5: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

4

Table of Contents

Abbstract ................................................................................................................................................. 2

1.Introduction

1.1. Purpose................................................................................................................................ 4

1.2. Methodology ....................................................................................................................... 4

1.3. Boras University, Engineering School .................................................................................. 5

2.Quality in Higher Education ................................................................................................................. 5

2.1. Stakeholders ........................................................................................................................ 5

2.2. Students’ Role in Higher Education ..................................................................................... 5

2.3. Different Prospective on Quality Definition in Higher Education ...................................... 6

2.4. Quality in Engineering Education ........................................................................................ 8

2.4.1. CDIO INITIATIVE Framework .............................................................................. 8

2.4.2. Engineering Programs Evaluation in Sweden and CDIO INITIATIVE ................... 9

3. Theories and Methodologies

3.1. PDCA Method .................................................................................................................... 10

3.2. EFQM self-assessment ..................................................................................................... 10

3.2.1. Theory and Methodology of EFQM Method .................................................... 10

3.2.2. The EFQM Model Definitions and Applications ................................................ 10

3.3. Student Feedback .............................................................................................................. 12

3.3.1. How to transfer Student Voice to Quality Index? ............................................ 13

3.3.2. Validity and Reliability of the Survey ................................................................ 14

4. PDCA Steps Implemented in Boras Engineering School ................................................................... 15

PDCA implementation using the assessments results

5. EFQM Methodology Implemented in Boras Engineering School ..................................................... 17

5.1. Self-assessment ................................................................................................................ 17

5.2. Results .............................................................................................................................. 17

6. Student Satisfaction Survey Implemented in Boras Engineering School ......................................... 19

6.1. Survey ............................................................................................................................... 19

6.2. Results .............................................................................................................................. 19

7. Conclusion ......................................................................................................................................... 20

References ............................................................................................................................................. 21

APPENDIX A: EFQM QUESTIONNAIRE .................................................................................................. 23

APPENDIX B: Faculty Student Satisfaction Survey ................................................................................ 30

Appendix C: EFQM Self-assessment results .......................................................................................... 34

Appendix D: Faculty Student Satisfaction Results ................................................................................. 37

Page 6: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

5

1. Introduction Quality in higher education is a challenge

Answering the question of “what is Quality of education?” is like answering to the question of “what is life?” (Kemenade et al. 2008, p.175)

Defining quality has ever been a big challenge in the field of services, and education has been the most challenging one. There are more tools to improve quality when considering products and evaluating the product line. It is also easier to assess the extent of products’ improvement. In services, that’s more difficult to define the quality and even much more difficult to assess the improvement results. Customer needs are more ambiguous in many fields, and their satisfaction is difficult to assess. In the field of services, education is one of the most complicated subjects to evaluate and improve. Defining quality in education depends on so many factors; the society beliefs, the area condition, the industry expectation and so many other factors. This is very difficult to find all of them and to find exact ways to achieve them. “There is an obvious lack in the provision of an education compatible with real needs in the context of continuous adaptation to change” (Logothetis, 1995).

This research provides different definitions of quality in higher education. Some different ways to classify the criteria and factors of quality in higher education are introduced. But as John Dewey said: “The object of education lies not in communicating the values of the past but in creating new values for the future” (John Dewey, 2009, p.4). So the research focuses on quality improvement tools which can be used in higher education. Two simple but very practical and useful evaluation tools which are chosen here are EFQM and Faculty-satisfaction-survey. These two tools are described and used in Boras University Collage as a case study. There are some solutions suggested to improve the quality in Engineering school of Boras University Collage based on the research.

These two assessment instruments are supposed to be used in a comprehensive improvement cycle. PDCA cycle is chose for this purpose. Using assessment tools in PDCA cycle has two advantages; the first is that these two can check the reliability of each other in the same cycle. The other is that they can be compared with the previous assessment in the previous cycle and the improvement rate is thus measured.

1.1 Purpose

In educational institutions with a high growth rate in the number of students and especially in international level, and according to organizational goals, developing a system for continuous improvement seems to be essential. This project will offer an efficient way to implement and develop continuous improvement while recognizing current problems.

According to the growth in number of students for the last few years, and the aim of this university collage to become a university in the near future, we feel that the current system may not be the best one in long term.

1.2. Methodology

The improvement methodology in this research is based on Deming Weal continuous improvement method which is also known as PDCA cycle. As it will be described later, this cycle has four main steps, plan, do, check and act. In each step, different management methods should be used to fulfill the step based on academic search and reliable data.

The methodologies used and described in this research are more based on assessment and evaluation methods. The executive planning should be considered detailed in practical steps. Methodologies which are suggested to achieve the best result in PDCA cycle are assessments methods such as EFQM, to focus on people who work in the organization and finding some of the roots problem

Page 7: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

6

according to their point of view NSS assessment method to evaluate student point of views to evaluate the success degree in improvement steps. This can clarify the acting steps in PDCA cycle.

1.3. An Introduction to Boras Engineering School

Borås is a city located in south west Sweden.

University collage of Borås has 6 deferent educational departments. The school of engineering was established in 1990. Up to 2007 the number of study programs offered was 19, including 5 master programs. The total number of students in 2007 was 1420 in this department. School of engineering has around 70 employees and 11 professors.

2. Quality in Higher Education

In order to define and then improve quality in education systems, we need to translate the educational needs to more clear concepts and classify the quality factors. But the first step is considering who the stakeholders of higher education organizations are.

2.1. Stake holders

The stake holder was first defined by Stanford research institute as “"those groups without whose support the organizations would cease to exist". And “Since then the term, of what is a stakeholder has been taken to mean someone who has an interest in a deliverable or outcome.” (Stanford, 1963)

More modern definition for stakeholders might be the one from Tonnquist which divides stakeholders to three different types according to their priority; core, primary and secondary stakeholders. Core stakeholders, according to Tonnquist definition, are decision makers, primary stakeholders are those who are particularly influenced and like to be considered in the decisions made by the core stakeholders, and secondary stakeholders have low interest and not benefit so much from the organization. (Bo Tonnquist, 2008) Universities stakeholders can be known as students, employees, society, industry, government, media and environment. But it’s still complicated to define the university core stake holders. According to Tonnquist definition, students are not core stakeholders. Although students are playing the main role in university, they are not decision makers, and their satisfaction would not be enough while evaluating the university outcome. Society can’t make decision for university policies and strategies either, although it can be very effective in decision making process and strategies. The main stakeholders of a university are the sponsors. Sponsor can be the government, or one or more companies. In case of the private universities, core stakeholders are the owners. The university should be directed in the way that the core stakeholders achieve the main goals. For instance, if Volvo is sponsoring Chalmers University, graduated students should be able to fulfill the company requirements. In a public university of teaching methods, graduated students should be able to satisfy the public schools needs.

2.2. Students’ role in higher education

“Employees and management are internal stakeholders, as is the student. Others rather call the student a participant in the learning process” says Everard Van Kemenade et al in the article “more value in defining quality”. He continuous: “if we are not clear from whose perspective we are discussing, our discussion on quality will get us nowhere.” (Kemenade et al, 2008 p.177)

Quran and Gryna (1980), as cited in Stensaasen’s article (1995, p582) consider educational process as bellow:

“... service (education). Schools start with a raw material (students), apply a process (teaching) and turn out a finished product (graduates), although there may be some rejects. There are raw material specifications (minimum entrance requirements) and incoming inspections (entrance examinations).

Page 8: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

7

There is a process specification (curriculum course outline), process facilities (faculty laboratories, textbooks), process controls (reports, recitations, quizzes) and final product testing (examination).”

Students are one of the main stake holders, they may consider as customers of universities and this is somehow true. There should be students who chose a university and study there, as there should be customers for any kind of product or service. But at the same time, students may be considered as the product of universities. This is the main aim of higher education to take the students and in a few years, prepare them to be helpful for their society and/or the industry which needs them. There are actually other customers who are waiting outside the university to fulfill their requirements with Graduates’ help. And most often, during education, students don’t have a clear idea of what they are expected to do after graduation. That’s the main reason that students’ satisfaction is not enough to rank a university and they are not the main stakeholders. Students cannot be decision makers (This is the reason that in EFQM method, the questionnaire is designed for employees)

2.3. Different prospective on Quality definition in higher education

So many recent researches have been done to define Quality in higher education and explaining the main issues. Harvey and Green (1993) suggest five index for quality, Exceptional, perfection, fitness for purpose, value for money and transformative. Van Kemenade suggests “A new definition of quality is needed to explain recent quality issues in higher education” (Kemenade et al. 2008). He describes “a quality concept with four constituents: object, standard, subject and values.” and tries to fit the quality in higher education into these four factors.

John Dew, in his article “Quality Issues in Higher Education” frames the quality concept in higher education in 5 different ways: Quality as endurance, Quality as luxury and prestige, quality as conformance to requirements, quality as continuous improvement, and quality as value added (John Dew, 2009). That’s clear that the quality definition differ from one concept to another and so does the improvement.

The first concept, the endurance, mostly the European point of view, believes that the age of the institute is the most important factor. The reason is that people believe if an organization or a university can last for a long time, it has the ability to ensure quality. Universities which stands more than a century, has a very high rank in European ranking system. Quality as luxury provides the students with a comfortable environment, and the up-to-date research facilities, the scholarships, as a facility is one of these commitments in luxury aspect. In the concept of quality as conformance to requirements, the requirements can be defined as learning results, the services, resources, planning and improvement programs. Quality as continuous improvement, ensure that the requirements in the previous concept are up-to-date and the institution can conform to recent requirements and has enough tools and assessments to evaluate it. Quality as value added is a modern concept, which believes in adding values to the society by help of education. As Dew defines: “Completing a college degree should mean some measurable improvement in student learning, social skills, social contacts, writing skills, reading skills, critical thinking, or other attributes that are consistent with the mission of an institution, such as the ability to dance, speak another language, or plan how to construct a building” (Dew, 2008, p.4)

The main issue in higher education is measuring the quality factors. It’s easy to measure the age of a university, and rather easy to measure the luxury, but it is more difficult to assess the conformance to requirements, and more complicated to assess the value added. If we can find suitable tools to accredit the conformance to requirements, we can use the continuous improvement tools to improve it and adding value depending on the society values and expectations.

There is a simple review of how the international institutes, for example QS (The world's leading network for top careers and education) ranks the top universities according to different measures; Academic peer review, employer review, faculty student ratio, and international faculty and

Page 9: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

8

international students satisfaction, Table-1 bellow from QS shows this institutes’ ranking’s methodology:

Indicator Explanation Weighting

Academic Peer Review

Composite score drawn from peer review survey (which is divided into five subject areas). 9,386 responses in 2009 (6,354 in 2008).

40%

Employer Review

Score based on responses to employer survey. 3,281 responses in 2009 (2,339 in 2008).

10%

Faculty Student Ratio

Score based on student faculty ratio 20%

Citations per Faculty

Score based on research performance factored against the size of the research body

20%

International Faculty

Score based on proportion of international faculty 5%

International Students

Score based on proportion of international students 5%

http://www.topuniversities.com/university-rankings/world-university-rankings/methodology/simple-overview

Dervitsiotis (1986) uses Objectives Matrix to facilitate a framework to evaluate educational institutes. The main task of Objectives Matrix (OMAX) is to translate strategy objectives to success factors and weight them so that they could be prioritized. He defines 6 main factors to evaluate education: “(1) teaching ability, (2) research and publication output, (3) student contact effort, (4) joint projects effort with business and industry, (5) community service, (6) student placement effectiveness.” (Dervitsiotis, 1995, p566)

He defines these 6 criteria as bellow:

1; Teaching ability is the ability of an effective process in admitting and requiting students, this factor is assessed by Questionnaire score.

2; Research and publication is the amount of papers published in journals and other official databases, it is weighted by number of papers.

3; Student contract Total hours is the hours spent with students except the lectures time.

4; Joint projects Budget amount is calculated from the number and the budget that firms spend for supervising projects.

5; Community service is the total hours spent on various organizations to assist with community activity.

6; Student placement effectiveness is the percentage of students who satisfactorily found related jobs within 6 months

The quality definition can vary according to the institute aim and visions. The important is to break the general vision and aim to measurable criteria’s and have practical plans to achieve them.

Page 10: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

9

2.4. Quality in Engineering Education

”Universities can be very different, not only from one country to the next, but also among different scientific sectors within the same country.” (Gola, 2005, p423)

This statement confirms that in order to improve quality, it is not enough to define the concept of quality in higher education and follow general principles. Although it can be a part of the progress, it is also so important to define the requirements and essential changes according to the field of education. Needs may really vary from a literature department to medicine faculty, from natural science to engineering field. The general needs might be the same, but it would be so helpful to concentrate on the higher education institute detailed, according to the field of study, as well as general educational and social requirements.

Recently some research’ have been done about engineering education. Engineering is an especial field of study, which needs the academic knowledge as well as practical capabilities. There are specific requirements for engineering education which can guarantee the quality of the education. Examining the quality index in higher education regardless of the field of education may not be useless, but will have many ambiguities. Gola (2005), defining quality in engineering education on European scale, describes a “Minimum set” of evaluation requirements. He divides the evaluation items into four main dimensions: requirements and objectives, teaching and learning, learning resources, analysis and review. Each item breaks down to more detailed elements which should be considered separately. Here comes a short description of each factor:

Requirements, Objectives: this aspect is related to external parties. Requirements of any party that can be concerned by the education should be considered. For instance, employment opportunities for graduates can be one of the requirements/objectives factors.

Teaching, Learning, Assessment: this aspect can be break down into content of the program, teaching material and methods, assessment methods, etc. It is so important to make sure that teachers are updated with up-to-date methods and are informed about new strategy and policies according to their field of teaching.

Learning Resources: this factor is not limited to facilities that are required for education, such as laboratories, libraries, online resources. Learning resources in this definition also includes the educational standard of the staff and teachers. Academic supports, guidance and consultants, and even welfare services are counted in learning resources item.

Monitoring, Analysis, Improvement: there is a list of data which should be collected and analyzed to help the improvement process, this list contains student progression, graduates opinions, rating graduates on the job market, student general satisfaction, etc.

Any assessment which is supposed to lead to improvement should contain these aspects. So it is very important to design and manage the assessment and improvement process considering all of the above items.

2.4.1. CDIO INITIATIVE Framework

CDIO INITIATIVE is a framework for reforming engineering education. This was first launched with support of some universities of Sweden and USA with the goal of improving engineering education in their countries. However it is a worldwide frame work now. This frame work was aimed to solve the problem of graduated engineers in their careers; “Industry in recent years has found that graduating students, with technically adept, lack many abilities required in real-world engineering situations”. (cdio.org) Pointing out the concept of engineering education, CDIO is abbreviation of Conceiving-Designing-Implementation-Operating. The framework has 12 standards to be followed and these 12 standards are focused on these four main context. The philosophy of the program, syllables outcome

Page 11: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

10

and curriculum, integrated experience and design build experience are some of the standards which all focus on the content of the education.

However Breggren et al. divides the main strategies of implementing CDIO to four themes:

• Curriculum reform: This reform should ensure the development of knowledge, design ability, and skills

• Teaching and learning improvement: Special improvement is essential in technical skills and active learning experiences

• Experiential Learning environment: The syllabus should be developed so that it includes enough laboratory and workshops

• Assessment methods improvement: Assessment methods should be qualified to assess the quality and the improvement (Berggren et al. 2003, p 49)

There is some advice for those institutes who wants to start CDIO program in its official website http://www.cdio.org. This advice suggests five main key to start the organizational change: “Five of the items are commonly thought to be key in bringing about any organizational change: rationale, committed leadership, interested early adopters, appeal to professionalism, and resources and incentives. These should be very carefully considered and incorporated into program plans.”

The CDIO framework is focused on the program and the content of engineering education. There should be complementary instruments to evaluate and improve the whole institute besides CDIO.

2.4.2. Engineering Programs Evaluation in Sweden and CDIO INITIATIVE

Swedish National Agency for Higher Education (Hogskoleverket, HSV) is a public authority that is responsible for the quality of Higher Education in Sweden. HSV has a qualitative evaluation program which takes place each six year. The evaluation has different purposes, some of the main purposes are to review the quality and monitor the development process, achieving higher standards in global higher education, and providing prospective students with sufficient information.

The evaluation has three steps. The first step is self-assessment to assess and analyze the program. The questionnaire in this assessment has a common base which is stated by HSV. The second step is site visits to have a face to face evaluation to confirm the previous result from self-assessment. The third and last step is follow-up. A follow-up is made ensure the efficiency of the evaluation and its recommendation.

However HSV added and external evaluation tool in evaluating civil engineering program in 2005. They decided that CDIO self-assessment could be used as an overall evaluation tool in program level and be used as an external instrument for the second step which is external review. CDIO standards can be also an appropriate base for the follow-up step. (Malmqvist et al. 2005, p4)

The CDIO framework was modified by a team in HSV. For instance, all the standards and assessment was translated to Swedish. This survey, with focus on implementation of CDIO in civil engineering programs presented the frame-work capability for improvement. “Survey and interview results indicate that the standards are relevant and applicable for a wider range of programs than have earlier used the standards and, that making changing towards implementing the standards would improve program quality.” (Malmqvist et al. 2005, p13)

Page 12: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

11

3. Theories and Methodologies 3.1. Theory and Methodology of PDCA Method

Methodology: Deming Wheel can be used as a basic tool or methodology for all systems and organizations having a plan for continuous improvement.

According to Deming Wheel cycle there are four main steps to improve quality:

Plan: first step is to have a plan for improvement by recognizing the problems and offering ideas for solution

Do: Changes should be implemented to check if they will work or not

Check: the results should be checked continuously to examine the affects of changes on quality and also to identify the new problems

Act: Implementing the changes according to the previous results and involving all persons and people who are affected by the changes (Deming, 1986)

We plan to work on this project, based on Deming Wheel cycle.

In planning and checking steps, some assessment methods are essential to evaluate the improvement and achievement. The next session will introduce the assessment tools which are basis of university assessments.

3.2. EFQM Self-Assessment

3.2.1. EFQM History and Background

A clear description of excellence organization can be “The 4p” that has been defined by Dahlgaard & Dahlgaard (1999), The 4p consists of 1- people,2-partner, 3- process and work, 4- product. 4p says that excellence is achievable if we pay attention to these factors in the organization and the most important between those above is people.

One way to achieve excellence, as a challenging issue facing by Organizations, to obtain a high performance organization is a framework entitled EFQM. Europian Fundamental Quality Management was introduced as a frame work for European Quality award in 1992(Chee hee ew et,al 2002); “The Foundation is in the tradition of the American Malcolm Baldrige Award and was initiated by the European Commission and 14 European multi-national organizations in 1988” NABITZ(2000). As we compare EFQM with some other quality improvement model we realize that there are a lot of similarity between EFQM and other quality awards and models including Deming Award in Japan and assessment model of American Malcolm Baldrige, Malcolm Baldrige National Quality Award (1999) EFQM was first introduced in 1992 and has been reviewed annually, the principles remained the same but some changes have been done.

3.2.2. The EFQM Model Definitions and Applications

EFQM is a model which can be applied in any type and section of organization regardless of sector, size maturity etc. This model consists of nine criteria, As it is shown in the figure five criteria out of nine are enablers and four criteria are results (Dahlgaard, Dahlgaard, 2004). The enablers are as follows: leadership, policy and strategy, people, partnership and resource and processes. The model presents the result of excellence in terms of people result, customer result, society result and key

Page 13: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

12

performance result. This is obvious that this model pays attention to all stakeholders of the company. Excellence is not just neither a theory nor a concept; excellence is an achievement of a company thorough paying attention to all stockholders. Excellence leads not only to continuous improvement but also makes a good atmosphere in the organization.

This model can be difficult to understand and also it is complex in practice (Dahlgaard, Dahlgaard, 2004). Another application of EFQM is self assessment. In other words the organization that does not aim to achieve quality award can benefit from self assessment functionality of EFQM.

Self-assessment is a different approach from traditional managing assessments; In self-assessment the company identifies its competitive capabilities, while traditional approaches evaluates the conformity of the performance of the company with its rules (Conti,2001; karapetrovic and Willborn 2001). Self-assessment, on the other hand, identifies the areas (Van der Wiele et al., 1996) that there are possibilities for improvement. The European Foundation for Quality Management (EFQM, 1999) defined self- assessment as “A comprehensive, systematic and regular review of an organization’s activities and results against a model of business excellence”.

Zink & Schmidt give us a very clear and useful definition of EFQM criteria. The table below shows and describes these criteria and the maximum point for each criterion.

Description of the EQA criteria

Group Criterion Specification Max. points

Enablers Leadership The behaviour of all managers in driving the organization towards total quality.

100

Policy and strategy The organization's mission, values, vision and strategic direction and the ways in which the organization achieves them.

80

People management The management of the organization's people.

90

Resources The management, utilization and preservation of resources.

90

Processes The management of all the value-adding activities within the organization.

140

Results Customer-satisfaction What the perception of external customers is of the organization and of its products and services.

200

People satisfaction What the people's feelings are about their organization.

90

Impact on society What the perception of the organization is among society at large. This includes views of the organization's approach to quality of hfe, the environment and the preservation of global resources.

60

Business results What the organization is achieving in relation to its planned business performance.

150

Total Point

1000

Page 14: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

13

The main question in the field of EFQM is: can we use this tool in universities? Zink and Schmidt believe that we can use EFQM to improve quality in higher education:”true, a university is different from a ‘normal’ business, but that has not too much impact on the application of TQM” (Zink & Schmit, 1995, p.547). EFQM has been used as the main quality tools in many European universities. “In the Netherlands and many other European countries we have seen a growing interest in the Excellence Model as Developed by the European Foundation for Quality Management (EFQM)… Seven institutes for higher education in the Netherlands called the HBO-Expert group, developed a version of the model for higher education that has been translated into English, French, German, Spanish, Latvian, Czech and even into Vietnamese” (Van Kemenade, 1999,2004, p.547).

In this research EFQM is applied in Borås University as a tool for self assessment to achieve higher performance.

3.3. Student Feedback

Nowadays student feedback, as one of the main stakeholders, is one of the most important considerations in higher education quality management. There are different ways to get feedback from students. Most of the higher education institutions have their own or/and national ways to get student feedback. Lee Harway describes student feedback:

“Feedback in this sense refers to the expressed opinions of students about the service they receive as students. This may include perceptions about the learning and teaching, the learning support facilities, the learning environment, support facilities and external aspects of being a student.” (Harvey, 2003)

There are different ways to collect student feedback, Formal and informal feedbacks, discussion or questionnaire form, agreement or satisfaction ranking, etc. The most common form of feedback collection which provides reliable data for quality improvement is using questionnaire. Questionnaire can be set as description or multi answer forms, online or paper base, choosing between these different forms depends on the purpose of the questionnaire. Harvey, believes that “Questionnaire-based feedback is usually in the form of ‘satisfaction’ surveys” (Harvey, 2003, p3). He divides satisfaction survey form according to the institutional level that the survey is taking place: Institutional-level, Faculty-Level, program-level, module-level and teacher appraisal.

Although most of the higher education institute has one or more sort of student survey, not all of them seem to be in a right direction for improvement or problem solving.

“It is not enough to collect data. We need to examine data, share them and act accordingly, with the purpose of improving the quality of teaching and learning.” (Gaspar et all, 2008, p 446)

According to ‘Gaspar et all’ there are three different levels in the progress; collecting data, examining data, and acting accordingly. The lack or failure in any of these levels may make the whole attempt completely useless. In many institutions, student survey is a part of routine process of the institution bureaucracy; they have had the same old way of feedback collecting for years. A wrong way of collecting data, weather it is a wrong questionnaire or a wrong level or even lack of commitment in collecting and gathering survey is the best reason that student feedback is not brought to consideration. In some institutes, although there is a proper way of collecting feedbacks, but there is lack of analyzing the result. Therefore it not enough to gather the data, but is so important to analyze them and find out the main gaps, and the first and most important actions which should take place. Some institute may have problem with bringing the results into action. The right decision in the right time should be made and a prompt action should be done according to the results. The latter reason is

TOTAL QUALITY MANAGEMENT, VOL. 6, NOS 5&6, 1995, page 551, KLAUS J. ZINK & ANDREAS SCHMIDT.

Page 15: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

14

the main problems in many institutes, “Most institutions indicate that it is difficult to pinpoint specific action resulting from the survey findings.” (Harvey, 2003, p11)

3.3.1. How to transfer Student Voice to Quality index?

“Proper Student Satisfaction Survey”

“Collecting feedback from students on their experiences of higher education has become one of the central pillars of the quality process. Many surveys are being carried out but it is not always clear how

fit for purpose they are.” (Williams&Cappuccini, 2007, p159)

When talking about student voice, it is so important to have a proper tool to translate this voice to quality index. Otherwise collecting feedback is waste of time, money and energy and can also make student disappointed. This can prevent them to answer the questionnaires properly or they may even ignore them. That is very important to make a proper way to collect this feedback according to the purpose and even more important to respond to it. Students should make sure that their responsibility in spending time on questionnaire can affect the improvement process in their institute.

In the case of Boras Engineering School the questionnaire type should be discussed in two different dimensions. One is the level that the survey should take place. And the other is the form of the questionnaire.

Although this might seems to be helpful to have all institution, faculty and program level in this case, but it will be useless and needs extra time and energy both for students who should fill the questionnaires and those who should analyze them. Beside, a well-structured and fit-to-purpose questionnaire in one level can make the survey unessential in other levels. In this case, the focus is in the faculty of engineering in the institute. So the faculty-level survey seems to fit for the purpose. But it is so important to consider that there might be an institute survey which can clash the faculty survey and lowers its efficiency. Harvey clarifies “where both co-exist, it is probably better to attempt to collect faculty data through qualitative means, focusing on faculty-specific issues untouched by institution-wide survey.” (Harvey 2003, p13)

The form of survey is also very important and should be helpful for quality improvement process. The form of survey can vary depending on the purpose. Here comes a comparison between two main forms of surveys which are known as standard forms. Although NSS is a British National Questionnaire, but it claims to consider all questionnaire standards and many surveys all over the world are following its base to design their questionnaire. Comparing between these two main forms of questionnaire can lead us to a proper form of questionnaire.

NSS vs SSS Questionnaire

‘Higher Education Funding Council for England’ (HEFCE) decided to collect a national feedback related to quality of higher education for two main purposes. The first was to get sufficient data for quality improvement and the second was to publish the results for prospective students and other stake holders. This was designed based on an Australian national survey called CEQ. The National Student Survey (NSS) in England was carried out in 2003. The first official survey was done in 2005. NSS is an annual survey which is only for last year students. (Williams&Cappuccini, 2007, p160)

Although NSS is to be the main framework for quality improvement in national scope, but institutes still prefer to have their local survey and plan their own improvement principles.

“One of the more popular approaches to collecting feedback from students within individual institutions is the Student Satisfaction Approach… The questionnaire used in this approach is known

as the Student Satisfaction Survey (SSS)” (Williams&Cappuccini, 2007, p160)

Page 16: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

15

While the SSS questionnaire is covering a wide range of quality index, the NSS focus limited issues based on its purpose. Although both of these surveys are widely used in England and even in other countries, both of them have their own incompatibility with the Faculty Satisfaction Survey which is the purpose of this research. NSS survey focuses on seven main issues including management, assessment, teaching, supports, personal development, resources, and overall satisfaction. The advantage here is that a short and summarized questionnaire is designed, numbers of questions are limited to 22 questions and the answers are five-point scale based on dis/agreement. Therefore it doesn’t take so much time and effort to be filled up by students. At the same time, there are many issued that are not considered or not well analyzed in the questionnaire. For instance, learning resources section is limited to three general questions. Or job opportunity for graduated students is not considered at all.

SSS questionnaire covers more issues and is more detailed in each issue. Having almost 177 questions (depends on odd/even years), it covers almost all of the areas needed for quality improvement locally or institutionally. SSS breaks up higher education issues to 14 different sections including overall satisfaction. A seven-point scales answering table should be filled in both importance and satisfaction degree. SSS is apparently more compatible to local quality improvement purpose. But this is in some areas more detailed that is needed as a faculty survey. Also being so long and having so many questions makes it more unlikely to be answered by so many students. In general it fits more to long term quality improvement in institutional level rather than a realistic feedback collection in a part of institutions which needs rapid changes for improvements.

As described before both questionnaires have their own advantages and weakness’. To choose or to design a proper questionnaire, it is so important to focus on the purpose of the survey. Some of the main purposes of designing a proper questionnaire in Boras Engineering School are listed below:

• To find out the most important value for students in their education • Recognizing the main improvement needed and areas that can be postponed according to less

priority • Rating student satisfaction especially in the last year of their education • What are students needs as one of the main stakeholders and in which degree are they fulfilled

And finally: • What should be the next step in the improvement progress

It seems that the purposes are more close to SSS survey purposes. But the SSS questionnaire still needs to be adapted to the requirements. Some of the issues are useless in faculty level survey. The number of questions should be reduced so that more students will be encouraged to answer the questionnaire. And finally it should be fit to this special faculty which is an Engineering Faculty. In 2006/2007, University of Essex in UK, designed a questionnaire which is a combined NSS-SSS questionnaire. The questionnaire includes 50 questions, but only covers agreement dimension. Based on this combined questionnaire, a questionnaire is designed to fit Boras Engineering School. The institutional level questions are reduced and some questions are replaced with some more important issues such as career. Both importance and agreement are considered in the questionnaire. The questionnaire is attached in APPENDIX B.

3.3.2. Validity and Reliability of the Survey

Although a well designed questionnaire can be a useful tool to collect acceptable data, but it can’t guarantee the validity and reliability of the survey. To explain the validity and reliability of the survey, it is makes sense to define these two words first. Oxford online dictionary defines these two words as:

Validity: “The quality of being logically or factually sound, soundness or cogency”

Reliability: noun form of reliable “Consistently good in quality or performance, able to be trusted”

Page 17: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

16

According to Oxford online dictionary definitions, we can summarize validity and reliability as being logic in quality and trustable in quantity.

“Validity is established if an instrument actually provides a measure of what it purports to measure” (Kamber&Leung, 2008, p342) It is so difficult to measure a concept that is not defined clearly. As explained before, there are so many definitions in higher education quality criteria and some concepts are still ambiguous. In order to have a valid questionnaire the principles should be defined, the field of measurements should be well clarified and the subjects that are going to be focused must be described. The more the terms in the questionnaire are clear, and the more the subjects are related to the main purpose, the more validity can be earned.

Reliability is more quantitative value. It can be defined both in number of questions in the questionnaire and the number of responds that are available in the end of the survey. It might be clear that a more detailed questionnaire and accordingly the more terms/questions, and the more scale options in answers make the questionnaire more reliable. But it is obvious that a very long questionnaire has more repulsion than attraction. At the same times the more responds means more feedbacks and this is essential for reliability and quality improvement process. So the balance between the length and reliability should be observed.

4. PDCA cycle implemented in Boras engineering School

“What an organization needs is not just good people; it needs people who are improving with education” (Deming, 1986)

To implement the PDCA cycle in the University collage of Boras, there should be an increased obligation and commitment to follow the improvement steps. The best way to implement PDCA cycle is to fit the cycle into an accurate period of time. The whole cycle can fit to an academic year or a semester or any kind of academic period. An academic year can be a logic period to take all the steps of PDCA cycle. But before deciding on where to start and when to end up the cycle, we should define the steps in the Engineering School of Boras.

The plan step as defined before is to have a plan for improvement by recognizing the problems and offering ideas for solution. It means that in order to plan, we need to recognize problems first. So an assessment may be essential to evaluate the current situation and the first change or improvement that should be done in the system. An EFQM self-assessment in the plan step can help us to have a general view on the system current situation. This point of view which is focused on people and those who are responsible for the quality, can lead us to main roots problems in the system. Although the result of EFQM might be so general, but it still can give us some hint about where to start.

The next point is WHEN to do this self-assessment. An EFQM assessment can be done any time in any institute. But in order to have the best efficiency here, the best time during the academic year can be the end of the spring semester. Some of the reasons for that are, first; new personals personnel have had enough time (the whole academic year) to evaluate the system, second; in the continuous improvement progress, the leaders have enough time during the summer to plan for the next year and a whole semester to try and test their solutions and/or new strategies.

The second step of the PDCA cycle is to DO. ‘Do’ step is defined as ‘Changes should be implemented to check if they will work or not’. The first action should be planned according to EFQM self-assessment results. Top leaders and faculty managers can set their principles with quality team who assess EFQM results. Though EFQM is a simple assessment, but it is so important to analyze the results in a proper way. As it will be explained later, the main gaps should be determined in the first year. But these gaps may stay in the main areas which are usually leadership and resources. So for the next years in continuous improvement progress, it would not be enough to analyze the annual results. It would be so important to compare the result with the previous results that comes from last years and review the ascent and decent rate in the results in different areas.

Page 18: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

17

The first semester or autumn semester is the best time for the action to take place. Leaders have had the whole summer to prepare for the changes. There can be sessions and even courses if needed to share new principles with staff. During the first semester in the academic year, the changes can be implemented and the results can be examined by the leaders and managers.

In the end of do step, there should be a checking progress. The first assessment has been in the employee level, and the plan has been designed according to this assessment. But if the assessment goes well, and the plan is well followed according to the assessment, the result should appear in all levels in the institute. We can check the system in another level. Student assessment may be useful to check the results. There are two different benefits in having a student assessment in this step. The first is to check the results of the previous plans in other stakeholders (students) point of view. The second is that hearing student voice can help us to find out other gaps and problems which were missed in the last survey. “As paying ‘customers’, students may expect to be asked their opinion of the varying aspects of their chosen higher education institution, as well as to be informed what actions have resulted from the collection of their views.” (Williams&Cappuccini, 2007, p167) It would be so helpful to have a tool to examine the changes according to institute customers (students) to make sure that the improvement progress is in the right direction.

This step can be done in the end of first semester in academic year. There are some advantages in taking the student assessment in the end of first semester. The first advantage is that students are still more involved with their education in compare with the second semester. The second advantage is that course managers, resource managers, and other managers who may get involved with the result of the assessment will have enough time to plan and prepare improvements for the next academic year.

The act step can be designed according to the second assessment that is student assessment. While the leaders are trying to continue the planned changes in employees’ level, there could be other changes in academic level. These changes can go on parallel with those who are made in the plan step. They may overlap with each other’s in some field, but they still can go on without causing problems in improvement progress.

The figure below shows the PDCA cycle specialized for Boras Engineering School:

Plan Do

Check Act

EFQM Self-Assessment Employee Focus

Improvement In Progress Focused on EFQM Results

Improvement in Progress Focused on Student Voice

Student Voice

End of First Semester

End of Second Semester

Take Direct Actions based on Assessment

The Result May occur Next Academic Year

Page 19: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

18

5. EFQM Methodology Implemented in Boras engineering School

EFQM has been used as the main quality tools in many European universities. “In the Netherlands and many other European countries we have seen a growing interest in the Exellence Model as Developed by the European Foundation for Quality Management (EFQM)… Seven institutes for higher education in the Netherlands called the HBO-Expertgroup, developed a version of the model for higher education that has been translated into English, French, German, Spanish, Latvian, Czech and even into Vietnamese” (Van Kemenade, 1999,2004, p. 175).

5.1. Self-Assessment

In order to do the self-assessment we chose right to left approach. The important result was customer results. But the customer satisfaction is achieved through the people result, so we took the people result in to consideration; however student as the most important customer have been involved in the research. Of course we can expect more customers for the university which are out of scope of this research. In other word people working at the university and students studying there have been the most important source of input to the self-assessment in this study.

In order to achieve a good and appropriate input to the result we made a questionnaire. Each questionnaire consisted of five major parts; each part was assigned to one or two enabler of the model.

For each enabler we made some statement, special care have been taken in finding the statement because it seemed to be easy to make a statement which leads to ambiguities, both for the one who is filling out the questionnaire and the one who is going to analyze the result.

Each statement had two dimensions as follow;

Importance: meaning that how much this statement is important from the perspective of who is filling out the form.

Agreement: meaning that to what extend this statement is true at the university.

Both of the dimensions were ranked from one to five, 1 had the lowest value and 5 had the highest value.

Considering the following statement which can be conducted to Process filed.

“Processes are improved continuously”

Imagine that someone gives the value of 4 as the importance degree and 2 as the agreement degree. It means that the person expect that the reception should be manned as a very important issue, but practically this issue is not done as much as it should be from people point of view.

The questionnaires are made and were sent to staffs of engineering schools of Boras University Collage to get a reliable result statistically. (Appendix 1)

5.2. Results

The final result of the research is supposed to lead us to the main gaps and help us to have some clue to start for problem solving and/or improvement steps. In this especial research, questionnaire results are shown on the table below (complete result chart is available in appendix 2):

I: Importance A: Agreement G: gap (I-A)

Page 20: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

19

Subject G=I-A

Leadership 1.8

Policy and strategy 0.8

People 0.7

Partnership and Resources 1.2

Processes 0.4

Student and People result 0.8

Society and Key performance result 0.25

This is clear here that the main gaps are in two fields, leadership and Partnership and resources.

Using EFQM tools clarify the main roots problems in the organization. But still in order to prove and have a start points, some interview was done with some of the programs’ coordinators and lecturers. Mrs. M, a program coordinator, believes that the main problems of the program could be solved, if they could contact their managers regularly. She confesses that they are especially disconnected from top leaders and there is no meeting or any kind of communication where they can talk and discuss the problems. They are only in contact with one level manager which doesn’t always seem to be enough. Mr. Ali complains about the lack of human resources: “There are some plans for improving the courses, but there is no professor or qualified lecturer to provide them” says Mr. Ali. Another lecturer said that the lack of resources is his main problem. He needs both financial support and leaders’ support to be able to achieve the course goals. He has had many problems for contacting companies and factories to manage guest lectures. Contacting with some of the top leaders, made me believe that there is really some connection problems in the system. One of the top leaders had no idea about the course evaluation system and how it is considered for improvement. He believed that it is being done automatically.

The interview results proved the result comes from the EFQM method. It means that improving leadership will have the first affect on the whole system quality. Improving leadership strategies and educating leaders to follow these strategies is the most important factor in improving quality in all sectors of the organization. Improving leadership will not only improve the quality because its one if the quality factors, but also because it will make the people the real wish and motivation to participate in improving.

These changes and improvement may seem to be very clear and understandable and easy to do in practice. It is too important to realize the real concept of EFQM. EFQM, both in the self-assessment process and in implementing changes is not only an improvement process. This should be considered as a culture change in the whole system. Both leaders and staff should have a high commitment to fill the assessment and stick to essential changes until they achieve the organization goals.

“It is clear that the implementation of a TQM initiative such as the EFQM Excellence Model involves a culture change and the culture realities of an organization need to be understood” (Davis, 2007, p382)

(Please notice that this questionnaire has been replied only by 7 employees and therefore might not be reliable)

Page 21: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

20

6. Student Satisfaction survey Implemented in Boras Engineering School

6.1. Assessment

As explained before the student assessment is supposed to be done in the end of first semester. The time is proper because students may feel more responsibility because they are more involved with their education in compare with the end of second semester.

The more result we can get, the more reliable the assessment would be. So to achieve the best result in the improvement progress, it is wise to inform students about the assessment. This can happen during face to face meetings in the classes, or sending emails. The advantage of talking to students about this face to face in the classes is that they will be clarified about how to feel the assessments and also they feel more responsible to fill the assessment.

The designed questionnaire contains 10 different issues that each has some statements to deepen the issue. And there is a last statement about overall satisfaction. Each statement has two different dimensions which are the same as EFQM assessment, importance and agreement. After reading each statement, students will rank their agreement with the statement and then the degree of importance of the statement in their point of view.

Both of the dimensions were ranked from one to five, 1 had the lowest value and 5 had the highest value.

6.2. Results

The final result of student assessment leads us to the main quality gaps according to students’ point of view. In order to achieve the best results, we can focus on two main gaps to start the improvement.

To recognize the main gaps or the main fields that needs improvement, it is enough to use the formulation below and average the results from all students.

Gap= Importance – Agreement

The table below shows the result in this case study. According to the table of results, we can highlight the main issues that needs instance improvement. Here, the two main issues are first career, and the second is course delivery. The third one which is organization and management is almost in the same range. Although we may not chose it as the main focus of improvement, but we can hope to observe some improvement since we have another improvement process according to organization people in EFQM assessment. The next big gap is degree content. This issue is so close to course delivery. Improving ‘course delivery’ can automatically affect this issue as well.

The table shows that student main issue is career. They need more support and advices from the institute to get the right career or the opportunity to continue their study. This request may appear in the last year of study. When students are so hopeful and at the same time stressed about finding a proper job, that can satisfy them and also they can use what they have learned in their career. Improving this issue does not only increase the current student satisfaction, but can also positively affect the prospect students.

Page 22: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

21

Subject G=I-A

Degree content 0.9

The teaching on my degree 0.5

Assessment and feedback 0.3

Academic support 0.7

Organization and management 1.0

Learning resources 0.2

Personal development 0.3

Course Delivery 1.2

Learning Community 0.2

Careers 1.5

Overall Satisfaction 0.9

It is so interesting to compare this table with the one obtained from people self-assessment. ‘resources and partnership’ which was one of the main issued in people assessment is one of the strength in student point of view. Although resources may have different definitions in these two different assessments, but it can show us how essential is to have both views.

The main gap in people assessment was the ‘leadership’. This can be related to one of the main gaps here which is ‘organization and management’. It shows that changes are essential in leadership and management in all levels of the institute. However, we expect that if the right improvement would be implemented in leadership strategies, it will improve the ‘management and organization’ issue.

(Please notice that this questionnaire has been replied only by 30 students and therefore might not be reliable)

7. Conclusion

Improving Quality in higher education is one of the most complicated issues in quality improvement field. In this research, the concept of quality in higher education was defined and Quality improvement tools were introduced. The self-assessment is of course one of the main subjects which can help the leaders to find out the roots problems and to get some idea about where and how to start. There are two assessment which are used as evaluation instruments in this research, EFQM self-assessment and student assessment.

This research also suggests an expanded PDCA cycle which is adoptable to higher education continuous improvement. This cycle is adoptable in each academic year. The results of assessment in each cycle are not therefore assessed solely by themselves, but are also comparable with the previous cycles and can help to evaluate the continuous improvement process.

Although finding the root problems and suggestions to solve and/or improve them is always helpful, but it can never guarantee continuous improvement. Continuous improvement needs a strong obligation to stick on quality improvement process. The process can include annual self-assessment, periodical meetings, motivating stakeholders, and the most important, making the vision of becoming the best. Committed open-minded leaders, who are always open to positive changes and improvements, can have the main effect on continuous quality improvement in higher education organizations.

Page 23: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

22

Reference

UDO NABITZ, NIEK KLAZINGA and JAN WALBURG(2000)” The EFQM excellence model: European and Dutch experiences with the EFQM approach in health care” International Journal for Quality in Health Care 12:191-202

Malcolm Balderige National Quality Award 1999, Healthcare Criteria for Performance Excellence. aithersburg:NIST, 1999.

HARVEY, L. & GREEN, D., 1993, ‘Defining quality’, Assessment & Evaluation in Higher Education, 18, pp. 9–34.

Conti T.,2001,”why Most Companies Do Not Get the Most Out of Their Self-Assessment?”, American Quality Conference, 2001.

Karapetrovic S. & Willborn W., 2001, “Audit and Self-assessment in Quality Management: Comparison and Compatibility”, Managerial Auditing Journal, Vol. 16, No. 6, pp 366-377.

Van der Wiele et al., 1996, “Self-assessment: a Study of Progress in Europe’s Leading Organization in Quality Management Practices”, The International Journal of Quality and Reliability Management, Vol. 13, No. 1, pp 84-104.

Dahlgaard, Su Mi Park & Dahlgaard, Jens J. (2004), The 4P Quality Strategy for Breakthrough and Sustainable Development, European Quality, Vol. 10 no. 4, UK

Dahlgaard, Jens J. & Dahlgaard, S. M. Park (1999) Core Value and Core Competence Development – a Pre-condition for achieving Business Excellence, Proceedings of the International Conference on TQM and Human Factors. Linköping: Linköping University, Sweden.

Chee Yew, Wong, Dahlgaard, Jens J ,Anders Kindlihagen 2002“ Self- Assessment on hydro extrusion;s continuous improvement process within EFQM excellence model” Linkoping University

Van Kemenade, Everard, Pupius, Mike and Hardjono, Teun W.'More Value to Defining Quality',Quality in Higher Education, 2008, 14: 2, 175 — 185

N. Logothetis, Towards a quality management of education, Total Quality Management & Business Excellence, 1478-3371, Volume 6, Issue 5, 1995, Pages 479 – 486

KLAUS J. ZINK & ANDREAS SCHMIDT, Measuring universities against the European Quality Award criteria, TOTAL QUALITY MANAGEMENT, VOL. 6, NOS 5&6, 1995 547, PART IV: MEASUREMENT

DERVITSIOTIS, KOSTAS N. (1986) Applications of the Objectives Matrix for Productivity Improvement and Extensions, International Pacific Rim Conference, Institute of Industrial Engineers, February, Maui, Hawaii.

DERVITSIOTIS K. N., The objectives matrix as a facilitating framework for quality assessment and improvement in education, TOTAL QUALITY MANAGEMENT, VOL 6, NOS 5&6, 1995 563

John Dew, Quality Issues in Higher Education, Journal for Quality & Participation, Apr2009, Vol. 32 Issue 1, p4-9, 6p; (AN 39662757)

Bo Tonnquist, Project Management, Publisher: Bonnier Utbildning, 2008

N. LOGOTHETIS, Towards a quality management of education, TOTAL QUALITY MANAGEMENT, VOL. 6, NOS 5&6, 1995 479

DEMING, W.E. (1986) Out of the Crisis (Cambridge, MA, Massachusetts Institute of Technology, Center for Advanced Engineering Study).

Page 24: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

23

SVEIN STENSAASEN, The application of Deming's theory of total quality management to achieve continuous improvements in education, TOTAL QUAUTY MANAGEMENT, VOL. 6, NOS 5&6, 1995 579

EFQM (1999a), assessing for excellence, European foundation for quality management, Brussels

Joseph Juran, Juran on Leadership for Quality, Macmillan, 1989.

M. F. Gaspar*, A. M. Pinto, H. Camilo F. da Conceição, J. A. Pereira da Silva, A questionnaire for listening to students’ voices in the assessment of teaching quality in a classical medical school, Assessment & Evaluation in Higher Education, Vol. 33, No. 4, August 2008, p445–453

J. WILLIAMS* & G. CAPPUCCINI-ANSFIELD, Fitness for Purpose? National and Institutional Approaches to Publicising the Student Voice, Quality in Higher Education, Vol. 13, No. 2, July 2007, p159-172

John Davies, A.Douglas, J. Douglas, The effect of academic culture on the implementation of the EFQM Excellence Model in UK universities, Quality Assurance in Education, Vol. 15 No. 4, 2007, p. 382-401

David Kembera, Doris Y.P. Leungb, Establishing the validity and reliability of course evaluation questionnaires, Assessment & Evaluation in Higher Education, Vol. 33, No. 4, August 2008, 341–353

MUZIO M. GOLA, Quality assurance in engineering education on a national and European scale, European Journal of Engineering Education, Vol. 30, No. 4, December 2005, p423-430

CDIO: An international initiative for reforming engineering education, K. F. Berggren, D. Brodeur, E. F. Crawley, I. Ingemarsson, W. T.G. Litant, J. Malmqvist, S. Östlund, World Transactions on Engineering and Technology Education, Vol.2, No.1, 2003, p 49-

J. Malmqvist, K. Edström, S. Gunnarsson, S. Östlund, USE OF CDIO STANDARDS IN SWEDISH NATIONAL EVALUATION OF ENGINEERING EDUCATIONAL PROGRAMS, 1st Annual CDIO Conference, June 2005

http://www.hci.com.au/hcisite2/toolkit/pdcacycl.htm, HCI professional service, 2005

http://hb.se/wps/portal, Högskolan I Borås, 5.6.2008

http://www.topuniversities.com/university-rankings/world-university-rankings/methodology/simple-overview, Top Universities, World university ranking, Methodology, 2009

http://www.my-project-management-expert.com/who-is-a-stakeholder.html, Who is stakeholder?, 2009

http://www.hsv.se/qualityassurance/qualitysurveysamongstudentsandteachers.4.36556afd1261af125707ffe870.html, Swedish national agency for higher education, 2010

http://www.hefce.ac.uk/learning/nss/, HEFCE, National student Survey, 2010

http://oxforddictionaries.com/, 2010

http://www.cdio.org/cdio-history, Worldwide CDIO Initiative, CDIO history, 2010

http://www.cdio.org/implementing-cdio-your-institution/startup-advice, Worldwide CDIO Initiative, Expert advice on starting a CDIO program at your institution, 2010

http://www.cdio.org/implementing-cdio/standards/12-cdio-standards, Worldwide CDIO Initiative, 12 CDIO standards, 2010

Page 25: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

24

APPENDIX

A

EFQM Q

uestionn

aire

Leade

rship

STATEMENTS (POTENTIA

L AREAS TO ADDRESS

)

AGREEMENT

IMPORTANCE

Disagree

very

much

Agree

slightly

Agree

very

much

Really

don't

know

Disagree

very

much

Agree

slightly

Agree

very

much

1. As an active stakeholder, I'm informed about the vision and

mission of my devision and of the whole system, and it has been

clarified how to move to achieve them

1 2

3 4

5

1 2

3 4

5

2. I'm asked to participate pro-actively to help my department in

the development of our vision and mission

1 2

3 4

5

1 2

3 4

5

3. As a stakeholder, I'm involved in contributing to develop

mission, vision, values, and goals

1 2

3 4

5

1 2

3 4

5

4. I'm encourage to act creativly to help my devision to achieve

the goals and visions

1 2

3 4

5

1 2

3 4

5

5. School's top leaders stimulate and encourage empowerment by

giving division leaders the authority to make their decision for

improvements

1 2

3 4

5

1 2

3 4

5

6. Engineering school's top leaders develop organization and

processes in accordance with mission of the department

1 2

3 4

5

1 2

3 4

5

7. There are yearly meetings to review the mission and vision of

my division and the result achievment for improvement with the

leaders and top leaders

1 2

3 4

5

1 2

3 4

5

8. Top leaders encourage me to fill the self-asseddment

questionnaire which is done once a year at least to evaluate the

development

1 2

3 4

5

1 2

3 4

5

Page 26: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

25

polic

y an

d strategy

STATEMENTS (POTENTIA

L AREAS TO ADDRESS

)

AGREEMENT

IMPORTANCE

Disagree

very

much

Agree

slightly

Agree

very

much

Really

don't

know

Disagree

very

much

Agree

slightly

Agree

very

much

1. Engineering schools stablishes objectives and strategies to

support vision and mission

1 2

3 4

5

1 2

3 4

5

2. There are short-term and long-term actions in this division to

achieve the values and goals

1 2

3 4

5

1 2

3 4

5

3. When over viewing the strategies and outlines, my devision

uses the students and stakeholders feedbacks and needs

1 2

3 4

5

1 2

3 4

5

4. When over viewing the strategies and outlines, my devision

considers the current strengths and weaknesses and staff needs

1 2

3 4

5

1 2

3 4

5

5. in the process of developing strategies, my division considers

all staff, students, managers, and other stakeholders' needs.

1 2

3 4

5

1 2

3 4

5

6. there is annual assessments to have an overview on the

strategies and policies and its used to develop the policies

1 2

3 4

5

1 2

3 4

5

7. I feel free to communicate with my managers to give them non-

official feedbacks on the processes and strategies.

1 2

3 4

5

1 2

3 4

5

8. Engineering school continuously review the effectiveness of the

key processes of policies and strategies

1 2

3 4

5

1 2

3 4

5

Page 27: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

26

People

STATEMENTS (POTENTIA

L AREAS TO ADDRESS

)

AGREEMENT

IMPORTANCE

Disagree

very

much

Agree

slightly

Agree

very

much

Really

don't

know

Disagree

very

much

Agree

slightly

Agree

very

much

1. Engineering school continuously evaluates if the human

resource plan is in line with the objectives and strategies

stablished

1 2

3 4

5

1 2

3 4

5

2. My managers motivates me to develop and use my full

potential in achieving the best results on my objectives according

to my division mission and vision

1 2

3 4

5

1 2

3 4

5

3. The working conditions, including health, safety, environment,

etc., is being improved continuously

1 2

3 4

5

1 2

3 4

5

4. I take part in the working condition improvements, such as

health, safety, and ergonomics

1 2

3 4

5

1 2

3 4

5

5. Managers and leaders in my division provides periodically

surveys to confirm staff satisfaction and motivation

1 2

3 4

5

1 2

3 4

5

6. I continuously promote a culture of open communication and

dialogue, to share knowledge and skills between individuals and

divisions

1 2

3 4

5

1 2

3 4

5

7. Staff satisfaction and people sharing in objectives and goal

achieving process is considered as one of the main criterias to

achieve the goals and values and is determined by direct and

indirect managers

1 2

3 4

5

1 2

3 4

5

Page 28: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

27

partnership an

d resources

STATEMENTS (POTENTIA

L AREAS TO ADDRESS

)

AGREEMENT

IMPORTANCE

Disagre

e very

much

Agree

slightl

y

Agre

e very

much

Reall

y don't

know

Disagre

e very

much

Agree

slightl

y

Agre

e very

much

1. My division establishes partnerships with students and other

divisions to have up-dated knowledge about their needs

1 2

3 4

5

1 2

3 4

5

2. Different communication channels are promoted to share

knowledge between the customers, partners, and leaders

1 2

3 4

5

1 2

3 4

5

3. There are surveys on customer needs (students and other

stakeholdres) which are being done periodically

1 2

3 4

5

1 2

3 4

5

4. Surveys on customer’s needs, are used to improve the

division goals, values, and processes

1 2

3 4

5

1 2

3 4

5

5. Survay on future customers (students especially) needs are

used to improve educational programs and services

1 2

3 4

5

1 2

3 4

5

6. I'm encouraged to have open relationship and communication

to get feedback and knowledge from students and other

colleagues

1 2

3 4

5

1 2

3 4

5

7. I feel free to let my manager know about my needs and I'm

provided with essential resources

1 2

3 4

5

1 2

3 4

5

8. Resources are being improved in my division according to

new internal and external customers' needs (Staff and students)

1 2

3 4

5

1 2

3 4

5

Page 29: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

28

Processes

STATEMENTS (POTENTIA

L AREAS TO ADDRESS

)

AGREEMENT

IMPORTANCE

Disagree

very

much

Agree

slightly

Agree

very

much

Really

don't

know

Disagree

very

much

Agree

slightly

Agree

very

much

1. Surveys are being done periodically to make sure that the

processes get the students satisfactions

1 2

3 4

5

1 2

3 4

5

2. Surveys are being done periodically to make sure that the

processes get staff and partners satisfactions

1 2

3 4

5

1 2

3 4

5

3. The engineering School is commited to use surveys to improve

processes

1 2

3 4

5

1 2

3 4

5

4. Processes are improved continuously to meet customers'

satisfaction

1 2

3 4

5

1 2

3 4

5

5. The education processes are managed and improved with

accompany of professors, managers, staff and students

1 2

3 4

5

1 2

3 4

5

6. enrolement and official processes are improved to meet staff

and students' satisfaction

1 2

3 4

5

1 2

3 4

5

7. I'm encouraged to give feedbacks and use my creativity to help

processes improvements

1 2

3 4

5

1 2

3 4

5

Page 30: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

29

Stud

ent a

nd people results

STATEMENTS (POTENTIA

L AREAS TO ADDRESS

)

AGREEMENT

IMPORTANCE

Disagre

e very

much

Agree

slightl

y

Agre

e very

much

Reall

y don't

know

Disagre

e very

much

Agree

slightl

y

Agre

e very

much

1. Student satisfaction regarding the processes and services has

increased during the last year

1 2

3 4

5

1 2

3 4

5

2. student satisfaction regarding the educational process and

courses given in Engineering schools has increased during the

last year

1 2

3 4

5

1 2

3 4

5

3. Student satisfaction regarding the resources provided by the

Engineering School has increased during the last year

1 2

3 4

5

1 2

3 4

5

4. The number of students complains regarding the educational

system and courses have decreased during the last year

1 2

3 4

5

1 2

3 4

5

5. Communication with students have been improved and the

feedback process has become easier during the last year

1 2

3 4

5

1 2

3 4

5

6. Number of students dropped courses has decreased during the

last year

1 2

3 4

5

1 2

3 4

5

7. Working condition and services has improved for staff during

the last year

1 2

3 4

5

1 2

3 4

5

8. My satisfaction and motivation has been increased during the

last year

1 2

3 4

5

1 2

3 4

5

Page 31: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

30

Society an

d Key perform

ance (B

usiness) results

STATEMENTS (POTENTIA

L AREAS TO ADDRESS

)

AGREEMENT

IMPORTANCE

Disagre

e very

much

Agree

slightl

y

Agre

e very

much

Reall

y don't

know

Disagre

e very

much

Agree

slightl

y

Agre

e very

much

1. Partnership with external stakeholders have increased during

the last year

1 2

3 4

5

1 2

3 4

5

2. External stakeholder satisfaction has increased during the last

year

1 2

3 4

5

1 2

3 4

5

3. Student graduation rate and on time graduation have

increased during the last year

1 2

3 4

5

1 2

3 4

5

4. Student grades have increased in the courses given by

engineering school

1 2

3 4

5

1 2

3 4

5

5. The Number of students who give up studying or don't

graduate from their programs have decreased during the last

year

1 2

3 4

5

1 2

3 4

5

6. The number of external available sources, such as libraries,

labs, and other sources have increased during the last year

1 2

3 4

5

1 2

3 4

5

7. the accessability of the available sources (the opening hours

during the day and opening days in national holidays) have

increased during the last year

1 2

3 4

5

1 2

3 4

5

8. The availability of services and help desks have increased

during the last year

1 2

3 4

5

1 2

3 4

5

Page 32: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

31

APPENDIX

B

Faculty Student Satisfaction Su

rvey

STATEMENT

AGREEMENT

IMPORTANCE

Disagree very much

Agree slightly

Agree very much Not important

Almost Important

Very Important

Degree content

AGREEMENT

IMPORTANCE

1.

The content of my degree matches my expectations.

1 2

3 4

5 1

2 3

4 5

2.

The content of my degree enables me to acquire knowledge and understanding of the subject.

1 2

3 4

5 1

2 3

4 5

3.

The learning and teaching methods used in my degree are appropriate.

1 2

3 4

5 1

2 3

4 5

4.

The workload for my degree is appropriate.

1 2

3 4

5 1

2 3

4 5

The teaching on my degree

AGREEMENT

IMPORTANCE

5.

Staff are good at explaining things.

1 2

3 4

5 1

2 3

4 5

6.

Teachers have made the subject interesting.

1 2

3 4

5 1

2 3

4 5

7.

Teachers are enthusiastic about what they are teaching.

1 2

3 4

5 1

2 3

4 5

Assessment and feedback

AGREEMENT

IMPORTANCE

8.

Information about what is required to pass courses/modules and obtain a particular class of degree is

clear.

1 2

3 4

5 1

2 3

4 5

9.

The criteria used for marking have been clear in advance.

1 2

3 4

5 1

2 3

4 5

10.

Assessment arrangements and marking have been fair.

1 2

3 4

5 1

2 3

4 5

11.

Feedback on my work has been prompt.

1 2

3 4

5 1

2 3

4 5

Page 33: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

32

12.

Feedback on my work has helped me to clarify things I did not understand.

1 2

3 4

5 1

2 3

4 5

Academic support

AGREEMENT

IMPORTANCE

13.

I have received sufficient advice and support with my studies.

1 2

3 4

5 1

2 3

4 5

14.

I have been able to contact staff when I needed to.

1 2

3 4

5 1

2 3

4 5

15.

Good advice was available from my department when I had questions about my studies.

1 2

3 4

5 1

2 3

4 5

16.

The support and supervision I receive from academic staff enable me to study independently.

1 2

3 4

5 1

2 3

4 5

Organisation and management

AGREEMENT

IMPORTANCE

17.

The timetable works efficiently as far as my studies are concerned.

1 2

3 4

5 1

2 3

4 5

18.

Any changes in the degree or teaching have been communicated effectively.

1 2

3 4

5 1

2 3

4 5

19.

The degree is well-organized and running smoothly.

1 2

3 4

5 1

2 3

4 5

Learning resources

AGREEMENT

IMPORTANCE

20.

The library resources and services are good enough for my needs.

1 2

3 4

5 1

2 3

4 5

21.

I have been able to access general IT resources when I needed to.

1 2

3 4

5 1

2 3

4 5

22.

The printed materials and online documentation on my courses/modules give me the information I

need.

1 2

3 4

5 1

2 3

4 5

23.

I have been able to access specialized equipment, facilities or rooms when I needed to.

1 2

3 4

5 1

2 3

4 5

Personal development

AGREEMENT

IMPORTANCE

24.

The degree has enabled me to develop transferable skills, such as communication, group work, and

IT skills.

1 2

3 4

5 1

2 3

4 5

25.

My degree will support me in my prospective career, further study or other individual goals.

1 2

3 4

5 1

2 3

4 5

26.

The degree has helped me present myself with confidence.

1 2

3 4

5 1

2 3

4 5

Page 34: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

33

Course Delivery

AGREEMENT

IMPORTANCE

27.

Learning materials available on my course have enhanced my learning.

1 2

3 4

5 1

2 3

4 5

28.

The range and balance of approaches to teaching has helped me to learn.

1 2

3 4

5 1

2 3

4 5

29.

The delivery of my course has been stimulating.

1 2

3 4

5 1

2 3

4 5

30.

My learning has benefited from modules that are informed by current research.

1 2

3 4

5 1

2 3

4 5

31.

Practical activities on my course have helped me to learn.

1 2

3 4

5 1

2 3

4 5

Learning Community

AGREEMENT

IMPORTANCE

32.

I feel part of a group of students committed to learning.

1 2

3 4

5 1

2 3

4 5

33.

I have been able to explore academic interests with other students.

1 2

3 4

5 1

2 3

4 5

34.

I have learned to explore ideas confidently.

1 2

3 4

5 1

2 3

4 5

35.

Within my course, I feel my suggestions and ideas are valued.

1 2

3 4

5 1

2 3

4 5

36.

I feel part of an academic community in my university.

1 2

3 4

5 1

2 3

4 5

Careers

AGREEMENT

IMPORTANCE

37.

As a result of my course, I believe that I have improved my career prospects.

1 2

3 4

5 1

2 3

4 5

38.

Good advice is available for making career choices.

1 2

3 4

5 1

2 3

4 5

39.

Good advice is available on further study opportunities.

1 2

3 4

5 1

2 3

4 5

Overall Satisfaction

AGREEMENT

IMPORTANCE

40.

Overall I am satisfied with my experience as a student in Engineering Faculty.

1 2

3 4

5 1

2 3

4 5

Loo

king

back on

you

r expe

rience of study

ing, are th

ere an

y pa

rticularly positive or negative aspe

cts you wou

ld like

to highlight?

Page 35: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

34

Positive :

Negative:

Ackno

wledg

ement: This questio

nnaire is based on a combined SSS/NSS questionnaire Designed by UNIVERSITY of ESSEX , UK,

http://www2.essex.ac.uk/academic/offices/smo/)

Page 36: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

35

Appendix C

EFQM Self-assessment results

Leadership Agreement Importance I-A Question 1 3.1 5 1.9

Question 2 1.9 4.2 2.3

Question 3 3.5 3.8 0.3

Question 4 1.8 4.1 2.3

Question 5 2.3 3.9 1.6

Question 6 2.4 5 2.6

Question 7 3.4 5 1.6

Question 8 2.8 4.4 1.6

avarage of Importance-Agreement 1.775

Policy and Strategy

Agreement Importance I-A Question 1 3.2 4.1 0.9

Question 2 4.1 3.9 0.2

Question 3 3.2 3.4 0.2

Question 4 2.4 3.2 0.8

Question 5 2.4 4.1 1.7

Question 6 2.9 3.2 0.3

Question 7 2.4 4.8 2.4

Question 8 2.9 3.2 0.3

avarage of Importance-Agreement 0.8

Page 37: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

36

People Agreement Importance I-A Question 1 3.2 3.2 0

Question 2 3.2 4.6 1.4

Question 3 4.2 4.6 0.4

Question 4 2.6 2.9 0.3

Question 5 2.4 3.6 1.2

Question 6 2.4 3.2 0.8

Question 7 2.4 4.2 1.8

avarage of Importance-Agreement 0.7375

Partnership and resources Agreement Importance I-A Question 1 3.2 4.2 1

Question 2 3.4 4.2 0.8

Question 3 3.6 4.8 1.2

Question 4 2.8 4.8 2

Question 5 2.6 4.2 1.6

Question 6 4.2 4.8 0.6

Question 7 3.8 4.8 1

Question 8 3.2 4.8 1.6

avarage of Importance-Agreement 1.225

Processes Agreement Importance I-A Question 1 2.6 3.2 0.6

Question 2 2.4 3.2 0.8

Question 3 3.4 4.1 0.7

Question 4 4.2 4.6 0.4

Question 5 3.6 4.4 0.8

Question 6 4.2 4.4 0.2

Question 7 4.2 3.9 -0.3

avarage of Importance-Agreement 0.4

Page 38: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

37

Students and People results Agreement Importance I-A Question 1 3.8 4.6 0.8

Question 2 3.4 4.8 1.4

Question 3 4.2 4.8 0.6

Question 4 3.8 4.2 0.4

Question 5 3.2 4.6 1.4

Question 6 3.6 4.2 0.6

Question 7 4.2 4.4 0.2

Question 8 3.2 4.4 1.2

avarage of Importance-Agreement 0.825

Society and key performance results Agreement Importance I-A Question 1 4.2 4.6 0.4

Question 2 3.8 4.2 0.4

Question 3 4.4 4.6 0.2

Question 4 4.2 4.4 0.2

Question 5 4.6 4.8 0.2

Question 6 4.2 4.4 0.2

Question 7 3.6 3.8 0.2

Question 8 4.4 4.6 0.2

avarage of Importance-Agreement 0.25

Page 39: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

38

Appendix D

Faculty Student Satisfaction Results

STATEMENTS A I I-A

Degree content 3.5 4.4 0.9 1. The content of my degree matches my expectations. 3.4 4.6 1.2

2. The content of my degree enables me to acquire knowledge and understanding of the subject. 3.6 4.8 1.2

3 The learning and teaching methods used in my degree are appropriate. 3.1 4.6 1.5 4. The workload for my degree is appropriate. 3.8 3.9 0.1 The teaching on my degree 3.5 4.4 0.9 5. Staff are good at explaining things. 4.0 4.2 0.2 6. Teachers have made the subject interesting. 3.9 4.5 0.6 7. Teachers are enthusiastic about what they are teaching. 3.8 4.4 0.6 Assessment and feedback 3.9 4.2 0.3

8. Information about what is required to pass courses/modules and obtain a particular class of degree is clear. 3.6 3.8 0.2

9. The criteria used for marking have been clear in advance. 4.2 4.6 0.4 10. Assessment arrangements and marking have been fair. 4.1 4.2 0.1 11. Feedback on my work has been prompt. 3.6 3.9 0.3 12. Feedback on my work has helped me to clarify things I did not understand. 4.1 4.4 0.3 Academic support 4.0 4.7 0.7 13. I have received sufficient advice and support with my studies. 3.9 4.8 0.9 14. I have been able to contact staff when I needed to. 4.1 4.8 0.7 15. Good advice was available from my department when I had questions about my studies. 4.2 4.7 0.5 16. The support and supervision I receive from academic staff enable me to study

independently. 3.8 4.6 0.8

Organisation and management 3.3 4.3 1.0 17. The timetable works efficiently as far as my studies are concerned. 3.1 4.2 1.1 18. Any changes in the degree or teaching have been communicated effectively. 3.2 4.1 0.9 19. The degree is well-organized and running smoothly. 3.5 4.6 1.1 Learning resources 4.4 4.6 0.2 20. The library resources and services are good enough for my needs. 4.8 5 0.2 21. I have been able to access general IT resources when I needed to. 4.6 4.8 0.2

22. The printed materials and online documentation on my courses/modules give me the information I need. 4.2 4.6 0.4

23. I have been able to access specialised equipment, facilities or rooms when I needed to. 3.9 4.6 0.7 Personal development 4.1 4.4 0.3 24. The degree has enabled me to develop transferable skills, such as communication, group

work, and IT skills. 4.1 4.2 0.1

25. My degree will support me in my prospective career, further study or other individual goals. 4.2 4.6 0.4

Page 40: Continuous Quality Improvement in Higher Education A …bada.hb.se/bitstream/2320/9673/1/Shokraiefard.pdf · Continuous Quality Improvement in Higher Education ... Continuous Quality

39

26. The degree has helped me present myself with confidence. 4.0 4.4 0.4 Course Delivery 3.0 4.2 1.2 27. Learning materials available on my course have enhanced my learning. 2.6 4.2 1.6 28. The range and balance of approaches to teaching has helped me to learn. 2.8 4.1 1.3 29. The delivery of my course has been stimulating. 3.2 4.4 1.2 30. My learning has benefited from modules that are informed by current research. 3.6 4.0 1.4 31. Practical activities on my course have helped me to learn. 3.1 4.2 1.1 Learning Community 3.4 3.6 0.2

32. I feel part of a group of students committed to learning. 3.6 3.8 0.2

33. I have been able to explore academic interests with other students. 3.6 3.6 0 34. I have learned to explore ideas confidently. 3.0 3.2 0.2 35. Within my course, I feel my suggestions and ideas are valued. 3.6 3.6 0 36. I feel part of an academic community in my university. 3.4 3.8 0.4 Careers 3.4 4.9 1.5 37. As a result of my course, I believe that I have improved my career prospects. 3.6 4.8 1.2 38. Good advice is available for making career choices. 3.2 5 1.8 39. Good advice is available on further study opportunities. 3.4 4.8 1.4 Overall Satisfaction 4.1 4.8 0.9 40. Overall I am satisfied with my experience as a student in Engineering Faculty. 4.1 4.8 0.9