course experience questionnaire 2001

69
i Course Experience Questionnaire 2001 Graduate Careers Council of Australia and Australian Council for Educational Research

Upload: others

Post on 21-Jan-2022

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Course Experience Questionnaire 2001

i

Course Experience Questionnaire

2001

Graduate Careers Council of Australia

and

Australian Council for Educational Research

Page 2: Course Experience Questionnaire 2001

Acknowledgments

The Project Director for this report was Roger Bartley (Executive Director, GCCA).

The principal researchers were:

Dr John Ainley (Australian Council for Educational Research)

Dr Trevor Johnson

Dr Gerald Elsworth (RMIT University)

Bruce Guthrie (Research Manager, GCCA)

The Graduate Careers Council of Australia (GCCA) managed the Course Experience

Questionnaire, and worked with Australian universities to co-ordinate the collection of data

from recent graduates. The Australian Council for Educational Research (ACER) undertook

the analyses of the data and drafted this report on behalf of the GCCA.

The GCCA and ACER wish to acknowledge the role of the participating universities and in

particular research directors, survey managers and careers service staff who provided valuable

support to the project. To the graduates who completed survey forms we express our sincere

appreciation.

We also wish to acknowledge the support of the GCCA’s Survey Reference Group. The

advice and comment provided by staff from the Commonwealth Department of Education,

Science and Training (DEST) is also greatly appreciated.

The GCCA also wishes to thank DEST for funding the 2001 Course Experience

Questionnaire.

© 2002 Graduate Careers Council of Australia Ltd.

All rights reserved. No part of this publication may be copied or reproduced, stored in a

retrieval system or transmitted in any form or by any means electronic, mechanical,

photocopy, recording or otherwise without the prior written permission of the publishers.

Published by the Graduate Careers Council of Australia Ltd.

PO Box 28, Parkville, Victoria 3052

GCCA Switchboard: 03 8344 9333

Gradlink Helpdesk: 03 9349 4300

Facsimile: 03 9347 7298

Email: [email protected]

Web: www.gradlink.edu.au

Page 3: Course Experience Questionnaire 2001

iii

Contents

Executive Summary ........................................................................................................ viii

Introduction ................................................................................................................ 1

Background ................................................................................................................. 1

Data ............................................................................................................................. 2

Issues in the Interpretation of CEQ Data .................................................................... 5

Comprehensiveness ................................................................................................ 6

Variability within Courses ...................................................................................... 6

Graduate Respondents ............................................................................................ 6

Response Scale ....................................................................................................... 6

Response Rates ....................................................................................................... 6

Summary ..................................................................................................................... 7

Patterns and Trends ....................................................................................................... 8

Responses to CEQ Items ............................................................................................ 8

Groups of Items or Scales ........................................................................................... 11

Trends ......................................................................................................................... 13

Summary ..................................................................................................................... 18

Influence of Graduate and Course Characteristics on the Good Teaching Scale

and Overall Satisfaction ....................................................................... 19

Characteristics of graduates ........................................................................................ 19

Fields of Study ............................................................................................................ 22

Universities ................................................................................................................. 24

The Good Teaching Scale ....................................................................................... 24

The Overall Satisfaction item ................................................................................. 26

Institutional Differences within Fields of Study ......................................................... 28

Initial Primary Teacher Education .......................................................................... 28

Psychology .............................................................................................................. 29

Summary ..................................................................................................................... 31

Generic Skills ................................................................................................................ 32

Background ................................................................................................................. 32

Generic Skills in the CEQ .......................................................................................... 33

Differences Among Fields of Study ........................................................................... 33

Differences Among Institutions .................................................................................. 38

Nursing ................................................................................................................... 38

Accounting .............................................................................................................. 40

Dimensions of the Generic Skills Scale ..................................................................... 42

Summary ..................................................................................................................... 42

Page 4: Course Experience Questionnaire 2001

Properties of the CEQ ..................................................................................................... 43

The Scales .................................................................................................................. 43

Reliabilities of the Scales ........................................................................................... 43

Structure of the CEQ .................................................................................................. 44

Exploratory Factor Analysis ................................................................................... 44

Confirmatory Factor Analysis ................................................................................ 46

Summary .................................................................................................................... 48

References ................................................................................................................. 49

Appendix A: The Course Experience Questionnaire ............................................... 51

Appendix B: The AVCC Code of Practice ................................................................ 52

Appendix C: Response Rates of Institutions Participating in GDS 2001 ............... 58

Appendix D: Comparison of Characteristics of CEQ 2001 Respondents and

the Population of Bachelor Degree Graduates from 2000 ................. 59

Page 5: Course Experience Questionnaire 2001

v

Tables

Table 1.1 Respondent Numbers and Response Characteristics for CEQ

2001 ........................................................................................................ 2

Table 1.2 CEQ Respondents by Qualification and CEQ Scales ....................... 4

Table 2.1 CEQ 2001 Item Response Percentages: Bachelor Degree

Graduates .............................................................................................. 9

Table 2.2 Summary Statistics for Scales by Qualification: CEQ 2001 ............. 12

Table 3.1a Percentage Agreement with the Good Teaching Scale and the

Overall Satisfaction Item by Selected Graduate and Course

Characteristics: Bachelor Degree Respondents, 2001 ....................... 20

Table 3.1b Percentage Agreement with the Good Teaching Scale and the

Overall Satisfaction Item by Selected Graduate and Course

Characteristics: Bachelor Degree Respondents, 2001 ....................... 21

Table 3.2 Percentage of Variance in the Good Teaching Scale Explained by

Selected Graduate and Course Characteristics: Ten Specific

Fields of Study, Bachelor Degree Respondents, CEQ 2001 .............. 25

Table 3.3 Percentage of Variance in the Overall Satisfaction Item

Explained by Selected Graduate and Course Characteristics:

Ten Minor Fields of Study, Bachelor Degree Respondents, CEQ

2001 ........................................................................................................ 27

Table 4.1 Mean Percentage Agreements for Generic Skills by Broad Field

of Study: CEQ 1995, 1998 and 2001 (Bachelor Degree

Graduates). ............................................................................................ 34

Table 4.2 Percentage Agreement for Generic Skills Items by Broad Field of

Study: CEQ2001. .................................................................................. 36

Table 4.3 Percentage Agreement for Generic Skills Items by Specific Field

of Study: CEQ2001. .............................................................................. 37

Table 4.4 Results of Principal Component Analysis of the GSS for Each

Broad Field of Study ............................................................................ 41

Table 5.1 Reliability of the CEQ Scales: Bachelor Degree Graduates, CEQ

2001 ........................................................................................................ 43

Table 5.2 Factor Loadings Derived from Exploratory Factor Analysis of

CEQ Items: Bachelor Degree Graduates: CEQ 2001 ....................... 45

Table 5.3 Confirmatory Factor Analyses, Bachelor Degree Graduates,

CEQ 2001 .............................................................................................. 47

Page 6: Course Experience Questionnaire 2001

Figures

Figure 1.1 Institutional Response Rates to the CEQ and GDS ........................... 3

Figure 1.2 CEQ Bachelor Degree Respondent Numbers, 1993-2001 ................. 5

Figure 2.1 Percentage Agreement with CEQ 2001 Items (Bachelor Degree

Respondents) ......................................................................................... 10

Figure 2.2 Trends in CEQ Indicators 1993-2001 (Bachelor Degree

Respondents) ......................................................................................... 14

Figure 2.3a Percentage Agreement with Items in the Good Teaching Scale:

Bachelor Degree Graduates, 1993-2001 .............................................. 15

Figure 2.3b Percentage Agreement with Items in the Clear Goals and

Standards Scale: Bachelor Degree Graduates, 1993-2001 ................ 15

Figure 2.3c Percentage Agreement with Items in the Appropriate Assessment

Scale: Bachelor Degree Graduates, 1993-2001 ................................... 16

Figure 2.3d Percentage Agreement with Items in the Appropriate Workload

Scale: Bachelor Degree Graduates, 1993-2001 ................................... 16

Figure 2.3e Percentage Agreement with Items in the Generic Skills Scale:

Bachelor Degree Graduates, 1993-2001 .............................................. 17

Figure 2.3f Percentage Agreement with the Overall Satisfaction Item:

Bachelor Degree Graduates, 1993-2001 .............................................. 17

Figure 2.4 Recent trends in Mean Percentage Agreement with CEQ Scales

and the Overall Satisfaction Item (Bachelor Degree

Respondents) ......................................................................................... 18

Figure 3.1 Percentage Agreement with the Good Teaching Scale by Selected

Fields of Study: Bachelor Degree Graduates, CEQ 2001 .................. 23

Figure 3.2 Percentage Agreement with the Overall Satisfaction Item by

Selected Fields of Study: Bachelor Degree Graduates, CEQ 2001 ... 24

Figure 3.3 Mean Percentage Agreement with the Good Teaching Scale by

University: Bachelor Degree Graduates in Initial Primary

Teacher Education, CEQ 2001 ............................................................ 28

Figure 3.4 Percentage Agreement with the Good Teaching Scale for Initial

primary Teacher Education Graduates: CEQ 2001 .......................... 29

Figure 3.5 Mean Percentage Agreement with the Good Teaching Scale by

University: Bachelor Degree Psychology Graduates, CEQ 2001 ...... 30

Figure 3.6 Mean Percentage Agreement with the Overall Satisfaction Item

by University: Bachelor Degree Psychology Graduates, CEQ

2001 ......................................................................................................... 31

Figure 4.1 Trends in Percentage Agreement for Generic Skills for Broad

Fields of Study: CEQ 1995, 1998 and 2001 (Bachelor Degree

Graduates). ............................................................................................ 34

Figure 4.2 Patterns of Agreement with Items on the Generic Skills Scale for

Broad Fields of Study ........................................................................... 35

Figure 4.3 Percentage Agreement for Generic Skills Items by Specific Field

of Study: CEQ2001. .............................................................................. 38

Page 7: Course Experience Questionnaire 2001

vii

Figure 4.4 Mean Percentage Agreement on Generic Skills for Initial

Nursing by Institution. ......................................................................... 39

Figure 4.5 Mean Percentage Agreement on Generic Skills for Accounting

by Institution ......................................................................................... 40

Figure 5.1 Scales and Items of the Course Experience Questionnaire ............... 44

Page 8: Course Experience Questionnaire 2001

Executive Summary

This report describes the views of graduates from Australian universities regarding the courses

that they completed. It focuses specifically on graduates who completed their courses of study

in 2000 but also references previous cohorts of graduates. The data on which the report is

based are taken from the Course Experience Questionnaire (CEQ) that was administered

during the year 2001 as part of 2001 Graduate Destination Survey (GDS).

Each year since 1993, approximately four months after they have completed a course of study,

all graduates of Australian universities are invited to respond to the 25-item CEQ. In that

questionnaire graduates are able to express their degree of agreement or disagreement on a

five-point scale with 24 statements about five facets of their course experience:

· the quality of teaching;

· the clarity of goals and standards;

· the nature of the assessment;

· the level of the workload; and

· the enhancement of their generic skills.

A final item asks graduates to indicate their overall level of satisfaction with the course on the

same five-point scale.

This report focuses on the responses by bachelor degree graduates: those students who have

recently completed pass bachelor degrees, honours bachelor degrees or three-year

undergraduate diplomas. There were 50,103 bachelor degree respondents to the 2001 CEQ

and 8,141 of these provided additional information about a second major so that a maximum

of 58,244 response sets were available for course experiences analysis. These respondents

had similar characteristics (in terms of gender, age, field of study, and nature of qualifications)

to the population of graduates who completed a course in 2000.

As was done in the reports of the 1999 and 2000 CEQ surveys this report focuses on the six

items that form the Good Teaching Scale and the Overall Satisfaction item. In addition this

report of the 2001 CEQ survey provides an analysis of the items concerned with “generic

skills”. Generic skills are those skills that are not specific to a field of study and intended for

application in a range of contexts outside those where they are learned.

Nationally 68 per cent of these bachelor degree graduates expressed agreement (combining the

percentages in the two top categories of a five-point scale) with the statement Overall, I was

satisfied with the quality of this course. As shown in Figure 2.2 the trend in this level of

agreement has been generally positive since 1993 when the percentage agreement was 62 per

cent. A measure called “broad satisfaction” is sometimes used to refer to the overall

percentage in the top three response categories. In 2001, 89 per cent of bachelor degree

graduates were “broadly satisfied” with the overall quality of their courses, a modest increase

from the 87 per cent recorded in 1993.

The Good Teaching Scale consists of six items on which the average agreement was 43 per

cent. Agreement ranged from 50 per cent for the item the teaching staff worked hard to make

Page 9: Course Experience Questionnaire 2001

ix

their subjects interesting to 33 per cent for the item the staff put a lot of time into commenting

on my work. As for the Overall Satisfaction item, there has been a trend for levels of

agreement on the Good Teaching Scale to increase over time from 35 to 43 per cent. The level

of “broad satisfaction” with good teaching in 2001 was 77 per cent, and has shown an increase

from the 72 per cent recorded in 1993.

The trends for the Clear Goals and Standards and Generic Skills scales are similar to those

for the Good Teaching scale. Over the period from 1993 to 2001 the mean percentage

agreement on the Clear Goals and Standards scale has risen from 43 to 52 per cent and the

level of “broad satisfaction” has increased from 77 to 82 per cent. Similarly, for the Generic

Skills scale there has been an increase (but a smaller increase) in mean percentage agreement

from 60 to 63 per cent and an increase in broad satisfaction from 84 to 87 per cent.

On the Appropriate Assessment scale the trend has been in the other direction. From 1993 to

2001 the mean percentage agreement declined from 59 to 56 per cent and “broad satisfaction”

declined to a smaller extent from 85 to 84 per cent. It could be inferred from this trend that

over the period under consideration there has been a shift in assessment in higher education

towards factual content and knowledge rather than thinking skills.

There has been very little change, between 1993 and 2001, in the mean percentage agreement

on the Appropriate Workload scale and just a small shift (upwards) in the “broad satisfaction”

measure for that scale.

Relatively little of the variation in responses to the Overall Satisfaction item or the Good

Teaching Scale can be attributed to the characteristics of graduates: their sex, ethnic

background, mode of attendance or employment status. Some evidence exists of an influence

of age with graduates older than 40 years expressing greater satisfaction than younger

graduates. There is substantial variation in the responses of graduates who were enrolled in

the different fields of study such as accounting, biology, nursing and so on. Further, within

fields of study there are sometimes substantial differences among the responses of graduates

from different universities. These differences are larger for the Good Teaching Scale than for

the Overall Satisfaction item. For both measures, however, the between university differences

are markedly larger than any differences attributable to the characteristics of graduates.

Page 10: Course Experience Questionnaire 2001
Page 11: Course Experience Questionnaire 2001

1

Introduction

The Course Experience Questionnaire (CEQ) has been used in development, evaluation and

research extending over 20 years and has been used in annual national surveys of Australian

graduates since 1993. It has been proved to have a stable and reliable structure, and to

discriminate between different learning environments.

This report is concerned with graduates from Australian universities about their experience of

the courses from which they graduated in 2000. Data are derived from the Course Experience

Questionnaire (CEQ) that forms part of the Graduate Destination Survey (GDS) administered

in the year 2001. The GDS is mailed some four months after graduates have completed their

course. The CEQ asks graduates to rate their agreement with each of 25 items. Items cover

the quality of teaching, the clarity of the goals and standards, the level of the workload, the

nature of the assessment and the extent to which generic skills are embedded in the course.

There is also a single item that measures overall satisfaction with the course. A copy of the

questionnaire is included with this report as Appendix A.

CEQ 2001 collected data from 73,408 graduates of whom 50,103 were Bachelor degree

graduates. The number of respondents to the survey has declined a little since 1997 and this

appears to be a consequence of declining response rates and, since 1999, more precise CEQ

selection procedures. Although the response rate is a little better than many comparable

surveys a high response rate is crucial to any survey and this aspect of the CEQ may need

attention. The overall response rate to the GDS in 2001 was 57 per cent and that for the CEQ

was 52 per cent1.

Background

The purpose of the CEQ is to assemble data about graduates’ perceptions of the quality of the

courses that they completed in the previous year. Students’ views have long been recognised

as relevant to the evaluation of courses. The CEQ focuses on graduates’ perceptions of their

courses rather than on students’ evaluations of particular subjects or instructors. It is a step

towards providing universities with system-wide information that can be used to make

informed judgements about aspects of the courses that they provide. Interest in the

development of an instrument like the CEQ was stimulated by observations about the absence

of systematic information about the quality of teaching universities in several discipline

reviews and in the recommendations of the Performance Indicators Research Group (Linke,

1991).

The original form of the CEQ was used in studies of undergraduate students in the United

Kingdom (Ramsden & Entwistle, 1981; Entwistle & Ramsden, 1983). Ramsden and

colleagues tested a later version in Australian universities during 1989 (Ramsden, Martin &

Bowden, 1989; Ramsden, 1991a; 1991b). Wilson, Lizzio and Ramsden (1996) reported on the

validity and usefulness of the CEQ as a performance indicator of the perceived quality of

university teaching.

1 Based of the numbers of questionnaires returned as a proportion of those mailed out.

Page 12: Course Experience Questionnaire 2001

CEQ 2001

2

Data

The overall response rate to the GDS was 57.2 per cent (survey questionnaires containing

were mailed to 158,153 graduates and 90,889 were returned, of these 90,410 contained valid

GDS data for analysis). Among Australian permanent residents the response rate was slightly

higher at 61 per cent (80,462 questionnaires were returned out of 131,533 that had been

distributed). These overall response rates are lower than those obtained in the 2000 survey.

As shown in Figure 1.1 institutional response rates to the GDS ranged from a high of 87 per

cent to a low of 40 per cent. GDS data are from all universities and from an average of about

57 per cent of the graduates from those universities.

CEQ response rates are lower than GDS response rates. It can be seen from Table 1.1 that

73,562 respondents across all levels of qualification provided information about a main course

of study and answered at least one CEQ item. Institutional response rates to the CEQ ranged

from 83 per cent to 30 per cent, as shown in Figure 1.1. Institution response rates to the CEQ

were nine percentage points lower than for the GDS and there were five institutions where the

gap was over 20 percentage points. The average response rate to the CEQ was 47 per cent.

This report focuses on the views of a subset of the CEQ respondents – the bachelor degree

graduates defined as those who have recently completed pass bachelor degrees, honours

bachelor degrees or three-year undergraduate diplomas. There were 50,207 bachelor degree

respondents to the CEQ and 8,147 of these provided additional CEQ information about a

second major2.

Table 1.1 Respondent Numbers and Response Characteristics for CEQ 2001

Course of study identified / CEQ response information Bachelor

Degrees

Higher &

other Degrees

Total

Respondents

First major identified and CEQ responses to first major

provided 40647 22601 63248

Two majors identified and CEQ responses to both majors

provided 8147 484 8631

Both majors identified and CEQ responses to first major

provided 736 83 819

First major identified and CEQ responses to both majors

provided 532 131 663

First major identified but CEQ responses to second major

provideda 23 10 33

Second major identified and CEQ responses to second major

provided 36 7 43

Second major identified and CEQ responses to both majors

provided 70 32 102

Both majors identified and CEQ responses to second major

provided 16 7 23

Total CEQ respondents 50,207 23,355 73,562

Total CEQ opinionsb 58,354 23,839 82,193

Notes: a These CEQ responses were treated as being related to the identified first major.

b 8661 of the 73,562 respondents provided a second CEQ response

2 The CEQ provides graduates completing double majors to register their opinions of both courses.

Page 13: Course Experience Questionnaire 2001

Introduction

3

Figure 1.1 Institutional Response Rates to the CEQ and GDS

30

31

35

35

37

38

38

39

40

40

41

41

42

42

43

43

45

45

45

46

46

46

48

48

49

50

50

51

54

55

55

56

57

57

59

59

59

63

65

68

73

83

0 10 20 30 40 50 60 70 80 90 100

University of New South Wales

University of Melbourne

University of Queensland

Marcus Oldham College

University of Wollongong

Australian Maritime College

Central Queensland University

University of Adelaide

Victoria University

University of Western Australia

Curtin University of Technology

University of Ballarat

Griffith University

University of Sydney

Charles Sturt University

RMIT University

Australian National University

James Cook University

Bond University

Macquarie University

Monash University

University of Technology, Sydney

University of Canberra

Southern Cross University

Northern Territory University

Edith Cowan University

University of South Australia

University of Tasmania

University of Newcastle

Murdoch University

Swinburne University of Technology

University of Southern Queensland

Deakin University

University of Notre Dame

Queensland University of Technology

Flinders University of South Australia

University of Western Sydney

University of New England

La Trobe University

Australian Catholic University

University of the Sunshine Coast

Avondale College

Response Rate (%)

GDS CEQ

Page 14: Course Experience Questionnaire 2001

CEQ 2001

4

Table 1.2 CEQ Respondents by Qualification and CEQ Scales

Course Experience Questionnaire Scales

Qualification GDS % CEQa % GTS CGS AWS AAS GSS OSI

Doctorate 1707 1.9 1179 1.4 1121 1145 1089 1108 1159 1124

Masters Research 892 1.0 653 0.8 639 641 631 633 645 638

Masters Other 10881 12.0 9363 11.4 9297 9318 9297 9300 9317 9311

G/PG Diploma 8672 9.6 7189 8.7 7149 7158 7151 7153 7154 7138

Graduate

Certificate 3721 4.1 3049 3.7

3013 3037 3027 3027 3033 3028

Graduate Bachelor 1996 2.2 1738 2.1 1719 1721 1718 1718 1721 1727

Honours Bachelor 5852 6.5 5444 6.6 5417 5420 5416 5420 5419 5413

Pass Bachelor 54931 60.8 52165 63.5 51863 51917 51890 51902 51938 51754

3 Yr U/G Diploma 1022 1.1 745 0.9 739 744 741 743 744 737

Associate Diploma 429 0.5 387 0.5 384 386 387 386 387 382

Certificate 307 0.3 281 0.3 281 281 281 281 281 281

Total 90410 100 82193 100 81622 81768 81628 81671 81798 81533

Note: a

Where provided, respondent opinions to both majors counted

In practice the effective number of bachelor degree responses differs for different items and

scales on the questionnaire. Table 1.2 details the number of responses to CEQ 2001 (and GDS

2001) by qualification and the number of responses to each of the CEQ Scales.

Table 1.3 provides information about respondent numbers, to both the GDS and the CEQ, for

each broad field of study as well as the proportion of GDS respondents who completed the

CEQ. On average 90 per cent of bachelor degree respondents to the GDS completed the CEQ

but the percentage ranged from 81 per cent in Veterinary Science to 96 per cent in

Agriculture.

Table 1.3 CEQ 2001 Bachelor Degree Respondent Numbers by Broad Field of Study

Broad Field of Study

GDS Total

Respondents

CEQ Total

Respondents

GDS respondents to

CEQ (%)

Agriculture 927 886 96

Architecture 1191 1056 89

Arts, Humanities & the Social Sciences 13709 12025 88

Business Studies 12763 11836 93

Education 5162 4841 94

Engineering 3058 2725 89

Health 7521 6896 92

Law 2054 1915 93

Science 9035 7923 88

Veterinary Science 129 104 81

Total Respondents 55549 50207 90

Note: Table based on CEQ responses to first major only.

Page 15: Course Experience Questionnaire 2001

Introduction

5

In previous reports of the CEQ the decline in numbers of respondents since 1996 has been

noted. This decline has been detailed in Figure 1.2. Changes in the number of respondents

reflect both changes in the number of graduates and changes in response rates. There is

evidence of a general but uneven decline in response rates since 1996. Some of the recent

decline can be attributed to the application of more rigorous specifications for inclusion in

CEQ analyses. Respondents to the CEQ 2001 survey had similar characteristics (in terms of

gender, age, field of study, and nature of qualifications) to the population of graduates who

completed a course in 2000 but under-represented graduates who were not Australian

residents. Details are contained in Appendix D. Despite this the decline in response rates over

time is a cause for concern. Low response rates mean that there is a potential for bias in ways

that cannot be predicted. It would be valuable to found out more detail about the procedures at

some institutions that result in high response rates and to explore the extent to which those

might be adapted for use in other institutions.

Figure 1.2 CEQ Bachelor Degree Respondent Numbers, 1993-2001 Note: This chart is based on responses to the first major only to avoid double counting since 1997. The numbers

are an aggregate of those honours bachelor degree, pass bachelor degree and three-year undergraduate

diplomats who identified their course of study and who responded to at least one item in the CEQ.

Issues in the Interpretation of CEQ Data

Previous reports have outlined some of the caveats to be attached to the interpretation of CEQ

results. These caveats concern comprehensiveness, within-course variation, respondent

characteristics, response scales and response rates. CEQ scale scores are relative indicators

and informed judgements must always incorporate relevant local knowledge.

42255

55879

63042

66350

6192460145

57698

50455 50207

0

10000

20000

30000

40000

50000

60000

70000

1993 1994 1995 1996 1997 1998 1999 2000 2001

Year of Survey

Re

sp

on

de

nt

Nu

mb

ers

Page 16: Course Experience Questionnaire 2001

CEQ 2001

6

Comprehensiveness

The CEQ cannot and does not encompass all the dimensions on which students could evaluate

their courses. Other dimensions might reflect objectives developed for particular courses at

individual institutions, specific aspects of particular fields of study or be based on other

general characteristics of teaching and learning. The CEQ focuses on parameters central to

teaching and learning in most fields of study within universities. It captures information about

these dimensions, and provides a basis for comparisons within fields of study.

Variability within Courses

Graduates may find it difficult to condense their experiences of an entire course into the single

response required for each item. In addition, if the results are averages of experience there is

the real possibility that the items will fail to discriminate between courses. Although these

results in a broad-brush measure it does not appear to be a problem for respondents.

Graduate Respondents

Since the CEQ only gathers data from those who have successfully completed a course of

study students who do not graduate are excluded. Although graduates are better placed to

evaluate a course than those who have not graduated, there is a possibility that the CEQ scale

scores are biased towards more favourable assessments. Although many decisions to withdraw

may be based on factors unrelated to the course, it would be surprising if there were no

correlation between course experiences and the decision to discontinue. A somewhat different

proposition is the claim that student evaluations are suspect because students are not in a

position to correctly evaluate a course until they have either graduated or applied their

knowledge in the workplace.

Response Scale

When responding graduates are required to circle the number 1, 2, 3, 4, or 5 next to each item

where ‘1’ represented strong disagreement and ‘5’ was associated with strong agreement (or

in some forms tick a box corresponding to the response). It is assumed that respondents

would consider the intervening values of 2, 3 and 4 part of the five-point scale ranging from

strong disagreement to strong agreement. This type of scale provides a common basis for

responses to items concerned with different aspects of course experience. Analyses by Long

and Hillman (2000) have shown consistent and well-spaced thresholds for these categories on

all of the items (including the middle category) indicating that graduates interpret them as

intended. Eley (2001) has suggested that a different response format based on frequency of

occurrence would be more sensitive.

Response Rates

Given that there has been a decline in response rates to the CEQ consideration of the possible

effects of partial response is appropriate. The general issue is whether those who did not

respond to the survey might have answered differently from those who did. One important

aspect of the survey was the differential non-response between fields of study and between

institutions. A small-scale investigation of non-respondents to the 1996 CEQ found that they

did not differ greatly from respondents at a macro level, such as field of study, but there were

Page 17: Course Experience Questionnaire 2001

Introduction

7

discrepancies between the two groups of graduates in terms of sex and age group (Guthrie &

Johnson, 1997). Long and Hillman (1999) examined the effect of non-response on CEQ

scores more recently and concluded that the effect is small.

Summary

This report is based on the CEQ 2001 survey data from approximately 73,400 graduates of

Australian universities: a response rate of approximately 60 per cent. It focuses particularly on

the views of more than 50,000 Bachelor degree graduates about their experience of the

courses from which they graduated in 2000. The report is organised around a series of

chapters. The second chapter examines patterns and trends in CEQ data at a national level.

The following chapter examines the influence of graduate and course characteristics on CEQ

responses. Of the graduate characteristics examined, only age has an influence on graduate

perceptions of their course but there are noteworthy differences among institutions within

fields of study. Chapter 4 provides a discussion of the generic skills scale of the CEQ and

how that scale might inform debate about the role of universities in developing generic or

transferable skills. Chapter 5 discusses some of the psychometric and statistical properties of

the CEQ.

Page 18: Course Experience Questionnaire 2001

8

Patterns and Trends

As in previous years graduates responding to the 2001 CEQ were asked to record responses to

each item on a five-point scale ranging from strongly disagree to strongly agree. From these

responses a variety of summary statistics can be generated to indicate graduates’ views of their

course experiences at university. This report focuses on those respondents who completed

pass bachelor degrees, honours bachelor degrees or three-year undergraduate diplomas -- a

group collectively referred to as bachelor degree graduates. Although most attention is given

to those items that are indicators of good teaching and overall satisfaction, the responses to all

the items by bachelor degree graduates who completed their course in 2000 is presented in

Table 2.13.

Responses to CEQ Items

Table 2.1 contains the wording of each of the items on the questionnaire, together with the

percentages of bachelor degree graduates responding to each category. As an example,

consider the most general item: Overall, I was satisfied with the quality of this course. The

data indicate that 3 per cent of bachelor degree graduates strongly disagreed with this

statement and 21 per cent strongly agreed with it. The percentages responding with the

intervening categories from the disagreement (lower value) to the agreement (higher value)

end of the scale were 8, 21 and 47 per cent respectively. Although the intervening response

points were not labelled on the questionnaire it is reasonable to interpret them as disagree,

uncertain (neither disagree nor agree) and agree (Long & Hillman, 2000).

Two summary statistics for items are recorded in Table 2.1. The first is the percentage

agreement. By combining the two agreement categories, it can be concluded that 68 per cent

of bachelor degree graduates agreed with the expression of overall satisfaction with the quality

of their course. Percentage agreement values for each of the items are illustrated in Figure 2.1.

A measure called “broad agreement” is sometimes used to refer to the overall percentage in

the top three categories.

The second is the mean score. Item means are calculated after recoding the responses 1, 2, 3,

4 and 5 to -100, -50, 0, 50 and 100 respectively. Where the wording of an item had a sense

opposite to the meaning of the scale (items 4, 8, 12, 13, 19, 21 and 23) the scoring is reversed.

Percentage agreement is more easily understood than the mean, is equally useful in monitoring

change and can be directly compared across scales. On the other hand the mean scores

incorporate information from all response categories.

Figure 2.1 displays the percentage agreement for each of the CEQ items diagrammatically. It

indicates that range of agreement amongst the items in the questionnaire within each set of

items.

3 Graduates completing a double major recorded their opinions about both courses of study and 8,141 of the

50,103 bachelor degree respondents (16.2 per cent) provided two sets of course experiences. Analyses of

opinions about courses treat those additional responses from the various institutions and fields of study as part

of the descriptive analysis. Additional responses are not included in analyses of background data (such as

gender, non-English speaking background, and state of origin) or the structure of the questionnaire.

Page 19: Course Experience Questionnaire 2001

Patterns and Trends

9

Table 2.1 CEQ 2001 Item Response Percentages: Bachelor Degree Graduates

Responses in each Category

(%)

Strongly

Disagree to

Strongly

Agree

%

No. CEQ Scale/Item 1 2 3 4 5 Agree M SD N

Good Teaching

3 The teaching staff of this course motivated me to do my best work. 4 14 33 34 14 49 20 52 57944

7 The staff put a lot of time into commenting on my work. 10 24 33 25 8 33 -1 55 57996

15 The staff made a real effort to understand difficulties I might be

having with my work 7 18 35 29 11 40 10 53 57727

17 The teaching staff normally gave me helpful feedback on how I was

going. 6 18 31 35 10 45 13 53 57926

18 My lecturers were extremely good at explaining things. 4 15 38 33 10 43 15 49 57683

20 The teaching staff worked hard to make their subjects interesting. 4 13 33 38 12 50 21 50 57760

Clear Goals & Standards

1 It was always easy to know the standard of work expected 3 13 31 41 12 53 23 48 57929

6 I usually had a clear idea of where I was going and what was

expected of me in this course. 3 13 26 43 14 57 26 50 58027

13.r It was often hard to discover what was expected of me in this

course. 4 15 31 37 12 50 19 51 57951

24 The staff made it clear right from the start what they expected from

students. 4 15 35 35 11 46 17 50 57842

Appropriate Workload

4.r The workload was too heavy. 5 16 39 32 9 41 12 49 57977

14 I was generally given enough time to understand the things I had

to learn. 3 14 32 41 9 50 19 47 57902

21.r There was a lot of pressure on me to do well in this course. 11 26 33 23 7 30 -6 55 57875

23.r The sheer volume of work to be got through in this course meant it

couldn’t all be thoroughly comprehended. 11 23 31 27 9 35 0 56 57837

Appropriate Assessment

8.r To do well in this course all you really needed was a good memory. 6 14 21 33 27 60 31 59 58003

12.r The staff seemed more interested in testing what I had memorised

than what I had understood. 5 13 27 34 21 55 26 56 57941

19.r Too many staff asked me questions just about facts. 2 8 36 38 17 55 30 46 57545

Generic Skills

2 The course developed my problem-solving skills. 2 8 23 45 22 67 39 47 58027

5 The course sharpened my analytic skills. 1 7 22 46 23 69 41 45 57898

9 The course helped me develop my ability to work as a team

member. 7 17 26 34 15 50 17 57 57966

10 As a result of my course, I feel confident about tackling unfamiliar

problems. 3 10 31 42 14 56 27 48 58002

11 The course improved my skills in written communication. 3 9 19 40 29 69 41 52 57951

22 My course helped me to develop the ability to plan my own work. 2 7 23 45 22 67 39 48 57717

Overall Satisfaction

25 Overall, I was satisfied with the quality of this course. 3 8 21 47 21 68 38 48 57904

Ungrouped Item

16.n The assessment methods employed in this course required an in-

depth understanding of the course content. 3 11 29 42 15 57 27 48 57859

Notes a. Graduates with pass bachelor degrees, honours bachelor degrees or three-year undergraduate diplomas.

b. Means are calculated after recoding the responses 1, 2, 3, 4 and 5 to -100, -50, 0, 50 and 100 respectively

c. Items marked with an r are reverse-scored in analyses to allow for their negative phrasing.

d. Item 16 does not fit statistically in any of the scales.

e. The full wording of the items and the format of the questionnaire is shown in Appendix 1.

Page 20: Course Experience Questionnaire 2001

CEQ 2001

10

Figure 2.1 Percentage Agreement with CEQ 2001 Items (Bachelor Degree Respondents)

68

50

56

67

67

69

69

55

55

60

30

35

41

50

46

50

53

57

33

40

43

45

49

50

Overall, I was satisfied with the quality of this course.

Overall Satisfaction

The course helped me develop my ability to work as a team member.

As a result of my course, I feel confident about tackling unfamiliar problems.

The course developed my problem-solving skills.

My course helped me to develop the ability to plan my own work.

The course improved my skills in written communication.

The course sharpened my analytic skills.

Generic Skills

The staff seemed more interested in testing what I had memorised . . .

Too many staff asked me questions just about facts.

To do well in this course all you really needed was a good memory.

Appropriate Assessment

There was a lot of pressure on me to do well in this course.

The sheer volume of work . . . meant it couldn’t all be thoroughly comprehended.

The workload was too heavy.

I was generally given enough time to understand the things I had to learn.

Appropriate Workload

The staff made it clear right from the start what they expected from students.

It was often hard to discover what was expected of me in this course.

It was always easy to know the standard of work expected

I usually had a clear idea of where I was going . . .

Clear Goals & Standards

The staff put a lot of time into commenting on my work.

The staff made a real effort to understand difficulties I might be having . . .

My lecturers were extremely good at explaining things.

The teaching staff normally gave me helpful feedback on how I was going.

The teaching staff of this course motivated me to do my best work.

The teaching staff worked hard to make their subjects interesting.

Good Teaching

Page 21: Course Experience Questionnaire 2001

Patterns and Trends

11

The number of valid responses to each of the items and the standard deviations of the

responses to the items are also shown in Table 2.1. The standard deviation indicates the

spread of the responses to an item, with a larger standard deviation corresponding to a wider

range of responses4.

Groups of Items or Scales

CEQ items form groups that represent underlying dimensions of course experiences.

Summary statistics can be used to represent responses to groups of related items. Such

summaries provide parsimony in analysis and reporting. Rather than reporting on 24 different

items it becomes possible to report patterns for five dimensions. Such a reduction of the data

can be of considerable assistance in the process of making inferences about trends and

patterns. Summary statistics use the relationships between items to confirm the meaning of

the item and reduce the effect of any idiosyncrasies.

Two summary statistics for groups of items are used in CEQ reports: the CEQ scale mean and

the mean percentage agreement. It can be seen that they relate to the two summary statistics

for items discussed in the previous section. The scale mean is useful in forms of analysis

where continuous measures are important (eg. in various correlation and multivariate

analyses) and percentage agreement is useful in representing differences between groups.

Scale means are the average of the item ratings for the group of items making up the scale.

The scale means provide the most reliable indicator for the group of items because they make

use of the full distribution of responses to each item5.

Mean percentage agreement refers to the average across the items in a group of the percentage

of respondents in a group agreeing or strongly agreeing with the item. It follows that this is

computed for groups of respondents rather than for individual respondents. The corresponding

measure for an individual would be the number of items with which they were in agreement.

This measure at individual level is a little less reliable than the scale means for scales

containing only a few items because the distribution of responses has not been utilised.

Table 2.2 shows the mean, percentage agreement, and percentage broad agreement for each of

the five CEQ scales and the Overall Satisfaction by the level of the course of the graduate.

The values for postgraduate research students are usually higher than for bachelor students on

all measures, and sometimes markedly higher. For instance, on the Good Teaching Scale, 59%

of graduates of research degree courses agreed with the Good Teaching Scale items, compared

with 43% of bachelor degree graduates. Differences are even larger on the Appropriate

Assessment Scale but minimal on Clear Goals and Standards.

4 By way of illustration, responses to item 8 were spread more widely than responses for other items. The

standard deviation was 59 with the percentages in each category from strongly disagree to strongly agree

being approximately six, 14, 21, 33 and 27 per cent respectively. In contrast, responses to item 19 had a

smaller standard deviation of 46. The responses to this item were clustered together at one end of the scale

being two, eight, 36, 38 and 17 per cent in each of the categories from strongly disagree to strongly agree

respectively. 5 It is possible to introduce greater precision by weighting each item in the scale according to the extent to

which it contributes to the underlying dimension. This has not been done because the weights would be

different for each set of data that was used.

Page 22: Course Experience Questionnaire 2001

CEQ 2001

12

Table 2.2 Summary Statistics for Scales by Qualification: CEQ 2001

Qualification and CEQ Scale Mean

Scale

Score

Standard

Deviation

Mean

Agreement

(%)

Broad

Agreement

(%)

Respondent

Numbers

Postgraduate Research

Good Teaching Scale 32 47 59 84 1760

Clear Goals & Standards Scale 24 47 54 79 1786

Appropriate Workload Scale 15 36 45 77 1720

Appropriate Assessment Scale 59 44 76 92 1741

Generic Skills Scale 46 34 72 88 1804

Overall Satisfaction Item 48 51 75 91 1762

Coursework Postgraduate

Good Teaching Scale 19 40 49 81 21178

Clear Goals & Standards Scale 23 40 54 82 21234

Appropriate Workload Scale 10 38 42 75 21193

Appropriate Assessment Scale 45 41 70 90 21198

Generic Skills Scale 28 35 58 85 21225

Overall Satisfaction Item 39 49 70 89 21204

Bachelor Degrees

Good Teaching Scale 13 41 43 77 58019

Clear Goals & Standards Scale 21 38 52 82 58081

Appropriate Workload Scale 6 38 39 73 58047

Appropriate Assessment Scale 29 43 56 84 58065

Generic Skills Scale 34 34 63 87 58101

Overall Satisfaction Item 38 48 68 89 57904

Other Qualifications

Good Teaching Scale 13 40 43 78 665

Clear Goals & Standards Scale 17 37 46 80 667

Appropriate Workload Scale 10 36 40 78 668

Appropriate Assessment Scale 36 41 63 89 667

Generic Skills Scale 25 33 56 85 668

Overall Satisfaction Item 39 47 70 90 663

All Qualifications

Good Teaching Scale 15 41 45 78 81622

Clear Goals & Standards Scale 22 39 52 82 81768

Appropriate Workload Scale 7 38 40 73 81628

Appropriate Assessment Scale 34 43 60 86 81671

Generic Skills Scale 33 34 62 87 81798

Overall Satisfaction Item 38 49 69 89 81533

Notes

a. Based on graduates who responded to at least one CEQ item, and whose level of qualification was known.

b. Information on both majors included where available.

c. Mean agreement percentages are based on Agree and Strongly Agree categories.

d. Broad agreement percentages are based on the top three categories.

e. Scale means are calculated after recoding the responses 1, 2, 3, 4 and 5 to -100, -50, 0, 50 and 100

respectively.

Page 23: Course Experience Questionnaire 2001

Patterns and Trends

13

The CEQ was developed primarily for use with students undertaking studies for an initial

qualification based on coursework. A number of the items assume that the respondents have

completed a qualification by meeting the requirements of a ‘course’. The notion of a ‘course’

varies between disciplines. Within some fields of study (e.g. an Arts degree) students design

their own course by choosing a sequence of units of study. Other areas (e.g. professional

courses such as Engineering and Medicine) tend to be more prescriptive, and this may

influence students’ opinions.

Trends

The CEQ provides the opportunity to examine the way in which the views of bachelor degree

graduates have changed over time. Figure 2.2 provides an overview using summary statistics

(mean percentage agreement and mean broad percentage agreement6). Figures 2.4a to 2.4f

show the changes in the percentage agreement with selected CEQ items for bachelor degree

graduates over the nine-year period 1993-2001.

Although the wording of the items has remained relatively unchanged, there is a discontinuity

in the way respondents answered. Until 1997 graduates were provided with the opportunity to

comment on only one course. From 1997 onwards the questionnaire provided graduates with

the opportunity to comment on two courses. The values from 1997 onwards incorporate

responses for a second course. Our analysis suggests that this has had the effect of changing

the level of agreement slightly.

There have also been slight changes in the coverage of the survey over the period 1993-2001

and in response rates. Several universities did not participate in the 1993 survey, but for the

remainder of the period higher education institutions covering the overwhelming majority of

graduates have participated in the survey. There are substantial differences in responses to the

CEQ items across the different fields of study. Hence changes in the enrolment pattern by

field of study may potentially affect aggregate responses by higher education graduates.

For the Good Teaching Scale the story is one of stability with a slight increase. The mean

percentage agreement had risen slightly in 1996 compared with previous years and increased

further from 1997 to 2000. Agreement levels in 2001 were almost the same as for 2000. The

story of a small increase over time is similar for the mean percentage agreement with the

Clear Goals and Standards Scale. The values for the 2001 survey are marginally higher

overall, than in 2000. Mean percentage agreement for the Appropriate Assessment Scale,

showed a decline over the course of the course of the survey. The effect of the change in the

questionnaire structure in 1997 appears to have been to increase the level of percentage

agreement. Even so, the values in 1997, 1998, 1999 and 2000 are below those in the earlier

years of the survey. However, from 1999 to 2001 the decline has been small. There has been

a small increase in the level of agreement with items that form part of the Appropriate

Workload Scale. With regard to the Generic Skills Scale the picture is one of a small and

gradual increase both in the period 1993-1996 and 1997-2000 but with the result for 2001

being almost the same as for 2000 (in fact a small decline). The biggest increases appear to

have been for item 9 (working in a team) and this continued to increase from 2000 to 2001.

6 Mean percentage agreement is based on respondents who answered either agree or strongly agree to each item

(the top two categories on the response scale). Broad percentage agreement is based on respondents who

answered in one of the top three categories of the response set.

Page 24: Course Experience Questionnaire 2001

CEQ 2001

14

Percentage agreement with the Overall Satisfaction item by graduates of bachelor degree

courses has increased in the period 1993 to 2000 but in 2001 the result on either percentage

agreement or broad percentage agreement shows that increase had stopped (and had declined

just a little).

Figure 2.2 Trends in CEQ Indicators 1993-2001 (Bachelor Degree Respondents) (Data shown as mean percentage agreement and mean percentage broad agreement)

Good Teaching Scale (All Respondents)

0

10

20

30

40

50

60

70

80

90

100

1993 1994 1995 1996 1997 1998 1999 2000 2001

Ag

ree

me

nt

(%)

% Broad Agreement % Agreement

Appropriate Assessment Scale (All

Respondents)

0

10

20

30

40

50

60

70

80

90

100

1993 1994 1995 1996 1997 1998 1999 2000 2001

Ag

ree

me

nt

(%)

% Broad Agreement % Agreement

Generic Skills Scale (All Respondents)

0

10

20

30

40

50

60

70

80

90

100

1993 1994 1995 1996 1997 1998 1999 2000 2001

Ag

ree

me

nt

(%)

% Broad Agreement % Agreement

Good Teaching Scale

(Bachelor Degree Respondents)

0

10

20

30

40

50

60

70

80

90

100

1993 1994 1995 1996 1997 1998 1999 2000 2001

Year of Survey

% Broad Agreement % Agreement

Appropriate Assessment Scale

(Bachelor Degree Respondents)

0

10

20

30

40

50

60

70

80

90

100

1993 1994 1995 1996 1997 1998 1999 2000 2001

Year of Survey

% Broad Agreement % Agreement

Appropriate Workload Scale

(Bachelor Degree Respondents)

0

10

20

30

40

50

60

70

80

90

100

1993 1994 1995 1996 1997 1998 1999 2000 2001

Year of Survey

% Broad Agreement % Agreement

Generic Skills Scale

(Bachelor Degree Respondents)

0

10

20

30

40

50

60

70

80

90

100

1993 1994 1995 1996 1997 1998 1999 2000 2001

Year of Survey

% Broad Agreement % Agreement

Overall Satisfaction Index

(Bachelor Degree Respondents)

0

10

20

30

40

50

60

70

80

90

100

1993 1994 1995 1996 1997 1998 1999 2000 2001

Year of Survey

% Broad Agreement % Agreement

Clear Goals & Standards Scale

(Bachelor Degree Respondents)

0

10

20

30

40

50

60

70

80

90

100

1993 1994 1995 1996 1997 1998 1999 2000 2001

Year of Survey

% Broad Agreement % Agreement

Page 25: Course Experience Questionnaire 2001

Patterns and Trends

15

Figure 2.3a Percentage Agreement with Items in the Good Teaching Scale: Bachelor

Degree Graduates, 1993-2001

Figure 2.3b Percentage Agreement with Items in the Clear Goals and Standards Scale:

Bachelor Degree Graduates, 1993-2001

0

10

20

30

40

50

60

70

Year of Survey

Mea

n A

gre

emen

t (%

)

Item 3 Item 7 Item 15 Item 17 Item 18 Item 20

Item 3 38.2 38.3 38.9 39.9 44.1 46.2 47.7 48.7 48.6

Item 7 25.2 25.8 26.2 26.5 30.0 31.8 32.6 33.5 33.1

Item 15 35.8 34.8 35.2 34.8 37.4 38.6 39.2 40.1 40.3

Item 17 37.0 36.8 34.1 34.7 36.8 42.9 43.8 44.9 45.0

Item 18 31.8 31.9 31.7 32.9 36.9 40.4 42.2 43.0 43.3

Item 20 39.4 40.3 40.5 41.8 45.5 46.5 49.0 49.7 50.1

1993 1994 1995 1996 1997 1998 1999 2000 2001

0

10

20

30

40

50

60

70

Year of Survey

Mea

n A

gre

emen

t (%

)

Item 1 Item 6 Item 13r Item 24

Item 1 41.8 42.2 42.4 43.0 48.2 50.3 51.6 52.1 52.8

Item 6 51.0 50.8 51.0 52.1 54.9 55.8 56.3 57.3 57.3

Item 13r 46.3 45.9 45.6 45.6 47.4 47.1 48.2 48.9 49.5

Item 24 34.5 35.3 35.4 36.6 40.2 42.9 44.5 45.4 46.5

1993 1994 1995 1996 1997 1998 1999 2000 2001

Page 26: Course Experience Questionnaire 2001

CEQ 2001

16

Figure 2.3c Percentage Agreement with Items in the Appropriate Assessment Scale:

Bachelor Degree Graduates, 1993-2001

Figure 2.3d Percentage Agreement with Items in the Appropriate Workload Scale:

Bachelor Degree Graduates, 1993-2001

0

10

20

30

40

50

60

70

Year of Survey

Mea

n A

gre

emen

t (%

)

Item 8r Item 12r Item 19r

Item 8r 63.9 62.9 63.2 61.6 60.9 59.5 59.8 60.4 59.7

Item 12r 55.5 54.8 55.2 53.9 54.6 54.1 55.2 54.7 54.5

Item 19r 57.9 57.0 54.7 52.3 53.9 53.7 55.1 54.6 54.8

1993 1994 1995 1996 1997 1998 1999 2000 2001

0

10

20

30

40

50

60

70

Year of Survey

Mea

n A

gre

emen

t (%

)

Item 4r Item 14 Item 21r Item 23r

Item 4r 36.5 36.5 36.1 35.9 36.8 37.1 38.0 39.7 40.9

Item 14 43.8 42.8 43.3 44.1 46.7 48.0 48.8 49.7 50.2

Item 21r 26.8 27.0 26.2 26.4 26.7 26.4 28.3 29.0 29.7

Item 23r 33.0 32.7 32.1 32.2 33.1 33.8 34.0 35.2 35.2

1993 1994 1995 1996 1997 1998 1999 2000 2001

Page 27: Course Experience Questionnaire 2001

Patterns and Trends

17

Figure 2.3e Percentage Agreement with Items in the Generic Skills Scale: Bachelor

Degree Graduates, 1993-2001

Figure 2.3f Percentage Agreement with the Overall Satisfaction Item: Bachelor Degree

Graduates, 1993-2001

0

10

20

30

40

50

60

70

Year of Survey

Mea

n A

gre

emen

t (%

)

Item 2 Item 5 Item 9 Item 10 Item 11 Item 22

Item 2 65.9 65.3 65.8 66.3 66.9 66.9 67.4 67.8 67.2

Item 5 68.4 67.9 67.7 68.5 69.0 69.8 69.9 69.7 69.3

Item 9 41.4 42.9 43.4 45.2 45.9 46.6 47.6 49.2 49.9

Item 10 52.9 52.7 53.4 54.1 55.1 55.6 56.1 56.8 56.3

Item 11 64.9 65.7 66.1 66.7 68.0 69.1 69.0 69.8 69.1

Item 22 66.2 65.7 66.2 66.2 67.2 66.9 67.0 67.6 67.4

1993 1994 1995 1996 1997 1998 1999 2000 2001

0

10

20

30

40

50

60

70

Year of Survey

Mea

n A

gre

emen

t (%

)

Item 25

Item 25 61.6 61.6 61.0 62.6 64.9 66.2 67.3 68.3 68.1

1993 1994 1995 1996 1997 1998 1999 2000 2001

Page 28: Course Experience Questionnaire 2001

CEQ 2001

18

Figure 2.4 Recent trends in Mean Percentage Agreement with CEQ Scales and the

Overall Satisfaction Item (Bachelor Degree Respondents)

Summary

The pattern of bachelor degree graduate responses to CEQ 2001 items was similar to that

identified in previous surveys. Approximately two thirds (in fact 68 per cent) of bachelor

degree graduates agreed or strongly agreed that overall, they were satisfied with their course.

Responses can be summarised as either means (on a defined scale) or as percentage agreement

measures for both individual items and groups of items. Differences in summary statistics for

each scale between course levels are in the expected direction of higher scores for graduates of

higher degrees than bachelor degrees. On the Good Teaching Scale, 58% of graduates of

research degree courses agreed with the Good Teaching Scale items, compared with 43% of

bachelor degree graduates. In CEQ 2000 trends in CEQ indicators were highlighted. In

general from CEQ 2000 to CEQ 2001 the trend lines flattened. For example the increases

previously evident in the Overall Satisfaction item ceased with a flattening of the trend.

Trends over the past three survey years for bachelor degree graduates are shown in Figure 2.4.

30

35

40

45

50

55

60

65

70

75

80

GTS CGS AAS AWS GSS OSI

Course Experience Questionnaire Scales (Bachelor Degree Respondents)

Mean

Ag

reem

en

t (%

)

1999 2000 2001

Page 29: Course Experience Questionnaire 2001

19

Influence of Graduate and Course Characteristics on the

Good Teaching Scale and Overall Satisfaction

This chapter explores aspects of the variation in graduate responses to the Good Teaching

Scale and the Overall Satisfaction item. As reported for CEQ 2000 there is relatively little

variation in responses to the Good Teaching Scale and the Overall Satisfaction item that can

be attributed to the characteristics of graduates: their sex, age, ethnic background, mode of

attendance or employment status. However, there is substantial variation in the responses of

graduates who were enrolled in different fields of study: accounting, biology, nursing and so

on. Further, within fields of study there are sometimes differences among the responses of

graduates from different universities. These differences are larger for the Good Teaching

Scale than for the Overall Satisfaction item. For both measures, however, the differences

among universities are larger than the differences attributable to the characteristics of

graduates.

Characteristics of graduates

It is important to know about differences in CEQ scores among various categories of

graduates in order to be able to assess the extent to which differences among courses may be

confounded by differences among their enrolment profile. Such information is also of interest

by itself.

Table 3.1 shows the mean percentage agreement for the Good Teaching Scale and the Overall

Satisfaction item for a number of characteristics of the graduates. The size of the differences

among the various categories is usually rather small. A difference of one percentage point or

less has been considered to represent no substantive difference. A difference of more than one

but less than five percentage points has been described as a very small difference, a difference

greater than five but less than ten percentage points has been described as a small difference,

and a difference of ten percentage points or more has been described as moderate difference.

One of the few large differences associated with graduate background concerns age. Older

graduates (over age 40) record higher levels of agreement with items on the Good Teaching

Scale (by 15 percentage points) and with the Overall Satisfaction item (by nine percentage

points) than younger graduates. However only about 10 per cent of graduates are over the age

of 40. The question remains as to whether older graduates experience different teaching or

whether they are able to appreciate better the approaches adopted by their teachers.

Graduates of a non-English speaking background recorded slightly greater satisfaction but the

difference was very small (the difference was just three percentage points) on each of the CEQ

measures.

Page 30: Course Experience Questionnaire 2001

CEQ 2001

20

Table 3.1a Percentage Agreement with the Good Teaching Scale and the Overall

Satisfaction Item by Selected Graduate and Course Characteristics: Bachelor

Degree Respondents, 2001

Respondent

Characteristic Percentage of

Respondents

Agreement (%)

Good Teaching

Scale

Overall Satisfaction

Item

All persons 100.0 42.6 68.1

(N = 50,207)

Sex Male 37.3 41.8 68.4

(N = 50,000) Female 62.7 42.9 67.8

Age 24 & under 64.2 40.7 67.8

(N = 49,969) 25-29 14.2 41.9 66.1

30-39 12.1 45.4 68.8

40-54 8.5 50.2 70.3

55 & over 1.0 65.8 83.5

Non-Eng-speaking

background

(N = 47,789)

NESB 20.9 39.3 65.3

ESB 79.1 43.3 68.9

Disability Disability 3.5 46.0 64.1

(N = 43,635) No disability 96.5 42.8 69.0

Level of previous

qualification

(N = 47,507)

Post-graduate 2.9 45.7 68.7

Bachelor 13.9 46.5 68.1

Sub-bachelor 11.0 42.6 67.5

High school 61.4 41.3 68.4

Other 7.7 45.7 68.1

No previous qual. 3.1 46.7 68.7

Mobility No movement 63.7 42.6 67.5

(N = 32,109)a Changed city 23.5 43.2 67.6

Changed state 12.7 43.4 68.4

Fee payment HECS 85.9 42.8 68.7

(N = 43,179) Aust fee-paying 5.3 42.2 67.7

Overseas fee-paying 7.7 41.0 66.3

Other 1.0 44.9 69.4

Attendance Full-time internal 68.9 42.4 67.6

(N = 48,917)b Part-time internal 18.8 41.4 67.2

External 12.3 44.0 70.7

Note: Table based on CEQ responses to first major only.

a Based on the subset of bachelor degree respondents who provided valid responses to both mobility variables.

b Based on the subset of bachelor degree respondents who provided valid responses to both attendance variables

Page 31: Course Experience Questionnaire 2001

Graduate and Course Characteristics

21

Table 3.1b Percentage Agreement with the Good Teaching Scale and the Overall

Satisfaction Item by Selected Graduate and Course Characteristics: Bachelor

Degree Respondents, 2001

Respondent

Characteristic Percentage of

Respondents

Agreement (%)

Good Teaching

Scale

Overall Satisfaction

Item

Employment in final

year

(N = 49,384)e

None 22.1 44.5 68.7

Full-time 18.0 39.3 67.2

Part-time 59.9 42.7 68.0

Field of study Agriculture 1.8 47.2 72.8

(N = 50,207) Architecture, building 2.1 37.3 55.9

Arts, Hum. & Soc. Sci. 24.0 52.6 71.5

Comm. & journalism 48.6 63.9

Psychology 40.4 68.9

Bus., admin., eco. 23.6 36.3 69.0

Business admin. 35.9 67.6

Accounting 31.9 66.6

Marketing & distribution 40.6 73.4

Education 9.6 43.3 63.2

Teacher edn-primary 40.3 61.6

Engineering, surveying 5.4 32.4 65.6

Health 13.7 39.6 62.6

Nursing - initial 37.2 55.7

Law, legal studies 3.8 35.9 65.2

Law 33.9 69.9

Science 15.8 44.1 71.6

Computer science 33.5 63.5

Biology 46.3 77.1

Veterinary Science 0.2 41.9 72.1

Activity in 2001 Work, full-time 53.6 39.7 67.9

(N = 50,207) seeking other job 7.7 40.3 64.9

Work, part-time 14.1 48.9 71.7

seeking full-time 8.1 44.5 66.4

Unemployed 8.8 43.9 64.1

Not in lab. force, study 1.1 45.8 71.2

Not in labour force 6.5 50.3 72.9

Note: Table based on responses to the first major only to avoid double counting of background variables

e Based on the subset of bachelor degree respondents who provided valid responses to both employment variables

Page 32: Course Experience Questionnaire 2001

CEQ 2001

22

There are no substantial differences between male and female graduates in their views of

Good Teaching and Overall Satisfaction. Females are a little more likely to report having

experienced aspects of Good Teaching than males while males are a little more likely to report

being satisfied overall with their course than are females.

Graduates who reported having a disability of some sort (motor, sensory or other) had a

marginally higher level of percentage agreement for the Good Teaching Scale than other

graduates, but there was no real difference in terms of the level of Overall Satisfaction.

Graduates with a previous bachelor degree had higher scores on the Good Teaching Scale than

those with only Year 12 entry (the difference was five percentage points) but there was no

difference in terms of overall satisfaction.

Those who had moved between States during their course had very slightly higher scores on

the Good Teaching Scale and the Overall Satisfaction item but the difference was very small:

less than three percentage points.

There was no real difference in either Good Teaching or Overall Satisfaction scores between

graduates who paid in different ways (HECS or fee paying) for their course although there was

a very small difference between overseas fee paying students and other students (less than

three percentage points).

Graduates who had attended mainly as part-time on-campus students were less likely to report

having experienced good teaching and less likely to be satisfied than other graduates.

Graduates who had enrolled externally, however, had higher scores on both CEQ measures

than did other graduates. Of course graduates who had enrolled externally are likely to have

been older.

Graduates who had been employed full-time during the final year of their course had

marginally lower results for the Good Teaching Scale and the Overall Satisfaction item. The

difference between full-time work and no work was five percentage points on the Good

Teaching Scale and two points on the Overall Satisfaction measure.

Higher levels of labour market participation after completion of a course did not appear to be

related to more positive assessments of the course. Instead, graduates who were either not in

the labour force (and not studying) or employed part-time but seeking full-time work had

somewhat higher scores (by between five and ten percentage points on the Good Teaching

Scale) than did other graduates.

Fields of Study

Field of study is related to CEQ scores for several possible reasons. First, disciplines may

have their own cultures and approaches to teaching. Second, subject matter may lend itself to

different forms of exposition. Third, different types of students may be attracted to different

fields of study. In addition, there may be differences in the demands placed on students.

Table 3.1b shows that there are sometimes quite large differences in the mean percentage

agreement with the Good Teaching Scale and the Overall Satisfaction item for graduates of

courses in various fields of study. The mean percentage of graduates agreeing with items on

the Good Teaching Scale differs substantially among the broad fields of study. For instance,

there is a 16-percentage point difference between the value for Business courses (36.3%) and

Page 33: Course Experience Questionnaire 2001

Graduate and Course Characteristics

23

for Arts courses (52.7%). Interestingly, however, the corresponding values for Overall

Satisfaction (69.0% and 71.6%) show a much smaller difference.

32

34

34

36

37

40

40

41

46

49

0 5 10 15 20 25 30 35 40 45 50

Accounting

Computer Science - general

Law, Legal Studies professional professional

Business Administration

Nursing - initial

Tchr Edn - primary, initial

Psychology

Marketing & Distribution

Biology

Communications & Journalism

Mean GTS Agreement (%)

Figure 3.1 Percentage Agreement with the Good Teaching Scale by Selected Fields of

Study: Bachelor Degree Graduates, CEQ 2001

Although there are sometimes large differences between broad fields of study, there are also

often large differences between minor fields of study within those broad fields of study. For

instance, within Arts, the mean agreement for Good Teaching for graduates of courses in

Communication and Journalism is eight percentage points higher than for graduates of

Psychology courses, or within Science, the value for Good Teaching is 12 percentage points

higher for graduates of Biology courses than for graduates of Computer Science courses.

Again, however, the differences in Overall Satisfaction are usually somewhat less marked.

Figures 3.1 and 3.2 present the values for the 10 minor fields of study from Table 2.1. These

fields of study were selected because they contained substantial numbers of graduates, were

taught in many universities, and covered a diversity of subject matter. The graphical

presentation in Figure 3.1 highlights the extent of the variation in mean agreement with the

Good Teaching Scale across minor fields of study. The difference between Biology and

Accounting, for instance, is 14 percentage points. Similarly Figure 3.2 shows there a

difference of 21 percentage points for values for Overall Satisfaction between graduates of

courses in Biology and Initial Nursing.

Comparisons of different courses within a particular university are unlikely to be of like with

like. Instead there are substantial differences in graduates’ experiences of courses that appear

to be linked to the subject matter of the course itself. Instead, comparisons are more likely to

be fair if they are made within similar courses between universities. The next section explores

such differences for each of the 10 minor fields of study presented in Figures 3.1 and 3.2.

Page 34: Course Experience Questionnaire 2001

CEQ 2001

24

56

62

63

64

67

68

69

70

73

77

0 10 20 30 40 50 60 70 80 90

Nursing - initial

Tchr Edn - primary, initial

Computer Science - general

Communications & Journalism

Accounting

Business Administration

Psychology

Law, Legal Studies

Marketing & Distribution

Biology

Mean OSI Agreement (%)

Figure 3.2 Percentage Agreement with the Overall Satisfaction Item by Selected Fields

of Study: Bachelor Degree Graduates, CEQ 2001

Universities

The purpose of the CEQ is to capture graduates’ experiences of their courses. For that reason

it is important that variation in scores be explained by aspects of the course (eg. field of study

or institution) rather than by personal characteristics of the graduate. Certainly some of the

variation in responses to the Good Teaching Scale and the Overall Satisfaction item could be

explained by the background characteristics (sex, age etc.) of the respondents. The results in

Table 3.1 indicated that the effects of most background characteristics (with the exception of

age) are small compared with those associated with field of study. For 10 minor fields of

study, Tables 3.2 and 3.3 show the percentage of the variance in the Good Teaching Scale and

the Overall Satisfaction item explained by selected background variables, field of study and

institution. It is not possible from the CEQ data to link scores to particular courses (although

that is possible within some universities by the institutions themselves). The analysis in

Tables 3.2 and 3.3 refer to fields of study rather than courses.

The Good Teaching Scale

Table 3.2 records the percentage of the variance in individual scores on the Good Teaching

Scale “explained by” various factors for each of ten fields of study. Two values are shown for

each field of study. The value in the first row for a given field of study shows the percentage

of variation explained by the university a graduate attended. The value in the second row

shows the percentage of variation explained by the university a graduate attended adjusted for

the influence of the other background characteristics.

Page 35: Course Experience Questionnaire 2001

Graduate and Course Characteristics

25

Table 3.2 Percentage of Variance in the Good Teaching Scale Explained by Selected

Graduate and Course Characteristics: Ten Specific Fields of Study, Bachelor

Degree Respondents, CEQ 2001

Field of Study Sex Age NESB Attendance Fees Activity University

% % % % % % %

Accounting 0.0 1.3 0.0 0.1 0.3 0.4 5.4

(29/2925) 0.1 1.4 0.0 0.1 0.3 0.4 5.2

Biology 0.5 4.4 0.3 0.5 1.5 1.4 5.3

(13/674) 0.6 4.7 0.5 0.7 1.5 1.6 5.2

Business admin 0.3 4.1 0.0 0.5 0.1 1.0 6.1

(19/1654) 0.2 4.1 0.0 0.4 0.1 1.0 5.5

Comm. & journalism 0.3 6.9 0.0 0.3 0.8 1.2 4.4

(18/1124) 0.1 6.3 0.0 0.1 0.7 0.8 2.9

Computer science 0.3 2.9 0.0 0.0 0.3 0.9 7.6

(23/1680) 0.3 2.9 0.1 0.1 0.3 0.8 7.3

Law 0.3 5.5 0.0 0.6 0.2 0.7 4.7

(13/1018) 0.5 5.3 0.0 0.6 0.2 0.7 4.7

Marketing & distribn 0.3 1.8 0.0 0.3 0.1 0.5 3.2

(22/1670) 0.2 2.0 0.0 0.5 0.2 0.6 2.9

Nursing - initial 0.0 3.8 0.5 0.8 1.3 0.5 6.6

(23/1855) 0.0 3.2 0.2 0.3 0.8 0.5 5.4

Psychology 0.2 3.2 0.0 0.1 0.2 0.9 3.6

(26/1860) 0.1 3.2 0.0 0.2 0.2 0.8 3.6

Teacher edn - primary 0.2 3.1 0.7 0.6 0.4 1.3 9.0

(20/1325) 0.2 2.5 0.5 0.6 0.4 1.4 7.5

Notes

a. The first row for each field of study contains the percentage of variation in percentage agreement for the

Good Teaching Scale that can be attributed to the corresponding variable (sex, age, etc) when that variable is

considered by itself. All values are corrected for degrees of freedom.

b. The second row for each field of study contains the unique percentage of variation in percentage agreement

for the Good Teaching Scale that can be attributed to the corresponding variable after the variation associated

with the other variables was removed. Nested Ordinary Least Squares equations were used. All values are

corrected for degrees of freedom. Negative values were converted to zero.

c. The values in parentheses are first the number of universities and then the number of respondents for each

field of study.

d. Responses to both first and second major are used as appropriate.

e. Universities with fewer than 20 responses for the relevant field of study were omitted.

Page 36: Course Experience Questionnaire 2001

CEQ 2001

26

Those results indicate that the university the graduate attended explains an average of six per

cent of the variance in scores. The effect of university attended is greatest for primary teacher

education (8.9% unadjusted or 7.3 % after adjusting for differences in other factors) and least

for Marketing and Psychology (less than 4%). It is interesting that the percentage of variance

explained is highest where the alignment between field of study and course is probably

greatest. In general the percentage of the variance associated with institution appeared to be a

little less than in CEQ 2000.

Although 7.3 per cent of the variance does not seem large it corresponds to a correlation

coefficient of 0.27. The unadjusted percentage of the variance that is associated with

institution of 8.9 per cent would correspond to a correlation coefficient of 0.30. In most

survey research this would be considered a moderately high correlation. In fact the value

probably understates the strength of the association because of the measurement errors

involved.

Among the other factors in the analysis only age “explains” a significant amount of variance

(typically 3.5%). Factors such as gender, ethnic background, mode of attendance, fee status or

post-graduation labour market participation had a very small impact. Hence, for Primary

Teacher Education the university attended by a graduate explains a substantial amount of the

variation in responses to the Good Teaching Scale, graduate characteristics explain a

relatively small amount of this variation, and the amount of variation explained is relatively

unaffected by differences among universities in the characteristics of their graduates.

The Overall Satisfaction item

Table 3.3 presents the corresponding results for the Overall Satisfaction item. The major

difference between Tables 3.2 and 3.3 is the size of the values -- in almost every case, the

absolute sizes of the values in Table 3.3 are less than the corresponding values in Table 3.2.

Typically the percentage of variance explained by institution is 3.6 per cent. This means that

responses to the Overall Satisfaction item are less likely to be explained by the individual

characteristics of graduates or by the university they attended. The smaller values probably in

part reflect the lower reliability of a single item measure as well as the summative (and less

focused) nature of that item.

The adjusted percentage of variation in responses to the Overall Satisfaction item explained

by the university a graduate attended is greatest for Initial Nursing (7.3%) and least for

Biology (1.5%). For Primary Teacher Education, Communication and Journalism, Marketing,

and computer science the percentage of variation in overall course satisfaction explained by

between-universities remains moderate at four per cent.

Page 37: Course Experience Questionnaire 2001

Graduate and Course Characteristics

27

Table 3.3 Percentage of Variance in the Overall Satisfaction Item Explained by Selected

Graduate and Course Characteristics: Ten Minor Fields of Study, Bachelor

Degree Respondents, CEQ 2001

Field of Study Sex Age NESB Attendance Fees Activity University

% % % % % % %

Accounting 0.0 1.8 0.1 0.1 0.1 0.6 2.1

(29/2917) 0.0 1.8 0.0 0.1 0.0 0.6 2.0

Biology 0.1 5.0 0.2 0.1 0.6 1.9 1.2

(13/671) 0.2 5.3 0.2 0.1 0.7 1.9 1.0

Business admin 0.1 3.1 0.5 0.1 0.6 0.7 3.2

(19/1651) 0.1 2.8 0.2 0.1 0.2 0.6 2.9

Comm. & journalism 0.0 3.7 0.0 0.4 0.0 0.5 4.4

(18/1121) 0.0 3.5 0.0 0.2 0.0 0.4 4.1

Computer science 0.0 2.4 0.1 0.0 0.1 1.3 3.6

(23/1673) 0.0 2.4 0.1 0.0 0.0 1.2 3.5

Law 0.1 5.3 0.0 0.2 0.3 0.7 1.4

(13/1016) 0.1 5.5 0.0 0.4 0.3 0.9 1.2

Marketing & distribn 0.0 2.0 0.1 0.3 0.2 0.6 3.2

(22/1658) 0.0 2.0 0.1 0.3 0.1 0.6 3.6

Nursing - initial 0.0 2.8 0.2 0.6 0.5 0.4 7.5

(23/1852) 0.0 2.6 0.1 0.3 0.3 0.4 6.8

Psychology 0.0 2.8 0.0 0.2 0.0 1.4 3.2

(26/1851) 0.0 2.6 0.0 0.1 0.0 1.4 3.2

Teacher edn - primary 0.0 2.3 0.4 0.3 0.1 0.5 5.6

(23/1319) 0.0 2.3 0.4 0.2 0.1 0.4 4.8

Notes

a. The first row for each field of study contains the percentage of variation in percentage agreement for the

Overall Satisfaction Item that can be attributed to the corresponding variable (sex, age, etc) when that

variable is considered by itself. All values are corrected for degrees of freedom.

b. The second row for each field of study contains the unique percentage of variation in percentage agreement

for the Overall Satisfaction Item that can be attributed to the corresponding variable after the variation

associated with the other variables was removed. Nested Ordinary Least Squares equations were used. All

values are corrected for degrees of freedom. Negative values were converted to zero.

c. The values in parentheses are first the number of universities and then the number of respondents for each

field of study in the analysis.

d. Responses to both first and second major are used as appropriate.

e. Universities with fewer than 20 responses for the relevant field of study were omitted.

Page 38: Course Experience Questionnaire 2001

CEQ 2001

28

Institutional Differences within Fields of Study

Initial Primary Teacher Education

Initial primary teacher education was a field in which institutional differences contributed a

relatively larger percentage in overall scores on the Good Teaching Scale. Figure 3.3

represents the distribution of institutional scores within initial primary teacher education. It

can be seen in Figure 3.3 that there is a considerable range in institutional means on the Good

Teaching Scale. The range (institutions with fewer than 20 graduates have been excluded) was

from 27 to 68.

Figure 3.3 Mean Percentage Agreement with the Good Teaching Scale by University:

Bachelor Degree Graduates in Initial Primary Teacher Education, CEQ 2001

27

33

34

34

34

35

36

36

37

37

38

38

38

39

41

42

43

43

45

48

48

50

53

62

68

68

0 10 20 30 40 50 60 70

Z [N=202]

Y [N=86]

X [N=79]

W [N=58]

V [N=39]

U [N=86]

T [N=30]

S [N=26]

R [N=96]

Q [N=260]

P [N=144]

O [N=21]

N [N=47]

M [N=103]

L [N=54]

K [N=28]

J [N=34]

I [N=31]

H [N=37]

G [N=21]

F [N=46]

E [N=126]

D [N=31]

C [N=39]

B [N=41]

A [N=50]

Mean GTS Agreement (%)

Page 39: Course Experience Questionnaire 2001

Graduate and Course Characteristics

29

Figure 3.4 also shows the mean percentage agreement on the Good Teaching Scale from CEQ

2001 for initial primary teacher education graduates7. For each mean it also shows plus and

minus one standard error as an error bar8. Although few of the differences between

institutions are statistically significant there are some differences between the ends of the

distribution that are substantial. Institutions A, B and C could be considered on the basis of

these data to represent “good practice” (they are the top three institutions). From these there

can be some confidence that more than 50 per cent of graduates are in agreement with the

items of the Good Teaching Scale. Based on Figure 4.1 there is a number of institutions that

are significantly lower than this level. There are other comparisons that could be made –

perhaps of institutions that have similar characteristics in other respects. The important point

is that the patterns highlight instances where further investigation appears warranted followed

by reflection and review.

Figure 3.4 Percentage Agreement with the Good Teaching Scale for Initial primary

Teacher Education Graduates: CEQ 2001

Psychology

Psychology was a field of study in which the amount of variance attributable to the university

attended was about average. Figures 3.5 shows the distribution among universities of the

mean percentage agreement for Psychology graduates on the Good Teaching Scale. Leaving

aside five institutions with 20 or fewer respondents the institution means for the percentage

agreement score range from 25 to 60. In other words the mean percentage agreement for the

top institution is more than twice that for the bottom institution. It is clear from Figure 3.5 that

there are quite large differences among universities in the extent to which their graduates

agree with items on the Good Teaching Scale.

7 The analysis has been restricted to institutions where there were at least 20 respondents to this scale of CEQ

2001. 8 There is some debate in the literature regarding the estimation of the appropriate within institution, and within

field of study, standard errors. For this exercise the standard errors have been estimated using the assumption that

the samples within each institution can be treated as simple random samples.

0

10

20

30

40

50

60

70

80

Z [

N=202]

Y [

N=86]

X [

N=79]

W [

N=58]

V [

N=39]

U [

N=86]

T [

N=30]

S [

N=26]

R [

N=96]

Q [

N=260]

P [

N=144]

O [

N=21]

N [

N=47]

M [

N=103]

L [

N=54]

K [

N=28]

J [

N=34]

I [N

=31]

H [

N=37]

G [

N=21]

F [

N=46]

E [

N=126]

D [

N=31]

C [

N=39]

B [

N=41]

A [

N=50]

Me

an

GT

S A

gre

em

en

t (%

)

Page 40: Course Experience Questionnaire 2001

CEQ 2001

30

Figure 3.5 Mean Percentage Agreement with the Good Teaching Scale by University:

Bachelor Degree Psychology Graduates, CEQ 2001

Figure 3.6 displays the distribution of mean percentage agreement scores for psychology

graduates on the Overall Satisfaction item but in a different format. The data are categorised

in five percentage point intervals. Again the range of percentage agreement scores (leaving

aside five universities with fewer than 20 respondents) is considerable.

25

27

28

31

32

34

36

36

37

38

38

38

39

40

40

41

42

43

43

44

44

46

46

47

48

49

52

53

54

60

0 10 20 30 40 50 60 70

A [N=60]

B [N=40]

C [N=82]

D [N=39]

E [N=32]

F [N=95]

G [N=83]

H [N=108]

I [N=119]

J [N=83]

K [N=46]

L [N=113]

M [N=28]

N [N=69]

O [N=109]

P [N=72]

Q [N=45]

R [N=93]

S [N=151]

T [N=65]

U [N=30]

V [N=44]

W [N=70]

X [N=33]

Y [N=41]

Z [N=25]

AA [N=50]

AB [N=29]

AC [N=34]

AD [N=41]

Mean GTS Agreement (%)

Page 41: Course Experience Questionnaire 2001

Graduate and Course Characteristics

31

0

1

2

3

4

5

6

7

8

45 - 49 50 - 54 55 - 59 60 - 64 65 - 69 70 - 74 75 - 79 80 - 84 85 - 89 90 - 94 95 - 99

Mean Percentage Agreement (Overall Satisfaction)

Num

ber

of In

stitu

tions in S

core

Cate

gory

Figure 3.6 Mean Percentage Agreement with the Overall Satisfaction Item by University:

Bachelor Degree Psychology Graduates, CEQ 2001

Summary

Three factors were associated with differences in CEQ scores for the Good Teaching Scale

and the Overall Satisfaction item. Older graduates rated their courses more favourably than

did younger graduates. Courses with substantial proportions of older graduates will tend to

have higher levels of satisfaction. There were differences among field of study. Generally,

graduates from the humanities and social science fields of study recorded higher scores on the

Good Teaching Scale and the Overall Satisfaction item than those from science fields of

study. There were substantial differences between specific fields of study with broad fields.

There may be differences in approaches to teaching among fields of study that could usefully

be evaluated. It also indicates that comparisons between institutions need to make allowance

for differences in the enrolment profile of the institutions. Finally, within fields of study there

are substantial differences between institutions in terms of Good Teaching Scores and Overall

Satisfaction. These differences invite further investigation.

Page 42: Course Experience Questionnaire 2001

32

Generic Skills

For some time there has been interest in the knowledge and skills learned through formal

education programs that have applicability to fields other than the specific domain of

instruction. “Generic skills” is a term that is used in at least two senses. In one sense generic

skills refer to skills that are not domain specific and can be acquired through various fields of

study. In another related sense generic skills are skills learned in one context that can be

applied in other contexts (Curtis & McKenzie, 2002). Often these generic skills are learned in

educational settings and applied to the world of work and then referred to as generic

employability skills. Over the past decade generic skills concerned with the world of work

have received considerable attention. However, even though the idea of generic skills has

become widely accepted it is much less clear as to what those skills are and how they relate to

domain specific learning.

Background

A 1992 study based on an analysis of requirements specified in job advertisements, and a

survey of employers, concluded that university graduates lacked skills in written and oral

communication skills, an ability to apply academic learning to practical situations, general

knowledge and commercial awareness (NBEET, 1992). Shortly afterwards the Australian

Education Council Review of young people's participation in post-compulsory education and

training argued for a convergence of general and vocational education so that educational

programs became more focussed on issues of employability (Finn, 1991). Finn focussed on

young people completing school and argued that young people should be competent in six

areas by the time they leave school: language and communication, mathematics, scientific and

technological understanding, cultural understanding, problem solving and personal and

interpersonal skills. The Mayer Committee further developed this general orientation in its

report on Key Competencies: Putting general education to work (Mayer, 1992). That report

proposed a set of seven key competencies that it regarded as generic to all kinds of work:

collecting, analysing and organising information, communicating ideas and information,

planning and organising, working with others and in teams, using mathematical ideas and

techniques, solving problems, and using technology. A subsequent meeting of the Ministers of

Education in 1993 added an eighth competency concerned with cultural understanding.

A number of universities have identified graduate qualities that are generic in applying across

a range of fields of study and are important for participation in work and society. These

typically refer to problem solving capacity, commitment to further learning, effective

communication, working both collaboratively and autonomously and social responsibility.

One project centred on universities with a technology focus argued for a set of principles

around which institutions could frame statements of generic capabilities. These saw generic

capabilities being defined at institutional or course levels but implemented and assessed

within disciplines and fostered through a variety of approaches to teaching and learning

(Bowden et al., 2000). In 2000 ACER was commissioned to produce a test of generic skills

for graduates known as the Graduate Skills Assessment. Four domains were identified: written

communication, critical thinking, problem solving and interpersonal understanding. The

experience has shown that it is possible to measure competence in these domains and report

performance along scales that distinguish the level of performance.

Page 43: Course Experience Questionnaire 2001

Properties of the CEQ

33

A number of other OECD countries developed frameworks for generic skills during the 1990s.

For example, in the United States of America, the SCANS report sought to identify

employment-related skills, to specify levels of proficiency, and to suggest assessment methods

(SCANS, 1991). A review of this and other approaches to generic skills in the United States

identified common features such as: a core of academic skills; higher order thinking skills

adapting to change, problem-solving, creativity, decision-making, learning how to learn;

interpersonal and team skills (by O'Neil, Allred & Baker, 1997). The Conference Board of

Canada (1992) through an employability skills profile identified the generic academic,

personal management, and teamwork skills that are required in most employment situations in

every job to varying degrees. In the United Kingdom a variety of approaches has emerged

through the decade (Moser, 1999). Across the OECD the Definition and Selection of

Competencies (DESECO) project seeks to assess effectiveness of education systems using a

broader range of indicators than is available from subject-specific assessments (Salganik,

Rychen, Moser, & Konstant, 1999).

Generic Skills in the CEQ

From the earliest of the national CEQ surveys (conducted in 1993) the GCCA added a set of

items concerned with generic skills for the national survey. Generic Skills items were added

in response to an interest in the broader skills (beyond the discipline specific skills and

knowledge) that are developed through university study (NBEET, 1992). The Generic Skills

scale consists of six items:

The course developed my problem-solving skills.

The course sharpened my analytic skills.

The course helped me develop my ability to work as a team member.

As a result of my course, I feel confident about tackling unfamiliar problems.

The course improved my skills in written communication.

My course helped me to develop the ability to plan my own work.

Differences Among Fields of Study

The differences in generic skills scores between broad fields of study were not large. Table 4.1

provides mean percentage agreement indicators for ten broad fields of study for three survey

years: 1995, 1998 and 2001. The same data are displayed in Figure 4.1. In CEQ 2001 the

range in scores was approximately ten percentage points; from 70 per cent for Engineering to

60 per cent for Architecture. Overall the highest scores were recorded for science-oriented

fields (Engineering, Agriculture, Science) and the lowest scores were recorded for architecture

and education.

On average there has been a small increase of three percentage points from 1995 to 2001 but a

substantial increase for Veterinary Science (a field with few graduates each year). In health

and education there were also large increases of nearly five percentage points over the six-year

period. In engineering, science and the humanities there were increases of two to three

percentage points. There were rather smaller increases for architecture (which increased from

1995 to 1998 and then declined in 2001), business and law and a drop of three percentage

points for agriculture.

Page 44: Course Experience Questionnaire 2001

CEQ 2001

34

Table 4.1 Mean Percentage Agreements for Generic Skills by Broad Field of Study:

CEQ 1995, 1998 and 2001 (Bachelor Degree Graduates).

Year of Survey

Broad Field of Study 2001 1998 1995

Agriculture 68.1 68.2 71.3

Architecture 59.5 62.0 58.4

Humanities and Social Science 63.8 64.3 61.9

Business Studies 62.1 62.6 60.9

Education 60.5 58.8 55.5

Engineering 69.5 67.6 66.7

Health 62.1 59.9 57.2

Law 62.0 64.4 60.6

Science 64.7 64.1 62.6

Veterinary Science 62.1 59.6 46.5

All broad fields 63.2 62.9 60.4

Figure 4.1 Trends in Percentage Agreement for Generic Skills for Broad Fields of

Study: CEQ 1995, 1998 and 2001 (Bachelor Degree Graduates).

40

45

50

55

60

65

70

75

80

Agriculture Architecture Humanities

and Social

Science

Business

Studies

Education Engineering Health Law Science Veterinary

Science

Mean Agreement (%)

1995 1998 2001

Page 45: Course Experience Questionnaire 2001

Properties of the CEQ

35

Table 4.2 provides the percentage agreement for each item of the Generic Skills scale by

broad field of study for the CEQ 2001 survey. There is some evidence of different patterns

among fields of study for different items and differences in the range among fields. These

have been illustrated in Figure 4.2.

Figure 4.2 Patterns of Agreement with Items on the Generic Skills Scale for Broad

Fields of Study

The two items on which there was a wide dispersion in responses from different broad fields

of study were those concerned with the ability to work as a member of a team and skills in

written communication. For the item concerned with written communication the range was

from the lowest of Veterinary Science at 27 per cent at (followed by Architecture at 49 per

cent) to the group of three highest of Agriculture, Humanities and Law (at 81, 80 and 78 per

cent respectively). On the item concerned with working as a member of a team the range was

from a high of 64 per cent for Engineering to a low of 27 percentage points for Law (and 37

per cent for the Humanities).

The items concerned with analytic skills and problem solving skills had ranges from top to

bottom of 23 and 26 percentage points respectively. Engineering and Veterinary Science

featured at the top for both these items and Education featured at the bottom of each. For

written communication skills, where the range was 54 percentage points, the highest ratings

came from graduates in the humanities and agriculture and the lowest from graduates in

veterinary science, architecture and engineering. The range was least wide for confidence in

tackling unfamiliar problems (13 percentage points) and planning ones own work (14

percentage points).

Table 4.3 records the percentage agreement for each item and the mean percentage agreement

for the generic skills scale for selected specific fields of study. The same data are displayed in

Figure 4.3.

20

30

40

50

60

70

80

90

problem-solving analytic skills. team member. tackling unfamiliar

problems.

written communication. plan my own work.

Pe

rce

na

tge

Ag

ree

me

nt

Agric Arch Hums Bus Edn Eng Health Law Science Vet Mean

Page 46: Course Experience Questionnaire 2001

CEQ 2001

36

Table 4.2 Percentage Agreement for Generic Skills Items by Broad Field of Study: CEQ2001.

Field of Study

Ag

ricultu

re

Arch

itecture

Hu

man

ities

Bu

siness

Ed

ucatio

n

En

gin

eering

Health

Law

Scien

ce

Vet S

cience

The course developed my problem-solving skills. 71 68 66 65 57 80 68 72 73 83

The course sharpened my analytic skills. 71 62 74 66 56 78 64 76 74 79

The course helped me develop my ability to work as a team member. 55 57 37 56 56 64 55 27 54 63

As a result of my course, I feel confident about tackling unfamiliar problems. 60 54 56 54 54 67 55 54 60 64

The course improved my skills in written communication. 81 49 80 67 71 57 64 78 61 27

My course helped me to develop the ability to plan my own work. 71 68 71 65 69 71 65 65 67 57

Mean Percentage Agreement 68 60 64 62 61 70 62 62 65 62

Page 47: Course Experience Questionnaire 2001

Properties of the CEQ

37

Table 4.3 Percentage Agreement for Generic Skills Items by Specific Field of Study: CEQ2001.

Specific Field of Study

Co

mm

un

ication

Psy

cho

log

y

Bu

siness A

dm

in

Acco

un

ting

Mark

eting

Prim

ary T

eacher E

d.

Nu

rsing

- initial

Law

Co

mp

uter S

cience

Bio

log

y

The course developed my problem-solving skills. 60 76 63 61 71 56 64 79 73 73

The course sharpened my analytic skills. 65 85 66 62 72 52 58 83 70 77

The course helped me develop my ability to work as a team member. 49 40 50 48 72 65 56 24 60 52

As a result of my course, I feel confident about tackling unfamiliar problems. 56 57 53 49 65 53 52 59 60 59

The course improved my skills in written communication. 82 85 66 58 76 69 69 79 48 69

My course helped me to develop the ability to plan my own work. 67 74 66 60 71 69 63 67 64 67

Mean Percentage Agreement 63 69 61 56 71 61 60 65 63 66

Page 48: Course Experience Questionnaire 2001

CEQ 2001

38

Figure 4.3 Percentage Agreement for Generic Skills Items by Specific Field of Study:

CEQ2001.

The data in Table 4.3 and Figure 4.3 refer to selected specific fields of study. It can be seen

that the differences among fields are greatest for the items referring to being a team member,

written communication and analytical skills. In terms of being a team member the gap is

between marketing (72 per cent) and law (24 per cent). For written communication the gap is

between psychology (85 per cent) and computer science (48 per cent). With respect to

analytical skills the gap is between psychology (85 per cent) and primary teaching (52 per

cent).

A conclusion that can be drawn form these data is that, although these skills may be generic to

a range of fields of study, they are achieved (and possibly emphasised) to widely varying

extents.

Differences Among Institutions

It is possible to imagine that there could be differences among institutions on the generic skills

scale that could reflect the extent to which course provide opportunities for and emphasise

underlying skills and the application of those skills to contexts other than the particular course

of study. In order to explore institutional differences in generic skills investigations were

made of initial nursing and accounting.

Nursing

Figure 4.4 shows the mean percentage agreement on the generic skills scale for nursing

graduates from 26 institutions in 2001.

Page 49: Course Experience Questionnaire 2001

Properties of the CEQ

39

Figure 4.4 Mean Percentage Agreement on Generic Skills for Initial Nursing by

Institution.

The overall pattern is that typically 60 per cent of nursing graduates from an institution are in

agreement with the set of items concerned with the development of generic skills. The range

in the institutional values is from 45 per cent to 78 per cent. It could be of considerable

interest to investigate the characteristics of nursing courses where the mean percentage

agreement was 70 per cent or higher and compare those characteristics to courses where the

mean percentage agreement is less than 55 per cent.

45

52

52

55

55

56

56

56

56

56

57

60

60

61

61

62

62

62

63

66

66

69

69

72

72

74

78

0 10 20 30 40 50 60 70 80 90

AA [N=112]

Z [N=130]

Y [N=41]

X [N=72]

W [N=28]

V [N=142]

U [N=165]

T [N=82]

S [N=48]

R [N=156]

Q [N=62]

P [N=103]

O [N=140]

N [N=103]

M [N=118]

L [N=48]

K [N=65]

J [N=39]

I [N=102]

H [N=249]

G [N=126]

F [N=33]

E [N=55]

D [N=87]

C [N=50]

B [N=40]

A [N=44]

Mean Agreement (%)

Page 50: Course Experience Questionnaire 2001

CEQ 2001

40

Accounting

Figure 4.5 shows differences among accounting graduates from 25 institutions on the generic

skills scale. The average value across institutions was 56 per cent and the range was from 44

to 69 per cent. As was noted for nursing graduates it would be of interest to explore the

characteristics of courses where the graduates recorded high scores on the generic skills scale

and compare those courses with similar courses on which accounting graduates scored lower

on the scale.

Figure 4.5 Mean Percentage Agreement on Generic Skills for Accounting by

Institution

44

45

47

51

51

52

52

52

52

53

53

54

54

55

56

56

56

57

58

59

59

60

61

62

62

62

64

64

65

66

66

66

69

0 10 20 30 40 50 60 70 80

AG [N=193]

AF [N=180]

AE [N=43]

AD [N=91]

AC [N=105]

AB [N=33]

AA [N=40]

Z [N=55]

Y [N=134]

X [N=194]

W [N=146]

V [N=174]

U [N=22]

T [N=95]

S [N=193]

R [N=53]

Q [N=71]

P [N=205]

O [N=139]

N [N=104]

M [N=106]

L [N=29]

K [N=117]

J [N=50]

I [N=49]

H [N=65]

G [N=36]

F [N=39]

E [N=67]

D [N=230]

C [N=100]

B [N=26]

A [N=77]

Mean Agreement (%)

Page 51: Course Experience Questionnaire 2001

Properties of the CEQ

41

Table 4.4 Results of Principal Component Analysis of the GSS for Each Broad Field of Study

Factor Loadings for Each Broad Field of Study and Total

Generic Skills Scale Items

Ag

ricultu

re

Arch

itecture

Arts &

Hu

man

ities

Bu

siness S

tud

ies

Ed

ucatio

n

En

gin

eering

Health

Law

Scien

ce

Vet S

c.

All F

ields

F1 F2

The course developed my problem solving skills 78 73 71 30 76 75 79 77 81 76 80 77

The course sharpened my analytic skills 78 74 82 76 73 79 74 82 76 77 76

The course helped me develop my ability to work as a team member 51 56 94 58 57 62 61 39 52 46 48

As a result of my course, I feel confident about tackling unfamiliar problems 77 76 59 54 77 78 76 77 77 76 71 77

The course improved my skills in written communication 71 58 76 69 70 53 63 74 62 44 64

My course helped me to develop the ability to plan my own work 71 67 67 70 71 69 72 72 70 76 70

Percentage variance explained by principal component 51% 46% 47% 17% 51% 50% 50% 50% 52% 48% 45%

Eigen values for components with eigen values greater than 1. 3.1 2.8 2.8 1.0 3.0 3.0 3.0 3.0 3.1 2.9 2.7

Page 52: Course Experience Questionnaire 2001

42

Dimensions of the Generic Skills Scale

One of the issues in using the Generic Skills Scale is the extent to which it functions as a scale

in different broad fields of study. In order to examine this question a series of principal

components analyses were conducted for each of the ten broad fields of study. Results have

been reported in Table 4.4. For each field of study, except arts humanities and social sciences,

one principal component was identified as explaining 50 per cent of the variance in responses

to the items. In these fields the item that fitted the structure least well was the item concerned

with developing “my ability to work as a team member”. This item fitted the structure best in

the fields of engineering and health. However, even this item still fitted the scale satisfactorily

in all nine fields.

For graduates from the humanities arts and social sciences the item concerned with teamwork

appeared to capture a dimension that was a little different from the other items in the scale9.

For students in this field of study working as member of a team was partly linked to feeling

confident about tackling unfamiliar problems. It is possible that this dimension was reflecting

aspects of the development of personal confidence for graduates in the humanities, arts and

social sciences. This issue deserves further exploration in other studies. This does not mean

that the item cannot be used as part of the scale but simply that is capturing a different facet of

working as a member of a team.

Summary

The items that were constructed in the early 1990s to obtain information about the emerging

concepts of generic skills have formed part of the CEQ since the 1993 survey. Analysis of

responses to those items in CEQ 2001 indicates that they do provide information about these

aspects of the course experience of graduates. It is important that the emphasis on generic

skills in bachelor degree programs be examined, just as good teaching, as part of the CEQ.

There is a long tradition that argues that a primary function of university education is to

develop knowledge skills and understanding that extends beyond particular disciplines and

finds application in the wider world. Analyses of CEQ 2001 indicate that there are differences

between fields of study in graduates’ opinions about the emphasis placed on generic skills and

more importantly differences in the emphasis placed on different aspects of generic skills.

Although the skills may be generic in a broad sense they are not equally represented in

different fields. Within two specific fields that were examined it appeared that there were

differences among institutions that deserved further exploration. Finally analyses of the

structure of the scale indicated that it constituted a coherent set of items in almost all broad

fields of study but that the item concerned with working as a member of a team may have

involved other issues, particularly for graduates in the humanities, arts and social sciences.

9 The first principal component still explained 47 per cent of the variance and the second component added

an additional 16 per cent of variance.

Page 53: Course Experience Questionnaire 2001

Properties of the CEQ

43

Properties of the CEQ

The Scales

Five scales have been presumed to underlie responses to 23 of the items in the CEQ. The 25th

item has always been treated separately as an overall measure of satisfaction and the 16th item

has never been included in any of the scales and its content has changed over time.

The five scales are:

The Good Teaching Scale (GTS);

The Clear Goals and Standards Scale (CGS);

The Appropriate Workload Scale (AWS);

The Appropriate Assessment Scale (AAS); and

The Generic Skills Scale (GSS).

The allocation of the items to the scales is shown in Figure 5.1.

The use of scales instead of individual items is intended to both simplify the presentation of

results and to improve the robustness of the measures. Simplification is achieved by

combining results for several closely related items in a single statistic. Robustness is

enhanced because what is being measured does not depend on the particular wording of one

item but draws strength from the group of items.

Reliabilities of the Scales

The extent to which scales measure the construct behind the items reliably is indicated by a

reliability coefficient. There are many forms of reliability coefficient but they all can have

values ranging from 0 (unreliable) to 1 (completely reliable). Reliability coefficients indicate

the extent to which one could expect to obtain the same score on several different

administrations of the scale and the consistency between the items making up the scale. Table

5.1 records two reliability coefficients for each of the CEQ scales.

Table 5.1 Reliability of the CEQ Scales: Bachelor Degree Graduates, CEQ 2001

CEQ Scale Cronbach

Alpha

Composite Scale

Reliability

Good Teaching Scale 0.87 0.91

Clear Goals and Standards 0.78 0.81

Appropriate Workload Scale 0.71 0.75

Appropriate Assessment Scale 0.71 0.77

Generic Skills Scale 0.77 0.84

Page 54: Course Experience Questionnaire 2001

CEQ 2001

44

Good Teaching Scale (six items)

3. The teaching staff of this course motivated me to do my best work.

7. The staff put a lot of time into commenting on my work.

15. The staff made a real effort to understand difficulties I might be having with my work

17. The teaching staff normally gave me helpful feedback on how I was going.

18. My lecturers were extremely good at explaining things.

20. The teaching staff worked hard to make their subjects interesting.

Clear Goals and Standards Scale (four items)

1. It was always easy to know the standard of work expected.

6. I usually had a clear idea of where I was going and what was expected of me in this course.

13. It was often hard to discover what was expected of me in this course.

24. The staff made it clear right from the start what they expected from students.

Appropriate Workload Scale (four items)

4. The workload was too heavy.

14. I was generally given enough time to understand the things I had to learn.

21. There was a lot of pressure on me to do well in this course.

23. The sheer volume of work to be got through in this course meant it couldn't all be thoroughly

comprehended.

Appropriate Assessment Scale (three items)

8. To do well in this course all you really needed was a good memory.

12. The staff seemed more interested in testing what I had memorised than what I had understood.

19. Too many staff asked me questions just about facts.

Generic Skills Scale (six items)

2. The course developed my problem-solving skills.

5. The course sharpened my analytic skills.

9. The course helped me develop my ability to work as a team member.

10. As a result of my course, I feel confident about tackling unfamiliar problems.

11. The course improved my skills in written communication.

22. My course helped me to develop the ability to plan my own work.

Figure 5.1 Scales and Items of the Course Experience Questionnaire

Structure of the CEQ

The existence of groups of items, and the composition of the groups, relating to common

underlying dimensions in the CEQ had been established through successive exploratory factor

analyses reported in previous reports of the CEQ (Johnson, 1999). This structure was

confirmed by analyses of the CEQ 1999 data (Long & Hillman, 2000). In their analysis of the

CEQ data for the year 1999 confirmatory factor analysis was used to establish that the data

provided a good fit to the model underlying the CEQ.

Exploratory Factor Analysis

Table 5.2 shows the pattern of factor loadings from which the structure of the instrument can

be inferred. Factor analysis is used to explore the pattern of correlations between item

responses. Through an analysis of these patterns it seeks to make inferences about the

underlying latent factors that might explain the patterns of variation in items.

Page 55: Course Experience Questionnaire 2001

Properties of the CEQ

45

Table 5.2 Factor Loadings Derived from Exploratory Factor Analysis of CEQ Items:

Bachelor Degree Graduates: CEQ 2001

No. CEQ Item CEQ

Scale

Factor

1

Factor

2

Factor

3

Factor

4

Factor

5

17

The teaching staff normally gave me helpful feedback on how I

was going. GTS 79

15

The staff made a real effort to understand difficulties I might be

having with my work GTS 76

7 The staff put a lot of time into commenting on my work. GTS 76

18 My lecturers were extremely good at explaining things. GTS 69

20

The teaching staff worked hard to make their subjects

interesting. GTS 67

3

The teaching staff of this course motivated me to do my best

work. GTS 65 30

16 n

The assessment methods employed in this course required an

in-depth understanding of the course content. GTS 37 32

10

As a result of my course, I feel confident about tackling

unfamiliar problems. GSS 74

2 The course developed my problem-solving skills. GSS 72

5 The course sharpened my analytic skills. GSS 71

22

My course helped me to develop the ability to plan my own

work. GSS 65

11 The course improved my skills in written communication. GSS 58

9

The course helped me develop my ability to work as a team

member. GSS 53

1 It was always easy to know the standard of work expected CGS 76

6

I usually had a clear idea of where I was going and what was

expected of me in this course. CGS 71

13 r

It was often hard to discover what was expected of me in this

course. CGS 70

24

The staff made it clear right from the start what they expected

from students. CGS 45 59

21 r There was a lot of pressure on me to do well in this course. AWS 76

4 r The workload was too heavy. AWS 75

23 r

The sheer volume of work to be got through in this course

meant it couldn’t all be thoroughly comprehended. AWS 73

14

I was generally given enough time to understand the things I

had to learn. AWS 35 55

8 r

To do well in this course all you really needed was a good

memory. AAS 77

12 r

The staff seemed more interested in testing what I had

memorised than what I had understood. AAS 76

19 r Too many staff asked me questions just about facts. AAS 71

Notes: r = a reversed item n = not used in subsequent analyses

GTS = Good Teaching Scale GSS = Generic Skills Scale

CGS = Clear Goals and Standards Scale AWS = Appropriate Workload Scale

AAS = Appropriate Assessment Scale

Page 56: Course Experience Questionnaire 2001

CEQ 2001

46

Tables of results from factor analysis record among other statistics factor loadings, which are

the correlations between the item score and the underlying factor. Factor analysis tables also

record how well a set of underlying factors accounts for the pattern of item responses (as a

percentage of variance and a statistic known as an eigen value).

Table 5.2 shows the group or scale to which the item was assigned based on theory and the

results of analyses survey data from previous years. Five factors had eigen values greater than

one and accounted for 58 per cent of the variance in item responses. Table 5.2 also records

the factor loadings10

. Those factor loadings closely matched the theoretical structure of the

questionnaire and the corresponding statistics in analyses from previous years.

The factor analyses confirmed previous findings that the items could be grouped into five

scales. The Overall Satisfaction item (question 25) was kept separate. In its current version

item 16 was designed to strengthen the Appropriate Assessment Scale. However, it is

apparent from Table 5.2 that item 16 groups with the Good Teaching Scale items. It is

excluded from the current CEQ analyses of scales. In summary, analyses of data from the

CEQ survey of 2001 replicated the exploratory factor analyses of previous years.

Confirmatory Factor Analysis

Confirmatory factor analysis11

was conducted in order to investigate the extent to which

measures based on the five clusters of items identified separately identifiable constructs.

Confirmatory factor analysis (CFA) differs from exploratory factor analysis (EFA) in that

CFA leads to a single ‘identified’ solution that can be tested for ‘goodness-of-fit’ against the

data. A CFA thus tests a ‘measurement theory’ against an available data set and assesses the

goodness of fit of the measurement theory to the observed data.

Table 5.3 presents the results of three sets of confirmatory factor analyses of responses to the

CEQ by graduates of bachelor-level courses. The coefficients associated with each of the

items are shown in Table 5.3 together with a variety of ‘goodness-of-fit’ measures12

. The

first model is the theoretical set of five factors with each item loading on only one factor. It

specified that each item was associated with one and only one latent variable or factor. All

other factor loadings were fixed to zero, as were all correlations among the ‘errors’ of the

items. It was assumed that factors could be correlated.

10

In accordance with convention, factor loadings less than 0.30 have been omitted and decimal points have

been dropped. 11

Dr Gerald Elsworth of the RMIT University conducted this part of the analysis. The confirmatory factor

analyses were carried out with the structural equation modelling (SEM) program LISREL. 12

In analyses with large numbers of factors and items, and a moderately large sample such as this, chi-square is

regarded as an index that is excessively sensitive to lack of fit, however. A number of ‘comparative fit’ (or

‘lack-of-fit) indices have accordingly been developed which, in various ways, compare the chi-square of the

fitted model to that of a base line or ‘null’ model. Hence, for the present analysis, the ‘comparative fit index

(CFI) was 0.97 while the ‘root mean square residual’ (RMSR) was 0.101 and the ‘root mean square error of

approximation’ (RMSEA) was 0.038. The first two indices can be thought of as proportional measures of

goodness of fit (maximum 1.0) while the latter two can be thought of as measures of ‘lack of fit’. Values of

standard goodness of fit indices of over 0.9 are frequently regarded as satisfactory as are values of RMSEA of

below 0.05. Hence there was a close fit of the model to the observed data.

Page 57: Course Experience Questionnaire 2001

Properties of the CEQ

47

Table 5.3 Confirmatory Factor Analyses, Bachelor Degree Graduates, CEQ 2001

No. CEQ Items Factor Sq. Multiple

Loading Correlations

Model . . . (1) (2) (3) (1) (2) (3)

Good Teaching Scale

3. The teaching staff of this course motivated me to do my best work. 83 78 84 69 61 71

7. The staff put a lot of time into commenting on my work. 79 80 79 63 64 62

15. The staff made a real effort to understand difficulties I might be having . . . 77 77 79 60 59 62

17. The teaching staff normally gave me helpful feedback on how I was going. 84 85 85 71 72 72

18. My lecturers were extremely good at explaining things. 80 79 82 64 62 68

20. The teaching staff worked hard to make their subjects interesting. 75 74 78 56 55 61

Clear Goals and Standards Scale

1. It was always easy to know the standard of work expected. 76 75 73 57 56 53

6. I usually had a clear idea of . . . what was expected of me in this course. 80 80 78 64 61 61

13.* It was often hard to discover what was expected of me in this course. 80 67 75 64 46 56

24. The staff made it clear right from the start what they expected from students. 75 67 73 56 45 54

Appropriate Workload Scale

4.* The workload was too heavy. 66 70 48 43 48 24

14. I was generally given enough time to understand the things I had to learn. 79 53 72 62 28 51

21.* There was a lot of pressure on me to do well in this course. 58 66 27 33 43 07

23.* The sheer volume of work . . . couldn’t all be thoroughly comprehended 73 72 58 53 52 33

Appropriate Assessment Scale

8.* To do well in this course all you really needed was a good memory. 63 68 54 39 46 30

12.* The staff seemed more interested in testing what I had memorised . . . 84 82 74 71 68 55

19.* Too many staff asked me questions just about facts. 69 66 64 48 44 41

Generic Skills Scale

2. The course developed my problem-solving skills. 78 79 75 61 63 56

5. The course sharpened my analytic skills. 77 81 75 60 65 56

9. The course helped me develop my ability to work as a team member. 42 44 37 17 19 14

10. As a result of my course, I feel confident about tackling unfamiliar problems. 75 77 74 57 59 54

11. The course improved my skills in written communication. 66 62 67 44 39 45

22. My course helped me to develop the ability to plan my own work. 66 67 67 43 45 45

Notes to Table 5.3

a. Models: (1) is a first-order confirmatory factor analysis (CFA) model with five factors and each factor loading on only

one item and no additional correlations between the unexplained variance in the items. (2) considers five separate one-

factor ‘congeneric’ measurement models. (3) is a first-order CFA model with one factor only and no additional

correlations between the unexplained variance in the items..

b. All models are calculated from a polychoric correlation matrix and a corresponding asymptotic covariance matrix with

weighted least squares.

c. N of cases, 49,773.

d. Fit Statistics: Model 1 – ChiSq=25,828.4 (d.f. 220); RMSEA=0.049; SRMR=0.100; AGFI=0.98; CFI=0.88. Model 3 -

ChiSq=45,867.1 (d.f. 230); RMSEA=0.062; SRMR=0.150; AGFI=0.96; CFI=0.95.

Model 2 - Good Teaching Scale - ChiSq=2,503.9 (d.f. 9); RMSEA=0.075; SRMR=0.048; AGFI=0.99; CFI=0.97. Clear

Goals and Standards - ChiSq=15.00 (d.f. 2); RMSEA=0.011; SRMR=0.004; AGFI=1.00; CFI=1.00. Appropriate

Workload Scale - ChiSq=663.0 (d.f. 2); RMSEA=0.080; SRMR=0.029; AGFI=0.99; CFI=0.97. Appropriate Assessment

Scale – Saturated model, no fit statistics calculated. Generic Skills Scale – ChiSq=3391.2 (d.f. 9); RMSEA=0.087;

SRMR=0.059; AGFI=0.98; CFI=0.93.

e. Correlations between the factors in Model 1: GT/G&S = .76; GT/AW = .52; GT/Ass = .59; GT/GSk = .64; G&S/AW =

.57; G&S/Ass = .51; G&S/GSk = .58; AW/Ass = .47; AW/GSk = .33; Ass/GSk = .52.

f. * denotes a reversed item.

Page 58: Course Experience Questionnaire 2001

CEQ 2001

48

Model 2 tested the separate one-factor models for each of the presumed scales separately. It is

really a set of models consisting of separate analyses: first the six items of the Good Teaching

Scale predicted by one latent variable, then the four items of the Clear Goals and Standards

Scale, and so on. In model 3 it was assumed that a single factor would best explain the

patterns in the data. The third model assumed that all items reflected a single underlying trait

perhaps ‘satisfaction with course’.

Values for two statistics are presented for each item for each model. The factor loading

reflects the effect the underlying dimension has on responses to the item and the squared

multiple correlations show the extent to which the model explains variance in the item. A

number of goodness-of-fit statistics are presented for each model in the notes.

In the first model the factor loadings are generally high. Item 9, The course helped me

develop my ability as a team member has a lower factor loading than desirable and this is

consistent with the discussion in the previous chapter. Item 21 from the Appropriate

Workload Scale (There was a lot of pressure on me to do well in this course) also had a low

factor loading. The squared multiple correlations are similarly overall fairly high, except for

these items. The measures of fit that are unaffected by sample size show very good levels of

fit. It can be concluded that the CEQ model fits the pattern of responses to the items well.

Moreover the results are similar to those reported for CEQ 1999 and CEQ 2000. The third

model tests the result of assuming a single factor (perhaps overall satisfaction) could explain

the variation in CEQ responses. As would be expected, the fit of most items is worse than for

either of the other two models. However, a single trait model does fit the responses to the

CEQ to a reasonably good extent.

Summary

The investigation of the structure of the CEQ from the 2001 survey provides results that are

consistent with those from previous years. The structure of the measurement model in the

CEQ fits the pattern of responses in the survey data. In addition the scales have a satisfactory

reliability. The results also suggest that it may be possible to identify a dimension concerned

with general course satisfaction that influences many of the separate scale scores. That

suggests that there may be a common element of general satisfaction underpinning graduate

responses. However, for use in exploring features of good practice so as to improve the quality

of courses it is probably more fruitful to use scores related to the separate scales.

Page 59: Course Experience Questionnaire 2001

Appendices

49

References

AVCC (1995). The AVCC Code of Practice. Canberra: Australian Vice-Chancellors’

Committee.

Australian Education Council. Finn Review Committee. (1991). Young People's Participation

in Post-Compulsory Education and Training. Report of the Australian Education

Council Review Committee. Canberra: AGPS.

Australian Education Council. Mayer Committee. (1992a). Key Competencies. Report of the

Committee to advise the Australian Education Council and Ministers of Vocational

Education, Employment and Training on employment-related Key Competencies for

postcompulsory education and training. Canberra: Australian Education Council and

Ministers of Vocational Education, Employment, and Training.

Australian Education Council. Mayer Committee. (1992b). Putting General Education to

Work. The key competencies report. Canberra: AGPS.

Bowden, J., Hart, G., King, B., Trigwell, K., & Watts, O. (2000). Generic Capabilities of ATN

University Graduates, [Web document]. Available:

http://www.clt.uts.edu.au/ATN.grad.cap.project.index.html [2000, 28 August].

Conference Board of Canada. (1992). Employability Skills Profile: What are Employers

looking for? Ottawa: Conference Board of Canada.

Curtis, D. & McKenzie, P. (2002). Employability Skills for Australian Industry: Literature

Review and Framework Development. Report to the Business Council of Australia and

the Australian Chamber of Commerce and Industry. Melbourne: ACER.

Eley, M.G. (2001). TI: The Course Experience Questionnaire : altering question format and

phrasing could improve the CEQ's effectiveness. Higher Education Research and

Development, 20 (3), pp. 293-312.

Entwistle, N.J. and Ramsden, P. (1983). Understanding Student Learning. London: Croom

Helm.

Guthrie, B. and Johnson, T.G. (1997). Study of Non-Response to the 1996 Graduate

Destination Survey. Canberra: AGPS.

Hambur, S., & Glickman, H. (2001). Summary Report: GSA Exit 2000. Melbourne: ACER.

Johnson, T. (1999). The 1998 Course Experience Questionnaire. Melbourne: GCCA.

Linke, R. (1991). Performance Indicators in Higher Education, Vols 1 and 2. Canberra:

AGPS.

Long, M. & Hillman, K. (2000). 1999 Course Experience Questionnaire. Melbourne: GCCA.

Moser, C. (1999). A Fresh Start – Improving Literacy And Numeracy. UK, Department of

Education and Science. Available:

http://www.lifelonglearning.co.uk/mosergroup/index.htm [2001, 21 May].

Page 60: Course Experience Questionnaire 2001

CEQ 2001

50

National Board of Employment, Education and Training (NBEET), Employment and Skills

Formation Council. (1992). The Australian Vocational Certificate Training System

(Carmichael Report). Canberra: National Board of Employment, Education and

Training.

O'Neil, H. F., Allred, K., & Baker, E. L. (1997). Review of Workforce Readiness Theoretical

Frameworks. In H. F. O'Neil (Ed.), Workforce readiness. Competencies and assessment

(pp. 3-25). Mahwah, NJ: Lawrence Erlbaum.

Ramsden, P. (1991a). Report on the Course Experience Questionnaire trial. In R. Linke (Ed),

Performance Indicators on Higher Education, Vol 2. Canberra: AGPS.

Ramsden, P. (1991b). A performance indicator of teaching quality in higher education: The

Course Experience Questionnaire. Studies in Higher Education, 16, 2, 129-150.

Ramsden, P. and Entwistle, N.J. (1981). Effects of academic departments on students’

approaches to studying. The British Journal of Educational Psychology, 51, 368-383.

Salganik, L. H., Rychen, D. S., Moser, U. & Konstant, J. W. (1999). Definition and Selection

of Competencies. Projects on Competencies in the OECD Context. Analysis of

Theoretical and Conceptual Foundations. Neuchatel, Switzerland: OECD.

SCANS. (1991). What Work Requires of Schools. A SCANS Report for America 2000.

Washington, DC: U.S. Department of Labor.

Wilson, K., Lizzio, A. & Ramsden, P. (1997). The development, validation and application of

the Course Experience Questionnaire. Studies in Higher Education, 22, 1, 33-53.

Page 61: Course Experience Questionnaire 2001

Appendices

51

Appendix A: The Course Experience Questionnaire

Page 62: Course Experience Questionnaire 2001

CEQ 2001

52

Appendix B: The AVCC Code of Practice

Page 63: Course Experience Questionnaire 2001

Appendices

53

Page 64: Course Experience Questionnaire 2001

CEQ 2001

54

Page 65: Course Experience Questionnaire 2001

Appendices

55

Page 66: Course Experience Questionnaire 2001

CEQ 2001

56

Page 67: Course Experience Questionnaire 2001

Appendices

57

Page 68: Course Experience Questionnaire 2001

CEQ 2001

58

Appendix C: Response Rates of Institutions Participating in GDS 2001

GDS CEQ

Institution Npopln Nresps RR (%) Nresps RR (%)

Australian Catholic University 2678 1877 70.1 1812 67.7

Australian Maritime College 137 65 47.4 52 38.0

Australian National University 2223 1239 55.7 994 44.7

Avondale College 190 166 87.4 157 82.6

Bond University 688 385 56.0 312 45.3

Central Queensland University 3096 1537 49.6 1176 38.0

Charles Sturt University 6291 3165 50.3 2710 43.1

Curtin University of Technology 5064 2340 46.2 2052 40.5

Deakin University 6085 3510 57.7 3466 57.0

Edith Cowan University 4205 2140 50.9 2109 50.2

Flinders University of South Australia 3074 1986 64.6 1819 59.2

Griffith University 5678 3897 68.6 2356 41.5

James Cook University 1449 758 52.3 651 44.9

La Trobe University 5685 3841 67.6 3711 65.3

Macquarie University 5269 2925 55.5 2412 45.8

Marcus Oldham College 40 16 40.0 14 35.0

Monash University 6799 3655 53.8 3134 46.1

Murdoch University 2390 1332 55.7 1305 54.6

Northern Territory University 795 426 53.6 388 48.8

Queensland University of Technology 7584 5107 67.3 4478 59.0

RMIT 6880 3771 54.8 2982 43.3

Southern Cross University 2045 1019 49.8 990 48.4

Swinburne University of Technology 2314 1373 59.3 1267 54.8

University of Adelaide 3131 1669 53.3 1220 39.0

University of Ballarat 1303 776 59.6 530 40.7

University of Canberra 1929 993 51.5 918 47.6

University of Melbourne 8897 5320 59.8 2755 31.0

University of New England 2802 1835 65.5 1769 63.1

University of New South Wales 7543 3894 51.6 2233 29.6

University of Newcastle 3734 2203 59.0 2000 53.6

University of Notre Dame 202 128 63.4 116 57.4

University of Queensland 7157 4096 57.2 2484 34.7

University of South Australia 5398 2897 53.7 2709 50.2

University of Southern Queensland 2772 1626 58.7 1547 55.8

University of Sydney 7568 4015 53.1 3161 41.8

University of Tasmania 2766 1732 62.6 1410 51.0

University of Technology, Sydney 6617 3397 51.3 3050 46.1

University of the Sunshine Coast 341 254 74.5 250 73.3

University of Western Australia 3465 1909 55.1 1394 40.2

University of Western Sydney 5223 3355 64.2 3093 59.2

University of Wollongong 3150 1668 53.0 1176 37.3

Victoria University 3496 2113 60.4 1400 40.0

Total 158153 90410 57.2 73562 46.5

Note: Response rate calculations are based on the number of survey forms returned. Nvalid = the number of

survey forms containing sufficient GDS background information to process. A return to the CEQ is

defined as a graduate who has a valid score for at least one of the CEQ scales or the Overall Satisfaction

item for either the first or second course on the questionnaire.

Page 69: Course Experience Questionnaire 2001

Appendices

59

Appendix D: Comparison of Characteristics of CEQ 2001 Respondents and

the Population of Bachelor Degree Graduates from 2000

Percentage Distribution

CEQ 2001 Respondents Bachelor Degree Graduates 2000

Sex

Male 37.3 41.3

Female 62.7 58.7

Age

24 and younger 64.3 68.9

25-29 years 14.1 13.4

30-39 years 12.1 10.8

40 and older 9.5 6.9

Field of Study

Agriculture 1.8 1.4

Architecture 2.1 2.2

Arts, Humanities & Soc. Sc. 24.0 24.0

Business Studies 23.6 26.1

Education 9.6 8.3

Engineering 5.4 5.9

Health 13.7 11.9

Law 3.8 3.7

Science 15.8 16.2

Veterinary Science 0.2 0.3

Residency

Permanent resident 91.2 82.2

Overseas resident 8.8 17.8

Level of Course

Bachelor honours 9.6 7.5

Bachelor pass 88.9 91.3

Undergraduate diploma 1.4 1.2

Notes: Table based on responses to first major to avoid double counting

Population values are derived from DETYA (2002) Students 2001: Selected Higher Education Statistics.

Canberra: Department of Education, Science and Training. (http://www.dest.gov.au/highered/statpubs)