appendix 1 olaaf partner site summary the 7 consortium ... · questionmark perception will be the...

30
Appendix 1 Page 1 of 4 Appendix 1 OLAAF Partner Site Summary The 7 Consortium Partners are representatives of the following institutions: Birkbeck College (Lead Site) Brunel University London Metropolitan University University of Birmingham University of Brighton University of Plymouth University of Wales College of Medicine 1. Birkbeck College Biological and Chemical Sciences In Biology at Birkbeck, we have long experience in teaching mature students from non- standard backgrounds, at undergraduate and postgraduate level. Non-native English speakers form an increasing proportion of our student body, a pan-European trend that may well be the future for the HE sector. One solution has been to embed web-delivered CBAF in foundation year and second-year undergraduate units. This enhanced student learning by providing unrestricted access to focused tutorial materials, on site and from home. Dr Rayne was leader of the above-mentioned project, and Dr Baggott was an evaluator for the TLTP project, QUERCUS. Birkbeck was an evaluation site for the FDTL1 project 34/96, TRIADS - Tripartite Assessment Delivery System. A priority area for embedding CBAF ’s within programmes is support for quantitative biology at all levels and integration with CD-based pedagogy. Birkbeck College Earth Sciences Over the last few years, the Earth Sciences department have developed CD and web-based learning materials for a number of Geology and Earth Science courses, enabling distance learning and significantly increasing student intake. Involvement in the OLAAF project will lead to the integration of computer-based assessment and feedback (CBAF) within some of these courses.

Upload: others

Post on 14-Oct-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 1 Page 1 of 4

Appendix 1

OLAAF Partner Site Summary

The 7 Consortium Partners are representatives of the following institutions:

Birkbeck College (Lead Site)

Brunel University

London Metropolitan University

University of Birmingham

University of Brighton

University of Plymouth

University of Wales College of Medicine

1. Birkbeck College

Biological and Chemical Sciences

In Biology at Birkbeck, we have long experience in teaching mature students from non-

standard backgrounds, at undergraduate and postgraduate level. Non-native English

speakers form an increasing proportion of our student body, a pan-European trend that may

well be the future for the HE sector. One solution has been to embed web-delivered CBAF in

foundation year and second-year undergraduate units. This enhanced student learning by

providing unrestricted access to focused tutorial materials, on site and from home.

Dr Rayne was leader of the above-mentioned project, and Dr Baggott was an evaluator for

the TLTP project, QUERCUS. Birkbeck was an evaluation site for the FDTL1 project 34/96,

TRIADS - Tripartite Assessment Delivery System.

A priority area for embedding CBAF ’s within programmes is support for quantitative biology

at all levels and integration with CD-based pedagogy.

Birkbeck College

Earth Sciences

Over the last few years, the Earth Sciences department have developed CD and web-based

learning materials for a number of Geology and Earth Science courses, enabling distance

learning and significantly increasing student intake. Involvement in the OLAAF project will

lead to the integration of computer-based assessment and feedback (CBAF) within some of

these courses.

Page 2: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 1 Page 2 of 4

The Certificate/Diploma in Geochemistry, which is currently being developed for CD-Rom

delivery in October 2004, will incorporate elements of CBAF.

2. Brunel University

Biological Sciences

The Biological Sciences Department at Brunel University is still relatively new to CBL and

related teaching methods. We hope that through research from the E-Learning group within

the Department, we will be able to increase the usage of the amazing technology that is

available to us to improve student learning.

The eLearning section of the Cell, Chromosome Biology Group at Brunel University

specialises in Virtual Lectures, Simulated Practical classes and computer-based assessment

of biological subject matter. Currently we incorporate CBL in several of our modules and run a

number of simulated practical classes in place of ‘wet’ laboratories. Research in our group

provides evidence that the CBL approach can significantly improve the student learning

experience and provide added benefits for the lecturer.

3. London Metropolitan University

Human and Health Science

E-Learning at London Metropolitan received a boost in 1999, with huge investment in the New

Tower IT facility providing 600 computers for students. Most lecturers now put some or all of

their learning material online.

Several courses in the Department of Human and Health Science will participate in the

OLAAF project, with the aim of introducing CBAF to complement the existing use of learning

technology.

The initial phase of the project will focus on first year Cell Biology (170 students); MSc Wound

Healing (20 students); second year Bioanalytical Techniques (50 students) and second year

Experimental Biochemistry (50 students)

4. University of Birmingham

Civil Engineering

The team at Birmingham have been involved in Computer Based Assessment and Feedback

(CBAF) for many years. They were an evaluation site for the TRIADS project, and the system

continues to be used at Post Graduate level. Questionmark Perception is also used at

Birmingham, and a comparison between the systems has been identified as an area of

interest.

Page 3: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 1 Page 3 of 4

The use of CBAF within the School of Engineering is entirely formative, with high quality on-

screen feedback provided for each quesiton. It has proved an invaluable tool for increasing

student understanding of the subject matter.

5. University of Brighton

Institute of Nursing and Midwifery (INaM)

Computer Based Assessment and Feedback (CBAF) will initially be used as a formative tool

to facilitate the acquisition of competency in Numeracy by Pre-Registration students of

Nursing and Midwifery. The Online Assessment and Feedback will use the TRIADS software

to deliver drug dosages calculations to first year students on the Pre-Registration Diploma in

Nursing (PRDN) and the BSc(Hons) Nursing and Midwifery courses.

6. University of Plymouth

Biological Sciences

All of us at Plymouth are using, in one form or another, problem-based learning (PBL)

approaches:

Les Jervis uses a PBL approach in the laboratory component of a second-year hybrid PBL

course in environmental and ecological biochemistry. Simon Belt has developed a series of

PBL case studies (with funding from the Royal Society of Chemistry) that are used in the

seminar component of a final year analytical chemistry course. Neil Witt uses a PBL

approach with final year Navigation students using an advanced marine navigation simulator.

Although our approaches to PBL differ, we all have similar reasons for using this pedagogy.

We want students to think through more or less complex problems that have several possible

solutions. That is, we want students to be able to work in situations where there is not

necessarily any one right answer and to start exercising judgements based on pre-existing

knowledge and knowledge gained in the process of working towards a solution of the

problem.

Our contribution to OLAAF will be to develop assessments designed to determine whether or

not students are meeting our intended learning objectives. We are not under any illusion that

designing either the assessments or the feedback is going to be easy. Problem-based

learning as an approach has at least as many critics as supporters. It presents difficulties in

assessment design and many of the assessment approaches developed over the past 30

years appear to have little connection to specific learning objectives that may seem suitable

Page 4: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 1 Page 4 of 4

for PBL approaches. OLAAF will give us an opportunity to build on good examples and our

own experiences.

7. University of Wales College of Medicine

Dental School

The Dental School at the University of Wales College of Medicine has a long history of using

Computer Based Assessment with Feedback (CBAF), with over 12 years expertise in e-

learning and assessment.

As a TRIADS project evaluation site, they have incorporated the system into several modules,

and are now exploring innovative ways of using CBAF to mimic microscopy classes and as a

means of emulating medical practice.

Page 5: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 2 Page 1 of 8

Appendix 2

OLAAF Interest Group Summary

The 7 OLAAF Interest Group members are representatives of the following institutions:

British School of Osteopathy

Edge Hill College of Higher Education

Keele University

Manchester Metropolitan University

Warwickshire College

University of Ulster

Kingston University

1. British School of Osteopathy

As members of the OLAAF Interest Group the team at the BSO will use TRIADS to develop

Computer Based Assessment for the Bachelor of Osteopathy Degree. They will focus on the

level 3 unit: Professional Capability, which usually involves 70-80 students. The assessments

will be used formatively to support clinic-based learning. Students will be provided with

information about a clinical situation and they will be asked to demonstrate their

understanding and knowledge, or to choose a course of action, including requesting

additional information which involves moving through an algorithmic series of 'stations'. It is

intended to assess decision-making and cognitive knowledge. The team also hope to pilot a

simple MCQ to test basic science knowledge as part of their selection procedure.

In the long term, ways of integrating CBAF into the new Virtual Learning Environment will be

explored.

2. Edge Hill College of Higher Education

The current initiative will build on 15 years' experience of CBA at Edge Hill, using QM DOS.

The main subject areas involved are Geology, Geography and Biology, where staff will be

trained in the use of the TRIADS system. Both formative and summative assessments will be

designed for undergraduate modules, and the following areas of development have been

identified:

• development of knowledge and understanding from a theory course in physical

geography

• supporting practical based learning in geology and environmental science

• supporting graphical learning in both geology and geography

Page 6: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 2 Page 2 of 8

It is expected that CBA will form a significant component of a new geography course at Level

1 running as from Sept 2003. This course (a revamped Year 1) integrates theoretical

knowledge and practical skills. It will also be used in other modules, and all initiatives will be

placed in a proper experimental context so that we can report on its utility.

TRIADS will be the main authoring tool.

3. Keele University

We will be implementing computer-based assessment within the subject area of Animal

Physiology, initially on a level 2 module with 40-80 students. Students will be directed to

specified reading (the programmed learning) on a weekly basis through the module. Each

block of reading will be formatively assessed by objective questions available on line, with

feedback. Approximately six times during the semester summative tests using questions of

the same type (and including some identical questions) to the formative tests will be given,

also with feedback.

Until confidence in the CAA software and our ability to deliver it is established, the summative

tests will be paper-based. It is intended to move to on-line assessment for this stage too, as

soon as possible. A conventional essay-style end of module examination will also be given.

Previous studies supported by a LTSN TDF grant have shown that this style of teaching (but

using paper-based assessment throughout) an improvement in examination performance

over lecture and essay examination delivery. A further comparison following the work

described above is intended, aiming to compare the effectiveness of on-line formative

assessment with paper-based delivery.

Questionmark Perception will be the main authoring tool.

4. Manchester Metropolitan University

I am one of 18 university senior learning & teaching fellows with a special interest in

ELearning. There are other staff within the department involved in the general field of

computer assisted learning and assessment. Although we still offer relatively few 'pure'

distance courses we have plans to extend the provision, particularly for masters students.

Online assessment, using WebCT, is being increasingly adopted and staff in Biological

Sciences are beginning to make more extensive use of online formative tests as one means

of supporting our students and improving our teaching. We are particularly interested in

exploring the most efficient and effective methods of providing feedback to students who

make use of the formative assessment opportunities.

Page 7: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 2 Page 3 of 8

5. Warwickshire College

Formative CBAF will be developed using TRIADS. It will be integrated into the Professional

Development Module (Level 1) on the Foundation Degree in Equine Studies which involves

about 70 students each year.

As part of the Professional Development module it is necessary for students to evaluate CAL

and CBAF as a means of delivering health & safety training within the equine industry. The

students are currently supplied with a CD-ROM which features a safety training program. The

CAL program is also included in a training package which can be delivered through yard-

based distance learning directly into the work place. To complete the package for evaluation

and training purposes it is proposed to develop CBAF materials to form part of the program

assessment.

6. University of Ulster

We’ve been using CAA for around 10 years now. The initial driving force was the need to

assess large numbers of students in the health sciences in an introductory science module.

Over the years we have tried to refine our approach making much more use of CAA for

formative assessment. This required the setting up of a large bank of questions all of which

contained substantial feedback. Our early efforts were supported by a University grant

(Enterprise in Higher Education) of £5K. This allowed us to develop resource materials using

QuestionMark for Windows. Subsequently we embarked on a bigger project (£27K) to set up

web based assessment using QuestionMark Perception . In 2001/2 we received LTSN

Physical Science funding (£5K) to develop question banks in Chemistry. Currently (2003/4)

we have been funded (£10K) by LTSN to undertake a project with Manchester Metropolitan

University to develop questions associated with laboratory work. KRA is also on the editorial

board of SPQR – a National Teaching Fellowship project being undertaken by Bob

Rotherham at Nottingham Trent University. The focus of this project is the development of

question banks in Sociology.

Our interest in OLAAF is not only to share good practice but also to explore the use of

innovative question types aimed at testing higher order skills. In QM Perception the hotspot or

drag and drop question template is the ideal vehicle for this. We would also be very interested

in establishing a database of research publications on fixed response questions and in

conducting research collaboratively in this field.

7. Kingston University

Computer-assisted assessment based on an Optical Mark Reader was introduced in the

Faculty of Science at Kingston University in 1992-3. With the increase in student numbers

and the introduction of a modular degree scheme, the OMR system is now used across the

University to administer more than 30,000 formative and summative tests a year.

Page 8: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 2 Page 4 of 8

A strong emphasis has been given to question design for computer-assisted assessment and

much time has been invested in devising challenging questions that can be adapted and re-

used. A question bank currently holding some 3000 questions has been compiled which

includes item statistics against which to evaluate question performance.

As an extension to this initiative, web-based assessment is now being adopted in a range of

subject areas across the University including: Chemical and Pharmaceutical Science,

Computing, Earth Sciences and Geography, Economics, Engineering, Life Sciences,

Psychology, Mathematics. Currently questions, which may include feedback, are being

delivered via the assessment module in Blackboard. From September 2003, the primary

authoring tool will be Questionmark Perception and staff from various subject areas will

contribute to the aims of the OLAAF project.

Details of intended case studies

a. Biology of Disease

b. Web Technologies

c. Inorganic Chemistry

d. Modern Techniques for Mathematics

e. Foundation Medicinal and Pharmaceutical Chemistry

f. Entry to Year 1: Chemistry, Pharmaceutical Science, Medicinal Chemistry

7a. Biology of Disease

Subject area: Life Sciences

Programme of study: Biomedical Science

Course/Module: LS2070A Biology of Disease

Level: 2

Estimated number of students involved: 100

Is the assessment summative/formative or both? Formative

Will computer-based assessment and feedback (CBAF) be used to support a particular

pedagogy? No

Please give a brief description of how you will use CBAF:

Page 9: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 2 Page 5 of 8

Provide students with an opportunity to practice assessments and receive automated

feedback prior to formative assessment.

Details of any previous experience of CBAF (include OMR if you give feedback other than

marks): Used OMR and provide students with oral feedback.

7b. Web Technologies

Subject area: Computing (within the School of Maths)

Programme of study: Computing with Business Management. Various Science Joint Honours

programmes. Media Technology. Communications Systems.

Course/Module: CO2013/3013A Web Technologies

Level: Level 2/3

Estimated number of students involved 2002/3: 112 (actual)2003/4: 200 (estimate)

Is the assessment summative/formative or both? Formative and summative.

Will computer-based assessment and feedback (CBAF) be used to support a particular

pedagogy?

Not specifically in 2002/3.In 2003/4 the module will include formative assessment to reinforce

exercise-based learning (‘practice-based’?)

Please give a brief description of how you will use CBAF:

Regular (weekly/fortnightly) formative assessments to allow students to identify weaknesses

not highlighted by the weekly exercises (which are textbook-based programming tasks).

Details of any previous experience of CBAF:

In CO2013/3013A (2002/3 autumn term) I used the Blackboard (Bb) environment to give one

‘formative’ assessment (quiz). This was unsuccessful as various enrolment issues (late option

choices and SITS/Bb problems) meant the quiz did not tie-in with the material it was designed

to reinforce.I also gave two in-class, summative tests using Bb which ran satisfactorily, but the

lack of feedback statistics in Bb has precluded any decent development/analysis of the tests. I

used Bb’s pools to randomise the answers to multiple choice/answer questions and therefore

have no summary statistics from the tests.In the past I have used the OMR. Our online testing

environment would benefit greatly from the kind of feedback statistics one receives from that!

Page 10: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 2 Page 6 of 8

7c. Inorganic Chemistry

Subject area: Chemistry

Programme of study: BSc Chemistry

Course/Module: CH2010A Inorganic Chemistry ICH2020B Inorganic Chemistry II

Level: Second year

Estimated number of students involved: 25

Is the assessment summative/formative or both? Both summative test and formative

Will computer-based assessment and feedback (CBAF) be used to support a particular

pedagogy? Problem based learning

Please give a brief description of how you will use CBAF:

Progress/revision tests to be delivered on-line.

Details of any previous experience of CBAF: None

7d. Modern Techniques for Mathematics

Subject area: Mathematics

Programme of study: Mathematics Joint BSc and Mathematical Sciences BSc

Course/Module: MA1050 Modern Techniques for Mathematics

Level: One

Estimated number of students involved: 15

Is the assessment summative/formative or both? Both formative and summative

Will computer-based assessment and feedback (CBAF) be used to support a particular

pedagogy?

No pedagogy in particular is used, but there are aspects of problem based learning and with

the aid of the Blackboard VLE some students treat the module almost as if it were designed

for distance learning (which it is not).

Page 11: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 2 Page 7 of 8

Please give a brief description of how you will use CBAF:

The module has set of laboratory exercises bound into a hardcopy study guide which aims to

guide students towards self-discovery of mathematical material by using the symbolic algebra

package Derive. Although largely self-paced, each week concludes with a feedback lecture

clarifying results students should have obtained in the computing laboratories. The main

component of feedback is a full Derive solution file for each week, released in Blackboard at

the start of the feedback lecture. This formative assessment feedback also serves as a

resource for students to use in the summative assessments, if required. Full Derive solutions

for the in-class test from a previous year are also released on Blackboard. Coursework

solutions are sometimes released in a similar way, but more often the fully annotated

hardcopies are returned to the students and key features of the assessment (including

common errors) are discussed in a feedback lecture.

Details of any previous experience of CBAF:

I have used OMR summative assessment for several years on a basic mathematics and

statistics module for non-mathematical scientists (approximately 400 students).

7e. Foundation Medicinal and Pharmaceutical Chemistry

Subject area: Assessment of Practical Experiments and Experimental Skills

Programme of study: MPHS/Medicinal Science(perhaps Pharmacy from 2004)

Course/Module: CH1741A Foundation Medicinal and Pharmaceutical Chemistry I CH1751B

Foundation Medicinal and Pharmaceutical Chemistry II

Level: First Year

Estimated number of students involved: 60-95

Is the assessment summative/formative or both? Summative (part of the continuous

assessment)

Will Computer-based assessment and feedback (CBAF) be used to support a particular

pedagogy?

Yes, we anticipate that the student will benefit from not only receiving their on-line test results

rapidly but will be able to access valuable feedback from other sources and hence gain a

wider learning experience.

Please give a brief description of how you will use CBAF:

Page 12: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 2 Page 8 of 8

We will provide an “open-book” on-line assessment for the student to participate in. Upon

completion the student (and the tutor) will be provided with an assessment mark. The student

will then be able to access additional information sources on-line. These sources will not only

provide correct data for erroneous/ inaccurate responses but also provide enhanced tutorial

support. Widening participation of these activities is anticipated to other institutions and

further support for will be sought.

Details of any previous experience of CBAF (include OMR if you give feedback other than

marks)

Our project has arisen from our extensive use of OMR for previous summative assessments.

7f. Entry to Year 1: Chemistry, Pharmaceutical Science, Medicinal Chemistry

Subject area: Chemical and Pharmaceutical Sciences

Programme of study: BSc and Masters programmes in Chemistry, Pharmaceutical Science,

Medicinal Chemistry

Course/Module: Entry to year 1

Level: 1

Estimated number of students involved: 100

Is the assessment summative/formative or both? Formative

Please give a brief description of how you will use CBAF:

Diagnostic assessment of first year students during induction before they enter the courses

listed above

Details of any previous experience of CBAF: OMR used in tests/ exams/ practicals

Page 13: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 3

Appendix 3 is the Dissemination Strategy. Because the latest version of this document can be found on the OLAAF web site (See About OLAAF> Dissemination), it is not reproduced here.

Appendix 4

Appendix 4 is a photocopy of a small feature article about the OLAAF Project written for the “BBK Magazine”. It has not been possible to reproduce this here.

Page 14: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 5 Page 1 of 2

Appendix 5

Programme for Plymouth Event

LTSN Bioscience, LTSN Geography, Earth and Environmental

Sciences, the OLAAF Project and the University of Plymouth:

On-line Assessment and Feedback

Venue: The Robbins Conference Centre, University of Plymouth (Plymouth Campus)

Date: Tuesday 21st October, 2003

Why is this meeting important?

Assessment needs to be effective, efficient and support student learning. The use of on-line

assessment, linked to effective immediate feedback offers the prospect of more efficient and effective

use of staff time AND improved feedback to students on how to improve future learning. In addition, on-

line assessment can inform staff about common student errors and misconceptions, thereby improving

future teaching. Recent developments in on-line assessment authorware improves the flexibility of

question types and the ability to assess higher level learning. New authorware is easier to use, allowing

staff to focus on assessment and feedback design. This meeting will examine general issues of

assessment and feedback and look in more depth at experiences of using two on-line authorware

systems - QM Perception and TRIADS. Opportunities will be available for delegates to present posters

on their own work and try out examples of both authorware systems. Staff experienced in the use of

both systems will be available throughout the day to discuss your individual issues about implementing

on-line assessment and feedback. The meeting is sponsored by the LTSN Centre for Bioscience and

the LTSN Centre for Geography, Earth and Environmental Sciences.

How will you benefit from attending?

By attending this conference delegates should be able to:

• Hear national speakers presenting on cutting edge topics (Graham Gibbs, Joanna Bull)

• Explore new ideas about their assessment and feedback issues

• Present posters on their own work

• View the demonstrations from FDTL4 and other projects

• Network with staff involved in the on-line assessment and feedback projects

• Get involved in the Panel Discussion

Who should attend?

This conference will be relevant to all staff involved in assessment, feedback and improving student

learning.

Page 15: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 5 Page 2 of 2

Programme

10.00 Registration and Coffee

10.30

Welcome and Introduction

Ivan Sidgreaves, Pro Vice Chancellor and Dean of Students

Learning and Teaching Support Networks

Professor Brian Chalkley (LTSN GEES, University of Plymouth)

Dr Heather Sears (LTSN Bioscience)

11.00 Keynote Presentation: Computer Based Assessment & Feedback

Joanna Bull (Eduology)

11.45 The Question Mark Perception Experience

Chris Ricketts, Roy Lowry, Sally Wilks (University of Plymouth)

12.45

Lunch, Posters and Demonstrations (QM Perception and TRIADS)

Matt Newcombe and Janet Corboy (University of Plymouth)

Prof. Don mackenzie (University of Derby)

14.00 The TRIADS Experience and OLAAF

Professor Don Mackenzie (University of Derby) - Introduction

Glenn Baggott (Birkbeck), Richard Rayne (Birkbeck), Les Jervis (Plymouth), Simon Belt

(Plymouth), Neil Witt (Plymouth)

15.15 Keynote Presentation: Does Your Assessment Support Your Students' Learning?

Professor Graham Gibbs (Open University)

16.00

Panel Discussion

Sue Burkill (Chair), Joanna Bull, Professor Graham Gibbs, Professor Don Mackenzie, Glenn

Baggott, Chris Ricketts, Les Jervis

Page 16: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Core Team (Birkbeck College)

Dr Richard Rayne [email protected] Director 0207 631 6253

Jenny Phillips [email protected] Manager 0207 700 2258

Ellen McCarthy [email protected] Expert 0207 079 0713

Kavita Shah [email protected] Expert 01895 274 000 x 4727

Consortium Site LeadersDr Glenn Baggott [email protected] College 0207 631 6244

Prof Chris Branford-White [email protected] Metropolitan 020 7753 5125

Dr Richard Freer-Hewish [email protected] of Birmingham 0121 414 5149

Dr Darren Griffin [email protected] University 01895 274 000

Dr Les Jervis [email protected] of Plymouth 01752 232 929

Dr John Potts [email protected] 029 2074 4239

Mr Patrick Saintas [email protected] of Brighton 01273 644 077

One-day Event

On-Line Assessment and Feedback

Tuesday October 21st 2003The Robbins Conference CentreUniversity of Plymouth

• Explore current issues in assessment and feedback

• Hear experiences of using different on-line authoringsystems: QM Perception and TRIADS.

• Exchange ideas with other enthusiasts.

• Pose your questions to a panel of experts.

• Present a poster on your own work.

• Keynote presentations from Dr Joanna Bull (Eduology)and Professor Graham Gibbs (Open University).

This one-day event is sponsored by LTSN Bioscience,LTSN Geography, Earth and Environmental Sciences, theOLAAF Project and the University of Plymouth.

Registration, refreshments and buffet lunch are free of charge.

Register online at:http://bio.ltsn.ac.uk/events/registration/olaaf.htm

OnLine AssessmentAnd Feedback

A HEFCE Fund for the Development of Teaching and Learning Project 2002-2005

OLAAF

w w w . b b k . a c . u k / o l a a f

OLAAF DL LEAFLET 19/8/03 9:18 pm Page 1

Page 17: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

OLAAF NetworkOutcomes of the OLAAF project will arise through theenergy and efforts of a Network of contributors.

The project was initiated by a group of ConsortiumPartners and is lead by a Core Team at Birkbeck College,University of London. The Network also includes anInterest Group, recruited from around the UK, which addsbreadth to the range of subject areas and pedagogicalapproaches represented within the project.

Consortium PartnersBirkbeck College, University of LondonBiological & Chemical SciencesEarth Sciences

University of BirminghamCivil Engineering

University of BrightonInstitute of Nursing & Midwifery

Brunel UniversityBiological Sciences

London Metropolitan UniversityHuman and Health Science

University of PlymouthBiological Sciences

University of Wales College of MedicineDental School

OLAAF Interest GroupMembers of the Interest Group will contribute to theevolution of the project’s Assessment ConstructionResources by undertaking case studies, participating inworkshops, and by sharing their work at annual projectconferences. Interest Group members include:

British School of Osteopathy (TRIADS)Osteopathy

Edgehill College of Higher Education (TRIADS)Geology, Geography

Keele University (QM PERCEPTION)Life Sciences

Kingston University (QM PERCEPTION/ BLACKBOARD)Life Sciences, Computing, Chemistry, Mathematics,Pharmaceutical Science, Medicinal Chemistry

Manchester Metropolitan (WEB CT)Biological Sciences

University of Ulster (QM PERCEPTION)Chemistry and Biochemistry

Warwickshire College (TRIADS)Equine Studies (via distance learning)

Curious…?…about how you can get involved? Please visit our web site and/or contact the OLAAF Project Director, Dr Richard Rayne, School of Biological & ChemicalSciences, Birkbeck College, Malet Street, London WC1E 7HX. [direct line 020 7631-6253]

http://www.bbk.ac.uk/olaaf

Project OverviewThe primary aim of the OLAAF project is to develop anddisseminate resources to support authors in the design,delivery and evaluation of Computer-Based Assessmentwith Feedback (CBAF).

Focusing on fundamentals of assessment construction,these resources will be relevant to authors of computer-based assessments irrespective of the preferred CBAFauthoring and delivery system. These materials – which weaim to make available via the OLAAF web site from Spring2004 – will provide practical guidance on how tostrategically combine questions within assessments, howto enhance feedback to students, how to ensure usability,and how to evaluate the impact of CBAF on the studentexperience.

The main assessment authoring tool used by the OLAAFConsortium Partners is TRIADS, a product of the Centre forInteractive Assessment Development at the University ofDerby. Based on Macromedia Authorware, TRIADS waschosen for its ability to produce and deliver highlyinteractive computer-based assessments capable oftesting higher order learning. Members of an OLAAFInterest Group will bring to the project their experiences ofother CBAF authoring/delivery tools, such as WebCT andQuestionmark Perception.

The OLAAF project will run from November 2002 toSeptember 2005. It has been made possible by a grantfrom HEFCE under the Fund for Development of Teachingand Learning phase 4, and by the support of theConsortium Partner Institutions.

OLAAF DL LEAFLET 19/8/03 9:19 pm Page 2

Page 18: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Page 1 of 4

Appendix 7 External Assessment Evaluation Report

OLAAF Project Dr D. O’Hare August 2003

Areas of Evaluation

The scope of the evaluation was limited as specified in the OLAAF Evaluation Strategy

(9/07/2003 version). Throughout, the following 2 questions were considered against the

accompanying criteria:

1. Have the assessments designed as part of the OLAAF been project designed

to prioritise student learning?

Criteria Assessments have been constructed according to principles outlined in the

Assessment Construction Resources and Context Analysis; Evaluation of the

assessments suggests that the impact of CBAF on student learning is positive.

2. Are the materials produced by the OLAAF project of high quality and do users

find the materials useful? Are the materials accessible?

Criteria The materials are well-researched and evidence-based where possible. The

resources address an identified gap in available information on CBAF and therefore

provide a key service to users. Feedback on the resource materials from both academic and support staff is positive; uptake of materials by staff occurs at

participating institutions and beyond. Accessibility is prioritised as far as possible

throughout the project (web site, printed materials, question authoring advice).

Methods of Evaluation

• Review of existing Assessment Construction Resources (ACR) on the OLAAF website. • Review of existing assessments produced by the OLAAF team (Field Biology module)

according to ACR principles.

• Site visit to review additional materials and to discuss progress with the project team.

The meeting with the OLAAF team (G Baggott, E McCarthy, J Phillips, R Rayne, K Shah) concentrated initially on an overview of the project aims and re-statement of how the work

underway would meet these aims.

Each of the areas was then considered in further detail and for the purpose of this report related back to the above questions.

Page 19: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Page 2 of 4

EVALUATION 1: Construction of Field Biology TRIADS assessments

The Birkbeck site leader and Field Biology module leader (G Baggott) explained the

genesis and nature of the assessments which were of two main types: 1) formative

exercises (with full feedback & tutorial support, but no explicit scoring) and 2) self-tests (with no feedback, but with scoring). Documentation was provided which described the

structure of the module, the assessment strategy for the module as a whole, and the

rationale underlying the use of TRIADS CBA as an assessment method. Documentation

further outlined steps undertaken in authoring the TRIADS assessments, specified the schemes for classifying test items by cognitive type, and specified categories describing

the general types of feedback employed. Finally, for each assessment, a complete

inventory was provided, including question title, style of question (e.g. MCQ, multiple response, sequence, etc.), cognitive type, and type of feedback provided. All of this

followed the procedures outlined in the ACR and included, for example, completed

proformas (included in the ACR) for various aspects of context-setting and planning the assessments.

Each assessment consisted of a range of items (around 10) that tested a given area of

the syllabus:

• Definitions

• Association analysis • Energetics

• Community analysis

• Tides and cycles • Hypotheses

• Food webs and communities

• Sampling

• Data analysis • Report writing

The assessments (both tutorial & self-test) consisted of a number of items (mostly MCQ, sequencing & label diagram) in the TRIAD system. However, the Tides and Cycles

assessments made a greater use of animation than many of the other assessments,

which related to the complex nature of the topic matter and the need for a highly visual

presentation.

The assessments were all linked to specific learning outcomes which were stated at the

beginning of the test, and in the formative exercises, prior to each question; thus students had a real understanding of how these activities fitted in with the learning outcomes of the

course as a whole and could build their learning around these smaller outcomes. The

assessment context proformas provided a sound means of classifying the items in terms of the cognitive types. This allowed a clear mapping of the items to the learning outcome

of the assessment.

The formative (tutorial) assessments contained a high level of feedback to the students on an item-by-item basis. The feedback was response sensitive and thus informed

students of their knowledge. In addition, the use of questions in the feedback to incorrect

responses provided useful prompts for the students, allowing them to be led in constructing their understanding. However, it would have been very useful to have

provided synoptic feedback regarding the attainment of the learning outcomes by the

students at the end of each assessment; this would have provided additional useful information to the students on their learning. It may also be useful to include feedback on

the relative success of the students in both the particular areas of the curriculum on each

test and in relation to the cognitive taxonomies of the items. Thus they could leave the

assessment with a ‘shopping list’ of areas of their learning that they need to make

Page 20: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Page 3 of 4

particular effort in (e.g. “my basic knowledge is good, but I need to work on application of that knowledge…” or “I’m OK on most of the topics, but need to work more on the Data

Analysis material”).

Regarding the operation of the tests (in formative mode only), it might be useful to allow the students multiple attempts at each item (rather than having to choose to go back).

Perhaps this would allow them more immediate feedback as their thinking /

understanding develops on an item.

Evaluation (by the authors) of Field Biology TRIADS assessments

A preliminary evaluation strategy and methodology was described (in conversation with K Shah). Evaluation of the assessments to-date was limited mainly owing to the fact that

prior to the date of this report only one meeting with the students had been possible.

Further limiting the outcome of evaluation to-date was the fact that the students had spent only a limited time with the materials before they were asked to comment on them.

However, some useful initial data has been collected, which together with the focus

groups that will be convened in September, will provide the basis for the fine tuning of the

evaluation strategy that will be disseminated to the partner sites.

The assessments were provided to students on CD and thus it was not possible at

present to obtain psychometric data from the items (unlike the case for assessments provided via the web, where the data can easily be captured and recorded on the web

server). Possible strategies to collect data from CD-delivered TRIADS materials were

discussed (e.g. requiring capture of data to diskette which could be returned for analysis).

Recommendations

1. It would be worthwhile to expand the provision of feedback to include end-of-

test synoptic feedback which relates the results of the assessment to (a) the

cognitive levels being assessed and (b) the wider learning outcomes of the unit under test. This will give the students greater ownership and a sense of control

over their learning.

2. It will be important to expand the capability to capture evaluative data from the assessments themselves. Together with information gleaned from

questionnaires and focus groups, this will allow for a very thorough analysis of

the impact of the assessments.

EVALUATION 2: The Assessment Construction Resources (ACR)

The project team demonstrated the ACR, which is presently available on an internal

Birkbeck college website. Although all of the ACR materials are not available publicly at this stage, it is envisaged that they will be disseminated at forthcoming workshops (with

partner sites) and this will provide a basis for partner sites to contribute. Piloting of the

ACR at Birkbeck is now well underway (being used to support the production of the Field Biology assessments) and useful feedback from staff is allowing the iterative

improvement of these resources.

There is much to be commended in the resources here. The addition of carefully chosen literature to support each of the steps in assessment production provides useful

pedagogic & academic support on which to base assessment strategy and construction.

The inclusion of a number of proformas, downloadable as MS-Word documents, will be

Page 21: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Page 4 of 4

invaluable in assisting busy staff in various aspects of planning and record-keeping that are essential to the assessment construction process.

In the main, good use is already being made of other FDTL project resources as well as a

range of other web-based materials. It was suggested, however, that in some cases, there could be more extensive reference to existing materials that support CBA (e.g.

Elicit). This would allow the team to concentrate its efforts on filling gaps in existing

resources, rather than replicating or repackaging those that already exist.

The presentation of the resources and, in particular, the format of the context analysis

tool were discussed at length. Some changes to this tool are planned in the light of this

feedback. It was also suggested that existing case studies (e.g. the materials produced in documenting the Field Biology assessments) could be supplied to exemplify each stage

of the assessment construction process. The inclusion of such “real world” examples

would be extremely valuable, especially to those new to CBA and who are using the approaches outlined in the ACR for the first time. Suggestions were also made regarding

methods of navigation/access to the resources. It was agreed that maximum utility is best

served by providing a variety of access methods (e.g. browsing of a static web site, automated/targeted access via structured questionnaire, Boolean searching via text

queries).

In summary, the OLAAF ACR, by providing a range of up-to-date, practically-focused resources on CBA, will be a valuable resource to practitioners in HE. The basis for a

sound collection of resources that will aid staff in the production of CBAF has been

established.

Recommendations

1. A clear strategy for evaluation of these resources by end users should be

produced, and a protocol established for amending the resources in light of end

user feedback.

2. Care should be taken in organising/presenting the resources to ensure that end

users gain maximum benefit from them.

3. Ensure the usability of the ACR by including explanations of all acronyms, and

providing a glossary of terms.

4. Concentrate on areas where there are gaps in existing resources, and make full

reference to existing materials where appropriate.

5. Continue to expand the resource with examples, case studies and completed

proformas drawn from the project members, in a variety of subject areas.

Include examples which demonstrate poor practice in assessment design.

Signed Date

Dr David O’Hare, CIAD

Page 22: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 8

Appendix 8 is the Evaluation Strategy. Because the latest version of this document can be found on the OLAAF web site (See About OLAAF> Evaluation), it is not reproduced here.

Page 23: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 9 Key: PD = Project Director, PM = Project Manager, SL = Site Leaders, IG = Interest Group, ST = Site Teams = milestone Page 1 of 7

Appendix 9 OnLine Assessment and Feedback (OLAAF) Project Plan Preparatory Phase (Nov – Dec 02)

Who Start Finish

Establish Working Procedures of Project Team

• Appoint Project Manager PD Oct 02 Nov 02

• Finalise agreements between Partner Sites PD/ PM/ SL Dec 02 Jan 03

• Refine project plan in line with feedback from exploratory contacts PD/ PM/

G Baggott

Dec 02

• Appoint CBAF support officer PD/ PM/

G Baggott

Nov 02 Feb 02

Communication and Dissemination

• Present at LTSN and Other Events PD/ PM/ SL Oct 02 Ongoing

• Web site live; follows W3C guidelines PM Dec 02

• Organise web-based discussion forum PD/ PM Dec 02

Initiate OLAAF Interest Group

• Identify participants and survey their requirements PD/ PM Dec 02

• Plan interactions of Partner Sites PD/ PM/ SL Dec 02

Page 24: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 9 Key: PD = Project Director, PM = Project Manager, SL = Site Leaders, IG = Interest Group, ST = Site Teams = milestone Page 2 of 7

Phase 1 (Jan – Sep 03)

Key Phase 1 Outcomes

• Working procedures established.

• ACR (assessments, benchmarks, CBAF Toolkit, and evaluation guide) specified and piloted.

• Target sites identified and OLAAF Interest Group established.

Who Start Finish

Project Team Activities

• Refinement of Partner Site and User Group objectives SL/ IG Jan 03 Feb 03

• Preparation of Assessment Benchmarks PM/ PD/ SL Jan 03 Apr 03

• Accessibility audit at main sites; contact Disability Officers SL Feb 03 Mar 03

• Production of draft CBAF Toolkit for use by partner sites PM/ PD/ SL Feb 03 Apr 03

• Production of Evaluation Guide for use by partner sites K MacKenzie-D Feb 03 Apr 03

• ACR piloted at BBK G Baggott Jun 03 Sep 03

• 1st Annual Progress Report to NCT PM 30 Sep 03

Dissemination

• Site Leaders establish Site Groups; web fora initiated, facilitated by Site Leaders SL Feb 03 Apr 03

• Distribute software and funding to Target Sites PM/ PD Jan 03 Jul 03

• Training sessions at Partner Sites to embed ACR SL Sep 03 Dec 03

• Awareness workshops at Partner and Target Sites to advertise OLAAF to potential end-users

and establish links with Learning & Teaching networks

Sl/ T&L Reps May 03 Jun 03

Evaluation

• 1st External Assessment Evaluation of ACR and CBAFs D O’Hare Aug 03

Steering Group

1st Meeting SG Jan 03

2nd

Meeting SG Sep 03

Page 25: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 9 Key: PD = Project Director, PM = Project Manager, SL = Site Leaders, IG = Interest Group, ST = Site Teams = milestone Page 3 of 7

Phase 2 (May 03 – Feb 05)

Key Phase 2 Outcomes

• ACR refined.

• CBAFs authored, embedded, and evaluated at Partner Sites.

• Dissemination of activities to Target Sites and other OLAAF users.

Evaluations—external, internal, end-user—inform project development.

Who Start Finish

Project Team Activities

• Authoring of CBAF at sites (for semesters 1 & 2, first iteration) SL/ ST May 03 Nov 03

• Refinement and final implementation of ACR PM/ PD/ SL Sep 03 Oct 03

• ACR piloted within OLAAF Network PD/ PM/ SL/ IG Aug 03 Feb 04

• Roll-out of CBAF’s in modules SL/ ST Oct 03 May 04

• Draft ACR available publicly on web site for use and evaluation PD/ PM/ SL Apr 04 May 04

• Develop publication strategy PD/ PM/ SL Jun 03 Sep 03

• 2nd Annual Progress Report to NCT PM 30 Sep 04

Dissemination

• Workshops at Target Sites on implementation of ACR PD/ PM/ SL Jul 03 Dec 03

• OLAAF Interest Group

o Meeting of OLAAF Users and Project Team to monitor cascading to Target Sites IG/ PM/ PD/ SL Oct 03

o National conference OLAAF1 and Report IG/ PM/ PD/ SL Jan 04

• FDTL4 Assessment Conference (LTSN Generic Centre) TBA

• Reporting of developments at International CAA Conference Jul 04

• ALT-C presentation Sep 04

• Refine continuation plans PD/ PM/ SL Oct 04

Page 26: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 9 Key: PD = Project Director, PM = Project Manager, SL = Site Leaders, IG = Interest Group, ST = Site Teams = milestone Page 4 of 7

Evaluation

• 1st Internal Project Evaluation: to review delivery of outcomes to-date PD/PM/SL Jan 04

• 2nd

External Assessment Evaluations: to assess the appropriateness of the assessments

constructed to-date

D O’Hare Apr 04

• 1st External Evaluation: to evaluate the project management and assess the educational value of

the outputs; to review dissemination activities.

J Bull May 04

Steering Group

• 3rd meeting SG Mar 04

• 4th meeting SG Sep 04

Page 27: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 9 Key: PD = Project Director, PM = Project Manager, SL = Site Leaders, IG = Interest Group, ST = Site Teams = milestone Page 5 of 7

Phase 3 (Oct 04 – Sep 05)

Key Phase 3 Outcomes

• Develop materials for Resources CD derived from Partner and Target Sites.

o case studies

o question/assessment banks

o image banks

Dissemination cascade accelerates to fuel continuation.

Who Start Finish

Project Team Activities

• Embedding and student evaluation of refined CBAF’s at sites (semesters 1 & 2, second iteration) PD/ PM/ SL Oct 04 Mar 05

• Alpha-stage specification of Resources CD PD/ PM/ SL Oct 04 Jan 05

• Processing of student evaluations PD/ PM/ SL

• Preparations for further publication PD/ PM/ SL

• Finalising continuation plans PD/ PM/ SL

• Final Annual Progress Report to NCT PM 30 Sep 05

Dissemination

• OLAAF Interest Group

o Target and Partner Sites define CD content and structure IG Oct 04 Dec 04

o National conference: OLAAF2 (and report) IG/ PM/ PD/ SL Jan 05

• Beta stage Resources CD materials available on web site PM Jan 05 May 05

• Release version of Resource CD PD/ PM/ SL Jun 05

• Report results at International CAA Conference (satellite meeting) Jul 05

Evaluation

• 2nd Internal Project Evaluation: to review delivery of outcomes and to finalise continuation plans Internal Evaluator Mar 05

Page 28: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 9 Key: PD = Project Director, PM = Project Manager, SL = Site Leaders, IG = Interest Group, ST = Site Teams = milestone Page 6 of 7

• 3rd

External Assessment Evaluation: to assess the functionality of the final assessments; to

evaluate the beta version of the Resources CD

D O’Hare Apr 05

• 2nd External Evaluation: to evaluate project management; to assess the quality of the

deliverables; to review the publication activities; to review the relationship between dissemination

and continuation activities

J Bull Jun 05

Steering Group

• 5th meeting SG Mar 05

• 6th meeting SG Sep 05

Page 29: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 9 Key: PD = Project Director, PM = Project Manager, SL = Site Leaders, IG = Interest Group, ST = Site Teams = milestone Page 7 of 7

Continuation Phase Who Start Finish

Dissemination

• Workshops by former Project Team and OLAAF Users

• Website with up-to-date contacts

• Publications

Evaluation

• Post-project survey of utility of Resource CD and website Jul 06

Page 30: Appendix 1 OLAAF Partner Site Summary The 7 Consortium ... · Questionmark Perception will be the main authoring tool. 4. Manchester Metropolitan University I am one of 18 university

Appendix 10: Planned and currently projected expenditure over the 3-year term of the OLAAF project

Appendix 10 Page 1 of 1

Year 1 Year 2 Year 3 Cumulative Totals

Original Current Diff Original Current Diff Original Current Diff Original

Total

Current

Total Diff

Staff 63990 51969 12021 76153 77312 -1159 38413 48000 -9587 178556 177281 1275

Travel/Subsistence 2700 2370 330 5800 6250 -450 5300 5680 -380 13800 14300 -500

Dissemination 14050 8530 5520 9300 13020 -3720 8700 10500 -1800 32050 32050 0

Evaluation 700 840 -140 2800 3150 -350 2800 3150 -350 6300 7140 -840

Equipment 9700 9635 65 0 0 0 0 0 0 9700 9700 65

Other Costs 4950 5205 -255 2500 3650 -1150 2100 695 1405 9550 9550 0

Total 96090 78549 17541 96553 103382 -6829 57313 68025 -10712 249956 249956 0

“Original” refers to the sum specified in the OLAAF Project bid. “Current” refers to the actual expenditure (year 1) or projected expenditure (years 2 and 3).

“Diff” is the difference between original and current; positive numbers represent an underspend (“surplus”) relative to the budgeted sum, while negative numbers indicate an

overspend. Despite changes to the rate of expenditure vs. the original plan, the cumulative difference is 0, indicating that the overall expenditure is unchanged.