12544 learner guide v4dfinal

95
Facilitate the preparation and presentation of evidence for assessment May 1, 2008 Sunette Bosch | US 12544 1 Facilitate the preparation and presentation of evidence for assessment SAQA US ID UNIT STANDARD TITLE 12544 Facilitate the preparation and presentation of evidence for assessment ORIGINATOR REGISTERING PROVIDER SGB Assessor Standards FIELD SUBFIELD Field 05 - Education, Training and Development Adult Learning ABET BAND UNIT STANDARD TYPE NQF LEVEL CREDITS Undefined Regular Level 4 4 REGISTRATION STATUS REGISTRATION START DATE REGISTRATION END DATE SAQA DECISION NUMBER Reregistered 2006-03-14 2009-03-14 SAQA 0160/05 LAST DATE FOR ENROLMENT LAST DATE FOR ACHIEVEMENT 2010-03-14 2013-03-14 Participant’s Guide Unit Standard 12544

Upload: m3isd

Post on 01-Dec-2015

27 views

Category:

Documents


1 download

DESCRIPTION

Evidence Facilitator Learner Guide

TRANSCRIPT

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

1

Facilitate the preparation and presentation of

evidence for assessment

SAQA US ID UNIT STANDARD TITLE

12544 Facilitate the preparation and presentation of evidence for assessment

ORIGINATOR REGISTERING PROVIDER

SGB Assessor Standards

FIELD SUBFIELD

Field 05 - Education, Training and Development Adult Learning

ABET BAND UNIT STANDARD TYPE NQF LEVEL CREDITS

Undefined Regular Level 4 4

REGISTRATION

STATUS

REGISTRATION START

DATE

REGISTRATION END

DATE

SAQA DECISION

NUMBER

Reregistered 2006-03-14 2009-03-14 SAQA 0160/05

LAST DATE FOR ENROLMENT LAST DATE FOR ACHIEVEMENT

2010-03-14 2013-03-14

Participant’s Guide

Unit Standard 12544

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

2

CONTENT

SSeerr nnrr TTooppiicc PPaaggee

1 Foreword/ Introduction 3

2 NLRD ID 12544 – Unit Standard 4 - 8

3 Learning Unit 1- Specific Outcome 1 14 –66

4 Learning Unit 2 – Specific Outcome 2 67 - 78

5 Learning Unit 3 – Specific Outcome 3 79 - 97

19 List of Sources 98 - 99

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

3

FOREWORD

At dawn, an elderly couple were walking on the beach, hand in hand.

They noticed a young man ahead of them, picking up starfish and hurling

them into the sea.

Curious, when they caught up with him they asked him why he was doing this.

He answered that if the star fish were stranded when the morning sun came

up, they would die.

‘But the beach goes on for miles and there are hundreds of star fish,’

countered the elderly woman. ‘How can your efforts make any difference?’

The young man looked at the starfish in his hand and then he

threw it to safely in the water.

‘It made a difference to that one,’ he said…

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

4

[Registered Qual & Unit Std Home page] [Search Qualifications] [Search Unit Standards]

All qualifications and unit standards registered on the National Qualifications Framework are public

property. Thus the only payment that can be made for them is for service and reproduction. It is

illegal to sell this material for profit. If the material is reproduced or quoted, the South African

Qualifications Authority (SAQA) should be acknowledged as the source.

SOUTH AFRICAN QUALIFICATIONS AUTHORITY

REGISTERED UNIT STANDARD:

Facilitate the preparation and presentation of evidence for assessment

SAQA US ID UNIT STANDARD TITLE

12544 Facilitate the preparation and presentation of evidence for assessment

ORIGINATOR REGISTERING PROVIDER

SGB Assessor Standards

FIELD SUBFIELD

Field 05 - Education, Training and Development Adult Learning

ABET BAND UNIT STANDARD TYPE NQF LEVEL CREDITS

Undefined Regular Level 4 4

REGISTRATION

STATUS

REGISTRATION START

DATE

REGISTRATION END

DATE

SAQA DECISION

NUMBER

Reregistered 2006-03-14 2009-03-14 SAQA 0160/05

LAST DATE FOR ENROLMENT LAST DATE FOR ACHIEVEMENT

2010-03-14 2013-03-14

This unit standard replaces:

US ID Unit Standard Title NQF Level Credits Replacement Status

9927 Conduct an assessment Level 4 12 Complete

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

5

PURPOSE OF THE UNIT STANDARD

This unit standard will be useful to candidates who assist others who wish to be assessed to prepare and

present evidence for assessment. Such evidence facilitators will add value to the assessment process by

ensuring candidates are ready to present well organised and complete evidence to registered assessors.

Their value will be particularly felt when assisting candidates who are competent in their field, but who may

be unable to present coherent evidence of that fact for reasons unrelated to their skill area. People credited

with this unit standard are able to:

Provide information to candidates about outcomes-based assessment in general and their assessment in

particular.

Advise and support candidates to prepare, organise and present evidence.

Check and give feedback on evidence.

LEARNING ASSUMED TO BE IN PLACE AND RECOGNITION OF PRIOR LEARNING

The credit value is based on the assumption that people learning towards this unit standard already

understand the basic principles of an outcomes-based system, and seek to apply the assessment facilitation

skills within the context of their given area of expertise.

UNIT STANDARD RANGE

References to "evidence facilitator" concern the person who wishes to achieve this unit standard. References

to "the candidate" in this unit standard concern the person who the evidence facilitator is assisting in

preparing for assessment, and do not refer to the evidence facilitator.

Assessment of the evidence facilitator against this unit standard is to take place within the context of given

organisational assessment policies and procedures, using given assessment instruments that are fully

designed in relation to registered unit standards. This means that the evidence facilitators will not be

required to design assessments.

This unit standard does not distinguish between "RPL assessment" and any other form of assessment. The

reason for this is because all assessment involves gathering, evaluating and giving feedback on evidence in

relation to agreed criteria. Therefore, it does not matter whether the evidence facilitator is assisting a

candidate to prepare and present existing evidence in the RPL sense, or whether the evidence facilitator is

assisting candidates to produce evidence after having recently attended a course. It is most likely however

that evidence facilitators will most frequently assist those seeking RPL.

Specific Outcomes and Assessment Criteria:

SPECIFIC OUTCOME 1

Provide information to candidates about assessment.

OUTCOME RANGE

The information provided to candidates is to include:

General principles and procedures concerning outcomes-based assessments·

Organisational assessment policies and procedures·

The requirements of the particular assessment at hand.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

6

ASSESSMENT CRITERIA

ASSESSMENT CRITERION 1

Basic information is provided about key concepts and principles concerning the outcomes- based system of

learning and assessment, within the context of the National Qualifications Framework.

ASSESSMENT CRITERION NOTES

Explanations of these key concepts promote understanding of the purpose of assessment and possible

implications for the candidates at individual, organisational, industry and national levels.

ASSESSMENT CRITERION RANGE

The proposals could be made to candidates and/or assessors and other role-players.

ASSESSMENT CRITERION 2

Interactions with candidates help to set them at ease and promote understanding of the assessment.

ASSESSMENT CRITERION RANGE

Understanding of the specific assessment process, the expectations of the candidate, the organisational

assessment policy, moderation and the appeals procedures.

ASSESSMENT CRITERION 3

Information to candidates is clear, precise and in line with instructions provided in the assessment

instruments, and opportunities are provided for clarification concerning the process and the expectations.

ASSESSMENT CRITERION 4

The information helps candidates to identify possible sources of evidence and the most appropriate and

effective means for producing evidence for the assessment.

SPECIFIC OUTCOME 2

Advise and support candidates to prepare, organise and present evidence.

ASSESSMENT CRITERIA

ASSESSMENT CRITERION 1

Potential barriers to gathering evidence and special needs of candidates are identified, and appropriate

guidance is given to overcome such barriers and to address special needs.

ASSESSMENT CRITERION RANGE

ASSESSMENT CRITERION 2

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

7

The advice and support helps candidates to identify appropriate, effective and efficient ways of producing

evidence of their competence.

ASSESSMENT CRITERION 3

The advice and support is given in a way that promotes the candidates` ability to present valid, relevant,

authentic and sufficient evidence of current competence.

ASSESSMENT CRITERION 4

Interactions with candidates enable them to organise and present evidence in a manner that contributes to

the overall efficiency and effectiveness of the assessment, but without compromising the reliability and

validity of the assessment.

ASSESSMENT CRITERION 5

The nature and manner of advice and support takes into account lessons learnt from previous such

interactions as well as information from assessors.

ASSESSMENT CRITERION 6

Support is given in a way that strengthens candidates` ability to engage more independently in future

assessments.

SPECIFIC OUTCOME 3

Check and give feedback on evidence.

OUTCOME RANGE

This is limited mainly to checking the completeness and appropriateness of the evidence, and is not

expected to amount to an assessment judgement as would be appropriate for an assessor.

ASSESSMENT CRITERIA

ASSESSMENT CRITERION 1

Checks establish the validity, authenticity, relevance and sufficiency of evidence.

ASSESSMENT CRITERION 2

Decisions are made concerning the readiness of the evidence for presentation to registered assessors, and

recommendations contribute to the efficiency and effectiveness of the assessment process.

ASSESSMENT CRITERION RANGE

Recommendations to candidates and/or to registered assessors and/or to supervisors or managers.

ASSESSMENT CRITERION 3

Gaps in the evidence are identified and dealt with appropriately.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

8

ASSESSMENT CRITERION RANGE

"Appropriate" means advice or coaching is only given in cases where the gaps do not reflect a lack of

competence on the part of the candidate. In cases where a lack of competence is discerned, feedback is

provided in such a way that directs the candidate to further learning and/or practice, and in accordance with

organisational policies and procedures.

ASSESSMENT CRITERION 4

Feedback about the evidence is communicated to assessors where required, and to candidates in a culturally

sensitive manner and in a way that promotes positive action by the candidate.

ASSESSMENT CRITERION 5

Key lessons from the facilitation process are identified and recorded for integration into future interactions

with candidates.

UNIT STANDARD ACCREDITATION AND MODERATION OPTIONS

An individual wishing to be assessed, including through RPL, against this unit standard may apply to an

assessment agency, assessor or provider institution accredited by the relevant ETQA.

Anyone assessing an evidence facilitator against this unit standard must be registered as an assessor with

the relevant ETQA.

Any institution offering learning that will enable achievement of this unit standard must be accredited as a

provider with the relevant ETQA.

External moderation of assessment will be conducted by the relevant ETQA according to an agreed

Moderation Action Plan.

UNIT STANDARD ESSENTIAL EMBEDDED KNOWLEDGE

The following knowledge is embedded within the unit standard, and will be assessed directly or implicitly

through assessment of the specific outcomes in terms of the assessment criteria:

Principles of assessment

Principles and practices of RPL

Methods for gathering evidence

Potential barriers to assessment

Feedback techniques

The principles and mechanisms of the NQF

Assessment policies and ETQA requirements

UNIT STANDARD DEVELOPMENTAL OUTCOME

UNIT STANDARD LINKAGES

N/A

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

9

Critical Cross-field Outcomes (CCFO):

UNIT STANDARD CCFO IDENTIFYING

Identify and solve problems using critical and creative thinking: planning for contingencies, candidates with

special needs, predicting problems that could arise during the gathering of evidence, and offering guidance

to address difficulties.

UNIT STANDARD CCFO WORKING

Work effectively in a team using critical and creative thinking:- working with candidates and other relevant

parties prior to, during and after evidence gathering.

UNIT STANDARD CCFO ORGANISING

Organise and manage oneself and ones activities:- planning, preparing, conducting and recording the

evidence gathering.

UNIT STANDARD CCFO COLLECTING

Collect, analyse, organise and critically evaluate information:- gather and evaluate evidence and the

facilitation process.

UNIT STANDARD CCFO COMMUNICATING

Communicate effectively:- inform candidates about assessment, communicate during evidence gathering

and provide feedback.

UNIT STANDARD CCFO DEMONSTRATING

Demonstrate the world as a set of related systems:- understanding the impact of assessment on individuals

and organisations.

UNIT STANDARD CCFO CONTRIBUTING

Be culturally and aesthetically sensitive across a range of social contexts:- work with candidates and give

feedback in a culturally sensitive manner.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

10

COURSE OVERVIEW

PURPOSE OF THIS LEARNING PROGRAMME

This unit standard will be useful to candidates who assist others who wish to be assessed to prepare and

present evidence for assessment. Such evidence facilitators will add value to the assessment process by

ensuring candidates are ready to present well organised and complete evidence to registered assessors.

Their value will be particularly felt when assisting candidates who are competent in their field, but who may

be unable to present coherent evidence of that fact for reasons unrelated to their skill area. People

credited with this unit standard are able to:

• Provide information to candidates about outcomes-based assessment in general and their

assessment in particular.

• Advise and support candidates to prepare, organise and present evidence.

• Check and give feedback on evidence.

LEARNING ASSUMED TO BE IN PLACE AND RECOGNITION OF PRIOR LEARNING

See unit standard above.

UNIT STANDARD RANGE

See unit standard above.

UNIT STANDARD ESSENTIAL EMBEDDED KNOWLEDGE

See unit standard above.

HOW IS THIS LEARNING PROGRAMME COMPILED?

Contact Learning Phase

You must attend a one days formal outcomes based facilitation session of the learning material. This

session is critical as outcomes will be discussed and you will be guided on to successfully achieve

competence against this unit standard.

Application Phase

On completion of the Contact Learning Phase you will immediately enter the Application Phase of the

learning programme during which you must complete your Portfolio of Evidence (PoE). The PoE consist

mainly of a Theoretical and Practical Assessment. You are allowed one month to finalise and submit your

portfolio. The Portfolio of Evidence is probably the most important document of this entire learning

programme as it provides the essential evidence necessary against which a decision of competence will be

made resulting in the achievement of credits on the NQF if successful.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

11

Assessment Phase

The ETD Provider will assess your portfolio. If successful, you will receive the credit value of this learning

programme, four (4) credits on NQF level 4. The entire assessment process is explained in the Assessment

Guide and you are urged to read this guide as soon as possible as it explains the assessment process in

detail and clarifies you rights and responsibilities to ensure that the assessment is fair, valid and reliable.

If you are not successful, you will receive all the guidance needed to resubmit your PoE within two months.

Thereafter, an additional fee has to be paid for assessment should you wish to submit again.

Learning Facilitation

All the presentations, group discussions, role-plays and other learning opportunities, will be based on this

learning guide. You are encouraged to make is of the references and other sources listed in the References

Section of this guide to read more about this Unit Standard and to get other peoples’ views and

perspectives on this learning topic.

LEARNING MATERIAL

This learning guide belongs to you. It is designed to serve as a guide for the duration of your learning

programme and as the main sources document for transfer of learning. It contains readings, activities, and

application aids that will assist you in developing the knowledge and skills stipulated in the specific

outcomes and assessment criteria.

Follow along in the guide as the facilitator takes you through the material, and feel free to make notes and

diagrams that will help you to clarify or retain information. Jot down things that work well or ideas that

come from the group. Also note any points you'd need clarify and explore further.

You are once again encouraged to share your own expertise and experiences with the facilitator and your

fellow candidates to enhance the overall learning achievement of this course and allow other candidates

can learn from you too.

Participate actively in the activities as they will give you an opportunity to gain insights from other people's

experiences and to practice these new founded knowledge and skills. Each Unit will be preceded by

outcomes and assessment criteria, taken from the Unit Standards. These will describe what you must know

and be able to do in order to successfully achieve competence in this unit standard.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

12

LEARNING TIPS AND TECHNIQUES

THE UNIT STANDARD This learning programme is designed around the outcomes as stipulated in the unit standard 12544 available at the beginning of this Guide. Remember, decisions of competence will be based on the following question: “Did the candidate master, provide evidence that she or he is competent in, all the specific outcomes and assessment criteria stated in the unit standard?

Ask immediately if you have questions.

Complete all tasks immediately.

Pass your tests, first time.

Read through the days work every night. It will stimulate questions and help you to participate the next day.

Get a "Study Buddy". Learning together is twice as much fun, and you can check your understanding against each other.

Use colours and drawings to help you to internalise and remember what you have learned.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

13

Icons used in this Guide

Description Icon

Learning Units & Specific

Outcomes

Assessment Criteria

Range Statement

Learning Tips and Techniques

Learning Material/ Sources

Group Activity

Learning Summaries

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

14

Provide information to candidates about assessment

Unit 1 OUTCOME RANGE

The information provided to candidates is to include:

• General principles and procedures concerning outcomes-based assessments

• Organisational assessment policies and procedures

• The requirements of the particular assessment at hand.

At the end of this learning unit the candidate will be able provide informations to candidates

about assessment against the following criteria:

• Basic information is provided about key concepts and principles concerning the outcomes- based

system of learning and assessment, within the context of the National Qualifications Framework.

• Interactions with candidates help to set them at ease and promote understanding of the

assessment.

• Information to candidates is clear, precise and in line with instructions provided in the assessment

instruments, and opportunities are provided for clarification concerning the process and the

expectations.

• The information helps candidates to identify possible sources of evidence and the most appropriate

and effective means for producing evidence for the assessment.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

15

OBE EVIDENCE FACILITATORS

Evidence facilitators will need to have a clear understanding of the Outcomes-Based Education and Training

System if they are to facilitate appropriate evidence needed to achieve competence to candidates and

“evidence gaps” existing with candidates to registered assessors. We will therefore in the following unit

spend some time to ensure that you as future OBE Evidence Facilitators, as defined by this unit standard,

has appropriate and sufficient understanding of OBE Learning and Assessment in the context the National

Qualifications Framework to guide and support candidates and assessors efficiently and effectively.

THE NATIONAL QUALIFICATIONS FRAMEWORK (NQF)

The NQF is a framework i.e. it sets the boundaries - a set of principles and guidelines which provide a vision,

a philosophical base and an organisational structure - for construction, in this case, of a qualifications

system. Detailed development and implementation is carried out within these boundaries. It is national

because it is a national resource, representing a national effort at integrating education and training into a

unified structure of recognised qualifications. It is a framework of qualifications i.e. records of candidate

achievement.

In short, the NQF is the set of principles and guidelines by which records of candidate achievement are

registered to enable national recognition of acquired skills and knowledge, thereby ensuring an integrated

system that encourages life-long learning.

PURPOSE OF THE NATIONAL QUALIFICATIONS FRAMEWORK

The purposes of the NQF are:

• To establish a learning environment which enables people to realize their full social and economic

potential in the modern world.

• To produce educated people who are independent problem solvers and reflective candidates and

who have learned how to learn.

• To provide a learning environment with the proper integration of academic abilities and workplace

skills, in order to produce qualifications which not only meet needs, but have appropriate

intellectual content - thus removing the artificial distinctions between academic and vocational

training.

• To establish an enabling framework for the many who have been marginalised from formal

education and/or workplace opportunities.

• To remove the existing artificial learning ceilings and to provide the pathways of continuous

learning toward meaningful qualifications.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

16

• In essence, to establish the framework for a nation of life-long candidates who are able to realise

their full potential through flexible curricula and opportunity structures which enable movement

between various levels of achievement.

OBJECTIVES OF THE NQF

The objectives of the NQF are:

• To create an integrated national framework for learning achievements.

• To facilitate access to, mobility and progression within education, training and career paths.

• To enhance the quality of education and training.

• To accelerate the redress of past unfair discrimination in education, training and employment

opportunities, and thereby

• contribute to the full personal development of each candidate and the social and economic

development of the nation at large.

THE PRINCIPLES FOR THE DEVELOPMENT AND IMPLEMENTATION OF THE NQF.

The principles for the development and implementation of the NQF are:

• Integration (of education and training, of mental and manual labour/theory and practice/academic

and vocational).

• Articulation (linkage of different curricula, qualifications and institutions).

• Flexibility (different options for entering and progressing through learning and career paths).

• Access (easier to appropriate learning and career paths).

• Progression (movement through learning and career paths).

• Coherence (the paths should all “hang together” in the overall framework).

• Portability (candidates should be able to “carry” appropriate knowledge and skills from one

learning programme or context to another, with the knowledge and skills being recognised in the

new context).

• Recognition of prior learning (linking informally acquired or unaccredited knowledge and skills to

formal provision and accreditation).

• Guidance of candidates (to assist candidates to understand and make decisions about entry into

and progression through the education and training system).

• Equality of opportunity (the same standards for entry and progression should apply to all

candidates).

• Relevance (of education and training to social, economic and political developments and candidate

needs).

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

17

• Quality (the nature of learning to be actually achieved when education and training is provided,

expressed in the standards).

• Credibility (the standards should enjoy national and international value and acceptance).

• Legitimacy (to be achieved through all national stakeholders participating in the planning and

coordination of the framework).

• Democratic participation (through which legitimacy will be achieved).

In summary, the objectives of the NQF are to create an integrated national framework for learning. Access,

mobility and progression are key objectives, as is the need for enhancing quality in education and training.

Attention must be given to the speedy redress of past discrimination in education, training and

employment. Through these objectives, the NQF contributes to the full personal development of each

candidate and the social and economic development of the nation at large.

THE ETD REGULATORY FRAMWEROK AND OTHER STATORY BODIES RELATING TO

THE NQF

Before we get down to the actual business end of Evidence Facilitators, we need to take a closer look at the

ETD Regulatory Framework of Education, Training and Development (ETD).

We saw that the introduction and implementation of new policies and legislation is guided by the White

Paper on transformation in the public sector. Since 1995, a number of pieces of legislation have emerged.

Amongst these are various acts and regulations that collectively provide a new regulatory framework for

improving ETD. Those in particular, which we will be looking at, are:

• SAQA Act, 1995 and Regulations.

• Labour Relations Act, 1995.

• Basic Conditions of Employment Act, 1997.

• Skills Development Act, 1998 and Regulations.

• Employment Equity Act, 1998.

• Skills Development Levies Act, 1999 and Regulations.

Two departments that play a key role in overseeing, administering and implementing legislation with

respect to ETD are Education and Labour.

The key challenge for Government lies in expanding the role of educationally sound and sustainable private

higher education institutions in terms of the applicable South African legislation, and to root out poor

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

18

quality, unsustainable, “fly-by-night” operators in the higher education band (paragraph 2.55, Education

White Paper 3, Government Gazette No 18207 dated 15 August 1997).

THE DEPARTMENT OF EDUCATION

There is a legal obligation on the national Department of Education to protect the South African public and

act as a watchdog over private higher education institutions operating in South Africa. (It is not concerned

about, or responsible for, South African institutions’ activities abroad.) The Department states to the public

that it believes that the private higher education institutions which are registered by it, are financially

viable, and that their qualifications are recognised as being of at least comparable quality to those offered

by South African universities. The Department claims that they exercise a level of performance, integrity

and quality that entitles them to the confidence of the higher educational community and the public they

serve.

SAQA resorts under the Department of Education, while SETAs resort under the Department of Labour.

Legislation referred to above has provided for the establishment of various bodies. The most important of

these are the following:

• National Skills Authority (NSA).

• South Africa Qualification Authority (SAQA).

• National Standards Bodies (NSBs).

• Standards Generating Bodies (SGBs).

• Education and Training Quality Assurance bodies (ETQAs).

• Sector Education and Training Authorities (SETAs).

• More recently, the Council on Higher Education (CHE), and its quality assurance body, the Higher

Education Quality Committee (HEQC).

• Umalusi, the quality assurance body for the General and Further Education and Training bands of the

National Qualifications Framework (NQF).

• Other important role-players in the ETD framework are training providers, professional bodies and,

of course, the candidates. We will discuss the Internal Training Committee, as representing training

providers, here.

SAQA and its associated bodies are responsible for the professional implementation of the National

Learning System. The approach selected for the National Learning System is “outcomes-based” education

and training.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

19

The SAQA Act says that standards must be agreed in a democratic way. Everyone with a direct interest in

skills development in South Africa does this. These role players can be grouped into four groups according

to the primary roles they play, namely:

• The protector of the NQF, which is SAQA.

• The national standards bodies (NSBs) are made up of government, organised business, organised

labour, education and training providers, community and candidate organisations and any other

groups who might have an interest in education and training.

• The Standards Generating Bodies (SGBs). Everyone with a direct interest in a standard gets

together in a SGB to agree what the learning outcomes should be.

• Those who are responsible for management (SETAs) and quality assurance (ETQAs) in training.

Especially here the demarcation becomes somewhat artificial, in that SETAs are also involved in the

first three groupings.

THE NATIONAL SKILLS AUTHORITY

In April 1999 the National Skills Authority (NSA) was set up to advise the Minister of Labour on policies and

strategies for the new skills system. The NSA has 24 members who can vote, three members who attend

meetings but do not vote and an executive officer that also attends but does not vote. The members of the

NSA represent organised labour, organised business, the community, government, education and training

providers, experts on employment services, and SAQA. The community representatives include people who

represent women, youth, civics, rural groups and people with disabilities. The NSA work closely with the

Chief Directorate: Employment and Skills Development Services of the Department of Labour.

The NSA is one of four structures created by the promulgation of the South African Skills Development Act

with the purpose of:

• Developing a national skills development policy and national skills development strategy.

• Liaising with SETAs on the national skills development strategy, and

• Reporting to the Minister of the Department of Labour (DoL) on the progress made with the

implementation of the national skills development strategy.

THE SOUTH AFRICAN QUALIFICATIONS AUTHORITY

SAQA must consult with all affected parties. It must also comply with the various rights and powers of

bodies in terms of the Constitution and Acts of Parliament.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

20

The South African Qualifications Authority (SAQA) consists of a Chairperson and members nominated from

a diversity of interests including education, labour, business, the universities, the teaching profession and

special education needs.

The functions of SAQA are as follows:

• To oversee the development of the NQF, formulate and publish policies and criteria both for the

registration of bodies responsible for establishing educational and training standards, and for the

accreditation of bodies responsible for monitoring and auditing achievements.

• To oversee the implementation of the NQF. It must ensure the registration, accreditation and

assignment of functions to the bodies referred to above, as well as the registration of national

standards and qualifications. It must also take steps to ensure that provisions for accreditation are

complied with and that standards and registered qualifications are internationally comparable.

• SAQA must advise the Ministers of Education and Labour.

• SAQA must consult with all affected parties. It must also comply with the various rights and powers of

bodies in terms of the Constitution and Acts of Parliament.

STANDARDS GENERATING BODIES (SGBs)

SGBs are composed of key education and learning stakeholders in the sub-field, drawn from interest groups

and specialists who will have been identified by the NSB in accordance with the requirements of the SAQA.

The people who sit on the SGBs are presumed to have and excellent working knowledge of the

competencies needed by the particular sector. Stakeholders in the sector nominate these individuals and

their names are published in a government gazette.

Their functions are to:

• Generate unit standards and qualifications in accordance with SAQA requirements in identified sub-

fields and levels.

• Update and review unit standards.

• Recommend unit standards and qualifications to NSBs.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

21

The position of the SGBs in the new regulatory framework can be illustrated as follows:

The position of SGBs.

SECTOR EDUCATION AND TRAINING AUTHORITIES (SETAs)

SETAs are statutory bodies that were created by the promulgation of the South African Skills Development

Act. Twenty-seven SETAs were established for the various business sectors with similar products, materials,

business processes and technologies. (Only 25 SETAs were activated. It looks like some SETAs might

integrate during 2005, so that probably some 23 will still function autonomously by the end of the year.)

A SETA is a body consisting of representatives from labour, employers, key government departments, any

professional body with a reason to be there, and any bargaining council from the sector or industry

involved.

The Minister of Labour is responsible for creating SETAs for every clearly definable and reasonably distinct

national economic sector.

SETAs are responsible for ensuring that effective learning and education for that particular sector is being

implemented by government bodies, companies, and unions in accordance with market driven needs and

in the best long-term interests of the country. Their main function is to contribute to the raising of skills –

to bring skills to the employed, or those wanting to be employed, in their sector. This involves the following

specific tasks:

SAQA

………….

NQF

NSB NSB NSB NSB

SGB SGB SGB

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

22

• Creating a skills development plan for that industry sector within the framework of the national skills

development strategy.

• Implementing its sector skills plan by:

Establishing candidateships;

Approving workplace skills plans;

Allocating skills grants to employers, learning providers and workers; and

Monitoring education and learning in the sector.

• Promoting candidateships by;

Identifying workplaces for experiential learning;

Supporting the development of learning materials;

Improving the facilitation of learning;

Assisting in the conclusion of candidateship agreements.

Registering candidateship agreements.

• Receiving and paying out the skills development levies in its sector.

• Liaising with the National Skills Authority on the national skills development strategy, and its sector

skills plan.

• Reporting to the Director-General on its income and expenditure; and the implementation of its sector

skills plan.

• Improving information about employment opportunities, learning providers and the labour market.

• Operating as a learning and education quality assurance body in some cases.

SETAs invite private learning institutions, private companies and public sector organisations who are

involved in learning to accredit as service providers and to have their learning accredited as full national

qualifications, part-qualifications, candidateships or unit standards.

The position of the SETAs in the new Regulatory Framework can be illustrated as follows:

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

23

The position of SETAs in the new Regulatory Framework.

The main tasks of the SETAs are:

• To develop a sector skills plan that indicates who is employed where in the sector, and what the

strengths, weaknesses, opportunities and threats in the sector are. Liaise with the NSA in this regard.

• To see where candidateships are needed, design the candidateships, market them, and register them.

• To act as an Education and Training Qualification Authority (ETQA) for standards and qualifications in

the sector.

• To disburse money from the National Skills Development Levy.

• Report to the Director-General of the Department of Labour (DoL) on its income and expenditure and

the implementation of the sector skills development plan.

Especially in the case of SETAs the need for the co-ordination of the communication structure entails much

more than merely the establishment thereof – it also includes managing the functioning of the

communication process. The issue that requires most communication is the development and running of

candidateships. Therefore we will discuss the co-ordination of the communication structure in terms of

how it impacts on candidateships.

EDUCATION AND TRAINING QUALITY ASSURANCE BODIES (ETQAs)

ETQA bodies have been established under SAQA to accredit providers, to be responsible for quality

assurance and to monitor and audit the achievement of standards and qualifications. To get the quality

assurance process going, SAQA accredits ETQAs. There are three kinds of ETQAs:

• The economic sector ETQAs. In the economic sector there are three models for ETQA. The first model is

of each SETA functioning as an ETQA for that economic sector. The second model is of (statutory and

MINISTER OF LABOUR

DG OF THE DEPARTMENT OF LABOUR

DEPARTMENT OF

LABOUR

NATIONAL SKILLS

AUTHORITY

SECTOR EDUCATION AND TRAINING

AUTHORITIES

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

24

voluntary) professional bodies that have been set up under different Acts, and which have

accreditation and quality assurance responsibilities. These bodies may apply for accreditation as

economic sector ETQAs for standards and qualifications related to the particular profession (for

example, Pharmacy Council, South African Institute for Chartered Accountants, Bar Councils, etc.) The

third model is of the profession-related institute, for example the Institute of Marketing Management

being given quality assurance responsibilities. It is possible that ETQAs may be established under

different models. They will then have to negotiate how to deal with an overlap in their areas of

responsibility.

• The social sector ETQAs. In the social sector it is anticipated that there will be different ETQAs, but it is

not yet known which bodies will perform this function, or at what levels.

• The Education and Learning sub-system sector ETQAs, or band ETQAs as they are commonly known. In

the Education and Learning sub-system sector, the band ETQAs are the Council on Higher Education

(CHE) and the General and Further Education and Learning Quality Assurer (GENFETQA).

The observant candidate will notice that we are actually skipping a step in the structure of the new

Regulatory Framework by discussing ETQAs here. The reason for this is that we need to focus on the

information that is necessary on the level at which you, as Skills Development Facilitators, will function.

Nevertheless, in order to ensure that you keep track of the progress in this manual, we will illustrate the

position of the ETQAs in the new Regulator Framework here:

SAQA Act, 1995 Skills Development

Act, 1998

Skills Development

Levies Act, 1999

Minister of

Education

Minister of Labour

Department of

Education

Department of

Labour SAQA

NQF

DG of the Department

of Labour

National Skills

Authority

SETAs and

ETQAs

NSB

NSB

NSB Labour

Centres

Skills Development

Planning Unit

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

25

ETQAs could be established on the basis of:

• Social sectors.

• Economic sectors.

• Education and training sub-systems.

The functions of the ETQAs are:

• To promote quality amongst constituent providers (i.e. the registered constituency of the ETQA).

• Accredit providers in terms of quality management.

• Facilitate or ensure moderation across constituent providers.

• Cooperate with relevant NSBs for the purpose of moderation across ETQAs.

• Register constituent assessors.

• Evaluate assessment.

• Certificate candidates.

• Maintain an acceptable database.

• Submit reports to SAQA.

• Recommend unit standards to SGBs and qualifications to NSBs as appropriate.

• Monitor provision.

• Undertake quality systems audits.

An ETQA may, with the approval of SAQA, delegate selected functions to a provider or other body, but may

not delegate its accountability to SAQA.

THE COUNCIL ON HIGHER EDUCATION (CHE)

The Council on Higher Education (CHE) was established in May 1998 in terms of the Higher Education Act of

1997. Its mission is to contribute to the development of an HE system characterised by equity, quality,

responsiveness to economic and social development needs, and effective and efficient provision and

management.

The CHE seeks to make this contribution:

• By providing informed, considered, independent and strategic advice on HE issues to the Minister of

Education.

• Through the quality assurance activities of its sub-committee, the HEQC.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

26

• Through various activities that include the dissemination of knowledge and information on HE through

publications and conferences.

The key responsibilities of the CHE revolve around:

• Advising the Minister of Education, at his request or proactively, on all matters related to HE – including

active information gathering and research to sensitise government and stakeholders to immediate and

long-term challenges and issues.

• Assuming executive responsibility for quality assurance within higher education and training – including

decision-making related to programme accreditation.

• Monitoring and evaluating whether, how, to what extent and with what consequences the vision,

policy goals and objectives for HE are being realised.

• Contributing to developing HE – giving leadership around key national and systemic issues, promoting

quality in learning and teaching, capacity-building around quality assurance, producing publications,

convening conferences, etc.

• Consulting with stakeholders around HE and convening an annual consultative conference.

The Higher Education Quality Committee (HEQC) is the permanent committee of the CHE through which

the CHE’s quality assurance mandate is conducted. The HEQC has the statutory responsibility to carry out

audits of higher education institutions and accredit programmes of higher education. Furthermore, the

HEQC is an independent statutory body that was established in terms of the Higher Education Act, No 101

of 1997. The Higher Education Act and Education White Paper 3 of 1997 define the mandate and

responsibilities of the HEQC: A Programme for the Transformation of Higher Education. The mission of the

HEQC is to contribute to the development of a higher education system characterized by quality and

excellence, equity, responsiveness to economic and social development needs and effective and efficient

provision, governance and management. As such the HEQC has a mandate to appoint an independent

evaluation panel from which the Minister of Education is able to appoint verifiers to conduct investigations

into particular issues at Private Higher Education institutions. The HEQC can thus be regarded as the

ultimate moderator for Private Higher Education institutions in South Africa.

The specific functions of the HEQC are to:

• Promote quality assurance in higher education.

• Audit the quality assurance mechanisms of institutions of higher education.

• Accredit programmes of higher education.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

27

UMALUSI

UMALUSI is the quality assurer in the general and further education and training bands of the national

qualifications framework (NQF). The Council ensures that the providers of education and training have the

capacity to deliver and assess qualifications and learning programmes and are doing so to expected

standards of quality.

UMALUSI is guided by the General and Further Education and Training Act, Act 58 of 2001, published in

December of that year. The functions of the South African Certification Council (SAFCERT) were

incorporated into those of the new Council, constituted in June 2002. SAFCERT concentrated on quality

assuring the Senior Certificate.

OUTCOMES BASED EDUCATION AND TRAINING (OBET)

It is critical that candidates understand the concepts of OBET before they engage on their learning journey.

This will allow them to complete a holistic picture of what will be expected from them and ultimately assist

in the transfer of learning.

The historically separated worlds of ‘work’ and ‘learning’ are no longer to be seen as separate if you want

to understand ‘knowledge’ in our present situation.

WHAT IS OUTCOMES-BASED ALL ABOUT?

Outcomes-based education and training (OBET) is defined as a candidate-centred approach that is primarily

characterised by a focus on results and outputs as opposed to inputs and syllabi or curriculum. Outcomes-

based systems describe the learning outcomes that candidates are expected to achieve on completion of a

programme at a given level. Outcomes based systems do not prescribe any particular syllabi. Achievement

in such a system is defined in terms of criteria rather than normatively in terms of a given percentage of

candidates that are expected to reach a given level. In theory therefore, all candidates can pass if they

provide evidence of fulfilling the criteria.

OUTCOMES

Outcomes are the crucial determinants of the nature and quality of an OBET system. These are sometimes

referred to as objectives. Candidates in OBET are required to provide evidence that learning, whether

formal or informal, did take place and resulted in achievement. Outcomes are defined as ‘the results of

learning processes and refer to knowledge, skills, attitudes and values within particular contexts.’ Learning

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

28

area outcomes are outcomes related to specific learning areas. Specific outcomes are contextually

demonstrated knowledge, skills and values reflecting critical cross-field outcomes.

TRAINING SYSTEMS THAT ARE OUTCOMES-BASED

OBET systems are also transparent in that they make clear what candidates have to achieve and the criteria

against which that achievement will be determined (measured). By so doing candidates know “upfront”

what is expected of them in order to achieve a qualification and employers, parents, further and higher

education officials and the general public are provided with clear and accurate information on what a

candidate has achieved.

OBET creates a “multiple opportunity system of instruction and evaluation” that undermines the potential

use of evaluation (testing and grades) as a mechanism for the control of candidate behaviour. The focus is

on learning and acquiring competence rather than “managing by fear”, which was often the case in the

previous (before 1992) South African learning dispensation.

COMPETENCE

One of the key features of OBET is the notion of competence. Competence is about demonstration of

ability, performing or acting, demonstration of understanding of the knowledge underpinning performance

or action, and demonstration of the ability to integrate understanding of underpinning knowledge and

performance or action.

Competence refers to three inter-connected kinds of competence: practical competence, foundational

competence and reflexive competence.

• Practical competence is the demonstrated ability, in an authentic context, to consider a range of

possibilities for action, make considered decisions about which action to follow and to perform the

chosen action. Practical competence is grounded in what is termed,

• foundational competence where the candidate demonstrates an understanding of the knowledge

and thinking which underpins the action taken; and it is integrated through

• reflexive competence, in which the candidate demonstrates ability to integrate or connect

performances and decision making with understanding and with the ability to adapt to change in

unforeseen circumstances and explain the reason behind these adaptations.

SAQA’s standard setting regulations refer to applied competence, which is defined as “the ability to put into

practice in the relevant context the learning outcomes acquired in obtaining a qualification.” Applied

competence manifests in what is called “critical cross-field education and training outcomes” or “generic

abilities”.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

29

CRITICAL OUTCOMES (CRITICAL CROSS-FIELD EDUCATION AND TRAINING OUTCOMES)

All unit standards, and, therefore, learning interventions, should contribute to the personal development of

candidates. This is achieved by incorporating critical cross-field education and training outcomes, popularly

known as critical outcomes, in the learning content and process. The incorporation of at least some of the

following critical outcomes is regarded as mandatory by SAQA:

• Solving problems. The ability to identify and solve problems in a way that demonstrates that

responsible decisions have been made, using critical and creative thinking. Problem solving

approaches must be embedded within the content which is to be learned.

• Working effectively. The ability to work effectively with others as a member of a team, group,

organisation, community. Social interaction, properly executed, becomes an integral part of the

definition of good training and education. For this critical outcome to be realised, the unit standard

developer must at least know:

the principle of group dynamics,

the methodology of learning within a group setting,

the optimum conditions under which learning occurs in groups,

how each individual develops this Critical Outcome within the group, and probably a number of

additional considerations.

• Organising. Organise and manage oneself and one’s activities responsibly and effectively.

• Analysing data. Collect, analyse, organise and critically evaluate information.

• Communicating. Communicate effectively using visual, mathematical and/or language skills in the

modes of oral and/or written presentation.

• Using technology. Use science and technology effectively and critically, showing responsibility

towards the environment and health of others.

• Recognising systems. Demonstrate an understanding of the world as a set of related systems by

recognising that problem-solving contexts do not exist in isolation.

In order to contribute to the full personal development of each candidate and the social and economic

development of the society at large, it must be the intention underlying any programme of learning to

make an individual aware of the importance of:

• Reflecting on and exploring a variety of strategies to learn more effectively.

• Participating as responsible citizens in the life of local, national and global communities.

• Being culturally and aesthetically sensitive across a range of social contexts.

• Exploring education and career opportunities.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

30

• Developing entrepreneurial opportunities.

In outcomes-based education and training it is not only knowledge that is assessed, but also how

candidates integrate generic capabilities to demonstrate achievement.

WHAT IS A UNIT STANDARD

A unit standard can be described as a set of registered statements of desired education and training and

their associated assessment criteria, together with administrative and other information. In other words, a

unit standard is an end-statement of the achievement of a certain competence, as well as being a building

block for possible qualifications.

DEFINITION

A unit standard is a document that describes:

• a coherent and meaningful outcome of learning (title) that we want recognised nationally,

• the smaller more manageable outcomes that make up the main outcome (specific outcomes),

• the standards of performance required as proof of competence (assessment criteria), and

• the scope and contexts within which competence is to be judged

PARTS OF A UNIT STANDARD

The information on the following detail must be specified for every unit standard:

1. Unit standard title Form:

The title of the unit standard is unique

That is, the title is different from any other title registered on the NQF.

The title provides a concise yet comprehensive and pointed indication of the contents of the unit

standard.

The title contains a maximum of 100 characters including spaces and punctuation.

2. Unit standard level

• The level assigned to the unit standard is appropriate in terms of the complexity of learning

required to achieve the standard (as described in SAQA’s Level Descriptors).

• The level is appropriate in relation to the learning pathway/s within which the unit standard is

located.

Note: Fundamental or Core standards in particular may form part of many different learning pathways.

3. Credit attached to the unit standard

The definition of a credit is that 1 credit = 10 notional (assumed) hours of learning.

The credit assignment reflects the average length of time the average candidate might take to

complete the learning leading to the achievement of the standard.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

31

4. Field and sub-field of the unit standard

The Fields of Learning have been indicated in Unit 3

Unit standards must be located within the sub-field and organizing field.

Where there is more than one sub-field or organising field to which the standard might apply, this must

be clearly indicated and justified, either here or in the brief of the SGB that generated the standard.

5. Purpose of the unit standard

The format of entries under the heading Purpose follow on from the statement: ‘Persons credited with this

unit standard are able to...’

The Purpose of a unit standard includes its specific outcomes together with a concise statement of the

contextualised purpose of the unit standard and what its usage is intended to achieve for:

– the individual

– the field or sub-field

– social and economic transformation

These entries are phrased as: Verb + object + modifying phrase(s) (if required)

The purpose statement succinctly captures what the candidate will know and be able to do on the

achievement of the unit standard.

The sub-outcome entries are ‘bulleted’ for easy reading purposes.

6. Learning assumed to be in place

There is a clear relationship between the credit value of the standard and the learning assumptions.

[This is the learning assumed to be in place if the learning required to achieve the standard is to be

completed in the assigned credit time]

The statement captures and reflects the knowledge, skill and understanding ‘building blocks’ which are

assumed to be in place and which support the learning towards the achievement of the unit standard under

consideration.

7. Specific Outcomes

The format of entries under the heading Specific Outcomes follows on from the statement: Persons

credited with this unit standard are able to: and these entries are phrased as: Verb + noun + modifying

phrase(s)

There are usually between 4 and 6 specific outcomes. [More than six may indicate that there is more than

one purpose that the standard is trying to address. Fewer than four may indicate that the purpose of the

unit standard is too narrow].

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

32

The specific outcomes together reflect and capture the purpose of the unit standard in ways that are

measurable and verifiable.

The specific outcome statements focus on competence outcomes and avoid describing specific procedures

or methods used in the demonstration of competence. This ensures that unit standards:

have broad and inclusive applicability

avoid frequent review and overhaul because of procedural or methodological shifts in tendencies

focus on competence outcomes for learning and performance, not descriptions of tasks or jobs

The specific outcomes avoid evaluative statements where possible. Statements reflecting the quality of

performance are located in the assessment criteria].

8. Assessment criteria

The format of entries under the heading Assessment criteria follow on from the statement: We will know

that you are competent to... [insert specific outcome] if or when... [insert assessment criterion]

Where there is a product, the assessable or measurable criteria for the product may include:

accuracy

finish / presentation

completeness (written information)

legibility (written information)

clarity (written / spoken information)

availability for use / location

Where work organisation / work role is critical the assessable or measurable criteria for the way work is

carried out may include:

time / speed / rate – schedule – procedures involving processes or methods cost effectiveness user specifications or needs optimisation of resources health and safety hygiene confidentiality / security dress / appearance language and behaviour creation and maintenance of effective relationships

The criterion statement sets the guidelines for developing particular assessment tasks at learning

programme or services level rather than reflecting check lists for one or more assessment instruments.

The criteria capture the requirements for fair, valid and reliable assessment procedures that make use

of tools and methods appropriate to the organising field, sub-field, level, category and the unit

standard being registered.

The assessment criteria capture the underlying and embedded knowledge base that allows the

candidate to reflect achievement of the unit standard (through the reflective and repetitive application

of that knowledge, skill, ability and value achievement within a range of contexts).

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

33

The assessment criteria must be sufficiently transparent to ensure ease of understanding across a

range of learning providers, learning services and candidates.

9. Range statements

The range statements relate directly to specific outcomes, assessment criteria or even the standard.

Note: Not all specific outcomes or assessment criteria require range statements.

There must be a clear relationship between range statements, the specific outcomes, the purpose of the

unit standard, and the assessment criteria delineated for the unit standard.

10. Notes

This category contains:

General Notes

Critical cross-field outcomes as well as

Embedded knowledge.

Embedded Knowledge:

The format of entries follows on from the statements:

I/Candidates can understand and explain...

I/Candidates can apply... and these entries are phrased as Noun + modifying phrase(s)

Where there is an embedded knowledge section it comprises a statement of the knowledge base required

for competent performance and achievement of the unit standard, representing what the candidate has to

understand and be able to explain in the area (sub-field) at the particular level.

The embedded knowledge statement includes demonstrations of knowledge of the classificatory systems

operating in the area and at the level of the unit standard.

Critical Cross-Field Outcomes

Critical Cross-Field Outcomes are in a ‘matrix’ format that indicates how each outcome is addressed in the

standard. The matrix captures the relationship of the purpose, specific outcomes, and embedded

knowledge to the critical cross-field outcomes.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

34

ASSESSMENT IN OUTCOMES-BASED EDUCATION AND TRAINING

THE PURPOSE OF ASSESSMENT

The notion of ‘outcome’ seems to relate to results. These results are seen to relate to consequences within

a person, in economic contexts or in societal context.

Assessment is ‘a way of measuring progress’. Performance criteria feature prominently in OBET systems. As

a substantially assessment-driven system, OBET requires clear and transparent articulation of criteria

against which successful (or unsuccessful) performance is assessed. The criteria should specify the

knowledge, understanding, performance(s), action(s) and roles that a candidate needs to show in order to

provide evidence that outcomes, standards and competence have been achieved. The criteria should also

state the level of complexity and quality of these. Context of and conditions under which demonstrations

occur should be indicated.

In general, assessment in education and training is about making judgements about the results of learning

so that decisions can be made. These decisions may have to do with the candidate - Is the candidate able to

do a certain job? Is the candidate able to embark on a particular course of study? What other learning the

candidate needs in order to be deemed qualified? They may also have to do with the learning programme -

What is the quality of the programme? What improvements or changes are needed? Decisions may need to

be made about the education and training system itself, and judgements made in the assessment process

can inform such decisions.

Outcomes based assessment is therefore defined as:

A process of collecting sufficient evidence demonstrating that individuals can perform to pre-defined

performance standards.

Note: The evidence collected includes not only behavioural evidence, but also the

knowledge and attitudinal evidence underpinning performance.

What is the difference between traditional assessment and outcomes – based assessment?

The difference between the traditional assessment to which we are all used (e.g.,

examinations and tests), and competence based assessment is demonstrated in the

following diagram (adapted from Fletcher, 1997, p. 25):

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

35

Traditional View Outcomes – based View

If you do well in this course… These functions must be performed well…

… you will do well in this exam… This course is designed to

get you there …

You can learn at work or by

any other means …

… and you will get this certificate. The assessment finds out

whether you can perform

to the predefined

standards …

And leads to (this certificate

or this company recognition)

which confirms that you can

perform to those standards.

PRINCIPLES OF OBE ASSESSMENT

All planning, design and development of assessment are based on the principles of assessment.

Evidence Facilitators should understand these principles as it will help them to provide effective

guidance to candidates and help them to highlight possible “gaps” in assessment activities to

registered assessors.

The principles of assessment are:

Assessment must be authentic, continuous, multi-dimensional, varied and balanced.

Assessment is an on-going integral part of the learning process.

Assessment must be accurate, objective, valid, fair, manageable and time-efficient. Assessment takes

many forms, gathers information from several contexts, and uses a variety of methods according to

what is being assessed and according to the needs of the candidate.

Assessment methods and techniques must be appropriate to the knowledge, skills, or attitudes to be

assessed as well as to the age and developmental level of the candidate.

Assessment must be bias free and sensitive to gender, race, cultural background and abilities.

Assessment results must be communicated clearly, accurately, timeously and meaningfully.

Progression should be linked to the achievement of the specific outcomes and should not be rigidly

time bound. Evidence of progress in achieving outcomes shall be used to identify areas where

candidates need support and remedial intervention.

The principles of fairness, validity, reliability and practicability are probably the most important, because

the other principles will in all likelihood be met if those four are adhered to.

Fairness relates mainly to the assessment process, validity relates to the assessment design, reliability

relates mainly to the conduct of the assessment and practicability relates mainly to the financial and time

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

36

implications of assessment. (ETDQA, June 2004: 7.) The principles of fairness, validity and reliability imply

that some form of moderation practices (both internal and external) need to be applied to assessments.

Fairness. A fair assessment should not in any way hinder or advantage a candidate. Fairness is the

overarching principle for good assessment practices, but the other principles help to clarify exactly what we

mean by a fair assessment. Examples of unfairness might include:

Unequal opportunities or resources.

Biased assessment (e.g. in relation to ethnicity, gender, age, disability, social class, language).

Unethical behaviour by the assessor, candidate or other person involved (threats, bribes, copying,

leaking of confidential information, etc.)

Any irregularities in the conduct of the assessment.

A lack of transparency about the assessment process.

Ambiguous or unclear assessment instructions.

Validity. A valid assessment really assesses what it claims to assess. In order to achieve validity in the

assessment, assessors should:

Check that the selected assessment instruments really targets the selected outcomes/unit standards.

Check that the assessment method is ‘fit for purpose’.

Ensure that the evidence is authentic (it was generated by the candidate in an appropriate context).

Ensure that the evidence is current.

Ensure that the evidence is sufficient to show competence and covers the range given in the range

statement, where it exists.

Reliability. A reliable assessment is one that is in line with other assessments made by the same and other

assessors in relation to the same standard or qualification. Reliability in assessment is about consistency.

Consistency means that comparable judgements are made in the same (or similar) contexts each time a

particular assessment is conducted. Assessment judgements should also be comparable between different

assessors. Assessment results should not be perceived to have been influenced by variables such as:

Assessor bias.

Different assessors interpreting the standards or qualifications differently.

Assessor stress and fatigue.

Assessor assumptions about the candidate, based on previous performance.

Practicability. A practicable assessment is effective without placing unreasonable demands on the relevant

role-players. Assessment should be designed to be as effective as possible in the context of what is feasible

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

37

and efficient in a particular learning programme or RPL process. It should try to avoid unreasonable

demands in relation to:

The time commitments required for the generation, collection, presentation and assessment of evidence

involving:

The candidate.

The assessor.

Third party witnesses (mentors, line-managers, coaches, etc.)

Evidence facilitators, RPL advisors and others involved in advice and support.

Financial implications for the employer or provider in relation to, for example, releasing personnel

listed above for lengthy periods.

Financial implications for the employer or provider in relation to suspending or slowing the

effectiveness of the normal use of machinery, tools, facilities and human resources.

ASSESSMENT METHODS

Assessment in OBET lays emphasis on the assessment of outputs and end-products as opposed to inputs.

This feature draws on the broader concept of criterion- referenced assessment (as opposed to norm-

referenced assessment). Norm- referenced assessment is associated with the grading and ranking of

candidates, comparing candidates, averaging scores or grades of candidates. Norm-referenced assessment

is defined as making judgements about candidates by comparing them to each other. Criterion-referenced

assessment, on the other hand, is viewed as making judgements about candidates by measuring

candidate’s work against set criteria that are independent of the work of other candidates. Even if grades

are given, candidates are graded in terms of whether they have satisfied criteria set for assessment.

OBET assessment is not solely focused on assessing what candidates can do but also what they know and

how they integrate critical cross-field outcomes (generic abilities) to demonstrate achievement. The term

“generic abilities” refers to elements such as problem-solving, decision-making, analysing, etc. In NQF

terminology these are called critical cross-field outcomes. Critical cross-field outcomes have already been

discussed. To refresh your memory - Critical outcomes include the ability to identify and solve problems in

a way that demonstrates that responsible decisions have been made.

OBET as envisaged, will produce candidates who are active, thoughtful, reflective users of knowledge as

well as being generators of new knowledge, both in personal and broader societal situations.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

38

OBET MAKES USE OF FORMATIVE AND SUMMATIVE ASSESSMENT

• Formative assessment refers to assessment that takes place during the process of learning and

teaching. Its purposes may be to diagnose candidate strengths and weaknesses, provide feedback to

candidates on their progress (or lack of), assist both the candidate and facilitator/assessor to plan

future learning, assist the candidate and facilitator/assessor in making decisions regarding the

readiness of the candidate to do summative assessment. This assessment is not intended for

assessing whether the candidate has successfully achieved or not. When formative assessment

results are recorded and used to make judgements about achievement, they then fall into the

category of summative assessment. When results initially collected as results for formative

assessment purposes are used for summative assessment purposes, the candidate should be

informed. The assessor should also indicate to the candidate which outcomes are being recorded as

having been achieved and the criteria used. The assessor would have to ensure that these outcomes

are not assessed again.

• Summative assessment is assessment for making judgement about achievement. This is carried out

when a candidate is ready to do assessment, which may come at the end of learning. In knowledge

and inputs systems this is usually done after a specified time spent on learning. In OBET emphasis is

on candidate readiness.

Now that you understand the meaning and implications of OBET, it should be easier to understand the

organizational policies and procedures governing assessment practices.

END OF SECTION 1

SELF ASSESSMENT AND GROUP ACTIVITIES

Do Exercise one (1) in your

workbook now

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

39

UNDERSTANDING OF THE SPECIFIC ASSESSMENT PROCESS, THE EXPECTATIONS OF THE

CANDIDATE, THE ORGANISATIONAL ASSESSMENT POLICY, MODERATION AND THE APPEALS

PROCEDURES

This unit will help you as the Evidence Facilitator to explain to candidates (in other words candidates who

are to be assessed or preparing for assessment) the assessment process and all assessment policies

impacting on assessment. In addition you must also be able to explain to them their rights in terms of

assessment. Candidates will comfortable and at ease when they know that they will be entitles to an appeal

and that they have the assurance that the evidence submitted by them will be check by another person,

called the “Moderator” to ensure fairness, validity and reliability.

But before we can get to this in detail it would appropriate for you to know how to interact and

communicate with candidates to ensure you set them at ease and that your guidance and advice will

promote understanding of assessment. We will provide you as the Evidence Facilitator with specific

methods to facilitate and communicate the understanding of assessment to candidates. This will include

explaining how to communicate effectively, setting up an orientation session, and some role-play

techniques to ensure that candidates understand and participate in effectively in the process of getting

them to understand assessment.

COMMUNICATION IN THE ASSESSMENT PROCESS

Communication is the sending and receiving of messages. It is concerned with sending (conveying)

knowledge or information from one person to another. Communication should be clear, understandable

and unambiguous. The purpose of communication is to establish understanding and to convey a message.

• By the of this unit you will be able to interact with candidates to set them at ease and promote understanding of the assessment.

Section 2 - Interaction with candidates

• Understanding of the specific assessment process, the expectations of the candidate, the organisational assessment policy, moderation and the appeals procedures.

ASSESSMENT CRITERION RANGE

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

40

DEFINITIONS OF COMMUNICATION Webster defines communication as an “Interchange of thoughts and opinions.” MacKenzie defines

it as “Communications is ensuring understanding”; “Communications is the Process of sending and

receiving information in order to achieve an objective”. We all interact with the printed word as

though it has a personality and that personality makes positive and negative impressions upon us.

Without immediate feedback your interaction effort can easily be misinterpreted by the

candidate, so it is crucial that you follow the basic rules of etiquette to construct an appropriate

communication tone.

The following can be used with both written and oral communication: Four Critical Steps of Communication

Clarify the message.

Test the receptivity of the message.

Convey or transmit the message.

Check the feedback. Six Barriers to Communication

Distortion

Stating inferences as facts

Jumping to conclusions

Confusion over the meaning of words

Experiences, value systems, and prejudices Failure to listen

THE PRINCIPLES OF COMMUNICATING EFFECTIVELY

Seek to clarify your ideas before communication.

Examine the true purpose of each communication.

Consider the total physical and human setting whenever you communicate.

Consult with others, where appropriate, in planning communication.

Be mindful, while you communicate, of the overtones as well as the basic content of your

message.

Take the opportunity, when it arises, to convey something of help or value to the receiver.

Follow up your communication.

Communicate for tomorrow as well as today.

Be sure your actions support your communications.

Last, by no means least: Seek not only to be understood but to understand — be a good

listener.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

41

Let us discuss the principles.

SEEK TO CLARIFY YOUR IDEAS BEFORE COMMUNICATION.

The more systematically we analyze the problem or idea to be communicated, the clearer

it becomes. This is the first step toward effective communication. Many communications fail because of

inadequate planning. Good planning must consider the goals and attitudes of those who will receive the

communication and those who will be affected by it.

EXAMINE THE TRUE PURPOSE OF EACH COMMUNICATION

Before you communicate, ask yourself what you really want to accomplish with your message—obtain

information, initiate action, change another person’s attitude? Identify your most important goal and then

adapt your language, tone and total approach to serve that specific objective. Do not try to accomplish too

much with each communication. The sharper the focus of your message, the greater its chances of success.

CONSIDER THE TOTAL PHYSICAL AND HUMAN SETTING WHENEVER YOU COMMUNICATE

Meaning and intent are conveyed by more than words alone. Many other factors influence the overall

impact of a communication, and the manager must be sensitive to the total setting in which he

communicates. Consider, for example, your sense of timing (i.e., the circumstances under which you make

an announcement or render a decision); the physical setting—whether you communicate in private, for

example or otherwise; the social climate that pervades work relationships within the agency or a

department and sets the tone of its communications; and custom and past practice—the degree to which

your communication conforms to, or departs from, the expectations of your audience. Be constantly aware

of the total setting in which you communicate. Like all living things, communication must be capable of

adapting to its environment.

CONSULT WITH OTHERS, WHERE APPROPRIATE, IN PLANNING COMMUNICATION

Frequently it is desirable or necessary to seek the participation of others in planning a communication or

developing the facts on which to base it. Such consultation often helps to lend additional insight and

objectivity to your message. Moreover, those who have helped you plan your communication will give it

their active support.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

42

BE MINDFUL, WHILE YOU COMMUNICATE, OF THE OVERTONES AS WELL AS THE BASIC CONTENT

OF YOUR MESSAGE

Your tone of voice, your expression, or apparent receptiveness to the responses of others—all have

tremendous impact those you wish to reach. Frequently overlooked, these subtleties of communication

often affect a listener’s reaction to a message even more than its basic content. Similarly, your choice of

language—particularly your awareness of the fine shades of meaning and emotion in the words you use—

predetermines in a large part the reactions of your listeners.

TAKE THE OPPORTUNITY, WHEN IT ARISES, TO CONVEY SOMETHING OF HELP OR VALUE TO THE

RECEIVER

Consideration of the other person’s interests and needs—the habit of trying to look at things from his point

of view—will frequently point up opportunities to convey something of immediate benefit or long-range

value to him. People on the job are most responsive to the managers whose messages take their own

interests into account.

FOLLOW UP YOUR COMMUNICATION

Our best efforts at communication may be wasted, and we may never know whether we have succeeded in

expressing our true meaning and intent, if we do not follow up to see how well we have put our message

across. This you can do by asking questions, by encouraging the receiver to express his reactions, by

following up contracts, by subsequent review of performance. Make certain that every important

communication has a “feedback” so that complete understanding and appropriate action result.

COMMUNICATE FOR TOMORROW AS WELL AS TODAY

While communications may be aimed primarily at meeting the demands of an immediate situation, they

must be planned with the past in mind if they are to maintain consistency in the receiver’s view; but, most

importantly, they must all be consistent with long-range interests and goals. For example, it is not easy to

communicate frankly on such matters as poor performance or poor test results but postponing

disagreeable communications makes them more difficult in the long run and is actually unfair to your

candidates.

BE SURE YOUR ACTIONS SUPPORT YOUR COMMUNICATIONS

In the final analysis, the most persuasive kind of communication is not what you say but what you do.

When a person’s actions or attitudes contradict his words, we tend to discount what he has said. For every

manager this means that good supervisory practice—such as clear assignments or responsibility and

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

43

authority, fair regards for effort, and sound policy enforcement—serve to communicate more than all the

gifts of oratory.

LAST, BUT BY NO MEANS LEAST: SEEK NOT ONLY TO BE UNDERSTOOD BUT TO UNDERSTAND—

BE A GOOD LISTENER

When we start talking we often cease to listen—in that larger sense of being attuned to the other persons

unspoken reactions and attitudes. Even more serious is the fact that we are all guilty, at times, of

inattentiveness when others are attempting to communicate with us. Listening is one of the most

important, most difficult—and most neglected—skills in communication. It demands that we concentrate

not only on the explicit meanings, unspoken words and undertones that may be far more significant.

SELF ASSESSMENT AND GROUP ACTIVITIES

Effective communication can modify behaviour, effect changes, make information productive and achieve

goals. It is critical to each of us in work and play. Interpersonal communications are not confined to any

single aspect of our lives but with each and every time that we interact with others.

Successful communication can only take place if the people involved share the same meanings for words

and ideas. Communicating with another person is not a science; there are specific sound principles but

thousands of variations.

Now that you have some understanding it will be a good time to introduce you to the OBE Assessment

Practices and methods that you need to understand before you can engage evidence in evidence

facilitation session with your candidate/(s). Remember what is the purpose of this unit standard?”; to

“Facilitate the preparation and presentation of evidence for assessment”. All learning activities discussed

in this guide is directed in achieving this.

Do Exercise two (2) in your

workbook now

This brings us to the next unit and that is to ensure that your interaction with

candidates help to set them at ease and promote understanding of the assessment.

You should have a clear understanding of Outcomes-Based Assessment in the context

of the National Qualification Framework.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

44

THE PURPOSE OF INTERVIEWS/ONE-ON-ONE ORIENTATION SESSIONS

The purpose of orientation session to explaining and clarify the details of assessment to candidates.

(purpose, process, expectations, roles, responsibilities). It is important that the learners understand from

the onset what their role and responsibilities are regarding their own assessment. The candidates need to

understand what the process is and why it is so. The candidates need to know what to expect from the

assessor. The Evidence Facilitator needs to explain to the candidates what Assessor will expect from them.

We will provide you with sound principles in setting up one-on-one session with your candidate later in this

section but we first need to look at what you should understanding and discuss during these orientation

sessions.

THE ASSESSMENT PROCESS

All assessments, regardless of the subject matter, follow the same basic procedure, i.e. the planning of the

assessment with the candidate, the conducting of the assessment and on completion of the assessment,

the feedback to the candidate. However, before the assessment can take place, the assessor has to plan,

design and prepare assessments. This includes making decisions about the method of assessment, the

instruments to be used, the activities to be structured and the extent to which more than one

learning outcome can be assessed simultaneously.

STEPS IN THE ASSESSMENT PROCESS

The Steps in the Assessment Process are broadly divided into three areas:

The Preparatory Phase

The Assessment

Record and Review

The process is described below.

• Information to candidates is clear, precise and in line with instructions provided in the assessment instruments, and opportunities are provided for clarification concerning the process and the expectations.

Section 3 - Clear and Precise Information

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

45

Planning

Identify need for assessment.

Determine & state required criteria for performance.

Determine the assessment strategy.

Select or design assessment procedure.

Consult the candidates on the process of the assessment. Clarify

requirements, standards and expectations.

INTER

NA

L MO

DER

ATIO

N

Implementation

Explain the assessment procedure.

Gather the evidence.

Match the evidence against the required criteria.

Make an assessment decision (allocate competent/not yet

competent rating)

Provide feedback.

Implement assessment result/decide on development plan.

Record & Review

Record assessment results.

Record development plans (if any).

Handle assessment appeals (if any).

Review the assessment procedure.

MODERATION OF OBE ASSESSMENT

Moderation systems combine moderation and verification. Both moderation and verification

systems must ensure that all assessors produce assessments that meet the requirements for

assessment (fair, valid, reliable, etc.). This implies that the methods of assessment can be used for

moderation as well. The same evidence of assessment information thus needs to be gathered.

Candidates must be informed by Evidence Facilitators that assessments are moderated and that

they have the right to submit queries to the moderator where feedback from assessors were not

sufficient.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

46

PROCESS OF MODERATION

Moderation ensures that assessment that is conducted by a single learning provider is consistent,

accurate and well designed. The three main stages to moderation are:

Design: Ensuring that the choice and design of assessment methods and instruments are

appropriate to the unit standards and qualifications being assessed. Design can also be

used as the check that assessors carry out when selecting assessment instruments.

Implementation: Ensuring that assessment is appropriately conducted and matches the

specifications of unit standards and qualifications – it involves making sure that

appropriate arrangements have been made and having regular discussions between

assessors.

Review: Ensuring that any lessons learnt from the other two stages are considered and the

necessary changes required are made.

Accredited providers should have individuals to manage their moderation systems. Providers’

moderation should:

Establish systems to standardise assessment including plans for moderation.

Monitor consistency of assessment records.

Through sampling, check the design of assessment materials for appropriateness before they are

used, monitor assessment processes, check candidates’ evidence, check the results and decisions of

assessors for consistency.

Co-ordinate assessor meetings.

Liaise with verifiers.

Provide appropriate and necessary support, advice and guidance to assessors.

Moderate assessment practices.

Maintain and monitor arrangements for processing assessment information.

RECOGNITION OF PRIOR LEARNING

It would most certainly be to your advantage to attend the course based on the unit standard:

Develop, Support and Promote RPL Practices if you wish to gain more information and knowledge

on recognition of prior learning (RPL). Here we will only discuss the basics of RPL, which will at

least give you a good idea of what it entails.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

47

Purpose. The purpose of RPL can include access to and appropriate placement at a particular level

at a learning institution; granting advanced status, advanced standing, crediting and certifying

candidates for the parts of the qualification where all the requirements have been met. A

candidate can achieve a qualification either in part (e.g. where a candidate is granted credits for

some unit standards) or wholly through the process of RPL. (Criteria and Guidelines for the

Implementation of the RPL: 28.)

The following descriptions for the abovementioned options may be helpful:

Access. To provide ease of entry to appropriate level of education and training for all

prospective candidates in a manner that facilitates progression.

Placement. To determine the appropriate level for candidates wanting to enter education

and training through a diagnostic assessment.

Advanced status. To grant access to a level of a qualification higher than the logical next

level following on the preceding qualification.

Credit. To award formal, transferable credits to the learning that meets the requirements

of the part or full qualification.

Certification. To certify credits attained for the purposes of the qualification.

Processes. The process by means of which RPL is implemented must meet the SAQA requirements

in this regard. The following generic RPL process is suggested by SAQA (Criteria and Guidelines for

the Implementation of the RPL: 32).

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

48

The RPL process

RPL evidence facilitator meets the

candidate to conduct pre-screening to

ascertain viability of pursuing the RPL

option.

If not viable, i.e. the candidate

will clearly not meet the

minimum requirements in

terms of language or numeracy

and/or other competencies,

the candidate is referred for

further advice on alternative

pathways.

If viable, then pre-assessment takes place. The RPL

evidence facilitator takes the candidate(s) through

preparation for assessment:

o Portfolio development and related workshops, and/or o One-on-one advising. o Assessment approaches, tools mechanisms. o Guidance on collecting evidence, which the candidate

then follows. The assessor (preferably with the facilitator present) and

candidate develop the assessment plan:

o Review unit standard(s) and requirements. o Type and sources of evidence. o Assessment tools to be used in the assessment. o Dates and time for assessment. Assessment stage:

o The candidate undergoes practical assessment, and/or o The candidate sits knowledge tests, and/or o The candidate goes through pre-and post-interview,

etc.

Judgement stage: The evidence is assessed by the assessor.

Moderation stage. Feedback stage.

Credit not awarded

Credit awarded.

Appeal process may be

initiated. Post-assessment

support and

certification, if

applicable.

RELATED ASPECTS

ASSUMED TO BE IN PLACE

1. RPL policies, procedures and systems must be in place; information on RPL must be readily available.

2. The provider has developed a criteria framework within which pre-screening takes place; pre-screening criteria are readily available to the candidates.

3. Assessment instruments have been developed and moderated.

4. Alternative pathways/options as well as additional counseling services are available.

5. Where no facilitators are available, assessors will undertake all functions.

Note: Credit awarded

could be replaced with

‘access’; ‘advanced

status’, etc. depending on

the context and purpose

of RPL within the

institution.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

49

Benefits. RPL holds benefits for the candidate, the employer of the candidate and the institution

who offers RPL. Many such benefits will be discussed with you once you do the RPL course, based

on the unit standard Develop, support and promote RPL practices. Generically speaking, RPL offers

the following benefits:

Costs can be kept down by co-operation between the employer and the providers of RPL.

New skills can be built.

New and dormant learning pathways can be opened.

Credit transfer is facilitated, provided that other learning providers recognize credits obtained through

RPL.

Consensus on the level(s) and the minimum requirements for candidates seeking credits for particular

qualifications or entry to further study supports the objectives of the NQF.

Learning institutions maintain their independence, since RPL procedures are generic and not

dependent on specific learning content or curriculums.

Challenges. Successful implementation of an RPL service is hampered by the following challenges:

Not all learning institutions use or recognize unit standard-based learning.

There are few assessors specifically trained in assessing RPL, and no registration option as RPL assessors

is currently available.

Not many providers of learning have staffs that are trained in the RPL processes.

RPL is often seen as a shortcut towards obtaining qualifications. Candidates do not always realize that

RPL also requires evidence of competence.

RPL ASSESSMENT COMPARED TO “OTHER FORMS” OF ASSESSMENT

It is obvious that this is a totally different process from the traditional teach-and test environment

of the pre-outcomes-based learning era. It is not advisable to approach RPL as something

different and divorced from formal learning. The contents of RPL must be the same as that for

formal learning. The RPL process does not require highly specialized training skills in test design

and statistics on the part of the assessor, although a holistic assessment approach with high levels

of flexibility, specialized activities, functions and procedures has to be conformed to. The RPL

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

50

process is non-competitive as each portfolio is distinct and personal. RPL assessment documents

are guided by the types of evidence the candidate provides.

In formal learning the assessor is limited to one or two generic assessment documents. The RPL

candidate is in control of the RPL process and has to show a high level of commitment, self-

analysis, reflection and objectivity. The focus is on what the individual has achieved and not the

time, the place or the method used. RPL can be utilized and applied in a number of different

contexts. This reflects a shift in learning from a lecturer-centered approach to a candidate-

centered approach and makes the RPL candidate a stakeholder in the RPL process. The candidate

is consulted in determining a timeframe for portfolio development but the time allowed for

completion should not exceed the time required to attend formal tuition. In RPL the candidate

should be fully aware of all assessment criteria against which suitable evidence of competence

has been prepared and matched. The candidate should be aware of and have access to the

assessment activities used by the assessor. The evidence of learning in RPL can take many

different forms, whereas in formal assessment, it is mostly limited to specific outcomes-based

questions and assessment criteria as specified in the unit standard.

The essential reference point for “marking” or “grading” a RPL candidate as competent is the

lowest mark which enables a formal learning candidate to “pass”. The learning is therefore

assessed in terms of whether competence has been achieved or not. A RPL assessment

framework provides a template from which various assessment instruments can originate in

order to match the evidence provided by the RPL candidate. The RPL portfolio should always

achieve what it sets out to do and the duplication of learning is reduced. The evidence provided

by the candidate must meet the requirements set by the relevant ETQA. In RPL administrative

records need to be carefully completed to provide an accurate historical perspective of the

candidate’s assessment.

RPL SUPPORT

It is important to realize that the reality in South-Africa is different from especially first world

countries and that our candidates therefore have different needs. To avoid candidates feeling

alienated, RPL needs to be integrated with other learning and assessment services and

opportunities.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

51

The biggest misconception that RPL candidates hold is probably the perception of what is meant

by “life experience” entails. “Life experience” evidence must be output driven and linked to

specific assessment criteria. Candidates should be assisted to reflect on their experiences and

identify their strengths, weaknesses and accomplishments and link them to the specific

assessment criteria – where and if possible.

RPL candidates need to be informed of the dynamics of the RPL process and they should be

encouraged to clarify their goals, overcome obstacles and develop strategies to overcome them.

Candidates should be assisted to match claims of competence with qualifications, unit standards

and assessment criteria. Candidates need correct and adequate information which will enable

them to make sound decisions and listen attentively. They need information, advice, guidance and

tutorial support that will hopefully empower them to make good decisions.

The support services should consciously address the invisible barriers (lack of access to learning

and non exposure to formal learning) to successful assessment. These (social, cultural, economical,

educational and psychological) barriers can be overcome by adapting assessment and

accreditation practices to suite the unique South African context. The inclusion of advising and

counseling services to complement evidence, facilitation and assessment should meet the needs

of the candidates.

The candidates need relevant support in a well-managed, cost-effective process with post-

assessment care. This could be achieved through well structured and well managed portfolio

development workshops. Candidates need access to portfolio development workshops in order to

select or develop the most appropriate evidence, construct his/her portfolio and prepare

him/herself for assessment. During portfolio preparation the RPL facilitator needs to provide

information, respond to questions, review candidates’ work, encourage candidates, liaise with

assessors and give candidates fair and accurate feedback.

ASSESSMENT PRACTICES

In this section we will discuss the practices that should be associated with assessment to ensure

that what we do is in line with quality assurance requirements, recognized codes of practice and

learning-site or work-site standard operating procedures. You should not read this in isolation, nor

should you see it as separate from the rest of the procedures of conducting assessment. What I

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

52

am trying to do here is to give you a summary of the quality assurance activities that should

accompany conducting outcomes-based assessment. Let us start of by exploring what the

Assessment Process entails and then the discuss the policies and instruction that governs and

regulate the actual assessment.

THE ASSESSMENT POLICY

The assessment policy should specify the following:

Time spent on contact learning and experiential, as well as how much time is allowed for

preparing practical portfolios of evidence.

How much time candidates should spend on contact learning before they can be assessed.

Assessment instruments in use (theoretical, practical, types of questions, criterion-

referenced, norm referenced.)

How formative assessment will be used and if it will count towards the candidate’s final

mark, i.e. if it will also have a summative value.

Types of competence to be assessed. The following types of competence should be

considered for inclusion:

o Foundational competence. If a unit standard requires that embedded knowledge

and theory must be assessed separately, it has to be adhered to in addition to the

practical summative application-type assessments.

o Practical competence. This evidence must be as holistic, direct, integrated and

naturally occurring as possible. If simulations and scenario-type application is not

allowed for assessment purposes according to the unit standard notes, this must be

adhered to.

o Reflexive competence. Reflexive competence is the ability of a candidate to

critically reflect on his or her ability to apply the knowledge gained. Qualitative type

questioning should be used and should include reflection on the critical outcomes.

The sources of reflective competence can include self- and peer assessment as well

as third party evidence of these particular skills.

THE ASSESSMENT GUIDE

One of the key documents in the design of assessment is the compilation of an assessment guide.

This document should not be confused with the assessment plan. The assessment plan is an

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

53

organising and management tool which links activities to a time scale and has the candidate’s

support in mind. Much of the information in an assessment plan could be explained further in the

assessment guide. The assessment guide is a guide, it gives direction and provides more

information regarding the unit standard itself and has quality of assessment in mind. The content

of an assessment guide is much more comprehensive and could replace a curriculum, provided it

contains all the required information and is based on a proper assessment analysis and design.

The assessment guide is a separate document that accompanies a course manual. The assessment

plan can form part of it, because cross-references are made to the guide. The assessment guide

should contain the following:

The guide contains all the details needed by assessors to conduct assessments in line with

defined assessment principles.

The guide provides clear details of the assessment activities in line with the assessment

design, so as to facilitate fair, reliable and consistent assessments by assessors. The

activities are presented in a form that allows for efficient communication of requirements.

The structure of the guide promotes efficient and effective assessment. It further facilitates

the recording of data before, during and after the assessment for purposes of record

keeping, assessment judgement and moderation of assessment.

The guide includes all support material and/or references to support material, including

observations sheets, checklists, possible or required sources of evidence and guidance on

expected quality of evidence including exemplars, memoranda or rubrics as applicable.

The guide makes provision for review of the assessment design, and is presented in a

format consistent with organisational quality assurance requirements.

If there is any reason that the candidates should NOT get parts of the assessment guide, this can

be deleted from their version and only given to the assessors. This includes information such as:

Memoranda.

Assessment instruments.

Instructions to assessors and support personnel.

Administrative procedures not meant for the candidates e.g. scheduling of assessments,

quality assurance meetings, etc.

Assessment review information.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

54

THE ASSESSMENT PLAN

One of the important documents used during assessment, is the assessment plan. It is the

agreement between the assessor and the candidates on what the plan is that will be followed to

gather evidence against a unit standard. For a group, one plan will be compiled and discussed in a

group. The essential parts preceding the plan will be signed and returned to the facilitator or

assessor to keep as evidence. (The ideal is that the candidates keep a copy and the provider or

assessor the original.)

An individual plan can be compiled if only an individual candidate is involved. This is especially the

case in RPL where the detail on gathering evidence could differ from that of a group. This planning

will form part of a counselling/interview session to prepare for RPL.

The Evidence Facilitator or assessor must discuss the assessment plan with all candidates.

Both the assessor/facilitator and candidate/leaner must sign the assessment plan once it has been

discussed.

The assessment plan becomes a learning contract once it is signed. It has to be stored with the

assessment evidence for verification. The crux of an assessment plan is that it should contain all the

information that tells the candidates:

o What exactly will be assessed (links with the unit standard)

o How assessment will be done (methods and instruments used,

how the grading is done, whether assessments are formal or

informal, etc.).

o Where and when the assessment will be conducted.

o What their responsibility is.

o What and how must they must learn? Is it an open-book assessment? Is it part of a

cumulative mark? Will candidates be allowed to conduct re-assessment and under

what conditions?

o What if a candidate falls ill and cannot participate in the assessment?

o Can the candidates work in a group? How will the group effort be scored?

o Must they submit a portfolio to the assessor?

All these questions should be encouraged and recorded as part of the assessment plan.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

55

THE MEMORANDUM

The memorandum (containing model answers) is another important document used during the

assessment process.

A model answer is required for factual and calculated answers.

In examination papers or practical exercises where creativity is an element, there may be more

than one possible correct solution or answer.

Where more answers are possible than the allocation of marks indicates, all possibilities must

appear in the memorandum, if possible. (In the case of creative problem solving, a mark out of

100, based on certain elements of the problem, should be used as yardstick.)

With discussion and opinion-type questions the main points should be listed in order of

importance and a broad explanation of that which is expected must be provided. The assessor

will mark such question papers, so that it can be subject to moderating.

There may be no confusion in respect of the question that was asked and the expected

answer.

The memorandum must be a comprehensive set of answers covering all subsections of all

questions and phrased in the exact manner the assessor expects the candidates to answer the

questions.

In the case of question papers where no model answer can be provided and where the

assessor must evaluate the answer at his or her own discretion, no prototype can be expected.

The marks awarded for each answer or subsection of an answer must be shown clearly.

The answers must be in the same sequence as the questions in the question paper.

References to textbooks and lectures are not acceptable.

Where annotated drawings are required, the complete drawings with annotations must

appear in the memorandum.

THE ASSESSMENT APPEALS PROCESS AND RE-ASSESSMENTS

Candidates must have the security of knowing that, if they feel that unfairness, invalidity,

unreliability, impracticability, inadequacy of experience or expertise, and unethical practices were

present in assessment, they may appeal against the results of assessment. Appeals are normally

lodged in writing and the merits of all appeals must be investigated. A learning or assessment

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

56

provider must have a clear and fair appeals procedure and this must be brought to the attention

of all candidates

An appeal against an assessment decision or the manner in which the assessment was conducted

may be lodged by any of the role players in the assessment process. This is the proposed

procedure to follow in the event of an appeal:

Assessment conducted.

Feedback given to candidate.

Appeal lodged within 3 working days.

Internal moderator.

Education and Training Committee.

Top management for final decision

The assessment process has to have built in a process for re-assessment. When a candidate has to

undergo re-assessment they have to be given feedback so as to concentrate on areas of weakness.

Ideally continuously conducting formative assessment should minimise the need for reassessment

as the assessor and candidate will decide on carrying out summative assessment when both have

agreed that the candidate is ready for it. Re-assessment should comply with the following

conditions: (SAQA Guidelines, 1999: 29.)

The reassessment should take place in the same situation or context and under the same

conditions.

The same method and instrument may be used, but, the task and materials should be

different – the task and materials should, however, be of the same complexity and level as

the previous ones – in case the methods and instruments are changed it must be ensured

that they are appropriate for the outcomes specified.

Care should be taken regarding how often re-assessment can be taken and the length of time

between the original assessment and the re-assessment. Limits should be set to the number of

times a candidate can undergo reassessment and for the length of time between assessments. A

candidate who is repeatedly unsuccessful should be given guidance on other possible and more

suitable learning avenues.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

57

SELF ASSESSMENT AND GROUP ACTIVITIES

Finally we get to the point where we have started at the beginning of this section….

SCHEDULING ORIENTATION SESSIONS INTERVIEWS/ONE-ON-ONE SESSIONS

One of the critical success factors of planning for success is to undertake regular interviews / one- to-one

orientation session to review and evaluate the assessment processes, activities and instructions. At this

orientation session all assessment activities must be made clear and the Evidence Facilitator must ensure

that the candidate is understand and is at ease with the process. The registered assessor is depending on

the outcome of this as the candidate’s should be declared ready for assessment on completion of this

orientation session. If the candidate is not ready this must be made known to the assessor and immediate

action must be taken to get the candidate ready and prepared.

This process needs to:

■ encourage the candidate to self-reflect with the view to clarifying where and how progress has

been made,

■ identify, recognise and celebrate success with the purpose of promoting personal confidence and

growth,

■ set high standards and create an environment of high expectations, which are challenging but

realistic,

■ provide encouragement and support, but leave the responsibility of making progress with the

candidate, to encourage independent candidates;

■ help candidates prioritise, especially where there are conflicting priorities,

■ help candidates set goals and solutions.

Do Exercise THREE (3) in your

workbook now

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

58

CREATING A CLIMATE CONDUCIVE TO ASSESSMENT, REFLECTION AND ACHIEVEMENT

It takes time to build a rapport to promote trust and openness with candidates. This is necessary to

promote a climate of confidentiality in this important professional relationship where clear boundaries are

agreed from the outset about what can be discussed within the planning for success context. It is crucial

that Evidence Facilitators truly believe that all candidates can learn and achieve, and that they

communicate this view effectively to candidates. When this climate has been created, candidates’ self-

esteem can be built. Many candidates who have experienced difficulties at summative assessment may feel

that they do not have the necessary potential to succeed. Planning for success can overcome these fears

and prevent candidates withdrawing from assessment.

Start building a professional relationship beginning with:

■ showing genuine interest in candidates;

■ demonstrating respect for the candidate;

■ establishing effective channels of communication;

■ Promoting effective open dialogue.

■ Set the timeframe for the orientation session. Candidates might get the impression that

when they raise difficult issues the Evidence Facilitator will bring the session to a close

because they don’t want to address these concerns. Setting the parameters for discussion

and defining the time available is an important part of the session and to encourage open

dialogue. Then critically:

put the candidate in the driving set within an agenda agreed at the outset of the

orientation session;

allow the candidate to communicate their aspirations, successes, needs and concerns,

and encourage them to put forward their own solutions to problems;

don’t give advice while the candidate is still talking.

Listening to feedback from candidates. Just because Evidence Facilitators are listening to what the

candidate is saying, it does not necessarily mean that they are hearing what is being said. Being able to

clarify what a candidate is saying at strategic points is an important skill.

Active Listening skills involve:

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

59

■ valuing the candidate’s contribution;

■ looking interested in what the candidate is saying;

■ minimising distractions;

■ demonstrating patience;

■ communicating empathy;

■ maintaining eye contact (in certain cultures this is not to be good practice);

■ refraining from interrupting;

■ making some non-verbal signals to indicate encouragement and agreement such as

nodding to indicate you are listening to what is being said;

■ not looking at your watch or a clock (although it is useful to have clock in the room as part

of time management);

■ avoiding making early judgements.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

60

IDENTIFY SOURCES OF EVIDENCE

The purpose of assessment is to gather evidence of performance against predefined criteria against which a

judgement decision of competence can be made. You as the Evidence Facilitator should memorise this and

measure all you actions against the purpose of assessment. All information and advice that you provide

must be aligned with purpose. Always ask this question; “Does the information that I provide help

candidates to identify the appropriate and essential evidence necessary against which a decision of

competence can be made?”

If your answer to this question is ”No” or if you are in doubt surely you must rethink your evidence

facilitation strategy. Strategy means in short; “The ways and means to achieve the end”! Take all steps

necessary in considering the most important activity when informing candidates, is providing information

on how to identify sources of evidence). The candidates should know:

• The kinds of assessment activities that they could be asked to perform.

• The standard and level of performance expected.

• The type and amount of evidence to be collected.

• Their responsibility regarding the collection of evidence.

• The purpose of assessment.

• Assessment methods including purpose of unit standards and evidence guides.

• Assessment procedures.

• The NQF (in relation to assessment) and the national certification bodies.

• The role and responsibilities of the role-players in assessment.

• The information helps candidates to identify possible sources of evidence and the most appropriate and effective means for producing evidence for the assessment.

Section 4 - Producing evidence for assessment

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

61

• The opportunities available to the candidates during and after the assessment process, including

the appeal procedure.

IN ADDITION - ALLOWING FOR INPUT (FEEDBACK) BY THE CANDIDATE

The candidates and the Evidence Facilitator should be satisfied that the timing of the assessment, the

opportunities identified, and the place of assessment are suitable.

Opportunities should be provided for input from the candidate/learner on possible sources of evidence

that could contribute to valid assessment. This is especially important in adult and occupationally directed

learning, because candidates often have valuable experience, so that they can make good suggestions of

what assessment methods should be most effective and efficient. Modifications made on the basis of the

inputs maintain and/or improve the validity of the assessment. Informing the candidates about their

assessment is thus important in that there may be cases where, because of the maturity and experience of

the candidates, they may be in a position to alert the assessor to other opportunities that the assessor may

not have been aware of in planning. Inputs obtained from candidates might lead to the assessor having to

modify or redesign the assessment. This includes questioning the candidates on their prior experience to

assist in determining their knowledge and skills to identify relevant unit standards according to assessment

requirements. (Design should, nevertheless, take place before candidates can be informed.) Depending on

the language policy of the learning institution, candidates can also be consulted in which language the

assessment will be conducted and the learner’s readiness for assessment.

ASSESSMENT METHODS, INSTRUMENTS AND RESOURCES

This topic was discussed in more detail earlier in this unit. Assessment methods refer to the

activity that an assessor engages in, as he or she assesses a candidate and the candidate’s work.

Many different methods can be used to gather evidence of candidate’s knowledge, skills and

understanding. The assessment activities, instruments and resources selected must be appropriate

to the outcomes to be assessed and the assessment candidates, and have the potential to enable

valid and sufficient evidence to be collected.

EVIDENCE AND ASSESSMENT

What Evidence Facilitators must know and understand regarding the amount of evidence

required.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

62

The amount of evidence required. There is quite some disagreement between providers of

learning and assessment about exactly how much evidence is required to justify finding a

candidate competent. To some extent this will depend on the assessment method used. However,

the following general guidelines should help you decide if the evidence is sufficient:

The candidate must show competence in executing the specific outcomes and critical cross-

field outcomes of the unit standard and the unit standards of a qualification.

The candidate must meet the requirements of the assessment criteria linked to each specific

outcome. This can be rather subjective, so that the assessor might need to motivate his or her

judgement of “meeting the requirements” or not. That is why it is important to keep evidence

of assessment for a reasonable period of time, at least until after verification and until the

candidate had reasonable time to appeal against the results of the assessment, should he or

she wish to appeal.

The candidate must show an understanding of the essential embedded knowledge that forms

part of the unit standard. Rote learning is seldom necessary, but the candidate must

demonstrate the ability to find and use theoretical knowledge. This can often be achieved by

means of an open-book theoretical examination.

At least two different assessment instruments should be used, so that a measure of

corroboration can be achieved. The principle of triangulation is, however, always a safe one –

three assessment instruments are used.

The candidate must demonstrate added value in a workplace setting that is relevant to his or

her newly acquire skills. A workplace setting can, of course, also be simulated.

The candidate must show competence at the level of the unit standard and, ultimately the level of the qualification or learning programme.

GUIDING CANDIDATES ON EVIDENCE

The quality of outcome based-assessment relies on the quality of the evidence that you record in the

assessment process. Assessors collect the evidence using the different methods according to the outcomes

being assessed, the purpose and the characteristics of the candidates. You will need to make a judgement

concerning the quality of the evidence available.

Types of evidence; Fletcher [1997, p67; 1992, p82-84] identifies different types of evidence that may be

considered (discussed hereafter).

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

63

PERFORMANCE EVIDENCE

This is any evidence, which is direct and gives clear information about the candidate’s performance.

Example:

Performance evidence may take on the following forms:

- Actual products of performance.

- Results of observation of performance.

- Results of questioning by the assessor.

- Assessor’s own notes/records.

- Video recording.

- Tape recordings.

- Record and documents which are themselves evidence of the required outcome.

- Signed written evidence by a recognised expert.

- Signed written evidence by specialist assessor.

SUPPLEMENTARY EVIDENCE

This is any evidence for which you rely on the judgement of other assessors [excluding specialist or

recognised expert assessors] or on documentation that supports the performance required. Which is in

itself, evidence of performance.

Examples:

- Letter [e.g. written by a customer of a candidate’s performance].

- Signed evidence by peers against a standard or qualification.

- Signed evidence by an assessee against a standard.

- Signed evidence by a customer against a standard.

- Verbal customer reference not obtained by asked for by the assessor for judgement against a

standard.

- Records and documents, which are not performance, evidence on their own.

KNOWLEDGE EVIDENCE

This is any evidence on which you make a judgement as to the knowledge and application of the knowledge

of the candidate.

Example:

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

64

- Information gained from questioning.

- Written tests and the answers.

PAST ACHIEVEMENT EVIDENCE

This includes any performance, supplementary or knowledge evidence of performance relevant to the

standards which have occurred at a time before the current assessment.

Examples:

- Qualifications

- Customer references

- Reports

You would focus attention on performance evidence obtained directly to make a decision. All the evidence

would still need to comply with the assessment criteria.

SUMMARY OF UNIT 1

Outcomes-based assessment is entails the gathering of evidence of performance based on

predefined criteria against which a judgement, decision of competence can be made – whether it

is a decision of confirming competence or decision of “not yet” competence. Ultimately registered

assessors makes this decision based on the evidence provided by the candidate. YOUR job is to

ensure that candidates knows:

WHAT Evidence they must submit;

HOW they must submit the Evidence;

WHEN they must submit the Evidence;

WHY they must submit the Evidence;

WHERE they must submit the Evidence;

To achieve the above it is evident that you “as the Evidence Facilitator” must have sufficient

knowledge and understanding of all Policies, Procedures, Documentation, Methods and Activities

involved regarding the Assessment Process. Not only need you to understand this, but you must

also have the appropriate skills in place to facilitate this understanding to the candidate, who is

preparing for assessment.

Now you ask “Why all this fuss and emphasis on assessment’? The answer is simple; “To enhance

the achievement of learning”. Without achievement of learning there will be no competence in

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

65

place meaning no competence in South Africa’s labour force, meaning no a foreign investment as

we will not have the skills to attract foreign investment, meaning no Jobs, meaning no social

growth - upliftment of each and every citizen of this country.

RPL is a specialized form of assessment in that candidates must provide evidence of previous

learning without having to attend formal training. It can be used for certification, placement,

granting advanced status and standing, crediting and access to further learning. RPL candidates

still need to provide evidence of competence – crediting is not an automatic process, based on

life-skills. RPL should not be managed separately from other learning and assessment services.

The difference between assessment procedures, approaches, strategies and methods are not clear

– different sources use the terminology differently, so that I felt it necessary to illustrate how we

will use them in this manual.

The following table should give you a good idea of our understanding of assessment procedures,

approaches, strategies and methods:

PROCEDURES METHODS

Continuous assessment.

Formative assessment.

Summative assessment.

RPL.

Observation in the workplace.

Simulation.

Practical demonstration.

Presentation.

Self-assessment.

Products of candidate’s activities.

Projects, assignments, tasks.

Documents linked to an activity.

Oral discussion or interview.

Written questions or essays.

Witness testimony/peer reports.

Photographs, videos.

Case studies and role plays.

Journals or logbooks.

Tests or examinations.

APPROACHES

Norm-referenced assessment.

Criterion-referenced assessment.

STRATEGIES

Performance-based assessment.

Outcomes-based assessment.

The principles of fairness, validity, reliability and practicability are probably the most important,

because the other principles will in all likelihood be met if those four are adhered to.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

66

SELF ASSESSMENT AND GROUP ACTIVITIES

Space for your notes

Do Exercise Four (4) in your

workbook now

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

67

Advise and support candidates to prepare, organise and present evidence

Unit 2

PROVIDING GUIDANCE AND SUPPORT

One of the aims of outcomes-based learning is to make the entire process, including assessment,

more candidate-focused. Some of the ways suggested to achieve this is for Evidence Facilitators

and Assessors to acknowledge the needs, interests and abilities of candidates; to actively involve

candidates in all phases of the learning and assessment process.

•Potential barriers to gathering evidence and special needs of candidates are identified, and appropriate guidance is given to overcome such barriers and to address special needs.

•The advice and support helps candidates to identify appropriate, effective and efficient ways of producing evidence of their competence.

•The advice and support is given in a way that promotes the candidates` ability to present valid, relevant, authentic and sufficient evidence of current competence.

•Interactions with candidates enable them to organise and present evidence in a manner that contributes to the overall efficiency and effectiveness of the assessment, but without compromising the reliability and

•The nature and manner of advice and support takes into account lessons learnt from previous such interactions as well as information from assessors.

•Support is given in a way that strengthens candidates` ability to engage more independently in future assessments.

Section 5 to 11: All Assessment Criteria relating to SO2 will be dicussed in this unit

• The proposals could be made to candidates and/or assessors and other role-players.

ASSESSMENT CRITERION RANGE

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

68

Effective assessment, like good facilitation, should foster a positive relationship between

candidates, Evidence Facilitators and Assessors. It should encourage candidates to take

responsibility for their own learning, to develop the confidence necessary to cope with increasing

challenges, to reflect on their own abilities and progress and to be actively involved in improving

themselves. Assessment should help candidates who experience learning problems to a

reconsideration of the work that is being done, of facilitation procedures which are being followed

or of learning strategies which are being applied.

It is clear, therefore, that assessment should form an integral part of the learning process. As such,

it should be planned, implemented, recorded and reported in systematic and comprehensive

ways.

It is important for the Evidence Facilitator in cooperation with the Assessor plan the assessment so

as to ensure that the candidate will be assessed in a fair and effective manner, and that the

assessment will be credible, that is fair, valid, reliable and practicable. The Evidence Facilitator

would need to know from the onset the type and amount of evidence that will need to be

generated. During this stage the candidates will in all probability also be registered for assessment,

and the assessment methods will be chosen. The assessment must be carried out in accordance

with the assessment plan. Therefore the Evidence Facilitator must understand the assessment

plan in such a manner that it will facilitate execution. The assessment approach should, however,

be adapted as required by the situation, and unforeseen events are addressed without

compromising the validity or fairness of the assessment. Planning the assessment involves:

The Evidence Facilitator can obtain information on the assessment from a number of sources for

example:

Discussion with assessors. Evidence Facilitators will need to meet with Assessors of the same unit

standards or qualifications from time to time to discuss assessment issues. This could be part of the

moderation process. They can review each other’s plans and materials and discuss the assessment

strategies, evaluation of past assessment materials, new approaches and strategies, possibilities for the

use of more than one assessor, type and amount of evidence required.

Discussion with the learning facilitator. In instances where the Evidence Facilitator is not the

facilitator, he/ she would need to know from the facilitator about the learning programme that the

candidates have been through, the results of the formative assessment that the candidates have been

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

69

through, the equipment and material that the candidates are familiar with and to ascertain candidate

readiness for summative assessment.

Decide on the timing of assessment. The Evidence Facilitator together with the assessor would need to

plan for time during the learning programme for both formative and summative assessments. He or she

would need to identify opportunities for carrying out assessment. In the case of the candidate being

employed, the they would need to find out when the candidate will be engaging in certain activities in

the workplace that relate to the outcomes specified in the unit standard or qualification. In the case of

learning programmes that include a practical component like teaching, they would need to see if such

opportunities could be used for assessment of some of the outcomes. It is also important to plan the

timing of assessment in cases where facilities have to be shared by a number of providers and,

therefore, time slots have to be booked for the use of facilities.

Modifications made on the basis of the inputs maintain and/or improve the validity of the

assessment. Informing the candidates about their assessment is thus important in that there may

be cases where, because of the maturity and experience of the candidates, they may be in a

position to alert the assessor to other opportunities that the assessor may not have been aware of

in planning. Inputs obtained from candidates might lead to the assessor having to modify or

redesign the assessment. This includes questioning the candidates on their prior experience to

assist in determining their knowledge and skills to identify relevant unit standards according to

assessment requirements. (Design should, nevertheless, take place before candidates can be

informed.)

Depending on the language policy of the learning institution, candidates can also be consulted in

which language the assessment will be conducted and the candidate’s readiness for assessment.

LANGUAGE AND ASSESSMENT

Candidates should, generally, be able to be assessed in a language that they are most proficient

in. In South Africa in particular, this is an important factor. In Chapter 1 Section 6 of the

Constitution it is stipulated that:

…the official languages of South Africa are Sepedi, Sesotho, Setswana, siSwati, Tsivenda, Xitsonga,

Afrikaans, English, isiNdebele, isiXhosa, and isiZulu… measures must be taken to elevate the status

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

70

and advance the use of the historically diminished status of indigenous languages… all languages

must enjoy parity of esteem and must be treated equitably.

Furthermore, in Chapter 2, the Bill of Rights, Section 29, it is stipulated that:

…everyone has the right to receive education in the official language or languages of their choice in

public educational institutional institutions taking into account equity, practicability and the need

for redress…

These constitutional provisions give candidates the right to determine their language/s of learning

and teaching. Assessment policies, therefore, should ensure, as far as possible, that this right is

upheld for all candidates.

Language and expressions used should be at a level appropriate to the candidate and provide for

clear understanding of what is required without leading candidates. It is the responsibility of the

assessor to ensure that the level of the course and the level of the language used are as far as

possible the same. It is better to err in terms of using too simple language rather than the other

way around.

BARRIERS TO ASSESSMENT AND SPECIAL NEEDS OF CANDIDATES

Potential unfair barriers to achievement by candidates must be identified and plans made to

address such barriers without compromising the validity of the assessment. (Unfair barriers could

relate to issues such as language or disabilities.)

Disabled or impaired candidates are not only people in wheelchairs. Any disability that can make it

difficult or even impossible for a candidate to write (theoretical) or do (practical) assessment

qualifies as a special need. It is the constitutional right of such individuals to be assessed as one

would assess any other candidate. A disabled person can be physical, mental or emotional. In fact,

language may be a barrier requiring special arrangements.

Often the first barrier to overcome is the physical barrier to access. The Evidence Facilitator should

consider and evaluate the venue for learning and assessment where it will be accessible to people

with disabilities. Transport arrangements can be made, although it is normally the responsibility of

the candidate.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

71

If language is a barrier to writing examination, documentation can be translated into the language

of the candidate with a language problem. This, however, can make learning very expensive, so

that this will probably only be done with critical material. Sign language interpreters can be

arranged for deaf people. Childcare arrangements should also receive attention if there are

candidates who need such services - mostly single/working mothers who do not have anybody

else to look after the child while she attends learning or writes examinations. Examination papers

can be prepared in extra large font type for candidates with poor vision.

The only instance where a candidate can be refused assessment is where his or her writing or

doing the assessment will endanger the safety of others or themselves. For example, a blind

person will probably not be allowed to do an examination which will qualify him or her to work on

high tension electrical cables.

The basic principle is that candidates with special needs must as far as possible be allowed and

enabled to write examinations and do practical assignments. This, however, does not mean that

standards should be compromised.

OTHER BARRIERS TO LEARNING AND ASSESSMENT

We have used evidence from our literature review to create a series of typologies of factors,

barriers and triggers that affect engagement in learning. The following categories encompass, in

an extremely broad sense, the key components of those typologies:

• personal – family commitments, personal interest and motivation, financial commitments

and barriers, individual income

• institutional – employment and unemployment factors, availability of institutional finance or

funding, tax breaks and benefits, information, advice and guidance, the nature and quality of

provision itself

• systemic or external – transport, proximity to institutions, regional and local characteristics,

such as deprivation, nature of the labour market and availability of learning opportunities.

The most commonly found barriers derived from the abovementioned topologies are listed below.

Evidence Facilitators must ensure that they identify this in advance during the identification of

needs and ensure that necessary action is taken to reduce or void these barriers. Guidance must

be provided in line the organizational policy and should the assistance of the applicable SETA be

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

72

sought where the giving learning opportunity was sponsored or funded with a specific intent by

the said SETA.

Most commonly found barriers include but is not limited to:

• Financial difficulties with course costs.

• Financial difficulties resulting from studying instead of working.

• Transport too expensive.

• Transport not available or difficult,

• Place of learning too far away to commute comfortable.

• Childcare too expensive.

• Childcare not available.

• Lack of time to study.

• Lack of time to attend.

• Caring responsibilities.

• Religious/cultural constraints.

• Lack of support from the institution.

• Lack of support from family or friends.

• A change in circumstances made continuing impossible.

• Wrong course enrolled.

Evidence Facilitators that shows a keen interest in resolving these barriers with candidates

involved will find renewed candidate willingness and positive participation to achieve.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

73

SUPPORT IS GIVEN IN A WAY THAT STRENGTHENS CANDIDATES` ABILITY TO ENGAGE MORE INDEPENDENTLY IN FUTURE ASSESSMENTS - DEVELOPING A FRAMEWORK FOR ASSESSMENT – BUILDING COMPETENCY PORTFOLIO OF EVIDENCE This framework is not prescriptive and can take various forms. It can take the form of the personal

development plan, it can be in the form of a learning journal or it can take the form of a simple table as

depicted in table below. We however suggest that learners start developing Competency Portfolios. If the

term confuses you with a Formal Portfolio of Evidence (PoE) submitted as part of summative assessment,

rather refer to it then as a “File in which candidates start to collect, chronological compile and prepare for

assessment”. This is not a PoE that the candidate will submit as part of summative assessment but a

“personal” file in which they start to prepare, plan and gather evidence to get them ready for assessment.

For all purposes let us refer to it as a Competency Portfolio.

“Example ‘Preparing Evidence”

Name of Plan: Learning Strategy for the Learnership in ETD

Name of Learner: P. Learnmore

Ser nr

Outcome to achieved Evidence/ Facilitator

Degree of excellence to be

achieve Target Date

Resources Required

1 Improve communication skills to assist with learning

Mr Malone Must be competent at least to NQF 4

10/05/08 Prescribed learning material – will be provided by the facilitator

2 Re-Submit portfolio on OBE Assessment

Mr Nel 60% 01/07/08 None

etc

DEVELOPING A PORTFOLIO OF COMPETENCIES

We strongly recommend that if that you advice your learner to construct some kind of portfolio - either

paper-based or electronic - to provide evidence of the learner’s competencies and performance. The

advantage of a portfolio is that it is:

Holistic – it allows you to present an overview of your professional path and competencies in

one place

Job-related – it is based on the demands of your current job and prospective development

opportunities

Appropriate – you select the evidence to illustrate achievements you think are relevant

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

74

Historic – it contains evidence of past achievements

Prospective – it details future plans

WHAT WILL WE FIND WE FIND IN A COMPETENCY PORTFOLIO?

As a portfolio is a record of learning and development, it will probably contain:

Details of Unit Standard or the Unit Standard itself.

The Assessment Guide or Curriculum.

Workbooks and practical activities that they completed during the learning programme.

The Assessment Plan.

Detail regarding the Assessment Methods that will be used during summative assessment.

Competencies they need to develop per specific outcome and identified assessment criteria.

Methods to demonstrate competence against the assessment criteria and possible questions that

might be asked by the Assessor.

Learning that they still are in doubt with and aspects that they must clarify with you the Evidence

Facilitator..

Evidence of competence (cross-referenced to the competencies contained in the framework).

Learning plan and target time frames in the form of a plan to learn and master the competencies

necessary to produce as evidence of competence during Summative Assessment.

In order to create the portfolio, the will need to look at their own strengths and weaknesses by comparing

their personal profile against the requirements of the Assessment Activities listed in the Assessment Guide

and Assessment Plan. This will help them to identify the essential evidence necessary for assessment

which they can then discuss with you the Evidence Facilitator. They will then be able to identify their own

personal learning development plan based on criteria of the assessment.

The development plan should reflect not only what is desirable, but what is possible and achievable, and

should include:

What is to be achieved

How they will know if they have achieved it

When - what is the required timescale

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

75

EVIDENCE TO BE COLLECTED WITHIN THE COMPETENCY PORTFOLIO

The evidence they collect in the portfolio must be aligned clearly with the assessment criteria of the unit

standard against which the learning programme was designed and developed.

In addition this will allow candidates to carry out a self-assessment and hereby providing scope for

candidates to prepare more independently for assessment. Following this, they may wish to involve you the

Evidence Facilitator to discuss those assessment activities still causing doubt or concern.

Evidence compiled should ultimately result in:

Suggested sources of evidence:

Products – the tangible evidence/ activities necessary for Assessment.

Process – information about the methodology and rationale of the assessment approach.

Suggested types of evidence:

Research and continuing professional development activities.

Teamwork activities: co-operation with colleagues and with other departments.

Samples of work produced, e.g.

o a learning programme submissions etc.

o Assessment reports.

Other feedback reports from other assessors, learning facilitators, fellow learners etc.

Analysis on summaries these reports and feedback received fro example on formative assessments.

The format of the portfolio will depend on whether they have chosen to work with a paper-based or

electronic portfolio (ePortfolio). A paper-based portfolio is a perfectly adequate tool for presenting written

documentation. It can also have certain artefacts attached to the written document and contain references

to digital and online sources. The ePortfolio, however, will give you more opportunities to present a varied

range of evidence, including:

Audio files

Graphics

Video clips

HTML files

PPT presentations

Digitized photographs

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

76

HOW WILL THE PORTFOLIO PROGRESS BE MEASURED?

Criteria for evaluation include the fact that the evidence must:

Be based on actual performance and not just theoretical knowledge.

Demonstrate an acceptable level of competence against performance indicators (e.g. the Unit

Standards and NQF Level Descriptors)

Demonstrate an ability to transfer competence to work situations (i.e. if they are working in a

different environment, could they still perform to the same level?)

Be reliable and valid.

Be sufficient to prove competence.

THE END-RESULT OF ALL INFORMATION AND ORIENTATION SESSIONS - CONFIRMING

CANDIDATES’ READINESS FOR ASSESSMENT

The Evidence Facilitator must make sure that the candidates are ready and well-prepared for the

assessment. This can be achieved by reviewing the results of formative assessment, but the

candidates must also be given an opportunity to declare if they are ready or not. It is important

that the candidates’ confirmation that they are ready be documented. This is done by taking

minutes during the orientation of the candidates on the assessment. However, we tend to

generate way too much paperwork when preparing for assessment. Consequently the same

results can be achieved by having the candidates sign a declaration that they are ready for the

assessment. This can take the following format:

“START of EXAMPLE”

THE FOLLOWING QUESTIONS MUST BE COMPLETED BY THE CANDIDATE:

Do you have any special requests or requirements now that we discussed the assessment plan?

Please list them:

1. ……………………………………………………………………………………..

CANDIDATES MUST COMPLETE THIS FORM AFTER HAVING READ THE ASSESSMENT

GUIDELINES AND SUBMIT THE COMPLETED FORM PRIOR TO WRITING THE THEORETICAL

EXAMINATION. IT ALSO APPLIES TO THE PRACTICAL EXAMINATION.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

77

2. ……………………………………………………………………………………..

3. ……………………………………………………………………………………..

Do you have any questions about the examination? (Please be sure to pose your questions to the

facilitator/Assessor before commencement of the examination.)

1. …………………………………………………………………………………….

2. …………………………………………………………………………………….

3. …………………………………………………………………………………….

I hereby confirm that:

Yes No

1 I understand the contents of this assessment plan.

2 That the contents have been explained to me.

3 That I have been given an opportunity to ask questions.

4 That I have been involved and that changes have been made to suit

my reasonable objections or suggestions.

5 That this is the plan that we will follow to gather the evidence

needed.

………………………………………. ………………………………………

SIGNATURE: CANDIDATE SIGNATURE: ASSESSOR

Date: …………………………. Date: ……………………..

“END of the EXAMPLE”

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

78

THE ROLE OF THE CANDIDATES IN ASSESSMENT

It is important that the candidate understands from the onset what his or her role and

responsibilities are regarding the assessment, as well as the evidence that is required for

assessment. The candidate needs to understand what the process is and why it is so. The

candidate also needs to know what to expect from the assessor. The Evidence Facilitator or

assessor needs to explain to them what he/she expects from them. This should be contained in

the assessment guide and the assessment plan. The assessment guide must provide all the

information listed in the unit standard

SUMMARY OF UNIT 2

Candidates must be properly informed about all the details of the assessment and they should be

given an opportunity to suggest alternatives and raise concerns or objections. The Evidence

Facilitator on conjunction with assessor must also confirm that the candidates are ready to be

assessed and the candidates must confirm their readiness in writing. After the assessor briefed the

candidates on the assessment, gave them an opportunity to air their views, reservations and

suggestions on the assessment, everything should be in place for conducting the assessment.

SELF ASSESSMENT AND GROUP ACTIVITIES

Do Exercise Five (5) in your

workbook now

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

79

Check and give feedback on evidence

Unit 3 OUTCOME RANGE

This is limited mainly to checking the completeness and

appropriateness of the evidence, and is not expected to amount to

an assessment judgement as would be appropriate for an assessor.

On completion of this unit the candidate will be able to check and give feedback against the following

criteria:

• Checks establish the validity, authenticity, relevance and sufficiency of evidence.

• Decisions are made concerning the readiness of the evidence for presentation to registered assessors, and recommendations contribute to the efficiency and effectiveness of the assessment process.

• Gaps in the evidence are identified and dealt with appropriately.

• Feedback about the evidence is communicated to assessors where required, and to candidates in a culturally sensitive manner and in a way that promotes positive action by the candidate.

• Key lessons from the facilitation process are identified and recorded for integration into future interactions with candidates.

In this unit Assessment Criteria 1 to 6 of Specific Outcome 3 will be disussed

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

80

CHECKS ESTABLISH THE VALIDITY, AUTHENTICITY, RELEVANCE AND SUFFICIENCY OF EVIDENCE

Candidates must receive specific feedback on both their work and their self-assessment of that

work, if applicable. By reporting results facilitators/assessors could provide candidates with more

than mere feedback on their progress. They could also be offering helpful suggestions on how

candidates could go about improving their performance and how they should, in future, tackle

assessment tasks. As such, reporting becomes an integral part of facilitating, i.e. it serves a

formative purpose.

Feedback to candidates can take place verbally or in writing. Facilitators, or peers, could respond

to a particular candidate’s contribution to classroom activities, or a piece of work submitted after

experiential learning. Recording candidate performances electronically, for example, is particularly

helpful for discussion purposes and for self-assessment. Spoken feedback should be constructive,

informal and instantaneous, giving credit for effort, encouraging perseverance and offering

suggestions for overcoming difficulties. Written comments should be clear, specific, supportive

and reader-friendly.

GIVING FEEDBACK ON DIFFERENT RESULTS

There are four primarily four “types” of results that feedback must be given on, and each of them

should be approached differently.

The easiest type is where the candidate meets all the requirements to be found competent. In this

instance the candidate will be given the results, receive credits and a certificate, if applicable. No

further action is necessary.

The second situation would be where the candidate clearly does not meet any of the criteria for a

particular outcome. This must be communicated to the candidate in writing and the candidate

must be informed what he or she needs to do to be found competent. If it is the purpose of the

unit standard that the candidate cannot be found competent against, the assessor or facilitator will

have to decide if the candidate has the ability to master the material or not. It might be necessary

to suggest to the candidate that he or she should rather follow a different learning pathway or

switch to a lower level course. Remember, however, that in outcomes-based learning the candidate

who is willing to work should be given an opportunity to do so.

The third situation is where the candidate meets some of the criteria. In this instance the candidate

may be allowed to resubmit questions or exercises on the assessment criteria that he or she does

not meet yet. If the candidate clearly needs further guidance, the assessor or facilitator will ask the

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

81

candidate to visit him or her to discuss the problem and to provide the necessary guidance and

support.

The fourth situation is where more evidence is required in order to make a judgement of

competence. In this instance the candidate may be allowed to submit outstanding questions or

exercises on the assessment criteria that he or she did not provide evidence of competence against

yet. If the candidate clearly needs further guidance, the assessor or facilitator will ask the candidate

to visit him or her to discuss the problem and to provide the necessary guidance and support.

GIVING FEEDBACK ON ASSESMENT RESULTS

The Evidence Facilitator together with assessor has to decide when to say that the candidate has

given enough evidence of appropriate quality to confirm that he/she is capable of performing the

outcome/s consistently and to the required standard. Sufficient evidence can include evidence

generated over time, to enable valid, consistent and fair assessment judgements to be made. This

judgement is made against the outcomes and assessment criteria in the unit standards and

evidence guides, taking the range (scope, context, underpinning knowledge and any other relevant

information) into account. Supplementary evidence may be used when necessary, for example

results of formative assessment, class participation, etc.

The assessor decides on the competence of the candidate once sufficient evidence has been

collected, the Evidence Facilitator can assist with this process by providing appropriate feedback

and input to the assessor. The ability to make assessment judgements must be demonstrated

using diverse sources of evidence and in situations where: special needs of candidates need to be

considered, candidates meet all criteria, candidates clearly do not meet the criteria, candidates

meet some, but not all criteria, and more evidence is required in order to make a judgment. The

quality and type of evidence can be assessed in terms of the assessment outcomes, against all the

assessment criteria in the relevant unit standard or qualifications. Assessment judgements are

always justified by the quality and sufficiency of the evidence. Judgements should be

substantiated in terms of the consistency and repeatability of the candidate’s performance and

evidence from various sources and time periods.

The candidate has to be informed what was correctly done and achieved, and if necessary what

was not correctly done and achieved. They should be told that they are deemed competent or not

yet competent. In formative assessment the candidate should be told what their strengths and

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

82

weaknesses are, why they have the strengths and weaknesses, and what they need to do to deal

with the weaknesses. In summative assessment the candidate has to know that they are

competent or not and why. If judged not competent they should be told what steps could be taken

to get them to competency.

Not all candidates going through the same learning programme will reach competency at the same

time and the assessors should keep this in mind. Furthermore, some candidates may never be able

to achieve competence based on a number of factors and the reasons for this have to be

explained to them and advice given of other avenues which may be available, e.g., changing to

other learning programmes

Check that the evidence meets the criteria spelt out as it is essential when choosing assessment

methods..

Check that sufficient evidence is collected on which to base a decision, the evidence is accurate

and that the evidence is up to date.

When reviewing assessor decisions to provide feedback to candidates:

• Compare the evidence with the requirements of the assessment criteria.

• Provide feedback each decision point made by the assessor.

• Provide overall feedback against the whole assessment decision of the Assessor. To do this you will

need to discuss this in detail with the assessor. Do not make any assumptions about the assessors

decision clarify and discuss it with the assessor before giving feedback.

• Be prepared to explain and justify every decision made.

Beware of:

• Identifying with the candidate.

• Being influenced by the candidate’s past performance.

• Making assumptions, or being discriminating.

• Being overly influenced by one particular thing the candidate does, especially if it happens early on in

the process.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

83

The ability to give feedback must be demonstrated in situations where:

Special needs of candidates need to be considered, candidates meet all criteria, candidates clearly

do not meet the criteria, candidates meet some, but not all criteria, and more evidence is required

before a judgment is possible. Feedback should be given to relevant parties in accordance with

confidentiality requirements, in an appropriate sequence and within agreed timeframes.

Feedback should focus on the quality and sufficiency of the candidate’s performance in relation to

the agreed outcomes and criteria. The type of feedback and manner of giving feedback should be

constructive and related to the relevant party’s needs.

Sufficient information must be provided to enable the purpose of the assessment to be met, and

to enable parties to make further decisions. (Further decisions include awarding of credits and

redirecting candidates to alternative learning or re-assessment.) Feedback processes and models

should be described in terms of the potential impact on candidates and further learning and

assessment

Feedback about the assessment should be given:

As soon as possible.

In an appropriate place.

In a constructive and affirming way.

In a manner based on facts and the evidence collected in the assessment.

The candidate needs:

Time to discuss the assessment.

To be able to ask questions.

To ask the moderator for feedback in their methods and approach and their use of

different types of evidence.

To ask for advice on further steps to take in terms of training and assessment.

Where appropriate, to ask for advice on the appeal procedure.

To provide their own comments on the process.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

84

Do’s on giving feedback:

Try to give feedback as soon as possible after the evidence-collection process.

Always say something positive first.

Be specific in ones praise – there must be something the candidate did right! Name it clearly. Don’t

be vague or make generalisations.

Be sensitive and tactful.

Be reassuring and constructive.

Be helpful and encouraging.

Give reasons.

Describe rather than judge.

Be professional. Focus on the competency not the personality.

End on a positive note.

Don’ts on giving feedback:

• Don’t be vague.

• Don’t make generalised comments like “Your problem solving skills are not acceptable”.

• Don’t be judgemental or evaluative. Don’t say “The way you did that was good” but rather “The

way you did that matched exactly the requirements of the assessment criteria”.

• Don’t blame, or behave as if problems are the candidate’s fault.

• Don’t end off without making suggestions how the problems can be addressed.

PROVIDING FEEDBACK TO THE ASSESSOR

The assessor may need accurate advice and support to enable him/her to identify and meet the

candidate’s training and development needs. This is why it is advisable to make use of assessment

panels, including other assessors, facilitator(s) and moderator(s).

The assessor must be objective and fair. Agreements reached and key elements of the feedback

must be recorded in line with the organisational quality assurance system.

Feedback is not one-way traffic. The candidate should also be given an opportunity to give

feedback on how he or she experienced the assessment, and opportunities must be provided for

clarification and explanations concerning the entire assessment.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

85

FEEDBACK ON CRITICAL CROSS-FIELD OUTCOMES

The Evidence Facilitator also wants to know what candidates have learned at the end of a learning

programme in terms of social skills. What knowledge and awareness have been gained? What

questions or tensions remain or were newly created? What type of support system is available to

candidates as they apply what they have learned?

Written statements, action plans, and presentations synthesising a candidate’s learning present

various methods for assessing the impact of the course. The facilitator may also ask direct

questions such as: What have you learned? What has changed in your understanding of these

issues? What next steps would you like to take to continue to learn about and address these

issues? How will the learning influence your social and work-life?

Particularly with social justice content, application to real world contexts is an important goal for

learning. Depending on the duration of a course, candidates may choose to implement these

strategies and report the results back to the class. When time is more limited, a written or verbal

description of the proposed action plan helps transfer the learning from classroom to daily life.

FURTHER DECISIONS – WHAT NEXT

The summative assessment results should not be the end of the road for most candidates. It is the

responsibility of the learning provider, facilitator and/or assessor to motivate candidates to

continue learning. The following will typically happen after assessment and the completion of a

particular learning intervention, be it a unit standard-based course, candidateship, learning

programme or national qualification.

Awarding of credits. Credits can only be awarded once the verifier or verifiers endorsed

the decision of the moderator. Awarding of credits is done by reading the successful

candidate’s credits into the National Candidate Record Database (NLRD). The learning

provider forwards the results to the ETQA, where the person responsible will read the

credits into the NLRD.

Redirecting candidates for further learning. In a strategy of lifelong learning, successful

candidates should be urged to enroll for learning on a higher level or the same level,

perhaps in a different field. This also applies to unsuccessful candidates, sine they will often

perform better in a different field of learning or at a lower level.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

86

Guiding candidates for further application or re-assessment. Candidate guidance and

support must still be available to candidates after the assessments. Learning providers

should create a spirit of continuously upward spiraling growth in knowledge, skills and

attitudes. Outcomes-based learning should always improve the ability of the candidates to

work more efficiently. This is a continuous process that should carry on for the working

years of all adult candidates. Candidates who were not successful should be encouraged to

either try again or to try a different field of learning. Candidates should be helped to

discover their strong points and the fields of learning in which they will have the best

chance of success.

ASSESSMENT DOCUMENTATION

Assessment documentation is prepared to facilitate efficient and effective assessment and

recording of information. The documentation provides all details of the assessment process

needed to ensure fair, open, reliable and consistent assessment. Details include instructions to

candidates, assessors and other relevant parties.

REVIEWING OF ASSESSMENT

Reviewing assessment is a process of quality assurance and should probably be seen as a research

process rather than assessment as such. The existing assessment instruments are evaluated and,

based on the results of the evaluation, improved, changed or replaced.

Assessment instruments and procedures should be reviewed on a regular basis, in the light of

ETQA feedback, SAQA feedback regarding development of new standards, and client/candidate

feedback. Reviewing assessment and moderation systems should be coordinated by a person

specifically tasked with the responsibility. This is done prior to and also after tests or examinations

have been taken to identify good and bad practices in assessment design and processes, and to

incorporate it in the assessment redesign. Changes to assessment can take place at different levels

i.e. at the level of the individual facilitator, course team or the training/ assessment institution.

Weaknesses in the assessment design and processes that could have compromised the fairness,

consistency and reliability of the assessment should be identified and changed. Weaknesses

arising from poor quality unit standards or qualifications may also be identified and brought to the

attention of the relevant bodies (probably SGBs).

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

87

QUALITY OF THE ASSESSMENT INSTRUMENTS

There is no such thing as a perfect assessment instrument. All assessment instruments have flaws,

simply because they were designed and developed by human beings, and we all know how fallible

human beings can be. Nevertheless, this does not mean that we should accept less than the best

quality instruments. It is in he interest of our candidates to ensure that they receive a fair and

equal chance of achieving success in assessment.

REVIEWING THE ASSESSMENT PROCESS

The assessment process should be reviewed every time the instruments were used. The

moderator would have evaluated the assessment instrument before it was used, but even this

does not mean that the instrument will be perfect. Using the instruments is probably the best acid

test for any assessment instrument. The following is a list of items that should be reviewed:

Was the assessment instrument designed in accordance with the quality assurance policy?

Are instructions to the candidates clear and unambiguous?

Was the assessment instrument sufficient to protect the integrity of standards and qualifications?

Is each assessment task clearly described and outlined?

Is the purpose of each task clear and clearly linked to the purpose of the learning programme?

Are the tasks relevant to the candidate’s context?

Are the assessment methods and tasks fit for purpose?

Is the evidence collection integrated into the workplace where appropriate?

Was the choice and design of assessment methods and instruments appropriate to the unit

standards and qualifications being assessed?

Is the assessment instrument consistent, accurate and well designed?

Does the assessment instrument make provision for reassessment?

Will it be necessary to redesign the assessment instrument?

Has the memorandum been prepared according to the quality assurance policy?

If annotated drawings are required, do complete drawings with annotations appear in the

memorandum?

Is the design of the assessment instrument linked to an assessment strategy? (Environmental

analysis to find the best assessment opportunities and approach.)

Is the grading design compatible with the assessment instrument? (assessment criteria, weighting,

format for judgements, etc.)

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

88

Is the assessment instrument implementable within any reasonable site costs and time

requirements?

Are marks for sections and subsections shown clearly?

Did the assessment instrument make provision for special needs without compromising the validity

of the assessment?

Does the assessment instrument endeavour to determine the attitude of the candidate towards his

or her vocation as well as his or her sense of responsibility towards his or her vocation?

Was the assessment instrument career- and practice-oriented?

Are critical cross-field outcomes also assessed?

Does the recording format clearly state criteria and evidence requirements?

Does the recording format allow for third party testimony/witness statement? (Especially relevant

to RPL.)

Does the recording format allow for levels of performance to be recorded?

Does the recording format enable accurate recording of administrative information?

REPORTING AND HIGHLIGHTING ASSESSMENT GAPS

Weaknesses in the assessment design and process that could have compromised the fairness of

assessment must be identified and dealt with in accordance with the assessment policy.

Weaknesses arising from poor quality of unit standards or qualifications need to be identified, and

effective steps taken to inform relevant bodies.

The pool of items for a particular test can be reviewed by the individual who constructed them or

by a colleague. In either case it is helpful for the reviewer to read and answer each item as if taking

the test. This provides a check on the correct answer and a means of spotting any obvious defects.

A more careful evaluation of the items can be made by considering them in the light of each of the

following questions. (Gronlund, 1998: 114.)

Does each test item measure an important learning outcome included in the test specification? Each

test item should relate to one of the outcomes in the unit standards, since each item is designed to

measure one aspect of the subject matter and candidate performance specified there. If the outcome

to which the item refers was noted on the card at the time the item was constructed, the task is simply

to read the item and recheck its appropriateness. Essay questions and complex objective items may

have to be checked against several outcomes in the unit standards. In the final analysis each item

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

89

should be related directly to the type of performance specified by the learning outcome(s) to be

measured.

Is each item appropriate for the particular learning outcome to be measured? Some learning

outcomes can be measured by any of the common item types. In such cases the multiple-choice item

should be favoured. However, if the learning outcome calls for supplying the answer, the completion or

essay test must be used. If only two alternatives are plausible, the true-false item might be the most

useful, and if the outcome calls for relating a series of homogeneous elements, the matching item

might be more efficient. Reviewing the items provides for a second check on the appropriateness of

each item type for the outcomes to be measured.

Does each item present a clearly formulated task? The problem presented by a test item, regardless of

item type, should be so clear and unambiguous that all candidates understand the task they are being

called on to perform. Those who fail an item should do so only because they lack the knowledge or

intellectual skill called for by the item and not because of poorly formulated tasks or questions.

Although ambiguity is a major problem in test construction, it is fortunately a flaw that becomes more

apparent during a follow-up review of the items.

Is the item stated in simple, clear language? This point is obviously related to the previous one, but

here we are concerned more with the appropriateness of the reading level of the item for the NQF level

to be tested. Except for technical terms that are a necessary part of the problem, the vocabulary should

be simple. Similarly, short and simple sentences are to be favoured over long and complex ones.

Meeting these two standards is likely to help remove ambiguity but, equally important, they enable

poor readers to demonstrate their levels of achievement more adequately. Reading ability is well worth

measuring in its own right, but attempts should be made to keep it from interfering with the

measurement of the learning outcomes. Ideally the reading level of the items should be adapted to the

least able reader in the group to be tested.

Is the item free from extraneous clues? Although we do not want candidates to fail an item if they

have achieved the outcome being measured, neither do we want them to answer an item correctly

when they have not achieved the intended outcome. Thus, the review of items provides another

opportunity to ferret out clues that might lead the uninformed to the correct answer. Verbal

associations, grammatical inconsistencies, and other clues that are easily overlooked during the

construction of the items frequently become obvious during review.

Is the difficulty of the items appropriate? The difficulty of the items in a criterion-referenced test

should match the difficulty of the learning task set forth in the specific learning outcomes. No attempt

should be made to alter item difficulty simply to obtain a spread of test scores. The important question

here becomes: Is the difficulty of the test item the same as that of the specified learning task? We

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

90

assume, of course, that the appropriateness of the learning task for the group to be tested (NQF level)

was checked at the time the list of learning outcomes was prepared.

o In evaluating the difficulty of the items in a norm-referenced test, we shift our focus to the

question: How effectively will this item discriminate among candidates? Recall that the purpose

of a norm-referenced test is to obtain a dependable ranking of candidates, and to do this we

need items that discriminate. (Norm-referenced assessment is not favoured by an outcomes-

based approach to learning, but one cannot rule out the possibility that it might be the best

option under certain conditions.) Item analysis can be used to determine the discriminating

power of test items. Items that are difficult enough to discriminate between high and low

achievers are to be favoured.

Is each test item independent, and are the items as a group free from overlapping? Knowing the

answer to one item should not depend on knowing the answer to another item. Thus, each item should

be a separate scorable unit. Interlocking items are especially likely to occur when several items are

based on common introductory material. A closely related problem occurs when information in one

item helps the candidate determine the answer to another item. This is most common in tests that

include both selection and supply items. Frequently the information given in selection items is useful in

answering the supply items. These defects can easily be remedied by an overall review of the items

during the final selection of the items to be included in the test.

Do the items to be included in the test provide adequate coverage of the test specifications? The

review, elimination, and revision of test items may result in a pool of items that deviates somewhat

from the set of specifications. Thus, it may be necessary to further revise some of the items or to

construct new ones. In any event the final selection of items for the test must be made in light of the

test specifications in order to assure adequate sampling of the intended learning outcomes.

In addition to these general questions that apply to all item types, the rules for

constructing each specific type of item provide for item evaluation. In the review of

multiple-choice items, for example, the completeness of the problem given in the stem,

the inclusion of one clearly best answer, and the plausibility of the distracters all warrant

special attention. Just before reviewing a pool of items, one should consult the checklist for

evaluating each item type.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

91

CHANGING ASSESSMENT PRACTICES

It is necessary to review assessment even after test or examinations to identify good and bad

practice in assessment design and process, and to incorporate it in the assessment redesign.

Changes to assessment can take place at different levels – the individual facilitator, course team,

department and learning institution. At any of these, it is possible to make a change. For example,

an individual facilitator may wish to introduce a new assessment method and may be able to do

this without affecting other people. At the other end of the scale, the learning institution may

formulate a mission that requires for its achievement changes in the design and implementation

of assessment – for example, producing more independent candidates or achieving better results.

Some changes require a concerted effort at more than one level. For example, to achieve a

consistent approach to assessment in financial management would require agreement by

individual facilitators, course teams and the department. (Freeman & Lewis, 1998: 311.)

Candidates may also be involved in the review process.

Weaknesses in the assessment design and processes that could have compromised the fairness of

the assessment should be identified and changed in accordance with the institution’s assessment

policy. Weaknesses in the assessment arising from poor quality of unit standards or qualifications

should also be identified and relevant bodies be informed if changes call for their participation.

The Evidence Faciliator can also suggest how the assessment instrument can or should be

reviewed. The following is an example of a form which the candidates and assessor should

complete to facilitate review:

ASSESSMENT REVIEW

Review Item

CANDIDATE ASSESSOR

Remarks

YES NO YES NO

Were the assessment procedures clear?

Were the instructions and directions clearly specified?

Were all the specific outcomes tested?

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

92

Was guidance offered to the assessors to help them collect and judge evidence?

Does the assessment material encourage candidate success, rather than failure?

Does the material encourage candidate self-evaluation?

Does the assessment material motivate the candidate?

Does the assessment material assess various thinking and communication skills?

Were the principles/criteria for good assessment achieved?

Is RPL assessment process work-related?

Did the assessment material provide for special needs of candidates?

Was feedback given constructively against the evidence required?

Was feedback given in a positive manner?

Was an opportunity to appeal given?

Was the evidence recorded?

SUMMARY OF UNIT 3

Assessment is part of the learning process, and all the role-players, including the candidate,

Evidence Facilitator, Learning Facilitator, Assessor, Moderator and Training manager should use it

as an opportunity to provide further training.

Feedback can be given verbally or in writing. It is important to keep written records of at least the

fact that feedback was given.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

93

The results of the assessment does not decide if feedback should be given or not, only how much

and on what feedback should be given.

Feedback should include guidance and support on how to correct the problem, should there be

one. Special needs of the candidate should also be taken into consideration.

Feedback must always be the product of sound quality assurance. In this respect it must be fair,

honest, positive, given in a mature manner and be well motivated. It should not only highlight

weak point, but should also include what the candidate did well. The purpose of feedback must

always be to improve the competence of the candidate, never to break down or offend. Feedback

must be given as soon as possible after assessment.

The candidate must be given an opportunity to respond to feedback and to appeal if he or she

feels that the assessment was not fair. Reassessment should always be an option, within realistic

limits, of course.

Confidentiality of information should always be maintained when giving feedback. The first person

who is entitled to assessment results is the candidate, and it is also the candidate who may decide

who else can have access to the results.

Feedback is not the end of the learning process. The following can follow after assessment and

feedback:

Awarding of credits.

Redirecting candidates for further learning.

Guiding candidates for further application or re-assessment.

SELF ASSESSMENT AND GROUP ACTIVITIES

Do Exercises Six and Seven and Eight

in your workbook now

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

94

REFERENCES

1. Adams, M., Bell, L.A., Griffen, P. 1997. Teaching for Diversity and Social Justice. Routledge,

New York & London.

2. Assessors Course. MEIETB Candidate’s Manual. 1999.

3. Bellis, I. June 1997. Equity Issues in Education and Assessment. Outcomes-based Education:

Issues of Competence and Equity in Curriculum and Assessment. South African Certification

Council.

4. Chase, C.I., 1999. Contemporary Assessment for Educators. Longman, New York.

5. Cotton, J., 1995. The Theory of Assessment. Kogan Page. London.

6. Craig, R.L., 1993. Training Development Handbook. A Guide to Human Resource

Development. Third Edition. McGraw-Hill Book Company.

7. Centre for Educational Research, Evaluation & Policy (CEREP). October 1998. Outcomes-

based Education: Perspectives, Policy, Practice and Possibilities. University of Durban-

Westville.

8. Desmond, C.T., 1996. Shaping the Culture of Schooling. State University of New York Press.

9. ETDQA. June 2004. Guidelines for assessment. ETDP SETA Publication.

10. Freeman, R. and Lewis, R., 1998. Planning and Implementing Assessment. Kogan Page.

London.

11. Government Gazette. Act no. 97 of 1998: Skills Development Act, 1998.

12. Gronlund, N.E., 1998. Assessment of Candidate Achievement. Allyn and Bacon. Boston.

13. Le Grange, L. & Reddy, C. 2000. Continuous Assessment. An Introduction and Guidelines to

Implementation. Juta. Kenwyn.

14. Malan, B., 1997. Excellence through outcomes. Kagiso Publishers.

15. Spady, W. & Schwahn, C. October 1999. The Operating Essentials and Indicators of Total

Learning Communities. A Concrete Vision for Education in the Information Age.

Breakthrough Learning Systems.

16. The National Qualifications Framework: An Overview. February 2000. SAQA Publication.

17. Marzano, R.J. March 1994. Lessons from the Field About Outcomes-Based Performance

Assessment. Educational Leadership.

18. Olivier Cas. Let’s educate, train and learn OUTCOMES-BASED. A 3D Experience in Creativity.

NQF Based. Design Book. 1999.

Facilitate the preparation and presentation of evidence for assessment May 1, 2008

Sunette Bosch | US 12544

95

19. Pahad, M. 1998. Outcomes-based Assessment: The Need for a Common Vision of What

Counts and How to Count It. University of Durban-Westville.

20. Raggatt, P. Cookwood, F. (Ed.) 1994. Materials Production in Open and Distance Learning.

Paul Chapman Publishing.

18. South African Qualifications Authority Bulletin. Volume 2, Number 31. August 1998 –

January 1999.