0 indicators to measure the effectiveness of the implementation of the unece strategy for esd expert...

25
1 Indicators to measure the effectiveness of the implementation of the UNECE Strategy for ESD Expert group Indicators for ESD: Our methods, discussions and results Steering Committee ESD, Geneva 4-5 December 2006 drs. R.M. van Raaij, ministry LNV, the Netherlands, Dutch NAP “Learning for SD”

Upload: oswald-chase

Post on 03-Jan-2016

219 views

Category:

Documents


0 download

TRANSCRIPT

1

Indicators to measure the effectiveness

of the implementation of the UNECE Strategy for ESD

Expert group Indicators for ESD:Our methods, discussions and results

Steering Committee ESD, Geneva 4-5 December 2006 drs. R.M. van Raaij,

ministry LNV, the Netherlands, Dutch NAP “Learning for SD”

2

Outline :

- the context, mandate and limits of our work- the process, approach and methodology

- the questions, discussions and struggles- the reflection on comments and other ESD processes

- the set of Indicators- working with this set of indicators- further reflections and developments

3

The Strategy :

Strategy for ESD (Vilnius, March 2005):To facilitate the introduction and promotion of education for sustainable development (ESD) in the UNECE region to the realization of our common vision.

The Aim of the Strategy:To encourage UNECE member States tot develop and incorporate ESD into their formal education systems, in all relevant subjects, and in non-formal and informal education.

4

Objectives :

1) Ensure that policy, regulatory and operational frameworks support ESD

2) Promotion SD through formal, non-formal and informal learning

3) Equip educators with the competences to include SD in their teaching

4) Ensure that adequate tools and materials for ESD are available and accessible

5) Promote research on and development of ESD

6) Strengthen cooperation on ESD at all levels within the UNECE region.

5

Establishment of Expert group following Vilnius

Mandate: “to develop indicators to measure the effectiveness of the implementation of the strategy”

- Setting up a “framework”

- Translating objectives into questions : what do we need to know, what do we want to know, which data are available, what methodology is available

- Than: constructing indicators out of these questions, as far as aggregation is possible and qualitative/quantitative data / methods are available

6

Establishment of Expert group: Representatives from Austria, Canada (e-mail), France, Germany, Greece, Italy, Lithuania, the Netherlands, the Russian Federation, Slovenia, Sweden and the United Kingdom; United Nations Educational, Scientific and Cultural Organization (UNESCO), the Intergovernmental Central Asia Working Group on Environmental Education and Education for Sustainable Development, the Environment and Schools Initiative Network (ENSI) and European ECO-Forum, a coalition of citizens’ environmental organizations , UNICEF (guest), IUCN-CEC (guest), supported by UNECE

4 Meetings of 3 days work : September 2005 Ede, October 2005 Geneva, March 2006 Vienna, May 2006 Den Haag

Policymakers, scientists, educators, evaluators, statisticians, researchers, practicians, NGO, governmental, universities, ..

7

We look for an evaluation model that covers :

a) the process of implementation

b) The effectiveness of the implementation (as a qualitative feature of both the process and long-term effects of ESD).

There are several considerations given :- Operate within the mandate of expert group and the Strategy- stick to the text / objectives of the Strategy as such- our work has to be understood in different countries, cultures, educational systems, political systems, languages (!) - Mostly based on existing data and methodology (Vilnius)- not too much questions, keep it simple, look for aggregation- UNECE is a political and policy-oriented process (international), so this is the main audience

8

Developing the concepts, take into account :

-Sustainable Development is not a fixed goal, but a developing concept, process oriented

-“Learning” is a broad concept, especially in ESD regarding to non-formal and informal education. (For most non-educators it is “a black box process”)

-Education refers to knowledge, but also to attitude, values, skills, competences, behaviour. How to capture all ?

-Methodology may be demanding in some respect.

-Data are not always available, besides some in the field of formal education.

- Education contributes to SD we trust and can be described in a logical chain, but we can’t “proof” SD as a result from education alone.

9

Background for our work : Evaluation model

Check List……. Y/n……. Y/nTYPE 1

Type 0

Current Situation

(baseline)

Policy Framework

Throughputactivities

Direct / Indirect

Effects,Impact On SD

Outcome TYPE 4 outcome

TYPE 2input

TYPE 3Output

T=0

2015

10

The expert group gives into consideration to countries:

-With respect of difference between States, each state should consider a baseline report of the situation in 2005, which is useful to measure progress

-Into each country the possibility of a general ESD- monitoring system should be determined. In national action plans the objectives should be formulated as operational goals.

- observation: through the nature of the Strategy it selves most of indicators seems to be “checklist” or “input” type ! We are aware, but its not in our mandate to change that.

-This system of evaluation is not for benchmark or reporting only : in line with the spirit of the Strategy we should celebrate progress, share good practices and LEARN !

11

Some Reflection of Steering Committee 2005 :

-too much questions, please aggregate or skip, to have not too much indicators

-Discussion on relations ESD and SD and how “measurable” this is

-Based on existing data

-Understandable in different educational systems, cultures, dealing with different (federal) state systems

-Check with UNESCO process

12

Relevant Discussions in our Expert Group,Some also discussed in Bath (Anglo-German research group), in Hiroshima (UNESCO expert group) and Paris (UNESCO panel)

-the causality of ESD and SD-Different views on education (instrumental – emancipator)-Scale/level of use (international reporting – national – project- …)-Who wants to know and for what: who asks for indicators en how the results will be (mis)used-Availability of data, methodology,evaluation systems, efforts to collect data-Quantitative vs. Qualitative : in what parameters can we express SD, and ESD, the use of scales to express indicators-Focus on formal education, but what about non-formal and informal learning ?-“Learning as black-box-system” (esp. people without educational Experience)-Differences in educational systems, e.g. the role of curriculum, the power of Governmental influences,…-Differences in State organization, relation GO - NGO

13

Causality of ESD and SD a)

We are convinced that ESD is crucial for changes towards a sustainable development , but can we prove this ?

Chain: Awareness knowledge attitude behavior changes in life?

Of course a good model, but in practice changes towards SD are also driven by economy, feelings, culture, technical development, political urgency, peers/family/ groups, More education =/= more sustainability ,but NO education is a disaster and leads to ignorance.

Our choice in UNECE is NOT to mingle up in SD-indicator discussions, the outcome of ESD is learning and development of competences.

Compare: Panel for ESD (UK):ESD enables people to develop the knowledge, value and skills to participate in

decisions about the way we do things individually and collectively (and locally and globally) that will improve the quality of life now, without damaging the planet for the future.

14

Causality of ESD and SD b)

Who wants to know and what for ? To justify expenses on ESD ?

As causality is difficult, and very tricky, never promise to much, because there are many other influences than education that conduct behaviour, and of which your NAP in not in charge ! We notice differences in approach in ministries of education and ministries of environment , and differences between stakeholders! Always the danger of “wrong accountability” .

Different views on the role of education:

Instrumental (to support environmental policy / measures by government)to emancipator: to empower people with competences so they can participate.Make as explicit as you can in your NAP ! different approaches ask for different type of indicators.

15

Scale / level of monitoring:

-international reporting? (Mainly UNECE, but also EU, OESO, UNESCO, …) (for benchmark or exchange of good practice ?)

-Monitoring progress in National Action plans (for accounability or for learning ?)

-Reviewing of ESD programmes or projects (to justify the subsidies or to evaluate the methods and results ?)

-Looking for effectiveness of learning process it selves (not only WHAT we learned but also HOW we learned in the BEST WAY ?)

-Checking competence development (knowledge, skills, changing attitudes) of individuals or groups ? (story writing, self-assessment, other more qualitative methods, ….)

16

Who asks for indicators and who is working with results?

-international organisations as UN, UNECE, EU, IUCN, UNESCO, …-National governments

-Politicians: minister, parliamentarians (cynical: to prove they do as they promised in election time/ looking for short term success)-Policymakers (at all levels)

-(to show improvement as result of their policy-(to justify the efforts and money spend)-(to prove that ESD is a “effective” instrument)

-Educators, professionals in (E)SD-Developing ESD further (process oriented, double loop learning)-Practitioners in ESD (to improve their performance)

-Participants in learning processes-Teachers and learners/students; NGO’s / Private sector/ other-organisations; Individuals / civil society (all different reasons)

-RESEARCHERS (they always want to do more research !)

17

Availability of data, methodology, resources for evaluation and monitoring

1) Not everything that counts can be counted (Einstein) : don’t be afraid for uncertainties and non-countable issue. Look for innovative ways.

2) Be creative with existing data: there is a lot in economic reporting, social reports, environmental reports, even educational reports but THE INTERDEPENDENCE nature of SD and ESD is a NEW phenomena ! If no new data collected or research here is the main difficulty.

3) Formulate the objectives in your NAP as SMART as possible, the more chance to find quantitative output.

4) Differences in educational systems: ISCED is approved by the ministries of Education

5) Start monitoring NOW, make a baseline-report, make reservation of 10-25 % for E and M ( because that’s also LEARNING)

6) Set up an E+ M group (participative approach, innovative, collection of new data (esp. for non- and informal learning !)

18

Quantitative and Qualitative indicators:

- We found very little quantitative indicators.- Questions “to what extend”…. are difficult to answer (objectively).- Descriptive indicators are also indicators (but the sentiment for “sound numbers” is always there !) let this not hold you !

The use of “scales” (only a few of our indicators has agreed to have a scale)-linear scale if countable numbers-Relative scale in case of %-Non-linear scale if the optimum is not the maximum-Descriptive scales are difficult in interpretation ! and discutable !

-Not at all – seldom – some – mostly complete-Not started – beginning – some experience – good practice – expert-Increasing – stabile phase – decreasing of … (trends)- (++; +; 0; -; --) or “smiles” or “traffic lights”

19

Focus on formal education ?!

-this we know best, it is in formal system, there is the role of curriculum, there are existing data, …….

-But for ESD non- and informal learning is as much important !

-Input indicators are part of all actions under your NAP (Dutch: each project has to fill in for subsidies in begin and end of project, giving answers to questionnaires are part of the deal)

-Knowledge management / good-practices are systematically collected

-Case writing and storytelling are acceptable as evaluation

-Social learning requires self-evaluation ! Participation is essential.

20

Final Results :

Products :-2 reports of the Expert group-List of indicators, with reference to character of indicator and sources-Draft format for reporting + Addendum (= set of indicators)-Informal guidance to support the use of indicators

6 objectives 18 indicators 48 sub-indicators/questions

Three levels of reflection:• Yes/No mode• Descriptive part (most qualitative, where possible some quantitative), with template tables• Summary and self-assessment

21

Remarks 1) content and character:

•this is a SET of indicators•The use of this indicators and the preparation of a report

should be a participatory process•This is the best possible result within mandate and

limitations of existing data, no big efforts in research and extra monitoring

•Own experience : fill in Yes/no + simple description : 3 hours, interactive feedback from panel : 1 day, extra research

for descriptive part : 2 days

•Each county could set up more detailed indicators, but that requires further evaluation and monitoring system, new

research and data-collecting. We recommend counties to do so as feasible in their National Action Plan for ESD.

•This set of indicators is not for measuring but to look for progress and LEARN.

22

Remarks 2) Compare with other processes

•This set of indicators is discussed in the Asia-Pacific Expertgroup on indicators of UNESCO and IUCN and in other discourses. •Different is the nature of the UNECE process (more in policy-level) compared by the UNESCO draft plan for implementation, that calls on UNESCO national focal points, NGO’s schools and governements.

•Another difference is that the work of UNESCO / IUCN will not deliver an indicator system for international use, but a helpful guidance with description of different kinds of indicators to set up indicator systems at national level

•It is recommended to consider ESD in the specific context and needs of the region. The concept of ESD is the same, but priorities can and will be different in parts of the world. Diversity should be respected.

23

The use of the UNECE set of indicators :

-2007 : all countries Yes/No Mode; pilot countries full set of description

recommendation to provide baseline report/data and first analyses of ESD implementation for Belgrade autumn 2007-2010 : all countries full report-2015 : all countries full report

Recommendation from Expert group to extend its mandate - to reflect on the first round of pilot reporting and improve the templates for indicators and draft format for reporting if necessary.- to answer on the resolution of EECA: to develop criteria to asses successful examples of implementation

24

Special thanks:

-To all members of the expert group

-To all criticasters from outside this group

-To UNESCO for cooperation

-To Austria, UNECE and Netherlands for hosting the meetings

-To UNECE secretariat for support the groups work, esp. Ms. Ella Behlyarova and Ms. Angela Sochirca

25

Thank you for your attention,

Lots of success to us all !