data quality assessment framework (dqaf)...
TRANSCRIPT
Data Quality Assessment Framework
(DQAF)
KENYA
19 – 30 July 2010
Ma ca Mo ce
rc BERNAL, UIS Regional Advisor for Sub‐Saharan Afriritz BILAGHER, UIS Cluster Advisor for the Nairobi OffiChris VAN WYK, University Stellenbosch, South Africa
Version 3.0
ii
L ist of acronyms
CHE Commission for Higher Education
g Information System DEB District Education Board DEMMIS District Education Management & MonitorinDEO District Education Officer DFID Department for International Development
ework Standards Officer
DQAF Data Quality Assessment Framsurance and DQASO District Quality As
ECD Early Childhood Development
n System EFA Education For All EMIS Education Management Informatio
ducation FPE Free Primary Education FTSE Free Tuition Secondary EGER Gross Enrolment Ratios
ucation IP Investment Programme ISCED International Standard Classification of EdKCPE Kenya Certificate of Primary Education
ation amme
KCSE Kenya Certificate of Secondary EducKESSP Kenya Education Sector Support Progr
u of Statistics uncil
KNBS Kenya National BureaKNEC Kenya National Examinations CoLAN’s Local Area Networks MDG Millennium Development Goals
on ucation Science and Technology
MoE Ministry of EducatiMoHEST Ministry of Higher EdMoY Ministry of Youth NER Net Enrolment Ratio
ducation PDE Provincial Director of EPEB Provincial Education Board PS Permanent Secretary PTA Parent Teachers Associations
QASO Quality Assurance and Standards Officer RDMS Relational Database Management SystemSADC South African Development Community
gency ation Quality
SAGA Semi Autonomous Government ASACMEQ Southern Africa Consortium for Monitoring EducSMC School Management Committees
ogramme Planning l and Entrepreneurship Training
SWAP Sector Wide Approach to PrTIVET Technical Industrial, VocationaTMU Textbook Monitoring Unit TSC Teachers Service Commission TTC Teachers Training Colleges UIS UNESCO Institute for Statistics UNESCO United Nations Educational, Scientific and Cultural Organisation USAID United States Agency for International Development
swahili is a four year initiative to improve competenciesUWEZO
Meaning ‘capability’ in Ki ‐16 years in numeracy and literacy among children aged 6
World Food Programme Zonal Quality Assurance and Standards Officer
WFP ZQASO
iii
Table of Contents Introduction............................................................................................................................................................................1 Data Collection Method .......................................................................................................................................................1 Goals and objectives of DQAF assessment ....................................................................................................................3 An Overview of the Organisational Landscape of education in Kenya ................................................................3 D sign of the assessmente ....................................................................................................................................................5
1. rereP quisites of quality ............................................................................................................................................6 1.1. Legal and institutional environment....................................................................................................................................................... 6 1.2. Resources: Resources are commensurate with needs of statistical programs ..................................................................... 9 1.3. Quality awareness: Quality is a cornerstone of statistical work............................................................................................... 12 1.4. Synthesis and score ..................................................................................................................................................................................... 13 1.5. Recommendations ....................................................................................................................................................................................... 13
2. Integrity: The principle of objectivity in the collection, processing, and dissemination of statistics firis mly adhered to.................................................................................................................................................. 17
2.1. Professionalism: Statistical policies and practices are guided by professional principles ........................................... 17
2.2. Transparency: Statistical policies and practices are transparent........................................................................................... 18 2.3. Ethical standards: Policies and practices are guided by ethical standards.......................................................................... 20 2.4. Synthesis and score ..................................................................................................................................................................................... 20 2.5. Recommendations ....................................................................................................................................................................................... 21
3. Methodological soundness: The methodological basis for the statistics follows internationally 22accepted standards, guidelines, or good practices ........................................................................................
3.1. Concepts and definitions: Concepts and definitions used are in accord with standard statistical frameworks.. 22 ....... 223.2. Scope: The scope is in accord with internationally accepted standards, guidelines, or good practices...........
3.3. accepted standards, guidelines, or good practicesClassification / sectorisation: Classification and sectorisation systems are in accord with internationally
. 24........................................................................................................................ 3.4.
practicesBasis for recording: Data are recorded according to internationally accepted standards, guidelines, or good
........................................................................................................................................................................................................... 25 3.5. Database Structure ...................................................................................................................................................................................... 27 3.6. Synthesis and score ..................................................................................................................................................................................... 29
4. Accuracy and reliability: Source data and statistical techniques are sound and statistical outputs ffisu ciently portray reality..................................................................................................................................... 32
4.1. Source data available provide an adequate basis to compile statistics................................................................................. 32 4.1.1. Statistics collected through a regular administrative school census program. ................................................................. 32 4.1.2. Statistics on demand for education collected through household surveys and population censuses ..................... 36
.
4.1.3 Statistics on the quality of learning outcomes collected through assessments of student achievement................ 38
....... 414.2. Assessment of source data: Source data are regularly assessed and validated.......................................................... 4.3. Statistical techniques: Statistical techniques employed conform to sound statistical procedures, and are
documented .................................................................................................................................................................................................... 41
iv
4.4. provide.Revision studies: Revisions, as a gauge of reliability, are tracked and mined for the information they may
............................................................................................................................................................................................................. 42 4.5. Synthesis and score ..................................................................................................................................................................................... 43 4.6. Recommendations ....................................................................................................................................................................................... 43
5. rv Se iceability: Statistics are relevant, timely, consistent, and follow a predictable revisions policy45
5.1. Periodicity and timeliness: Periodicity and timeliness follow internationally accepted dissemination standards45
5.2. Consistency: Statistics are consistent within a dataset and over time, and with other major data sets ................. 47 5.3. Revision policy and practice: Data revisions follow a regular and publicized procedure............................................. 50 5.4. Synthesis and score ..................................................................................................................................................................................... 50 5.5. Recommendations ....................................................................................................................................................................................... 50
6. 2Accessibility: Data and metadata are easily available and assistance to users is adequate ............ 5 6.1.
adequate, and statistics are made available on an impartial basisData accessibility: Statistics are presented in a clear and understandable manner, forms of dissemination are
.......................................................................................... 52 6.2. Metadata accessibility: Up‐to‐date and pertinent metadata are made available .............................................................. 54
6.3. Assistance with the users: Prompt knowledge support service is available....................................................................... 55
6.4. Synthesis and score ..................................................................................................................................................................................... 55 6.5. Recommendations ....................................................................................................................................................................................... 56
7. Conclusion and overall recommendations ....................................................................................................... 57 APPENDIX A: List of relevant references and documents..................................................................................... 60 APPENDIX B: Kenya DQAF –schedule .......................................................................................................................... 62 PPENDIX C: List of Persons MetA
................................................................................................................................... 63
v
List of Figures
Figure 1.1: Results of pre‐requisites of quality................................................................................................................................13
Figure 2.1: Results of integrity ................................................................................................................................................................20
Figure 3.1: Structure of the age table in TSC database.................................................................................................................28
Figure 3.2: The structure of the Institutions table (master list) in TSC database ...........................................................28
Figure 3.3: Structure of enrolment table in the Foundation database .................................................................................29
Figure 3.4: Results of methodological soundness .........................................................................................................................29
Figure 4.1: Tables in the TSC database...............................................................................................................................................33
Figure 4.2: Students and teachers by gender for public primary schools in the TSC database .................................34
Figure 4.3: Enrolment figures by grade for 2008 and 2009 (data received via a spreadsheet from the MoE)...34
Figure 4.4 : Duplicates in the institutions table in TSC database.............................................................................................35
6 Figure 4.5: Primary school age table for 2010 table in TSC database ...................................................................................3
Figure 4.7: Population projections for primary and secondary school going age (Source: Revised Population Projections for Kenya: 2000 – 2020, CBS . MoPND, August 2006) .........................................................................................37
Figure 4.8: Population Census for age groups 5‐15 and 15‐19 by year................................................................................38
Figure 4.9: KCPE Examination overall performance per subject (Source: KNEC, 2009) ..............................................39
Figure 4.10: Mean pupil reading and maths scores by province (Source: SACMEQ)......................................................40
Figure 4.11: Mean pupil reading and maths scores by country ...............................................................................................40
43 Figure 4.12: Results of accuracy and reliability...............................................................................................................................
Figure 5.1: Primary School Enrolment data from TSC database for 2009 and 2010 by Province (Source TSC ....48 database).......................................................................................................................................................................................................
Figure 5.2: Comparison of primary school enrolment figures for 2009 and 2010 by province (Source: TSC database)...........................................................................................................................................................................................................49
Figure 5.3: Results of serviceability ......................................................................................................................................................50
Figure 6.1: Distribution of 2010 enrolment by province ( Source: TSC database)..........................................................53
Figure 6.2: Results of accessibility.........................................................................................................................................................55
igure 7.1: Overall results .........................................................................................................................................................................57 F
vi
List of tables
...........................Table 1.1: Overview of education data collection processes .......................................................................... 11
Table 3.1: Primary Schools by province and district (Source : Educational Booklet 2003‐2007) ..........................24
Table 3.2: Institution table structure in the TSC database (Source: TSC database)........................................................24
6 Table 3.3: Overview of data collection processes...........................................................................................................................2
Table 4.1: 2008 & 2007 KCPE Examination Overall Candidates Performance Per Subject By Gender (Source: KNEC 2009) .....................................................................................................................................................................................................39
Table 5.1: TSC primary school enrolment for consecutive years 2009 and 2010 by province..................................47
1
Introduction Education Management Information Systems (EMIS) were established as one of the priorities of the African Union’s action plan for the Second Decade of Education for Africa, and the South African Development Community (SADC) education programme. Their inception and development aims at providing quality education statistics data that is complete, relevant, accurate, timely and accessible, as well as managed efficiently. This in turn makes effective decision‐making possible. In this vein, the Ministry of Education (MoE) of the Republic of Kenya recognises the significant role that an effective EMIS can play in the provision of timely, reliable and accurate data for the education sector. In particular, EMIS is the key platform that provides the necessary performance data for onitoring and evaluation of the Kenya Education Sector Support Programme (KESSP) m
investment programmes (IPs). KESSP, a sector‐wide approach to programme planning, was adopted by MoE and comprises twenty three investment programmes focusing on the sector as a whole in order to improve access to and quality of education in Kenya. KESSP provides the framework for support to the education sector covering the period 2005‐2010 and whose investments rogrammes are prioritized and costed (refer to Kenya Education Sector Support pProgramme 2005‐2010, July 2005). It is important to note the pivotal role that monitoring and evaluation play as one of the investment programmes of KESSP. Firstly, under KESSP, monitoring will entail the collection of information and its analysis to report on the progress of the overall performance of the KESSP programme. Secondly, it will assess the impact of KESSP on learners. Thirdly, it will entail collecting data and information for the general oversight and management of the programme. The monitoring plan of KESSP will rely heavily on the continuous or periodic collection of data by those implementing activities. Further, the process will focus on tracking quantitative performance data generated from the EMIS, as ell as complementary reporting systems within each investment programme (refer to w
Kenya Education Sector Support Programme 2005‐2010, July 2005). In order to support the development of an effective EMIS across sub‐Saharan Africa, in line with its Mid‐Term Strategy, UNESCO Institute for Statistics (UIS) is in the process of conducting diagnostic assessments of national education statistics systems within the global context of UNESCO’s support to the African Union Second Decade for Education. The nsuing report summarises the findings of a situation analysis of the education data quality n Kenya, while proposing recommendations for improvement. ei Data Collection Method The assessment process was guided by the Data Quality Assessment Framework (DQAF). This methodology was originally developed by the International Monetary Fund (IMF), and later further developed by UIS and the World Bank for an education context. The UIS is
2
currently developing the framework further into a complete methodology for assessing national education statistics systems.
The DQAF for Kenya was planned in coordination between UIS and the MoE in Kenya. A consultant from the University of Stellenbosch, South Africa, was contracted by the United Kingdom Department for International Development (DFID) to assist with the diagnostic ssessment. The fieldwork and on‐site situation analysis for Kenya was carried out from 9‐30 July 201a1
0. It consisted of:
• Field visits during which interviews and meetings were held with key stakeholders, such as Ministries responsible for education and training, their agencies at national and regional level and other government agencies in charge of data and educational institutions as well as development partners and national non‐governmental organisations (NGOs) involved in data collection, processing and use (Refer to appendix for a list of persons met). The interview sessions were the largest and single most Cvaluable source of qualitative information collection.
• A Round Table meeting with the Permanent Secretary of MoE, on 28 July 2010, during which the preliminary findings of the country visit were presented and discussed.
• Archival Analysis: This observational method was used to examine the accumulated documents as part of the research method to enhance the report. The documents included, but were not limited to: promulgated Acts; Policies; documents, official publications, reports on EMIS in Kenya, strategic plans of the agencies and questionnaires used to collect data. A list of the documents that formed the basis of the analysis is attached as appendix A.
Database Analysis: The analysis of databases of key stakeholders such as Teachers
m ( ) M•Service Co mission TSC and E IS / MoE was undertaken to inform the investigation.
The mission schedule and list of individuals met during the consultation sessions are attached as appendices B and C, respectively. The mission was conducted by Marc Bernal, UIS Regional Advisor, Moritz Bilagher, UIS Cluster Advisor, Chris Van Wyk, private consultant, University of Stellenbosch, and Charles Obiero from the EMIS Unit / MoE. The analysis and this report were contingent on the cooperation, consultation and input from national and sub‐national staff of the key government departments and agencies, as well as school staff and other partners. The relevant stakeholders for this exercise were identified in coordination between Mr. Charles Obiero and the UIS advisors. They included, among others, units within the Ministry of Education, Ministry of Youth, Ministry of Higher Education, Science and Technology and their agencies such as TSC, Kenya National Examinations Council (KNEC), Kenya National Bureau of Statistics (KNBS) and Commission for Higher Education (CHE) and partners such as WFP, USAID and UWEZO. Representatives from these organisations have been more than helpful, and thanks are due for their time. Special gratitude is also due to the Head of EMIS Mr. Charles Obiero for his important support to this mission.
3
Goals and objectives of DQAF assessment The situational analysis was not intended as an academic research paper or an audit. The intention was rather to create a realistic picture of the practical state of education statistics in Kenya at local, provincial and national level from an implementation and application point of view and how consistent it is in comparison with what the DQAF proposes. The im is to try to inform the recommendations made towards improving education statistics n Kai
enya. As such, the objectives of the situational analysis are the following:
• To develop an accurate picture of the availability, level and extent of the use of education statistics in Kenya
• To identify and understand the challenges that provinces and districts face in their drive to optimally implement EMIS in their respective areas
• To identify gaps in the current situation and key priorities for future development through the DQAF
To put forward recommendations to the MoE on ways to improve education statistics in Kenya
•
An Overview of the Organisational Landscape of education in Kenya general overview of the education landscape in the country is outlined in the following A
paragraphs: The formal education system in Kenya comprises of early childhood education; 8 years of compulsory schooling in primary education; 4 years in secondary education and a minimum of four years in the university depending on the degree pursued. This is widely referred to as the 8‐4‐4 system. Other education and training programmes include the Technical Industrial, Vocational and Entrepreneurship Training (TIVET), Special Needs Education, Adult and Continuing Education, Non‐formal Education and Youth Polytechnics. Progression from primary to secondary school and from secondary to university is through selection on the basis of performance in the national examinations for the Kenya Certificate f Primary Education (KCPE) and the Kenya Certificate of Secondary Education (KCSE) orespectively which are administered by the Kenya National Examinations Council. The education data cut across, mainly, the three central government Ministries responsible for the education and training sector, namely Ministry of Education (MoE), Ministry of Higher Education Science and Technology (MoHEST) and the Ministry of Youth (MoY) each with their own mandates and responsibilities. The three ministries were identified as relevant to the DQAF exercise. Their roles within the national statistical system, and their interrelations, will be discussed in greater detail in the relevant sections of this report. The egal framework for the educational and statistical functions of government bodies are laid own, mainly, in the Statistics Act, the Education Act and the University Act. ld
4
The management and implementation of the education and training sector is decentralized both inICT in E
stitutionally and in terms of decision making with four broad levels (Refer to Kenya: ducation Situational Analysis, September 2009):
At the central level, the ministries responsible for education and training (the MoE, MoHEST and MoY) are responsible primarily for policy and strategy.
At the provincial level, there is a Provincial Director of Education (PDE) and a Provincial Education Board (PED) in each of the 8 provinces. They are responsible
vfor monitoring and coordinating all education activities in the pro ince and supervision of all district education programs.
At the district level, there is a District Education Officer (DEO) and a District Education Board (DEB) responsible for the management of education services, teachers, schools, funds and quality assurance. Quality Assurance and Standards Officer (QASO) is a recent term coined to refer to the education officer responsible for supervision of curriculum implementation in schools.
At the Zone, which is a sub‐division of a district , the Zonal Quality Assurance Officers (ZQASOs) will be link pin between the district and the education
institutions (See KESSP revised report: EMIS Chapter, May 2010). These officers are also responsible for data quality through their verification function.
At institutional level, there are Boards of Governors responsible for policy and strategy, School Management Committees (SMC’s) responsible for developing and implementing school plans and implementing education and training policies and Parent Teachers Associations (PTAs) responsible for monitoring school activities and mobilizing additional resources. Each school has a head‐teacher, who is the secretary to each institutional management board, a deputy head‐teacher and several heads of departments.
The recent establishment of Kenya National Bureau of Statistics (KNBS) as a Semi Autonomous Government Agency (SAGA), from previously being a component of the Ministry of Planning, is mandated by law to collect, analyze and disseminate socio‐econom eded for planning and policy formulation in the country. The functio ies:
ic statistics ne
ns of the KNBS fall into the following four categor
Data collection
tics ers of data Data analysis and production of official statisDissemination of results to users and produc
Archiving of survey and census results data KNBS is ideally situated to fulfil the data coordinating function between all the data producing agencies in the country. KNBS is, among others, responsible for the population ensus in Kenya and this dataset is also relevant to education, particularly in developing ducation indicators, such as enrolment ratios. ce
5
Design of the assessment The principal objective of the DQAF evaluation is to produce a qualitative assessment of the statistics used and produced by the education sector in Kenya. As indicated, the tools and methodologies applied have been adapted from earlier evaluations undertaken by the International Monetary Fund. Further adaptations by the UIS and World Bank sought to ensure a comprehensive evaluation, specifically focused on the quality of education tatistics. To underpin the DQAF evaluation, a participatory and needs‐based approach was sadopted. The diagnostic consisted of two phases: 1) data collection and 2) data analysis. The team could conduct all required procedures as planned, for which the relevant authorities should be commended. The evaluation framework covers the different steps included in the statistical business process model at the national and sub‐national levels, and as weaknesses of the available structures, based on the six DQAF d
sesses the strengths and :
imensions
pre‐requisites of quality;
s; integrity;
ndnes ility; methodological sou
reliab ; and accuracy andserviceability
accessibility Narrative descriptions are given of the state of the system in Kenya as per these dimensions and sub‐dimensions, in addition to scores on each of the sub‐dimensions. Scores are attributed according to international norms pertaining to the functions of the different elements of the statistical system. The scores on the sub‐dimensions are then aggregated so as to arrive at scores on each of the dimensions. They should be indicative of where efforts for improvement of the statistical function could focus. In order to make this more explicit, specific recommendations are made. This diagnostic is intended to provide inputs for an action plan to improve the system of educational statistics in Kenya.
6
1. requisites of quality Pre
Data quality is regulated by a framework of statistical laws, policies, standards and practices, and technical and human resources. This framework cannot exist in a vacuum. Pre‐requisites of quality, as one of the dimension of data quality, do not comprise a qualitative dimension, but refer to the evaluation and understanding of the institutional context in which the statistical processes exist and which is essential to the other dimensions. This dimension presents the integrated nature in which available statistical laws, as well as essential human and technical resources, impact on other quality dimensions.
1.1. and institutional environment Legal
1. Education and training in Kenya is governed by the Education Act of 1968 and other
related Acts of Parliament, including TSC Act, KNEC Act, Adult Education Act, University Act, and various Acts and Charters for universities. The Education Act gives the MoE the mandate to manage the provision of education to all Kenyan children. To fulfill this mandate, the MoE needs quality data. That is data that is complete, relevant, accurate, timely and accessible. Teachers Service Commission (TSC), Kenya National Examination Council (KNEC) and Textbook Monitoring Unit (TMU) are examples of key data collection centers (TMU is integral to MoE and TSC and KNEC are SAGA’s) that assist to realise the goal of obtaining quality data. Additional data sources on education are obtained through the Ministry of Higher Education (and CHE) and the Ministry of Youth Affairs. However, this does not seem to be underpinned legally by the Education Act. It seemed during the visits that MoE through the EMIS Unit attempts to consolidate these education data sources and acts as a link to the Kenya National Bureau of Statistics (KNBS). KNBS has primary responsibility for census and household surveys (which is underpinned by the Statistics Act). The authority of the Commission for Higher Education is underpinned by the University Act Chapter 210B which has amongst its mandate and functions to collect, examine and publish information relating to university education. These Acts of parliament seem to provide an adequate legal framework. However, on a more critical note, it should be mentioned that the ducation Act, which defines the work of MoE, does not, in any way, mention statistics Eor any data collection activity.
2. The division of responsibility seems largely invoked as per the stipulations in the above mentioned Acts. However, some key arrangements have been operationalized in n informal manner. For example, the incumbent in the position of the Head of the MIS has been seconded from the KNBS. aE
7
3. A critical need for a proper institutional framework was identified, and the strategic involvement of a central agency such as KNBS in data‐related work of the Ministries is essential. KNBS indicated that they are, at present, not satisfied with their strategic nvolvement in education data affairs. It appeared during the mission that data flows iare largely dependent on informal arrangements.
4. During the consultation process, it appeared as if no systematic and formal (official) rrangements are in place to ensure consistency with regard to data management and adata sharing between KNBS, MoE and even TSC.
5. The investigation could not find evidence of regular formal coordination meetings and orkshops between the data producing agencies with the specific aim to establish the w
different data requirements and to avoid duplication.
6. In collecting data, the Statistics Act part III clearly states that individual responses are to be treated as confidential, and shall not be disclosed or used for other than statistical purposes unless disclosure is agreed to in writing by the respondent. This is in ccordance with the Statistics Act under number 22: restriction on disclosure of ainformation. There does not seem to be a similar section in the Education Act.
7. Respondents were not always adequately informed of their rights and obligations with egard to the provision of information. For example, the EMIS / TSC survey rinstruments include such information whilst the CHE instrument does not.
8. Statutory statements and compliance contingencies: there appears to be variation in terms of clearly stating under which acts information is being collected. For example, the survey instruments of MoE and TSC clearly stated that the statistics are collected under the Statistics Act, Cap 112, the regulations of the Ministry of Education, the Teachers Service Commission and the regulations of the Department of Adult Education. Such a clause is not included on the survey instrument of the Commission of Higher Education. Subsequently respondents are not always reminded or informed of their legal responsibility to provide information accurately and the contingencies for non‐compliance. For example, the survey instruments of MoE and TSC clearly stated that the information provided in these forms must be correct, and that the supply of any false information will lead to disciplinary action. Furthermore, these forms also specify a line function for verifying information once completed. For example, the form must be certified by the head teacher, zonal quality assurance and standards officer (ZQASO) and the district education officer (DEO) or municipal education officer, or director of city education or district adult education officer, upon completion. However, during the site visits and interviews with participants, limited evidence was ound of the execution of the prescribed line function to ensure veracity of information fobtained.
9. Provisions for contingencies in the event of unlawful disclosures exist in the Public Officer Ethics Act, but they do not refer to disclosure of data per se. For example, the Statistics Act in Part III – Statistical Information stipulates a few regulations to prevent
8
disclosure of confidential information by which any person commits an offence and shall be liable on conviction to a fine or to imprisonment under number 26: Offences. There also exist such provisions in the Public Officers Ethics Act, specifically in Part VI – General, under number 41‐ obstructing or hindering ‐ Divulging information acquired under Act, where it states: A person who, without lawful excuse, divulges information acquired in the course of acting under this Act is guilty of an offence and is liable, on conviction, to a fine not exceeding five million shillings or to imprisonment for a term ot exceeding five years or to both. This however has no direct reference to disclosure nof statistical data.
10. Access to data at KNEC seems highly secured with solid provisions, as well as at TSC; ood efforts at MoE / EMIS were noted, but provisions found were not of the same glevel.
11. During the investigation, the premises seemed to be sufficiently secured. There are security gates that are locked after hours. The physical space within which the EMIS nit is situated, is not conducive to receiving members of the public and deal with U
information of such high significance for the MoE.
12. Specific regulations governing the storage and back‐up of data do not seem to exist across the board, although, in practice, this seems to be reasonably well taken care of. However, there are some doubts about the rules for data destruction. At the MoE, owever, the Human Resources section has to be informed when documents are hdisposed of.
13. There appears to be no strong evidence for the legal underpinning of the work of the EMIS Unit. Specifically, it is not clear which agency or unit has been legally mandated to produce or collect data for the purposes of compiling the statistical data for use in education. Findings from the investigation indicate that EMIS often produced and published the relevant education data. However, data was also collected by other gencies, such as TSC, KNEC, Ministry of Higher Education, CHE and the Ministry of aYouth.
14. he team did not find confirmation of formal timelines for publications (for the EMIS TUnit), nor for penalties with non‐compliance with such timelines.
15. The three most important central government Ministries responsible for the education and training sector, i.e. Ministry of Education (MoE), Ministry of Higher Education (MoHEST) and Ministry of Youth (MoY), each with their own distinct mandates and roles and responsibilities, have devoted considerable effort to addressing the major challenges facing the sector. It is important to note that until recently the MoE and MoHEST were one Ministry. There are nine core EMIS data collection instruments attempting to cover all levels of education (see Table 1.1 below). There are materials such as a data collection manual and data entry guides that provide respondents with good paper assistance. However, three data collection cycles per year for MoE and TSC imply that respondent burden is not always optimally considered.
9
1.2. Resources are commensurate with needs of statistical programs Resources:
16. The number of staff and the capacity to perform data management functions differ
from agency to agency. While the overall impression is that staffing in TSC, KNEC and CHE seems adequate, this seems certainly not the case for the EMIS Unit at MoE. The EMIS Unit seems to rely heavily on interns and external consultants ‐ a situation that seems hardly sustainable. In addition, the estimated six clerks attached to the EMIS nit are said not to be excessively effective, specifically with regard to data related U
functions. As a consequence, TSC sometimes has to lend clerks to the EMIS Unit.
17. he lack of data analysis at national and district level was raised as a concern in many Tof the interviews and training is required in this regard.
18. ome efforts are being made at a decentralised level to ensure the retention of staff, but Sit needs to be increased at the EMIS unit.
19. There exists an ICT Unit in the MoE which takes care of individual computers and local area networks (LAN’s) and intends to exploit national fibre optic backbone. As far as ould be ascertained, the LAN is operational at provincial level and being implemented cat district level and benefits the data capturing systems.
20. The quality of the software for data capturing and reporting purposes varies across organisational units. The school census data returns process has been based on two independent data capturing systems, one for the TSC and one for EMIS (MoE). The EMIS data capture system developed by COPY CAT has specific weaknesses in meeting the user requirements at district and national level. The TSC system is not robust enough to cater for dynamic data needs. Due to the stated weakness it is not possible to have one common data capturing system at district level. The system requires new reports in each cycle (Refer to KESSP revised report: EMIS Chapter, May 2010). The software used by the EMIS Unit for the capturing of survey data is currently being discarded. It is not well designed and difficult to use, and with no direct support available, this results in non‐completion of data capturing exercises. The software of the TSC currently makes it possible to complete data capturing although it needs to be reviewed. The CHE seems to use data analysis software (SPSS) for purposes of a database system, which is not ideal solution, while the MoY uses spreadsheets, such as Excel to capture the data, which is far from ideal. The development of the Foundation Database in the MoE seems to be a step in the right direction. The greatest hurdle is aving ICT skilled staff in handling data entry and related data management issues at hdistrict and institutional level.
21. There appear to be sufficient computers provided to districts and laptops provided to districts without electricity. According to the EMIS Unit, each district has at least one computer and one printer. An inventory of computers, printers and LAN’s in districts
10
exists and is maintained. In some of the districts that were visited, the officials use their personal modems to communicate via e‐mail with head quarters.
22. While EMIS is an Investment Programme (IP) under KESSP (refer to “Introduction” for more detail) in its own right, there are no dedicated budget lines for data management ithin the various Ministries. Despite the efforts made in rolling out, EMIS IP is still w
faced with challenges including weak processing and low data sharing and utilization.
23. The allocation of budgetary resources for future statistical development seems to be a weak point due to a lack of clarity of the nature of KESSP, which is treated as a rogramme but seems to be a project. Mainstreaming of KESSP, and the EMIS pcomponent within it, would help increase the sustainability of the EMIS Unit's function.
24. The fact that timeliness of the delivery of data has lagged for years, seems to indicate that education statistics have not received enough attention from the educational leadership so far. However, recently a new Director of Policy and Planning was recruited, and this may change the importance attributed to educational data use. A ew Permanent Secretary (PS) was also retained and during a consultation meeting nwith him, it was communicated that this will take priority.
25. A data collection manual was prepared in an attempt to encourage consistency in concepts, definitions and methodologies across the different units within the data‐producing agency. Concerns are being raised, especially at district level, about the number of requests for the same information from different directorates at head quarters. It is also important to note that education data straddles three Ministries: MoE, MoHEST and MoYS with leading data producers being MoE, TSC and CHE. There is a need for a more coordinated approach in data collection, as there already is some level of coordination between MoE and the TSC. This lends itself for the harmonization of concepts and the elimination of unnecessary duplication and repetition. Thus ncreasing efficiencies of data collection and improving the comparability of the idifferent surveys.
26. Education Management Information Systems (EMIS) is an investment plan in the Kenya Education Sector Support Programme (KESSP), with the aim to harmonise collection, processing, analysis and dissemination of data. Although KESSP places great emphasis on the collection, collation and production of quality data, in practice, there seems to be weak collaboration between data collection agencies. Furthermore, the lack of capacity within the MoE makes the efficient and effective management of these survey processes very difficult. The investigation indicated that the following data collection instruments exist for education institutions in the various ministries, such as MoE, MoHEST, Ministry of Youth. The surveys for 1) ECD, 2) Primary schools and 3) Secondary schools are termly, to meet the TSC teacher management data with reference months being March, July and October. The first survey is a short one whilst the second one updates the changes in terms of teachers and enrolment. The third survey for these sectors is a comprehensive data collection survey. There are also other surveys such as 4) an Adult education survey which is quarterly (August, December,
11
March and June). Surveys are also conducted for tertiary institutions, such as 5) Teacher Training Colleges, 6) Technical Industrial Vocational and Entrepreneurship (TIVET), and 7) Universities. There is a survey instrument for 8) Non‐Formal Centres and 9) a survey for Education Officers. All these surveys have an annual data collection cycle. During the interviews it was established that the same information is collected by different agencies, for example, enrolment is collected by TSC and TMU. Table 1.1 below outlines the various data collection instruments that were encountered during he consultation interviews and points to the need for harmonization of some of these rocesses. tp
Table 1.1: Overview of education data collection processes Unit TARGET Frequency Objective
MoE: EMIS
PRIMARY AND SECONDARY SCHOOLS (PUBLIC AND, PRIVATE)
3 To obtain national education data (enrolment, teachers, pupil ages,
ial repeaters, dropouts, classes, speceducation)
EMIS EARLY CHILDHOOD DEVELOPMENT (ECD) 1 To collect statistical data on ECD
EMIS ADULT EDUCATION 1 Collect data on adult learners by
age, teachers EMIS NON‐FORMAL (PUBLIC, PRIVATE) 1 To collect statistical data on no
formal institutions n‐
MoE: TMU PRI MARY(PUBLIC) 2 For budgeting and accounting purposes
TSC PRIMARY, SECONDARY (PUBLIC) 2 To obtain teacher‐related information?
USAID: DEMMIS
PRIMARY (PUBLIC, PRIVATE) –NATIONAL; A PILOT PROJECT
NOT Monthly For school‐level mgmt. capacity development
CHE TERTIARY (PUBLIC, PRIVATE) 1 To obtain national statistical data
(universities only)
MoHEST TIVET (PUBLIC) ? To obtain national statistical dat
(TIVET) a
WFP ? – NOT NATIONAL (COVERING SCHOOLS
SCHOOL FEEDING PARTICIPATING IN THE PROGRAMME)
Monthly For planning and allocation of purposes
MoY YOUTH POLYTECHNICS 1 To obtain national statistic
(YP only) al data
KNEC PRIMARY, SECONDARY (PUBLIC, PRIVATE) 1 To obtain school data and
achievement data
UWEZO PRIMARY SCHOOLS (PUBLIC) To document the competencies of
hildren in literacy and numeracy c
Others (incl. KNBS)
CENSUS; HOUSEHOLDS SURVEYS
Summary of the data collection instruments indicating the responsible unit, the target population, the frequency of the survey and the aim of the survey questionnaire.
12
27. Comprehensive guidelines exist in the form of EMIS manuals (data collection manual, data analysis and reporting manual, facilitator's guide). However, it appears as if these guidelines have not all been properly implemented. An attempt is being made to tandardise the technical processes through the implementation of the Foundation sDatabase, which is in development.
28. The team did not find evidence of periodic review of work processes, but the Needs Assessment Report for the Establishment of Education Management Information ystems (EMIS), funded by the World Bank, and the support for the current DQAF xercise can be seen as an attempt to review existing work processes. Se
1.3. awareness: Quality is a cornerstone of statistical work Quality
29. There is a general concern about quality of education data at management level.
30. It was apparent during the investigation that KNBS as the umbrella body has no formal
arrangement with other data collection agencies in terms of data processes and procedures or any data audit practices to ensure the production and dissemination of quality data. There is also no formal process and procedure in place to determine when data becomes official. However, during the visits it was observed that a Quality Policy Statement of the MoE was displayed at head quarters and district offices. The policy, although not with specific reference to data, reads that the MoE is “commited to perate a quality management system in line with ISO9001 International standards
oaimed at providing high quality products”.
31. Although the MoE (EMIS) consolidates education data from the various sources, such as Teachers Service Centre (TSC), Kenya National Education Examination Council (KNEC), Ministry of Higher Education, and Ministry of Youth, and acts as a link to the enya Nation Bureau of Statistics (KNBS), there is no formal agreement in place to K
support this arrangement.
32. Several checks of data quality are conducted at various levels and various stages of the data collection process. District staff and DQASO’s, in particular, were found to be the first ‘line of defence' for EMIS. During the site visits, the participants at district offices explained the data collection process in detail. It appeared that a good data collection flow structure is in place, and managers and officials are aware of these data collection processes. The survey instruments are disseminated by the district education officers (DEO) via the zones through the zonal quality assurance and standards officers (ZQASOs) to the schools. The district education officer then disseminates the survey forms according to an official list of schools. The forms are then returned via the same dissemination procedures. The data entry takes place at school level and is checked and verified at the zones by the ZQASO, and then sent to the district office and to head quarters where further checks are being conducted. Similarly, the data in the Textbook Monitoring Unit (TMU) in the directorate of Basic Education are checked to ensure
13
quality data is provided for capitation grants. This Unit has a sound verification process n place to check the existence of an institution, as well as a validation process to check ithe enrolment totals.
33. There is no evidence that a coordinating entity or group exists to provide guidance on quality of statistical data. This function is mostly done by Quality Assurance and tandards Officials at district and zonal level with, seemingly, no or little data quality Sskills.
34. Although the KESSP revised report (see EMIS Chapter, May 2010) refers to the statistical quality assurance framework (SQAF) as one of its implementation strategies in an attempt to get feedback from users, no evidence was found of periodic users’ surveys or other systematic processes to obtain feedback from users on data quality issues. This strategy envisages monitoring the data collection procedures, data validity nd support systems at all levels. It seems as if it is merely a reference to the current QAF exercise.
aD
1.4.
0.46
0.410.39
Legal and institutional environment
RessourcesQuality awareness
Kenya International Norms
Synthesis and score ) Based on an assessment of all the DQAF sub‐dimensions, a global score of 42% has een assigned for the pre‐requisites of quality dimension. ab
Figure 1.1: Results of prerequisites of quality
1.5. Recommendations
The experience in the field revealed that there were some positive aspects regarding data collection processes and awareness. It also seemed as if officials at the institutional level
14
are not overburdened as one might have thought, taking into account all the data collection processes that take place during one calendar year in these institutions.
Nr. Recommendation Priority
1.1.1 The arrangement between MoE and KNBS, in relation to the Head of EMIS Unit, needs to be clarified. A Memorandum of Understanding (MoU) is said to have been drafted in this regard, but this now appears to be outdated. A new MoU regulating this arrangement seems a minimum pre‐requisite, but more rigorous discussion may be useful to clarify respective roles.
High
1.1.2 An institutional framework, relating to and describing the national statistical system for education, should be established. This should indicate the approximate functions of all actors in the system; promote the strategic involvement of KNBS; and formalise data‐sharing / coordination arrangements between the different organisational elements / units of the system. Regular coordination meetings between relevant actors are strongly advised as a first step into this direction.
High
1.1.3 Kenya is encouraged to include a statistical element in the Education Act.
Medium
1.1.4 Survey and census instruments in general, and for example that of the Commission for Higher Education (CHE), should remind respondents of their rights and obligations. They should indicate that information is to be treated confidentially, but also inform respondents of their duty to comply, where applicable.
Medium
1.1.5 The (physical) space for the EMIS Unit should urgently be improved, to adequately house the staff working there, as well as provide storage for critical files.
High
1.1.6 A policy outlining guidelines for storage / archiving and destruction of data and documents should be formulated, and, where possible, widely adopted.
Low
1.1.7 Data collection schedules and procedures are to be optimised, with a suggested reduction of three to two cycles for the EMIS Unit / TSC surveys and integration of Textbook Monitoring Unit
High
15
(TMU) survey in the mainstream data collection cycle.
1.2.1 At least one to two mid‐level officers (planners) should be recruited for the EMIS Unit, to ensure continuity and enable succession planning. These may be sourced from the Provincial Officers, for example. The Head of the EMIS Unit should train these officers.
High
1.2.2 The efforts in provision of information and communication technology (ICT) resources should be realigned, focusing on doing the necessary and realistic first, rather than attempting at such projects as connecting every school. Functioning local area networks (LANs) in districts and provinces should be considered as critical tools to enable an effective EMIS. Prioritisation is highly recommended.
High
1.2.3 In terms of software, unification of the different systems under one software application is recommended. The present application is not working well. A sustainable solution should be sought. Harmonised databases across responsible bodies should underpin the whole education statistics system.
High
1.2.4 Training in generic ICT skills, as well as in (using software for) data analysis is urgently required, in particular at the district level. This should target staff critical to data processing related to EMIS, for example, secretarial staff. It should be targeted in
durasuch a way as to ensure capabilities are bly developed.
High
1.2.5 While at present there seem to be sufficient computers at district level, even in areas with limited electricity (laptops), connectivity should be prioritarily provided. This will become of particular importance once the EMIS application (see 1.2.3) will be accessible online.
Medium
1.2.6 The exact status of the EMIS Unit, apart from the EMIS IP, is to be clarified, mainstreamed and provided with adequate funding.
High
1.2.7 Harmonisation of concepts, methodologies and surveys / data collection cycles should be sought, where possible; this drive should be led by a central coordinating body such as a data quality group (see 1.1.2). KNBS should probably play a key
Medium
16
leadership role in such a group (see 1.3.1).
1.2.8 Regular training in completing survey instruments, for both school‐based and district‐based staff, as well as regular data quality checks and audits of completed instruments, should be implemented. This is of particular importance at recruitment.
Medium
1.3.1 KNBS should, to a greater extent, fulfill a role of guardian of quality at relevant line Ministries and SAGAs. It should attain a leadership role in coordination activities (see 1.2.7).
Medium
1.3.2 Surveys of user satisfaction with education statistics (where users may be defined as intra‐Ministerial clients or external individuals or entities, such as development partners) should be conducted to assist a greater understanding of whether relevant data provision is timely, accurate and accessible.
Low
17
2. Integrity: The principle of objectivity in the collection, processing, and dissemination of statistics is firmly adhered to
This dimension captures the notion that statistical systems should be based on adherence to the principle of objectivity in the collection, compilation, and dissemination of statistics. The dimension encompasses institutional arrangements that ensure professionalism in statisti ractices, transparency, and ethical standards. The three elements for this ity are:
cal policies and p
dimension of qual
professionalism, transparency, and
ethical standards.
2.1. Professionalism: Statistical policies and practices are guided by professional
s principle
35. The MoE recognizes the critical role of an effective Education Management Information
System (EMIS) in the provision of timely, reliable and accurate data for the education sector, as it is acknowledged in KESSP. It should be mentioned that the Education Act, which defines the work of MoE, does not, in any way, mention statistics or any data collection activity, while it could be the vehicle to address the general need for the professional independence of the data producing agency. As mentioned before, KNBS has the primary responsibility for census and household surveys (which is underpinned by the Statistics Act) and there is an attempt by MoE, through the EMIS nit, to consolidate the education data sources and acts as a formal link to the Kenya uNational Bureau of Statistics (KNBS).
36. The recent establishment of KNBS as a Semi Autonomous Government Agency (SAGA), from formerly being a component of the Ministry of Planning, indicates the recognition of the importance of its independence with the full authority to compile and disseminate statistical information. This function is underpinned by the Statistics Act, 2006, and clearly states it is an Act of Parliament to provide for the establishment of Kenya National Bureau of Statistics for the collection, compilation, analysis, publication nd dissemination of statistical information, and the co‐ordination of the national astatistical system and for connected purposes.
37. KNBS as the umbrella agency is supportive of the work of the EMIS unit with no interference in the compilation and dissemination of statistical data. EMIS consolidates the education data for Kenya National Bureau of Statistics (KNBS) for purposes of the publication of the Statistical Abstract and Economic Surveys. In fact, from the
18
db
iscussions with participants it was apparent that an informal collaboration exists etween these two data producing agencies.
2.2. Statistical policies and practices are transparent Transparency:
38. Based on evidence from the analysis of current documentation (See KESSP revised
report: EMIS Chapter, May 2010) with regard to the recruitment and promotion practices it is stated that a systematic and aggressive capacity building programme is in place and will be continuously carried out to equip the staff with skills and ompetencies necessary to support electronic‐based systems particularly at the cdistrict, province and headquarters levels.
39. It does not seem that professionalism is promoted by the publication of methodological papers and by encouraging participation in or organizing lectures and conferences and hat constraints due to lack of capacity are a very real inhibitor at this level. However, it tis a necessity that should be emphasised.
40. In discussions with interview participants it was indicated that the choice of core data survey instruments of the MoE (refer to Table 1.1 above) and the population census are considered for the development of education indicators such as Gross Enrolment Ratios (GER), Net Enrolment Ratios (NER), repeaters, dropouts, transition rates, completion rates, gender parity and tracking of MDG goals and EFA progress. The publications Education Facts and Figures and the Educational Statistical Booklet include these education indicators. When the team visited one of the schools, the head‐teacher indicated that the source data for age is obtained from the birth certificates of learners. In fact she kept a file with all these documents. In our discussion with participants it was apparent that there is a drive in the country to make sure that all learners get their birth certificates. Furthermore, there is an awareness of the statistical chain (instrument design and dissemination, data cleaning and reporting) with some limitations in data validation and data analysis. There is also an attempt to validate and compare certain data elements on the survey instrument. For example, the comparison of enrolment figures with age totals on the survey instrument will increase the quality of the data. However it must be said that the experience during these few discussion sessions is that there is a definite need to improve data processing (data collection activities and scheduling) and data validation procedures.
41. It was ascertained during the investigation that members are generally not encouraged o present their reasoning for the choice of methodologies in documents that are made tpublic.
42. here seems to be very little evidence for that the data‐producing agency commenting ublicly on erroneous interpretations or misuse of the statistical data. Tp
19
43. A representative of KNBS indicated that there is a media interest in the leading economic indicators published on a monthly basis, and the KNBS website is therefore an important link in this regard. On the website of KNEC there is a speech by the chief executive officer during the release of the 2009 KCSE results, and a speech by the Minister of Education during the release of the 2009 KCPE results. A report on the 008 KCPE examination with question papers and answers to the objective questions 2is also available. There is no clipping service or media division in the MoE.
44. The Kenya National Examinations Council has a service charter as a commitment to provide quality service to its clients and KNBS has a data access and dissemination policy clearly specified that the objective of the micro‐data release policy is to define the nature of the anonymised micro‐data files that will be released, the intended use of these files, and the conditions under which these files will be released. The policy urther outlines terms and conditions of use of public data files. We could not find fsimilar documentation for the MoE.
45. There does not seem to be a public and official schedule for reporting of statistics at MoE. However, KNBS has a catalogue (39 pages), an ad‐hoc publication which presents descriptive information of KNBS publications which includes the following details: The title; Short summary of each publication; The unit price/method of distribution; The number of pages; Whether available in other formats; International standard book umber where necessary; KNBS website and whose link all other KNBS publications ncan be accessed; and a list of Geographic information Maps of 1999 Census.
46. n the case of MoE, the public is not made aware of the approval process for official Istatistics.
47. Most publications that are available have been clearly identified, and the data producing agency is usually acknowledged as the source. Refer to publications of MoE ch as Education Statistical Booklet and Education Facts and Figures and publications, su
and of KNBS such as Statistical Abstract 2007 and Economic Survey 2009 as examples.
48. In most cases, it appears that, in the case of joint publications, the part attributable to the data‐producing agency is identified. However, the KNBS logo seems not to have been indicated on the MoE publications.
49. Oversights were noted during the visits with regard to requests by the data agency for attribution when its statistics are used or reproduced. It seems that the MoY has not requested attribution for publication of their data. Data from KNEC is available for outside users but the obligations and responsibilities of their customers are clearly outlined in their Service Charter (June 2006). It is stated that they should “recognize, cknowledge and respect intellectual property rights and any materials acquired in the acourse of duty with KNEC”.
50. The team did not find any evidence that advance notice is given when major changes in methodology, sources, and statistical techniques are introduced.
20
2.3. standards: Policies and practices are guided by ethical standards Ethical
51. lear guidelines outlining correct staff behaviour exist, for example in the Public Officer C
Ethics Act, but these are not focused on statistics and tend to be on the generic side.
52. anagement acknowledges its status as role model and is vigilant in following the Mguidelines.
53. ost staff members in the EMIS Unit are interns or consultants. There does not seem to Mbe an induction process in place that is available for new staff members.
54. o specific guidelines seem to exist to remind staff members periodically of the uidelines (ethical standards). Ng
0.32
0.320.28
Professionalism
TransparencyEthical standards
Kenya International Norms
2.4. Synthesis and score
Based on an assessment of all the DQAF sub‐dimensions, a global score of 31% has been assigned for the integrity dimension.
Figure 2.1: Results of integrity
21
2.5. dations Recommen
Nr. Recommendation Priority
2.1.1 Further participation in technical conferences and publication of (methodological) papers, by concerned EMIS staff, is encouraged.
Medium
2.1.2 More can be done in terms of preventing erroneous interpretation of statistical data in the press, through e.g. briefings and a ‘clipping service’.
Low
2.2.1 The KNBS’ contribution to MoE publications should be made e oexplicit with a logo, wherev r p ssible.
Medium
2.2.2 Advance notice needs to be given when major changes in methodology, sources, and statistical techniques are introduced. Prior to that, however, the current way of calculating data should be clarified publicly.
Medium
2.3.1 More explicit guidelines on ethical behaviour of relevant staff would be helpful. This should be an important element of any induction process (see 1.2.8 and 2.3.2).
Medium
2.3.2 The introduction of induction for new staff members, targeting all staff involved in the statistical data chain seems called for.
Medium
22
3. Methodological soundness: The methodological basis for the statistics follows internationally accepted standards, guidelines, or good practices
This dimension covers the idea that the methodological basis for the production of statistics should be sound and that this can be attained by following internationally accepted standards, guidelines, or good practices. This dimension is necessarily dataset‐specific hodologies for different datasets. This dimension has four elemen
, reflecting different metely:
ts, nam
concepts and definitions,
ization, and scope, classification/sector
basis for recording.
3.1. Concepts and definitions: Concepts and definitions used are in accord with
statistical frameworks standard
55. At the time of fact finding mission, the mapping of education system in Kenya does not
completely align with the International Standard Classification of Education (ISCED) of UNESCO. Kenya is one of the countries who have submitted a Questionnaire on National Education Programmes (UIS/ISCED) to the UIS and whose mappings are in progress (available at: http://www.uis.unesco.org).
56. No important deviations from international standards and concepts were found.
57. The Education Act, as well as available publications, do not describe the compulsory beginning and end age of education.
58. Data is collected for repeaters via the primary and secondary school survey instruments, indicating that there is no automatic promotion in Kenya. Furthermore, progression from primary to secondary school and from secondary school to university is through selection on the basis of performance in the national examinations for the Kenya Certificate of Primary Education (KCPE) and the Kenya Certificate of Secondary Education(KCSE), respectively.
3.2. Scope: The scope is in accord with internationally accepted standards, guidelines,
good practices or
23
59. The formal education system in Kenya comprises of Early Childhood Education, 8 years of compulsory schooling in primary education, 4 years in secondary education and a minimum of four years in the University depending on the degree pursued. This is widely referred to as the 8‐4‐4 system which has been operational since 1985. Data collection instruments were harmonized with nine core EMIS data collection instruments in an attempt to cover all levels of education. The data collection for ECD, primary and secondary is termly to meet the TSC teacher management data with reference months being March, July and October. Tertiary institutions have an annual data collection cycle while adult education is quarterly (refer to EMIS documentation).
60. Analysis of the core annual survey instruments indicates that data of teaching staff is mostly collected by the MoE, in collaboration with TSC. The Textbook Monitoring Unit, within the MoE also collects data (enrolment) for the capitation grants and keeps records of the current textbooks in institutions. The Free Primary Education return was designed in 2003 to support the Free Primary Education programme and the target is public primary schools. The key data collected is enrolment, which is, used as a basis for disbursements of Free Primary Support Funds. The enrolment statistics are compiled at the school and forwarded to the District Education Office for further tabulation and transmission to the Ministry headquarters to Text Book unit under Directorate of Basic Education for allocation of funds.
61. Kenya National Examination Council (KNEC) is responsible for examination data in the country with one of the core functions being the co‐ordination of examination data. Kenya National Examination Council (KNEC) is responsible for examination data in the country with one of the core functions the co‐ordination of examination data. The only performance data that is collected appeared to be through the Teachers Training Colleges annual survey instrument asking for the institutions mean score. The South and Eastern Africa Consortium for Monitoring Education Quality (SACMEQ) is a dataset that contains information on schools based surveys, including students’ tests, from 15 countries that belong to SACMEQ and Kenya is one of them.
62. The data collection process is coordinated by provincial, district and zonal officials. The datasets make provision for these geographical boundaries and is published as such, namely per institution by district and province. Table 3.1 below, indicates the geographical boundaries (Education Statistical Booklet 2003‐2007) by Province and district. Table 3.2 below, shows that the structure of the TSC database also makes provision for geographical boundaries (refer to provincial and district code in the table) and is used at institutional level (see school code) to capture the survey data.
24
Table 3.1: Primary Schools by province and district (Source: Educational Booklet 20032007) Table A3.1 Number of Primary Schools by Type and District, 2003-2007PROVINCE/ 2003 2004 2005 2006 2007DISTRICT Public Private Total Publ ic Private Total Public Pri vate Total Publi c Private Total Public Pri vate Total
COASTT aitaT aveta 177 7 184 178 7 185 179 9 188 179 9 188 179 27 206Kilifi 227 29 256 230 33 263 229 25 254 235 50 285 235 98 333T ana River 114 3 117 114 3 117 121 7 128 121 8 129 121 15 136Lamu 67 4 71 65 2 67 66 3 69 69 4 73 69 1 70Kwale 272 9 281 271 9 280 271 12 283 269 18 287 269 80 349Mombasa 83 51 134 81 64 145 85 87 172 91 104 195 91 341 432Malindi 95 17 112 100 20 120 97 40 137 105 43 148 105 67 172Total 1,035 120 1,155 1,039 138 1,177 1,048 183 1,231 1,069 236 1,305 1,069 629 1,698CENTRAL (Geographical boundaries, such as province and district are indicated in the table above)
Table 3.2: Institution table structure in the TSC database (Source: TSC database)
Institutes SchoolCode Scode schoolname Provcode Distcode DISTNAME 800104013 0104013 A. K. MAGUGU PRIMARY SCHOOL 2 0201 KIAMBU 900033034 0033034 A.B.C ACADEMY 7 0733 KISI I 900006019 0006019 A.B.C ACADEMY GIRLS 3 0306 MACHAKOS 820611072 0611072 A.B.C GIRLS ACADEST MARYMY 3 0768 NYANDO 820612056 0612056 A.B.C KATHEKA PRI SCH 3 0306 MACHAKOS 000009140 0009140 A.B.C KISOVO 3 0348 MWINGI 824301154 4301154 A.B.C THWAKE PRI 3 0343 MAKUENI 830905017 0905017 A.C KIGARI SCH 3 0309 EMBU 830905025 0905025 A.C KIRIGI PRI SCH 3 0309 EMBU 830902039 0902039 A.C NDUMARI PRI SCH 3 0309 EMBU 810506009 0506009 A.C OLKALOU PRI SCH 2 0205 NYANDARUA
(The fields in the institutions table of TSC, such as provcode, distcode and schoolcode indicate that provision is made for geographical boundaries in the data collection process)
63. The data collection instruments as listed in Table 3.3 below include groups such as
male, female private and public. Table 3.1 above is an example of data for public and private institutions. The enrolment and teacher data can also be categorized in groups such as gender and sector (public and private).
3.3. Classification / sectorisation: Classification and sectorisation systems are in
with internationally accepted standards, guidelines, or good practices accord
64. Statistics, in general, do not seem to be underpinned by an overall classification
framework or system. For example, data on Youth Polytechnics are collected, processed and published separately from TIVET‐data, while these concern, in essence, the same sub‐sector. This is due to historical and organisational factors, rather than statistical
25
ones. In addition, some databases are recreated on a yearly basis, without conversion keys, leading to lack of consistency of classifications over time, as well as to challenges to compatibility and trends analyses.
65. The recording of educational expenditure from public sources (for example, data collected by the Textbook Monitoring Unit) allows for disaggregation by ISCED level without significant challenges. As submissions of international questionnaires even reflect household expenditures by ISCED levels, the situation regarding private c uontrib tions is also satisfactory.
66. There are nine core data collection instruments in an attempt to cover the entire ducation spectrum (from ECD, primary, secondary, TIVET, TTCs, University) in terms eof students (enrolment), teachers, and educational institutions.
67. Documentation is available for respondents who need assistance in the form of a Data Collection Manual (DCM) that covers all levels of education stating the data collection procedures, definition of terms, terminology, data collection reference dates and detailed explanation of the instruments. However, it appeared that the manual was not yet implemented.
3.4. Basis for recording: Data are recorded according to internationally accepted
guidelines, or good practices standards,
68. Although not providing all elements to identify the full set of ISCED level, the
questionnaires for data collection, over the whole, provide adequate data to submit the international questionnaires. The nine core data collection instruments attempt to
cover all levels of the education spectrum, including the Youth Polytechnics.
69. Although not strictly according to the ISCED classification the data is categorized according to public and private; ECD, primary, secondary and tertiary education. The data collection instruments for universities are in the process of being developed and implemented. In this section there exists a strong willingness to implement the data collection activity. The data collection instruments for primary, secondary education collect data also by gender, age and class.
70. Table 3.3 below presents a detailed analysis of all the data collection instruments that the team could obtain during its investigation. It must be noted that although some of the instruments exist, the datasets are not necessary complete or available. For example the 3rd term data instrument for primary and secondary schools is a comprehensive survey but the datasets for this survey were not available. From the Table below it is evident that education finances data was not readily available and the investigation confirmed this.
26
Table 3.3: Overview of da a collection processes t
ECD – 3rd Term 2008
Primary 1
st Term 2010
Secondary 1st Term 2010
NonFormal 1
st Term 2010
Teacher Training Colleges
2010
TIVET 2008
University CHE 2010
Adult Education 2010
TSC – 2nd Term 2010
DEM
MIS – ECD & Primary
Gender X X X X X X X X X X Public X X X X X X X X X X Private X X X X X X X X
Enrolment
Grade/level X X X X X X X X X Gender X X X Public X X X Private X X X Repeaters
Grade/level X X X Gender X Public X Private X Graduates
Grade/level X Gender X X X X X X Public X X X X X X Private X X X X X X Enrolment by age
Grade/level X X X X X Gender Public Private Field of Study
Grade/level Gender X X X X X X X X X Public X X X X X X X X X Private X X X X X X X Level of education X X X X X X X Length of service X X X X X X X Full‐time/part‐time X X X X X X X Full time equivalents
Teachers
Trained/untrained X X X X X X X Gender X X Public X X Private X X
Non‐Teaching Staff‐ 3rd Term 2008 where applicable
Grade/level Public X X X X X X X X X X Private X X X X X X X X X Education
Institutions Grade/Level Public X X X Private X X X
Education Finances‐ 3rd Term 2008 where applicable Grade/Level
27
(The columns in the table above indicate the particular sector with the term and year the survey was conducted. The rows in the table indicate the elements included in the questionnaire by gender, by private and public and by grade/ level of education).
3.5. Structure Database
71. Prior to the analysis of the different database systems currently used for data
collection, a few general statements about Relational Database Management Systems (RDMS) are offered. A Relational Database Management System (RDMS) is a system in which data is stored in the form of tables and the relationship among the tables is also stored in the form of tables. The relational structure makes it easy to query the database and to integrate large datasets from multiple sources. Data integration generally means linking different data sources through a common field across a collection of data sources. To be able to do this, unique identifier codes must be assigned to the datasets that are used for the integration. Another key concept in RDMS is referential integrity a concept that ensures that relationships between tables remain consistent, in other words when one table has a foreign key to another table referential integrity states that one may not add a record to the table that contains the foreign key unless there is a corresponding record in the linked table (it includes concepts such as cascading delete and cascading update). Normalization is another important concept that ensures that the data in the database is efficiently organized, namely by eliminating redundant data and ensuring that dependencies make sense. With the above in mind we can now look at the databases we encountered during our visits: The COPY CAT database, designed and developed to capture the data for MoE, was complex and difficult to access and could not meet user needs at the district and national levels. No historical data can be stored in the database. For every data collection process a new database has to be physically installed. This seems to be the main reasons why the MoE is planning to discard it in favour of a more stable and standardized system, such as StatEduc2 on top of the Foundation Database. StatEduc is a statistical data entry and processing software generator that was developed by UNESCO Institute of Statistics to answer member states needs. This system has been deeply tested and is currently working in several countries in Sub‐Saharan Africa. The database used by TSC was available and seems to work for the purpose it was developed. However it was not designed for the purpose of statistical needs, including efficient storage and retrieval of historical data. A new database has to be installed every time a data collection process starts. The structure of the database also does not adhere to the principles of RDMS. Figure 3.1 below is an example of some of the fields in the age table that is in the form of a flat file which makes data querying difficult. There is also no primary key assigned and no naming conventions are followed such as to indicate the primary key or foreign key fields in the table. Figure 3.2 is the table with all the institutions (master list). There is a unique identifier for each institution, but no link to
28
any other dataset. Remember that there are three unique identifiers for each institution, namely a code for TSC, MoE and KNEC.
Figure 3.1: Structure of the age table in TSC database
(The table indicates the fields in the TSC database with a flat file structure. The fields, such as std1m5 and std1f5 indicate the age 5 by standard and gender).
Figure 3.2: The structure of the Institutions table (master list) in TSC database
(The master list table in TSC database indicates the unique identifier for institutions with the field name schoolcode)
The structure of the Foundation database, although not yet fully implemented, is in line with RDMS principles. The principles of referential integrity were evident in the design of
29
the oundation database and specific naming conventions are followed as indicated in the Figure 3.3 below. Specific documentation for the Foundation Database was also available.
f
Figure 3.3: Structure of enrolment table in the Foundation database
(Referential integrity is applied in the Foundation database. Notice how field fk_age_group in the enrolment table refers to the age_group table)
3.6. and score Synthesis
0.33
0.67
0.28
0.46
0.06
Concepts and definitions
Scope
Classification/sectorizationBasis for recording
Data Base structure
Kenya International Norms
Based on an assessment of all the DQAF sub‐dimensions, a global score of 49% has been assigned for the methodological soundness dimension.
Figure 3.4: Results of methodological soundness
30
3.7 Recommendations
An overall view on the methodological soundness indicates that many data collections activities are initiated and implemented throughout the system. It did not appear, considering the outcomes of the evaluation of this section, as if there was an endemic problem with the collection of data. Questionnaires are developed, designed, printed and disseminated. Though survey instruments are distributed, data captured and returned, when one delves deeper it was evident during our investigation that response rates need to be increased and data collection processes need to be improved and completed.
.
Nr. Recommendation Priority
3.1.1 It is recommended that the main characteristics of the Kenya education system, an understanding of which will assist the interpretation of the data, are briefly explained in important statistical publications.
Low
3.2.1 Several improvements can be made in terms of the scope of the collected data. In terms of sectors, large parts of the TVET establishments do not seem to be covered by any surveys – i.e., the private TVET institutions. The separation of Youth Polytechnics in terms of EMIS, from general TVET, while understandable from an organisational and historical point of view, creates inefficiencies too large to ignore.
Medium
3.3.1 Some documentation of a mapping to Kenya’s education system to International Standard Classification in Education (ISCED) at the KNBS and MoE is highly recommended. (Kenya’s ISCED map is published at the UIS website.) This will not only aid consistency and coverage in the future submissions of international data, but also international comparisons on education indicators.
Low
3.3.2 Implementation of the data collection manual, as developed, is encouraged, but this should preferably be harmonised with the possible changes following a revision of data collection instruments and procedures, underpinned by a general review of information requirements (see 3.5.3).
Low
31
3.4.1 It is recommended to optimise the data collection procedures for education statistics, e.g. by reducing EMIS’ data collections to two cycles a year (see 1.1.7). A snapshot of data at the beginning of the year1 to provide teachers and pupil enrolment data for education planning purposes and a more comprehensive survey later in the year, is considered the ideal model.
High
3.5.1 Unique identifiers for educational establishments across organisational units involved in educational data collection and processing, are highly recommended (note: steps in this direction have been taken through the ‘MoE‐number’). These should be sourced from a Master List that is used across organisational units, e.g. such as that of the MoE / Registration Department.
High
3.5.2 A full redesign and integration of existing database systems, according to standards in vigour, is advised, involving the application of referential integrity. The Foundation Database aims to realise this
Medium / High
3.5.3 A global inventory of information / data requirements is highly recommended, to underpin the redesign mentioned under 3.5.2.
High
1 Kenya’s academic year follows the calendar year.
32
4. Accuracy and reliability: Source data and statistical techniques are sound and statistical outputs sufficiently portray reality
This dimension covers the idea that statistical outputs sufficiently portray the reality of the economy. It relates to the notion that source data provide an adequate basis to compile statistics that statistical techniques are sound, and that source data, intermediate data, and statisti re regularly assessed and validated, inclusive of revision studies. The five ele n cover:
cal outputs a
ments of this dimensio
source data,
statistical techniques,
alidation of source data, alidation of intermediate data and statistical outputs, and assessment and vassessment and v
revision studies.
4.1. data available provide an adequate basis to compile statistics Source
4.1.1. Statistics collected through a regular administrative school census
program.
rough analysis of databases by TSC and datasets of MoE has shown oA th the following:
72. The structure of educational system, students and teachers: The nine core data survey instruments cover the education system (ECD, primary, secondary and tertiary) in terms of data such as enrolment and teachers. School data on educational expenditure was not always available. The database of TSC that we obtained, although not well designed, included the tables for secondary and primary education for three consecutive years (2008, 2009, 2010) as presented in Figure 4.1 below. The database of EMIS (MoE) was not operational at the time of the investigation. The Foundation database in EMIS (MoE) is an attempt to have a functional system in place that contains adequate source data on education system, students, teachers, and educational expenditure to compile statistics
33
Figure 4.1: Tables in the TSC database
(The names of the tables in the TSC database are displayed in the figure above, such as PriStudentsEnrolment20101, indicating the enrolment table for primary school students in 2010 for the first term data collection)
73. Geographic areas (local, regional, central): The structure of the MoE is based on
clearly defined geographical areas at a local (institution), regional (districts, divisions and zones) and central level (national). The MoE consists of 8 Provinces and within each province are a number of districts. Currently there are 272 districts in the country. Within each district there are a number of divisions (600 in the country) and within each division there are a number of zones. Each zone consists of about 23 institutions. The data collection process (questionnaire dissemination, collection and capturing) is managed along these geographical areas. The institution table in the TSC database has fields for institutions, provinces and districts to accommodate the geographical areas of education system as indicated in Figure 3.2 above. The master ist of institutions with unique identifiers for MoE was in the process of development uring our consultation interviews. ld
74. Coverage of relevant subgroups of units of collection (e.g. male and female students and teachers): Student and teacher data in the TSC database are according to
34
0
100000
200000
300000
400000
500000
600000
1 2 3 4 5 6 7
2008
2009
gender. The data was available only for public institutions as presented in Figure 4.2 below. Figure 4.3 shows a big inconsistency between the enrolment by grade for the years 2008 and 2009 of the MoE (received a spreadsheet from MoE with their enrolment figures for 2008 and 2009).
Figure 4.2: Students and teachers by gender for public primary schools in the TSC database
Figure 4.3: Enrolment figures by grade for 2008 and 2009 (data received via a spreadsheet from the MoE)
Figure 4.3:
35
75. School list reliability in the data base: There is a unique identifier for each institution in the TSC database which is used in all correspondence and processes for that institution. When we analysed the TSC database that we received, there were some duplicates in the institution table as indicated in Figure 4.4 below. Institutions have the same TSC unique number but the name of the school differs.
Figure 4.4 : Duplicates in the institutions table in TSC database SchoolCode schoolname942903084 NAKWAPUA PRY SCH942903084 NGOTUT PRY SCH942903081 NASAL PRY SCH942903081 KASAKA PRY SCH942903083 KOSIA PRY SCH942903083 OROLWO PRY SCH942903085 KOTOPOTON PRY SCH942903085 KATOPOTON PRY SCH942903086 CHEPCHIKARAR PRY SCH942903086 MBARU PRY SCH942903082 LOCHERIAMONYANG PRY SCH942903082 KAPKEWA PRY SCH810501079 Grace Primary810501079 KOINANGE PRY SCH810501078 GRACE PRY810501078 KANYUGI PRY SCH000009238 KIVANI S.S.000009238 YIKITAA SEC SCHOOL820612016 KIKAMBUANI PRI SCH820612016 kalacha nomadic000008350 NUU S.S.000008350 KITOO SEC SCHOOL933801117 BIRKAN PRY SCH933801117 DARIKA PRY SCH
Same school code but the name of the school is different
76. Reliability of age distribution: The age table in the TSC database in Figure 4.5 below for primary schools has a flat file format not complying with relational database structure standards. This makes it extremely difficult querying data. When we compared the totals of the enrolment figures as collected in the 2010 survey with the age totals in the same survey, large discrepancies appeared. The scatter plot in Figure 4.6 below shows the discrepancies between these figures.
36
Figure 4.5: Primary school age table for 2010 table in TSC database
Figure 4.6 : Scatterplot for age totals and enrolment collected in the 2010 survey instrument
4.1.2. Statistics on demand for education collected through household surveys and population censuses
77. School enrolment, educational attainment (literacy level and highest educational
level attained) and attendance: Kenya National Bureau of Statistics has two publications “Economic Survey 2009” and “Statistical Abstract” wherein the enrolment figures for ECD, primary and secondary schools are published. These figures are obtained from the MoE and not through household surveys and population censuses.
37
0
1,000,000
2,000,000
3,000,000
4,000,000
5,000,000
6,000,000
7,000,000
8,000,000
9,000,000
10,000,000
2000 2001 2002 2003 2004 200 2006 2007 2008 20 2015 09 0
Primary School Age
Secondary School-going Age
The population projections by age range (ref. Revised Population Projections for Kenya: 2000 – 2020, CBS . MoPND, August 2006) is an important dataset for EMIS and is presented in Figure 4.7 below. The Kenya Integrated Household Budget Survey (KIHBS), 2004/05 (available from: http://www.knbs.or.ke/, 2010) includes important enrolment elements that could allow for triangulation of EMIS data. This Household Questionnaire has 23 sections and aims to provide data on socio‐economic aspects of the Kenyan population including education, health, energy, housing, water and sanitation. Important enrolment items included in the questionnaire are province, district, division, names of school, gender and age. The education section in the Household Questionnaire also has questions on literacy levels, such as ever attended school, highest school grade completed and in which year, and questions on highest educational levels attained such as, highest vocational training completed and highest educational qualification acquired. The education section in the Kenya Integrated Household Budget Survey (KIHBS) also has questions on attendance such s how many days was school in session over the past 2 weeks and how many days
attending school in the past 2 weeks. awas spent
Figure 4.7: Population projections for primary and secondary school going age (Source: Revised Population Projections for Kenya: 2000 – 2020, CBS . MoPND, August 2006)
78. Educational expenditure: The above mentioned household survey (Kenya Integrated Household Budget Survey (KIHBS) also has detailed information on how much was spent on a particular child’s education in the last 12 months by members of the household under the education section. Expenditure include items such as, tuition fees, books and other fees, uniforms, boarding fees, transport costs, contribution for school building or maintenance, extra tuition fees, examination fees, PTA and other related fee , pocket money ans d shopping.
79. Population analysis: From Figure 4.8 below, projections by age group (5‐15 and 15‐19) seem to be coherent (ref. Revised Population Projections for Kenya: 2000 – 2020, CBS . MoPND, August 2006). However, we couldn’t be provided by projections by single age used for education indicators calculation. In addition, it is to be noted that
38
0
2,000,000
4,000,000
6,000,000
8,000,000
10,000,000
12,000,000
2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010
5-15
15-19
the projections by around 3 million a
recent census (2009, published 2010) revealed that projections underestimated s presented in Figure 4.8 below.
Figure 4.8: Population Census for age groups 515 and 1519 by year
4.1.3. Statistics on the quality of learning outcomes collected through
assessments of student achievement
80. Regular programme assessment of student achievement: The Kenya Examinations Council (KNEC) is responsible for examinations at the end of each school cycle. The Kenya Certificate of Primary Education (KCPE) and Kenya Certificate of Secondary Education (KCSE) are the main formative assessment in measuring the level of attainment during the primary and secondary cycles. The Council administers
for over 1,000,000 students annually, of which around 650,000 for ucation and around 350,000 for
examinations primary ed post‐primary education.
81. Assessments national or international: The Service Charter, 2006 of The Kenya Examinations Council (KNEC) states that the Council was established by the government in 1980 through an Act of Parliament (CAP 225A) as a non‐profit making institution to conduct school and post‐school national examinations except university examinations. Kenya is one of the countries in the Southern and Eastern Africa Consortium for Monitoring Education Quality (SACMEQ) survey. SACMEQ has a dataset that contains information on school‐based surveys, including students’ tests,
15 countries that belong to the Southern and Eastern African Consortium for Educational Quality.
from Monitoring
82. Assessments cover key competencies in areas of learning such as reading and writing, mathematics, and science: The year 2008 Kenya Certificate of Primary Education (KCPE) examination report that was published confirms that assessments cover key competencies in learning areas such as reading, writing, mathematics and science. Table 4.1 and Figure 4.9 indicate overall performance per subject by gender for the years 2007 and 2008 KCPE examination. SACMEQ also covers the reading and
39
All Female Male All Female Male1 English Objective 41.58 41.40 41.72 47.02 46.72 47.282 E3
nglish Composition 40.48 42.15 38.98 39.68 41.11 38.40Kiswahili Objective 56.60 56.56 56.66 51.02 50.07 51.30
4 Kiswahili Composition 46.00 47.75 44.45 42.45 44.33 40.785 Mathematics 47.16 44.44 49.58 49.24 46.00 52.146 Science 55.24 52.16 58.00 59.44 55.52 62.927 Social Studies 61.35 58.48 63.92 60.13 56.58 63.288 Religious Education 60.41 58.90 61.56 58.83 58.43 59.17
MEAN PERFORMANCE (%)2007 2008
mathematics of pupils’ competencies as presented in Figure 4.10 below. Figure 4.11 indicates the performance of Kenya relative to other countries in the mathematics and reading scores of SAQMECII results. There is also an annual learning assessment by UWEZO. The aim of the assessment is to document and improve competencies in
and literacy among children aged 6‐16 years in Kenya (UWEZO, Annual sessment Report 2010, Summary and Key Findings, 2010).
numeracy Learning As
Table 4.1: 2008 & 2007 KCPE Examination Overall Candidates Performance Per Subject By Gender (Source: KNEC 2009)
0.00
10.00
20.00
30.00
40.00
50.00
60.00
70.00
English Objective
English Co
mpo
sitio
n
Kisw
ahili Objective
Kisw
ahili Com
positio
n
Mathe
matics
Science
Social Studies
Religious Edu
catio
n
2007
2008
Figure 4.9: KCPE Examination overall performance per subject (Source: KNEC, 2009)
40
450
0
0
0
0
0
0
0
0
0
0
Central Coast Eastern Nairobi North Eastern
Nyanza Rift Valley Western All
Mean pupil Reading and Maths scores
47
49
51
53
55
57
59
61
63
65
SACMEQ I - Reading SACMEQ II - Reading SACMEQ II - Maths
300
350
400
450
500
550
600
Bot
swan
a
Ken
ya
Les
otho
Mal
awi
Mau
ritiu
s
Moz
ambi
que
Nam
ibia
Seyc
helle
s
Sout
h A
fric
a
Swaz
iland
Tanz
aina
Uga
nda
Zam
bia
Zan
ziba
r
All
part
icpa
nts
Mean pupil Reading and Maths scores by country, SACMEQ II
Reading Mathematics
Figure 4.10: Mean pupil reading and maths scores by province (Source: SACMEQ)
Figure 4.11: Mean pupil reading and maths scores by country
41
4.2. of source data: Source data are regularly assessed and validated Assessment
83. Administrative and survey data are audited: During the field visits the interview
participants confirmed that there are data quality checks by the Quality Assurance and Standards Officers at the zones. However, the regularity of these checks could not be determined. In other words, there is no certainty that administrative and survey data re regularly audited to check the accuracy of source data (e.g., through inspection of
ions, randomafield collect post‐enumeration checks).
84. is provided: Training to improve accuracy has been insufficient over the Trainingyears.
85. Students dropping out are removed from the register: Although school registers seem to constitute an accurate source of data for completing the survey instruments issues such as birth certificates still need to be addressed (not compulsory and not
available). It was confirmed by head teachers visited that students dropping out emoved from the register.
freely are r
86. Students moving or changing schools are removed from the register: It was confirmed by head teachers and the registers of the school confirmed that students oving schools are removed from the register. There is an official form used to
fer students. mtrans
87. register includes all students currently enrolled: The school register is the all the students at the school.
Theofficial document as a record of
4.3. Statistical techniques: Statistical techniques employed conform to sound
l procedures, and are documented statistica
88. Data compilation procedures minimize processing errors: Some cross‐table checks
and data entry formats are integrated in the system. The enrolment data collected for enrolment can be validated with the age data on the same survey instrument. When we
these totals large discrepancies are evident as presented by Figure 4.6 compared above.
89. The report forms are designed in a way that makes them easy: Consultation participants generally felt that the data collection instruments are relatively easy to complete and that they are not at all burdened by these processes. However there was a concern amongst participants that not sufficient training is provided on how to omplete these survey forms. In addition, during the investigation it was found that
erpreted some of the tables in the survey questionnaire. ckey officials misint
42
90. Enrolment data: School enrolment is collected through different survey instruments by different data producing agencies for different purposes. Both TSC and EMIS collect school enrolment data with the same survey instrument. This is an attempt to
some of the many data collection. TMU also collects enrolment data for a ent purpose, namely the capitation grant
harmonise differ s.
91. A register (master list) of all schools exist: The unique identifier for TSC and MoE is different. The database from TSC contains a table that covers all the public institutions each with a unique identifier. Figure 3.2 above shows these fields that are in this table. We also obtained a table of the master list with the following fields, MoE code, TSC code, KNEC code, province, district, division and zone from the EMIS (MoE), although the list is not yet fully functional. During the visits to district offices, participants indicated that they use the list of schools to disseminate and collect survey instruments. At this juncture it is appropriate to mention that the School Registration Unit in the Directorate Basic Education is the most logical place to maintain and update the master list of institutions. Procedures for opening and closing of institutions already exist within this unit. Currently it is a manual process which should be changed to an electronic system. The purpose of such a system is to assign a unique identifier to eveIde
ry institution in the country. The basic functioning of the Institutional Unique ntifier System works in the following way:
School The Registration Unit assigns each institution a unique national institutional identifier (code) that can be used to match records accurately across years.
The School Registration Unit develops procedures to ensure that two institutions are not assigned the same identifier.
The School Registration Unit in conjunction with the national EMIS develop a policy with procedures for the closing and opening of institutions
The Master table has a specific number of key data fields, such as a field to link EMIS data with other data, such as TSC, KNEC and census. Other fields that could be c n i a e g a a tin luded i th s t ble could b geo raphic l areas (loc l, regional and na ional),
groups such as public and private. 92. The master list of TSC covers only public institutions while the master list of
itutions for MoE covers both public and private schools. It seems as if this list has ept up‐to‐date as it could have been.
instnot been k
4.4. Revision studies: Revisions, as a gauge of reliability, are tracked and mined for the
n they may provide. informatio
93. In general, monitoring of own performance and methodologies does not seem to take
place to a large extent in the concerned agencies and organisations (KNBS, TSC, MoE, KNEC, CHE and TMU). Little to no evidence was found of revision studies, or general reviews of methodologies used. For example there is an inconsistency in the reported number of private schools: in Nairobi this went from 154 in 2006 to 1,044 in 2007(Refer to Education Statistical Booklet 2003‐2007), showing that revisions are not tracked and, secondly, that the Master List of schools is not maintained.
43
0.52
0.27
0.17
0.22
Adequate data Source
Assessment of data source
Statistical techniques
Revision studies
Kenya International Norms
4.5. and score Synthesis
Based on an assessment of all the DQAF sub‐dimensions, a global score of 27% has been assigned for the accuracy and reliability dimension.
Figure 4.12: Results of accuracy and reliability
4.6. dations Recommen
Nr. Recommendation Priority
4.1.1 Accuracy and reliability should be improved through improved communication between organisational units in MoE (in particular, EMIS Unit, TMU and School Mapping), and MoE and external bodies, such as the Kenya National Examination Council (KNEC), MoHEST and MoYS, but also UWEZO and partners such as USAID, who are running the district‐based
High
44
EMIS (DEMMIS) and World Food Programme.
4.1.2 Education indicators should be recalculated on the basis of the census 2009 findings, within the great framework of revision studies to be undertaken. It is likely that KNBS, rather than EMIS Unit / MoE would lead on such studies. Correct information on where Kenya stands in terms of its education indicators, such as enrolment rates, is a civic requirement for the general public and indispensible information for decision‐makers.
Medium
4.2.1 Random field checks to audit the reliability of source are recommended, although these are already happening, to an extent, through the District Quality Assurance Officers (DQASOs). Such checks will, to an extent at least, avert the risk of over‐reporting, leading to inflated enrolment figures as well as non‐justified capitation grant expenditures (cf. 1.2.8).
Medium
4.2.2 The timeframe for considering learners early school leavers in schools (i.e. after two weeks of non‐attendance) seems slightly on the short side. A revision of this may improve the accuracy of the reporting of early school leavers.
Low
4.2.3 Greater triangulation of findings and datasets, including comparisons of MoE data with census or survey data, is highly recommended to ensure accuracy.
High
4.3.1 Pilot testing of survey instruments (questionnaires) is highly recommended to minimise errors. This should enhance the intelligibility of the existing instruments, leading to review of questions, if not considered clear.
High
4.3.2 Greater transparency concerning imputation methods is considered highly recommended (see 2.2.2).
Medium
4.4.1 Reflection on methodological approaches is recommended, so as to promote a statistical culture within relevant units at the concerned Ministries and SAGAs. One can imagine that KNBS should play a lead role in establishing this.
Medium
45
5. Serviceability: Statistics are relevant, timely, consistent, and follow a predictable revisions policy
The quality dimension of serviceability looks at the extent to which statistics are useful for planning or policy purposes. It refers, mainly, to the dimension of time. Data is timely when it is current or up to date as defined by the owner of the data. Data must be on time and available when it is required, otherwise the credibility of the information system diminishes. Given that data are actually accurate, it looks at the extent to which they reflect a reality either of the moment or of the past. With relevance to this, this dimension also verifies whether data are subject to a revision policy and practice.
5.1. Periodicity and timeliness: Periodicity and timeliness follow internationally
dissemination standards accepted
94. Respondents are made aware of the deadlines set for reporting: During the field
visits to schools and district offices there was not a schedule of the deadlines set for eporting on data collection processes (dates for the completion and return of forms, apturing of the data and verification of the data). rc
95. The producing agency employs rigorous followup procedures to ensure the timely receipt of respondents’ data: During the consultation process, it was acknowledged that numerous examples could be found that the data collection process is well established from institutional up to national level. Head teachers are aware of the importance of data for decision‐making. The management and implementation of the education sector is decentralised both institutionally and in terms of decision‐making at the following broad levels, namely National, Provincial, District, Divisional, Zonal and at Institutional level. There appears to be a good collaboration and relationship between officials at provincial, district and zonal level that contributes to the timely receipt of data. The same Quality Assurance and Standards Officials are used to check the data of EMIS and TSC. At some institutions the same officials capture the data for EMIS and TSC. This collaboration also increases the data collection process and decreases response time. The communication between institutions, zones and districts seems to work well. There seems to be a good collaboration between staff of EMIS and TSC at institutional and district and zone level. The general concern among interviewed participants was the difficulty of the data capturing tool of MoE which is in most cases he cause for delays. The capacity at National level is also a problem in that follow‐up, eeting deadlines and controlling the data collection process is made
tm difficult.
96. If respondents fail to submit appropriate adjustments are made: In MoE an attempt is made to complement datasets with other sources, such as enrolment figures from
46
TI
MU and population census data. However, even though this is more relevant to the ntegrity dimension, the way these adjustments are made are not documented.
97. Enrolments and teachers are provided no later than 4 months after the beginning of the school year: TMU is the unit responsible for the collection of enrolment data that is used for the allocation of capitation grants and here good practices already exist to get the school enrolment on time so that these required figures are provided no later than 4 months after the beginning of the school year. TSC also has well established processes and procedures in place to receive the enrolment data in time for the allocation of teachers at schools. The data provided by TSC takes even less than 4 months after the beginning of the school year. The team is not sure if the same applies o MoE. During the consultation in July the EMIS data was not available. The real roblem lies with the reorganisation of the dtp ata collection process.
98. Source data on educational expenditures: Source data on educational expenditures are collected in the comprehensive annual survey questionnaire during the 3rd term ollection process. Expenditure items such as Textbooks, water and sanitation, salaries, lectricity and water, etce c are included.
99. reliminary datasets: It appeared that such a practice does not exist and that there re no preliminary dPa atasets.
100. Final Publications: There is an attempt to maintain periodicity of publications within the MoE. Two MoE’s publications could be obtained, namely the Education Facts and Figures (2002 – 2008) and Education Statistical Booklet (2003 – 2007). However, the same publications were not available for 2009, an indication that periodicity was not good. KNBS annually publishes statistics in the Statistical Abstract and the Economic Survey which indicates periodicity that follows accepted good practices.
101. Timeliness of preliminary publications: No prelimary datasets exist. The TSC datasets were available during the field visits and the datasets for EMIS not. The reason for this is that there is a difficulty with the capturing tool at school and district level.
102. Timeliness of final publication: The enrolment by TSC for the allocation of teachers and enrolment by TMU for capitation grants are available in time. It is not always the case for the enrolment provided by EMIS.
103. Timeliness of international publication: Kenya’s EMIS Unit is consistently delayed by some months in terms of submission of data for international publications. This, however, can be attributed at least in part to the fact that the school year in Kenya runs from January to November, with the deadline for UIS questionnaire on 31 March. This timeframe is usually not sufficient for data to have been finalised by the country for submission.
47
5.2. Consistency: Statistics are consistent within a dataset and over time, and with
major data sets other
104. It largely seems to be the case that accounting identities between aggregates and
their components are observed for all involved data.
105. It seems to be largely the case, in general, that accounting identities between enrolments, repeaters, drop‐outs, and demographic data are observed.
106. It also seems to be largely the case, in general, that statistics are cross‐checked across geographical areas and sub‐groups of population. However, when we analysed the age data and the data collected for gender, large discrepancies are observed as described in number 76 above.
107. Education expenditure data is collected from the same institutions as for those for which the enrolment and teacher data are reported. However, the data collection agencies and the accompanying databases to store these datasets are different for TSC, TMU and MoE. For example TMU collects enrolment data for Capitation Grants and stores it a different database than TSC who collects the same data for the allocation of teachers from the same institutions. MoE collects the expenditure data from the same institutions via the comprehensive annual survey during the third term.
PROVINCE 2010 2009CENTRAL 624298 638340COAST 581330 587334EASTERN 1283126 1307598NAIROBI 180763 183533NORTH- 114283 112471NYANZA 1172644 1205070RIFT VALLEY 1757492 1764783WESTERN 1079077 1078939TOTAL 6793013 6878068
108. Data longitudinal coherency: The team could only find data sets for the last 3 years
(2008, 2009 and 2010) in the TSC database. Although data is published for 2002 to 2008 in Education Facts and Figures the team could not find any datasets for MoE. Their database was also difficult to access, because it was not developed according to sound relational database principles. From an analysis of the TSC primary school enrolment data for the same institutions for the years 2009 and 2010 it can be derived that consistent time data are available for this time period as presented in Table 5.1 and the graph in Figure 5.1 below.
Table 5.1: TSC primary school enrolment for consecutive years 2009 and 2010 by province
(Comparison of primary school enrolment for the years 2009 and 2010 for each of the
provinces in Kenya)
48
0
200000
400000600000
800000
1000000
1200000
1400000
1600000
1800000
2000000
CENTRAL
COAST
EASTERN
NAIROBI
NORTH‐EASTERN
NYANZA
RIFT VALLEY
WESTERN
2010
2009
Figure 5.1: Primary School Enrolment data from TSC database for 2009 and 2010 by Province (Source TSC database)
Yet, when we delve deeper and compared the primary school enrolment from the TSC database for two consecutive years (2009 and 2010) by province as indicated by the scatter plots Figure 5.2 below then some large variances are observed which should be nvestigated in greater detail. Refer also to number 93 above for an example in the nconsistency in the reported number of private schools. ii
49
Figure 5.2: Comparison of primary school enrolment figures for 2009 and 2010 by province (Source: TSC database)
109. Consistency over time: The team could only find data sets for the last 3 years (2008,
2009 and 2010) in the TSC database. Although data is published for 2002 to 2008 in Education Facts and Figures the team could not find any datasets for MoE. Their database was also difficult to access, because it was not developed according to sound relational database principles
110. There were no indications (refer to the example of the inconsistency in the number in private schools in number 93 above) that, when changes in source data, methodology, and statistical techniques are introduced, historical data were reconstructed as far back as reasonably possible.
111. No detailed methodological notes were found that explain main breaks and discontinuities in time data.
112. No evidence of reconciliation was found. The different existing databases seem not to be interlinked. It is hoped that this will improve with the introduction of the Foundation database. There is both a need to reconcile databases across and within
50
data collection agencies. Each data producing agency (MoE, TSC, CHE, TMU, MoY) has its own database. These agencies such as TSC, MOE and KNEC also provide their own institution code.
5.3. Revision policy and practice: Data revisions follow a regular and publicized
procedure
113. Given that there are or have been very little, if any, revisions, it is not surprising that
no documentation to this end is included in the publication of the statistical data.
0.22
0.260.17
Periodicity and timeliness
ConsistencyRevision policy and practice
Kenya International Norms
5.4. Synthesis and score
Based on an assessment of all the DQAF sub‐dimensions, a global score of 22% has been assigned for the serviceability dimension.
Figure 5.3: Results of serviceability
5.5. dations Recommen
Nr. Recommendation Priority
5.1.1 The issuance, dissemination and promotion of a data schedule for relevant surveys is highly recommended to define
High
51
expectations (i.e. deadlines for the submission and processing of relevant data) and to improve the timely returns of data.
5.1.2 Integration of the TMU survey with the general EMIS is recommended, so as to incentivise timeliness of the EMIS
1survey (see 1.1.7 and 4. .1).
Medium
5.1.3 While there are some education statistics for Kenya publicly available, there remains much to be improved in terms of periodicity of relevant publications. For example, an Education Facts and Figures publication is available for the period 2002 – 2008, as well as an Education Statistical Booklet for the period 2003 – 2007 but, for example, no such publication is for 2009. It is highly recommended that an annual ‘paper’ publication be instituted.
High
5.2.1 Harmonisation of existing databases and procedures, initiated by improved coordination and communication between the different involved bodies, should help aid the internal consistency of data across organisational units (see 1.2.3).
High
5.2.2 Recalculations for historical data (at least five years back) should be conducted, in particular in light of the new population data, as provided by the 2009 census (see 4.1.2) and the methodologies for underpinning these should be documented.
Medium
52
6. Accessibility: Data and metadata are easily available and assistance to users is adequate
This dimension is based on the principle that data and metadata should be presented in a clear and understandable way and should be easily available to users. Metadata should also be relevant and regularly updated. In addition, assistance to users should be available, efficient and performed in a reasonable time frame.
6.1. Data accessibility: Statistics are presented in a clear and understandable manner,
forms of dissemination are adequate, and statistics are made available on an
l basis impartia
114. There is a positive attempt to publish the education data with accompanying charts
and tables. The following statistical publications were obtained: Education Statistical Booklet 2003‐2007 (Graphs and tables); Education Facts and Figures 2002‐2008 (Tables only). The little fact book of the Socio‐Economic and Political Profiles of Kenya's Districts Institute of Economic Affairs (tables only); Statistical Abstract 2007, 2008, 2009 KNBS (Tables Only); Economic Survey KNBS (tables only).
115. The publications as indicated above provide education data per province and district by gender and year and the data seem relatively well‐presented.
116. It seemed that there is no analysis of current period estimates available. The data are mostly used for budget and reporting purposes. Little data analysis was evident during the visits and consultation participants generally expressed a need for capacity development in data analysis. Figure 6.1 indicates the distribution of enrolment per province. This is an example of how analysis can be used to increase the value of the data collected through the survey instruments.
53
Figure 6.1: Distribution of 2010 enrolment by province ( Source: TSC database)
117. Data is collected per school by different sub‐components (e.g. by gender, by level of education, by age, private and public, full‐time and part‐time) as indicated by the data collection processes in Table 3.3 above. The data is available in the same format for the intended users and purposes. There is even an attempt to make it available to the public via the education website in this format.
118. Although KNBS has an advance data release calendar on their website, the Education sector is not included in this calendar. KNBS also has a catalogue of publications. During the field visits the team did not encounter anything that indicated that this practice might exist for education data.
119. Data on enrolment and teachers could be obtained from TSC database for 2008, 2009 and 2010. The EMIS system is not user friendly which makes data validation and verification processes difficult and is often the cause that data capturing is not completed at institutional level. EMIS data could not be accessed via electronic database. During our visit to a school there was difficulty in accessing the EMIS database. Even the data on the education website could not be accessed it displayed an error message “File or directory not found”. The Foundation database of MoE is an attempt to make data accessible in a structured way through self selection queries for key users in future. In a discussion with ICT staff of KNBS they also indicated that
54
as the custodian of the data in the country they have the same vision to make data electronically available to the public. In both cases the system is not yet operational or implemented.
120. Education data are published in the Educational Statistical booklet and Education Facts and Figures, but is hampered by the fact that periodicity is not good. Education data is also published yearly in publications such as the Statistical Abstract and Economic Survey. 121. The statistical data is not released according to a pre‐announced schedule.
122. There is no pre‐announced schedule available with dates indicating the release of the statistical data to users. 123. The press is not briefed in advance for publication release.
124. In discussion with interview participants it was ascertained that non‐published (but non‐confidential) datasets are made available upon request, however in education no records are kept of these requests.
125. The practice to make non‐confidential micro‐data files (e.g., with information permitting the identification of individual respondents removed) available seems to exist in KNBS and MoE. KNBS has a data release and dissemination policy (Data Access and Dissemination Microdata Release Policy). The aim of such a policy is to define the nature of the anonymized micro‐data files that will be released (refer to number 44 above for further detail).
126. Terms and conditions under which non published statistics and data are made available are not published
6.2. accessibility: Uptodate and pertinent metadata are made available Metadata
127. There is a data dictionary for the Foundation database available and EMIS has a
comprehensive data collection manual. In this manual the operational terms are defined. However we were not sure how widely this document is distributed or implemented.
128. No other meta‐data related to biases in the data, information about response rates, comparison with other data sources, seem to be disseminated.
129. The General Data Dissemination System (GDDS) summary methodologies and other related metadata haven’t been identified
55
130. A Data Analysis and Reporting Training Manual exists, prepared by the MoE. We are not sure how widely this manual is used by education data users or distributed to such users. KNBS has a catalogue (39 Pages), an ad‐hoc publication which presents descriptive information of KNBS publications which includes the following details: The title; Short summary of each publication; The unit price/method of distribution; The number of pages; Whether available in other formats; International standard book number where necessary; KNBS website and whose link all other KNBS publications can be accessed; and a list of Geographic information Maps of 1999 Census.
6.3. stance with the users: Prompt knowledge support service is available Assi
131. All publications (print and website) provide contact details in terms of mail, email,
facsimile or telephone).
132. The EMIS analysis and reporting manual compiled by the EMIS unit is an attempt to educate users of the use of education datasets.
133. While MoE recognised the need, in principle, to monitor support to users, for example by means of a survey of users, this was indicated, for the moment, not to be a priority.
6.4. and score Synthesis
0.28
0.220.33
Data Accessibility
Metadata AccessibilityAssistance with the users
Kenya International Norms
Based on an assessment of all the DQAF sub‐dimensions, a global score of 28% has been assigned for the accessibility dimension.
Figure 6.2: Results of accessibility
56
6.5. dations Recommen
Nr. Recommendation Priority
6.1.1 The publication of an education statistics database online (e.g., for indicators, KenInfo) is encouraged. Such an online database could contain a Master List of schools, as mentioned under 3.5.1, which may greatly improve transparency as to which schools are known, and registered, in the country.
Low
6.1.2 Greater coordination between KNBS and line Ministries as well as SAGAs, concerning the publication of education statistics (MoE, MoHEST, MoYS, CHE, TSC, KNEC), with a leadership role for KNBS, is recommended.
High
6.1.3 A schedule for data release, of KNBS, should integrate, where cfeasible, MoE statisti al publications.
Medium
6.2.1 Improvements can be made in the publication of metadata across the board. While KNBS publishes metadata, MoE does not. A brochure aiding analysts, and other users of data, should help in understanding the status of published figures.
Low
6.3.1 While there exists a service at the EMIS Unit to assist users of statistics, the existing capacity is clearly not sufficient to deal with requests satisfactorily, leaving even less time for professional development. Provisions should be implemented to ensure capacity in assisting users at EMIS Unit.
Medium
57
0.00
0.50
1.00
Pre‐requisites of quality
Integrity
Methodological soundness
Accuracy and reliability
Accessibility
Serviceability
Kenya
International Norms
7. and overall recommendations Conclusion
Figure 7.1: Overall results
a) Based on an assessment of all the six DQAF dimensions, a global score of 33% has been
assigned for the overall system currently in place for the collection, processing, analysis and dissemination of Kenyan education statistical data at the national and sub‐national levels
b) While this figure may not necessarily reflect an ideal state of the statistical system for education data in Kenya, at the moment, it should be emphasised that strengths were encountered as well as possible points for improvements. These were discussed in the report.
c) Specific recommendations, pertaining to the different dimensions and sub‐dimensions of the DQAF, were given at the end of each chapter. These have the potential to result in concrete action plan, to be developed in coordination with the Kenyan developments
d a iapartners an national authorities, iming to address exist ng points for improvements.
The following points highlight the overall findings and accompanying high priority recommendations (with reference to more specific recommendations) for improving the Kenyan system for educational statistics.
58
Statistics is not only about computers and processes. We wanted to emphasize that statistics is more than just a technical solution, but that the important aspects of legislation, governance, regulation, resources, people, systems and processes should be part of such a process. The recommendations are based on these aspects in support of government decision making and service delivery.
Nr. Recommendation Ref.
O.1 There exist different data collection cycles and databases in the country, both in government ‐intra‐Ministry of Education (MoE), in other Ministries and Semi‐Autonomous Government Agencies (SAGAs) and among Non‐Governmental Organisations (NGOs) and other partners. Greater harmonisation between these systems should be sought. For this, the establishment of a solid main system is a pre‐requisite. Data collection cycles should be reviewed with a view to enhancing timeliness, in particular.
1.1.7
3.4.1
4.1.1
5.1.1
.1.2 6
O.2 Greater organisational clarity should be sought in terms of education statistics. At the moment, the Education Management Information Systems (EMIS) Unit is placed within the Central Planning Unit (CPU) of the MoE, which is itself an extension of the Ministry of Planning (MoP). The Head of Unit, however, is affiliated with the Kenya National Bureau of Statistics (KNBS), which has recently detached itself from MoP. These arrangements can be improved in terms of clarity. The KNBS could head a data quality group to coordinate the implementation of at least some of these recommendations.
1.1.1
1.1.2
O.3 The Kenya Education Sector Support Programme (KESSP) is understood to underpin the EMIS, but this is not itself a line‐function. For example, there exists an EMIS Investment Programme (IP) within KESSP, while there exists a Monitoring & Evaluation (M&E) Unit within the CPU as well. These entities are headed by different persons and arrangements for coordination between them seem informal. Streamlining should take place at this level. Adequate funding needs to be addressed as well.
1.2.6
O.4 The current EMIS software application is dysfunctional. To solve this, a Foundation Database is being elaborated, with the
1.2.3
59
support of DFID. The UNESCO Institute for Statistics system (StatEduc) is intended to be customized and deployed at the decentralised level. Now that official approval was requested, and granted, it is recommended that work in this direction be continued. It is nevertheless essential that basic pre‐requisites related to unique school identification be addressed (there are currently three unique identifiers for each institution, namely a code for TSC, for MoE and for KNEC).
3.5.1
5.2.1
O.5 The EMIS Unit in MoE is insufficiently staffed for its tasks. It is recommended that at least one to two more planners join this Unit, possibly from the Provincial Offices. Improved office space is also essential for the effective functioning of the Unit.
1.1.5
1.2.1
O.6 A global inventory of information / data requirements across relevant bodies, and KESSP IPs is highly recommended. This should be led by either the M&E Unit at the CPU or by the M&E IP in KESSP.
3.5.3
O.7 Functioning connectivity at district and province levels should be considered as critical to enable effective EMIS
1.2.2
O.8 Training in generic ICT skills, as well as in data analysis is urgently required, in particular at the district level. It should be targeted in such a way as to ensure capabilities are durably developed.
1.2.4
If Kenya can address the aforementioned points, the country will be able to make great strides towards an effective and efficient system for producing high quality education statistics, pertaining to all dimensions of the DQAF. An effective and efficient statistics system and the availability of quality statistics will support strategic planning and decision‐making.
60
APPENDIX A: List of relevant references and documents
Doc. N r. Description Medium Year Issuer
1 Strategic Plan (2010‐2015) Document Jul 2010 CHE
2 The Little Fact Book: The Socio‐Economic and Political Profiles of Kenya’s Districts Publication Apr 2002
Institute of Economics
s Affair
3 Kenya Education Sector Support Programme 2005‐2010: Delivering Quality Education and Training to All Kenyans Report Jul 2005 MoE
4 Draft Kenya Education Sector Support P2005‐2010: Investment Programme Tar
rogramme (KESSP) gets 2007‐2010 Report Jan 2008 MoE
5 Revised Youth Polytechnics Curriculum Report Jul 2010 MoY
6 The Little Fact Book: The Socio‐Economic and Political Profiles of Kenya’s Districts Publication Apr 2002
Institute of omics s
EconAffair
7 Education Statistical Booklet 2003‐2007 Publication ? MoE 8 Education Facts and Figures 2002‐2008 Publication Jun 2009 MoE 9 Economic Survey 2009 Publication 2009 KNBS 10 Statistical; Abstract 2007 Publication 2007 KNBS
11 Are our Children Learning? Annual Learning Assessment Report Kenya 2010 Publication 2010 UWEZO
12 The Year 2008 KCPE Examination Report with Question & ive Questions Answers to the Object Publication 2009 KNEC
13 TSC Database Database 2010 TSC, MoE 14 Foundation Database Database 2010 EMIS, MoE 15 The Kenya National Examinations Council: Service Charter Publication Jun 2006 KNEC
16 World Bank Needs Assessment Report For The
nt Information Establishment Of Education ManagemeSystems (Emis)
Report Mar 2005 MoE
17 Data Access And Dissemination Policy Policy Nov 2008 KNBS 18 Catalogue of KNBS publications Publication Jan 2009 KNBS 19 Statistical Act Act Aug 2006 Parliament 20 Education Act Act Parliament 21 Public Officer Ethics Act Act
22 KESSP: EMIS‐Revised‐Chapter 22nd (V2) Report May 2010 EMIS, MoE
23 Primary Schools Data Returns Form 1st Term 2010 Questionre 2010 EMIS, MoE 24 Secondary Schools Data Returns Form 1st Term 2010 Questionre 2010 EMIS, MoE 25 Non‐Formal Data Returns Form 1st Term 2010 Questionre 2010 EMIS, MoE 26 Adult Education Data Returns Form 1st Term 2010 Questionre 2010 EMIS, MoE 27 Teacher Training Colleges Annual Data Returns Form 2010 Questionre 2010 EMIS, MoE
28 Revised Form Primary Schools Data Returns Form 2nd Term2010
Questionre 2010 TSC, EMIS
29 Form A: DEMMIS Registration – Primary Schools and EUnits
CD Questionre 2010 DEMMIS, MoE
30 Commission for higher education university statistics questionnaire Questionre 2010 CHE
31 Southern Africa Consortium for Monitoring Education Quality (SAQMEC) II Database SAQMECII
61
32 Kenya: ICT in Education Situational Analysis, Dr. Patti Swarts and Esther Mwiyeria Wachira Report Sep 2009 GeSCI
33 The 2005/06 Kenya Integrat
ORT
ed Household Budget Survey [KIHBS] FIRST QUARTERLY REP
Report Aug 2005 KNBS
34 Kenya ISCED Mapping Report 2007 UIS
35 Are Our Children Learning? Annual Learning Assessment Kenya 2010,
Report: Summary and Key
s Finding
2010 UWEZO
36 Are Our Children Learning? Annual Learning Assessment Kenya 2010, Report 2010 UWEZO
37 http://www.uis.unesco.org) Website 2010 UIS 38 www.education.go.ke Website 2010 EMIS 39 www.examscouncil.or.ke Website 2010 KNEC
40 www.knbs.or.ke Website 2010 KNBS
41 www.sacmeq.org/education‐kenya.htm Website SACMEQ
62
APPENDIX B: Kenya DQAF –schedule
Morning Afternoon
Mon - 19 Jul 09:45 EMIS Unit: Charles, Tony, Anne
14:00 MoE: DPP, Mr. Magochi
16:00 MoE: DCE, Ms. Koori
Tue - 20 Jul
09:00 TSC: Eva and others
12:30 ICT: Ndoria
14:30 MoHEST & Directorate of Research: Nyangate, Mavisi
16:45 MoHEST: Waweru (TIVET)
Wed - 21 Jul
10:00 KNEC: ICT & National Assessment Team, Mr. Keith Maleche / Ogle
12:00 MoE: Grace (KESSP) & Peris (M&E) Review
Thu - 22 Jul
09:00 MoY: Directorate Youth Training: Ms. Mary Cherono
12:00 Directorate for Population and Social Statistics, KNBS: Dr Opiyo 14:30 Grace Igweta / Rene McGuffin (WFP)
Fri - 23 Jul 9:00 CHE (Ndoria), Ms. Beatrice Muganda Review
Mon - 26 Jul
9:00 Directorate of ICT (KNBS): Mr. Kiio, Kipruto, Mumo
10:30 KNBS: Mr. Kilele (DG) Naivasha: DO, Pri School (Gituamba Co-Ed)
Tue - 27 Jul Kakamega: PO, DO, Sec School (Matende Girls)
Wed - 28 Jul
10:00 Sara (WERK / UWEZO)
12:00 MoE: DPP + team 14:00 Senior official MoE: PS
Thu - 29 Jul
10:00 MoE: Registration Dept.
12:00 MoE: TMU 15:30 MoE: EMIS Unit
Fri - 30 Jul 8:00 Gitonga (USAID) Wrap-up / initial analysis
63
APPENDIX C: List of Persons Met
Name Org. / Div/ Post
Contact Remarks
Charles Obiero EMIS [email protected] Head of EMIS Anne Nduku EMIS [email protected] EMIS Consultant Tony Shikali ICT [email protected] Areba Nyangate MoHEST [email protected] SDDE Richard Mavisi Liahona
MoHEST [email protected] Principal Research Officer
Samuel K. Waweru MoHEST [email protected] Chief Technical Education Officer Ivy Obonyo TSC [email protected] Florence Mwende TSC [email protected] Eva Msagha TSC Evamsagha @tsc.go.ke Ag. DS (ICT) Mr Ndoria TSC ICT Paul Wasanga KNEC CEO Joyce Sabari KNEC [email protected] Exams, Research Division Grace Ngaca MoE Dinah Mwinzi MoY [email protected] Director Maria Cherono MoY [email protected] Deputy Director Dr Collins Opiyo KNBS [email protected] Director Population 7 Social Stats Grace Tgweta WFP Beatrice Muganda CHE [email protected] Research coordinator Ndoria Ngari CHE [email protected] CHE Cleophas Kiio KNBS [email protected] Director: ICT Matthew Aboka MoE [email protected] DEO: Naivasha District Kaman Njenga MoE [email protected] E.O: Naivasha District Mary Mukundi MoE Head Teacher: Gituamba Primary J. Ochongo MoE [email protected] PQASO: Western Province George Lutomis MoE [email protected] DPQASO: Western Province Immaculate Obari MoE [email protected] Kakamega Central District: DQASO Biece Kavere Mugita
MoE Kakamega Central District: Secretary
Karen R. Sisia‐Mayabi
MoE [email protected] Principal: Matende Girls Secondary
Dr Sara Ruto UWEZO [email protected] Country coordinator Conrad Watola UWEZO [email protected] Data Analyst Daniel Wesonga UWEZO [email protected] Nesmus M. Kiminza MoE [email protected] Senior Deputy Director Charles Kataka MoE [email protected] School Registration: Assistant Director Karami BuyaboNyukuri
MoE [email protected] Senior Deputy Director: Records Management
Charity Nyaga MoE [email protected] TMU: Senior Assistant Director of Educ
Tony Shikali MoE [email protected] ICT database manager Christine Obester USAID [email protected] Education Officer Francis Gitanga USAID