03.11.2014 Views

Primary Education Survey Evaluation Report Somalia - Somali - JNA

Primary Education Survey Evaluation Report Somalia - Somali - JNA

Primary Education Survey Evaluation Report Somalia - Somali - JNA

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

UNICEF<br />

USSC <strong><strong>Somali</strong>a</strong> Support Centre<br />

<strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong><br />

<strong>Evaluation</strong> <strong>Report</strong><br />

<strong><strong>Somali</strong>a</strong><br />

Supported by the Strategic Partnership for Recovery and<br />

Development in <strong>Education</strong> in <strong><strong>Somali</strong>a</strong><br />

UNICEF-DFID-UNESCO<br />

August 2008<br />

Dr Edward Redden, Consultant


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

ACKNOWLEDGEMENTS<br />

The nature of this assignment required assistance from many people involved in the education sectors of<br />

<strong><strong>Somali</strong>a</strong>. This assistance was always willingly given and I am grateful for that support.<br />

Minister Hassan Haji Mohmoud of <strong>Somali</strong>land and Minister Mohamud Bile Dubbe of Puntland both gave<br />

freely of their time and provided the support of their senior officers. I would also like to express gratitude<br />

to the Directors General of the two northern Zones and the various directors to whom I have had the<br />

pleasure of interviewing and sharing their experiences with reference to the <strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong>.<br />

The regional education officers and enumerators who attended a workshop in support of this evaluation<br />

provided wonderful insight in the education processes in their regions and demonstrated great<br />

understanding of challenges ahead. They were able to provide a number of very useful recommendations<br />

that have been embedded in this report.<br />

The UNICEF officers in both Hargeisa and Garowe provided access to a range of officials within the<br />

Ministries and also provided valuable descriptions of their field experiences during implementation of the<br />

survey project. I offer my sincere thanks to these staff members. The UNICEF officers in the central<br />

south zone provided information via email due to the insecurity in their areas. Their promptness in<br />

replying to my inquiries was always much appreciated.<br />

Members of NGOs, UN Agencies and Donors gave of their time to both contribute data for the evaluation<br />

process and to provide comment on the draft report. Their contributions were significant and valued.<br />

Finally, I am indebted to the staff of UNICEF <strong><strong>Somali</strong>a</strong> in Nairobi, who provided the necessary<br />

infrastructure support and technical and cultural guidance. Woki Munyui, Maulid Warfa and Catherine<br />

Remmelzwaal also provided critical comment on early drafts of the <strong>Report</strong> that contributed significantly<br />

to the development of the evaluation.<br />

Dr Edward Redden<br />

25/8/08<br />

2


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

ACRONYMS<br />

AIR<br />

ECD<br />

ECCE<br />

EFA<br />

EMIS<br />

EU<br />

FTI<br />

GER<br />

G1ECD<br />

MDG<br />

MoE<br />

NER<br />

NIR<br />

NGO<br />

PAE<br />

PES<br />

REO<br />

SPSS<br />

UNESCO<br />

UNICEF<br />

Apparent Intake Rates<br />

Early Childhood Development<br />

Early Childhood Care and <strong>Education</strong><br />

<strong>Education</strong> for All<br />

<strong>Education</strong> Management Information System<br />

European Union<br />

Fast Track Initiative<br />

Gross Enrolment Ratio<br />

Percentage of Grade 1 students who have attended Early Childhood <strong>Education</strong><br />

Millennium Development Goals<br />

Ministry of <strong>Education</strong><br />

Net Enrolment Ratio<br />

Net Intake Rate<br />

Non Government Organisation<br />

<strong>Primary</strong> Alternative <strong>Education</strong><br />

<strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong><br />

Regional <strong>Education</strong> Officer<br />

Statistics Package for the Social Sciences<br />

United Nations <strong>Education</strong>, Science and Cultural Organisation<br />

United Nations Children’s Fund<br />

3


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

CONTENTS<br />

EXECUTIVE SUMMARY.........................................................................................................................6<br />

THE PRIMARY EDUCATION SURVEY ...............................................................................................9<br />

EVALUATION OBJECTIVES..................................................................................................................9<br />

CONCEPTUAL FRAMEWORK ............................................................................................................10<br />

RESEARCH QUESTIONS AND METHODOLOGY...........................................................................11<br />

RESULTS...................................................................................................................................................13<br />

A. EFFICIENCY....................................................................................................................................................... 13<br />

1 Validity ............................................................................................................................................................. 14<br />

2 Training............................................................................................................................................................ 18<br />

3 Enumerators and Supervisors........................................................................................................................ 19<br />

4 Timing .............................................................................................................................................................. 20<br />

5 Data Duplication.............................................................................................................................................. 22<br />

B. EFFECTIVENESS ............................................................................................................................................... 23<br />

1 Utility-Current and Potential ......................................................................................................................... 23<br />

2 EMIS Tools ...................................................................................................................................................... 24<br />

3 Ownership ........................................................................................................................................................ 25<br />

C. RELEVANCE AND APPROPRIATENESS ..................................................................................................... 27<br />

1 Data Required for an EMIS ........................................................................................................................... 27<br />

2 <strong>Report</strong>ing.......................................................................................................................................................... 32<br />

D. IMPACT ............................................................................................................................................................... 35<br />

E. SUSTAINABILITY.............................................................................................................................................. 36<br />

F. COVERAGE ......................................................................................................................................................... 37<br />

G. COORDINATION ............................................................................................................................................... 38<br />

H. CONCLUSION: FUTURE STEPS TOWARDS AN EMIS ............................................................................. 40<br />

REFERENCES.......................................................................................................................................................... 43<br />

APPENDICES ........................................................................................................................................................... 44<br />

Appendix 1 Summary of Recommendations.......................................................................................................... 44<br />

Appendix 2 Terms of Reference.............................................................................................................................. 48<br />

4


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Appendix 3 EMIS Conceptual framework ............................................................................................................ 53<br />

Appendix 4 Research Questions and Data Collection Planner ............................................................................ 58<br />

Appendix 5 List of Interviewees.............................................................................................................................. 62<br />

Appendix 6 EMIS Development Planning Matrix ................................................................................................ 63<br />

Appendix 7 Evidence Based Planning .................................................................................................................... 65<br />

Appendix 8 Glossary................................................................................................................................................ 67<br />

5


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

EXECUTIVE SUMMARY<br />

1. The <strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong> represents a significant achievement for the education sector of<br />

<strong><strong>Somali</strong>a</strong>. The cooperation between three Ministries of <strong>Education</strong>, UNICEF, DFID and UNESCO<br />

demonstrated in the instigation and continued development of the <strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong><br />

(PES) process, reflects a commitment and dedication to the children of <strong><strong>Somali</strong>a</strong>.<br />

2. The PES reports are well respected within the education sector and the data is widely used in a<br />

range of ways by agencies active in the sector. The MoEs have expressed interest in taking<br />

greater control over the process and in expanding and improving planning processes using<br />

evidence based approaches.<br />

3. The Census approach to the <strong>Survey</strong> should be continued and developed over time into a full<br />

EMIS. The notion of an EMIS that is aspired to is “an organized group of information and<br />

documentation services that collects, stores, processes, analyses and disseminates information for<br />

educational planning and management.” As such it is more than a data storage system. Rather it<br />

integrates system components of inputs, processes, outputs and reporting in such a way that<br />

accuracy, flexibility and adaptability, and efficiency are maximized.<br />

4. Views received from the field indicate that the validity of the data is quite sound, although some<br />

recommendations have been made to improve data collection processes. These are based upon<br />

developing an audit trail so that greater accountability can be achieved.<br />

5. Enumerators and supervisors should be assessed during training and only those demonstrating<br />

competency in the required skills and appropriate ethical values should be employed.<br />

Additionally, an appraisal of field performance should be undertaken following the data<br />

collection and cleaning process.<br />

6. The training for enumerators and supervisors should be expanded to include a number of new<br />

training modules and methodologies. The training should allow the demonstration of competence<br />

and include a greater emphasis on understanding the technical skills being developed.<br />

7. Enumerators and supervisors should be subject to a transparent selection system conducted<br />

jointly by implementing organizations. Rotation of staff supervision arrangements across regions<br />

would improve supervision objectivity.<br />

8. When an effective quality assurance process is in place, responsibility for data collection should<br />

be shifted to REOs and head teachers following adequate training.<br />

9. The timing of the data collection should be shifted to the earliest practical time in the school year.<br />

Late September or early October should be considered. This will allow some data reports being<br />

available to MoE, REOs and school communities during December. This shift will help avoid the<br />

current need for data duplication. As the system develops there will be a need for supplementary<br />

data to be gathered on a term or semester basis.<br />

10. EMIS tools are only marginally useful under current practice. They do not seem to be linked to<br />

the PES data collection process. They should be developed as documentary evidence for the data<br />

being collected and hence contribute to the validity of the system by allowing systematic<br />

checking of summary data. They will also contribute to the supplementary data collection process<br />

when it is developed. More training and supervision of the tools’ use and simplification of tools is<br />

recommended.<br />

11. The plan in 1997 was to develop the PES processes and transfer responsibility for implementation<br />

to the Ministries. Apart from the data collection process this has not been achieved.<br />

6


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

12. Capacity building needs to be seen as more than training. There are at least three components<br />

required; training, experience and ongoing support. A model needs to be built to ensure these<br />

three components are in place as MoEs accept responsibility for additional components of PES<br />

process.<br />

13. The current PES reporting process does not make full use of the flexibility offered by the data<br />

being stored in an Access data base. It also targets a very small range of stakeholders which<br />

results in the REOs, and school communities being unable to use the data it provides.<br />

14. At a minimum the following set of reports should be developed and distributed:<br />

a. A school report that should be provided within 3 months of data collection.<br />

b. District reports that reflect education conditions within the district generally and allow<br />

schools and CECs to benchmark themselves, and district officers to determine priorities.<br />

c. Regional reports that reflect education conditions across regions and allow for comparison of<br />

progress between regions.<br />

d. Zone and <strong><strong>Somali</strong>a</strong> compact reports that that provide a summary of major EFA indicators and<br />

trends disaggregated by region, gender and rural/urban.<br />

e. A set of thematic reports that would change from year to year and would provide deeper<br />

analysis of EMIS data of issues of specific relevance.<br />

f. The capacity to interrogate the data base using the query module of Access should exist at<br />

Zone level and would be available for generating specialized reports for stakeholders<br />

g. A software-based front end that will allow exploration of the data base by non-technical<br />

people.<br />

Some suggestions for detailed reporting are included in Appendix7 of the report.<br />

15. A set of steps toward the development of a full EMIS is outlined, with a time frame of at least 5<br />

years indicated. The development of the EMIS is seen as a transition process and is based on<br />

using current PES processes as a starting point. Some key elements of the plan are listed here:<br />

a. An EMIS unit with adequate staffing levels, including technical assistance, needs to be<br />

established. The MoEs must demonstrate commitment to the process by allocating some<br />

resources to the unit with a plan to increase this allocation as resources become available. The<br />

salary and conditions of employment must be such that the staff are retained as their capacity<br />

is developed.<br />

b. A set of nine skill areas required for EMIS implementation is identified that provide a basis<br />

for a capacity development plan.<br />

c. A set of EMIS components are outlined that would be developed within each ministry<br />

sequentially. An immediate start with data cleaning and entry by MoEs is recommended.<br />

d. With the ongoing support of technical assistance, a rolling plan should be developed for<br />

EMIS development that will be subject to an annual review that will focus on redefining<br />

objectives following identification of progress the previous year.<br />

e. The expansion of data is recommended in line with EFA and MDG reporting requirements<br />

and in regard to local contextual needs. A three stage data expansion model is suggested. This<br />

is reproduced below under the headings of Immediate, Next phase and Aspirational:<br />

7


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Immediate<br />

Next phase<br />

Aspirational<br />

Enrolment by age, grade and gender<br />

Drop out and repetition<br />

Secondary School data<br />

Condition of buildings/ rehabilitation needs<br />

Background ECD of grade 1 enrolments<br />

Non-formal education<br />

Qur’anic schools/ Early childhood<br />

Text book numbers<br />

Emergency preparedness<br />

Outcome/ student achievement data<br />

Child friendliness of schools<br />

Teaching and learning resources<br />

f. Data from the EMIS should systematically be integrated into mapping software. Such a<br />

synthesis will also facilitate cross cutting views by allowing assessment to include education<br />

conditions along with emergency, WASH and health conditions. The DEVINFO format of<br />

reporting can also be supported.<br />

g. A number of possible threats to sustainability have been listed. Careful planning will be<br />

required to reduce the perceived risk of these threats.<br />

8


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

THE PRIMARY EDUCATION SURVEY<br />

The <strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong> (PES) has been conducted almost every year for the last nine years.<br />

However, no formal evaluation of the <strong>Survey</strong> has ever been carried out. The original purpose of the<br />

survey was to fill a critical data gap in the education sector and to meet the information needs of the<br />

Ministry’s of <strong>Education</strong> (MoEs), national and international development stakeholders as well as donors,<br />

with an independent, reliable and routine source of data for trend analysis and programme planning. The<br />

PES has evolved over the years with additional data elements being added to meet the growing needs of<br />

education stakeholders. From the beginning, UNICEF clearly stated its intention to build capacity of<br />

MoE officials in order for the MoEs to take full responsibility for the design and management of the PES.<br />

The PES was in many ways only meant to be an interim step on the road to establishing a full operational<br />

<strong>Education</strong> Management Information System (EMIS). (PES <strong>Evaluation</strong> TOR 2008)<br />

The ultimate goal of building such an effective EMIS is the establishment of a reputable system and<br />

expertly trained education officials at central, regional and district levels to ensure the efficient flow of<br />

data in order to facilitate the monitoring of service provision as well as providing critical information to<br />

educational planners and policy-makers<br />

After nine years, the <strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong> remains the most up-to-date and reliable source of data on<br />

the status of education in <strong><strong>Somali</strong>a</strong>. However, for the PES to maintain credibility and to ensure the<br />

validity of the data collected, an evaluation of the current and past processes involved in this important<br />

routine education survey, as well as an assessment of its utility levels, is considered necessary. The<br />

evaluation also examines the relationship between the need for a sustainable EMIS and the future role of<br />

the PES.<br />

The Strategic Partnership in <strong>Education</strong> (DFID-UNICEF-UNESCO), intends to fund the next annual<br />

education survey. However, prior to doing so, it is deemed important to evaluate the current annual<br />

process of data collection and data dissemination. Such an evaluation is fully justified in view of the fact<br />

that without a fully functional and efficient management information system, it will be difficult to bring<br />

about any reform in basic education in <strong><strong>Somali</strong>a</strong>. To date, the PES has played a crucial role in maintaining<br />

the availability of a reliable data source. How the effectiveness of this role can be strengthened and its<br />

relationship to the establishment of a broader system for EMIS, will depend on the outcomes and<br />

recommendations of this independent evaluation.<br />

EVALUATION OBJECTIVES<br />

The scope of the proposed evaluation covers the two routine annual surveys of primary education in<br />

<strong><strong>Somali</strong>a</strong>. These are known as the <strong>Primary</strong> Formal <strong>Education</strong> (PFE) and the <strong>Primary</strong> Alternative<br />

<strong>Education</strong> <strong>Survey</strong> (PAE). Together they are referred to as the <strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong> (PES). The<br />

evaluation focuses on the main aspects of the surveys, i.e. data collection and management as well as<br />

information dissemination and utility. In addition, it looks at the performance of the agency in<br />

supporting this recurring activity and the collaboration with its partners.<br />

The overall objectives of the evaluation are:<br />

• To assess the current and past design, planning and management of the key processes and<br />

methods that have been employed to conduct the PES since its inception<br />

• To assess the degree to which the PES meets the real data needs/analysis of the range of<br />

education stakeholders in <strong><strong>Somali</strong>a</strong><br />

• To assess the role of the PES in relation to the nascent EMIS and review the current linkages<br />

• To assess the need for expanding the breadth and depth of the information gathered and in<br />

particular, to review the idea of integrating the collection of Secondary School data<br />

• To assess the mechanisms for ensuring internal validity and reliability of the data presented in<br />

the PES<br />

9


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

• To assess the degree to which training of MoE officials in data collection and management<br />

have been successfully and appropriately implemented<br />

• To assess the utility level of the PES in <strong><strong>Somali</strong>a</strong>, in particular the benefits of its results and<br />

outputs to the intended users<br />

• To identify best practice, innovative interventions and shortcomings in the current process of<br />

planning and implementing the annual survey<br />

• To make recommendations and suggestions on possible improvements related to all aspects<br />

of the routine survey, including, if necessary, ways of increasing levels of participation and<br />

ownership and utility of the survey/data by the MoE authorities and other stakeholders<br />

• To make recommendations with respect to the evolving relationship between the<br />

current/future PES and the nascent EMIS system<br />

CONCEPTUAL FRAMEWORK<br />

The PES as currently conceived has many of the classic attributes of a survey in that it is a set of data<br />

collected by independent enumerators at a single point in time. The data is aggregated centrally, subjected<br />

to some analysis and placed in a single report for distribution. The implementation of the survey on an<br />

annual basis has facilitated trend analysis to be undertaken. However, these annual implementations are a<br />

set of independent events, and not an annual upgrading of a central data base. Importantly, the PES has<br />

not used the common strategy of sampling from the population; rather it is a census that collects data<br />

from all schools in the targeted population. This element, along with the centralised aggregation of data,<br />

is a feature that the PES has in common with an EMIS. However, an EMIS has some important features<br />

that distinguish it from a survey. A full description of an EMIS is set out in Appendix 3. Here, a brief<br />

extract from that appendix is presented to define what is understood to be an EMIS in this report.<br />

To clarify understanding about what an EMIS is, and how it can be of assistance in developing an<br />

education infrastructure, the definition and conceptual framework of an EMIS has been adapted from the<br />

work undertaken by UNESCO (1998 Bangkok). This conceptualization defines an EMIS to be ‘an<br />

organized group of information and documentation services that collects, stores, processes, analyses and<br />

disseminates information for educational planning and management” (p.2). As such it is more than a data<br />

storage system. Rather it integrates system components of inputs, processes, outputs and reporting in such<br />

a way that accuracy, flexibility and adaptability, and efficiency are maximized. The elements that need to<br />

be integrated are the:<br />

• Needs of data producers and users<br />

• Data<br />

• Information handling<br />

• Storage of data<br />

• Retrieval of data<br />

• Data analysis<br />

• Computer and manual procedures<br />

• Networking among EMIS centers (UNESCO 1998, p. 4)<br />

It also has a dynamic process orientation that focuses on the two-way flow of information so that all<br />

levels of the organization can make effective use of the data and reports generated from the data. This<br />

flow of information needs to be efficient by avoiding duplication of effort, made accurate by<br />

incorporating validation procedures and adequate training of personnel at all levels, and meeting the<br />

needs of all stakeholders in the system. This complex interaction of components is represented in Figure 1<br />

and has been adapted from the UNESCO (1998) report to reflect some aspects of the <strong><strong>Somali</strong>a</strong> context.<br />

These ideas are more fully developed in Appendix 3.<br />

10


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

EMIS DATA<br />

LEVELS OF<br />

DECISION MAKING<br />

INFORMATION<br />

PRODUCTS<br />

Students<br />

Curriculum<br />

Personnel<br />

Finance<br />

Facilities<br />

Other<br />

National<br />

Zones<br />

Regions<br />

Districts<br />

Villages/Schools<br />

Early childhood<br />

<strong>Primary</strong><br />

High school<br />

PAE<br />

NFE<br />

Nomadic and<br />

Qur’anic Schools<br />

<strong>Report</strong>s<br />

Plans<br />

Products<br />

Data/statistics<br />

bulletins<br />

Directories<br />

Physical plant and<br />

Facilities design<br />

Budget Estimates<br />

Others<br />

Figure 1: Information Flow (adapted from UNESCO 1998)<br />

RESEARCH QUESTIONS AND METHODOLOGY<br />

These objectives will be addressed by considering a set of 40 research questions that have been organised<br />

in seven areas. The detailed questions are set out in Appendix 4. The evaluation will explore the<br />

following areas in relation to the survey.<br />

• Efficiency<br />

• Effectiveness<br />

• Relevance / Appropriateness<br />

• Impact<br />

• Sustainability<br />

• Coverage<br />

• Coordination<br />

These are explored by using the detailed research questions to structure interview schedules for a wide<br />

variety of interviewees.<br />

11


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Study Components<br />

The principal source of data will be interviews with respective stakeholders. Respondents<br />

will include MoE officials at central, regional and district levels, NGO officials, head<br />

teachers, teachers, survey enumerators, supervisors, UNICEF/UNESCO staff at Nairobi<br />

and zonal level as well as local and international NGOs. The methods followed would be<br />

largely qualitative and include key informant semi-structured interviews, community level<br />

interviews and focus group discussions. Questionnaires for guiding the interviews will be<br />

developed for review in the inception phase. (PES <strong>Evaluation</strong> TOR 2008)<br />

The task of this methodology section is to coordinate these process components with the 7 areas identified<br />

above. The 7 areas are further clarified and designed in Appendix 4 by the use of 40 research questions<br />

mapped onto the 7 areas. A research matrix was developed that linked relevant data sources<br />

(stakeholders) with the research questions. This appendix is referred to as the Data Collection Planner.<br />

This matrix was used to ensure that each research question had an associated data source to inform it, to<br />

identify people who would reasonably be expected to have relevant knowledge and experience to inform<br />

each research question, and to guide the development of the interview guides that would assist the<br />

researcher in generating the conversation in the semi structured interview environments. These processes<br />

were validated by consultation with a range of stakeholders who have personal and detailed knowledge of<br />

the context of the research.<br />

The Data Collection planner was used to develop seven Interview guides. These guides targeted UNICEF<br />

officers, Directors General of <strong>Education</strong>, planning officers in each Ministry, Regional and District<br />

<strong>Education</strong> Officers, School and community leaders, <strong>Survey</strong> team members, and NGOs/international<br />

organisations. In addition, questions that could be informed by the desk review were identified.<br />

Appendix 5 indicates the set of interviews and focus groups conducted in the data collection process.<br />

The semi-structured interviews with individuals and focus groups were recorded and salient information<br />

mapped onto research questions. These data were synthesized and interpreted by the consultant prior to<br />

drawing conclusions and making recommendations.<br />

To enhance the validity of the study a number of strategies were used that were drawn from the<br />

qualitative research literature (LeCompte and Preissle, 1993). These involve enhancing the<br />

“trustworthiness” of the research. The strategies included triangulation, building a comfortable non<br />

threatening relationship with interviewees, creating an audit trail of data using a digital recorder in<br />

interviews, seeking permission of interviewee to record the interview with an offer to turn the recorder off<br />

if requested, seeking specific examples in support of claims being made, asking to see documentary<br />

evidence of claims and comments, and providing a summary of conversations by listing major points and<br />

seeking concurrence from interviewees.<br />

A few further words on triangulation are warranted. This is a process where the researcher tries to get<br />

multiple sources of evidence to assist in making conclusions in relation to the study. To this end, after<br />

studying the Data Collection Planner (Appendix 4) it can be seen that each research question is to be<br />

investigated with several stakeholders. Additionally, when claims are made attempts will be made to<br />

triangulate the claim with documentary, or physical evidence. In general, matters of fact should be agreed<br />

upon by all data sources. Some variability will be expected in views, or attitudes of a more subjective<br />

nature. Often it can be expected that this variability is a function of varying positions in the system. For<br />

example the Director General might see data as a valuable help to decision making and resource<br />

allocation, whereas teachers might see the collection of the same data as an imposition that reduces the<br />

time they have for lesson preparation.<br />

There were a number of threats and limitations to the study that should be enumerated. During the period<br />

of the study (July/August) schools were closed for the summer break and hence were not available for<br />

observation in a natural setting. Attempts were made to contact some local principals and school<br />

representatives for interview.<br />

12


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Only two zones were able to be visited (<strong>Somali</strong>land and Puntland) as Central South was considered<br />

unstable by security authorities. While findings will be generalised from the two zones visited, these<br />

findings need qualification when applied to Central South Zone, due to particular difficulties of security,<br />

accessibility due to climatic factors and the general level of development and competence of the Ministry<br />

of <strong>Education</strong> in the Zone<br />

The evaluation took two months as per the time-line (Appendix 1).<br />

RESULTS<br />

The results of the evaluation are reported in the seven areas indicated in the evaluation design, namely;<br />

efficiency, effectiveness, relevance/appropriateness, impact, sustainability, coverage, and coordination.<br />

The first three areas are divided into a number of sections to enhance clarity. Recommendations are then<br />

gathered at the end of each section for ease of access, and to provide a summary of the section. All<br />

recommendations are collected and numbered in Appendix 1.<br />

A. EFFICIENCY<br />

In this section the PES data collection process is briefly described and then some issues associated with<br />

that process are explored. The issues explored are; data validation strategies, timing of data collection,<br />

training and recruitment of enumerators and supervisors and data duplication.<br />

A brief outline of the PES data collection strategies as explained by a number of UNICEF and Ministry<br />

staff is presented prior to the major components of the section.<br />

1. The process begins with a consultant working in Nairobi to design the questionnaire.<br />

2. A PES planning meeting involving UNICEF staff from the three zones is conducted. Quality of<br />

questions is discussed to eliminate ambiguity and to ensure that the questions will achieve what<br />

they are designed to achieve. In addition, a detailed training strategy is designed.<br />

3. UNICEF staff go back to Zones where the enumerators and supervisors are identified. This<br />

identification differs across the 3 Zones. In one Zone the MoE does it independently of UNICEF,<br />

In the second Zone MoE and UNICEF cooperate in the process of identification and appointment,<br />

and in the third zone UNICEF takes full control of the process.<br />

4. UNICEF staff undertake training of enumerators and supervisors. (Field testing happened in two<br />

zones and not in a third zone.)<br />

5. Training includes a module of ‘micro-planning’ during which enumerators and supervisors are<br />

allocated to a region and a set of schools within the region. Additionally, UNICEF staff are also<br />

allocated to regions in a supervisory capacity.<br />

6. Enumerators then visit their schools over a two week period to collect data. Supervisors are<br />

expected to conduct follow up visits to schools to ensure they have been visited. If the school has<br />

not in fact been visited the enumerator is sent to the school again.<br />

7. UNICEF staff and supervisors also visit a sample of schools to check data that has been collected<br />

and on occasions to collect missing the data.<br />

8. Enumerators, Supervisors, REOs and UNICEF staff come together to review data forms one by<br />

one to check and clean data.<br />

9. Forms are then photo copied and the originals sent to Nairobi for the data entry process. This<br />

process involves double entry and comparison processes along with internal consistency checks<br />

in the data base entry forms<br />

10. Finally a report is prepared and distributed to MoEs and NGOs.<br />

13


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

1 Validity<br />

1.1 Findings and Analysis<br />

Attempts were made to assess data quality. Unfortunately this was undertaken without a full field<br />

assessment being conducted by selecting a sample of schools and recollecting data for a comparison. This<br />

full validation exercise was impossible due to schools being closed for summer holidays. Such an<br />

evaluation was also outside of the terms of reference for this study. Instead a range of participants in the<br />

process were interviewed and an assessment made of documentary evidence, processes in the project<br />

design, and comparison with stakeholders perceptions based on field experience.<br />

There has been some questioning of the validity of some of the data that appears on the data forms and in<br />

the draft report of the 2006/7 survey. Comment has been made on the draft report in the reporting section<br />

of this evaluation. There it is argued that the inaccuracies and inconsistencies identified seem to be a<br />

function of the subsequent analysis of the data and careless editing rather than a consequence of the lack<br />

of validity in the original data set. That is not to imply that there are no limitations on the validity of the<br />

data in the data base. The issue is to assess the size and nature of the limitations and to recommend<br />

strategies to address the threats to validity. As one Director General interviewed observed:<br />

All measurement has an error. But it is important to improve results.<br />

In attempts to reduce the size of the error the issue of diminishing returns for effort is faced. By this it is<br />

meant that it is relatively cheap to move from 90% accurate to 91% accurate, but very expensive to move<br />

from 98% accurate to 99% accurate, with 100% accuracy being unattainable. This is especially so in a<br />

context of political instability, capacity constraints, rainy seasons, droughts and a mobile population.<br />

There are a number of reported threats to the validity of the data, some of which are of the nature of<br />

hunches, and some of which are a result of direct observation. An interview with a director general<br />

identified these issues:<br />

A number of schools were not visited and some documents lost, but these matters have<br />

been discussed and follow up data collection conducted to remedy the situation.<br />

The recording of GPS coordinates is a problem and I suggest that additional training on<br />

the use of a GPS is necessary.<br />

The process of transferring raw data to UNICEF for data entry caused some problems,<br />

and to ensure supervisors and enumerators completed forms correctly we need close<br />

monitoring. (He was unable to tell us about spot checking).<br />

In one zone UNICEF officers make up to 150 visits to schools (out of approximately 600 schools) for<br />

check visits. One reported that only 1 out of his 30 schools had not been visited. In another zone the<br />

UNICEF officer said that in 4 years he had only found one school that had not been visited.<br />

The Zone that had the trouble with the use of the GPS and recording the coordinates was also the zone<br />

that did not conduct field testing during the training of enumerators. In another zone that had conducted<br />

field testing<br />

Further:<br />

Only one enumerator had trouble with GPS. The problem was identified and resurveyed.<br />

We had a problem with School mapping data last year. The coordinates collected did not<br />

match the ten year old village location coordinates. We recollected 10 Bossaso school<br />

coordinates and they still did not match. Was the problem with us or the village maps?<br />

14


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

While UNICEF staff are actively engaged in ‘spot checks’ of schools and the associated data, such<br />

consistent and diligent behaviour among supervisors and REOs was difficult to verify. A meeting of<br />

REOs indicated that:<br />

There is not follow up by REOs in the field due to lack of transport. No systematic data<br />

verification procedures at the field level are in place. The so called spot checks are<br />

haphazard at best and non existent at worst.<br />

Further REOs have no records of data collected. The process of checking by REOs and supervisors<br />

seems to be inconsistently applied across zones and regions. Some REOs described in detail how the<br />

checking process is undertaken and how they use independently collected data to verify the data forms.<br />

Two REOs were vague about the verification process and implied that it was left to supervisors and<br />

UNICEF staff.<br />

One enumerator indicated that:<br />

We always find 1 to 3 schools that have never been recorded.<br />

suggesting that there is a continuous process of seeking a complete data set.<br />

All stakeholders interviewed with experience in the field estimated that over 95% of schools were visited<br />

by enumerators. This is consistent with UNICEF reports of follow up visits. Further, one UNICEF staff<br />

member said that in his region staff consistently take the report with them into the field and check on<br />

data. It is always very consistent with field observation. This practice is also practiced by a donor, who<br />

pays special attention to gender issues in the field. The PES report is always an accurate reflection of the<br />

data he finds during his visit.<br />

In a focus group session with approximately 8 NGOs who regularly used the data from the PES, no one<br />

indicated that they had found significant inaccuracies.<br />

While there are constraints to validity identified above, they do not seem to be very great challenges to<br />

the quality of the data. However, reasons behind these shortcomings that have been identified were sought<br />

and strategies for improving the process of data collection are suggested.<br />

There appears to be three sets of reasons for schools not being visited.<br />

• The first group are obstacles that cannot be overcome in the time frame of the survey. These<br />

include Floods (especially during the rainy season), and security concerns.<br />

• The second group are those obstacles such as distance and transport that need careful planning to<br />

overcome.<br />

• The third set are those reflecting a lack of ethical commitment to the work with and underlying<br />

attitude of ‘they won’t know’ or ‘I won’t be caught.’<br />

Common responses to these constraints include the enumerator:<br />

• Contacting the Head Teacher at his/her home to conduct the interview with the result that some<br />

details are estimates rather than validated information from source documents.<br />

• Completing the forms in the REOs office using data collected by REO, or last years data.<br />

• Making up data from local knowledge.<br />

• Recording GPS coordinates from the location of interview rather than location of Head Teacher’s<br />

office.<br />

These together with some limited problems of using a GPS to collect school location data,<br />

misrepresentation by school head teachers, and carelessness in recording data on the forms encompass the<br />

set of constraints to validity that have been found in this evaluation. However, it is emphasised that, while<br />

these issues exist they do not appear to be widespread and while they will cause distortions at the micro<br />

level, they will have little impact on macro indicators.<br />

15


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

To address these issues a number of recommendations are made. Some are described here in full and<br />

others related to issues of training are mentioned here and developed further in the timing, training and<br />

enumerator sections below.<br />

REOs, Supervisors and UNICEF staff are encouraged to develop a ‘culture of support’ for enumerators.<br />

In such an environment enumerators are encouraged to report difficulties in visiting schools and to record<br />

limitations to data so that an accurate picture of data quality can be gained. Such a culture can be<br />

contrasted with a culture of punishment and inadequacy when enumerators have not been able to visit a<br />

school. This will result in attempts to hide constraints to data quality. In one zone this approach was<br />

reflected in comments by a UNICEF officer:<br />

Enumerators are told that if they cannot find or get to a school they are to report it. It is<br />

not a problem. A solution will be found. The only punishment is if you lie to us about<br />

claiming you have been there and you have not.<br />

Such a culture should be encouraged by a module on ethical behaviour being included in the training<br />

exercises for enumerators and supervisors.<br />

Enumerators should be evaluated both during training and during the field work. If they do not reflect<br />

adequate competency they should be replaced or not re-employed. An effective monitoring system needs<br />

to be established that will avoid issues of nepotism influencing performance quality.<br />

GPS training should be supported by field work and mapping exercises to encourage understanding of the<br />

implications and importance of the school mapping exercise.<br />

The timing of the major data collecting exercise should be adjusted to avoid rainy seasons and end of<br />

school year closures.<br />

An audit trail needs to be established to facilitate effective tracking of school visits and data forms.<br />

Evidence of visits, checking, recovering missing data and reasons for non-visits were almost impossible<br />

to find in the field. There was a case of missing forms, requiring a recollection of the data, which resulted<br />

in intense debate as to the cause of the problem. This was a result of lack of effective tracking strategies.<br />

Such a system should be transparent and available to all participants to understand, monitor and use.<br />

Performance indicators can be derived from such systems and used to encourage performance<br />

improvement, and an understanding of ethical approaches to the work being undertaken.<br />

As a component of the audit trail, enumerators should be required to record school locations as ‘way<br />

points’ in their GPS. These way points should be shared with supervisors/ REOs on return from the field<br />

and checked against a list of schools and previously recorded locations. Such a system will provide<br />

validating evidence of the school visit process.<br />

Enumerators should be required to validate data in the school from source documents where possible.<br />

This is already a component of the data collection process for the assessment of the physical resources of<br />

the school where the enumerator is required to conduct an inspection of the school and its surroundings to<br />

gather the necessary data. This should also be encouraged where possible in the earlier part of the data<br />

collection process. Effective use of the EMIS tools would assist in this process so that school enrolment<br />

could be checked for example. A common practice elsewhere is to choose a few data elements each year<br />

that will be subject to a validation check at the school level. These data elements are commonly rotated<br />

each year.<br />

Data entry has thus far taken place in Nairobi. While this has been efficient it has some drawbacks in<br />

terms of capacity development and efficient interpretation of <strong>Somali</strong> language issues. This matter has<br />

been discussed extensively elsewhere in this report in the section on ownership.<br />

The data is entered into an Access data base. The mechanisms used in the data entry process are very<br />

impressive and reflect state of the art practice. These include validity checks at point of data entry, the<br />

16


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

double entry of data and subsequent comparison, and data cleaning. All this should ensure that the data<br />

entered into the data base is an accurate reflection of what is recorded on the survey forms. The designers<br />

of these processes should be congratulated.<br />

A common practice in these exercises is, following data entry, to produce a report of the data for each<br />

school and to have the school check the accuracy of the data. This is not a component part of the PES.<br />

This validation of school data is an important and valuable component of the EMIS process. Additionally,<br />

it provides feedback to schools and communities with regard to the value and importance of accurate data.<br />

Such a process is consistent with the EMIS conceptual framework. With the data entered into an Access<br />

data base the production of such a school report becomes an easy process that could be done at zone level<br />

at minimum cost. The challenge is not the report itself but the distribution of the reports to schools and<br />

the receiving of feed back from the schools. This becomes a challenge due to a lack of well established<br />

and efficient communications systems in many places and a lack of sophisticated monitoring systems<br />

within the ministries/zones. There are however a number of emerging strategies that reflect improved<br />

contact and communication with schools. These include the mentoring and mobilising systems<br />

established in rural areas, the regular data collection at regional level that duplicates the PES in some<br />

regards, and the initiatives of examination and curriculum directorates in one region that collect reports as<br />

frequently as monthly. Such resources need to be integrated and harnessed into an effective unit that can<br />

facilitate regular and economic contact with schools. The establishment of effective Quality Assurance<br />

mechanisms that would establish regular contact with schools is a necessary prerequisite for this<br />

validation process.<br />

An option that should be considered in future data collection processes, if for the annual data census to be<br />

viewed as an update on data rather than the current collection of all data on an annual basis. Now that data<br />

is entered in to a data base, much will remain constant or relatively constant from year to year. Hence, the<br />

annual data collection process could be seen as an update on data that changes regularly, such as student<br />

enrolments, and a validation exercise for data that does not change regularly, such as number of<br />

classrooms and provision of water and sanitation services. To facilitate this approach, the enumerator<br />

would take a report for each school to the field that would provide for checks on current data that has not<br />

changed and updating of data that has changed. An added advantage of this process would be to reduce<br />

the volume of data entry that is undertaken annually, and hence, speeding up the reporting time frame.<br />

A useful addition here would be a missing data report to allow follow up in the field. Such a report can<br />

easily be generated from the data base. Although it is recognised that missing data should be minimal due<br />

to the data checking and cleaning exercises before data form are delivered from the regions.<br />

1.2 Recommendations<br />

a. That REOs, Supervisors and UNICEF staff are encouraged to develop a ‘culture of support’ for<br />

enumerators to encourage transparent and open work practices.<br />

b. That such a culture should be encouraged by a module on ethical behaviour being included in the<br />

training exercises<br />

c. That enumerators should be evaluated both during training and during the field work, and<br />

employment in the PES process be dependent on satisfactory appraisal<br />

d. That GPS training be supported by field work and mapping exercises to ensure an understanding<br />

of the implications of incorrect location data<br />

e. That an audit trail be established to facilitate effective tracking of school visits and data forms<br />

f. That enumerators be required to record school locations as ‘way points’ in their GPS as part of<br />

the audit trail<br />

g. That enumerators be required to validate data in the school from source documents on selected<br />

indicators.<br />

h. That effective Quality Assurance mechanisms be developed that would establish regular contact<br />

with schools. These are seen as a necessary prerequisite for the validation process.<br />

i. That in future data collection processes, the data census become an ‘update’ of data rather than<br />

the current collection of all data on an annual basis<br />

j. That a missing data report be provided at the completion of data entry to allow follow up in the<br />

field.<br />

17


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

2 Training<br />

2.1 Findings and Analysis<br />

There have been a number of issues identified in this evaluation report that point to the importance of<br />

effective training of enumerators and supervisors, and to the consequences when training has not been<br />

undertaken with rigour and dedication. All of the issues discussed here have been mentioned in other<br />

sections of the report and are being readdressed for purposes of thoroughness and completeness. It is<br />

observed in the ownership section that ‘class training’ in itself is not an adequate preparation for the<br />

undertaking of complex skills. It is further emphasised that experience and ongoing support are also<br />

necessary. Thus, it is emphasised that there is a necessity for practical components, undertaken with<br />

appropriate guidance and opportunity to reflect on the experience, to be central to the training process.<br />

There are two major components to these discussions of training processes. The first is a certification<br />

process for enumerators and supervisors and the second is a set of additions to the current training<br />

programme that are based on experiences of people who have been engaged in the current process.<br />

Readers may react to some of these issues with the view that they are already included in the training<br />

process. What has become clear is that the training is not delivered using exactly the same process in all<br />

zones, consequently, some procedures are being emphasised for inclusion.<br />

<strong>Report</strong>s were received during the evaluation that some enumerators and supervisors did not take the<br />

training seriously. This is in spite of the recent initiative to develop a TOR for the employment of these<br />

staff members. The lack of commitment was reflected in behaviour of constantly absenting themselves<br />

from training sessions, constantly using mobile phones during training, and failing to take and active part<br />

in class sessions and activities. One MoE reported that:<br />

Some people are only here for the money. We need to be more careful in their selection.<br />

It is clear that it is not enough for mere attendance at the training be deemed adequate preparation, but<br />

also an evaluation of performance leading to a certification process should be embedded to ensure<br />

commitment and competence. A set of competencies should be designed and shared with trainees prior to<br />

the training workshop. Certification will only be granted when trainees achieve all competencies. The<br />

competencies might include:<br />

Trainees must:<br />

• Achieve satisfactory attendance<br />

• Be able to actively engage in class activities<br />

• Be able to use GPS and identification of way points<br />

• Be able to use mapping skills to plot Location coordinates<br />

• Demonstrate an ability to analyse data forms for inconsistencies<br />

• Be able to use audit trail instruments<br />

• Be able to discuss implications of inaccurate data<br />

• Demonstrate commitment to the project and understand its importance.<br />

• Be able to make ethical judgements appropriately<br />

To assist participants achieve these competencies it will be necessary to adjust course content to provide<br />

opportunities for participants to develop the necessary skills and attitudes. The following modules are<br />

suggestions derived from the study:<br />

• Ethics/Values module. This would have the aim of developing the importance of accurate data<br />

collection and to value the opportunity to be a participant in the process. This might be achieved<br />

through a set of small case studies that reflect common dilemmas faced in the field, and using<br />

class or group discussion to resolve the dilemmas. An issue that might be included here is how to<br />

deal with obviously exaggerated data, or major changes in data from previous years.<br />

• Field Testing and Modelling of Monitoring Process module.<br />

18


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

• Data Checking and Cleaning module. A module of training for data sheet checking, monitoring<br />

and supervision should be developed and all REOs, supervisors and enumerators be encouraged<br />

to reflect on impact of poor data pick up procedures.<br />

• Use of audit trail records and way points to track activity.<br />

• GPS and mapping exercise Module. Use of GPS. Consideration of implications of school<br />

coordinates missing village locations. This should include the use of way points to monitor school<br />

visits.<br />

• Analysis of Forms Module. Forms containing common problems and errors analysed, and<br />

consideration of causes and correction strategies.<br />

• Addressing Common Problems module. Issues such as spelling of village and school names,<br />

changing school names, new schools emerging and old schools disappearing needs discussion and<br />

strategies developed for avoiding confusion.<br />

These have two obvious consequences for the training course. One is that the course will have to be<br />

lengthened to include the extra content. The second is that it will not be enough to merely present the<br />

course material. Participants will have to be given the opportunity to demonstrate the competencies. This<br />

has implications for teaching methodology, need for revision and retesting opportunities.<br />

2.2 Recommendations<br />

a. That a certification process for enumerators and supervisors be used to judge competence for the<br />

work tasks.<br />

b. That the certification process be based on explicit competencies that must be demonstrated by<br />

enumerators and supervisors.<br />

c. That a set of additional modules to the current training be included in the training course<br />

d. That the course be lengthened to include the extra content<br />

e. That participants be able to revisit opportunities to learn and demonstrate the competencies.<br />

3 Enumerators and Supervisors<br />

3.1 Findings and Analysis<br />

Enumerator and supervisors are at the centre of the current data collection system for the PES. The<br />

quality and dedication of these people is important to the validity of the data collected. It is clear from the<br />

above discussion that there are areas of challenge in their selection, training, supervision and<br />

commitment. Some doubts have also been expressed in regard to the ethical practice of some of the<br />

enumerators. It must be emphasised that inadequate performance was not found to be widespread and<br />

most of the enumerators spoken to as part of this evaluation reflected competency and commitment to the<br />

tasks at hand. Once again there was variability across zones, with the zone that seemed to have the most<br />

transparent recruitment practice and who had addressed some supervision issues having fewer difficulties<br />

in this regard.<br />

It has been noted that a TOR has been developed for enumerators and supervisors. This is a useful<br />

beginning. One zone seemed to make the recruitment process a ‘private’ affair, with a lack of clarity with<br />

regard to demonstrated potential to adequately commit to the task. One senior ministry official observed<br />

that:<br />

We have been recruiting old people as enumerators and we need to start recruiting<br />

younger people with greater enthusiasm and commitment.<br />

A more transparent process of recruitment, using a variety of stakeholders would assist in improving the<br />

quality of recruited staff.<br />

In the above section on training it has been recommended that enumerators and supervisors demonstrate<br />

competency and commitment during the training process. If they fail to demonstrate the competencies<br />

they should be replaced. It would be advisable to include more than the required number in the training<br />

process and only the best included in the data collection phase.<br />

19


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

In addition, the performance of enumerators and supervisors should be evaluated during the data<br />

collection phase by independent assessors, a report given to the staff member and this used as a basis for<br />

reemployment in the process the following year. One zone moved data collecting teams across regions<br />

where they were supervised by a different REO each year. In this way a more independent system of<br />

monitoring was created since over familiarity and old friendships could not influence work practice.<br />

Such a system could be considered by other zones.<br />

In undertaking the above recommendation it is necessary to remember the recommendation made above<br />

of conducting the data collection in a supportive environment. It needs to be remembered that these<br />

people are working in an environment where capacity is low and a major focus of the work is to develop<br />

that capacity. Hence, in implementing these proposals it is necessary to ensure that all staff are supported<br />

to develop the necessary skills and attitudes, are given support when needed and are encouraged to seek<br />

advice and support without fear of dismissal. Thus, it is imperative that a fine balance is recognised<br />

between commitment to quality and commitment to capacity development.<br />

It is worthwhile to make a note with regard to the future use of enumerators and supervisors as they are<br />

currently conceived. The more standard process for collecting data in an EMIS process is for head<br />

teachers to take responsibility for completing data forms and returning them to district/regional office.<br />

Currently this would be difficult to achieve due to the poor communication and transport infrastructure in<br />

many areas of <strong><strong>Somali</strong>a</strong>. As noted in an earlier section, a quality assurance and systematic monitoring<br />

structure has to be established before such data collection strategies can be used effectively. As capacity<br />

is developed and infrastructure built it is suggested that a more economic data collection system be<br />

developed. This could be achieved by abolishing the enumerator/supervisor structure and replacing it with<br />

an EMIS officer located in each regional office and head teachers be empowered to complete update of<br />

data forms along with subsequent data collection processes in the form of term or semester reports.<br />

3.2 Recommendations<br />

a. That the enumerators/supervisor position be competitively attained. It would be advisable to<br />

include more than the required number in the training process and only the best included in the<br />

data collection phase.<br />

b. That the performance of enumerators and supervisors be evaluated during the data collection<br />

phase by independent assessors using predefined competencies.<br />

c. That REOs be rotated across data collecting teams and thus create a more independent system of<br />

monitoring.<br />

d. That a fine balance is recognised between commitment to quality and commitment to capacity<br />

development.<br />

e. That as capacity is developed and infrastructure built the enumerator/supervisor structure be<br />

abolished and replaced with an EMIS officer located in each regional office and head teachers be<br />

empowered to complete update of data forms along with subsequent data collection processes in<br />

the form of term or semester reports.<br />

4 Timing<br />

4.1 Findings and Analysis<br />

In making comments with regard to the timing of data collection there are two areas in which<br />

considerations need to be made. One is the impact of timing on the data collection process, and the<br />

second is the suitability of timing of data collection for the process of collecting data and providing<br />

reports in a timely manner to ensure the maximum use of the data can be made and to help in avoiding the<br />

necessity for data duplication. By data duplication it is meant the need for people to pick up data in a<br />

parallel system that is also collected in the PES. Firstly, the impact of timing on the data collection<br />

process is considered. While it might not be possible to avoid all data duplication, a well designed EMIS<br />

process will attempt to minimise the need for data duplication.<br />

The current data collection process is undertaken in late April and May. The plan is for data to be<br />

collected over a two week period, but there are often delays and disruption to the process. This results in<br />

20


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

data collection coinciding with end of the school year activities such as exams and exam marking, making<br />

it difficult to find head teachers in their office.<br />

A more significant constraint is that the current data collection period takes place in part of the rainy<br />

season. The influence of the rainy season has been identified above as a threat to the validity of some<br />

data, due to enumerators being unable to reach some schools.<br />

The Vice Minister of one zone observed:<br />

We are collecting data once a year. It is very rushed and not a stable process close to the<br />

end of the school year.<br />

Another impact of the data being collected so late in the school year is that any report is always an<br />

historical document. It can never be a report about the current year. Such a reporting structure is built for<br />

agencies in developing proposals, not for monitoring school performance and improving planning, or to<br />

facilitate system activities such as recording year 8 exam results and attendance data. The current practice<br />

has led to a number of data duplication practices that have become necessary due to the lack of timeliness,<br />

in addition to an inflexible reporting process.<br />

A common practice in EMIS data collection is to conduct a school census (similar to the current PES)<br />

early in the school year, with subsequent data pick ups on a term or semester basis. The subsequent data<br />

pickups commonly include drop out, exam, attendance, and finance data. With careful planning and<br />

coordination with other ministry directorates, reports can be made on a timely basis for other MoE<br />

activities. Such practice has the advantage of spreading EMIS activity across the year, thus ensuring<br />

ongoing activity of specialist data entry staff and data base managers.<br />

Both of the above issues can be addressed by moving the school census process to early in the school<br />

year. If data was collected in late September or early October, (or as soon as possible in the school year) it<br />

would be possible to avoid the rainy season and disruptions to school activity due to end of year<br />

examination activity. A report has been received that indicated some variability in the school year<br />

beginning for some schools, or some late enrolments that may constrain the beginning of the data pick up.<br />

Such matters would require the advice of REOs to ensure the most suitable time is chosen.<br />

Such timing at the beginning of the school year will favourably impact on the usability of the data by<br />

regions, districts and school communities since it would be current data. If data is collected in early<br />

October, entered into the data base in November, preliminary reports should begin to flow to<br />

stakeholders, in particular REOs and schools, in December. A full set of reports for the year would be<br />

available before the end of the school year in May/June.<br />

4.2 Recommendations<br />

a. That data be collected in late September or early October (or as early in the school year as can be<br />

achieved)<br />

b. That data be entered into the data base in November,<br />

c. That preliminary reports begin to flow to stakeholders, in particular REOs and schools, in<br />

December.<br />

21


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

5 Data Duplication<br />

5.1 Findings and Analysis<br />

In the effectiveness section of this evaluation report the view is expressed that an EMIS unit should be<br />

established that has the vision of being a service unit for other directorates within the ministry and other<br />

stakeholders that support the education of children. This provision of data and the associated reports need<br />

to be provided on a timely basis. A result of not providing reports on a timely basis is the duplication of<br />

the data collection process. If data collection needs are not well coordinated, parallel systems are at times<br />

set up to gather additional data.<br />

Evidence was found of quite extensive data collection and data entry that is not part of the PES process.<br />

Some examples are listed below.<br />

In one zone UNICEF entered a subset of PES data into an excel spreadsheet due the tardiness of the PES<br />

report being provided. The data entered allowed them to assess progress and distribute resources on an<br />

equitable basis.<br />

In another Zone REOs reported that they all had their own data collection systems. One set of documents<br />

provided indicated that School enrolment by grade/gender/district/school, drop outs, school furniture and<br />

equipment health hazards, general condition of school, number of text book per student, extra curricular<br />

activities and other data are systematically collected. Some of this is collated into a regional report. While<br />

all REOs indicated they collect data independently of PES it is not clear just how extensive each set of<br />

data is.<br />

In Puntland the curriculum division collects reports on a monthly basis from rural schools and conducts<br />

some analysis of the data. Mentors are used as collection agents. These include student enrolment figures<br />

and observations of the mentoring process. Some may have reservations with regard to this process as<br />

mentors are intended to have a professional development role, rather than an enumerator role. If such a<br />

process were to continue, it would have to be made explicit how the data collected related to their<br />

professional development function<br />

In this same zone a set of data for Grade 8 examination results is collected and collated. This data<br />

consisted of full details (including date of birth) of every student undertaking the examination. To this<br />

data base the exam results are added when available. This is an example of data being required to<br />

supplement that of the PES. The PES does not gather student names and details as records. As EMIS units<br />

develop it will be necessary to integrate these functions into its annual work plan.<br />

Much of the content of this section supports the observations and recommendations made in the previous<br />

section on the timeliness of the PES and indicates how the current structure and operational mechanisms<br />

are resulting in duplication of effort and a resulting wasting of resources.<br />

However, there are some positive aspects to these observations. There is an indication of capacity to<br />

communicate with schools and to transfer forms within a rudimentary monitoring system emerging.<br />

There is also an indication of an emerging understanding of the important role of data in planning and an<br />

emergent capacity to process and analyse data. These efforts need to be recognised and encouraged to<br />

facilitate the growth of formal quality assurance procedures to replace the relatively ad hoc systems<br />

currently being used.<br />

5.2 Recommendation<br />

a. That careful planning take place to provide data on a timely basis for the variety of MoE<br />

directorates and stakeholders<br />

22


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

B. EFFECTIVENESS<br />

This section discusses effectiveness in the context of the use made of the current survey data, the use and<br />

potential of documents provided to schools and referred to as ‘EMIS tools,’ ownership of the PES and the<br />

progress made towards MoEs accepting responsibility for the implementation of the PES.<br />

1 Utility-Current and Potential<br />

1.1 Findings and Analysis<br />

Inquiries as to how the PES was used drew a range of responses. One Vice Minister said:<br />

We don’t make use of the PES.<br />

One Minister was more positive implying it is used to provide data for supporting organisations<br />

It is very important. We need it to submit to other supporting organisations. It is the only<br />

way to know what is going on.<br />

A Director of Planning was more aspirational:<br />

We are committed to build the capacity of a Department of Planning, Research and<br />

Coordination to have accurate data and proper data recording.<br />

We want more capacity building in planning.<br />

We want:<br />

To provide a yardstick for assessing and evaluating development in primary education.<br />

To provide the latest data on key aspects of primary education for use by organisations<br />

involved in provision and delivery of education.<br />

To take further the development of an EMIS for our Zone by collecting and analyzing<br />

data on a questionnaire that has continued to improve since 1997.<br />

He also indicated that he used the data to check submissions from REOs when they requested support.<br />

However, he was unable to provide an example of such activity and his data entry clerks were unable to<br />

use the query function on Access. Thus the conclusion, that these views are more aspirational than reality.<br />

There was some criticism of the distribution of the PES report that implied a constraint to its possible use.<br />

A REO/Enumerator workshop reported:<br />

<strong>Report</strong>s not received for last two years. Last report received 2004/05. Schools never<br />

receive reports. MOE should ensure wide distribution of the report. UNICEF ensures<br />

adequate copies of the report reaches the MOE including electronic format. Majority of<br />

REOs have computers in their offices.<br />

Even when received, the report is never properly used by schools, due to lack of trained<br />

staff for the use and planning. Capacity development for the use of such a report is<br />

needed.<br />

In a different zone one REO indicated:<br />

I make regular use of the report for planning purposes. I look at enrolment increases and<br />

decreases, physical condition of schools, how many are operating.<br />

This seemed to the exception rather than the rule.<br />

23


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

While it is clear that the survey reports have had very little impact on the MoEs, it is still of major<br />

importance to the various agencies working in the education sector. Staff from Save the Children<br />

indicated that they:<br />

Use data for Base line data, enrolment figures, number of teachers, school environment,<br />

fees of students, community contributions and gender numbers<br />

Data collected targets ministry, but they (the Ministries) don’t use it.<br />

A UNESCO representative supported this position and added that the PES consisted of the only<br />

significant data for the sector.<br />

A wide range of uses of the PES reports were identified by NGOs. These included:<br />

• Data used in proposal preparation.<br />

• Establishing base lines for activities and monitoring progress to assess impact of activities.<br />

• Identifying and prioritising schools in need of water and sanitation support.<br />

• Planning for distribution of supplies.<br />

• Planning for teacher training. (Special focus on women due to very small number of women<br />

teachers identified in the survey)<br />

• Planning for text book production and distribution.<br />

• Use as a sampling frame.<br />

• Prediction of numbers of students progressing to secondary school.<br />

• Comparison of Regions for patterns of enrolment across the country.<br />

• Identifying schools for rehabilitation interventions.<br />

From the above it can be concluded that the PES data is playing an important role in the development of<br />

the sector from an agency perspective. The MoEs seem to be making very limited use of the data at the<br />

present time but have indicated a desire to do so. To facilitate “Evidence Based Planning” there are a<br />

number of issues that need to be addressed, these include timeliness, capacity to use the data, format of<br />

reporting of data, and the development of a systematised and institutionalised planning process. This<br />

institutionalisation needs to take place at central, regional and school/community levels. These issues will<br />

be more fully discussed later in this report.<br />

1.1 Recommendations<br />

a. That timeliness of data reporting be improved to suit institutional needs.<br />

b. That capacity to use the data be developed among stakeholders.<br />

c. That alternative formats of reporting of data be developed to differentiate between stakeholders.<br />

d. That a systematised and institutionalised planning process be developed. This institutionalisation<br />

need to take place at central, regional and school/community levels.<br />

2 EMIS Tools<br />

2.1 Findings and Analysis<br />

This evaluation includes an exploration of the use schools make of a set of tools to assist in managing the<br />

schools. These tools are inappropriately referred to as ‘EMIS tools’ and consist of a Class Register,<br />

School Register, Student record card and a set of detailed instructions on their use. Unfortunately, the<br />

timing of the visit to Zones coincided with school holidays and hence, discussions on their use with<br />

teachers or principals were impossible. Their use was discussed with REOs and Ministry staff. They<br />

estimated that Class registers are used by 80% of schools and school registers by 20% of schools. The<br />

80% may be a bit optimistic since the 2006/7 draft report indicated that only 50% of school had received<br />

the tools. The pupil record cards are very infrequently used, with one REO saying:<br />

They are only used when a student changes schools.<br />

One Zone indicated that these tools were to be a major monitoring focus in the new school year:<br />

24


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

The next school year is to be ‘the year of using EMIS tools properly.’ We will evaluate<br />

mentors and teachers on this basis. All Ministry staff will visit schools to monitor the<br />

EMIS tools (this includes the Minister).<br />

<strong>Report</strong>s were received that indicated the School register was very difficult for head teachers to use. This<br />

difficulty was related to the perceived complexity of the tools and to the capacity of the head teachers to<br />

undertake such clerical duties. Further investigation of these issues needs to be made and adjustments to<br />

the tools, or to the training associated with the tools, or both, needs to be made. An appropriate strategy<br />

would be to conduct a workshop on the evaluation and use of the tools with a random sample of head<br />

teachers to more fully evaluate the issue, and to develop remedial strategies.<br />

What remains unclear is how these tools are to be used. Is it expected that they will provide summary data<br />

on school enrolments, external exam candidature, absenteeism, dropout rates, school resources etc? These<br />

would be desirable supplementary data sets that should be routinely collected in a full EMIS process.<br />

However, no evidence could be found that this is in fact the case. The use of these tools needs<br />

clarification and appropriate mentoring provided for head teachers on the use and value of these tools.<br />

The tools are potentially valuable aids for providing evidence for the PES data collection process.<br />

Without these important functions being understood the use of the tools will be seen by head teachers as<br />

“busy work” without value or impact.<br />

2.2 Recommendations<br />

a. That the use of the ‘EMIS tools’ is clarified and appropriate mentoring provided for head teachers<br />

on the use and value of these tools.<br />

b. That a review of the design and use of the tools be undertaken.<br />

c. That plans be developed to integrate this data with the EMIS in the form of supplementary data<br />

collection and as a contributor to data verification.<br />

d. That further distribution of the EMIS tools be supported by developing understanding of their<br />

importance and use.<br />

3 Ownership<br />

3.1 Findings and Analysis<br />

There is considerable commitment to the importance of the survey:<br />

It is very important. We need to it to submit to other supporting organisations. It is the<br />

only way to know what is going on. It is important to conduct it annually.<br />

A Director General<br />

However, the MoEs seem to only be really involved in the data collection process, with design of forms,<br />

data entry analysis and reporting all being the domain of UNICEF, with some consultative processes, and<br />

the technical assistance they provide<br />

The operation of the project is done by MoE by filling in the forms. Funds are transferred<br />

to Ministry by UNICEF to allow this process. Calculations are done in Nairobi and<br />

results sent back to Zones …<br />

A Director General<br />

Even in this data collection process UNICEF staff are central to a validation process of spot checking of<br />

schools to confirm data accuracy.<br />

To broaden the “ownership” of the project from its present narrow base strategies need to be developed to<br />

shift responsibility for other major components of the PES to MoEs. As this shift takes place it should<br />

also be conceived as a shift from a survey to an EMIS. In addition to data collection, the major<br />

components could be considered to include:<br />

25


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

• Validating data collection and data cleaning;<br />

• Data entry processes;<br />

• Data analysis;<br />

• <strong>Report</strong> generation;<br />

• The design of future data needs, and;<br />

• The integration of various data collection strategies currently undertaken.<br />

These components should be addressed as capacity and resources are developed. It would not be<br />

appropriate for capacity in all these areas to be developed simultaneously. An immediate focus should be<br />

placed on improving data collection processes to ensure greater accuracy, and shifting data entry<br />

responsibility to Zones over a two to three year period.<br />

The issue of data entry is clearly the next phase in developing ownership of the PES. The issue caused<br />

much comment from Ministry officials. One observed:<br />

Further<br />

The Process of data entry has to be owned by ministry. The computers are available, but<br />

human resources are the problem. People are available but need some support. UNICEF<br />

will have to pay for the EMIS counterpart.<br />

Those who enter data are not <strong>Somali</strong> and therefore they do not know local language and<br />

conditions. Need to involve local people in data entry. Can they enter once and the<br />

Nairobi people enter once. Then compare. Do this to build experience. Even take <strong>Somali</strong><br />

data entry people to Nairobi for the process.<br />

It was mentioned above that data collecting, processing and analysis skills are emerging in Puntland.<br />

However, this is not consistent across the three Zones. Some capacity building in the area of data entry<br />

and accessing data has been attempted; computers have been supplied and staff to undertake the tasks<br />

have been identified. However, there are constraints in the system that are preventing the transition of<br />

responsibility from UNICEF to the MoEs. Both the education literature and the change literature will<br />

indicate that a brief workshop on a complex set of skills is not enough to build capacity for people to<br />

operate independently in a new and challenging area. In addition to the training component, the skills<br />

need to be practiced in a supportive environment. While experience and support has been provided for in<br />

the area of data collection it has not been provided in the other component areas of EMIS development.<br />

This is essential if capacity is to be developed. With the current double entry data system the opportunity<br />

exists to develop capacity in the area of data entry, without substantial risk to the project. The MoE data<br />

entry staff could be used to enter the data once and the Nairobi data entry staff enter it the second time.<br />

This would be followed by the usual comparison check and corrections. The precise location of the data<br />

entry function would need to be decided, but the process would provide data entry staff with experience<br />

in a supportive environment. As capacity is demonstrated, responsibility for both data entry tasks should<br />

be shifted to Zones.<br />

3.2 Recommendations<br />

a. That the responsibility for transferring of EMIS components to MoEs are considered one at a time<br />

as capacity and resources are developed. In this way a smooth transition from the centrally<br />

managed survey to a decentralised EMIS can be achieved.<br />

b. That responsibility for data entry be shifted to Zones over a two to three year period.<br />

c. That initially MoE data entry staff be used to enter the data once and that the Nairobi data entry<br />

staff enter it the second time.<br />

26


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

C. RELEVANCE AND APPROPRIATENESS<br />

Relevance and appropriateness is discussed under the subheadings of the data that is required for an<br />

EMIS and the suitability of reporting processes following data collection.<br />

1 Data Required for an EMIS<br />

1.1 Findings and Analysis<br />

The data collected in the PES has been modified over the 10 years of the surveys operation. The major<br />

component of these modifications has been the PAE survey to capture this large (some 69000 students)<br />

section of the <strong>Education</strong> sector. This component is of particular importance for girls who are often<br />

required to perform duties in the home prior to attending school. Street children are also targeted within<br />

this system. In addition, rural /urban data, and GPS coordinates of schools have recently been added to<br />

the system. There has been a continual evolution of the survey forms over the ten years based on<br />

experience and evaluations.<br />

The range of data collected is relatively extensive and includes educational components as well as<br />

infrastructure data that should assist in planning activities, and some limited school management data<br />

such as staff meeting, CEC and associated training. It also allows for disaggregation of the data by<br />

Gender, District, Region and Zone, and in the future, Rural/Urban differentiation.<br />

There is still need for some expansion of the data collected if it is an aspiration to provide for a full set of<br />

EFA indicators to be provided from the data. This would be consistent with the conceptual framework<br />

outlined (Appendix 3) and would be necessary for Fast Track Initiative (FTI) involvement and<br />

Millennium Development Goal (MDG) reporting. The indicators that are not currently reported are listed<br />

in Table 1.<br />

The additional data that needs to be collected includes, grade enrolment by age for calculation of net<br />

enrolment ratios, ECD/Qur’anic enrolment (the G1ECD indicators calculates the percentage of grade 1<br />

enrolments that have previously had early childhood education experience), repetition and drop out data,<br />

student achievement data including student literacy rates. In addition more effective use needs to be made<br />

of population data for the calculation of standard gender parity indices. (Currently reported is % of girls<br />

in the school population, not the % of the female population that attends school. To interpret the current<br />

statistic requires the assumption of the equality of the number of boys and girls in the school age<br />

population). Much of this data would be available from and enhanced use of the ‘EMIS Tools’. Such a<br />

use would assist in generating importance and value of the tools, and create value and relevance for their<br />

use.<br />

It is recognised that student age information maybe difficult to acquire initially. It was noted that the<br />

grade 8 exam project was collecting student dates of birth. These may or may not have been accurate.<br />

However, at least some approximation of student age at time of school enrolment is a necessary<br />

component if EFA indicators are to be calculated. This data is provided for in the EMIS tools that we<br />

have indicated are not used systematically. This is further evidence for the importance of the review of<br />

EMIS tools recommended above.<br />

27


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Table 1: EFA Indicators not <strong>Report</strong>ed<br />

ECD 1 Gross enrolment ratio in Early Childhood Development (ECD) 1<br />

G1ECD 2 Percentage of new entrants to Grade 1 with ECD experience<br />

AIR-NIR 3, 4 Apparent Intake rates (AIR) and Net Intake rate (NIR) in primary<br />

education 2<br />

NER 6 Net Enrolment ratios (NER) in primary education<br />

EXP 7<br />

8<br />

a) Public current expenditure on primary education as a % of GNP 3<br />

b) Public current expenditure per pupil on primary education as a % of<br />

GNP per capita; and Public current expenditure on primary education as a<br />

% of total public current expenditure on education.<br />

REPETITION 12 Repetition rates by grade in primary education<br />

SURVIVAL 13 Survival rate to grade 5 4<br />

14 Coefficient of efficiency at grade 5 and at the final grade 5<br />

ACHIEVEMENT 15 % of pupils who master basic learning competencies<br />

LITERACY 16, 17 Adult Literacy rates 15-24 years and 15 years and over<br />

18 Literacy Gender Parity Index (GPI)<br />

Note. Current gender indicators are proxies for standard gender parity index due to lack of age cohort<br />

data.<br />

With the exception of the proxy for the survival rate, no outcome data is reported that would allow us to<br />

make judgements with regard to the quality of the education process. One Zone shared their examination<br />

system process and the data collection system associated with it. The system has the potential to provide<br />

information with regard to the “% of pupils who master basic learning competencies” provided suitable<br />

forms of examination data analysis were undertaken. Unfortunately, this examination data was not linked<br />

to the PES. In fact a parallel data base was being established reflecting a duplication of effort and<br />

inefficient use of scarce resources. Systematic recording of school outcome data should be undertaken to<br />

facilitate judgments about school quality. Such analysis is particularly important in the rapidly expanding<br />

context of <strong>Somali</strong> primary education. There is considerable international evidence that such<br />

circumstances lead to a declining education standard. A challenge will be to identify useful and<br />

meaningful indicators of student achievement. The common norm referenced approach of reporting a<br />

mean score for a school is not very useful as they cannot be compared across time. A more useful<br />

approach is to move to a criterion referenced approach that allows the recording of the percentage of<br />

students who have achieved a set of defined outcomes. The existing examination projects being<br />

undertaken should be encouraged to provide such information from the analysis and recording of exam<br />

data.<br />

The “inclusion” of Qur’anic and Nomadic schools in the PES is considered here. Qur’anic schools seem<br />

to operate in two different ways. It is reported that the majority operate by exposing children to the Koran<br />

prior to their progression to primary school. In this sense they are providing some education experience<br />

for some children prior to entry into the formal schooling system. It would thus be useful to eventually<br />

include Qur’anic institutions in a systematic data collection process in a similar manner to the way Early<br />

Childhood institutions are commonly monitored and recorded. However, advice has been received that<br />

the Qur’anic schools are too numerous to record and that there would be resistance to ‘government<br />

interference’ in their operation.<br />

1 This could be defined to include attendance at Qur’anic school prior to <strong>Primary</strong> school or Qur’anic<br />

school experience could be collected as separate data<br />

2 Due to not collecting age/grade enrolment<br />

3 Not a function of EMIS necessarily<br />

4 Current estimate is a proxy due to lack of repetition data<br />

5 Due to lack of repetition data<br />

28


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

It would be useful to record the number of grade 1 enrolments who have attended Qur’anic school in a<br />

way similar to the G1ECD indicator. Such data allows investigation into the impact of the Qur’anic<br />

school experience on drop out, survival and performance in Grade 1. International experience indicates<br />

that ECCE has a very positive impact on these indicators. It would be interesting to make similar<br />

investigations into the impact of Qur’anic experience. This may lead to discussions as to how to adjust<br />

Qur’anic school experiences to increase positive impact on these indicators.<br />

The second way Qur’anic schools are operating is that some have started to provide experience in the<br />

formal primary school curriculum. As such they are indigenous schools with a hybrid curriculum and are<br />

providing alternative pathways to formal education. Two DGs indicated that such schools are already<br />

included in the PES. This is a satisfactory situation except that the current PES form does not allow<br />

recording of Qur’anic status. Such school status should be recorded. It might be provided for in question 1<br />

under “school type” or under question 9a under school “owner”, or 9b under school “manager” of the<br />

current PES data collection form.<br />

There seems to be no nomadic schools included in the current survey. If they are, they are not recorded as<br />

such. Nomadic children are likely to be among the most vulnerable and marginalised children, and<br />

therefore need particular attention. However, if such children are not in school, they will not be included<br />

in the survey. A more precise analysis of out of school children is needed to investigate the underlying<br />

issues of out of school children. The variability of the distribution of these children needs to be<br />

understood to allow strategies to be developed to address their needs. However, such a study is<br />

considered to be outside the normal ambit of an EMIS, unless such children can be identified by a<br />

disaggregation of the GER. It needs to be remembered that the GER uses population data as its<br />

denominator, and such data will not in itself identify nomadic children as distinct from other out of school<br />

children.<br />

In the extensive range of interviews undertaken there were many other suggestions for additional data<br />

which people argued would assist in their delivery of support programmes, or in the allocation of<br />

resources and in general school management. The sources of these suggestions included NGOs and<br />

REOs. Nearly all the suggestions have value. However, some would not normally be included in an<br />

EMIS.<br />

As an example let us consider the rehabilitation needs of school buildings. A limiting factor is the ability<br />

of a wide range of people to make a consistent or reliable evaluation of the condition of a school. Head<br />

teachers and enumerators could be asked to make a global judgement about the building condition such as<br />

“new, adequate, dilapidated, unsafe”, but will the data recorders be qualified to make such a judgement,<br />

and could adequate descriptors be developed for each word on the scale to be consistently used across a<br />

large number of schools. This data remains important for planners and in the absence of a formal school<br />

building survey by qualified experts, it should be collected with the caveat that it will need to be<br />

interpreted with caution.<br />

A number of contributors to the study asked for data about the “child friendliness” of a school that would<br />

include child centred learning, violence, child participation, clubs, participation of girls in forums,<br />

protection, quality, access etc. To effectively measure all of these issues in a reliable way creates a<br />

challenge, which may be beyond the current state of development of the PES/EMIS. It may be possible to<br />

use some proxy indicators such as lesson types used once a week, or use of corporal punishment in last<br />

month that reflect the degree of child friendliness.<br />

The suggestion of recording learning material also has merit, but may create difficulties of volume of<br />

information. Short of conducting a complete stock take of the school, full coverage of such information<br />

may be impractical. The PES already records receipt of text books, but not the adequacy or otherwise of<br />

supply. It may be practical to record some key resources that are commonly available and for which there<br />

is money or a donor ready to provide.<br />

29


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

A request was made for issues relating to <strong>Education</strong> Management to be recorded in the PES. Some<br />

aspects of this are already included, such as staff meetings, function of CEC, use of EMIS tools, and inservice<br />

training experiences.<br />

Two areas that have been recommended for inclusion and should receive support are non-formal<br />

education and secondary education. Both these areas are significant contributors to development and<br />

should be systematically recorded. UNESCO is currently undertaking development of a secondary school<br />

EMIS. These issues are further developed below in the sections on Coverage and Coordination.<br />

The PAE currently records data from non formal institutions, but only in association with the Formal<br />

curriculum that some of these institution provide. It would be a sensible development to expand this<br />

component of data collection to include a broader definition of non-formal curriculum components. This<br />

issue is further developed below in the coverage section.<br />

A final suggestion that has been received is the inclusion of hazards and risks that might lead to an<br />

emergency situation for education. Given the current context this would be of great benefit for planners<br />

and experts in the area. A current programme to develop an emergency preparedness plan is currently<br />

emerging and consideration should be given to including relevant data and indicators in the EMIS.<br />

In this section a large number of suggestions with regard to possible additional data needs have been<br />

made that could be useful in assessing the current state of primary education and in planning for its future<br />

development. However, it would not be practical to try and expand the system to include all of these<br />

possibilities immediately. Further, the priority for expansion of the system will be more dependent on<br />

aspirations of the MoEs than on ‘outsiders’ views and values. It should also be recognised that the ability<br />

to expand the range of data collected will depend in part on the success or otherwise of attempts to shift<br />

further functions to the control of the MoEs. With these caveats, suggested data expansion has been<br />

organised into three groups implying a priority. The groups are called Immediate, Next Phase, and finally<br />

Aspirational and are set out in Table 2.<br />

Table 2: Suggested Development of Targeted Data<br />

Immediate<br />

Next phase<br />

Aspirational<br />

Enrolment by age, grade and gender<br />

Drop out and repetition<br />

Secondary School data<br />

Condition of buildings/ rehabilitation needs<br />

Background ECD of grade 1 enrolments<br />

Expanded Non-formal education data<br />

Qur’anic schools/ Early childhood<br />

Text book numbers<br />

Emergency preparedness<br />

Outcome/ student achievement data<br />

Child friendliness of schools<br />

Teaching and learning resources<br />

The enthusiasm and encouragement with which NGOs and other agencies have supported this review<br />

process, and the range of suggestions provided for additional data to be included in the PES should be<br />

30


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

formally recognised and systems put in place to provide regular opportunities to contribute to the data<br />

design process.<br />

31


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

1.2 Recommendations<br />

a. That expansion of the data collected take place to provide for a full set of EFA and MDG<br />

indicators.<br />

b. That systematic recording of school outcome data should be undertaken to facilitate judgments<br />

about school quality.<br />

c. That a criterion referenced approach to school outcomes be adopted that allows the recording of<br />

the percentage of students who have achieved a set of defined outcomes.<br />

d. That the number of grade 1 enrolments who have attended Qur’anic school and ECD be recorded.<br />

e. That Qur’anic status of Qur’anic schools offering an integrated hybrid curriculum be recorded.<br />

f. That a more precise analysis of out of school children be undertaken to investigate the underlying<br />

issues of out of school children. This should be a study separate from the EMIS.<br />

g. That school condition data be collected with the caveat that it will need to be interpreted with<br />

caution.<br />

h. That proxy indicators such as lesson types used regularly, or use of corporal punishment in last<br />

month that reflect degree of child friendliness be considered.<br />

i. That some key resources that are commonly available and for which there is money or a donor<br />

ready to provide be recorded.<br />

j. That the data collected include an expanded view of non-formal curriculum components.<br />

k. That a data on emergency risk and emergency preparedness be considered.<br />

l. That the priority list for data expansion be adopted.<br />

m. That NGOs and other agencies be given regular opportunities to contribute to the data design<br />

process.<br />

2 <strong>Report</strong>ing<br />

2.1 Findings and Analysis<br />

In this section comment will be made with regard to two separate aspects of reporting. The first is<br />

comment with regard to the Draft report from the 2006/2007 survey, which has not as yet been published.<br />

The second draws upon findings with regard to the use of the Data and the EMIS conceptual framework<br />

that indicated the EMIS is a process that requires movement of information both up and down within the<br />

system. A significant development this year has been the use of Access data base to store the data. This<br />

will improve flexibility and accessibility for investigating emerging issues and challenges. More comment<br />

on the potential of this data base in made below, but first the draft 2006/7 report is considered.<br />

The nature of reporting from the survey has been explored. The draft PES report that has been analysed<br />

seems to be opaque, poorly constructed and contains many inaccuracies. These inaccuracies have<br />

occurred in the extraction of data from the data base and in the subsequent analysis. A few of the<br />

apparent errors are listed here to illustrate the point.<br />

• Paginations are not consistent with contents table<br />

• Headings of unnumbered tables on pages 19 to 23 refer to the wrong year<br />

• Total enrolment figure for North West region on page 19 is 124117. It should be 115999. 115999<br />

is used elsewhere in the report and is consistent with the data in the Access data base.<br />

• The difference above leads to inconsistent reporting of total enrolment figures across the report<br />

(392101 versus 383983).<br />

• Annex 7 uses a total enrolment of 318159, when a check reveals it should be 383983. (An earlier<br />

version of annex 7 which had some major inaccuracies has, fortunately, been extensively<br />

modified in the current draft of the report)<br />

• Percentage change figures are sometimes incorrect or do not use a negative sign to differentiate<br />

between directions of change<br />

• Gross Intake Ratio is inappropriately defined<br />

• Survival rates are not consistent with EFA definitions<br />

32


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

This list is not claimed to be exhaustive but does create a sense of unease in the reader about the<br />

confidence that can be placed in the data of the report, especially when there is glaring errors in the<br />

opening tables.<br />

It would seem that the errors do not arise from the data collection or data entry processes but rather from<br />

the subsequent analysis and documentation process. A more carefully compiled and more attention paid<br />

to checking the report may have been useful.<br />

The question arises of what to do with the Draft 2006/7 report. It should not be published in its current<br />

form. However, some report needs to be made available. The options suggested here are:<br />

a. Ask the writer to carefully review the draft and to remove errors<br />

b. Ask a new analyst to review the current report with a view to eliminating all identifiable errors,<br />

but not attempting to change structure or readability (say 3 weeks).<br />

c. Ask a new analyst to start a new report preparation process, that would lead to a user friendly (or<br />

at least an improvement on the current model) product with the same scope as the current report<br />

(say 2 months). It would be feasible to ask the analyst to extract a small set of summary data for<br />

publication in a small “<strong><strong>Somali</strong>a</strong> <strong>Education</strong> Facts” book that presented key indicators and some<br />

time series data disaggregate by Gender and Zone. This summary data could then be given to a<br />

designer to be prepared for publication.<br />

d. If all the above seems out of the questions as simpler option would be to just prepare a “<strong><strong>Somali</strong>a</strong><br />

<strong>Education</strong> Facts” book and distribute it with a disc copy of the access data base with a set of<br />

queries embedded to facilitate more detailed inquiry when necessary (say 1 month).<br />

It was said earlier that the report seem opaque. By this it is meant that it is difficult to access, and find<br />

data, it seems to lack a coherent design and organisational layout. A conceptual framework guiding the<br />

construction of the report would be a useful addition. Part of this problem is due to the vast amount of<br />

data being reported, and it should be reported. It is clear that this report is intended for a relatively few<br />

people in the Development sector who require base line data for the development of proposals and<br />

submissions.<br />

It has been noted in this evaluation that the MoE makes very little use of the data in the current report<br />

format. This is in part due to a lack of systematic planning and monitoring processes. However, NGOs<br />

have reported making extensive use of the report in its current format. If there is to be a shift from a<br />

‘survey’ to an ‘EMIS’ a more effective way of sharing the data will have to be found.<br />

In the conceptual framework for an EMIS the need for a system that allows for dissemination of material<br />

in a flexible way was outlined and emphasised the need for a two way flow of information “so that all<br />

levels of the organisation can make effective use of the data and reports generated from the data.” This<br />

can never be achieved with a single report format such as the PES is currently using. It is necessary to<br />

clearly identify target groups, consider their needs and capacities and to design report formats to suit. It is<br />

axiomatic that UNICEF/UNESCO’s data needs are different from a news agency, regional office or local<br />

school/community. The current model of reporting reflects a model where information flows one way<br />

preventing lower levels of the MoE organisation from benefiting from the data collection and analysis<br />

process that they were so fundamentally important in providing.<br />

A set of target groups needs to be identified and reports designed specifically for each group. This process<br />

should also be supported by a front end module built for the access data base that can be flexibly accessed<br />

by a variety of users without the technical capacity to use the query module of Access. A front end<br />

module is software specially developed to provide a facility that allows data users to access information<br />

by making selections from a set of parameters of interest. An example of this has already been developed<br />

for the Health Management Information System for <strong><strong>Somali</strong>a</strong>.<br />

At a minimum the following set of reports should be developed and distributed:<br />

33


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

• A school report that should be provided within 3 months of data collection. This report can also<br />

be used as a data validation check. In addition it can be used by schools and CECs for planning<br />

processes and development of school improvement plans.<br />

• District reports that reflect education conditions within the district generally and allow schools<br />

and CECs to benchmark themselves and district officers to determine priorities<br />

• Regional reports that reflect education conditions across regions and allow for comparison of<br />

progress between regions.<br />

• Zone and <strong><strong>Somali</strong>a</strong> reports that that provide a summary of major EFA indicators and trends<br />

disaggregated by Region, gender and Rural/Urban. The tables in the current report would be<br />

available by front end module to data base.<br />

• A set of thematic reports that would change from year to year and would provide deeper analysis<br />

of EMIS data of issues of specific relevance. This might include gender, vulnerable and<br />

marginalised children, water and sanitation, condition of building or impact of specific<br />

programmes. It is expected that specialist technical assistance would be required to support this<br />

work. This technical assistance would be supported by a variety of interested agencies.<br />

• The capacity to interrogate the data base using the query module of Access would be developed<br />

at Zone level and would be available for generating specialised reports for stakeholders<br />

In preparing these reports specific attention should be paid to the technical capacity of the target group to<br />

interpret and use the reports. It has been suggested that a glossary of terms be included in each report to<br />

enhance readability.<br />

To make more explicit the thoughts behind the need for alternative reporting formats, a set of indicative<br />

data needs and how they might be used have been developed and included as Appendix7. They address a<br />

range of specific school, district and regional issues that are common issues in emerging education<br />

systems. They provide some direction as to how evidence based planning might begin.<br />

An emerging form of reporting is in the form of Maps. The EU has funded the Food and Agriculture<br />

Organisation to develop a <strong><strong>Somali</strong>a</strong> Dynamic Atlas. Data from the EMIS should systematically be<br />

integrated into a comparable atlas via GIS software. This format provides a useful form of collating data<br />

sets and providing easily accessed information in map form with associated attachments and tables. The<br />

GPS coordinates that have been collected in the PES will facilitate this synthesis. Such a synthesis is<br />

discussed further in the coordination section of this report.<br />

DEVINFO format of reporting should also be considered. DEVINFO is a data base format commonly<br />

used in developing countries for providing access to key data and indicators across a wide range of areas,<br />

including education, health, population, WASH, etc. It is not country specific and hence the data of many<br />

countries can be accessed using this format.<br />

It has become evident that there is little capacity in some zones in the areas of evidence based planning,<br />

monitoring of progress at various levels of the system, or school improvement planning. It will not be<br />

adequate to merely produce reports. It will therefore be necessary for support and training to be provided<br />

for many levels of the MoE, on how to use the reports to improve the education outcomes for the children<br />

of <strong><strong>Somali</strong>a</strong>.<br />

There already exists Community <strong>Education</strong> Committees (CEC) that provide support for education at the<br />

local level. Additionally, UNICEF is supporting a Community Development Project that aims to<br />

empower local communities to harness resources through identifying local needs and priorities, and<br />

proposal writing. These structures can be used to enhance education ownership at the local level. By<br />

linking to Community Development Committees (CDCs) this process can be used to mobilise resources.<br />

This process could be strengthened in the field of education by providing school/community reports that<br />

focus on reporting local conditions and facilities together with relevant comparisons. Such reports could<br />

be used to underpin local proposal development<br />

34


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

2.2 Recommendations<br />

a. That the draft 2006/2007 report should not be published in its current form.<br />

b. That a set of target groups are identified and reports designed specifically for each group. This<br />

process should also be supported by a front end module built for the access data base that can be<br />

flexibly accessed by a variety of users without the technical capacity to use the query module of<br />

Access.<br />

c. That reports include a school/CEC report, District reports, Regional reports, Zone and <strong><strong>Somali</strong>a</strong><br />

compact reports that provide a summary of major EFA indicators and trends disaggregated by<br />

Region, gender and Rural/Urban, and a set of thematic reports that would change from year to<br />

year and would provide deeper analysis of EMIS data of issues of specific relevance.<br />

d. That the capacity to extract information from the Access data base using the Query module of<br />

Access be developed at Zone level among those responsible for data base management.<br />

e. That Data from the EMIS be systematically integrated into an Atlas via GIS software<br />

f. That necessary support and training be provided for many levels of the MoE, on how to use and<br />

interpret the reports.<br />

D. IMPACT<br />

Findings and Analysis<br />

The impact of such survey instruments as PES is difficult to measure in objective terms. This study has<br />

attempted to assess impact by making inquiries with regard to the use and status of the PES report.<br />

Use of the PES by NGO and UN agencies has been reported earlier in this report, and indicates extensive<br />

use of the data and indicators by all agencies contacted. In particular it has allowed participation in EFA<br />

reporting and the setting of appropriate Millennium Development Goals. The general view is that the PES<br />

is the only significant information source in the sector for <strong><strong>Somali</strong>a</strong>. These positive responses tend to<br />

indicate the perception that the PES reports are a valued and reliable instrument for planning in the sector.<br />

To the extent that the PES is having an impact on or “contribute to increased enrolment rates, greater<br />

opportunities for girls, increased numbers of teachers and a larger number of quality schools” (Appendix<br />

2) can only be measured in the success of the programmes that are underpinned by the survey. The way in<br />

which the survey data underpinned proposals and programme development described by agencies earlier<br />

is evidence of this critical link. It is therefore necessary to look at the trend studies to quantify the impact<br />

of the efforts in the sector. One example is provided here to make the point. The number of children in<br />

schools has increased from approximately 150000 in 1999 to just fewer than 400000 in 2007. This trend<br />

has been consistent over the last ten years and applies to all three zones, with the exception of a decline in<br />

enrolments in the CSZ in the last year of measurement. The Gross Enrolment Ratio (GER) has also<br />

increased across all three zones in the last ten years. The juxtaposition of the PES and this increase in<br />

enrolment and GER does not establish cause and effect directly, but is indicative of some success in the<br />

sector when viewed holistically.<br />

While these judgements of impact can be made, there have been identified in this report a number of EFA<br />

indicators that cannot be calculated due to the data not being collected. Examples of these are net<br />

enrolment rations, and survival and efficiency indicators. The impact of the PES would be enhanced by<br />

the inclusion of these data and indicators.<br />

While there is evidence of impact at the macro level of agency planning, there is little evidence of a<br />

significant impact of the survey at either ministry level or at regional, school or community level. This is<br />

probably a result of capacity constraints and a lack of a wide ranging and flexible reporting structure.<br />

Strategies have been identified above to improve performance and impact in these areas. These include<br />

the decentralization of survey strategies and data use to enhance ownership of the processes involved. The<br />

emergence of evidence driven planning is one of these processes that will be associated with improved<br />

impact.<br />

35


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Currently common practice in agencies and NGOs is to sustain separateness between sectors such as<br />

education, health, water and sanitation, child protection etc. While attempts are made to have a more<br />

integrated approach to programme planning through constructs such as focusing on common regions or<br />

the identification of ‘cross cutting’ issues in an attempt to achieve synergies of programming effort, such<br />

attempts are not often reflected in systematic programme infrastructure and often have to be achieved<br />

through an additional layer of arrangements internal to an organisation. One approach to build a more<br />

systematic infrastructure to achieve cross sector cooperation is the DevInfo data base that stores key<br />

indices for all sectors across many countries to facilitate more holistic analyses. DevInfo tends to focus on<br />

macro level data with some disaggregation.<br />

There does appear to be an opportunity to further enhance impact through the programme infrastructure<br />

by allowing cross sector investigations to be made by linking the EMIS to the Health IMS at district, and<br />

community level to allow systematic investigations to be made in support of holistic programming<br />

activities. This together with a mapping process facilitated by the GPS coordinates being collected will<br />

allow for some detailed and holistic micro level reporting that will be particularly valuable in emergency<br />

situations. Hence, enhancing impact of the existing data collection and storage processes across sectors.<br />

Recommendation<br />

a. That impact of the PES be improved by implementation of the recommendations of this report in<br />

the areas of diversified reporting, capacity building on data use, broadening the scope of data<br />

collected, decentralisation of PES processes and linking PES to other data systems<br />

E. SUSTAINABILITY<br />

Findings and Analysis<br />

If sustainability of the process means that the MoEs can undertake the PES independently of outside<br />

technical assistance and funding support, then it is abundantly evident that the process is not sustainable<br />

now. Ministries are dependent on UNICEF for funding, organisation, conceptual overview, monitoring<br />

and much of the motivation for undertaking the annual survey. One Director General, while<br />

understanding the notion and need for sustainability sees UNICEF as part of that sustainability.<br />

UNICEF is not like a small NGO, it will be here in the future.<br />

Clearly the DG has a different concept of sustainability than that suggested above. The need has already<br />

been discussed for the carefully designed structures and capacity building procedures at ministry level.<br />

These are emerging and UNICEF contributions of Planning and EMIS technical assistance and the<br />

establishment of an EMIS unit for each of the three ministries will be important contributions to<br />

sustainability. The extent to which they are successful obviously will impact on sustainability.<br />

To the extent that lack of quality staff is seen as a threat to sustainability, it may be necessary to look for<br />

different arrangements that the traditional employment of staff to permanent positions within the MoEs to<br />

meet capacity demands of the EMIS. One possible alternative arrangement is to issue outsourcing tenders<br />

to undertake the work to suitable institutions that can exhibit the appropriate management and technical<br />

skills. Such an institution may be the local institutes of higher learning which might be expected to have<br />

suitable research, data collection and IT skills available. Such ‘outsourcing’ strategies would have the<br />

advantage of providing funding support for students and could assist in developing genuine research<br />

interest in the data to link with the concept of generating specialist reports on particular areas of the<br />

education sector.<br />

In addition to the above general concepts there are a number of perceived threats to sustainability specific<br />

to the local contexts that are listed here.<br />

1. The EMIS units will have to be staffed with capable and professional local people. It was often<br />

said that trained local staff are hard to hold due to the demand to trained staff from other<br />

36


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

organisations. This will be a threat to sustainability and reflects on the need to provide<br />

competitive salaries and incentives for the new recruits. There will be a need for assistance until<br />

such time as financial support for education, from the government, is adequate. In the short term,<br />

MoEs should provide evidence of their commitment to the EMIS process by accepting at least<br />

partial responsibility for staff salaries and have a plan to accept full responsibility in the<br />

foreseeable future.<br />

2. The development of clear ethical standards and effective monitoring systems of staff will be<br />

required. These may include defined work hours and conditions, commitment to achievement of<br />

targets in a timely fashion, performance appraisal, and adequate reward for effort.<br />

3. Transparent selection procedures of staff to ensure nepotism is avoided and that appropriate skill<br />

levels and potential are reflected in employees.<br />

4. Coordination of EMIS across three ministries, especially as capacity grows within zones, will<br />

need to be carefully managed. Along with decentralisation, it is to be expected that greater<br />

autonomy of action will result, and hence differentiated EMIS practice may follow.<br />

5. Establishment of communication and travel capacity is necessary for transfer of the PES to more<br />

usual structures of EMIS operations. While these are beginning to emerge, they need to be<br />

consolidated and encouraged. Regional level monitoring and quality assurance systems will be<br />

reflected in these developments.<br />

6. In Puntland the restriction of movement for the technical assistance staff and current restrictions<br />

to compounds will impact adversely on capacity building plans.<br />

7. In central south the current instability threatens sustainability. The technical assistance will<br />

probably need to be located in Nairobi. This will create challenges for the development of<br />

capacity in local staff, without which sustainability in the longer term becomes difficult.<br />

8. Maintenance of computing facilities. Access to internet will be necessary for virus control, plus a<br />

regular maintenance contract and updating plan for computing equipment.<br />

Recommendations<br />

a. That employment conditions be such as to ensure the retention of professional staff<br />

b. That outsourcing some EMIS activities be considered in areas of staff shortage<br />

c. That merit based employment procedures be defined and implemented<br />

d. That effective work practices and standards be identified and made explicit<br />

e. That a mechanism to coordinate EMIS activities across the three zones be established<br />

f. That an effective quality assurance system be established that includes communication and<br />

travel arrangements<br />

g. That systems of EMIS implementation be differentiated across regions to reflect the<br />

differences in local conditions<br />

h. That a computer maintenance and replacement policy be developed and implemented<br />

F. COVERAGE<br />

Findings and Analysis<br />

Since the PES survey is a census, there can be no threats to validity derived from a sampling bias. There<br />

have been some questions as to the suitability of conducting the survey on a sample basis rather that the<br />

current census. This would not be consistent with the conceptual framework for an EMIS outlined here,<br />

and would severely limit the potential use of the EMIS as it grows to a complete system.<br />

While the priority for development in the recent past has been primary education, there have been several<br />

requests for the PES/EMIS to be expanded to include secondary education. While the secondary<br />

education system is currently relatively small, it still needs to be assessed and planned for, hence the<br />

necessity for a systematic data collection procedure to be implemented. UNESCO is currently beginning<br />

that implementation. Further comment on this issue is offered under the coordination section.<br />

One planning officer indicated that nomadic schools are included in the current PES, while another in a<br />

different zone indicated that there were no Nomadic schools. The Save the Children NGO has established<br />

37


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

a number of models to target Nomadic children in a special project involving the establishment of 44<br />

Nomadic schools and 5300 learners. It is important that these can be identified in the PES/EMIS process<br />

to allow monitoring of impact of the project over time. Obviously the nomadic children should be<br />

included in any data collection process due to their perceived vulnerability and marginalisation. Currently<br />

there is no system for identifying nomadic school status in the PES. This should be addressed, and REOs<br />

asked to identify locations of such schools if they exist.<br />

The PES covers primary schools that target children of normal school age who study the formal primary<br />

curriculum. In addition to primary schools there are <strong>Primary</strong> Alternative <strong>Education</strong> Centres (PAE) that<br />

target youth who have either not attended school or who have had schooling disrupted by social<br />

instability. In addition, to the youth, there are a significant number of primary school aged children who<br />

attend these centres by accompanying older siblings. Both these groups at PAEs have a significant<br />

enrolment of girls. As a result of these differentiated groups, two curriculums are offered. One is the<br />

standard primary formal curriculum offered to children of normal primary school age and the other is<br />

referred to as the ‘Non Formal Curriculum.’ The non formal curriculum consists of the primary school<br />

formal curriculum for the first five years, compressed into a four year programme to better suit the older<br />

youth that it targets. PES currently records data with regard to students undertaking the formal primary<br />

curriculum in alternative education institutions, in addition to the number of students undertaking the non<br />

formal education curriculum. Unfortunately, it does not record ages of these students, and ages would<br />

seem to be a critical issue when considering the diversity among the groups catered for these alternative<br />

centres.<br />

There is no provision for an alternative form of non formal education to be included in the survey. This<br />

might include intensive adult literacy classes, traditional crafts, peace education, child raring practices etc.<br />

Some requests were received for these to be included in the PAE form of the PES. This would be suitable<br />

if they were supervised and monitored by the MoEs. It becomes very difficult to adequately define some<br />

of these activities as they are often transient. If they were to be included in the PES/EMIS they would<br />

need to be addressing a policy position that the MoEs or agencies were interested in. As an example such<br />

a policy position might be improving adult literacy. If this were the case it would be relevant to include<br />

non-formal education adult literacy classes in the survey.<br />

Recommendations<br />

a. That data collection systems be expanded to include all areas of that have policy implications for<br />

MoEs. This would include ECCE, secondary education and non formal education.<br />

b. That nomadic schools be clearly identified in the PES<br />

G. COORDINATION<br />

Findings and Analysis<br />

UNICEF seems to work very effectively with other stakeholders. There is a consultation process on an<br />

annual basis and the work of UNICEF in coordinating the PES process is well respected and valued.<br />

The PES is well known in the education sector and, as detailed earlier in this report, the data it provides is<br />

widely used by NGOs, UN agencies and Ministry authorities. Of all agencies that contributed to this<br />

evaluation only one indicated that it did not have a copy of the reports made available by UNICEF.<br />

Further issues that will require effort in the future to ensure adequate coordination are the development of<br />

the Secondary School EMIS by UNESCO and the coordination across the three zones of <strong><strong>Somali</strong>a</strong> as<br />

responsibility is shifted to MoEs as capacity and ownership develops. The UNESCO secondary survey<br />

has some elements of its design in common with PES in that enumerators collect data and it is entered<br />

into computers in Nairobi. However, the data is entered into SPSS and not a relational data base. The data<br />

is entered twice as a validity check and a relevant comparison process undertaken. The current reporting<br />

system is very rudimentary and is based on the ‘tables’ capacity of SPSS.<br />

38


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

The lack of similar codes as the PES uses for zone, region, district and settlement and the lack of location<br />

coordinates make it impossible to bring the two data systems together. This would be necessary if the<br />

Ministry wanted to investigate available secondary capacity as compared with demand for secondary<br />

school places from graduating primary students.<br />

It was also noted that data entry training within zones is being duplicated and data entry computers are<br />

being provided by both UNICEF and UNESCO.<br />

Discussions with UNESCO staff indicate that they have a philosophy for an EMIS similar to the<br />

conceptual framework presented here and aspire to the transfer of responsibility for the survey and the<br />

emerging EMIS to the MoEs. The UNESCO spokesman spoke passionately about the need to harmonise<br />

the primary and secondary systems but agreed that it is not happening yet. A structure needs to be found<br />

that will bring the two systems together in a rational way. One suggestion is that both the PES and the<br />

Secondary survey be run through the EMIS unit when it is established and that the EMIS coordinator and<br />

the associated technical assistant be responsible for harmonising the two systems.<br />

Exciting opportunities for cross cutting appraisal of conditions exist if the links to WASH, health and<br />

emergency data, via a school mapping exercise, are well coordinated. This can be done by using location<br />

coordinates and existing zone, regional, district boundaries in addition to accurate settlement locations an<br />

up to date population data. An earlier consultants report on School Mapping recommended that UNICEF<br />

engage a GIS specialist to facilitate this work across many areas of UNICEF activity. This would be a<br />

useful way of developing mapping processes for education and would facilitate the integration of several<br />

areas of UNICEF activity<br />

In addition to the PES there have been other surveys conducted for a variety of purposes and by a variety<br />

of agencies. A recent example is the CARE survey of Baseline <strong>Survey</strong> of Integrated Support for <strong>Primary</strong><br />

and Alternative Basic <strong>Education</strong> (Ispabe) Programme in Central- South <strong><strong>Somali</strong>a</strong>. This study is restricted<br />

to just three regions in one zone, and covers many of the areas studied in PES. However, a significant<br />

additional contribution is that it adds a qualitative dimension to the data. Instead of just asking, what or<br />

how many, is asks why. In this way it supplements the PES material. Additionally the two studies could<br />

be used as a validation of the data they have in common.<br />

Another study of some significant is the Multiple Indicators Cluster <strong>Survey</strong> (MICS) undertaken by<br />

UNICEF in 2005. This survey is a model used in many developing countries and is based on a standard<br />

‘core’ with adaptations to investigate special contextual issues in <strong><strong>Somali</strong>a</strong>. It used a sample approach to<br />

explore a range of indicators at Zonal level and allows for disaggregation by gender and Rural/Urban. It<br />

provides for a range of indicators that are not currently available in PES, such as net enrollment ratios and<br />

survival to grade five indices. It also covers a range of areas outside of education such as nutrition, child<br />

health, environment, child protection, HIV/AIDS etc. It is thus a much more broad brush survey that<br />

provided a wide range of data on a national scale, but would not be suitable for many, indeed most, of the<br />

functions expected of an EMIS.<br />

There are other studies that have been conducted in <strong><strong>Somali</strong>a</strong>, such as the From Perception to Reality: A<br />

study on Child Protection in <strong><strong>Somali</strong>a</strong>, that don’t directly record EMIS type data. However, they should<br />

be included in programme development in the education context since they included data on issues such<br />

as child violence, children with special needs and other significant areas that matters that impact directly<br />

on formal school activity.<br />

Recommendations<br />

a. That the PES and Secondary School data systems be coordinated to ensure consistency between<br />

them.<br />

b. That mapping tools are used as a coordinating device across sectors<br />

c. That survey and study design be undertaken to complement the PES/EMIS.<br />

39


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

H. CONCLUSION: FUTURE STEPS TOWARDS AN EMIS<br />

It has been noted that currently UNICEF is committed to providing technical support in the areas of<br />

Planning and EMIS. These resources need to be integrated closely with the experience and support<br />

mechanisms being recommended here.<br />

It is suggested that an EMIS unit is established within a Directorate of Planning and Statistics of each<br />

Zone Ministry. The EMIS unit would be staffed by:<br />

• EMIS Coordinator<br />

• EMIS data base manager<br />

• Five data entry and retrieval staff<br />

• Two secretarial and administrative assistants<br />

The unit would have the assistance of technical support responsible for assisting the EMIS coordinator<br />

and data base manager in planning development of the unit and the capacity development of the staff of<br />

the unit.<br />

The capacity development would have to take place in the areas of:<br />

• Relational data base design and construction.<br />

• Data form design in Access<br />

• Data entry, cleaning and validation<br />

• Design and use of Access queries<br />

• Design and generation of Access reports<br />

• Use of front end software to generate flexible reports and provision of specialist information<br />

• Use of mapping software<br />

• Design, Management and implementation of the EMIS process<br />

• Linking of the EMIS unit with the Planning arms of the Directorate<br />

From this list it can be seen that the technical assistance will need to be in place for a considerable length<br />

of time. While accurate estimates of this time frame are difficult, a minimum of five years should be<br />

planned for.<br />

While it has been recommended that capacity building structures are improved to include, along with<br />

training, the opportunity to gain experience in complex tasks within a supportive environment, there is a<br />

need to emphasis that the Ministries have some additional responsibilities. It is essential that they identify<br />

suitable qualified focal points and the necessary support staff to undertake these duties. Some of the<br />

current practices of people working without salary will threaten sustainability. In addition, the appointed<br />

staff must be adequately supervised to ensure the achievement of set goals and targets and a professional<br />

work ethic in terms of hours worked and commitment to the organisation. Without these qualities in the<br />

EMIS staff no capacity building exercises can be expected to impact significantly on the organisation.<br />

The approach being assumed is that the PES is an early step towards the development of an EMIS,<br />

supervised and administered by MoEs. This was the understanding that emerged in 1997 and there have<br />

been some tentative steps towards this autonomy since, but there does not seem to have been developed a<br />

comprehensive transition plan. The tentative steps include provision of computing hardware, training of<br />

enumerators, use of access data base etc. More recently plans have emerged for technical assistance to be<br />

provided for further capacity development and with the provision of buildings etc. The challenge will be<br />

to adequately assess the rate of development of the capacity of the MoEs to ensure sustainability of the<br />

desired autonomous EMIS.<br />

There needs to be some agreement as to precisely what aspirations there are in the development of an<br />

EMIS. The language of EMIS is being used in a number of different ways that seem at odds with<br />

international standard use of the term. To assist in addressing this issue a conceptual framework (see<br />

40


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Appendix 3) has been developed that assists in defining what is meant by EMIS and, if agreed to, will<br />

assist in designing a pathway of development. It is an aim of this review to present some options for<br />

moving from where the PES is now to where stakeholders aspire to be in (say) 5 years time. The<br />

conceptual framework has been used extensively in developing countries in Africa and Asia. It is<br />

accepted as an aspirational model for an EMIS for the component Zones of <strong><strong>Somali</strong>a</strong>. The current PES and<br />

its associated Access Data Base can be referred to as the initial component parts of an EMIS. The current<br />

‘EMIS tools’ should be referred to as ‘School Records’.<br />

Earlier in this report a set of components was identified that are required to be developed as part of the<br />

transition from a survey to an EMIS. Central to this transition is the notion that the PES will not be<br />

replaced, but rather will be further developed to form an EMIS consistent with the conceptual framework.<br />

Throughout this report are sets of recommendations relating to each of the components in the list below.<br />

A brief summary of these recommendations is provided here to clarify development directions being<br />

recommended.<br />

• Identification of data to be collected and design of data forms. A prioritised set of additional data<br />

is listed under relevance and appropriateness section. This reflects the EFA model included in the<br />

conceptual framework (Appendix 3) together with some context specific data recommendations<br />

gleaned from field observations.<br />

• Data Collection. This is currently undertaken by enumerators. Suggestions are made about<br />

developing conditions to facilitate head teachers communicating with REOs with completed data<br />

sheets<br />

• Validating data collection and data cleaning. A set of recommendations are provided in the<br />

efficiency section. Central to this is the development of an audit trail for more efficient<br />

monitoring of data collection, the improvement of training leading to a certification process, the<br />

transparency of employment processes and identification of a set of ethical considerations.<br />

• Data entry processes. This should be the next component to be transferred to the MoEs.<br />

Suggestions are made in the ownership section about how this might be done and how the<br />

training component of capacity building needs to be developed<br />

• Data analysis and <strong>Report</strong> generation. In the reporting section a set of recommendations are made<br />

that will expand and enhance the use of the data. It is envisaged that many of these will be<br />

produced by a ‘front end’ software component to the existing access data base. The emphasis<br />

here is on flexibility and a variety of report formats that are designed to suit a variety of stake<br />

holders. Central to this concept is the reporting to regions, districts and schools and their<br />

communities to facilitate planning and school improvement.<br />

• The integration of various data collection strategies currently undertaken. In a section on data<br />

duplication, the implications of some current planning and timing issues are discussed and<br />

suggestions made for reducing impact on resource use caused by duplication of data collection by<br />

different sections of the Ministries.<br />

The EMIS focal point and the EMIS technical assistance should develop a ‘ROLLING’ transition plan for<br />

the development of components of the EMIS and the shift of responsibility to the MoEs. An EMIS<br />

Development Planning Matrix is provided as Appendix 6 that might provide a useful starting point for the<br />

rolling plan. Such a plan should include a ‘vision’ for the EMIS unit to be seen as a service unit to other<br />

units within the Ministry such as regional offices and schools and their communities, planning,<br />

examinations, curriculum and teacher development. A precise time line is not provided here as it will be<br />

subject to successful implementation of the capacity building in earlier components. Hence the notion of a<br />

‘rolling’ plan that will be upgraded annually subject to evaluation of previous targets.<br />

It is clear in two of the zones that the development of an EMIS/Planning Directorate would take place<br />

within the current structures of the MoE. These two ministries expressed enthusiasm and commitment to<br />

undertaking this capacity building process and have staff identified to begin the task. However, there are<br />

challenges to be overcome in the Central South zone where there is no functioning Ministry.<br />

Currently in Central South there are umbrella organisations and a community care centre emerging that<br />

are acting as coordinating agencies across 5 of the 10 Regions. These umbrella organisations have formed<br />

41


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

a coordination body to assist education development. It might be possible to form a subgroup of this<br />

coordination body to that would act as an EMIS committee to supervise the development of an<br />

EMIS/Planning unit in lieu of a functioning ministry. This unit would eventually fit into the Ministry<br />

when it emerges. The obvious threat to sustainability with this model is the willingness of the new<br />

ministry to accept the unit members as employees.<br />

Alternatively, there may exist a local NGO that is capable of managing the process and would be capable<br />

to overseeing the development of the EMIS/Planning unit. This process would have a higher risk of<br />

failure when the time came to hand over the unit to a ministry structure since the employees would see<br />

themselves as employees of the NGO and not the ministry<br />

In addition to these challenges is the issue of location of the technical assistance. Currently International<br />

staff are forbidden to enter the Central South Zone due to the lack of security in the area. While this<br />

situation persists, it will be impossible to develop the EMIS/Planning unit as in the other two zones. If<br />

this is the case it would be preferable to use the current PES structures and procedures until the security<br />

situation improves. It is difficult to develop capacity in a ministry if there is no functioning Ministry.<br />

Advice from UNICEF officers in CSZ is for this option of continuing with current PES procedures until<br />

such time a greater stability returns to the Zone and a functioning Ministry emerges.<br />

An issue that will emerge as a possible constraint without careful planning is the issue of coordination<br />

across the three zones as ownership is developed within the zones and local issue are addressed. There is<br />

a likelihood that the data collection forms of the three zones will begin to diverge as each zone begins to<br />

exercise greater control over the process. If this happens it may become difficult to bring the three data<br />

bases together into a single <strong><strong>Somali</strong>a</strong> data base. It will be necessary to use procedures such as those that<br />

currently exist to ensure the future harmonisation of the three data bases.<br />

Recommendations<br />

a. That the UNICEF Planning and EMIS technical assistance be integrated closely with an EMIS<br />

and Planning Directorate.<br />

b. That MoEs identify suitable qualified focal points and the necessary support staff to undertake the<br />

duties associated with EMIS development.<br />

c. That the appointed staff be adequately supervised to ensure the achievement of set goals and<br />

targets<br />

d. That the conceptual framework is accepted as an aspirational model for an EMIS for the<br />

component Zones of <strong><strong>Somali</strong>a</strong>.<br />

e. That the current PES and its associated Access Data Base is referred to as the initial component<br />

parts of an EMIS. That the current “EMIS tools” be referred to as School records.<br />

f. That the EMIS focal point and the EMIS technical assistance develop a ‘ROLLING’ transition<br />

plan for the development of components of the EMIS and the shift of responsibility to the MoEs<br />

g. That the rolling transition plan incorporates changes in data collection, data cleaning and<br />

validation, and changes in reporting recommended in this evaluation<br />

42


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

REFERENCES<br />

Anon. (2007) Summary Field <strong>Report</strong> – <strong><strong>Somali</strong>a</strong> 27 – 31 March 2007: Training Of Trainers <strong>Primary</strong><br />

<strong>Education</strong> <strong>Survey</strong> Micro Planning Workshop – Bossaso<br />

Anon. End of Consultancy <strong>Report</strong>. <strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong><br />

Anon. Data Analysis and data verification process<br />

Anon. (1999). Notes on the use of class register. UNICEF<br />

Anon Oct 2002. A Trip <strong>Report</strong> on the <strong>Education</strong> Management System (EMIS): Software training.<br />

Boveington J. T. (2007) UNICEF Assessment-Capacity Development CA-CD Assignment. Ministry of<br />

<strong>Education</strong>, <strong><strong>Somali</strong>a</strong>.<br />

Brentall, C., Peart, E., Carr-Hill, R., and Cox, A. (2000) Thematic <strong>Education</strong> For All Study. Funding<br />

Agency Contributions to <strong>Education</strong> For All. Overseas Development Institute. London.<br />

Mulinge, L., (2007). EMIS: Version 1. User Guide. UNICEF, Nairobi<br />

Lecompte, M. D., & Preissle, J. (1993). Ethnography and Qualitative Design in <strong>Education</strong>al Research<br />

(2nd ed.). San Diego: Academic Press.<br />

May, C. and Vine, K. (1997). Indicators of <strong>Education</strong>al efficiency and Effectiveness: A Data Dictionary.<br />

UNESCO PROAP. Bangkok<br />

Mendelsohn, J. (2007). Further Development of the School Mapping Programme in <strong><strong>Somali</strong>a</strong>. UNICEF<br />

Ministry Of <strong>Education</strong>. (no date). EMIS Registers Instruction Manual. UNICEF<br />

Ministry of <strong>Education</strong>. (no date). School Register. UNICEF<br />

Ministry of <strong>Education</strong>. (no date). Class Register. UNICEF<br />

Ministry of <strong>Education</strong>. (no date). Pupil Information Card. UNICEF<br />

UNICEF (2000) <strong>Primary</strong> School <strong>Education</strong> <strong>Survey</strong> <strong>Report</strong> 1999/2000. UNICEF<br />

UNICEF (2001) <strong>Primary</strong> School <strong>Education</strong> <strong>Survey</strong> <strong>Report</strong> 2000/2001. UNICEF<br />

UNICEF (2002) <strong>Primary</strong> School <strong>Education</strong> <strong>Survey</strong> <strong>Report</strong> 2001/2002. UNICEF<br />

UNICEF (2003) <strong>Primary</strong> School <strong>Education</strong> <strong>Survey</strong> <strong>Report</strong> 2002/2003. UNICEF<br />

UNICEF (2004) <strong>Primary</strong> School <strong>Education</strong> <strong>Survey</strong> <strong>Report</strong> 2003/2004. UNICEF<br />

UNICEF (2005) <strong>Primary</strong> School <strong>Education</strong> <strong>Survey</strong> <strong>Report</strong> 2004/2005. UNICEF<br />

UNICEF (2006) <strong>Primary</strong> School <strong>Education</strong> <strong>Survey</strong> <strong>Report</strong> 2005/2006. UNICEF<br />

UNICEF (2007) Draft <strong>Primary</strong> School <strong>Education</strong> <strong>Survey</strong> <strong>Report</strong> 2006/2007. UNICEF<br />

UNESCO (1998) Training Package: <strong>Education</strong>al Management Information System. Principal Regional<br />

Office for Asia and the Pacific. Bangkok<br />

UNESCO (2000). Support for <strong>Education</strong> for All 2000 Assessment. UNESCO Principal Regional Office.<br />

Bangkok. (CD)<br />

UNESCO (2000) http://portal.unesco.org/education/en/ev.php-<br />

URL_ID=43385&URL_DO=DO_TOPIC&URL_SECTION=201.html<br />

Windham, 1988. Improving the Efficiency of <strong>Education</strong>al Systems: Indicators of <strong>Education</strong>al Efficiency<br />

and Effectiveness.<br />

43


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

APPENDICES<br />

Appendix 1 Summary of Recommendations<br />

A. Efficiency<br />

Validity<br />

1. That REOs, Supervisors and UNICEF staff are encouraged to develop a ‘culture of<br />

support’ for enumerators to encourage transparent and open work practices.<br />

2. That such a culture should be encouraged by a module on ethical behaviour being<br />

included in the training exercises<br />

3. That enumerators should be evaluated both during training and during the field work, and<br />

employment in the PES process be dependent on satisfactory appraisal<br />

4. That GPS training be supported by field work and mapping exercises to ensure an<br />

understanding of the implications of incorrect location data<br />

5. That an audit trail be established to facilitate effective tracking of school visits and data<br />

forms<br />

6. That enumerators be required to record school locations as ‘way points’ in their GPS as<br />

part of the audit trail<br />

7. That enumerators be required to validate data in the school from source documents on<br />

selected indicators.<br />

8. That effective Quality Assurance mechanisms be developed that would establish regular<br />

contact with schools. These are seen as a necessary prerequisite for the validation<br />

process.<br />

9. That in future data collection processes, the data census become an ‘update’ of data rather<br />

than the current collection of all data on an annual basis<br />

10. That a missing data report be provided at the completion of data entry to allow follow up<br />

in the field.<br />

Training<br />

11. That a certification process for enumerators and supervisors be used to judge competence<br />

for the work tasks.<br />

12. That the certification process be based on explicit competencies that must be<br />

demonstrated by enumerators and supervisors.<br />

13. That a set of additional modules to the current training be included in the training course<br />

14. That the course be lengthened to include the extra content<br />

15. That participants be able to revisit opportunities to learn and demonstrate the<br />

competencies<br />

Enumerators and Supervisors<br />

16. That the enumerators/supervisor positions be competitively attained. It would be<br />

advisable to include more than the required number in the training process and only the<br />

best included in the data collection phase.<br />

17. That the performance of enumerators and supervisors be evaluated during the data<br />

collection phase by independent assessors using predefined competencies.<br />

18. That REOs be rotated across data collecting teams and thus create a more independent<br />

system of monitoring.<br />

19. That a fine balance is recognised between commitment to quality and commitment to<br />

capacity development.<br />

20. That as capacity is developed and infrastructure built the enumerator/supervisor structure<br />

be abolished and replaced with an EMIS officer located in each regional office and head<br />

teachers be empowered to complete update of data forms along with subsequent data<br />

collection processes in the form of term or semester reports.<br />

44


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Timing<br />

21. That data be collected in late September or early October (or as early in the school year<br />

as can be achieved)<br />

22. That data be entered into the data base in November,<br />

23. That preliminary reports begin to flow to stakeholders, in particular REOs and schools, in<br />

December.<br />

Data Duplication<br />

24. That careful planning take place to provide data on a timely basis for the variety of MoE<br />

directorates and stakeholders<br />

B<br />

Effectiveness<br />

Utility-Current and Potential<br />

25. That timeliness of data reporting be improved to suit institutional needs<br />

26. That capacity to use the data be developed among stakeholders,<br />

27. That alternative formats of reporting of data be developed to differentiate between<br />

stakeholders<br />

28. That a systematised and institutionalised planning process be developed. This<br />

institutionalisation need to take place at central, regional and school/community levels<br />

EMIS Tools<br />

29. That the use of the ‘EMIS tools’ is clarified and appropriate mentoring provided for head<br />

teachers on the use and value of these tools<br />

30. That a review of the design and use of the tools be undertaken.<br />

31. That plans be developed to integrate this data with the EMIS in the form of<br />

supplementary data collection and as a contributor to data verification<br />

32. That further distribution of the EMIS tools be supported by developing understanding of<br />

their importance and use<br />

Ownership<br />

33. That the responsibility for transferring of EMIS components to MoEs are considered one<br />

at a time as capacity and resources are developed. In this way a smooth transition from<br />

the centrally managed survey to a decentralised EMIS can be achieved<br />

34. That responsibility for data entry be shifted to Zones over a two to three year period.<br />

35. That initially MoE data entry staff be used to enter the data once and that the Nairobi data<br />

entry staff enter it the second time<br />

C<br />

Relevance / Appropriateness<br />

Data Required for an EMIS<br />

36. That expansion of the data collected take place to provide for a full set of EFA and MDG<br />

indicators<br />

37. That systematic recording of school outcome data should be undertaken to facilitate<br />

judgments about school quality.<br />

38. That a criterion referenced approach to school outcomes be adopted that allows the<br />

recording of the percentage of students who have achieved a set of defined outcomes<br />

39. That the number of grade 1 enrolments who have attended Qur’anic school and ECD be<br />

recorded.<br />

40. That Qur’anic status of Qur’anic schools offering an integrated hybrid curriculum be<br />

recorded.<br />

41. That a more precise analysis of out of school children be undertaken to investigate the<br />

underlying issues of out of school children. This should be a study separate from the<br />

EMIS<br />

42. That school condition data be collected with the caveat that it will need to be interpreted<br />

with caution<br />

43. That proxy indicators that reflect degree of child friendliness be considered, such as<br />

lesson types used regularly, or use of corporal punishment in last month<br />

45


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

44. That some key resources that are commonly available and for which there is money or a<br />

donor ready to provide be recorded<br />

45. That the data collected include an expanded view of non-formal curriculum components.<br />

46. That a data on emergency risk and emergency preparedness be considered<br />

47. That the priority list for data expansion be adopted<br />

D<br />

E<br />

F<br />

G<br />

H<br />

<strong>Report</strong>ing<br />

48. That the draft 2006/2007 report should not be published in its current form.<br />

49. That a set of target groups are identified and reports designed specifically for each group.<br />

This process should also be supported by a front end module built for the access data<br />

base that can be flexibly accessed by a variety of users without the technical capacity to<br />

use the query module of Access.<br />

50. That reports include a school/CEC report, District reports, Regional reports, Zone and<br />

<strong><strong>Somali</strong>a</strong> compact reports that provide a summary of major EFA indicators and trends<br />

disaggregated by Region, gender and Rural/Urban, and a set of thematic reports that<br />

would change from year to year and would provide deeper analysis of EMIS data of<br />

issues of specific relevance.<br />

51. That the capacity to extract information from the Access data base using the Query<br />

module of Access be developed at Zone level among those responsible for data base<br />

management.<br />

52. That Data from the EMIS be systematically integrated into an Atlas via GIS software<br />

53. That necessary support and training be provided for many levels of the MoE, on how to<br />

use and interpret.<br />

Impact<br />

54. That impact of the PES be improved by implementation of the recommendations of this<br />

report in the areas of diversified reporting, capacity building on data use, broadening the<br />

scope of data collected, decentralisation of PES processes and linking PES to other data<br />

systems<br />

Sustainability<br />

55. That employment conditions be such as to ensure the retention of professional staff<br />

56. That outsourcing some EMIS activities be considered in areas of staff shortage<br />

57. That merit based employment procedures be defined and implemented<br />

58. That effective work practices and standards be identified and made explicit<br />

59. That a mechanism to coordinate EMIS activities across the three zones be established<br />

60. That an effective quality assurance system be established that includes communication<br />

and travel arrangements<br />

61. That systems of EMIS implementation be differentiated across regions to reflect the<br />

differences in local conditions<br />

62. That a computer maintenance and replacement policy be developed and implemented<br />

Coverage<br />

63. That data collection systems be expanded to include all areas of that have policy<br />

implications for MoEs. This would include ECD, secondary education and non formal<br />

education.<br />

64. That nomadic schools be clearly identified in the PES<br />

Coordination<br />

65. That the PES and Secondary School data systems be coordinated to ensure consistency<br />

between them.<br />

66. That mapping tools are used as a coordinating device across sectors<br />

67. That survey and study design be undertaken to complement the PES/EMIS.<br />

Conclusion: Future steps towards an EMIS<br />

68. That the UNICEF Planning and EMIS technical assistance be integrated closely with an<br />

EMIS and Planning Directorate.<br />

46


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

69. That MoEs identify suitable qualified focal points and the necessary support staff to<br />

undertake the duties associated with EMIS development.<br />

70. That the appointed staff be adequately supervised to ensure the achievement of set goals<br />

and targets<br />

71. That the conceptual framework is accepted as an aspirational model for an EMIS for the<br />

component Zones of <strong><strong>Somali</strong>a</strong>.<br />

72. That the current PES and its associated Access Data Base is referred to as the initial<br />

component parts of an EMIS. That the current “EMIS tools” be referred to as School<br />

records.<br />

73. That the EMIS focal point and the EMIS technical assistance develop a ‘ROLLING’<br />

transition plan for the development of components of the EMIS and the shift of<br />

responsibility to the MoEs<br />

74. That the rolling transition plan incorporates changes in data collection, data cleaning and<br />

validation, and changes in reporting recommended in this evaluation<br />

47


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Appendix 2 Terms of Reference<br />

Terms of Reference for the Engagement of an <strong>Evaluation</strong> Expert<br />

<strong>Evaluation</strong> of the Annual <strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong> in <strong><strong>Somali</strong>a</strong><br />

1. BACKGROUND<br />

1.1 Country Background<br />

With an overall Human Development ranking of 161 out of 163 countries on the human development<br />

index, available data on <strong><strong>Somali</strong>a</strong> points towards enormous gaps in meeting the needs of women and<br />

children in the social sector. The country has not had a de facto central government authority, or any other<br />

feature associated with an established independent state since the last 15 years. Functioning regional<br />

governments have been formed in the Northwest (<strong>Somali</strong>land) and Northeast (Puntland), and more<br />

recently a Transitional Federal Government was formed in 2004 but with very limited de facto authority.<br />

1.2 UNICEF Country Programme<br />

UNICEF has had a large field and programme presence in <strong><strong>Somali</strong>a</strong> over the last 15 years, with three zonal<br />

offices. A new country programme, within the framework of the United Nations Transition Plan, will aim<br />

to accelerate progress towards targets 3-8 of the Millennium Development Goals by further increasing<br />

access to basic services for accelerated child survival and development through humanitarian assistance,<br />

strengthening the institutional capacity of government as a duty bearer and further enabling children and<br />

women to claim their rights.<br />

1.3 UNICEF <strong>Education</strong> Programme<br />

Notwithstanding the work done by international development and relief organisations, access to basic<br />

education and gender equity in education in <strong><strong>Somali</strong>a</strong> remains a real concern. With a Gross Enrolment<br />

Ratio of 30 per cent, overall primary school enrolment ranks among the lowest in the world. In terms of<br />

gender equity, the <strong>Somali</strong> girl child has even less chance of accessing basic education. After 17 years of<br />

an emergency-driven response in <strong><strong>Somali</strong>a</strong>, the absence of a viable education system continues to hinder<br />

<strong><strong>Somali</strong>a</strong>’s progress towards meeting the Millennium Development Goals (MDGs) and <strong>Education</strong> For All<br />

(EFA) goals.<br />

In this context, the new UNICEF “Go to School” programme will aim to improve access to and quality of<br />

basic education with an emphasis on developing mechanisms to increase girls’ enrolment. In addition, an<br />

institutional development component will provide financial support and expertise to the existing<br />

ministries of education for the development and implementation of gender-sensitive education policies<br />

and standards, including the development of an effective <strong>Education</strong> Management Information System.<br />

1.4 The <strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong><br />

The <strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong> (PES) has been conducted almost every year for the last nine years.<br />

However, no formal evaluation of the <strong>Survey</strong> has ever been carried out. The original purpose of the<br />

survey was to fill a critical data gap in the education sector and to meet the information needs of the<br />

Ministry’s of <strong>Education</strong> (MoEs), national and international development stakeholders as well as donors,<br />

with an independent, reliable and routine source of data for trend analysis and programme planning. The<br />

PES has evolved over the years with additional data elements being added to meet the growing needs of<br />

48


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

education stakeholders. From the beginning, UNICEF clearly stated its intention to build capacity of<br />

MoE officials in order for the MoEs to take full responsibility for the design and management of the PES.<br />

The PES was in many ways only meant to be an interim step on the road to establishing a full operational<br />

<strong>Education</strong> Management Information System (EMIS).<br />

The ultimate goal of building such an effective EMIS was the establishment of a reputable system and<br />

expertly trained education officials at central, regional and district levels to ensure the efficient flow of<br />

data in order to facilitate the monitoring of service provision as well as providing critical information to<br />

educational planners and policy-makers. In reality, this goal remains far from being realized. While some<br />

EMIS tools were developed and distributed, they have not been institutionalized. This is partly due to the<br />

absence of an overall EMIS framework. Efforts have been made, with limited success, to train MoE<br />

officials in basic data management using the data collected during the PES. Attempts to involve MoE<br />

officials in data collection for the PES have also had mixed and sometimes questionable results.<br />

Furthermore, there is little evidence that any of the data from the PES is being translating into information<br />

that is used for educational planning and policy-making purposes.<br />

2. EVALUATION PURPOSE<br />

2.1 Reasons for this evaluation<br />

Availability of reliable and up-to-date educational statistics/data is essential not only for formulation of<br />

policies and educational planning, but also making information based decisions on problematic issues in<br />

the day-to-day administration of the educational system.<br />

After nine years, the <strong>Primary</strong> <strong>Education</strong> <strong>Survey</strong> remains the most up-to-date and reliable source of data on<br />

the status of education in <strong><strong>Somali</strong>a</strong>. However, for the PES to maintain it credibility and to ensure the<br />

validity of the data collected, an evaluation of the current and past processes involved in this important<br />

routine education survey, as well as an assessment of its utility levels, is considered necessary. The<br />

evaluation will also examine the relationship between the need for a sustainable EMIS and the future role<br />

of the PES.<br />

The Strategic Partnership in <strong>Education</strong> (DFID-UNICEF-UNESCO), intends to fund the next annual<br />

education survey. However, prior to doing so, it is deemed important to evaluate the current annual<br />

process of data collection and data dissemination. Such an evaluation is fully justified in view of the fact<br />

that without a fully functional and efficient management information systems, it will be difficult to bring<br />

about any reform in basic education in <strong><strong>Somali</strong>a</strong>. To date, the PES has played a crucial role in maintaining<br />

the availability of a reliable data source. How the effectiveness of this role can be strengthened and its<br />

relationship to the establishment of a broader system for EMIS, will depend to on the outcomes and<br />

recommendations of an independent evaluation.<br />

2.2 Target audience and proposed use of results<br />

The results, conclusions and recommendation emanating from this evaluation will be primarily of use to<br />

the Ministries of <strong>Education</strong> and the UNICEF <strong><strong>Somali</strong>a</strong> country office and UNESCO. Secondary target<br />

audience will be UNICEF/UNESCO field offices, INGOs and LNGOs.<br />

2.3 Timing of the <strong>Evaluation</strong><br />

The final report from this evaluation is expected by XX<br />

49


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

3. SCOPE AND FOCUS<br />

3.1 <strong>Evaluation</strong> Objectives<br />

The scope of the proposed evaluation will cover the routine annual surveys of primary education in<br />

<strong><strong>Somali</strong>a</strong>. The evaluation will focus on the main aspects of the surveys, i.e. data collection and<br />

management as well as information dissemination and utility. In addition, it will look at the<br />

performance of the agency in supporting this recurring activity and the collaboration with its partners.<br />

The overall objectives of the evaluation are:<br />

• To assess the current and past design, planning and management of the key processes and<br />

methods that have been employed to conduct the PES since its inception<br />

• To assess the degree to which the PES meets the real data needs/analysis of the range of<br />

education stakeholders in <strong><strong>Somali</strong>a</strong><br />

• To assess the role of the PES in relation to the nascent EMIS and review the current linkages<br />

• To assess the need for expanding the breadth and depth of the information gathered and in<br />

particular, to review the idea of integrating the collection of Secondary School data<br />

• To assess the mechanisms for ensuring internal validity and reliability of the data presented in<br />

the PES<br />

• To assess the degree to which training of MoE officials in data collection and management<br />

have been successfully and appropriately implemented<br />

• To assess the utility level of the PES in <strong><strong>Somali</strong>a</strong>, in particular the benefits of its results and<br />

outputs to the intended users<br />

• To identify best practice, innovative interventions and shortcomings in the current process of<br />

planning and implementing the annual survey<br />

• To make recommendations and suggestions on possible improvements related to all aspects<br />

of the routine survey, including, if necessary, ways of increasing levels of participation and<br />

ownership and utility of the survey/data by the MoE authorities and other stakeholders<br />

• To make recommendations with respect to the evolving relationship between the<br />

current/future PES and the nascent EMIS system<br />

3.2 Major Questions<br />

The evaluation will address questions which have been organized according to generally applicable<br />

evaluation criteria.<br />

• Effectiveness<br />

• Relevance / Appropriateness:<br />

• Efficiency:<br />

• Impact:<br />

• Sustainability:<br />

• Coverage:<br />

• Coordination:<br />

3.3 Performance Standards<br />

Prevailing norms (for cost, time and service quality) for UNICEF program in <strong><strong>Somali</strong>a</strong> shall be used as<br />

benchmarks for addressing the above listed evaluation questions.<br />

50


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

4. EVALUATION PROCESS AND METHODS<br />

4.1 <strong>Evaluation</strong> Stages<br />

The evaluation shall span the following main stages:<br />

i. Initial desk review of all relevant documents; annual PES documents, databases, country<br />

programme documents, consultancy reports, project documents, progress reports, training<br />

programme reports and minutes of meetings among others<br />

ii. Submission of an Inception <strong>Report</strong> mapping out the evaluation plan, questionnaires and<br />

highlighting key initial findings<br />

iii. Consultations with individuals and organisations concerned, including field visits (see details<br />

below)<br />

iv. Preparation of draft report and circulation among individuals and organisations concerned<br />

v. Submission of the draft report to UNICEF USSC, highlighting the main conclusions and<br />

lessons learned from the evaluation<br />

vi. Production and submission of final report<br />

4.2 <strong>Evaluation</strong> Data Collection Methods<br />

The principal source of data will be interviews with respective stakeholders. Respondents will<br />

include MoE officials at central, regional and district levels, NGO officials, head teachers, teachers,<br />

survey enumerators, supervisors, UNICEF/UNESCO staff at Nairobi and zonal level as well as local<br />

and international NGOs.<br />

The methods followed would be largely qualitative and include key informant semi-structured<br />

interviews, community level interviews and focus group discussions. Questionnaires for guiding the<br />

interviews will be developed for review in the inception phase. Should it be necessary, the expert will<br />

hire local translators.<br />

5. Accountabilities<br />

The evaluation expert shall be professionally independent to carry out a critical assessment to enable<br />

him or her to come up with candid answers to the questions posed above. The expert shall, for<br />

administrative purposes, be managed by UNICEF’s USSC <strong>Education</strong> Section - XX , who will also<br />

provide technical advice for the task. The evaluation expert shall be responsible for efficient and<br />

timely performance and the quality of the reports. The expert will be bound by normal UNICEF rules<br />

of confidentiality.<br />

6. <strong>Evaluation</strong> Expert<br />

The evaluation expert shall be responsible for scrutiny of data sources, design of questionnaires,<br />

conducting interviews, data analysis and report writing.<br />

The ideal candidate would have:<br />

• A Master’s degree in the social sciences, education and or development studies<br />

• Extensive experience (8-10 years minimum) in the evaluation of social sector programmes,<br />

particularly education programmes<br />

• Experience in design, management and evaluation of education information systems<br />

• Expertise in qualitative and quantitative methods of evaluation<br />

• Expertise in ICT, including spreadsheets and database management<br />

• Excellent knowledge of evaluation norms, standards and approaches<br />

• Proven communication, facilitation and writing skills<br />

51


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

• Excellent knowledge of English (oral and in writing)<br />

• Familiarity with the <strong>Somali</strong> context<br />

• Ability to work independently<br />

A candidate with the following would be preferred:<br />

• Field experience in post-conflict situations<br />

• Knowledge of national/local language(s)<br />

• Expertise in evaluating social sector projects<br />

The evaluation expert shall arrange his or her own office equipment (as laptop) and interview<br />

material (as tape recorders). While no specific work schedule is being suggested, they would be<br />

required to deliver the agreed products within the specified time.<br />

7. Procedures and Logistics<br />

The evaluation is expected to take two months as per the following time-line after the engagement of<br />

the expert in early January 2008 .<br />

Table 3 Timeline for <strong>Evaluation</strong><br />

Activity Location Duration<br />

Working<br />

Days<br />

Briefing at UNICEF USSC USSC, Nairobi 2<br />

Desk Study USSC, Nairobi 5<br />

Interviews: Stakeholders in Nairobi Nairobi 5<br />

Preparation of an Inception <strong>Report</strong>, listing methodology,<br />

means of data collection and questionnaires<br />

Presentation of Inception <strong>Report</strong>, feedback and<br />

finalization of methods<br />

Visit to <strong>Somali</strong>land, interview UNICEF/UNESCO staff,<br />

consult records and meet/ interview MoE authorities at<br />

all levels<br />

Visit to Puntland, interview UNICEF/UNESCO staff,<br />

consult records and meet/ interview MoE authorities at<br />

all levels<br />

Visit to Central Southern Zone, interview<br />

UNICEF/UNESCO staff, consult records and meet/<br />

interview MoE authorities at all levels<br />

Preparation and presentation of preliminary findings and<br />

recommendations<br />

2<br />

USSC, Nairobi 1<br />

<strong>Somali</strong>land 5<br />

Puntland 5<br />

Central South 5<br />

USSC, Nairobi 5<br />

Preparation of draft report, presentation & feedback USSC, Nairobi 7<br />

Final report submission USSC, Nairobi 3<br />

Total Working Days 45<br />

52


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Appendix 3 EMIS Conceptual framework<br />

To clarify the understanding about what an EMIS is, and how it can be of assistance in developing an<br />

education infrastructure, the definition and conceptual framework of an EMIS has been adopted from the<br />

work undertaken by UNESCO (1998) Bangkok. This conceptualisation defines an EMIS to be ‘an<br />

organised group of information and documentation services that collects, stores, processes, analyses and<br />

disseminates information for educational planning and management” (p.2). As such it is more than a data<br />

storage system. Rather it integrates system components of inputs, processes, outputs and reporting in such<br />

a way that accuracy, flexibility and adaptability, and efficiency are maximised. The elements that need to<br />

be integrated are the:<br />

• Needs of data producers and users<br />

• Data<br />

• Information handling<br />

• Storage of data<br />

• Retrieval of data<br />

• Data analysis<br />

• Computer and manual procedures<br />

• Networking among EMIS centres<br />

(UNESCO 1998, p. 4)<br />

It also has a dynamic process orientation that focuses on the two-way flow of information so that all<br />

levels of the organisation can make effective use of the data and reports generated from the data. This<br />

flow of information needs to be efficient by avoiding duplication of effort, made accurate by<br />

incorporating validation procedures and adequate training of personnel at all levels, and meeting the<br />

needs of all stakeholders in the system. This complex interaction of components is represented in Figure 1<br />

and has been adapted from the UNESCO (1998) report to reflect some aspects of the <strong><strong>Somali</strong>a</strong> context.<br />

The main goal is to develop approaches which contribute to the systematic incorporation of performance<br />

data into the policy design and implementation cycle, not the enhancement of data on an individual<br />

agency to support intervention ( Brental, Peart, Carr-Hill, and Cox 2000). More specifically the UNESCO<br />

(1998) report argues that a well designed EMIS has the ability to:<br />

• To improve capacities in data processing, storage, analysis and supply of educational<br />

management information so that educational planners and administrators can avail<br />

themselves of reliable and timely data.<br />

• To co-ordinate and further improve dispersed efforts in the acquisition, processing, storage,<br />

transmission, analysis, repackaging, dissemination and use of educational management<br />

information.<br />

• To facilitate and promote the use of relevant information by various agencies and individuals<br />

at all levels for more effective educational planning, implementation and management.<br />

• To streamline the flow of information for decision-making by reducing and eliminating<br />

duplications as well as filling information gaps.<br />

• To provide information for policy dialogue and scenarios for development of the education<br />

system.<br />

(UNESCO 1998, p.3)<br />

53


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

EMIS DATA<br />

LEVELS OF<br />

DECISION MAKING<br />

INFORMATION<br />

PRODUCTS<br />

Students<br />

Curriculum<br />

Personnel<br />

Finance<br />

Facilities<br />

Other<br />

National<br />

Zones<br />

Regions<br />

Districts<br />

Villages/Schools<br />

Early childhood<br />

<strong>Primary</strong><br />

High school<br />

PAE<br />

NFE<br />

Nomadic and<br />

Qur’anic Schools<br />

<strong>Report</strong>s<br />

Plans<br />

Products<br />

Data/statistics<br />

bulletins<br />

Directories<br />

Physical plant and<br />

Facilities design<br />

Budget Estimates<br />

Others<br />

Figure 1: Information Flow (adapted from UNESCO 1998)<br />

The importance of an EMIS is supported by the increasing tendency to call for in-country capacity for<br />

evaluation by donor agencies. This is reflected in donor agencies shifting their focus from specific<br />

projects to system level indicators (Brental et al. 2000). This demand can best be clarified by taking a<br />

systems approach to view the education system. May and Vine (1997) identified such an approach that, it<br />

was argued, identified the “major factors in an educational production system.” The model used has been<br />

used extensively elsewhere (Windham, 1988) and is predicated on a systems approach that involves<br />

inputs, processes and outputs as its major components (Table 4).<br />

Table 4 Major Factors in the <strong>Education</strong> Production System<br />

Determinants<br />

Effects<br />

Inputs Process Outputs Outcomes<br />

Student characteristics<br />

Teacher<br />

characteristics<br />

School Characteristics<br />

Instructional materials<br />

and equipment<br />

characteristics<br />

Facilities<br />

characteristics<br />

Forms of instructional<br />

organisation<br />

Class room<br />

technologies<br />

Teacher and student<br />

time allocation<br />

Cognitive<br />

achievement<br />

Improved manual<br />

skills<br />

Attitudinal changes<br />

Behavioral changes<br />

Employment<br />

Earnings<br />

Status<br />

Attitudes<br />

Behaviors<br />

54


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

The significance of the model is that it can provide guidance as to how to monitor and evaluate the<br />

efficiency and effectiveness of the education system. To achieve this it has been suggested that a range of<br />

indicators can be calculated that reflect the status of the each component of the system (May and Vine<br />

1997). Such a model has been extensively developed in the <strong>Education</strong> for All (EFA) project established<br />

by UNESCO following the Jomtiem agreement of 1990 (UNESCO 2000). This model defined 18<br />

indicators that have three significant functions. They can be used to describe the health of an education<br />

system. Secondly, by using trend data they can be used to evaluate progress towards a set of key<br />

educational goals, and finally, by disaggregating the data across gender or regional groups, they can be<br />

used to assess the equity of the system. The abbreviated name and definition of the 18 core indicators are<br />

presented in Table 5.<br />

Table 5 The 18 Core ‘<strong>Education</strong> For All’ Indicators<br />

Abbreviation<br />

Indicator<br />

N°<br />

Indicator<br />

ECD 1 Gross enrolment ratio in Early Childhood Development (ECD).<br />

G1ECD 2 Percentage of new entrants to grade 1 with ECD experience.<br />

AIR-NIR 3, 4 Intake rates (AIR and NIR) in primary education.<br />

GER-NER 5, 6 Enrolment ratios (GER and NER) in primary education.<br />

EXP 7<br />

8<br />

a) Public current expenditure on primary education as a % of GNP;<br />

b) Public current expenditure per pupil on primary education as a % of<br />

GNP per capita; and Public current expenditure on primary education as<br />

a % of total public current expenditure on education.<br />

TEACHERS 9 % of primary teachers with required academic qualifications;<br />

10 % of primary teachers who are certified to teach.<br />

P-T RATIO 11 Pupil-teacher ratio in primary education<br />

REPETITION 12 Repetition rates by grade in primary education<br />

SURVIVAL 13 Survival rate to grade 5;<br />

14 Coefficient of efficiency at grade 5 and at the final grade<br />

ACHIEVEMENT 15 % of pupils who master basic learning competencies<br />

LITERACY 16, 17 Adult Literacy rates 15-24 years and 15 years and over;<br />

18 Literacy Gender Parity Index (GPI)<br />

From these indicators, the data required for their calculation can be deduced. For the trend analysis to be<br />

undertaken a longitudinal set of data needs to be compiled with the implication that the data collection<br />

needs to be updated annually. Finally, if full use of the data is to be made the variability of the data needs<br />

to be investigated. Possible sources of this variability are gender, Zone, Region, District, Village,<br />

socioeconomic profiles, school type etc. Hence, the need to be able to disaggregate the data across these<br />

variables.<br />

While these indicators are useful for providing general information about efficiency and effectiveness of<br />

the education system, it is useful to think about what data can be useful in informing us about many other<br />

aspects of the system. If systems developed elsewhere for developing countries are looked at (e.g.,<br />

UNESCO Model, Timor Leste, Myanmar) there appears to be considerable uniformity as the general<br />

nature of EMIS data, given that they always need to be adapted for local structures, conditions and<br />

contexts.<br />

55


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

We can think of this data in the form of a database containing a number of foundation tables in a<br />

relational database. They are tables concerning 13 domains of data, namely:<br />

• School Infrastructure<br />

• School Management<br />

• School Pedagogy<br />

• School Curriculum<br />

• School Budget<br />

• School Inspection<br />

• Student Admissions<br />

• Student Grade/Class Assignment<br />

• Student Attendance<br />

• Student Learning Achievement<br />

• Teachers Employed<br />

• Teacher Grade/Class Assignment<br />

• Teacher Attendance<br />

• Administrative Staff<br />

• General Staff<br />

• Population<br />

The EMIS would also contain similar domains of data for the multiple levels of education institutions<br />

including early childhood centres, primary, secondary, non formal education institutions and tertiary<br />

institutions. It is a common model to gradually move to this complete system, by initially building the<br />

technical capacity required for effective EMIS implementation in one sector and then expanding to other<br />

sectors.<br />

Each domain of data may be represented by more than one table. For example, school infrastructure<br />

would have a separate table for each of the following aspects of infrastructure:<br />

• Buildings and Classrooms<br />

• Furniture<br />

• Teaching and Learning Resources<br />

• Water and Sanitation<br />

• Utilities (eg. mains electricity supply)<br />

• Communications<br />

And the school management domain would have separate tables for:<br />

• School Management Committee<br />

• School CECs<br />

• Special school development projects<br />

• Catchment Area<br />

• Affiliated Schools<br />

Similarly, the School Curriculum domain would have separate tables for:<br />

• <strong>Primary</strong> School Subjects<br />

• Middle School Subjects<br />

• High School Streams and Subjects<br />

All of the data referred to above originates from within the education system with the exception is<br />

population data. This needs to be derived from census data and embedded in the EMIS at village level by<br />

single year of age and sex. Typically these data is derived from school level and aggregated at district,<br />

regional and national levels. The precise organisation for this aggregation depends upon many factors<br />

including personnel technical capacity, hardware availability, remoteness, and financial resources. A<br />

highly developed system would generally provide for data entry to be undertaken at a decentralised level<br />

and aggregated as it was sent forward to regional and national levels.<br />

56


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

At each level EMIS there would be an associated Decision Support System (DSS) tool set appropriate for<br />

that level. For example, Regional/District data could be interfaced with UNESCO’s decentralized<br />

education planning models, and with Geographic Information Systems (GIS) software for mapping school<br />

catchment areas. Similarly, higher levels would have tools such as SPSS and data transfer programmes<br />

for exporting database tables to SPSS (a statistical analysis package) in their DSS. Further, the national<br />

level could be interfaced with UNESCO’s national education planning model and the UN’s DevInfo<br />

software for thematic mapping on a national scale of crucial education indicators such as those that<br />

measure progress against <strong>Education</strong> for All (EFA) and Millennium Development Goals (MDG) goals.<br />

EMIS has the potential to lift management knowledge and skills across the Ministry and thus increase the<br />

effectiveness of education management across the country. Importantly, the process involves a two way<br />

flow of information. When data is aggregated centrally reports need to be provided for all levels of<br />

stakeholders<br />

More broadly, the village/district component of EMIS could be conceptualized as a social sector local<br />

management and planning information system by ensuring some compatibility between the health and<br />

education information systems. As a minimum that would ensure that education and health managers and<br />

planners use the same basic population data. Such arrangements will assist greatly responses to<br />

emergency situations in particular, and will support integrated programming by development agencies.<br />

This broader integrated approach would have substantial resource sharing benefits. For example, there<br />

would need to be only one set of hardware and one management structure in each township covering both<br />

EMIS and HMIS, one set of DSS software tools, and one GIS with education and health data at village<br />

level. Given the scarcity of trained personnel this would have obvious advantages.<br />

These conceptualisations will be used a guide to the evaluations of the information systems in use in<br />

<strong><strong>Somali</strong>a</strong>.<br />

57


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Appendix 4 Research Questions and Data Collection Planner<br />

Research Question<br />

Data Source<br />

Effectiveness:<br />

1. To what extent does the data/information<br />

generated from the survey, meet the needs of<br />

the primary stakeholders: MoE officials at<br />

central, regional and district level?<br />

2. Have adequate efforts been made to strengthen<br />

and empower the educational authorities at all<br />

levels in terms of increasing their capacity to<br />

take full ownership of the process of planning<br />

and implementing the annual survey?<br />

3. Is the annual survey embedded/<br />

institutionalised in the <strong>Education</strong>al Policies and<br />

Strategies of the MoEs?<br />

4. What are the lessons learned as a result of the<br />

annual survey, in terms of building capacities<br />

among district authorities, communities, local<br />

government and local educational institutions?<br />

5. In terms of planning and implementing, to<br />

what degree have the annual surveys provided<br />

appropriate monitoring systems to help<br />

education stakeholders (UNICEF and<br />

UNESCO included) measure impact and<br />

effectiveness, and improve accountability?<br />

6. How are the “EMIS documents” of school<br />

register, student register, Pupil record card and<br />

instructions for use, used?<br />

7. What other data are systematically collected,<br />

how, when and by whom?<br />

Relevance / Appropriateness:<br />

8. In designing the survey, was an assessment of<br />

the needs of the educational institutions,<br />

district authorities and MoEs ever taken into<br />

consideration?<br />

9. Are the questionnaires well designed and<br />

appropriate?<br />

Director General, Planning officer<br />

Regional office, district office, schools<br />

Ask for examples. Map to common EMIS<br />

models<br />

UNICEF, survey team, DG, Planning<br />

Annual reports<br />

Who would do it? What resources are needed,<br />

how can responsibility be shifted<br />

DG, Planning, regional, district, school<br />

UNICEF, UNESCO. EFA FTI<br />

Link to 1. How is data used? Specific examples<br />

UNICEF, <strong>Survey</strong> team, Planning regional,<br />

district, school, annual reports<br />

Link to 2<br />

UNICEF, UNESCO, EFA, FTI<br />

LINK to 1, examples<br />

UNICEF, DG, Planning, Region, district<br />

school.<br />

How long have they been in use, is data<br />

centrally collected? Is this done electronically<br />

DG, planning, region, district, schools.<br />

Is there duplication?<br />

<strong>Survey</strong> design team, UNICEF, DG, Planning,<br />

regions, districts, annual reports<br />

Has Ministry ever suggested additions to<br />

surveys? LINKS to 1 and 5<br />

<strong>Survey</strong> team, consultant, Schools<br />

Problems with completion?<br />

10. Are the questionnaires really gender sensitive? Consultant. Disaggregation where appropriate?<br />

11. Do they provide appropriate data to highlight<br />

the needs of vulnerable and marginalised<br />

children?<br />

UNICEF, consultant<br />

58


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Research Question<br />

12. Are the survey results used for MoE planning<br />

and management purposes and if so, to what<br />

extent?<br />

13. Are the survey results documented and<br />

presented fully in line with the requirements<br />

and wishes of the MoE authorities?<br />

14. Is feedback given to the end-users, in other<br />

words, are districts and educational institutions<br />

provided with the documented and userfriendly<br />

survey results?<br />

15. Are educational institutions and districts<br />

motivated to carry out and collaborate with the<br />

annual surveys? Do the results from the survey<br />

motivate districts and educational institutions<br />

to improve their performance?<br />

16. Is the planning and implementation of the<br />

survey guided by cross-cutting principles of<br />

gender sensitivity, partnerships, national<br />

ownership, capacity development, bottom-up<br />

approach to programming, participation of<br />

children and young people, and gender<br />

sensitivity?<br />

17. Does the data collected properly match the data<br />

requirements of the EMIS information system?<br />

Efficiency:<br />

18. Are the annual surveys carried out in a timely<br />

manner, taking into account climatic, logistic<br />

and security constraints?<br />

19. To what extent are the communities and<br />

educational institutions involved in planning,<br />

timing and carrying out the routine surveys?<br />

20. Is the total enumeration approach employed the<br />

preferred method for carrying out the survey,<br />

or would a stratified sampling method be<br />

feasible?<br />

21. Is the timing/content and duration of<br />

enumerator training appropriate?<br />

22. To what extent is pilot testing used to improve<br />

skills and monitor training outcomes?<br />

23. Does the payment of incentives skew outcomes<br />

or promote rent-seeking only attitudes?<br />

24. In collecting and collating the data, do<br />

adequate validity checks take place at the point<br />

of data collection, data cleaning and data entry<br />

into the information system?<br />

Data Source<br />

DG, planning, region, district<br />

DG, Planning, region, district schools<br />

What reports do you get? How are the reports<br />

presented (paper, Book, electronically). What<br />

would your preference be?<br />

DG, Planning, region, district schools<br />

Link to 13<br />

Regions, districts, schools, survey teams<br />

Is there resistance to process? If so why?<br />

Are they perceived to be valuable?<br />

UNICEF,<br />

How significant is this?<br />

Consultant, UNESCO documents. EFA<br />

indicators. Link to 1<br />

UNICEF, annual reports, survey team<br />

Seek general reply from DG and Planning<br />

UNICEF, survey teams<br />

Consultant<br />

UNICEF, survey team, evaluation reports<br />

What training?<br />

UNICEF, survey teams design reports<br />

UNICEF, Planning, annual reports<br />

Evidence? Or Opinion, survey team<br />

UNICEF, Annual <strong>Report</strong>s, <strong>Survey</strong> teams.<br />

59


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Research Question<br />

25. Are data integrity and validity checks carried<br />

out, in terms of random visits to educational<br />

institutions?<br />

26. Has the cost of carrying out the survey been<br />

reasonable in relation to other parts of the<br />

country programme?<br />

27. Can recommendations be made as to cost<br />

saving elements in terms of funding the routine<br />

annual survey?<br />

Impact:<br />

28. Do the annual survey and the published results<br />

contribute to increased enrolment rates, greater<br />

opportunities for girls, increased numbers of<br />

teachers and a larger number of quality<br />

schools?<br />

29. Have there been unintended positive or<br />

negative consequences, as a result of the<br />

annual survey taking place?<br />

Sustainability:<br />

30. Is the recurring process of conducting the PES<br />

likely to continue if/when UNICEF withdraws<br />

its funding support?<br />

31. Can recommendations be made about shifting<br />

the responsibility for data collection to the<br />

districts and educational facilities, as opposed<br />

to a donor-driven and donor-coordinated<br />

method of data collection while maintaining its<br />

validity?<br />

Coverage:<br />

32. Does the PES equally cover urban and rural<br />

areas?<br />

33. Did UNICEF’s assistance in any way influence<br />

the existing inequities between urban and rural<br />

educational institutions, provided these exist?<br />

34. Does the PES provide sufficient data on<br />

alternative education pathways such as<br />

Nomadic and Qur’anic Schools?<br />

Coordination:<br />

35. Does UNICEF effectively work with other<br />

stakeholders (local, district, regional<br />

authorities, NGOs, UN agencies) during all<br />

stages of the survey process?<br />

36. Are all international development partners and<br />

NGOs fully aware of the routine survey of<br />

primary education funded by UNICEF and are<br />

Data Source<br />

DG, Planning, Region, district, UNICEF,<br />

annual reports<br />

UNICEF, Annual work plan, budget<br />

Consultant, UNICEF<br />

Discus in relation to ques 3, development of<br />

EMIS<br />

UNICEF, FTI, DG, Planning<br />

Link to effectiveness<br />

DG, Planning,<br />

Link to 2, 3<br />

DG, Planning, Region, district<br />

If a EMIS system was established, owned by<br />

MoE, how would it be structured,<br />

decentralised,<br />

What resources would be needed?<br />

Where would you start?<br />

UNICEF, annual reports,<br />

Is this relevant here?<br />

UNICEF, annual reports<br />

UNESCO. Relevant NGOs, district, regions,<br />

Planning, DG<br />

Relevant NGOs, UNESCO, UNDP…????<br />

60


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Research Question<br />

Data Source<br />

these partners routinely supplied with the<br />

survey results?<br />

37. Does any duplication of activities take place, in<br />

terms of surveys of primary education in<br />

<strong><strong>Somali</strong>a</strong>?<br />

38. How can coordination be improved for better<br />

efficiency and sustainability?<br />

Coherence:<br />

39. Does the type of data collected match the<br />

reporting requirements of national and global<br />

frameworks: the Reconstruction and<br />

Development Framework; the Joint Needs<br />

Assessment (<strong>JNA</strong>), the UNTP IMEP, the MDG<br />

targets and the EFA goals?<br />

40. How consistent has the planning and<br />

implementation of the PES been within a<br />

human-rights based and gender sensitive<br />

approach to programming?<br />

Schools, district, regions, planning<br />

What other data is collected?<br />

Link to 6, 7<br />

Consultant,<br />

recommendations<br />

DG, Planning, EFA, FTI Reconstruction and<br />

development framework<br />

Link to 1 and 5<br />

UNICEF, consultant, annual reports, annual<br />

work plans<br />

61


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Appendix 5 List of Interviewees<br />

Names of Interviewees/<br />

Focus Groups<br />

Organisational Status<br />

Maurice Robson UNICEF <strong>Education</strong> Officer 1/7/08<br />

Edith Mururu <strong>Education</strong> Specialist UNICEF 3/7/08<br />

Lawrence Mulinge Computer Programming Consultant 4/7/08<br />

Woki Munyui <strong>Education</strong> Specialist UNICEF 7/7/08<br />

<strong>Education</strong> Sector Meeting Nairobi 8/7/08<br />

Catherine Remmelzwaal <strong>Education</strong> Specialist UNICEF 8/7/08<br />

Date<br />

Maulid Warfa <strong>Education</strong> Specialist UNICEF 13/7/08<br />

Safia Jibril Abdi Project officer UNICEF 12/7/08<br />

Rashid Hassan Muse Project Officer UNICEF 14/7/08<br />

Hassan Haji Mohmoud Minister of <strong>Education</strong> <strong>Somali</strong>land 12/7/08<br />

Ali Abdi Odowa Director General of <strong>Education</strong> <strong>Somali</strong>land 12/7/08<br />

Abdi Abdulahi Director of Planning <strong>Somali</strong>land 14/7/08<br />

James Wamwangi Field Coordinator UNESCO 15/7/08<br />

Mohamud Bile Dubbe Minister of <strong>Education</strong> Puntland 19/7/08<br />

Abdi Mohamed Gobbe Vice Minister of <strong>Education</strong> Puntland 19/7/08<br />

Mohamed Jama Director of Examination Puntland 19/7/08<br />

Said Farah Mohd<br />

Director of Curriculum and Teacher Training<br />

Puntland<br />

19/7/08<br />

Khalif Yusuf Muse Regional <strong>Education</strong> Officer Nugal 19/7/08<br />

Abdulkadir Mohamed Enumerator 19/7/08<br />

Said Mohamed Hassan UNICEF Programme Assistant 19/7/08<br />

Abdirashid Ismail OIC Save the Children Puntland 19/7/08<br />

Ahmed Ali Shire SCOTT coordinator Save the Children Puntland 19/7/08<br />

Ahmed Abbas Ali<br />

Representatives of 11 NGOs<br />

Basic <strong>Education</strong> Manager Save the Children<br />

Puntland<br />

AET, SCUK, Care <strong><strong>Somali</strong>a</strong>, SC-Denmark, CfBT<br />

<strong>Education</strong> Trust, Caritas <strong>Somali</strong>land, UNESCO,<br />

Trocaire, Diaz, WFL.<br />

19/7/08<br />

25/7/08<br />

18/8/08<br />

Christophe Mononye UNESCO Program Specialist 13/8/08<br />

Kimanzi Muthengi Secondary <strong>Survey</strong> Manager UNESCO 13/8/08<br />

62


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Appendix 6 EMIS Development Planning Matrix<br />

Activity<br />

Year<br />

Assumptions Threats Capacity Development<br />

1 2 3 4 5<br />

Vision and Rationale for EMIS accepted X A clear vision of role of the PES and Lack of commitment to evidence based Senior MoE Officials<br />

EMIS is enunciated by senior staff planning and monitoring<br />

Relational data base design and<br />

construction<br />

X X Current data base design is used and<br />

updated as necessary<br />

Secondary and Non formal data bases not<br />

consistent<br />

EMIS coordinator<br />

EMIS manager<br />

Form design<br />

Use Current model with added data X X Current forms updated to include new<br />

data<br />

Consistency across zones as EMIS units<br />

develop independence<br />

Update of previous years report X School reports generated and used as <strong>Report</strong>ing procedures not developed EMIS staff<br />

basis for following years data collection<br />

Increase data set with<br />

Phase 1 additions (EFA data and<br />

school types)<br />

X Schools given adequate warning on<br />

preparation of age/grade data<br />

EMIS tools not being used<br />

Head teachers<br />

Enumerators<br />

Phase2 additions (ECD background,<br />

X Resources available Information is not known Heads of relevant sections of Ministry<br />

NFE, etc)<br />

Phase 3 additions (qualitative<br />

aspects)<br />

X Demand warrants this data Properties<br />

can be quantified objectively<br />

Data becomes too complex for collection. EMIS coordinator<br />

EMIS manager<br />

Area specialists<br />

Data Collection, Cleaning and Validation X<br />

Audit trail X<br />

Documentary evidence collected of To much reliance on UNICEF staff REOs, EMIS coordinator<br />

school visits and data checking<br />

GPS Way points X GPS are available Technical competence not developed Enumerators<br />

Replace Enumerators with Head<br />

teachers<br />

X Quality assurance system developed to<br />

allow regular contact with schools<br />

Isolated schools missed<br />

Nomadic schools not found<br />

REOs<br />

Head Teachers<br />

Conduct independent validation<br />

X There is still a need to confirm data Cost<br />

EMIS coordinator<br />

survey<br />

collection processes are accurate<br />

Check data from EMIS Tools X School understand need to provide<br />

documentary evidence for data<br />

provided<br />

Select different key data elements for<br />

validation each year<br />

X<br />

X X X X Physical check of some data elements<br />

each year will improve validity<br />

Timing<br />

Conduct Census in February X Planning time is not available to<br />

commence in September<br />

Ethical issues of false data are not<br />

understood<br />

Deliberate misrepresentation of the<br />

physical reality can still take place<br />

Early rains, continued insecurity in some<br />

areas<br />

Conduct Census in September X Implies two data collection in 2009 Resources are available<br />

REOs, Head Teachers,<br />

Enumerators<br />

REOs, Head Teachers<br />

Data Entry, Pre-coding and cleaning<br />

On entry in Zone, One in Nairobi X Adequate training and support is Need to communicate between two teams. EMIS unit data entry staff<br />

available to data entry staff.<br />

Both entries in Zone X Successful data entry in year 1 Professional qualities of staff Data base manager<br />

63


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Activity<br />

Year<br />

Assumptions Threats Capacity Development<br />

1 2 3 4 5<br />

Enumerators and supervisors<br />

Employment X Sound and transparent processes are in Lack of professional commitment MoE senior staff<br />

place. Performance reviews take place Sound<br />

Training<br />

X<br />

Greater emphasis on understanding. Lack of effective performance<br />

Enumerators<br />

Competencies demonstrated to qualify.<br />

New modules developed<br />

monitoring. Inadequate numbers trained<br />

to allow rejecting ‘failures’<br />

<strong>Report</strong>ing and analysis<br />

School report X Training of Head teachers and CEC on Proposals developed are ignored Head teachers, CEC<br />

use<br />

District reports<br />

X<br />

Training on use for planning and<br />

DEOs<br />

monitoring<br />

Regional reports<br />

X<br />

Training on use for planning and Quality assurance procedures not in place REOs<br />

monitoring<br />

Zone and <strong><strong>Somali</strong>a</strong><br />

X<br />

Current model of report reduced to key MoE does not embrace evidence based Ministry senior staff<br />

tables, indicators and trends<br />

planning<br />

Thematic reports<br />

X<br />

Special interests are identified and<br />

technical assistance employed<br />

Interrogate Access data base X Undertaken at zone level Capacity is not available in EMIS Unit Data base manager<br />

Front end reporting software<br />

X X X X Develops over time. Specialist<br />

All stakeholders,<br />

developed<br />

programmer employed<br />

Indicators transported to DEVINFO X UNICEF staff responsible for this task MoE won’t release data NGO staff<br />

School Mapping<br />

School location data validated<br />

X All GPS location data collected Understanding of mapping processes Enumerators<br />

EMIS Unit staff<br />

Data Transported to appropriate<br />

mapping software<br />

X Specialist Mapping expertise employed Poor match between school location and<br />

settlement location<br />

<strong>Education</strong> data linked with health,<br />

X Useful model for linking data sets A perceived irrelevance of crosscutting EMIS staff<br />

emergency data<br />

developed<br />

approach<br />

Establish EMIS Unit Staff of EMIS Coordinator<br />

Quality staff not available All EMIS staff<br />

<strong>Somali</strong>land X EMIS data base manager<br />

Staff leave after capacity built All EMIS staff<br />

Puntland X Five data entry and retrieval staff Ministry don’t make financial<br />

All EMIS staff<br />

South Central Zone<br />

X<br />

Two secretarial and administrative<br />

Assistants Ministries financial<br />

commitment<br />

Continued instability<br />

All EMIS staff<br />

commitment grows over time<br />

Lack of functioning ministry<br />

Maintenance and update plan for X<br />

Internet, anti virus software available IT maintenance contract not issued EMIS manager and data entry staff<br />

computing equipment<br />

Equipment not maintained<br />

Technical Assistance needed in each X X X X X UNICEF ongoing support Availability of resources<br />

zone<br />

PES International consultant<br />

available<br />

Provides overall coordination<br />

Does not coordinate across zones and<br />

EMIS Units<br />

64


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Appendix 7 Evidence Based Planning<br />

A major thrust of this evaluation has been to suggest the development of systems and reporting<br />

techniques that will facilitate evidence based planning processes to be used in the education sector of<br />

<strong><strong>Somali</strong>a</strong>. It has been seen in the text that while Agencies and NGOs have been making systematic use of<br />

the PES report, MoEs, regions, districts and school communities have not been making use of the data. It<br />

has also been argued that this is in large part due to the lack of suitable reporting strategies to make<br />

available to these stakeholders the necessary data.<br />

There is considerable evidence in support of the proposition that decentralising decision making and<br />

responsibility can lead to enhanced education opportunities for children. This process is explained<br />

succinctly in a UNICEF paper studied in the desk review for this work. It said in part:<br />

The overall goal is to bring interested people together, to encourage them to identify and<br />

prioritize key improvements that need to be made, and then to plan how the<br />

improvements can be made. In a sense, this would amount to a process of diagnosis of<br />

challenges faced by school-aged children in the areas, from which should emerge an<br />

interest and commitment to improve schooling. Much of this would be built upon the<br />

processes that have been established by Community <strong>Education</strong> Committees (CECs), the<br />

assumption being that the most effective agents of change are likely to be parents and<br />

those that represent their vested interests.<br />

Mendelsohn 2007<br />

Schools can be provided with simple forms of data that can allow them to compare their own school’s<br />

status with that of other comparable schools. This data can be in the form of infrastructure quality (inputs)<br />

or in the form of student achievements (outputs). The existence of such data allows and encourages the<br />

generation of discussion and debate about the key issues. More importantly it provides objective evidence<br />

for such debate. However, this evidence is often the ‘what’ component of the evidence and local input<br />

needs to be provided as to ‘why’. As an example, if a school finds that it has a low gross enrolment ratio<br />

in comparison to other schools in its district, it could conduct its own investigation into why that is the<br />

case and to then identify possible solutions that would work in that specific context. In this way generic<br />

overall policy, which may lack application in some areas, is replaced with differentiated strategies that are<br />

designed around local issues and gain greater ownership and support.<br />

It has also been suggested that school communities will need specialist assistance in developing<br />

submissions and proposals to address the key issues when there is need for outside assistance. This<br />

process has begun through Mobilization agents and Community development initiatives.<br />

When district and regions receive requests for assistance they will need to be able to effectively prioritize<br />

the requests. Hence they will need district and regional overviews of school status and performance to<br />

allow the most rational decisions to be made. Such reports are not complex to generate once the data is in<br />

the form of an Access data base and the requisite reports have been designed in the reporting context of<br />

Access.<br />

The need for an effective quality assurance system has been discussed in this report in several places.<br />

Such a system can be in part supported by effective monitoring of region, district and school<br />

performance. Supervisors are provided with objective data, which can form the basis of mentoring<br />

discussion and the formation of school improvement processes.<br />

Mendelsohn (2007) provided a matrix of possible ideas for mobilizing school improvement activity that<br />

was mainly orientated around a school mapping process. In the matrix below these ideas have been<br />

adjusted and added to so that they can better reflect the EMIS context. These are presented as examples of<br />

possible activities that can be generated from reports and thus provide indicators as to the possible nature<br />

of those reports. This is not intended to be exhaustive, but rather an indicative list.<br />

65


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Issue Indicator Goals and challenges<br />

School Enrolment<br />

Identify groups of<br />

people and areas<br />

where few children<br />

are in school<br />

GER, NER<br />

Local knowledge of<br />

location out of school<br />

children<br />

School Infrastructure Need for new schools Average distance to<br />

nearest school<br />

Drop out<br />

Teaching resources<br />

Student achievement<br />

Student attendance<br />

Teacher Attendance<br />

Need for additional<br />

classrooms<br />

Condition of<br />

classroom<br />

Latrines<br />

Water<br />

Identify reasons for<br />

high drop out rates<br />

when they occur<br />

Need to identify a<br />

basic list of resources<br />

that all schools should<br />

have<br />

Number of pupils per<br />

room<br />

Construction materials<br />

General condition<br />

descriptor<br />

Expert assessment of<br />

condition<br />

Number of pupils per<br />

latrine<br />

Quantity/ cost per<br />

pupil<br />

Drop out rate<br />

Reasons?<br />

Survival rate to grade<br />

5<br />

Compare with other<br />

schools, districts and<br />

regions<br />

Number of text books<br />

per pupil<br />

Compare with other<br />

schools<br />

Mean scores on<br />

external exams<br />

Student literacy rate<br />

Percentage of students<br />

mastering a set of<br />

defined basic skills<br />

Attendance rate<br />

Comparison with<br />

other comparable<br />

schools<br />

Plan interventions based on<br />

information collected<br />

locally<br />

Identify construction<br />

sources<br />

Identify construction<br />

sources<br />

Plan maintenance<br />

programme<br />

Priority to schools with no<br />

latrines<br />

Priority to schools with<br />

no/inadequate water<br />

supplies<br />

How is high drop out rate<br />

at end of lower primary<br />

addressed<br />

Need to be careful to<br />

benchmark schools against<br />

suitable comparable<br />

schools<br />

Need good records using<br />

EMIS tools<br />

66


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Appendix 8 Glossary<br />

Achievement. Performance on standardized tests or examinations that measure knowledge or competence<br />

in a specific subject area. The term is sometimes used as an indication of education quality in an<br />

education system or when comparing a group of schools.<br />

Adult literacy rate. Number of literate persons aged 15 and above, expressed as a percentage of the total<br />

population in that age group. Different ways of defining and assessing literacy yield different results<br />

regarding the number of persons designated as literate.<br />

Apparent Intake Rate. The total number of new entrants in the first grade of primary school regardless<br />

of age, expressed as a percentage of the population of the official primary school entrance age.<br />

Basic education. The whole range of educational activities, taking place in various settings (formal, nonformal<br />

and informal), that aim to meet basic learning needs. It has considerable overlap with the earlier<br />

concept ‘fundamental education’.<br />

Basic skills. Usually refers to some minimum competence in reading, writing and calculating (using<br />

numbers). The term is synonymous in many uses with basic learning needs.<br />

Constant prices. A way of expressing values in real terms, enabling comparisons across a period of<br />

years. To measure changes in real national income or product, economists value total production in each<br />

year at constant prices using a set of prices that applied in a chosen base year.<br />

Continuing (or further) education. A general term referring to a wide range of educational activities<br />

designed to meet the basic learning needs of adults. See also Adult education and Lifelong learning.<br />

Criterion Referenced Assessment. Assessment is based on observing previously defined competencies<br />

and is reported as a set of competencies (see Norm Referenced Assessment).<br />

Curriculum. A course of study pursued in educational institutions. It consists of select bodies of<br />

knowledge, organized into a planned sequence, that are conveyed by educational institutions, primarily<br />

schools, to facilitate the interaction of educators and learners. When applied to adult, non-formal and<br />

literacy programmes, the term often implies a less formalized organization of learning materials and<br />

methods than in schools and tertiary institutions. Indeed, in programmes aimed at individual<br />

empowerment and social transformation, the curriculum may be developed as a dialogue with and<br />

between learners.<br />

DevInfo. DevInfo is a powerful database system that is used to compile and disseminate data on human<br />

development. The software package has evolved from a decade of innovations in database systems that<br />

support informed decision making and promote the use of data to advocate for human development. The<br />

DevInfo project is an interagency initiative managed by UNICEF on behalf of the United Nations (UN)<br />

System.<br />

Drop-out rate by grade. Percentage of pupils or students who drop out from a given grade in a given<br />

school year. It is the difference between 100% and the sum of the promotion and repetition rates.<br />

Early childhood care and education (ECCE). Programmes that, in addition to providing children with<br />

care, offer a structured and purposeful set of learning activities either in a formal institution or as part of a<br />

non-formal child development programme. ECCE programmes are normally designed for children from<br />

age 3 and include organized learning activities that constitute, on average, the equivalent of at least 2<br />

hours per day and 100 days per year.<br />

<strong>Education</strong> for All Development Index (EDI). Composite index aimed at measuring overall progress<br />

towards EFA. At present, the EDI incorporates four of the most easily quantifiable EFA goals – universal<br />

67


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

primary education as measured by the net enrolment ratio, adult literacy as measured by the adult literacy<br />

rate, gender as measured by the gender-specific EFA index,and quality of education as measured by the<br />

survival rate to Grade 5. Its value is the arithmetical mean of the observed values of these four indicators.<br />

<strong>Education</strong> Management Information System (EMIS). An EMIS is an organized group of information<br />

and documentation services that collects, stores, processes, analyses and disseminates information for<br />

educational planning and management. As such it is more than a data storage system. Rather it integrates<br />

system components of inputs, processes, outputs and reporting in such a way that accuracy, flexibility and<br />

adaptability, and efficiency are maximized.<br />

Enrolment. Number of pupils or students enrolled at a given level of education, regardless of age. See<br />

also gross enrolment ratio and net enrolment ratio.<br />

Entrance age (official). Age at which pupils or students would enter a given programme or level of<br />

education assuming they had started at the official entrance age for the lowest level, studied full-time<br />

throughout and progressed through the system without repeating or skipping a grade. The theoretical<br />

entrance age to a given programme or level may be very different from the actual or even the most<br />

common entrance age.<br />

Fast Track Initiative (FTI). The <strong>Education</strong> for All-Fast-Track Initiative, launched by the World Bank, is<br />

a global partnership between developed and developing countries to promote free, universal basic<br />

education by 2015. The initiative seeks to ensure that no country that has demonstrated its commitment to<br />

education will fail to meet this goal for lack of resources or technical capacity. In addition to mobilizing<br />

funds, the initiative supports the design of comprehensive sector-wide education plans and fills gaps in<br />

policy, capacity and data.<br />

Functional literacy/illiteracy. A person is functionally literate/illiterate who can/cannot engage in all<br />

those activities in which literacy is required for effective functioning of his or her group and community<br />

and also for enabling him or her to continue to use reading, writing and calculation for his or her own and<br />

the community’s development.<br />

Gender parity index (GPI). Ratio of female to male values (or male to female, in certain cases) of a<br />

given indicator. A GPI of 1 indicates parity between sexes; a GPI above or below 1 indicates a disparity<br />

in favour of one sex over the other.<br />

Gender-specific EFA index (GEI). Composite index measuring relative achievement in gender parity in<br />

total participation in primary and secondary education as well as gender parity in adult literacy. The GEI<br />

is calculated as an arithmetical mean of the gender parity indices of the primary and secondary gross<br />

enrolment ratios and of the adult literacy rate.<br />

Grade. Stage of instruction usually equivalent to one complete school year.<br />

Graduate. A person who has successfully completed the final year of a level or sublevel of education. In<br />

some countries completion occurs as a result of passing an examination or a series of examinations. In<br />

other countries it occurs after a requisite number of course hours have been accumulated. Sometimes both<br />

types of completion occur within a country.<br />

Gross enrolment ratio (GER). Total enrolment in a specific level of education, regardless of age,<br />

expressed as a percentage of the population in the official age group corresponding to this level of<br />

education. The GER can exceed 100% due to early or late entry and/or grade repetition.<br />

Gross intake rate (GIR). Total number of new entrants in the first grade of primary education,<br />

regardless of age, expressed as a percentage of the population at the official primary-school entrance age.<br />

Gross domestic product (GDP). Sum of gross value added by all resident producers in the economy,<br />

including distributive trades and transport, plus any product taxes and minus any subsidies not included in<br />

68


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

the value of the products.<br />

Gross national product (GNP). Gross domestic product plus net receipts of income from abroad. As<br />

these receipts may be positive or negative, GNP may be greater or smaller than GDP.<br />

Gross national product per capita. GNP divided by the total population.<br />

Language (or medium) of instruction. Language(s) used<br />

to convey a specified curriculum in a formal or non-formal educational setting..<br />

Literacy. According to UNESCO’s 1958 definition, it is the ability of an individual to read and write<br />

with understanding a simple short statement related to his/her everyday life. The concept of literacy has<br />

since evolved to embrace multiple skill domains, each<br />

conceived on a scale of different mastery levels and serving different purposes. See Chapter 6 for a<br />

detailed discussion.<br />

Literacy projects/programmes. Limited-duration initiatives designed to impart initial or ongoing basic<br />

reading, writing and/or numeracy skills.<br />

Literate/Illiterate. As used in the statistical annex, the term refers to a person who can/cannot read and<br />

write with understanding a simple statement related to her/his everyday life.<br />

Millennium Development Goals (MDG). In September 2000, 189 countries signed the United Nations<br />

Millennium Declaration committing themselves to eradicating extreme poverty in all its forms by 2015.<br />

To help track progress toward these commitments, a set of time-bound and quantified goals and targets,<br />

called the Millennium Development Goals, were developed for combating poverty in its many<br />

dimensions - including reducing income poverty, hunger, disease, environmental degradation and gender<br />

discrimination.<br />

Multiple Indicator Cluster <strong>Survey</strong> (MICS). The MICS programme developed by UNICEF assists<br />

countries in filling data gaps for monitoring the situation of children and women through statistically<br />

sound, internationally comparable estimates of socioeconomic and health indicators. The household<br />

survey programme is the largest source of statistical information on children.<br />

Net attendance rate (NAR). Number of pupils in the official age group for a given level of education<br />

who attend school in that level, expressed as a percentage of the population in that age group.<br />

Net enrolment ratio (NER). Enrolment of the official age group for a given level of education,<br />

expressed as a percentage of the population in that age group.<br />

Net intake rate (NIR). New entrants to the first grade of primary education who are of the official<br />

primary school entrance age, expressed as a percentage of the population of that age.<br />

New entrants. Pupils entering a given level of education for the first time; the difference between<br />

enrolment and repeaters in the first grade of the level.<br />

Non-formal education. Learning activities typically organized outside the formal education system. The<br />

term is generally contrasted with formal and informal education. In different contexts, non-formal<br />

education covers educational activities aimed at imparting adult literacy, basic education for out-of-school<br />

children and youth, life skills, work skills, and general culture. Such activities usually have clear learning<br />

objectives, but vary in duration, in conferring certification for acquired learning, and in organizational<br />

structure.<br />

Norm Referenced Assessment. A form of assessment that ranks and compares students. Commonly<br />

reported as a percentage score (see Criterion Referenced Assessment).<br />

69


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

Out-of-primary-school children. Children in the official primary school age range who are not enrolled<br />

in primary school.<br />

Percentage of new entrants to the first grade of primary education with ECCE experience. Number<br />

of new entrants to the first grade of primary school who have attended the equivalent of at least 200 hours<br />

of organized ECCE programmes, expressed as a percentage of the total number of new entrants to the<br />

first grade.<br />

Percentage of repeaters. Number of pupils enrolled in the same grade or level as the previous year,<br />

expressed as a percentage of the total enrolment in that grade or level.<br />

<strong>Primary</strong> <strong>Education</strong>. Programmes normally designed on a unit or project basis to give pupils a sound<br />

basic education in reading, writing and mathematics and an elementary understanding of subjects such as<br />

history, geography, natural sciences, social sciences, art and music. Religious instruction may also be<br />

featured. These subjects serve to develop pupils’ ability to obtain and use information they need about<br />

their home, community, country, etc. Also<br />

known as elementary education.<br />

Public current expenditure on education as percentage of total public expenditure on education.<br />

Recurrent public expenditure on education expressed as a percentage of total public expenditure on<br />

education (current and capital). It covers public expenditure for both public and private institutions.<br />

Current expenditure includes expenditure for goods and services that are consumed within a given year<br />

and have to be renewed the following year, such as staff salaries and benefits; contracted or purchased<br />

services; other resources, including books and teaching materials; welfare services and items such as<br />

furniture and equipment, minor repairs, fuel, telecommunications, travel, insurance and rent. Capital<br />

expenditure includes expenditure for construction, renovation and major repairs of buildings and the<br />

purchase of heavy equipment or vehicles.<br />

Public expenditure on education. Total public finance, devoted to education by local, regional and<br />

national governments, including municipalities. Household contributions are excluded. Includes both<br />

current and capital expenditure.<br />

Public expenditure on education as percentage of total government expenditure. Total current and<br />

capital expenditure on education at every level of administration, i.e. central, regional and local<br />

authorities, expressed as a percentage of total government expenditure (on health, education, social<br />

services, etc.).<br />

Pupil. A child enrolled in pre-primary or primary education. Youth and adults enrolled at more advanced<br />

levels are often referred to as students.<br />

Pupil/teacher ratio (PTR). Average number of pupils per teacher at a specific level of education, based<br />

on headcounts for both pupils and teachers.<br />

Repetition rate by grade. Number of repeaters in a given grade in a given school year, expressed as a<br />

percentage of enrolment in that grade the previous school year.<br />

School life expectancy (SLE). Number of years a child of school entrance age is expected to spend at<br />

school, including years spent on repetition. It is the sum of the<br />

age-specific enrolment ratios for primary, secondary, post-secondary non-tertiary and tertiary education.<br />

School-age population. Population of the age group officially corresponding to a given level of<br />

education, whether enrolled in school or not.<br />

Secondary education. Lower secondary education is generally designed to continue the basic<br />

programmes of the primary level but the teaching is typically more subject-focused, requiring more<br />

specialized teachers for each subject area. The end of this level often coincides with the end of<br />

70


PES <strong>Evaluation</strong> <strong>Report</strong>, 2008<br />

compulsory education.<br />

In upper secondary education the final stage of secondary education in most countries, instruction is often<br />

organized even more along subject lines and teachers typically need a higher or more subject specific<br />

qualification lower secondary level.<br />

Survival rate by grade. Percentage of a cohort of pupils or students who are enrolled in the first grade of<br />

an education cycle in a given school year and are expected to reach a specified grade, regardless of<br />

repetition.<br />

Teachers or teaching staff. Number of persons employed full time or part time in an official capacity to<br />

guide and direct the learning experience of pupils and students, irrespective of their qualifications or the<br />

delivery mechanism, i.e. face-to-face and/or at a distance. Excludes educational personnel who have no<br />

active teaching duties (e.g. headmasters, headmistresses or principals who do not teach) and persons who<br />

work occasionally or in a voluntary capacity.<br />

Trained teacher. Teacher who has received the minimum organized teacher training (pre-service or inservice)<br />

normally required for teaching at the relevant level in a given country.<br />

Trainer. In the context of adult education, someone who trains literacy educators, providing pre-service<br />

or in-service training in adult literacy teaching methods.<br />

Transition rate to secondary education. New entrants to the first grade of secondary education in a<br />

given year, expressed as a percentage of the number of pupils enrolled in the final grade of primary<br />

education in the previous year.<br />

Triangulation. A concept borrowed from surveying. In a research context it refers to a process of using<br />

two or more data sources to confirm and observation or acceptance of an hypothesis.<br />

Definitions based on those from<br />

http://portal.unesco.org/education/en/ev.php-<br />

URL_ID=43385&URL_DO=DO_TOPIC&URL_SECTION=201.html<br />

71

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!