08.11.2014 Views

MONITORING AND EVALUATION FRAMEWORK ... - The Presidency

MONITORING AND EVALUATION FRAMEWORK ... - The Presidency

MONITORING AND EVALUATION FRAMEWORK ... - The Presidency

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>MONITORING</strong> <strong>AND</strong> <strong>EVALUATION</strong> <strong>FRAMEWORK</strong><br />

(Implications for Government)<br />

Presented at the Senior Management Service Conference<br />

by<br />

Kefiloe Masiteng<br />

21 September 2004


INTRODUCTION<br />

M<strong>AND</strong>ATE<br />

• Monitoring and evaluation of<br />

government performance has<br />

been identified as a<br />

responsibility of the PCAS<br />

• <strong>The</strong> <strong>Presidency</strong> may evaluate the<br />

performance of government<br />

against set goals, targets,<br />

equitableness of resource<br />

allocation and effectiveness and<br />

efficiency in service delivery<br />

across all levels.<br />

OBJECTIVE<br />

To inform steps to be<br />

undertaken in creating a<br />

conducive environment enabling<br />

all levels of government, private<br />

sector, communities and<br />

individuals to achieve respective<br />

goals in service delivery and<br />

Improve performance.<br />

Hence the need to assess essential capacity in govt.


Monitoring versus Evaluation<br />

• <strong>MONITORING</strong><br />

Tracking changes in program performance<br />

over time<br />

• <strong>EVALUATION</strong><br />

Attributing program outcomes to their<br />

causes


<strong>MONITORING</strong><br />

WHY MONITOR<br />

• Monitoring in the government-wide<br />

framework refers to a set of activity<br />

and milestone tracing techniques, all<br />

of which measure some aspect of<br />

government performance including<br />

the measurement of the current<br />

status and change over time (trend<br />

analysis) in any of the initiatives.<br />

• Monitoring tracks changes in<br />

services provided (outputs) and the<br />

desired results (outcomes), providing<br />

the basis for accountability in the<br />

utilization of resources.<br />

BENEFITS<br />

• Monitoring can be put into<br />

place as a management tool<br />

that may be sustained over<br />

time. It can be used to improve<br />

initiatives by identifying<br />

aspects that are working<br />

according to plan and yielding<br />

positive results, while on the<br />

other hand it can identify those<br />

initiatives that need midcourse<br />

corrections.<br />

POA: Monitoring of the progress made in<br />

attaining the goals set in the SONA and Makgotla.<br />

Cluster POAs: Bi-monthly reporting to Cabinet


<strong>MONITORING</strong> C0MPONENTS<br />

Monitoring processes:<br />

• Development and definition of indicators<br />

to measure the progress made towards<br />

meeting relevant objectives<br />

• Data collection mechanisms for and<br />

monitoring systems to collate data on<br />

indicators<br />

• Data verification, validation and systems<br />

clean up<br />

• Data analysis to determine outputs,<br />

outcomes and trends<br />

• Report writing on the progress made on<br />

implementation<br />

• Distribution and feedback mechanisms<br />

across the entire spectrum of relevant<br />

stakeholders.<br />

Capacity needed for Govt:<br />

• Understanding of the<br />

POA on GOVT. website<br />

• Ability to develop<br />

relevant indicators for<br />

the initiatives and<br />

interventions arising<br />

from the cluster POA<br />

• Information collection<br />

strategy on the<br />

developed indicators<br />

• Analysis and verification<br />

of collected information<br />

• Report writing<br />

• Communication link with<br />

GCIS


DEVELOPMENT <strong>AND</strong> DEFINITION OF<br />

RELEVANT INDICATORS<br />

• Indicators development is based<br />

on goals and objectives for<br />

government<br />

• <strong>The</strong>se indicators may be<br />

calculated on the basis of<br />

description and formulae allocated<br />

to measure progress made<br />

(monitoring) or determining<br />

causality (impact assessment).<br />

– method for acquiring information<br />

on indicators,<br />

– responsibility for collection<br />

– Info./data source<br />

– frequency for updating<br />

• Agreement on evaluation methods<br />

• Role of <strong>Presidency</strong><br />

• Spearheading the<br />

indicator development<br />

based on POA<br />

• Setting in place<br />

collection, collation and<br />

report-back/feedback<br />

mechanisms


IMPACT ASSESSMENT<br />

• <strong>The</strong> purpose of impact assessment<br />

is primarily to measure the degree of<br />

change attributable to a particular<br />

initiative or intervention.<br />

• Impact assessment addresses the<br />

question of causality.<br />

• What differentiates the two<br />

processes are the evaluation<br />

techniques which might just include<br />

trend analysis in the case of<br />

monitoring and the analytic<br />

techniques used in impact<br />

assessment.<br />

• It determines how much of the<br />

observed change in the outcome<br />

(quality of life, access to services<br />

e.t.c) at the population can be<br />

attributable directly to the<br />

implementation of government<br />

policies and programmes and not to<br />

other factors.<br />

• <strong>The</strong> level of analysis for assessing<br />

the impact of government policies<br />

and programmes is the population<br />

(beneficiaries).<br />

•Lessons from the Ten Year Review are crucial<br />

•Planning for future government reviews based on TYR indicators<br />

•Development of Mid-term review indicators based on MTSF<br />

•Review and refinement of current TYR indicators


IMPACT ASSESSMENT<br />

• Development of assessment<br />

frameworks (modeling)<br />

• Collection and collation of data<br />

from different sources in relation<br />

to developed models<br />

• Regression (logistic, multivariate<br />

e.t.c) analysis on dependent and<br />

independent variables<br />

• Interpretation of results/findings to<br />

determine relationships<br />

• Report writing on the impact of<br />

government interventions to the<br />

population<br />

• Distribution of reports to relevant<br />

stakeholders<br />

PROCESSES<br />

Implications for <strong>Presidency</strong><br />

• Advanced policy analysis<br />

skills<br />

• Advanced data analysis<br />

skills<br />

• Basic data mining<br />

Basic statistical modeling skills<br />

• Econometrics<br />

• Demographic modeling<br />

(work with treasury on economic models)<br />

(Working with departments to compile a compendium of indicators)<br />

(work with Statssa on demographic/population dynamics & NSS)


DEVELOPMENT OF DATA COLLECTION<br />

MECHANISMS FOR INDICATORS<br />

• To enable comparisons (demographic,<br />

social, economic, financial and<br />

corporate governance) across the<br />

provinces, population groups, gender<br />

and age groups around government<br />

sectors over time and space.<br />

• <strong>The</strong> data collected on indicators will<br />

thus have to accommodate such<br />

comparisons and be disintegrated<br />

within the developed systems and<br />

databases according to the above<br />

mentioned categories, especially the<br />

GDC.<br />

• Role of Government<br />

– Data collection<br />

– Verification<br />

– Validation<br />

– Report writing


INFORMATION MANAGEMENT<br />

• <strong>The</strong> use of information systems in<br />

monitoring provides a reliable flow<br />

of information to allow<br />

management to keep abreast with<br />

the progress in the<br />

implementation of policy thrusts,<br />

programmes and activities based<br />

on decisions made in different<br />

gatherings.<br />

• Information systems facilitate<br />

assessment of the quality,<br />

quantity and timeliness of policy<br />

and programme inputs while<br />

operational constraints towards<br />

programme and policy<br />

effectiveness are identified, thus<br />

gaps may be addressed.<br />

• <strong>The</strong>y may further provide<br />

contextual information for<br />

evaluation processes.<br />

Role of <strong>Presidency</strong><br />

• Reporting Formats<br />

from FOSAD<br />

– (the project card)<br />

• EIMS<br />

– Roll out<br />

– TRAINING<br />

– Commitment<br />

• Integration with NSS<br />

(urgent to review)


Three models may be applied in monitoring<br />

and evaluation activities in Government<br />

• High level tertiary model: This model can be informed by State<br />

of the Nation address, Cabinet Decision and cluster priorities;<br />

• Government level monitoring and evaluation (PCAS): This<br />

model measures the progress made by government as a whole<br />

in addressing the objectives and implementing priority<br />

programmes;<br />

• Departmental Monitoring and Evaluation Initiatives:<br />

This level addresses the progress made by individual<br />

departments in implementing their programmes in line with<br />

government priorities. <strong>The</strong>se include indicators to measure<br />

programme level objectives (outputs), developed within each<br />

department in their informed by their strategic frameworks


Government model for systems<br />

integration at National level<br />

Govt.<br />

POA<br />

Executive<br />

Info.<br />

(CabEnet)<br />

Planning<br />

Government<br />

Statistics<br />

(Departmental routine<br />

sy stems)<br />

National Statistical System)<br />

M&E<br />

Programme level Statistics/Info<br />

(Departmental Information Systems)<br />

Provincial and Local Government M&E


OPERATIONALISING GOVERNMENT- WIDE<br />

<strong>MONITORING</strong> <strong>AND</strong> <strong>EVALUATION</strong> (PHASE 1)<br />

Five results, each of which will be delivered as a Report<br />

• A review of existing public service monitoring and evaluation systems.<br />

An early step in the process of creating a national monitoring and evaluation<br />

system for government will involve reviewing existing Departmental M&E<br />

systems so that existing capacity and capability is properly drawn upon.<br />

• A review of government reporting requirements, procedures and needs.<br />

• A review on progress in the development and implementation of government wide<br />

M&E systems by central or coordinating departments.<br />

• Results of consultations with all provincial administrations and FOSAD clusters on<br />

their performance indicators.<br />

• A logic model and framework architecture for a national M&E framework, (including<br />

a dashboard-style presentation of a national scorecard).


Monitoring and Evaluation<br />

Refining Monitoring and Evaluation system<br />

Improve quality of our outputs<br />

Provide an early warning system<br />

Improve statistical and information base.<br />

Cabinet<br />

Clusters and Depts<br />

FOSAD & MANCO?<br />

Project Teams<br />

<strong>Presidency</strong>


Monitoring and Evaluation<br />

Government's Programme of Action<br />

Econom ic Cluster<br />

FIRST ECONOMY<br />

In dex | Next<br />

Actions Departments Tim eframe Progress<br />

1.00 BROAD ECONOMIC STANCE<br />

1.01 N o significant cha nges to m acroeconom ic<br />

m anag em ent foreseen .<br />

NT Ong oing Continuing<br />

1.02 In fla tion targeting to re m ain in pla ce NT Ong oing Continuing<br />

1.03 <strong>The</strong> Execu tive sho uld work w ith the m onetary<br />

authorities to ensure th at the objective of a sta ble<br />

and com petitive exch ange rate and approp riate<br />

inflation range is attained<br />

NT Ong oing Con tinu ing thro ugh<br />

ordinary cha nnels<br />

1.04 Shift to m icro eco nom ic re form s dti, N T,<br />

Presid ency<br />

Ong oing<br />

Ongoin g<br />

2.00 INCREASED LEVELS OF INVESTM ENT IN<br />

PUBLIC INFRASTRUCTURE<br />

2.01 D eve lop sector-specific infrastructure in vestm ent<br />

plans<br />

2.02 <strong>The</strong> developm ent of financing stra teg ies to<br />

im plem en t plans including Public Private<br />

Partnership s and sim ilar m echanism<br />

2.03 Im p lem entatio n o f an appro priate regulatory<br />

fram ew ork to stim ulate new S tate Ow ned<br />

Enterprise investm en t in infrastructure<br />

DPE, DM E, d ti,<br />

NT, DOT<br />

DPE, DM E, d ti,<br />

NT, DOT<br />

DPE, NT,<br />

Presid ency<br />

2.04 Strengthenin g o f sector regu lato rs DPE, DM E,<br />

DOC<br />

N ov-04<br />

Sep -04<br />

Sep -04<br />

C ontinuous<br />

Progre ssing<br />

Progre ssing<br />

Progre ssing<br />

2.05 Strengthenin g o f th e G ove rnm ent cap acity to<br />

oversee the State O wn ed Ente rprises<br />

DPE, NT,<br />

Presid ency<br />

C ontinuous<br />

2.06 Finalise C AP EX fin ancing strategy DPE et al Sep -04 Progre ssing<br />

2.07 Finalise G overnm ent-w ide review of the State<br />

Ow ned Enterprises perform an ce pra ctices<br />

2.08 B ette r utilis ation of P ub lic Inves tm ent<br />

C om m ission 's Isibaya fun d<br />

DPE, NT,<br />

DPSA<br />

Sep -04<br />

Draft policy to be<br />

com p lete d by N ovem ber<br />

2004<br />

Progre ssing<br />

NT, GDC N ov-04 Progre ssing<br />

2.09 Expedite Port re structuring DOT, DPE a.s.a.p B ill to be tab led in N C O P<br />

by Septem ber 200 4


Monitoring and Evaluation<br />

PROGRAMME OF ACTION<br />

Governance and Administration (G&A Cluster)<br />

Policy Objective 4.00 - MACRO-ORGANISATION OF THE ST ATE<br />

Program me Activity 4.04.2 - Build 60 Multi Purpose Community Centres (MPCCs) and finalise plans to<br />

have at least one of these in each of our 284 municipal areas<br />

Project Team: GCIS<br />

People’s Contract/Partnerships:<br />

Timeframes: Dec-04<br />

WORK IN PROGRESS<br />

Strategic Policy Issues:<br />

Performance Indic ators:<br />

Level of Progress:<br />

Fast Tracking Implementation:<br />

Challenges:<br />

Comments:


ROLE OF COORDINATING<br />

DEPARTMENTS<br />

Institutions at the centre of government need to take the initiative in designing<br />

performance assessment systems for the whole of government i.e PCAS, OPSC,<br />

National Treasury, DPSA, DPLG.<br />

<strong>The</strong>se should link clearly into the Medium Term Expenditure and Strategic Frameworks<br />

and should show how assessments and evaluations should deliver useful information<br />

with practical recommendations.<br />

Such transversal systems could include:<br />

• Good governance (OPSC/<strong>Presidency</strong>)<br />

• Value for Money (National treasury)<br />

• Service Delivery (DPLG/OPSC)<br />

• Human Resource utilization (DPSA)<br />

• An Early Warning Systems (DPSA/<strong>Presidency</strong>)


ROLE OF SECTOR DEPARTMENTS<br />

Government wide M&E system will be operationalised on the<br />

understanding that each individual department will take responsibility for<br />

their own monitoring and evaluation processes according to the<br />

guidelines and standards mentioned above.<br />

• Monitoring is meant to take place at three different levels:<br />

‣ Implementation monitoring, evaluation, early warning and data collection – at all<br />

three spheres of government using inputs, outputs and outcome indicators<br />

‣ Monitoring of national departmental inputs, outputs and outcomes – by the<br />

coordinating departments (PCAS, OPSC, National Treasury, DPSA, DPLG)<br />

‣ Monitoring of process inputs, outputs and outcomes – by the departments<br />

themselves<br />

• Evaluation will also take place at these three levels but will be restricted to<br />

process and impact analysis.


ROLE OF PROVINCIAL <strong>AND</strong> LOCAL<br />

GOVERNMNET<br />

• Operationalisation of the framework will comprise provincial, departmental systems<br />

and the government wide supplementary systems listed above, some of which still<br />

need to be developed. Work on such development should be considered a priority.<br />

• Government wide M&E system will be operationalised on the understanding<br />

that individual Provinces will take responsibility for their own monitoring and<br />

evaluation processes according to the guidelines and standards mentioned<br />

above.<br />

• <strong>The</strong> role of Premier’s Offices in driving provincial M&E will also need a<br />

special focus. This highlights the need for the offices of the Premiers in all<br />

Provinces to establish monitoring and evaluation processes and apply them<br />

to local government.


Components of Programme Monitoring and<br />

Impact Assessment<br />

Program level<br />

Population level<br />

INPUTS PROCESSES OUTPUTS OUTCOMES<br />

Resources:<br />

•Personnel<br />

•Equipment<br />

•Finance<br />

Project Cycle: Phases<br />

1. Housing Dev. Process<br />

•Access to land<br />

•Land avail. Agreement<br />

2. Planning Process<br />

•Layout<br />

•Civil eng. Design<br />

3. Township est. process<br />

•Install Civ. Eng. Services<br />

•Units construction<br />

4. Hand over process<br />

•Keys to beneficiaries<br />

Deliverables<br />

•Serviced sites<br />

•Subsidies approved<br />

•Units completed<br />

•Units under const.<br />

•Projects approved<br />

•Fem. Headed H/holds<br />

•Budget & Exp.<br />

Impact<br />

•Housing access<br />

•Better lives<br />

• % Beneficiaries<br />

•Objectives met<br />

Monitoring<br />

Impact<br />

Assessment


RELEVANT SKILLS <strong>AND</strong> RESOURCES<br />

REQUIRED<br />

• Research;<br />

• statistical/ data analysis;<br />

• specialized software to perform modeling and<br />

other evaluation techniques;<br />

• Research design for evaluation may include<br />

population surveys, community surveys or<br />

forums, focus groups as well as randomized<br />

experiments;<br />

• Policy analysis and report writing.


Key Questions for Program M&E<br />

• What is a program?<br />

• Nationally organized, often<br />

publicly sponsored, effort to<br />

deliver social-economic<br />

services to target<br />

populations with need<br />

• Organizational systems<br />

activated for service<br />

delivery<br />

• Indefinite lifetime<br />

• Has an institutional host<br />

that is organic, of known<br />

size, adaptive, and<br />

operates in a changing<br />

environment<br />

• Did the program achieve<br />

its objectives?<br />

• Were the results<br />

attributable to program<br />

efforts?<br />

• Which program activities<br />

were more or less<br />

important/effective?<br />

• Did the intended benefit<br />

from the program?<br />

• At what cost?


Scope of Program M&E<br />

• What level of program evaluation?<br />

• National, subnational, specific site?<br />

• Implications for M&E design<br />

• Inference of results<br />

• Relevant time frame?<br />

• Relevant units of action?


RELEVANT SKILLS <strong>AND</strong> RESOURCES<br />

REQUIRED<br />

• Research;<br />

• statistical/ data analysis;<br />

• specialized software to perform modeling and<br />

other evaluation techniques;<br />

• Research design for evaluation may include<br />

population surveys, community surveys or<br />

forums, focus groups as well as randomized<br />

experiments;<br />

• Policy analysis and report writing.


Illustration of Program Monitoring<br />

Program<br />

outcome<br />

indicator<br />

Program<br />

start<br />

TIME-><br />

Program<br />

end


Illustration of Program Monitoring<br />

Program<br />

outcome<br />

indicator<br />

Actual?<br />

Program<br />

start<br />

TIME-><br />

Program<br />

end


Illustration of Program Impact<br />

With program<br />

Change<br />

in<br />

program<br />

outcome<br />

Without<br />

program<br />

Program<br />

start<br />

TIME-><br />

Program<br />

end


Illustration of Program Impact<br />

Change<br />

in<br />

program<br />

outcome<br />

Program<br />

impact<br />

With program<br />

Without<br />

program<br />

Program<br />

start<br />

TIME-><br />

Program<br />

end


<strong>The</strong> Role of the<br />

Logical/Strategic/Conceptual Framework<br />

• Logical vs Strategic vs Conceptual<br />

• Clarify program objective/strategic<br />

outcome/dependent variable<br />

• Interrelate units, levels and directions of<br />

action<br />

• Allow for consensus-building around a<br />

common paradigm


Example of a Strategic Framework<br />

Strategic Objective/Priority<br />

Objective 1 Objective 2<br />

Indicator 1<br />

Indicator 2<br />

Indicator 1<br />

Indicator 2<br />

Indicator 3


Example of a Conceptual Framework<br />

for a Structural Model<br />

Individual<br />

demand<br />

Program<br />

supply<br />

Service<br />

utilization<br />

Output of<br />

delivery<br />

Adequacy of<br />

Delivery


Example of a Conceptual Framework<br />

a Structural Model<br />

Individual<br />

demand<br />

Program<br />

supply<br />

Technical<br />

inputs<br />

Service<br />

utilization<br />

Housing<br />

Delivery<br />

Institutional<br />

capacity<br />

Adequate<br />

Housing<br />

Selfsufficiency


Monitoring versus Evaluation<br />

Can good monitoring lead to good evaluation?<br />

• Can good monitoring lead to<br />

good evaluation?<br />

• Indicators = Significant and<br />

influential factors<br />

• Framework = <strong>The</strong>oretically<br />

sound model<br />

• Directionality = Temporally<br />

correct causal flow<br />

• Coupling<br />

quantitative<br />

and<br />

qualitative<br />

assessment<br />

methods<br />

• Levels = Appropriate hierarchy<br />

of units

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!