to download PDF of full document - SPARC Nigeria
to download PDF of full document - SPARC Nigeria
to download PDF of full document - SPARC Nigeria
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
APPENDIX I: M&EMETHODOLOGY<br />
Methodology<br />
ANNEXES<br />
Moni<strong>to</strong>ring and evaluation <strong>of</strong> 2009 DRGs-MDGs projects and programmes was done by a completely<br />
independent National Moni<strong>to</strong>ring and Evaluation Team (NMET).<br />
Qualitative data was collected through in-depth interviews and focus group discussions (FGDs) with<br />
stakeholders in beneficiary communities. Observations were recorded using a set <strong>of</strong> templates and benchmark<br />
criteria that were agreed at the inaugural national briefing workshop at Kaduna. Benchmarking helped measure<br />
the progress <strong>of</strong> work at project sites by apportioning a percentage <strong>to</strong> each work stage.<br />
State M&E teams (SMETs) assessed DRGs-MDGs projects and programmes in the field. NMET visited states <strong>to</strong><br />
verify sample SMET reports. During these visits interviews and FGDs were held with beneficiaries and NMET<br />
met with the direc<strong>to</strong>rs <strong>of</strong> the state MDGs <strong>of</strong>fices and MDA desk <strong>of</strong>ficers. Meetings <strong>of</strong> both state and national<br />
teams with MDAs were facilitated by OSSAP–MDGs through letters <strong>of</strong> introduction <strong>to</strong> state coordina<strong>to</strong>rs.<br />
All data and information from field M&E were fed in<strong>to</strong> a web-based portal created by NMET for real-time data<br />
feed from all the teams. The portal provided seamless communication between the state and national teams,<br />
and allowed basic information for the various sec<strong>to</strong>rs <strong>to</strong> be shared. It also eliminated redundant data as all<br />
project sites were codified according <strong>to</strong> budget line items. In addition, the web portal generated reports on the<br />
status <strong>of</strong> projects (Abandoned, Completed, Ongoing and Not Started) and on the quality <strong>of</strong> project execution<br />
(Good, Average and Poor). A project was categorised as „Abandoned‟ if it had started but the contrac<strong>to</strong>r had not<br />
been <strong>to</strong> the project site for six months. If site visits showed that the project had not begun, it was categorised as<br />
„Not Started‟.<br />
Of particular interest <strong>to</strong> the NMET were the „Outputs‟ (level and quality <strong>of</strong> completion) and the „Immediate<br />
Outcomes‟ <strong>of</strong> projects and programmes. Completion related <strong>to</strong> the actual physical finishing <strong>of</strong> a project, as<br />
evaluated by M&E consultants, and the quality <strong>of</strong> execution was based on agreed benchmarks. The ‟Immediate<br />
Outcome‟ was a composite evaluation made by CSOs who assessed the degree <strong>to</strong> which projects or<br />
programmes served their intended purpose, taking in<strong>to</strong> account community participation, relevance <strong>to</strong> end users,<br />
access <strong>to</strong> end-users, branding and sustainability.<br />
National M&E Team Portal<br />
The NMET Portal is the website where data from the state M&E teams was assembled. The portal was<br />
developed <strong>to</strong> streamline data collected by the states‟ team so that it can be s<strong>to</strong>red in a centralised databank with<br />
the aim <strong>of</strong> improving collection methods, data analysis and integrity <strong>of</strong> data from the fields.<br />
Page 70 <strong>of</strong> 150