Design, Monitoring, and Evaluation – Capacity Assessment
Design, Monitoring, and Evaluation – Capacity Assessment
Design, Monitoring, and Evaluation – Capacity Assessment
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
• On one occasion, respondent generated codes had to be altered to fit in with the codes<br />
used by other Country Offices. Likewise, there were some alterations in the ordering of<br />
responses that had to be modified prior to data entry<br />
• The sample is too small for some of the analysis; hence only numbers <strong>and</strong> not<br />
percentages are presented by country.<br />
• The spread of projects by region / Country Offices tends to give weightage to the results<br />
from select regions / Country Offices<br />
• Sector information for projects is currently unavailable to use as an analysis filter,<br />
consequently all the projects are viewed as a whole.<br />
• Tremendous spread of projects by beneficiaries reached <strong>and</strong> lack of information on this<br />
for all projects doesn’t allow assignment of weights.<br />
• It appears that at times the questionnaire has been filled up by individuals rather than as<br />
part of a group exercise. (See section 3.6: Numbers ranged from one individual to 50<br />
project staff participating in the DME CA).<br />
• Definitive response categories considered <strong>–</strong> only responses that were an unqualified<br />
‘Yes’. This may tend to present a pessimistic picture, by not counting “possibly” or “to<br />
some extent”.<br />
• Tremendous variation in projects within a country in terms of their <strong>Design</strong>, <strong>Monitoring</strong>,<br />
<strong>and</strong> <strong>Evaluation</strong> make it difficult to paint a true Country Office-specific picture.<br />
• These were self-assessments by project staff. The answers represent their own<br />
perceptions. In some cases, outside experts facilitated DME CA sessions. It is hoped<br />
that they helped to provide some objectivity. Nevertheless, all of these results need to be<br />
interpreted as “according to those participating in these DME CA exercises”.<br />
• It is apparent that CARE staffs have a variety of interpretations of some DME terms.<br />
One example is “registry” for counting participants/beneficiaries. While many projects<br />
say their M&E MIS uses a “registry” it is doubtful they all do keep track of individuals,<br />
especially given the size of some projects (with as many as 16.5 million beneficiaries).<br />
• Narrative summaries were available for a sub-sample of the 23 Country Offices.<br />
3. Results<br />
Some of the key results of this analysis were that 84% of projects have a log frame; 20% of<br />
projects were based on a full Household Livelihood Security (HLS) assessment; 73% of projects<br />
say their final goal is addressed at the household impact level; 45% of projects have some form<br />
of evaluation plan; 43% of projects had a quantitative survey as a baseline; <strong>and</strong> in 65% of<br />
projects, staff were measuring or processing outcome data. It is evident that the DMECA has led<br />
to the systematization of reflection <strong>and</strong> varying levels of analysis of the state of DME in the<br />
Country Offices.<br />
Summary results were reported in the Executive Summary. Detailed results are divided into the<br />
following sub-sections: Concept for Projects (3.1); Diagnosis, <strong>Design</strong>, Goals, <strong>and</strong> Indicators<br />
(3.2); <strong>Monitoring</strong> <strong>and</strong> <strong>Evaluation</strong> (3.3); Information Processing <strong>and</strong> Use (3.4); Training <strong>and</strong> Use<br />
of DME Skills (3.5); <strong>and</strong> DME CA Process (3.6).<br />
DME CA Global Synthesis 3