11.11.2014 Views

Regional Workshop on the UNDP Evaluation Policy Arab States

Regional Workshop on the UNDP Evaluation Policy Arab States

Regional Workshop on the UNDP Evaluation Policy Arab States

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>the</strong>re should be a mechanism to alert CO to use that amount. In additi<strong>on</strong>, time resources<br />

devoted to evaluati<strong>on</strong> should also be factored in when developing country programme.<br />

• Explore co-financing of outcome and o<strong>the</strong>r evaluati<strong>on</strong>s undertaken by CO with<br />

government and d<strong>on</strong>ors.<br />

• Specific guidance/feedback from EO <strong>on</strong> human and financial resources as well as range<br />

of timeframe (with opti<strong>on</strong>s) required for improving quality of outcome and o<strong>the</strong>r<br />

evaluati<strong>on</strong>s commissi<strong>on</strong>ed by CO.<br />

• Formulate strategic evaluati<strong>on</strong> plan with UNCT <strong>on</strong> UNDAF evaluati<strong>on</strong>s and explore<br />

additi<strong>on</strong>al resources for such evaluati<strong>on</strong>s.<br />

2.5 Quality Enhancement and Assurance<br />

Relevant secti<strong>on</strong>s of <strong>the</strong> Evaluati<strong>on</strong> <strong>Policy</strong><br />

All evaluati<strong>on</strong>s should meet minimum quality standards defined by <strong>the</strong> Evaluati<strong>on</strong> Office. To<br />

ensure that <strong>the</strong> informati<strong>on</strong> generated is accurate and reliable, evaluati<strong>on</strong> design, data<br />

collecti<strong>on</strong> and analysis should reflect professi<strong>on</strong>al standards, with due regard for any special<br />

circumstances or limitati<strong>on</strong>s reflecting <strong>the</strong> c<strong>on</strong>text of <strong>the</strong> evaluati<strong>on</strong>. To ensure this, <strong>the</strong><br />

professi<strong>on</strong>alism of evaluators and <strong>the</strong>ir intellectual integrity in applying standard evaluati<strong>on</strong><br />

methods is critical.<br />

The Evaluati<strong>on</strong> Office is resp<strong>on</strong>sible for setting evaluati<strong>on</strong> standards, developing and<br />

disseminating methodology and establishing <strong>the</strong> instituti<strong>on</strong>al mechanisms for applying <strong>the</strong><br />

standards; and for assuring (<strong>the</strong> quality of) mandatory decentralized evaluati<strong>on</strong>s and support<br />

<strong>the</strong> quality assurance of <strong>the</strong> evaluati<strong>on</strong>s c<strong>on</strong>ducted by <strong>the</strong> associated funds and programmes.<br />

Key issues and implicati<strong>on</strong>s<br />

Outcome level evaluati<strong>on</strong> expertise are lacking in <strong>the</strong> regi<strong>on</strong> as a whole, and in practice most<br />

outcome evaluati<strong>on</strong>s are actually ‘output evaluati<strong>on</strong>s’. The lack of basic m<strong>on</strong>itoring data and <strong>the</strong><br />

lack of instituti<strong>on</strong>al capacity to retain and provide credible data remains a key challenge that<br />

makes such evaluati<strong>on</strong>s more time-c<strong>on</strong>suming and undermine <strong>the</strong>ir quality. It is critical that <strong>the</strong><br />

experience so far is properly documented and shared to feed into future M&E strategies of<br />

<strong>UNDP</strong> and government. Experience from Egypt, for instance, shows that <strong>the</strong> majority of<br />

unsuccessful projects had “faults in design”. Therefore, it is important to identify measurable<br />

objectives and a few good indicators. We have to make sure that <strong>the</strong> data is <strong>the</strong>re to measure<br />

<strong>the</strong>m, and if we can measure <strong>the</strong>m, we have to ask what <strong>the</strong>y c<strong>on</strong>tribute to measuring <strong>the</strong><br />

outcome. There is weak capacity in <strong>the</strong> regi<strong>on</strong> in setting proper indicators.<br />

There is a need to systematically share informati<strong>on</strong> and best practices (e.g. <strong>the</strong> UN Evaluati<strong>on</strong><br />

Group Norms and Standards for evaluati<strong>on</strong> in <strong>the</strong> UN system) and build <strong>on</strong> less<strong>on</strong>s learned from<br />

previous programmes and results of evaluati<strong>on</strong>.<br />

Recommended acti<strong>on</strong>s<br />

• Support and feedback <strong>on</strong> quality assurance issues from EO to COs need to be<br />

systematized and not carried out <strong>on</strong> an ad-hoc basis.<br />

• Undertake situati<strong>on</strong> analysis of existing programmes and projects <strong>on</strong> quality assurance,<br />

including focus <strong>on</strong> gender sensitivity and audit reports etc.<br />

• The timing allocated for outcome evaluati<strong>on</strong>s in <strong>the</strong> current guidelines (“Yellowbook”)<br />

should be clarified.<br />

13

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!