13.07.2015 Views

INASP newsletter 49.pdf

INASP newsletter 49.pdf

INASP newsletter 49.pdf

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Practising DevelopmentAn interview with Clara RichardsClara Richards is a development professional who hasworked for CIPPEC and the Overseas Development Institute.She is the current coordinator of the Evidence-BasedPolicy in Development Network. Earlier this year, Clara andVanessa Weyrauch (from CIPPEC) carried out an evaluationof <strong>INASP</strong>’s programme to teach pedagogy skills to those whotrain African policy makers in the use of evidence. Followingthis evaluation, Clara took part as a participant in a similartraining programme in Kuala Lumpur — this time for trainersof Asian policy makers. To complete the cycle, she recentlytook on the role of co-facilitator at a further workshop —pedagogy for trainers of policy makers in Latin America.Since she has experienced this approach as an evaluator,a participant and an implementer, we thought it wouldbe interesting to find out Clara’s perspective on <strong>INASP</strong>’sapproach.Q1. Thanks very much to Clara for agreeing to take part inthis interview. Could you briefly describe the programmethat you and Vanessa were asked to evaluate?The programme we evaluated aimed to build the trainingabilities of a group of African trainers so that they wouldbe better able to train policy makers on the use of researchfor policy. It consisted of three phases: the pre-workshop,workshop and post-workshop.In the pre-workshop phase, participants were asked towrite a reflective essay focussing on teaching and learning.The participants were then brought together for a five-dayworkshop to build training skills. Throughout the workshop,the examples and practical activities concerned relevanttraining topics. Following the workshop, participants wereexpected to deliver training to policy makers in the skillsneeded to access and use research evidence and, wherepossible, to engage in peer mentoring relationships withother participants.The objective of carrying out an evaluation was todetermine the nature and magnitude of the impact of thistraining programme in order to help <strong>INASP</strong> and Institute ofDevelopment Studies (IDS) to understand whether this wasa cost-effective approach for building the capacity of policymakers to access and use research.Q2. Why did you decide to get involved in carrying out theevaluation? What about it interested you?What interested me the most was the approach of theprogramme. Training policymakers is a key aspect to helpthem improve different skills such as using evidence forpolicymaking and encourage the development of moreinformed policies, therefore trainers should be experts onthe topics they teach. However, there is also a great needto better engage with this (often challenging) audience.Although the expertise on different topics such asinformation literacy, writing skills, research communication,etc. is growing fast, trainers still feel they lack the skills toengage participants and they feel weak while making theirtraining interesting and appealing. Especially in Africa itwas discovered that the learning approach is usually very“lecturer centred” creating a large gap between trainer andparticipants which hinders learning.<strong>INASP</strong> and IDS have introduced a very different approachthat bridges this distance and makes training much moreeffective. Based on the constructivist theory, training takesfrom the idea that people learn best when they ‘co-construct’the knowledge. The constructivist trainer facilitates theacquisition of new knowledge by acknowledging the wealthof experience in the room, encouraging participants to reflecton previous experiences and through questioning elicits thegaps in their knowledge. The trainer co-constructs the ‘new’knowledge by filling in the gaps in participants’ knowledgeand asks them to consider how the new theory alters theirperception or approaches to their current life experiences.I think this approach should be used more often in capacitybuilding activities in general. This evaluation was an excellentopportunity to detect how this can be applied and the typeof results that it yields.Q3. What would you say were the main lessons from theevaluation? Are there lessons which are of relevance toothers working in this field?I think the main lesson was to realise that although immediateresults were very positive (in the sense that the participants’reaction to the programme was optimistic and they feltthey had acquired new skills and improved their trainingabilities), as an evaluator we need to understand that theseimmediate reactions are not enough to assess long-termchange. For example, although some participants said theyacquired new skills during the workshop and [as mentionedby trainers] they had performed very well during theteaching sessions, they performed less well when observedafter the workshop. This shows that knowing how to delivergood training may not be enough to effect behaviour changein real world settings. There is a need to keep supportingthem through diverse channels (for example an e-platformwhere they can share progress) and monitoring the longtermprogress (or regression) in order to address attitudinaland behavioural change.10 <strong>INASP</strong> Newsletter 49 December 2012

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!