Improve learning practicesacross the community7. Encourage grantees <strong>to</strong> developlearning agendas based on howtarget communities of practiceactually operate“Contributing <strong>to</strong> learning” can take manyforms and organizations often go about itin ways that are familiar <strong>to</strong> them. Commonoutputs include: online reposi<strong>to</strong>ries of casestudies, academic white papers shared at‘brown bag lunches’ or more formal ‘launch’events, and webinars <strong>to</strong> present findings<strong>to</strong> an interested group of viewers. Theseapproaches are the norm for most organizations,but they may overlook ac<strong>to</strong>rs thatare influential in their respective operatingcontexts who are not engaged in thesecommunities of practice.To address these challenges, donors canencourage grantees <strong>to</strong> first pinpointthe sources and channels their targetaudiences use <strong>to</strong> get inspiration,design projects, and troubleshootimplementations. Then, they caninterview audiences <strong>to</strong> understand whatmakes these channels appealing—is it theselection of <strong>to</strong>pics, the edi<strong>to</strong>rial <strong>to</strong>ne, thefrequency of new content, or other fac<strong>to</strong>rs?From there, they can develop learningproducts and outreach strategies based onaudience preferences, not organizationalhabit.8. Set specific indica<strong>to</strong>rs and targetsfor tracking a grantee’s contributions<strong>to</strong> broader community learning andprocess change.Although it will be difficult <strong>to</strong> determinecorrelation and causation between agrantee’s activities and wider systemimpact, setting targets and trackingprogress will help grantees prioritizeotherwise ‘fuzzy’ learning activities.While indica<strong>to</strong>rs will, in most instances,track outputs instead of outcomes, donorscan invest resources in analyzinglongitudinal changes in thought andpractice among target communities ofpractice. This measurement exercise cantake the form of a process evaluation whereincremental steps are defined and used forreflection at multiple points over time.Take, for example, an online communityof practice. Donors can specify ways ofmeasuring ‘community’ activity—such asthrough relationships formed and offlineengagements spurred between online participants.This will help <strong>to</strong> drive the initiative<strong>to</strong> tie immediate outputs (e.g. platformand social media creation)<strong>to</strong> long-term behavior change.9. Steer learning efforts <strong>to</strong>wardcommunity-wide process change(beyond organizational learningand peer exchange)Often, learning for organizations isconfined <strong>to</strong> internal parties or for exchangewith peer groups who are familiar (at times,only because they are geographically convenient).There are challenges <strong>to</strong> influencingbeyond one’s known sphere of influenceincluding the heavier investment needed <strong>to</strong>form new relationships, and the trade-offsthat arise when allocating limited resources.Credibility in the eyes of the wider communityof practice requires an ability <strong>to</strong> communicatehow one has changed the statusquo, but this takes time and incrementalsteps. Donors can help grantees changebroader processes in the following ways:Target support <strong>to</strong>ward learningactivities and deliverables in scopesof work. Doing so would be especiallyimportant in issue areas where organizationsseem <strong>to</strong> be particularly siloed, thereare few other incentives or channels forknowledge exchange, and/or where lessonsfrom a specific organization would greatlybenefit a larger community of practice. Incommunities studied, learning efforts arecentral <strong>to</strong> their agendas, however, they lackincentives <strong>to</strong> share concrete learning withthe broader community in a significant way.The rewards of doing so are ambiguous,whereas the cost is clear: precious time13
LOOKING FORWARDdiverted from activities (e.g. fundraising)with more immediate, direct benefits.Donors have an opportunity <strong>to</strong> prioritizeeffective learning by supporting these activitiesin current and core strategic granteeworkplans.Introduce new ways of directingprecision in understanding processchange. Quantitative metrics and targetsare often used <strong>to</strong> assess process change.For example, <strong>to</strong> understand the success ofcommunity building efforts, donors mayselect indica<strong>to</strong>rs relating <strong>to</strong> volume of socialmedia engagement, of website visi<strong>to</strong>rs, orof case studies downloaded. Qualitativemetrics, such as participant responses on amulti-stakeholder workshop, are also used.While these metrics are easy for grantees<strong>to</strong> report on and donors <strong>to</strong> review, theyprovide limited insight in<strong>to</strong> whether andhow different activities are contributing <strong>to</strong>larger process change.Donors can encourage greaterambition in communicationsactivities armed at enabling widerprocess change. They can start, forexample, by assessing, whether granteesprioritize audiences because they are an‘easy fit’, or because they understand userbehaviors and considerations of greatestimpact. A grantee might target its webinarat the global open government community,a seemingly natural audience for its work.However, donors can ask for more granularityin audience selection based on thegrantee’s larger impact goals. For example,a government ministry that reserves abi-weekly timeslot <strong>to</strong> discuss capacitybuilding is more likely <strong>to</strong> integrate newlearning in<strong>to</strong> its workflow. If learning is <strong>to</strong>drive impact, not just stimulate conversation,this ministry may be the most suitabletarget audience.The study highlighted the need for governance data communities<strong>to</strong> better understand the influential ac<strong>to</strong>rs and political contexts theyseek <strong>to</strong> influence. Doing so may require wider adoption of strategicplanning and program design approaches that appropriately accountfor the complexity of governance ecosystems and processes.There is interest and appetite from governance data ac<strong>to</strong>rs <strong>to</strong> buildupon the strong work that they are currently doing, and <strong>to</strong> try andtest new design approaches. Mechanisms for doing so includedemonstration projects that support organizations in applyinguser-centered and politically grounded design approaches inproduct development and implementation.Implementing such projects can happen through eager, influentialmulti-stakeholder initiatives that are invested in broader sec<strong>to</strong>rlearning. This will help <strong>to</strong> ensure lessons from the demonstrationprojects are shared among wider communities of practices.14