10.07.2015 Views

Towards a Metrics Implementation Plan - UCAR Finance ...

Towards a Metrics Implementation Plan - UCAR Finance ...

Towards a Metrics Implementation Plan - UCAR Finance ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Toward a <strong>Metrics</strong> <strong>Implementation</strong> <strong>Plan</strong>Based on the <strong>UCAR</strong> Strategic <strong>Plan</strong><strong>Metrics</strong> Action Learning Team<strong>UCAR</strong> 2006 Leadership AcademyJothiram VivekanandanMike TurpinDave MaddyJeff LazoDeirdre GarveyAlex EschenbaumClara DeserBob BarronJune 8, 2006


Executive Summary<strong>Metrics</strong> are an increasingly important topic at the National Science Foundation (NSF) and amongscientific researchers. With the yearly budget for NSF research currently at over five billiondollars, there is a keen interest in having a robust method for measuring the products of NSFresearch. The University Corporation for Atmospheric Research (<strong>UCAR</strong>) has studied whichtypes of metrics are appropriate for its activities and is in the process of completing a newstrategic plan. This report provides the <strong>UCAR</strong> President’s Council with suggestions fordeveloping a <strong>UCAR</strong> <strong>Metrics</strong> <strong>Implementation</strong> <strong>Plan</strong> tied to the <strong>UCAR</strong> Strategic <strong>Plan</strong>.There are many important considerations for <strong>UCAR</strong> as it implements metrics. <strong>UCAR</strong> will needto manage both the positive effects of metrics (e.g. better information on which to basedecisions) and the negative effects (e.g. increased overhead associated with collecting andreporting metrics). <strong>UCAR</strong> will also need to consider potential impacts of metrics on the personalneeds and desires of <strong>UCAR</strong> staff. <strong>UCAR</strong> could address the impacts on staff by informing andinvolving them in the metrics implementation process as early and widely as feasible.Three key pieces of material provide an excellent background on metrics for <strong>UCAR</strong>. The 2005<strong>UCAR</strong> <strong>Metrics</strong> Committee Report gives a careful treatment of the types of metrics appropriatefor use at <strong>UCAR</strong>. The 2006 Draft <strong>UCAR</strong> Strategic <strong>Plan</strong> provides the most current source for<strong>UCAR</strong>’s goals. The 2005 NRC report on appropriate use metrics for the climate change scienceprogram is a very comprehensive look at metrics aimed at activities similar to those of <strong>UCAR</strong>and provides a quite useful cross check to <strong>UCAR</strong>’s work on metrics. Most of the metrics fromthe <strong>UCAR</strong> <strong>Metrics</strong> Survey appear to map readily into the <strong>UCAR</strong> strategic goals, but there mayneed to be more work done on defining <strong>UCAR</strong>’s process and input metrics.The process has for developing the <strong>UCAR</strong> <strong>Metrics</strong> <strong>Implementation</strong> <strong>Plan</strong> (MIP) might consist ofthree phases: 1) Initial institutional commitment to a MIP and formation of metrics committees,2) Development of the MIP, and 3) Ongoing implementation, review, and modification of theMIP. Each of these phases contain several steps and should include frequent outreach andeducation (O&E) to <strong>UCAR</strong> staff. The successful completion of this process will require a longterm significant commitment of staff resources over the one year estimated for phases 1 and 2.An additional ongoing commitment of staff resources will be needed for implementation andreview of the MIP after is has been established.<strong>Implementation</strong> of metrics at <strong>UCAR</strong> appears to be feasible. After the <strong>UCAR</strong> Strategic <strong>Plan</strong> isfinalized, <strong>UCAR</strong> will be positioned to create a <strong>UCAR</strong> <strong>Metrics</strong> <strong>Implementation</strong> <strong>Plan</strong> that is tied tothe <strong>UCAR</strong> Strategic <strong>Plan</strong>. The first step in this process could be for <strong>UCAR</strong> to make aninstitutional commitment to the development of a <strong>Metrics</strong> <strong>Implementation</strong> <strong>Plan</strong>.


1 IntroductionThis document’s objective is to offer the <strong>UCAR</strong> President’s Council suggestions for developinga <strong>Metrics</strong> <strong>Implementation</strong> <strong>Plan</strong> (MIP) tied to the <strong>UCAR</strong> strategic plan. This document is theoutcome of a <strong>UCAR</strong> 2006 Leadership Academy Action Learning Team project. The ActionLearning Team (subsequently referred to as “the Team”) chose a topic suggested by JackFellows, vice president of Corporate Affairs and the director of the <strong>UCAR</strong> Office of Programs.Fellows’ initial charge to the Team statesWhat metrics (process, input, output, outcome, and impact) should weuse to monitor progress toward our mission and goals? The NationalAcademy released a report recently entitled “The Appropriate Use of<strong>Metrics</strong> for the Climate Change Science Program” that explored metrics inour world. <strong>UCAR</strong> also developed a “survey” of metrics recently. Alongwith the NA report, it might be quite useful to take that survey the nextstep and develop a metrics implementation plan that could be part of theproposal (i.e., how we expect to measure progress toward our goals).Fellows elaborated on this initial charge in a meeting with the Team at the outset of this project.In that meeting, Fellows relayed that Cliff Jacobs (NSF Atmospheric Sciences Division sectionhead) would like to see the <strong>UCAR</strong> metrics reflect NSF’s metrics. Further, Jacobs wanted the<strong>UCAR</strong> metrics to be tied to the upcoming <strong>UCAR</strong>*2020 Strategic <strong>Plan</strong>.In 2002, an NSF-appointed review panel noted that “<strong>UCAR</strong>/NCAR/UOP should develop arobust set of metrics (beyond citation analyses) that characterize the quality, effectiveness andefficiency or both science programs and service functions…. A comprehensive set ofperformance metrics would help <strong>UCAR</strong> and NSF with strategic planning and aid comparisonswith other organizations” (“Measuring the Productivity, Quality, and Impact of <strong>UCAR</strong>Programs,” <strong>UCAR</strong> 2005).In addition to appreciative interviews 1 , we referred to three key documents for information.Section 2.1 describes these documents in more detail.(1) “<strong>UCAR</strong>*2020: A Strategic Outlook for the University Corporation for AtmosphericResearch” (<strong>UCAR</strong> 2006 - Draft), subsequently referred to as the “<strong>UCAR</strong> Strategic <strong>Plan</strong>.”This document describes the goals of the organization. The metrics implementation planis intended to measure progress toward these goals.(2) National Research Council’s “Thinking Strategically: The Appropriate Use of <strong>Metrics</strong>for the Climate Change Science Program.” Committee on <strong>Metrics</strong> for Global ChangeResearch, Climate Research Committee, National Academy of Sciences, subsequentlycalled the “NRC document.” This document gives an organization-independent view ofmetrics and offers suggestions for metrics implementation methods.1 “Appreciative [interview] questions are written as affirmative probes into an organization’s positive core, in thetopic areas selected. They are written to generate stories, to enrich the images and inner dialogue within theorganization, and to bring the positive core more fully into focus.” http://www.positivechange.org/appreciativeinquiry.htmlaccessed June 7, 2006.1


(3) “Measuring the Productivity, Quality, and Impact of <strong>UCAR</strong> Programs” (<strong>UCAR</strong>2005), subsequently referred to as the “<strong>UCAR</strong> <strong>Metrics</strong> Committee Report.” Thisdocument focuses on determining a robust set of metrics that may be used to measure theorganization’s quality, effectiveness, and efficiency.Section 2 of this report summarizes some considerations we identified in undertaking thisproject. We feel that these are important considerations in developing a <strong>UCAR</strong> MIP. Section 3summarizes the three primary documents we used in developing the recommended process, andSection 4 outlines a proposed process for developing an MIP.2 <strong>Metrics</strong> Considerations2.1 Pluses and Minuses of Implementing <strong>Metrics</strong> at <strong>UCAR</strong>/NCAR/UOPPros and cons are associated with implementing metrics at <strong>UCAR</strong>/NCAR/UOP. On the positiveside, metrics implementation would yield information on which the organization’s leaders couldbase sound decisions and changes, and accountability at all organizational levels could increase.In addition, metrics could be used as a tool for implementing and evaluating <strong>UCAR</strong>’s ability tomeet its strategic planning goals. <strong>Metrics</strong> implementation could also result in improvedcontinuity and increased confidence as goals are reached. The Team acknowledged a currentsocietal trend that favors numbers and metrics (e.g., the No Child Left Behind Act).Conversely, team members identified concerns that staff would resist metrics implementationbecause of their potential use in performance appraisals and the associated setting of salaries.The metrics might also be seen as an additional burden of bureaucracy and paperwork. Inaddition, there is the perception that people perform to the metric and as a result, individuality,flexibility, and creativity might diminish. In other words, metrics implementation might cause aloss of the “big picture” and a “dumbing down” of the organization, and long-term researchmight suffer in a metrics-driven organization. Finally, there are concerns that inappropriate orirrelevant metrics could be included in the MIP.The Team also engaged in a polarity discussion of “metrics” versus “no metrics.” To recap,major benefits of implementing metrics include providing the institution with information onwhich to base sound decisions and changes and holding people accountable for their work.Benefits of not implementing metrics include increased flexibility and freedom, leading possiblyto higher levels of creativity and individuality, and perhaps to unexpected scientific or technicalbreakthroughs.During our discussions on the benefits and risks of metrics implementation, we discovered thatthe psychology of metrics implementation within the organization is an important component ofthe metrics implementation process, and should be taken into consideration in any move forward.2.2 Human Factors Associated with Implementing <strong>Metrics</strong> at<strong>UCAR</strong>/NCAR/UOPThe psychology of metrics implementation must be considered if an MIP is to be implementedsuccessfully within an organization. These psychological or “human” factors, which may be2


similar to those associated with implementing employee benefits plans within the <strong>UCAR</strong>,include the following:• Management “buy-in”<strong>UCAR</strong>/NCAR/UOP management needs to be strongly committed to the need for and valueof implementing metrics within the organization.• Management communication of purpose and value of metrics to staff<strong>UCAR</strong>/NCAR/UOP management needs to be able to articulate the overall purpose and valueof metrics and metrics implementation to the entire staff. This can be done, for example,through various types of informational meetings, following the model that the HumanResources (HR) Department used when implementing employee benefits plans. In particular,an initial informational meeting is needed to explain the positive aspects and outcomes of asuccessful metrics implementation plan, and to hear employees’ questions and concerns. Infollow-up meetings, management should continue to explain the MIP’s positive aspects,receive employee feedback, and address questions or concerns. These follow-up meetings areneeded to improve the MIP, solidify employee buy-in, and respond to employee resistance.• Avoidance of direct impacts on performance appraisalsDevelopment of metrics may be perceived by employees as a tool to be used in performanceappraisals and annual reviews. Although employee performance may best be ultimately tiedto organizational objectives as developed in the <strong>UCAR</strong> Strategic <strong>Plan</strong>, the purpose of metricsshould be to evaluate progress of the organization in achieving organizational objectives andnot individual performance.• Use of metrics to inform rather than motivate<strong>Metrics</strong> should be used to inform—rather than motivate—the organization of its progresstoward the goals of the <strong>UCAR</strong> Strategic <strong>Plan</strong>.• Human factors: Outreach and education (O&E)Significant ongoing O&E is needed to implement metrics successfully. Again, we can drawon HR’s experience in implementing employee benefits policy changes as a useful model forthese activities. The model stresses the need for two-way communication betweenmanagement and staff throughout the implementation process, so that the process is dynamic,self-correcting, and continuously improving. We give specific recommendations for O&Eactivities relevant to the MIP in Section 4.2.3


3 <strong>Metrics</strong> Background MaterialThe definition of content for the <strong>UCAR</strong> metrics should integrate ideas and contents of threepieces of background material: the <strong>UCAR</strong> Strategic <strong>Plan</strong>, the NRC document, and the <strong>UCAR</strong><strong>Metrics</strong> Committee Report. In the sections that follow, we discuss each of these important piecesof background material briefly.3.1 <strong>UCAR</strong> * 2020: A Strategic Outlook for the University Corporationfor Atmospheric Research – 1 May 2006 Draft.With the support of NSF, universitiescreated <strong>UCAR</strong> for the purpose ofdeveloping and managing NCAR.<strong>UCAR</strong>’s core mission is to engageand represent the universities and thebroader atmospheric and geosciencecommunity in research andeducation. <strong>UCAR</strong> achieves itsmission through many programs,facilities, and services. Since itsinception in 1960, <strong>UCAR</strong> hasincreased its membership by a factorof five. Even though the core missionof <strong>UCAR</strong> remained the same, itsscope of activities has expanded byan order of magnitude.The <strong>UCAR</strong> Strategic <strong>Plan</strong>(<strong>UCAR</strong>*2020) that is currently beingdeveloped describes <strong>UCAR</strong>’s sevenprimary goals. 2 These goals supportits primary mission— research—along with the development ofobservational facilities, informationtechnology, leadership andmanagement, O&E, and technologytransfer. The sidebar to the rightbriefly states each goal.<strong>UCAR</strong> Strategic GoalsGoal 1: Enable, facilitate and conduct a broad, highquality research program in the atmospheric andrelated sciences to address present and future needs ofsociety.Goal 2: Invent, develop and employ increasinglycapable observing systems that will fuel discovery andunderstanding.Goal 3: Further the understanding and prediction ofearth and Sun systems by advancing and sustaininginformation technology and data services.Goal 4: Cultivate an environment of excellence inleadership and management, where science andscience support thrive.Goal 5: Engage people with global and diverseperspectives in the science of the systems that supportlife on the earth.Goal 6: Extend the educational capabilities of theacademic community and society using the people,ideas, and tools of our national center.Goal 7: Transfer of knowledge and innovativetechnologies to the public, government and privatesectors for the benefit of the society.2 At the time of this report we understand that the <strong>UCAR</strong> Strategic <strong>Plan</strong> has seven goals but that this may eventuallybe revised to identify eight goals. If this occurs, the rest of this document would be revised to reflect that change.4


3.2 National Research Council’s “Thinking Strategically: TheAppropriate Use of <strong>Metrics</strong> for the Climate Change ScienceProgram.” Committee on <strong>Metrics</strong> for Global Change Research,Climate Research Committee. National Academy of Sciences.At the request U.S. Climate Change Science Program (CCSP) the National Research Council(NRC) established a committee to develop quantitative metrics and performance measures fordocumenting progress and evaluating performance of global change and climate change research.This NRC report lays out the framework for creating and implementing metrics for the CCSP.The NRC document outlines the general principles for developing metrics that strongly advancediscovery and promote continuous improvement in the program.Drawing from the NRC report, thesidebar to the right lists the five typesof metrics, all of which should beincluded to result in a robust andcomplete metrics set. A bad metric,which may stifle growth, overburdenthe employees, and causeorganizational stress, obviouslyshould be avoided. A good set ofmetrics should assess process as wellas progress. Particularly, theimplementation and success of ametric set depends on three factors:(1) strong leadership to guide theprogram and apply metrics, (2) astrategic plan against which metricsTypes of <strong>Metrics</strong>Process <strong>Metrics</strong>: Measure a course of action taken toachieve a goal.Input <strong>Metrics</strong>: Measure tangible quantities put into aprocess to achieve a goal.Output <strong>Metrics</strong>: Measure the products and servicesdelivered.Outcome <strong>Metrics</strong>: Measure results that stem from useof the outputs and influence stakeholders outside theprogramImpact <strong>Metrics</strong>: Measure the long-term societal,economic or environmental consequences of anoutcome.are applied, and (3) the ability to guide strategic planning and foster growth in the organization.Furthermore, the metrics should be easily understood and broadly accepted by stakeholders. It isimportant to choose metrics and maintain a vibrant strategic plan so that both the metrics and theplan evolve in pace with scientific progress and program objectives.3.3 <strong>UCAR</strong> <strong>Metrics</strong> Committee ReportIn December 2002 a review panel appointed by NSF to evaluate <strong>UCAR</strong>’s proposal for themanagement of the NCAR recommended that “<strong>UCAR</strong>/NCAR should develop a robust set ofmetrics (beyond citation analyses) that characterize the quality, effectiveness and efficiency ofboth science programs and service functions.” <strong>UCAR</strong> management agreed with thisrecommendation, and subsequently appointed the <strong>UCAR</strong> <strong>Metrics</strong> Committee to conduct a studyand evaluation of possible indicators that would meet the challenge posed by the review panel.The “<strong>UCAR</strong> <strong>Metrics</strong> Committee Report” is the result of the work of the <strong>UCAR</strong> <strong>Metrics</strong>Committee.5


Table 3 of the <strong>UCAR</strong> <strong>Metrics</strong>Committee Report, reproduced inpart in the text box at right, showsgeneral metrics that apply to morethan one functional category of<strong>UCAR</strong> activities and some keymetrics used in the ten functionalcategories of <strong>UCAR</strong> activities. All11 major metrics are listed in the<strong>UCAR</strong> <strong>Metrics</strong> Committee Report’sTable 3.The <strong>UCAR</strong> metrics committeerecommended that both metrics andindependent and knowledgeablepeer reviews should be used toevaluate <strong>UCAR</strong>’s process, as wellas its progress toward its goals.Table 3*<strong>Metrics</strong> to characterize the productivity,quality, impact, and efficiency of <strong>UCAR</strong>programsGeneral—Apply to more than one category of <strong>UCAR</strong>activitiesKey metrics used in ten categories of <strong>UCAR</strong> activities1.0 Science2.0 Scientific visitors and postdoctoral fellows3.0 Computing facilities4.0 Community models5.0 Observing facilities6.0 Data services7.0 Intellectual property and technology transfer8.0 Administrative support9.0 Education, outreach, and training10.0 Communications and advocacy*Taken directly (as Table 3) from “Measuring the Productivity,Quality, and Impact of <strong>UCAR</strong> Programs” (<strong>UCAR</strong> 2005).3.4 Notes on the <strong>UCAR</strong> <strong>Metrics</strong> Committee ReportWhen <strong>UCAR</strong>’s metrics committee members wrote the <strong>UCAR</strong> <strong>Metrics</strong> Committee Report, theydid so without the benefit of the <strong>UCAR</strong>*2020 Strategic <strong>Plan</strong>, which had not yet been written. Asa result, the <strong>UCAR</strong> <strong>Metrics</strong> Committee Report and the <strong>UCAR</strong> Strategic <strong>Plan</strong> are twoindependent documents that stand alone.As discussed in the Section 3.2, the NRC document strongly recommends including five types ofmetrics in a robust set: (1) process metrics, (2) input metrics, (3) output metrics, (4) outcomemetrics, and (5) impact metrics. The Team performed a exercise to evaluate the metricsidentified in the <strong>UCAR</strong> <strong>Metrics</strong> Committee Report into the five types of metrics. Theseassignments are shown in Appendix A.The <strong>UCAR</strong> <strong>Metrics</strong> Committee Report determined what metrics should be applied to <strong>UCAR</strong>'s 10functional categories. We matched those metric categories to the <strong>UCAR</strong> Strategic <strong>Plan</strong>'s 7strategic goals. Multiple metrics could apply to multiple goals. We handled this challenge byselecting a “primary” metric category, or categories, to assign to each goal. This approachreduces the complexity of measuring progress towards a specific goal.Appendix B, Table B-1 matches functional metric categories to <strong>UCAR</strong> strategic goals. Theindividual metrics of each category are then listed showing the item to measure, the measuringunits, and the focus, or importance, of the measure. The focus of a measure indicates whether ornot the metric should be used for measuring <strong>UCAR</strong>'s production levels, quality of work, or theimpact on external entities or events. Some metrics have more than one focus.The current list of <strong>UCAR</strong> metrics can be grouped into three out of five categories—Output<strong>Metrics</strong>, Outcome <strong>Metrics</strong>, and Impact <strong>Metrics</strong>.6


Strong leadership in <strong>UCAR</strong> management and consequently in the organization’s strategic planspartially fulfills the requirements of the Process <strong>Metrics</strong>. Additional Process <strong>Metrics</strong> for settingpriorities and allocating resources among elements of the program need to be created.It is interesting to note that the proposed <strong>UCAR</strong> metrics are dominated by Output <strong>Metrics</strong>, whichmeasure products and services.Input metrics, which measure tangible quantities needed to achieve a goal, are not considered inthe current list. Input metrics include resources to implement and sustain research, humancapital, measurement systems, prediction models, data services, and infrastructure. This may bean inherent weakness in the <strong>UCAR</strong> <strong>Metrics</strong> Committee Report because results of output metrics(good, bad, or ugly) cannot be explained if there is no way of knowing that enough resourceswere spent on achieving a desired goal. As a result, this lack of input metrics may limit theability to explain output metrics.We can see, then, that the current list of <strong>UCAR</strong> metrics needs to be augmented to fill the gaps inProcess and Input metrics.4 Outline of <strong>Metrics</strong> <strong>Implementation</strong> <strong>Plan</strong> Process for <strong>UCAR</strong>4.1 Process IntroductionIn this section, we outline a potential process for <strong>UCAR</strong> to follow in developing an MIP. Wedeveloped this process based on our experience with and knowledge of <strong>UCAR</strong>/NCAR/UOP andour vision of what the MIP should include at the end of the process. We offer this as a foundationon which to build a final MIP development process that can be altered, supplemented, andrevised as needed based on the organization’s objectives and goals and the resources available.We are as explicit as possible in stating the assumptions and decisions that need to be made indeveloping such a process.The NRC document 3 states that “A good strategic plan must precede the development ofmetrics.” Our process assumes the <strong>UCAR</strong> Strategic <strong>Plan</strong>, currently under development, as itsfoundation. Integration with the completion and adoption of the <strong>UCAR</strong> Strategic <strong>Plan</strong> is aprerequisite for final deployment of the MIP.In suggesting this process we view thus the MIP as an integral component of overall <strong>UCAR</strong>strategic planning and <strong>UCAR</strong>/NCAR/UOP management focus. We feel that these metrics willbecome a crucial tool in management, program evaluation, and budget decision making, as wellas in institutional culture.As we developed this process, we assumed a “top-down” approach. By this we mean that <strong>UCAR</strong>management will strongly support this effort, adopt an MIP development process, and entrainother units (NCAR, UOP) into this commitment. If this effort were to involve only developmentof metrics for <strong>UCAR</strong>, the approach and effort would likely be significantly different thansuggested here. We also suspect that, if NCAR and UOP are not fully on board with this effort,the exercise will fail to meet stakeholder requirements (e.g., NSF’s desire to see metricsindicative of the return on its investment).3 Executive Summary, p. 37


Although we assume a top-down approach, this effort must empower all levels of<strong>UCAR</strong>/NCAR/UOP to develop and design the MIP, ensuring that the final product becomes apart of institutional culture. The full potential of integrating strategic planning and metrics cannotbe realized without a truly “bottom-up” understanding and involvement in the process.We also present rough estimates of the resources and time needed to complete this process.These are truly only ballpark estimates that may well be off by an order of magnitude. Althoughthese estimates have not been developed with significant detail, we offer them to encourage thatany decisions made in designing and adopting a process be made in full awareness of theresources necessary ensure success. Allotting inadequate resources to this effort may lead tofailures to achieve the necessary or desired results.4.2 Process OutlineIn this section we outline a process by which <strong>UCAR</strong> could develop and implement strongperformance metrics and present a sequential flow chart of these steps.An important cross-cutting aspect of the MIP development process should be an ongoing focusedeffort to involve the stakeholder communities, including <strong>UCAR</strong>/NCAR/UOP employees. Thesestakeholders must be engaged, as it is critical to elicit their input and guidance in MIPdevelopment, ensure their buy-in in the process and the product, and maintain transparency andenhance trust in this effort. Various approaches and activities could be used: (1) stakeholdermeetings such as presentations, focus groups, and working groups; (2) outreach efforts such asinformational e-mails, newsletters or articles in existing communication channels, and Web sitesdetailing the process and status of ongoing efforts; and (3) educational efforts such as training,classes, or workshops on metrics and their implementation, measurement, and use. To ensurethat this aspect of the process takes place, we recommend appointing a specific individual or agroup to oversee communications as part of the <strong>Metrics</strong> <strong>Implementation</strong> Committee (MIC)described later. Although we have not identified specific steps for outreach and education in theprocess described here, we have indicated along the phases of the process where such effortsmay fit well.We also encourage the implementation of innovative techniques for undertaking the varioustasks in the MIP development process. Rather than relying on a standard set of working groups—based largely on extensive meetings—we encourage unconventional approaches to this work.We also suggest that bringing in external resources be considered to augment this process. Theseresources could include potential experts in meeting facilitation, strategic planning, or metricsdevelopment and measurement. Although there may be a feeling that <strong>UCAR</strong>/NCAR/UOPpersonnel are capable of completing this process, there are likely experts with experience inimplementing such efforts who could enhance and expedite this effort and help avoid potentialroadblocks.Figure 1 is a flow chart for the proposed MIP development process, showing the steps that aredescribed in the sections that follow the figure. Because this will ultimately be a dynamic anditerative process, we recognize that certain opportunities for nonlinear approaches are notadequately captured in such a diagram. We have attempted to indicate on the left side of thefigure some specific points where iteration would be most useful or necessary.8


Step1: Commitment by <strong>UCAR</strong>/NCAR/UOP ManagementStep 2: Form <strong>Metrics</strong> <strong>Implementation</strong> Committee (MIC)OutreachandEducation 1Step 3; Creation of <strong>Metrics</strong> Development Teams (MDT’s)Step 4: Workshop on <strong>UCAR</strong> <strong>Metrics</strong>Step 5: MDT’s Create <strong>Metrics</strong> <strong>Implementation</strong> <strong>Plan</strong> (MIP)1Step 6: MIP Internal ReviewOutreachandEducation 2Process Iteration2Step 7: Adoption of MIP by MICStep 8: MIP External ReviewStep 9: Revision and Completion of MIPStep 10: <strong>Implementation</strong> of MIPOutreachandEducation 33Step 11: Review of Results and <strong>Metrics</strong>Figure 1: Proposed MIP development process9


• MDT members should be knowledgeable about <strong>UCAR</strong> activities relevant to their particulargoal and related functional categories.• MDT members should be committed to the developing strong performance metrics for<strong>UCAR</strong> within their sphere of expertise.• MDT members should be capable of identifying and overseeing the work of broad teams ofstakeholders as these teams review and comment on the MDT’s suggested metrics.Although we are unable to comment on the specific composition of the MDTs, we do advise thatthe MIC focus on providing each team with the broadest possible representation from the <strong>UCAR</strong>functional categories relevant to each team’s Strategic <strong>Plan</strong> goal. To be effective in developing<strong>UCAR</strong>-wide metrics, many of the teams will require representation from multiple organizationalentities within <strong>UCAR</strong> (laboratories, institutes, divisions, etc.) and will be required to analyze theactivities of multiple organizational entities.Step 4: Workshop on <strong>UCAR</strong> <strong>Metrics</strong> DevelopmentOnce the MIC and MDTs have been created, but before work begins, we suggest holding a<strong>UCAR</strong> <strong>Metrics</strong> Workshop. This workshop would be designed to present the overall vision andmission of the MIP development process to all involved and interested parties. This couldinclude appropriate statements of institutional support for this effort; clarification of the roles ofthe various teams (MIC, MDTs); and statements about and open discussions of the goals,objectives, process, timelines, expected outcomes, rights, and responsibilities of the variousactors in the process. This workshop could also benefit from strong involvement of stakeholdersas well as guidance or input from external and internal experts on the theories and methods ofmetrics. It would also benefit from an overview of the relevant strategic plans.Step 5: <strong>UCAR</strong> MDT ActivitiesThe activities of each MDT will include the following (in roughly sequential order)• Each MDT will review and analyze the activities of the <strong>UCAR</strong> functional categories directlyrelated to its particular strategic plan goal. (See Appendix B for a mapping of the tenfunctional categories to the seven <strong>UCAR</strong> Strategic <strong>Plan</strong> goals.)• Each MDT will review and analyze the available literature on metrics development incomplex, interdisciplinary research environments, with a particular focus on the <strong>UCAR</strong><strong>Metrics</strong> Committee Report and the NRC document.• Based on the guidelines and best practices found in these documents, and on their ownanalysis and judgment, each MDT will develop a draft set of metrics for activities relevant to13


its particular goal. We advise the MDTs to pay particular attention to the five “General<strong>Metrics</strong> for the CCSP,” 6 which are Process, Input, Output, Outcome, and Impact <strong>Metrics</strong>.• After developing the draft metrics, each MDT will oversee a formal review of the draftmetrics by other stakeholders familiar with activities relevant to its strategic plan goal. Thereview committee(s) for each MDT should be directly charged with watching for the pitfallsoutlined under “caveats” in the <strong>UCAR</strong> <strong>Metrics</strong> Committee Report and under “Characteristicsof Good <strong>Metrics</strong>” in the NRC document. We advise the MDTs to consider includingstakeholders from outside <strong>UCAR</strong> in this process, possibly including representatives from<strong>UCAR</strong> member universities or other entities closely affiliated with <strong>UCAR</strong>.• Once the stakeholders have reviewed the draft metrics, each MDT will submit them to theMIC for review and comment. Included with this submission should be the intended audiencefor each draft metric and preliminary information on the difficulty and cost, among otherfactors, of computing each metric.• Once the draft metrics are reviewed and accepted by the MIC, the MDTs should complete afirst pass computation of all approved metrics and submit the results to the MIC with detailedplans for implementing each metric for the long term. (Note that computation of somemetrics may require resources beyond those available to the MDTs. In this case the MIC willallocate the resources necessary to complete this computation.) Included with this submissionshould be information on the frequency with which the metric should be calculated,estimated resources and costs of maintaining the metric over the long term, and insights intothe ways in which the metric may have to change in future to remain useful.Step 6: MIP Internal ReviewOnce each of the MDTs completes their work, the MIC will compile the seven MDT productsinto a coordinated and integrated MIP. The MDTs may need to revise their products based oninput from the MIC.Step 7: <strong>UCAR</strong>/NCAR/UOP Management Adoption of MIP<strong>UCAR</strong>/NCAR/UOPOnce the MIC completed the integrated MIP, the committee will submit the document to<strong>UCAR</strong>/NCAR/UOP management for review and comment. <strong>UCAR</strong> management may providefeedback requiring sufficient revision and thus some iteration of prior steps in the MIPdevelopment process. Once the MIP has been completed to the satisfaction of <strong>UCAR</strong>management, it will be accepted as an internal document and sent for more formal externalreview.6 NRC document, Executive Summary, pages 6–714


Step 8: MIP External ReviewAn external group of experts and stakeholders should thoroughly review and comment on thecompiled MIP to evaluate the transparency of the metrics process and to ensure that it meets theneeds and expectations of stakeholders. Feedback from this external review will be directed to<strong>UCAR</strong>/NCAR/UOP management and the MIC.Step 9: MIP Revision and CompletionBased on the external review results, the MIC will develop a response plan; if significantrevision is required, additional internal and external review may be appropriate. <strong>UCAR</strong>management would then accept the final MIP as formal <strong>UCAR</strong> policy.Step 10: MIP <strong>Implementation</strong>All relevant units and entities within the organization would then implement the final MIP.Step 11: Ongoing Development and RevisionAs with any complex and dynamic process, this plan would be regularly reviewed and revised tomeet the objectives of the process and to respond to new needs and opportunities. We suggestthat a formal, long-term MIC undertake and oversee this process and ensure the appropriateintegration of this effort with other <strong>UCAR</strong>/NCAR/UOP activities including strategic planning.4.3 Estimate of Required ResourcesIn the NRC document, the NRC Committee on <strong>Metrics</strong> for Global Change Research quite clearlystates that developing and implementing strong metrics requires allocating significant resources.The size, scope, complexity, and interdisciplinary nature of <strong>UCAR</strong> and its activities will makedeveloping and implementing <strong>UCAR</strong>-wide performance metrics a resource-intensiveundertaking. We cannot estimate the cost of this endeavor with any accuracy, but we underscorethe importance of sufficient resources to see this process through to a successful conclusion. Tothis end we offer the following brief analysis of required resources.• <strong>UCAR</strong>/NCAR/UOP Management Resources (MIC)The MIC is likely to include high-level <strong>UCAR</strong> management, and its duties will require asignificant time commitment. And if the MIC includes representatives from outside <strong>UCAR</strong>,compensation for travel and possibly other costs may be required.15


• <strong>UCAR</strong>/NCAR/UOP Management/Staff Resources (MDTs)The seven MDTs will do much of the “heavy lifting” of the process. As with the MIC, allMDT members will be required to make a significant time commitment, at significant cost tothe organization. To ensure commitment to this process, MDT (and MIC) members must beadequately compensated or have duties otherwise offset. In addition, they must be assured ofprofessional recognition and support equivalent to what they will lose by not undertakingtheir usual professional duties.• Administrative ResourcesWe advise the MIC to consider hiring or allocating dedicated administrative support to themetrics development mission. This would involve at least one FTE for the duration of theproject, estimated below to be approximately one to two years.• Stakeholder ResourcesThe stakeholder review panels will contribute significant time to reviewing draft metrics. Ifany review panels include representatives from outside <strong>UCAR</strong>, travel costs may be incurred.4.4 Timeline EstimatesThough it is beyond the scope and ability of the Team to develop any firm schedules or timelinesfor this process, we do offer the following general analysis and rough timeline for review.• Completion of the <strong>UCAR</strong> Strategic <strong>Plan</strong>Because a complete Strategic <strong>Plan</strong> is a prerequisite for the <strong>Metrics</strong> <strong>Implementation</strong> <strong>Plan</strong>, theStrategic <strong>Plan</strong> must be completed before metrics can be formally adopted and implemented.However, both the MIC and the MDTs could bring useful insight to the final stages of thedevelopment of the <strong>UCAR</strong> Strategic <strong>Plan</strong>, and could complete productive advance workbefore the completion of that plan.The <strong>UCAR</strong> Strategic <strong>Plan</strong> is in review, and is scheduled for publication in January 2007.• MIC AppointmentThe MIC should be selected immediately. The NSF directive calling for strong metrics at<strong>UCAR</strong> suggests that this process should begin in earnest as soon as possible, and appointingthe MIC is the necessary first step. The MIC could form the MDTs before the Strategic <strong>Plan</strong>is completed.We estimate the MIC selection will take approximately one month.• MDT Selection and AppointmentThe MIC should select the MDTs by as soon as possible it is appointed. By reviewing the<strong>UCAR</strong> Strategic <strong>Plan</strong> goals from a metrics perspective, the MDTs could help ensure thateach goal is compatible with strong metrics and measures from the outset. This could requireadding an additional or preliminary task to the MDT charge—monitor and provide feedbackon each Strategic <strong>Plan</strong> goal from a metrics perspective. A thorough review of the available“best practices” literature on metrics, including the documents cited in this report, will takesome time and can begin before the <strong>UCAR</strong> Strategic <strong>Plan</strong> is completed.16


We estimate MDT selection will take approximately one month.• MDT <strong>Metrics</strong> DevelopmentAs noted above, developing strong metrics for each Strategic <strong>Plan</strong> goal will take a significantamount of time, effort, and resources. Without knowing what resources will be allocated tothis endeavor, we are unable to estimate the required time with high confidence. Note thatadjusting any one of the three essential components of a project (time, effort, or resources)will necessarily force in changes to the other two. If necessary the MIC could shorten therequired time for the process by increasing the allocated resources or by limiting the scope toeither fewer or less complicated metrics.Assuming a relatively high level of resource availability, we estimate that this process willtake six months.• Internal and External Review, MIP Revision, O&EOnce the MDTs have completed the first draft metrics, the internal and external review andrevision processes will take some time. We expect the O&E efforts, both internally with staffand externally with various <strong>UCAR</strong> partners, to proceed in parallel with the other processsteps. However, we suggest including two specific O&E steps, as noted in the flow chart.We estimate that review, revision, and outreach and education will take four months.• Total Duration of MIP ProcessWe estimate that completing the process outlined in this document, to the point of computingand publishing the first complete set of metrics, will require approximately one year. Someorganizational, planning, and preparatory work can begin before the <strong>UCAR</strong> Strategic <strong>Plan</strong> iscompleted, but many of the steps require a completed plan. Figure 2 shows a preliminaryoverall project schedule.17


Figure 2. Proposed MIP process timeline18


Appendix A: Mapping of <strong>UCAR</strong> <strong>Metrics</strong> to NRC Metric TypesTable A-1. <strong>Metrics</strong> to Characterize the Productivity, Quality, Impact, and Efficiency of <strong>UCAR</strong> ProgramsGeneral—Apply to more than one category of <strong>UCAR</strong> activities1. External awards (e.g., AMS, AGU, AAAS)2. Number and dollar value of proposals written and funded3. Number of peer-reviewed papers and conference presentations4. Invitationsa. To chair or serve on external committees, boards, etc.b. To present invited papers or talks5. Leadership and participation in community activities such as NRC committees, IPCC, WMO, AMS activities, reviewcommittees, etc., especially when they are by invitation only6. Election to prestigious positions (such as president of professional societies)7. Number of external collaborators and coauthored papers8. Number of workshops, tutorials, and seminars9. Degree of satisfaction of users (e.g., through surveys)10. Administrative overhead for programs and facilitiesKey metrics used in ten categories of <strong>UCAR</strong> activities1.0 Science1. Peer-reviewed publications2. Citation rates3. Reference to or inclusion in major scientific assessments (e.g., IPCC)4. Coauthored papers5. Number of prestigious awards or positions such as NAE and NAS memberships6. Quality of letters of recommendation for promotion7. Specific examples of outstanding discoveries or contributions2.0 Scientific visitors and postdoctoral fellows1. Number of visitors and visitor-days per year2. Number of collaborations with <strong>UCAR</strong> scientists3. Amount of visitor funding4. Feedback from visitors and postdoctoral fellows3.0 Computing facilities1. Power and quality of hardware and software (e.g., computational power [computing cycles]), visualization platforms, dataarchival and delivery systems, and size of data set archived and used)2. Number of software and other applications developed and used by community3. Number of users4.0 Community models1. Number of users (individuals and institutions) of models or output of models2. Number of workshops and tutorials3. Number of responses to enquiries for technical or other help4. Computation efficiency, usability on different platformsOutput metricsA-1


5.0 Observing facilities1. Number of users of instruments and platforms2. Number of field projects supported3. Number of days a facility is used per year4. Number of new facilities developed5. Number of requests for use6. PI assessments6.0 Data services1. Number of users (individual and institutional)2. Data volumes archived and accessed by users or delivered in real time3. Number of data services and data software packages used4. Reliability (e.g., up time) of data services7.0 Intellectual property and technology transfer1. Number of patents and copyrights awarded2. Number of licenses granted3. Number of products or systems developed by <strong>UCAR</strong> and used operationally4. Revenue generated by commercialization and licensing activities8.0 Administrative support1. Number of financial reports prepared, paychecks processed, equipment tracked, travel authorizations prepared, contractsand sponsored agreements supported, purchase orders processed, computer and Web applications produced2. Number of high-quality applicants for positions and ability to “hire the best”3. Number of reportable accidents, days of sick leave, lost work days4. Turnover rate of employees, promotion rate, job mobility5. Mentoring and training opportunities6. Demographics, salaries and benefits, ethnic and gender diversity of staff compared to peer organizations7. Number of workshops and events supported8. Investment performance, number and dollar amount of bond agreements, success in obtaining favorable bond ratings9. Indirect costs and indirect cost rate9.0 Education, outreach, and training1. Number of people directly served (e.g., SOARS protégés, ASP postdoctoral fellows, graduate assistants, weather serviceforecasters, teachers trained, workshop and residence course participants, exhibit visitors)2. Number of people accessing distance learning materials and Web sites3. Surveys of participants in courses and workshops, visitors to Web sites, and stakeholders such as federal agencies,educators, students, and the public4. Number of holdings in library or other collections (e.g., in digital libraries or other educational resource centers)5. Educational materials produced for use in classrooms (e.g., COMET materials)6. Summative evaluations10.0 Communications and advocacy1. Number of Capital Hill briefings and visits to <strong>UCAR</strong> by congressional or administration delegations, Action Alerts,Washington Updates, testimonies2. Number and length of publications, press releases, other documents, and Web pages.3. Use of materials by others (e.g., hits on Web sites, faculty responses to Action Alerts, requests for information fromCongress, press clippings covering <strong>UCAR</strong> work and/or quoting <strong>UCAR</strong> staff, readership and circulation statistics)4. Specific actions that occur as a result of advocacy efforts (e.g., funding for programs, incorporation of <strong>UCAR</strong>-suggestedlanguage)5. Feedback from universities, agency people, knowledgeable people in WashingtonOutput <strong>Metrics</strong>Outcome <strong>Metrics</strong>Impact <strong>Metrics</strong>A-2


Appendix BMapping of <strong>UCAR</strong> <strong>Metrics</strong> to <strong>UCAR</strong> Strategic <strong>Plan</strong> Goals<strong>Metrics</strong> are used to measure progress toward goals. The <strong>UCAR</strong> <strong>Metrics</strong> Committee Reportdetermined what metrics apply to <strong>UCAR</strong>’s ten functional categories. We matched those metriccategories to the <strong>UCAR</strong> Strategic <strong>Plan</strong>’s seven strategic goals. Multiple metrics could apply tomultiple goals. We handled this challenge by selecting a “primary” metric category, orcategories, to assign to each goal. This approach reduces the complexity of measuring progresstoward a specific goal.Table B-1 matches functional metric categories to <strong>UCAR</strong> strategic goals. The individual metricsof each category are then listed showing the item to measure, the measuring units, and the focus,or importance, of the measure. The focus of a measure indicates whether the metric should beused for measuring <strong>UCAR</strong>'s production levels, quality of work, or impact on external entities orevents. Some metrics have more than one focus.B-1


Strategic GoalTable B-1. How <strong>UCAR</strong> <strong>Metrics</strong> Map to <strong>UCAR</strong> Strategic <strong>Plan</strong> GoalsActual <strong>Metrics</strong>Primary MetricCategory Item UnitFocus of MeasureOutput Outcome ImpactScience(Goal 1: Enable, facilitate and conducta broad, high quality research programin the atmospheric and related sciencesto address present and future needs ofsociety.)SciencePeer-reviewed original scientific publications # / period & evaluation X XCitation rates # / period X XCoauthored papers # / period XReference to or inclusion in major scientificassessmentsNumber of prestigious awards or positions such asNAE and NAS membershipsSpecific examples of outstanding discoveries orcontributions# in each assessment / period X# / period X X# / period XProposals - Number submitted # / period XProposals - Success rate % / period XProposals - Resources generated $ / period X XNumber of scientists #, (metric) / scientist X XAmount of funding & fiscal managementTotal $, $ / scientist, (metric) /$X XScientist assessments evaluation X Xi) Number of field projects supported # / period Xi) Number of days a facility is used # / period Xii) Number of new facilities developed # / period Xii) Number of patents # / period XObserving Facilities and Systems(Goal 2: Invent, develop and employincreasingly capable observing systemsthat will fuel discovery andunderstanding.)Observing Facilities(two areas:i) deploymentii) developmentii) Number of non-NSF and joint developments # / period XNumber of requests for use or development # / period XNumber of papers, conferences, and citationsbased on deployments or developments# / period X XNumber of users, total and of each item # / period, (metric) / user X XAmount of funding & fiscal management Total $, $ / user, (metric) / $ X XUser (PI) assessments evaluation X XB-2


Strategic GoalInformation Technology(Goal 3: Further the understanding andprediction of earth and Sun systems byadvancing and sustaining informationtechnology and data services.)Table B-1. How <strong>UCAR</strong> <strong>Metrics</strong> Map to <strong>UCAR</strong> Strategic <strong>Plan</strong> GoalsActual <strong>Metrics</strong>Primary MetricCategory Item Uniti) high-end computing capability and performancei) real-time or post-production data analysis andvisualization# of systems, cycles / period,wait-times, Top 500 ranking,compare w/ Moore's Law# of platforms available,capability growthFocus of MeasureOutput Outcome ImpactXXComputing Facilities(two areas:i) computationalresource deliverablesii) softwaredevelopment)i) high-performance, high-availability, data archival,access, and delivery# of systems, capacity(storage & transport), used(storage & transport),utilization (%), access times,compare w/ peer centersX Xii) Number of software and other applicationsdeveloped and used by community# developed / period, #downloaded / period, #application areas (i.e.sponsors)X XNumber of users, total and of each item # / period, (metric) / user X XAmount of funding & fiscal management Total $, $ / user, (metric) / $ X XUser assessments evaluation X XCommunity ModelsNumber of widely-used community models # XNumber of workshops and tutorials # / period, XNumber of responses to inquiries for technical orother help# / period X X XComputational efficiency, accuracy, usability ondifferent platformsNumber of papers, conferences, and citationsbased on models# / period X XXNumber of users, total and of each item # / period, (metric) / user X XAmount of funding & fiscal management Total $, $ / user, (metric) / $ X XB-3


Strategic GoalTable B-1. How <strong>UCAR</strong> <strong>Metrics</strong> Map to <strong>UCAR</strong> Strategic <strong>Plan</strong> GoalsActual <strong>Metrics</strong>Primary MetricCategory Item UnitFocus of MeasureOutput Outcome ImpactCommunity user assessments evaluation X XVolume of data available to community # bytes XAmount of data distributed to or accessed by thecommunity# bytes XData Services(for the community)Number of data services and software packagesusedReliability (e.g. uptime) of data services %# / period X XNumber of external collaborations # / period XNumber of users, total and of each item # / period, (metric) / user X XAmount of funding & fiscal management Total $, $ / user, (metric) / $ X XCommunity user assessments evaluation X XOrganizational Excellence(Goal 4: Cultivate an environment ofexcellence in leadership andmanagement, where science andscience support thrive.)Administrative SupportNumber of financial reports prepared, paychecksprocessed, equipment tracked, travelauthorizations prepared, contracts & sponsoredagreements supported, purchase ordersprocessed, computer and web applicationsproduced# / period XNumber of high-quality applicants for positions andability to hire the best# / period X XNumber of reportable accidents, days of sickleaves, lost work days# / period XTurnover rate of employees, promotion rate, jobmobility# / period XMentoring and training opportunities # / period XDemographics, salaries and benefits, ethnic &gender diversity compared to peer organizations# / period XNumber of workshops and events supported # / period XInvestment performance, number and dollaramount of bond agreements, success in obtainingfavorable bond ratings# / period XB-4


Strategic GoalTable B-1. How <strong>UCAR</strong> <strong>Metrics</strong> Map to <strong>UCAR</strong> Strategic <strong>Plan</strong> GoalsActual <strong>Metrics</strong>Primary MetricCategory Item UnitFocus of MeasureOutput Outcome ImpactIndirect costs and indirect cost rate % X XNumber of customers, total and of each item # / period, (metric) / customer X XAmount of funding & fiscal managementTotal $, $ / customer, (metric)/ $X XCustomer assessments evaluation X XNumber of Capital Hill briefings and visits to <strong>UCAR</strong>by congressional or administration delegations,action alerts, Washington updates, testimonies# / period XCommunications &AdvocacyNumber and length of publications, press releases,other documents, and web pagesSpecific actions that occur as a result of advocacyeffortsNumber of users of materials, total and of eachitem# / period X X# / efforts in period X X# / period, (metric) / user X XAmount of funding & fiscal management Total $, $ / user, (metric) / $ X XStakeholder assessments evaluation X XNumber of publications, presentations, courses,workshops, and services# / period X XEducation and Outreach(Goal 5: Engage people with global anddiverse perspectives in the science ofthe systems that support life on theearth.)Education, Outreach,and TrainingNumber of awards # / period X XNumber of degrees earned # / period X XNumber of holdings in library or other collections # XNumber of users/participants, total and of eachprogram/service# / period, (metric) / user X XAmount of funding & fiscal management Total $, $ / user, (metric) / $ X XUser/participant assessments evaluation X XB-5


Strategic GoalTable B-1. How <strong>UCAR</strong> <strong>Metrics</strong> Map to <strong>UCAR</strong> Strategic <strong>Plan</strong> GoalsActual <strong>Metrics</strong>Primary MetricCategory Item UnitFocus of MeasureOutput Outcome ImpactBuilding and Sustaining Community(Goal 6: Extend the educationalcapabilities of the academic communityand society using the people, ideas,and tools of our national center.)Serving Society(Goal 7: Transfer of knowledge andinnovative technologies to the public,government and private sectors for thebenefit of the society.)Visitors andPostdoctoral Fellows(manage all visitormetrics to targets, nottoo high, not too low)Intellectual Propertyand TechnologyTransferNumber of collaborations with <strong>UCAR</strong> scientists # / period X XNumber of visitor-days # / period X XNumber of visitors, total and each program # / period, (metric) / visitor X XAmount of funding & fiscal management Total $, $ / visitor, (metric) / $ X XVisitor assessments evaluation X XNumber of patents and copyrights awarded # / period XNumber of licenses granted # / period XNumber of products or systems developed by<strong>UCAR</strong> and used operationally# / period X XNumber of transfers, total and of each type # / period, (metric) / transfer X XAmount of funding/revenue & fiscal managementTotal $, $ / transfer, (metric) /$X XSponsor/customer assessments evaluation X XB-6

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!