54 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS AND AGENCY OPERATIONSence, what gets measured is viewed as being moreimportant and what is not measured as less important.For performance measurement to thrive, indicatorsof what is important in an organization—goals, strategies, policies, programs, and projects—must be aligned in all directions: vertically, spanningthe hierarchy; horizontally, spanning the functionalspecialties and geographic turf; and diagonally, spanninghorizontal and vertical dimensions simultaneously.It is surprising how often this alignment simplydoes not occur. It is not good for headquarters tofocus on key measures of critical importance to theleadership while the field, where much of the workgets done, is on a different page. It does not help ifone or more of the essential organizational units responsiblefor working together on a particular initiativefails to understand the effort required and theresults expected. Yet how incredibly common is thisoccurrence in our collective experience?At one end of an organization, polices, strategies,and goals are articulated. At the other end, operationalactivities produce outcomes. The need to definethose outcomes in the context of performanceobjectives and measures that align with policies,strategies, and goals is fundamental. Clear and focusedcommunication and teamwork among all affectedorganizational units, as well as strong and sustainedleadership and commitment, are keys toachieving omnidirectional alignment. It is a neverendingprocess of sensing differences, respondingconstructively, and developing a clear consensus.SurvivalOne of the major questions we face in introducingorganizational change is whether the changes will besustained over the long run, whether they will be unceremoniouslydropped at the next change in leadership,or whether they will simply fade away overtime.The common goal of champions of change is toinstitutionalize what they perceive as new positivepractice. Certainly, this is what would be expectedamong the sponsors of performance measurement.Barring a conscious effort to the contrary, the hopewould be that performance measurement would becomeso deeply ingrained in the culture of the organizationand the benefits so apparent that it wouldbe unthinkable to stop. The tendency at all levelswould be to continue the practice because it workswell.The key is to address the issue of ingraining suchchanges in the fabric of an organization at the veryoutset. The manner by which the first seeds of performancemeasures are sown will have a strong bearingon whether the underlying philosophy and dayto-daypractice become firmly rooted. It depends inno small way on whether there is a well-thought-outstrategy for implementation developed with the involvementof the people who will be affected, fromdata collectors to decision makers.• Is there a well-thought-through plan of attack?• Is the plan being discussed openly and often?• Is the plan well understood?• Is the plan widely supported?• Is the plan taking root in a systematic and deliberativemanner?• Is there provision built into the plan for refinementand continuous improvement?Implementing performance measurement is not acakewalk and is not to be taken lightly. It demandsstrong leadership coupled with sensitivity, skill, goodwill, and intuitive common sense. If it seems right, itvery likely is. However, if it seems wrong, it likelyis too.U.S. Department of <strong>Transportation</strong> ExperienceIn 1992, the then-new administration in Washingtoninitiated the National Performance Review under theleadership of Vice President Al Gore. In 1993, theU.S. Congress enacted the Government Performanceand Review Act. The legislation required each agencyto develop a 5-year strategic plan; define its missions,goals, and objectives; establish annual performanceplans and budgets; identify performance measures;and prepare an annual performance report.Within the federal establishment, the U.S. Departmentof <strong>Transportation</strong> (DOT) has been praised forits work in establishing national performance objectivesand measures. However, the response from stateand local transportation officials has been quite different.From their perspective, federally developedmeasures applied to highway and transit systemsowned and operated by the states and localities havebeen inappropriate and intrusive.This dilemma is fraught with difficulty. The federalgovernment finances a significant share of capital investmenton the nation’s arterial highways and transitsystems. From that perspective, it makes sense forCongress and U.S. DOT to seek ways to measure theeffectiveness of these investments in terms of safety,service levels, and structural integrity of the nation’sprincipal transportation infrastructure. However, thestates and local areas own, operate, and maintainthese arterial highways and transit systems. Many of
IMPLEMENTING PERFORMANCE MEASUREMENT IN TRANSPORTATION AGENCIES 55them recoil at the idea that national performancestandards could be imposed that might not be relevantto their unique situations but that could be perceivedby some as reflecting unfavorably on them.They argue that decisions on performance objectivesto adopt, investments to make, and projects to pursueare the responsibility of state and local governmentsin response to circumstances that the federal governmentis incapable of dealing with and that are neitherconsistent nor comparable from location to location.They point out that local differences are so greatacross the country that any attempt to measure performanceon a national scale is inappropriate. Theycite the largely discredited Hartgen report as evidenceof the difficulty in drawing legitimate comparisonsamong transportation agencies with divergent characteristicsand goals.The dilemma has not yet been resolved. In theory,a compromise might involve dividing performanceobjectives and measures into national, regional, andlocal categories. National standards could apply touniversally accepted measures (such as federally mandatedbridge criteria, or adherence to uniform trafficcontrol standards). Regional criteria might applywhere sufficient consistency exists among such factorsas climate, terrain, traffic characteristics, materials,and subsurface conditions. Regional comparisonscould be valid for pavement and bridgeperformance, for example. However, it would clearlydepend on eliminating the underlying variation infactors that are beyond the control of the agency buthave a significant impact on condition. Some measures,such as level of congestion considered acceptableand levels of transit service provided, might berelevant strictly in relation to individual state and localarea policies and objectives. Whether such a hybridapproach can be developed to the general satisfactionof federal, state, and local officials remainsto be seen.State DOT ExperiencesMost of the state DOTs have initiated or experimentedwith performance measures to some degree.A few states, including Florida, Minnesota, andOregon, have been at it for some time. In no twocases have state DOTs undertaken performance measuresfor identical reasons and implemented them inthe same way.FloridaFlorida represents a particularly good long-standingexample of the power of performance measurement.In 1984, after a revenue increase, the legislaturemade its policy direction to the Florida Departmentof <strong>Transportation</strong> (FDOT) unmistakably clear: it wasnot satisfied with the level of maintenance statewideand wanted system preservation to take priority oversystem expansion. For the first time, FDOT definedperformance standards for bridges, pavements, andoverall quality of maintenance. The measurementsystems were easily understood by practitioners andpoliticians alike, and they turned out to be effective.Annually, using clear and comprehensible charts,FDOT graphically displays these adopted performanceobjectives and its progress for the year and forpreceding years in relation to that standard. Therefore,it is possible to ascertain at a glance what theobjective is, whether it is being achieved, and whatthe year-to-year trends are. It also is possible for thelegislature, the governor, the Commission on GovernmentAccountability to the People (the so-calledGAP Commission), the Florida <strong>Transportation</strong> Commission,MPOs throughout the state, and FDOT stafffrom senior levels to the front lines to confirm thatthe legislature’s mandate is being heeded, with performancestandards set and steady progress beingmade. The value of such an approach to an agency’scredibility is incalculable, particularly when continuousimprovement can be easily demonstrated.MinnesotaMinnesota’s initiative, known in its early days as the‘‘Family of Measures,’’ has received a lot of nationalattention. In contrast to that of Florida, its roots liewithin the state DOT itself, although later on, thelegislature began requiring it. It also was a broader,cross-cutting approach, embracing about 40 measuresin three general categories: system performance,organizational performance, and societal values (e.g.,social and economic factors, the environment).More recently, the focus in Minnesota was movedto an emphasis on business planning, using measuresto assess performance with respect to strategic objectivesdrawn from the agency’s strategic plan. Manyof the original measures remain in use, so it seemsfair to say that the Minnesota DOT performance initiativecontinues to refine and adjust based on a solidfoundation. The four key categories of strategic objectivesare• Level of service in interregional corridors (i.e.,specified percent of miles achieving a threshold averagetravel speed),• Multimodal options,• Program delivery, and
- Page 5 and 6:
National Academy of SciencesNationa
- Page 7 and 8:
Workshop Summary ..................
- Page 9:
General OverviewIntroductionExecuti
- Page 12 and 13: 4 PERFORMANCE MEASURES TO IMPROVE T
- Page 14 and 15: 6 PERFORMANCE MEASURES TO IMPROVE T
- Page 18 and 19: 10 PERFORMANCE MEASURES TO IMPROVE
- Page 20 and 21: 12 PERFORMANCE MEASURES TO IMPROVE
- Page 23: Linking Performance Measures withDe
- Page 26 and 27: 18 PERFORMANCE MEASURES TO IMPROVE
- Page 28 and 29: 20 PERFORMANCE MEASURES TO IMPROVE
- Page 30 and 31: 22 PERFORMANCE MEASURES TO IMPROVE
- Page 32 and 33: 24 PERFORMANCE MEASURES TO IMPROVE
- Page 34 and 35: 26 PERFORMANCE MEASURES TO IMPROVE
- Page 36 and 37: 28 PERFORMANCE MEASURES TO IMPROVE
- Page 38 and 39: 30 PERFORMANCE MEASURES TO IMPROVE
- Page 40 and 41: 32 PERFORMANCE MEASURES TO IMPROVE
- Page 42 and 43: Panel DiscussionJohn Poorman, Capit
- Page 44 and 45: 36 PERFORMANCE MEASURES TO IMPROVE
- Page 46 and 47: 38 PERFORMANCE MEASURES TO IMPROVE
- Page 48 and 49: Workshop SummaryJohn Basilica, Loui
- Page 50 and 51: 42 PERFORMANCE MEASURES TO IMPROVE
- Page 52 and 53: 44 PERFORMANCE MEASURES TO IMPROVE
- Page 55 and 56: RESOURCE PAPERImplementing Performa
- Page 57 and 58: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 59 and 60: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 61: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 65 and 66: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 67 and 68: Panel DiscussionJennifer Finch, Col
- Page 69 and 70: FIGURE 2Level of measures and align
- Page 71 and 72: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 73 and 74: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 75 and 76: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 77 and 78: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 79 and 80: WORKSHOP SUMMARY: AGENCY IMPLEMENTA
- Page 81: Selecting Measures, Data Needs, and
- Page 84 and 85: 76 PERFORMANCE MEASURES TO IMPROVE
- Page 86 and 87: 78 PERFORMANCE MEASURES TO IMPROVE
- Page 88 and 89: 80 PERFORMANCE MEASURES TO IMPROVE
- Page 90 and 91: 82 PERFORMANCE MEASURES TO IMPROVE
- Page 92 and 93: 84 PERFORMANCE MEASURES TO IMPROVE
- Page 94 and 95: 86 PERFORMANCE MEASURES TO IMPROVE
- Page 96 and 97: Panel Discussion, Part 1Anita Vande
- Page 98 and 99: 90 PERFORMANCE MEASURES TO IMPROVE
- Page 100 and 101: 92 PERFORMANCE MEASURES TO IMPROVE
- Page 102 and 103: Panel Discussion, Part 2Douglas Zim
- Page 104 and 105: FIGURE 11 Example data
- Page 106 and 107: 98 PERFORMANCE MEASURES TO IMPROVE
- Page 108 and 109: Workshop SummaryBrenda Berg, Trans
- Page 110 and 111: 102 PERFORMANCE MEASURES TO IMPROVE
- Page 113 and 114:
RESOURCE PAPERMeasuring That Which
- Page 115 and 116:
MEASURING THAT WHICH CANNOT BE MEAS
- Page 117 and 118:
TABLE 1Sustainability MeasuresNewma
- Page 119 and 120:
MEASURING THAT WHICH CANNOT BE MEAS
- Page 121 and 122:
MEASURING THAT WHICH CANNOT BE MEAS
- Page 123 and 124:
MEASURING THAT WHICH CANNOT BE MEAS
- Page 125 and 126:
MEASURING THAT WHICH CANNOT BE MEAS
- Page 127 and 128:
MEASURING THAT WHICH CANNOT BE MEAS
- Page 129 and 130:
MEASURING THAT WHICH CANNOT BE MEAS
- Page 131 and 132:
MEASURING THAT WHICH CANNOT BE MEAS
- Page 133 and 134:
MEASURING THAT WHICH CANNOT BE MEAS
- Page 135 and 136:
PANEL DISCUSSION: CONNECTING SYSTEM
- Page 137 and 138:
PANEL DISCUSSION: CONNECTING SYSTEM
- Page 139 and 140:
PANEL DISCUSSION: CONNECTING SYSTEM
- Page 141 and 142:
PANEL DISCUSSION: CONNECTING SYSTEM
- Page 143 and 144:
Workshop SummaryScott Bassett, Oreg
- Page 145:
Freight Performance MeasuresCurrent
- Page 148 and 149:
140 PERFORMANCE MEASURES TO IMPROVE
- Page 150 and 151:
142 PERFORMANCE MEASURES TO IMPROVE
- Page 152 and 153:
144 PERFORMANCE MEASURES TO IMPROVE
- Page 154 and 155:
146 PERFORMANCE MEASURES TO IMPROVE
- Page 156 and 157:
148 PERFORMANCE MEASURES TO IMPROVE
- Page 158 and 159:
150 PERFORMANCE MEASURES TO IMPROVE
- Page 160 and 161:
152 PERFORMANCE MEASURES TO IMPROVE
- Page 163 and 164:
APPENDIX ASummaries of 20 Poster Se
- Page 165 and 166:
SUMMARIES OF 20 POSTER SESSIONS 157
- Page 167 and 168:
TABLE 1(continued) WSDOT Outcomes,
- Page 169 and 170:
SUMMARIES OF 20 POSTER SESSIONS 161
- Page 171 and 172:
SUMMARIES OF 20 POSTER SESSIONS 163
- Page 173 and 174:
SUMMARIES OF 20 POSTER SESSIONS 165
- Page 175 and 176:
SUMMARIES OF 20 POSTER SESSIONS 167
- Page 177 and 178:
FIGURE 2 Sacramento Regional Transi
- Page 179 and 180:
FIGURE 3Goal evaluation table: Chit
- Page 181 and 182:
SUMMARIES OF 20 POSTER SESSIONS 173
- Page 183 and 184:
SUMMARIES OF 20 POSTER SESSIONS 175
- Page 185 and 186:
SUMMARIES OF 20 POSTER SESSIONS 177
- Page 187 and 188:
TABLE 6Mobility Performance Measure
- Page 189 and 190:
SUMMARIES OF 20 POSTER SESSIONS 181
- Page 191 and 192:
SUMMARIES OF 20 POSTER SESSIONS 183
- Page 193 and 194:
SUMMARIES OF 20 POSTER SESSIONS 185
- Page 195 and 196:
FIGURE 13 MPAH lane-miles by roadwa
- Page 197 and 198:
SUMMARIES OF 20 POSTER SESSIONS 189
- Page 199 and 200:
SUMMARIES OF 20 POSTER SESSIONS 191
- Page 201 and 202:
SUMMARIES OF 20 POSTER SESSIONS 193
- Page 203 and 204:
SUMMARIES OF 20 POSTER SESSIONS 195
- Page 205 and 206:
SUMMARIES OF 20 POSTER SESSIONS 197
- Page 207 and 208:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 209 and 210:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 211 and 212:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 213 and 214:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 215 and 216:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 217 and 218:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 219 and 220:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 221 and 222:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 223 and 224:
APPENDIX C: RESEARCH STATEMENTS DEV
- Page 225 and 226:
LIST OF PARTICIPANTS 217University,
- Page 227:
LIST OF PARTICIPANTS 219Darwin Stua