78 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS AND AGENCY OPERATIONSignored the distribution of pavement conditions, thatthe issue was really the number of pavements towardthe bad end of the scale. A statistician might suggesta skewness statistic as a method of measuring whichway the distribution of pavement conditions is leaning.A decrease in the skewness coefficient from oneperiod to the next would indicate that the distributionof conditions was moving toward lower (i.e.,better) scores. Reporting a decrease in skewness tothe public and to agency management, however,would not elicit the same level of understanding asreporting that the number of bad pavements decreased.The latter is a concept that can be easilyunderstood, making it a very powerful measure.Collecting DataThe examples presented in the previous section demonstratethat a direct relationship exists between theperformance measures selected and the data neededin the performance measurement process. The mostcommon data problems are in ascertaining the qualityof the data and in acquiring it in the exact formdesired.The ‘‘garbage in, garbage out’’ concept applies tothe data used in a performance measurement system.If the data gathered are highly uncertain, then theconclusions drawn by converting those data into performancemeasures also will be highly uncertain andwill have reduced value in managing the agency. Forthis reason, great care needs to be taken in data collection.Investments in accurate, high-quality datacollectionsystems are essential to successful performancemeasurement and, by extension, to achievingthe overall strategic goals of the agency. In reality,however, some things are important and either cannotbe measured accurately or cannot be measuredaccurately at an acceptable cost. <strong>Transportation</strong>agencies need to consider the uncertainty introducedby inaccurate data when taking action based on theirsystem of performance measures. More specific issuesrelated to data collection and manipulation are discussedbelow.Analyzing and Reporting ResultsOnce the desired data are in hand, the focus shifts tothe analysis and reporting of results. In this stage, themost challenging problem is often separating the impactof the activities of the transportation agencyfrom the impacts generated from beyond those activities.For example, highway crashes are influenced bymany factors besides highway design. If an agencyuses the total number of highway crashes as a performancemeasure, does an increase in crashes indicatethat the agency’s safety programs are ineffective?Before that conclusion is drawn, the impact ofchanges in the weather and other factors clearlyneeds to be understood.The necessity of separating the impacts of externalfactors has direct implications for data collection, anotherof the important feedback loops in developinga performance measurement system. Even thoughstatistical techniques might be available to allow theimpacts of several factors to be isolated, the techniquesrequire large numbers of observations to beused reliably. Thus, it is necessary to have a datacollectionsystem that increases the number of observationsby maintaining data with some degree ofdesegregation in both time and space. It also isnecessary to gather data on relevant factors outsidethe agency’s control. For example, if highway crashesare a performance measure and are influenced by severeweather conditions, then data need to be collectedon severe weather across the agency’s jurisdiction.It is also necessary to record crashes on anhourly or daily basis by location to determine howmany occurred during periods of good versus badweather.Another aspect of the analysis of performancemeasures with a direct impact on data collection isthe frequency with which the analysis is needed. Thetime period covered by an agency’s goals and the timeperiod for which current data are maintained mustbe consistent. In determining frequency, the agencyshould consider the nature of the processes underlyingits activities. Consider pavement roughness, forexample. Highway construction takes place over severalmonths, and the schedule of work over thecourse of the year varies for many reasons. In thiscase, it would be of little use to measure, analyze,and report changes in pavement conditions less thanannually. Poorer conditions early in the year do notnecessarily imply the agency will end up with poorerconditions after all construction work is complete. Inother cases, the underlying process may be muchshorter than the frequency of analysis and reporting.If the process can be redirected on short notice, itmay be useful to monitor the results of the ongoingprocess so that midterm corrections can be madeif it appears that the agency’s goal might not bereached.As mentioned in the discussion of data collection,performance analysis results are often uncertain becausedata are difficult to collect accurately. This uncertaintyoften can be addressed in the analysisphase. One approach is to desegregate the performancedata and determine whether all levels of aggregationperform similarly. This might be done by
TRANSPORTATION DATA AND PERFORMANCE MEASUREMENT 79looking at conditions in varying geographical areaswithin the jurisdiction of the agency. If all areas performsimilarly, the result conveys more certainty. Ifonly one or two areas have poor results, then additionalanalysis can focus on those areas to determinewhether there is reason to believe data accuracy issuesare causing them to stand apart. Another approachis to look at related measures, which the underlyingprocess suggests should be correlated withperformance in areas prone to inaccurate data. Ifeach measure points in the same direction, then theagency can be more confident of the results.Analysis of performance also should consider combiningfeedback and performance data for a morecomplete picture. Data on changes in miles of badpavement, for example, could be combined with customerfeedback gained through pavement satisfactionsurveys. One result can help verify and explain theother, and when results vary, it can point to the needto reevaluate the measures used.Finally, analysis must consider the impact that themeasures have on each other. Three goals have alreadybeen suggested for a highway organization:smooth pavement, reduced congestion, and fewercrashes. Success in increasing the smoothness ofpavements may encourage higher speeds, which willincrease crashes. A heavy commitment of resourcesto capacity projects may reduce resources availableto pavement renewal or to safety improvements. Ananalytic process must be sufficiently complex to allowthe policy choices to be highlighted and the relativeimpact of each to be understood. If competinggoals cannot be analyzed, the results achieved will behaphazard.Managers of highway systems are not alone in facingsuch challenges. Transit operators usually areforced to balance the need for efficiency with theneed to provide mobility for people in low-densityareas. Efficiency measures would tend to lead the operatorto discontinue less-used routes. However, thedemands for access to jobs in less-dense suburbanlocations might lead the operator to add more suchroutes. Policy makers and managers must be able tounderstand the interaction of these two goals thatmay be polar opposites in terms of their implementation.If policy makers determine greater mobility tobe the primary goal, they must either accept a reducedemphasis on efficiency or adopt a system ofperformance measurement that is sufficiently complexto differentiate the efficiency of various types ofservices or routes.Both of these examples of competing goals requirereasonably sophisticated analytic processes that allowfor various policy options to be considered initerations, so that the interplay of those options canbe understood.Accepting Performance MeasuresAs transportation agencies move through the stagesof the performance measurement process, it is importantfor them to keep in mind that a system willfail unless it has buy-in from customers, stakeholders,and employees. Agencies should view the developmentof a performance measurement system as anart, not a science. If performance measurement werea science, there would be one best way to do it. Thereis not. Given that performance measurement is anart, an agency’s top managers must view themselvesas artists who find creative ways to bring the brushstrokes of all interest groups into a coherent form.Top management needs to set the agency’s strategicdirection and goals as well as broaden involvementin developing the performance measures that theagency uses. If done successfully, each group will believein the results and be willing to act on them toachieve real improvement.To ensure buy-in, an agency must consider notonly what it does but also how it is done. Many ofthe points made in discussing the performance measurementprocess bear repeating because ignoringthem will hurt the buy-in process. First, managementmust keep the measures few and simple. Second,management needs to ensure that the measures aredirectly related to agency strategic goals and directlyinfluenced by agency activities. Third, performancemeasures must be developed and used as tools forimproving critical processes, not as report cards. Finally,management must invest staff and resources inreliable data-collection systems and in the analyticmethods required for timely analysis and reportingof results. A significant breakdown on any of thesepoints will lessen the effectiveness of the performancemeasurement process and reduce the ability of theagency to successfully accomplish true process improvements.CUSTOMER IDENTIFICATIONThe earlier discussion focuses largely on measuresthat come from a transportation agency’s standarddata systems. Pavement quality, congestion, andcrashes can be reduced to hard numbers and are routinelyreported in most agencies. These are the traditionaltransportation measures. Customer measuresprovide another view of many of these traditionalmeasures; they may provide a subjective overall as-
- Page 5 and 6:
National Academy of SciencesNationa
- Page 7 and 8:
Workshop Summary ..................
- Page 9:
General OverviewIntroductionExecuti
- Page 12 and 13:
4 PERFORMANCE MEASURES TO IMPROVE T
- Page 14 and 15:
6 PERFORMANCE MEASURES TO IMPROVE T
- Page 18 and 19:
10 PERFORMANCE MEASURES TO IMPROVE
- Page 20 and 21:
12 PERFORMANCE MEASURES TO IMPROVE
- Page 23:
Linking Performance Measures withDe
- Page 26 and 27:
18 PERFORMANCE MEASURES TO IMPROVE
- Page 28 and 29:
20 PERFORMANCE MEASURES TO IMPROVE
- Page 30 and 31:
22 PERFORMANCE MEASURES TO IMPROVE
- Page 32 and 33:
24 PERFORMANCE MEASURES TO IMPROVE
- Page 34 and 35:
26 PERFORMANCE MEASURES TO IMPROVE
- Page 36 and 37: 28 PERFORMANCE MEASURES TO IMPROVE
- Page 38 and 39: 30 PERFORMANCE MEASURES TO IMPROVE
- Page 40 and 41: 32 PERFORMANCE MEASURES TO IMPROVE
- Page 42 and 43: Panel DiscussionJohn Poorman, Capit
- Page 44 and 45: 36 PERFORMANCE MEASURES TO IMPROVE
- Page 46 and 47: 38 PERFORMANCE MEASURES TO IMPROVE
- Page 48 and 49: Workshop SummaryJohn Basilica, Loui
- Page 50 and 51: 42 PERFORMANCE MEASURES TO IMPROVE
- Page 52 and 53: 44 PERFORMANCE MEASURES TO IMPROVE
- Page 55 and 56: RESOURCE PAPERImplementing Performa
- Page 57 and 58: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 59 and 60: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 61 and 62: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 63 and 64: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 65 and 66: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 67 and 68: Panel DiscussionJennifer Finch, Col
- Page 69 and 70: FIGURE 2Level of measures and align
- Page 71 and 72: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 73 and 74: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 75 and 76: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 77 and 78: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 79 and 80: WORKSHOP SUMMARY: AGENCY IMPLEMENTA
- Page 81: Selecting Measures, Data Needs, and
- Page 84 and 85: 76 PERFORMANCE MEASURES TO IMPROVE
- Page 88 and 89: 80 PERFORMANCE MEASURES TO IMPROVE
- Page 90 and 91: 82 PERFORMANCE MEASURES TO IMPROVE
- Page 92 and 93: 84 PERFORMANCE MEASURES TO IMPROVE
- Page 94 and 95: 86 PERFORMANCE MEASURES TO IMPROVE
- Page 96 and 97: Panel Discussion, Part 1Anita Vande
- Page 98 and 99: 90 PERFORMANCE MEASURES TO IMPROVE
- Page 100 and 101: 92 PERFORMANCE MEASURES TO IMPROVE
- Page 102 and 103: Panel Discussion, Part 2Douglas Zim
- Page 104 and 105: FIGURE 11 Example data
- Page 106 and 107: 98 PERFORMANCE MEASURES TO IMPROVE
- Page 108 and 109: Workshop SummaryBrenda Berg, Trans
- Page 110 and 111: 102 PERFORMANCE MEASURES TO IMPROVE
- Page 113 and 114: RESOURCE PAPERMeasuring That Which
- Page 115 and 116: MEASURING THAT WHICH CANNOT BE MEAS
- Page 117 and 118: TABLE 1Sustainability MeasuresNewma
- Page 119 and 120: MEASURING THAT WHICH CANNOT BE MEAS
- Page 121 and 122: MEASURING THAT WHICH CANNOT BE MEAS
- Page 123 and 124: MEASURING THAT WHICH CANNOT BE MEAS
- Page 125 and 126: MEASURING THAT WHICH CANNOT BE MEAS
- Page 127 and 128: MEASURING THAT WHICH CANNOT BE MEAS
- Page 129 and 130: MEASURING THAT WHICH CANNOT BE MEAS
- Page 131 and 132: MEASURING THAT WHICH CANNOT BE MEAS
- Page 133 and 134: MEASURING THAT WHICH CANNOT BE MEAS
- Page 135 and 136: PANEL DISCUSSION: CONNECTING SYSTEM
- Page 137 and 138:
PANEL DISCUSSION: CONNECTING SYSTEM
- Page 139 and 140:
PANEL DISCUSSION: CONNECTING SYSTEM
- Page 141 and 142:
PANEL DISCUSSION: CONNECTING SYSTEM
- Page 143 and 144:
Workshop SummaryScott Bassett, Oreg
- Page 145:
Freight Performance MeasuresCurrent
- Page 148 and 149:
140 PERFORMANCE MEASURES TO IMPROVE
- Page 150 and 151:
142 PERFORMANCE MEASURES TO IMPROVE
- Page 152 and 153:
144 PERFORMANCE MEASURES TO IMPROVE
- Page 154 and 155:
146 PERFORMANCE MEASURES TO IMPROVE
- Page 156 and 157:
148 PERFORMANCE MEASURES TO IMPROVE
- Page 158 and 159:
150 PERFORMANCE MEASURES TO IMPROVE
- Page 160 and 161:
152 PERFORMANCE MEASURES TO IMPROVE
- Page 163 and 164:
APPENDIX ASummaries of 20 Poster Se
- Page 165 and 166:
SUMMARIES OF 20 POSTER SESSIONS 157
- Page 167 and 168:
TABLE 1(continued) WSDOT Outcomes,
- Page 169 and 170:
SUMMARIES OF 20 POSTER SESSIONS 161
- Page 171 and 172:
SUMMARIES OF 20 POSTER SESSIONS 163
- Page 173 and 174:
SUMMARIES OF 20 POSTER SESSIONS 165
- Page 175 and 176:
SUMMARIES OF 20 POSTER SESSIONS 167
- Page 177 and 178:
FIGURE 2 Sacramento Regional Transi
- Page 179 and 180:
FIGURE 3Goal evaluation table: Chit
- Page 181 and 182:
SUMMARIES OF 20 POSTER SESSIONS 173
- Page 183 and 184:
SUMMARIES OF 20 POSTER SESSIONS 175
- Page 185 and 186:
SUMMARIES OF 20 POSTER SESSIONS 177
- Page 187 and 188:
TABLE 6Mobility Performance Measure
- Page 189 and 190:
SUMMARIES OF 20 POSTER SESSIONS 181
- Page 191 and 192:
SUMMARIES OF 20 POSTER SESSIONS 183
- Page 193 and 194:
SUMMARIES OF 20 POSTER SESSIONS 185
- Page 195 and 196:
FIGURE 13 MPAH lane-miles by roadwa
- Page 197 and 198:
SUMMARIES OF 20 POSTER SESSIONS 189
- Page 199 and 200:
SUMMARIES OF 20 POSTER SESSIONS 191
- Page 201 and 202:
SUMMARIES OF 20 POSTER SESSIONS 193
- Page 203 and 204:
SUMMARIES OF 20 POSTER SESSIONS 195
- Page 205 and 206:
SUMMARIES OF 20 POSTER SESSIONS 197
- Page 207 and 208:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 209 and 210:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 211 and 212:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 213 and 214:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 215 and 216:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 217 and 218:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 219 and 220:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 221 and 222:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 223 and 224:
APPENDIX C: RESEARCH STATEMENTS DEV
- Page 225 and 226:
LIST OF PARTICIPANTS 217University,
- Page 227:
LIST OF PARTICIPANTS 219Darwin Stua