Panel Discussion, Part 1Anita Vandervalk, Florida Department of <strong>Transportation</strong>Paul O’Brien, Utah Transit AuthorityJoel Pfundt, Puget Sound Regional CouncilTom Brigham, Alaska Department of <strong>Transportation</strong>, ModeratorTRB AND FLORIDA EXAMPLESAnita VandervalkThe goal of this presentation is to confirm manyof the points in the resource paper by a comparisonwith the outcome of the <strong>Transportation</strong><strong>Research</strong> <strong>Board</strong> (TRB) data committee peerexchange and of the Florida experience.At the end of July, the TRB data committee helda peer exchange in which nine states gathered alongwith staff from TRB and the Federal Highway Administration(FHWA). We had a good exchangeabout the latest developments and performancemeasures. We focused on data issues. So a lot of theoutcome of that peer exchange is similar to thepoints made in the resource paper. (Note that a copyof the report on the peer exchange is found in AppendixB.)I will not go into too much detail about Florida’sperformance measures because we have been doingthis for 10 years and much has been written aboutour program. I want to draw on some examples tovalidate the points. One of the keys about the performancemeasure program is that it is closely linkedto our planning process, which I will demonstrate ina moment. Then I’m going to propose some additionalareas of study, most of which came from thepeer exchange.There was no question-and-answer period for this session. Thesummaries were prepared by Jonette Kreidweis, Minnesota Departmentof <strong>Transportation</strong>.I decided to break the points of the paper intothree main areas: agency performance measures, customerneeds and data, and—most important—datafor performance measures. For each area, I have chosena couple of points to highlight and go intodetail.The paper pointed out that agency performancemeasures should be focused on three groups: customers,stakeholders, and employees. I would like to takethat focus a little farther and emphasize that we needto look at how performance measures are used. Thisis something we discussed at length in the peerexchange. We all realize that there is a flurry of activity.We all think we are doing the right thing indeveloping these performance measures. But we needsome examples of how performance measures havecontributed to making decisions in an agency, organizationaland institutional changes, and how theyrelate to operations. We also talked about that in ourbreakout group.A second point on agency performance measuresis the number of measures. This is one point where Idisagree with the paper’s authors. They indicated thatthere should be a few of them. In Florida and Minnesota,where there are hundreds of measures, it isimportant to have a lot of measures. I do agree thatyou need to be able to boil them down to a few keyones that you report out with, but to get the buy-in,you need to have measures that cover every area ofevery agency so that everybody is involved.When we were mandated by our legislature in1994 to have measures in place, we looked out intothe agency to determine how we could report to thelegislature on this. We found that a lot of the areas88
PANEL DISCUSSION, PART 1: SELECTING MEASURES, DATA NEEDS, AND ANALYTICAL ISSUES 89(maintenance pavement, e.g.) had been doing performancemeasures for years; that is how they operatedtheir business. That is why it is important to havehundreds of measures. Again, the focus should be ona few that we can keep our eyes on.The third point under agency performance measuresis to link to agency goals. This point is absolutelycritical, and the goals should be aligned. Figure9 shows Florida’s method for linking. One thing wedo differently is to link back. First, we establish ourpolicies and plans. We just completed the developmentof our 2020 Florida <strong>Transportation</strong> Plan. Theplan took 18 months to develop, and we involvedseveral hundred individuals throughout the state—MPOs, county government officials, and the generalpublic. We had several brainstorming groups. It wasa huge effort. But we are finishing what I think is afairly well-supported plan.On the basis of that plan, we developed our financialpriorities. Some of them are based on statute andregulation. For example, in Florida, maintenance dollarsare taken right off the top. A certain percentagegoes directly to facilities maintenance. Of the remainder,50 percent of the all the capacity funds automaticallygoes to supporting Florida’s Interstatehighway system, our key corridors within the state.The other 50 percent of the capacity dollars is spentbased on the policy and plans that we put forth inour 2020 long-range plan. We then implement ouradopted work program and measure the performance,that is, how well we are serving the customersin the area of pavement maintenance and capacityimprovements. The measures link directly back to theFlorida <strong>Transportation</strong> Plan. We have to report, inthe form of an agency strategic plan (a kind of ashort-range component of our plan) to the legislatureannually on how we are doing on each of our hundredsof measures. We report how well we are doingFIGURE 9 Agency performance measures: link to planningprocess.based on our goal, and what we plan to doif we are not meeting that goal. So we have to demonstratecontinuous improvement based on thosemeasures.The fourth point is that measures should changeslowly. From a data standpoint, I cannot emphasizethis issue enough. Some trends need to be establishedbefore you start changing the measures. In fact, forthe mobility area in Florida, we decided to not evenset goals and objectives until we have enough datato back us up so that we can set some appropriateobjectives. That is a key point.Fifth, continuous improvement, as I demonstrated,is critical.The second major area is customer needs and data.The resource paper touched on the need for surveysand what they should look like. One of the thingswe talked about at the peer exchange was the needfor using existing market research, survey tools, andso on. We should not reinvent the wheel in this area.A lot of agencies are already doing market research,and we should latch on to existing methods, maybetweaking it for our needs.The third and most important area, of course, isthe data. One issue touched upon in the paper ismodeled versus actual data collection. This issomething we struggled quite a bit with, especiallyin our mobility measure area. We have determinedthat model data, when validated with actual data,are an excellent way to get a consistent data sourcefrom a network standpoint. We are big on makingsure that the data we report are consistent acrossthe whole network, to the point when we will noteven map our mobility performance measures becausewe are concerned about the comparisons thatmay be made from one urban area to another.Model data have been the answer for us becausethey provide consistent results. For example, speedis the one thing that we model. We have an extensiveprogram for traffic data collection that givesus our volumes and so on. We have a combinationof modeled versus existing.We used our existing data-collection program extensively.It should be based on what you alreadyhave. We had to tweak our programs. For example,one decision we had to make is what we shouldchoose as a peak hour. We chose 5 to 6 p.m. becauseit seemed to be the peak hour that closely matchedthe peak time for both transit and roadway. Then,we had to tweak our data-collection processes to giveus data in that time frame.Quality control, integration, and data sharing arekey to what we do, and I have linked them together.When we discussed these issues in our peer exchange,we linked them because if you are trying to get qual-
- Page 5 and 6:
National Academy of SciencesNationa
- Page 7 and 8:
Workshop Summary ..................
- Page 9:
General OverviewIntroductionExecuti
- Page 12 and 13:
4 PERFORMANCE MEASURES TO IMPROVE T
- Page 14 and 15:
6 PERFORMANCE MEASURES TO IMPROVE T
- Page 18 and 19:
10 PERFORMANCE MEASURES TO IMPROVE
- Page 20 and 21:
12 PERFORMANCE MEASURES TO IMPROVE
- Page 23:
Linking Performance Measures withDe
- Page 26 and 27:
18 PERFORMANCE MEASURES TO IMPROVE
- Page 28 and 29:
20 PERFORMANCE MEASURES TO IMPROVE
- Page 30 and 31:
22 PERFORMANCE MEASURES TO IMPROVE
- Page 32 and 33:
24 PERFORMANCE MEASURES TO IMPROVE
- Page 34 and 35:
26 PERFORMANCE MEASURES TO IMPROVE
- Page 36 and 37:
28 PERFORMANCE MEASURES TO IMPROVE
- Page 38 and 39:
30 PERFORMANCE MEASURES TO IMPROVE
- Page 40 and 41:
32 PERFORMANCE MEASURES TO IMPROVE
- Page 42 and 43:
Panel DiscussionJohn Poorman, Capit
- Page 44 and 45:
36 PERFORMANCE MEASURES TO IMPROVE
- Page 46 and 47: 38 PERFORMANCE MEASURES TO IMPROVE
- Page 48 and 49: Workshop SummaryJohn Basilica, Loui
- Page 50 and 51: 42 PERFORMANCE MEASURES TO IMPROVE
- Page 52 and 53: 44 PERFORMANCE MEASURES TO IMPROVE
- Page 55 and 56: RESOURCE PAPERImplementing Performa
- Page 57 and 58: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 59 and 60: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 61 and 62: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 63 and 64: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 65 and 66: IMPLEMENTING PERFORMANCE MEASUREMEN
- Page 67 and 68: Panel DiscussionJennifer Finch, Col
- Page 69 and 70: FIGURE 2Level of measures and align
- Page 71 and 72: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 73 and 74: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 75 and 76: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 77 and 78: PANEL DISCUSSION: AGENCY IMPLEMENTA
- Page 79 and 80: WORKSHOP SUMMARY: AGENCY IMPLEMENTA
- Page 81: Selecting Measures, Data Needs, and
- Page 84 and 85: 76 PERFORMANCE MEASURES TO IMPROVE
- Page 86 and 87: 78 PERFORMANCE MEASURES TO IMPROVE
- Page 88 and 89: 80 PERFORMANCE MEASURES TO IMPROVE
- Page 90 and 91: 82 PERFORMANCE MEASURES TO IMPROVE
- Page 92 and 93: 84 PERFORMANCE MEASURES TO IMPROVE
- Page 94 and 95: 86 PERFORMANCE MEASURES TO IMPROVE
- Page 98 and 99: 90 PERFORMANCE MEASURES TO IMPROVE
- Page 100 and 101: 92 PERFORMANCE MEASURES TO IMPROVE
- Page 102 and 103: Panel Discussion, Part 2Douglas Zim
- Page 104 and 105: FIGURE 11 Example data
- Page 106 and 107: 98 PERFORMANCE MEASURES TO IMPROVE
- Page 108 and 109: Workshop SummaryBrenda Berg, Trans
- Page 110 and 111: 102 PERFORMANCE MEASURES TO IMPROVE
- Page 113 and 114: RESOURCE PAPERMeasuring That Which
- Page 115 and 116: MEASURING THAT WHICH CANNOT BE MEAS
- Page 117 and 118: TABLE 1Sustainability MeasuresNewma
- Page 119 and 120: MEASURING THAT WHICH CANNOT BE MEAS
- Page 121 and 122: MEASURING THAT WHICH CANNOT BE MEAS
- Page 123 and 124: MEASURING THAT WHICH CANNOT BE MEAS
- Page 125 and 126: MEASURING THAT WHICH CANNOT BE MEAS
- Page 127 and 128: MEASURING THAT WHICH CANNOT BE MEAS
- Page 129 and 130: MEASURING THAT WHICH CANNOT BE MEAS
- Page 131 and 132: MEASURING THAT WHICH CANNOT BE MEAS
- Page 133 and 134: MEASURING THAT WHICH CANNOT BE MEAS
- Page 135 and 136: PANEL DISCUSSION: CONNECTING SYSTEM
- Page 137 and 138: PANEL DISCUSSION: CONNECTING SYSTEM
- Page 139 and 140: PANEL DISCUSSION: CONNECTING SYSTEM
- Page 141 and 142: PANEL DISCUSSION: CONNECTING SYSTEM
- Page 143 and 144: Workshop SummaryScott Bassett, Oreg
- Page 145: Freight Performance MeasuresCurrent
- Page 148 and 149:
140 PERFORMANCE MEASURES TO IMPROVE
- Page 150 and 151:
142 PERFORMANCE MEASURES TO IMPROVE
- Page 152 and 153:
144 PERFORMANCE MEASURES TO IMPROVE
- Page 154 and 155:
146 PERFORMANCE MEASURES TO IMPROVE
- Page 156 and 157:
148 PERFORMANCE MEASURES TO IMPROVE
- Page 158 and 159:
150 PERFORMANCE MEASURES TO IMPROVE
- Page 160 and 161:
152 PERFORMANCE MEASURES TO IMPROVE
- Page 163 and 164:
APPENDIX ASummaries of 20 Poster Se
- Page 165 and 166:
SUMMARIES OF 20 POSTER SESSIONS 157
- Page 167 and 168:
TABLE 1(continued) WSDOT Outcomes,
- Page 169 and 170:
SUMMARIES OF 20 POSTER SESSIONS 161
- Page 171 and 172:
SUMMARIES OF 20 POSTER SESSIONS 163
- Page 173 and 174:
SUMMARIES OF 20 POSTER SESSIONS 165
- Page 175 and 176:
SUMMARIES OF 20 POSTER SESSIONS 167
- Page 177 and 178:
FIGURE 2 Sacramento Regional Transi
- Page 179 and 180:
FIGURE 3Goal evaluation table: Chit
- Page 181 and 182:
SUMMARIES OF 20 POSTER SESSIONS 173
- Page 183 and 184:
SUMMARIES OF 20 POSTER SESSIONS 175
- Page 185 and 186:
SUMMARIES OF 20 POSTER SESSIONS 177
- Page 187 and 188:
TABLE 6Mobility Performance Measure
- Page 189 and 190:
SUMMARIES OF 20 POSTER SESSIONS 181
- Page 191 and 192:
SUMMARIES OF 20 POSTER SESSIONS 183
- Page 193 and 194:
SUMMARIES OF 20 POSTER SESSIONS 185
- Page 195 and 196:
FIGURE 13 MPAH lane-miles by roadwa
- Page 197 and 198:
SUMMARIES OF 20 POSTER SESSIONS 189
- Page 199 and 200:
SUMMARIES OF 20 POSTER SESSIONS 191
- Page 201 and 202:
SUMMARIES OF 20 POSTER SESSIONS 193
- Page 203 and 204:
SUMMARIES OF 20 POSTER SESSIONS 195
- Page 205 and 206:
SUMMARIES OF 20 POSTER SESSIONS 197
- Page 207 and 208:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 209 and 210:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 211 and 212:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 213 and 214:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 215 and 216:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 217 and 218:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 219 and 220:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 221 and 222:
SUMMARY OF PEER EXCHANGE ON DATA FO
- Page 223 and 224:
APPENDIX C: RESEARCH STATEMENTS DEV
- Page 225 and 226:
LIST OF PARTICIPANTS 217University,
- Page 227:
LIST OF PARTICIPANTS 219Darwin Stua