28.10.2014 Views

CDOT Performance Data Business Plan - Cambridge Systematics

CDOT Performance Data Business Plan - Cambridge Systematics

CDOT Performance Data Business Plan - Cambridge Systematics

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

final report<br />

<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong><br />

<strong>Plan</strong><br />

prepared for<br />

Colorado Department of Transportation<br />

prepared by<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc.<br />

1566 Village Square Boulevard, Suite 2<br />

Tallahassee, FL 32309<br />

date<br />

December 31, 2011


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Table of Contents<br />

Executive Summary ........................................................................................................ 1<br />

1.0 Introduction ......................................................................................................... 1-1<br />

1.1 Purpose ......................................................................................................... 1-1<br />

2.0 Stakeholder Outreach ........................................................................................ 2-1<br />

3.0 <strong>Performance</strong> Measures ....................................................................................... 3-1<br />

3.1 <strong>Performance</strong> Reporting Context ............................................................... 3-1<br />

3.2 Benefits ......................................................................................................... 3-3<br />

3.3 Assessing <strong>CDOT</strong> Current Practices .......................................................... 3-4<br />

3.4 Recommended Measures ........................................................................... 3-7<br />

3.5 <strong>Data</strong> to Support Measures ....................................................................... 3-22<br />

3.6 Initial Calculations .................................................................................... 3-25<br />

3.7 <strong>Data</strong> Inventory .......................................................................................... 3-30<br />

3.8 Alignment with Goals .............................................................................. 3-37<br />

3.9 Potential Enhancements ........................................................................... 3-38<br />

4.0 <strong>Data</strong> Governance <strong>Plan</strong> ....................................................................................... 4-1<br />

4.1 Best Practices ............................................................................................... 4-1<br />

4.2 State of <strong>Data</strong> Governance at <strong>CDOT</strong> .......................................................... 4-7<br />

4.3 Recommendations and Timeline ............................................................ 4-14<br />

5.0 External Reporting Dashboard ......................................................................... 5-1<br />

5.1 Best Practices ............................................................................................... 5-1<br />

5.2 Current <strong>CDOT</strong> Process and Web Reporting Capability ...................... 5-19<br />

5.3 Recommendations for <strong>CDOT</strong> Dashboard Report Format .................. 5-19<br />

6.0 Cost/Benefit Curves for Safety and Mobility ................................................ 6-1<br />

6.1 Safety ............................................................................................................ 6-2<br />

6.2 Mobility ........................................................................................................ 6-3<br />

Appendix A Additional <strong>Data</strong> Collection Recommendations ............................. A-1<br />

Appendix B <strong>CDOT</strong> Supporting Measures ............................................................ B-1<br />

Appendix C References ............................................................................................ C-1<br />

Appendix D Stakeholder Interview Summary ..................................................... D-1<br />

Appendix E <strong>CDOT</strong> Goals and Objectives ............................................................. E-1<br />

Appendix F Sample <strong>Data</strong> Governance Work Team Charter .............................. F-1<br />

Appendix G Congestion ........................................................................................... G-1<br />

Appendix H Calculation Spreadsheet Pages ........................................................ H-1<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc.<br />

i


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

List of Tables<br />

Table 3-1 Assessing <strong>CDOT</strong> Practices ...................................................................... 3-5<br />

Table 3-2 Summary of Measures ............................................................................. 3-7<br />

Table 3-3 Summary of Results for 2010 ................................................................ 3-21<br />

Table 3-4 <strong>Data</strong> Sources to Support <strong>Performance</strong> Measures .............................. 3-22<br />

Table 3-5 Report Sources ........................................................................................ 3-23<br />

Table 3-6 <strong>Data</strong> Catalog ........................................................................................... 3-33<br />

Table 3-7 <strong>Data</strong> Quality Issues for Measures ........................................................ 3-36<br />

Table 4-1 <strong>Data</strong> Management Maturity Model Matrix .......................................... 4-9<br />

Table 4-2 <strong>CDOT</strong> Multi-Asset Management Self-Assessment ........................... 4-11<br />

Table 5-1 Hennepin County Sample Balanced Scorecard ................................. 5-18<br />

Table 5-2 Dashboard Types ................................................................................... 5-20<br />

Table 5-3<br />

A Comparison of Operational and Tactical Dashboards and<br />

Strategic Scorecards ............................................................................... 5-21<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc.<br />

iii


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

List of Figures<br />

Figure 3.1 <strong>Performance</strong> Management Framework ................................................ 3-1<br />

Figure 3.2 <strong>Data</strong> Flow ................................................................................................ 3-24<br />

Figure 3.3 <strong>CDOT</strong> Mission ........................................................................................ 3-37<br />

Figure 4.1 <strong>Data</strong> Management, <strong>Data</strong> Governance, and <strong>Data</strong> Stewardship .......... 4-3<br />

Figure 4.2 Self-Assessment Results ........................................................................ 4-13<br />

Figure 4.3 Timeline ................................................................................................... 4-21<br />

Figure 5.1 Georgia DOT <strong>Performance</strong> Management Dashboard ......................... 5-2<br />

Figure 5.2 Georgia DOT Bridge Maintenance Measures ...................................... 5-3<br />

Figure 5.3 VDOT Main Dashboard .......................................................................... 5-4<br />

Figure 5.4 Detailed View for Project Delivery ........................................................ 5-5<br />

Figure 5.5 Virginia Governor’s Scorecard ............................................................... 5-6<br />

Figure 5.6 Washington State Transportation Improvement Board<br />

Dashboard ................................................................................................. 5-7<br />

Figure 5.7 Washington State <strong>Performance</strong> Management Dashboard ................. 5-8<br />

Figure 5.8 Washington State Key <strong>Performance</strong> Indicators ................................... 5-8<br />

Figure 5.9 N<strong>CDOT</strong> Organization <strong>Performance</strong> Dashboard ................................. 5-9<br />

Figure 5.10 Detailed View for Infrastructure Health ............................................. 5-10<br />

Figure 5.11 N<strong>CDOT</strong> Quarterly Scorecard ............................................................... 5-11<br />

Figure 5.12 District Transportation Access Portal (Beta 2.0) ................................ 5-12<br />

Figure 5.13 Detailed View for Safety ....................................................................... 5-13<br />

Figure 5.14 Capital Bikeshare <strong>Performance</strong> Dashboard ....................................... 5-14<br />

Figure 5.15 Capital Bikeshare Dashboard Drill Down .......................................... 5-15<br />

Figure 5.16 MNDOT <strong>Performance</strong> Results Scorecard ........................................... 5-16<br />

Figure 5.17 Dashboard Design Elements ................................................................ 5-25<br />

Figure 5.18 <strong>CDOT</strong> Dashboard Example .................................................................. 5-29<br />

Figure 5.19 <strong>CDOT</strong> <strong>Performance</strong> Measures Site Organization .............................. 5-32<br />

Figure 6.1 Example Cost/Benefit Curve – Interstate Bridges in the Atlanta<br />

Region ........................................................................................................ 6-1<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc.<br />

v


List of Figures, continued<br />

vi<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Executive Summary<br />

Like many transportation agencies, the Colorado Department of Transportation<br />

(<strong>CDOT</strong>) collects, maintains, and reports on a wide variety of internal and<br />

external performance areas. <strong>CDOT</strong> has been collecting and using performance<br />

measures data to support long-range planning, policy, and investment analysis<br />

since the early 2000s. Several internal offices are directly involved with the<br />

collection of data and the maintenance of systems to store and analyze the<br />

information to support the measures. <strong>CDOT</strong> performance data is reported<br />

regularly (both internally and externally). These include the Annual<br />

<strong>Performance</strong> Report; Annual Report; Transportation Deficit Report; Strategic<br />

<strong>Plan</strong>; FHWA-<strong>CDOT</strong> Stewardship Agreement; and <strong>CDOT</strong> Fact Book. Several<br />

internal offices also report on various indicators including regions, bridge office,<br />

contracts and market analysis, project development, and maintenance. In<br />

addition, there are several related initiatives occurring within <strong>CDOT</strong> that are<br />

aimed at improving access to data and information.<br />

<strong>CDOT</strong>, like many other state DOTs, realizes that the highway construction era is<br />

changing, and the Department’s focus needs to shift from increasing capacity to<br />

managing and operating the existing system. Managing and operating the system<br />

requires detailed information about current and past performance, as well as<br />

predictions of future performance. Mobility, safety, asset management, and data<br />

collection/management are critical to the success of any <strong>CDOT</strong> performance-based<br />

planning process.<br />

The overall objective for this project was to develop a <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong><br />

<strong>Plan</strong> for <strong>CDOT</strong>. The report comes at an optimum time for <strong>CDOT</strong> to get ready for<br />

potentially changing Federal requirements and to support enhanced data<br />

management, performance reporting and decision-making within the agency. The<br />

project was sponsored by the <strong>Performance</strong> and Policy Analysis Unit within<br />

<strong>CDOT</strong>’s Division of Transportation Development. This report documents the<br />

results of this effort. It recommends the following nine core performance<br />

measures:<br />

1. Number of fatalities;<br />

2. Bridge condition;<br />

3. Pavement condition;<br />

4. Roadside condition;<br />

5. Snow and ice control;<br />

6. Roadway congestion;<br />

7. On time construction;<br />

8. On budget construction; and<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc.<br />

ES-1


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

9. Strategic action item implementation.<br />

It also addresses data management methodologies to support these measures,<br />

and details best practices and recommendations related to data governance,<br />

performance measures, and dashboard development. The products were<br />

generated with extensive internal stakeholder input.<br />

ES-2<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

1.0 Introduction<br />

1.1 PURPOSE<br />

There are two main types of challenges associated with performance<br />

measurement: 1) making effective use of performance data for decision-making,<br />

and 2) ensuring that the critical processes and responsibilities for data<br />

processing, analysis, and distribution work as effectively as possible. <strong>CDOT</strong>, like<br />

other agencies, is facing both types. Each aspect of performance measurement –<br />

data quality, data management, analysis tools and methods, dissemination, and<br />

use in business process – is important to the ultimate success of the effort. In<br />

general, agencies should determine which element(s) need more attention and<br />

they need to develop a balanced strategy for improvement. Typically, this<br />

strategy will require efforts on multiple fronts within <strong>CDOT</strong> including:<br />

<br />

<br />

<br />

<br />

<br />

Measuring the right things, at a level of detail appropriate to what they will<br />

be used for;<br />

Taking advantage of current technologies and tools for data collection,<br />

processing, and analysis;<br />

Making the best possible use of existing data and legacy systems;<br />

Enhancing tools over time to provide better decision support; and<br />

Building the staff capability and commitment required to ensure quality<br />

information and analyses that are used to make decisions.<br />

The purpose of this project was to develop and deliver a <strong>Plan</strong> to improve the<br />

Department’s management of performance information in the following areas:<br />

identification of priority performance measures, identification of critical data for<br />

these measures, data quality assurance and control, interfacing with a data<br />

management system, ensuring consistency, and minimizing the burden of<br />

reporting.<br />

This report documents current practices and next steps for <strong>CDOT</strong> with regard to<br />

performance measures, data governance and dashboard development.<br />

The remainder of this report is organized in the following sections:<br />

2.0 Stakeholder Outreach Process – Summary of outreach efforts.<br />

3.0 <strong>Performance</strong> Measures – Details current practices, best practices, and<br />

recommended practices related to performance measures. Included in this<br />

section is a detailed discussion regarding data needs to support the<br />

recommended measures.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 1-1


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

4.0 <strong>Data</strong> Governance – Details best practices and specific recommendations for<br />

<strong>CDOT</strong> regarding the development of a <strong>Data</strong> Governance strategy.<br />

5.0 External Dashboard – Documents best practices and examples from other<br />

states and recommends a specific dashboard structure for <strong>CDOT</strong>.<br />

6.0 Cost/Benefit Curves for Safety and Mobility – Addresses options for<br />

evaluating the relationship between funding level and future safety and mobility<br />

performance.<br />

1-2 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

2.0 Stakeholder Outreach<br />

A key component of this project was coordination with the large number of<br />

stakeholders within <strong>CDOT</strong>. The input received from these staff was integral to<br />

developing the recommendations. Following is a summary of the outreach<br />

efforts.<br />

1. A list of internal stakeholders was developed in coordination with <strong>CDOT</strong>.<br />

Stakeholders included both performance measures program and data<br />

program owners and users. They represent offices that collect, own,<br />

maintain, use, interface with, access, benefit from, or are otherwise affected<br />

by the data and performance measures. The stakeholders were organized<br />

into a Core Team and Stakeholder Group. The Core Team’s role was to<br />

provide guidance to the project team by reviewing the recommended<br />

approach and products, assist with stakeholder coordination and<br />

reconciliation of issues, and help to ensure that the products met all<br />

Stakeholder needs. The Stakeholders Group’s role was to assist with<br />

interviews, provide feedback on the recommended performance measures,<br />

contribute to populating the data matrix, and provide <strong>CDOT</strong> best practices.<br />

2. Kickoff meetings of the Core Team and Stakeholders Group were held on<br />

May 2, 2011. Individual interviews were also held with the stakeholders to<br />

discuss the following topics: <strong>Business</strong> Area Mandates and <strong>Performance</strong><br />

Measures (SAP role, What? Who? When? How?), Tools and Dashboards<br />

(Existing and new tools, <strong>Data</strong> Sources, Source? Owner? <strong>Data</strong> quality?); <strong>Data</strong><br />

Governance (Methods? Readiness?). A summary of the kickoff meetings<br />

and interviews is included in Appendix D.<br />

3. Additional meetings were held by telephone throughout the summer of 2011<br />

with stakeholders who were not available for the face-to-face meetings in<br />

May.<br />

4. A Core Team meeting was held by teleconference on July 25. The draft<br />

performance measures matrix was reviewed by the team at that time.<br />

5. A final set of meetings were held with stakeholders on September 6 and 7,<br />

2011 to review and validate the recommendations to date. Large meetings<br />

were held with a Management team and with the broader Stakeholder<br />

Group. Individual meetings were also held with each data owner to discuss<br />

data needs for the nine recommended measures.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 2-1


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

3.0 <strong>Performance</strong> Measures<br />

3.1 PERFORMANCE REPORTING CONTEXT<br />

This section reviews <strong>CDOT</strong>’s performance reporting processes, compares them<br />

to best practices nationally, and recommends opportunities for improvement.<br />

To help structure the review of current practices, the study team used a<br />

performance management framework developed through a National<br />

Cooperative Highway Research Program (NCHRP) study. This framework,<br />

illustrated in Figure 3.1, shows performance reporting as one of three related<br />

DOT management processes, alongside strategic planning and performance<br />

management. Together, these three components establish the foundation for a<br />

continuous cycle of identifying priorities, allocating resources, refining agency<br />

practices, managing staff, and maintaining accountability to the public. Each<br />

process has its own timeframe and cycle, and though they are all closely related,<br />

each is characterized by distinct activities conducted by various parties within an<br />

agency.<br />

Figure 3.1<br />

<strong>Performance</strong> Management Framework<br />

Source: <strong>Cambridge</strong> <strong>Systematics</strong>, NCHRP 8-62 Final Report, Transportation <strong>Performance</strong> Management<br />

Programs, Insight from Practitioners, April 2009.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-1


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Each component illustrated in Figure 3.1 is described below. The descriptions<br />

define the terms and characteristics used for the assessment of <strong>CDOT</strong> practices<br />

presented in Section 3.3. This information (along with the benefits of<br />

performance reporting provided in Section 3.2) draws heavily from the NCHRP<br />

8-62 work, but has been tailored for this report.<br />

Strategic <strong>Plan</strong>ning<br />

The strategic planning process typically includes the following elements:<br />

<br />

<br />

<br />

<br />

Vision or Mission. An agency’s mission or vision is the set of foundational<br />

principles that guide all of its policies and business decisions. A mission<br />

should be broad enough to encompass the agency’s entire breadth of<br />

responsibilities, while specific enough to suggest actionable goals and<br />

objectives.<br />

Goals. While an agency’s mission can remain the same for a long period<br />

(although it does not have to), its goals should change as necessary in<br />

response to new or evolving challenges. <strong>Performance</strong> measures can directly<br />

inform the development of goals by highlighting troubling trends and<br />

particular agency challenges. Goals can address a variety of facets of the<br />

agency’s performance, including external performance (state of the system,<br />

project delivery, customer satisfaction, etc.) and internal performance<br />

(human resources, communication, employee satisfaction, etc.).<br />

Objectives. One of the characteristics of an effective strategic plan is that it<br />

contains a limited number of achievable, measurable objectives that can be<br />

achieved within a few years. Strategic objectives will ultimately determine<br />

the measures that are used to gauge success.<br />

Initiatives. Strategic initiatives and implementation strategies are used to<br />

help orient an agency towards achieving the desired outcomes and fostering<br />

informed and responsive decision-making. Once goals are in place and<br />

specific measurable objectives have been set, agencies can identify policies<br />

and procedures that ensure success.<br />

<strong>Performance</strong> Management<br />

Figure 3.1 presents performance management as a continuous cycle consisting of<br />

the following four activities:<br />

Selecting measures. <strong>Performance</strong> measures provide a means for connecting<br />

decisions to agency goals and objectives. The best measures reflect an agency’s<br />

priorities, are feasible to calculate given existing data, and can be influenced by<br />

an agency’s actions.<br />

Setting targets. <strong>Performance</strong> targets are the gauges of success that support and<br />

advance an agency’s strategic plan. Without them, objectives may represent<br />

abstract concepts. The most useful targets are ambitious but achievable.<br />

3-2 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Making decisions. Ideally, performance management begins at the strategic<br />

level, where the setting of agency goals and objectives implies certain resource<br />

allocation priorities right from the start. However, DOT decision-making is far<br />

more nuanced than simply deciding that a particular function or outcome, such<br />

as pavement condition, is where the agency will invest its resources. There are<br />

numerous business decisions made within individual divisions or business<br />

functions that can benefit from a performance-based approach.<br />

Evaluating the system. One key objective of performance management is to<br />

discourage complacency in an agency that might otherwise be slow to embrace<br />

and adopt change. Likewise, the performance system itself should be regularly<br />

evaluated and updated as necessary. The most important area of evaluation is<br />

the selection of the measures. Although measures should be updated<br />

periodically to ensure consistency with agency priorities and strategic plans,<br />

there are significant benefits associated with maintaining a stable collection of<br />

measures. Internally, consistently collecting and reporting the same measure for<br />

several years enables the in-depth analysis of long-term trends. Externally,<br />

consistent measures can make it easier for stakeholders to fully appreciate<br />

progress that is being made or new challenges that arise.<br />

<strong>Performance</strong> Reporting<br />

<strong>Performance</strong> reporting is an essential component of any performance<br />

management program. Therefore, the form and frequency of performance<br />

reports should not be an afterthought. Reporting performance is in itself a key<br />

component in developing a culture of performance throughout a DOT.<br />

<strong>Performance</strong> reports should reflect agency priorities, assess progress towards<br />

goals and objectives, be understandable by target audiences, be easily accessible,<br />

and be updated regularly.<br />

3.2 BENEFITS<br />

Strategic planning, performance management and performance reporting can<br />

help agencies make difficult decisions about longer-term policy priorities,<br />

(“doing the right things”) as well as where and how to apply day-to-day staff<br />

and capital resources (“doing the right things well”). Additional benefits<br />

include:<br />

<br />

Helping agency leaders set a strategic agenda and motivate staff – effective<br />

leaders keep their organizations focused on the highest business priorities.<br />

<strong>Performance</strong> data can help them understand challenges and set appropriate<br />

policy priorities. At a transportation agency, for example, analysis of data<br />

can reveal where performance is inadequate in key focus areas like pavement<br />

condition, fatalities, congestion, project delivery, or maintenance and this<br />

information can be used to set a strategic agenda. Armed with a<br />

performance-driven strategic direction, leaders can energize staff and focus<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-3


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

resources around key policy priorities – such as reducing fatalities or<br />

alleviating congestion – to maximum effect.<br />

Helping agency managers improve business processes – Strong<br />

performance emerges when day-to-day business processes are aligned with<br />

agency-wide strategic priorities. In large organizations, business practices<br />

that have neutral or even adverse impacts on performance can easily become<br />

routine. Careful scrutiny of performance indicators can help managers make<br />

better day-to-day decisions about how to direct staff and resources to achieve<br />

outcomes that are more closely aligned with an agency’s overall strategic<br />

agenda for achieving improved performance.<br />

<br />

Helping agencies improve accountability to external stakeholders – The<br />

need for improved accountability is increasingly becoming a fact for public<br />

agencies. Transportation agencies that ignore the expectations of elected<br />

officials, advocacy organizations or the public run the risk of stimulating<br />

adversarial relationships that drive up the risk of negative policy mandates<br />

and reductions in funding. In contrast, agencies able to provide stakeholders<br />

with timely and compelling performance-based information about important<br />

issues can increase credibility and establish a positive environment for setting<br />

policy and funding direction.<br />

3.3 ASSESSING <strong>CDOT</strong> CURRENT PRACTICES<br />

Table 3.1 provides an assessment of <strong>CDOT</strong>’s current practices in the areas<br />

described above. It summarizes <strong>CDOT</strong> practices, and indicates the study team’s<br />

assessment of how closely they align with best practices.<br />

These findings are based on the results from interviews with <strong>CDOT</strong> staff and a<br />

review of several <strong>CDOT</strong> documents including:<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

Amendment to the 2035 Revenue Forecast and Resource Allocation (June<br />

2010);<br />

<strong>CDOT</strong> 2010 Transportation Deficit Report;<br />

<strong>CDOT</strong> 2011 Chief Engineer’s Objectives;<br />

<strong>CDOT</strong> 2010-2011 Fact Book;<br />

<strong>CDOT</strong>/FHWA 2010 Stewardship Agreement;<br />

<strong>CDOT</strong> Fiscal Year 2010 Annual <strong>Performance</strong> Report;<br />

<strong>CDOT</strong> Project Priority Programming Process (4P) and STIP Development<br />

Guidelines (September 2009);<br />

<strong>CDOT</strong> Strategic <strong>Plan</strong>, 2011-2012 Budget; and<br />

Transportation Commission Policy Directive 14 (December 2006).<br />

3-4 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Table 3-1<br />

Assessing <strong>CDOT</strong> Practices<br />

<strong>Performance</strong> Element<br />

Strategic <strong>Plan</strong>ning<br />

<strong>CDOT</strong> Alignment<br />

with Best Practices<br />

Vision/mission High Vision and mission statements are documented in Policy Directive 14, the Strategic <strong>Plan</strong> and the Statewide Transportation<br />

<strong>Plan</strong> (long-range plan).<br />

Goals and objectives High Policy Directive 14 provides goals and objectives by investment category, and the objectives are referenced in subsequent<br />

performance reports.<br />

Initiatives Low The Statewide Transportation <strong>Plan</strong> (long-range plan) provides investment strategies by corridor. The Strategic <strong>Plan</strong> provides<br />

general strategies by investment category. However, their treatment in the Strategic <strong>Plan</strong> is inconsistent; with some of the<br />

strategies lacking details required to track progress towards. For example the strategy for mobility includes “changing<br />

traveler behavior”, although this term in not defined. The Annual Report alludes to some strategies throughout the document.<br />

Overall, the study team found no consolidated list of <strong>CDOT</strong>’s priority strategic initiatives.<br />

<strong>Performance</strong> Management<br />

Measures Med <strong>CDOT</strong> tracks and reports a wide variety of measures. An annotated inventory of existing measures can be found in<br />

Appendix B. Overall, the measures tend to support the documented goals and objectives, reflect a mixture of system<br />

measures and organizational measures, and are actionable. In terms of coverage, <strong>CDOT</strong>’s vision and mission contain a<br />

multimodal aspect, which does not appear to be accounted for in the measures. The vision also highlights enhancing quality<br />

of life, which is not addressed.<br />

As illustrated in Appendix B, the number of measures tracked and reported by <strong>CDOT</strong> is quite large. Most agencies calculate<br />

more measures than they report in formal performance reports. However, maintaining a large volume of measures can make<br />

it difficult for internal staff and external audiences to understand agency priorities. In addition, superfluous measures can also<br />

unnecessarily burden data collection and IT efforts. Measures that are not reported externally or used internally to influence<br />

decision-making should be removed.<br />

Targets Med <strong>CDOT</strong> has two sets of targets – annual targets and longer-term, aspirational targets. Although the longer-term goals are<br />

described in the executive summary of the Annual Report, the use of annual targets throughout the report tends to mask<br />

overall progress (or lack of progress) towards them. This is because green lights are assigned to areas in which annual<br />

targets have been obtained, regardless of the relationship between current performance and the longer-term targets.<br />

Notes<br />

The Deficit Report addresses this disconnect directly. It indicates a significant gap between aspirational targets and current<br />

performance. It also describes that significant additional funding would be required to achieve these targets. It appears as<br />

though financial constraints were not a major consideration when the aspirational targets were initially set. This decreases<br />

their strategic value because they do not represent true relative priorities (e.g., if <strong>CDOT</strong> cannot afford to do everything, these<br />

areas are its priorities), and because the department’s day to day decisions cannot influence their obtainment. In addition,<br />

tracking chronically unmet targets may eventually desensitize external decision-makers to the need for transportation funding.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-5


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

<strong>Performance</strong> Element<br />

Using measures in decisionmaking<br />

<strong>CDOT</strong> Alignment<br />

with Best Practices<br />

Low<br />

Notes<br />

The study team found little documentation on the use of performance measures to influence decision-making. In some areas,<br />

<strong>CDOT</strong> staff explained how performance data is used to evaluate potential projects (for example, which safety<br />

countermeasures have the most promise, and which pavements and bridges to work on). However, discussions frequently<br />

focused on the mechanics of performance reporting (data collection, processing, and report development) rather than on their<br />

use.<br />

Evaluate system High <strong>CDOT</strong> has an organizational unit responsible for managing its performance management efforts.<br />

Reporting<br />

Required IT infrastructure High <strong>CDOT</strong> has made significant progress on the underlying IT infrastructure required for performance reporting. Ongoing efforts<br />

regarding a performance dashboard and PBF implementation will further enhance these capabilities.<br />

Organizational responsibility High <strong>CDOT</strong> has an organization structure that clearly defines a performance management champion, responsible for establishing,<br />

compiling, and reporting performance measures.<br />

Reporting format and content Med <strong>CDOT</strong> maintains a variety of performance measure reports. Each report has its specific purpose (e.g., fulfill a reporting<br />

mandate, highlight internal measures, communicate funding gaps, etc.), and therefore contains a custom set of measures.<br />

Similar to the discussion above regarding the use of multiple measures, the use of multiple reports can make it difficult for<br />

internal and external audiences to clearly understand agency priorities, what success looks like, and how <strong>CDOT</strong> is<br />

progressing towards success.<br />

<strong>CDOT</strong> provides on-line access to many of its performance reports in PDF format. The reports are not located in a central<br />

location, and users must sift through the reports if they are interested in a specific performance area.<br />

3-6 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

3.4 RECOMMENDED MEASURES<br />

This section recommends a set of priority performance measures. The measures<br />

were chosen based on stakeholder input, public priorities, and alignment with<br />

goals. Taken collectively, they can provide a comprehensive status report for the<br />

public, help to communicate agency priorities and progress, and enhance<br />

transparency and accountability.<br />

Table 3.2 summarizes the measures and categorizes them into tiers. Tier 1<br />

measures require no further work, while Tier 2 measures have issues that need to<br />

be resolved before implementation. In most cases, the issues reflect options and<br />

suggestions by <strong>CDOT</strong> staff that are not consistent with the recommendations<br />

presented below.<br />

Table 3-2<br />

Summary of Measures<br />

# Measure Tier Issues Recommendations<br />

1 Number of fatalities 2 Determine if measure should reflect a Report a count rather than a rate.<br />

rate (fatalities per 100 million VMT) or a<br />

count.<br />

Determine if measure should be<br />

Report measure for <strong>CDOT</strong> roadways.<br />

reported for the entire state or for <strong>CDOT</strong><br />

roadways.<br />

Determine if measure should reflect Focus measure on fatalities<br />

injuries in addition to fatalities.<br />

Determine if measure should reflect an Use a 5-year average<br />

annual value or a 5-year average.<br />

2 Bridge condition 1<br />

3 Pavement condition 2 Determine if the measure should include Combine remaining service life with IRI<br />

an IRI component.<br />

4 Roadside condition 2 Determine the preferred scope of this<br />

measure – the entire maintenance<br />

program, the Commission’s priorities, or<br />

roadside condition.<br />

Focus measure on roadside conditions<br />

5 Snow and ice<br />

control<br />

6 Roadway<br />

congestion<br />

7 On time<br />

construction<br />

8 On budget<br />

construction<br />

9 Strategic action item<br />

implementation<br />

1<br />

2 Add measures on delay and travel time<br />

reliability.<br />

1<br />

2 Determine if this measure should be<br />

considered as a priority measure and<br />

reported externally.<br />

Include measure in external reports<br />

2 Identify actions items to monitor. Tie this measure to actions identified in<br />

<strong>CDOT</strong>’s strategic plan and/or long<br />

range plan.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-7


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

In addition to the issues identified in Table 3.2 a global issue is to finalize the use<br />

of a grading scale. The following sections recommend a grading scale from A to<br />

F that could be applied consistently to all measures except for number of<br />

fatalities. Other options identified throughout the course of this project include a<br />

variable grading scale that reflects differences in the measures (for example, an A<br />

for bridges might represent a higher percent of good/fair than an A for<br />

pavements), and the assignment of grades based on achievement of a long-term<br />

target value.<br />

The following sections describe the recommended measures. Details related to<br />

data and calculations are contained in Section 3.6.<br />

System <strong>Performance</strong><br />

1. Number of Fatalities (5-year moving average)<br />

Existing Measure: Number of fatalities per 100 million vehicle miles traveled<br />

is currently tracked and reported by <strong>CDOT</strong>.<br />

Recommendation: Recommended modification to this measure is to report<br />

the number of fatalities rather than the fatality rate, and report a five-year<br />

moving average rather than a single year.<br />

Current Value: 1.01 fatalities per 100 million VMT<br />

Value Based on Recommendation: The current annual average of fatalities on<br />

state highways from 2005 to 2009 is approximately 542 (based on NHTSA<br />

Fatality Analysis Reporting System (FARS) data).<br />

Owner: Rahim Marandi<br />

Required <strong>Data</strong> Item Source Owner<br />

Fatalities by year for past five years TRAFDA Safety and Traffic Engineering<br />

Unit of Measure: Count<br />

Calculation:<br />

Where:<br />

<br />

<br />

K = number of fatalities<br />

n = year of calculation<br />

Update Frequency:<br />

Annually<br />

Additional Notes: A random element occurs in the timing and location of<br />

serious crashes. Therefore, the annual number of fatalities may fluctuate a great<br />

deal from one year to the next. Relying on a moving average number of fatalities<br />

over a five-year period provides a more stable picture of crash occurrence, and<br />

3-8 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

will make it easier to identify trends and establish a correlation between <strong>CDOT</strong><br />

actions and observed performance.<br />

“Fatalities” are defined as a fatality that occurs within 30 days of a crash.<br />

This is the only measure not converted to a grading scale. If a grading scale is<br />

preferred, a different type of safety measure is probably necessary.<br />

2. Bridge Condition<br />

Existing Measure: The percent of deck area on bridges and major culverts on<br />

<strong>CDOT</strong> system that are classified as “good” or “fair” is currently tracked and<br />

reported by <strong>CDOT</strong>. Good/fair/poor are based on the following criteria:<br />

<br />

<br />

<br />

Poor – Sufficiency rating less 50 and classified as SD (structurally deficient)<br />

and/or FO (functionally obsolete).<br />

Fair – Sufficiency rating between 50 and 80 and classified as SD or FO.<br />

Good – Do not meet the criteria for poor or fair.<br />

Letter grades are based on the following thresholds for percent good or fair:<br />

Grade<br />

Threshold<br />

A 90 – 100<br />

B 80 – 89<br />

C 70 – 79<br />

D 60 – 69<br />

F < 60<br />

Recommendation: Recommended modification to this measure is to convert the<br />

existing results to a letter grade.<br />

Current Value: 94.5%<br />

Value Based on Recommendation:<br />

Owner: Mark Nord<br />

A<br />

Required <strong>Data</strong> Item Source Owner<br />

Structure ID Pontis Staff Bridge Branch<br />

Bridge region Pontis Staff Bridge Branch<br />

Bridge length Pontis Staff Bridge Branch<br />

Bridge width Pontis Staff Bridge Branch<br />

Bridge sufficiency rating Pontis Staff Bridge Branch<br />

Bridge SD/FO status Pontis Staff Bridge Branch<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-9


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Unit of Measure: Grade<br />

Calculation:<br />

1. Calculate the deck area of each structure as follows:<br />

<br />

2. Determine which bridges are “good or fair” using the criteria described<br />

above.<br />

3. Calculate percent good or fair as follows:<br />

<br />

∑ <br />

∑ <br />

100<br />

4. Convert result to a letter grade using the thresholds defined above.<br />

Update Frequency: Annual (<strong>Data</strong> are collected biannually for each bridge.<br />

However, the inspections are performed on a rolling basis, so it is recommended<br />

that the measure be reported annually.)<br />

Additional Notes: The criteria for defining good/fair/poor are based on<br />

national standards used to determine eligibility for Federal bridge funds.<br />

Bridges in poor condition are eligible for Federal reconstruction funds. Bridges<br />

in fair condition are eligible for Federal rehabilitation funds. Bridges in good<br />

condition are not eligible for Federal funds.<br />

Sufficiency rating is reported on a scale of one to one hundred. It is based on<br />

condition data and bridge characteristics related to serviceability and essentiality<br />

for public use. Structurally deficient (SD) status indicates that a bridge has a<br />

significant condition defect. Functionally obsolete (FO) status indicates a design<br />

issue, such as insufficient shoulder width. All three metrics are based on data<br />

collected biannually.<br />

3. Pavement Condition<br />

Existing Measure: The percent of miles of pavement that are classified as<br />

“good” or “fair” based on remaining service life (RSL) is currently tracked and<br />

reported by <strong>CDOT</strong>. RSL thresholds are as follows:<br />

Poor – RSL < 6<br />

Fair – 6 ≤ RSL ≤ 10<br />

Good – RSL ≥ 11<br />

Recommendation: Recommended modification to this measure is to combine<br />

RSL with percent of miles of pavement classified as “good” or “fair” based on IRI<br />

(International Roughness Index), and then convert the result to a letter grade.<br />

The recommended IRI thresholds are as follows:<br />

Poor – IRI > 170<br />

Fair – 95 ≤ IRI ≤ 170<br />

Good – IRI < 95<br />

3-10 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Letter grades are based on the following thresholds for percent good or fair:<br />

Grade<br />

Threshold<br />

A 90 – 100<br />

B 80 – 89<br />

C 70 – 79<br />

D 60 – 69<br />

F < 60<br />

Current Value: 48% based on RSL<br />

Value Based on Recommendation:<br />

available.)<br />

Owner: Stephen Henry<br />

F (At time of this writing IRI data was not<br />

Required <strong>Data</strong> Item Source Owner<br />

Pavement segment identifier ADLP Pavement Design Unit<br />

Pavement segment region dTIMS Pavement Design Unit<br />

Pavement segment number of miles dTIMS Pavement Design Unit<br />

Pavement segment RSL ADLP Pavement Design Unit<br />

Pavement segment IRI Surface Condition <strong>Data</strong>base Pavement Design Unit<br />

Unit of Measure: Grade<br />

Calculation:<br />

1. Classify each segment as good, fair or poor using RSL thresholds defined<br />

above.<br />

2. Calculate percent good or fair based on RSL as follows:<br />

<br />

∑ good or fair based RSL<br />

∑ <br />

100<br />

3. Classify each segment as good, fair or poor using IRI thresholds defined<br />

above.<br />

4. Calculate percent good or fair based on IRI as follows:<br />

percent IRI <br />

∑ M good or fair based IRI<br />

∑ M<br />

100<br />

5. Combine the results of step 2 and step 4 as follows:<br />

<br />

<br />

6. Convert percent total to a letter grade using the thresholds defined above.<br />

Update Frequency:<br />

Annual<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-11


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Additional Notes: RSL is defined as the number of years remaining until<br />

reconstruction is necessary. A pavement segment’s RSL is estimated based on<br />

pavement age, traffic volumes, climate zone, cracking distress, rut depth, and<br />

smoothness as measured by IRI.<br />

IRI is a measurement of pavement smoothness, and reflects how the traveling<br />

public experiences a roadway. The recommended thresholds for good/fair/poor<br />

for IRI are based on thresholds being considered by AASHTO and FHWA as part<br />

of a national performance monitoring program.<br />

RSL is a holistic representation of pavement condition based upon age,<br />

environment, distress deterioration rates, and the user's perspective as<br />

represented by IRI. Combining RSL with IRI elevates the importance of the<br />

user's perspective. However, <strong>CDOT</strong> may want to adjust the weight associated<br />

with each component. The current proposal is to weight them both equally. If<br />

the public surveys indicate willingness for less than perfect pavement, RSL<br />

should be weighted higher than IRI.<br />

ADLP is a tool maintained by the pavement group. Output from the ADLP is<br />

stored in an access database and/or spreadsheet, and imported into the<br />

pavement management system.<br />

4. Roadside Condition<br />

Existing Measure: Nine measures are tracked and combined to provide an<br />

Overall Maintenance Level of Service grade at <strong>CDOT</strong>.<br />

Recommendation: Recommended modification to the existing measure is to<br />

combine the LOS grades for the Roadside Facilities and Roadside Appearance<br />

Maintenance Programs into a new measure of roadside condition.<br />

Current Value:<br />

Value Based on Recommendation: B+<br />

Owner: B.J. McElroy<br />

Roadside Facilities B+ / Roadside Appearance B<br />

Required <strong>Data</strong> Item Source Owner<br />

Roadside Facilities LOS<br />

Roadside Appearance LOS<br />

Roadside Facilities expenditures from<br />

previous year<br />

Roadside Appearance expenditures<br />

from previous year<br />

BPS Module of SAP (source will be<br />

PBF, when complete)<br />

BPS Module of SAP (source will be<br />

PBF, when complete)<br />

BPS Module of SAP<br />

BPS Module of SAP<br />

Operations and Maintenance Division<br />

Operations and Maintenance Division<br />

Operations and Maintenance Division<br />

Operations and Maintenance Division<br />

3-12 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Unit of Measure: Grade<br />

Calculation:<br />

1. Convert the two LOS values from a grade to a numeric value using the<br />

following table:<br />

Grade<br />

Value<br />

A+ 98<br />

A 95<br />

A- 91<br />

B+ 88<br />

B 85<br />

B- 81<br />

C+ 78<br />

C 75<br />

C- 71<br />

D+ 65<br />

D 65<br />

D- 61<br />

F+ 58<br />

F 55<br />

F- 51<br />

2. Calculate a weighted average of the two numeric values based on the<br />

expenditures in the previous year, as follows:<br />

<br />

<br />

<br />

3. Convert the resulting numeric value to a letter grade using the following<br />

table:<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-13


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Grade<br />

Threshold<br />

A+ ≥ 97<br />

A 93-96<br />

A- 90-92<br />

B+ 87-89<br />

B 83-86<br />

B- 80-83<br />

C+ 77-79<br />

C 73-76<br />

C- 70-73<br />

D+ 67-69<br />

D 63-66<br />

D- 60-63<br />

F+ 57-59<br />

F 53-56<br />

F- < 53<br />

Update Frequency:<br />

Annual<br />

Additional Notes: LOS is based on the results of an annual visual inspection.<br />

<strong>Data</strong> is collected on a sampling basis, so that statewide results can be reported<br />

with a reasonable level of statistical confidence.<br />

The LOS for the Roadside Facilities program accounts for the following items:<br />

drainage inlets and structures, drainage ditches, slopes, fencing, sound barriers,<br />

litter, debris, and sand on shoulders. The LOS for the Roadside Appearance<br />

program accounts for the following items: grass mowing, vegetation control,<br />

and landscape appearance.<br />

<strong>CDOT</strong> currently reports maintenance LOS values using a grading scale that<br />

includes plusses and minuses. Although plusses and minuses are not included<br />

for the other measures, it is recommended that they be used for this measure to<br />

maintain consistency with current reporting practices.<br />

5. Snow and Ice Control<br />

Existing Measure: A letter grade is reflecting the level of service (LOS) for<br />

snow and ice control is currently tracked and reported by <strong>CDOT</strong>.<br />

Recommendation:<br />

Current Value: C+<br />

Value Based on Recommendation: C+<br />

Owner: B.J. McElroy<br />

No modification to this measure is recommended.<br />

Status: This measure is currently reported by <strong>CDOT</strong>.<br />

3-14 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Required <strong>Data</strong> Item Source Owner<br />

Snow and Ice Control LOS<br />

BPS Module of SAP (source will be<br />

PBF, when complete)<br />

Operations and Maintenance Division<br />

Unit of Measure: Grade<br />

Calculation: No additional calculation is necessary. Measure can be reported<br />

directly from SAP.<br />

Update Frequency: Annual<br />

Additional Notes: See notes on LOS above.<br />

6. Roadway Congestion<br />

Existing Measure: The minutes of delay per traveler on congested state<br />

highway segments is currently tracked and reported by <strong>CDOT</strong>.<br />

Recommendation:<br />

reported as follows:<br />

It is recommended that three congestion measures be<br />

1. Extent of Congestion – Assign a letter grade based on percent of person miles<br />

traveled (PMT) on all corridors that are considered congested/uncongested.<br />

Letter grades are based on the thresholds below for percent uncongested.<br />

2. Delay Per Traveler (annual hours) –<br />

<br />

A T<br />

T FFS PSL<br />

T T <br />

FFS – Free Flow Speed<br />

PSL – Posted Speed Limit<br />

<br />

<br />

<br />

<br />

<br />

3. Travel Time Reliability – begin tracking a travel time reliability metric in<br />

coordination with the ITS Office. Recommended measure is the planning<br />

time index formula.<br />

<br />

<br />

For example – the measure would be reported as an A if more than 90% of all<br />

person miles traveled occur on uncongested roadways. The measure would be<br />

reported as an F if less than 60% of all person miles traveled occur on<br />

uncongested roadways.<br />

The measure is to be reported for urban and rural segments of the state<br />

separately.<br />

The following relates to the first measure. Measures two and three need further<br />

development and coordination regarding data availability.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-15


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Grade<br />

Threshold<br />

A 90 – 100<br />

B 80 – 89<br />

C 70 – 79<br />

D 60 – 69<br />

F < 60<br />

Current Value: Rural 8.4% congested / Urban 45.5% congested<br />

Value Based on Recommendation: Rural – A / Urban – F<br />

Owner: Mehdi Baziar<br />

Required <strong>Data</strong> Item Source Owner<br />

Roadway segment ID TRAFFON Information Management Branch<br />

Roadway segment region/MPO area TRAFFON Information Management Branch<br />

Roadway segment vehicle miles traveled<br />

(VMT)<br />

Average vehicle occupancy rate by<br />

region/MPO area<br />

TRAFFON<br />

TRAFFON<br />

Information Management Branch<br />

Information Management Branch<br />

Roadway segment volume/capacity ratio TRAFFON Information Management Branch<br />

Roadway segment urban/rural designation IRIS Information Management Branch<br />

Unit of Measure: Grade<br />

Calculation:<br />

1. Determine urban and rural VMT and perform the remainder of the steps for<br />

each group separately.<br />

2. Calculate PMT for each segment as follows:<br />

<br />

3. Calculate percent PMT on rural roads as follows:<br />

<br />

<br />

∑ / .<br />

∑ <br />

100<br />

Calculate percent PMT on urban roads as follows:<br />

∑ / .<br />

∑ <br />

100<br />

4. Convert the results to a letter grade using the thresholds defined above.<br />

Update Frequency:<br />

Annual<br />

3-16 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Additional Notes: In locations with permanent traffic counters, V/C is based on<br />

the 30th highest hour. In other locations, it is based on the highest peak hour.<br />

<strong>CDOT</strong> needs to begin including system operation improvements such as<br />

managed lanes, variable speed limits, and adaptive traffic signalization in the<br />

planning process. Appropriate performance measures need to be adopted to<br />

monitor, measure and track the success of these projects in terms of improving<br />

mobility and improving congestion. The most appropriate measure to<br />

accomplish this is travel time reliability.<br />

In a broader sense, reliability is a dimension or attribute of mobility and<br />

congestion. Traditionally, the dimensions of congestion have been spatial (how<br />

much of the system is congested?) (represented by measure one above), temporal<br />

(how long does congestion last?), and severity-related (how much delay is there<br />

or how low are travel speeds?) (represented by measure two above). Reliability<br />

adds a fourth dimension: how does congestion change from day to day?<br />

The recommended measure is:<br />

<strong>Plan</strong>ning Time Index<br />

95 th percentile Travel Time Index<br />

(95th percentile travel time divided by the free flow travel time)<br />

In general, reliability is defined as the variation in that performance measure<br />

over a specified period of time, whether speaking in terms of travel times, delay,<br />

stops, queues, speeds or any other transportation performance measure.<br />

“Variability” and “reliability” are interchangeable terms and refer to<br />

performance, regardless of how performance is measured or predicted. To be<br />

more specific, travel time reliability relates to the how travel times for a given<br />

trip and time period perform over time. For measuring reliability, a “trip” can<br />

occur on a specific highway section, any subset of the transportation network, or<br />

can be broadened to include a traveler’s initial origin and final destination.<br />

Reliability (more appropriately, unreliability) is caused by the interaction of the<br />

factors that influence travel times: fluctuations in demand, traffic control<br />

devices, traffic incidents, inclement weather, work zones, and physical capacity<br />

(based on prevailing geometrics and traffic patterns). From a measurement<br />

perspective, reliability is quantified from the distribution of travel times, for a<br />

given facility/trip and time slice, that occurs over a significant span of time; one<br />

year is generally long enough to capture nearly all of the variability caused by<br />

disruptions. A variety of different metrics can be computed once the travel time<br />

distribution has been established.<br />

The <strong>Plan</strong>ning Time Index was calculated from actual data for the period from<br />

November 1, 2010 through October 31, 2011 covering I-70 from Denver<br />

International Airport to Vail in both directions. The values and maps showing<br />

peak hour <strong>Plan</strong>ning Time Indices are included in Appendix G.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-17


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Organizational <strong>Performance</strong><br />

7. On Time Construction<br />

Existing Measure: The percent of construction projects being delivered on time<br />

is currently tracked and reported by <strong>CDOT</strong> and included in the Chief Engineer’s<br />

Objectives and the FHWA-<strong>CDOT</strong> stewardship agreement. <strong>CDOT</strong> is moving<br />

towards defining "on time" delivery as occurring when a project is completed<br />

within the period specified in the contract after the pre-construction conference.<br />

Recommendation: Recommended modification to this measure is to assign a<br />

letter grade based on the thresholds below.<br />

A potential alternative for this measure is to base it on information provided in<br />

the Statewide Transportation Improvement Program (STIP). Measuring against<br />

the STIP would enable <strong>CDOT</strong> to assess performance against the public’s<br />

expectations, which are often established by the project timing information<br />

provided in the STIP. Options include using a project’s original completion year,<br />

as first specified in the STIP as the definition of on time; and assessing the degree<br />

to which progress has been made on a project in each year specified in the STIP.<br />

Letter grades are based on the following thresholds for percent of on time<br />

projects:<br />

Grade<br />

Threshold<br />

A 90 – 100<br />

B 80 – 89<br />

C 70 – 79<br />

D 60 – 69<br />

F < 60<br />

Current Value: 72.2%<br />

Value Based on Recommendation:<br />

Owner: Scott McDaniel<br />

C<br />

Required <strong>Data</strong> Item Source Owner<br />

Project identifier for projects completed in previous<br />

quarter<br />

SAP Projects System Module<br />

Completion date for each project SAP Projects System Module TBD<br />

Project completion data specified in the contract after SAP Projects System Module<br />

TBD<br />

the pre-construction period for each project<br />

TBD<br />

Unit of Measure: Grade<br />

3-18 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Calculation:<br />

1. For each project completed in the previous quarter, flag as “on-time” if it was<br />

completed within the period specified in the contract after the preconstruction<br />

conference.<br />

2. Calculate percent on time as follows:<br />

# <br />

100<br />

# <br />

3. Convert the result to a letter grade using the thresholds defined above.<br />

Update Frequency: Quarterly<br />

Additional Notes: None<br />

8. On Budget Construction<br />

Existing Measure: The percent of construction projects being delivered on<br />

budget is currently tracked and reported by <strong>CDOT</strong> and included in the Chief<br />

Engineer’s Objectives and the FHWA-<strong>CDOT</strong> stewardship agreement. A project<br />

is considered to be “on budget” if it is completed within the project commitment<br />

amount.<br />

Recommendation: Recommended modification to this measure is to weight<br />

the measure by project cost, and assign a letter grade based on the thresholds<br />

below.<br />

Letter grades are based on the following thresholds for percent of on budget<br />

projects:<br />

Grade<br />

Threshold<br />

A 90 – 100<br />

B 80 – 89<br />

C 70 – 79<br />

D 60 – 69<br />

F < 60<br />

Current Value: 92.9%<br />

Value Based on Recommendation: A<br />

Owner: Scott McDaniel<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-19


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Required <strong>Data</strong> Item Source Owner<br />

Project identifier for projects completed in the<br />

previous quarter<br />

SAP Projects System Module<br />

Final construction cost for each project SAP Projects System Module TBD<br />

Project commitment cost for each project SAP Projects System Module TBD<br />

TBD<br />

Unit of Measure: Grade<br />

Calculation:<br />

1. For each project completed in the previous quarter, flag as “on-budget” if the<br />

final cost was less than the project commitment amount.<br />

2. Calculate on budget average as follows:<br />

∑ “ ” <br />

100<br />

∑ <br />

3. Convert the result to a letter grade using the thresholds defined above.<br />

Update Frequency: Quarterly<br />

Additional Notes: None<br />

9. Strategic Action Item Implementation<br />

Existing Measure:<br />

This is a new measure.<br />

Recommendation: Assign letter grade based on <strong>CDOT</strong>’s progress<br />

implementing the policies and/or action items defined in its strategic plan or<br />

similar type of document. The implementation status for each item could be<br />

categorized based on the following scale:<br />

<br />

<br />

<br />

<br />

Complete – <strong>CDOT</strong> has sufficiently addressed an action<br />

Advancing – Work on the action item is proceeding as planned<br />

Delayed – Work on the action item is delayed<br />

Dropped – The action item has been dropped and will no longer be<br />

implemented.<br />

Letter grades are based on the following thresholds for percent of items complete<br />

or advancing:<br />

Grade<br />

Threshold<br />

A 90 – 100<br />

B 80 – 89<br />

C 70 – 79<br />

D 60 – 69<br />

F < 60<br />

3-20 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Current Value: NA<br />

Owner: Gary Vansuch<br />

Required <strong>Data</strong> Item Source Owner<br />

TBD TBD TBD<br />

Unit of Measure: Grade<br />

Calculation:<br />

1. Identify priority policies and/or action items that should be tracked.<br />

2. Categorize each item using the scale above.<br />

3. Calculate the measure as follows:<br />

# <br />

100<br />

# <br />

4. Convert the result to letter grade<br />

Update Frequency:<br />

Quarterly<br />

The following table summarizes the results of all measures.<br />

Table 3-3 Summary of Results for 2010<br />

# Measure Current Value<br />

Value Based on<br />

Recommendation<br />

1 Number of fatalities 1.01 per 100 mil VMT 542<br />

2 Bridge condition 94.5% A<br />

3 Pavement condition 48% F<br />

4 Roadside condition B+ / B B+<br />

5 Snow and ice control C+ C+<br />

6 Roadway congestion R 8.4% cong / U 45.5% cong A / F<br />

7 On time construction 72.2% C<br />

8 On budget construction 92.9% A<br />

9 Strategic action item<br />

implementation<br />

NA<br />

NA<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-21


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

3.5 DATA TO SUPPORT MEASURES<br />

This section summarizes the data sources for the recommended measures. It also<br />

documents the approach used to determining initial values for each measure.<br />

Table 3-4<br />

<strong>Data</strong> Sources to Support <strong>Performance</strong> Measures<br />

DATABASE SOURCE<br />

MEASURE<br />

Number of Fatalities<br />

ADLP<br />

BPS Module of SAP<br />

dTIMS<br />

IRIS<br />

Pontis<br />

SAP Projects System<br />

Module<br />

Surface Condition<br />

<strong>Data</strong>base<br />

Bridge Condition<br />

X<br />

Pavement Condition X X X<br />

Roadside Condition<br />

X<br />

Snow and Ice Control<br />

X<br />

Roadway Congestion a X X<br />

On Time Construction<br />

X<br />

On Budget Construction<br />

X<br />

Strategic Action Item Implementation<br />

a Measure one only<br />

TRAFDA<br />

X<br />

TRAFFON<br />

3-22 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Table 3-5 describes the data sources (reports) that were referenced to calculate<br />

the measures. <strong>CDOT</strong> typically reports measure results in an annual performance<br />

report.<br />

Table 3-5<br />

Report Sources<br />

Measure DATA SOURCE LOCATION<br />

Number of Fatalities<br />

Bridge Condition<br />

Pavement Condition<br />

Roadside Condition<br />

Snow and Ice Control<br />

FARS website<br />

% Good/Fair available in <strong>CDOT</strong><br />

Annual <strong>Performance</strong> Report<br />

% Good/Fair based on RSL available<br />

in <strong>CDOT</strong> Annual <strong>Performance</strong> Report<br />

IRI not available<br />

available in <strong>CDOT</strong> Annual<br />

<strong>Performance</strong> Report<br />

available in <strong>CDOT</strong> Annual<br />

<strong>Performance</strong> Report<br />

Reports for 2007 through 2010<br />

available on <strong>CDOT</strong> website as PDF<br />

Reports for 2007 through 2010<br />

available on <strong>CDOT</strong> website as PDF<br />

Reports for 2007 through 2010<br />

available on <strong>CDOT</strong> website as PDF<br />

Reports for 2007 through 2010<br />

available on <strong>CDOT</strong> website as PDF<br />

Roadway Congestion 2010-Urban-Rural-summary.xlsx. Provided by <strong>CDOT</strong><br />

On Time Construction<br />

Chief Engineer’s Objectives Final Provided by <strong>CDOT</strong><br />

Report<br />

On Budget Construction<br />

Chief Engineer’s Objectives Final Provided by <strong>CDOT</strong><br />

Report<br />

Strategic Action Item Implementation TBD TBD<br />

The following diagram shows the general flow of data among <strong>CDOT</strong> systems<br />

and databases to generate the performance results.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-23


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 3.2<br />

<strong>Data</strong> Flow<br />

3-24 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

3.6 INITIAL CALCULATIONS<br />

This section describes the details involved in calculating initial value for each<br />

measure. An accompanying spreadsheet was created and delivered to <strong>CDOT</strong> as<br />

part of this project. Figure 5.18 is a recommended dashboard design with<br />

measures calculated as described in Section 3.6. The dashboard illustrates the<br />

department’s performance status at a glance.<br />

Number of Fatalities<br />

Gather <strong>Data</strong><br />

Fatality data was gathered from the NHTSA Fatality Analysis Reporting System<br />

(FARS) Encyclopedia at http://www-fars.nhtsa.dot.gov/Main/index.aspx.<br />

Enter Into Spreadsheet<br />

Number of fatalities was entered into Excel spreadsheet as well as the reporting<br />

year. <strong>Data</strong> was available from 1994 to 2009 at the time of this report.<br />

Year 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010<br />

Number of<br />

Fatalities 586 645 617 613 628 626 681 741 743 642 667 606 535 554 548 465<br />

Five Year<br />

Average 617.8 625.8 633.0 657.8 683.8 686.6 694.8 679.8 638.6 600.8 582.0 541.6<br />

Years of<br />

Average<br />

1994‐1998<br />

1995‐1999<br />

1996‐2000<br />

1997‐2001<br />

1998‐2002<br />

1999‐2003<br />

2000‐2004<br />

2001‐2005<br />

2002‐2006<br />

2003‐2007<br />

2004‐2008<br />

2005‐2009<br />

2006‐2010<br />

Build Formula<br />

A formula was built to average the first five years of fatality data. This formula<br />

was used to calculate rolling averages for all data.<br />

Plot Results<br />

A line chart was created from the calculated data. This type of chart was chosen<br />

to display trends over time.<br />

Bridge Condition<br />

Gather <strong>Data</strong><br />

Bridge condition data was collected from the <strong>CDOT</strong> Annual <strong>Performance</strong><br />

Reports for fiscal years 2007 through 2010. The data is available as a percentage<br />

of bridge deck area in good or fair condition.<br />

Enter Into Spreadsheet<br />

Percentages were entered into spreadsheet as well as the year of reporting.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-25


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Year 2007 2008 2009 2010<br />

Percent in "good" or "fair" condition 94.8% 93.8% 94.4% 94.5%<br />

Percent in "poor" condition 5.2% 6.2% 5.6% 5.5%<br />

Grade A A A A<br />

Build Formula<br />

A formula was built to calculate and return the letter grade associated with the<br />

percentage. The grades were based on the ten-point scale previously defined.<br />

Plot Results<br />

Pie charts were created from the percentages. This type of chart was chosen for<br />

comparison of parts to the whole.<br />

Pavement Condition<br />

Gather <strong>Data</strong><br />

Percentage of pavement condition in good or fair condition is available in the<br />

<strong>CDOT</strong> Annual <strong>Performance</strong> Reports. At the time of this writing IRI was not<br />

available therefore only the RSL data was used to generate the graph.<br />

Enter Into Spreadsheet<br />

Percentages were entered into spreadsheet as well as the year of reporting.<br />

Build Formula<br />

A formula was built to calculate and return the letter grade associated with the<br />

percentage. The grades were based on the ten-point scale previously defined.<br />

Plot Results<br />

Line charts were created from the percentages. This type of chart was chosen to<br />

display trends over time.<br />

Roadside Condition<br />

Gather <strong>Data</strong><br />

Roadside condition data was collected from the <strong>CDOT</strong> Annual <strong>Performance</strong><br />

Reports for fiscal years 2007 through 2010. The data is a calculation of roadside<br />

appearance and roadside facilities budget and grades.<br />

Enter Into Spreadsheet<br />

Annual grades and amount spent were entered into the spreadsheet. Grades<br />

were manually changed into percentages based on the scale in Section 3.4.<br />

3-26 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Year 2007 2008 2009 2010<br />

Roadside Fac B A‐ A‐ B+<br />

Roadside Appear C+ B B B<br />

Roadside Fac $ 19,600,000 $ 18,500,000 $ 22,300,000 $ 19,000,000<br />

Roadside Appear $ 7,500,000 $ 7,400,000 $ 8,300,000 $ 8,000,000<br />

$ 27,100,000 $ 25,900,000 $ 30,600,000 $ 27,000,000<br />

Roadside Fac 85% 91% 91% 88%<br />

Roadside Appear 78% 85% 85% 85%<br />

83% 89% 89% 87%<br />

B‐ B+ B+ B+<br />

Build Formula<br />

The calculation to arrive at the weighted average was entered. Then the letter<br />

grade was assigned based on the scale.<br />

Plot Results<br />

A bar chart was created from the average percentage.<br />

Snow and Ice Control<br />

Gather <strong>Data</strong><br />

Snow and ice control grades were collected from the <strong>CDOT</strong> Annual <strong>Performance</strong><br />

Reports for fiscal years 2007 through 2010.<br />

Roadway Congestion<br />

Gather <strong>Data</strong><br />

<strong>Data</strong> was collected from 2010-Urban-Rural-summary.xlsx, which was provided by<br />

the Office of Transportation Development (Mehdi Baziar).<br />

Enter Into Spreadsheet<br />

Person miles traveled on all corridors considered congested was entered for<br />

urban and rural roads separately.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-27


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

URBAN<br />

RURAL<br />

Year: 2010<br />

uncongested 32,591,598<br />

congested 27,228,080<br />

total 59,819,678<br />

% uncongested 35.3%<br />

% congested 45.5%<br />

Grade: F<br />

uncongested 37,047,182<br />

congested 3,403,623<br />

total 40,450,805<br />

% uncongested 91.6%<br />

% congested 8.4%<br />

Grade: A<br />

Build Formula<br />

The calculation to determine percentage congestion was entered, and a letter<br />

grade assigned based on the ten-point scale outlined in Section 3.4.<br />

Plot Results<br />

Pie charts were created from the percentages. This type of chart was chosen for<br />

comparison of parts to the whole.<br />

On Time Construction<br />

Gather <strong>Data</strong><br />

<strong>Data</strong> was gathered from the Chief Engineer’s Objectives, FY 2011 Final Report.<br />

Enter Into Spreadsheet<br />

The percentage of on time construction was entered into the spreadsheet.<br />

FY 2011<br />

Quarter: Q1 Q2 Q3 Q4 Year to Date<br />

Number projects<br />

completed on time: 21 38 11 13 83<br />

Number of projects<br />

completed: 27 46 14 18 105<br />

On Time 77.78% 82.61% 78.57% 72.22% 79.05%<br />

Late 22.22% 17.39% 21.43% 27.78% 20.95%<br />

Grade: C B C C C<br />

3-28 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Build Formula<br />

The formula to determine percentage of on time construction was entered as well<br />

as to assign grades.<br />

Plot Results<br />

Pie charts were developed to show percentage on time and not on time<br />

construction.<br />

On Budget Construction<br />

Gather <strong>Data</strong><br />

<strong>Data</strong> was gathered from the Chief Engineer’s Objectives, FY 2011 Final Report.<br />

Only the fourth quarter and year-to-date data was available from this source.<br />

Enter Into Spreadsheet<br />

The percentage of under budget construction was entered into the spreadsheet.<br />

Q4 Year to Date<br />

Number projects completed on budget: 13 96<br />

Number of projects completed: 14 119<br />

Under Budget 92.9% 80.7%<br />

Over Budget 7.1% 19.3%<br />

Grade: A B<br />

Build Formula<br />

A formula to calculate the percentage of construction that was delivered on or<br />

under budget was entered. A grade was determined from the resulting<br />

percentage.<br />

Plot Results<br />

Pie charts were developed to show percentage under budget and over budget<br />

construction.<br />

Strategic Action Item Implementation<br />

Gather <strong>Data</strong><br />

This is a new measure so mock data was used in order to illustrate results.<br />

Enter Into Spreadsheet<br />

Percentages were entered for action items complete, advancing, delayed and<br />

dropped, to total 100% for each reporting year.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-29


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Year: 2007 2008 2009 2010<br />

Percent of implementation progress 60% 75% 80% 95%<br />

"complete" or "advancing" of strategic<br />

plan policies<br />

Grade: D C B A<br />

Complete 30% 35% 40% 50%<br />

Advancing 30% 40% 40% 45%<br />

Delayed 20% 15% 10% 5%<br />

Dropped 20% 10% 10% 0%<br />

100% 100% 100% 100%<br />

Build Formula<br />

A letter grade was assigned based on the percentage of the complete and<br />

advancing action items totaled together.<br />

Plot Results<br />

Pie charts were developed showing the percentages of action item<br />

implementation.<br />

3.7 DATA INVENTORY<br />

A data inventory assessment was completed as part of this project for the data to<br />

support the recommended measures. The recommended steps to complete a<br />

data inventory are listed in Section 4.3 (Priority 3). They are summarized below<br />

with notes describing how each step was accomplished specifically for the<br />

recommended performance measures.<br />

1. All priority data sets to support key department business – Key business needs<br />

include performance reporting and asset management.<br />

The data sets were identified by referencing the performance measures matrix in<br />

Appendix B and verification of data availability and quality through interviews<br />

with data owners.<br />

2. <strong>Data</strong> owners, stewards, stakeholders, community of interest, working groups etc. and<br />

roles/responsibilities for all.<br />

<strong>Data</strong> owners were identified based on the matrix in Appendix B and as<br />

suggested by <strong>CDOT</strong>.<br />

3. A data catalog detailing all data programs, sources, business owners, requestors of<br />

data, data definitions, data standards, metadata standards, format, data models, and<br />

identification of IT and business subject matter experts who may be contacted<br />

regarding information about the data programs and instructions for accessing data<br />

standards and definitions used with each data program. The catalog should be<br />

updated regularly (every two to three years).<br />

3-30 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

A data catalog was created for the nine recommended core performance<br />

measures. It is contained in Section 3.4. The following summarizes the data<br />

catalog specifically for the nine core measures:<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-31


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Table 3-6<br />

<strong>Data</strong> Catalog<br />

#<br />

<strong>Performance</strong><br />

Measure<br />

1 Number of<br />

Fatalities<br />

2 Bridge<br />

Condition<br />

3 Pavement<br />

Condition<br />

4 Roadside<br />

Condition<br />

5 Snow and Ice<br />

Control<br />

Sources<br />

TRAFDA<br />

<strong>Business</strong><br />

Owner<br />

Rahim<br />

Marandi<br />

Requestors of<br />

<strong>Data</strong><br />

TBD<br />

<strong>Data</strong> Definitions<br />

“Fatalities” – fatality that occurs within 30 days of a<br />

crash<br />

Pontis Mark Nord TBD Poor – Sufficiency rating less 50 and classified as<br />

structurally deficient or functionally obsolete.<br />

Fair – Sufficiency rating between 50 and 80 and<br />

classified as structurally deficient or functionally<br />

obsolete.<br />

Good – Do not meet the criteria for poor or fair.<br />

ADLP, dTIMS,<br />

Surface<br />

Condition<br />

<strong>Data</strong>base<br />

BPS Module of<br />

SAP (source will<br />

be PBF, when<br />

complete)<br />

BPS Module of<br />

SAP (source will<br />

be PBF, when<br />

complete)<br />

Stephen<br />

Henry<br />

TBD<br />

RSL – the number of years remaining until<br />

reconstruction is necessary.<br />

IRI – a measurement of pavement smoothness, which<br />

reflects how the traveling public experiences a<br />

roadway.<br />

B.J. McElroy TBD LOS is based on the results of an annual visual<br />

inspection. The LOS for the Roadside Facilities<br />

program accounts for the following items: drainage<br />

inlets and structures, drainage ditches, slopes, fencing,<br />

sound barriers, litter, debris, and sand on shoulders.<br />

The LOS for the Roadside Appearance program<br />

accounts for the following items: grass mowing,<br />

vegetation control, and landscape appearance.<br />

B.J. McElroy TBD LOS is based on the results of an annual visual<br />

inspection.<br />

<strong>Data</strong><br />

Standards<br />

Metadata<br />

Standards<br />

Format<br />

<strong>Data</strong><br />

Models<br />

IT &<br />

<strong>Business</strong><br />

Subject<br />

Matter<br />

Experts<br />

TBD TBD Count TBD Safety and<br />

Traffic<br />

Engineering<br />

TBD TBD Grade TBD Staff Bridge<br />

Branch<br />

TBD TBD Grade TBD Pavement<br />

Design Unit<br />

TBD TBD Grade TBD Operations<br />

and<br />

Maintenance<br />

Division<br />

TBD TBD Grade TBD Operations<br />

and<br />

Maintenance<br />

Division<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-33


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

#<br />

<strong>Performance</strong><br />

Measure<br />

Sources<br />

<strong>Business</strong><br />

Owner<br />

Requestors of<br />

<strong>Data</strong><br />

<strong>Data</strong> Definitions<br />

<strong>Data</strong><br />

Standards<br />

Metadata<br />

Standards<br />

Format<br />

<strong>Data</strong><br />

Models<br />

IT &<br />

<strong>Business</strong><br />

Subject<br />

Matter<br />

Experts<br />

6 Roadway<br />

Congestion<br />

TRAFFON, IRIS<br />

Mehdi<br />

Baziar<br />

TBD<br />

Reliability – the variation in travel time, delay, stops,<br />

queues, speeds, or any other transportation<br />

performance measure over a specified period of time.<br />

TBD TBD Grade TBD Information<br />

Management<br />

Branch<br />

7 On Time<br />

Construction<br />

SAP Projects<br />

System Module<br />

Scott<br />

McDaniel<br />

TBD<br />

A project is considered to be “on time” if is completed<br />

within the period specified in the contract after the preconstruction<br />

conference.<br />

TBD TBD Grade TBD TBD<br />

8 On Budget<br />

Construction<br />

SAP Projects<br />

System Module<br />

Scott<br />

McDaniel<br />

TBD<br />

A project is considered to be “on budget” if it is<br />

completed within the project commitment amount.<br />

TBD TBD Grade TBD TBD<br />

9 Strategic Action<br />

Item<br />

Implementation<br />

TBD<br />

Gary<br />

Vansuch<br />

TBD<br />

Complete – <strong>CDOT</strong> has sufficiently addressed an action<br />

Advancing – Work on the action item is proceeding as<br />

planned<br />

TBD TBD Grade TBD TBD<br />

Delayed – Work on the action item is delayed<br />

Dropped – The action item has been dropped and will<br />

no longer be implemented.<br />

3-34 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

A data inventory should also include an assessment of quality. There are many<br />

issues surrounding data quality. These include accuracy, completeness, validity,<br />

availability, timeliness, and coverage. Other considerations for converting the<br />

data into information and performance measures include archiving/aggregation,<br />

analysis, delivery and access. The table below documents a review of the quality<br />

issues for each measure. Yes (Y) indicates that the data characteristics meets the<br />

criteria for that measure. For example, the accuracy/validity and availability of<br />

the data to support the number of fatalities measure is satisfactory. No (N)<br />

indicates that more work is required to bring the data up to standard to be used<br />

to report on the measures. Each No (N) is accompanied by a note – notes are<br />

numbered and included beneath the table.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-35


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Characteristic<br />

Table 3-7<br />

<strong>Data</strong> Quality Issues for Measures<br />

Measure<br />

Number of<br />

Fatalities<br />

Bridge Condition<br />

Pavement<br />

Condition<br />

Roadside<br />

Condition<br />

Snow and Ice<br />

Removal<br />

Roadway<br />

Congestion<br />

On Time<br />

Construction<br />

On Budget<br />

Construction<br />

Strategic <strong>Plan</strong><br />

Implementation<br />

Accuracy, Validity, Availability Y Y Y Y Y N (1) N(2) N(3) N(4)<br />

Completeness Y Y Y Y Y N(5) N(6) N(7) N(8)<br />

Timeliness N(9) Y Y Y Y N(10) Y Y N(11)<br />

Coverage Y Y Y Y Y N(12) Y Y N(13)<br />

Archiving/Aggregation Y Y Y Y Y N(14) Y Y N(15)<br />

Delivery/Access Y Y Y Y Y N(16) Y Y Y<br />

Notes:<br />

1. More work is required to verify the availability and accuracy of the data (speeds and delay on arterials and Freeways) to support the<br />

delay and travel time reliability measures<br />

2. There were questions among data providers regarding a consistent source for this<br />

3. There were questions among data providers regarding a consistent source for this<br />

4. <strong>Data</strong> is not yet available for this measure.<br />

5. <strong>Data</strong> for arterials for % congested is required. <strong>Data</strong> is still needed for other congestion measures.<br />

6. <strong>Data</strong> may not be available for all projects<br />

7. <strong>Data</strong> may not be available for all projects<br />

8. <strong>Data</strong> is not yet available for this measure.<br />

9. <strong>Data</strong> on fatalities could be processed in a more timely manner<br />

10. Details regarding how often data is to be reported need to be discussed and agreed upon<br />

11. <strong>Data</strong> is not yet available for this measure<br />

12. Details regarding coverage (which facilities to be reported) still need to be discussed and agreed upon<br />

13. <strong>Data</strong> is still needed to support these measures.<br />

14. Archiving of speed data should be accomplished by the ITS/Operations office<br />

15. <strong>Data</strong> is not yet available for this measure<br />

16. Access should be through the dashboard.<br />

4. A business case for every critical data program.<br />

This step was not accomplished for the core measures because the business case<br />

for the data is essentially to report on performance.<br />

In general, data issues should be addressed with a data governance plan, which<br />

is discussed in more detail in Section 4.<br />

3-36 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

3.8 ALIGNMENT WITH GOALS<br />

It is critical that the measures align closely with <strong>CDOT</strong>’s vision, mission, values<br />

and goals/objectives. The following figure shows that connection.<br />

Figure 3.3<br />

<strong>CDOT</strong> Mission<br />

The goals and objectives for each investment category are listed in Appendix E.<br />

The supporting measures are contained in Appendix B and they are color-coded<br />

to coincide with the investment categories.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-37


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

The figure above could be expanded to also include the public budget categories:<br />

maintain, maximize, expand, deliver, pass-through/multimodal and<br />

Transportation Commission contingency/debt.<br />

The recommended measures could be mapped to these categories as follows:<br />

Maintain – bridge condition; pavement condition; roadside condition; snow and<br />

ice control; roadway congestion<br />

Maximize – roadway congestion<br />

Expand – roadway congestion; on time construction; on budget construction; and<br />

strategic action item implementation.<br />

Deliver – on time construction; on budget construction; and strategic action item<br />

implementation.<br />

Pass-through/multimodal – none<br />

Transportation Commission contingency/debt - on time construction; on budget<br />

construction<br />

3.9 POTENTIAL ENHANCEMENTS<br />

This section presents opportunities for improving <strong>CDOT</strong> practices categorized in<br />

Table 3.1 as low or medium relative to best practices.<br />

Initiatives<br />

1. Clearly define the terms “strategic initiatives” and “strategies”. Develop or<br />

select a reporting mechanism for communicating <strong>CDOT</strong>’s strategic initiatives<br />

and tracking their implementation. The initiatives should support the vision,<br />

mission, goals and objectives, and represent specific policies or procedures<br />

required to achieve them. The Strategic <strong>Plan</strong> currently contains strategies,<br />

but it is recommended that <strong>CDOT</strong> update them in order to ensure that they<br />

are concrete, actionable, and can be tracked.<br />

Measures<br />

2. Continue to explore options for measuring progress in the multimodal and<br />

quality of life aspects of the vision and mission. Of these two, the<br />

multimodal area lends itself better to quantitative measures. Developing<br />

these measures would require clarification of the role of <strong>CDOT</strong> within each<br />

mode, and on which aspects of multimodalism to track. Example measures<br />

include access to other modes (e.g., percent of population within a ½ mile of<br />

a bike/ped facility or transit service), coverage (e.g., pairs of employment<br />

centers connected by a bike/ped facility or transit service), operational<br />

performance (e.g., transit delay), and improvements to modal connectors<br />

(e.g., access to airports or freight rail facilities).<br />

3-38 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

It also is possible to address quality of life through specific measures.<br />

However, this approach can be difficult given the subjective nature of the<br />

term. Another approach is to document the relationship between this<br />

priority and existing measures. For example, <strong>CDOT</strong> could explain the role of<br />

mobility in improving quality of life.<br />

3. Building from the existing inventory of performance measures, look for<br />

measures that are not used or no longer used, and remove them from the<br />

performance measurement systems.<br />

4. Begin reporting on Travel Time Reliability as an additional mobility measure.<br />

Example results and maps are included in Appendix G.<br />

Targets<br />

5. Either 1) revisit the long-term, aspirational goals to ensure that they reflect<br />

relative priorities between competing needs, and that they are feasible to<br />

obtain; or 2) develop a new set of mid-range targets that meet these criteria<br />

and provide a link between year-to-year performance progress and the<br />

aspirational goals. The target setting exercise could be conducted by <strong>CDOT</strong>’s<br />

Statewide <strong>Plan</strong>ning Unit as part of a long range planning process or strategic<br />

planning process, to help ensure that short-term resource allocation decisions<br />

support a longer term view of the transportation system defined in the long<br />

range plan.<br />

6. Add an indication of movement towards the long-term targets in the Annual<br />

<strong>Performance</strong> Report. For example, combine the current red/yellow/green<br />

scale with an arrow indicating if the measure is trending towards the longterm<br />

target. A green arrow pointing upwards could indicate that the annual<br />

target was achieved, and that performance is improving from previous years.<br />

Using Measures in Decision-making<br />

7. Evaluate opportunities to further integrate performance measures (current<br />

and future projections) into decision-making processes. Examples include:<br />

1) using measures to support the allocation of funds across budget categories<br />

and/or regions (building from current practices evaluating funding and<br />

performance scenarios for the pavement, bridge, and maintenance<br />

programs); 2) using measures to support the prioritization of specific projects<br />

within categories, and 3) using performance results to identify internal<br />

strategic action items.<br />

8. Document the basis for resource allocation decisions in the Long Range <strong>Plan</strong>,<br />

Revenue Forecast and Resource Allocation document, and/or the STIP<br />

Development Guidelines.<br />

9. Work with senior management to further promote a culture of performance.<br />

Examples include: consistently communicate a strategic direction for the<br />

agency; explain how a small set of measures reflect this direction; refer to<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-39


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

performance reports in management meetings; identify the role of <strong>CDOT</strong><br />

branches in achieving which performance targets; and in addition to<br />

identifying a reporting owner for each measure, identify an owner<br />

responsible for achieving the target value, etc. Research has shown that<br />

when agencies (such as <strong>CDOT</strong>) identify a performance management<br />

champion, staff in other offices may feel that they are off the hook because<br />

someone else is responsible for performance management. In reality, the<br />

performance champion is responsible for ensuring that accurate and current<br />

performance data is provided in a manner that enables other offices to use it<br />

to make better decisions.<br />

Report Format and Content<br />

10. Work to improve consistency among the various <strong>CDOT</strong> reports that contain<br />

performance information. For example, develop boilerplate text that defines<br />

the purpose, content, and use of each report; and include this text on the<br />

<strong>CDOT</strong> web site and at the front of each document. Other options for<br />

improving consistency include:<br />

a. Add a discussion of <strong>CDOT</strong>’s vision, mission, goals and objectives to each<br />

document.<br />

b. Establish an overall message for the agency (for example, a message that<br />

reflects agency priorities, what is being done to address these priorities,<br />

and what needs to be done) and ensure that this message is weaved<br />

throughout each document.<br />

c. To the extent possible, modify the content of the reports so that they use<br />

the same measures. When the content is mandated, work with the<br />

mandating organizations to update the list of required measures based on<br />

the agencies current priorities. For example, the Annual <strong>Performance</strong><br />

Report contains a measure called “on-time performance for buses on U.S.<br />

36.” Ensure that this measure still reflects a current priority. If not,<br />

remove it from the report.<br />

11. Identify a “performance report of record” that represents <strong>CDOT</strong>’s vision of<br />

what an annual performance report should look like and contain. Ideally,<br />

<strong>CDOT</strong> could work to turn one of the existing reports into this document,<br />

rather than developing a new one. (Based solely on document names, the<br />

Annual <strong>Performance</strong> Report appears to serve this role now.) Once<br />

established, refer to this report as much as possible during internal and<br />

external discussions. While other reports will likely still be required, they<br />

could be described in terms of addressing a particular mandate, rather than<br />

reflecting <strong>CDOT</strong>’s strategic direction.<br />

12. Provide on-line access to the information in the “performance report of<br />

record” identified in item #10. On-line dashboards provide easy access to<br />

key performance data and enable users to drill down into specific<br />

performance areas. Agencies have found that dashboards can increase the<br />

3-40 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

visibility of performance efforts, which in turn helps to integrate measures<br />

into an agency’s culture (see item #8 above), increase accountability to<br />

external partners, and ensure the longevity of performance efforts. For more<br />

information regarding the effective dashboard design, refer to Section 5 of<br />

this report.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 3-41


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

4.0 <strong>Data</strong> Governance <strong>Plan</strong><br />

This section recommends an approach for data governance within <strong>CDOT</strong>. It<br />

discusses opportunities for data governance at <strong>CDOT</strong> and documents<br />

recommendations concerning how a data governance committee could best assist<br />

the department in reporting on performance.<br />

In evaluating data issues surrounding performance reporting, it quickly becomes<br />

apparent that <strong>CDOT</strong> could benefit from a broader scaled data governance<br />

approach. This section documents the observations and recommendations made<br />

related to data management and governance at <strong>CDOT</strong>. The recommendations<br />

cover data governance to support the performance measurement/reporting<br />

process and <strong>CDOT</strong> business in general.<br />

This section is organized as follows:<br />

<br />

<br />

<br />

4.1 Best Practices – Review of other existing <strong>Data</strong> Governance <strong>Plan</strong>s<br />

4.2 Assessment of <strong>CDOT</strong> – State of <strong>Data</strong> Governance at <strong>CDOT</strong><br />

4.3 Recommendations<br />

4.1 BEST PRACTICES<br />

The application of data governance in a data management plan is critical and the<br />

benefits are numerous. From a policy standpoint, data governance promotes the<br />

understanding of data as a valuable asset to the organization and encourages the<br />

understanding and management of data from both a technical and business<br />

perspective. On a practical level, data governance provides for enterprise access<br />

to data standards and metadata. It provides a central focus for identifying and<br />

controlling the collection, storage, and sharing of data. Improved sharing of data<br />

within <strong>CDOT</strong> can result in costs savings associated with data collection and<br />

integration. <strong>Data</strong> governance also is very important from an IT perspective. It can<br />

result in reduction of redundancy of maintenance of data systems, ensure that<br />

data quality is closest to the source of data collection, and provide opportunities<br />

to implement new and improved technologies for use in data programs. It also<br />

provides flexibility in responding to changes in reporting requirements.<br />

A data governance framework documents key business programs and data<br />

systems, and defines roles and responsibilities for data owners, data stewards,<br />

and data custodians. Organizations that have successfully implemented a data<br />

governance framework tend to have the following factors in common:<br />

<br />

<br />

Strong executive leadership;<br />

Partnership between the IT and the business units of the organization;<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 4-1


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

<br />

<br />

<br />

Effective communication between the various communities of interest (COI)<br />

regarding the data and associated application systems that are used to collect,<br />

maintain, and report information;<br />

Published definitions and standards for source data, metadata, and data used<br />

in the data marts for creating reports from the various data systems; and<br />

Use of a knowledge management system to document work process, data<br />

dictionaries, data models, etc.<br />

There also are many documented obstacles to implementing successful data<br />

governance. They include:<br />

<br />

<br />

<br />

Required culture change to adapt the organization;<br />

Resistance to migration of data from silos to an enterprise management<br />

system; and<br />

Lack of funding or available resources.<br />

State DOTs face many challenges to establishing formal data governance policy<br />

and procedures. <strong>Data</strong> governance models for State DOTs are relatively new,<br />

emerging in response to improved practices for collecting, analyzing, sharing<br />

and disseminating data for the purposes of asset management, performance<br />

reporting, resource allocation and decision-making.<br />

The following definitions were developed previously by <strong>Cambridge</strong> <strong>Systematics</strong><br />

for inclusion in Target-Setting Methods and <strong>Data</strong> Management to Support<br />

<strong>Performance</strong>-Based Resource Allocation by Transportation Agencies (NCHRP Report<br />

666).<br />

<strong>Data</strong> management is defined as the development, execution, and oversight of<br />

architectures, policies, practices, and procedures to manage the information life<br />

cycle needs of an enterprise in an effective manner as it pertains to data<br />

collection, storage, security, data inventory, analysis, quality control, reporting,<br />

and visualization.<br />

<strong>Data</strong> governance is defined as the execution and enforcement of authority over<br />

the management of data assets and the performance of data functions.<br />

Organizations have different strategies for their approach for data governance.<br />

<strong>Data</strong> governance defines how organizations coordinate the strategic<br />

management of their data and information resources. This includes establishing<br />

clear roles, responsibilities, and authorities through various committees and<br />

works structures. These committees may range from an executive steering<br />

group, operation unit information systems strategy groups, IT strategy groups,<br />

application or technical management groups, and service management groups.<br />

The data governance steers the organization and defines the role that top<br />

management plays in information management planning, ensuring a fit between<br />

the information management and the strategy of the organization, improving<br />

communication between top management and middle management, and<br />

influencing user attitudes about information management practices.<br />

4-2 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

<strong>Data</strong> stewardship is defined as the formalization of accountability for the<br />

management of data resources. <strong>Data</strong> stewardship is a role performed by<br />

individuals within an organization known as data stewards.<br />

A data program refers to specific data systems that support a business area of the<br />

organization. The “program” usually includes the functions of data collection,<br />

analysis, and reporting. In the case of a DOT, some examples of these programs<br />

include traffic, roadway inventory, safety, and pavement data.<br />

A hierarchical relationship exists between data management, data governance,<br />

and data stewardship as illustrated in Figure 4.1.<br />

Figure 4.1<br />

<strong>Data</strong> Management, <strong>Data</strong> Governance, and <strong>Data</strong> Stewardship<br />

<strong>Data</strong> Management<br />

<strong>Data</strong> Governance<br />

(DG Board, Stakeholders, DG Maturity Model)<br />

<strong>Data</strong> Stewardship<br />

(Stewards, Owners, Custodians)<br />

Source: Modified from Figure 1 <strong>Data</strong> Governance Team, The <strong>Data</strong> Governance Maturity Model. White<br />

Paper, RCG Information Technology, 2008.<br />

In addition to those who collect and provide data, there are users of the data,<br />

known as stakeholders. These stakeholders form a Community of Interest (COI)<br />

for the data system. The COIs serve a vital role by identifying needs for data and<br />

information and helping to determine where the gaps exist in data programs.<br />

<strong>Data</strong>/information “owners” who create the information do not necessarily have<br />

an interest (and, in fact, may actively resist) data sharing and dissemination.<br />

Information managers, by contrast, understand that: 1) their information only<br />

has value if it is accessible; and 2) accessibility results from syndication through<br />

the richest possible variety of information streams.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 4-3


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Several state DOTs have embarked on data management plans and data<br />

governance strategies in particular. They include Washington State, Minnesota,<br />

Alaska, Virginia, Georgia, California, and Colorado.<br />

Best practices that are particularly relevant to <strong>CDOT</strong> are described briefly below.<br />

Virginia DOT – Communities of Interest<br />

One of the key first steps in data governance is to establish clear roles of<br />

stakeholders. Virginia DOT assigned the following roles in the development of<br />

their data business plan:<br />

<br />

<br />

<br />

<br />

<br />

<strong>Data</strong> Steward – Owns the data business plan and its associated processes<br />

<strong>Data</strong> Architect and <strong>Data</strong> Coordinator- Designated by data steward to carry<br />

out the directorate-wide business data stewardship functions<br />

<strong>Business</strong> Owner – Responsible for a data product<br />

<strong>Data</strong> Custodian – Assigned by business owner to ensure that data services<br />

are provided in the most effective way<br />

Community of Interest – The COI at Virginia DOT is the group of people<br />

who either use the product directly, or depend on its results<br />

More information related to VDOT can be found in the NCHRP 666 report and<br />

Alaska DOT&PF <strong>Data</strong> Governance, Standards and Knowledge Management<br />

(September 2009).<br />

Minnesota DOT – User Survey to Prioritize <strong>Data</strong> Needs<br />

A web survey could be used as a precursor to the more detailed interviews to<br />

prioritize data needs. Minnesota DOT used a web survey. Examples of the<br />

questions used are provided below.<br />

MNDOT User Survey of <strong>Data</strong> and Information Priorities<br />

Purpose<br />

The Minnesota Department of Transportation (MNDOT) is using a multi-step<br />

approach for implementation of a strategic data business plan to guide the<br />

management of data programs at the department. This plan will be used to meet<br />

strategic business objectives and provide information in four core business<br />

emphasis areas: safety, mobility, preservation, and support.<br />

In support of the data business plan, this survey instrument is used to gain<br />

insight into the data and priority information needs as identified by the users of<br />

the core data programs. These users are members of the Communities of Interest<br />

(COIs) representing the core programs from the following data emphasis areas:<br />

<br />

<br />

<strong>Plan</strong>ning and Project Development<br />

Produce<br />

4-4 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

<br />

<br />

Operate/Maintain<br />

Support<br />

The results of this survey will help MNDOT to identify priorities for addressing<br />

the gaps and to also improve current data programs in support of the<br />

Department’s <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong>.<br />

<strong>Business</strong> Areas<br />

1. What business area are you primarily associated with? (Safety, Mobility,<br />

Preservation, Support)<br />

2. Are you also associated with other business areas? Which ones?<br />

3. What are the business objectives for your business area?<br />

4. What are the priority business decisions made in your area?<br />

Role of <strong>Data</strong><br />

1. What data systems are used to support those business decisions?<br />

2. For each of the data systems previously identified, please rate each data<br />

system’s ability to meet the criteria of Accuracy, Completeness, Timeliness,<br />

Validity, Coverage and Accessibility as described below:<br />

a. High – meets criteria most of the time<br />

b. Medium – meets criteria some of the time<br />

c. Low – Does not meet the criteria<br />

For each of the following criteria?<br />

– a) Accuracy – degree to which data is free from error<br />

– b) Completeness – degree to which data values exist in the data system<br />

– c) Timeliness – degree to which data is available when required<br />

– d) Validity – degree to which data is in domain of acceptable data values<br />

– e) Coverage – degree to which sample data accurately represents the<br />

entire set of data<br />

– f) Accessibility – degree to which data is easily retrievable<br />

– g) Overall – degree to which the data system meets all of the criteria listed<br />

above<br />

1. For the systems which do not meet the Overall criteria for supporting<br />

information and business decisions, what recommendations do you have<br />

to improve system?<br />

2. Is the data collected for system by the DOT or by contractor or in<br />

partnership with other agencies?<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 4-5


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

3. Do quality assurance procedures exist for the collection of data for the<br />

respective systems? (Assessment of validity, accuracy, timeliness,<br />

completeness, etc.)<br />

4. How is the quality of the data affecting your ability to make business<br />

decisions?<br />

5. Do you know if the data systems you identified have metadata available<br />

to you, which explains what the data is used for?<br />

6. Does that data system have established targets and measures as part of a<br />

performance management process?<br />

7. How is the data accessed? (application systems, intranet, web, other)<br />

8. How is the data reported? (published reports, intranet, web, other)<br />

9. How is the data stored? (PC, data warehouse, legacy system)<br />

Gaps in <strong>Data</strong> and Information<br />

1. Have you identified any areas where data is lacking to support business<br />

decisions?<br />

2. Are you aware of any planned changes to existing data systems or plans to<br />

develop new data system that will address “gaps in data” issues?<br />

3. Do you have any additional recommendations to improve the data systems<br />

that support business operations?<br />

Minnesota DOT – Assessing State of <strong>Data</strong> Programs<br />

As part of the data business plan development Minnesota DOT sponsored a<br />

white paper recommending best practices for assessing the state of its data<br />

programs. The white paper documents methods, tools, and procedures that can<br />

be used to 1) assess the ability of existing data and information programs to meet<br />

user business needs, 2) determine existing and anticipated gaps in the programs,<br />

and to 3) examine how access to information can be enhanced. Additional<br />

Information can be obtained from MNDOT.<br />

Alaska DOT – <strong>Data</strong> Governance Manual<br />

Alaska DOT created a data governance manual to provide guidance in<br />

implementing data governance structures. The manual contains roles and<br />

responsibilities of all stakeholders and a data governance charter. Additional<br />

information can be found in NCHRP 666 report.<br />

District of Columbia – <strong>Data</strong> Catalog<br />

The District of Columbia developed a very comprehensive data catalog for<br />

operational data with a browsing feature to search types of data. For more<br />

information, see http://data.octo.dc.gov/.<br />

4-6 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Washington State Department of Transportation – <strong>Data</strong><br />

Stewardship Council<br />

WSDOT has a data council and a data stewardship council to help support data<br />

governance at WSDOT. WSDOT also further defines two categories of data<br />

stewardship: business stewardship and technical stewardship. The business<br />

stewards are executive, managerial, and operational stewards, while the<br />

technical stewards include the more traditional roles of system architects and<br />

database administrators. For more information, see reference 1.<br />

<strong>Data</strong> Management Principles from Michigan, Virginia, and Alaska<br />

DOTs<br />

Following is a summary of key data management principles identified by the<br />

subject states during their data governance/management initiatives.<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

Create data once, store once, and use many times.<br />

Define data from an enterprise perspective; define data so that it is sharable<br />

across the department.<br />

Make data available, allowing access to those who need it and create a<br />

“published” data dictionary available through a web site or other means.<br />

Establish data standards to reduce time and costs of maintenance of<br />

redundant data sources.<br />

Establish a data governance structure for the organization and data<br />

stewardship roles to serve as liaisons with user groups, executives, IT and<br />

COIs on data needs and emerging issues for such programs including safety,<br />

mobility, preservation, etc.<br />

<strong>Data</strong> is best managed by people who use it; business people must define the<br />

data and services they need at all levels of the organization.<br />

Establish a formal process for communicating business needs and manage<br />

changes to existing data systems to ensure data programs continue to<br />

support business needs.<br />

4.2 STATE OF DATA GOVERNANCE AT <strong>CDOT</strong><br />

This project included an assessment of the needs and gaps related to data<br />

governance to support performance reporting. The assessment was conducted<br />

through interviews and meetings with relevant stakeholders throughout <strong>CDOT</strong>.<br />

The stakeholder process is described in more detail in Section 2.<br />

In addition to the stakeholder outreach, two informal yet revealing surveys<br />

resulted in observations and led to the recommendations included in Section 4.3.<br />

The first informal survey was conducted with the Division of Transportation<br />

Development. The group participated in an evaluation of the data management<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 4-7


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

structure at <strong>CDOT</strong> using the <strong>Data</strong> Management Maturity Model Matrix from the<br />

NCHRP 666 report. A maturity model is a framework describing aspects of the<br />

development of an organization with respect to a certain process. It is a helpful<br />

tool to assess where an organization stands with respect to implementing certain<br />

processes. A maturity model also can be used to benchmark for comparison or<br />

assist an agency in understanding common concepts related to an issue or<br />

process. The maturity model matrix is shown below (Table 4.1). It can be used<br />

to assess an agency’s status and assist in identifying next steps to achieve success<br />

toward an ultimate goal state.<br />

4-8 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Table 4-1<br />

<strong>Data</strong> Management Maturity Model Matrix<br />

Level 0 – Ad Hoc 1 – Aware 2 – <strong>Plan</strong>ning 3 – Defined 4 – Managed 5 – Integrated<br />

Technology/<br />

Tools<br />

People/<br />

Awareness<br />

Institutional/<br />

Governance<br />

No tools in place<br />

Not aware of<br />

need for<br />

improved data<br />

management to<br />

support<br />

performance<br />

measurement<br />

processes<br />

No data<br />

governance in<br />

place<br />

<strong>Plan</strong>ning for<br />

tools to support<br />

data<br />

management in<br />

some offices<br />

Aware of need<br />

for improved data<br />

management to<br />

support<br />

performance<br />

measurement<br />

processes /<br />

No action has<br />

been taken<br />

Agency is<br />

discussing<br />

needs/plans for<br />

data governance<br />

<strong>Plan</strong>ning for tools to<br />

support data<br />

management across<br />

agency or for a specific<br />

office.<br />

Aware of need for<br />

improved data<br />

management to support<br />

performance<br />

measurement processes<br />

/<br />

Some steps have been<br />

made within agency to<br />

improve technology or<br />

institutional setting to<br />

support data<br />

management in at least<br />

one office<br />

Some level of data<br />

program assessment and<br />

formulation of roles for<br />

data managers is<br />

underway in one or more<br />

offices of agency.<br />

Implemented some tools<br />

to support data<br />

management but not<br />

widespread across<br />

agency.<br />

Aware of need for<br />

improved data<br />

management to support<br />

performance<br />

measurement processes<br />

/<br />

Some steps have been<br />

made within agency to<br />

improve both technology<br />

and institutional setting to<br />

support data<br />

management in more<br />

than one office<br />

<strong>Data</strong> <strong>Business</strong> <strong>Plan</strong>ning<br />

underway – including<br />

development of<br />

governance model for<br />

multiple offices in agency.<br />

Widespread<br />

implementation of tools<br />

to support data<br />

management but not<br />

integrated<br />

Aware of need for<br />

improved data<br />

management to support<br />

performance<br />

measurement processes<br />

/ Improvements are<br />

under way to improve<br />

both technology and<br />

institutional setting to<br />

support data<br />

management across the<br />

agency<br />

<strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

developed with data<br />

assessment complete<br />

and data governance<br />

structure defined<br />

Integrated, widespread<br />

implementation of tools<br />

to support data<br />

management and<br />

performance<br />

measurement<br />

Aware of need for<br />

improved data<br />

management to support<br />

performance<br />

measurement processes<br />

/ technology and<br />

institutional processes are<br />

in place to support data<br />

management for<br />

performance measures<br />

Fully operational data<br />

governance structure in<br />

place<br />

6-Continuous<br />

Improvement<br />

Ongoing assessment of<br />

new technology to<br />

support and improve data<br />

management and<br />

performance<br />

measurement.<br />

Agency is able to develop<br />

performance measures<br />

and predict outcomes for<br />

programs based on<br />

success with other<br />

programs.<br />

<strong>Data</strong> governance<br />

structure fully supports<br />

data management<br />

activities across the<br />

agency.<br />

Red line indicates the level in the matrix <strong>CDOT</strong> is based on informal assessment<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 4-9


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

The group generally agreed that <strong>CDOT</strong> scores as follows for each category:<br />

Technology/Tools – Between Levels 3 and 4<br />

People/Awareness – Level 2<br />

<br />

Institutional/Governance – Level 1, which equates to “Agency is discussing<br />

needs/plans for data governance.” The ultimate state for governance is a<br />

Level 6 where the data governance structure fully supports data management<br />

activities across the agency.<br />

The scores are shown on in the table with a red solid line.<br />

A second informal survey was conducted by the <strong>Performance</strong> and Policy<br />

Analysis Unit. The following questions were asked of eleven key stakeholders<br />

questions were asked of eleven key stakeholders selected from the <strong>Plan</strong>ning,<br />

Finance and Engineering Divisions.<br />

Table 4-2<br />

<strong>CDOT</strong> Multi-Asset Management Self-Assessment<br />

Policy Guidance<br />

1. Policy goals and objectives reflect a holistic, longterm<br />

view of asset performance and cost.<br />

2. The agency proactively helps to formulate effective<br />

asset management policy, by working with elected<br />

officials and interest groups.<br />

3. Policy formulation allows the agency latitude in<br />

arriving at performance-driven decisions on resource<br />

allocation.<br />

<strong>Plan</strong>ning and Programming<br />

4. Capital versus maintenance expenditure tradeoffs<br />

are explicitly considered in the preservation of assets<br />

like pavement and bridges.<br />

5. Our agency periodically updates its planning and<br />

programming methods to keep abreast of current<br />

policy guidance, customer expectations, and critical<br />

performance criteria.<br />

6. Criteria used to set program priorities, select<br />

projects, and allocate resources are consistent with<br />

stated policy objectives and defined performance<br />

measures.<br />

Program Delivery<br />

7. Our agency uses well-defined program delivery<br />

measures to track adherence to project scope,<br />

schedule, and budget.<br />

Strongly<br />

agree Agree Neutral Disagree<br />

Strongly<br />

agree<br />

Strongly<br />

agree<br />

Strongly<br />

disagree<br />

Agree Neutral Disagree Strongly<br />

disagree<br />

Agree Neutral Disagree Strongly<br />

disagree<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 4-11


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Policy Guidance<br />

8. When adding projects or changing project<br />

schedules, our agency considers effects on the<br />

delivery of other projects in the program or other<br />

potentially impacted programs.<br />

Strongly<br />

agree Agree Neutral Disagree<br />

Strongly<br />

disagree<br />

Information and Analysis<br />

9. Our agency has a complete and up-to-date<br />

inventory of our assets.<br />

10. Our agency regularly collects information on our<br />

asset condition.<br />

11. Null. Programmer error.<br />

12. Our agency regularly collects information on the<br />

performance of our assets.<br />

13. Agency managers and staff at different levels can<br />

quickly and conveniently obtain information they need<br />

about asset characteristics, location, usage, condition,<br />

or performance.<br />

14. Our agency has established data standards to<br />

promote consistent treatment of existing asset-related<br />

data and guide development of future applications.<br />

15. Information on changes in asset condition over<br />

time is used to improve forecasts of asset life and<br />

deterioration in our asset management systems.<br />

Strongly<br />

agree<br />

Agree Neutral Disagree Strongly<br />

disagree<br />

Source: Questions selected from AASHTO’s Transportation Asset Management Guide.<br />

The respondents were asked to answer the questions in the context of current<br />

and desired status at <strong>CDOT</strong>. The number “1” corresponds to “strongly disagree”<br />

and “5” corresponds to “strongly agree”. The results are shown in Figure 4.2.<br />

4-12 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 4.2<br />

Self-Assessment Results<br />

5.00<br />

4.50<br />

4.00<br />

3.50<br />

3.00<br />

2.50<br />

2.00<br />

1.50<br />

1.00<br />

0.50<br />

0.00<br />

1 2 3 4 5 6 7 8 9 10 12 13 14 15<br />

Actual<br />

Desired<br />

The CS team agrees with the survey data presented above. However, surveys<br />

may be biased since they were based on feedback from select <strong>CDOT</strong> employees.<br />

While <strong>CDOT</strong> management and staff clearly recognize the need for governance<br />

related to data collection and management, there is not a consistent<br />

understanding regarding what is needed and how it would affect the<br />

department. The following gaps were identified through stakeholder interviews<br />

during this project.<br />

<br />

<br />

<br />

No one office is championing data management or data governance (overall)<br />

at <strong>CDOT</strong>.<br />

There are varying definitions for terms – which results in misleading and<br />

inconsistent data reporting, for example:<br />

– The definition of “injury” has changed over time, and “crash rates” are<br />

not calculated consistently (e.g., crashes per 100 million VMT and crashes<br />

per million VMT are both used.)<br />

– “Fatalities” and “fatal crashes” have been used interchangeably by<br />

decision-makers though tracked separately.<br />

– The term “lane miles” means at least four different things to four different<br />

units across <strong>CDOT</strong>. It is also calculated differently – for example, some<br />

units divide the number of lanes by twelve (assuming 12-foot lanes) and<br />

others divide by fourteen.<br />

Communication regarding data availability, sharing and sources is not<br />

consistent.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 4-13


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

<br />

<br />

<br />

<br />

<br />

Methods of enacting and enforcing policy and procedure are not consistent at<br />

<strong>CDOT</strong> (and possibly even ignored.)<br />

There is a great deal of data at <strong>CDOT</strong>. The agency needs to be more efficient<br />

in reporting it, more standard/universal methods to report and avoid<br />

variations.<br />

There are opportunities to reduce the redundancy of data (i.e., both<br />

Maintenance and Pavement collecting the same data items.) In some cases,<br />

there are several offices collecting the same data. For example, it was<br />

recently discovered that several units are all going into the field and<br />

collecting the same asset data (culverts and fences.)<br />

<strong>CDOT</strong> needs a repository/library for information to reside in and standard<br />

ways to report.<br />

There is a disconnect with respect to data, i.e., different branches own<br />

different data which makes it difficult to establish performance measures.<br />

In summary, <strong>CDOT</strong> is clearly in need of an improved structure to reduce<br />

redundancy in data collection, clarify/standardize terms/definitions used,<br />

clarify roles and responsibilities, and identify and prioritize stakeholder needs<br />

for data.<br />

This improved governance will result in improved sharing of data and<br />

information, improved input to the performance reporting process, improved<br />

ability to fund/prioritize new data programs, ability to measure the right things<br />

at the appropriate level of detail, take advantage of newer technology and tools,<br />

facilitate integration of data and ensure quality decisions are made based on<br />

informative data.<br />

4.3 RECOMMENDATIONS AND TIMELINE<br />

This project provides <strong>CDOT</strong> with a unique opportunity to recommend specific<br />

steps to carry out successful data management through a data governance<br />

approach. The timing is ideal, as the information technology management team<br />

(ITMT) has been engaged in recent discussion regarding improvements in data<br />

including data governance and document management systems.<br />

As indicated above, data governance is a method by which successful data<br />

management is accomplished – essentially data governance falls under the<br />

concepts of data management. For example, techniques such as some of those<br />

being discussed by the ITMT (such as document management systems) really fall<br />

more into the realm of data management.<br />

The recommendations in this section pertain to <strong>CDOT</strong> data management needs<br />

with a focus on data governance. Additional recommendations related to data<br />

collection and data assessment are located in Appendix A.<br />

4-14 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

<strong>Data</strong> Governance<br />

The following recommendations are in priority order. A timeline and<br />

responsibility chart is included (Figure 4.3) to provide more detail.<br />

Priority 1 – Create a Formal <strong>Data</strong> Governance Structure<br />

It is recommended that <strong>CDOT</strong> create a formal <strong>Data</strong> Governance Structure that<br />

includes:<br />

<br />

<br />

A high level <strong>Data</strong> Governance Oversight Committee which should be part of<br />

or a reconfiguration of the ITMT; and<br />

A <strong>Data</strong> Governance Working Group comprised of data owners from all<br />

offices within <strong>CDOT</strong>.<br />

The Committee and Working Group need authority and resources from the top<br />

of the organization. A champion/leader for each group needs to be identified. It<br />

would be desirable if the leaders of each group were at a middle management<br />

level and be given at least 50% time to work on this initiative over the next year.<br />

Each leader would also need support staff to assist with meetings and<br />

coordination. The Working group should be comprised of one representative<br />

from each business area and there should be at least one data owner assigned to<br />

each data program (e.g., safety, pavement, mobility, bridge, maintenance etc.)<br />

It is recommended that the ITMT take the lead in this task (with authority from<br />

the Executive Director of <strong>CDOT</strong>). The work should be accomplished in close<br />

coordination with the <strong>Performance</strong> and Policy Analysis Unit and other units in<br />

the Division of Transportation Development.<br />

In order to have data management in place within the next two years, this task<br />

could be accomplished in the month 1/month 2 timeframe.<br />

Priority 2 – Develop a <strong>Data</strong> Governance Charter<br />

The Committee and Working Group need to work closely to agree on a vision,<br />

mission and objectives for data governance within <strong>CDOT</strong>. These items should be<br />

documented in a charter. Key considerations are the needs for data to support<br />

asset management and performance reporting. A sample charter from<br />

Minnesota is included in Appendix F.<br />

An example mission could be: “The <strong>Data</strong> Governance Oversight Committee<br />

shall implement policies, procedures and standards to be used in the<br />

management of data within <strong>CDOT</strong> in order to support the agency mission and<br />

goals.”<br />

Sample objectives include:<br />

<br />

To oversee the development and implementation of a <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong> (the<br />

<strong>Plan</strong> would encompass Priorities 3 – 6 described below).<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 4-15


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

<br />

To review and prioritize all new data collection/management efforts within<br />

<strong>CDOT</strong>.<br />

The charter may also need to clarify the need for data management/governance.<br />

This could be accomplished with a more thorough application of the <strong>Data</strong><br />

Management Maturity Model defined in Section 4.2.<br />

This task could be accomplished by the newly formed <strong>Data</strong> Governance<br />

Oversight Committee. It should take the Committee no more than two meetings<br />

to accomplish this in month 2.<br />

Priority 3 – Complete a <strong>Data</strong> Inventory/Assessment<br />

A data inventory assessment has been completed as part of this project for the<br />

data to support the recommended measures. It is described in Section 3.7. The<br />

tasks include the identification of:<br />

<br />

<br />

<br />

<br />

<br />

All priority data sets to support key department business – Key business<br />

needs include performance reporting and asset management.<br />

<strong>Data</strong> owners, stewards, stakeholders, community of interest, working groups<br />

etc. and roles/responsibilities for all.<br />

A data catalog detailing all data programs, sources, business owners,<br />

requestors of data, data definitions, data standards, metadata standards,<br />

format, data models, and identification of IT and business subject matter<br />

experts who may be contacted regarding information about the data<br />

programs and instructions for accessing data standards and definitions used<br />

with each data program. The catalog should be updated regularly (every two<br />

to three years). A data catalog was created for the nine recommended core<br />

performance measures. It is documented in Section 3.7.<br />

Develop a business terminology dictionary to align the use of business terms<br />

commonly used throughout an organization. This is particularly helpful to<br />

staff such as IT professionals who are often responsible for developing<br />

applications to meet business needs.<br />

A business case for every critical data program.<br />

This task could be accomplished methodically through the following steps:<br />

Step 1 – Identify the business objectives of the agency (<strong>CDOT</strong> and business<br />

units).<br />

Step 2 – Identify the business functions or services of the agency that support the<br />

business objectives.<br />

Step 3 – Identify which business functions are supported by which data<br />

programs.<br />

Step 4 – Establish policies, standards, and procedures, which mandate how data<br />

is to be collected and used within the agency.<br />

4-16 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Step 5 – Establish data action plans on both a data program and enterprise level,<br />

to address needs and gaps in data and information across the agency. This step<br />

is described in more detail in Appendix A related to data assessment.<br />

It is recommended that the Oversight Committee and Working Group assign a<br />

group to perform this analysis. Outside assistance in the form of consulting<br />

services may be warranted. It is recommended that this step take place starting<br />

in month 2 through month 5.<br />

Priority 4 – Develop and Adopt a <strong>Data</strong> Governance Procedure<br />

The recommended procedure would include a glossary of terms; a framework to<br />

show the relationship between the <strong>Data</strong> Governance Oversight Committee, the<br />

Working Group and data owners within <strong>CDOT</strong>; clear description of roles and<br />

responsibilities of the Oversight Committee, Working Group, data owners, data<br />

custodians, IT, data stakeholders and data working groups (if necessary); and a<br />

process for making decisions related to investment of resources in<br />

data/information projects within <strong>CDOT</strong>. For example, the procedure should<br />

require that the Procurement Office run all requests for new data collection<br />

efforts through the <strong>Data</strong> Governance Oversight Committee.<br />

The procedure could be developed by the Working Group and approved by the<br />

Oversight Committee, and could be accomplished by month 4.<br />

Priority 5 – Perform a Risk Assessment of <strong>Data</strong> Programs<br />

Although this is listed as Priority 5 – it is nevertheless an important component<br />

of the data governance efforts. A risk assessment of data programs would help<br />

to demonstrate the value of data programs in terms of <strong>CDOT</strong> business needs.<br />

The risk assessment would provide for the ability to determine priorities for<br />

addressing any negative impacts to the division, agency, or external customers,<br />

including the public, due to temporary unavailability of data systems or<br />

catastrophic loss of strategic systems.<br />

It is recommended that the risk management plan include:<br />

<br />

<br />

<br />

<br />

<br />

A list of risk associated with the reduction in quality, availability, timeliness<br />

and coverage of data as it supports business functions. The <strong>Data</strong> Assessment<br />

conducted in Priority 3 could serve as the basis.<br />

A risk value for each risk (low, medium, high), which evaluates the impact of<br />

the risk to the overall program if data is no longer available or if access to the<br />

data is interrupted for a period of time.<br />

Costs and benefits and Return on Investment for each of the data program<br />

components.<br />

A consistent method for conducting such risk/benefit cost/ ROI on a regular<br />

basis.<br />

A person/office responsible for handling each risk (same as data owner).<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 4-17


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

A demonstration project could be useful for showing the value of data and the<br />

possibilities of having more data (e.g., if you had this data, this is what you could<br />

do). This would enable the data program to become more of a priority at the top<br />

level.<br />

Three excellent references for guidance in this area are Target-Setting Methods and<br />

<strong>Data</strong> Management to Support <strong>Performance</strong>-Based Resource Allocation by<br />

Transportation Agencies (NCHRP Report 666), Uses of Risk Management and <strong>Data</strong><br />

Management to Support Target-Setting for <strong>Performance</strong>-Based Resource Allocation by<br />

Transportation Agencies (NCHRP 706) and Alaska DOT&PF <strong>Data</strong> Governance White<br />

Paper.<br />

It is recommended that the Oversight Committee and Working Group assign a<br />

group to perform this analysis. Outside assistance in the form of consulting<br />

services may be warranted. This task should take place immediately following<br />

Priority 3 (Assessment), and could be completed by month 10.<br />

Priority 6 – Implement Ongoing <strong>Data</strong> Management and<br />

Governance at <strong>CDOT</strong><br />

This task includes developing training, manuals and implementing the process<br />

throughout <strong>CDOT</strong>. It is recommended that <strong>CDOT</strong> work to accomplish the<br />

following:<br />

<br />

<br />

Develop a <strong>Data</strong> Governance (DG) Manual that includes:<br />

– Introduction explaining the role of data governance at <strong>CDOT</strong>, including a<br />

defined policy for how data is to be collected, managed, and used at the<br />

agency.<br />

– Define the goals/objectives pertaining to the collection and use of data at<br />

<strong>CDOT</strong>.<br />

– Identify process to address non-compliance with goals of data collection<br />

and use.<br />

– Communicate with stakeholders to sustain support for various programs.<br />

Continue to provide outreach to all communities of interest to ensure that<br />

all needs are addressed<br />

– Publish any new standards, policies, procedures that will be enacted as a<br />

result of the use of data governance at <strong>CDOT</strong> and in the regions.<br />

– Once the governance model is established, review annually the data<br />

programs for any needed enhancements, replacements with newer<br />

systems, technologies.<br />

Use data standards to:<br />

– Facilitate establishing targets and measures that meet agency goals.<br />

– Reduce the cost of multiple data collection efforts and maintenance of<br />

duplicate databases. Strive to collect data once, use it many times.<br />

4-18 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

– Facilitate consistent reporting of information.<br />

– Develop a communication plan to market the impact and benefits of data<br />

governance to the Department. This may be needed earlier in the<br />

process.<br />

There are clearly some risks that need to be overcome in the development and<br />

implementation of a <strong>Data</strong> Governance approach. These risks include:<br />

<br />

<br />

<br />

This will result in a culture change for <strong>CDOT</strong>, which there may be resistance<br />

to change in the way data is currently managed at <strong>CDOT</strong>.<br />

The potential for ill-defined roles for regions.<br />

Need a better way to enact and enforce policy – top down. If the policy is<br />

related directly to performance management, it will be more successful.<br />

Need to ensure ongoing <strong>CDOT</strong>/OIT support of SAP’s performance database.<br />

The above steps assume that <strong>CDOT</strong> managers have been convinced of the need<br />

for data management/governance. However, some work may still be necessary<br />

related to this vital condition. NCHRP 666 contains an entire section related to<br />

establishing the need for data management/governance. Following are some<br />

relevant excerpts for <strong>CDOT</strong> to consider:<br />

“The need and urgency for data management improvements are not always shared<br />

across all levels of an agency. In some cases, a senior manager within the agency<br />

identifies the need and in other cases individuals at lower levels recognize the value of<br />

improved data management. Nevertheless, a clear case must be established to secure<br />

resources and commitment to proceed with a data management improvement strategy.”<br />

A key success factors related to establishing the need for data governance is:<br />

Demonstrate the Return on Investment (ROI) to the organization regarding the use of<br />

data management and data governance in order to gain buy-in from executives and<br />

decision-makers. Demonstrate with specific examples how the use of data governance can<br />

meet the goals and targets most important to executives.<br />

ROI can be determined in many ways and on many levels within an organization. For<br />

instance, in a Highway Safety Improvement Program (HSIP): a) from the perspective of<br />

the HSIP Statewide Coordinator, an investment in more resources (e.g., people,<br />

technology, tools), may lead to the ROI of an improved HSIP strategic plan; b) for traffic<br />

and safety engineers, an investment in Global Positioning System (GPS) field inventory<br />

projects may lead to the ROI of improved crash locations; and c) for the Highway Safety<br />

<strong>Plan</strong>ning Agency, an investment in electronic data collection may lead to the ROI of<br />

improved quality of crash records.<br />

ROI also can be realized across business functional areas within an agency or across<br />

agency boundaries. In the highway crash safety example, a) for law enforcement<br />

personnel, an investment in electronic crash data collection and submittal may lead to the<br />

ROI of reduced time to complete the accident investigation and review; b) for<br />

maintenance and operations personnel, an investment in digital imaging capabilities may<br />

lead to a ROI of quicker and less costly asset management inventory and reduced cost to<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 4-19


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

prepare HSIP projects for the traffic and safety engineers; and c) for the executive<br />

management, investment in an enterprise Geographic Information System (GIS)<br />

deployment may lead to the ROI for improved tradeoff analysis on project selection by<br />

visualizing the crash history, traffic, and pavement condition.<br />

A data governance framework, implemented on an enterprise level, supports ROI by providing<br />

a means of monitoring and tracking progress of various business programs for<br />

executives as well as data stewards, and stakeholders and users of the source data. <strong>Data</strong><br />

governance provides methods, tools, and processes for:<br />

<br />

<br />

<br />

<br />

<br />

Traceability – aligning data programs with the agency’s business needs.<br />

Establishing data area communities of interest and working groups that examine<br />

needs in common areas and on a regular basis is essential.<br />

<strong>Performance</strong> Measures – should be reflective of the business needs identified in the<br />

traceability exercise.<br />

Risk Assessment – requires the agency to assess 1) how much data is needed;<br />

2) how accurate should the data be; 3) what should the refresh rate of the data be,<br />

4) who should have access to the data, and many other questions which help to assess<br />

the risks associated with a particular data program.<br />

Value of <strong>Data</strong> Programs – needs to be demonstrated to users and those who<br />

authorize investments in the data programs. This can be done effectively through the<br />

use of visualization tools, use of enterprise GIS systems, collecting data once and<br />

using it for many purposes, and demonstrated improvements in business operations<br />

through the use of quality, accurate, timely, easily accessible data, and information.<br />

Knowledge Management – must become part of the data governance framework in<br />

order to ensure that lessons learned and experiences pertaining to business operations<br />

within the organization are not lost. This will help to increase the ROI for time and<br />

resources committed to support of data programs.<br />

Institutional challenges may include: centralized policy-making, and decentralized<br />

execution of those policies; limited appreciation by decision-makers of the role of data<br />

systems in supporting business operations; and lack of formal policies and standards<br />

which guide the collection, processing, and use of data within the organization. It is<br />

particularly critical to have standardized policies and procedures for management of data<br />

and information when that information is the foundation of performance measurement<br />

and target setting programs for an agency. A data management program is used to<br />

coordinate the establishment and enforcement of data policies and standards for the<br />

organization.<br />

One of the ways to address these and other data-related needs is through the<br />

establishment of a structured data management program and data governance<br />

framework. <strong>Data</strong> management and data governance can help the agency to prioritize the<br />

most critical data needs and identify the resources available to address those needs in a<br />

timely manner.”<br />

Timeline<br />

Figure 4.3 provides a suggested timeline for the activities described above.<br />

4-20 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 4.3<br />

Timeline<br />

Colorado DOT <strong>Data</strong> Governance Recommendations<br />

TIMELINE<br />

Month<br />

PRIORITY TASK & RESPONSIBLE GROUP 1 2 3 4 5 6 7 8 9 10 11 12<br />

1 Create a Formal <strong>Data</strong> Governance Structure<br />

<strong>Data</strong> Governance Oversight Committee<br />

<strong>Data</strong> Governance Working Group<br />

2 Develop a <strong>Data</strong> Governance Charter<br />

DG Committee<br />

DG Working Group<br />

3 Complete a <strong>Data</strong> Inventory/Assessment<br />

<strong>Data</strong> Assessment Group<br />

4 Develop and Adopt a <strong>Data</strong> Governance Procedure<br />

DG Working Group<br />

5 Perform a Risk Assessment of <strong>Data</strong> Programs<br />

Risk Assessment Group<br />

6 Implement Ongoing <strong>Data</strong> Management and Governance at <strong>CDOT</strong><br />

Task Duration<br />

<strong>Data</strong> Governance Oversight Committee Meeting<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 4-21


5.0 External Reporting<br />

Dashboard<br />

<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

This section addresses methods of external reporting of performance<br />

measures information and recommendations for formatting dashboards and<br />

balanced scorecards.<br />

The use of information technology tools such as executive dashboards and<br />

balanced scorecards can be used for multiple purposes, from transportation<br />

project tracking to project delivery and demonstration of how the agency’s<br />

business programs are performing when compared to established<br />

performance goals and targets.<br />

Dashboards simplify performance reporting. Tools for visualization or<br />

graphic displays of data and information come in many forms such as tables,<br />

charts (pie chart, bar chart, histogram, function graph, scatter plot, etc.),<br />

graphs, maps, or Venn diagrams. A balanced scorecard is one of the<br />

components that can be displayed on a dashboard.<br />

The following section includes analysis and best practices from state DOTs<br />

and public agencies, and recommendations for <strong>CDOT</strong> regarding displaying<br />

performance measures in a dashboard for public communication.<br />

5.1 BEST PRACTICES<br />

Georgia DOT<br />

The Georgia DOT’s performance management dashboard is easily accessible<br />

from the GDOT Home Page. It clearly demonstrates the relationship<br />

between their strategic goals and performance measures. The main<br />

categories are safety, maintenance and planning with individual measures<br />

represented as gauges.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-1


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.1<br />

Georgia DOT <strong>Performance</strong> Management Dashboard<br />

http://www.dot.state.ga.us/statistics/performance/Pages/default.aspx<br />

By selecting one of the gauges on the dashboard, more detailed information<br />

is available. An example of a drill-down into more detail is shown in Figure<br />

5.2. The more detailed page describes the goal, strategic objective and<br />

historic values of the measure.<br />

5-2 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.2<br />

Georgia DOT Bridge Maintenance Measures<br />

http://www.dot.state.ga.us/statistics/performance/Pages/Bridges.aspx<br />

Virginia DOT<br />

The Virginia DOT performance management dashboard is accessible from<br />

the VDOT Home Page as a link called “VDOT Dashboard”. This would be<br />

easier to find if it had a descriptive caption. Selecting the link opens the<br />

dashboard screen depicting a series of gauges showing key performance<br />

indicators for roadway performance, safety, pavement condition, finance,<br />

VDOT management, citizen survey results, and project delivery (Figure 5.3).<br />

According to VDOT, their dashboard has served as an early warning system<br />

for project managers to get their projects back on track since early 2003. As a<br />

result, VDOT’s performance with delivering projects on time and on budget<br />

has greatly improved.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-3


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.3<br />

VDOT Main Dashboard<br />

http://dashboard.virginiadot.org/default.aspx<br />

Within the main dashboard, users can select individual gauges to view<br />

additional related performance measures within each area of performance.<br />

For example, Figure 5.4 shows detailed performance measures for project<br />

delivery. Drop-down menus allow users to apply filters such as district,<br />

geographic area, roadway system, date range, funding type, contract type, or<br />

type of work. However, no link to department goals is readily accessible<br />

from the dashboard.<br />

5-4 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.4<br />

Detailed View for Project Delivery<br />

http://dashboard.virginiadot.org/Pages/Projects/ConstructionOriginal.aspx<br />

Users can also access the Governor’s Scorecard (see Figure 5.5), by selecting<br />

the “Virginia Performs” tab from the main dashboard. The Governor’s<br />

Scorecard includes administrative measures that track the effectiveness of<br />

state agency management in five critical categories: emergency<br />

preparedness, financial management, government procurement, human<br />

resources, and information technology. Based on performance in each<br />

category, every agency is assigned a color-coded rating based on whether<br />

expectations are being met (see the legend in Figure 5.5).<br />

Agency heads rate their agency’s performance according to the criteria. Then,<br />

annually, the cabinet, central agencies, and the governor review the ratings.<br />

Citizens can view the criteria and track how state agencies are performing in<br />

critical management categories year-by-year.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-5


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.5<br />

Virginia Governor’s Scorecard<br />

http://www.vaperforms.virginia.gov/agencylevel/src/ScoreCardResults.cfm<br />

The Virginia DOT is utilizing the abilities of dashboards and scorecards in a<br />

very effective manner for transmitting information to department staff and<br />

managers, policy-makers, and the public. It is an outstanding example of<br />

how these methods may be applied to business areas at other DOTs.<br />

Washington State Transportation Improvement Board (TIB)<br />

The Washington State Transportation Improvement Board (TIB) uses a realtime<br />

dashboard to monitor active grant projects that are awarded to local<br />

agencies to fund road repairs and new construction. The dashboard was<br />

developed as an internal oversight tool during a period of fiscal crisis, and it<br />

has consistently improved business processes and grant project performance<br />

since its implementation in 2003. The length of time for a local government<br />

to receive payment dropped from five months in 2001 to just seventeen days.<br />

Delayed projects dropped seventy percent, saving millions in public funds<br />

due to construction cost inflation. Grant projects from the TIB’s safety<br />

program averaged nineteen percent fewer accidents and thirty percent less<br />

injuries two years after construction.<br />

The dashboard link is easily found on the TIB Home Page. Selecting it brings<br />

up a web page that has information about the dashboard as well as related<br />

links, awards, and what others are saying about the TIB dashboard. The user<br />

must scroll the page in order to access the actual dashboard.<br />

5-6 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.6<br />

Washington State Transportation Improvement Board<br />

Dashboard<br />

http://www.tib.wa.gov/performance/<strong>Performance</strong>.cfm<br />

This real-time dashboard application is updated every time the page is<br />

loaded. It shows the balance that can be struck between design principals<br />

and real-world constraints. The choice of Xcelsius lends flash-based<br />

interactivity. While some charts and graphs may be better designed from an<br />

information visualization point of view, the level of utility is high. Most<br />

pages do not fit on one screen unless the menu is minimized.<br />

Project level information even includes pictures from the job site. It also<br />

includes a balanced scorecard. Quarterly financial reporting includes<br />

sparklines. A sparkline is a type of information graphic characterized by its<br />

small size and high data density. Sparklines present trends and variations<br />

associated with some measurement, such as average temperature or stock<br />

market activity, in a simple and condensed way. The overview displays the<br />

status of projects in each county. A link to the TIB Strategic <strong>Plan</strong> and<br />

department mission and values is easily accessible from the dashboard.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-7


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.7<br />

Washington State <strong>Performance</strong> Management Dashboard<br />

http://www.tib.wa.gov/TIBDashboard/<br />

Figure 5.8<br />

Washington State Key <strong>Performance</strong> Indicators<br />

http://www.tib.wa.gov/TIBDashboard/<br />

5-8 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

North Carolina DOT<br />

North Carolina DOT’s Organization <strong>Performance</strong> Dashboard is easily<br />

accessible from N<strong>CDOT</strong>’s Home Page. It displays one primary executive<br />

performance measure for each of the department’s five goals:<br />

<br />

<br />

<br />

<br />

<br />

Make our transportation network safer (fatality rate);<br />

Make our transportation network move people and goods more<br />

efficiently (incident duration);<br />

Make our infrastructure last longer (infrastructure health);<br />

Make our organization a place that works well (project delivery rate); and<br />

Make our organization a great place to work (employee engagement).<br />

The main dashboard, shown in Figure 5.9, depicts a series of tabs and gauges<br />

showing key performance indicators for fatality rate, incident duration,<br />

infrastructure health, delivery rate, and employee engagement.<br />

Figure 5.9<br />

N<strong>CDOT</strong> Organization <strong>Performance</strong> Dashboard<br />

https://apps.dot.state.nc.us/dot/dashboard/<br />

Users can select links to view additional performance measures within each<br />

area of performance. For example, Figure 5.10 shows detailed performance<br />

measures for infrastructure health. A drop-down menu allows users to filter<br />

results by county.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-9


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.10 Detailed View for Infrastructure Health<br />

Each quarter, the department publishes a performance scorecard that depicts<br />

performance during the previous three months. The scorecard allows users<br />

to see how the agency is performing in meeting the executive performance<br />

measures in each area of performance. An example of the scorecard for the<br />

period from July 1, 2010 to September 30, 2010 is shown in Figure 5.11.<br />

Results are color-coded based on whether current results are exceeding,<br />

meeting, or not meeting annual targets. The scorecard is available as a PDF<br />

from the <strong>Performance</strong> Reports web page.<br />

5-10 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.11 N<strong>CDOT</strong> Quarterly Scorecard<br />

http://www.ncdot.gov/performance/reports/<br />

District of Columbia<br />

The DC Department of Transportation’s dashboard indicates that it is in Beta,<br />

and not easily accessed.<br />

The main dashboard, shown in Figure 5.12 depicts a series of gauges showing<br />

key performance indicators for safety, roadway condition, projects, transit,<br />

finance, and customer service.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-11


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.12 District Transportation Access Portal (Beta 2.0)<br />

http://dashboard.ddot.dc.gov/ddotdashboard/#Home<br />

Within the main dashboard, users can select individual gauges to view<br />

additional related performance measures within each area of performance.<br />

For example, Figure 5.13 shows detailed performance measures for safety.<br />

5-12 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.13 Detailed View for Safety<br />

http://dashboard.ddot.dc.gov/DTAPDOC/Other/Safety276544_2011CrashReport_1%20test.pdf<br />

Capital Bikeshare<br />

The Capital Bikeshare Dashboard is the product of a public/private<br />

partnership between the District Department of Transportation (DDOT),<br />

Arlington County, and Alta Bicycle Share. According to the website, the<br />

Capital Bikeshare dashboard was created to increase government<br />

transparency, accountability, and communication with Capital Bikeshare<br />

members and the public, as well as to facilitate decision-making about the<br />

program.<br />

As shown in Figure 5.14, the following performance metrics are updated<br />

monthly and historical data is also available.<br />

<br />

<br />

<br />

<br />

Ridership;<br />

Fleet <strong>Performance</strong> and Safety;<br />

Customer Service; and<br />

Membership.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-13


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.14 Capital Bikeshare <strong>Performance</strong> Dashboard<br />

http://cabidashboard.ddot.dc.gov/CaBiDashboard/<br />

Figure 5.15 shows more detail into ridership and includes options for<br />

customizing the report display.<br />

5-14 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.15 Capital Bikeshare Dashboard Drill Down<br />

http://cabidashboard.ddot.dc.gov/CaBiDashboard/#Ridership/StartDate=4/30/2011EndDate=9/30/2011P<br />

ubDate=9/30/2011<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-15


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Minnesota Department of Transportation<br />

Access to the MNDOT’s performance report is readily accessible from the<br />

home page. Several options are available such as a PDF of the scorecard or<br />

annual performance report, or an interactive version of the latest annual<br />

report.<br />

While not contained to one page, the Minnesota annual scorecard is a<br />

visually effective means of providing performance at-a-glance results<br />

(Figure 5.16).<br />

Figure 5.16 MNDOT <strong>Performance</strong> Results Scorecard<br />

http://www.dot.state.mn.us/measures/pdf/2010%20SCORECARD.pdf<br />

5-16 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Hennepin County, Minnesota<br />

An example of the effective use of scorecards comes from Hennepin County<br />

in the state of Minnesota. The Balanced Scorecard helps the County to align<br />

their daily work with their vision and strategic goals.<br />

As shown in Table 5.1, the BSC in Hennepin County is viewed from four<br />

perspectives:<br />

<br />

<br />

<br />

<br />

Customer – What results do we need to produce for our customers to<br />

fulfill our mission and achieve our vision?<br />

Finance – What financial objectives must we meet in order to produce the<br />

desired results for our customers?<br />

Internal Process – What processes must we excel at in order to attain the<br />

financial objectives and desired results for the customer?<br />

Learning and Growth – How do we develop our internal resources to<br />

refine the necessary processes that will allow us to attain our financial<br />

objectives and desired results for the customer?<br />

The balanced scorecard forms the basis for discussion between supervisors<br />

and managers, managers and directors, directors and administration, and<br />

administration and the board about progress towards achieving desired<br />

results.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-17


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Table 5-1<br />

Hennepin County Sample Balanced Scorecard<br />

Perspective Strategic Objective Measure Target Actual Comment<br />

Customer<br />

Finance<br />

Internal<br />

Process<br />

Learning and<br />

Growth<br />

Achieve customer<br />

outcomes<br />

Improve customer<br />

satisfaction<br />

Manage expenses<br />

Maximize revenue<br />

Build effective<br />

partnerships<br />

Retain knowledgeable<br />

staff<br />

Number of high<br />

priority issues<br />

resolved<br />

Percent of customers<br />

rating service very<br />

good or excellent<br />

Percent<br />

increase/decrease in<br />

annual budget<br />

Percent<br />

increase/decrease<br />

revenue derived from<br />

grants<br />

Number of projects<br />

involving one or more<br />

partners<br />

Employee retention<br />

rate<br />

60 30<br />

Need improvement,<br />

investigate process<br />

for resolving high<br />

priority issues<br />

80% 80% Right on target<br />

1.5 5%<br />

Reduced expenses<br />

due to budget cuts<br />

5% 13% Good progress<br />

25 10<br />

Based on the<br />

number of projects to<br />

date with one or<br />

more partners<br />

95% 75% Need to monitor<br />

Lessons Learned<br />

Dashboards are becoming commonplace for public agencies to present their<br />

performance results. Both real time and static data are often featured.<br />

Designs vary but some similarities stand out. Best practices include:<br />

<br />

<br />

<br />

<br />

Initial view is a single page/screen snapshot of performance measure<br />

results;<br />

Drill down to more detail with filter options;<br />

<strong>Data</strong> export availability; and<br />

Responsible departments and directors displayed;<br />

Other features that present opportunities for improvement for some<br />

dashboards are:<br />

<br />

<br />

<br />

Easier to find links from the agency home page, with descriptive text;<br />

More use of maps and corridors to illustrate congestion; and<br />

Link to agency goals and strategies.<br />

5-18 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

5.2 CURRENT <strong>CDOT</strong> PROCESS AND WEB<br />

REPORTING CAPABILITY<br />

The objective of this report is to assist <strong>CDOT</strong> with communicating<br />

performance measures to the public. Currently annual performance measure<br />

reports are available to the public via the <strong>CDOT</strong> Home Page by selecting<br />

Annual Reports from the Quick Links menu on the left of the page.<br />

However, no mention of performance measure reporting is available from<br />

the home page. The Annual Reports page lists all available reports for the<br />

last several years.<br />

The stakeholder process revealed the following needs regarding dashboards:<br />

<br />

<br />

<br />

<br />

<br />

Need to be able to drill down on measures so a person can see how their<br />

performance objectives link to those of the work unit, division and<br />

Department.<br />

Reporting needs to be more outcome oriented (e.g., rather than focusing<br />

on whether a bid was on time – what was the quality of the work?)<br />

It is a good idea to break down the measures by region. The more<br />

information, the better for the decision-makers. However, <strong>CDOT</strong> may<br />

wish to only show regional breakdowns for internal reporting.<br />

If measures are reported by region it will be important to provide<br />

annotations. For example, urban regions may have more difficulty with<br />

the on-time and on-budget measures than more rural regions.<br />

Dashboard design should not be constrained by the capabilities of<br />

Xcelsius.<br />

These recommendations need to be coordinated with the Public Relations<br />

Office for consistency with current processes at <strong>CDOT</strong>.<br />

5.3 RECOMMENDATIONS FOR <strong>CDOT</strong> DASHBOARD<br />

REPORT FORMAT<br />

What Should Be Reported In A Dashboard?<br />

Dashboards should contain summary statistics at-a-glance, with drill-down<br />

capabilities to other dashboards for specific areas of interest to the dashboard<br />

user. Best practice indicates that dashboards should include only a few<br />

critical measures.<br />

How Often It Should Be Updated<br />

Dashboards are only tools; effectiveness depends on use. Update frequency<br />

should occur based on the individual measures and is discussed in more<br />

detail in Section 3.4 of this report.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-19


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Visualization Tools to be Used<br />

As the primary purpose of the <strong>CDOT</strong> dashboard is to communicate timely<br />

performance information about the department’s activities, it is important<br />

that this information be displayed in a way that is understandable to the<br />

public and simple to interpret.<br />

Stephen Few (author of Information Dashboard Design and considered a world<br />

leader in data visualization) defines a dashboard as “a visual display of the<br />

most important information needed to achieve one or more objectives;<br />

consolidated and arranged on a single screen so the information can be<br />

monitored at a glance.”<br />

There are three types of performance-based dashboards: 1) operational,<br />

2) tactical, and 3) strategic.<br />

Operational dashboards focus on exception alerting or detailed metrics<br />

related to daily operations, and are based on real-time or transactional data.<br />

These dashboards tend to be more volatile, as data changes frequently<br />

throughout the day. However, they provide an accurate snapshot of what is<br />

happening right now.<br />

Tactical dashboards display data that is not as real-time as operational<br />

dashboards. The tactical dashboard contains an aggregated, summarized, or<br />

averaged view of data, which allows comparison against historical values,<br />

benchmarks, and goals.<br />

The strategic dashboard tracks performance against high-level objectives.<br />

These dashboards tend to summarize performance over the past month,<br />

quarter, or year.<br />

Characteristics for each dashboard type are summarized in Table 5.2.<br />

Table 5-2<br />

Dashboard Types<br />

Type of Dashboard Audience PM Type<br />

Operational – for monitoring in real<br />

time<br />

Tactical – for analysis and<br />

benchmarking<br />

Strategic – for tracking achievement<br />

of strategic objectives<br />

Front-line personnel dealing with dayto-day<br />

activities of the organization<br />

Executives<br />

Organizational leaders<br />

Detailed metrics related to daily<br />

operations<br />

Comparative metrics to review and<br />

benchmark data of the departments<br />

<strong>Performance</strong> indicators with respect<br />

to their goals<br />

Ganapati, Sukumar. Use of Dashboards in Government. IBM Center for the <strong>Business</strong> of Government, 2011.<br />

Scorecards<br />

The balanced scorecard (BSC) is one of the components that can be displayed<br />

on a dashboard. The BSC is used to translate business mission<br />

accomplishment into a critical set of performance measures distributed<br />

among an equally critical and focused set of business perspectives. The BSC<br />

reports how well specific programs are performing based on established<br />

5-20 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

targets and goals that are linked to strategic business objectives. The purpose<br />

of the BSC is to:<br />

<br />

<br />

<br />

<br />

Align all members of an organization around common goals and<br />

strategies;<br />

Link initiatives to the strategy, making prioritization easier;<br />

Provide feedback to people on key issues – notably, areas where they can<br />

have an impact; and<br />

Be an essential decision-making tool.<br />

The BSC builds on cross-functional cause and effect relationships. Processes<br />

that contribute to desired results are viewed cross-functionally. Measures<br />

that make one function look good while deflating another are avoided, thus<br />

minimizing negative competition between individuals and functions. One<br />

should test the results ahead of time to avoid this.<br />

Scorecards include two key components:<br />

<br />

<br />

A balanced set of measures, and<br />

A set of strategically focused business perspectives.<br />

The Kaplan/Norton Balanced Scorecard looks at four interconnected<br />

business perspectives. These are:<br />

<br />

<br />

<br />

<br />

Financial – How do we look to our stakeholders?<br />

Customer – How well do we satisfy our internal and external customers’<br />

needs?<br />

Internal <strong>Business</strong> Process – How well do we perform at key internal<br />

business processes?<br />

Learning and Growth – Are we able to sustain innovation, change, and<br />

continuous improvement?<br />

A comparison of dashboards and scorecards is provided in Table 5.3.<br />

Table 5-3<br />

A Comparison of Operational and Tactical Dashboards and<br />

Strategic Scorecards<br />

Operational Tactical Strategic<br />

Type Dashboard Dashboard Balanced Scorecard<br />

Users<br />

Managers, supervisors,<br />

Managers<br />

Executives<br />

operators<br />

Information Detailed Detailed / Summary Summary<br />

Usage<br />

Organizational<br />

Level<br />

Monitor daily, production and<br />

operation<br />

Monitor progress on an<br />

initiative<br />

Monitor alignment and success<br />

of strategic objectives<br />

Work Unit Department Enterprise or Strategic <strong>Business</strong><br />

Unit<br />

Updated Intra-Day Daily or Weekly Monthly<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-21


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Person, Ron. Balanced Scorecards & Operational Dashboards with MS Excel. Wiley Publishing, 2009<br />

Dashboard Design Elements<br />

Dashboard design is not meant only for aesthetics, but also for easy grasp of<br />

actionable data and information. If a dashboard is poorly designed, it could<br />

lead the user to erroneous conclusions or time-consuming misinterpretation.<br />

So many options exist for building dashboards it is difficult not to employ so<br />

much that the data gets lost in the design. Too much clutter leads to<br />

confusion.<br />

Experts cite three core principles of design:<br />

<br />

<br />

<br />

Dashboards should fit on one single page;<br />

Dashboards should be simple; and<br />

Dashboards should use the best display medium for communicating data.<br />

Agencies have different design approaches to dashboards. Whereas some<br />

dashboards are visually rich, other dashboards are essentially tables. A<br />

dashboard may be designed to display any combination of data summaries,<br />

charts (e.g., bar charts, pie charts, histogram, function graph, scatter plot,<br />

“sparklines,” etc.), graphs (tree diagram, network diagram, flowchart, etc.),<br />

gauges, maps, or Venn diagrams.<br />

Charts<br />

Depending on the type of data to be presented, choices of types of charts are<br />

numerous. It is important to consider the audience as well as the data when<br />

choosing the type of visualizations for a dashboard.<br />

Gauges, Menus and Sliders<br />

Just as in effective website design, dashboards should give the reader the<br />

information that they need without having to search for it. Undue visual<br />

“noise” should be avoided and design elements should enhance rather than<br />

clutter the dashboard.<br />

Many transportation-related dashboards are created using elements such as<br />

gauges and meters, and other dashboard images from cars. However, these<br />

are not always the best method for representing data. Gauges use a lot of<br />

space unnecessarily. Gauges and meters typically display a single key<br />

measure, sometimes compared to a related measure such as a target, and<br />

sometimes in the context of quantitative ranges with qualitative labels that<br />

declare the measure’s state (such as good or bad). The bullet graph achieves<br />

the communication objective without the problems that usually plague<br />

gauges and meters. Bullet graphs are presented in more detail in Figure 5.17.<br />

Menus are a necessary tool for on-line dashboards in order to drill-down to<br />

more detailed and expansive data.<br />

5-22 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.17 illustrates several charting options and their common uses.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-23


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.17 Dashboard Design Elements<br />

Function Visualization Tool Examples<br />

Compare a Set of Values<br />

Bar Chart – For numerical comparisons showing one or more set of variables<br />

Block Histogram – For visualization of distribution of numeric values in a data set.<br />

Bubble Chart – Displays a set of numeric values as circles, especially useful for<br />

data sets with dozens to hundreds of values or with values that differ by several<br />

orders of magnitude.<br />

Candlestick Chart – A combination of bar and line chart representing the range of<br />

movement of the measure over a given time interval.<br />

Radar Chart – For visualization of multivariate data in the form of a twodimensional<br />

chart of three or more quantitative variables represented on axes<br />

starting from the same point.<br />

Stacked Bar Chart – If values of each category add up to 100% and if this is<br />

important<br />

100%<br />

90%<br />

80%<br />

70%<br />

60%<br />

50%<br />

40%<br />

30%<br />

20%<br />

10%<br />

0%<br />

2007 2008 2009 2010<br />

Display Key Measures With<br />

Comparative Measure and<br />

Qualitative Ranges<br />

Bullet Graph<br />

n<br />

n<br />

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-25


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Function Visualization Tool Examples<br />

Track Rises And Falls Over Time<br />

Line Graph – For visualizing continuous change<br />

Stack Graph – For visualizing change in a set of items, where the sum of the<br />

values is as important as the individual items<br />

Stack Graph for Categories – For visualizing the total change over time of a group<br />

of quantities<br />

Sparkline – A graphic designed for visualizing trends and variations associated<br />

with high-density data, but displayed in a simple and condensed way.<br />

800<br />

600<br />

400<br />

200<br />

0<br />

g<br />

g<br />

5%<br />

See The Parts Of A Whole<br />

Pie Chart<br />

Treemap<br />

Treemap for Comparisons<br />

95%<br />

5-26 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Recommended Dashboard Design for <strong>CDOT</strong><br />

Figure 5.18 is a recommended dashboard design with measures calculated as<br />

described in Section 3.6. The dashboard illustrates the department’s<br />

performance status at a glance.<br />

Each of the nine recommended measures are shown along with actual results<br />

for 2007 through 2010. Raw values were obtained from the Annual<br />

<strong>Performance</strong> Reports and measures calculated. The letter grades are easily<br />

viewed as are trends and issues needing attention (such as the red “F” in<br />

urban congestion). It is envisioned that the dashboard be updated annually<br />

or as available when the measures are calculated. The dashboard should be<br />

displayed prominently on <strong>CDOT</strong>’s website. Each measure would have drilldown<br />

capability to show more detail and specific values for the measure.<br />

All of the measures except those indicated with an asterisk (*) are based on<br />

actual values. The new recommended measures (congestion map and<br />

strategic action item implementation) are populated with data for<br />

demonstration purposes.<br />

The congestion map section will drill down into corridors with travel time<br />

reliability information. The calculations and maps showing the <strong>Plan</strong>ning<br />

Time Index are shown in Appendix G. The construction section will show<br />

graduated shading indicating the letter grades reached by the results.<br />

A simple palette with color used strategically for emphasis and white used to<br />

segregate areas should be considered in deciding the final layout and theme.<br />

The purpose is to show all measures fitting on an 8 ½ x 11 inch printed page<br />

or one standard 880 x 660 computer screen.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-27


Figure 5.18 <strong>CDOT</strong> Dashboard Example<br />

Pavement Condition<br />

Bridge Condition<br />

Roadside Condition<br />

Snow and Ice Control<br />

100%<br />

90%<br />

80%<br />

70%<br />

60%<br />

50%<br />

40%<br />

30%<br />

20%<br />

10%<br />

0%<br />

F F F F<br />

2007 2008 2009 2010<br />

Pavement Condition "Good" or "Fair"<br />

2007<br />

5%<br />

6%<br />

95%<br />

2009<br />

A<br />

A<br />

2008<br />

6%<br />

6%<br />

A<br />

94%<br />

2010<br />

A<br />

100%<br />

90%<br />

80%<br />

70%<br />

60%<br />

50%<br />

40%<br />

30%<br />

20%<br />

10%<br />

0%<br />

B+ B+ B+<br />

B-<br />

2007 2008 2009 2010<br />

Snow and Ice Control<br />

2010 C+<br />

2009 C+<br />

2008 C+<br />

2007 B-<br />

94%<br />

94%<br />

Number of Fatalities<br />

Roadway Congestion<br />

Congestion Map<br />

800<br />

600<br />

Five Year Rolling Average<br />

RURAL<br />

2010<br />

URBAN<br />

2010<br />

400<br />

200<br />

0<br />

8.4%<br />

91.6%<br />

A A<br />

45.5%<br />

54.5%<br />

F<br />

% uncongested % congested<br />

Strategic Action Item Implementation<br />

2007 2008 2009 2010<br />

D C B A<br />

Complete Advancing Delayed Dropped<br />

On Budget<br />

Construction<br />

On Time<br />

Construction<br />

Construction<br />

F D C B A<br />

B<br />

C<br />

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-29


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

The dashboard should be easily accessed from the <strong>CDOT</strong> Home Page, and<br />

can serve as the highest tier of an on-line application with drill-down<br />

capabilities for each measure as shown in Figure 5.19. As learned from best<br />

practices of other agencies, the following features should be incorporated in<br />

the performance measure detail pages:<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

simple definitions;<br />

how often the measure is updated, and why;<br />

sources of data;<br />

easy navigation using navigation bar, drop-down menus, and tabs;<br />

filters so users can access data of interest;<br />

methods and calculations used to assign grades;<br />

comparisons to national averages (for selected measures)(if desired);<br />

drill-down to specific department goals and strategies;<br />

issues mitigating results, such as budget limitations;<br />

branding consistent with rest of <strong>CDOT</strong> website; and<br />

export options for tabular data.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 5-31


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Figure 5.19 <strong>CDOT</strong> <strong>Performance</strong> Measures Site Organization<br />

<strong>CDOT</strong> Home Page<br />

<strong>CDOT</strong>’s <strong>Performance</strong> Results<br />

Dashboard<br />

Summary of all measures on one screen<br />

<strong>Performance</strong><br />

Measure Detail<br />

Pages<br />

Fatalities<br />

Bridge Condition<br />

Pavement Condition<br />

Roadside Condition<br />

Snow and Ice<br />

Roadway Congestion<br />

On Time Construction<br />

On Budget Construction<br />

Strategic Action Item<br />

Implementation<br />

Deployment can be accomplished in phases, beginning with adding a link to<br />

the <strong>CDOT</strong> Home Page, such as a button or logo with text such as “<strong>CDOT</strong>’s<br />

<strong>Performance</strong> Dashboard”. This can link to the dashboard as in the example.<br />

Hyperlinks from each graph to the appropriate section of the current<br />

<strong>Performance</strong> Report can be added. Further expansion can be developed by<br />

creating a web page for each measure with interactive material and links.<br />

5-32 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

6.0 Cost/Benefit Curves for Safety<br />

and Mobility<br />

In the context of performance management, understanding the relationship<br />

between funding and future performance levels can help to inform resource<br />

allocation decisions. These relationships can be communicated through a<br />

cost/benefit curve, as illustrated in Figure 6.1. This figure represents a<br />

cost/benefit curve for interstate bridge preservation funding in the Atlanta<br />

region. It shows the expected performance (expressed as a percent of bridges in<br />

good condition) in 2040 for various funding levels. For example, the graph<br />

indicates that maintaining current conditions would cost nearly $2 billion over<br />

the next 30 years, and that achieving an 80 percent condition level would cost<br />

around $1.1 billion.<br />

Figure 6.1<br />

Example Cost/Benefit Curve – Interstate Bridges in the Atlanta<br />

Region<br />

These types of curves can help agencies answer the following types of questions:<br />

<br />

How much money would it cost to maintain existing performance levels over<br />

the next X years?<br />

How much would it cost to improve existing performance levels to Y?<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 6-1


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

<br />

<br />

What performance can be achieved with the existing budget?<br />

What would be the impact on performance if the existing budget were<br />

increased or decreased by ten percent?<br />

Understanding the consequences of potential funding levels in terms of expected<br />

future performance can help decision-makers evaluate tradeoffs between<br />

competing needs, and allocate funds in a manner that reflects agency priorities.<br />

<strong>CDOT</strong>’s Transportation Deficit Report provides this type of information for three<br />

program areas – pavement preservation, bridge preservation and maintenance.<br />

The Deficit Report indicates the costs of achieving various performance levels and<br />

compares these costs to existing funding levels. This section examines the<br />

potential to expand this analysis to two additional program areas – safety and<br />

mobility.<br />

6.1 SAFETY<br />

<strong>CDOT</strong> Policy Directive 14 identifies two goals related to safety – “to create,<br />

promote and maintain a safe and secure transportation system and work<br />

environment, and to “increase investment in safety and strategic projects.” In<br />

general, <strong>CDOT</strong> works to address the systems components of these goals through<br />

two types of projects:<br />

<br />

<br />

Safety motivated projects – projects identified to address a specific safety<br />

deficiency; or<br />

Non-safety motivated projects – projects identified to address another type of<br />

deficiency, such as a preservation or mobility need.<br />

Safety motivated projects. <strong>CDOT</strong>’s 2010 Amendment to 2035 Revenue Forecast and<br />

Resource Allocation document indicates that over the next five years,<br />

approximately seven percent of agency funds will be allocated to safety<br />

motivated projects. These funds will address a wide variety of project types<br />

including rock fall mitigation, safety related pavement surface treatments,<br />

maintenance activities, railroad crossing programs, education programs, etc.<br />

One option for developing a safety cost/benefit curve is to focus on this bucket<br />

of funds. However, it can be very difficult to estimate the expected impact on<br />

fatalities and injuries of some of these types of projects, and to apply this process<br />

systematically at the network level. The ability to estimate future impacts is a<br />

requirement for developing the type of cost/benefit curves described above.<br />

Given the wide range of project types associated with this program and<br />

difficulties in modeling them, <strong>CDOT</strong> has focused much of its safety analysis over<br />

the past twelve years on non-safety motivated projects.<br />

Non-safety motivated projects. <strong>CDOT</strong> also works to incorporate safety features<br />

into non-safety motivated projects through the project development process.<br />

This process is based on the application of Safety <strong>Performance</strong> Functions (SPF),<br />

which provide an estimate of a location’s expected crash frequency and severity<br />

6-2 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

relative to similar facilities. <strong>CDOT</strong> incorporates safety components into a<br />

project’s design when its location has been identified as a viable safety candidate<br />

through the SPF analysis. This targeted approach and the leveraging of projects<br />

implemented through non-safety programs has enabled <strong>CDOT</strong> to achieve<br />

significant safety improvements over the past decade. Given the success of this<br />

approach, it would be beneficial to understand the relationship between<br />

additional funding and expected future performance. However, a cost/benefit<br />

curve for these funds would be very difficult to develop because the projects<br />

implemented with them are driven by prioritization processes that largely<br />

consider non-safety factors, and the expected impacts of the safety strategies are<br />

largely location and project specific.<br />

Given the focus on incorporating safety into non-safety motivated projects and<br />

the difficulty in modeling the implications of this strategy at the network level, it<br />

is recommended that <strong>CDOT</strong> not develop the type of cost/benefit curve<br />

illustrated in Figure 6.1 for safety. Instead, it is recommended that <strong>CDOT</strong> assess<br />

safety costs and benefits at the project level. For example, by evaluating the<br />

relationship between the incremental costs of modifying the scope of a nonsafety<br />

motivated project to address safety and the expected impact on safety of<br />

the modification. In addition, if not done so already, <strong>CDOT</strong> should consider<br />

systematically inflating unit costs for preservation projects to account for safety<br />

strategies and updating existing preservation cost/benefit curves based on these<br />

updated costs. This would require developing historic estimates for the cost of<br />

safety related scope changes. Updating the costs would change the shape of the<br />

cost/benefit curves because a portion of the overall funding would go to safety<br />

related work rather than to preservation activities.<br />

6.2 MOBILITY<br />

<strong>CDOT</strong>’s Fiscal Year 2010 <strong>Performance</strong> Report indicates that approximately eight<br />

percent of Colorado’s lane miles are considered highly congested. Congestion is<br />

defined as peak traffic exceeding eighty-five percent of a highway’s design<br />

capacity. The Annual <strong>Performance</strong> Report provides a map that shows that the<br />

congested segments are largely concentrated in the Denver region. In major<br />

urban areas such as Denver, increasing highway capacity through capital<br />

projects (for example, adding lanes to an existing highway or building a new<br />

corridor) can be cost prohibitive, leading agencies to focus on a relatively small<br />

set of strategic expansion projects. In these cases, highway operations strategies<br />

provide an alternative means for achieving significant performance<br />

improvements.<br />

Throughout the course of this project, <strong>CDOT</strong> staff confirmed that the situation<br />

described above holds true for Colorado. Achieving significant mobility through<br />

capital improvements does not appear to be a viable option. Therefore, a<br />

meaningful cost/benefit curve would need to focus on potential operations<br />

strategies.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. 6-3


<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong><br />

Section 3 of this report recommends three measures related to congestion – one<br />

based on the percent of person miles traveled (PMT) on congested corridors, one<br />

on person delay, and another one based on travel time reliability. The first<br />

measure is designed to work with <strong>CDOT</strong>’s existing data, and <strong>CDOT</strong> is working<br />

to develop the second and third. The first measure is based on comparing a<br />

highway segment’s traffic volume to its capacity. Developing a cost/benefit<br />

curve would require estimating the change in a highway segment’s capacity that<br />

could be achieved by implementing a project. This calculation is relatively<br />

straightforward for capacity expansion projects but not technically feasible for<br />

operations projects.<br />

Given that the mobility curve should focus on operations projects and that it is<br />

not possible to relate operations projects directly to a highway’s capacity, it is<br />

recommended that <strong>CDOT</strong> hold off on developing a mobility cost/benefit curve<br />

until a measure of reliability is developed.<br />

When the details of delay and reliability measures have been resolved, a<br />

cost/benefit curve could be developed using a project-level approach. This<br />

approach would require <strong>CDOT</strong> to define a list of potential operations projects<br />

and to estimate the following information for each:<br />

<br />

<br />

<br />

Project cost;<br />

Value of the reliability measure on the project segment without the project;<br />

and<br />

Value of the reliability measure on the project segment with the project.<br />

In addition, the projects would need to be ranked from highest to lowest priority.<br />

The prioritization could be done based on cost effectiveness (e.g., change in<br />

reliability per dollar) or using a more comprehensive approach. Once the list has<br />

been prioritized, a cost/benefit curve could be created as follows:<br />

<br />

<br />

<br />

<br />

<br />

Assume a funding level;<br />

Work down the list of prioritized projects until these funds are expended;<br />

Calculate the cumulative change in reliability expected for the funded set of<br />

projects;<br />

Plot the resulting value; and<br />

Repeat this process for additional funding levels.<br />

6-4 <strong>Cambridge</strong> <strong>Systematics</strong>, Inc.


APPENDICES<br />

A. Additional <strong>Data</strong> Collection Recommendations<br />

B. <strong>CDOT</strong> Supporting Measures<br />

C. References<br />

D. Stakeholder Interview Summary<br />

E. <strong>CDOT</strong> Goals and Objectives<br />

F. Example of a <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong> Charter<br />

G. Congestion<br />

H. Calculation Spreadsheet Pages


Appendices<br />

A. Additional <strong>Data</strong> Collection<br />

Recommendations<br />

During the data collection phase within a data program, it is important to take<br />

into account the following considerations:<br />

<br />

<br />

<br />

<br />

<br />

Collect Accurate and Consistent Location Reference <strong>Data</strong>. Make sure that<br />

performance data is collected with location referencing information that<br />

allows it to be mapped and integrated with other data sources. There are<br />

many efforts underway within <strong>CDOT</strong> to address this. Further<br />

standardization in this area will help <strong>CDOT</strong> to better integrate and share<br />

business data within the agency.<br />

Put <strong>Data</strong> Quality Controls in Place. Location and temporal validity and<br />

integrity control systems for data elements must be compatible. For example,<br />

when collecting data from multiple pieces of equipment, by way of multiple<br />

methods or from multiple sources, consistency of the measurement must be<br />

assured. <strong>Data</strong> from surveillance systems are often faulty or missing because<br />

of errors in the surveillance system (e.g., loop detector errors, communication<br />

drops between loops and traffic management centers (TMC), etc.). It is<br />

important that reliable detector diagnostic tools be in place to check the data<br />

for accuracy and also to fill missing data based on reliable statistical<br />

procedures. <strong>Data</strong> quality also is an area of focus for <strong>CDOT</strong>. For example, the<br />

Division of Transportation Development (DTD), Traffic Analysis Unit (TAU)<br />

has very specific data quality controls in place for the collection, analysis and<br />

reporting of traffic data. Similar/ related quality assurance could be more<br />

standardized within <strong>CDOT</strong>.<br />

<strong>Data</strong> Should Facilitate Integration. <strong>Data</strong> should be used to facilitate<br />

integration of information available from multiple systems. This includes<br />

collecting and formatting data at an acceptable precision level that allows for<br />

easier integration of data across systems. The current lack of standards for<br />

data management within <strong>CDOT</strong> may be leading to redundancies in data<br />

collection and in fact may be hindering integration.<br />

<strong>Data</strong> Formats. It is essential that all data are provided in electronic format. It<br />

is quite common that static data pertaining to network characteristics are in<br />

non-electronic form (maps, drawings, as built plans, or aerial photos), which<br />

makes it difficult and time consuming to code electronically. Such data<br />

should ideally be available electronically in previously developed simulation<br />

tools, GIS tools, or other software packages.<br />

Open <strong>Data</strong> and Open Standards. The availability of privately owned,<br />

continuously collected data has the potential to improve evaluations but is<br />

likely to increase the complexity of conducting evaluations. This issue calls<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. A-1


Appendices<br />

for the development of data standards for data definitions, means of<br />

collection, and format.<br />

<strong>Data</strong> Assessment<br />

In order to support performance measurement efforts effectively, data programs<br />

must be carefully evaluated in terms of their ability to meet overall agency and<br />

stakeholder goals. For example, traffic and safety data programs must produce<br />

quality data to support decision-making regarding safety and mobility projects.<br />

The section assists in performing a health assessment of data systems to<br />

determine where the most critical deficiencies exist and to develop a strategy for<br />

addressing those deficiencies.<br />

In general, criteria must be developed to assess the data programs. An example<br />

of the type of criteria that could be used were initially identified for use with the<br />

FHWA’s Traffic <strong>Data</strong> Quality Management Report and are applicable, as well,<br />

for assessing quality of data used for performance measurement. These criteria<br />

include the following:<br />

<br />

<br />

<br />

<br />

<br />

<br />

Accuracy. The measure of degree of agreement between data values or sets of<br />

values and a source assumed to be correct. It is also defined as a qualitative<br />

assessment of freedom from error, with a high assessment corresponding to a<br />

small error.<br />

Completeness (also referred to as availability). The degree to which data<br />

values are present in the attributes (e.g., volume and speed are attributes of<br />

traffic) that require them. Completeness is typically described in terms of<br />

percentages or number of data values and measures how much data is<br />

available compared to how much data should be available.<br />

Validity. The degree to which data values satisfy acceptance requirements of<br />

the validation criteria or fall within the respective domain of acceptable<br />

values. <strong>Data</strong> validity can be expressed in numerous ways. One common way<br />

is to indicate the percentage of data values that either pass or fail data<br />

validity checks.<br />

Timeliness. The degree to which data values or a set of values are provided<br />

at the time required or specified. Timeliness can be expressed in absolute or<br />

relative terms. This also can be referred to as latency.<br />

Coverage. The degree to which data values in a sample accurately represent<br />

the whole of that which is to be measured. As with other measures, coverage<br />

can be expressed in absolute or relative units.<br />

Accessibility (also referred to as usability). The relative ease with which<br />

data can be retrieved and manipulated by data consumers to meet their<br />

needs. Accessibility can be expressed in qualitative or quantitative terms.<br />

<strong>Data</strong> Management<br />

While managing data within data programs, it is important to do the following:<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. A-2


Appendices<br />

<br />

<br />

<br />

<br />

<br />

Recognize and <strong>Plan</strong> for <strong>Data</strong> Management Costs. Adequate resources must<br />

be provided to collect, store, archive, analyze, and disseminate critical data<br />

elements.<br />

Manage <strong>Data</strong> as an Asset. <strong>Performance</strong> data needs to be acquired and<br />

managed as an enterprise asset. If a data element is judged to be a critical<br />

input for the performance measurement process, it needs to have a data<br />

owner, a data element definition, a schedule for updating, and a fixed<br />

amount of precision. The definition must be clear to end users and decisionmakers<br />

and applied consistently throughout the agency. Enterprise-level data<br />

elements must be accessible throughout the data-owning agency and for<br />

authorized uses among business process partners in cooperating local, state,<br />

and Federal agencies.<br />

Require Metadata for the <strong>Data</strong>. When transmitting data from one group to<br />

another, the use of metadata greatly enhances the information that is<br />

provided about a set of data. There are different formats used for creating<br />

metadata including geo-spatial and non-geo-spatial formats. The purpose of<br />

the metadata is to provide more detailed information about how the data are<br />

defined and their intended use, and to warn users of data limitations and<br />

variability. The use of metadata and metadata standards are extremely<br />

important in ensuring that data are used appropriately for making business<br />

decisions. Another critical piece of information provided in metadata is<br />

contact information about who can provide additional information about the<br />

data when needed. This can eliminate confusion on the part of users, when<br />

trying to determine the best source of data for addressing a business related<br />

question.<br />

Metadata Standards and Formats. The use of metadata standards and<br />

formats helps to facilitate the understanding, characteristics, and usage of<br />

data. Metadata provides such information as data name, size, data type,<br />

where data is located, how it is associated and data ownership (source:<br />

http://en.wikipedia.org/wiki/Metadata). Listed below are some traditional<br />

needs for metadata standards as identified by the Metadata Subcommittee of<br />

the TRB. The Subcommittee identified “Priority Needs for Metadata in<br />

Transportation” in its “Working Document – Research Agenda”, Section 4,<br />

January 15, 2007. Metadata are needed:<br />

– Where data serve a critical function and impacts key decisions;<br />

– Where different data sources need to be combined;<br />

– Where data are published to a large population of users with different<br />

needs;<br />

– Where data value depends on end-user understanding of data quality;<br />

and<br />

– Where data value depends on effectiveness of automated discovery tools.<br />

<strong>Data</strong> Dictionaries. <strong>Data</strong> dictionaries contain information about the physical<br />

database tables, such as the names of the tables, the names and attributes of<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. A-3


Appendices<br />

<br />

the data fields in each table, and data owner identification (i.e., which level of<br />

user has read/write capabilities to the database).<br />

Clearly State <strong>Data</strong> Definitions. Particularly where data from secondary<br />

sources are being used to derive performance measures, it is important to<br />

obtain and document precise data definitions.<br />

Research has indicated that as data is transformed into performance measures, it<br />

is important to do the following:<br />

<br />

Focus on Essential Measures. Collect and store only those data that are<br />

essential to the purposes of data integration, strategic decision-making, and<br />

accountability. CS will address this focus during Task 1.1.<br />

Ensure Accuracy and Consistency of Fundamental Measures.<br />

Inconsistencies in the types of measures across data sets and analysis tools<br />

can sometimes arise due to use of different data sources and data estimation<br />

methods. Sometimes, the problem relates to data definitions. Other times, the<br />

problem is due to the lack of data integration and updating procedures with<br />

the result that some systems do not have the most up to date information.<br />

The implementation of the <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong> described below<br />

will ensure this issue is resolved.<br />

<br />

Avoid Linear Referencing Pitfalls. Trying to join aged linearly referenced<br />

highway attribute data to an up-to-date cartographic model of highways is a<br />

sure formula for loss of data integrity. The real-world highway system, and<br />

the current cartographic model of it, is changed frequently by route<br />

retirements, route additions, and route re-measurements that occur whenever<br />

geometric changes are included in a project. Archived linearly referenced<br />

highway attributes can only be mapped correctly in a GIS application by<br />

either joining them to the matching archived cartographic model or spatially<br />

transforming the archived attributes to be measured in the linear referencing<br />

datum that is current. Failure of GIS users to account for this temporal aspect<br />

of linear referencing systems is a currently major data integrity issue. One<br />

solution is to establish a business rule that requires all linearly referenced<br />

data enterprise-wide to be transformed to the current cartographic model,<br />

and to enforce the rule each time the cartographic model is updated. As<br />

previously mentioned, <strong>CDOT</strong> is working to address spatial relationship<br />

issues, the <strong>Data</strong> <strong>Plan</strong> will assist greatly by identifying roles, responsibilities,<br />

and rules to ensure data can be integrated spatially.<br />

Successful implementation of a performance-based system will continue to result<br />

in the following for <strong>CDOT</strong>:<br />

<br />

<br />

<br />

Improved system and organizational performance;<br />

Greater results with constrained resources, and fewer investments with low<br />

performance benefits;<br />

Strengthened accountability with elected officials and stakeholder groups;<br />

and<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. A-4


Appendices<br />

<br />

Improved communication with the full range of stakeholders.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. A-5


Appendices<br />

B. <strong>CDOT</strong> Supporting Measures<br />

The following table summarizes all of the measures and supporting data currently reported by <strong>CDOT</strong>. The<br />

measures are organized (and color coded) by Functional Category, Investment Category (Mobility, System<br />

Quality, Safety, Program Development or Non) (does not fit a category)). Raw data are also indicated. The chart<br />

could be further stratified to show relationship to the Public Budget categories (however this was not<br />

accomplished for this project.)<br />

Several of these measures also serve as supporting measures for the nine recommended measures. For example,<br />

the supporting measures for mobility include minutes of delay, congested corridors, volume to capacity ratio etc.<br />

It should be noted that this list of data is for performance measures only – the <strong>Data</strong> Governance section refers to<br />

ALL data collected at <strong>CDOT</strong>.<br />

Table B-1<br />

<strong>CDOT</strong> Measures<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

FHWA-<strong>CDOT</strong> Stewardship<br />

Count<br />

Economic/Freight MOB Avg. time to cross borders 0<br />

Economic/Freight MOB Travel Time Saved ($/resident) x 1<br />

Economic/Freight MOB Health/Service Accessibility x 1<br />

Economic/Freight MOB Job Accessibility 0<br />

Economic/Freight MOB Labor Force Accessibility 0<br />

Economic/Freight MOB Market Accessibility 0<br />

Economic/Freight MOB Destination Accessibility x 1<br />

Economic/Freight MOB Non-Work Accessibility 0<br />

Economic/Freight MOB Network Utility / Connect 0<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. B-1


Appendices<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

Mobility MOB Inches of snow fall by region x 1<br />

Mobility MOB Minutes of Delay on Congested Corridor (Vehicle) IMB x x x x 4<br />

Mobility MOB Minutes of Delay on Congested Corridor (Vehicle) in next LRTP year IMB x x x 3<br />

Mobility MOB On-Time Buses on US 36 R6 - Stephen Sperry RTD Denver x 1<br />

Mobility MOB Congested Corridors IMB x x 2<br />

Mobility MOB Congested Corridors where ITS implemented ITS - Bruce Coltharp x 1<br />

Mobility MOB Congested Corridors where incident management plans implemented ITS - Bruce Coltharp x 1<br />

Mobility MOB Congested Corridors where ramp metering implemented ITS - Bruce Coltharp x 1<br />

Mobility MOB Ramp Metering Travel Time Benefits ITS - Bruce Coltharp x 1<br />

Mobility MOB Incident Clearance Time ITS - Bruce Coltharp x 1<br />

Mobility MOB Average incident clearance time along I-70 ITS - Bruce Coltharp x 1<br />

Mobility MOB Avg lgth of winter closures due to inc on I-70 (Heavy Tow Program) ITS - Bruce Coltharp x 1<br />

Mobility MOB Number of vehicles served by Courtesy Patrol Program ITS - Bruce Coltharp x 1<br />

Mobility MOB Minutes of Delay (Individual) IMB 0<br />

Mobility MOB Minutes of Delay per VMT IMB 0<br />

Mobility MOB Volume to Capacity Ratio IMB 0<br />

Mobility MOB Weighted Avg. AADT per Lane IMB 0<br />

Mobility MOB %Lane Miles congested IMB x x 2<br />

Mobility MOB %Urban Lane Miles congested IMB x x 2<br />

Mobility MOB #Miles congested<br />

Mobility MOB Comparison of Growth in Vol. & Loadings on Interstate IMB 0<br />

Mobility MOB % Distribution of Traffic Vol. & Loadings on Rural Interstate IMB 0<br />

Mobility MOB Length by Volume-Service Flow Ratio IMB 0<br />

Mobility MOB <strong>Plan</strong>ning Time Index (reliability) IMB 0<br />

Mobility MOB Trip Travel Time IMB 0<br />

Mobility MOB Travel Time Index IMB 0<br />

Mobility MOB Travel Time Saved (hours/resident) IMB x 1<br />

Mobility MOB Travel Time Variation (DRCOG: ratio of peak hour to non-peak hour) IMB 0<br />

Mobility MOB Congested center line miles IMB 0<br />

Mobility MOB Congestion severity (DRCOG: % of corridors delayed during peak travel) IMB 0<br />

Mobility MOB Mobility Level of Service IMB 0<br />

FHWA-<strong>CDOT</strong> Stewardship<br />

Count<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. B-2


Appendices<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

Mobility MOB %travelers using non-SOV IMB 0<br />

Mobility MOB Worker Travel Modes IMB x 1<br />

Mobility MOB Avg. speed on Interstate/NHS IMB 0<br />

Mobility MOB Avg. Time for Different Segments IMB x 1<br />

Mobility MOB Lane Closures (lost lane-hours) IMB 0<br />

Mobility MOB % corridors near full speed (cotrip.org) ITS - Bruce Coltharp 0<br />

Mobility MOB I-25 HOT Lane Usage HPTE? x 1<br />

Mobility MOB Transit Trip Demand DTR x 1<br />

Mobility MOB Transit Ridership DTR x x 2<br />

Mobility MOB Dollar value of total FTA funding for grant programs DTR x 1<br />

Mobility MOB #Miles snowplowed, sanded or de-iced M&O - B.J. McElroy SAP x 1<br />

Mobility MOB Average # of vehicles passing through Eisenhower-Johnson daily IMB x 1<br />

Mobility MOB Average # of vehicles passing through Eisenhower-Johnson annually IMB x 1<br />

Raw <strong>Data</strong> MOB VMT - CO IMB x x 2<br />

Raw <strong>Data</strong> MOB VMT - State Hwy IMB x x 2<br />

Raw <strong>Data</strong> MOB %∆ in VMT on State Hwy over previous year IMB x 1<br />

Raw <strong>Data</strong> MOB DVMT by <strong>CDOT</strong> Region IMB x x 2<br />

Raw <strong>Data</strong> MOB Truck DVMT by <strong>CDOT</strong> Region IMB x 1<br />

Raw <strong>Data</strong> MOB VMT/Capita IMB 0<br />

Economic/Freight NON Freight volume by mode x 1<br />

Economic/Freight NON Freight volume by corridor 0<br />

Economic/Freight NON Energy development by corridor x 1<br />

Economic/Freight NON #Airports x 1<br />

Economic/Freight NON Travel Spending x 1<br />

Economic/Freight NON Vehicle Maint Saved ($/resident) x 1<br />

Economic/Freight NON Long-Term Job Creations x 1<br />

Economic/Freight NON %State gaming revenue from Central City and Black Hawk x 1<br />

Environment NON Wetland impact and replacement ratios EPB - Tom Boyce x 1<br />

Environment NON Water Quality Measure EPB - Tom Boyce x 1<br />

Environment NON Water Quality Violations EPB x 1<br />

Environment NON %RECAT findings addressed within 48 hrs EPB x 1<br />

FHWA-<strong>CDOT</strong> Stewardship<br />

Count<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. B-3


Appendices<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

Environment NON Trans-related Air Quality Emit. EPB 0<br />

Environment NON <strong>CDOT</strong> use of Stormwater Pract. EPB 0<br />

Environment NON <strong>CDOT</strong> energy, recycle, carbon EPB 0<br />

Environment NON Annual highway greenhouse EPB x 1<br />

Environment NON Environmental Justice Impact EPB 0<br />

Environment NON Fuel consum saved/resident EPB x 1<br />

Environment NON Petroleum consumption EPB 0<br />

Mobility NON MLOS - Snow & Ice Control M&O - B.J. McElroy SAP x x x x x x 6<br />

Mobility NON E-470 Toll Revenue HPTE 0<br />

Mobility NON #Tons of solid de-icer M&O - B.J. McElroy SAP x 1<br />

Mobility NON #Gallons of solid de-icer M&O - B.J. McElroy SAP x 1<br />

Mobility NON #Feet of snow fence repaired and installed M&O - B.J. McElroy SAP x 1<br />

Safety NON Workers' Comp Claims Risk Mgt - Tracie Smith x x 2<br />

Safety NON Workers' Comp Claim $ Risk Mgt - Tracie Smith x 1<br />

PD - PD - Transit Contracts Processed TDR 0<br />

PD - DTD <strong>Plan</strong>ning PD - DTD CPG and Rural PO PPB x 1<br />

PD - DTD <strong>Plan</strong>ning PD - DTD #HPMS and other transportation data re-submittals required PPB x 1<br />

PD - DTD <strong>Plan</strong>ning PD - DTD Transit Contracts - Avg. Days TDR 0<br />

PD - DTD <strong>Plan</strong>ning PD - DTD CMAQ Contracts Processed PPB 0<br />

PD - DTD <strong>Plan</strong>ning PD - DTD CMAQ Contracts - Avg. Days PPB 0<br />

PD - DTD <strong>Plan</strong>ning PD - DTD CPG Contracts Processed PPB 0<br />

PD - DTD <strong>Plan</strong>ning PD - DTD CPG Contracts - Avg. Days PPB 0<br />

PD - DTD <strong>Plan</strong>ning PD - DTD Total value of federal grants administered by DTD PPB x 1<br />

PD - Engineering PD - ENG Region Allocation % budget advertised/encumbered/oblig. x 1<br />

PD - Engineering PD - ENG % FASTER Safety budget encumbered x 1<br />

PD - Engineering PD - ENG % ITS capital budget advertised x 1<br />

PD - Engineering PD - ENG % Rock fall mitigation budget encumbered x 1<br />

PD - Engineering PD - ENG % Regional priority program budget encumbered x 1<br />

PD - Engineering PD - ENG % Surface treatment budget obligated x 1<br />

PD - Engineering PD - ENG % Surface treatment projects match PMS sw plan x 1<br />

PD - Engineering PD - ENG % MLOS budget expended x 1<br />

FHWA-<strong>CDOT</strong> Stewardship<br />

Count<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. B-4


Appendices<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

PD - Engineering PD - ENG % of engineering estimates +/- 10% low bid on all projects x 1<br />

PD - Engineering PD - ENG $ for projects on the shelf per region x 1<br />

PD - Finance PD - FIN Projects exec by local agencies or sub-grantees as a % of proj auth for constr OFMB x 1<br />

PD - Finance PD - FIN %STIP projects advanced in year promised OFMB x 1<br />

PD - Finance PD - FIN 7th Pot Progress OFMB - Pat Girten x 1<br />

PD - Finance PD - FIN ARRA Obligations CE - Charles Meyer 0<br />

PD - Finance PD - FIN FASTER-Safety Projects OFMB 0<br />

PD - Finance PD - FIN Admin Exp % of Tot Budg OFMB 0<br />

PD - HR PD - HR %Empl Turnover CHRM - Deb Haglund x 1<br />

PD - ITS PD - ITS Cotrip use ITS 0<br />

PD - ITS PD - ITS Concern by Invest Cat x 1<br />

PD - ITS PD - ITS Webhits to cotrip.org ITS x 1<br />

PD - ITS PD - ITS Employee engagement 0<br />

PD - ITS PD - ITS Cotrip satisfaction ITS 0<br />

PD - ITS PD - ITS Calls to road hotline ITS x 1<br />

PD - MLOS PD - MLOS MLOS - <strong>Plan</strong>ning & Training M&O - B.J. McElroy SAP x x x 3<br />

PD - Operations PD - OPS %Design on Time PPB - JoAnn Mattson SAP x x x x 4<br />

PD - Operations PD - OPS %DBE participating in FHWA program CEO - Debra Gallegos x x x x 4<br />

PD - Operations PD - OPS %Construction Projects on Time PPB - JoAnn Mattson SAP x x 2<br />

PD - Operations PD - OPS %Design/ROW paid w/Fed funds not advanced to construction x 1<br />

PD - Operations PD - OPS #DBEs using CDC and DBE supportive services CEO - Debra Gallegos x 1<br />

PD - Operations PD - OPS %DBE payments equal goal as awarded at project complet/final invoice CEO - Debra Gallegos x 1<br />

PD - Operations PD - OPS Cumulative DBE goal set by region CEO - Debra Gallegos x 1<br />

PD - Operations PD - OPS #Payment complaints by DBEs/ESBs and %compared to non-DBE/ESBs CEO - Debra Gallegos x 1<br />

PD - Operations PD - OPS #Days to investigate Title VI complaints CEO - Debra Gallegos x 1<br />

PD - Operations PD - OPS #ADA investigations and days to investigate CEO - Debra Gallegos x 1<br />

PD - Operations PD - OPS Dollar value of total amonut of purchases and contracts processed Procurement x 1<br />

PD - Operations PD - OPS #Purchase orders and contracts processed Procurement x 1<br />

PD - Operations PD - OPS #Price agreements Procurement x 1<br />

PD - Operations PD - OPS #Projects went to bid Contracts & Mkt Analysis x 1<br />

PD - Operations PD - OPS Total dollar value of all projects sent to bid Contracts & Mkt Analysis x 1<br />

FHWA-<strong>CDOT</strong> Stewardship<br />

Count<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. B-5


Appendices<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

PD - Operations PD - OPS % Low bids w/i +/- 15% of Engineer's Est on projects over $250K Contracts & Mkt Analysis x x 2<br />

PD - Operations PD - OPS %Projects completed w/in commitment amount Contracts & Mkt Analysis x 1<br />

PD - Operations PD - OPS #Major (>$250K) Change Orders Contracts & Mkt Analysis x 1<br />

PD - Operations PD - OPS #Change orders for time extensions Contracts & Mkt Analysis x 1<br />

PD - Operations PD - OPS #Change Orders (


Appendices<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

Raw <strong>Data</strong> RAW Forecast Rev by Source OFMB x x x x x x 6<br />

Raw <strong>Data</strong> RAW Invest by Category OFMB x x x x 4<br />

Raw <strong>Data</strong> RAW Budget at program level OFMB x 1<br />

Raw <strong>Data</strong> RAW Actual Expenditures OFMB x 1<br />

Raw <strong>Data</strong> RAW VMT Forecast through next LRTP Year IMB x x 2<br />

Raw <strong>Data</strong> RAW CO Population U.S. Census Bur. x x 2<br />

Raw <strong>Data</strong> RAW CO Population Growth thru next LRTP U.S. Census Bur. x x 2<br />

Raw <strong>Data</strong> RAW Population by Region in current year and next LRTP Year U.S. Census Bur. x 1<br />

Raw <strong>Data</strong> RAW CO Employment x 1<br />

Raw <strong>Data</strong> RAW Gasoline Price OFMB 0<br />

Raw <strong>Data</strong> RAW Relative (to 1992 $) value of motor fuel tax OFMB x 1<br />

Raw <strong>Data</strong> RAW CO HUTF Distribution OFMB x 1<br />

Raw <strong>Data</strong> RAW Forecast Spend by Mode OFMB x 1<br />

Raw <strong>Data</strong> RAW Forecast Spend by Invest Cat OFMB x 1<br />

Raw <strong>Data</strong> RAW FASTER Bridge Safety Surcharge Forecasts OFMB x 1<br />

Raw <strong>Data</strong> RAW FASTER Road Safety Surcharge Forecasts OFMB x 1<br />

Raw <strong>Data</strong> RAW FASTER Daily Rental Car Fee Forecast OFMB x 1<br />

Raw <strong>Data</strong> RAW FASTER Oversize/Overweight Vehicle Charge Rev Forecast OFMB x 1<br />

Raw <strong>Data</strong> RAW CO (state and local) FTA funds by program area OFMB x 1<br />

Raw <strong>Data</strong> RAW CO (state and local) FHWA transit funds OFMB x 1<br />

Raw <strong>Data</strong> RAW <strong>CDOT</strong> federal transit funds (FTA and FHWA) OFMB x 1<br />

Raw <strong>Data</strong> RAW CO Gaming Funds Revenue OFMB x 1<br />

Raw <strong>Data</strong> RAW CO Capital Construction Funds Revnue OFMB x 1<br />

Raw <strong>Data</strong> RAW CCI Contracts & Mkt Analysis x 1<br />

Raw <strong>Data</strong> RAW Authorized FTE CHRM - Deb Haglund x 1<br />

Raw <strong>Data</strong> RAW Actual FTE Filled CHRM - Deb Haglund x 1<br />

Raw <strong>Data</strong> RAW Authorized FTE and vacancies by division/region CHRM - Deb Haglund x 1<br />

Raw <strong>Data</strong> RAW %Full service employees elligible for retirement CHRM - Deb Haglund x 1<br />

Raw <strong>Data</strong> RAW %Reduced service employees elligible for retirement CHRM - Deb Haglund x 1<br />

Raw <strong>Data</strong> RAW #Copies printed per month Facilities x 1<br />

Raw <strong>Data</strong> RAW Average daily value of Bid <strong>Plan</strong>s Unit sales Facilities x 1<br />

FHWA-<strong>CDOT</strong> Stewardship<br />

Count<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. B-7


Appendices<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

Raw <strong>Data</strong> RAW Roadway Center Line Miles by State, County, City and Other IMB - Lou Henefeld x x x 3<br />

Raw <strong>Data</strong> RAW %∆ in State Hwy Center Line Miles over previous year IMB - Lou Henefeld x 1<br />

Raw <strong>Data</strong> RAW #Bridges by State, County, City and Other Bridge - Mark Nord PONTIS x x x 3<br />

Raw <strong>Data</strong> RAW Vehicle Lane Miles - CO or State Hwy IMB - Lou Henefeld x x x x 4<br />

Raw <strong>Data</strong> RAW Center Lane Miles by Region IMB - Lou Henefeld x 1<br />

Raw <strong>Data</strong> RAW %∆ in state Hwy Lane Miles over previous year IMB - Lou Henefeld 0<br />

Raw <strong>Data</strong> RAW Lane Miles Growth Forecast thru next LRTP year IMB - Lou Henefeld x 1<br />

Raw <strong>Data</strong> RAW Miles striping M&O SAP x 1<br />

Raw <strong>Data</strong> RAW Miles Guardrail M&O SAP x 1<br />

Raw <strong>Data</strong> RAW Miles Ditches M&O SAP x 1<br />

Raw <strong>Data</strong> RAW #<strong>CDOT</strong> Signalized intersections total and by region M&O SAP x x 2<br />

Raw <strong>Data</strong> RAW #<strong>CDOT</strong> signs total and by region M&O SAP x x 2<br />

Raw <strong>Data</strong> RAW #<strong>CDOT</strong> sign structures total and by region M&O SAP x 1<br />

Raw <strong>Data</strong> RAW #<strong>CDOT</strong> Regional maintenance positions filled CHRM - Deb Haglund x 1<br />

Raw <strong>Data</strong> RAW #<strong>CDOT</strong> HQ maintenance positions filled CHRM - Deb Haglund x 1<br />

Raw <strong>Data</strong> RAW #maintenance patrols by region M&O SAP x 1<br />

Raw <strong>Data</strong> RAW Average Cost of New Construction Project Contracts & Mkt Analysis x 1<br />

Raw <strong>Data</strong> RAW Average Cost of a Widening Project Contracts & Mkt Analysis x 1<br />

Raw <strong>Data</strong> RAW Average Cost of a Reconstruction Project Contracts & Mkt Analysis x 1<br />

Raw <strong>Data</strong> RAW Average Cost of a New Interchange Contracts & Mkt Analysis x 1<br />

Raw <strong>Data</strong> RAW Average Cost of a Resurfacing Project Contracts & Mkt Analysis x 1<br />

Raw <strong>Data</strong> RAW Average Cost to Maintain one lane mile M&O - B.J. McElroy SAP x 1<br />

Raw <strong>Data</strong> RAW Average cost per plow mile M&O - B.J. McElroy SAP x x 2<br />

Raw <strong>Data</strong> RAW #Plow Miles M&O - B.J. McElroy SAP x 1<br />

Raw <strong>Data</strong> RAW Regis Vehicles in CO by Auto, Bus, Truck, Motorcycle and Other CO Dept. of Rev. x 1<br />

Raw <strong>Data</strong> RAW Total Licensed Drivers in CO and by Gender CO Dept. of Rev. x x 2<br />

Raw <strong>Data</strong> RAW Hits to coloradodot.info Public Information x 1<br />

Raw <strong>Data</strong> RAW Twitter followers Public Information x 1<br />

Raw <strong>Data</strong> RAW GovDelivery recipients Public Information x 1<br />

Safety SAF Fatality Rate (per 100M VMT) STE - Rahim Marandi x x x x x x 6<br />

Safety SAF Fatal Crash Rate (per 100M VMT) STE - Rahim Marandi x 1<br />

FHWA-<strong>CDOT</strong> Stewardship<br />

Count<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. B-8


Appendices<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

Safety SAF Serious Injury Crash Rate (per 100M VMT) STE - Rahim Marandi x 1<br />

Safety SAF % Seat Belt Use STE - Ilana Erez x x x x x 5<br />

Safety SAF Total Crash Rate (per 100M VMT) STE - Rahim Marandi x x 2<br />

Safety SAF Fatal Crashes - Alcohol (% of all crashes) STE - Rahim Marandi x 1<br />

Safety SAF <strong>CDOT</strong> Vehicle Accidents Risk Mgt - Tracie Smith x x 2<br />

Safety SAF MLOS - Traffic Services M&O - B.J. McElroy SAP x x x 3<br />

Safety SAF #Crashes in CO STE - Rahim Marandi x 1<br />

Safety SAF %Crashes occur on CO state hwys STE - Rahim Marandi x 1<br />

Safety SAF Fatalities - Alcohol STE - Rahim Marandi x x x x 4<br />

Safety SAF Crash data processing time x 1<br />

Safety SAF #Traffic Fatalities STE - Rahim Marandi x x x x 4<br />

Safety SAF Number of Serious Injuries STE - Rahim Marandi 0<br />

Safety SAF Injuries STE - Rahim Marandi x 1<br />

Safety SAF Fatalities - Pedestrian STE - Rahim Marandi x x 2<br />

Safety SAF Fatalities - Speed STE - Rahim Marandi x 1<br />

Safety SAF Crash Reduction Factor 0<br />

Safety SAF Fatalities - No Seat Belt STE - Rahim Marandi x x x 3<br />

Safety SAF Fatalities - Drivers and passengers STE - Rahim Marandi x x 2<br />

Safety SAF Fatalities - Drivers and passengers other vehicle STE - Rahim Marandi x 1<br />

Safety SAF Fatalities - Motorcycles STE - Rahim Marandi x x x 3<br />

Safety SAF Fatalities - Motorcycles no helmet STE - Rahim Marandi x 1<br />

Safety SAF Fatalities - Drivers and passengers


Appendices<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

Safety SAF Car seat use


Appendices<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

Sys Qual SYS #AAH volunteers M&O SAP x 1<br />

Sys Qual SYS #Bags of trash cleared with corporate sponsors M&O SAP x 1<br />

Sys Qual SYS #Corporate sponsors of ROW hwy trash clearing M&O SAP x 1<br />

Sys Qual SYS #State bridges by region Bridge - Mark Nord PONTIS x 1<br />

Sys Qual SYS %OnSys Bridge Deck P Bridge - Mark Nord PONTIS x 1<br />

Sys Qual SYS %OnSys Bridge Deck SD Bridge - Mark Nord PONTIS x 1<br />

Sys Qual SYS %OnSys Bridge Deck FO Bridge - Mark Nord PONTIS x 1<br />

Sys Qual SYS #BE Bridges by region and project status Bridge - Mark Nord PONTIS x 1<br />

Sys Qual SYS #NHS Bridge SD Bridge - Mark Nord PONTIS x 1<br />

Sys Qual SYS #Signs and sign posts replaced M&O SAP x 1<br />

Sys Qual SYS #Linear feet of fencing replaced, installed or repaired M&O SAP x 1<br />

Sys Qual SYS #Known avalanche paths M&O SAP x 1<br />

Sys Qual SYS #Avalanche paths monitored and controlled M&O SAP x 1<br />

Sys Qual SYS #Avalanches triggered by explosives M&O SAP x 1<br />

Sys Qual SYS #Hwys impacted by triggered avalanches M&O SAP x 1<br />

Sys Qual SYS #Hours of hwy closures due to avalanche control M&O SAP x 1<br />

Sys Qual SYS #Hours spend performing avalanche control activities M&O SAP x 1<br />

Sys Qual SYS % Scour critical bridges that have had plans of actions upd after 2008 Bridge - Mark Nord PONTIS x 1<br />

Sys Qual SYS Linear feet of bridge expansion joint in condition state 1 Bridge - Mark Nord PONTIS x 1<br />

Sys Qual SYS #Bridges G/F/P by Functional Classification Bridge - Mark Nord PONTIS x 1<br />

Sys Qual SYS %Resurfacing projects recommended in PMS annual review Materials & Geotech-S. Henry Deighton x 1<br />

Sys Qual SYS % Surface treatment funds exp on pavement preservation by reg Materials & Geotech-S. Henry Deighton x 1<br />

Sys Qual SYS %NHS pavement with IRI


Appendices<br />

<strong>Plan</strong>ning Ext<br />

<strong>Plan</strong>ning<br />

Int<br />

PR<br />

Gov<br />

Rela FHWA<br />

2035 Statewide <strong>Plan</strong><br />

Annual Perf Report<br />

Deficit Report<br />

Strategic <strong>Plan</strong><br />

Chief Eng Obj (Internal)<br />

Contracting Improv Report<br />

Fact Book<br />

Annual Report<br />

Elected Officials Guide<br />

Transition Report<br />

Functional Category Investment Category <strong>Data</strong>/Measure Owner<br />

Source<br />

(System, File)<br />

Sys Qual SYS #Culverts total and by region Bridge - Mark Nord PONTIS x 1<br />

Sys Qual SYS %Culverts G/F/P total and by region Bridge - Mark Nord PONTIS x 1<br />

Sys Qual SYS %Pavement G/F Forecast Materials & Geotech-S. Henry Deighton 0<br />

Sys Qual SYS %Pavement Avg. IRA Materials & Geotech-S. Henry Deighton 0<br />

Sys Qual SYS Length by Measured Pavement Roughness Materials & Geotech-S. Henry Deighton 0<br />

Sys Qual SYS MLOS - Overall Forecast M&O - B.J. McElroy SAP x x 2<br />

Sys Qual SYS Cost to Meet MLOS at x level M&O - B.J. McElroy SAP x x 2<br />

FHWA-<strong>CDOT</strong> Stewardship<br />

Count<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. B-12


Appendices<br />

C. References<br />

Albir, S., UML In A Nutshell – A Desktop Quick Reference. Sebastopol, CA: O’Reilly<br />

and Associates, Inc., 1998.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc., <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong> <strong>Performance</strong> Measures High Level<br />

System Design. Alaska Department of Transportation and Public Facilities, 2007.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc., Alaska <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong> – <strong>Data</strong> Sharing and <strong>Data</strong><br />

Delivery Methods and Tools. Alaska Department of Transportation and Public<br />

Facilities, September 2009.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc., <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong> <strong>Performance</strong> Measures Concept of<br />

Operations. Alaska Department of Transportation and Public Facilities,<br />

September 2005.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc., <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong> <strong>Performance</strong> Measures Summary of<br />

Stakeholders Interviews. Alaska Department of Transportation and Public<br />

Facilities, July 2007.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc., <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong> <strong>Performance</strong> Measures <strong>Data</strong><br />

Governance, Standards and Knowledge Management. Alaska Department of<br />

Transportation and Public Facilities, September 2009.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc., <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong> <strong>Performance</strong> Measures Technical<br />

Information Sources – White Paper. Alaska Department of Transportation and<br />

Public Facilities, July 2007<br />

Few, Stephen, Information Dashboard Design: The Effective Visual Communication of<br />

<strong>Data</strong>. O’Reilly, 2006.<br />

FHWA, National ITS Architecture, Version 5.0. Washington: U.S. DOT, October<br />

2003.<br />

FHWA, Traffic <strong>Data</strong> Quality Measurement Report. September 2004.<br />

Ganapati, Sukumar, Use of Dashboards in Government. IBM Center for the<br />

<strong>Business</strong> of Government, 2011.<br />

Grant, et al, Congestion Management Process: A Guidebook. FHWA, 2011.<br />

Institute of Electrical and Electronics Engineers (IEEE), Recommended Practice for<br />

Software Requirements Specifications. IEEE Std 830-1998, June 25, 1998.<br />

NCHRP, Target-Setting Methods and <strong>Data</strong> Management to Support <strong>Performance</strong>-Based<br />

Resource Allocation by Transportation Agencies, Volume I and II, Project 08-70,<br />

NCHRP Report 666 and Report 706, 2011.<br />

NCHRP, Transportation <strong>Data</strong> Self Assessment Guide. Project 08-36, Task 100, 2011.<br />

North Carolina Department of Transportation, <strong>Performance</strong> Dashboard<br />

Documentation – Definitions, Rationale and Supporting Information for the <strong>Performance</strong><br />

Dashboard. Revised September 2010.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. C-1


Appendices<br />

Person, Ron, Balanced Scorecards & Operational Dashboards with MS Excel. Wiley<br />

Publishing, 2009.<br />

State of Washington Transportation Improvement Board Website,<br />

http://www.tib.wa.gov/performance/<strong>Performance</strong>.cfm, accessed October 19,<br />

2011.<br />

Virginia Department of Transportation, Project Dashboard Release 3.0 <strong>Business</strong><br />

Rules and User’s Information. Updated August 2011.<br />

Western Transportation Institute, RWIS Usage Study. Alaska Department of<br />

Transportation and Public Facilities, October 2007.<br />

White, Allan, Operational, tactical, strategic… what kind of dashboard do you have?<br />

Klipfolio blog, http://www.klipfolio.com/blog/entry/347, (Accessed April 7,<br />

2011).<br />

Wikipedia, Dashboard (Management Information Systems). November 23, 2011.<br />

http://en.wikipedia.org/wiki/Dashboard_%28management_information_syste<br />

ms%29. (Accessed December 20, 2011).<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. C-2


D. Stakeholder Interview<br />

Summary<br />

Memorandum<br />

TO:<br />

FROM:<br />

Cathy Cole, Colorado DOT<br />

Anita Vandervalk and Joe Guerre<br />

DATE: May 19, 2011<br />

RE:<br />

<strong>CDOT</strong> <strong>Performance</strong> <strong>Data</strong> <strong>Business</strong> <strong>Plan</strong> Interviews<br />

Core Team Meeting<br />

Attendance:<br />

Cathy Cole – <strong>Plan</strong>ning and <strong>Performance</strong> Branch<br />

Kate Dill – <strong>Plan</strong>ning and <strong>Performance</strong> Branch<br />

Joe Guerre – <strong>Cambridge</strong> <strong>Systematics</strong><br />

Danny Herrman – Region 6 <strong>Plan</strong>ner<br />

Myron Hora – Region 4 <strong>Plan</strong>ning and Environmental Manager<br />

William Johnson – Information Management Branch<br />

Sandi Kohrs – <strong>Plan</strong>ning and <strong>Performance</strong> Branch<br />

JoAnn Mattson – <strong>Plan</strong>ning and <strong>Performance</strong> Branch<br />

Scott Richrath – <strong>Performance</strong> and Policy Unit<br />

Anita Vandervalk – <strong>Cambridge</strong> <strong>Systematics</strong><br />

One of the objectives of this project is to identify the 5 to 8 highest priority measures and<br />

provide data governance details for each. The measures should expand beyond pavements<br />

and bridges. Information to be provided for each measure includes definition, units, timing,<br />

data items, data sources, calculations, etc. <strong>CDOT</strong> would like to be able to drill down into<br />

more detail on these top measures.<br />

Project Considerations:<br />

<br />

Change management challenges associated with using performance measures to inform<br />

decisions.


Appendices<br />

<strong>CDOT</strong> is required to report to governor on the implementation of <strong>CDOT</strong>’s strategic plan (2-<br />

7-202(13)(a) Colorado Revised Statutes.).<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

Public survey results will be available in approximately August*. Results can help to<br />

identify public priorities and help to frame performance results for public consumption.<br />

<strong>CDOT</strong> would like to analyze future performance scenarios (10-year time horizon).<br />

There is a need to connect <strong>CDOT</strong>’s measures with the measures and decisions made within<br />

individual departments and regions.<br />

<strong>Performance</strong> framework should allow for goals, objectives and measures to evolve over<br />

time.<br />

Work is underway to implement Public Budget Formulation (PBF) module of SAP. This<br />

module is emerging as the performance measure data warehouse. To ensure consistency<br />

with these and other <strong>CDOT</strong> efforts, this project should be technology-neutral and not<br />

address system architecture issues.<br />

<strong>Data</strong> governance recommendations need to identify data owners and formalize their role in<br />

the data QA/QC process.<br />

Dashboard recommendations should be developed from the perspective of the public.<br />

Kick Off Meeting<br />

Attendance:<br />

Mehdi Baziar - DTD<br />

Tony Bemelen - SAP Support<br />

Dave Bouwman - DTS<br />

Patrick Byrne - OFMB<br />

Eric Chavez - Pavement Management<br />

Cathy Cole - DTD<br />

Kathleen Collins - Statewide <strong>Plan</strong>ning<br />

Darrell. Johnson - OFMB/<strong>CDOT</strong><br />

Glenn Davis - OTS<br />

Kate Dill - DTD-PPA+<br />

Deb Hagland - CHRM<br />

Stephen Henry - Pavement<br />

Management<br />

Dan Herrmann - <strong>CDOT</strong>-R6 <strong>Plan</strong>ning<br />

Myron Hora - R4 <strong>CDOT</strong><br />

William Johnson - DTD IMB<br />

Sandi Kohrs - <strong>Plan</strong>ning and<br />

<strong>Performance</strong><br />

Tammy Lang - DTD/IMB<br />

Mark Leonard - Staff Bridge<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

Jeff Loeper – Staff Branches<br />

JoAnn Mattson - DTD <strong>Performance</strong><br />

B.J. McElroy - M&O<br />

Gregg Miller - SAP<br />

Mark Nord – Staff Bridge<br />

Steve Olson - SM&G Branch<br />

Darius Pakbaz - SAP Support<br />

Adrianne Raiche - EO<br />

Karen Rowe - SAP Support<br />

Pat Saffo - OFMB<br />

Tracie Smith - Risk Management<br />

Darin Stavish - Region 1 <strong>Plan</strong>ning<br />

Jeff Sudmeier - DTD – <strong>Plan</strong>ning<br />

Casey Tighe - Audit<br />

Will Ware - CFMB<br />

David Wieder - Maintenance & OPS<br />

Branch<br />

Beverly Wyatt - CHRM<br />

Scott Young - Staff Branches<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. D-2


Appendices<br />

The following question was presented to the group and their responses follow:<br />

“What do you hope to gain from the study and what can you contribute?”<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

To unify data collection across <strong>CDOT</strong><br />

Bring connection to SAP<br />

Revenue forecasting and resource allocation,<br />

Would like to see higher-level organizational measures and results for management.<br />

Quality of measures across the stewardship agreement<br />

Chief Objectives reports<br />

Mentioned Transport software<br />

Interested in mobility, congestion, and speed data<br />

Would like to see delay added as a measure<br />

Track risk management associated with human resources<br />

Have information on audit compliance reviews and Efficiency, Accountability Committee<br />

Interested in data governance (who owns the data?)<br />

Roles and responsibilities, collect once using a linear referencing system to share<br />

information, and mobility performance measures<br />

Benefit/cost measures in the region<br />

Use performance measures to see where we are and where we are going<br />

Impacts of investments and outcomes (10 and 20 years in the future)<br />

A vital few measures is desirable, however need to be able to drill down to ones that are<br />

more detailed<br />

Would like to see concrete measures – subjective rather than objective<br />

Would like in real time for making decisions<br />

Would like better coordination and streamlining of data and performance measures, to be<br />

accessible for long range planning and sharing with all parties<br />

Maintenance performance-based budget<br />

Would like to see Department be more efficient in reporting data, more standard/universal<br />

methods to report and avoid variations,<br />

<strong>CDOT</strong> spends a lot of time on PMs, needs to be more efficient<br />

Measures and reports frequently, would like to see reduction in redundancy of data (i.e.<br />

both Maintenance and Pavement collecting the same data items)<br />

Not every lane mile may be necessary for pavement data collection<br />

Should optimize among departments (i.e. run the same data collection van)<br />

Would like to figure out best method to obtain information from engineers<br />

Use for formulating budgets, analysis, refining strategic plans<br />

Would like a repository/library for information to reside in and standard ways to report<br />

Would like to see performance measures drill down from the highest level (Strategic <strong>Plan</strong>)<br />

to the IPO level so employees have a “line of site” between them<br />

Need performance measures to help track what we should do in 2013, 2015, etc.<br />

Would like to see all reports in SAP<br />

An external style guide would be appropriate<br />

Agreement on valued measures<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. D-3


Appendices<br />

<br />

<br />

<br />

<br />

How and when they are defined (quarterly? annually?)<br />

Connection of macro- and micro- levels<br />

Allow flexibility to change measures<br />

SAP will help <strong>CDOT</strong> avoid silos and better integrate performance reporting. This project<br />

will feed into requirements for the dashboard.<br />

Interviews<br />

Twenty-four interviews were conducted with a wide variety of headquarters and region <strong>CDOT</strong><br />

staff. Comments were received related to <strong>CDOT</strong>’s performance measurement process, data<br />

management/governance, and specific recommended high level performance measures. The<br />

recommendations from the interviews were used to create the first version of the framework.<br />

The comments will also be used in the remainder of the project. The following summarizes the<br />

main take-away points related to the project.<br />

<strong>CDOT</strong> Annual Report: The report may be sending the wrong message, because too many of the<br />

measures are shown as a green light. This indicates that a target has been met, but does not<br />

communicate if the target is a “good” target or not. Report could be improved by providing<br />

trend information. For example, alcohol fatalities have gone down (trend) however, is still<br />

reported as a red light in the performance report.<br />

<strong>Data</strong> Governance: <strong>Data</strong> governance standards should clearly define measures. For example,<br />

the definition of “injury” has changed over time, and “crash rates” are not calculated<br />

consistently (e.g., crashes per 100 million VMT and crashes per million VMT are both used).<br />

Possible obstacles to PMs and <strong>Data</strong> Governance include change management (attitudes and<br />

behavior/culture change).<br />

There is a disconnect with respect to data i.e. different branches own different data makes it<br />

difficult to establish performance measures.<br />

Specific Measures: Safety should not be included in overall program level tradeoffs, because<br />

much of work is done (and benefits achieved) through other programs.<br />

Mobility measure should be based on delay on a subset of urban corridors which would be<br />

more meaningful than a statewide number that includes a significant rural component.<br />

Reliability is a good measure, but very difficult to explain to public.<br />

A recommended pavement measure is “Percent pavement in good/fair condition (based on<br />

RSL).”<br />

There is an opportunity for tighter coordination between pavement surface LOS and capital<br />

pavement measure (RSL). This connection is tighter for bridges because data used for bridge<br />

LOS is provided by the bridge management system.<br />

Recommended bridge measure is “Percent of deck area on bridges in good/fair condition”<br />

(includes bridges and major culverts).<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. D-4


Appendices<br />

The two most important delivery measures are on-time and on-budget:<br />

<br />

<br />

<br />

<br />

<br />

Options for on time include obligating money (same as advertising project), starting<br />

project, and completing project. Highest priority for <strong>CDOT</strong> is advertisement date<br />

(consistent with previous interview).<br />

Public is most interested in: contract estimated completion, actual completion and<br />

reason for difference. The following should be measured: 1. How well did <strong>CDOT</strong> do<br />

the estimate? 2. Was it met? 3. Reasons if not.<br />

Good milestones to track would be: 1. Is budget set up before going to bid? 2. Was<br />

there a change in budget after bid? 3. If changes occurred – are they justified? 4.<br />

What was the final budget? The fundamental data is already in SAP to track these 4<br />

– the dates just need to be tightened.<br />

The ideal on-time budget would have thresholds for “on-time” that vary by project<br />

type. For example, a resurfacing project could be 1 month within the ad date, while<br />

a bigger project could be 2-3 months.<br />

The target for an on-time measure based on ad date should not be 100 percent. This<br />

would indicate that staff is too conservative in their estimates. For this measure, a<br />

tight definition and process for when/how to calculate it are critical. For example,<br />

on time could be defined as project ad within 30 days of current ad date. An ideal<br />

measure might be project-based, e.g., percent of projects on-time, or percent of<br />

projects on-time weighted by project cost.<br />

General<br />

An audit of PMs was conducted and it was found that the data was not checked.<br />

<strong>CDOT</strong> needs to do better measuring program level and connection between federal and<br />

financing (e.g., how was money spent and what did we get for it?).<br />

Need set rules in place for commonly reported data – e.g. there are different ways of reporting<br />

(centerline miles, lane miles, two directions etc.).<br />

Need an annual “publication” from IRIS to send to SAP.<br />

Need to be able to drill down on measure so a person can see how their performance objectives<br />

link to those of the work unit, division and Department.<br />

Reporting needs to be more outcome oriented (e.g. rather than focusing on whether a bid was<br />

on time – what was the quality of the work?<br />

Regions<br />

There appears to be a disconnect between project decisions made in regions and the<br />

performance measures reported by <strong>CDOT</strong>. It is important for regions to buy into future<br />

measures and feel a sense of accountability for them.<br />

It is a good idea to break down the measures by region. The more information, the better for<br />

the decision makers.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. D-5


Appendices<br />

If measures reported by region it will be important to provide annotations. For example, urban<br />

regions may have more difficulty with the on-time and on-budget measures than more rural<br />

regions.<br />

There is less concern for variability between the regions on the network-level measures then on<br />

the delivery measures. Ideally the measures would help estimate the needs in the regions and<br />

identify which regions are being more or less efficient with their resources.<br />

Dashboard<br />

Dashboard design should not be constrained by the capabilities of Xcelsius. All external facing<br />

information including dashboards need to be reviewed by the Public Relations office.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. D-6


Appendices<br />

E. Goals and Objectives<br />

The Public Budget Categories are maintain, maximize, expand, deliver, passthrough/multimodal<br />

and Transportation Commission Contingency/Debt.<br />

The following Vision, Mission and Goals were obtained from <strong>CDOT</strong> in a<br />

spreadsheet “<strong>Performance</strong> Measure Tracking.xlsx” dated April 6, 2011.<br />

<strong>CDOT</strong>'S VISION<br />

To enhance the quality of life and the environment of the citizens of Colorado by<br />

creating an integrated transportation system that focuses on moving people and<br />

goods and by offering convenient linkages among modal choices.<br />

<strong>CDOT</strong>'S MISSION<br />

To provide the best multi-modal transportation system for Colorado that most<br />

effectively moves people, goods and information.<br />

<strong>CDOT</strong>'S TRANSPORTATION OPERATING PRINCIPLES<br />

CUSTOMER FOCUS<br />

<strong>CDOT</strong> will strengthen its relationships with the increasingly informed and<br />

interested citizenry by reinforcing the public participation process to include outreach,<br />

early involvement and review, candid and understandable presentations,<br />

and consistency in follow-up. The process must include local governments,<br />

interest groups, and formal organizations, along with methods to solicit and<br />

respond to the views of all those impacted by transportation performance,<br />

improvements and financing.<br />

LEADERSHIP<br />

<strong>CDOT</strong> will bring together varied interests to address the transportation needs<br />

and issues of Colorado’s ever-changing social and physical environment. With a<br />

commitment to its vision, <strong>CDOT</strong> will utilize its unique statewide perspective and<br />

range of expertise in reaching optimal transportation solutions with its broad<br />

customer base.<br />

PARTNERSHIP<br />

<strong>CDOT</strong> will develop, support and/or participate in the formation of formal and<br />

informal partnerships for the quality development and implementation of<br />

Colorado’s transportation goals. Through cooperative efforts and shared<br />

responsibilities, these partnerships will help to leverage the limited resources<br />

available, and tap new sources of support for transportation development in<br />

Colorado.<br />

<strong>CDOT</strong> will solicit, support and/or participate in formal arrangements that<br />

further its Vision, Investment Strategy and Statewide <strong>Plan</strong>.<br />

Unsolicited proposals made to <strong>CDOT</strong> should be consistent with transportation<br />

planning process.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. E-1


Appendices<br />

INTEGRATE REGIONAL AND STATEWIDE PRIORITIES<br />

<strong>CDOT</strong>, and the Transportation Commission recognize and support the various<br />

roles of our planning partners, and of transportation providers, in coordinating<br />

an integrated, intermodal transportation system for Colorado. <strong>CDOT</strong> will<br />

collaborate with our partners to build consensus for the integration of local,<br />

regional and statewide transportation priorities. In order to optimize a limited<br />

resource base, effective integration requires mutual respect while addressing the<br />

issues and priorities of competing interests.<br />

FINANCIAL RESPONSIBILITIES<br />

<strong>CDOT</strong> will pursue diverse and cooperative funding options to reflect the<br />

interrelated nature of all modes within the transportation system. Public<br />

understanding of the financial requirements of the transportation system is a<br />

prerequisite for developing additional funding options that are reliable,<br />

equitable, flexible, adequate and acceptable. In an increasingly competitive<br />

environment for already limited resources, <strong>CDOT</strong> acknowledges and share the<br />

public’s concern over the cost and efficiency of government services. <strong>CDOT</strong> will<br />

continue to enhance its financial management practices to demonstrate<br />

accountability toward achieving established benchmarks.<br />

BALANCE QUALITY OF LIFE FACTORS<br />

<strong>CDOT</strong> recognizes the complex interrelationship of the environment, economic<br />

vitality and mobility, and is committed to balancing these factors in the<br />

development and implementation of the statewide transportation plan. By<br />

working with local, regional and state interests, <strong>CDOT</strong> will advocate the<br />

development of a coordinated decision-making process that balances the longrange<br />

transportation, land use and quality of life needs in Colorado. It is not the<br />

intent of the Commission or <strong>CDOT</strong> to prohibit or interfere with local land use<br />

decisions.<br />

ENVIRONMENT<br />

<strong>CDOT</strong> will support and enhance efforts to protect the environment and quality<br />

of life for all its citizens in the pursuit of providing the best transportation<br />

systems and services possible. <strong>CDOT</strong> will:<br />

<br />

<br />

<br />

<br />

promote a transportation system that is environmentally responsible and<br />

encourages preservation of the natural and enhancement of the created<br />

environment for current and future generations;<br />

incorporate social, economic, environmental concerns into the planning,<br />

design, construction, maintenance, and operations of the state’s existing and<br />

future transportation system;<br />

will, through the active participation of the general public, federal, state and<br />

local agencies, objectively consider all reasonable alternatives to avoid or<br />

minimize adverse impacts;<br />

will ensure that measures are taken to avoid and minimize the environmental<br />

impacts of construction and maintenance of the transportation system, all<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. E-2


Appendices<br />

<br />

<br />

activities are in compliance with all environmental statutes and regulations,<br />

and that mitigation commitments are implemented and maintained;<br />

will plan, design, construct, maintain and operate the transportation system<br />

in a manner which helps preserve Colorado’s historic and natural heritage<br />

and fits harmoniously into the community, local culture and the natural<br />

environment.<br />

will promote a sense of environmental responsibility for all employees in the<br />

course of all <strong>CDOT</strong> activities and we will go beyond environmental<br />

compliance and strive for environmental excellence.<br />

ACCESSIBILITY CONNECTIVITY, AND MODAL CHOICES<br />

<strong>CDOT</strong> will promote a transportation system that is reliable and accessible to<br />

potential users, including the transportation disadvantaged. Accessibility<br />

includes the availability of modal choices and connectivity, ease of use, relative<br />

cost, proximity to service and frequency of service. <strong>CDOT</strong> will go beyond the<br />

traditional single-occupancy vehicle highway improvements by emphasizing an<br />

approach to transportation planning, development, and maintenance that takes<br />

advantage of the inherent efficiencies of each mode. Such an approach is<br />

necessary to respond to the diverse needs of both urban and rural customers, to<br />

preserve and improve the environment, and to ensure the connectivity and<br />

interaction of modes.<br />

SOCIAL RESPONSIBILITY<br />

<strong>CDOT</strong> recognizes the value of human capital in achieving state goals, and<br />

maintains a commitment to fostering nondiscrimination practices in a safe and<br />

healthy work environment. Our commitment to fair and equitable business<br />

practices encompasses the interests of all of our customers. Overall the general<br />

welfare of the total public will be reflected in <strong>CDOT</strong>’s decision-making processes.<br />

In everything we do, <strong>CDOT</strong> will be guided by certain values. We will:<br />

<br />

<br />

<br />

<br />

Take pride in our work and ourselves.<br />

Demand quality of ourselves.<br />

Strive to improve our personal skills and talents.<br />

Use resources wisely.<br />

INVESTMENT CATEGORIES<br />

SYSTEM QUALITY: Activities, programs and projects that maintain the (physical<br />

integrity/ condition) function and aesthetics of the existing transportation<br />

infrastructure.<br />

GOALS<br />

<br />

<br />

Cost effectively maintain or improve the quality and serviceability of the<br />

physical transportation infrastructure<br />

Increase investment in system quality and in strategic projects<br />

OBJECTIVES<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. E-3


Appendices<br />

<br />

<br />

<br />

<br />

<br />

Maintain or improve the CY 2003 projected good/fair condition of the state<br />

highways through 2010<br />

Maintain or improve the CY 2003 projected good/fair condition of the major<br />

structures through CY 2010<br />

Maintain or improve the transportation system at the adopted annual<br />

maintenance level of service grade (In the System Quality program areas)<br />

Maintain or improve the average external customer satisfaction survey grade<br />

for state highways drivability<br />

Maintain or improve customer satisfaction grade of the state highway<br />

system’s appearance<br />

SAFETY: Services, programs and projects that reduce fatalities, injuries and<br />

property damage for all users and providers of the system.<br />

GOALS<br />

<br />

<br />

To create, promote and maintain a safe and secure transportation system and<br />

work environment<br />

Increase investment in safety and strategic projects<br />

OBJECTIVES<br />

By CY 2010, reduce by 4% the total motor vehicle crash rate from CY 2002<br />

level (CY 2002 rate is 307.1 crashes per 100 million vehicle miles of travel)<br />

<br />

<br />

<br />

<br />

<br />

<br />

By CY 2010, reduce by 20% the severity and economic loss of transportation<br />

related motor vehicle crashes from CY 2002 levels<br />

By FY 2007, develop a <strong>CDOT</strong> homeland security plan<br />

Within 5 years, reduce the <strong>CDOT</strong> employee injury rate by 50% and reduce<br />

construction contractor employee fatalities (Based on average of three years<br />

of specific <strong>CDOT</strong> OSHA recordable claims data: From an average worker<br />

injury rate in FY 2004 of 9.9 to 5.0 injury accidents per 100 employees by FY<br />

2009)<br />

Over next 5 years, reduce worker accidents by 15% per year (Base year is FY<br />

2004)<br />

Over next 5 years, reduce the number of <strong>CDOT</strong> vehicle accidents by 10% per<br />

year (base year is FY 2004)<br />

Maintain or improve the 2006 customer rating of safety-related programs and<br />

service delivery<br />

MOBILITY: Programs, services, and projects that provide for the movement of<br />

people, goods and information.<br />

GOALS<br />

<br />

<br />

Maintain or improve the operational capacity of the transportation system<br />

Increase integration of the transportation system modal choices<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. E-4


Appendices<br />

<br />

Increase investment in mobility and strategic projects<br />

OBJECTIVES<br />

<br />

<br />

<br />

<br />

<br />

Maintain or improve the 2006 customer satisfaction rating of operational<br />

services delivery<br />

Reduce the growth rate through CY 2010 below projected growth rate of<br />

person miles traveled in congestion<br />

Increase of infrastructure to improve mobility for the user or increase in<br />

investment to improve mobility for user<br />

Maintain the snow and ice maintenance level of service grade at the adopted<br />

annual grade<br />

Maintain or improve the 2006 customer satisfaction rating of transportation<br />

choices as a part of an integrated statewide transportation system<br />

PROGRAM DELIVERY: Functions that enable the successful delivery of <strong>CDOT</strong>’s<br />

programs, projects and services.<br />

GOALS<br />

<br />

<br />

<br />

Deliver high-quality programs, projects and services in an effective and<br />

efficient manner<br />

Accelerate completion of the remaining strategic projects<br />

Increase investment in strategic projects<br />

OBJECTIVES<br />

<br />

<br />

<br />

<br />

<br />

<br />

Encumber all program funds within the planned quarter<br />

Deliver all programs and projects on time and within budget<br />

Maintain or improve customer satisfaction rating of project quality<br />

Maintain or improve a diverse and qualified workforce that supports <strong>CDOT</strong><br />

values<br />

Meet or exceed the Department’s annual Disadvantaged <strong>Business</strong> Enterprise<br />

(DBE) goals<br />

Protect the environment by mitigating adverse environmental impacts while<br />

providing effective transportation systems<br />

For the strategic projects:<br />

<br />

<br />

<br />

<br />

Promote partnerships with all governments to enhance working relationships<br />

Accelerate strategic project delivery while minimizing the impact to all other<br />

objectives<br />

Preserve options to anticipate Colorado’s future transportation needs in<br />

major corridors<br />

Ensure <strong>CDOT</strong>’s bonding eligibility to secure future funding levels<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. E-5


Appendices<br />

F. Sample <strong>Data</strong> Governance<br />

Work Team Charter<br />

<strong>Data</strong> Governance Work Team Charter<br />

June 2009<br />

Purpose<br />

The <strong>Data</strong> Governance Work Team is established to support a data governance<br />

framework for the Minnesota Department of Transportation (Mn/DOT). The<br />

work team, under the guidance of the <strong>Business</strong> Information Council, supports<br />

standardizing procedures for the identification and management of enterprise<br />

data. The <strong>Data</strong> Governance Work Team supports the department vision and<br />

mission for management of enterprise data.<br />

Responsibilities of the <strong>Data</strong> Governance Work Team:<br />

<br />

<br />

Develop a strategy and process for implementing data governance<br />

throughout the organization, which will include:<br />

– Guidance on priorities for implementing data governance on enterprise<br />

data<br />

– Recommended next steps in data governance (development of data<br />

dictionary, review of current inventory of data systems, etc.)<br />

– A Mn/DOT data governance manual documenting the framework<br />

– A communication plan for reaching the entire enterprise<br />

Develop and recommend for implementation a detailed data governance<br />

framework to:<br />

– Define roles and responsibilities in data governance (board/council,<br />

steward, custodian, stakeholder, etc.)<br />

– Define goals/objectives pertaining to the use of data<br />

– Define goals/objective pertaining to data collection<br />

– Identify process for non-compliance with goals/objectives of data<br />

collection and use<br />

– Identify data values/principals/guidelines which support data’s value as<br />

an asset<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. F-1


Appendices<br />

<br />

Define the need and what should be included in a data catalog for the<br />

enterprise.<br />

Scope<br />

The scope of the <strong>Data</strong> Governance Work Team is to provide recommendations to<br />

the <strong>Business</strong> Information Council for implementation of standards, procedures<br />

and work products for the enterprise as defined above, in coordination with<br />

business emphasis area work teams.<br />

Deliverables<br />

The deliverables of the <strong>Data</strong> Governance Work Team include the following:<br />

1. <strong>Data</strong> governance structure including roles and responsibilities of current<br />

and future data-related boards.<br />

2. Action plan for implementing data governance at Mn/DOT.<br />

3. <strong>Data</strong> governance manual for the enterprise and each business area to<br />

include methods and processes for reviewing, approving, and<br />

implementing data governance standards, procedures, and work<br />

products.<br />

4. Communication plan to market the impact and benefits of data<br />

governance.<br />

5. Development plan for a data catalog including structure and content.<br />

Organization of <strong>Data</strong> Governance Work Team<br />

Tim Henkel, Modal <strong>Plan</strong>ning & Program Management Division Director, Co-<br />

Champion<br />

Pam Tschida, Employee & Corporate Services Division Director, Co-Champion<br />

Bob Brown, Office of Land Management, Co-Chair<br />

Kathy Hofstedt, Office of Information & Technology Services , Co-Chair<br />

Lee Berget, District 4<br />

Sue Dwight, Office of Financial Management<br />

Cassandra Isackson, Office of Traffic, Safety, & Technology<br />

Matt Koukol, Office of Transportation <strong>Data</strong> & Analysis<br />

Jonette Kreideweis, Office of Transportation <strong>Data</strong> & Analysis<br />

Bill Roen, Office of Financial Management<br />

Susan Walto, Office of Transportation <strong>Data</strong> & Analysis (staff)<br />

Expert Resources to the Work Team<br />

Michele Bliss, Mn/DOT Records Manager<br />

Sheila Hatchell, Mn/DOT Library Director<br />

Schedule<br />

The <strong>Data</strong> Governance Work Team will meet at regular intervals (to be scheduled)<br />

based on the on-going development and implementation of the <strong>Data</strong> <strong>Business</strong><br />

<strong>Plan</strong> for Mn/DOT as managed by the <strong>Business</strong> Information Council.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. F-2


Appendices<br />

G. Congestion<br />

The report recommends adopting a travel time reliability measure. This<br />

appendix documents the calculations and results of both travel time index (TTI)<br />

and planning time index (PTI) for I-70 from Denver International Airport to Vail<br />

from for the period of November 1, 2010 through October 31, 2011.<br />

Definition of Travel Time Reliability<br />

There are two widely held ways that reliability can be defined. Each is valid and<br />

leads to a set of reliability performance measures that capture the nature of travel<br />

time reliability. Reliability can be defined as:<br />

1. The variability in travel times that occur on a facility or a trip over the course<br />

of time; and<br />

2. The number of times (trips) that either “fail” or “succeed” in accordance with<br />

a pre-determined performance standard.<br />

In both cases, reliability (more appropriately, unreliability) is caused by the<br />

interaction of the factors that influence travel times: fluctuations in demand,<br />

traffic control devices, traffic incidents, inclement weather, work zones, and<br />

physical capacity (based on prevailing geometrics and traffic patterns). These<br />

factors will produce travel times that are different from day-to-day for the same<br />

trip. The reliability of a facility or trip can be reported for different time slices,<br />

e.g., weekday peak hour, weekday peak period, and weekend.<br />

From a measurement perspective, reliability is quantified from the distribution of<br />

travel times, for a given facility/trip and time slice, that occurs over a significant<br />

span of time; one year is generally long enough to capture nearly all of the<br />

variability caused by disruptions. A variety of different metrics can be computed<br />

once the travel time distribution has been established, including standard<br />

statistical measures (e.g., standard deviation, kurtosis), percentile-based<br />

measures (e.g., 95 th percentile travel time, Buffer Index), on-time measures (e.g.,<br />

percent of trips completed within a travel time threshold, and failure measures<br />

(e.g., percent of trips that exceed a travel time threshold).<br />

Definition of Measures Calculated<br />

The speed and segment data was obtained from the <strong>CDOT</strong> ITS Office in a flat<br />

file, comma delineated format organized by month. CS calculated the TTI and<br />

PTI for the am and pm peak for I-70 for the year beginning November 1, 2010<br />

using the following formulae.<br />

PTI was calculated as the 95th percentile travel time divided by the free flow<br />

travel time.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-1


Appendices<br />

TTI was calculated as the ratio of average peak travel time to an off-peak (freeflow)<br />

standard. For example, a value of 1.20 means that average peak travel times<br />

are 20 percent longer than off-peak travel times.<br />

The segments were revised as shown in the following table for the purposes of<br />

calculating the results.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-2


Appendices<br />

Table G-1<br />

Revised Segments<br />

ORIGINAL<br />

REVISED<br />

SEGMENT NAME SEGMENT LENGTH SEGMENT ID NUM SEG SEGMENT ID SEGMENT NAME SEGMENT LENGTH<br />

070E177 ‐ East Vail to Vail Pass 12.4 E1 1 E1 East Vail to Vail Pass 12.4<br />

070E190 ‐ Vail Pass to Copper 5.8 E2 1 E2 Vail Pass to Copper 5.8<br />

070E196 ‐ Copper to Frisco (Main St) 5.7 E3 1 E3 Copper to Frisco (Main St) 5.7<br />

070E201 ‐ Frisco (Main St) to Silverthorne 4 E4 1 E4 Frisco (Main St) to Silverthorne 4<br />

070E206 ‐ Silverthorne to MM 211 5.62 E5 2 E5 Silverthorne to Eisenhower Tunnel West Entrance 5.62<br />

070E211 ‐ MM 211 to Eisenhower Tunnel West Entrance 2.68 E5 2 E6 Eisenhower Tunnel West Entrance to Loveland 2.7<br />

070E214 ‐ Eisenhower Tunnel West Entrance to Loveland 2.7 E6 1 E7 Loveland to Bakerville 5<br />

070E217 ‐ Loveland to Bakerville 5 E7 1 E8 Bakerville to Georgetown 6.6<br />

070E222 ‐ Bakerville to Georgetown 6.6 E8 1 E9 Georgetown to US 40 Empire 4.6<br />

070E228 ‐ Georgetown to US 40 Empire 4.6 E9 1 E10 US 40 Empire to CO 103 Idaho Springs / Mt Evans 7.2<br />

070E233 ‐ US 40 Empire to CO 103 Idaho Springs / Mt Evans 7.2 E10 1 E11 CO 103 Idaho Springs / Mt Evans to US 6 4.6<br />

070E240 ‐ CO 103 Idaho Springs / Mt Evans to US 6 4.6 E11 1 E12 US 6 to Evergreen Pkwy 3.51<br />

070E245 ‐ US 6 to Beaver Brook 3.51 E12 2 E13 Evergreen Pkwy to Lookout Mountain 4.58<br />

070E248 ‐ Beaver Brook to Evergreen Pkwy 3.92 E12 2 E14 Lookout Mountain to C‐470 2.69<br />

070E252 ‐ Evergreen Pkwy to Lookout Mountain 4.58 E13 1 E15 C‐470 to US 6 1.1<br />

070E256 ‐ Lookout Mountain to Morrison / Heritage 2.69 E14 2 E16 US 6 to CO 58 3.49<br />

070E259 ‐ Morrison / Heritage to C‐470 1 E14 2 E17 CO 58 to Kipling St 1.98<br />

070E260 ‐ C‐470 to US 6 1.1 E15 1 E18 Kipling St to Sheridan Blvd 1.57<br />

070E261 ‐ US 6 to 32nd Ave 3.49 E16 2 E19 Sheridan Blvd to Pecos St 2.59<br />

070E265 ‐ 32nd Ave to CO 58 0.98 E16 2 E20 Pecos St to I‐25 0.99<br />

070E266 ‐ CO 58 to Kipling St 1.98 E17 1 E21 I‐25 to Colorado Blvd 1.21<br />

070E268 ‐ Kipling St to I‐76 1.57 E18 2 E22 Colorado Blvd to I‐270 2.48<br />

070E270 ‐ I‐76 to Sheridan Blvd 1.59 E18 2 E23 I‐270 to I‐225 1.61<br />

070E271 ‐ Sheridan Blvd to Pecos St 2.59 E19 1 E24 I‐225 to Pena Blvd 0.69<br />

070E274 ‐ Pecos St to I‐25 0.99 E20 1 E25 Pena Blvd to Tower Rd 2.61<br />

070E275 ‐ I‐25 to Brighton Blvd 1.21 E21 2 W1 Vail Pass to East Vail 12.6<br />

070E276 ‐ Brighton Blvd to Colorado Blvd 1.44 E21 2 W2 Copper to Vail Pass 5.6<br />

070E277 ‐ Colorado Blvd to I‐270 2.48 E22 1 W3 Frisco (Main St) to Copper 5.1<br />

070E280 ‐ I‐270 to Havana St 1.61 E23 3 W4 Silverthorne to Frisco (Main St) 4.7<br />

070E281 ‐ Havana St to Peoria St 1.02 E23 3 W5 Eisenhower Tunnel West to Silverthorne 5.57<br />

070E282 ‐ Peoria St to I‐225 0.95 E23 3 W6 Loveland to Eisenhower Tunnel West Entrance 2.6<br />

070E283 ‐ I‐225 to Pena Blvd 0.69 E24 1 W7 Bakerville to Loveland 5.1<br />

070E284 ‐ Pena Blvd to Tower Rd 2.61 E25 1 W8 Georgetown to Bakerville 6.6<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-3


Appendices<br />

ORIGINAL<br />

REVISED<br />

SEGMENT NAME SEGMENT LENGTH SEGMENT ID NUM SEG SEGMENT ID SEGMENT NAME SEGMENT LENGTH<br />

070W189 ‐ Vail Pass to East Vail 12.6 W1 1 W9 US 40 Empire to Georgetown 4.7<br />

070W195 ‐ Copper to Vail Pass 5.6 W2 1 W10 CO 103 Idaho Springs / Mt Evans to US 40 Empire 7.2<br />

070W200 ‐ Frisco (Main St) to Copper 5.1 W3 1 W11 US 6 to CO 103 Idaho Springs / Mt Evans 4.4<br />

070W205 ‐ Silverthorne to Frisco (Main St) 4.7 W4 1 W12 Evergreen Pkwy to US 6 3.29<br />

070W211 ‐ MM 211 to Silverthorne 5.57 W5 2 W13 Lookout Mountain to Evergreen Pkwy 4.75<br />

070W213 ‐ Eisenhower Tunnel West Entrance to MM 211 2.63 W5 2 W14 C‐470 to Lookout Mountain 2.85<br />

070W216 ‐ Loveland to Eisenhower Tunnel West Entrance 2.6 W6 1 W15 US 6 to C‐470 0.89<br />

070W221 ‐ Bakerville to Loveland 5.1 W7 1 W16 CO 58 to US 6 3.4<br />

070W227 ‐ Georgetown to Bakerville 6.6 W8 1 W17 Kipling St to CO 58 1.86<br />

070W232 ‐ US 40 Empire to Georgetown 4.7 W9 1 W18 Sheridan Blvd to Kipling St 1.42<br />

070W239 ‐ CO 103 Idaho Springs / Mt Evans to US 40 Empire 7.2 W10 1 W19 I‐25 to Pecos St 0.88<br />

070W244 ‐ US 6 to CO 103 Idaho Springs / Mt Evans 4.4 W11 1 W20 Pecos St to Sheridan Blvd 2.39<br />

070W247 ‐ Beaver Brook to US 6 3.29 W12 2 W21 Colorado Blvd to Brighton Blvd 1.1<br />

070W251 ‐ Evergreen Pkwy to Beaver Brook 4.01 W12 2 W22 I‐270 to Colorado Blvd 2.27<br />

070W256 ‐ Lookout Mountain to Evergreen Pkwy 4.75 W13 1 W23 Havana St to Peoria 1.46<br />

070W258 ‐ Morrison / Heritage to Lookout Mountain 2.85 W14 2 W24 Pena Blvd to I‐225 1.12<br />

070W259 ‐ C‐470 to Morrison / Heritage 1 W14 2 W25 Tower Rd to Pena Blvd 1.82<br />

070W260 ‐ US 6 to C‐470 0.89 W15 1<br />

070W264 ‐ 32nd Ave to US 6 3.4 W16 2<br />

070W265 ‐ CO 58 to 32nd Ave 0.89 W16 2<br />

070W267 ‐ Kipling St to CO 58 1.86 W17 1<br />

070W269 ‐ I‐76 to Kipling St 1.42 W18 2<br />

070W270 ‐ Sheridan Blvd to I‐76 1.51 W18 2<br />

070W273 ‐ I‐25 to Pecos St 0.88 W19 1<br />

070W273 ‐ Pecos St to Sheridan Blvd 2.39 W20 1<br />

070W275 ‐ Brighton Blvd to I‐25 1.1 W21 2<br />

070W276 ‐ Colorado Blvd to Brighton Blvd 1.33 W21 2<br />

070W278 ‐ I‐270 to Colorado Blvd 2.27 W22 1<br />

070W280 ‐ Havana St to I‐270 1.46 W23 1<br />

070W281 ‐ Peoria St to Havana St 0.94 W24 1<br />

070W282 ‐ I‐225 to Peoria St 0.94 W25 1<br />

070W283 ‐ Pena Blvd to I‐225 1.12 W26 1<br />

070W285 ‐ Tower Rd to Pena Blvd 1.82 W27 1<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-4


Appendices<br />

Results<br />

The following table shows the segment IDs, segment names, segment speeds and<br />

TTI and PTI for both PM and AM peak hours.<br />

Table G-2<br />

Detailed Summary of Results<br />

AM_PEAK<br />

PM_PEAK<br />

SEGMENT<br />

_ID SEGMENT_NAME Speed TTI PTI Speed TTI PTI<br />

E1 East Vail to Vail Pass 56 1.04 1.20 54 1.10 1.63<br />

E2 Vail Pass to Copper 56 1.05 1.22 54 1.10 1.65<br />

E3 Copper to Frisco (Main St) 60 1.03 1.15 63 1.01 1.06<br />

E4 Frisco (Main St) to Silverthorne 62 1.02 1.10 62 1.02 1.16<br />

E5 Silverthorne to Eisenhower Tunnel West Entrance 56 1.02 1.11 55 1.06 1.27<br />

E6 Eisenhower Tunnel West Entrance to Loveland 47 1.03 1.15 46 1.06 1.44<br />

E7 Loveland to Bakerville 60 1.03 1.16 59 1.07 1.55<br />

E8 Bakerville to Georgetown 62 1.02 1.10 60 1.06 1.32<br />

E9 Georgetown to US 40 Empire 64 1.01 1.06 60 1.08 1.56<br />

E10 US 40 Empire to CO 103 Idaho Springs / Mt Evans 64 1.01 1.07 61 1.06 1.33<br />

E11 CO 103 Idaho Springs / Mt Evans to US 6 57 1.02 1.08 57 1.03 1.16<br />

E12 US 6 to Evergreen Pkwy 63 1.02 1.11 64 1.01 1.08<br />

E13 Evergreen Pkwy to Lookout Mountain 63 1.02 1.12 62 1.03 1.10<br />

E14 Lookout Mountain to C‐470 56 1.01 1.10 57 1.01 1.08<br />

E15 C‐470 to US 6 64 1.02 1.13 64 1.02 1.10<br />

E16 US 6 to CO 58 64 1.02 1.15 57 1.13 1.52<br />

E17 CO 58 to Kipling St 61 1.05 1.21 48 1.33 2.10<br />

E18 Kipling St to Sheridan Blvd 56 1.06 1.25 55 1.07 1.21<br />

E19 Sheridan Blvd to Pecos St 47 1.16 1.81 54 1.02 1.08<br />

E20 Pecos St to I‐25 39 1.39 2.45 48 1.15 1.70<br />

E21 I‐25 to Colorado Blvd 48 1.15 1.52 44 1.25 1.88<br />

E22 Colorado Blvd to I‐270 50 1.09 1.38 36 1.51 2.51<br />

E23 I‐270 to I‐225 48 1.10 1.31 41 1.28 1.78<br />

E24 I‐225 to Pena Blvd 55 1.00 1.04 55 1.01 1.06<br />

E25 Pena Blvd to Tower Rd 55 1.01 1.07 55 1.01 1.06<br />

W1 Vail Pass to East Vail 56 1.04 1.20 57 1.05 1.34<br />

W2 Copper to Vail Pass 56 1.04 1.21 57 1.05 1.34<br />

W3 Frisco (Main St) to Copper 61 1.02 1.12 62 1.01 1.07<br />

W4 Silverthorne to Frisco (Main St) 61 1.02 1.11 63 1.01 1.06<br />

W5 Eisenhower Tunnel West to Silverthorne 55 1.03 1.16 58 1.01 1.07<br />

W6 Loveland to Eisenhower Tunnel West Entrance 46 1.03 1.14 47 1.01 1.08<br />

W7 Bakerville to Loveland 59 1.02 1.11 61 1.02 1.01<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-5


Appendices<br />

AM_PEAK<br />

PM_PEAK<br />

SEGMENT<br />

_ID SEGMENT_NAME Speed TTI PTI Speed TTI PTI<br />

W8 Georgetown to Bakerville 61 1.02 1.05 63 1.01 1.00<br />

W9 US 40 Empire to Georgetown 63 1.01 1.01 63 1.01 1.00<br />

W10 CO 103 Idaho Springs / Mt Evans to US 40 Empire 63 1.01 1.00 64 1.00 1.00<br />

W11 US 6 to CO 103 Idaho Springs / Mt Evans 54 1.00 1.01 54 1.00 1.00<br />

W12 Evergreen Pkwy to US 6 58 1.00 1.00 57 1.01 1.04<br />

W13 Lookout Mountain to Evergreen Pkwy 64 1.01 1.10 64 1.01 1.06<br />

W14 C‐470 to Lookout Mountain 60 1.02 1.12 59 1.04 1.15<br />

W15 US 6 to C‐470 55 1.05 1.15 51 1.13 1.33<br />

W16 CO 58 to US 6 63 1.03 1.17 62 1.03 1.15<br />

W17 Kipling St to CO 58 62 1.04 1.19 61 1.06 1.23<br />

W18 Sheridan Blvd to Kipling St 56 1.02 1.16 51 1.09 1.58<br />

W19 I‐25 to Pecos St 54 1.01 1.09 42 1.30 1.87<br />

W20 Pecos St to Sheridan Blvd 55 1.01 1.07 52 1.05 1.16<br />

W21 Colorado Blvd to Brighton Blvd 51 1.06 1.26 37 1.47 2.07<br />

W22 I‐270 to Colorado Blvd 51 1.06 1.26 32 1.68 2.74<br />

W23 Havana St to Peoria 40 1.35 2.20 39 1.40 2.30<br />

W24 Pena Blvd to I‐225 53 1.05 1.21 54 1.02 1.09<br />

W25 Tower Rd to Pena Blvd 54 1.02 1.15 55 1.00 1.02<br />

The following table summarizes the results<br />

Table G-3<br />

Summary<br />

AM PEAK<br />

PM PEAK<br />

Speed TTI PTI Speed TTI PTI<br />

Min 39.37323 1.002322 1 32.38328 1.002905 1<br />

Avg 56.51019 1.047182 1.206262 54.71064 1.09914 1.363171<br />

Max 63.88306 1.386931 2.45392 63.97534 1.682572 2.744582<br />

The results can also be viewed in graphical format shown below.<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-6


Appendices<br />

Figure G-1<br />

Speed Eastbound AM Peak Period<br />

Speed (mph)<br />

70<br />

65<br />

60<br />

55<br />

50<br />

45<br />

40<br />

35<br />

30<br />

AM Peak Period Eastbound<br />

Figure G-2<br />

Speed Eastbound PM Peak Period<br />

Speed (mph)<br />

70<br />

65<br />

60<br />

55<br />

50<br />

45<br />

40<br />

35<br />

30<br />

PM Peak Period Eastbound<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-7


Appendices<br />

Figure G-3<br />

TTI Eastbound AM Peak Period<br />

TTI<br />

1.50<br />

1.40<br />

1.30<br />

1.20<br />

1.10<br />

1.00<br />

0.90<br />

0.80<br />

AM Peak Period Eastbound<br />

Figure G-4<br />

TTI Eastbound PM Peak Period<br />

TTI<br />

1.60<br />

1.50<br />

1.40<br />

1.30<br />

1.20<br />

1.10<br />

1.00<br />

0.90<br />

0.80<br />

PM Peak Period Eastbound<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-8


Appendices<br />

Figure G-5<br />

Speed Westbound AM Peak Period<br />

Speed (mph)<br />

70<br />

65<br />

60<br />

55<br />

50<br />

45<br />

40<br />

35<br />

30<br />

AM Peak Period Westbound<br />

Figure G-6<br />

Speed Westbound PM Peak Period<br />

Speed (mph)<br />

70<br />

65<br />

60<br />

55<br />

50<br />

45<br />

40<br />

35<br />

30<br />

PM Peak Period Westbound<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-9


Appendices<br />

Figure G-7<br />

TTI Westbound AM Peak Period<br />

AM Peak Period Westbound<br />

1.40<br />

1.30<br />

1.20<br />

TTI<br />

1.10<br />

1.00<br />

0.90<br />

0.80<br />

Figure G-8<br />

TTI Westbound PM Peak Period<br />

PM Peak Period Westbound<br />

TTI<br />

1.80<br />

1.70<br />

1.60<br />

1.50<br />

1.40<br />

1.30<br />

1.20<br />

1.10<br />

1.00<br />

0.90<br />

0.80<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-10


Appendices<br />

A useful format for the dashboard would be map-based as shown below.<br />

Figure G-9 AM Peak PTI 1<br />

Figure G-10 PM Peak PTI 1<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-11


Appendices<br />

Figure G-11 AM Peak PTI 2<br />

Figure G-12 PM Peak PTI 2<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-12


Appendices<br />

Figure G-13 AM Peak PTI 3<br />

Figure G-14 PM Peak PTI 3<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-13


Appendices<br />

Figure G-15 AM Peak PTI 4<br />

Figure G-16 PM Peak PTI 4<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. G-14


Appendices<br />

H. Calculation Spreadsheet Pages<br />

(<strong>CDOT</strong> <strong>Data</strong> and Graphs for<br />

Dashboard.pdf)<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. H-1


Appendices<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. H-2


Appendices<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. H-3


Appendices<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. H-4


Appendices<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. H-5


Appendices<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. H-6


Appendices<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. H-7


Appendices<br />

<strong>Cambridge</strong> <strong>Systematics</strong>, Inc. H-8

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!