2010 Traffic Incident Management Self-Assessment (TIM SA) National Analysis Report
Executive Summary
Background
The Traffic Incident Management Self-Assessment (TIM SA) was developed by the Federal Highway Administration (FHWA) as a benchmarking tool for evaluating TIM program components and overall TIM program success. Development of the TIM SA initiated in 2002 and the first assessments were conducted in 2003. The TIM SA serves several functions. Through the TIM SA, State and local TIM program managers are able to assess progress and identify areas for improvement at State and local levels. Similarly, analysis of the aggregated TIM SA results allows FHWA to identify program gaps and better target TIM program resources.
There are 80 FHWA-defined operational areas (States, regions, localities) in the annual TIM SA process. The original design was for half (40) of the operational areas to complete a re-assessment in 2004 and the remaining 40 to do so in 2005. In 2006, FHWA amended the process so that all 80 areas were asked to complete the TIM SA on an annual basis. Since the inaugural TIM SA in 2003, additional TIM programs beyond the original 80 have completed and submitted the TIM SA for inclusion in the national analysis. The 2010 TIM SA had a record number of assessments submitted; a total of 92 locations completed a TIM SA for inclusion in the national analysis. Table ES1 shows the total number of new and re-assessments each year.
Year | New Assessments | Re-Assessments | Total Completed |
---|---|---|---|
2003 | 70 | -- | 70 |
2004 | 7 | 25 | 32 |
2005 | 1 | 41 | 42 |
2006 | 3 | 67 | 70 |
2007 | 5 | 62 | 67 |
2008 | 2 | 74 | 76 |
2009 | 6 | 80 | 86 |
2010 | 6 | 86 | 92 |
In 2007 a revision process was initiated by FHWA to more closely align the TIM SA with current TIM state of practice. Although the revision process was completed in 2008, the revised TIM SA was not deployed until the 2009 TIM SA cycle. Among other changes, the TIM SA Revision included a reduction in the number of questions from 34 to 31, which was the result of the combining of some questions, the elimination of others and the addition of several new questions.
The 31 questions are grouped into three sections; Strategic, Tactical and Support. In order to benchmark progress for each question and the three sections over time, the initial assessments completed in 2003, 2004 and one in 2005 (78 in total) have been used each year as the Baseline. Due to the changes resulting from the TIM SA Revision, Baseline data was recalculated in 2010 to reflect the combined, eliminated and new questions. This was particularly necessary for the new questions which, prior to the 2009 assessment, had no established baseline scores. The score achieved for each of the new questions in 2009 is now its baseline and is part of the overall baseline calculation for each section.
Table ES2 shows the average score for each of the three TIM SA sections from the Baseline and 2010, along with the percentage change from the Baseline. The table also shows the high score achieved in each of the three program areas.
Section | # of Questions | Mean Score Baseline | Mean Score 2010 | High Score 2010 (possible) | % Change in scores from Baseline | Section Weights |
---|---|---|---|---|---|---|
Strategic | 11 | 33.00% | 55.20% | 29.8 (30) | 67.20% | 30% |
Tactical | 13 | 60.90% | 71.50% | 40.0 (40) | 17.40% | 40% |
Support | 7 | 40.50% | 62.60% | 30.0 (30) | 54.60% | 30% |
Overall Total | 31 | 46.40% | 63.90% | 98.8 (100) | 37.80% | 100% |
Strategic
The questions in the Strategic section ask respondents to rate progress in how the TIM program is organized, resourced, supported and sustained. The Strategic questions also cover TIM performance measures. The Strategic questions have realized a 67.2 percent increase over the Baseline, the largest increase of the three sections.
Despite progress in the Strategic area, four out of the five questions receiving the lowest mean score in the TIM SA are in this section, with most coming from the subsection on TIM Performance Measurement (Table ES3).
Mean Score Rank in 2010/ Baseline | Question Number | Question | 2010 Mean Score (n=92) | % Scoring 3 or Higher -2010 | % Change in 2010 / Baseline Mean Scores |
---|---|---|---|---|---|
31/27 | 4.1.3.5 Strategic | Track performance in reducing secondary incidents? | 1.27 | 11% | 23.50% |
30/29 | 4.1.3.4 Strategic | Routinely review whether progress is made in achieving the targets? | 1.83 | 30% | 146.80% |
29/30 | 4.1.3.1 Strategic | Have multi-agency agreement on the two performance measures being tracked (roadway clearance time and incident clearance time)? | 1.87 | 35% | 192.10% |
28/24 | 4.1.1.2 Strategic | Is there a process in place to ensure the continuity of these agreements / memoranda of understanding through integrated planning and budgeting across and among participating agencies? | 1.92 | 35% | 42.50% |
27/16 | 4.3.1.2 Strategic | Is public safety co-located with transportation in the TMC/TOC? | 1.95 | 45% | 3.50% |
The questions in TIM Performance Measurement are also among the questions that achieved the largest increase from the Baseline. Table ES4 shows that scores for three of the TIM Performance Measurement questions have more than doubled since the Baseline.
Mean Score Rank in 2010/ Baseline | Question Number | Question | 2010 Mean Score (n=92) | % Scoring 3 or Higher -2010 | % Change in 2010 Mean Scores from Baseline |
---|---|---|---|---|---|
22/30 | 4.1.3.2 Strategic | Has the TIM program established methods to collect and analyze the data necessary to measure performance in reduced roadway clearance time and reduced incident clearance time? | 2.28 | 45% | 256.70% |
29/30 | 4.1.3.1 Strategic | Have multi-agency agreement on the two performance measures being tracked (roadway clearance time and incident clearance time)? | 1.87 | 35% | 192.10% |
17/28 | 4.3.2.2 Support | Are motorists provided with travel time estimates for route segments? | 2.5 | 54% | 152.50% |
30/29 | 4.1.3.4 Strategic | Routinely review whether progress is made in achieving the targets? | 1.83 | 30% | 146.80% |
20/25 | 4.1.2.2 Strategic | Conduct training?
|
2.37 | 61% | 88.10% |
Tactical
The questions in Tactical focus on the policies and procedures used by field personnel when responding to incidents. This includes the policies and procedures specifically targeting motorist and responder safety. Collectively, these questions consistently score among the highest in the TIM SA and in 2010 this section achieved an overall score of 71.5 percent. Three of the six questions achieving the highest mean score are in the Tactical section (Table ES5).
The highest scoring question in the 2010 TIM SA was on "move over" laws. With 80 percent of the assessments scoring this question 3 or higher and with 47 states with "move over" laws already in place, the expectation is that this question will remain in the top five scoring questions in subsequent analyses. The question about driver removal laws also made the top 5, highlighting efforts across the country to pass safe quick clearance laws.
Mean Score Rank in 2010/ Baseline | Question Number | Question | 2010 Mean Score (n=92) | % Scoring 3 or Higher-2010 | % Change in 2010/ Baseline Mean Scores |
---|---|---|---|---|---|
1/1 | 4.2.2.1 Tactical | Have "move over" laws which require drivers to slow down and if possible move over to the adjacent lane when approaching workers or responders and equipment in the roadway? | 3.27 | 80% | 2.20% |
2/11 | 4.3.1.1 Support | Use a Traffic Management Center/Traffic Operations Center to coordinate incident detection, notification and response? | 3.22 | 80% | 62.50% |
3/9 | 4.1.2.4 Strategic | Conduct planning for special events? | 3.18 | 86% | 28.90% |
4/2 | 4.2.1.2 Tactical | Have "driver removal" laws which require drivers involved in minor crashes to move vehicles out of the travel lanes? | 3.16 | 76% | 5.10% |
5/14 | 4.3.2.1 Support | Have a real-time motorist information system providing incident-specific information? | 3.15 | 84% | 65.90% |
5/8 | 4.2.1.4 Tactical | Utilize the Incident Command System? | 3.15 | 76% | 23.60% |
Support
The questions in Support focus on the tools and technologies enabling improved incident detection, response and clearance. Despite a slight decline in mean score from 2008 to 2009, the overall mean score for the Support section rebounded to 62.6 in 2010.
In the Data subsection, the highest scoring question is 4.3.1.1 on the use of a Traffic Management Center/Traffic Operations Center (TMC/TOC) to coordinate incident detection, notification and response. However, lower scores throughout this subsection indicate that the potential of TMCs/TOCs is not yet being fully realized due to several factors including limited co-location of public safety and transportation in the centers.
Summary
The 2010 TIM SA is the first completed following the establishment of several new benchmarks in 2009 due to the TIM SA Revision completed in 2008. As a result of the revision, several key changes were made to the TIM SA:
- The three subsections were renamed.
- The total number of questions was reduced from 34 to 31.
- A new scoring approach was instituted which asked respondents to rate progress using High, Medium and Low rather than the numeric scoring of 0-4.
- An online TIM SA was introduced to make it easier for participants to respond to the questions.
With a record 92 TIM SA completed in 2010, it appears that the TIM SA continues to be seen as a beneficial tool by State and local TIM program managers. The 92 assessments represent 86 re-assessments and six new locations submitting an assessment for the first time. An overall score of 63.9 percent was achieved, representing a 37.8 percent increase over the Baseline. The highest scores continue to be in the Tactical section and the largest percentage increase over Baseline was in the Strategic section.
Low scoring questions and those with the least improvement over Baseline indicate specific program areas where additional guidance from FHWA is warranted. This includes TIM Performance Measurement and in particular, additional guidance on secondary incident definitions and technical direction on tracking reductions in the occurrence of secondary incidents.