09.12.2012 Views

I__. - International Military Testing Association

I__. - International Military Testing Association

I__. - International Military Testing Association

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

_.-.- - . . -.-_--_ . .- .~<br />

--...T<br />

TROUBLESHOOTING ASSESSMENT AND ENHANCEMENT (TAE) PROGRAM:<br />

TEST AND EVALUATION RESULTS *<br />

Paper Presented by Dr. Harry B. Conner,<br />

Navy Personnel Research and Development Center, San Diego, CA 921528800<br />

32nd ANNUAL MILITARY TEST& ASSOCIATION CONFERENCE<br />

November f&9,1990, Orange Beach, Alabama<br />

Nauta Rack- (1984) reported on a number of difficulties associated with the U.S. Navy’s ability to<br />

maintain fts weapons systems. He reported the costs of poor performance of maintenance personnel and<br />

recommended areas re uiring investigation if performance of these personnel was to improve. At about the<br />

same time at the Navy 8 ersonnel Research and Development Center (NPRDC), we determined that one of<br />

the difficulties we had encountered in the test and evaluation of an ongoing project (the Enlisted Personnel<br />

Individualized Career System-EPICS) was we had no way of comparing maintenance personnel in the most<br />

important aspect of their erformance: troubleshooting of the hardware system. We realized that we needed<br />

an objective way to eva P uate personnel performance in the skill of troubleshooting. A, literaturessearch<br />

supported the contention that most research and development efforts in this area start with a premtse of a<br />

known expert, journeyman/master, or experienced troubleshooter when in fact these are defined rather than<br />

empirically determined. Therefore, we concluded that efforts to improve maintenance personnel<br />

troubleshooting performance were futile until we could empirically and objectively define how a good<br />

troubleshooter performs.<br />

Aoproach. We addressed this evaluation issue first with a feasibility study (Conner 1988, 1987) followed<br />

by a more structured investigation the Troubleshooting Assessment and Enhancement (TAE) program. The<br />

TAE objective was to design, develop, test, and evaluate a low cost troubleshooting evaluation capability.<br />

The model (Figure 1) we used in our investigation shows that maintenance is just one of a number*of<br />

activities associated with a hardware system. Within the area of maintenance, one can perform preventattve<br />

or corrective maintenance. Within corrective maintenance, one troubleshoots or repairs. Specifically, we<br />

focused on the skill of troubleshooting, which we considered to be a skill of problem solving requiring abstract<br />

conceptualization capabilities.<br />

HARDWARE SYSTEM INTERACTfONS<br />

I I I<br />

CONSTRUCT INSTALL OPERATE MAINTAIN<br />

I 1<br />

PREVENTIVE-- . CORRECTIVE<br />

. .._____.... ^_<br />

MAINTENANCE MAlNTtNANUt<br />

I<br />

I I<br />

+ TROUBLiSHOOTlNG REPAIR<br />

Figure 1. Hardware Activity to Troubleshooting<br />

With 25 subject matter experts, we developed a list of factors to be used to evaluate the proficiency of a<br />

troubleshooting technician in a high tech environment; that is, systems having state-of-the-art electronics and<br />

computers requiring troubleshooting. Next, we sent our initial factors list with definitions (shown in Table 1)<br />

to 1200 operational hi-tech personnel for ranking. The results were then weighted by a jury of experts (on the<br />

system under investigation). Once the factors were weighted, a scoring methodology was developed. Table<br />

2 provides the results of the factor development, weighting, and TAE scoring scheme. Our literature search<br />

caused us to add a tenth factor: redundant checks.<br />

::<br />

3.<br />

4.<br />

2<br />

i:<br />

9.<br />

10.<br />

Rank Factor<br />

Solution.<br />

Cost (Incorrect Solutions).<br />

Time.<br />

Proof Points.<br />

Illogical Approaches.<br />

Invalid Checks.<br />

Out-of-Bounds.<br />

Test Points.<br />

Checks.<br />

Redundant Checks.<br />

TABLE 1. Factor Definitions<br />

-<br />

Definition<br />

Problem is correctly solved; fault is identified.<br />

Number of Lowest Replaceable Units (LRUs) incorrectly identified as faulty.<br />

Total minutes from login to logout taken to find the fault.<br />

Test points that positively identify LRUs as faulty.<br />

Inappropriate equipment selection.<br />

Inappropriate test at appropriate test point.<br />

inappropriate test point was selected.<br />

Total number of valid reference designator tests.<br />

Total number of tests performed at all test points.<br />

Same test performed at same point during the episode. - - - -<br />

� The opinions expressed in this paper are (hose of the author, are not official and da not necessarily reflect Ihe vim+s Of the Navy mPadme”t<br />

372<br />

.<br />

/

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!