09.12.2012 Views

I__. - International Military Testing Association

I__. - International Military Testing Association

I__. - International Military Testing Association

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

The goal in the TAE testing is to find and replace the LRU. Subjects begin TAE testing by reviewing a series<br />

of menus of symptoms, Panels, and diagnostic information; next they select equipment to be tested and<br />

conduct tests or replace a LRU.<br />

ch Hy@heseg, The 20 hypotheses for the TAE Test and Evaluation were organized into seven<br />

categories: experience, electronics knowledge, electronics performance proficiency, difficulty level, time,<br />

complex test equipment, and ranking. The hypotheses in each category, and method of testing each, are<br />

described in the following sections.<br />

METHOD<br />

Test Administration Procedu & <strong>Testing</strong> was conducted by NPRDC personnel in a classroom at the<br />

Advanced Electronics School: Department (AESD), Service Schools Command, San Diego, California.<br />

<strong>Testing</strong> was on the Zenith 248 microcomputer. Technical documentation for the hardware system was.in the<br />

classroom. Subjects were assigned randomized test sequences to protect from test order effects. Srxteen<br />

episodes were administered to each subject and each episode required about an hour to comPlete, but<br />

subjects had no specific time limit. Subjects completed all episodes in two to three days. The admrnrstrator<br />

was present in the classroom during testing. Subjects listened to an introduction to the TAE study and the<br />

technical documentation available; read and signed a Privacy Act release statement; and completed a<br />

computerized Learn Program, 2 practice and 14 test troubleshooting episodes. After testing, subjects<br />

received test performance feedback and completed a critique.<br />

Subjects. Subjects for the TAE test and evaluation were students in the “system” phase of the maintenance<br />

course and the system qualified instructors. All subjects were required to have school training on the<br />

subsystems.<br />

Data Data were collected for 53 students and 13 instructors in two data bases, using a standard<br />

statistical Package for analysis. The first contained demographic data: the second, performance data. Data<br />

were collected for seven classes of students between April and September 1989. Demographic data for each<br />

student included: SSN, time in service, Armed Services Vocational Aptitude Battery (ASVAB) scores, school<br />

subsystem scores, school corn rehensive score, school final score, class ranking, TAE ranking, and instructor<br />

ranking. Demographic and TAI! performance data for instructors were collected during September 1989. The<br />

demographic data for instructors included SSN, rate/rating, time in service & paygrade, time system qualified,<br />

time working on the system in the fleet and as a system instructor. The TAE program data for both students<br />

and instructors consisted of scores for 16 episodes encompassing 673 variables. Table A-l describes the<br />

variables for each episode (Episode 1 is presented).<br />

Data files were refined and evaluated. Data for five students were dropped due to missing data, and for<br />

two instructors due to lack of system qualification. Thus, the data of 59 subjects were used for this study, 48<br />

students and 11 instructors. The resultant data base were used to create files for testing the study hypotheses.<br />

The master file was used to create files with variables specifically required to test each hypothesis. The<br />

methods for testing the hypotheses are described in the following subsections.<br />

RESULTS and DISCUSSION<br />

Results of the data analyses are presented in Appendix A, and the specific areas investigated are discussed<br />

in the following:<br />

Demoaraohic Data. For the 48 students, the average time in service was 2.23 years. For the 11 instructors,<br />

9 had a rate of electronics technician first class (ETl) and 2, of ET2; the average paygrade was 5.82. The<br />

average time in service for instructors was 10.41 years and average time in paygrade was 3.64 years.<br />

Instructors were system qualified for an average of 4.67 years and had worked on the system hardware in the<br />

fleet an average of 2.94 years, In addition, they averaged 16.18 months as instructors.<br />

.<br />

EI@WIWZ (Table A-2). Hypothesis 1. Instructors (experts) will score significantly higher on the TAE test<br />

than students (novices). A one-way analysis of variance (ANOVA) was performed to test hypothesis 1. The<br />

F ratio value is not significant.<br />

Hypothesis 2. Sub’ects with a longer time in the electronics rate (i.e., Time in Service - TIS) will score<br />

significantly higher on tlle TAE test than subjects with less time in that rate.<br />

Generally, the relationship between experrence and TAE performance was not statistically significant. This<br />

apparent anomaly may be explained by the fact that instructors of the course are not required to be system<br />

qualified. Students must prove their system qualification to graduate.<br />

The lack of a significant relationship between experience and troubleshooting performance causes one to<br />

uestion if the experience measures were appropriate, if an appropriate set of subjects was tested, if the TAE<br />

1 elivery and evaluation systems are valid, or if there is actually no difference due experience. Given the face<br />

validity of TAE and the high level of expectation by subject matter experts of the relationship between<br />

experience and Performance, further testing is needed to resolve this issue.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!