09.12.2012 Views

I__. - International Military Testing Association

I__. - International Military Testing Association

I__. - International Military Testing Association

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

EVALUATION MODEL<br />

The evaluation system we recommend approaches evaluation on<br />

two levels. At the more immediate level, we attempt to determine<br />

effects on student performance in the specific training areas<br />

modified. For example, changes in test scores or training time in<br />

those specific content areas might be analyzed. The second level<br />

attempts to determine how the training program as a whole was<br />

affected by the modifications. Such measures as course attrition,<br />

total training time, performance in other areas of the program,<br />

etc., would be considered. Inferring cause and effect<br />

relationships becomes riskier as one moves to these more general<br />

measures of effect -- measures further removed from the proposed -'<br />

cause. However, modification in one area of the training program<br />

should ultimately affect the program as a whole and become<br />

manifested in these general measures. In reality, it is this<br />

broader impact which serves as the bottom line point of interest<br />

for most of our clients.<br />

The evaluation model we use can be condensed to a six-step<br />

process described below:<br />

1. Beqin evaluation nlanninq earlv, before imnlementation of the<br />

proqram chanqe if oossible. The evaluator should be involved as<br />

soon as possible, ideally during the modification planning stage -certainly<br />

prior to modification implementation. Many threats to<br />

validity can be anticipated and controlled if the evaluator is<br />

involved in this manner. Realistically, however, we know that this<br />

scenario seldom occurs. More often, the evaluator is called in<br />

after the modification has been implemented. For this reason, we<br />

usually find ourselves beginning with step two.<br />

2. Know the oroqram YOU are about to evaluate. A thorough<br />

understanding of the nature of the program change and its impact on<br />

the general operation of the training program is critical to good<br />

evaluation. The evaluator must understand the program's<br />

objectives, the anticipated impact of the change on these<br />

objectives, and the methods used to accomplish them. In addition,<br />

the evaluator must determine what data is currently being collected<br />

to evaluate program performance and whether this data might be<br />

useful in evaluating the program change. Most importantly, a<br />

definitive statement of how the change is intended to affect the<br />

program (that is, the goal of the program chanqe) must be<br />

formulated.<br />

3. Determine data collection procedures and qather baseline data.<br />

The purpose of baseline data is to develop a snapshot of how well<br />

the program is performing in the area to be modified prior to the<br />

change. Often you will find existing measures of performance, such<br />

as test scores, which directly address this question. In other<br />

123<br />

I

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!