02.01.2015 Views

Report - Government Executive

Report - Government Executive

Report - Government Executive

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

No training needs assessments have been identified and no documentation is available that<br />

identifies what user groups actually need, a major omission. The plan simply outlines various<br />

course topics that address different aspects of DCIPS processes and tools; it lacks any user<br />

considerations. Had a training needs assessment been performed, OUSD(I) and the components<br />

would have been well positioned to identify user requirements and skills gaps (particularly<br />

related to soft skills training), support a more informed and thorough approach to training, and<br />

address constraints on delivery (such as bandwidth limitations for web-based training and other<br />

technical challenges). As written, the training is DCIPS centric, not requirements centric.<br />

Specific training is also needed for rating consistency and fairness. Fed by reports of actual<br />

behavior, there is a perception that ratings are being forced to conform to predetermined<br />

distributions or specific quotas. There is the further perception, supported by actual NGA data,<br />

that administrative support staff (who primarily reside in Pay Band 2) are consistently given<br />

lower ratings overall since their work is less directly connected to the agency mission.<br />

The use of performance ratings is new to most supervisors, and the guidance for ensuring<br />

objectivity and fairness must be thorough and consistent. Few supervisors have previously used<br />

a rating system tied to performance objectives; the concepts behind the system and the actual<br />

practices must be communicated, trained, and reinforced.<br />

Raters must be more fully trained on how to apply a consistent approach to rating against the<br />

individual objectives and performance elements for each job, without bias against certain<br />

functions or forcing a distribution of ratings to a pre-set quota. Data suggest that more thorough<br />

training is needed across the DoD intelligence components to educate raters on how to prepare<br />

fair ratings. 126<br />

In addition, the DCIPS training evaluation approach relies on end-of-course evaluations.<br />

Monthly reports indicate the number of people trained, ratings of satisfaction, and other<br />

summary outputs. Missing is discussion of how the learning will be measured or applied in the<br />

workplace—that is, the actual outcome of the training. Although end-of-course ratings generally<br />

have been favorable (participants liked the training and thought it would be useful), online<br />

dialogue and focus group input suggests that training sometimes has been ineffective,<br />

particularly when the content or tools were subject to change.<br />

Finding 4-8<br />

Key planning documents, such as a training design document, are lacking and training courses<br />

have focused on DCIPS’ technical features rather than the broader behavioral changes needed<br />

to support the transformation.<br />

Delivery<br />

Delivery refers to how well a training strategy is implemented. Comprehensive in content,<br />

contextually relevant, and tailored to a specific audience, effective training employs appropriate<br />

126 Academy online dialogue and open forum data.<br />

73

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!