28.01.2015 Views

human factors

human factors

human factors

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

15-18 Handbook of Aviation Human Factors<br />

of operators with relevant experience results in fewer false starts, better insight in how and why the<br />

mission is performed, and a great savings in time, as well as money, in the latter steps of the process.<br />

By getting the operator involved from the beginning, the costly problem of making design changes<br />

further down the road is avoided.<br />

The dividing line between problem definition and solution development is often vague. Specific<br />

designs may affect task sequencing during the mission profile. This change in sequencing can reveal<br />

workload problems within the crew station. Because of this overtasking, the operator may shed tasks,<br />

which in turn alter the mission profile. Once the profile has changed, the designs may affect the tasks in<br />

a different way, and thus, the cycle continues. The design process is indeed an iterative process.<br />

15.2.1.1.4 Prototype Evaluation, Simulation Evaluation/Validation, Flight Test<br />

The last three steps are interdependent and very critical to the successful completion of an effective<br />

and proven crew station design. These three steps all work synergistically to “prove the solution.”<br />

Prototype evaluation marks the initial introduction of the implemented design concepts to the user.<br />

Although the users should be involved in the preliminary design step, the actual implementation into<br />

a prototype design will show the design in a whole new light. The design concepts are evaluated in a<br />

limited context, and suggestions are made by the user as to which designs should move forward to<br />

simulation. This step weeds out unfeasible design concepts. Human-in-the-loop simulation evaluation<br />

provides a more realistic and robust testing of the design concepts. In simulation evaluation, it is<br />

recommended that the new design concept be compared to an existing design in order to measure the<br />

“goodness” of the design concept. This step provides the final recommendation of a design concept<br />

for flight test.<br />

Traditionally, this process involved <strong>human</strong>-in-the-loop simulations, or virtual simulation as they<br />

are referred to today. At present, constructive simulation, which involves the use of models in simulated<br />

environments, is becoming a required part of the evaluation process as a low-cost alternative<br />

to conducting trade studies. Modeling specific systems, such as structures, engines, sensors, etc., for<br />

use in constructive simulation has been very successful (Aviation Week and Space Technology, 2003).<br />

However, one of the current challenges is modeling <strong>human</strong> behavior. Certainly, to determine the benefits<br />

of different technologies in this step of the design process, the simulation must not only model the<br />

technology, but also how the operator interacts with it. The Combat Automation Requirements Testbed<br />

(CART) program is developing an architecture that allows <strong>human</strong> behavior/performance models to<br />

interface with various constructive simulation environments to determine the “goodness” of various<br />

cockpit designs and how the operator interfaces with them.<br />

CART has been used to integrate such models successfully (Martin, Barbato, & Doyal, 2004).<br />

In one example, CART was used to model <strong>human</strong> tasks performed during an air-to-ground segment in<br />

a strike-fighter mission using a <strong>human</strong> performance model integrated with the Joint Integrated Mission<br />

Model aircraft model. Once the integrated model was run, results from the constructive simulation<br />

were compared with pilot performance from a virtual simulation in which real pilots performed the<br />

same tasks as the model. The <strong>human</strong> performance model was shown to predict the pilot performance<br />

with fairly high accuracy (correlation of 0.78 between the model-dependent measures and the pilotdependent<br />

measures) (Brett et al., 2002). Once the <strong>human</strong> performance models are validated, using<br />

constructive simulation prior to virtual simulation can save time and money by providing a quick way<br />

of thoroughly testing design concepts and advancing only the most promising one(s) to virtual simulation<br />

studies.<br />

Flight testing often involves only one design to be tested in operational use; however, in the case of<br />

the F-16, F-22, and the F-35 JSF, two prototypes were involved in a “fly-off.” For the purpose of this<br />

discussion, these final steps are combined to provide “Solution Evaluation.” Once again, there may not<br />

be a clear break between the solution evaluation and the solution definition step. It has been observed<br />

that most designers design, evaluate, redesign, etc., as they go. The transition from solution definition<br />

to solution evaluation occurs when formal, total-mission, total-system, <strong>human</strong>-in-the-loop evaluations

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!