21.04.2013 Views

ETTC'2003 - SEE

ETTC'2003 - SEE

ETTC'2003 - SEE

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

eference books, but it is as fundamental for test data or for any test. Henceforth it is<br />

necessary to document all test conditions, as well as all decisions that led to freeze this or that<br />

freedom degree within the scenario or the environment.<br />

This ultimate remark is important, as the VV&A and VV&C processes are very often<br />

described are costly, or as a necessary evil that generates initial overhead costs which lead to<br />

cost-efficiency only through subsequent reuse. However the same argument should be used<br />

stricto sensu for tests and test data, which is rarely underlined, all the more since in the past<br />

decades with the high budgets it was possible to conduct almost any desired test: we have<br />

today lots of test data that are totally useless as they cannot be reused in a coherent manner.<br />

As a first idea of the induced costs, we refer to the report “prefeasibility study on<br />

simulation based design and virtual prototyping”, published by the NATO Industrial Advisory<br />

Group in September 2000, reference NIAG-D(2000)9 AC/141(NG-6)D/25 : the VV&A cost<br />

is estimated as 15% of the global acquisition cost, and between 2 and 6% for a reused<br />

simulation.<br />

Back to verification and validation, it is difficult to evaluate a priori the necessary<br />

validation level, even if it is possible to define the necessary effort to reach a given credibility<br />

level. This comes from the fact that validation is performed in relation with an intended use of<br />

the product (data, model or simulation), and this cannot be made explicit from the beginning<br />

of the life cycle. Such a difficulty should not be seen as a tremendous obstacle, but as the<br />

necessity to conduct VV&A and VV&C within an iterative an incremental approach,<br />

throughout the whole system lifecycle.<br />

Technologies : solutions, constraints and necessary effort<br />

Virtual proving grounds, with a tight coupling between simulations and test data<br />

within virtual and digital synthetic environments, need to follow a few constraints in order to<br />

be viable. There is a need for interoperability between all simulations, even between systems<br />

which rely heavily on software, like C4ISR systems. This will be always more frequent with<br />

the development of systems-of-systems and network-centric warfare or cooperative<br />

engagement issues.<br />

It is not reasonable to develop every time specific interfaces, as a trivial cardinality<br />

analysis shows: their number is one order of magnitude larger that the number of incriminated<br />

systems, and an iterative evolution of the system implies thus an evolution of all interfaces,<br />

i.e. the order of magnitude of all considered connected systems.<br />

The (only) cost-efficient way is to adopt reference architectures at the various levels,<br />

what is usually known as meta-models from which the conceptual models of the systems can<br />

be easily established. To follow these reference architectures is indeed a constraint, but much<br />

smaller than the savings it generates, since the evolution of a system needs a priori only the<br />

modification of the interface with the reference model.<br />

Such reference architectures exist: HLA for simulations, others are under development<br />

for C4ISR systems, or are foreseen for robotic systems (cf. JAUGS, “Joint Architecture for<br />

Unmanned Ground Systems”, promoted by the US Army)…<br />

In a similar fashion, at data level, exchange standards are essential. As there does not<br />

exist a priori a unique format (SEDRIS should be a major format in the coming years), there<br />

is a need for meta-data standards. This is under study, for instance by the OMG, the “ object<br />

management group”.<br />

A quick review of the main efforts that should be investigated in order to optimize the<br />

collaboration between tests and simulations, one finds:

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!