01.12.2014 Views

monitoring

monitoring

monitoring

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

UNCLASSIFIED<br />

DEFENSE SCIENCE BOARD | DEPARTMENT OF DEFENSE<br />

methodology approach allows for a breakdown and prioritization of goals and objectives in the<br />

problem space into requirements and metrics for potential solution architectures. It will also<br />

allow for the systematic aggregation of performance assessments and analyses into an overall<br />

picture of <strong>monitoring</strong> and verification architecture performance.<br />

A.5. Proposed Decomposition Map Approach<br />

The bridging method proposed by the Task Force is a decomposition approach that maps<br />

between problem space descriptions and prospective solution architect elements. This<br />

subsection describes the decomposition approach in general; the next one provides an example<br />

to illustrate its application.<br />

As envisioned, the proposed approach begins with the selection of any node in the scenario<br />

framework discussed in Section A.2. The selected node is decomposed into any sub‐nodes<br />

required to add appropriate fidelity or resolution to the analysis. A system of decomposition<br />

layers is then constructed beneath the scenario nodes. Those four layers are:<br />

1. Strategic Capability Areas – This layer, while arbitrary, provides a convenient<br />

organizational structure when considering the universe of potential capability<br />

investments. As defined in this report, the strategic capability areas center on core<br />

elements of the mission space associated with reducing risk.<br />

2. Functional Objectives – Within each Strategic Capability Area, several functional<br />

objectives can be articulated. These objectives are intended to capture high‐level<br />

operational objectives that must be achieved. The articulation of these objectives<br />

must be performed by the decision maker, as there is no universal set. The metrics<br />

used to assess performance against those objectives must be derived by the analysts<br />

from the articulation of risk in the problem space.<br />

3. Tasks – Each functional objective can be further decomposed into a set of tasks. The<br />

tasks themselves are part of prospective solution architecture – i.e., tasks, just like<br />

objectives, are not universally defined, but proposed as part of a solution option. Each<br />

task defines a specific component of a functional objective, to be accomplished<br />

through the application of assets. The measures of performance used to assess the<br />

performance of a task are formulated derivatively from the metrics used to assess<br />

performance against the functional objectives.<br />

4. Assets – Each task is accomplished through the employment of assets. Assets can<br />

include hardware, platforms, people, training, concepts of operations, and programs –<br />

essentially any capability that can be specifically invested in. Requirements and<br />

metrics for assets are established derivatively from the tasks. The mix of appropriate<br />

assets for consideration is dependent on the tasks proposed as part of prospective<br />

solution architectures.<br />

As assessments of performance of existing or proposed solution architectures and components<br />

are completed through analytical work, the results must be first cast in the framework of the<br />

DSB TASK FORCE REPORT Appendix A: Unabridged Description | 85<br />

Nuclear Treaty Monitoring Verification Technologies<br />

UNCLASSIFIED

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!