12.07.2015 Views

Test and Evaluation Management Guide - AcqNotes.com

Test and Evaluation Management Guide - AcqNotes.com

Test and Evaluation Management Guide - AcqNotes.com

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

TEST AND EVALUATIONMANAGEMENT GUIDEJANUARY 2005FIFTH EDITIONPUBLISHED BYTHE DEFENSE ACQUISITION UNIVERSITY PRESSFORT BELVOIR, VA 22060-55654i


ACKNOWLEDGMENTSThis January 2005 update to the Defense Acquisition University’s <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> <strong>Management</strong><strong>Guide</strong> includes updates from the Military Services, Defense Agencies, <strong>and</strong> other organizations, aswell as changes in the May 2003 DoD 5000 series. Inputs were collated <strong>and</strong> finalized by DAUProgram Director for <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Dr. John Claxton. Freelance writer Christina Cavoli <strong>and</strong>Defense AT&L editor-in-chief Collie Johnson edited the final document. Freelance designer <strong>and</strong>cartoonist Jim Elmore designed the cover, <strong>and</strong> final desktop publishing <strong>and</strong> layout was ac<strong>com</strong>plishedby Bartlett Communications <strong>and</strong> DAU Visual Information Specialist Deborah Gonzalez.iii


FOREWORDThis book is one of many technical management educational guides written from a Department ofDefense perspective; i.e., non-Service specific. They are intended primarily for use in courses atDefense Acquisition University <strong>and</strong> secondarily as a desk reference for program <strong>and</strong> project managementpersonnel. These guidebooks are written for current <strong>and</strong> potential acquisition managementpersonnel who are familiar with basic terms <strong>and</strong> definitions employed in program offices. The guidebooksare designed to assist government <strong>and</strong> industry personnel in executing their management responsibilitiesrelative to the acquisition <strong>and</strong> support of defense systems.The objective of a well-managed test <strong>and</strong> evaluation program is to provide timely <strong>and</strong> accurate information.This guide has been developed to assist the acquisition <strong>com</strong>munity in obtaining a betterunderst<strong>and</strong>ing of whom the decision-makers are <strong>and</strong> determining how <strong>and</strong> when to plan test <strong>and</strong>evaluation events.4Dr. John D. ClaxtonProgram Director, T&EDefense Acquisition Universityv


CONTENTSPageAcknowledgments .............................................................................................................................. iiiForeword .............................................................................................................................................. vMODULE I — <strong>Management</strong> of <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Chapter 1 – Importance of <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>1.1 Introduction ........................................................................................................... 1-11.2 <strong>Test</strong>ing as a Risk <strong>Management</strong> Tool.................................................................... 1-11.3 The T&E Contribution at Major Milestones ....................................................... 1-41.4 Summary................................................................................................................ 1-8Chapter 2 — The <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Process2.1 Introduction ........................................................................................................... 2-12.2 Defense System Acquisition Process ................................................................... 2-12.3 T&E <strong>and</strong> the Systems Engineering Process (SEP) ............................................. 2-52.4 Definitions ............................................................................................................. 2-92.5 The DoD T&E Process ....................................................................................... 2-112.6 Summary.............................................................................................................. 2-12Chapter 3 — T&E Policy Structure <strong>and</strong> Oversight Mechanism3.1 Introduction ........................................................................................................... 3-13.2 The Congress ........................................................................................................ 3-13.3 OSD Oversight Structure ...................................................................................... 3-23.4 Service T&E <strong>Management</strong> Structures.................................................................. 3-53.5 The T&E Executive Agent Structure ................................................................. 3-113.6 Summary.............................................................................................................. 3-12Chapter 4 — Program Office Responsibilities for <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>4.1 Introduction ........................................................................................................... 4-14.2 Relationship to the Program Manager ................................................................. 4-14.3 Early Program Stages ........................................................................................... 4-14.4 PMO/Contractor <strong>Test</strong> <strong>Management</strong> ...................................................................... 4-4vii


4.5 Integrated Product Teams for <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>.............................................. 4-44.6 <strong>Test</strong> Program Funding/Budgeting ......................................................................... 4-44.7 Technical Reviews, Design Reviews, <strong>and</strong> Audits................................................ 4-54.8 Contractor <strong>Test</strong>ing ................................................................................................. 4-54.9 Specifications ........................................................................................................ 4-54.10 Independent <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Agencies.......................................................... 4-64.11 PMO Responsibilities for Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> ............................... 4-64.12 Summary.............................................................................................................. 4-10Chapter 5 — <strong>Test</strong>-Related Documentation5.1 Introduction ........................................................................................................... 5-15.2 Requirements Documentation ............................................................................... 5-15.3 Program Decision Documentation........................................................................ 5-45.4 Program <strong>Management</strong> Documentation ................................................................. 5-45.5 <strong>Test</strong> Program Documentation ............................................................................... 5-65.6 Other <strong>Test</strong>-Related Status Reports ....................................................................... 5-95.7 Summary.............................................................................................................. 5-10Chapter 6 — Types of <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>6.1 Introduction ........................................................................................................... 6-16.2 Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> ........................................................................ 6-16.3 Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>........................................................................... 6-36.4 Multi-Service <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> ....................................................................... 6-66.5 Joint <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>...................................................................................... 6-66.6 Live Fire <strong>Test</strong>ing ................................................................................................... 6-76.7 Nuclear, Biological, <strong>and</strong> Chemical (NBC) Weapons <strong>Test</strong>ing ............................. 6-86.8 Nuclear Hardness <strong>and</strong> Survivability (NHS) <strong>Test</strong>ing............................................ 6-96.9 Summary................................................................................................................ 6-9MODULE II — Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Chapter 7 — Introduction to Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>7.1 Introduction ........................................................................................................... 7-17.2 DT&E <strong>and</strong> the System Acquisition Cycle........................................................... 7-17.3 DT&E Responsibilities ......................................................................................... 7-47.4 <strong>Test</strong> Program Integration ...................................................................................... 7-57.5 Areas of DT&E Focus .......................................................................................... 7-67.6 System Design for <strong>Test</strong>ing ................................................................................... 7-7viii


7.7 Impact of Warranties on T&E .............................................................................. 7-87.8 DT&E of Limited Procurement Quantity Programs ........................................... 7-97.9 Summary................................................................................................................ 7-9Chapter 8 — DT&E Support of Technical Reviews <strong>and</strong> Milestone Decisions8.1 Introduction ........................................................................................................... 8-18.2 DT&E <strong>and</strong> the Review Process ........................................................................... 8-18.3 Configuration Change Control ............................................................................. 8-68.4 Summary................................................................................................................ 8-6Chapter 9 — Combined <strong>and</strong> Concurrent <strong>Test</strong>ing9.1 Introduction ........................................................................................................... 9-19.2 Combining Development <strong>Test</strong> <strong>and</strong> Operational <strong>Test</strong> ........................................... 9-19.3 Concurrent <strong>Test</strong>ing ................................................................................................ 9-39.4 Advantages <strong>and</strong> Limitations ................................................................................. 9-49.5 Summary................................................................................................................ 9-4Chapter 10 — Production-Related <strong>Test</strong>ing Activities10.1 Introduction ......................................................................................................... 10-110.2 Production <strong>Management</strong> ..................................................................................... 10-110.3 Production Readiness Review (PRR)................................................................. 10-210.4 Qualification <strong>Test</strong>ing ........................................................................................... 10-210.5 Transition to Production ..................................................................................... 10-210.6 Low Rate Initial Production (LRIP) .................................................................. 10-410.7 Production Acceptance <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (PAT&E)...................................... 10-410.8 Summary.............................................................................................................. 10-5MODULE III — Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Chapter 11 — Introduction to Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>11.1 Introduction ......................................................................................................... 11-111.2 Purpose of OT&E ............................................................................................... 11-111.3 <strong>Test</strong> Participants .................................................................................................. 11-111.4 Types of OT&E ................................................................................................... 11-211.5 <strong>Test</strong> Planning ....................................................................................................... 11-411.6 <strong>Test</strong> Execution ..................................................................................................... 11-511.7 <strong>Test</strong> Reporting ..................................................................................................... 11-511.8 Summary.............................................................................................................. 11-7ix


Chapter 12 — OT&E to Support Decision Reviews12.1 Introduction ......................................................................................................... 12-112.2 Concept Refinement (CR) <strong>and</strong> Technology Development (TD) ...................... 12-112.3 System Development <strong>and</strong> Demonstration (SDD) .............................................. 12-212.4 Production <strong>and</strong> Deployment (P&D) ................................................................... 12-412.5 Operations <strong>and</strong> Support (O&S) .......................................................................... 12-412.6 Summary.............................................................................................................. 12-5MODULE IV — <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> PlanningChapter 13 — <strong>Evaluation</strong>13.1 Introduction ......................................................................................................... 13-113.2 Difference Between “<strong>Test</strong>” <strong>and</strong> “<strong>Evaluation</strong>” .................................................... 13-113.3 The <strong>Evaluation</strong> Process....................................................................................... 13-113.4 Issues <strong>and</strong> Criteria .............................................................................................. 13-313.5 Measures of Effectiveness (MOEs) .................................................................... 13-513.6 <strong>Evaluation</strong> Planning ............................................................................................ 13-513.7 Evaluating Development <strong>and</strong> Operational <strong>Test</strong>s ................................................ 13-613.8 Summary.............................................................................................................. 13-8Chapter 14 — Modeling <strong>and</strong> Simulation Support to T&E14.1 Introduction ......................................................................................................... 14-114.2 Types of Models <strong>and</strong> Simulations...................................................................... 14-114.3 Validity of Modeling <strong>and</strong> Simulation ................................................................ 14-314.4 Support to <strong>Test</strong> Design <strong>and</strong> Planning ................................................................ 14-414.5 Support to <strong>Test</strong> Execution .................................................................................. 14-714.6 Support to Analysis <strong>and</strong> <strong>Test</strong> Reporting ............................................................ 14-914.7 Simulation Integration......................................................................................... 14-914.8 Simulation Planning ............................................................................................ 14-914.9 Summary............................................................................................................ 14-10Chapter 15 — <strong>Test</strong> Resources15.1 Introduction ......................................................................................................... 15-115.2 Obtaining <strong>Test</strong> Resources ................................................................................... 15-115.3 <strong>Test</strong> Resource Planning....................................................................................... 15-415.4 <strong>Test</strong> Resource Funding ....................................................................................... 15-815.5 Summary............................................................................................................ 15-10x


Chapter 16 — <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan16.1 Introduction ......................................................................................................... 16-116.2 TEMP Development............................................................................................ 16-116.3 TEMP Format...................................................................................................... 16-516.4 Summary.............................................................................................................. 16-7MODULE V — Specialized <strong>Test</strong>ingChapter 17 — Software Systems <strong>Test</strong>ing17.1 Introduction ......................................................................................................... 17-117.2 Definitions ........................................................................................................... 17-217.3 Purpose of Software <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> .......................................................... 17-217.4 Software Development Process .......................................................................... 17-317.5 T&E in the Software Life Cycle ....................................................................... 17-417.6 Summary.............................................................................................................. 17-8Chapter 18 — <strong>Test</strong>ing for Vulnerability <strong>and</strong> Lethality18.1 Introduction ......................................................................................................... 18-118.2 Live Fire <strong>Test</strong>ing ................................................................................................. 18-118.3 <strong>Test</strong>ing for Nuclear Hardness <strong>and</strong> Survivability ............................................... 18-518.4 Summary.............................................................................................................. 18-7Chapter 19 — Logistics Infrastructure T&E19.1 Introduction ......................................................................................................... 19-119.2 Planning for Logistics Support System T&E .................................................... 19-119.3 Conducting Logistics Support System T&E...................................................... 19-419.4 Limitations to Logistics Support System T&E ................................................. 19-819.5 Summary.............................................................................................................. 19-8Chapter 20 — EC/C 4 ISR <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>20.1 Introduction ......................................................................................................... 20-120.2 <strong>Test</strong>ing EC Systems ............................................................................................ 20-120.3 <strong>Test</strong>ing of C 4 ISR Systems .................................................................................. 20-320.4 Trends in <strong>Test</strong>ing C 4 I Systems ........................................................................... 20-520.5 T&E of Surveillance <strong>and</strong> Reconnaissance Systems.......................................... 20-620.6 Summary.............................................................................................................. 20-7xi


Chapter 21 — Multi-Service <strong>Test</strong>s21.1 Introduction ......................................................................................................... 21-121.2 Background.......................................................................................................... 21-121.3 <strong>Test</strong> Program Responsibilities ............................................................................ 21-221.4 <strong>Test</strong> Team Structure ............................................................................................ 21-321.5 <strong>Test</strong> Planning ....................................................................................................... 21-321.6 Discrepancy Reporting ........................................................................................ 21-321.7 <strong>Test</strong> Reporting ..................................................................................................... 21-421.8 Summary.............................................................................................................. 21-4Chapter 22 — International <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Programs22.1 Introduction ......................................................................................................... 22-122.2 Foreign Comparative <strong>Test</strong> Program .................................................................... 22-122.3 NATO Comparative <strong>Test</strong> Program ...................................................................... 22-222.4 T&E <strong>Management</strong> in Multinational Programs .................................................. 22-222.5 U.S. <strong>and</strong> NATO Acquisition Programs .............................................................. 22-422.6 Summary.............................................................................................................. 22-4Chapter 23 — Commercial <strong>and</strong> Non-Development Items23.1 Introduction ......................................................................................................... 23-123.2 Market Investigation <strong>and</strong> Procurement .............................................................. 23-323.3 Commercial Item <strong>and</strong> NDI <strong>Test</strong>ing .................................................................... 23-323.4 Resources <strong>and</strong> Funding....................................................................................... 23-523.5 Summary.............................................................................................................. 23-5Chapter 24 — <strong>Test</strong>ing the Special Cases24.1 Introduction ......................................................................................................... 24-124.2 <strong>Test</strong>ing with Limitations ..................................................................................... 24-124.3 Space System <strong>Test</strong>ing ......................................................................................... 24-324.4 Operations Security <strong>and</strong> T&E ............................................................................ 24-524.5 Advanced Concept Technology Demonstrations................................................ 24-524.6 Summary.............................................................................................................. 24-5Chapter 25 — Best Practices in T&E of Weapon Systems25.1 Introduction ......................................................................................................... 25-125.2 Specific Weapon Systems <strong>Test</strong>ing Checklist ..................................................... 25-2xii


APPENDICESAppendix A – Acronyms <strong>and</strong> Their Meanings ................................................................. A-1Appendix B – DoD Glossary of <strong>Test</strong> Terminology .......................................................... B-1Appendix C – <strong>Test</strong>-Related Data Item Descriptions ......................................................... C-1Appendix D – Bibliography ............................................................................................... D-1Appendix E – Index ............................................................................................................ E-1Appendix F – Points of Contact for Service <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Courses ..................... F-1LIST OF FIGURESFigure 1-1. The 5000 Model ............................................................................................... 1-2Figure 1-2. Life Cycle Cost Decision Impact <strong>and</strong> Expenditures ...................................... 1-3Figure 2-1. <strong>Test</strong>ing <strong>and</strong> the Acquisition Process................................................................ 2-2Figure 2-2. Requirements to Design ................................................................................... 2-4Figure 2-3. The Systems Engineering Process ................................................................... 2-6Figure 2-4. Systems Engineering Process <strong>and</strong> <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>.................................. 2-8Figure 2-5. Design Reviews .............................................................................................. 2-10Figure 2-6. DoD <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Process ................................................................. 2-11Figure 3-1. DoD <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Organization ........................................................... 3-3Figure 3-2. Army <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Organization ......................................................... 3-5Figure 3-3. Navy <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Organization .......................................................... 3-8Figure 3-4. Air Force <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Organization ................................................... 3-9Figure 3-5. <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Executive Agent Structure ............................................ 3-11Figure 4-1. Lessons Learned from OT&E for the PM ...................................................... 4-6Figure 5-1. <strong>Test</strong>-Related Documentation ............................................................................ 5-2Figure 5-2. Requirements Definition Process..................................................................... 5-3Figure 5-3. <strong>Test</strong> Program Documentation ........................................................................... 5-6Figure 6-1. <strong>Test</strong>ing During Acquisition .............................................................................. 6-2Figure 6-2. Sample Hierarchy of Questions Addressed in OT&E .................................... 6-5Figure 7-1. Contractor’s Integrated <strong>Test</strong> Requirements ...................................................... 7-2Figure 7-2. Design for <strong>Test</strong>ing Procedures ......................................................................... 7-8Figure 8-1. Relationship of DT&E to the Acquisition Process......................................... 8-1Figure 8-2. Technical Review Timeline .............................................................................. 8-3Figure 8-3. Specifications Summary................................................................................... 8-4Figure 11-1. Organizational Breakdown of the I-81mmMortar Operational <strong>Test</strong> Directorate ...................................................... 11-6Figure 12-1. OT&E Related to the Milestone Process ...................................................... 12-2Figure 12-2. Sources of Data .............................................................................................. 12-3xiii


Figure 13-1. Functional Block Diagram of the <strong>Evaluation</strong> Process ................................. 13-3Figure 13-2. Dendritic Approach to <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> .................................................. 13-7Figure 14-1. The Simulation Spectrum .............................................................................. 14-3Figure 14-2. Values of Selected Criteria Conducive to Modeling <strong>and</strong> Simulation .......... 14-4Figure 14-3. Modeling <strong>and</strong> Simulation Application in <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> .................... 14-5Figure 14-4. STEP Process.................................................................................................. 14-6Figure 15-1. DoD Major Range <strong>and</strong> <strong>Test</strong> Facility Base .................................................... 15-3Figure 17-1. The Error Avalanche....................................................................................... 17-3Figure 17-2. System Development Process ........................................................................ 17-4Figure 17-3. Spiral Model of AIS Development Process .................................................. 17-5Figure 18-1. Live Fire <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Planning <strong>Guide</strong> ............................................. 18-3Figure 19-1. Logistics Supportability Objectives in the T&E Program............................ 19-2Figure 20-1. Integrated EC <strong>Test</strong>ing Approach .................................................................... 20-2Figure 20-2. EC <strong>Test</strong> Process Concept ............................................................................... 20-3Figure 20-3. EC <strong>Test</strong> Resource Categories ......................................................................... 20-4Figure 20-4. The Evolutionary Acquisition Process .......................................................... 20-6Figure 21-1. Simple Multi-Service OT&E <strong>Test</strong> Team Composition ................................. 21-2Figure 23-1. The Spectrum of Technology Maturity ......................................................... 23-1LIST OF TABLESTable 5-1. Sample <strong>Test</strong> Plan Contents............................................................................... 5-8Table 6-1. Differences Between DT&E <strong>and</strong> IOT&E ........................................................ 6-4Table 8-1. Technical Reviews <strong>and</strong> Audits ......................................................................... 8-2Table 9-1. Combined vs. Concurrent <strong>Test</strong>ing: Advantages <strong>and</strong> Limitations ................... 9-2Table 10-1. PRR <strong>Guide</strong>lines Checklist.............................................................................. 10-3Table 13-1. Sample <strong>Evaluation</strong> Plan ................................................................................. 13-2Table 15-1. TEMP <strong>Test</strong> Resource Summary Section ....................................................... 15-5Table 16-1. <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan Format ...................................................... 16-6Table 18-1. Relationships Between Key Concepts ........................................................... 18-1Table 18-2. Types of Live Fire <strong>Test</strong>ing ............................................................................. 18-2Table 18-3. Nuclear Hardness <strong>and</strong> Survivability Assessment Activities ......................... 18-6xiv


IMODULEMANAGEMENT OFTEST AND EVALUATION<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> is a management tool <strong>and</strong> an integral partof the development process. This module will address the policystructure <strong>and</strong> oversight mechanisms in place for test <strong>and</strong>evaluation.


1IMPORTANCE OFTEST AND EVALUATION1.1 INTRODUCTIONThe <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E) process is anintegral part of the Systems Engineering Process(SEP), which identifies levels of performance <strong>and</strong>assists the developer in correcting deficiencies.It is a significant element in the decision-makingprocess, providing data that support trade-offanalysis, risk reduction, <strong>and</strong> requirements refinement.Program decisions on system performancematurity <strong>and</strong> readiness to advance to the nextphase of development take into considerationdemonstrated performance. The issue of paramountimportance to the servicemember user issystem performance; i.e., will it fulfill the mission.The T&E process provides data that tell theuser how well the system is performing duringdevelopment <strong>and</strong> if it is ready for fielding. TheProgram Manager (PM) must balance the risksof cost, schedule, <strong>and</strong> performance to keep theprogram on track to production <strong>and</strong> fielding. Theresponsibility of decision-making authoritiescenters on assessing risk tradeoffs. As stated inDepartment of Defense Directive (DoDD) 5000.1,The Defense Acquisition System, “<strong>Test</strong> <strong>and</strong> evaluationshall be integrated throughout the defenseacquisition process. <strong>Test</strong> <strong>and</strong> evaluation shall bestructured to provide essential information to decisionmakers, assess attainment of technical performanceparameters, <strong>and</strong> determine whether systemsare operationally effective, suitable,survivable, <strong>and</strong> safe for intended use. The conductof test <strong>and</strong> evaluation, integrated with modeling<strong>and</strong> simulation, shall facilitate learning, assesstechnology maturity <strong>and</strong> interoperability,facilitate integration into fielded forces, <strong>and</strong> confirmperformance against documented capabilityneeds <strong>and</strong> adversary capabilities as described inthe system threat assessment.” 11.2 TESTING AS A RISK MANAGEMENTTOOLCorrecting defects in weapons has been estimatedto add from 10 percent to 30 percent to the costof each item. 2 Such costly redesign <strong>and</strong> modificationefforts can be reduced if carefully planned<strong>and</strong> executed T&E programs are used to detect<strong>and</strong> fix system deficiencies early in the acquisitionprocess (Figure 1-1). Fixes instituted duringearly work efforts (Systems Integration (SI)) inthe System Development <strong>and</strong> Demonstration(SDD) Phase cost significantly less than thoseimplemented after the Critical Design Review(CDR), when most design decisions have alreadybeen made.T&E results figure prominently in the decisionsreached at design <strong>and</strong> milestone reviews. However,the fact that T&E results are required atmajor decision points does not presuppose thatT&E results must always be favorable. The finaldecision responsibility lies with the decisionmaker who must examine the critical issues <strong>and</strong>weigh the facts. Only the decision maker c<strong>and</strong>etermine the weight <strong>and</strong> importance that is tobe attributed to a system’s capabilities <strong>and</strong> short<strong>com</strong>ings<strong>and</strong> the degree of risk that can beaccepted. The decision-making authority will beunable to make this judgment without a solid baseof information provided by T&E. Figure 1-2illustrates the Life Cycle Cost (LCC) of a system<strong>and</strong> how decisions impact program expenditures.1-1


THE 5000 MODELTechnologyOpportunities &User Needs(BA 1 & 2)• Multiple entry points possibledepending ontechnical/concept maturity• Three basic options at eachdecision point: Proceed intonext phase; do additional work;terminate effort• Reviews are in-phasedecision/progress points heldas necessaryFundingRequirementsConceptRefinementConceptDecisionATechnologyDevelopmentPre-Systems AcquisitionProgramOutyear FundingB BSystemIntegrationDRRSystemDemoSystem Dev & DemonstrationMS C EXIT CRITERIADemonstrated systemApproved CPD & assured interoperabilityAffordability assessmentStrategy in place for evolutionary approach,production readiness, <strong>and</strong> supportabilityCIOCFRPDRContinuous <strong>com</strong>munication with usersBLOCK IIEarly & continuous testingBLOCK IIIBA 3 BA 4BA 5 BA 5/Proc Proc/Operations & MaintenanceSingle Step orEvolution to FullCapability All validated by JROCICD CDDCPDLRIPRate Production &DeploymentProduction & Deployment• MS A: Tech Devel• MS B: Begin SystemDevelopment• MS C: Commitmentto production.Operations &SupportConceptRefinement• Paper studies ofalternative concepts formeeting a mission• Exit criteria: Specificconcept to be pursued& technology exists.TechnologyDevelopment• Development ofsubsystems/<strong>com</strong>ponents thatmust be demonstrated beforeintegration into a system• Concept/tech demonstration ofnew system concepts• Exit criteria: Systemarchitecture & technologymature.SystemIntegration• System integration ofdemonstrated subsystems<strong>and</strong> <strong>com</strong>ponents• Reduction of integration risk• Exit criterion: Systemdemonstration in a relevantenvironment (e.g., firstflight).SystemDemonstration• Complete development• Demo engineeringdevelopment models• Combined DT/OT• Exit criterion: Systemdemonstration in anoperationalenvironment.LRIP• IOT&E, LFT&E ofprod-rep articles• Create manufacturingcapability• LRIP• Exit criterion: B-LRIPreport.Rate Prod &Deployment• Full rate production• Deployment of systemFigure 1-1. The 5000 Model1-2


A Defense Science Board (DSB) 1999 Task Forcefocused on a broad overview of the state of T&Ewithin the Department of Defense (DoD). 3 Thisgroup made the following observations about theT&E process:• The focus of T&E should be on how to bestsupport the acquisition process;• T&E planning with Operational <strong>Test</strong> (OT) personnelshould start early in the acquisitioncycle;• Distrust remains between the development <strong>and</strong>test <strong>com</strong>munities;• Contractor testing, developmental testing, <strong>and</strong>operational testing have some overlappingfunctions;• Ensuring the test data are independently evaluatedis the essential element, not the taking ofthe data itself;• Responses to perceived test “failures” are ofteninappropriate <strong>and</strong> counterproductive.The DSB Task Force (1983) developed a set oftemplates for use in establishing <strong>and</strong> maintaininglow-risk programs. Each template describedan area of risk <strong>and</strong> then specified technicalmethods for reducing that risk. PMs <strong>and</strong> test managersmay wish to consult these templates forguidance in reducing the risks frequently associatedwith test programs. The DoD manual Transitionfrom Development to Production containssample risk management templates. 410075PERCENTAGEOF TOTAL 50PERCENTAGESIMPACTOF DECISIONSON LIFE-CYCLE COSTS(% COMMITMENT)TOTALEXPENDITURESFOR LIFE-CYCLEOPERATIONS &SUPPORTPRODUCTION25RDT&EACQUISITIONPROGRAM MILESTONESA B C FRPDRPHASE: CRTD SDD P&D OPNS & SUPPORTSOURCE: Defense Acquisition University.Figure 1-2. Life Cycle Cost Decision Impact <strong>and</strong> Expenditures1-3


1.3 THE T&E CONTRIBUTION ATMAJOR MILESTONEST&E progress is monitored by the Office of theSecretary of Defense (OSD) throughout theacquisition process. Their oversight extends toMajor Defense Acquisition Programs (MDAPs)or designated acquisitions. T&E officials withinOSD render independent assessments to the DefenseAcquisition Board (DAB), the DefenseAcquisition Executive (DAE), <strong>and</strong> the Secretaryof Defense (SECDEF) at each system milestonereview. These assessments are based on thefollowing T&E information:• The <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan (TEMP)<strong>and</strong> more detailed supporting documentsdeveloped by responsible Service activities;• Service test agency reports <strong>and</strong> briefings;• T&E, Modeling <strong>and</strong> Simulation (M&S), <strong>and</strong>data from other sources such as Service PMs,laboratories, industry developers, studies, <strong>and</strong>analyses.At Milestone B, the OSD T&E assessmentsreflect an evaluation of system concepts <strong>and</strong> technologyalternatives using early performanceparameter objectives <strong>and</strong> thresholds found in anapproved preliminary TEMP. At Milestone C,assessments include an evaluation of previouslyexecuted test plans <strong>and</strong> test results. At the FullRate Production Decision Review (FRPDR),assessments include consideration of the operationaleffectiveness <strong>and</strong> suitability evaluations ofweapon systems.A primary contribution made by T&E is thedetection <strong>and</strong> reporting of deficiencies that mayadversely impact the performance capability oravailability/supportability of a system. A deficiencyreporting process is used throughout theacquisition process to report, evaluate, <strong>and</strong> tracksystem deficiencies <strong>and</strong> to provide the impetusfor corrective actions that improve performanceto desired levels.1.3.1 T&E Contributions Prior toMilestone BDuring Concept Refinement (CR) <strong>and</strong>Technology Development (TD) activities prior toMilestone B, laboratory testing <strong>and</strong> M&S are conductedby the contractors <strong>and</strong> the developmentagency to demonstrate <strong>and</strong> assess the capabilitiesof key subsystems <strong>and</strong> <strong>com</strong>ponents. The test<strong>and</strong> simulation designs are based on theoperational needs documented in the Initial CapabilitiesDocument (ICD). Studies, analyses,simulation, <strong>and</strong> test data are used by the developmentagency to explore <strong>and</strong> evaluate alternativeconcepts proposed to satisfy the user’s needs.Also during this period, the Operational <strong>Test</strong>Agency (OTA) monitors CR <strong>and</strong> TD activities togather information for future T&E planning <strong>and</strong>to provide effectiveness <strong>and</strong> suitability input desiredby the PM. The OTA also conducts EarlyOperational Assessments (EOAs), as feasible, toassess the operational impact of c<strong>and</strong>idate technicalapproaches <strong>and</strong> to assist in selecting preferredalternative system concepts.The development agency prepares the TechnologyDevelopment Strategy (TDS) with its early T&Estrategy for Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(DT&E) <strong>and</strong> M&S. The TDS presents T&E plansfor system design(s) engineering <strong>and</strong> performanceevaluations. The OTA may provide an EOA. Thisinformation is incorporated into the PM’s TEMPthat documents the program’s T&E strategy thatsupports the Milestone B decision to proceed tothe next phase.1.3.2 T&E Contributions Prior toMilestone CDuring the SDD Phase, concepts approved forprototyping form the baseline that is used fordetailed test planning. The design is matured into1-4


an Engineering Development Model (EDM),which is tested in its intended environment priorto Milestone C.In SI the development agency conducts DT&E toassist with engineering design, system development,risk identification, <strong>and</strong> to evaluate thecontractor’s ability to attain desired technical performancein system specifications <strong>and</strong> achieve programobjectives. The DT&E includes T&E of <strong>com</strong>ponents,subsystems, <strong>and</strong> prototype developmentmodels. T&E of functional <strong>com</strong>patibility, interoperability,<strong>and</strong> integration with fielded <strong>and</strong> developingequipment <strong>and</strong> systems is also included.During this phase of testing, adequate DT&E isac<strong>com</strong>plished to ensure engineering is reasonably<strong>com</strong>plete (including survivability/vulnerability,<strong>com</strong>patibility, transportability, interoperability, reliability,maintainability, safety, human factors, <strong>and</strong>logistics supportability). Also, this phase confirmsthat all significant design problems have been identified<strong>and</strong> solutions to these problems are in h<strong>and</strong>.The Service Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(OT&E) agency should conduct an EOA for theDesign Readiness Review (DRR) to estimate thesystem’s potential to be operationally effective <strong>and</strong>suitable; identify needed modifications; <strong>and</strong> provideinformation on tactics, doctrine,organization, <strong>and</strong> personnel requirements. Theearly OT&E program is ac<strong>com</strong>plished in anenvironment containing limited operational realism.Typical operational <strong>and</strong> support personnel areused to obtain early estimates of the user’s capabilityto operate <strong>and</strong> maintain the system. Someof the most important products of user assessmentsof system maintainability <strong>and</strong> supportabilityare human factors <strong>and</strong> safety issues.In Systems Demonstration, the objective is todesign, fabricate, <strong>and</strong> test a preproduction systemthat closely approximates the final product. T&Eactivities of the EDM during this period yieldmuch useful information. For example, dataobtained during EDM T&E can be used to assistin evaluating the system’s maintenance trainingrequirements <strong>and</strong> the proposed training program.<strong>Test</strong> results generated during EDM T&E alsosupport the user in refining <strong>and</strong> updatingemployment doctrine <strong>and</strong> tactics.During Systems Demonstration, T&E is conductedto satisfy the following objectives:(1) As specified in program documents, assessthe critical technical issues:(a) Determine how well the developmentcontract specifications have been met;(b) Identify system technical deficiencies<strong>and</strong> focus on areas for correctiveactions;(c) Determine whether the system is <strong>com</strong>patible,interoperable, <strong>and</strong> can be integratedwith existing <strong>and</strong> plannedequipment or systems;(d) Estimate the Reliability, Availability, <strong>and</strong>Maintainability (RAM) of the systemafter it is deployed;(e) Determine whether the system is safe <strong>and</strong>ready for Low Rate Initial Production(LRIP);(f) Evaluate effects on performance of anyconfiguration changes caused by correctingdeficiencies, modifications, orProduct Improvements (PIs);(g) Assess human factors <strong>and</strong> identifylimiting factors.(2) Assess the technical risk <strong>and</strong> evaluate thetradeoffs among specifications, operationalrequirements, LCCs, <strong>and</strong> schedules;(3) Assess the survivability, vulnerability, <strong>and</strong>logistics supportability of the system;1-5


(4) Verify the accuracy <strong>and</strong> <strong>com</strong>pleteness of thetechnical documentation developed tomaintain <strong>and</strong> operate the weapons system;(5) Gather information for training programs<strong>and</strong> technical training materials needed tosupport the weapons system;(6) Provide information on environmentalissues for use in preparing environmentalimpact assessments;(7) Determine system performance limitations<strong>and</strong> safe operating parameters.Thus, T&E activities intensify during this phase<strong>and</strong> make significant contributions to the overallacquisition decision process.The development agency evaluates the results ofT&E for review by the Service headquarters <strong>and</strong>the Service acquisition review council prior tosystem acquisition review by the MilestoneDecision Authority (MDA). The evaluation includesthe results of testing <strong>and</strong> supportinginformation, conclusions, <strong>and</strong> re<strong>com</strong>mendationsfor further engineering development. At the sametime, the OT&E agency prepares an IndependentOperational Assessment (IOA), which containsestimates of the system’s potential operationaleffectiveness <strong>and</strong> suitability. The OperationalAssessment (OA) provides a permanent recordof OT&E events, an audit trail of OT&E data,test results, conclusions, <strong>and</strong> re<strong>com</strong>mendations.This information is used to prepare for MilestoneC <strong>and</strong> supports a re<strong>com</strong>mendation of whether thedesign <strong>and</strong> performance of the system indevelopment justifies proceeding into LRIP.1.3.3 T&E Contributions Prior to Full RateProduction Decision ReviewThe development agency transitions the finaldesign to LRIP while fixing <strong>and</strong> verifying anytechnical problems discovered during the finaltesting of the EDM in its intended environment.The maturity of the hardware <strong>and</strong> software configurations<strong>and</strong> logistics support system availablefrom LRIP are assessed when the developmentagency considers certifying the system’s readinessfor Initial Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(IOT&E).IOT&E is conducted prior to the productiondecision at FRPDR to:(1) Estimate the operational effectiveness <strong>and</strong>suitability of the system;(2) Identify operational deficiencies;(3) Evaluate changes in production configuration;(4) Provide information for developing <strong>and</strong>refining logistics support requirements forthe system <strong>and</strong> training, tactics, techniques,<strong>and</strong> doctrine;(5) Provide information to refine Operations<strong>and</strong> Support (O&S) cost estimates <strong>and</strong> identifysystem characteristics or deficienciesthat can significantly impact O&S costs;(6) Determine whether the technical publications<strong>and</strong> support equipment are adequatein the operational environment.Before the certification of readiness for IOT&E,the developer should have obtained the JointInteroperability <strong>Test</strong> Comm<strong>and</strong>’s (JITC)’s certificationof interoperability for the system <strong>com</strong>ponents.In parallel with IOT&E, Live Fire <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> (LFT&E) may be used to evaluatevulnerability or lethality of a weapon systemas appropriate <strong>and</strong> as required by public law. ThePM’s briefing <strong>and</strong> the Beyond Low Rate InitialProduction (BLRIP) report address the risks ofproceeding into Full Rate Production (FRP).1-6


1.3.4 T&E Contributions After the Full RateProduction Decision ReviewAfter FRPDR, when the FRP decision is normallymade, T&E activities continue to provide importantinsights. <strong>Test</strong>s described in the TEMP butnot conducted during earlier phases are <strong>com</strong>pleted.The residual DT&E may include extremeweather testing <strong>and</strong> testing corrected deficiencies.System elements are integrated into the finaloperational configuration, <strong>and</strong> development testingis <strong>com</strong>pleted when all system performancerequirements are met. During FRP, governmentrepresentatives normally monitor or conduct theProduction Acceptance <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(PAT&E). Each system is verified by PAT&E for<strong>com</strong>pliance with the requirements <strong>and</strong>specifications of the contract.Post-production testing requirements may resultfrom an acquisition strategy calling for incrementchanges to ac<strong>com</strong>modate accumulated engineeringchanges or the application of PreplannedProduct Improvements (P 3 Is). This will allowparallel development of high-risk technology <strong>and</strong>modular insertion of system upgrades into productionequipment. Technology breakthroughs<strong>and</strong> significant threat changes may require systemmodifications. The development of the modificationswill require development testing; if systemperformance is significantly changed, somelevel of operational testing may be appropriate.OT&E activities continue after the FRP decisionin the form of Follow-on Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (FOT&E). The initial phase of FOT&Emay be conducted by either the OT&E agency oruser <strong>com</strong>m<strong>and</strong>s, depending on Service directives.This verifies the operational effectiveness <strong>and</strong>suitability of the production system, determinesif deficiencies identified during the IOT&E havebeen corrected, <strong>and</strong> evaluates areas not tested duringIOT&E due to system limitations. AdditionalFOT&E may be conducted over the life of thesystem to refine doctrine, tactics, techniques, <strong>and</strong>training programs, <strong>and</strong> to evaluate future increments,modifications, <strong>and</strong> upgrades.The OT&E agency prepares an OA report at theconclusion of each FOT&E. This report recordstest results, describes the evaluation ac<strong>com</strong>plishedto satisfy critical issues <strong>and</strong> objectives establishedfor FOT&E, <strong>and</strong> documents its assessment ofdeficiencies resolved after SDD. Deficiencies thatare not corrected are recorded.A final report on FOT&E may also be preparedby the using <strong>com</strong>m<strong>and</strong> test team, emphasizingthe operational utility of the system when operated,maintained, <strong>and</strong> supported by operationalpersonnel using the concepts specified for the system.Specific attention is devoted to thefollowing:(1) The degree to which the system ac<strong>com</strong>plishesits missions when employed byoperational personnel in a realistic scenariowith the appropriate organization, doctrine,threat (including countermeasures <strong>and</strong>nuclear threats), environment, <strong>and</strong> usingtactics <strong>and</strong> techniques developed duringearlier FOT&E;(2) The degree to which the system can beplaced in operational field use, with specificevaluations of availability, <strong>com</strong>patibility,transportability, interoperability, reliability,wartime usage rates, maintainability,safety, human factors, manpower supportability,logistics supportability, <strong>and</strong> trainingrequirements;(3) The conditions under which the system wastested including the natural weather <strong>and</strong>climatic conditions, terrain effects, battlefielddisturbances, <strong>and</strong> enemy threatconditions;(4) The ability of the system to perform itsrequired functions for the duration of aspecified mission profile;1-7


(5) System weaknesses such as the vulnerabilityof the system to exploitation by countermeasurestechniques <strong>and</strong> the practicality <strong>and</strong>probability of an adversary exploiting thesusceptibility of a system in <strong>com</strong>bat.A specific evaluation of the personnel <strong>and</strong> logisticschanges needed for the effective integrationof the system into the user’s inventory is alsomade. These assessments provide essential inputfor the later acquisition phases of the systemdevelopment cycle.1.4 SUMMARY“Risk management,” according to Transition fromDevelopment to Production, “is the means bywhich the program areas of vulnerability <strong>and</strong>concern are identified <strong>and</strong> managed.” 5 T&E isthe discipline that helps to illuminate those areasof vulnerability. The importance of T&E in theacquisition process is summarized well in a July2000 General Accounting Office (GAO) ReportNSIAD-00-199, Best Practices: A More Constructive<strong>Test</strong> Approach is Key to Better WeaponSystem Out<strong>com</strong>es. 6 The summary serves tounderscore the importance of the T&E processas a whole:• Problems found late in development signalweaknesses in testing <strong>and</strong> evaluation;• Early testing to validate product knowledge isa best practice;• Different incentives make testing a more constructivefactor in <strong>com</strong>mercial programs thanin weapon system programs.“To lessen the dependence on testing late indevelopment <strong>and</strong> to foster a more constructiverelationship between program managers <strong>and</strong>testers, GAO re<strong>com</strong>mends that the Secretary ofDefense instruct acquisition managers to structuretest plans around the attainment of increasinglevels of product maturity, orchestrate the rightmix of tools to validate these maturity levels, <strong>and</strong>build <strong>and</strong> resource acquisition strategies aroundthis approach. GAO also re<strong>com</strong>mends that validationof lower levels of product maturity not bedeferred to the third level. Finally, GAO re<strong>com</strong>mendsthat the Secretary require that weaponsystems demonstrate a specified level of productmaturity before major programmatic approvals.” 71-8


ENDNOTES1. DoDD 5000.1, The Defense AcquisitionSystem, May 12, 2003, p. 8.2. BDM Corporation, Functional Description ofthe Acquisition <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Process,July 8, 1983.3. Defense Science Board Task Force Report,Solving the Risk Equation in Transitioningfrom Development to Production, May 25,1983 (later published as DoD Manual4245.7).4. DoD 4245.7-M, Transition from Developmentto Production, September 1985.5. Ibid.6. GAO/NSIAD-00-199, Best Practices: A moreConstructive <strong>Test</strong> Approach is Key to BetterWeapon System Out<strong>com</strong>es, July 2000, p. 11.7. Ibid.1-9


1-10


2THE TEST ANDEVALUATION PROCESS2.1 INTRODUCTIONThe fundamental purpose of <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(T&E) in a defense system’s development <strong>and</strong>acquisition program is to identify the areas of riskto be reduced or eliminated. During the earlyphases of development, T&E is conducted to demonstratethe feasibility of conceptual approaches,evaluate design risk, identify design alternatives,<strong>com</strong>pare <strong>and</strong> analyze tradeoffs, <strong>and</strong> estimate satisfactionof operational requirements. As a systemundergoes design <strong>and</strong> development, the iterativeprocess of testing moves gradually froma concentration on Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(DT&E), which is concerned chiefly withattainment of engineering design goals, to increasingly<strong>com</strong>prehensive Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(OT&E), which focuses on questions of operationaleffectiveness, suitability, <strong>and</strong>survivability. Although there are usually separateDevelopment <strong>Test</strong> (DT) <strong>and</strong> Operational <strong>Test</strong> (OT)events, DT&E <strong>and</strong> OT&E are not necessarilyserial phases in the evolution of a weapon systemdevelopment. Combined or concurrent DT<strong>and</strong> OT are encouraged when appropriate, i.e.,conferring possible cost or time savings. 1T&E has its origins in the testing of hardware.This tradition is heavily embedded in its vocabulary<strong>and</strong> procedures. The advent of softwareintensivesystems has brought new challenges totesting, <strong>and</strong> new approaches are discussed inChapter 17 of this guide. Remaining constantthroughout the T&E process, whether testinghardware or software, is the need for thorough,logical, systematic, <strong>and</strong> early test planningincluding feedback of well-documented <strong>and</strong>unbiased T&E results to system developers, users,<strong>and</strong> decision makers.T&E has many useful functions <strong>and</strong> providesinformation to many customers. T&E givesinformation to: developers for identifying <strong>and</strong>resolving technical difficulties; decision makersresponsible for procuring a new system <strong>and</strong> forthe best use of limited resources; <strong>and</strong> to operationalusers for refining requirements <strong>and</strong> supportingdevelopment of effective tactics, doctrine,<strong>and</strong> procedures.2.2 DEFENSE SYSTEM ACQUISITIONPROCESSThe defense system acquisition process wasrevised significantly in 2003 to change its primaryobjective to acquisition of quality products thatsatisfy user needs with measurable improvementsto mission capability <strong>and</strong> operational support, in atimely manner, <strong>and</strong> at a fair <strong>and</strong> reasonable price.As it is now structured, the defense system lifecycle is to replicate the preferred acquisition strategyof an Evolutionary Acquisition (EA) processthat uses either incremental or spiral developmentprocesses. The three major elements—pre-systemacquisition, system acquisition, <strong>and</strong> sustainment—mayinclude the following five phases:(1) Concept Refinement (CR)(2) Technology Development (TD)(3) System Development <strong>and</strong> Demonstration(SDD)2-1


CONCEPTREFINEMENTTECHNOLOGYDEVELOPMENTSYSTEM DEV &DEMONSTRATIONPRODUCTION &DEPLOYMENTOPERATIONS& SUPPORTMILESTONEATECHNOLOGYDEVELOPMENTMILESTONEBAPPROVENEWSTARTMILESTONECAPPROVALFORLRIPFRPDRFULL RATEPRODUCTIONAPPROVALTEST/ EVALUATIONSTRATEGYPRELIMINARYTEMPTEMPUPDATETEMPLOW RATE UPDATEINITIALPRODUCTION(LRIP)OSD ASSESSMENT ASSESSMENT ASSESSMENTDEVELOPMENT TEST AND EVALUATION (DT&E)LFT&EOPERATIONAL TEST AND EVALUATION (OT&E)EARLY OPERATIONALASSESSMENT (EOA)OPERATIONALASSESSMENTIOT&ELFT REPORTFOT&EFigure 2-1. <strong>Test</strong>ing <strong>and</strong> the Acquisition Process(4) Production <strong>and</strong> Deployment (P&D)(5) Operations <strong>and</strong> Support (O&S).As Figure 2-1 shows, these phases are separatedby key decision points when a Milestone DecisionAuthority (MDA) reviews a program <strong>and</strong>authorizes advancement to the next phase in thecycle. Thus T&E planning <strong>and</strong> test results playan important part in the MDA review process.The following description of the defense systemacquisition process, summarized from Departmentof Defense Instruction (DoDI) 5000.2, Operationof the Defense Acquisition System 2 , showshow T&E fits within the context of the largerprocess.2.2.1 Concept Refinement <strong>and</strong> TechnologyDevelopmentA defense pre-system acquisition process beginswith the requirements process identification ofa material need <strong>and</strong> the development of the InitialCapabilities Document (ICD). A decision ismade to start CR, <strong>and</strong> the Technology DevelopmentStrategy (TDS) evolves with the initial testplanning concepts. CR ends when the MDA approvesthe preferred solution resulting from theAnalysis of Alternatives (AoA), identifyingMeasures of Effectiveness (MOEs), <strong>and</strong> approvesthe associated TDS. The TD Phase followsa Milestone A decision during which therisks of the relevant technologies of the alternativeselected for satisfying the user’s needs are2-2


investigated. Shortly after the milestone decision,an integrated team begins transitioning thetest planning in the TDS into the evaluation strategyfor formulation of a <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Master Plan (TEMP). The TD effort concludeswith a decision review at Milestone B when anaffordable increment of militarily useful capabilityhas been identified, the technology for thatincrement has been demonstrated in a relevantenvironment, <strong>and</strong> a system can be developed forproduction within a short time frame (normallyless than 5 years) or the MDA decides to terminatethe effort. Typical T&E-related documentsat the Milestone B review are the AcquisitionDecision Memor<strong>and</strong>um (ADM) (exit criteria),ICD, Capability Development Document (CDD)(performance parameters), Acquisition Strategy,System Threat Assessment (STA), an Early OperationalAssessment (EOA), <strong>and</strong> the TEMP. Additionalprogram management documents preparedbefore Milestone B include: the AoA,Independent Cost Estimate (ICE), <strong>and</strong> the conceptbaseline version of the Acquisition ProgramBaseline (APB), which summarizes the weapon’sfunctional specifications, performance parameters,<strong>and</strong> cost <strong>and</strong> schedule objectives.The program office for major programs (Officeof the Secretary of Defense (OSD) oversight)must give consideration to requesting a waiverfor full-up system level Live Fire <strong>Test</strong>ing (LFT)<strong>and</strong> identification of Low Rate Initial Production(LRIP) quantities for Initial Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (IOT&E).2.2.2 System Development <strong>and</strong>DemonstrationThe Milestone B decision is program initiationfor systems acquisition <strong>and</strong> establishes broadobjectives for program cost, schedule, <strong>and</strong> technicalperformance. After the Milestone B decisionfor a program start, the Systems Integration(SI) work effort begins during which a selectedconcept, typically brassboard or early prototype,is refined through systems engineering, analysis,<strong>and</strong> design. Systems engineering must manageall requirements as an integrated set of designconstraints that are allocated down through thevarious levels of design (Figure 2-2). This workeffort ends when the integration of the system<strong>com</strong>ponents has been demonstrated through adequatedevelopmental testing of prototypes. TheDesign Readiness Review (DRR) decision evaluatesdesign maturity <strong>and</strong> readiness to either enterinto System Demonstration or make a change tothe acquisition strategy. The System Demonstrationwork effort is intended to demonstrate theability of the system to operate in a useful wayconsistent with the approved Key PerformanceParameters (KPPs). Work advances the design toan Engineering Development Model (EDM) thatis evaluated for readiness to enter LRIP. “Successfuldevelopment test <strong>and</strong> evaluation to assesstechnical progress against critical technical parameters,early operational assessments, <strong>and</strong>,where proven capabilities exist, the use of modeling<strong>and</strong> simulation to demonstrate system integrationare critical during this effort. Deficienciesencountered in testing prior to Milestone Cshall be resolved prior to proceeding beyondLRIP.” 3 The EDM should have demonstrated acceptableperformance in DT&E <strong>and</strong> the OperationalAssessment (OA), with acceptable interoperability<strong>and</strong> operational supportability.Preparation for the Milestone C decision establishesmore refined cost, schedule, <strong>and</strong> performanceobjectives <strong>and</strong> thresholds, <strong>and</strong> the CapabilityProduction Document (CPD) establishes thecriteria for testing of LRIP items. Documentsinteresting to the T&E manager at the time ofthe Milestone C review include the ADM (exitcriteria), updated TEMP, updated STA, AoA,Development Baseline, development testingresults, <strong>and</strong> the OA.2.2.3 Production <strong>and</strong> DeploymentDuring the LRIP work effort, the purpose is toachieve an operational capability that satisfiesmission needs. The selected system design <strong>and</strong>2-3


IMPORTANT SYSTEMS ENGINEERING DESIGN CONSIDERATIONS“THE FISHBONE”SYSTEM PERFORMANCECAPABILITIESFUNCTIONSPRIORITIESRELIABILITYSYSTEMPERFORMANCETECHNICALEFFECTIVENESSARCHITECTURALIMPACTS ON SYSTEMOPEN SYSTEMS DESIGNINTEROPERABILITYDESIGNCONSIDERATIONSSURVIVABILITY &SUSCEPTIBLITYSOFTWAREANTI-TAMPERHSIACCESSIBILITYINSENSITIVE MUNITIONSSYSTEM SECURITYINFORMATIONASSURANCECORROSION PREVENTIONCOTSDISPOSAL ANDDEMILITARIZATIONENVIRONMENT, SAFETY,OCCUPATIONAL HEALTHDESIGN TOOLSMODELING & SIMULATIONMAINTAINABILITYSUPPORTABILITYPRODUCIBILITYOPERATIONSMAINTENANCELOGISTICSSYSTEMAVAILABILITYPROCESSEFFICIENCYLIFE CYCLE COST / TOTAL OWNERSHIP COSTSYSTEM AVAILABILITYRELIABILITY/MAINTAINABILITY/SUPPORTABILITY:RAM, COTSPRODUCIBILITY: VALUE ENGINEERING, QUALITY,MANUFACTURING CAPABILITYSYSTEMEFFECTIVENESSAFFORDABLEOPERATIONALEFFECTIVENESSPROCESS EFFICIENCYOPERATIONS: INTEROPERABILITY, INFORMATION ASSURANCE,SURVIVABILITY & SUSCEPTIBILITY, HSI, SYSTEM SECURITY,ANTI-TAMPERMAINTENANCE: CORROSION PREVENTION AND CONTROL,ACCESSIBILITY, INTEROPERABILITY, UNIQUE IDENTIFICATIONOF ITEMSLOGISTICS: SUPPORTABILITYPRODUCIBILITY: VALUE ENGINEERING, QUALITY,MANUFACTURING CAPABILITYSOURCE: Skalamera, Robert J. November 2004. “Implementing OSD Systems Engineering Policy” briefing to the USD(AT&L).Figure 2-2. Requirements to Design2-4


its principal items of support are fabricated asproduction configuration models. <strong>Test</strong> articlesnormally are subjected to qualification testing,full-up LFT, <strong>and</strong> IOT&E. This work effort endswith the Full Rate Production Decision Review(FRPDR) marking entry into Full Rate Production(FRP) <strong>and</strong> deployment of the system for InitialOperational Capability (IOC). Key documents forthe T&E manager at the time of the FRPDR arethe updated TEMP, development testing results,the Service IOT&E report, <strong>and</strong> Live Fire <strong>Test</strong> Report.For Acquisition Category (ACAT) I <strong>and</strong> designatedoversight programs, the Director of Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DOT&E) isrequired by law to document the assessment ofthe adequacy of IOT&E <strong>and</strong> the reported operationaleffectiveness <strong>and</strong> suitability of the system.This is done in the Beyond LRIP (BLRIP) Report.Also m<strong>and</strong>ated by law is the requirementfor the DOT&E to submit the Live Fire <strong>Test</strong> Reportprior to the program proceeding beyondLRIP. These DOT&E Reports may be submittedas a single document.2.2.4 Operations <strong>and</strong> SupportThe production continues at full rate allowingcontinued deployment of the system to operatinglocations <strong>and</strong> achievement of Full OperationalCapability (FOC). This phase may includemajor modifications to the production configuration,increment upgrades, <strong>and</strong> related FollowonOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (FOT&E)(OA). Approval for major modifications shouldidentify the actions <strong>and</strong> resources needed toachieve <strong>and</strong> maintain operational readiness <strong>and</strong>support objectives. The high cost of changes mayrequire initiation of the modification as a newprogram. To determine whether major upgrades/modifications are necessary or deficiencies warrantconsideration of replacement, the MDA mayreview the impact of proposed changes on systemoperational effectiveness, suitability, <strong>and</strong>readiness.2.3 T&E AND THE SYSTEMSENGINEERING PROCESS (SEP)In the early 1970s, Department of Defense (DoD)test policy became more formalized <strong>and</strong> placedgreater emphasis on T&E as a continuing functionthroughout the acquisition cycle. These policiesstressed the use of T&E to reduce acquisitionrisk <strong>and</strong> provide early <strong>and</strong> continuing estimatesof system operational effectiveness <strong>and</strong> operationalsuitability. To meet these objectives, appropriatetest activities had to be fully integratedinto the overall development process. From a systemsengineering perspective, test planning, testing,<strong>and</strong> analysis of test results are integral partsof the basic product definition process.Systems engineering, as defined in the DoDcontext, is an interdisciplinary approach toevolve <strong>and</strong> verify an integrated <strong>and</strong> optimallybalanced set of product <strong>and</strong> process designs thatsatisfy user needs <strong>and</strong> provide information formanagement decisionmaking (Figure 2-3).A system’s life cycle begins with the user’s needs,which are expressed as constraints, <strong>and</strong> therequired capabilities needed to satisfy missionobjectives. Systems engineering is essential in theearliest planning period, in conceiving the systemconcept <strong>and</strong> defining performance requirementsfor system elements. As the detailed design isprepared, systems engineers ensure balancedinfluence of all required design specialties,including testability. They resolve interface problems,perform design reviews, perform trade-offanalyses, <strong>and</strong> assist in verifying performance.The days when one or two individuals coulddesign a <strong>com</strong>plex system, especially a huge,modern-age weapon system, are in the past.Modern systems are too <strong>com</strong>plex for a smallnumber of generalists to manage; systems requirein-depth knowledge of a broad range of areas <strong>and</strong>technical disciplines. System engineers coordinatethe many specialized engineers involved in the concurrentengineering process through Integrated2-5


SYSTEMS ENGINEERING MUST MANAGE ALLREQUIREMENTS CHOICES IN ALL PHASESINPUTS• SYS PERFORMANCE SPEC• EXIT CRITERIA• VALIDATED SYS SUPPORT &• MAINTENANCE OBJECTIVES &• REQUIREMENTS• APB • CDD • SEP• ISP • TEMPFCAOUTPUTS• INITIAL PROD BASELINE• TEST REPORTS • TEMP• ELEMENTS OF PRODUCT SUPPORT• RISK ASSESSMENT• SEP • TRA • PESHE• INPUTS TO:- CPD - STA - ISP- COST/MANPOWER EST.SVR PRRINTERPRET USER NEEDS,REFINE SYSTEMPERFORMANCE SPECS &ENVIRONMENTAL CONSTRAINTSCOMBINED DT&E/OT&E/LFT&EDEMONSTRATE SYSTEM TOSPECIFIED USER NEEDS &ENVIRONMENTAL CONSTRAINTSTRADESSRRTRADESDEVELOP SYSTEMFUNCTIONAL SPECS &SYSTEM VERIFICATION PLANSFREVOLVE FUNCTIONALPERFORMANCE SPECS INTOCI FUNCTIONAL (DESIGN TO)SPECS AND CI VERIFICATION PLANTRRSYSTEM DT&E, LFT&E & OAS,VERIFY SYSTEM FUNCTIONALITY& CONSTRAINTS COMPLIANCETO SPECSINTEGRATED DT&E, LFT&E &EOAS VERIFY PERFORMANCECOMPLIANCE TO SPECSPDREVOLVE CI FUNCTIONALSPECS INTO PRODUCT(BUILD TO) DOCUMENTATIONAND INSPECTION PLANINDIVIDUAL CIVERIFICATIONDT&ECDRFABRICATE, ASSEMBLE,CODE TO “BUILD-TO”DOCUMENTATIONSOURCE: Skalamera, Robert J. November 2004. “Implementing OSD Systems Engineering Policy” briefing to the USD(AT&L).Figure 2-3. The Systems Engineering Process2-6


Product <strong>and</strong> Process Development (IPPD). IntegratedProduct Teams (IPTs) are responsible forthe integration of the <strong>com</strong>ponents into a system.Through interdisciplinary integration, a systemsengineer manages the progress of productdefinition from system level to configurationitemlevel, detailed level, deficiency correction,<strong>and</strong> modifications/Product Improvements (PIs).<strong>Test</strong> results provide feedback to analyze the designprogress toward performance goals. Toolsof systems engineering include design reviews,configuration management, simulation, TechnicalPerformance Measurement (TPM), trade-offanalysis, <strong>and</strong> specifications.What happens during systems engineering? Theprocess determines what specialists are required,what segments <strong>and</strong> Non-Developmental Items(NDIs) are used, design performance limits,trade-off criteria, how to test, when to test, howto document (specifications), <strong>and</strong> what managementcontrols to apply (TPM <strong>and</strong> designreviews).Development <strong>and</strong> operational testing support thetechnical reviews by providing feedback to theSEP. More information on the reviews is containedin Chapter 8.2.3.1 The Systems Engineering ProcessThe SEP is the iterative logical sequence ofanalysis, design, test, <strong>and</strong> decision activities thattransforms an operational need into the descriptionsrequired for production <strong>and</strong> fielding of alloperational <strong>and</strong> support system elements. Thisprocess consists of four activities. They includerequirements analysis, functional analysis <strong>and</strong>allocation, synthesis, <strong>and</strong> verification of performance(T&E), which support decisions ontradeoffs <strong>and</strong> formalize the description of systemelements. The system engineering plan is describedin Chapter 16 of the Defense AcquisitionUniversity guide to Systems Engineering Fundamentals,January 2001.The requirements analysis activity is a processused by the program office, in concert with theuser, to establish <strong>and</strong> refine operational <strong>and</strong> designrequirements that result in the proper balancebetween performance <strong>and</strong> cost within affordabilityconstraints. Requirements analysis shall be conductediteratively with Functional Analysis/Allocation (FA/A) to develop <strong>and</strong> refine systemlevelfunctional <strong>and</strong> performance requirements,external interfaces, <strong>and</strong> provide traceability amonguser requirements <strong>and</strong> design requirements.The FA/A activity identifies what the system,<strong>com</strong>ponent, or part must do. It normally worksfrom the top downward ensuring requirementstraceability <strong>and</strong> examining alternative concepts.This is done without assuming how functionswill be ac<strong>com</strong>plished. The product is a series ofalternative Functional Flow Block Diagrams (FF-BDs). A functional analysis can be applied atevery level of development. At the system level,it may be a contractor or Service effort. Duringthe CR <strong>and</strong> TD phases, developmental testersassist the functional analysis activity to help determinewhat each <strong>com</strong>ponent’s role will be aspart of the system being developed. Performancerequirements are allocated to system<strong>com</strong>ponents.The synthesis activity involves invention—conceivingways to do each FFBD task—to answerthe “how” question. Next, the physical interfacesimplied by the “how” answers are carefully identified(topological or temporal). The answers mustreflect all technology selection factors. Synthesistools include Requirements Allocation Sheets(RASs), which translate functional statements intodesign requirements <strong>and</strong> permit a long <strong>and</strong> <strong>com</strong>plexinteractive invention process with control,visibility, <strong>and</strong> requirements traceability. Developmentaltesters conduct prototype testing todetermine how the <strong>com</strong>ponents will performassigned functions to assist this synthesis activity.The verification loop <strong>and</strong> decision activity allowstradeoff of alternative approaches to “how.” This2-7


activity is conducted in accordance with decisioncriteria set by higher-level technical requirementsfor such things as: Life Cycle Costs (LCCs);effectiveness; Reliability, Availability, <strong>and</strong>Maintainability (RAM); risk limits; <strong>and</strong> schedule.It is repeated at each level of development.The verification <strong>and</strong> decision activity is assistedby developmental testers during the later SDDphase when <strong>com</strong>petitive testing between alternativeapproaches is performed.The final activity is a description of systemelements. Developing as the result of previous activities<strong>and</strong> as the final system design is determined,this activity takes form when specificationsare verified through testing <strong>and</strong> when reviewed inthe Physical Configuration Audit (PCA) <strong>and</strong> FunctionalConfiguration Audit (FCA). Operationaltesters may assist in this activity. They conductoperational testing of the test items/systems to helpdetermine the personnel, equipment, facilities, software,<strong>and</strong> technical data requirements of the newsystem when used by typical military personnel.Figure 2-4, Systems Engineering Process <strong>and</strong> <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong>, depicts the activities <strong>and</strong> theirinteractions.2.3.2 Technical <strong>Management</strong> PlanningThe technical management planning incorporatestop-level management planning for the integrationof all system design activities. Its purpose isto develop the organizational mechanisms fordirection <strong>and</strong> control, <strong>and</strong> identify personnel forthe attainment of cost, performance, <strong>and</strong> scheduleobjectives. Planning defines <strong>and</strong> describes thetype <strong>and</strong> degree of system engineering management,the SEP, <strong>and</strong> the integration of relatedengineering programs. The design evolutionprocess forms the basis for <strong>com</strong>prehensive T&Eplanning.The TEMP must be consistent with technicalmanagement planning. The testing program outlinedin the TEMP must provide the TPMs datarequired for all design decision points, audits,TESTING DEFINED IN PERFORMED AGAINST PARTICIPANTSSYSTEMS ENGINEERING PROCESSQUALIFICATIONACCEPTANCESOWCONTRACTSPECSWBSSPECIFICATIONMISSION PROFILE/ENVIRONMENTPROJ ENGRDPMLFUNC ENGRKr/GOVT TEAMTEST AND EVALUATIONSYSTEMDT & OTTEMPDT/OT PLANSSTA/CDD/CPDOPS CONCEPTMAINT CONCEPTDT&E/DOT&EDT/OT PERSONNELUSER/SUPT PERSTEST IPTFigure 2-4. Systems Engineering Process <strong>and</strong> <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>2-8


<strong>and</strong> reviews that are a part of the system’sengineering process. The configuration managementprocess controls the baseline for the testprograms <strong>and</strong> incorporates design modificationsto the baseline determined to be necessary byT&E.The TEMP <strong>and</strong> technical management planningmust be traceable to each other. The systemdescription in the TEMP must be traceable tosystems engineering documentation such as theFFBDs, the RASs, <strong>and</strong> the <strong>Test</strong> RequirementsSheets (TRSs). Key functions <strong>and</strong> interfaces ofthe system with other systems must be described<strong>and</strong> correlated with the systems engineering documentation<strong>and</strong> the system specification. Technicalthresholds <strong>and</strong> objectives include specific performancerequirements that be<strong>com</strong>e test planninglimits. They must be traceable through theplanned systems engineering documentation <strong>and</strong>can be correlated to the content of the TPM Program.For example, failure criteria for reliabilitythresholds during OT&E testing must be delineated<strong>and</strong> agreed upon by the Program Manager(PM) <strong>and</strong> the Operational <strong>Test</strong> Director (OTD),<strong>and</strong> reflected in the TEMP.2.3.3 Technical Performance MeasurementTPM identifies critical technical parameters thatare at a higher level of risk during design. It tracksT&E data, makes predictions about whether theparameter can achieve final technical successwithin the allocated resources, <strong>and</strong> assists inmanaging the technical program.The TPM Program is an integral part of the T&Eprogram. The TPM is defined as product designassessment <strong>and</strong> forms the backbone of the developmenttesting program. It estimates, throughengineering analyses <strong>and</strong> tests, the values ofessential performance parameters of the currentprogram design. It serves as a major input in thecontinuous overall evaluation of operational effectiveness<strong>and</strong> suitability. Design reviews areconducted to measure the systems engineeringprogress. For more information, see Chapter 8.Figure 2-5 depicts the technical reviews thatusually take place during the SEP <strong>and</strong> the relatedspecification documents.2.3.4 System Baselining <strong>and</strong> T&EThe SEP establishes phase baselines throughoutthe acquisition cycle. These baselines (functional,allocated, product) can be modified with the resultsof engineering <strong>and</strong> testing. The testing usedto prove the technical baselines is rarely the sameas the operational testing of requirements.Related to the baseline is the process of configurationmanagement. Configuration managementbenefits the T&E <strong>com</strong>munity in two ways.Through configuration management, the baselineto be used for testing is determined. Also, changesthat occur to the baseline as a result of testing<strong>and</strong> design reviews are incorporated into the testarticle before the new phase of testing (to preventretest of a bad design).2.4 DEFINITIONST&E is the deliberate <strong>and</strong> rational generation ofperformance data, which describes the nature ofthe emerging system <strong>and</strong> the transformation ofdata into information useful for the technical <strong>and</strong>managerial personnel controlling its development.In the broad sense, T&E may be defined as allphysical testing, modeling, simulation, experimentation,<strong>and</strong> related analyses performed duringresearch, development, introduction, <strong>and</strong> employmentof a weapon system or subsystem. TheGlossary of Defense Acquisition Acronyms <strong>and</strong>Terms, produced by the Defense AcquisitionUniversity defines “<strong>Test</strong>” <strong>and</strong> “<strong>Test</strong> <strong>and</strong><strong>Evaluation</strong>” as follows: 4A “test” is any program or procedure thatis designed to obtain, verify, or providedata for the evaluation of any of the following:1) progress in ac<strong>com</strong>plishingdevelopmental objectives; 2) the2-9


PROGRAMACTIVITYTESTTECHNICALREVIEWS &AUDITSSPECIFICATIONSAND OTHERPRODUCTSNOTE: ACTUALTIME PHASINGOF ACTIVITIESMUST BETAILORED TOEACHPROGRAMCONCEPTREFINEMENTPREPARE TESTANDEVALUATIONSTRATEGYTECHNOLOGYDEVELOPMENTSYSTEM DEVELOPMENTAND DEMONSTRATIONPRODUCTION ANDDEPLOYMENTPREPARE TESTANDEVALUATIONMASTER PLANPROTOTYPETESTINGPREPAREHWCITEST PLANPREPARESOFTWARETEST PLANPREPAREHWCI TESTPROCEDUREPREPARES/W TESTDESCRIPTIONPREPAREHWCISUBSYSTEMTESTSPREPARES/W TESTPROCEDURESPREPARESOFTWARETESTSPREPARESOFTWARETESTREPORTSPERFORMSYSTEMTESTSPREPARESYSTEMTESTREPORTSALTERNATIVESYSTEMREVIEWSYSTEMREQUIREMENTSREVIEWFUNCTIONALREVIEW SOFTWARESPECREVIEWPRELIMDESIGNREVIEWHWCIPRELIMDESIGNREVIEWCSCICRITICALDESIGNREVIEWHWCICRITICALDESIGNREVIEWCSCITESTREADINESSREVIEWFUNCTIONALCONFIGURATIONAUDIT HWCIFUNCTIONALCONFIGURATIONAUDIT CSCIPHYSICALCONFIGURATIONAUDIT HWCIPHYSICALCONFIGURATIONAUDIT CSCIFORMALQUALIFICATIONPREVIEWHWCISYSTEMFUNCTIONALCONFIGURATIONAUDITSYSTEMPHYSICALCONFIGURATIONAUDITSYSTEMFORMALQUALIFICATIONREVIEWHWCIDEVELOPMENTSPECDRAFTHWCIHWCIDEVELOPMENTDEVELOPMENTSPECSOURCEANDOBJECTCODEHWCIPRODUCTSPECDRAFTSYSTEMPERFORMANCESPECSYSTEMPERFORMANCESPECIRSSOFTWAREREQUIREMENTSSPECITEMSPECSOFTWARETOP LEVELDESIGNDOCUMENTDBDDIDDSOFTWAREDETAILEDDESIGNDOCUMENTITEMPERFORMSPECSOFTWARELISTINGITEMDETAILSOFTWAREPRODUCTSPECPROCESSMATERIALSYSTEMFUNCTIONALIDENTIFICATIONSYSTEMFUNCTIONALBASE LINEALLOCATEDIDENTIFICATIONALLOCATEDBASE LINEDEVELOPMENTALCONFIGURATION(CSCI ONLY)PRODUCTBASE LINELEGENDCSCI - COMPUTER SOFTWARE IDD -CONFIGURATION ITEMINTERFACE DECISIONDOCUMENTSPEC - SPECIFICATION DBDD - DATA BASE DESIGNDOCUMENTIRS - INTERFACE REQUIREMENTS HWCI -SPECIFICATIONHARDWARECONFIGURATION ITEMFigure 2-5. Design Reviews2-10


performance, operational capability, <strong>and</strong>suitability of systems, subsystems, <strong>com</strong>ponents,<strong>and</strong> equipment items; <strong>and</strong> 3) thevulnerability <strong>and</strong> lethality of systems,subsystems, <strong>com</strong>ponents, <strong>and</strong> equipmentitems.“<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>” is the process bywhich a system or <strong>com</strong>ponents areexercised <strong>and</strong> results analyzed to provideperformance-related information. Theinformation has many uses including riskidentification <strong>and</strong> risk mitigation <strong>and</strong>empirical data to validate models <strong>and</strong>simulations. T&E enables an assessmentof the attainment of technical performance,specifications, <strong>and</strong> system maturityto determine whether systems areoperationally effective, suitable, <strong>and</strong> survivablefor intended use, <strong>and</strong>/or lethality.2.5 THE DOD T&E PROCESSThe DoD <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Process (Figure 2-6) is an iterative five-step process that providesanswers to critical T&E questions for decisionmakers at various times during a system acquisition.The T&E process begins during theformative stages of the program with the T&ECoordination Function, in which the informationneeds of the various decision makers are formulatedin conjunction with the development of theprogram requirements, acquisition strategy, <strong>and</strong>AoAs.Given certain foundation documentation, Step 1is the identification of T&E information requiredby the decision maker. The required informationusually centers on the current system under test,which may be in the form of concepts, prototypes,EDMs, or production representative/production systems, depending on the acquisitionPROCESSSYSTEM ACQUISITIONISSUE TO BEDECIDEDMILITARY NEEDSCDD/CPDAOAPROTOTYPESSYSTEM DESIGNSYSTEM SPECIFICATIONPRODUCTION ARTICLES: HW/SWTHREATS/TARGETSSOFTWARE UPDATESMAJOR MODSID CRITICALISSUES ANDDATA RQMTSSTEP 1 STEP 2PRE-TESTANALYSISEVALUATIONPLANEXPECTEDOUTCOMESFEEDBACḴMEASUREDOUTCOMESTEST PLANNINGRUN SIMULATIONCONDUCT TESTGATHER DATAANALYZE DATATEST REPORTSTEP 3DECISIONBALANCE T&E RESULTS WITHOTHER PROGRAM INFOSTEP 5STEP 4POST-TEST SYNTHESISCOMPARE RESULTS/PREDICTIONSEVALUATION REPORTMODELING AND SIMULATIONFigure 2-6. DoD <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Process2-11


phase. The required information consists of performanceevaluations of effectiveness <strong>and</strong> suitability,providing insights into how well thesystem meets the user’s needs at a point in time.Step 2 is the pre-test analysis of the evaluationobjectives from Step 1 to determine the types <strong>and</strong>quantities of data needed, the results expected oranticipated from the tests, <strong>and</strong> the analytical toolsneeded to conduct the tests <strong>and</strong> evaluations. Theuse of validated models <strong>and</strong> simulation systemsduring pre-test analysis can aid in determining:how to design test scenarios; how to set up thetest environment; how to properly instrument thetest; how to staff <strong>and</strong> control test resources; howbest to sequence the test trials; <strong>and</strong> how toestimate out<strong>com</strong>es.Step 3, test activity <strong>and</strong> data management, is theactual test activity planning. <strong>Test</strong>s are conducted<strong>and</strong> data management for data requirements isidentified in Step 2. T&E managers determinewhat valid data exist in historical files that canbe applied <strong>and</strong> what new data must be developedthrough testing. The necessary tests are planned<strong>and</strong> executed to accumulate sufficient data tosupport analysis. Data are screened for <strong>com</strong>pleteness,accuracy, <strong>and</strong> validity before being used forStep 4.Step 4, post-test synthesis <strong>and</strong> evaluation, is the<strong>com</strong>parison of the measured out<strong>com</strong>es (test data)from Step 3 with the expected out<strong>com</strong>es fromStep 2, tempered with technical <strong>and</strong> operationaljudgment. This is where data are synthesized intoinformation. When the measured out<strong>com</strong>es differfrom the expected out<strong>com</strong>es, the test conditions<strong>and</strong> procedures must be reexamined to determineif the performance deviations are real or were theresult of test conditions, such as lack of fidelityin <strong>com</strong>puter simulation, insufficient or incorrecttest support assets, instrumentation error, or faultytest processes. The assumptions of tactics, operationalenvironment, systems performance parameters,<strong>and</strong> logistics support must have beencarefully chosen, fully described, <strong>and</strong> documentedprior to test. Modeling <strong>and</strong> Simulation (M&S)may normally be used during the data analysis toextend the evaluation of performance effectiveness<strong>and</strong> suitability.Step 5 is when the decision maker weighs theT&E information against other programmatic informationto decide a proper course of action.This process may identify additional requirementsfor test data <strong>and</strong> iterate the DoD T&E processagain.2.6 SUMMARYT&E is an engineering tool used to identifytechnical risk throughout the defense systemacquisition cycle <strong>and</strong> a process for verifying performance.This iterative cycle consists of acquisitionphases separated by discrete milestones. TheDoD T&E process consists of developmental <strong>and</strong>operational testing that is used to supportengineering design <strong>and</strong> programmatic reviews. ThisT&E process forms an important part of the SEPused by system developers <strong>and</strong> aids in the decisionprocess used by senior decision authorities in DoD.2-12


ENDNOTES1. DoDI 5000.2, Operation of the DefenseAcquisition System, May 12, 2003.2. Ibid.4. Defense Acquisition University Press. TheGlossary of Defense Acquisition Acronyms<strong>and</strong> Terms, Vol. 11, September 2003.3. Ibid.2-13


2-14


3T&E POLICY STRUCTURE ANDOVERSIGHT MECHANISM3.1 INTRODUCTIONThis chapter provides an overview of the policy<strong>and</strong> organizations that govern the conduct of <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> (T&E) activities within theDepartment of Defense (DoD) <strong>and</strong> discussescongressional legislation <strong>and</strong> activities for <strong>com</strong>plianceby DoD. It outlines the responsibilitiesof DoD test organizations at the Office of theSecretary of Defense (OSD) <strong>and</strong> Service levels,<strong>and</strong> describes related T&E policy.3.2 THE CONGRESSCongress has shown a longst<strong>and</strong>ing interest ininfluencing the DoD acquisition process. Duringthe early 1970s, in response to urging by theCongress <strong>and</strong> re<strong>com</strong>mendations by a PresidentialBlue Ribbon Panel on Defense <strong>Management</strong>,Deputy Secretary of Defense David Packard promulgateda package of policy initiatives that establishedthe Defense Systems Acquisition ReviewCouncil (DSARC). The DSARC wasorganized to resolve acquisition issues, wheneverpossible, <strong>and</strong> to provide re<strong>com</strong>mendations to theSecretary of Defense (SECDEF) on the acquisitionof major weapon systems. Also as a resultof the Congressional Directives, the Army <strong>and</strong>Air Force established independent Operational<strong>Test</strong> Agencies (OTAs). The Navy Operational <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> Force (OPTEVFOR) was establishedin the late 1960s. In 1983, similar concernsled Congress to direct the establishment ofthe independent Office of the Director, Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DOT&E) within OSD.In 1985, a report released by another President’sBlue Ribbon Commission on Defense <strong>Management</strong>,this time chaired by David Packard, madesignificant re<strong>com</strong>mendations on the management<strong>and</strong> oversight of DoD’s acquisition process, specifically,T&E. All the Commission’s re<strong>com</strong>mendationshave not been implemented, <strong>and</strong> the fullimpact of these re<strong>com</strong>mendations is not yet realized.In Fiscal Year (FY) 87 the Defense AuthorizationAct required Live Fire <strong>Test</strong>ing (LFT) ofweapon systems before the Production Phasebegins. The earmarking of authorizations <strong>and</strong> appropriationsfor DoD funding, specific testing requirements,<strong>and</strong> acquisition reform legislationcontinues to indicate the will of Congress forDoD implementation.Congress requires DoD to provide the followingreports that include information on T&E:• Selected Acquisition Report (SAR). Within thecost, schedule, <strong>and</strong> performance data in thereport, SAR describes Acquisition Category(ACAT) I system characteristics required <strong>and</strong>outlines significant progress <strong>and</strong> problemsencountered. It lists tests <strong>com</strong>pleted <strong>and</strong> issuesidentified during testing. The Program Manager(PM) uses the Consolidated AcquisitionReporting System (CARS) software to preparethe SAR.• Annual System Operational <strong>Test</strong> Report. Thisreport is provided by the DOT&E to the SEC-DEF <strong>and</strong> the <strong>com</strong>mittees on Armed Services,National Security, <strong>and</strong> Appropriations. Thereport provides a narrative <strong>and</strong> resource summaryof all Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(OT&E) <strong>and</strong> related issues, activities, <strong>and</strong>assessments. When oversight of LFT wasmoved to DOT&E, this issue was added to thereport.3-1


• Beyond Low Rate Initial Production (BLRIP)Report. Before proceeding to BLRIP for eachMajor Defense Acquisition Program (MDAP),DOT&E must report to the SECDEF <strong>and</strong> theCongress. This report addresses the adequacyof OT&E <strong>and</strong> whether the T&E results confirmthat the tested item or <strong>com</strong>ponent is effective<strong>and</strong> suitable for <strong>com</strong>bat. When oversight ofLFT was moved to the DOT&E, the Live Fire<strong>Test</strong> Report was added to the BLRIP Reportcontent.• Foreign Comparative <strong>Test</strong> (FCT) Report. TheUnder Secretary of Defense for Acquisition,Technology, <strong>and</strong> Logistics (USD(AT&L))should notify Congress a minimum of 30 daysprior to the <strong>com</strong>mitment of funds for initiationof new FCT evaluations of equipment producedby selected allied <strong>and</strong> friendly foreigncountries.3.3 OSD OVERSIGHT STRUCTUREThe DoD organization for the oversight of T&Eis illustrated in Figure 3-1. In USD(AT&L), Development<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E) oversightis performed by the Deputy Director, Developmental<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>, DefenseSystems (DT&E/DS), <strong>and</strong> the DOT&E providesOT&E oversight for the SECDEF. The managementof MDAPs in OSD is performed by the DefenseAcquisition Executive (DAE), who uses theDefense Acquisition Board (DAB) <strong>and</strong>Overarching Integrated Product Teams (OIPTs)to process information for decisions.3.3.1 Defense Acquisition Executive (DAE)The DAE position, established in September1986, is held by the USD(AT&L). As the DAE,the USD(AT&L) uses the DAB <strong>and</strong> its OIPTs toprovide the senior-level decision process for theacquisition of weapon systems. The DAE responsibilitiesinclude establishing policies for acquisition(including procurement, Research <strong>and</strong> Development(R&D), logistics, development testing,<strong>and</strong> contracts administration) for all elements ofDoD. The DAE’s charter includes the authorityover the Services <strong>and</strong> defense agencies on policy,procedure, <strong>and</strong> execution of the weapon systemsacquisition process.3.3.2 Defense Acquisition Board (DAB)The DAB is the primary forum used by OSD toprovide advice, assistance, <strong>and</strong> re<strong>com</strong>mendations,<strong>and</strong> to resolve issues regarding all operating <strong>and</strong>policy aspects of the DoD acquisition system. TheDAB is the senior management acquisition boardchaired by the DAE <strong>and</strong> co-chair is the ViceChairman of the Joint Chiefs of Staff (VCJCS).The DAB is <strong>com</strong>posed of the department’s senioracquisition officials, including the DOT&E. TheDAB conducts business through OIPTs <strong>and</strong>provides decisions on ACAT ID programs. 13.3.3 Deputy Director, Developmental <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong>, Defense Systems(DT&E/DS)The DT&E/DS serves as the principal staffassistant <strong>and</strong> advisor to the USD(AT&L) for T&Ematters. The Director, Defense Systems works forthe Deputy Under Secretary of Defense forAcquisition <strong>and</strong> Technology (DUSD(A&T)) <strong>and</strong>has authority <strong>and</strong> responsibility for all DT&Econducted on designated MDAPs <strong>and</strong> for FCT.The DT&E/DS is illustrated in Figure 3-1.3.3.3.1 Duties of the DT&E/DSWithin the acquisition <strong>com</strong>munity, the DT&E/DS:• Serves as the focal point for coordination ofall MDAP <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plans(TEMPs). Re<strong>com</strong>mends approval by the OIPTlead of the DT&E portion of TEMPs;• Reviews MDAP documentation for DT&Eimplications <strong>and</strong> resource requirements toprovide <strong>com</strong>ments to the OIPT, USD(AT&L)DAE or DAB;3-2


SECRETARY OF DEFENSEDIR, TEST RESOURCEMANAGEMENT CENTERUNDER SECRETARY OFDEFENSE (AT&L)DIRECTOR, OPERATIONALTEST AND EVALUATIONASD(NII)DEFENSE SYSTEMSUSD(A&T)JOINT INTEROPERABILITYTEST COMMAND (JITC)DEPUTY DIRECTOR,TEST & EVALUATIONSECRETARY OF THE ARMYASA(ALT)SECRETARY OF THE NAVY ASN(RD&A) SECRETARY OF THE AIR FORCEDUSA(OR)SAF/AQCHIEF OF STAFFOF THE ARMYCOMMANDANTOF THEMARINE CORPSCHIEF OFNAVALOPERATIONSCHIEF OF STAFFOF THE AIR FORCETEMA N-91 AF/TECOMMODITYCOMMANDSARMYTEST ANDEVALUATIONCOMMAND(ATEC)MARINE CORPSSYSTEMSCOMMAND(MARCORSYSCOM)MARINECORPSOPERATIONALTEST ANDEVALUATIONACTIVITY(MCOTEA)NAVYSYSTEMSCOMMANDSSPAWARNAVAIRNAVSEAOPERATIONALTEST ANDEVALUATIONFORCE(OPTEVFOR)AIR FORCEMATERIELCOMMAND(AFMC)AIR FORCEOPERATIONALTEST ANDEVALUATIONCENTER(AFOTEC)Figure 3-1. DoD <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Organization3-3


• Monitors DT&E to ensure the adequacy oftesting <strong>and</strong> to assess test results;• Provides assessments of the Defense AcquisitionExecutive Summary (DAES) <strong>and</strong>technical assessments of DT&E <strong>and</strong> SystemsEngineering Processes (SEPs) conducted on aweapon system;• Provides advice <strong>and</strong> makes re<strong>com</strong>mendationsto OSD senior leadership, <strong>and</strong> issues guidanceto the Component Acquisition Executives(CAEs) with respect to DT&E;• Leads the defense systems efforts for joint testing<strong>and</strong> is a member of the Joint T&EProgram’s Technical Advisory Board for JointDevelopment <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> programs.3.3.3.2 DT&E/DS <strong>and</strong> Service ReportsDuring the testing of ACAT I <strong>and</strong> designatedweapon systems, the DT&E/DS <strong>and</strong> Servicesinteraction includes the following reportingrequirements:• A TEMP (either preliminary or updated, asappropriate) must be provided for consideration<strong>and</strong> approval before each milestonereview, starting with Milestone B.• A technical assessment of Developmental T&Eis provided to the DS <strong>and</strong> DOT&E listing theT&E results, conclusions, <strong>and</strong> re<strong>com</strong>mendationsprior to a milestone decision or the finaldecision to proceed to BLRIP.3.3.4 Director, Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (DOT&E)As illustrated in Figure 3-1, the director reportsdirectly to the SECDEF <strong>and</strong> has special reportingrequirements to Congress. The DOT&E’sresponsibility to Congress is to provide anunbiased window of insight into the operationaleffectiveness, suitability, <strong>and</strong> survivability of newweapon systems.3.3.4.1 Duties <strong>and</strong> Functions of the DOT&EThe specific duties of DOT&E, outlined in DoDDirective (DoDD) 5141.2, Director of Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>, include: 2• Obtaining reports, information, advice, <strong>and</strong>assistance as necessary to carry out assignedfunctions (DOT&E has access to all records<strong>and</strong> data in DoD on acquisition programs);• Signing the ACAT I <strong>and</strong> oversight TEMPs forapproval of OT&E <strong>and</strong> approving the OT&Efunding for major systems acquisition;• Approving operational test plans on all majordefense acquisition systems <strong>and</strong> designatedoversight programs prior to system starting initialoperational testing (approval in writingrequired before operational testing may begin);oversight extends into Follow-on Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (FOT&E);• Providing observers during preparation <strong>and</strong>conduct of OT&E;• Analyzing results of OT&E conducted for eachmajor or designated defense acquisition program<strong>and</strong> submitting a report to the SECDEF<strong>and</strong> the Congress on the adequacy of theOT&E performed;• Presenting the BLRIP report to the SECDEF<strong>and</strong> to congressional Committees on ArmedServices <strong>and</strong> Appropriations on the adequacyof Live Fire <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (LFT&E) <strong>and</strong>OT&E, <strong>and</strong> analyzing if results confirm thesystem’s operational effectiveness <strong>and</strong>suitability;• Providing oversight <strong>and</strong> approval of majorprogram LFT.3-4


3.3.4.2 DOT&E <strong>and</strong> Service InteractionsFor DoD <strong>and</strong> DOT&E-designated oversightacquisition programs, the Service provides theDOT&E the following:• A draft copy of the Operational <strong>Test</strong> Planconcept for review;• Significant test plan changes;• The final Service IOT&E report, which mustbe submitted to DOT&E before the Full RateProduction Decision Review (FRPDR);• The LFT&E plan for approval, <strong>and</strong> the ServiceLive Fire <strong>Test</strong> Report for review.3.4 SERVICE T&E MANAGEMENTSTRUCTURES3.4.1 Army T&E OrganizationalRelationshipThe Army management structure for T&E isillustrated in Figure 3-2.3.4.1.1 Army Acquisition ExecutiveThe Assistant Secretary of the Army for Acquisition,Logistics, <strong>and</strong> Technology (ASA(ALT)) hasthe principal responsibility for all Department ofthe Army (DA) matters <strong>and</strong> policy related to acquisition,logistics, technology, procurement, theindustrial base, <strong>and</strong> security cooperation.Additionally, the ASA(ALT) serves as the ArmyAcquisition Executive (AAE). The AAE administersacquisition programs by developing/SECRETARY OF THE ARMYASA(ALT)PEOPMCHIEF OF STAFFDUSA(OR)T&EMGT AGENCY(TEMA)COMMODITYCOMMANDSARMY T&ECOMMAND (ATEC)ARMYEVALUATIONCENTER (AEC)OPERATIONALTEST COMMAND(OTC)DEVELOPMENTALTEST COMMAND(DTC)Figure 3-2. Army <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Organization3-5


promulgating acquisition policies <strong>and</strong> proceduresas well as appointing, supervising, <strong>and</strong> evaluatingassigned Program Executive Officers (PEOs)<strong>and</strong> direct reporting PMs.3.4.1.2 Army T&E ExecutiveThe Deputy Under Secretary of the Army forOperations Research (DUSA(OR)), establishes,reviews, enforces, <strong>and</strong> supervises Army T&Epolicy <strong>and</strong> procedures including overseeing allArmy T&E associated with the system research,development, <strong>and</strong> acquisition of all materiel <strong>and</strong>Comm<strong>and</strong>, Control, Communications, Computers<strong>and</strong> Intelligence (C 4 I)/Information Technology(IT) systems. The <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> <strong>Management</strong>Agency (TEMA) within the Office ofthe Chief of Staff, Army, provides the DUSA(OR)oversight. As delegated by the AAE, theDUSA(OR) is the sole Headquarters, Departmentof the Army (HQDA) approval authority forTEMPs.3.4.1.3 Army <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Comm<strong>and</strong>The U.S. Army <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Comm<strong>and</strong>(ATEC) supports the systems acquisition, forcedevelopment, <strong>and</strong> experimentation processesthrough overall management of the Army’s T&Eprograms. In this role, ATEC manages the Army’sdevelopmental <strong>and</strong> operational testing, all systemevaluation, <strong>and</strong> management of joint T&E. ATECis the Army’s independent Operational <strong>Test</strong>Activity (OTA) reporting directly to the ViceChief of Staff, Army (VCSA) through theDirector of the Army Staff (DAS).3.4.1.3.1 Army Developmental <strong>Test</strong>ersThe U.S. Army Developmental <strong>Test</strong> Comm<strong>and</strong>(DTC) within ATEC has the primary responsibilityfor conducting Developmental <strong>Test</strong>s (DTs)for the Army. DTC responsibilities include:• Perform the duties of governmental developmentaltester for all Army systems except forthose systems assigned to the Communications-Electronics Research, Development, <strong>and</strong>Engineering Center (CERDEC) of the ArmyMateriel Comm<strong>and</strong> (AMC) Research, Development<strong>and</strong> Engineering Comm<strong>and</strong> (RDE-COM) by HQDA (Deputy Chief of Staff, G-6),Medical Comm<strong>and</strong> (MEDCOM), Intelligence<strong>and</strong> Security Comm<strong>and</strong> (INSCOM), Space <strong>and</strong>Missile Defense Comm<strong>and</strong> (SMDC), <strong>and</strong> ArmyCorps of Engineers (ACE);• Provide test facilities <strong>and</strong> testing expertise insupport of the acquisition of Army <strong>and</strong> otherdefense systems;• Operate <strong>and</strong> maintain the Army’s portion ofthe Major Range <strong>and</strong> <strong>Test</strong> Facility Base(MRTFB) (except for the United States ArmyKwajalein Atoll (USAKA) <strong>and</strong> the High EnergyLaser System <strong>Test</strong> Facility (HELSTF)) inaccordance with DoDD 3200.11, Major Range<strong>and</strong> <strong>Test</strong> Facility Base (MRTFB), May 1, 2002;• Provide testers with a safety release for systemsbefore the start of pretest training for teststhat use soldiers as test participants;• Provide safety confirmations for milestoneacquisition decision reviews <strong>and</strong> the materielrelease decision;• Manage the Army <strong>Test</strong> Incident ReportingSystem (ATIRS).3.4.1.3.2 Army Operational <strong>Test</strong>ersThe U.S. Army Operational <strong>Test</strong> Comm<strong>and</strong> (OTC)within ATEC has the primary responsibility forconducting Operational <strong>Test</strong>s (OTs) for the Army<strong>and</strong> supporting Army participation in Joint T&E.OTC responsibilities include:• Perform the duties of operational tester for allArmy systems except for those systemsassigned to MEDCOM, INSCOM, SMDC,<strong>and</strong> ACE;3-6


• Perform the duties of operational tester forassigned multi-Service OTs <strong>and</strong> (on a customerservice basis) for Training <strong>and</strong> Doctrine Comm<strong>and</strong>(TRADOC) Force Development <strong>Test</strong>s<strong>and</strong> Experimentation (FDTE);• Provide test facilities <strong>and</strong> testing expertise insupport of the acquisition of Army <strong>and</strong> otherdefense systems, <strong>and</strong> for other customers on acost-reimbursable <strong>and</strong> as-available basis;• Program <strong>and</strong> budget the funds to support OTexcept out of cycle tests (which are usuallypaid for by the proponent);• Develop <strong>and</strong> submit OT <strong>and</strong> FDTE Outline<strong>Test</strong> Plans (OTPs) to the Army’s <strong>Test</strong> Schedule<strong>and</strong> Review Committee (TSARC).3.4.1.3.3 Army <strong>Evaluation</strong> CenterThe U.S. Army <strong>Evaluation</strong> Center (AEC) is anindependent subordinate element within ATECthat has the primary responsibility for conductingArmy system evaluations <strong>and</strong> system assessmentsin support of the systems acquisition process.Decision makers use AEC’s independentreport addressing an Army system’s operationaleffectiveness, suitability, <strong>and</strong> survivability. AECresponsibilities include:• Perform the duties of system evaluator for allArmy systems except for those systems assignedto MEDCOM, INSCOM, <strong>and</strong> the <strong>com</strong>mercialitems assigned to ACE; conduct Continuous<strong>Evaluation</strong> (CE) on all assigned systems;• Develop <strong>and</strong> promulgate evaluation capabilities<strong>and</strong> methodologies;• Coordinate system evaluation resourcesthrough the TSARC;• Preview programmed system evaluation requirementsfor possible use of Modeling <strong>and</strong>Simulation (M&S) to enhance evaluation <strong>and</strong>reduce costs; perform manpower <strong>and</strong> personnelintegration (MANPRINT) assessments in coordinationwith Deputy Chief of Staff (DCS),G-1 Army Research Laboratory-HumanResearch <strong>and</strong> Engineering Division/Directorate(ARL-HRED);• Perform the Integrated Logistics Support (ILS)program surveillance for Army systems.Perform independent logistics supportabilityassessments <strong>and</strong> report them to the ArmyLogistician <strong>and</strong> other interested members ofthe acquisition <strong>com</strong>munity.3.4.2 Navy T&E Organizational RelationshipThe organizational structure for T&E in the Navyis illustrated in Figure 3-3. Within the Navy Secretariat,the Secretary of the Navy (SECNAV) has assignedgeneral <strong>and</strong> specific Research, Development,<strong>Test</strong>, <strong>and</strong> <strong>Evaluation</strong> (RDT&E) responsibilities tothe Assistant Secretary of the Navy (Research, Development,<strong>and</strong> Acquisition) (ASN(RDA)) <strong>and</strong> tothe Chief of Naval Operations (CNO). The CNOhas responsibility for ensuring the adequacy of theNavy’s overall T&E program. The T&E policy <strong>and</strong>guidance are exercised through the Director of NavyT&E <strong>and</strong> Technology Requirements (N-91). Staffsupport is provided by the <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Division (N-912), which has cognizance over planning,conducting, <strong>and</strong> reporting all T&E associatedwith development of systems.3.4.2.1 Navy DT&E OrganizationsThe Navy’s senior systems development authorityis divided among the <strong>com</strong>m<strong>and</strong>ers of the system<strong>com</strong>m<strong>and</strong>s with Naval Air Systems Comm<strong>and</strong>(NAVAIR) developing <strong>and</strong> performing DT&E onaircraft <strong>and</strong> their essential weapon systems; NavalSea Systems Comm<strong>and</strong> (NAVSEA) developing<strong>and</strong> performing DT&E on ships, submarines, <strong>and</strong>their associated weapon systems; <strong>and</strong> Space <strong>and</strong>Naval Warfare Systems Comm<strong>and</strong> (SPAWAR)developing <strong>and</strong> performing DT&E on all othersystems. System acquisition is controlled by a3-7


ASST SEC NAVY(RD&A)PEOPMDIRECTORATE OF NAVYT&E AND TECHNOLOGYREQUIREMENTS (N-91)SECRETARY OF THE NAVYCHIEF OFNAVAL OPERATIONSCOMMANDANT OF THEMARINE CORPSNAVALSYSTEMSCOMMANDOPERATIONALTEST ANDEVALUATIONFORCEMARINE CORPSSYSTEMSCOMMANDSMARINECORPSOPERATIONALTEST ANDEVALUATIONACTIVITYNAVAIRSPAWARNAVSEANAVAL AIR WARFARE CENTERNAVAL COMMAND, CONTROL AND OCEAN SURVEILLANCE CENTERNAVAL SURFACE WARFARE CENTERNAVAL UNDERSEA WARFARE CENTERFigure 3-3. Navy <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Organizationchartered PM or by the <strong>com</strong>m<strong>and</strong>er of a systems<strong>com</strong>m<strong>and</strong>. In both cases, the designated developingagency is responsible for DT&E <strong>and</strong> for thecoordination of all T&E planning in the TEMP.Developing agencies are responsible for:• Developing test issues based on the thresholdsestablished by the user in the requirementsdocuments;• Identifying the testing facilities <strong>and</strong> resourcesrequired to conduct the DT&E;• Developing the DT&E test reports <strong>and</strong> quicklookreports.3.4.2.2 Navy Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> ForceThe Comm<strong>and</strong>er, Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Force (COMOPTEVFOR) <strong>com</strong>m<strong>and</strong>s theNavy’s independent OT&E activity <strong>and</strong> reportsdirectly to the CNO. The functions of theCOMOPTEVFOR include:• Establishing early liaison with the developingagency to ensure an underst<strong>and</strong>ing of the requirements<strong>and</strong> plans;• Reviewing acquisition program documentationto ensure that documents are adequate tosupport a meaningful T&E program;3-8• Planning <strong>and</strong> conducting realistic OT&E;


• Developing tactics <strong>and</strong> procedures for theemployment of systems that undergo OT&E(as directed by the CNO);• Providing re<strong>com</strong>mendations to the CNO forthe development of new capabilities or theupgrade of ranges;• Reporting directly to the CNO, the Presidentof the Board of Inspection <strong>and</strong> Survey (PRES-INSURV) is responsible for conducting acceptancetrials of new ships <strong>and</strong> aircraftacquisitions <strong>and</strong> is the primary Navy authorityfor Production Acceptance <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(PAT&E) of these systems;• Conducting OT&E on aviation systems in conjunctionwith Marine Corps Operational <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> Activity (MCOTEA).3.4.3 Air Force Organizational Relationships3.4.3.1 Air Force Acquisition ExecutiveThe Assistant Secretary of the Air Force forAcquisition (SAF/AQ) is the senior-levelauthority for Research, Development, <strong>and</strong>Acquisition (RD&A) within the Air Force. TheDirector of Space Acquisition (SAF/USA) obtainsspace assets for all Services. As illustrated in Figure3-4, the SAF/AQ is an advisor to the Secretaryof the Air Force <strong>and</strong> interfaces directly withthe DT&E <strong>and</strong> DOT&E. The SAF/AQ receivesDT&E <strong>and</strong> OT&E results as a part of the acquisitiondecision process. Within the SAF/AQ structure,there is a military deputy (acquisition) whois the Air Force primary staff officer with responsibilityfor RD&A. This staff officer is the chiefadvocate of Air Force acquisition programs <strong>and</strong>develops the RDT&E budget. Air Force policy<strong>and</strong> oversight for T&E is provided by a staffSECRETARY OF THE AIR FORCESAF(ACQUISITION)SAF/USACHIEF OF STAFFPEOAF/TEPMAF MATERIELCOMMAND (AFMC)MAJORCOMMANDS(MAJCOM)T&E ORGANIZATIONSAF OPERATIONALTEST ANDEVALUATIONCENTER (AFOTEC)LOGISTICCENTERSWARNER-ROBINSOKLAHOMA CITYOGDENTESTPRODUCTCENTERSCENTERSAFRLAF FLIGHT TEST CENTER AERONAUTICAL SYSTEMS WRIGHT-PATAF DEVELOPMENT TEST CENTER ELECTRONIC SYSTEMS ROMEARNOLD ENGINEERING &SPACE SYSTEMSHANSCOMDEVELOPMENT CENTERHUMAN SYSTEMSEDWARDSEGLINKIRTLANDFigure 3-4. Air Force <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Organization3-9


element under the Chief of Staff, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(AF/TE). They process test documentationfor DT&E <strong>and</strong> OT&E <strong>and</strong> manage the review ofthe TEMP.3.4.3.2 Air Force DT&E OrganizationThe Air Force Materiel Comm<strong>and</strong> (AFMC) is theprimary DT&E <strong>and</strong> acquisition manager. TheAFMC performs all levels of research; developsweapon systems, support systems, <strong>and</strong> equipment;<strong>and</strong> conducts all DT&E. The acquisition PMs areunder the Comm<strong>and</strong>er, AFMC. Within theAFMC, there are major product divisions, testcenters, <strong>and</strong> laboratories as well as missile,aircraft, <strong>and</strong> munitions test ranges.Once the weapon system is fielded, AFMC retainsmanagement responsibility for developing <strong>and</strong>testing system improvements, enhancements, orupgrades. The Air Force Space Comm<strong>and</strong> (AFSC)is responsible for DT&E of space <strong>and</strong> missilesystems.3.4.3.3 Air Force OT&E OrganizationThe AF/TE is responsible for supporting <strong>and</strong>coordinating the OT&E activities of the AirForce Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Center(AFOTEC).The Comm<strong>and</strong>er, AFOTEC, is responsible to theSecretary of the Air Force <strong>and</strong> the Chief of Stafffor the independent T&E of all major <strong>and</strong> nonmajorsystems acquisitions. The Comm<strong>and</strong>er issupported by the operational <strong>com</strong>m<strong>and</strong>s <strong>and</strong>others in planning <strong>and</strong> conducting OT&E.The AFOTEC reviews operational requirements,employment concepts, tactics, maintenanceconcepts, <strong>and</strong> training requirements before conductingOT&E. The operational <strong>com</strong>m<strong>and</strong>sprovide operational concepts, personnel, <strong>and</strong>resources to assist AFOTEC in performingOT&E.3.4.4 Marine Corps OrganizationalRelationship3.4.4.1 Marine Corps Acquisition ExecutiveThe Deputy Chief of Staff for Research <strong>and</strong>Development (DCS/R&D), Headquarters MarineCorps, directs the total Marine Corps RDT&Eeffort to support the acquisition of new systems.The DCS/R&D’s position within the General Staffis analogous to that of the Director, T&E, Tech/N-91 in the Navy structure. The DCS/R&D alsoreports directly to the Assistant Secretary of theNavy/Research, Engineering, <strong>and</strong> Science (ASN/RE&S) in the Navy Secretariat. Figure 3-3 illustratesthe Marine Corps organization for T&Emanagement.3.4.4.2 Marine Corps DT&E OrganizationsThe Comm<strong>and</strong>ing General, Marine CorpsSystems Comm<strong>and</strong> (CG MCSC), is the MarineCorps materiel developing agent <strong>and</strong> directlyinterfaces with the Navy Systems Comm<strong>and</strong>s.The CG MCSC implements policies, procedures,<strong>and</strong> requirements for DT&E of all systemsacquired by the Marine Corps. The Marine Corpsalso uses DT&E <strong>and</strong> OT&E performed by otherServices, which may develop systems of interestto the Corps.3.4.4.3 Marine Corps Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> ActivityThe Marine Corps Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Activity (MCOTEA) is the independentOT&E activity maintained by the Marine Corps.Its function is analogous to that performed byOPTEVFOR in the Navy. The CG MCSC providesdirect assistance to MCOTEA in the planning,conducting, <strong>and</strong> reporting of OT&E. MarineCorps aviation systems are tested byOPTEVFOR. The Fleet Marine Force performstroop T&E of materiel development in an operationalenvironment.3-10


USD(AT&L)PDUSD(AT&L)BoD EXECUTIVE SECUSA:DUSA(OR)USN:N-91USAF:AF/TEBOARD OF DIRECTORS(BoD)SERVICE VICE CHIEFS OF STAFFDEFENSE TEST AND TRAININGSTEERING GROUPBOARD OF OPERATING DIRECTORS(BoOD)ARMY DTCNAVAIR 5.0AFMC/ DORANGE COMMANDERSCOUNCILFigure 3-5. <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Executive Agent Structure3.5 THE T&E EXECUTIVE AGENTSTRUCTUREIn 1993, the USD(AT&L) approved a T&EExecutive Agent structure to provide the Serviceswith more corporate responsibility for the management<strong>and</strong> policies that influence the availabilityof test resources for the evaluation of DoDsystems in acquisition (Figure 3-5). The DT&E/DS has functional responsibility for the executionof the processes necessary to assure the T&EExecutive Agent structure functions effectively.The DT&E/DS also participates in the Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Coordinating Committee,chaired by the DOT&E. This <strong>com</strong>mittee managesthe OT&E Resources Enhancement Project,<strong>and</strong> the DT&E/DS draws input to the T&E ExecutiveAgent structure for coordination of allT&E resource requirements.The Board of Directors (BoD) (Service ViceChiefs) is assisted by an Executive Secretariatconsisting of the Army DUSA(OR), the Navy N-91, <strong>and</strong> the Air Force AF/TE. The BoD providesguidance <strong>and</strong> decisions on policy <strong>and</strong> resourceallocation to their subordinate element, the Boardof Operating Directors (BoOD) (Army DTC,NAVAIR 5.0, <strong>and</strong> AFMC Director of Operations).The BoD also provides program review <strong>and</strong> advocacysupport of the T&E infrastructure to OSD<strong>and</strong> Congress.The BoOD is supported by a Secretariat <strong>and</strong> theDefense <strong>Test</strong> <strong>and</strong> Training Steering Group(DTTSG). The DTTSG manages the T&EResources Committee (TERC), the TrainingInstrumentation Resource Investment Committee(TIRIC), <strong>and</strong> the CROSSBOW Committee. TheDTTSG is instrumental in achieving efficient3-11


acquisition <strong>and</strong> integration of all training <strong>and</strong>associated test range instrumentation <strong>and</strong> thedevelopment of acquisition policy for embeddedweapon system training <strong>and</strong> testing capabilities.The TERC supports the DTTSG in overseeinginfrastructure requirements developmentfrom a T&E <strong>com</strong>munity perspective, both developmenttesting <strong>and</strong> operational testing, <strong>and</strong>manages OSD funding in executing the CentralT&E Investment Program (CTEIP). The TIRICis chartered to ensure the efficient acquisition of<strong>com</strong>mon <strong>and</strong> interoperable range instrumentationsystems. The CROSSBOW Committee providestechnical <strong>and</strong> management oversight of theServices’ development <strong>and</strong> acquisition programsfor threat <strong>and</strong> threat-related hardware simulators,emitters, software simulations, hybrid representations,<strong>and</strong> surrogates.3.6 SUMMARYAn increased emphasis on T&E has placed greaterdem<strong>and</strong>s on the OSD <strong>and</strong> DoD <strong>com</strong>ponents tocarefully structure organizations <strong>and</strong> resources toensure maximum effectiveness. Renewed interestby Congress in testing as a way of assessingsystems utility <strong>and</strong> effectiveness, the report bythe President’s Blue Ribbon Panel on Acquisition<strong>Management</strong>, <strong>and</strong> acquisition reform initiativeshave resulted in major reorganizations withinthe Services. These policy changes <strong>and</strong> reorganizationswill be ongoing for several years to improvethe management of T&E resources in supportof acquisition programs.3-12


ENDNOTES1. DoD Instruction (DoDI) 5000.2, Operationof the Defense Acquisition System, May 12,2003.2. DoDD 5141.2, Director of Operational <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> (DOT&E), May 25, 2000.3-13


3-14


4PROGRAM OFFICE RESPONSIBILITIESFOR TEST AND EVALUATION4.1 INTRODUCTIONIn government acquisition programs, thereshould be an element dedicated to the managementof <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E). This elementhas the overall test program responsibilityfor all phases of the acquisition process. T&Eexpertise may be available through matrix supportor reside in the Program <strong>Management</strong> Office(PMO) engineering department during theprogram’s early phases. By System Demonstration,the PMO should have a dedicated T&Emanager. In the PMO, the Deputy for T&Ewould be responsible for defining the scope <strong>and</strong>concept of the test program, establishing theoverall program test objectives, <strong>and</strong> managingtest program funds <strong>and</strong> coordination. The Deputyfor T&E should provide test directors (such as ajoint test director) as required, <strong>and</strong> coordinatetest resources, facilities, <strong>and</strong> their support requiredfor each phase of testing. In addition, theDeputy for T&E or a staff member, will be responsiblefor managing the <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Master Plan (TEMP), <strong>and</strong> planning <strong>and</strong> managingany special test programs required for theprogram. The Deputy for T&E will also review,evaluate, approve, <strong>and</strong> release for distributioncontractor-prepared test plans <strong>and</strong> reports, <strong>and</strong>review <strong>and</strong> coordinate all appropriate governmenttest plans. After the system is produced,the Deputy for T&E will be responsible for supportingproduction acceptance testing <strong>and</strong> thetest portions of later increments, PreplannedProduct Improvements (P 3 I) upgrades, or enhancementsto the weapon system/acquisition.If the program is large enough, the Deputy forT&E will be responsible for all T&E direction<strong>and</strong> guidance for that program.4.2 RELATIONSHIP TO THE PROGRAMMANAGERThe Program Manager (PM) is ultimately responsiblefor all aspects of the system development,including testing. The Deputy for T&E is normallyauthorized by the PM to conduct all dutiesin the area of T&E. The input of the Deputy forT&E to the contract, engineering specifications,budget, program schedule, etc., is essential forthe PM to manage the program efficiently.4.3 EARLY PROGRAM STAGESIn the early stages of the program, the T&E functionis often h<strong>and</strong>led by matrix support from themateriel <strong>com</strong>m<strong>and</strong>. Matrix T&E support or theDeputy for T&E should be responsible fordevelopment of the T&E sections of the Requestfor Proposal (RFP). Although the ultimate responsibilityfor the RFP is between the PM <strong>and</strong> thePrimary Contracting Officer (PCO), the Deputyfor T&E is responsible for creating severalsections: the test schedule, test program funding(projections), test data requirements for theprogram (test reports, plans, procedures, quick-lookreports, etc.), the test section of the Statement ofWork (SOW), portions of the Acquisition Plan, Informationfor Proposal Preparation (IFPP), <strong>and</strong>, ifa joint acquisition program, feedback on the CapabilityDevelopment Document (CDD) integratingmultiple Services’ requirements.4.3.1 Memor<strong>and</strong>umsEarly in the program, another task of the Deputyfor T&E is the arrangement of any Memor<strong>and</strong>umsof Agreement or Underst<strong>and</strong>ing (MOAs/4-1


MOUs) between Services, North Atlantic TreatyOrganization (NATO) countries, test organizations,etc., which outline the responsibilities ofeach organization. The RFP <strong>and</strong> SOW outlinecontractor/government obligations <strong>and</strong> arrangementson the access <strong>and</strong> use of test facilities(contractor or government owned).4.3.2 <strong>Test</strong> Data <strong>Management</strong>The Deputy for T&E may have approval authorityfor all contractor-created test plans, procedures,<strong>and</strong> reports. The Deputy for T&E must haveaccess to all contractor testing <strong>and</strong> test results,<strong>and</strong> the Deputy for T&E is responsible fordisseminating the results to government agenciesthat need this data. Additionally, the Deputy forT&E creates report formats <strong>and</strong> time lines forcontractor submittal, government approval, etc.The data requirements for the entire test programare outlined in the Contract Data RequirementsList (CDRL). The Deputy for T&E should reviewthe Acquisition <strong>Management</strong> Systems <strong>and</strong> DataRequirements Control List 1 for relevant test DataItem Descriptions (DIDs). (Examples can be foundin Appendix C.) The Deputy for T&E provides inputto this section of the RFP early in the program. TheDeputy for T&E ensures that the office <strong>and</strong> all associatedtest organizations requiring the informationreceive the test documentation on time. Usually,the contractor sends the data packages directly tothe Deputy for T&E, who, in turn, has a distributionlist trimmed to the minimum number ofcopies for agencies needing that information toperform their mission <strong>and</strong> oversightresponsibilities. It is important for the Deputy forT&E to use an integrated test program <strong>and</strong> requestcontractor test plans <strong>and</strong> procedures well inadvance of the actual test performance to ensurethat the office of the Deputy for T&E has time toapprove the procedures or implementmodifications.The Deputy for T&E must receive the test results<strong>and</strong> reports on time to enable the office of theDeputy for T&E, the PM, <strong>and</strong> higher authoritiesto make program decisions. The data receivedshould be tailored to provide the minimum informationneeded. The Deputy for T&E must beaware that data requirements in excess of theminimum needed may lead to an unnecessaryincrease in overall program cost. For data thatare needed quickly <strong>and</strong> informally (at least initially),the Deputy for T&E can request Quick-Look Reports that give test results immediatelyafter test performance. The Deputy for T&E isalso responsible for coordinating with the contractoron all report formats (the in-house contractorformat is acceptable in most cases).The contract must specify the data the contractorwill supply the Operational <strong>Test</strong> Agency (OTA).Unlike Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(DT&E), the contractor will not be making theOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E) plans,procedures, or reports. These documents are theresponsibility of the OTA. The PMO Deputy forT&E should include the OTA on the distributionlist for all test documents that are of concernduring the DT&E phase of testing so they willbe informed of test item progress <strong>and</strong> previoustesting. In this way, the OTA will be informedwhen developing its own test plans <strong>and</strong> proceduresfor OT&E. In fact, OTA representativesshould attend the CDRL Review Board <strong>and</strong>provide the PMO with a list of the types of documentsthe OTA will need. The Deputy for T&Eshould coordinate the test sections of this datalist with the OTA <strong>and</strong> indicate concerns at thatmeeting. All contractor test reports should bemade available to the OTA. In return, the Deputyfor T&E must stay informed of all OTA activities,underst<strong>and</strong> the test procedures, <strong>and</strong> plan <strong>and</strong>receive the test reports. Unlike DT&E <strong>and</strong> contractordocumentation, the PMO Deputy for T&Ewill not have report or document approval authorityfor OT&E items. The Deputy for T&E isresponsible for keeping the PM informed ofOT&E results.4-2


4.3.3 <strong>Test</strong> Schedule FormulationAn important task the Deputy for T&E has duringthe creation of the RFP is the test programschedule. Initially, the PM will need contractorpredictions of the hardware (<strong>and</strong> software in somecases) availability dates for models, prototypes,mock-ups, full-scale models, etc., once the contractis awarded. The Deputy for T&E uses thisinformation <strong>and</strong> the early test planning in theTechnology Development Strategy (TDS) documentto create a realistic front-end schedule ofthe in-house testing the contractor will conductbefore government testing (Development <strong>Test</strong>ing(DT) <strong>and</strong> Operational <strong>Test</strong>ing (OT)). Then, a draftintegrated schedule (Part II, TEMP) is developedupon which the government DT <strong>and</strong> OT schedulescan be formulated <strong>and</strong> contractor support requirementsdetermined. The Deputy for T&E canuse past experience in testing similar weapon systems/acquisitionitems or contract test organizationsthat have the required experience to <strong>com</strong>pletethe entire test schedule. Since the test scheduleis a critical contractual item, contractor inputis very important. The test schedule will normallybe<strong>com</strong>e an item for negotiation once the RFP isreleased <strong>and</strong> the contractor’s proposal is received.Attention must be given to ensure that the testschedule is not so success-oriented that retestingof failures will cause serious program delays foreither the government test agencies or thecontractor.Another important early activity to be performedby the Deputy for T&E is to coordinate the OT&Etest schedule. Since the contractor may be requiredto provide support, the OT&E test supportmay need to be contractually agreed uponbefore contract award. Sometimes, the Deputy forT&E can formulate a straw-man schedule (basedon previous experience) <strong>and</strong> present this scheduleto the operational test representative at theinitial T&E Integrated Product Team (IPT) meetingfor review, or the Deputy for T&E can contactthe OTA <strong>and</strong> arrange a meeting to discussthe new program. In the meeting, time requirementsenvisioned by OTA can be discussed. Inputfrom that meeting then goes into the RFP<strong>and</strong> to the PM. The test schedule must allow timefor DT&E testing <strong>and</strong> OT&E testing when testingis not <strong>com</strong>bined or test assets are not limited.Before setup of Initial Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(IOT&E), certification of readiness forIOT&E may require a time gap for review ofDT&E test results <strong>and</strong> refurbishment or correctionsof deficiencies discovered during DT&E,etc. The test schedule for DT&E should not beoverrun so that the IOT&E test schedule is adverselyimpacted, reducing program schedule timewith inadequate operational testing or rushing thereporting of IOT&E results. For example, if theDT&E schedule slips 6 months, the OT&E schedule<strong>and</strong> milestone decision should slip also. TheIOT&E should not be shortened just to make amilestone decision date.4.3.4 Programmatic Environmental AnalysisThe PMO personnel should be sensitive to thepotential environmental consequences of systemmaterials, operations, <strong>and</strong> disposal requirements.Public Laws (Title 40, Code of Federal Regulations(CFR), Parts 1500-1508; National EnvironmentalPolicy Act (NEPA) Regulations; ExecutiveOrder 12114, Environmental Effects Abroadof Major Federal Actions; DoD 5000 Series; etc.)require analysis of hazardous materials <strong>and</strong> appropriatemitigation measures during each acquisitionphase. As stated in DoD 5000.2-R, “Planningshall consider the potential testing impactson the environment.”Litigation resulting in personal fines <strong>and</strong> imprisonmentagainst government employees haveraised the environmental awareness at test ranges<strong>and</strong> facilities. Environmental Impact Statements(EISs) (supported by long, thorough studies <strong>and</strong>public testimony) or Environmental Analysis <strong>and</strong>Assessments are generally required before anysystem testing can be initiated.4-3


4.4 PMO/CONTRACTOR TESTMANAGEMENTThe PMO will, in most cases, have a contractortest section counterpart. With this counterpart, theDeputy for T&E works out the detailed test planning,creation of schedules, etc., for the entiretest program. The PMO uses input from allsources (contracts, development test agencies,OTAs, higher headquarters, etc.) to formulate thetest program’s length, scope, <strong>and</strong> necessary details.The Deputy for T&E ensures that the RFPreflects the test program envisioned <strong>and</strong> thecontractor’s role in the acquisition process. TheDeputy for T&E also ensures the RFP includesprovisions for government attendance at contractors’tests <strong>and</strong> that all contractor test results areprovided to the government.After the RFP has been issued <strong>and</strong> the contractorhas responded, the proposal is reviewed by thePMO. The Deputy for T&E is responsible forperforming a technical evaluation on the test portionsof the proposal. In this technical evaluation,the Deputy for T&E <strong>com</strong>pares the proposalto the SOW, test schedule, IFPP, etc., <strong>and</strong> reviewsthe contractor’s cost of each testing item. This isan iterative process of refining, clarifying, <strong>and</strong>modifying that will ensure the final contractbetween the PMO <strong>and</strong> the prime contractor (subcontractors)contains all test-related tasks <strong>and</strong> ispriced within scope of the proposed test program.Once technical agreement on the contractor’stechnical approach is reached, the Deputy forT&E is responsible for giving inputs to thegovernment contracting officer during contractnegotiations. The contracting officer-requestedcontract deliverables are assigned Contract LineItem Numbers (CLINs), which are created by theDeputy for T&E. This will ensure the contractordelivers the required performances at specifiedintervals during the life of the contract. Usually,there will be separate contracts for development<strong>and</strong> production of the acquisition item. For eachtype of contract, the Deputy for T&E has theresponsibility to provide the PCO <strong>and</strong> PM withthe T&E input.4.5 INTEGRATED PRODUCT TEAMSFOR TEST AND EVALUATIONBefore the final version of the RFP is created,the Deputy for T&E should form an IPT, a testplanning/integration working group. This groupincludes the OTA, development test agency,organizations that may be jointly acquiring thesame system, the test supporting agencies,operational users, <strong>and</strong> any other organizations thatwill be involved in the test program by providingtest support or by conducting, evaluating, orreporting on testing. The functions of the groupsare to: facilitate the use of testing expertise,instrumentation, facilities, simulations <strong>and</strong> models;integrate test requirements; accelerate the TEMPcoordination process; resolve test cost <strong>and</strong> schedulingproblems; <strong>and</strong> provide a forum to ensureT&E of the system is coordinated. The existenceof a test coordinating group does not alter theresponsibilities of any <strong>com</strong>m<strong>and</strong> or headquarters;<strong>and</strong> in the event of disagreement within a group,the issue is resolved through the normal <strong>com</strong>m<strong>and</strong>/staffchannels. In later meetings, the contractorparticipates in this test planning group;however, the contractor may not be selected bythe time the first meetings are held.The purposes of these meetings are to review <strong>and</strong>assist in the development of early test documentation,the TEMP, <strong>and</strong> to agree on basic testprogram schedules, scope, support, etc. TheTEMP serves as the top-level test managementdocument for the acquisition program, beingupdated as the changing program dictates.4.6 TEST PROGRAM FUNDING/BUDGETINGThe PMO must identify funds for testing very earlyso that test resources can be obtained. The Deputyfor T&E uses the acquisition schedule, TEMP, <strong>and</strong>4-4


other program <strong>and</strong> test documentation to identifytest resource requirements. The Deputy for T&Ecoordinates these requirements with the contractor<strong>and</strong> government organizations that have the testfacilities to ensure their availability for testing.The Deputy for T&E ensures that test costs includecontractor <strong>and</strong> government test costs. The contractor’stest costs are normally outlined adequately in theproposal; however, the government test ranges,instrumentation, <strong>and</strong> test-support resource costsmust be determined by other means. Usually, theDeputy for T&E contacts the test organization <strong>and</strong>outlines the test program requirements; <strong>and</strong> the testorganization sends the program office an estimateof the test program costs. The Deputy for T&Ethen obtains cost estimates from all test sourcesthat the Deputy for T&E anticipates using <strong>and</strong>supplies this information to the PM. The Deputyfor T&E must also ensure that any program fundingreductions are not absorbed entirely by the testprogram. Some cutbacks may be necessary <strong>and</strong>allowable; but the test program must supply thePM, other defense decision-making authorities, <strong>and</strong>Congress with enough information to make programmilestone decisions.4.7 TECHNICAL REVIEWS, DESIGNREVIEWS, AND AUDITSThe role of the Deputy for T&E changes slightlyduring the contractor’s technical reviews, designreviews, physical <strong>and</strong> functional configurationaudits, etc. Usually, the Deputy for T&E plans,directs, or monitors government testing; however,in the reviews <strong>and</strong> audits, the Deputy examinesthe contractor’s approach to the test problem <strong>and</strong>evaluates the validity of the process <strong>and</strong> theaccuracy of the contractor’s results. The Deputyfor T&E uses personal experience <strong>and</strong> backgroundin T&E to assess whether the contractordid enough or too little testing; whether the testswere biased in any way; <strong>and</strong> if they followed alogical progression using the minimum of time,effort, <strong>and</strong> funds. If the Deputy for T&E findsany discrepancies, the Deputy must inform thecontractor, the PM, <strong>and</strong> the PCO to validate theconclusions before effecting corrections. Eachtype of review or audit will have a different focus/orientation, but the Deputy for T&E will alwaysbe concerned with the testing process <strong>and</strong> how itis carried out. After each review, the Deputy forT&E should always document all observationsfor future reference.4.8 CONTRACTOR TESTINGThe Deputy for T&E is responsible for ensuringthat contractor-conducted tests are monitored bythe government. The Deputy for T&E must alsobe given access to all contractor internal data,test results, <strong>and</strong> test reports related to the acquisitionprogram. Usually, the contract requires thatgovernment representatives be informed ahead oftime of any (significant or otherwise) testing thecontractor conducts so the government canarrange to witness certain testing or receive resultsof the tests. Further, the contractor’s internal datashould be available as a contract provision. TheDeputy for T&E must ensure that government testpersonnel (DT&E/OT&E) have access to contractortest results. It would be desirable to have alltesters observe some contractor tests to help developconfidence in the results <strong>and</strong> identify areasof risk.4.9 SPECIFICATIONSWithin the program office, the engineeringsection is usually tasked to create the system performancespecifications for release of the RFP.The contractor is then tasked with creating thespecification documentation called out by thecontract, which will be delivered once the item/system design is formalized for production. TheDeputy for T&E performs an important functionin specification formulation by reviewing thespecifications to determine if performanceparameters are testable; if current, state-of-thearttechnology can determine (during the DT&Etest phase) if the performance specifications are4-5


eing met by the acquisition item; or if the specifiedparameters are too “tight.” A specificationis too “tight” if the requirements (Section 3) areimpossible to meet or demonstrate, or if the specificationhas no impact on the form, fit, or functionof the end-item, the system it will be<strong>com</strong>e a partof, or the system with which it will interact. TheDeputy for T&E must determine if test objectivescan be adequately formulated from thosespecifications that will provide thresholds of performance,minimum <strong>and</strong> maximum st<strong>and</strong>ards,<strong>and</strong> reasonable operating conditions for the enditem’sfinal mission <strong>and</strong> operating environment.The specifications (Section 4) shape the DT&Etesting scenario, test ranges, test support, targets,etc., <strong>and</strong> are very important to the Deputy forT&E.4.10 INDEPENDENT TEST ANDEVALUATION AGENCIESThe PMO Deputy for T&E does not have directcontrol over government-owned test resources,test facilities, test ranges, test personnel, etc.Therefore, the Deputy for T&E must depend onthose DT or OT organizations controlling them<strong>and</strong> stay involved with the test agency activities.The amount of involvement depends on the itembeing tested; its <strong>com</strong>plexity, cost, <strong>and</strong> characteristics;the length of time for testing; amountof test funds; etc. Usually, the “nuts <strong>and</strong> bolts”detailed test plans <strong>and</strong> procedures are writtenby the test organizations controlling the testresources with input <strong>and</strong> guidance from theProgram Office Deputy for T&E. The Deputyfor T&E is responsible for ensuring that the tests(DT&E) are performed using test objectives basedon the specifications <strong>and</strong> that the requirementsof timeliness, accuracy, <strong>and</strong> minimal costs are metby the test program design. During the testing,the Deputy for T&E monitors test results. Thetest agencies (DT/OT) submit a copy of their reportto the Program Office at the end of testing,usually to the Office of the Deputy for T&E. TheArmy is the only Service to have a designatedindependent evaluation agency, which providesfeedback to the program office.4.11 PMO RESPONSIBILITIES FOROPERATIONAL TEST ANDEVALUATIONIn the government PMO, there should be a sectionresponsible for T&E. Besides being responsibleSOURCE: Matt Reynolds, NAVSEA.• UNDERSTAND THE POLICIES• ORGANIZE FOR T&E• KEEP SYSTEM REQUIREMENTS DOCUMENTS CURRENT• AGONIZE OVER SYSTEM THRESHOLDS• WORK CLOSELY WITH THE OPERATIONAL TEST DIRECTOR• DON’T FORGET ABOUT OPERATIONAL SUITABILITY• MAKE FINAL DT&E A REHEARSAL FOR IOT&E• PREPARE INTERFACING SYSTEMS FOR YOUR IOT&E• MANAGE SOFTWARE TESTING CLOSELY• TRACK AVAILABILITY OF TEST RESOURCES AND TEST SUPPORTPERSONNEL/FACILITIESFigure 4-1. Lessons Learned from OT&E for the PM4-6


for DT&E support to the PM, this section shouldbe responsible for program coordination with theOT&E agency (Figure 4-1). The offices of thesystems engineer or the Deputy for T&E may bedesignated to provide this support to the PM. Insome Services, responsibilities of the Deputy forT&E include coordination of test resources forall phases of OT&E.4.11.1 Contract ResponsibilitiesThe Deputy for T&E or a T&E representativeensures that certain sections of the RFP containsufficient allowance for T&E support by contractors.This applies whether the contract is for adevelopment item, a production item (limited production,such as Low Rate Initial Production(LRIP) or Full Rate Production (FRP)), or theenhancement/upgrade of portions of a weaponssystem. Where allowed within the law, contractorsupport for OT&E should be considered to helpresolve basic issues such as data collectionrequirements, test resources, contractor testsupport, <strong>and</strong> funding.In the overall portion of the RFP, government personnel,especially those in the OTAs, must beguaranteed access to the contractor’s developmentfacilities, particularly during the DT&E phase.Government representatives must be allowed toobserve all contractor in-house testing <strong>and</strong> haveaccess to test data <strong>and</strong> reports.4.11.2 Data RequirementsThe contract must specify the data the contractorwill supply the OTA. Unlike DT&E, the contractorwill not be making the OT&E plans, procedures,or reports. These documents are the responsibilityof the OTA. The PMO Deputy forT&E should include the OTA on the distributionlist for all test documents that are of concern duringthe DT&E phase of testing so they will beinformed of test item progress <strong>and</strong> previous testing.In this way, the OTA will be informed whendeveloping its own test plans <strong>and</strong> procedures forOT&E. In fact, OTA representatives should attendthe CDRL Review Board <strong>and</strong> provide thePMO with a list of the types of documents theOTA will need. The Deputy for T&E shouldcoordinate the test sections of this data list withthe OTA <strong>and</strong> indicate concerns at that meeting.All contractor test reports should be made availableto the OTA. In return, the Deputy for T&Emust stay informed of all OTA activities, underst<strong>and</strong>the test procedures <strong>and</strong> plans, <strong>and</strong> receivethe test reports. Unlike DT&E, the PMO Deputyfor T&E will not have report or documentapproval authority as does the Deputy for T&Eover contractor documentation. The Deputy forT&E is always responsible for keeping the PMinformed of OT&E results.4.11.3 <strong>Test</strong> ScheduleAnother important early activity the Deputy forT&E must ac<strong>com</strong>plish is to coordinate the OT&Etest schedule. Since the contractor may berequired to provide support, the OT&E testsupport may need to be contractually agreed uponbefore contract award. Sometimes, the Deputy forT&E can formulate a draft schedule (based onprevious experience) <strong>and</strong> present this scheduleto the operational test representative at the initial<strong>Test</strong> Planning Working Group (TPWG) forreview; or the Deputy for T&E can contact theOTA <strong>and</strong> arrange a meeting to discuss the newprogram. In the meeting, time requirementsenvisioned by OTA can be discussed. Input fromthat meeting then goes into the RFP <strong>and</strong> to thePM. The test schedule must allow time for DT&Etesting <strong>and</strong> OT&E testing if testing is not <strong>com</strong>binedor test assets are limited. Before setup ofIOT&E, the Certification of Readiness for IOT&Eprocess may require a time gap for review ofDT&E test results <strong>and</strong> refurbishment or correctionsof deficiencies discovered during DT&E,etc. The test schedule for DT&E should not beso “success-oriented” that the IOT&E test scheduleis adversely impacted, not allowing enoughtime for adequate operational testing or thereporting of IOT&E results.4-7


4.11.4 Contractor SupportThe Deputy for T&E provides all T&E input tothe RFP/SOW. The Deputy for T&E must determine,before the beginning of the programacquisition phase, whether the contractor will beinvolved in supporting IOT&E <strong>and</strong>, if so, to whatextent. According to Title 10, United States Code(U.S.C.), the system contractor can only be involvedin the conduct of IOT&E if, once the itemis fielded, tactics <strong>and</strong> doctrine say the contractorwill be providing support or operating that itemduring <strong>com</strong>bat. If not, no system contractor supportis allowed during IOT&E. Before IOT&E,however, the contractor may be tasked with providingtraining, training aids, <strong>and</strong> h<strong>and</strong>books toService training cadre so they can train theIOT&E users <strong>and</strong> maintenance personnel. In addition,the contractor must be required to providesufficient spare parts for the operationalmaintenance personnel to maintain the test itemwhile undergoing operational testing. These supportitems must be agreed upon by the PMO <strong>and</strong>OTA <strong>and</strong> must contractually bind the contractor.If, however, the contractor will be required to providehigher-level maintenance of the item for theduration of the IOT&E, data collection on thosefunctions will be delayed until a subsequentFollow-on Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(FOT&E).4.11.5 Statement of WorkOne of the most important documents receivinginput from the Deputy for T&E is the SOW. TheDeputy for T&E must outline all required oranticipated contractor support for DT&E <strong>and</strong>OT&E. This document outlines data requirements,contractor-conducted or supported testing, governmentinvolvement (access to contractor data,tests, <strong>and</strong> results), operational test support, <strong>and</strong>any other specific test requirements the contractorwill be tasked to perform during the duration ofthe contract.4.11.6 OT&E FundingThe Deputy for T&E provides the PM estimatesof PMO test program costs to conduct OT&E.This funding includes contractor <strong>and</strong> governmenttest support for which the program office directlyor indirectly will be responsible. Since ServiceOTAs fund differently, program office funding forconducting OT&E varies. The Deputy for T&Emust determine these costs <strong>and</strong> inform the PM.4.11.7 <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master PlanThe Deputy for T&E is responsible for managingthe TEMP throughout the test program. The OTAusually is tasked to <strong>com</strong>plete the OT section ofthe TEMP <strong>and</strong> outline its proposed test programthrough all phases of OT&E. The resources requiredfor OT&E should also be entered in PartV. It is important to keep the TEMP updated regularlyso that test organizations involved in OT&Eunderst<strong>and</strong> the scope of their test support. TheTEMP should be updated regularly by the OTA.Further, if any upgrades, improvements, orenhancements to the fielded weapon systemoccur, the TEMP must be updated or a new onecreated to outline new DT <strong>and</strong> OT requirements.4.11.8 Program <strong>Management</strong> OfficeSupport for OT&EEven though operational testing is performed byan independent organization, the PM plays animportant role in its planning, reporting, <strong>and</strong> funding.The PM must coordinate program activitieswith the test <strong>com</strong>munity in the TE Working-levelIntegrated Product Team (WIPT), especially theOTAs. The PM ensures that testing can addressthe critical issues, <strong>and</strong> provides feedback fromOT&E testing activities to contractors.At each milestone review, the PM is required tobrief the decision authority on the testing planned<strong>and</strong> <strong>com</strong>pleted on the program. It is importantthat PMO personnel have a good underst<strong>and</strong>ingof the test program <strong>and</strong> that they work with the4-8


operational test <strong>com</strong>munity. This will ensureOT&E is well-planned <strong>and</strong> adequate resources areavailable. The PMO should involve the test <strong>com</strong>munityby organizing test coordinating groups atprogram initiation <strong>and</strong> by establishing channelsof <strong>com</strong>munication between the PMO <strong>and</strong> the keytest organizations. The PMO can often avoidmisunderst<strong>and</strong>ings by aggressively monitoring thesystem testing <strong>and</strong> providing up-to-date informationto key personnel in the Office of the Secretaryof Defense <strong>and</strong> the Services. The PMO staffshould keep appropriate members of the test <strong>com</strong>munitywell-informed concerning system problems<strong>and</strong> the actions taken by the PMO to correctthem. The PMO must assure that contractor<strong>and</strong> government DT&E supports the decision tocertify the system’s readiness for IOT&E.4.11.9 Support for IOT&EThe Certification of Readiness for the IOT&Eprocess provides a forum for a final review oftest results <strong>and</strong> support required by the IOT&E.The Deputy for T&E must ensure the contractportions adequately cover the scope of testing asoutlined by the OTA. The program office maywant to provide an observer to represent theDeputy for T&E during the actual testing. TheDeputy for T&E involvement in IOT&E will beto monitor <strong>and</strong> coordinate; the Deputy for T&Ewill keep the PM informed of progress <strong>and</strong>problems that arise during testing <strong>and</strong> will monitorrequired PMO support to the test organization.Sufficient LRIP items must be manufacturedto run a <strong>com</strong>plete <strong>and</strong> adequate OT&E program.The DOT&E determines the number of LRIPitems for IOT&E on oversight programs, <strong>and</strong> theOTA makes this determination for all others. 2 Forproblems requiring program office action, theDeputy for T&E will be the point of contact.The Deputy for T&E will be concerned withIOT&E of the LRIP units after a limited numberare produced. The IOT&E must be closely monitoredso that a FRP decision can be made. As inthe operational assessments, the Deputy for T&Ewill be monitoring test procedures <strong>and</strong> results <strong>and</strong>keeping the PM informed. If the item does notsucceed during IOT&E, a new process of DT&Eor a modification may result; <strong>and</strong> the Deputy forT&E will be involved (as in any new programs’inception). If the item passes IOT&E testing <strong>and</strong>is produced at full rate, the Deputy for T&E willbe responsible for ensuring that testing of thoseproduction items is adequate to ensure that theend items physically <strong>and</strong> functionally resemblethe development items.4.11.10 FOT&E <strong>and</strong> Modifications,Upgrades, Enhancements, orAdditionsDuring FOT&E, the Deputy for T&E monitors thetesting, <strong>and</strong> the level of contractor involvement isusually negotiated. The Deputy for T&E shouldreceive any reports generated by the operationaltesters during this time. Any deficiencies noted duringFOT&E should be evaluated by the PMO,which may decide to incorporate upgrades,enhancements, or additions to the current systemor in future increments. If the PM <strong>and</strong> the engineeringsection of the program office design ordevelop modifications that are incorporated intothe weapon system design, additional FOT&E maybe required.Once a weapon system is fielded, portions of thatsystem may be<strong>com</strong>e obsolete, ineffective, ordeficient <strong>and</strong> may need replacing, upgrading, orenhancing to ensure the weapon system meetscurrent <strong>and</strong> future requirements. The Deputy forT&E plays a vital role in this process. Modificationsto existing weapon systems may be managedas an entire newly acquired weapon system. However,since these are changes to existing systems,the Deputy for T&E is responsible for determiningif these enhancements degrade the existingsystem, are <strong>com</strong>patible with its interfaces <strong>and</strong>functions, <strong>and</strong> whether Non-Developmental Items(NDIs) require retest, or the entire weapon systemneeds reverification. The Deputy for T&E mustplan the test program’s funding, schedule, test4-9


program, <strong>and</strong> contract provisions with these itemsin mind. A new TEMP may have to be generatedor the original weapon system TEMP modified<strong>and</strong> re-coordinated with the test organizations.The design of the DT&E <strong>and</strong> FOT&E programusually requires coordination with the engineering,contracting, <strong>and</strong> program management IPTsof the program office.4.11.11 <strong>Test</strong> ResourcesDuring all phases of OT, the Deputy for T&Emust coordinate with the operational testers toensure they have the test articles needed toac<strong>com</strong>plish their mission. <strong>Test</strong> resources will beeither contractor-provided or government-provided.The contractor resources must be covered in thecontract, whether in the development contract orthe production contract. Government test resourcesneeded are determined by the operational testers.They usually coordinate the test ranges, test support,<strong>and</strong> the user personnel for testing. The PMprograms funding for support of OT. Funding forOT&E is identified in the TEMP <strong>and</strong> funded inthe PMO’s budget. The Services’ policy specifieshow the OTAs develop <strong>and</strong> manage their ownbudgets for operational testing.4.12 SUMMARYStaffing requirements in the PMO vary with theprogram phase <strong>and</strong> the T&E workload. T&Eexpertise is essential in the early planning stagesbut can be provided through matrix support. TheDeputy for T&E may be subordinate to the chiefengineer in early phases but should be<strong>com</strong>e a separatestaff element after prototype testing. Changingof critical players can destroy established workingrelationships <strong>and</strong> abrogate prior agreements ifcontinuity is not maintained. The PMO managementof T&E must provide for an integrated focus<strong>and</strong> a smooth transition from one staff-supportmode to the next.The PMO should be proactive in its relations withthe Service OTA. There are many opportunitiesto educate the OTA on system characteristics <strong>and</strong>expected performance. Early OTA input to designconsiderations <strong>and</strong> requirements clarification canreduce surprises downstream. Operational testingis an essential <strong>com</strong>ponent of the systemdevelopment <strong>and</strong> decision-making process. It canbe used to facilitate system development or maybe<strong>com</strong>e an impediment. In many cases, the PMOattitude toward operational testing <strong>and</strong> the OTAwill influence which role the OTA assumes.4-10


ENDNOTES1. DoD 5010.12-L, Acquisition <strong>Management</strong>Systems <strong>and</strong> Data Requirements Control List(AMSDL), April 2001.2. DoD Instruction (DoDI) 5000.2, Operationof the Defense Acquisition System, May 12,2003.4-11


4-12


5TEST-RELATEDDOCUMENTATION5.1 INTRODUCTIONDuring the course of a defense acquisition program,many documents are developed that havesignificance for those responsible for testing <strong>and</strong>evaluating the system. This chapter is designed toprovide background on some of these documents.As Figure 5-1 shows, test-related documentationspans a broad range of materials. It includesrequirements documentation such as the CapabilityDevelopment Document (CDD) <strong>and</strong>Capability Production Document (CPD); programdecision documentation such as the AcquisitionDecision Memor<strong>and</strong>um (ADM) with exit criteria;<strong>and</strong> program management documentationsuch as the Acquisition Strategy, Baseline documentation,the Systems Engineering Plan (SEP),the logistics support planning, <strong>and</strong> the <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> Master Plan (TEMP). Of importanceto Program Managers (PMs) <strong>and</strong> to <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (T&E) managers are additional testprogram documents such as specific test designs,test plans, Outline <strong>Test</strong> Plans/<strong>Test</strong> Program Outlines(OTPs/TPOs), evaluation plans, <strong>and</strong> testreports. This chapter concludes with a descriptionof the End-of-<strong>Test</strong> Phase <strong>and</strong> Beyond LowRate Initial Production (BLRIP) Reports, <strong>and</strong> twospecial-purpose T&E status reports that are usedto support the milestone decision process.5.2 REQUIREMENTS DOCUMENTATION5.2.1 Continuing Mission CapabilityAnalysesThe Joint Chiefs, via the Joint Requirements OversightCouncil (JROC), are required to conductcontinuing mission analyses of their assigned areasof responsibility. 1 These Functional Analyses (Area,Needs, Solutions) may result in re<strong>com</strong>mendationsto initiate new acquisition programs to reduce oreliminate operational deficiencies. If a need cannotbe met through changes in Doctrine, Organization,Training, Leadership <strong>and</strong> Education, Personnel <strong>and</strong>Facilities (DOTLPF), <strong>and</strong> a Materiel (M) solutionis required, the needed capability is described firstin an Initial Capabilities Document (ICD) <strong>and</strong> thenin the CDD. When the cost of a proposed acquisitionprogram is estimated to exceed limits specifiedin Operation of the Defense Acquisition System,the proposed program considered a Major DefenseAcquisition Program (MDAP) <strong>and</strong> requires JROCapproval (JROC Interest). 2 Lesser programs for Serviceaction are designated Joint Integration or Independentprograms. The ICD is <strong>com</strong>pleted at the beginningof the analyses for a mission area; the CDD<strong>and</strong> CPD are program-/increment-specific <strong>and</strong> reviewedto evaluate necessary system modificationsperiodically.5.2.2 Initial Capabilities DocumentThe ICD makes the case to establish the need fora material approach to resolve a specific capabilitygap. The ICD supports the Analysis of Alternatives(AoA), development of the TechnologyDevelopment Strategy (TDS), <strong>and</strong> variousmilestone decisions. The ICD defines the capabilitygap in terms of the functional area(s), relevantrange of military operations, time, obstacles toover<strong>com</strong>e, <strong>and</strong> key attributes with appropriate Measuresof Effectiveness (MOEs). Once approved, theICD is not normally updated. Format for the ICDis found in CJCSM [Chairman of the Joint Chiefsof Staff Manual] 3170.01A. 35-1


SYSTEMSPECIFICATIONSSYS PERFITEM PERFITEM DETAILPROCESSMATERIALMISSIONNEEDSTATEMENTTEST ANDEVALUATIONMASTERPLAN(TEMP)ACQUISITIONDECISIONMEMORANDUM(ADM)SOWWBSCDD/ CPDSTAT&E STRATEGYSPECIALPURPOSET&ESTATUSREPORTSADM-EXITCRITERIAPROGRAMMANAGERLIVE FIRE TRBLRIP REPORTAoATESTPROGRAMDOCUMENTSEOA, OAACQUISITIONSTRATEGYACQUISITIONPROGRAMBASELINETECHNICALMANAGEMENT(SEP)INTEGRATEDLOGISTICSSUPPORT(ILS)TEST DESIGNEVALUATION PLANTEST RESOURCEPLANEVALUATIONREPORTTEST REPORTFigure 5-1. <strong>Test</strong>-Related Documentation5.2.3 Capability Development DocumentThe CDD outlines an affordable increment ofmilitarily useful <strong>and</strong> supportable operationalcapability that can be effectively developed,produced or acquired, deployed, <strong>and</strong> sustained.Each increment of capability will have its ownset of attributes <strong>and</strong> associated performancevalues with thresholds <strong>and</strong> objectives establishedby the sponsor (Service) with input from theuser. The CDD performance attributes apply toonly one increment <strong>and</strong> include Key PerformanceParameters (KPPs) <strong>and</strong> other parametersthat will guide the development, demonstration,<strong>and</strong> testing of the current increment. The CDDwill outline the overall strategy to develop thefull or <strong>com</strong>plete capability by describing all increments(Figure 5-2). The CDD will be approvedat Milestone B <strong>and</strong> not normally updated.The format is found in CJCSM 3170.01A. 45.2.4 Capability Production DocumentThe CPD addresses the production attributes <strong>and</strong>quantities specific to a single increment <strong>and</strong> isfinalized by the sponsor after the Design ReadinessReview (DRR), when projected capabilitiesof the increment in development have beenspecified with sufficient accuracy to begin production.Design trades during System Development<strong>and</strong> Demonstration (SDD) should have led to thefinal production design needed for Milestone C.5-2


MILESTONE MILESTONE MILESTONEREQUIREMENTS DOCUMENTSVERY BROADSTATEMENT OFMISSION NEEDCAPABILITIES DEVDOCUMENT:CHARACTERISTICSLETHALITYSURVIVABILITY"DESTROY A HARDENEDTARGET IN CENTRAL ASIA THATIS PROTECTED BY A 21st CENTURYAIR DEFENSE SYSTEM."PERFORMANCEPARAMETERSWARHEAD SIZECEPSURVIVABILITYSPEEDRANGERCSALTITUDESFRITEM SPECDASHCRUISEMINIMUMMAXCRUISELAUNCHFINAL SYSTEMCHARACTERISTICSFINAL PARAMETERSTHRESHOLDVALUESPDRCONCEPTSICBMCRUISE MISSILEMANNED BOMBERATTACK AIRCRAFTCRUISE MISSILETOMAHAWKALCMALCMFigure 5-2. Requirements Definition ProcessInitial Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (IOT&E)will normally test to the values in the CPD. Thethreshold <strong>and</strong> objective values from the CDDare, therefore, superseded by the specific productionvalues detailed in the CPD. Reductionin any KPP threshold value will require reassessmentof the military utility of the reducedcapability. The CPD will always reference theoriginating ICD. The format is found in CJCSM3170.01A. 55.2.5 System Threat Assessment (STA)An STA is prepared, starting at Milestone B, bythe Department of Defense (DoD) ComponentIntelligence Comm<strong>and</strong> or Agency, <strong>and</strong> for AcquisitionCategory (ACAT) ID programs, <strong>and</strong> isvalidated by the Defense Intelligence Agency(DIA). 6 The STA, for Defense Acquisition Board(DAB) programs, will contain a concise descriptionof the projected future operational threatenvironment, the system-specific threat, the reactivethreat that could affect program decisions,<strong>and</strong> when appropriate, the results of interactiveanalysis obtained by the Service PM when evaluatingthe program against the threat. Threat projectionsstart at the Initial Operational Capability(IOC) <strong>and</strong> extend over the following 10 years.The STA provides the basis for the test designof threat scenarios <strong>and</strong> the acquisition of appropriatethreat targets, equipment, or surrogates.It provides threat data for Development <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (DT&E) <strong>and</strong> Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (OT&E). Vulnerability <strong>and</strong> lethalityanalyses during Live Fire <strong>Test</strong>ing (LFT) of ACATI <strong>and</strong> II systems are contingent on valid threatdescriptions. A summary of the STA is includedin Part 1 of the TEMP.5-3


5.3 PROGRAM DECISIONDOCUMENTATION5.3.1 Acquisition Decision Memor<strong>and</strong>umUnder Secretary of Defense for Acquisition, Technology,<strong>and</strong> Logistics (USD(AT&L)) decisions atmajor defense ACAT ID milestones are recorded ina document known as an ADM. The ADM documentsa USD(AT&L) decision on program progress<strong>and</strong> on the Acquisition Program Baseline (APB) atmilestones <strong>and</strong> decision reviews. In conjunction withan ADM, <strong>and</strong> its included exit criteria for the nextphase, the APB is a primary program guidance documentproviding KPP thresholds/objectives for systemscost, schedule, <strong>and</strong> performance.5.3.2 Analysis of AlternativesAn AoA is normally prepared by a DoD Componentagency (or Principal Staff Assistant (PSA)for ACAT IA programs) other than the Program<strong>Management</strong> Office (PMO) for each milestonereview beginning at Milestone A. The AoA aidsdecision makers by examining the relative advantages<strong>and</strong> disadvantages of program alternatives,shows the sensitivity of each alternative to possiblechanges in key assumptions, <strong>and</strong> providesthe rationale for each option. The guidance in theDefense Acquisition <strong>Guide</strong>book, Chapter 4,requires a clear linkage between the AoA, systemrequirements, <strong>and</strong> system evaluation MOE.The driving factor behind this linkage is the decisionmaker’s reluctance to accept modeling orsimulation projections for system performance inthe future without actual test data that validateAoA results.5.4 PROGRAM MANAGEMENTDOCUMENTATION5.4.1 Acquisition StrategyAn event-based acquisition strategy must beformulated at the start of a development program.Event-driven acquisition strategy explicitly linksprogram decisions to demonstrated ac<strong>com</strong>plishmentsin development, testing, <strong>and</strong> initial production.The strategy constitutes a broad set ofconcepts that provide direction <strong>and</strong> control forthe overall development <strong>and</strong> production effort.The acquisition strategy is updated at each milestone<strong>and</strong> decision review using a Working-levelIntegrated Product Team (WIPT) structurethroughout the life of a program. The level ofdetail reflected in the acquisition strategy can beexpected to increase as a program matures. Theacquisition strategy serves as a tailored conceptualbasis for formulating other program functionalplans such as the TEMP.It is important that T&E interests be representedas the acquisition strategy is formulated becausethe acquisition strategy should:• Provide an overview of the T&E planned forthe program, ensuring that adequate T&E isconducted prior to the production decision;• Discuss plans for providing adequate quantitiesof test hardware;• Describe levels of concurrence <strong>and</strong> <strong>com</strong>binedDevelopment <strong>Test</strong>/Operational <strong>Test</strong> (DT/OT).5.4.2 Baseline DocumentationThe APB will initially be developed before MilestoneB or program initiation <strong>and</strong> revised for eachsubsequent milestone. Baseline parameters representthe cost, schedule, <strong>and</strong> performanceobjectives <strong>and</strong> thresholds for the system in a productionconfiguration. Each baseline influencesthe T&E activities in the succeeding phases.MOEs or Measures of Performance (MOPs) shallbe used in describing needed capabilities earlyin a program. Guidance on the formulation ofbaselines is found in the Defense Acquisition<strong>Guide</strong>book. 7 Performance demonstrated duringT&E of production systems must meet or exceedthe thresholds. The thresholds establish deviation5-4


limits (actual or anticipated breach triggersreports) for KPPs beyond which the PM may nottrade off cost, schedule, or performance withoutauthorization by the Milestone Decision Authority(MDA). Baseline <strong>and</strong> test documentation mustreflect the same expectations for system performance.The total number of performance parametersshall be the minimum number needed tocharacterize the major drivers of operational effectiveness<strong>and</strong> suitability, schedule, technicalprogress, <strong>and</strong> cost for a production systemintended for deployment. The performanceparameters may not <strong>com</strong>pletely define operationaleffectiveness or suitability.5.4.3 Acquisition Logistics PlanningSupportability analyses are a <strong>com</strong>posite of all supportconsiderations necessary to ensure the effective<strong>and</strong> economical support of a system at alllevels of maintenance for its programmed lifecycle. Support concepts describe the overalllogistics support program <strong>and</strong> include logisticsrequirements, tasks, <strong>and</strong> milestones for the current<strong>and</strong> succeeding phases of the program. The analysesserve as the source document for logistics supporttesting requirements.<strong>Guide</strong>lines for logistics support analyses are documentedin Logistics Support Analysis. 8 This st<strong>and</strong>ardidentifies how T&E programs should beplanned to serve the following three logisticssupportability objectives:(1) Provide measured data for input into systemlevelestimates of readiness, operationalcosts, <strong>and</strong> logistics support resourcerequirements;(2) Expose supportability problems so they canbe corrected prior to deployment;(3) Demonstrate contractor <strong>com</strong>pliance withquantitative supportability—related designrequirements.Development of an effective T&E programrequires close coordination of efforts among allsystems engineering disciplines, especially thoseinvolved in logistics support analyses. Areas ofparticular interest include Reliability, Availability,<strong>and</strong> Maintainability (RAM), Human SystemsIntegration (HSI), Environmental Safety <strong>and</strong>Occupational Health (ESOH), <strong>and</strong> post-deploymentevaluations. The support analyses should bedrafted shortly before program start to provide askeletal framework for Logistics Support Analysis(LSA), to identify initial logistics testing requirementsthat can be used as input to the TEMP, <strong>and</strong>to provide test feedback to support Integrated LogisticsSupport (ILS) development. <strong>Test</strong> resourceswill be limited early in the program.5.4.4 SpecificationThe system specification is a document used indevelopment <strong>and</strong> procurement that describes thetechnical performance requirements for items,materials, <strong>and</strong> services including the proceduresby which it will be determined that the requirementshave been met. Specification evolves overthe developmental phases of the program withincreasing levels of detail: system, item performance,item detail, process, <strong>and</strong> material. Section4 of the specification identifies what procedures(inspection, demonstration, analysis, <strong>and</strong>test) will be used to verify the performanceparameters listed in Section 3. Further details maybe found in Military St<strong>and</strong>ard (MIL-STD)-961D,DoD St<strong>and</strong>ard Practice for Defense Specifications.95.4.5 Work Breakdown Structure (WBS)A program WBS shall be established that providesa framework for program <strong>and</strong> technicalplanning, cost estimating, resource allocations,performance measurements, <strong>and</strong> status reporting.Program offices shall tailor a program WBS foreach program using the guidance in MilitaryH<strong>and</strong>book Work Breakdown Structure. 10 Level 2of the WBS hierarchical structure addresses5-5


system-level T&E with sublevels for DT&E <strong>and</strong>OT&E. Additionally, each configuration itemstructure includes details of the integration <strong>and</strong>test requirements.5.5 TEST PROGRAM DOCUMENTATION5.5.1 <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master PlanAn evaluation strategy is developed by the earlyconcept team that describes how the capabilitiesin the CDD will be evaluated once the system isdeveloped. The evaluation strategy, part of theTDS, provides the foundation for developmentof the program TEMP at the milestone supportingprogram start. The TEMP is the basic planningdocument for T&E related to a DoD systemacquisition (Figure 5-3). It is prepared by thePMO with the OT information provided by theService Operational <strong>Test</strong> Agency (OTA). It is usedby Office of the Secretary of Defense (OSD) <strong>and</strong>the Services for planning, reviewing, <strong>and</strong>approving T&E programs <strong>and</strong> provides the basis<strong>and</strong> authority for all other detailed T&E planningdocuments. The TEMP identifies Critical TechnicalParameters (CTPs), performance characteristics(MOE), <strong>and</strong> Critical Operational Issues(COIs); <strong>and</strong> describes the objectives, responsibilities,resources, <strong>and</strong> schedules for all <strong>com</strong>pleted<strong>and</strong> planned T&E. The TEMP, in the Defense Acquisition<strong>Guide</strong>book specified format, is requiredby DoD Instruction (DoDI) 5000.2 for ACAT I,IA, <strong>and</strong> designated oversight programs (see enclosure5 to DoDI 5000.2 for more informationregarding the TEMP). Format is at Service discretionfor ACAT II <strong>and</strong> III programs. For example,the Army requires that each TEMP containthe full set of approved Critical OperationalGOVERNMENTTEMPICDCDDSTAAPBSYSTEM TEST PLAN ++SYSTEMSPECSPROGRAM SCHEDULEINTEGRATED TEST PLANCONTRACT SCHEDULEHARDWAREDETAILED TEST PLANSSOFTWARESYSTEMCONTRACTORTEST PROCEDURESMISSION PLANNING DOCUMENTSTEST CARD/SCRIPTSQUICK LOOKTEST REPORTSINTERIMFINALFigure 5-3. <strong>Test</strong> Program Documentation5-6


Issues <strong>and</strong> Criteria (COIC), with associated scope<strong>and</strong> rationale, forming the basis for planning thesystem’s evaluations. 115.5.2 <strong>Evaluation</strong> Plan<strong>Evaluation</strong> planning is usually included withinthe test plan. <strong>Evaluation</strong> planning considers theevaluation <strong>and</strong> analysis techniques that will berequired once the test data have been collected<strong>and</strong> processed. <strong>Evaluation</strong> is linked closely to thetest design, especially the statistical models onwhich the test design is built.The Army requires a System <strong>Evaluation</strong> Plan(SEP) describing the evaluation approach that addressesthe technical <strong>and</strong> operational aspects necessaryto address the system’s operational effectiveness,suitability, <strong>and</strong> survivability.The objective of the Army’s emphasis onevaluation is to address the issues; describe theevaluation of issues that require data from sourcesother than test; state the technical or operationalissues <strong>and</strong> criteria; identify data sources (datasource matrix); state the approach to the independentevaluation; <strong>and</strong> specify the analytical plan<strong>and</strong> identify program constraints. 12<strong>Evaluation</strong> plans are prepared for all systems indevelopment by the independent evaluators duringconcept exploration <strong>and</strong> in coordination withthe system developer. The Army SEP <strong>com</strong>plementsthe TEMP <strong>and</strong> is developed in support of the initialTEMP. It identifies each evaluation issue <strong>and</strong>the methodology to be used to assess it <strong>and</strong> specifiesrequirements for exchange of informationbetween the development/operational testers <strong>and</strong>materiel developers.5.5.3 <strong>Test</strong> Design<strong>Test</strong> designers need to ensure that the test isconstructed to provide useful information in allareas/aspects that will lead to an assessmentof the system performance. For example, a<strong>com</strong>plicated, even ingenious, test that does notprovide the information required by the decisionmakers is, in many respects, a failed endeavor.Therefore, part of the process of developinga test concept or test design (the distinctionbetween these vary from organization toorganization) should be to consider whether thetest will provide the information required by thedecision makers. In other words, “Are we testingthe right things in the right way? Are ourevaluations meaningful?”The test design is statistical <strong>and</strong> analytical innature <strong>and</strong> should perform the followingfunctions:(1) Structure <strong>and</strong> organize the approach totesting in terms of specific test objectives;(2) Identify key MOEs <strong>and</strong> MOPs;(3) Identify the required data <strong>and</strong> demonstratehow the data will be gathered, stored,analyzed, <strong>and</strong> used to evaluate MOEs;(4) Indicate what part Modeling <strong>and</strong> Simulation(M&S) will play in meeting testobjectives;(5) Identify the number <strong>and</strong> type of test events<strong>and</strong> required resources.The test design may serve as a foundation for themore detailed test plan <strong>and</strong> specifies the testobjectives, events, instrumentation, methodology,data requirements, data management needs, <strong>and</strong>analysis requirements.5.5.4 <strong>Test</strong> PlanThe test plan is the vehicle that translates a testconcept <strong>and</strong> statistical/analytical test design intoconcrete resources, procedures, <strong>and</strong> responsibilities.The size <strong>and</strong> <strong>com</strong>plexity of a test program<strong>and</strong> its associated test plan are determined by thenature of the system being tested <strong>and</strong> the type of5-7


testing that is to be ac<strong>com</strong>plished. Some majorweapons systems may require large numbers ofPRELIMINARY PAGESi. TITLE PAGEii. APPROVAL PAGEiii. CONDITIONS FOR RELEASE OF TECHNICALDATAiv. PREFACE (ABSTRACT)v. EXECUTIVE SUMMARYvi. TABLE OF CONTENTS**THE ACTUAL NUMBER OF THESE PAGES WILL BE DETERMINEDBY THE LENGTH OF PRELIMINARY ELEMENTS (E.G., TABLE OFCONTENTS, TERMS AND ABBREVIATIONS, ETC.).MAIN BODY1. INTRODUCTION2. BACKGROUND3. TEST ITEM DESCRIPTION4. OVERALL TEST OBJECTIVES5. CONSTRAINTS AND LIMITATIONS6. TEST RESOURCES7. SAFETY REQUIREMENTS8. SECURITY REQUIREMENTS9. TEST PROJECT MANAGEMENT10. TEST PROCEDURES11. TEST REPORTING12. LOGISTICS13. ENVIRONMENTAL PROTECTIONAPPENDICESA. TEST CONDITION MATRIXB. REQUIREMENTS TRACEABILITYC. TEST INFORMATION SHEETSD. PARAMETERS LISTE. DATA ANALYSIS PLANF. INSTRUMENTATION PLANG. LOGISTICS SUPPORT PLANH-Z AS REQUIREDDISTRIBUTIONLIST OF ABBREVIATIONSSOURCE: AFFTC <strong>Test</strong> Plan Preparation <strong>Guide</strong>, May 1994.Table 5-1. Sample <strong>Test</strong> Plan Contentsseparate tests to satisfy test objectives <strong>and</strong>, thus,require a multi-volume test plan; other testingmay be well-defined by a relatively brief test plan.The test plan also provides a description of theequipment configuration <strong>and</strong> known limitationsto the scope of testing. The type of informationtypically included in a test plan is shown in Table5-1.5.5.5 Outline <strong>Test</strong> Plan/Resources PlanThe Army’s Outline <strong>Test</strong> Plan (OTP) <strong>and</strong> AirForce’s <strong>Test</strong> Resources Plan (TRP) are essentialtest planning documents. They are formal resourcedocuments specifying the resourcesrequired to support the test. Since the OTP orTRP provide the basis for fiscal programming<strong>and</strong> coordinating the necessary resources, it is importantthat these documents be developed in advance<strong>and</strong> kept current to reflect maturingresource requirements as the test program develops.The Navy makes extensive use of the TEMPto document T&E resource requirements. EachService has periodic meetings designed to reviewresource requirements <strong>and</strong> resolve problems withtest support.5.5.6 <strong>Test</strong> Reports5.5.6.1 Quick-Look ReportsQuick-look analyses are expeditious analyses performedduring testing using limited amounts ofthe database. Such analyses often are used toassist in managing test operations. Quick-lookreports are used occasionally to inform higherauthorities of test results. Quick-look reports mayhave associated briefings that present T&E results<strong>and</strong> substantiate conclusions or re<strong>com</strong>mendations.Quick-look reports may be generated by the contractoror government agency. They are of particularlycritical interest for high-visibility systemsthat may be experiencing some developmentdifficulties. Techniques <strong>and</strong> formats should bedetermined before the start of testing. They maybe exercised during pretest trials.5-8


5.5.6.2 Final <strong>Test</strong> ReportThe final test report disseminates the test informationto decision authorities, program officestaff, <strong>and</strong> the acquisition <strong>com</strong>munity. It providesa permanent record of the execution of the test<strong>and</strong> its results. The final test report should relatethe test results to the critical issues <strong>and</strong> addressthe objectives stated in the test design <strong>and</strong> testplan. A final test report may be separated intotwo sections—a main section providing theessential information about test methods <strong>and</strong>results, <strong>and</strong> a second section consisting of supportingappendices to provide details <strong>and</strong> supplementalinformation. Generally, the followingtopics are included in the main body of the report:(1) <strong>Test</strong> purpose(2) Issues <strong>and</strong> objectives(3) Method of ac<strong>com</strong>plishment(4) Results (keyed to the objectives <strong>and</strong> issues)(5) Discussion, conclusions <strong>and</strong> re<strong>com</strong>mendations.Appendices of the final test report may addressthe following topics:(1) Detailed test description(2) <strong>Test</strong> environment(3) <strong>Test</strong> organization <strong>and</strong> operation(4) Instrumentation(5) Data collection <strong>and</strong> management(6) <strong>Test</strong> data(7) Data analysis(8) M&S(9) RAM information(10) Personnel(11) Training(12) Safety(13) Security(14) Funding(15) Asset Disposition.The final test report may contain an evaluation<strong>and</strong> analysis of the results, or the evaluation maybe issued separately. The analysis tells what theresults are, whereas an evaluation tells what theresults mean. The evaluation builds on theanalysis <strong>and</strong> generalizes from it, showing howthe results apply outside the test arena. It showswhat the implications of the test are <strong>and</strong> mayprovide re<strong>com</strong>mendations. The evaluation maymake use of independent analyses of all or partof the data; it may employ data from other sources<strong>and</strong> may use M&S to generalize the results <strong>and</strong>extrapolate to other conditions. In the case of theArmy, a separate System <strong>Evaluation</strong> Report isprepared by independent evaluators within theArmy <strong>Evaluation</strong> Center (AEC).5.6 OTHER TEST-RELATED STATUSREPORTS5.6.1 End of <strong>Test</strong> ReportsThe Services are required by DoDI 5000.2,Operation of the Defense Acquisition System tosubmit to OSD T&E offices copies of their formalDT&E, OT&E, <strong>and</strong> Live Fire <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(LFT&E) reports that are prepared at theend of each period of testing for ACAT I, IA,<strong>and</strong> oversight programs. 13 These reports will generallybe submitted in a timely manner to permitOSD review.5-9


5.6.2 Beyond Low-Rate Initial Production(BLRIP) ReportBefore an ACAT I or Director, Operational <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> (DOT&E)-designated programcan proceed beyond Low Rate Initial Production(LRIP), the DOT&E must submit a BLRIPreport to the Secretary of Defense <strong>and</strong> the Senate<strong>and</strong> House of Representatives Committees onArmed Services, National Security, <strong>and</strong> Appropriations.This report addresses whether theOT&E performed was adequate <strong>and</strong> whether theIOT&E results confirm that the items or <strong>com</strong>ponentstested are effective <strong>and</strong> suitable for usein <strong>com</strong>bat by typical military users. The reportmay include information on the results ofLFT&E for applicable major systems.5.7 SUMMARYA wide range of documentation is available tothe test manager <strong>and</strong> should be used to developT&E programs that address all relevant issues.The PM must work to ensure that T&E requirementsare considered at the outset when theacquisition strategy is formulated. The PM mustalso require early, close coordination <strong>and</strong> a continuingdialogue among those responsible forintegration of functional area planning <strong>and</strong> theTEMP.5-10


ENDNOTES1. CJCS Instruction (CJCSI) 3170.013, JointCapabilities Integration <strong>and</strong> DevelopmentSystem (JCIDS), March 12, 2004; <strong>and</strong>CJCSM 3170.01A, Operation of the Joint CapabilitiesIntegration <strong>and</strong> DevelopmentSystem, March 12, 2004.2. DoDI 5000.2, Operation of the DefenseAcquisition System, May 12, 2003.3. CJCSM 3170.01A, Appendix A, enclosure D.4. CJCSM 3170.01A, Appendix A, enclosure E.5. CJCSM 3170.01A, Appendix A, enclosure F.6. DoD Directive (DoDD) 5105.21, DefenseIntelligence Agency, May 19, 1977. Cancelled.7. Defense Acquisition H<strong>and</strong>book, http://akss.dau.mil/DAG, Chapter 9.8. MIL-STD-1388-1A, Logistics SupportAnalysis, April 11, 1983.9. MIL-STD-961D, DoD St<strong>and</strong>ard Practice forDefense Specifications, March 22, 1995. MilitaryDefense Specification St<strong>and</strong>ard Practicesincorporated portions of MIL-STD-490,Specification Practices, which is fully exemptfrom the MIL-STD-491D waiver processbecause it is a “St<strong>and</strong>ard Practice.”10. Military H<strong>and</strong>book (MIL-HDBK)-881, WorkBreakdown Structure, January 2, 1998.11. Army Regulation (AR) 73-1, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Policy, January 7, 2002.12. Ibid.13. DoDI 5000.2, enclosure 5.5-11


5-12


6TYPES OF TESTAND EVALUATION6.1 INTRODUCTIONThis chapter provides a brief introduction toDevelopment <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E) <strong>and</strong>Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E)—twoprincipal types of <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E); italso discusses the role of qualification testing asa sub-element of development testing. Otherimportant types of T&E are introduced. They include:multi-Service testing; joint T&E; Live Fire<strong>Test</strong>ing (LFT); Nuclear, Biological, <strong>and</strong> Chemical(NBC) testing; <strong>and</strong> Nuclear Hardness <strong>and</strong> Survivability(NH&S) testing. As Figure 6-1 illustrates,DT&E <strong>and</strong> OT&E are performed throughoutthe acquisition process <strong>and</strong> identified bynomenclature that may change with the phase ofthe acquisition cycle in which they occur.6.2 DEVELOPMENT TEST ANDEVALUATIONDT&E is T&E conducted throughout the acquisitionprocess to assist in engineering design <strong>and</strong>development <strong>and</strong> to verify that technical performancespecifications have been met. The DT&Eis planned <strong>and</strong> monitored by the developing agency<strong>and</strong> is normally conducted by the contractor. However,the development agency may perform technical<strong>com</strong>pliance tests before OT&E. It includesthe T&E of <strong>com</strong>ponents, subsystems, PreplannedProduct Improvement (P 3 I) changes, hardware/softwareintegration, <strong>and</strong> production qualification testing.It en<strong>com</strong>passes the use of models, simulations,test beds, <strong>and</strong> prototypes or full-scale engineeringdevelopment models of the system. DT&E mayinvolve a wide degree of test <strong>com</strong>plexity, dependingupon the type of system or test article underdevelopment; e.g., tests of electronic breadboardsor brassboards, <strong>com</strong>ponents, subsystems, or experimentalprototypes.The DT&E may support the system design processthrough an iterative Modeling <strong>and</strong> Simulation(M&S), such as Simulation, <strong>Test</strong>, <strong>and</strong> <strong>Evaluation</strong>Process (STEP) that involves both contractor <strong>and</strong>government personnel. Because contractor testingplays a pivotal role in the total test program, it isimportant that the contractor establish an integratedtest plan early to ensure that the scope of thecontractor’s test program satisfies government <strong>and</strong>contractor test objectives.Anti-tamper <strong>com</strong>ponent-level verification testingshall take place prior to production as a functionof Development <strong>Test</strong>/Operational <strong>Test</strong> (DT/OT).Component-level testing shall not assess thestrength of the anti-tamper mechanism provided,but instead verify that anti-tamper performs asspecified by the source contractor or governmentagency. 1The Program Manager (PM) remains responsiblefor the ultimate success of the overall program.The PM <strong>and</strong> the test specialists on the PM’s staffmust foster an environment that provides thecontractor with sufficient latitude to pursueinnovative solutions to technical problems <strong>and</strong>, atthe same time, provides the data needed to makerational trade-off decisions between cost, schedule,<strong>and</strong> performance as the program progresses.6.2.1 Production Qualification <strong>Test</strong> ( PQT)Qualification testing is a form of developmenttesting that verifies the design <strong>and</strong> manufacturingprocess. PQTs are formal contractual tests that6-1


MILESTONE A B C FRPDRPHASETECH DEV SYSTEM DEV &PRODUCTION &DEMONSTRATIONDEPLOYMENTDRRSYS INTEGR SYS DEMO-LRIP-OPERATIONS &SUPPORTOSD DTOVERSIGHTDEVELOPMENT TEST AND EVALUATION (DT&E)ENVIRONMENTAL TESTING•RELIABILITY DEVELOPMENT TESTS• (TEST-ANALYZE-FIX-TEST [TAFT])LIFE TESTING •DESIGN LIMIT TESTING •• DESIGN-FOR-TESTABILITYQUALIFICATION TESTING•COMBINED DT&E/OT&E-CONCURRENT DT&E/OT&E••BIT, BITE, ATE•PRODUCTIONACCEPTANCET&EMODIFICATIONTESTING•DOT&EOVERSIGHTEARLY OPERATIONAL ASSESSMENTS(EOA)OPERATIONALLFT&EASSESSMENTS (OA) IOT&EFOLLOW-ONOT&E (FOT&E)EVALSTRATEGYPRELIMINARY TEMPAPPROVALTEMPUPDATETEMPUPDATEHARDWARECONFIGURATIONMOCK-UP &BREADBOARDBRASSBOARD ENGINEERINGPROTOTYPEDEV MODELLOW RATEPRODUCTIONARTICLEFULL RATEPRODUCTIONARTICLEUPGRADEDSYSTEMOSDREPORTST&E ASSESSMENT T&EASSESSMENTLIVE FIRE REPORTBEYOND LRIP REPORTFigure 6-1. <strong>Test</strong>ing During Acquisition6-2


confirm the integrity of the system design overthe operational <strong>and</strong> environmental range in thespecification. These tests usually use productionhardware fabricated to the proposed productiondesign specifications <strong>and</strong> drawings. Such tests includecontractual Reliability <strong>and</strong> Maintainability(R&M) demonstration tests required before productionrelease. PQT must be <strong>com</strong>pleted beforeFull Rate Production (FRP) in accordance withOperation of the Defense Acquisition System. 2PQTs may be conducted on Low Rate InitialProduction (LRIP) items to ensure the maturityof the manufacturing process, equipment, <strong>and</strong>procedures. These tests are conducted on eachitem or a sample lot taken at r<strong>and</strong>om from thefirst production lot <strong>and</strong> are repeated if the processor design is changed significantly or a second oralternative source is brought online. These testsare also conducted against contractual design <strong>and</strong>performance requirements.6.3 OPERATIONAL TEST ANDEVALUATION6.3.1 The Difference Between Development<strong>and</strong> Operational <strong>Test</strong>ingThe 1979 <strong>Management</strong> of Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong>, contained the following account ofthe first OT&E. This anecdote serves as anexcellent illustration of the difference between development<strong>and</strong> operational testing:The test <strong>and</strong> evaluation of aircraft <strong>and</strong> airweapon systems started with the contractawarded to the Wright brothers in 1908.This contract specified a craft that wouldlift two men with a total weight of 350pounds, carry enough fuel for a flight of125 miles, <strong>and</strong> fly 40 miles per hour instill air. The contract also required that testingbe conducted to assure this capability.What we now call Development <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (DT&E) was satisfied whenthe Wright brothers (the developer) demonstratedthat their airplane could meetthose first contract specifications. However,no immediate military mission hadbeen conceived for the Wright Flyer. Itwas shipped to Fort Sam Houston, Texas,where Captain Benjamin D. Foulois, thepilot, had orders to “teach himself to fly.”He had to determine the airplane’s performance,how to maintain it, <strong>and</strong> the kindof organization that would use it. Cavalrywagon masters had to be trained as airplanemechanics, <strong>and</strong> Captain Foulois washis own instructor pilot.In the process, Captain Foulois subjectedthe Wright Flyer to test <strong>and</strong> evaluationunder operational conditions. Foulois soondiscovered operational deficiencies. Forexample, there was no seat on the airplane.During hard l<strong>and</strong>ings, Foulois’ 130-pound frame usually parted <strong>com</strong>pany fromthe airplane. To correct the problem,Foulois bolted an iron tractor seat to theairplane. The seat helped, but Foulois stilltoppled from his perch on occasion. As afurther improvement, Foulois looped hisSam Browne belt through the seat <strong>and</strong>strapped himself in. Ever since then, contouredseats <strong>and</strong> safety belts—a productof this earliest “operational” test <strong>and</strong>evaluation—have been part of the militaryairplane. 3Captain Foulois’ experience may seem humorousnow, but it illustrates the need for operational testing.It also shows that operational testing has beengoing on for a long time.As shown in Table 6-1 where development testingis focused on meeting detailed technical specifications,the OT focuses on the actual functioningof the equipment in a realistic <strong>com</strong>bat environmentin which the equipment must interactwith humans <strong>and</strong> peripheral equipment. WhileDT&E <strong>and</strong> OT&E are separate activities <strong>and</strong> are6-3


Table 6-1. Differences Between DT&E <strong>and</strong> IOT&E• CONTROLLED BY PM• ONE-ON-ONE TESTSDT&E• CONTROLLED ENVIRONMENT• CONTRACTOR INVOLVEMENT• TRAINED, EXPERIENCED OPERATORS• PRECISE PERFORMANCE OBJECTIVES ANDTHRESHOLD MEASUREMENT• TEST TO SPECIFICATION• DEVELOPMENT TEST ARTICLEIOT&E• CONTROLLED BY INDEPENDENT AGENCY• MANY-ON-MANY TESTS• REALISTIC/TACTICAL ENVIRONMENT WITH OPERATIONALSCENARIO• RESTRICTED SYSTEM CONTRACTOR INVOLVEMENT• USER TROOPS RECENTLY TRAINED ON EQUIPMENT• PERFORMANCE MEASUREMENT OF OPERATIONALEFFECTIVENESS AND SUITABILITY• TEST TO REQUIREMENTS• PRODUCTION REPRESENTATIVE TEST ARTICLEconducted by different test <strong>com</strong>munities, the <strong>com</strong>munitiesmust interact frequently <strong>and</strong> are generally<strong>com</strong>plementary. The DT&E provides a viewof the potential to reach technical objectives, <strong>and</strong>OT&E provides an assessment of the system’spotential to satisfy user requirements.6.3.2 The Purpose of OT&EOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> is defined in Title10, United States Code (U.S.C.), Sections 139<strong>and</strong> 2399:The field test, under realistic <strong>com</strong>bat conditions,of any item of (or key <strong>com</strong>ponentof) weapons, equipment, ormunitions for the purposes of determiningthe effectiveness <strong>and</strong> suitability of theweapons, equipment, or munitions for usein <strong>com</strong>bat by typical military users; <strong>and</strong>the evaluation of the results of such test.This term does not include an operationalassessment based exclusively on <strong>com</strong>putermodeling, simulation, or an analysis ofsystem requirements, engineering proposals,design specifications, or any otherinformation contained in programdocuments. 4Definitions of operational effectiveness <strong>and</strong>operational suitability are listed below:Operational Effectiveness: Measure of the overallability of a system to ac<strong>com</strong>plish a missionwhen used by representative personnel in theenvironment planned or expected for operationalemployment of the system considering organization,doctrine, tactics, supportability, survivability,vulnerability, <strong>and</strong> threat. 5Operational Suitability: The degree to which asystem can be placed <strong>and</strong> sustained satisfactorilyin field use with consideration being given toavailability, <strong>com</strong>patibility, transportability, interoperability,reliability, wartime usage rates,maintainability, safety, human factors, habitability,manpower, logistics supportability, natural environmentaleffects <strong>and</strong> impacts, documentation,<strong>and</strong> training requirements. 6In each of the Services, operational testing isconducted under the auspices of an organizationthat is independent of the development agency,in environments as operationally realistic as possible,with hostile forces representative of theanticipated threat <strong>and</strong> with typical users operating<strong>and</strong> maintaining the system. In other words,OT&E is conducted to ensure that new systems6-4


meet the user’s requirements, operatesatisfactorily, <strong>and</strong> are supportable under actualfield conditions. The major questions addressedin OT&E are shown in Figure 6-2.Operational Assessment (EOA, OA): An evaluationof operational effectiveness <strong>and</strong> operationalsuitability made by an independent OT activity,with user support as required, on other than productionsystems. The focus of an OA is on significanttrends noted in development efforts, programmaticvoids, risk areas, adequacy of requirements,<strong>and</strong> the ability of the program to supportadequate operational testing. An OA may be conductedat any time using technology demonstrators,prototypes, mock-ups, Engineering DevelopmentModels (EDMs), or simulators, but willnot substitute for the Initial Operational T&E(IOT&E) necessary to support FRP decisions.Normally conducted prior to Milestone C. 7The OA normally takes place during each phaseof the acquisition process prior to IOT&E. It isused to provide an early assessment of potentialoperational effectiveness <strong>and</strong> suitability for decisionmakers at decision points. These assessmentsattempt to project the system’s potential to meetthe user’s requirements. Assessments conductedearly in the program development process maybe called Early Operational Assessments (EOAs).When using an evolutionary acquisition strategy,an OA is required for all increments followingthe one experiencing an IOT&E before the incrementis released to the user. 86.3.3 Initial Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>The OT&E performed in support of the FRP decisionis generally known as IOT&E. The Navycalls this event OPEVAL (Operational <strong>Evaluation</strong>).The IOT&E occurs during LRIP <strong>and</strong> mustMISSION ACCOMPLISHMENTORGANIZATIONDOCTRINETACTICSOPERATIONALTEST ANDEVALUATIONOPERATIONALEFFECTIVENESSDOES IT PERFORMAS INTENDED?OPERATIONALSUITABILITYIS IT SATISFACTORYFOR FIELD USE?SURVIVABILITYVULNERABILITYSUSCEPTIBILITYTHREATRELIABILITYWILL IT WORK FORA SPECIFIED TIMEUNDER STATEDCONDITIONS?AVAILABILITYIS IT IN OPERATINGCONDITION ANDCOMMITABLE?MAINTAINABILITYCAN IT BERETAINED IN ORRESTORED TO WORKINGCONDITION WITHINA GIVEN TIME?SUPPORTABILITYCAN AVERAGEMILITARY PERSONNELOPERATE IT? ARE SPAREAND REPAIR PARTSAVAILABLE?COMPATIBILITYTRANSPORTABILITYINTEROPERABILITYTECHNICAL DATASUPPORT/TESTEQUIPMENTFigure 6-2. Sample Hierarchy of Questions Addressed in OT&E6-5


e <strong>com</strong>pleted before the Full Rate Production DecisionReview (FRPDR). More than one IOT&Emay be conducted on the system if there are systemperformance problems requiring retest, thesystem is decertified, or a need exists to test indifferent environments. The IOT&E is conductedon a production or production representative systemusing typical operational personnel in a realistic<strong>com</strong>bat scenario.6.3.4 Follow-on Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (FOT&E)The OT&E that may be necessary after theFRPDR to refine the estimates made duringIOT&E, to evaluate changes, <strong>and</strong> to reevaluatethe system to ensure that it continues to meetoperational needs <strong>and</strong> retains its effectiveness ina new environment or against a new threat. 9FOT&E is conducted during fielding/deployment<strong>and</strong> operational support, <strong>and</strong> sometimes may bedivided into two separate activities. PreliminaryFOT&E is normally conducted after the InitialOperational Capability (IOC) is attained to assessfull system capability. It is conducted by theOTA or designated organization to verify the correctionof deficiencies, if required, <strong>and</strong> to assesssystem training <strong>and</strong> logistics status not evaluatedduring IOT&E. Subsequent FOT&E is conductedon production items throughout the life of a system.The results are used to refine estimates ofoperational effectiveness <strong>and</strong> suitability; to updatetraining, tactics, techniques <strong>and</strong> doctrine; <strong>and</strong>to identify operational deficiencies <strong>and</strong> evaluatemodifications. This later FOT&E often is conductedby the operating <strong>com</strong>m<strong>and</strong>.6.4 MULTI-SERVICE TEST ANDEVALUATIONMulti-Service <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> is T&E conductedby two or more Department of Defense(DoD) Components for systems to be acquiredby more than one DoD <strong>com</strong>ponent, or for a DoD<strong>com</strong>ponent’s systems that have interfaces withequipment of another DoD <strong>com</strong>ponent. 10 Allaffected Services <strong>and</strong> their respective Operational<strong>Test</strong> Agencies (OTAs) participate in planning, conducting,reporting, <strong>and</strong> evaluating the multi-Servicetest program. One Service is designated the leadService <strong>and</strong> is responsible for the management ofthe program. The lead Service is charged with thepreparation <strong>and</strong> coordination of a single report thatreflects the system’s operational effectiveness <strong>and</strong>suitability for each Service.The management challenge in a joint acquisitionprogram conducting multi-Service T&E stemsfrom the fact that the items undergoing testing willnot necessarily be used by each of the Servicesfor identical purposes. Differences among the Servicesusually exist in performance criteria, tactics,doctrine, configuration of armament or electronics,<strong>and</strong> the operating environment. As a result, adeficiency or discrepancy considered disqualifyingby one Service is not necessarily disqualifyingfor all Services. It is incumbent upon the leadService to establish a discrepancy reporting systemthat permits each participating Service to documentall discrepancies noted. At the conclusion ofa multi-Service T&E, each participating OT&Eagency may prepare an Independent <strong>Evaluation</strong> Report(IER) in its own format <strong>and</strong> submits that reportthrough its normal Service channels. The leadService OT&E agency may prepare the documentationthat goes forward to the Milestone DecisionAuthority (MDA). This documentation is coordinatedwith all participating OT&E agencies.6.5 JOINT TEST AND EVALUATIONJoint <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (JT&E) is not the sameas multi-Service T&E. JT&E is a specific programactivity sponsored <strong>and</strong> largely funded byan Office of the Secretary of Defense (OSD) (Director,Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(DOT&E)). JT&E programs are not acquisitionorientedbut are a means of examining joint-Service tactics <strong>and</strong> doctrine. Past joint-test programshave been conducted to provide informationrequired by the Congress, the OSD, the <strong>com</strong>m<strong>and</strong>ersof the Unified Comm<strong>and</strong>s, <strong>and</strong> the6-6


Services. The overarching policies <strong>and</strong> details offormulating a JT&E program are set forth in DoDDirective (DoDD) Joint <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(JT&E) Program. 11Responsibility for management of the JT&Eprogram was transferred from the Under Secretaryof Defense for Acquisition, Technology, <strong>and</strong>Logistics (USD(AT&L)) DT&E to the Office ofthe DOT&E in December 2002. Each year nominationsare submitted by Combatant Comm<strong>and</strong>s,DoD agencies, <strong>and</strong> Services to the JT&E ProgramOffice within DOT&E for review <strong>and</strong> processing.The JT&E Program Office identifiesthose to be funded for a feasibility study, theDOT&E assigns lead Service responsibility for aQuick Reaction <strong>Test</strong> (6 to 12 months) or a fullJT&E (up to three years), <strong>and</strong> a Joint <strong>Test</strong> ForceOffice is formed to conduct JT&E activities consistentwith the chartered objectives from the participatingServices. The lead <strong>and</strong> participatingServices provide personnel, infrastructure support,<strong>and</strong> test resources for JT&E activities.The JT&E program’s primary objectives include:assess Service system interoperability in jointoperations; evaluate joint technical <strong>and</strong> operationalconcepts <strong>and</strong> re<strong>com</strong>mend improvements;validate testing methodologies that have jointapplications; improve M&S validity with field exercisedata; increase joint mission capability, usingquantitative data for analysis; provide feedbackto the acquisition <strong>and</strong> joint operations <strong>com</strong>munities;improve joint Tactics, Techniques, <strong>and</strong>Procedures (TTP).Each JT&E typically produces one or more testproducts that provide continuing benefits to DoD,such as improvements in: joint warfighting capabilities;joint tactics, techniques, <strong>and</strong> procedures;joint <strong>and</strong> individual Service training programs;new testing methodologies; joint application ofM&Ss that can support acquisition, TTP, <strong>and</strong> Conceptof Operations (COOs) development;universal task list update <strong>and</strong> input.6.6 LIVE FIRE TESTINGThe Live Fire <strong>Test</strong> (LFT) Program was m<strong>and</strong>atedby the Congress in the National Defense AuthorizationAct (NDAA) for Fiscal 1987 (Public Law99-661) passed in November 1986. Specifically,this law stipulated that a major [Acquisition Category(ACAT) I <strong>and</strong> II] program developmentmay not proceed Beyond Low Rate Initial Production(BLRIP) until realistic survivability or(in the case of missiles <strong>and</strong> munitions) lethalitytesting has been <strong>com</strong>pleted.In 1984, before the passage of this legislation, theOSD had chartered a joint test program designed toaddress similar questions relative to systems alreadyin field use. This program, Joint LFT, was initiallydivided into two distinct parts: Armor/Anti-armor<strong>and</strong> Aircraft. The program’s objectives are to:• Gather empirical data on the vulnerability ofexisting U.S. systems to Soviet weapons;• Gather empirical data on the lethality ofexisting U.S. weapons against Soviet systems;• Provide insights into the design changes necessaryto reduce vulnerabilities <strong>and</strong> improvelethalities of existing U.S. weapon systems;• Calibrate current vulnerability <strong>and</strong> lethalitymodels.The legislated LFT Program <strong>com</strong>plements theolder Joint Live Fire (JLF) Program. While theJLF Program was designed to test systems thatwere fielded before being <strong>com</strong>pletely tested, thespirit <strong>and</strong> intent of the LFT legislation is to avoidthe need to play “catch-up.” This program notonly requires the Services to test their weaponssystems as early as possible against the expected<strong>com</strong>bat threat, but also before FRP to identifydesign characteristics that cause undue <strong>com</strong>batdamage or measure munitions lethality. Remediesfor deficiencies can entail required retrofits,production stoppages, or other more time-6-7


consuming solutions. The essential feature of livefire testing is that appropriate threat munitionsare fired against a major U.S. system configuredfor <strong>com</strong>bat to test its vulnerability <strong>and</strong>/or that amajor U.S. munitions or missile is fired against athreat target configured for <strong>com</strong>bat to test thelethality of the munitions or missile.Live Fire <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (LFT&E) <strong>Guide</strong>lineswere first issued by the Deputy Director, T&E(Live Fire <strong>Test</strong>ing) in May 1987 to supplementDoD <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan (TEMP)guidelines in areas pertaining to live fire testing. 12These guidelines en<strong>com</strong>pass all Major DefenseAcquisition Programs (MDAPs) <strong>and</strong> define LFTrequirements. In 1994 Public Law 103-355 directedthat oversight of Live Fire <strong>Test</strong>ing be moved withinDoD to the DOT&E. <strong>Guide</strong>lines for this programare now found in DoD Instruction (DoDI) 5000.2<strong>and</strong> the Defense Acquisition <strong>Guide</strong>book.6.7 NUCLEAR, BIOLOGICAL, ANDCHEMICAL (NBC) WEAPONSTESTINGThe testing of NBC weapons is highly specialized<strong>and</strong> regulated. PMs involved in these areasare advised to consult authorities within theirchain of <strong>com</strong>m<strong>and</strong> for the specific directives,instructions, <strong>and</strong> regulations that apply to theirindividual situations. Nuclear weapons tests aredivided into categories in which the responsibilitiesof the Department of Energy (DOE), theDefense Nuclear Agency (DNA), <strong>and</strong> the militaryServices are clearly assigned. The DOE isresponsible for nuclear warhead technical tests;the DNA is responsible for nuclear weapons effectstests. The Services are responsible for thetesting of Service-developed <strong>com</strong>ponents ofnuclear subsystems. All nuclear tests are conductedwithin the provisions of the Limited <strong>Test</strong>Ban Treaty (LTBT) that generally restricts nucleardetonations to the underground environment.Nuclear weapons testing requires extensivecoordination between Service <strong>and</strong> DOE testpersonnel. 13Since the United States signed <strong>and</strong> ratified theGeneva Protocol of 1925, U.S. policy has beennever to be the first to use lethal chemical weapons;it may, however, retaliate with chemicalweapons if so attacked. With the signing <strong>and</strong> ratificationof the 1972 Biological <strong>and</strong> Toxin WeaponConvention, the United States formally adoptedthe position that it would not employ biologicalor toxin weapons under any circumstances. Allsuch weapons were reported destroyed in the early1970s. 14Regarding retaliatory capability against chemicalweapons, the Service Secretaries are responsiblefor ensuring that their organizations establishrequirements <strong>and</strong> determine the military characteristicsof chemical deterrent items <strong>and</strong>chemical defense items. The Army has been designatedthe DoD executive agent for DoD chemicalwarfare, research, development, <strong>and</strong> acquisitionprograms. 15United States policy on chemical warfare seeksto:• Deter the use of chemical warfare weapons byother nations;• Provide the capability to retaliate if deterrencefails;• Achieve the early termination of chemicalwarfare at the lowest possible intensity. 16In addition to the customary development tests(conducted to determine if a weapon meets technicalspecifications) <strong>and</strong> OTs (conducted todetermine if a weapon will be useful in <strong>com</strong>bat),chemical weapons testing involves two types ofchemical tests—chemical mixing <strong>and</strong> biotoxicity.Chemical-mixing tests are conducted to obtain informationon the binary chemical reaction.Biotoxicity tests are performed to assess thepotency of the agent generated. Chemical weaponstesting, of necessity, relies heavily on the use ofnontoxic stimulants, since such substances are6-8


more economical <strong>and</strong> less hazardous <strong>and</strong> openairtesting of live agents has been restricted since1969. 176.8 NUCLEAR HARDNESS ANDSURVIVABILITY (NHS) TESTINGNuclear hardness is a quantitative description ofthe physical attributes of a system or <strong>com</strong>ponentthat will allow it to survive in a given nuclearenvironment. Nuclear survivability is the capabilityof a system to survive in a nuclear environment<strong>and</strong> to ac<strong>com</strong>plish a mission. DoD policyrequires the incorporation of NH&S features inthe design, acquisition, <strong>and</strong> operation of major<strong>and</strong> nonmajor systems that must perform criticalmissions in nuclear conflicts. Nuclear hardnesslevels must be quantified <strong>and</strong> validated. 18The T&E techniques used to assess nuclear hardness<strong>and</strong> survivability include: nuclear testing,physical testing in a simulated environment,modeling, simulation, <strong>and</strong> analysis. Althoughnuclear tests provide a high degree of fidelity <strong>and</strong>valid results for survivability evaluation, they arenot practical for most systems due to cost, longlead times, <strong>and</strong> international treaty constraints.Underground testing is available only on a prioritizedbasis for critical equipment <strong>and</strong> <strong>com</strong>ponents,<strong>and</strong> is subject to a frequently changing testschedule. Physical testing provides an opportunityto observe personnel <strong>and</strong> equipment in asimulated nuclear environment. Modeling, simulation,<strong>and</strong> analysis are particularly useful in theearly stages of development to provide early projectionsbefore system hardware is available.These methods are also used to furnish assessmentsin an area that, because of safety or testinglimitations, cannot be directly observedthrough nuclear or physical testing.6.9 SUMMARYT&E is a spectrum of techniques used to addressquestions about critical performance parametersduring system development. These questions mayinvolve several issues including: technical <strong>and</strong>survivability (development testing); effectiveness<strong>and</strong> suitability (operational testing); those affectingmore than one Service (multi-Service <strong>and</strong>joint testing); vulnerability <strong>and</strong> lethality (LFT),nuclear survivability; or the use of other thanconventional weapons (i.e., NBC).6-9


ENDNOTES1. Defense Acquisition <strong>Guide</strong>book, Chapter 3,<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>. http://akss.dau.mil/DAG.2. DoDI 5000.2, Operation of the DefenseAcquisition System, May 12, 2003.3. Air Force Manual (AFM) 55-43, <strong>Management</strong>of Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>, June1979.4. Title 10, U.S.C., Section 139, Director ofOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>, <strong>and</strong> Section2399, Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> of DefenseAcquisition Programs, March 18, 2004.5. Chairman of the Joint Chiefs of Staff Instruction(CJCSI) 3170.01D, Joint CapabilitiesIntegration <strong>and</strong> Development System (JCIDS),March 12, 2004.6. Ibid.7. Defense Acquisition University Press, Glossaryof Defense Acquisition Acronyms & Terms,September 2003.8. DoDI 5000.29. Defense Acquisition University Press.10. Ibid.11. DoDD 5010.41, Joint <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(JT&E) Program, February 23, 1998.12. Statement by Mr. James F. O’Bryon, AssistantDeputy Under Secretary of Defense (LiveFire <strong>Test</strong>) before the Acquisition Policy Panelof the House Armed Services Committee,September 10, 1987.13. DoDI 5030.55, Joint AEC-DoD NuclearWeapons Development Procedures, January25, 2001.14. DoDD 5160.5, Responsibilities for Research,Development, <strong>and</strong> Acquisition of ChemicalWeapons <strong>and</strong> Chemical <strong>and</strong> BiologicalDefense, May 1, 1985.15. Ibid.16. Ibid.17. Ibid.18. DoDI 4245.4, Acquisition of Nuclear-Survivable Systems, July 25, 1988. Cancelled.6-10


IIMODULEDEVELOPMENTTEST AND EVALUATIONMaterial acquisition is an iterative process of designing, building,testing, identifying deficiencies, fixing, retesting, <strong>and</strong>repeating. Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E) is animportant aspect of this process. DT&E is performed in the factory,in the laboratory, <strong>and</strong> on the proving ground. It is conductedby subcontractors as they are developing the <strong>com</strong>ponents<strong>and</strong> subassembly; the prime contractor, as the <strong>com</strong>ponents areassembled <strong>and</strong> integration of the system is ensured; <strong>and</strong> by thegovernment, to demonstrate how well the weapon system meetsits technical <strong>and</strong> operational requirements. This module describesdevelopment testing <strong>and</strong> the various types of activities it involves.The module also discusses how development testing is used tosupport the technical review process.


7INTRODUCTION TO DEVELOPMENTTEST AND EVALUATION7.1 INTRODUCTIONDevelopment <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E) is conductedto demonstrate that the engineering design<strong>and</strong> development process is <strong>com</strong>plete. It isused by the contractor to reduce risk, validate <strong>and</strong>qualify the design, <strong>and</strong> ensure that the product isready for government acceptance. The DT&E resultsare evaluated to ensure that design risks havebeen minimized <strong>and</strong> the system will meet specifications.The results are also used to estimatethe system’s military utility when it is introducedinto service. DT&E serves a critical purpose inreducing the risks of development by testing selectedhigh-risk <strong>com</strong>ponents or subsystems. Finally,DT&E is the government developingagency tool used to confirm that the systemperforms as technically specified <strong>and</strong> that thesystem is ready for field testing. This chapter providesa general discussion of contractor <strong>and</strong> governmentDT&E activities, stresses the need foran integrated test program, describes some special-purposeDevelopment <strong>Test</strong>s (DTs), <strong>and</strong> discussesseveral factors that may influence the extent<strong>and</strong> scope of the DT&E program.7.2 DT&E AND THE SYSTEMACQUISITION CYCLEAs illustrated in Figure 7-1, DT&E is conductedthroughout the system life cycle. DT&E maybegin before program initiation with the evaluationof evolving technology, <strong>and</strong> it continues afterthe system is in the field.7.2.1 DT&E Prior to Program InitiationPrior to program initiation, modeling, simulations,<strong>and</strong> technology, feasibility testing is conductedto confirm that the technology considered for theproposed weapon development is the mostadvanced available <strong>and</strong> that it is technicallyfeasible.7.2.2 DT&E During Concept Refinement(CR) <strong>and</strong> Technology Development (TD)Development testing that takes place is conductedby a contractor or the government to assist in selectingpreferred alternative system concepts,technologies, <strong>and</strong> designs. The testing conducteddepends on the state of technical maturity of thetest article’s design. Government test evaluatorsparticipate in this testing because information obtainedcan be used to support a Systems RequirementsReview (SRR), early test planning in theTechnology Development Strategy (TDS), <strong>and</strong> developmentof the Capability Development Document(CDD) <strong>and</strong> the Request for Proposal (RFP)for Milestone B. The information obtained fromthese tests may also be used to support a programstart decision by the Services or the Officeof the Secretary of Defense (OSD).7.2.3 DT&E During System Development<strong>and</strong> Demonstration (SDD)Development testing is used to demonstrate that:technical risk areas have been identified <strong>and</strong> canbe reduced to acceptable levels; the best technicalapproach can be identified; <strong>and</strong>, from this pointon, engineering efforts will be required rather thanexperimental efforts. It supports the decision7-1


GOVERNMENTTESTSPRIMECONTRACTORTESTSSUBCONTRACTORTESTSGOVERNMENT DT/DTASSESSMENTSGOVERNMENTGOVERNMENT-PARTICIPATIONCONDUCTEDIN CONTRACTOR-TESTSCONDUCTED TESTSSYSTEM DEVELOPMENTEVALUATEAPPROACHESSUBSYSTEM DEVELOPMENTEVALUATEAPPROACHESSUBSYSTEM QUALIFICATIONDESIGNLIMITVERIFYSOLUTIONSSUBSYSTEM ACCEPTANCEMANUFACTURING SCREENINGFINALDTSYSTEM QUALIFICATIONDESIGNLIMITVERIFYSOLUTIONSTAAFTAAFIOT&E FOT&ESYSTEM ACCEPTANCEMANUFACTURING SCREENINGFINALSOURCE: Adapted <strong>and</strong> modified from “Solving the Risk Equation in Transitioning from Development to Production.” January 19, 1984.Figure 7-1. Contractor’s Integrated <strong>Test</strong> Requirementsreview that considers transition from prototype designinto advanced engineering <strong>and</strong> constructionof the Engineering Development Model (EDM).This DT&E includes contractor/government integratedtesting, engineering design testing, <strong>and</strong>advanced development verification testing.Development testing during systems integration ismost often conducted at the contractor’s facility. Itis conducted on <strong>com</strong>ponents, subsystems, brassboardconfigurations, or advanced developmentprototypes to evaluate the potential application oftechnology <strong>and</strong> related design approaches beforesystem demonstration. Component interface problems<strong>and</strong> equipment performance capabilities areevaluated. The use of properly validated analysis,Modeling <strong>and</strong> Simulation (M&S) is encouraged,especially during the early phases to assess thoseareas that, for safety or testing capability limitations,cannot be observed directly through testing.M&Ss can provide early projections of systemsperformance, effectiveness, <strong>and</strong> suitability, <strong>and</strong> canreduce testing costs. This <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(T&E) also may include initial environmentalassessments.Army testing of the Advanced Attack Helicopter(AAH) provides an example of the type ofactivities that occur during development testing/technical testing. The early DT&E of the AAHwas conducted by the Army Engineering FlightActivity (EFA). The test was conducted in conjunctionwith an Early Operational Assessment(EOA), <strong>and</strong> c<strong>and</strong>idate designs were flown more7-2


than 90 hours to evaluate flight h<strong>and</strong>ling qualities<strong>and</strong> aircraft performance. This test also includedthe firing of the 30 millimeter cannon <strong>and</strong>the 2.75-inch rockets. Reliability, Availability, <strong>and</strong>Maintainability (RAM) data were obtainedthroughout the test program; <strong>and</strong> these data, alongwith RAM data provided from early contractortesting, became a part of the system’s RAM database.After evaluating the results, the Army selecteda contractor to proceed with the next developmentphase of the AAH.DT&E conducted during system demonstrationprovides the final technical data for determininga system’s readiness to transition into Low RateInitial Production (LRIP). It is conducted usingadvanced EDMs <strong>and</strong> is characterized byengineering <strong>and</strong> scientific approaches under controlledconditions. The qualification testing providesquantitative <strong>and</strong> qualitative data for use inthe system’s evaluation. The evaluation results areused by the development <strong>com</strong>munity <strong>and</strong> are alsoprovided to Service <strong>and</strong> OSD decision authorities.These tests measure technical performance including:effectiveness, RAM, <strong>com</strong>patibility, interoperability,safety, <strong>and</strong> supportability. They include testsof human engineering <strong>and</strong> technical aspects of thesystem. Demonstrations indicating that engineeringis reasonably <strong>com</strong>plete <strong>and</strong> that solutions toall significant design problems are in h<strong>and</strong> are alsoincluded.7.2.4 DT&E During Production <strong>and</strong>Deployment (P&D)7.2.4.1 Low Rate Initial ProductionDT&E may be conducted on EDMs or LRIParticles as a prelude to certifying the system readyfor Initial Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(IOT&E). Each Service has different <strong>and</strong> specificprocesses incorporated in the certification forIOT&E documentation. The Navy conducts additionalDT&E for certification calledTECHEVAL (Technical <strong>Evaluation</strong>). This is aDT&E event controlled by the program office thatis conducted in a more operationally realistic testenvironment. The Air Force has developed a guidewith a structured process using templates to assistthe Program Manager (PM) in assessing theprogram’s readiness for IOT&E.As an example of testing done during this phase,the Army AAH was flown in a series of EngineeringDesign <strong>Test</strong>s (EDTs). The EDT-1, -2, <strong>and</strong> -4were flown at the contractor’s facility. (The EDT-3requirement was deleted during program restructuring.)The objectives of these flight tests wereto evaluate the h<strong>and</strong>ling characteristics of the aircraft,check significant performance parameters,<strong>and</strong> confirm the correction of deficiencies notedduring earlier testing. The EDT-5 was conductedat an Army test facility, Yuma Proving Ground,Arizona. The objectives of this test were the sameas earlier EDTs; however, the testers wererequired to ensure that all discrepancies were resolvedbefore the aircraft entered operationaltesting. During the EDTs, Operational <strong>Test</strong> (OT)personnel were <strong>com</strong>pleting OT design, bringingtogether test resources, <strong>and</strong> observing the DT&Etests. Additionally, OT personnel were <strong>com</strong>pilingtest data, such as the system contractor’s testresults, from other sources. The evolving DTresults <strong>and</strong> contractor data were made availableto the Critical Design Review (CDR) membersto ensure that each configuration item design wasessentially <strong>com</strong>pleted. The Army conducted aPhysical Configuration Audit (PCA) to providea technical examination to verify that each item“as built” conformed to the technical documentationdefining that item.7.2.4.2 Post Full Rate Production DecisionReview (FRPDR)Development testing may be necessary after theFull-Rate Production (FRP) decision is made.This testing is normally tailored to verify correctionof identified design problems <strong>and</strong> demonstratethe system modification’s readiness for production.This testing is conducted under controlledconditions <strong>and</strong> provides quantitative <strong>and</strong>7-3


qualitative data. This testing is conducted on productionitems delivered from either the pilot orinitial production runs. To ensure that the itemsare produced according to contract specification,limited quantity production sampling processesare used. This testing determines whether thesystem has successfully transitioned from engineeringdevelopment prototype to production, <strong>and</strong>whether it meets design specifications.7.2.5 Operations <strong>and</strong> Support (O&S)The DT, which occurs soon after the Initial OperatingCapability (IOC) or initial deployment,assesses the deployed system’s operational readiness<strong>and</strong> supportability. It ensures that all deficienciesnoted during previous testing have beencorrected, evaluates proposed Product Improvements(PIs) <strong>and</strong> block upgrades, <strong>and</strong> ensures thatIntegrated Logistics Support (ILS) is <strong>com</strong>plete.It also evaluates the resources on h<strong>and</strong> <strong>and</strong> determinesif the plans to ensure operational phasereadiness <strong>and</strong> support objectives are sufficient tomaintain the system for the remainder of itsacquisition life cycle. For mature systems, DT&Eis performed to assist in modifying the system tohelp meet new threats, add new technologies, oraid in extending service life.Once a system approaches the end of its usefulness,the DT conducted is concerned with themonitoring of a system’s current state of operationaleffectiveness, suitability, <strong>and</strong> readiness todetermine whether major upgrades are necessaryor deficiencies warrant consideration of a newsystem replacement. <strong>Test</strong>s are normally conductedby the operational testing <strong>com</strong>munity; however,the DT&E <strong>com</strong>munity may be required to assessthe technical aspects of the system.7.3 DT&E RESPONSIBILITIESAs illustrated in Figure 7-1, the primary participantsin testing are the prime contractor, subcontractor,Service materiel developer, or developingagency; the Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(OT&E) agency acts as observer. In some Services,there are also independent evaluation organizationsthat assist the testing organization indesigning <strong>and</strong> evaluating development tests. Asthe figure shows, system development testing isperformed principally by contractors during theearly development stages of the acquisition cycle<strong>and</strong> by government test/evaluation organizationsduring the later phases.Army testing of the AAH illustrates the type ofDT performed by contractors <strong>and</strong> the relationshipof this type of testing to government DT&Eactivities. During the contractor <strong>com</strong>petitive testingof the Army AAH, prime contractor <strong>and</strong>subcontractor testing included design supporttests, testing of individual <strong>com</strong>ponents, establishingfatigue limits, <strong>and</strong> bench testing of dynamic<strong>com</strong>ponents to demonstrate sufficientstructural integrity to conduct the Army <strong>com</strong>petitiveflight test program. Complete dynamicsystem testing was conducted utilizing groundtest vehicles. Besides supporting the contractor’sdevelopment effort, these tests provided informationfor the Army technical review processas the systems, Preliminary Design Reviews(PDRs), <strong>and</strong> Critical Design Reviews (CDRs)were conducted.Following successful <strong>com</strong>pletion of the groundtest vehicle qualification testing, first flights wereconducted on the two types of <strong>com</strong>peting helicopters.Each aircraft was being flown 300 hoursbefore delivery of two of each <strong>com</strong>peting aircraftto the Army. The contractor flight testing was orientedtoward flight-envelope development, demonstrationof structural integrity, <strong>and</strong> evaluation<strong>and</strong> verification of aircraft flight h<strong>and</strong>ling qualities.Some weapons system testing was conductedduring this phase. Government testers used muchof the contractor’s testing data to develop the testdata matrices as part of the government’s DT <strong>and</strong>OT planning efforts. The use of contractor testdata reduced the testing required by the government<strong>and</strong> added validity to the systems alreadytested <strong>and</strong> data received from other sources.7-4


7.3.1 Contractor <strong>Test</strong>ingMateriel DT&E is an iterative process in which acontractor designs hardware <strong>and</strong> software, evaluatesperformance, makes necessary changes, <strong>and</strong>retests for performance <strong>and</strong> technical <strong>com</strong>pliance.Contractor testing plays a primary role in the totaltest program, <strong>and</strong> the results of contractor testsare useful to the government evaluator in supportinggovernment test objectives. It is importantthat government evaluators oversee contractorsystem tests <strong>and</strong> use test data from them toaddress government testing issues. It is not un<strong>com</strong>monfor contractor testing to be conductedat government test facilities, since contractorsoften do not have the required specialized facilities(e.g., for testing hazardous <strong>com</strong>ponents orfor missile flight tests). This enables governmentevaluators to monitor the tests more readily <strong>and</strong>increases government confidence in the testresults.Normally, an RFP requires that the winning contractorsubmit an Integrated Engineering Design<strong>Test</strong> Plan within a short period after contractinitiation for coordination with government testagencies <strong>and</strong> approval. This test plan shouldinclude testing required by the Statement of Work(SOW), specifications, <strong>and</strong> testing expected aspart of the engineering development <strong>and</strong> integrationprocess. When approved, the contractor’s testprogram automatically be<strong>com</strong>es part of thedevelopment agency’s Integrated <strong>Test</strong> Plan (ITP).If the contractor has misinterpreted the RFPrequirements <strong>and</strong> the Integrated EngineeringDesign <strong>Test</strong> Plan does not satisfy government testobjectives, the iterative process of amending thecontractor’s test program begins. This iterativeprocess must be ac<strong>com</strong>plished within limitedbounds so the contractor can meet the test objectiveswithout significant effects on contract cost,schedule, or scope.7.3.2 Government <strong>Test</strong>ingGovernment testing is performed to demonstratehow well the materiel system meets its technical<strong>com</strong>pliance requirements; provide data to assessdevelopmental risk for decisionmaking; verify thatthe technical <strong>and</strong> support problems identified inprevious testing have been corrected; <strong>and</strong> ensurethat all critical issues to be resolved by testing havebeen adequately considered. All previous testing,from the contractor’s bench testing through developmentagency testing of representative prototypes,is considered during government evaluation.Government materiel development organizationsinclude major materiel acquisition <strong>com</strong>m<strong>and</strong>s <strong>and</strong>,in some cases, operational <strong>com</strong>m<strong>and</strong>s. The materielacquisition <strong>com</strong>m<strong>and</strong>s have T&E organizationsthat conduct government development testing. Inaddition to monitoring <strong>and</strong> participating in contractortesting, these organizations conduct developmenttesting on selected high-concern areasto evaluate the adequacy of systems engineering,design, development, <strong>and</strong> performance tospecifications. The Program <strong>Management</strong> Office(PMO) must be involved in all stages of testingthat these organizations perform.In turn, the materiel development/T&E agenciesconduct T&E of the systems in the developmentstage to ensure they meet technical <strong>and</strong> operationalrequirements. These organizations operate governmentproving grounds, test facilities, <strong>and</strong> labs. Theymust be responsive to the needs of the PM by providingtest facilities, personnel, <strong>and</strong> data acquisitionservices, as required.7.4 TEST PROGRAM INTEGRATIONDuring the development of a weapon system, thereare a number of tests conducted by subcontractors,the prime contractor, <strong>and</strong> the government. Toensure these tests are properly timed, that adequateresources are available, <strong>and</strong> to minimize unnecessarytesting, a coordinated test program must bedeveloped <strong>and</strong> followed. The <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>7-5


Master Plan (TEMP) normally does not provide asufficient level of detail concerning contractor orsubcontractor tests. A contractor or PMO ITP mustalso be developed to describe these tests. The PMis responsible for coordinating the total T&E program.The PM performs this task with the assistanceof the T&E IPT whose members are assembledfrom development agency, user, technical<strong>and</strong> operational T&E, logistics, <strong>and</strong> training organizations.The PM must remain active in all aspectsof testing including planning, funding, resourcing,execution, <strong>and</strong> reporting. The PM plays an importantrole as the interface between the contractor <strong>and</strong>the government testing <strong>com</strong>munity. Recent emphasison early T&E highlights a need for early governmenttester involvement in contractor testing. Forexample, during development of the AAH test, itwas found that having program management personnelon the test sites improved test continuity,facilitated the flow of spare <strong>and</strong> repair parts, provideda method of monitoring contractor performance,<strong>and</strong> kept the Service headquarters informedwith timely status reports.7.4.1 Integrated <strong>Test</strong> PlanThe ITP is used to record the individual test plansfor the subcontractor, prime contractor, <strong>and</strong> government.The prime contractor should be contractuallyresponsible for the preparation <strong>and</strong> updatingof the ITP, <strong>and</strong> the contractor <strong>and</strong> Servicedevelopingagency should ensure that it remainscurrent. The ITP includes all developmental teststhat will be performed by the prime contractor<strong>and</strong> the subcontractors at both the system <strong>and</strong> subsystemlevels. It is a detailed, working-level documentthat assists in identifying risk as well asduplicative or missing test activities. A well-maintainedITP facilitates the most efficient use oftest resources.7.4.2 Single Integrated <strong>Test</strong> PolicyMost Services have adopted a single integratedcontractor/government test policy, thereby reducingmuch of the government testing requirements.This policy stresses independent governmentevaluation <strong>and</strong> permits an evaluator to monitorcontractor <strong>and</strong> government test programs <strong>and</strong>evaluate the system from an independent perspective.The policy stresses the use of data from allsources for system evaluation.7.5 AREAS OF DT&E FOCUS7.5.1 Life <strong>Test</strong>ingLife testing is performed to assess the effects oflong-term exposure to various portions of theanticipated environment. These tests are used toensure the system will not fail prematurely dueto metal fatigue, <strong>com</strong>ponent aging, or other problemscaused by long-term exposure to environmentaleffects. It is important that the requirementsfor life testing are identified early <strong>and</strong> integratedinto the system test plan. Life tests aretime-consuming <strong>and</strong> costly; therefore, life testingrequirements <strong>and</strong> life characteristics must becarefully analyzed concurrent with the initial testdesign. Aging failure data must be collected early<strong>and</strong> analyzed throughout the testing cycle. If lifecharacteristics are ignored until results of the testare available, extensive redesign <strong>and</strong> project delaysmay be required. Accelerated life testing techniquesare available <strong>and</strong> may be used wheneverapplicable.7.5.2 Design <strong>Evaluation</strong>/Verification <strong>Test</strong>ingDesign evaluation <strong>and</strong> verification testing are conductedby the contractor <strong>and</strong>/or the developmentagency with the primary objective of influencingsystem design. Design evaluation is fully integratedinto the DT cycle. Its purposes are to:(1) Determine if critical system technicalcharacteristics are achievable;(2) Provide data for refining <strong>and</strong> making thehardware more rugged to <strong>com</strong>ply withtechnical specification requirements;7-6


(3) Eliminate as many technical <strong>and</strong> designrisks as possible or to determine the extentto which they are manageable;(4) Provide for evolution of design <strong>and</strong> verificationof the adequacy of design changes;(5) Provide information in support of developmentefforts;(6) Ensure <strong>com</strong>ponents, subsystems, <strong>and</strong> systemsare adequately developed before beginningOTs.7.5.3 Design Limit <strong>Test</strong>ingDesign Limit <strong>Test</strong>s (DLTs) are integrated into thetest program to ensure the system will provideadequate performance when operated at outerperformance limits <strong>and</strong> when exposed to environmentalconditions expected at the extremes ofthe operating envelope. The tests are based onmission profile data. Care must be taken to ensureall systems <strong>and</strong> subsystems are exposed tothe worst-case environments, with adjustmentsmade because of stress amplification factors <strong>and</strong>cooling problems. Care must also be taken toensure that the system is not operated beyond thespecified design limits; for example, an aircraft<strong>com</strong>ponent may have to be tested at temperatureextremes from an Arctic environment to a desertenvironment.7.5.4 Reliability Development <strong>Test</strong>ing(RDT)RDT or Reliability Growth <strong>Test</strong>ing (RGT) is aplanned <strong>Test</strong>, Analyze, Fix, <strong>and</strong> <strong>Test</strong> (TAFT) processin which development items are tested underactual or simulated mission-profile environmentsto disclose design deficiencies <strong>and</strong> to provideengineering information on failure modes<strong>and</strong> mechanisms. The purpose of RDT is to providea basis for early incorporation of correctiveactions <strong>and</strong> verification of their effectiveness inimproving the reliability of equipment. RDT isconducted under controlled conditions with simulatedoperational mission <strong>and</strong> environmental profilesto determine design <strong>and</strong> manufacturing processweaknesses. The RDT process emphasizesreliability growth rather than a numerical measurement.Reliability growth during RDT is theresult of an iterative design process because, asthe failures occur, the problems are identified, solutionsproposed, the redesign is ac<strong>com</strong>plished,<strong>and</strong> the RDT continues. A substantial reliabilitygrowth TAFT effort was conducted on the F-18DT&E for selected avionics <strong>and</strong> mechanical systems.Although the TAFT effort added $100 millionto the Research, Development, <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (RDT&E) Program, it is estimated thatmany times that amount will be saved throughlower operational <strong>and</strong> maintenance coststhroughout the system’s life.7.5.5 Reliability, Availability, <strong>and</strong>MaintainabilityThe RAM requirements are assessed during allcontractor <strong>and</strong> government testing. The data arecollected from each test event <strong>and</strong> placed in aRAM database, which is managed by the developmentagency. Contractor <strong>and</strong> government DTsprovide a measure of the system’s <strong>com</strong>mon RAMperformance against stated specifications in acontrolled environment. The primary emphasis ofRAM data collection during the DT is to providean assessment of the system RAM parametersgrowth <strong>and</strong> a basis for assessing the consequencesof any differences anticipated during field operations.Early projections of RAM are important tologistics support planners. The test data facilitatedetermination of spares quantities, maintenanceprocedures, <strong>and</strong> support equipment.7.6 SYSTEM DESIGN FOR TESTINGBuilt-in <strong>Test</strong> (BIT), Built-in-<strong>Test</strong> Equipment(BITE) <strong>and</strong> Automated <strong>Test</strong> Equipment (ATE) aremajor areas that must be considered from the startof the design effort. Design for testing (Figure 7-2) addresses the need to: (1) collect data during7-7


the development process concerning particular performancecharacteristics; (2) enable efficient <strong>and</strong>economical production by providing ready accessto <strong>and</strong> measurement of appropriate acceptanceparameters; <strong>and</strong> (3) enable rapid <strong>and</strong> accurateassessment of the status of the product to the lowestrepairable element when deployed. Many hardwaresystems have testing circuits designed <strong>and</strong> builtin.This early planning by design engineers allowseasy testing for fault isolation of circuits, both insystem development phases <strong>and</strong> during operationaltesting <strong>and</strong> deployment. There are <strong>com</strong>puter chipsin which more than half of the circuits are for test/circuit check functions. This type of circuit designrequires early planning by the PM to ensure theRFP requirements include the requirement for designed/BITcapability. <strong>Evaluation</strong> of these BIT/BITE/ATE systems must be included in the testprogram.7.7 IMPACT OF WARRANTIES ON T&EA warranty or guarantee is a <strong>com</strong>mitment providedby a supplier to deliver a product that meetsspecified st<strong>and</strong>ards for a specified time. With aproperly structured warranty, the contractor mustmeet technical <strong>and</strong> operational requirements. Ifthe product should fail during that warrantyperiod, the contractor must replace or makerepairs at no additional cost to the government.The Defense Appropriations Act of 1984 requireswarranties or guarantees on all weapon systemsprocurement. This Act makes warranties ast<strong>and</strong>ard item on most fixed-price productioncontracts. Incentives are the main thrust of warranties,<strong>and</strong> the government will perform a reliabilitydemonstration test on the system to determinethese incentives. Although warranties havefavorable advantages to the government duringthe early years of the contract, warranties do notDEFINETESTABILITYOBJECTIVES• ISOLATION• ACCESSIBILITY• OBSERVABILITY• HUMAN FACTORS• COMPLETENESS• STANDARDIZATION OF TEST EQUIPMENT• INTERFACE WITH TEST EQUIPMENT• FAULT COVERAGE• RELIABILITY• OPERATION UNDER FAILURESELECT TESTABILITYTECHNIQUES TOINFLUENCE DESIGN• BUILT-IN-TEST• END-TO-END TESTING• TEST POINTS AND CONTROL POINTS• PARTITIONING• INITIALIZATIONCONDUCT TRADE-OFFSTUDIESCHECK SYSTEMTESTABILITY• TESTABILITY DEMONSTRATIONS• TESTABILITY ANALYSISNODOESTESTABILITYSATISFYNEEDS?YESIMPLEMENT TESTABILITYFigure 7-2. Design for <strong>Test</strong>ing Procedures7-8


affect the types of testing performed to ensurethe system meets technical specifications <strong>and</strong>satisfies operational effectiveness <strong>and</strong> suitabilityrequirements. Warranties do, however, affect theamount of testing required to establish reliability.Because the st<strong>and</strong>ard item is warranted, lessemphasis on that portion of the item can allowfor additional emphasis on other aspects of theitem not covered under the warranty. Further, thegovernment may tend to have more confidencein contractor test results <strong>and</strong> may be able, therefore,to avoid some duplication of test effort. Thewarranty essentially shifts the burden of risk fromthe government to the contractor. Warranties cansignificantly increase the price of the contract,especially if high-cost <strong>com</strong>ponents are involved.7.8 DT&E OF LIMITED PROCUREMENTQUANTITY PROGRAMSPrograms that involve the procurement of relativelyfew items, such as satellites, some largemissiles, <strong>and</strong> unique intelligence equipment,typically over an extended period, are normallysubjected to modified DT&E. Occasionally, aunique test approach that deviates from the st<strong>and</strong>ardtiming <strong>and</strong> reporting schedule will be used.The DT&E principle of iterative testing startingwith <strong>com</strong>ponents, subsystems, prototypes, <strong>and</strong>first-production models of the system is normallyapplied to limited procurements. It is importantthat DT&E <strong>and</strong> OT&E organizations work togetherto ensure that integrated T&E plans areadapted/tailored to the overall acquisition strategy.7.9 SUMMARYDT&E is an iterative process of designing, building,testing, identifying deficiencies, fixing, retesting,<strong>and</strong> repeating. It is performed in the factory,laboratory, <strong>and</strong> on the proving ground bythe contractors <strong>and</strong> the government. Contractor<strong>and</strong> government testing is <strong>com</strong>bined into one integratedtest program <strong>and</strong> conducted to determineif the performance requirements have been met<strong>and</strong> to provide data to the decision authority.7-9


7-10


8DT&E SUPPORT OF TECHNICAL REVIEWSAND MILESTONE DECISIONS8.1 INTRODUCTIONThroughout the acquisition process, Development<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E) is oriented towardthe demonstration of specifications showing the<strong>com</strong>pleteness <strong>and</strong> adequacy of systems engineering,design, development, <strong>and</strong> performance. A criticalpurpose of DT&E is to identify the risks ofdevelopment by testing <strong>and</strong> evaluating selectedhigh-risk <strong>com</strong>ponents or subsystems. DT&E is thedeveloper’s tool to show that the system performsas specified or that deficiencies have been corrected<strong>and</strong> the system is ready for operational testing<strong>and</strong> fielding (Figure 8-1). The DT&E resultsare used throughout the Systems Engineering Process(SEP) to provide valuable data in support offormal design reviews. This chapter describes thetest’s relationship to the Formal Design Reviews(FDRs) essential to the SEP.8.2 DT&E AND THE REVIEW PROCESS8.2.1 The Technical Review ProcessTechnical reviews <strong>and</strong> audits are conducted by thegovernment <strong>and</strong> the contractor as part of the SEPto ensure the design meets the system, subsystem,<strong>and</strong> software specifications. Each review is uniquein its timing <strong>and</strong> orientation. Some reviews buildon previous reviews <strong>and</strong> take the design <strong>and</strong> testingeffort one step closer to the final system designto satisfy the operational concept/purpose for theweapon system. Table 8-1 illustrates the sequencingof the technical reviews in relation to the <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (T&E) phases.TECHNOLOGYFEASIBILITYSTUDIESANDANALYSISTECHNOLOGYDEVELOPSTRATEGYSIMULATIONDT&EREQUIREMENTSALTERNATIVESSIMULATIONDT&EPREFERREDTECHNICAL APPTECHNICAL RISKENGINEER DESIGNSOLUTIONSSIMULATIONDT&EPRODUCTIONDESIGNENGINEERING COMPLETENESSIDENTIFY “ILITIES”SYSTEM PERFORMANCESPECIFICATIONSTECHNICAL COMPLIANCE TESTQUALIFICATION TESTSDT&EVALIDATE SPECIFICATIONSSYSTEM PERFORMANCEMODIFICATIONACCEPTANCEDT&EPRODUCTIONDESIGNPERFORMANCE/SPECSENGINEER COMPLETETECHNICAL RISKINITIALCAPABILITYDOCUMENTTECHNOLOGYDEVELOPMENTPROTOTYPESYSTEM DEVELOPMENT& DEMONSTRATIONENGR DEV MODELLRIPPRODUCTION &DEPLOYMENTOPERATIONS &SUPPORTMODIFICATIONSMILESTONE A MILESTONE B MILESTONE CFRPDRFigure 8-1. Relationship of DT&E to the Acquisition Process8-1


WHEN PURPOSE DOCUMENTATION DATAINITIAL ITR EARLY • EVALUATE INITIAL PROGRAM • COST ANALYSIS REQUIREMENTSTECHNICAL CONCEPT TECHNICAL BASELINE DESCRIPTION (CARD) LIKEREVIEW REFINEMENT • ASSESS INITIAL PROGRAM DOCUMENTCOST ESTIMATES• ASSESSMENT OF TECHNICALAND COSTS RISKSALTERNATIVE ASR LATE CONCEPT • ASSESS REQUIREMENTS • TRADE STUDIESSYSTEM REFINEMENT AGAINST CUSTOMER NEEDS • PRELIMINARY SYSTEM SPECREVIEW (PRE-MS A) • ASSESS READINESS OF • INPUT TO TDSPREFERRED ALTERNATIVE TO• INPUT TO AOAENTER TECHNOLOGY DEV• SYSTEMS ENGINEERING PLANSYSTEM SRR EARLY SYSTEMS • EVALUATE SYSTEM • PRELIM PERF SPECREQUIREMENTS INTEGRATION FUNCTIONAL REQUIREMENTS • PRELIM PLANNING DOCUMENTATIONREVIEW• FFBD, RAS, MBN ANALYSISSYSTEM SFR EARLY SYSTEMS • EVALUATE SYSTEM DESIGN • PERFORMANCE SPECFUNCTIONAL INTEGRATION • VALIDATE SYSTEM SPEC • PRELIM ITEM (PERF) SPECREVIEW • ESTABLISH SYSTEM LEVEL • DESIGN DOCUMENTSFUNCTIONAL BASELINE• RAS, TLSSOFTWARE SSR MID SYSTEMS • EVALUATE SW PERFORMANCE • SW SPECSPECIFICATION INTEGRATION REQUIREMENTS • (SRS & IRS)REVIEW • VALIDATE SW SPECS • OPS CONCEPT DOC• ESTABLISH SW SPECS BASELINEPRELIMINARY PDR LATE SI OR • VALIDATE ITEM (PERF) SPECS • ITEM (PERF) SPECDESIGN EARLY SYSTEMS • ESTABLISH HW ALLOCATED • DES DOC TEST PLANREVIEW DEMO BASELINE • ICD, ENGR DRAWINGS• EVALUATE PRELIMINARY• PRELIMINARY SDD - IDDDESIGN HW & SWCRITICAL CDR EARLY • EVALUATE CI DESIGN • PRELIM ITEM (DETAIL),DESIGN SYSTEMS • DETERMINE READINESS PROCESS, MATERIAL SPECSREVIEW DEMO FOR FABRICATION OF EDM • DETAIL DESIGN DOCUMENTSINCLUDE SDD - IDDTEST TRR MID/LATE • APPROVE SW TEST PROCEDURES • SW TEST PLAN/PROCEDURESREADINESS SYSTEMS • DETERMINE READINESS FOR • INFORMAL SW TEST RESULTSREVIEW DEMO FORMAL TESTFUNCTIONAL FCA LATE SYS DEMO • VERIFY CI ACTUAL PERFORMANCE • TEST PLANS & DESCRIPTIONSCONFIGURATION OR EARLY P&D COMPLIES WITH HW DEVELOPMENT • SOFTWARE TEST REPORTSAUDITOR SRS & IRSFORMAL FQR LATE LRIP • VERIFY CI’S PERFORM IN • TEST REPORTSQUALIFICATION SYSTEM ENVIRONMENT • SPECSREVIEW• O&S DOCSPRODUCTION PRR INCREMENTAL • ASSESS RISK FOR • PROD PLANNING DOCUMENTSREADINESS SYSTEMS PRODUCTION GO-AHEADREVIEWDEMOPHYSICAL PCA LRIP EARLY • FORMAT EXAMINATION OF • FINAL ITEM (DETAIL) SPECCONFIGURATION FRP THE AS-BUILT • LISTINGSAUDIT• LEVEL II & III DRAWINGIN-SERVICE ISR AFTER SYSTEM IN • ASSESS SYSTEM OPERATIONS • SYSTEM HAZARD RISK ASSESSMENTREVIEW OPERATIONAL USE IN FIELD CONDITIONS • OPERATIONAL READINESS ASSESSMENT(POST-FRPDR) • ASSESS SUPPORT OF FIELDED SYS • DISCREPANCY REPORTSSOURCE: Defense Acquisition <strong>Guide</strong>book, September 2004.Table 8-1. Technical Reviews <strong>and</strong> Audits8-2


The review process was established to ensure thatthe system under development would meetgovernment requirements. The reviews evaluatedata from contractor <strong>and</strong> government testing,engineering analysis, <strong>and</strong> models to determine ifthe system or its <strong>com</strong>ponents will eventually meetall functional <strong>and</strong> physical specifications <strong>and</strong> todetermine the final system design. (Figure 8-2)The system specification is very important in thisprocess. It is the document used as a benchmarkto <strong>com</strong>pare contractor progress in designing <strong>and</strong>developing the desired product. <strong>Guide</strong>lines forthese formal technical reviews <strong>and</strong> audits can befound in Processes for Engineering a System, orApplication <strong>and</strong> <strong>Management</strong> of the SystemsEngineering Process. 18.2.2 <strong>Test</strong>ing in Support of TechnicalReviewsThe testing <strong>com</strong>munity must be continually involvedin supporting the technical reviews of theirsystems. The timing of these events will varysomewhat from program to program, based uponthe explicit <strong>and</strong> unique needs of the acquisitionstrategy. Lower level design reviews will lead tosystem-level reviews. Decisions made at thesereviews have major impacts on the system testdesign, resources required to test, <strong>and</strong> the developmentof the <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan(TEMP) <strong>and</strong> other documentation. A moredetailed discussion of testing to support the technicalreviews is provided in the Systems EngineeringFundamentals <strong>Guide</strong>. 2 The reviews focusprimarily on government technical specificationsfor the system. Figure 8-3 illustrates the programspecifications <strong>and</strong> how they are developed in thesystem life cycle.8.2.3 Design Reviews <strong>and</strong> Audits8.2.3.1 Concept Refinement (CR) <strong>and</strong>Technology Development (TD)The Alternative Systems Review (ASR) is conductedto demonstrate the preferred systemconcept(s). Technical reviews conducted duringthe early stages of development tend to focus onPHASES CONCEPTREFINEMENTWORKEFFORTSTECHNICAL REVIEWS INTERACTIVE TIMELINECONCEPTDECISIONABTECHNOLOGYDEVELOPMENT(PROGRAM INITIATION)SYSTEM DEVELOPMENT &DEMONSTRATIONSYSTEMINTEGRATIONCSYSTEMDEMONSTRATIONDESIGNREADINESSREVIEWPRODUCTION &DEPLOYMENTLRIPIOT&EIOCFRPFRPDECISIONFOCOPERATIONS &SUPPORTSUSTAINMENTDISPOSALACTIVITIESTECHNICALPRE SYSTEMS AQUISITIONTRASYSTEMS ACQUISITIONTRASUSTAINMENTREVIEWSTTTPT TTTTTPTTITR ASRSRR IBR SFR PDR CDRTRR SVR(FCA)/ OTRR PCA ISR(May be (May bePRRmore morethan one) than one)TECHNOLOGY READINESS ASSESSMENT T TECHNICAL REVIEWSPPROGRAM REVIEWSPSOURCE: Skalamera, Robert J. November 2004. “Implementing OSD Systems Engineering Policy” briefing to the USD(AT&L).Figure 8-2. Technical Review Timeline8-3


SPECIFICATION WHEN PREPARING APPROVING CONTENT BASELINEPREPARED AGENT AGENTSYSTEM EARLY DEV/PROG DEV/PROG DEFINES MISSION AND TECH REQUIRE- FUNCTIONALSEGMENT SI MGR MGR MENTS; ALLOCATES REQUIREMENTS TO(OLD TYPE A) INDUSTRY USER FUNCTIONAL AREAS; DOCUMENTS DESIGNCONSTRAINTS; DEFINES INTERFACESITEM MID INDUSTRY DEV/PROG DETAILS DESIGN REQUIREMENTS; STATES, ALLOCATEDPERFORMANCE SD MGR DESCRIBES PERFORMANCE CHARACTER- “DESIGN(OLD TYPE B) ISTICS OF EACH CI; DIFFERENTIATES TO”REQUIREMENTS ACCORDING TOCOMPLEXITY AND DISCIPLINE SETSITEM DETAIL LATE SD INDUSTRY DEV/PROG DEFINES FORM, FIT, FUNCTION,(OLD TYPE C) MGR PERFORMANCE, AND TEST REQUIREMENTSFOR ACCEPTANCEPROCESS LATE SD INDUSTRY DEV/PROG DEFINES PROCESS PERFORMED(OLD TYPE D) MGR DURING FABRICATIONMATERIAL LATE SD INDUSTRY DEV/PROG DEFINES PRODUCTION OF RAW OR SEMI-(OLD TYPE E) MGR FABRICATED MATERIAL USED IN “BUILD UP”FABRICATIONFigure 8-3. Specifications SummaryPRODUCTtop-level concepts <strong>and</strong> system definitionsclarifying the requirements on which subsequentdesign <strong>and</strong> development activities will be based.8.2.3.2 System Development <strong>and</strong>Demonstration (SDD)The System Requirements Review (SRR) is normallyconducted late in the system concept evaluationor shortly after program initiation. It consistsof a review of the system/system segmentspecifications, previously known as the “A” specifications,3 <strong>and</strong> is conducted after the ac<strong>com</strong>plishmentof functional analysis <strong>and</strong> preliminary requirementsallocation. During this review, thesystems engineering management activity <strong>and</strong> itsoutput are reviewed for responsiveness to theStatement of Work (SOW) requirements. The primaryfunction of the SRR is to ensure that thesystem’s requirements have been <strong>com</strong>pleted <strong>and</strong>properly identified <strong>and</strong> that there is a mutual underst<strong>and</strong>ingbetween the contractor <strong>and</strong> thegovernment. During the review, the contractor describeshis progress <strong>and</strong> any problems in risk identification<strong>and</strong> ranking, risk avoidance <strong>and</strong> reduction,trade-off analysis, producibility <strong>and</strong> manufacturingconsiderations, <strong>and</strong> hazards considerations.The results of integrated test planning arereviewed to ensure the adequacy of planning toassess the design <strong>and</strong> to identify risks. Issues oftestability of requirements should be discussed.The System Functional Review (SFR) is conductedas a final review before submittal of theprototype design products. The system specificationis validated to ensure that the most currentspecification is included in the System FunctionalBaseline <strong>and</strong> that they are adequate <strong>and</strong> costeffectiveto satisfy validated mission requirements.The SFR en<strong>com</strong>passes the total systemrequirement of operations, maintenance, test,training, <strong>com</strong>puters, facilities, personnel, <strong>and</strong>logistics considerations. A technical underst<strong>and</strong>ingshould be reached on the validity <strong>and</strong> thedegree of <strong>com</strong>pleteness of specifications, design,Operational Concept Documentation (OCD),8-4


Software Requirements Specifications (SRSs),<strong>and</strong> Interface Requirements Specifications(IRSs) during this review.The Software Specification Review (SSR) is aformal review of the Computer System ConfigurationItem (CSCI) requirements, normally heldafter an SFR but before the start of a CSCIpreliminary design. Its purpose is to validate theallocated baseline for preliminary CSCI designby demonstrating to the government theadequacy of the SRSs, IRSs, <strong>and</strong> OCDs.The Preliminary Design Review (PDR) is aformal technical review of the basic approachfor a Configuration Item (CI). It is conducted atthe CI <strong>and</strong> system level early in the SDD Phaseto confirm that the preliminary design logicallyfollows SFR findings <strong>and</strong> meets the system requirements.The review results in an approvalto begin the detailed design. The draft itemspecifications (performance) are reviewed duringthe PDR. The purpose of the PDR is to:evaluate the progress, technical adequacy, <strong>and</strong>risk resolution (on technical, cost, <strong>and</strong> schedulebasis) of the CI design approach; review Development<strong>Test</strong> (DT) <strong>and</strong> Operational <strong>Test</strong> (OT)activities to measure the performance of eachCI; <strong>and</strong> establish the existence <strong>and</strong> <strong>com</strong>patibilityof the physical <strong>and</strong> functional interfaceamong the CI <strong>and</strong> other equipment.The Critical Design Review (CDR) may be conductedon each CI <strong>and</strong>/or at the system level. Itis conducted on the Engineering DevelopmentModel (EDM) design when the detailed designis essentially <strong>com</strong>plete <strong>and</strong> prior to the FunctionalConfiguration Audit (FCA). During the CDR, theoverall technical program risks associated witheach CI are also reviewed on a technical, cost,<strong>and</strong> schedule basis. It includes a review of theitem specifications (detail) <strong>and</strong> the status of boththe system’s hardware <strong>and</strong> software. Input fromqualification testing should assist in determinationof readiness for design freeze <strong>and</strong> Low RateInitial Production (LRIP). <strong>Test</strong> plans are reviewedto assess if test efforts are being developed sufficientlyto indicate a <strong>Test</strong> Readiness Review (TRR)would be successful. The CDR should precedethe program DRR.The TRR was a formal review of the contractor’sreadiness to begin CSCI testing, but has evolvedinto a review of both hardware <strong>and</strong> software. Itis conducted after the software test proceduresare available <strong>and</strong> <strong>com</strong>puter software <strong>com</strong>ponenttesting has been <strong>com</strong>pleted. A government witnesswill observe the system demonstration toverify that the system is ready to proceed withCSCI testing. The purpose of the TRR is for theProgram <strong>Management</strong> Office (PMO) to determinewhether the test procedures are <strong>com</strong>plete<strong>and</strong> verify their <strong>com</strong>pliance with test plans <strong>and</strong>descriptions.8.2.3.3 Production <strong>and</strong> Deployment (P&D)The FCA is a formal review to verify that theCI’s performance <strong>com</strong>plied with its system specification.The item specifications are derived fromthe system requirements <strong>and</strong> baseline documentation.During the FCA, all relevant test data arereviewed to verify that the item has performed asrequired by its functional <strong>and</strong>/or allocatedconfiguration identification. The audit is conductedon the item representative (prototype orproduction) of the configuration to be releasedfor production. The audit consists of a review ofthe contractor’s test procedures <strong>and</strong> results. Theinformation provided will be used during the FCAto determine the status of planned tests.The Physical Configuration Audit (PCA) is aformal review that establishes the product baselineas reflected in an early production CI. It is theexamination of the as-built version of hardware<strong>and</strong> software CIs against its technical documentation.The PCA also determines that theacceptance testing requirements prescribed by thedocumentation are adequate for acceptance ofproduction units of a CI by Quality Assurance(QA) activities. It includes a detailed audit of8-5


engineering drawings, final Part II item specifications(detail), technical data, <strong>and</strong> plans for testingthat will be utilized during production. ThePCA is performed on all first articles <strong>and</strong> on thefirst CIs delivered by a new contractor.The System Verification Review (SVR) is asystems-level configuration audit that may beconducted after system testing is <strong>com</strong>pleted. Theobjective is to verify that the actual performanceof the CI (the production configuration), asdetermined through testing, <strong>com</strong>plies with its itemspecifications (performance) <strong>and</strong> to document theresults of the qualification tests. The SVR <strong>and</strong>FCA are often performed at the same time; however,if sufficient test results are not available atthe FCA to ensure the CI will perform in its operationalenvironment, the SVR can be scheduledfor a later time.The Production Readiness Review (PRR) is anassessment of the contractor’s ability to producethe items on the contract. It is usually a series ofreviews conducted before an LRIP or FRP decision.For more information, see Chapter 10, ProductionRelated <strong>Test</strong>ing Activities.8.3 CONFIGURATION CHANGECONTROLConfiguration Change Control is reviewed toassess the impact of engineering or designchanges. It is conducted by the engineering, T&E,<strong>and</strong> Program Manager (PM) portions of the PMO.Most approved Class I Engineering ChangeProposals (ECPs) will require additional testing,<strong>and</strong> the test manager must ac<strong>com</strong>modate the newschedules <strong>and</strong> resource requirements. Adequatetesting must be ac<strong>com</strong>plished to ensure integration<strong>and</strong> <strong>com</strong>patibility of these changes. Forexample, an Engineering Change Review (ECR)was conducted to replace the black <strong>and</strong> whitemonitors <strong>and</strong> integrate color monitors into theAirborne Warning <strong>and</strong> Control System (AWACS).Further, the AWACS operating software had tobe upgraded to h<strong>and</strong>le color enhancement. Thereview was conducted by the government PMO;<strong>and</strong> sections of the PMO were tasked to contract,test, engineer, logistically support, control, cost,<strong>and</strong> finance the change to <strong>com</strong>pletion. <strong>Guide</strong>linesfor configuration control <strong>and</strong> engineering changesare discussed in Configuration Control – EngineeringChanges, Deviations, <strong>and</strong> Waivers. 48.4 SUMMARYDesign reviews are an integral <strong>and</strong> essential partof the SEP. The meetings range from very formalreviews by government <strong>and</strong> contractor PMs to informaltechnical reviews concerned with productor task elements of the Work Breakdown Structure(WBS). Reviews may be conducted in incrementsover time. All reviews share the<strong>com</strong>mon objective of determining the technicaladequacy of the existing design to meet technicalrequirements. The DT/OT assessments <strong>and</strong>test results are made available to the reviews,<strong>and</strong> it is important that the test <strong>com</strong>munity beinvolved.8-6


ENDNOTES1. EIA St<strong>and</strong>ard 632, Processes for Engineeringa System, January 1999; or IEEE 1220-1994, Application <strong>and</strong> <strong>Management</strong> of theSystems Engineering Process, 1994.2. Defense Systems <strong>Management</strong> College,Systems Engineering Fundamentals <strong>Guide</strong>,January 2001.3. Military St<strong>and</strong>ard (MIL-STD)-1512A, TechnicalReviews <strong>and</strong> Audits for Systems, Equipment,<strong>and</strong> Computer Programs. SystemFunctional Block Diagram, Chapter 12,September 30, 1997.4. EIA St<strong>and</strong>ard 649, MIL-STD-480, ConfigurationControl – Engineering Changes,Deviations, <strong>and</strong> Waivers, August 1, 1998.8-7


8-8


9COMBINED ANDCONCURRENT TESTING9.1 INTRODUCTIONThe terms “concurrency,” “concurrent testing,”<strong>and</strong> “<strong>com</strong>bined testing” are sometimes subject tomisinterpretation. Concurrency is defined as anapproach to system development <strong>and</strong> acquisitionin which phases of the acquisition process thatnormally occur sequentially overlap to someextent. For example, a weapon system enters theproduction phase while development efforts arestill underway.Concurrent testing refers to circumstances inwhich development testing <strong>and</strong> operational testingtake place at the same time as two parallel,but separate <strong>and</strong> distinct, activities. In contrast,<strong>com</strong>bined/integrated testing refers to a single testprogram conducted to support Development <strong>Test</strong>(DT) <strong>and</strong> Operational <strong>Test</strong> (OT) objectives. Thischapter discusses the use of <strong>com</strong>bined testing <strong>and</strong>concurrent testing, <strong>and</strong> highlights some of theadvantages <strong>and</strong> disadvantages associated withthese approaches. (Table 9-1)9.2 COMBINING DEVELOPMENT TESTAND OPERATIONAL TESTCertain test events can be organized to provideinformation useful to development testers <strong>and</strong>operational testers. For example, a prototype freefallmunition could be released from a fighteraircraft at operational employment conditionsinstead of from a static st<strong>and</strong> to satisfy DT <strong>and</strong>OT objectives. Such instances need to be identifiedto prevent unnecessary duplication of effort<strong>and</strong> to control costs. A <strong>com</strong>bined testing approachis also appropriate for certain specialized typesof testing. For example, in the case of NuclearHardness <strong>and</strong> Survivability (NHS) testing,systems cannot be tested in a totally realisticoperational environment; therefore, a single testprogram is often used to meet both DT <strong>and</strong> OTobjectives.A <strong>com</strong>bined/integrated Development <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (DT&E) <strong>and</strong> Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (OT&E) approach should be consideredwhen there are time <strong>and</strong> cost savings. 1 The<strong>com</strong>bined approach must not <strong>com</strong>promise eitherDT or OT objectives. If this approach is elected,planning efforts must be carefully coordinatedearly in the program to ensure data are obtainedto satisfy the needs of both the developing agency<strong>and</strong> the independent operational tester. Care mustalso be exercised to ensure a <strong>com</strong>bined test programcontains dedicated OT events to satisfy therequirement for an independent evaluation. Afinal independent phase of OT&E testing shallbe required for Beyond Low Rate Initial Production(BLRIP) decisions. In all <strong>com</strong>bined testprograms, provisions for separate independentdevelopment <strong>and</strong> operational evaluations of testresults should be provided.Service regulations describe the sequence ofactivities in a <strong>com</strong>bined testing program asfollows:Although OT&E is separate <strong>and</strong> distinctfrom DT&E, most of the generated dataare mutually beneficial <strong>and</strong> freely shared.Similarly, the resources needed to conduct<strong>and</strong> support both test efforts are often thesame or very similar. Thus, when sequentialDT&E <strong>and</strong> OT&E efforts would causedelay or increase the acquisition cost of9-1


Table 9-1. Combined vs. Concurrent <strong>Test</strong>ing: Advantages <strong>and</strong> LimitationsADVANTAGESCOMBINED TESTINGLIMITATIONS• SHORTENS TIME REQUIRED FOR TESTING AND,THUS, THE ACQUISITION CYCLE.• ACHIEVES COST SAVINGS BY ELIMINATINGREDUNDANT ACTIVITIES.• EARLY INVOLVEMENT OF OT&E PERSONNEL DURINGSYSTEM DEVELOPMENT INCREASES THEIRFAMILIARITY WITH SYSTEM.• EARLY INVOLVEMENT OF OT&E PERSONNEL PER-MITS COMMUNICATION OF OPERATIONAL CON-CERNS TO DEVELOPER IN TIME TO ALLOW CHANGESIN SYSTEM DESIGN.• REQUIRES EXTENSIVE EARLY COORDINATION.• TEST OBJECTIVES MAY BE COMPROMISED.• REQUIRES DEVELOPMENT OF DT/OT COMMON TESTDATABASE.• COMBINED TESTING PROGRAMS ARE OFTEN CON-DUCTED IN A DEVELOPMENT ENVIRONMENT.• TEST WILL BE DIFFICULT TO DESIGN TO MEET DT ANDOT REQUIREMENTS.• THE SYSTEM CONTRACTOR IS PROHIBITED BY LAWFROM PARTICIPATING IN IOT&E.• TIME CONSTRAINTS MAY RESULT IN LESS COVERAGETHAN PLANNED FOR OT&E OBJECTIVES.CONCURRENT TESTINGADVANTAGES• SHORTENS THE TIME REQUIRED FOR TESTING AND,THUS, THE ACQUISITION CYCLE.• ACHIEVES COST SAVINGS BY OVERLAPPINGREDUNDANT ACTIVITIES.• PROVIDES EARLIER FEEDBACK TO THEDEVELOPMENT PROCESS.LIMITATIONS• REQUIRES EXTENSIVE COORDINATION OF TESTASSETS.• IF SYSTEM DESIGN IS UNSTABLE AND FAR-REACHINGMODIFICATIONS ARE MADE, OT&E MUST BE REPEATED.• CONCURRENT TESTING PROGRAMS OFTEN DO NOTHAVE DT DATA AVAILABLE FOR OT&E PLANNING ANDEVALUATION.• CONTRACTOR PERSONNEL FREQUENTLY PERFORMMAINTENANCE FUNCTIONS IN A DT&E. LOGISTICSSUPPORT BY USER MUST BE AVAILABLE EARLIER FORIOT&E.• LIMITED TEST ASSETS MAY RESULT IN LESS COVERAGETHAN PLANNED FOR OT&E OBJECTIVES.the system, DT&E <strong>and</strong> OT&E are <strong>com</strong>bined.When <strong>com</strong>bined testing is planned,the necessary test conditions <strong>and</strong> datarequired by both DT&E <strong>and</strong> OT&Eorganizations must be integrated. Combinedtesting can normally be divided intothree segments.In the first segment, DT&E event[s]usually assume priority because criticaltechnical <strong>and</strong> engineering tests must beac<strong>com</strong>plished to continue the engineering<strong>and</strong> development process. During this earlyperiod, OT&E personnel participate to gainfamiliarity with the system <strong>and</strong> to gain accessto any test data that can supportOT&E. Next, the <strong>com</strong>bined portion of thetesting frequently includes shared objectivesor joint data requirements. The lastsegment normally contains the dedicated9-2


OT&E or separate OT&E events to beconducted by the OT&E agency. TheOT&E agency <strong>and</strong> implementing <strong>com</strong>m<strong>and</strong>must ensure the <strong>com</strong>bined test isplanned <strong>and</strong> executed to provide the necessaryOT information. The OT&E agencyprovides an independent evaluation of theOT&E portion <strong>and</strong> is ultimately responsiblefor achieving OT&E objectives.The testing of the Navy’s F-14 aircraft has beencited as an example of a successful <strong>com</strong>bined <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> (T&E) program. 2 A key factor inthe success of the F-14 approach was the selectionof a T&E coordinator responsible for supervisingthe generation of test plans that integratedthe technical requirements of the developers withthe operational requirements of the users. The T&Ecoordinator was also responsible for the allocationof test resources <strong>and</strong> the overall managementof the test. In a paper for the Defense Systems<strong>Management</strong> College, Mr. Thomas Hoivik describesthe successful F-14 test program as follows:The majority of the Navy developmental<strong>and</strong> operational testing took place duringthe same period <strong>and</strong> even on the sameflights. Maximum use was made of contractordemonstrations witnessed by theNavy testing activities to obviate theretesting of a technical point alreadydemonstrated by the contractor. Witnessingby testing activities was cruciallyimportant <strong>and</strong> allowed the contractor’s datato be readily accepted by the testing activities.This approach also helped to eliminateredundancy in testing, i.e., the testingof the same performance parameter by severaldifferent activities, which has been aconsistent <strong>and</strong> wasteful feature of Navytesting in the past. 3This approach placed a great deal of responsibilitydirectly on the shoulders of the T&E Coordinator,<strong>and</strong> required the T&E Coordinator’s staff to dealknowledgeably with a wide-ranging <strong>and</strong> <strong>com</strong>plextest plan.9.3 CONCURRENT TESTINGIn 1983, a senior Department of Defense (DoD)T&E official testified that a concurrent testingapproach is usually not an effective strategy. 4 Heacknowledged, however, that certain test eventsmay provide information useful to development<strong>and</strong> operational testers, <strong>and</strong> test planners must bealert to identify those events. His testimony includedthe following examples of situations wherea concurrent testing approach was unsuccessful:(1) During AAH (Advanced Attack Helicopter)testing in 1981, the Target AcquisitionDesignation System (TADS) was undergoingdevelopmental <strong>and</strong> operationaltesting at the same time. The schedule didnot allow enough time for qualificationtesting (a development test activity) of theTADS prototype prior to a full field testof the total aircraft system, nor was theretime to introduce changes to TADSproblems discovered in tests. As a result,the TADS performed poorly <strong>and</strong> wasunreliable during the operational test. Theresulting DSARC [Defense SystemsAcquisition Review Council] actionrequired the Army to fix <strong>and</strong> retest theTADS prior to release of second year <strong>and</strong>subsequent production funds.(2) When the AIM-7 Sparrow air-to-air missilewas tested, an attempt was made to moveinto operational testing while developmentalreliability testing was still underway.The operational test was suspended afterless than 2 weeks because of poor reliabilityof the test missiles. The program concentratedon an intensive reliability improvementeffort. A year after the initialfalse start, a full operational test was conducted<strong>and</strong> <strong>com</strong>pleted successfully.9-3


(3) The Maverick missile had a similar experienceof being tested in an operationalenvironment before <strong>com</strong>ponent reliabilitytesting was <strong>com</strong>pleted. As a result,reliability failures had a major impact onthe operational testers <strong>and</strong> resulted in theprogram being extended.9.4 ADVANTAGES AND LIMITATIONSBefore adopting a <strong>com</strong>bined or concurrent testingapproach, program <strong>and</strong> test managers are advisedto consider the advantages <strong>and</strong> disadvantagessummarized in Table 9-1.9.5 SUMMARYA <strong>com</strong>bined or concurrent testing approach mayoffer an effective means of shortening the timerequired for testing <strong>and</strong> achieving cost savings.If such an approach is used, extensive coordinationis required to ensure the development <strong>and</strong>operational requirements are addressed.It is possible to have <strong>com</strong>bined test teams, consistingof DT&E, OT&E, <strong>and</strong> contractor personnelinvolved throughout the testing process. Theteams can provide mutual support <strong>and</strong> sharemutually beneficial data as long as the test programis carefully planned <strong>and</strong> executed <strong>and</strong>reporting activities are conducted separately.9-4


ENDNOTES1. As suggested by DoD Instruction (DoDI)5000.2, Operation of the Defense AcquisitionSystem, May 12, 2003.2. Hoivik, Thomas H., The Navy <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Process in Major Systems Acquisition.DTIC [Defense Technical Information Center]Accession Number AD A035935,November 1976.3. Ibid.4. Hearing before the Senate Committee on GovernmentalAffairs, June 23, 1983.9-5


9-6


10PRODUCTION-RELATEDTESTING ACTIVITIES10.1 INTRODUCTIONMost of the <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E) discussedin this guide concerns the testing of the actualweapon or system being developed, but theProgram Manager (PM) must also evaluateproduction-related test activities <strong>and</strong> the productionprocess. This chapter describes productionmanagement <strong>and</strong> the production process testingrequired to ensure the effectiveness of the manufacturingprocess <strong>and</strong> the producibility of thesystem’s design.Normally, the Development <strong>Test</strong> (DT) <strong>and</strong> Operational<strong>Test</strong> (OT) organizations are not involveddirectly in this process. Usually, the manufacturing<strong>and</strong> Quality Assurance (QA) sections of theprogram office <strong>and</strong> a representative of the governmentDefense Contract <strong>Management</strong> Agency(DCMA) oversee/perform many of thesefunctions.10.2 PRODUCTION MANAGEMENTProduction (manufacturing) management is the effectiveuse of resources to produce, on schedule,the required number of end items that meet specifiedquality, performance, <strong>and</strong> cost. Productionmanagement includes, but is not limited to, industrialresource analysis, producibility assessment,producibility engineering <strong>and</strong> planning, productionengineering, industrial preparedness planning, postproductionplanning, <strong>and</strong> productivity enhancement.Production management begins early in theacquisition process, as early as the concept assessments,<strong>and</strong> is specifically addressed at eachprogram milestone decision point. For instance,before program initiation production feasibility,costs <strong>and</strong> risks should be addressed. The PM mustconduct an Industrial Resource Analysis (IRA) todetermine the availability of production resources(e.g., capital, material, manpower) required to supportthe production of the weapon system. On thebasis of the results of the IRA, critical materials,deficiencies in the U.S. industrial base, <strong>and</strong> requirementsfor new or updated manufacturing technologycan be identified. Analysis of the industrialbasecapacity is one of the considerations in preparingfor the program start decision. As developmentproceeds, the manufacturing strategy is developed;<strong>and</strong> detailed plans are made for production.Independent producibility assessments, conductedin preparation for the transition from developmentto production, are reviewed before enteringLow Rate Initial Production (LRIP). Onceproduction starts, the producibility of the systemdesign concept is evaluated to verify that the systemcan be manufactured in <strong>com</strong>pliance with theproduction-cost <strong>and</strong> the industrial-base goals <strong>and</strong>thresholds.The LRIP decision is supported by an assessmentof the readiness of the system to enter production.The system cannot enter production until itis determined the principal contractors have thenecessary resources (i.e., physical, financial, <strong>and</strong>managerial capacity) to achieve the cost <strong>and</strong>schedule <strong>com</strong>mitments <strong>and</strong> to meet peacetime <strong>and</strong>mobilization requirements for production of thesystem. The method of assessing productionreadiness is the Production Readiness Review(PRR), which is conducted by the PM <strong>and</strong> staff.10-1


10.3 PRODUCTION READINESSREVIEWThe following is an overview of PRRs:This review is intended to determine thestatus of <strong>com</strong>pletion of the specificactions that must be satisfactorilyac<strong>com</strong>plished prior to executing a productiongo-ahead decision. The review isac<strong>com</strong>plished in an incremental fashionbefore <strong>com</strong>mencing production. Usuallytwo initial reviews <strong>and</strong> one final revieware conducted to assess the risk in exercisingthe production go-ahead decision.In its earlier stages the PRR concernsitself with gross-level manufacturingconcerns such as the need for identifyinghigh-risk/low-yield manufacturing processesor materials, or the requirement formanufacturing development effort tosatisfy design requirements. Timing of theincremental PRRs is a function ofprogram posture <strong>and</strong> is not specificallylocked into other reviews.The conduct of a PRR (Table 10-1) is theresponsibility of the PM, who usually appoints adirector. The director assembles a team <strong>com</strong>prisedof individuals in the disciplines of design,industry, manufacturing, procurement, inventorycontrol, contracts, engineering, <strong>and</strong> qualitytraining. The PRR director organizes <strong>and</strong> managesthe team effort <strong>and</strong> supervises preparation of thefindings.10.4 QUALIFICATION TESTINGQualification testing is performed to verify thedesign <strong>and</strong> manufacturing process, <strong>and</strong> it providesa baseline for subsequent acceptance tests. Theproduction qualification testing is conducted atthe unit, subsystem, <strong>and</strong> system level on productionitems <strong>and</strong> is <strong>com</strong>pleted before the productiondecision. The results of these tests are a criticalfactor in assessing the system’s readiness forproduction. Down line Production Qualification<strong>Test</strong>s (PQTs) are performed to verify process control<strong>and</strong> may be performed on selected parametersrather than at the levels originally selected forqualification.10.4.1 Production Qualification <strong>Test</strong>sPQTs are a series of formal contractual testsconducted to ensure design integrity over thespecified operational <strong>and</strong> environmental range.The tests are conducted on pre-Full Rate Production(FRP) items fabricated to the proposed productiondesign drawings <strong>and</strong> specifications. ThePQTs include all contractual Reliability <strong>and</strong>Maintainability (R&M) demonstration testsrequired prior to production release. For volumeacquisitions, these tests are a constraint toproduction release.10.4.2 First Article <strong>Test</strong>s (FATs)FATs consist of a series of formal contractual testsconducted to ensure the effectiveness of themanufacturing process, equipment, <strong>and</strong> procedures.These tests are conducted on a r<strong>and</strong>omsample from the first production lot. These seriesof tests are repeated if the manufacturingprocess, equipment, or procedure is changed significantly<strong>and</strong> when a second or alternative sourceof manufacturing is brought online. 110.5 TRANSITION TO PRODUCTIONIn an acquisition process, often the first indicationthat a system will experience problems isduring the transition from engineering design toLRIP. This transition continues over an extendedperiod, often months or years; <strong>and</strong> during this period,the system is undergoing stringent contractor<strong>and</strong> government testing. There may be unexpectedfailures requiring significant designchanges, which impact on quality, producibility,supportability, <strong>and</strong> may require program scheduleslippage. Long periods of transition usuallyindicate that insufficient attention to design or10-2


Table 10-1. PRR <strong>Guide</strong>lines ChecklistPRODUCT DESIGN• PRODUCIBLE AT LOW RISK• STABILIZED AT LOW RATE OF CHANGE• VALIDATED• RELIABILITY, MAINTAINABILITY, AND PERFORMANCE DEMONSTRATED• COMPONENTS ENGINEERING HAS APPROVED ALL PARTS SELECTIONSINDUSTRIAL RESOURCES• ADEQUATE PLANT CAPACITY (PEACETIME AND WARTIME DEMANDS)• FACILITIES, SPECIAL PRODUCTION AND TEST EQUIPMENT, AND TOOLING IDENTIFIED• NEEDED PLANT MODERNIZATION (CAD/CAM, OTHER AUTOMATION) ACCOMPLISHED, WHICH PRODUCESAN INVESTED CAPTIVE PAYBACK IN 2 TO 5 YEARS• ASSOCIATED COMPUTER SOFTWARE DEVELOPED• SKILLED PERSONNEL AND TRAINING PROGRAMS AVAILABLEPRODUCTION ENGINEERING AND PLANNING (PEP)• PRODUCTION PLAN DEVELOPED*• PRODUCTION SCHEDULES COMPATIBLE WITH DELIVERY REQUIREMENTS• MANUFACTURING METHODS AND PROCESSES INTEGRATED WITH FACILITIES, EQUIPMENT, TOOLING,AND PLANT LAYOUT• VALUE ENGINEERING APPLIED• ALTERNATE PRODUCTION APPROACHES AVAILABLE• DRAWINGS, STANDARDS, AND SHOP INSTRUCTIONS ARE EXPLICIT• CONFIGURATION MANAGEMENT ADEQUATE• PRODUCTION POLICIES AND PROCEDURES DOCUMENTED• MANAGEMENT INFORMATION SYSTEM ADEQUATE• CONTRACTOR’S MANAGEMENT STRUCTURE IS ACCEPTABLE TO THE PMO• THE PEP CHECKLIST HAS BEEN REVIEWEDMATERIALS• ALL SELECTED MATERIALS APPROVED BY CONTRACTOR’S MATERIAL ENGINEERS• BILL OF MATERIALS PREPARED• MAKE-OR-BUY DECISIONS COMPLETE• PROCUREMENT OF LONG LEAD-TIME ITEMS IDENTIFIED• SOLE-SOURCE AND GOVERNMENT-FURNISHED ITEMS IDENTIFIED• CONTRACTOR’S INVENTORY-CONTROL SYSTEM ADEQUATE• CONTRACTOR’S MATERIAL COST PROCUREMENT PLAN COMPLETEQUALITY ASSURANCE (QA)• QUALITY PLAN IN ACCORDANCE WITH CONTRACT REQUIREMENTS• QUALITY CONTROL PROCEDURES AND ACCEPTANCE CRITERIA ESTABLISHED• QA ORGANIZATION PARTICIPATES IN PRODUCTION PLANNING EFFORTLOGISTICS• OPERATIONAL SUPPORT, TEST, AND DIAGNOSTIC EQUIPMENT AVAILABLE AT SYSTEM DEPLOYMENT• TRAINING AIDS, SIMULATORS, AND OTHER DEVICES READY AT SYSTEM DEPLOYMENT• SPARES INTEGRATED INTO PRODUCTION LOT FLOW*Military St<strong>and</strong>ard (MIL-STD)-1528, Manufacturing Master Plan.10-3


producibility was given early in the program’s acquisitionprocess.10.5.1 Transition PlanningProducibility Engineering <strong>and</strong> Planning (PEP) isthe <strong>com</strong>mon thread that guides a system fromearly concept to production. Planning is a managementtool used to ensure that adequate riskh<strong>and</strong>lingmeasures have been taken to transitionfrom development to production. It contains achecklist to be used during the readiness reviews.Planning should tie together the applications ofdesigning, testing, <strong>and</strong> manufacturing activitiesto reduce data requirements, duplication of effort,costs, <strong>and</strong> scheduling, <strong>and</strong> to ensure early successof the LRIP first production article.10.5.2 <strong>Test</strong>ing During the Transition<strong>Test</strong>ing ac<strong>com</strong>plished during the transition fromdevelopment to production will include acceptancetesting, manufacturing screening, <strong>and</strong> finaltesting. These technical tests are performed bythe contractor to ensure the system will transitionsmoothly <strong>and</strong> that test design <strong>and</strong> manufacturingissues affecting design are addressed. Duringthis same period, the government will be usingthe latest available Configuration Item (CI) toconduct the Initial Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(IOT&E). The impact of these tests may overwhelmthe configuration management of thesystem unless careful planning is ac<strong>com</strong>plishedto h<strong>and</strong>le these changes.10.6 LOW RATE INITIAL PRODUCTIONLRIP is the production of a system in limitedquantity to provide articles for IOT&E <strong>and</strong> todemonstrate production capability. Also, it permitsan orderly increase in the production ratesufficient to lead to FRP upon successful <strong>com</strong>pletionof operational testing. The decision to havean LRIP is made at the Milestone C approval ofthe program acquisition strategy. At that time, thePM must identify the quantity to be producedduring LRIP <strong>and</strong> validate the quantity of LRIParticles to be used for IOT&E [Acquisition Category(ACAT) I approved by the Director,Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DOT&E), ACATII <strong>and</strong> III approved by Service Operational <strong>Test</strong>Agency (OTA)]. 2 When the decision authoritythinks the systems will not perform to expectation,the PM may direct that it not proceed intoLRIP until there is a program review. TheDOT&E submits a Beyond LRIP (BLRIP) Reporton all oversight systems to congressional<strong>com</strong>mittees before the FRP decision, approvingthe system to proceed beyond LRIP, is made.10.7 PRODUCTION ACCEPTANCE TESTAND EVALUATION (PAT&E)The PAT&E ensures that production items demonstratethe fulfillment of the requirements <strong>and</strong>specifications of the procuring contract or agreements.The testing also ensures the system beingproduced demonstrates the same performance asthe pre-FRP models. The procured items or systemmust operate in accordance with system <strong>and</strong>item specifications. The PAT&E is usually conductedby the program office QA section at thecontractor’s plant <strong>and</strong> may involve operationalusers.For example, for the Rockwell B-1B Bomber productionacceptance, Rockwell <strong>and</strong> Air Force QAinspectors reviewed all manufacturing <strong>and</strong> groundtesting results for each aircraft. In addition, aflight test team, <strong>com</strong>posed of contractor <strong>and</strong> AirForce test pilots, flew each aircraft a minimumof 10 hours, demonstrating all on-board aircraftsystems while in flight. Any discrepancies inflight were noted, corrected, <strong>and</strong> tested on theground; they were then retested on subsequentcheckouts <strong>and</strong> acceptance flights. Once each aircrafthad passed all tests <strong>and</strong> all systems werefully operational, Air Force authorities acceptedthe aircraft. The test documentation also becamepart of the delivered package. During this testperiod, the program office monitored eachaircraft’s daily progress.10-4


10.8 SUMMARYA primary purpose of production-related testingis to lower the production risk in a Major DefenseAcquisition Program (MDAP). The PM mustensure the contractor’s manufacturing strategy <strong>and</strong>capabilities will result in the desired productwithin acceptable cost. The LRIP <strong>and</strong> PAT&E alsoplay major roles in ensuring the production unitis identical to the design drawings, conforms tothe specifications of the contract, that the IOT&Eis conducted with representative system configurations,<strong>and</strong> the system fielded meets the user’sneeds.10-5


ENDNOTES1. Federal Acquisition Regulation (FAR) Part9.3, First Article <strong>Test</strong>ing <strong>and</strong> Approval,September 2001.2. DoD Instruction (DoDI) 5000.2, Operationof the Defense Acquisition System, May 12,2003.10-6


IIIMODULEOPERATIONALTEST AND EVALUATIONOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E) is conducted to ensurethat a weapon system meets the validated requirements of theuser in a realistic scenario. Operational tests are focused onoperational requirements, effectiveness <strong>and</strong> suitability, <strong>and</strong> noton the proof of engineering specifications, as is the case withdevelopment testing. This module provides an overview of OT&E<strong>and</strong> discusses how OT&E results provide essential informationfor milestone decisions.


11INTRODUCTION TO OPERATIONALTEST AND EVALUATION11.1 INTRODUCTIONThis chapter provides an introduction to the conceptof Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E).It outlines the purpose of OT&E, discusses theprimary participants in the OT&E process, describesseveral types of OT&E, <strong>and</strong> includes somegeneral guidelines for the successful planning,execution, <strong>and</strong> reporting of OT&E programs.11.2 PURPOSE OF OT&EOT&E is conducted for major programs by anorganization that is independent of the developing,procuring, <strong>and</strong> using <strong>com</strong>m<strong>and</strong>s. Some formof operational assessment is normally conductedin each acquisition phase. Each assessment shouldbe keyed to a decision review in the materiel acquisitionprocess. It should include typical useroperators, crews, or units in realistic <strong>com</strong>bat simulationsof operational environments. The OT&Eprovides the decision authority with an estimateof:(1) The degree of satisfaction of the user’srequirements expressed as operational effectiveness<strong>and</strong> operational suitability of thenew system;(2) The system’s desirability, considering equipmentalready available, <strong>and</strong> operationalbenefits or burdens associated with the newsystem;(3) The need for further development of the newsystem to correct performance deficiencies;(4) The adequacy of doctrine, organizations,operating techniques, tactics, <strong>and</strong> trainingfor employment of the system; of maintenancesupport for the system; <strong>and</strong> of thesystem’s performance in the countermeasuresenvironment.11.3 TEST PARTICIPANTSThe OT&E of developing systems is managedby an independent Operational <strong>Test</strong> Agency(OTA), which each Service is required to maintain.It is ac<strong>com</strong>plished under conditions of operationalrealism whenever possible. Personnelwho operate, maintain, <strong>and</strong> support the systemduring OT&E are trained to a level <strong>com</strong>mensuratewith that of personnel who will perform thesefunctions under peacetime <strong>and</strong> wartime conditions.Also, Program <strong>Management</strong> Office (PMO)personnel, the Integrated Product Teams (IPTs),<strong>and</strong> test coordinating groups play important partsin the overall OT&E planning <strong>and</strong> executionprocess.11.3.1 Service Operational <strong>Test</strong> AgenciesThe OT&E agencies should be<strong>com</strong>e involvedearly in the system’s life cycle, usually duringthe program’s evaluation of concepts. At this time,they can begin to develop strategies for conductingoperational tests (OT&E). As test planningcontinues, a more-detailed <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Master Plan (TEMP), Part IV (OT&E), is developed,<strong>and</strong> the test resources are identified <strong>and</strong>scheduled. During the early stages, the OTAsstructure an OT&E program consistent with theapproved acquisition strategy for the system, identifycritical Operational <strong>Test</strong> (OT) issues, <strong>and</strong>11-1


assess the adequacy of c<strong>and</strong>idate systems. As theprogram moves into advanced planning, OT&Eefforts are directed toward be<strong>com</strong>ing familiar withthe system, encouraging interface between theuser <strong>and</strong> developer <strong>and</strong> further refining the CriticalOperational Issues (COIs). The OTA test directors,analysts, <strong>and</strong> evaluators design the OT&Eso that the data collected will support answeringthe COIs. Each Service has an independent organizationdedicated to planning, executing, <strong>and</strong>reporting the results of that Service’s OT&E activities.These organizations are the: Army <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> Comm<strong>and</strong> (ATEC), Navy Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Force (OPTEVFOR),Air Force Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Center(AFOTEC), <strong>and</strong> Marine Corps Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Activity (MCOTEA). Althoughpolicy only requires OTA within the Services,the Defense Information Systems Agency(DISA) has designated the Joint Interoperability<strong>Test</strong> Comm<strong>and</strong> (JITC) as the OTA for their Departmentof Defense (DoD) information systems<strong>and</strong> a Special Operations Comm<strong>and</strong> (SOCOM)element conducts many of its own IOT&E.11.3.2 <strong>Test</strong> PersonnelOperational testing is conducted on materielsystems with “typical” user organizational unitsin a realistic operational environment. It usespersonnel with the same Military OccupationalSpecialties (MOSs) as those who will operate,maintain, <strong>and</strong> support the system when fielded.Participants are trained in the system’s operationbased on the Service’s operational mission profiles.Because some OT&E consist of force-onforcetests, the forces opposing the tested systemmust also be trained in the use of threat equipment,tactics, <strong>and</strong> doctrine. For operational testingconducted before Initial Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (IOT&E), most system training is conductedby the system’s contractor. For IOT&E,the contractor trains the Service school cadre whothen train the participating organizational units.Once the system has entered Full-Rate Production(FRP), the Service will normally assumetraining responsibilities. Operational testing oftenrequires a large support staff of data collectors<strong>and</strong> scenario controllers operating in the fieldwith the user test forces <strong>and</strong> opposing forces.11.4 TYPES OF OT&EThe OT&E can be subdivided into two phases:operational testing performed before FRP <strong>and</strong> theoperational testing performed after the FRP decision.The pre-FRP OT&E includes Early OperationalAssessments (EOAs), Operational Assessments(OAs), <strong>and</strong> IOT&E. OAs begin early in theprogram, frequently before program start <strong>and</strong> continueuntil the system is certified as ready forIOT&E. The IOT&E is conducted late during lowrate production, when test assets are available, insupport of the full rate decision review. The Navyuses the term “OPEVAL” (Operational <strong>Evaluation</strong>)to define IOT&E. After transition to FRP, all subsequentoperational testing is usually referred toas Follow-on Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(FOT&E). In the Air Force, if no Research <strong>and</strong>Development (R&D) funding is <strong>com</strong>mitted to asystem, Qualification Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(QOT&E) may be performed in lieu ofIOT&E.11.4.1 Early Operational AssessmentsThe EOAs are conducted primarily to forecast <strong>and</strong>evaluate the potential operational effectiveness<strong>and</strong> suitability of the weapon system during development.EOAs start during Concept Refinement(CR) <strong>and</strong>/or Technology Development (TD),may continue into system integration, <strong>and</strong> areconducted on prototypes of the developing design.11.4.1.1 Operational AssessmentsThe OAs begin when the OTAs start their evaluationsof system-level performance. The OTAuses any testing results, Modeling <strong>and</strong> Simulation(M&S), <strong>and</strong> data from other sources duringan assessment. These data are evaluated by theOTA from an operational point of view. As the11-2


program matures, these OAs of performancerequirements are conducted on Engineering DevelopmentModels (EDMs) or pre-production articlesuntil the system performance is consideredmature. The mature system can then be certifiedready for its IOT&E (OPEVAL in the Navy).When using an evolutionary acquisition strategy,subsequent increments must have at least an OAbefore the new increment is issued to the user.11.4.1.2 Initial Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (Navy OPEVAL)The IOT&E is the final dedicated phase of OT&Epreceding an FRP decision. It is the final evaluationthat entails dedicated operational testing ofproduction-representative test articles <strong>and</strong> usestypical operational personnel in a scenario that isas realistic as possible in <strong>com</strong>pliance with UnitedStates Code (U.S.C.). 1 The IOT&E is conductedby an OT&E agency independent of the contractor,PMO, or developing agency. The test has beendescribed as:All OT&E is conducted on production orproduction-representative articles, tosupport the decision to proceed beyondLow Rate Initial Production [LRIP]. It isconducted to provide a valid estimate ofexpected system operational effectiveness<strong>and</strong> operational suitability. The definitionof “OT&E” as spelled out in congressionallegislation (see Glossary) is generallyconsidered applicable only to InitialOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (IOT&E).Further, IOT&E must be conducted withoutsystem contractor personnel participationin any capacity other than stipulatedin-service wartime tactics <strong>and</strong> doctrineas set forth in Public Law 99-661 2by Congress. The results from this test areevaluated <strong>and</strong> presented to the MilestoneDecision Authority (MDA) (i.e., the decisionto enter FRP) to support the Beyond-Low-RateInitial Production(BLRIP) decision. This phase of OT&Eaddresses the Key Performance Parameters(KPPs) identified in the CapabilityProduction Document (CPD) <strong>and</strong> theCritical Operational Issues (COIs) in theTEMP. IOT&E test plans for ACAT I <strong>and</strong>IA <strong>and</strong> other designated programs mustbe approved by the OSD (Office of theSecretary of Defense), Director, Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DOT&E). ServiceIOT&E test reports provide the foundationfor the DOT&E BLRIP Report.11.4.2 Follow-On Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (FOT&E)The FOT&E is conducted after the FRP decision.The tests are conducted in a realistic tactical environmentsimilar to that used in IOT&E, butfewer test items may be used. Normally FOT&Eis conducted using fielded production systemswith appropriate modifications, upgrades, or increments.Specific objectives of FOT&E includetesting modifications that are to be incorporatedinto production systems, <strong>com</strong>pleting any deferredor in<strong>com</strong>plete IOT&E, evaluating correction ofdeficiencies found during IOT&E, <strong>and</strong> assessingreliability including spares support on deployedsystems. The tests are also used to evaluate thesystem in a different platform application for newtactical applications or against new threats.11.4.3 Qualification Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (QOT&E) (USAF)Air Force QOT&E may be performed by themajor <strong>com</strong>m<strong>and</strong>, user, or AFOTEC <strong>and</strong> is conductedon minor modifications or new applicationsof existing equipment when no R&D fundingis required. An example of a program in whichQOT&E was performed by the Air Force is theA-10 Air-to-Air Self Defense Program. In thisprogram the mission of the A-10 was exp<strong>and</strong>edfrom strictly ground support to include an air-toairdefense role. To ac<strong>com</strong>plish this, the A-10 aircraftwas modified with off-the-shelf AIM-9 <strong>and</strong>11-3


air-to-air missiles; QOT&E was performed on thesystem to evaluate its operational effectiveness<strong>and</strong> suitability.11.5 TEST PLANNINGOperational <strong>Test</strong> (OT) planning is one of the mostimportant parts of the OT&E process. Properplanning facilitates the acquisition of data to supportthe determination of the weapon system’soperational effectiveness <strong>and</strong> suitability. Planningmust be pursued in a deliberate, <strong>com</strong>prehensive,<strong>and</strong> structured manner. Careful <strong>and</strong> <strong>com</strong>pleteplanning may not guarantee a successful test program;but inadequate planning can result in significanttest problems, poor system performance,<strong>and</strong> cost overruns. OT planning is conducted bythe OTA before program start, <strong>and</strong> more detailedplanning usually starts about 2 years before eachOT event.Operational planning can be divided into threephases: early planning, advanced planning, <strong>and</strong>detailed planning. Early planning entails developingCOIs, formulating a plan for evaluations,determining the Concept of Operation (COO),envisioning the operational environment, <strong>and</strong>developing mission scenarios <strong>and</strong> resource requirements.Advanced planning en<strong>com</strong>passes thedetermination of the purpose <strong>and</strong> scope of testing<strong>and</strong> identification of Measures of Effectiveness(MOEs) for critical issues. It includes developingtest objectives, establishing a test approach,<strong>and</strong> estimating test resource requirements.Detailed planning involves developing step-bystepprocedures to be followed as well as the finalcoordination of resource requirements necessaryto carry out OT&E.11.5.1 <strong>Test</strong>ing Critical Operational IssuesThe COIs have been described as:A key operational effectiveness or operationalsuitability issue that must be examinedin OT&E to determine thesystem’s capability to perform its mission.A COI is normally phrased as a questionto be answered in evaluating a system’soperational effectiveness <strong>and</strong>/or operationalsuitability.One of the purposes of OT&E is to resolve COIsabout the system. The first step in an OT&Eprogram is to identify these critical issues, someof which are explicit in the Operational RequirementDocument (ORD). Examples can be foundin questions such as: “How well does the systemperform a particular aspect of its mission?” “Canthe system be supported logistically in the field?”Other issues arise from questions asked aboutsystem performance or how it will affect othersystems with which it must operate. Critical issuesprovide focus <strong>and</strong> direction for the OT. Identifyingthe issues is analogous to the first step in theSystems Engineering Process: defining the problem.When COIs are properly addressed,deficiencies in the system can be uncovered. Theyform the basis for a structured technique of analysisby which detailed sub-objectives or MOEs canbe established. During the OT, each sub-objectiveis addressed by an actual test measurement (Measureof Performance (MOP)). After these issuesare identified, the evaluation plans <strong>and</strong> test designare developed for test execution. (For moreinformation, see Chapter 13.)11.5.2 <strong>Test</strong> Realism<strong>Test</strong> realism for OT&E will vary directly withthe degree of system maturity. Efforts early inthe acquisition program should focus on activeinvolvement of users <strong>and</strong> operationally orientedenvironments. Fidelity of the “<strong>com</strong>bat environment”should peak during the IOT&E when forceon-forcetesting of the production representativesystem is conducted. The degree of success inreplicating a realistic operational environment hasa direct impact on the credibility of the IOT&Etest report. Areas of primary concern for the testplanner can be derived from the legislated definitionof OT&E:11-4


(1) A field test includes all of the elementsnormally expected to be encountered in theoperational arena, such as appropriate size<strong>and</strong> type of maneuver terrain, environmentalfactors, day/night operations, austere livingconditions, etc.(2) Realistic <strong>com</strong>bat should be replicated usingappropriate tactics <strong>and</strong> doctrine, representativethreat forces properly trained in theemployment of threat equipment, free playresponses to test stimulus, stress, “dirty”battle area (fire, smoke, Nuclear, Biological<strong>and</strong> Chemical (NBC); Electronic Countermeasures(ECM), etc.), wartime tempo to operations,real time casualty assessment, <strong>and</strong>forces requiring interoperability.(3) Any item means the production-representativeconfiguration of the system at thatpoint in time, including appropriate logisticstail.(4) Typical military users are obtained by takinga cross section of adequately trainedskill levels <strong>and</strong> ranks of the intended operationalforce. Selection of “golden crews”or the best-of-the-best does not provide testdata reflecting the successes nor problemsof the “Murphy <strong>and</strong> gang” of typical units.In his book, Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>: ASystems Engineering Process, Roger Stevensstated, “In order to achieve realism effectively inan OT&E program, a concern for realism mustpervade the entire test program from the very beginningof test planning to the time when the verylast test iteration is run.” Realism is a significantissue during planning <strong>and</strong> execution of OT&E. 311.5.3 Selection of a <strong>Test</strong> ConceptAn important step in the development of anOT&E program is to develop an overall testprogram concept. Determinations must be maderegarding when OT&E will be conducted duringsystems development, what testing is to be doneon production equipment, how the testing willbe evolutionary, <strong>and</strong> what testing will have to waituntil all system capabilities are developed. Thisconcept can best be developed by considering anumber of aspects such as test information requirements,system availability for test periods,<strong>and</strong> the demonstration of system capabilities. Thetest concept is driven by the acquisition strategy<strong>and</strong> is a roadmap used for planning T&E events.The DOT&E is briefed on test concepts for oversightprograms <strong>and</strong> approves the test plan beforeIOT&E starts.11.6 TEST EXECUTIONAn OT plan is only as good as the execution ofthat plan. The execution is the essential bridgebetween test planning <strong>and</strong> test reporting. Thetest is executed through the OTA test director’sefforts <strong>and</strong> the actions of the test team. For successfulexecution of the OT&E plan, the test directormust direct <strong>and</strong> control the test resources<strong>and</strong> collect the data required for presentation tothe decision authority. The test director must preparefor testing, activate, <strong>and</strong> train the test team,develop test procedures <strong>and</strong> operating instructions,control data management, create OT&Eplan revisions, <strong>and</strong> manage each of the test trials.The test director’s data management duties willen<strong>com</strong>pass collecting raw data, creating a datastatus matrix, <strong>and</strong> ensuring data quality by processing<strong>and</strong> reducing, verifying, filing, storing,retrieving, <strong>and</strong> analyzing collected data. Onceall tests have been <strong>com</strong>pleted <strong>and</strong> the data arereduced <strong>and</strong> analyzed, the results must be reported.A sample test organization used for theArmy OT&E of the program is illustrated in Figure11-1. (In the Army, the Deputy <strong>Test</strong> Director<strong>com</strong>es from the OTA <strong>and</strong> controls the dailyOT activity.)11.7 TEST REPORTINGThe IOT&E test report is a very important document.It must <strong>com</strong>municate the results of11-5


TESTDIRECTORDEPUTYTESTDIRECTOR(OTA)- - - - - - - -SYSTEMREPRESENTATIVE(DEV CMD)DATACOLLECTION(OTA)DATAMANAGEMENT(OTA)DEPUTYTESTDIRECTOR(USER REP)TESTCONTROLHUMANFACTORSSITESUPPORTPLAYERUNITPERFORMCENTERSYSTEMRAMUNITS“A”/“B”PERFORMSYSTEMRAM“A”/“B”Figure 11-1. Organizational Breakdown of the I-81mmMortar Operational <strong>Test</strong> Directorate<strong>com</strong>pleted tests to decision authorities in a timely,factual, concise, <strong>com</strong>prehensive, <strong>and</strong> accuratemanner. The report must present a balanced viewof the weapon system’s successes <strong>and</strong> problemsduring testing, illuminating both the positive aspects<strong>and</strong> system deficiencies discovered. Analysisof test data <strong>and</strong> their evaluation may be inone report (Air Force, Navy) or in separate documents(Army, Marine Corps). The Army’s Independent<strong>Evaluation</strong> Report (IER) advises the decisionreview principals <strong>and</strong> MDA concerning theadequacy of testing, the system’s operational effectiveness<strong>and</strong> suitability, <strong>and</strong> survivability, aswell as providing re<strong>com</strong>mendations for futureT&E <strong>and</strong> system improvements.There are four types of reports most frequentlyused in reporting OT&E results. These includestatus, interim, quick-look, <strong>and</strong> final reports. Thestatus report gives periodic updates (e.g., monthly,quarterly) <strong>and</strong> reports recent test findings (discreteevents such as missile firings). The interimreport provides a summary of the cumulative testresults to date when there is an extended periodof testing. The quick-look reports provide preliminarytest results, are usually prepared immediatelyafter a test event (less than 7 days), <strong>and</strong> have beenused to support program decision milestones. Thefinal T&E report (Air Force, Navy) or IER (Army,Marine Corps) presents the conclusions <strong>and</strong> re<strong>com</strong>mendationsincluding all supporting data <strong>and</strong>11-6


covering the entire IOT&E program. The ServiceOTA are required to provide reports of OT&Eresults in support of Milestones B <strong>and</strong> C, in additionto the IOT&E report required for the FullRate Production Decision Review (FRPDR). 411.8 SUMMARYThe purpose of OT&E is to assess operationaleffectiveness <strong>and</strong> suitability at each stage in theacquisition process. Operational effectiveness is ameasure of the contribution of the system to missionac<strong>com</strong>plishment under actual conditions ofemployment. Operational suitability is a measureof the Reliability <strong>and</strong> Maintainability (R&M) ofthe system; the effort <strong>and</strong> level of training requiredto maintain, support, <strong>and</strong> operate it; <strong>and</strong>any unique logistics or training requirements itmay have. The OT&E may provide informationon tactics, doctrine, organization, <strong>and</strong> personnelrequirements <strong>and</strong> may be used to assist in thepreparation of operating <strong>and</strong> maintenance instructions<strong>and</strong> other publications. One of the mostimportant aspects is that OT&E provides an independentevaluation of the degree of progressmade toward satisfying the user’s requirementsduring the system development process.11-7


ENDNOTES1. Title 10, U.S.C., Section 2399, Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> of Defense AcquisitionPrograms.2. Public Law 99-661, National DefenseAuthorization Act (NDAA), Section 1207.3. Stevens, Roger T., Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong>: A Systems Engineering Process,John Wiley & Sons, New York: 1979, pp. 49-56.4. As required by DoD Instruction (DoDI)5000.2, Operation of the Defense AcquisitionSystem, May 12, 2003.11-8


12OT&E TO SUPPORTDECISION REVIEWS12.1 INTRODUCTIONMindful of principles of objectivity <strong>and</strong> impartialevaluation, Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(OT&E) may be conducted before each decisionreview to provide the decision authority with thelatest results from testing of Critical OperationalIssues (COIs). The philosophy of OT&E has beenrelated to three terms—adequacy, quality, <strong>and</strong>credibility:Adequacy: The amount of data <strong>and</strong> realismof test conditions must be sufficient to supportthe evaluation of the COIs.Quality: The test planning, control of testevents, <strong>and</strong> treatment of data must provideclear <strong>and</strong> accurate test reports.Credibility: The conduct of the test <strong>and</strong> datah<strong>and</strong>ling must be separated from externalinfluence <strong>and</strong> personal biases.Operational testing is conducted to provide informationto support Department of Defense(DoD) executive-level management decisions onmajor acquisition programs. OT&E is ac<strong>com</strong>plishedusing a test cycle of successive actions<strong>and</strong> documents. During the early stages of theprogram, the process is informal <strong>and</strong> modifiedas necessary. As programs mature, documentationfor major systems <strong>and</strong> those designated bythe Director, Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(DOT&E) for oversight must be sent to the Officeof the Secretary of Defense (OSD) for approvalbefore the testing can be conducted or thesystems can be cleared to proceed Beyond LowRate Initial Production (BLRIP). Figure 12-1illustrates how OT&E relates to the acquisitionprocess.12.2 CONCEPT REFINEMENT (CR) ANDTECHNOLOGY DEVELOPMENT (TD)The OT&E conducted during the first phase maybe an Early Operational Assessment (EOA)focused on investigating the deficiencies identifiedduring the mission area analysis. Operationaltesters participate in these evaluations to validatethe OT&E requirements for future testing <strong>and</strong> toidentify issues <strong>and</strong> criteria that can only be resolvedthrough OT&E to initiate early test resourceplanning.Before program initiation, the OT&E objectivesare to assist in evaluating alternative concepts tosolve the mission capability deficiencies <strong>and</strong> toassess the operational impact of the system. Anearly assessment also may provide data to supporta decision on whether the technology is sufficientlymature to support entry into the nextdevelopment phase. The OT&E conducted duringthis phase supports developing estimates of:(1) The military need for the proposed system;(2) A demonstration that there is a soundphysical basis for a new system;(3) An analysis of concepts, based on demonstratedphysical phenomena, for satisfyingthe military need;(4) The system’s affordability <strong>and</strong> Life CycleCost (LCC);12-1


MILESTONE A MILESTONE B MILESTONE C FRPDRCONCEPTREFINEMENTTECHNOLOGYDEVELOPMENTSYSTEM DEVELOPMENT& DEMONSTRATIONPRODUCTION &DEPLOYMENTOPERATIONS &SUPPORTPROTOTYPEENGR DEV MODELLRIPMODIFICATIONSANALYSISOFALTERNATIVESTECHDEVELSTRATEGYTEMPSIMULATIONEOAOPERATIONAL ASPECTSPREFERRED ALTERNATIVESEOAOPERATIONALUTILITYPRODUCTIONVALIDATIONINDEPENDENTASSESSMENTOAPOTENTIAL EFFECTIVENESSSUITABILITYALTERNATIVESINDEPENDENT EVALUATIONMILESTONE II DECISION INFORMATIONIOT&EFOT&EOPERATIONALUTILITYTACTICS-DOCTRINEPERSONNELINTEROPERABILITYFOT&EVALIDATEREQUIREMENTSOPERATIONALUTILITYINDEPENDENTEVALUATIONFOT&E/OAFigure 12-1. OT&E Related to the Milestone Process(5) The ability of a modification to an existingU.S. or allied system to provide neededcapability;(6) An operational utility assessment;(7) An impact of the system on the forcestructure.During concept assessment, there is normally nohardware available for the operational tester. Therefore,the EOA is conducted from surrogate test <strong>and</strong>experiment data, breadboard models, factory usertrials, mock-up/simulators, modeling/simulation,<strong>and</strong> user demonstrations (Figure 12-2). This makesearly assessments difficult, <strong>and</strong> some areas cannotbe covered in-depth. However, these assessmentsprovide vital introductory information onthe system’s potential operational utility.The OT&E products from this phase of testing includethe information provided to the decision authority,data collected for further evaluation, inputto the test planning in the Technology DevelopmentStrategy (TDS) that will later evolve into the <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> Master Plan (TEMP), <strong>and</strong> early <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> (T&E) planning. Special logisticsproblems, program objectives, program plans, performanceparameters, <strong>and</strong> acquisition strategy areareas of primary influence to the operational testerduring this phase <strong>and</strong> must be carefully evaluatedto project the system’s operational effectiveness<strong>and</strong> suitability.12.3 SYSTEM DEVELOPMENT ANDDEMONSTRATION (SDD)After program initiation, <strong>com</strong>bined DT/OT&E oran EOA may be conducted to support the EOAof prototype development <strong>and</strong> a decision regardinga system’s readiness to move into the developmentof the Engineering Development Model(EDM). In all cases, appropriate T&E must beconducted on a mature system configuration, i.e.,the EDM, before the production decision, therebyproviding data for identification of risk beforemore resources are <strong>com</strong>mitted. As appropriate,Low Rate Initial Production (LRIP) will followthis phase of testing to verify system-level performancecapability <strong>and</strong> to provide insight intotest resources needed to conduct future interoperability,live fire, or operational testing.12-2


12.3.1 Objectives of OperationalAssessmentsOperational Assessments (EOAs, OAs) are conductedto facilitate identification of the bestdesign, indicate the risk level of performance forthis phase of the development, examine operationalaspects of the system’s development, <strong>and</strong>estimate potential operational effectiveness <strong>and</strong>suitability. Additionally, an analysis of the planningfor transition from development to productionis initiated. EOAs supporting decision reviewsare intended to:(1) Assess the potential of the new system inrelation to existing capabilities;(2) Assess system effectiveness <strong>and</strong> suitabilityso that affordability can be evaluated forprogram cost versus military utility;(3) Assess the adequacy of the concept foremployment, supportability, <strong>and</strong> organization;doctrinal, tactical, <strong>and</strong> training requirements;<strong>and</strong> related critical issues;(4) Estimate the need for the selected systemsin consideration of the threat <strong>and</strong> systemalternatives based on military utility;(5) Assess the validity of the operationalconcept;(6) List the key risk areas <strong>and</strong> COIs that needto be resolved before construction of EDMsis initiated;(7) Assess the need during LRIP of long leadhardware to support Initial Operational <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> (IOT&E) prior to the Full-Rate Production (FRP) decision;(8) Provide data to support test planning for thisphase.During this phase, OT&E may be conducted onbrassboard configurations, experimental prototypes,or advanced development prototypes. Dedicatedtest time may be made available for the operationaltester. However, the OT&E assessmentsmay also make use of many other additional datasources. Examples of additional sources often usedby the Army during this phase include: ConceptExperimentation Program (CEP) tests, innovativetesting, Force Development <strong>Test</strong>s <strong>and</strong> Experimentation(FDT&E), source selection tests, user participationin Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(DT&E), <strong>and</strong> operational feasibility tests. The resultsfrom this testing, analysis, <strong>and</strong> evaluation areMSAMSBDRRMSC LRIP FRPDREOAOAIOT&EFOT&E• LAB REPORTS• TEST CELL• MODELS/SIMULATIONS• ENGINEER ANALYSIS• MOCK-UPS• DOCUMENT REVISIONS• HISTORICAL DATA• DT&E• DEMO• CONTRACTOR DATA• GOVERNMENT DATA• INTEGRATION• TESTS• QUALITY TESTS• COMBINED DT/OT• SIMULATORS• PRIOR EOA• EOA/OA• DT&E• DEMO• MODEL• SIMULATION• DT&E• OA• FIELD DATA• EXERCISES• WAR GAMESFigure 12-2. Sources of Data12-3


documented in an Operational Assessment (EOA)or end of phase OT&E report. These data, alongwith the mission needs <strong>and</strong> requirements documentation<strong>and</strong> TEMP, assist in the review of performancefor the next decision review.The OAs during the system demonstration areconducted on EDMs. These operational evaluationsestimate the operational effectiveness <strong>and</strong>suit-ability <strong>and</strong> provide data on whether thesystem meets minimum operational thresholds.12.4 PRODUCTION AND DEPLOYMENT(P&D)Just before the FRP decision, the dedicated T&Eis conducted on equipment that has been formallycertified by the Program Manager (PM) as beingready for the “final OT&E.” This dedicatedIOT&E is conducted in a test environment asoperationally realistic as possible.12.4.1 OT&E ObjectivesThe IOT&E conducted is characterized by testingperformed by user organizations in a fieldexercise to examine the organization <strong>and</strong> doctrine,Integrated Logistics Support (ILS), threat, <strong>com</strong>munications,Comm<strong>and</strong> <strong>and</strong> Control (C 2 ), <strong>and</strong>tactics associated with the operational employmentof the unit during tactical operations. Thisincludes estimates that:(1) Assess operational effectiveness <strong>and</strong>suitability;(2) Assess the survivability of the system;(3) Assess the systems reliability,maintainability, <strong>and</strong> plans for ILS;(4) Evaluate manpower, personnel, training, <strong>and</strong>safety requirements;(5) Validate organizational <strong>and</strong> employmentconcepts;(6) Determine training <strong>and</strong> logistics requirementsdeficiencies;(7) Assess the system’s readiness to enter FRP.12.5 OPERATIONS AND SUPPORT (O&S)After the FRP decision <strong>and</strong> deployment, theemphasis shifts towards procuring productionquantities, repairing hardware deficiencies,managing changes, <strong>and</strong> phasing in full logisticssupport. During initial deployment of the system,the OT&E agency <strong>and</strong>/or the user may performFollow-on Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(FOT&E) to refine the effectiveness <strong>and</strong> suitabilityestimates made during earlier OT&E, assessperformance not evaluated during IOT&E, evaluatenew tactics <strong>and</strong> doctrine, <strong>and</strong> assess theimpacts of system modifications or upgrades.The FOT&E is performed with production articlesin operational organizations. It is normally fundedwith Operations <strong>and</strong> Maintenance (O&M) funds.The first FOT&E conducted during this phasemay be used to:(1) Ensure that the production system performsas well as reported at the Milestone IIIreview;(2 Demonstrate expected performance <strong>and</strong>reliability improvements;(3) Ensure that the correction of deficienciesidentified during earlier testing have been<strong>com</strong>pleted;(4) Evaluate performance not tested duringIOT&E.Additional objectives of FOT&E are to validatethe operational effectiveness <strong>and</strong> suitability of amodified system during an OA of the system innew environments. The FOT&E may look atdifferent platform applications, new tacticalapplications, or the impact of new threats.12-4


12.5.1 FOT&E of Logistics SupportSystemsThe testing objectives to evaluate post-productionlogistics readiness <strong>and</strong> support are to:(1) Assess the logistics readiness <strong>and</strong> sustainability;(2) Evaluate the weapon support objectives;(3) Assess the implementation of logisticssupport planning;(4) Evaluate the capability of the logisticssupport activities;(5) Determine the disposition of displacedequipment;(6) Evaluate the affordability <strong>and</strong> LCC of thesystem.12.6 SUMMARYThe OT&E is that T&E (OAs, IOT&E, orFOT&E) conducted to provide feedback onsystem design <strong>and</strong> its potential to be operationallyeffective <strong>and</strong> operationally suitable. OT&Eevents in each phase of development will identifyneeded modifications; provide informationon tactics, doctrine, organizations, <strong>and</strong> personnelrequirements; <strong>and</strong> evaluate the system’s logisticssupportability. The acquisition program structureshould include feedback from OAs or evaluationsat critical decision points/technical reviews beginningearly in the development cycle <strong>and</strong> continuingthroughout the system’s life cycle.12-5


12-6


IVMODULETEST AND EVALUATIONPLANNINGMany program managers face several T&E issues that must beresolved to get their particular weapon system tested <strong>and</strong> ultimatelyfielded. These issues may include modeling <strong>and</strong> simulationsupport, <strong>com</strong>bined <strong>and</strong> concurrent testing, test resources,survivability <strong>and</strong> lethality testing, multi-Service testing, orinternational T&E. Each issue presents a unique set of challengesfor program managers when they develop the integrated strategyfor the T&E program.


13EVALUATION13.1 INTRODUCTIONThis chapter describes the evaluation portion ofthe <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E) process. It stressesthe importance of establishing <strong>and</strong> maintaining aclear audit trail from system requirements throughcritical issues, evaluation criteria, test objectives,<strong>and</strong> Measures of Effectiveness (MOEs) to theevaluation. The importance of the use of data fromall sources is discussed as are the differences inapproaches to evaluating technical <strong>and</strong> operationaldata.13.2 DIFFERENCE BETWEEN “TEST”AND “EVALUATION”The following distinction has been made betweenthe functions of “test” <strong>and</strong> “evaluation”:While the terms “test” <strong>and</strong> “evaluation” aremost often found together, they actuallydenote clearly distinguishable functions inthe RDT&E [Research, Development, <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong>] process.“<strong>Test</strong>” denotes any “program or procedure that isdesigned to obtain, verify, or provide data for theevaluation of any of the following: 1) progress inac<strong>com</strong>plishing developmental objectives; 2) theperformance, operational capability, <strong>and</strong> suitabilityof systems, subsystems, <strong>com</strong>ponents, <strong>and</strong>equipment items; <strong>and</strong> 3) the vulnerability <strong>and</strong>lethality of systems, subsystems, <strong>com</strong>ponents, <strong>and</strong>equipment items.” 1“<strong>Evaluation</strong>” denotes the process whereby dataare logically assembled, analyzed, <strong>and</strong> <strong>com</strong>paredto expected performance to aid in makingsystematic decisions.To summarize, evaluation is the process forreview <strong>and</strong> analysis of qualitative or quantitativedata obtained from design review, hardwareinspection, Modeling <strong>and</strong> Simulation (M&S),testing, or operational usage of equipment.13.3 THE EVALUATION PROCESSThe evaluation process requires a broad analyticalapproach with careful focus on the developmentof an overall T&E plan that will providetimely answers to critical issues <strong>and</strong> questions requiredby decision authorities throughout all theacquisition phases. (Table 13-1) <strong>Evaluation</strong>sshould focus on Key Performance Parameters(KPPs); i.e., those “minimum attributes or characteristicsconsidered most essential for an effectivemilitary capability.” An attribute is a testableor measurable characteristic that describesan aspect of a system or capability. 2A functional block diagram of a generic (i.e., notService-specific) evaluation process is shown inFigure 13-1. The Army’s Independent <strong>Evaluation</strong>Plan (IEP) is written very early <strong>and</strong> applies tothe system <strong>and</strong> not a specific test event, so it isnot as detailed as the format provided in Table13-1. The process begins with the identificationof a deficiency or need <strong>and</strong> the documentationof an operational requirement. It continues withthe identification of critical issues that must beaddressed to determine the degree to which thesystem meets user requirements. Objectives <strong>and</strong>thresholds must then be established to define13-1


Table 13-1. Sample <strong>Evaluation</strong> PlanCHAPTER 1 INTRODUCTION1.1 PURPOSE1.2 SCOPE1.3 BACKGROUND1.4 SYSTEM DESCRIPTION1.5 CRITICAL OPERATIONAL ISSUES AND CRITERIA (COIC)1.6 PROJECTED THREAT1.7 TEST AND EVALUATION MILESTONESCHAPTER 2 EVALUATION STRATEGY2.1 EVALUATION CONCEPT2.2 OPERATIONAL EFFECTIVENESS2.2.1 ISSUE 12.2.1.1 SCOPE2.2.1.2 CRITERIA2.2.1.3 RATIONALE2.2.1.4 EVALUATION APPROACH2.2.1.5 ANALYSIS OF MOPS AND DATA PRESENTATIONS2.2.1.5.1 MOP 1THROUGH2.2.1.5.1.X MOPx2.2.2 ISSUE 2THROUGH2.2.m ISSUE n2.3 OPERATIONAL SUITABILITY2.3.1 ISSUE n+1THROUGH2.3.n ISSUE n+x2.4 DATA SOURCE MATRIX2.5 DESCRIPTION OF OTHER PRIMARY DATA SOURCES2.6 TEST APPROACH2.6.1 TEST SCOPE2.6.2 FACTORS AND CONDITIONS2.6.3 SAMPLE SIZE AND OTHER TEST DESIGN CONSIDERATIONS2.6.4 DATA AUTHENTICATION GROUP (DAG)2.7 EVALUATION DATABASE STRUCTURE2.7.1 IDENTIFICATION OF REQUIRED FILES2.7.2 DESCRIPTION OF FILE RELATIONSHIPS2.7.3 DATA ELEMENTS DEFINITIONSAPPENDICES:APPENDIX AAPPENDIX BAPPENDIX CAPPENDIX DAPPENDIX EAPPENDIX FAPPENDIX GAPPENDIX HAPPENDIX IAPPENDIX JAPPENDIX KAPPENDIX LAPPENDIX MAPPENDIX NAPPENDIX OAPPENDIX PAPPENDIX QSOURCE: OT&E Methodology <strong>Guide</strong>, Department of the Army (DA) Pamphlet 71-3.IOT&E RESOURCE PLANPATTERN OF ANALYSISCONTROL CONCEPTDATA COLLECTION CONCEPTDATA REDUCTION CONCEPTQUALITY CONTROL CONCEPTDAG CHARTER AND SOPTRAINING CONCEPTTEST ENVIRONMENTAL ASSESSMENT AND ENVIRONMENTAL IMPACT STATEMENTSTATUS OF SUPPORT DOCUMENTSSYSTEM DESCRIPTIONSCENARIOINSTRUMENTATIONBASELINE CORRELATION MATRIXSTRAWMAN INDEPENDENT EVALUATION REPORTGLOSSARYABBREVIATIONS13-2


IDENTIFYDEFICIENCY/NEEDDOCUMENTREQUIREMENTFORMULATECRITICALISSUESDEVELOPEVALUATIONCRITERIAIDENTIFYDATAREQUIREMENTSSELECTMEASURESOFPERFORMANCESELECTMEASURESOFEFFECTIVENESSCONDUCTTESTANALYZETESTRESULTSCOMPARE TESTRESULTS WITHEVALUATIONCRITERIAPREPAREEVALUATIONREPORTFigure 13-1. Functional Block Diagram of the <strong>Evaluation</strong> Processrequired performance or supportability parameters<strong>and</strong> to evaluate progress in reaching them. T&Eanalysts then de<strong>com</strong>pose the issues into measurabletest elements, conduct the necessary testing,review <strong>and</strong> analyze the test data, weigh the testresults against the evaluation criteria, <strong>and</strong> preparean evaluation report for the decisionauthorities.13.4 ISSUES AND CRITERIAIssues are questions regarding a system thatrequire answers during the acquisition process.Those answers may be needed to aid in thedevelopment of an acquisition strategy, to refineperformance requirements <strong>and</strong> designs, or to supportMilestone Decision Reviews (MDRs). <strong>Evaluation</strong>criteria are the st<strong>and</strong>ards by which ac<strong>com</strong>plishmentsof required technical <strong>and</strong> operationaleffectiveness <strong>and</strong>/or suitability characteristics orresolution of operational issues may be assessed.The evaluation program may be constructed usinga structured approach identifying each issue.(1) Issue: a statement of the question to beanswered;(2) Scope: detailed conditions <strong>and</strong> range ofconditions that will guide the T&E processfor this issue;(3) Criteria: quantitative or qualitative st<strong>and</strong>ardsthat will answer the issue;(4) Rationale: full justification to support theselected criteria.13.4.1 Key Performance Parameters (KPPs)/Critical IssuesThe KPPs often can support the development ofa hierarchy of critical issues <strong>and</strong> less significantissues. Critical issues are those questions relatingto a system’s operational, technical, support,or other capability. These issues must be answeredbefore the system’s overall worth can be estimated/evaluated,<strong>and</strong> they are of primary importanceto the decision authority in allowing thesystem to advance to the next acquisition phase.Critical issues in the <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> MasterPlan (TEMP) may be derived from the KPPsfound in the capability requirements documents—Capability Development Document (CDD) <strong>and</strong>13-3


Capability Production Document (CPD). The systemrequirements <strong>and</strong> baseline documentationwill provide many of the performance parametersrequired to develop the hierarchy of issues.13.4.2 <strong>Evaluation</strong> Issues<strong>Evaluation</strong> issues are those addressed in technicalor operational evaluations during the acquisitionprocess. <strong>Evaluation</strong> issues can be separatedinto technical or operational issues <strong>and</strong> addressedin the TEMP.Technical issues primarily concern technicalparameters or characteristics <strong>and</strong> engineeringspecifications normally assessed in developmenttesting. Operational issues concern effectiveness<strong>and</strong> suitability characteristics for functions to beperformed by equipment/personnel. They addressthe system’s operational performance when examinedin a realistic operational mission environment.<strong>Evaluation</strong> issues are answered by whatevermeans necessary (analysis/survey, modeling,simulation, inspection, demonstration, or testing)to resolve the issue. Issues requiring test data arefurther referred to as test issues.13.4.3 <strong>Test</strong> Issues<strong>Test</strong> issues are a subset of evaluation issues <strong>and</strong>address areas of uncertainty that require test datato resolve the issue adequately. <strong>Test</strong> issues maybe partitioned into technical issues that are addressedby the Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(DT&E) <strong>com</strong>munity (contractors <strong>and</strong> government)<strong>and</strong> operational issues that are addressedby the Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E)<strong>com</strong>munity. <strong>Test</strong> issues may be divided into critical<strong>and</strong> noncritical categories. All critical T&Eissues, objectives, methodologies, <strong>and</strong> evaluationcriteria should be defined during the initialestablishment of an acquisition program. Criticalissues are documented in the TEMP. These evaluationissues serve to define the testing requiredfor each phase of the acquisition process <strong>and</strong>serve as the structure to guide the testing programso these data may be <strong>com</strong>pared against performancecriteria.13.4.4 CriteriaCriteria are statements of a system’s required technicalperformance <strong>and</strong> operational effectiveness,suitability, <strong>and</strong> supportability. Criteria are oftenexpressed as “objectives <strong>and</strong> thresholds.” (SomeServices, however, specify performance <strong>and</strong> supportabilityrequirements exclusively in terms ofthresholds <strong>and</strong> avoid the use of the concept ofobjectives.) These performance measurementsprovide the basis for collecting data used to evaluate/answertest issues.Criteria must be unambiguous <strong>and</strong> assessablewhether stated qualitatively or quantitatively. Theymay <strong>com</strong>pare the mission performance of the newsystem to the one being replaced, <strong>com</strong>pare thenew system to a predetermined st<strong>and</strong>ard, or <strong>com</strong>paremission performance results using the newsystem to not having the system. Criteria are thefinal values deemed necessary by the user. Assuch, they should be developed in close coordinationwith the system user, other testers, <strong>and</strong>specialists in all other areas of operational effectiveness<strong>and</strong> suitability. These values may bechanged as systems develop <strong>and</strong> associated testing<strong>and</strong> evaluation proceed. Every issue shouldhave at least one criteria that is a concise measureof the function. Values must be realistic <strong>and</strong>achievable within the state of the art of engineeringtechnology. A quantitative or qualitative criterionshould have a clear definition, free of ambiguousor imprecise terminology, such as “adequate,”“sufficient,” or “acceptable.”13.4.4.1 <strong>Test</strong> of Thresholds <strong>and</strong> ObjectivesA threshold for a performance parameter lists aminimum acceptable operational value belowwhich the utility of the system be<strong>com</strong>es questionable.3 Thresholds are stated quantitativelywhenever possible. Specification of minimallyacceptable performance in measurable parameters13-4


is essential to selecting appropriate MOEs, which,in turn, heavily influence test design. Thresholdsare of value only when they are testable; i.e., actualperformance can be measured against them.The function of T&E is to verify the attainmentof required thresholds. The Program Manager(PM) must budget for the attainment of thresholdvalues.Objectives are “the desired operational goalassociated with a performance attribute, beyondwhich any gain in utility does not warrant additionalexpenditure. The objective value is anoperationally significant increment above thethreshold. An objective value may be the sameas the threshold when an operationally significantincrement above the threshold is not significantor useful.” 4 Objectives are not normally addressedby the operational tester, whose primary concernis to “determine if thresholds in the approved CPD<strong>and</strong> critical operational issues have beensatisfied.” 5Going into system demonstration, thresholds <strong>and</strong>objectives are exp<strong>and</strong>ed along with the identificationof more detailed <strong>and</strong> refined performancecapabilities <strong>and</strong> characteristics resulting fromtrade-off studies <strong>and</strong> testing conducted during theevaluation of Engineering Development Models(EDMs). Along with the CPD, they should remainrelatively stable through production.13.5 MEASURES OF EFFECTIVENESS(MOEs)Requirements, thresholds, <strong>and</strong> objectives establishedin early program documentation form thebasis for evaluation criteria. If program documentationis in<strong>com</strong>plete, the tester may have to developevaluation criteria in the absence of specificrequirements. <strong>Evaluation</strong> criteria are associatedwith objectives, sub-objectives, <strong>and</strong> MOEs (sometimespartitioned into MOEs <strong>and</strong> measures of suitability).For example, an MOE (airspeed) may havean associated evaluation criterion (450 knots)against which the actual performance (425 knots)is <strong>com</strong>pared to arrive at a rating. An MOE of asystem is a parameter that evaluates the capacityof the system to ac<strong>com</strong>plish its assigned missionsunder a given set of conditions. They are importantbecause they determine how test results willbe judged; <strong>and</strong>, since test planning is directed towardobtaining these measures, it is importantthat they be defined early. Generally, the resolutionof each critical issue is in terms of the evaluationof some MOE. In this case, the operating,implementing, <strong>and</strong> supporting <strong>com</strong>m<strong>and</strong>s mustagree with the criteria before the test organizationmakes use of them in assessing test results.Ensuring that MOEs can be related to the user’soperational requirements is an important considerationwhen identifying <strong>and</strong> establishing evaluationcriteria. <strong>Test</strong>ers must ensure that evaluationcriteria <strong>and</strong> MOEs are updated if requirementschange. The MOEs should be so specific that thesystem’s effectiveness during developmental <strong>and</strong>operational testing can be assessed using someof the same effectiveness criteria as the Analysisof Alternatives [AoAs]. 613.6 EVALUATION PLANNING13.6.1 <strong>Evaluation</strong> Planning Techniques<strong>Evaluation</strong> planning is an iterative process thatrequires formal <strong>and</strong> informal analyses of systemoperation (e.g., threat environment, system design,tactics, <strong>and</strong> interoperability). Techniques thathave been proven effective in evaluation planninginclude: process analysis, design or engineeringanalysis, matrix analysis, <strong>and</strong> dendritic analysis. 713.6.1.1 Process Analysis TechniquesProcess analysis techniques consist of thinkingthrough how the system will be used in a varietyof environments, threats, missions, <strong>and</strong> scenariosin order to underst<strong>and</strong> the events, actions, situations,<strong>and</strong> results that are expected to occur. Thistechnique aids in the identification <strong>and</strong> clarificationof appropriate MOEs, test conditions, <strong>and</strong>data requirements.13-5


13.6.1.2 Design/Engineering AnalysisTechniquesDesign or engineering analysis techniques areused to examine all mechanical or functionaloperations that the system has been designed toperform. These techniques involve a systematicexploration of the system’s hardware <strong>and</strong> software<strong>com</strong>ponents, purpose, performance bounds,manpower <strong>and</strong> personnel considerations, knownproblem areas, <strong>and</strong> impact on other <strong>com</strong>ponents.Exploration of the way a system operates <strong>com</strong>paredto intended performance functions oftenidentifies issues, MOEs, specific data, test events,<strong>and</strong> required instrumentation.13.6.1.3 Matrix Analysis TechniquesMatrix analysis techniques are useful for analyzingany situation where two classifications mustbe cross-referenced. For example, a matrix of“types of data” versus “means of data collection”can reveal not only types of data having noplanned means of collection but also redundantor backup collection systems. Matrix techniquesare useful as checklists, as organizational tools,or as a way of identifying <strong>and</strong> characterizing problemareas. Matrix techniques are effective for tracinga system’s operational requirements throughcontractual specification documents, issues, <strong>and</strong>criteria to sources of individual data or specifictest events.13.6.1.4 Dendritic Analysis TechniquesDendritic analysis techniques are an effective wayof de<strong>com</strong>posing critical issues to the point whereactual data requirements <strong>and</strong> test measurementscan be identified. In these techniques, issues aresuccessively broken down into objectives, MOEs,Measures of Performance (MOPs), <strong>and</strong> datarequirements in a rootlike structure as depictedin Figure 13-2. In this approach, objectives areused to clearly express the broad aspects of T&Erelated to the critical issues <strong>and</strong> the overallpurpose of the test. The MOEs are developed assubsets of the objectives <strong>and</strong> are designed to treatspecific <strong>and</strong> addressable parts of the objectives.Each MOE is traceable as a direct contributor toone objective <strong>and</strong>, through it, is identifiable as adirect contributor to addressing one or morecritical issues. 8 Each test objective <strong>and</strong> MOE isalso linked to one or more MOPs (quantitativeor qualitative measures of system performanceunder specified conditions) that, in turn, are tiedto specific data elements. The dendritic approachhas be<strong>com</strong>e a st<strong>and</strong>ard military planningtechnique.13.6.2 Sources of DataAs evaluation <strong>and</strong> analysis planning matures, focusturns toward identifying data sources as a meansfor obtaining each data element. Initial identificationtends to be generic such as: engineering study,simulation, modeling, or contractor test. Later identificationreflects specific studies, models, <strong>and</strong>/ortests. A data source matrix is a useful planningtool to show where data are expected to be obtainedduring the T&E of the system.There are many sources of data that can contributeto the evaluation. Principal sources include:studies <strong>and</strong> analyses, models, simulations, wargames, contractor testing, Development <strong>Test</strong> (DT),Operational <strong>Test</strong> (OT), <strong>and</strong> <strong>com</strong>parable systems.13.7 EVALUATING DEVELOPMENT ANDOPERATIONAL TESTSTechnical <strong>and</strong> operational evaluations employdifferent techniques <strong>and</strong> have different evaluationcriteria. The DT&E is often considered technicalevaluation while OT&E addresses the operationalaspects of a system. Technical evaluation dealsprimarily with instrumented tests <strong>and</strong> statisticallyvalid data. An operational evaluation deals withoperational realism <strong>and</strong> the <strong>com</strong>bat uncertainties. 9The DT&E uses technical criteria for evaluatingsystem performance. These criteria are usuallyparameters that can be measured during controlledDT&E tests. They are particularly important to13-6


DATA ELEMENT 1MOE 1MOP 1MOP 2DATA ELEMENT 2DATA ELEMENT nTEST OBJECTIVE 1MOE 2MOE nMOS 1MOS 2MOP nCRITICALISSUETEST OBJECTIVE 2TEST OBJECTIVE nCAPABILITY OFCANDIDATERADAR "X"• DETERMINE DETECTIONCAPABILITY• DETERMINE TRACKINGCAPABILITY• DETERMINE MEAN/MEDIANDETECTION RANGE• DETERMINE EASE OFOPERATION OFCONTROLS/DISPLAYS• DETERMINE FALSE TARGET RATES• MEAN AND VARIANCEOF DETECTION RANGE• % TIME OPERATORS JUDGEDOPERATION OF CONTROLS &DISPLAYS SATISFACTORY• MEAN AND VARIANCE OF THENUMBER OF FALSE TARGETSPER SCAN• MEASURED/OBSERVEDDETECTION RANGE• OPERATOR’S COMMENTS• OBSERVATION OF SCANS• OBSERVATION OF FALSE ALARMSFigure 13-2. Dendritic Approach to <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>the developing organization <strong>and</strong> the contractor butare of less interest to the independent operationaltester. The operational tester focuses on issuessuch as demonstrating target acquisition at usefulranges, air superiority in <strong>com</strong>bat, or the probabilityof ac<strong>com</strong>plishing a given mission. For example,during DT&E, firing may be conductedon a round-by-round basis with each shot designedto test an individual specification or parameterwith other parameters held constant. Suchtesting is designed to measure the technical performanceof the system. In contrast, in OT&Eproper technical performance regarding individualspecifications/parameters is de-emphasized <strong>and</strong>the environment is less controlled. The OT&Eauthority must assess whether, given this technicalperformance, the weapon system is operationallyeffective <strong>and</strong> operationally suitable whenemployed under realistic <strong>com</strong>bat (with opposingforce) <strong>and</strong> environmental conditions by typicalpersonnel.The emphasis in DT is strictly on the use of quantitativedata to verify attainment of technical specifications.Quantitative data are usually analyzedusing some form of statistics. Qualitative data takeon increasing importance in OT&E when effectiveness<strong>and</strong> suitability issues are being explored.Many techniques are used to analyze qualitativedata. They range from converting expressions ofpreference or opinion into numerical values to establishinga consensus by <strong>com</strong>mittee. For example,a <strong>com</strong>mittee may assign values to parameters suchas “feel,” “ease of use,” “friendliness to the user,”<strong>and</strong> “will the user want to use it,” on a scale of 1to 10. Care should be exercised in the interpretationof the results of qualitative evaluations. Forinstance, when numbers are assigned to average13-7


evaluations <strong>and</strong> their st<strong>and</strong>ard deviations, meaningswill differ from quantitative data averages<strong>and</strong> st<strong>and</strong>ard deviations.13.7.1 Technical <strong>Evaluation</strong>The Services’ materiel development organizationsare usually responsible for oversight of all aspectsof DT&E including the technical evaluation. Theobjectives of a technical evaluation are:• To assist the developers by providing informationrelative to technical performance; qualificationof <strong>com</strong>ponents; <strong>com</strong>patibility, interoperability,vulnerability, lethality, transportability,Reliability, Availability, <strong>and</strong> Maintainability(RAM); manpower <strong>and</strong> personnel; systemsafety; Integrated Logistics Support (ILS);correction of deficiencies; accuracy of environmentaldocumentation; <strong>and</strong> refinement ofrequirements;• To ensure the effectiveness of the manufacturingprocess of equipment <strong>and</strong> proceduresthrough production qualification T&E;• To confirm readiness for OT by ensuring thatthe system is stressed beyond the levelsexpected in the OT environment;• To provide information to the decision authorityat each decision point regarding a system’stechnical performance <strong>and</strong> readiness to proceedto the next phase of acquisition;• To determine the system’s operability in therequired climatic <strong>and</strong> realistic battlefield environmentsto include natural, induced, <strong>and</strong>countermeasure environments. 1013.7.2 Operational <strong>Evaluation</strong>The independent OT&E authority is responsiblefor the operational evaluation. The objectives ofan operational evaluation are:• To assist the developers by providing informationrelative to operational performance;doctrine, tactics, training, logistics; safety; survivability;manpower, technical publications;RAM; correction of deficiencies; accuracy ofenvironmental documentation; <strong>and</strong> refinementof requirements;• To assist decision makers in ensuring thatonly systems that are operationally effective<strong>and</strong> suitable are delivered to the operatingforces;• To provide information to the decision authorityat each decision point as to a system’soperational effectiveness, suitability, <strong>and</strong>readiness to proceed to the next phase ofacquisition;• To assess, from the user’s viewpoint, asystem’s desirability, considering systems alreadyfielded, <strong>and</strong> the benefits or burdens associatedwith the system. 1113.8 SUMMARYA primary consideration in identifying informationto be generated by an evaluation programis having a clear underst<strong>and</strong>ing of the decisionsthe information will support. The importance ofstructuring the T&E program to support the resolutionof critical issues cannot be overemphasized.It is the responsibility of those involvedin the evaluation process to ensure that theproper focus is maintained on key issues, theT&E program yields information on criticaltechnical <strong>and</strong> operational issues, all data sourcesnecessary for a thorough evaluation are tapped,<strong>and</strong> evaluation results are <strong>com</strong>municated in aneffective <strong>and</strong> timely manner. The evaluation processshould be evolutionary throughout the acquisitionphases.13-8


ENDNOTES1. Defense Acquisition University Press,Glossary of Defense Acquisition Acronyms &Terms, 11th Edition, September 2003.2. Chairman of the Joint Chiefs of StaffInstruction (CJCSI) 3170.01D, Joint CapabilitiesIntegration <strong>and</strong> Development System(JCIDS), March 12, 2004, GL (glossary) -9.3. Ibid.4. Ibid, GL -11.5. Department of Defense Instruction (DoDI)5000.2, Operation of the Defense AcquisitionSystem, May 12, 2003, p. 37.7. Department of the Army (DA) Pamphlet 73-1, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> in Support of SystemsAcquisition, May 30, 2003.8. Air Force Regulation (AFR) 80-38, <strong>Management</strong>of the Air Force Systems SurvivabilityPrograms, May 27, 1994.9. NAVSO-P-2457, RDT&E/Acquisition <strong>Management</strong><strong>Guide</strong>, 10th Edition.10. Army Regulation (AR) 73-1, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Policy, January 7, 2002.11. AFR 800-2, Acquisition Program <strong>Management</strong>,July 21, 1994.6. Ibid.13-9


13-10


14MODELING AND SIMULATIONSUPPORT TO T&E14.1 INTRODUCTIONThis chapter discusses the applications of Modeling<strong>and</strong> Simulation (M&S) in <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(T&E). The need for M&S has long beenrecognized, as evidenced by this quotation fromthe USAF Scientific Advisory Board in June1965:Prediction of <strong>com</strong>bat effectiveness can onlybe, <strong>and</strong> therefore must be, made by usingthe test data in analytical procedures. Thisanalysis usually involves some type ofmodel, simulation, or game (i.e., the toolsof operations or research analysis). It is theexception <strong>and</strong> rarely, that the “end result,”i.e., <strong>com</strong>bat effectiveness, can be deduceddirectly from test measurements.In m<strong>and</strong>ating T&E early in the acquisition process,Department of Defense Instruction (DoDI)5000.2 encourages the use of M&S as a sourceof T&E data. 1 For instance, the Armored Familyof Vehicles (AFV) program used more than 60models, simulations, <strong>and</strong> other test data to supportsystem concept exploration. The reliance onM&S by this <strong>and</strong> other acquisition programs providesthe T&E <strong>com</strong>munity with valuable informationthat can increase confidence levels, decreasefield test time <strong>and</strong> costs, <strong>and</strong> provide datafor pre-test prediction <strong>and</strong> post-test validation.The Defense Modeling <strong>and</strong> Simulation Office(DMSO), working for the Director DefenseResearch <strong>and</strong> Engineering (DDR&E), is developingOffice of the Secretary of Defense (OSD)guidance on the application of M&S to theacquisition process. The DMSO operates theModeling <strong>and</strong> Simulation Information AnalysisCenter (MSIAC) to provide assistance to all Departmentof Defense (DoD) M&S users, includingprogram offices <strong>and</strong> the acquisition <strong>com</strong>munityat large.This chapter discusses using M&S to increasethe efficiency of the T&E process, reduce time<strong>and</strong> cost, provide otherwise unattainable <strong>and</strong>immeasurable data, <strong>and</strong> provide more timely <strong>and</strong>valid results.14.2 TYPES OF MODELS ANDSIMULATIONSThe term “modeling <strong>and</strong> simulation” is oftenassociated with huge digital <strong>com</strong>puter simulations;but it also includes manual <strong>and</strong> man-inthe-loopwar games, Hardware-in-the-Loop(HWIL) simulations, test beds, hybrid laboratorysimulators, <strong>and</strong> prototypes.A mathematical model is an abstract representationof a system that provides a means of developingquantitative performance requirementsfrom which c<strong>and</strong>idate designs can be developed.Static models are those that depict conditions ofstate while dynamic models depict conditions thatvary with time, such as the action of an autopilotin controlling an aircraft. Simple dynamicmodels can be solved analytically, <strong>and</strong> the resultsrepresented graphically.According to a former Director, Defense <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> (DDT&E), 2 simulations used inT&E can be divided into three categories:Constructive Simulations: Computersimulations are strictly mathematical14-1


epresentations of systems <strong>and</strong> do notemploy any actual hardware. They may,however, incorporate some of the actualsoftware that might be used in a system.Early in a system’s life cycle, <strong>com</strong>putersimulations can be expected to provide themost system evaluation information. Inmany cases, <strong>com</strong>puter simulations can bereadily developed as modifications ofexisting simulations for similar systems.For example, successive generations ofAIM-7 missile simulations have beeneffectively used in T&E.Virtual Simulations: A system test bedusually differs from a <strong>com</strong>puter simulationas it contains some, but not necessarilyall, of the actual hardware that willbe a part of the system. Other elementsof the system are either not incorporatedor, if they are incorporated, are in the formof <strong>com</strong>puter simulations. The system operatingenvironment (including threat)may either be physically simulated, as inthe case of a flying test bed, or <strong>com</strong>putersimulated, as in the case of a laboratorytest bed. Aircraft cockpit simulators usedto evaluate pilot performance are goodexamples of system test beds. As developmentof a system progresses, more subsystemsbe<strong>com</strong>e available in hardwareform. These subsystems can be incorporatedinto system test beds that typicallyprovide a great deal of the system evaluationinformation used during the middlepart of a system’s development cycle.Another type of virtual simulation usedin T&E is the system prototype. Unlikethe system test bed, all subsystems arephysically incorporated in a system prototype.The system prototype may closelyrepresent the final system configuration,depending on the state of development ofthe various subsystems that <strong>com</strong>pose it.Preproduction prototype missiles <strong>and</strong>aircraft used in operational testing by theServices are examples of this class ofsimulation. As system development proceeds,eventually all subsystems willbe<strong>com</strong>e available for incorporation in oneor more system prototypes. HWIL simulatorsor full-up man-in-the-loop systemsimulators may provide the foundation forcontinuous system testing <strong>and</strong> improvement.These simulators can provide thebasis for transitioning hardware <strong>and</strong> softwareinto operationally realistic trainingdevices with mission rehearsal capabilities.Operational testing of these prototypesfrequently provides much of thesystem evaluation information needed fora decision on full-scale production <strong>and</strong>deployment.Live Simulations: Some say that everythingexcept global <strong>com</strong>bat is a simulation,even limited regional engagements.Live exercises where troops use equipmentunder actual environmental conditionsapproach real life in <strong>com</strong>bat whileconducting peacetime operations. Trainingexercises <strong>and</strong> other live simulationsprovide a testing ground with real dataon actual hardware, software, <strong>and</strong> humanperformance when subjected to stressfulconditions. These data can be used to validatethe models <strong>and</strong> simulations used inan acquisition program.As illustrated in Figure 14-1, there is a continuousspectrum of simulation types with the pure <strong>com</strong>putersimulation at one end <strong>and</strong> the pure hardwareprototype at the other end. As you rely less on liveforces in exercises, the ability to clearly underst<strong>and</strong>the live-force exercise’s representativeness decreases.At the other end of the spectrum, as yourely more on virtual simulations (softwaregeneratedactivities), more of the models’ internal<strong>com</strong>plexities <strong>and</strong> increasing <strong>com</strong>plexity impactsthe accuracy of results. Examples of uncertaintywould be defining why a battle was lost in a war14-2


LESS CREDIBILITYMORE COMPLEXITY OF FORCESMORE SOFTWARETEST-BEDSGREATER CREDIBILITYFEWER FORCESMORE HARDWARECOMPUTERSIMULATIONSPROTOTYPESOPERATIONAL REALISMLESSMOREFigure 14-1. The Simulation Spectrumgame simulation, or a system failed to perform asexpected in a prototype or full system simulation,or why in a flight simulator, the aircraftseemed to be<strong>com</strong>e inverted when crossing theequator. Determining accuracy is a factor of thetechnical level of confidence or credibility of thesimulation.14.3 VALIDITY OF MODELING ANDSIMULATIONSimulations are not a substitute for live testing.There are many things that cannot be adequatelysimulated by <strong>com</strong>puter programs, including thedecision process <strong>and</strong> the proficiency of personnelin the performance of their functions. Therefore,models <strong>and</strong> simulations are not a total substitutionfor physical T&Es. Simulations, manual <strong>and</strong><strong>com</strong>puter-designed, can <strong>com</strong>plement <strong>and</strong> increasethe validity of live T&Es by proper selection <strong>and</strong>application. Figure 14-2 contrasts the test criteriathat are conducive to M&S versus physicaltesting. Careful selection of the simulation,knowledge of its application <strong>and</strong> operation, <strong>and</strong>meticulous selection of input data will producerepresentative <strong>and</strong> valid results.Conversely, certain highly <strong>com</strong>plex simulationscan be more effective analytical tools than limitedlive testing, generally due to the significance ofthe external constraints. For example, nuclearweapons effects testing can only be done in simulationbecause of international treaties. <strong>Test</strong>ingdepleted uranium ammunition is very expensivewith its related environmental considerations.Finally, many confidence-building iterations ofmultimillion dollar missile defense tests can onlybe executed using simulated events in conjunctionwith a few instances of real firings.The important element in using a simulation isto select one that is representative <strong>and</strong> eitheraddresses, or is capable of being modified toaddress, the level of detail (issues, thresholds, <strong>and</strong>objectives) under investigation. Models <strong>and</strong>simulations must be approved for use through14-3


CRITERIATHE SIMULATION SPECTRUMVALUES CONDUCIVE TO:PHYSICAL TESTINGMODELING AND SIMULATIONTEST SAMPLE SIZE/ SMALL/FEW LARGE/MANYNUMBER OF VARIABLESSTATUS OF VARIABLES/UNKNOWNS CONTROLLABLE UNCONTROLLABLEPHYSICAL SIZE OF PROBLEM SMALL AREA/FEW PLAYERS LARGE AREA/MANY PLAYERSAVAILABILITY OF TEST EQUIPMENT AVAILABLE UNAVAILABLEAVAILABILITY OF TEST FACILITIES RANGES, OTHER TEST AVAILABLE BENCHMARKED, VALIDATEDCOMPUTER MODELS AVAILABLETYPES OF VARIABLES/UNKNOWNS SPATIAL/TERRAIN LOW IMPORTANCE OF SPATIAL/TERRAINDIPLOMATIC/POLITICAL FACTORS CONVENTIONAL CONFLICTS NUCLEAR OR CHEMICAL CONFLICTSFigure 14-2. Values of Selected Criteria Conducive to Modeling <strong>and</strong> Simulationverification, validation, <strong>and</strong> accreditation processes.3 Verification is the process of determiningthat a model implementation accurately representsthe developer’s conceptual description <strong>and</strong>specifications. Validation is the process of determining:(a) the manner <strong>and</strong> degree to which amodel is an accurate representation of the realworld from the perspective of the intended usesof the model; <strong>and</strong> (b) the confidence that shouldbe placed on this assessment. Accreditation is theofficial certification that a model or simulationis acceptable for use for a specific purpose.14.4 SUPPORT TO TEST DESIGN ANDPLANNING14.4.1 Modeling <strong>and</strong> Simulation in T&EPlanningThe M&S can assist in the T&E planning process<strong>and</strong> can reduce the cost of testing. In Figure 14-3, areas of particular application include scenariodevelopment <strong>and</strong> the timing of test events; thedevelopment of objectives, essential elementsof analysis, <strong>and</strong> MOEs; the identification ofvariables for control <strong>and</strong> measurement; <strong>and</strong> thedevelopment of data collection, instrumentation,<strong>and</strong> data analysis plans. For example, using simulation,the test designer can examine system sensitivitiesto changes in variables to determine thecritical variables <strong>and</strong> their ranges of values to betested. The test designer can also predict theeffects of various assumptions <strong>and</strong> constraints <strong>and</strong>evaluate c<strong>and</strong>idate MOEs to help formulate thetest design. In the live fire vulnerability testingof the F/A-22 wing panel, a mechanical deformationmodel was used extensively in pre-testpredictions. The pre-test analysis was used todesign the experiment that assisted in shot-lineselection <strong>and</strong> allowed elimination of the aft boomtesting saving schedule <strong>and</strong> about $100,000. Realdata in post-test analysis verified the capabilityto predict impulse damage within acceptable limits,resulting in greater use of the model in futureapplications.Caution must be exercised when planning to relyon simulations to obtain test data as they tend tobe expensive to develop or modify, difficult tointegrate with data from other sources, <strong>and</strong> oftendo not provide the level of realism required foroperational tests. Although simulations are not a14-4


MODELS &SIMULATIONSCONCEPT REFINEMSAMSBMSCFRPDRTECH DEV SYSTEM DEV & DEMO PRODUCTION & DEPLOY OPERATIONS & SUPPORTANALYSISOFALTERNATIVESCAMPAIGN/THEATER(OUTCOMES)MISSION/BATTLE(EFFECTIVENESS)UPDATESUPDATESUPDATESAS REQDUPDATESAS REQDENGAGEMENT(EFFECTIVENESS)VIRTUAL PROTOTYPESENGINEERING(PERFORMANCE)PHYSICAL MODELSLIVE PROTOTYPETESTINGTHREAT SIMULATORSSURVIVABILITY/VULNERABILITYPHYSICAL MODELSLIVE PROTOTYPE TESTINGENGINEERING (E.G.,HW/SWIL)STIMULATORSLIVE TESTINGENGINEERING (E.G.,HW/SWIL TO SUPPORTPAT&E)STIMULATORSLIVE TESTINGENGINEERING (TOSUPPORT MODIFICATIONTESTING)DISTRIBUTED SIMULATIONPROGRAM ACTIVITIES/DOCUMENTS• TECHNOLOGYDEVELOPMENTSTRATEGY• PRELIMINARYTEMP•TEMP•TEMP• DOT&E REPORT• LFT REPORT• IDENTIFY MOEs,MOPs• COMPONENT DT&E• SUBSYSTEM &SYSTEM DT&E•DT&E• PAT&E• MODIFICATION TESTING• CRITICAL SYSTEMCHARACTERISTICS•LFT&E• ID MODELS• EARLY OPERATIONALASSESSMENT• OPERATIONALASSESSMENT•IOT&E •FOT&E •FOT&EFigure 14-3. Modeling <strong>and</strong> Simulation Application in <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>“cure-all,” they should be explored <strong>and</strong> usedwhenever feasible as another source of data forthe evaluator to consider during the test evaluation.Simulation fidelity is improving rapidly withenhancements to <strong>com</strong>puter technology, <strong>and</strong> it isno longer cost prohibitive to develop credibleM&S applications for future weapon systems.Computer simulations may be used to test theplanning for an exercise. By setting up <strong>and</strong>running the test exercise in a simulation, the timing<strong>and</strong> scenario may be tested <strong>and</strong> validated.Critical events may include interaction of variousforces that test the MOEs <strong>and</strong>, in turn, test objectives.Further, the simulation may be used to verify thestatistical test design <strong>and</strong> the instrumentation, datacollection, <strong>and</strong> data analysis plans. Essentially,the purpose of <strong>com</strong>puter simulation in pre-testplanning is to preview the test to evaluate waysto make test results more effective. Pre-testingattempts to optimize test results by pointing outpotential trouble spots. It constitutes a test setupanalysis, which can en<strong>com</strong>pass a multitude ofareas. The model-test-model process is an integratedapproach to using models <strong>and</strong> simulationsin support of pre-test analysis <strong>and</strong> planning;conducting the actual test <strong>and</strong> collecting data; <strong>and</strong>post-test analysis of test results along with furthervalidation of the models using the test data.As an example of simulations used in testplanning, consider a model that portrays aircraftversus air defenses. The model can be used toreplicate typical scenarios <strong>and</strong> provide data on14-5


the number of engagements, air defense systemsinvolved, aircraft target, length <strong>and</strong> quality ofthe engagement, <strong>and</strong> a rough approximation ofthe success of the mission (i.e., if the aircraftmade it to the target). With such data available,a data collection plan can be developed tospecify, in more detail, when <strong>and</strong> where datashould be collected, from which systems, <strong>and</strong>in what quantity. The results of this analysisimpact heavily on long lead-time items such asdata collection devices <strong>and</strong> data processing systems.The more specificity available, the fewerthe number of surprises that will occur downstream.As tactics are decided upon <strong>and</strong> typicalflight paths are generated for the scenario, ananalysis can be prepared on the flight paths overthe terrain in question; <strong>and</strong> a determination canbe made regarding whether the existing instrumentationcan track the numbers of aircraft involvedin their maneuvering envelopes. Alternativesite arrangements can be examined <strong>and</strong>tradeoffs can be made between the amount ofequipment to be purchased <strong>and</strong> the types of profilesthat can be tracked for this particular test.Use of such a model can also highlight numerouschoices available to the threat air defensesystem in terms of opportunities for engagement<strong>and</strong> practical applications of doctrine to the specificsituations.14.4.2 Simulation, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Process (STEP)The STEP has been proposed in DoD guidanceof the recent past <strong>and</strong> is still a valid concept. InSTEP, simulation <strong>and</strong> test are integrated, each dependingon the other to be effective <strong>and</strong> efficient.Simulations provide predictions of the system’sperformance <strong>and</strong> effectiveness, while tests are partof a strategy to provide information regarding risk<strong>and</strong> risk mitigation, to provide empirical data tovalidate models <strong>and</strong> simulations, <strong>and</strong> to determinewhether systems are operationally effective, suitable,<strong>and</strong> survivable for intended use. A byproductof this process is a set of models <strong>and</strong>simulations with a known degree of credibilityproviding the potential for reuse in other efforts(Figure 14-4).MISSIONCAPABILITIESFORCE-ON-FORCESIMULATIONFIELDEDPRODUCTSYSTEM DESIGN1. SYSTEM DESIGN 5.THRU LIFE CYCLETESTING2. DESIGN DEVELOPMENT 4. RECURRING ASSESSMENTS3. TESTINGFigure 14-4. STEP Process14-6


STEP is driven by mission <strong>and</strong> system requirements.The product of STEP is information. Theinformation supports acquisition program decisionsregarding technical risk, performance,system maturity, operational effectiveness, suitability,<strong>and</strong> survivability. STEP applies to allacquisition programs, especially Major DefenseAcquisition Programs (MDAPs) <strong>and</strong> MajorAutomated Information Systems (MAISs).Throughout STEP, tests are conducted to collectdata for evaluating the system <strong>and</strong> refining <strong>and</strong>validating models. Through the model-test-modeliterative approach, the sets of models mature,culminating in accurate representations of thesystem with appropriate fidelity, which can beused to predict system performance <strong>and</strong> to supportthe acquisition <strong>and</strong> potentially the training<strong>com</strong>munities.1. STEP begins with the Missions NeedsAnalysis <strong>and</strong> continues through the lifecycle. Top-level requirements are used todevelop alternative concepts <strong>and</strong> select/develop digital models that are used toevaluate theater/campaign <strong>and</strong> mission-/battle-level simulations. Mission-/battlelevelmodels are used to evaluate the abilityof a multiple platform force package toperform a specific mission. Mission <strong>and</strong>functional requirements continue to be refined,<strong>and</strong> the system reaches the preliminarydesign stage.2. M&S is used both as a predictive tool <strong>and</strong>with test in an iterative process to evaluatethe system design. The consequences ofdesign changes are evaluated <strong>and</strong> help translatethe most promising design approachinto a stable, interoperable, <strong>and</strong> costeffectivedesign.3. System <strong>com</strong>ponents <strong>and</strong> subsystems aretested in a laboratory environment. Datafrom this hardware are employed in themodel-test-model process. M&S is used inthe planning of tests to support a more efficientuse of resources. Simulated tests canbe run on virtual ranges to conduct rehearsals<strong>and</strong> determine if test limitations can beresolved. STEP tools are used to provide datafor determining the real <strong>com</strong>ponent orsubsystem’s performance <strong>and</strong> interactionwith other <strong>com</strong>ponents. M&S is used duringboth Development <strong>Test</strong>ing (DT) <strong>and</strong> Operational<strong>Test</strong>ing (OT) to increase the amountof data <strong>and</strong> supplement the live test eventsthat are needed to meet test objectives.4. Periodically throughout the acquisition processthe current version of the system underdevelopment should be reexamined in a syntheticoperational context to reassess itsmilitary worth. This is one of the significantaspects of STEP, underst<strong>and</strong>ing theanswer to the question: What differencedoes this change make in the system’sperformance?5. STEP does not end with fielding <strong>and</strong>deployment of a system, but continues to theend of the system’s life cycle. STEP resultsin a thoroughly tested system with performance<strong>and</strong> suitability risks identified. A byproductis a set of models <strong>and</strong> simulationswith a known degree of credibility with thepotential for reuse in other efforts. New testdata can be applied to models to incorporateany system enhancements <strong>and</strong> furthervalidate its models.14.5 SUPPORT TO TEST EXECUTIONSimulations can be useful in test execution <strong>and</strong>dynamic planning. With funds <strong>and</strong> other restrictionslimiting the number of times that a test maybe repeated <strong>and</strong> each test conducted over severaldays, it is m<strong>and</strong>atory that the test directorexercises close control over the conduct of thetest to ensure the specific types <strong>and</strong> quantities ofdata needed to meet the test objectives are beinggathered <strong>and</strong> to ensure adequate safety. The test14-7


director must be able to make minor modificationsto the test plan <strong>and</strong> scenario to force achievementof these goals. This calls for a dynamic(quick-look) analysis capability <strong>and</strong> a dynamicplanning capability. Simulations may contributeto this capability. For example, using the samesimulation(s) as used in pre-test planning, thetester could input data gathered during the firstday of the exercise to determine the adequacy ofthe data to fulfill the test objectives. Using thisdata, the entire test could be simulated. Projectedinadequacies could be isolated, <strong>and</strong> the test planscould be modified to minimize the deficiencies.Simulations may also be used to support testcontrol <strong>and</strong> to ensure safety. For example, duringmissile test firings at White S<strong>and</strong>s Missile Range(WSMR), New Mexico, aerodynamic simulationsof the proposed test were run on a <strong>com</strong>puter duringactual firings so that real-time missile positiondata could be <strong>com</strong>pared continuously to thesimulated missile position data. If any significantvariations occurred <strong>and</strong> if the range safetyofficer was too slow (both types of position datawere displayed on plotting boards), the <strong>com</strong>puterissued a destruct <strong>com</strong>m<strong>and</strong>.Simulations can be used to augment tests by simulatingnon-testable events <strong>and</strong> scenarios. Althoughoperational testing should be ac<strong>com</strong>plished in asrealistic an operational environment as possible,pragmatically some environments are impossibleto simulate for safety or other reasons. Some ofthese include the environment of a nuclear battlefield,to include the effects of nuclear bursts onfriendly <strong>and</strong> enemy elements. Others include twosidedlive firings <strong>and</strong> adequate representation ofother forces to ascertain <strong>com</strong>patibility <strong>and</strong> interoperabilitydata. Instrumentation, data collection, <strong>and</strong>data reduction of large <strong>com</strong>bined armed forces(e.g., brigade, division <strong>and</strong> larger-sized forces)be<strong>com</strong>e extremely difficult <strong>and</strong> costly. Simulationsare not restricted by safety factors <strong>and</strong> can realisticallyreplicate many environments that are otherwiseunachievable in an Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (OT&E)—nuclear effects, large<strong>com</strong>bined forces, Electronic Countermeasures(ECM), Electronic Counter-Countermeasures(ECCM), <strong>and</strong> many engagements. These effectscan be simulated repeatedly with no environmentalimpacts, costing only for the simulationoperations.Usually, insufficient units are available tosimulate the organizational relationships <strong>and</strong>interaction of the equipment with its operationalenvironment, particularly during the early OT&Econducted using prototype or pilot productiontypeequipment. Simulations are not constrainedby these limitations. Data obtained from alimited test can be plugged into a simulationthat is capable of h<strong>and</strong>ling many of the types ofequipment being tested. It can interface themwith other elements of the blue forces <strong>and</strong> operatethem against large elements of the red forcesto obtain interactions.End-item simulators can be used to evaluatedesign characteristics of equipment <strong>and</strong> can beused to augment the results obtained using prototypeor pilot production-type equipment that isrepresentative of the final item. The simulatormay be used to exp<strong>and</strong> test data to obtain therequired iterations or to indicate that the humaninterface with the prototype equipment will notsatisfy the design requirements.It is often necessary to use substitute or surrogateequipment in testing; e.g., American equipmentis used to represent threat-force equipment.In some cases the substitute equipment may havegreater capabilities than the real equipment; inother cases it may have less. Simulations arecapable of representing the real characteristics ofequipment <strong>and</strong>, therefore, can be used as a meansof modifying raw data collected during the testto reflect real characteristics.As an example, if the substitute equipment is anAAA gun with a tracking rate of 30 degrees persecond <strong>and</strong> the equipment for which it is substitutedhas a tracking rate of 45 degrees per second,14-8


the <strong>com</strong>puter simulation could be used to augmentthe collected, measured data by determining howmany rounds could have been fired against eachtarget; or whether targets that were missed becausethe tracking rate was too slow could havebeen engaged by the actual equipment. Considerationof other differing factors simultaneouslycould have a positive or negative synergistic effecton test results, <strong>and</strong> could allow evaluationmore quickly through a credible simulation designedwith the flexibility to replicate operationalvariations.14.6 SUPPORT TO ANALYSIS AND TESTREPORTINGM&S may be used in post-test analysis to extend<strong>and</strong> generalize results <strong>and</strong> to extrapolate to otherconditions. The difficulty of instrumentation <strong>and</strong>controlling large exercises <strong>and</strong> collecting <strong>and</strong> reducingthe data <strong>and</strong> resource costs, to some degree,limits the size of T&E. This makes the processof determining the suitability of equipmentto include <strong>com</strong>patibility, interoperability, organization,etc., a difficult one. To a large degree thelimited interactions, interrelationships <strong>and</strong> <strong>com</strong>patibilityof large forces may be supplementedby using actual data collected during the test <strong>and</strong>applying it in the simulation.Simulations can be used to extend test results,save considerable energy (fuel <strong>and</strong> manpower),<strong>and</strong> save money by reducing the need to repeatdata points to improve the statistical sample orto determine overlooked or directly unmeasuredparameters. Sensitivity analyses can be run usingsimulations to evaluate the robustness of thedesign.In analyzing the test results, data can be <strong>com</strong>paredto the results predicted by the simulationsused early in the planning process. Thus, thesimulation is validated by the actual live testresults, but the test results are also validated bythe simulation.14.7 SIMULATION INTEGRATIONSimulations have be<strong>com</strong>e so capable <strong>and</strong> widespreadin their application, <strong>and</strong> the ability tonetwork dissimilar <strong>and</strong>/or geographically separatesimulators in real time to synergistically simulatemore than ever before, that whole newapplications are being discovered. Simulations areno longer stove-piped tools in distinct areas, butare increasingly crossing disciplines <strong>and</strong> differentuses. The F-35 Joint Strike Fighter (JSF) programuses nearly 200 different models <strong>and</strong> simulationsthroughout its many development <strong>and</strong> testingactivities, in both government centers <strong>and</strong> contractorfacilities. Increasingly, operational issuesare being determined through operational analysissimulations, which will subsequently impactOT much later in a program. Elements of the T&E<strong>com</strong>munity should be involved increasinglyearlier in every program to insure such resultsare <strong>com</strong>patible with the OT objectives <strong>and</strong> requirements.For example, the F-35 program networkedeight different war game simulations togetherto develop the initial operational requirementsthat subsequently became documented intheir Operational Requirements Document (ORD)for the many different required Service applications,i.e., Navy, Air Force, Marine Corps, <strong>and</strong>British Royal Navy. Only through this extensive“Virtual Strike Warfare Environment” were thebroad <strong>and</strong> concurrent range of operational requirementsable to be evaluated <strong>and</strong> established. Newcapabilities were achieved, but the extensivemulti-simulation network dem<strong>and</strong>ed new techniquesfor establishing the technical confidencein the results.14.8 SIMULATION PLANNINGWith M&S be<strong>com</strong>ing increasingly more <strong>com</strong>plex,more expensive, <strong>and</strong> more extensive, testers mustbe thorough in planning their use <strong>and</strong> subsequenttechnical confidence. <strong>Test</strong>ers are expected to beinvolved increasingly earlier, including the Operational<strong>Test</strong> Agencies (OTAs), if the subsequentT&E results are to be accepted. Simulation SupportPlans (SSPs) are program documents that14-9


span the many simulations, their purpose, <strong>and</strong> theirexpected credibility. The Army, Air Force, <strong>and</strong>Marine Corps have policies requiring the earlyestablishment of an SSP process, <strong>and</strong> st<strong>and</strong>-aloneSSP documents. Usually this starts with a programoffice-level simulation support group ofService M&S experts who advise the programon M&S opportunities, establish the Verification,Validation, <strong>and</strong> Accreditation (VV&A) process,<strong>and</strong> document these procedures in the SSP, whichextends across the life cycle of the system development,testing, <strong>and</strong> employment. It is vital thatthe SSP be fully coordinated with the <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> Master Plan (TEMP). Without earlyplanning of what each model or simulation is for,their expense, impact on system design as wellas testing, earlier programs have ended up with avolume of different data for different purposes,which could not be analyzed, as subsequentlyfound not to be credible, or delayed testing orprogram development.14.9 SUMMARYM&S in T&E can be used for concept evaluation,extrapolation, isolation of design effects,efficiency, representation of <strong>com</strong>plex environments,<strong>and</strong> over<strong>com</strong>ing inherent limitations inactual testing. The use of M&S can validate testresults, increase confidence levels, reduce testcosts, <strong>and</strong> provide opportunities to shorten theoverall acquisition cycle by providing more dataearlier for the decision maker. It takes appropriatetime <strong>and</strong> funding to bring models <strong>and</strong> simulationsalong to the point that they are usefulduring an acquisition.14-10


ENDNOTES1. DoDI 5000.2, Operation of the DefenseAcquisition System, May 12, 2003.2. Watt, Charles K. The Role of <strong>Test</strong> Beds in C 3 I,Armed Forces Communications <strong>and</strong> ElectronicsAssociation (AFCEA) 35th Annual Convention,June 15-17, 1982.3. DoD Directive (DoDD) 5000.59, DoD Modeling<strong>and</strong> Simulation <strong>Management</strong>, January20, 1998, <strong>and</strong> DoDI 5000.61, DoD Modeling<strong>and</strong> Simulation (M&S) Verification, Validation<strong>and</strong> Accreditation (VV&A), May 13, 2003.14-11


14-12


15TEST RESOURCES15.1 INTRODUCTIONThis chapter describes the various types ofresources available for testing, explains testresource planning in the Services, <strong>and</strong> discussesthe ways in which test resources are funded.According to the Defense Acquisition <strong>Guide</strong>book,the term “test resources” is a collective term thaten<strong>com</strong>passes elements necessary to plan, conduct,collect, <strong>and</strong> analyze data from a test event or program.1 These elements include: funding (todevelop new resources or use existing ones), manpowerfor test conduct <strong>and</strong> support, test articles,models, simulations, threat simulators, surrogates,replicas, test-beds, special instrumentation, testsites, targets, tracking <strong>and</strong> data acquisition instrumentation,equipment (for data reduction, <strong>com</strong>munications,meteorology, utilities, photography,calibration, security, recovery, maintenance, <strong>and</strong>repair), frequency management <strong>and</strong> control, <strong>and</strong>base/facility support services. “<strong>Test</strong>ing planning<strong>and</strong> conduct shall take full advantage of existinginvestment in DoD [Department of Defense]ranges, facilities, <strong>and</strong> other resources, whereverpractical, unless otherwise justified in the <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> Master Plan [TEMP].”Key DoD test resources are in great dem<strong>and</strong> by<strong>com</strong>peting acquisition programs. Often special,unique, or one-of-a-kind test resources must bedeveloped specifically for the test program. It isimperative that the requirements for these testresources be identified early in the acquisitionprocess so adequate funding can be allotted fortheir development <strong>and</strong> they will be available whenthe test is scheduled.15.2 OBTAINING TEST RESOURCES15.2.1 Identify <strong>Test</strong> Resources <strong>and</strong>InstrumentationAs early as possible, but not later than programstart, the test facilities <strong>and</strong> instrumentationrequirements to conduct program <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (T&E) should be identified <strong>and</strong> a tentativeschedule of test activities prepared. Thisinformation is recorded in the TEMP <strong>and</strong> Servicetest resource documentation.15.2.2 Require Multi-Service OT&EMulti-Service Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(MOT&E) should be considered for weaponsystems requiring new operational concepts involvingother Services. If multi-Service testingis used, an analysis of the impact of demonstrationon time <strong>and</strong> resources needed to execute themulti-Service tests should be conducted beforethe low rate production decision.15.2.3 Military Construction ProgramFacilitiesSome programs cannot be tested without MilitaryConstruction Program (MCP) facilities. Toconstruct these facilities will require long leadtimes; therefore, early planning must be done toensure that the facilities will be ready whenrequired.15.2.4 <strong>Test</strong> Sample SizeThe primary basis for the test-sample size isusually based on one or more of the following:15-1


• Analysis of test objectives;• Statistical significance of test results at somespecified confidence level;• Availability of test vehicles, items, etc.;• Support resources or facilities available;• Time available for the test program.15.2.5 <strong>Test</strong> Termination<strong>Test</strong>ers should not hesitate to terminate a testbefore its <strong>com</strong>pletion if it be<strong>com</strong>es clear that themain objective of the test has already beenachieved, is unachievable (due to hardware failure,unavailability of resources, etc.), or if additionalsamples will not change the out<strong>com</strong>e <strong>and</strong>conclusions of the test.15.2.6 Budget for <strong>Test</strong>The Acquisition Strategy, TEMP, <strong>and</strong> budgetingdocuments should be reviewed regularly to ensurethat there are adequate identified testing fundsrelative to development <strong>and</strong> fabrication funds.The Acquisition Strategy, TEMP, <strong>and</strong> budgetingdocuments need careful scrutiny to ensure thatthere are adequate contingency funds to covercorrection of difficulties at a level that matchesindustry/government experience on the contract.(<strong>Test</strong>ing to correct deficiencies found duringtesting, without sufficient funding for propercorrection, results in B<strong>and</strong>-Aid ® approaches, whichrequire corrections at a later <strong>and</strong> more expensivetime period.)15.2.7 <strong>Test</strong> ArticlesA summary of important test planning items thatwere identified by the Defense Science Board(DSB) is provided below:• Ensure that the whole system, including thesystem user personnel, is tested. Realisticallytest the <strong>com</strong>plete system, including hardware,software, people, <strong>and</strong> all interfaces. Get userinvolved from the start <strong>and</strong> underst<strong>and</strong> userlimitations;• Ascertain that sufficient time <strong>and</strong> test articlesare planned. When the technology is stressed,the higher risks require more test articles <strong>and</strong>time;• In general, parts, subsystems, <strong>and</strong> systemsshould be proven in that order before incorporatingthem into the next higher assembly formore <strong>com</strong>plete tests. The instrumentationshould be planned to permit diagnosis oftrouble;• Major tests should never be repeated withoutan analysis of failure <strong>and</strong> corrective action.Allow for delays of this nature.15.2.8 Major Range <strong>and</strong> <strong>Test</strong> Facility Base(MRTFB)The National Defense Authorization Act (NDAA)of Fiscal Year (FY) 2003 directed the DoD establisha <strong>Test</strong> Resource <strong>Management</strong> Center(DTRMC), with the director (three star or seniorcivilian) reporting directly to the Under Secretaryof Defense for Acquisition, Technology, <strong>and</strong>Logistics (USD(AT&L)). The DTRMC providesoversight for T&E strategic planning <strong>and</strong> budgetsfor the Department’s T&E activities, including theMRTFB, Central <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> InvestmentProgram (CTEIP), <strong>and</strong> the T&E Science <strong>and</strong>Technology (S&T) Program.All Services operate ranges <strong>and</strong> test facilities fortest, evaluation, <strong>and</strong> training purposes. Theseactivities constitute the DoD Major Range <strong>and</strong><strong>Test</strong> Facility Base. 2 This MRTFB is described as“a national asset which shall be sized, operated,<strong>and</strong> maintained primarily for DoD T&E supportmissions, but also is available to all users having15-2


UTAH TEST ANDTRAINING RANGEDUGWAY PROVING GROUNDUTAHAIR FORCENEVADA TEST &TRAINING RANGENAVYPMRF,HINAVAL AIR WARFARECENTER WEAPONS DIVCHINA LAKE, CALIFORNIA30TH SPACE WINGCALIFORNIANAVAL AIR WARFARECENTER WEAPONS DIVPT MUGU, CALIFORNIAAIR FORCEFLIGHT TEST CENTERCALIFORNIAUS ARMYKWAJALEIN ATOLLYUMA PROVING GROUNDARIZONAELECTRONIC PROVING GROUNDFT HUACHUCA, ARIZONAJOINT INTEROPERABILITYTEST COMMANDARIZONAWHITE SANDSMISSILE RANGENEW MEXICOAIR FORCE DEVELOPMENTTEST CENTERFLORIDAABERDEEN PROVING GROUNDMARYLANDNAVAL AIR WARFARECENTER AIRCRAFT DIVPATUXENT RIVERMARYLANDARNOLD ENGINEERINGDEVELOPMENT CENTERTENNESSEE45TH SPACE WINGFLORIDAATLANTIC UNDERSEATEST AND EVALUATIONCENTERSOURCE: DOD 3200.IIDFigure 15-1. DoD Major Range <strong>and</strong> <strong>Test</strong> Facility Basea valid requirement for its capabilities. TheMRTFB consists of a broad base of T&E activitiesmanaged <strong>and</strong> operated under uniform guidelinesto provide T&E support to DoD Componentsresponsible for developing or operatingmateriel <strong>and</strong> weapon systems.” 3 The list ofMRTFB activities <strong>and</strong> their locations are shownon Figure 15-1. Summaries of the capabilities ofeach of these activities (with points of contactlisted for further information) may be found inMajor Range <strong>and</strong> <strong>Test</strong> Facility Base Summary ofCapabilities. 4The MRTFB facilities are available for use byall the Services, other U.S. government agencies<strong>and</strong>, in certain cases, allied foreign governments<strong>and</strong> contractor organizations. Scheduling is basedon a priority system; <strong>and</strong> costs for usage are billeduniformly. 5 The Director, <strong>Test</strong> Ranges <strong>Management</strong>Center sets policy for the <strong>com</strong>position, use,<strong>and</strong> test program assignments of the MRTFB. Inturn, the individual Services must fund, manage,<strong>and</strong> operate their activities. They are reimbursedfor direct costs by each user of the activity. TheDTRMC sponsors a Joint <strong>Test</strong> Assets Database,which lists MRTFB <strong>and</strong> Operational <strong>Test</strong> Agency(OTA) test facilities, test area <strong>and</strong> range data, instrumentation,<strong>and</strong> test systems.The DoD Components wishing to use an MRTFBactivity must provide timely <strong>and</strong> <strong>com</strong>plete notificationof their requirements, such as specialinstrumentation or ground-support equipmentrequirements, to the particular activity using thedocumentation formats prescribed by UniversalDocumentation System H<strong>and</strong>book. 6 The requirementsmust be stated in the TEMP discussedbelow. Personnel at the MRTFB activity willcoordinate with <strong>and</strong> assist prospective users withtheir T&E planning, to include conducting tradeoffanalyses <strong>and</strong> test scenario optimization basedon test objectives <strong>and</strong> test support capabilities.15.2.9 Central <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Investment Program (CTEIP)In 1994 the Principal Deputy Under Secretary ofDefense for Acquisition <strong>and</strong> Technology15-3


(PDUSD(A&T)) chartered the Director, <strong>Test</strong>, SystemsEngineering, <strong>and</strong> <strong>Evaluation</strong> (DTSE&E) toestablish <strong>and</strong> chair a steering group that wouldoversee the acquisition <strong>and</strong> integration of all training<strong>and</strong> associated test range instrumentation <strong>and</strong>develop related policy. As a result of the reorganizationof the Office of the Secretary of Defense(OSD) T&E functions, the Director, Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DOT&E) <strong>and</strong> subsequentlythe DTRMC, manages the implementation of theJoint Training <strong>and</strong> <strong>Test</strong> Range Roadmap (JTTR)<strong>and</strong> executes the CTEIP. The CTEIP provides OSDfunding <strong>and</strong> a mechanism for the development <strong>and</strong>acquisition of new test capabilities to satisfy multi-Service testing requirements.15.2.10 Service <strong>Test</strong> FacilitiesThere are other test resources available besidesMRTFB. Frequently, National Guard or Reserveunits, <strong>com</strong>mercial or international test facilities,<strong>and</strong> war reserve assets are available to supportDoD T&E. <strong>Test</strong>ers can determine resources availableby contacting their Service headquarters staffelement. Within the Army, consult documents include:Army <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Comm<strong>and</strong>(ATEC)’s database on existing Army major testfacilities, major instrumentation, <strong>and</strong> test equipment;the Project Manager for Instrumentation,Targets, <strong>and</strong> Threat Simulators (PM, ITTS) TargetsInformation Manual for a descriptive catalogueof Army targets <strong>and</strong> foreign ground assetsavailable (or in development) for support of T&E<strong>and</strong> training; <strong>and</strong>, PM ITTS’ Threat InventoryDatabase on available assets for both hardwaresimulators <strong>and</strong> software simulation systems. Informationon specific Navy test resources is foundin user manuals published by each range <strong>and</strong> theComm<strong>and</strong>er Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Force (COMOPTEVFOR) catalog of availablesupport.15.3 TEST RESOURCE PLANNINGThe development of special test resources tosupport a weapon system test can be costly <strong>and</strong>time-consuming. This, coupled with the <strong>com</strong>petitionfor existing test resources <strong>and</strong> facilities,requires that early planning be ac<strong>com</strong>plished todetermine all test resource requirements forweapon system T&E. The tester must use governmentfacilities whenever possible instead of fundingconstruction of contractor test capabilities.Problems associated with range <strong>and</strong> facility planningare that major systems tend to get top priority;i.e., B-1B, M-1, etc. Range schedules areoften in conflict due to system problems, whichcause schedule delays during testing; <strong>and</strong> thereis often a shortage of funds to <strong>com</strong>plete testing.15.3.1 TEMP Resource RequirementsThe Program Manager (PM) must state all keytest resource requirements in the TEMP <strong>and</strong> mustinclude items such as unique instrumentation,threat simulators, surrogates, targets, <strong>and</strong> testarticles. Included in the TEMP are a critical analysisof anticipated resource shortfalls, their effecton system T&E, <strong>and</strong> plans to correct resourcedeficiencies. As the first TEMP must be preparedfor program initiation, initial test resource planningmust be ac<strong>com</strong>plished very early. Refinements<strong>and</strong> reassessments of test resource requirementsare included in each TEMP update. Theguidance for the content of the test resource summary(Part V) of the TEMP is in Appendix 2,<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan, of the DefenseAcquisition <strong>Guide</strong>book (see Table 15-1). Onceidentified, the PM must then work within theService headquarters <strong>and</strong> range managementstructure to assure the assets are available whenneeded.15.3.2 Service <strong>Test</strong> Resource PlanningMore detailed listings of required test resourcesare generated in conjunction with the detailed testplans written by the materiel developer <strong>and</strong>operational tester. These test plans describe testobjectives, Measures of Effectiveness (MOEs), testscenarios, <strong>and</strong> specific test resource requirements.15-4


Table 15-1. TEMP <strong>Test</strong> Resource Summary SectionPART V – TEST AND EVALUATION RESOURCE SUMMARYPROVIDE A SUMMARY (PREFERABLY IN A TABLE OR MATRIX FORMAT) OF ALL KEY TEST AND EVALUATIONRESOURCES, BOTH GOVERNMENT AND CONTRACTOR, THAT WILL BE USED DURING THE COURSE OF THEACQUISITION PROGRAM.THE TEMP SHOULD PROJECT THE KEY RESOURCES NECESSARY TO ACCOMPLISH DEVELOPMENTAL ANDVALIDATION TESTING AND OPERATIONAL TEST AND EVALUATION. THE TEMP SHOULD ESTIMATE, TO THE DEGREEKNOWN AT MILESTONE B, THE KEY RESOURCES NECESSARY TO ACCOMPLISH DEVELOPMENTAL TEST ANDEVALUATION, LIVE FIRE TEST AND EVALUATION, AND ALL OPERATIONAL TEST AND EVALUATION. THESE SHOULDINCLUDE ELEMENTS OF THE NATIONAL TEST FACILITIES BASE (WHICH INCORPORATES THE MAJOR RANGEAND TEST FACILITY BASE (MRTFB), CAPABILITIES DESIGNATED BY INDUSTRY AND ACADEMIA, AND MRTFBTEST EQUIPMENT AND FACILITIES), UNIQUE INSTRUMENTATION, THREAT SIMULATORS, AND TARGETS. ASSYSTEM ACQUISITION PROGRESSES, THE PRELIMINARY TEST RESOURCE REQUIREMENTS SHALL BEREASSESSED AND REFINED AND SUBSEQUENT TEMP UPDATES SHALL REFLECT ANY CHANGED SYSTEMCONCEPTS, RESOURCE REQUIREMENTS, OR UPDATED THREAT ASSESSMENT. ANY RESOURCE SHORTFALLSTHAT INTRODUCE SIGNIFICANT TEST LIMITATIONS SHOULD BE DISCUSSED WITH PLANNED CORRECTIVE ACTIONOUTLINED. SPECIFICALLY, IDENTIFY THE FOLLOWING TEST RESOURCES:• TEST ARTICLES• TEST SITES AND INSTRUMENTATION• TEST SUPPORT EQUIPMENT• THREAT REPRESENTATION• TEST TARGETS AND EXPENDABLES• OPERATIONAL FORCE TEST SUPPORT• SIMULATORS, MODELS, AND TEST-BEDS• SPECIAL REQUIREMENTS• TEST AND EVALUATION FUNDING REQUIREMENTS• MANPOWER/PERSONNEL TRAININGSource: Defense Acquisition <strong>Guide</strong>book, September 2004.15.3.2.1 Army <strong>Test</strong> Resource PlanningIn the Army, the independent evaluator, with theassistance of the developmental <strong>and</strong> operationaltesters, prepares input to the TEMP <strong>and</strong> developsthe System <strong>Evaluation</strong> Plan (SEP), the primaryplanning documents for integrated T&E of thesystem. These documents should be prepared earlyin the acquisition cycle (at the beginning of systemacquisition activities). They describe the entireT&E strategy including critical issues, test methodology,MOEs, <strong>and</strong> all significant test resources.The TEMP <strong>and</strong> SEP provide the primary input tothe Outline <strong>Test</strong> Plan (OTP), which contains adetailed description of each identified required testresource, where <strong>and</strong> when it is to be provided, <strong>and</strong>the providing organization.The tester must coordinate the OTP with all major<strong>com</strong>m<strong>and</strong>s or agencies expected to provide testresources. Then, the OTP is submitted to ATECfor review by the <strong>Test</strong> Schedule <strong>and</strong> Review Committee(TSARC) <strong>and</strong> for incorporation into theArmy’s Five-Year <strong>Test</strong> Program (FYTP). The initialOTP for each test should be submitted to theTSARC as soon as testing is identified in theTEMP. Revised OTPs are submitted as moreinformation be<strong>com</strong>es available or requirements15-5


change, but a final <strong>com</strong>prehensive version of theOTP should be submitted at least 18 monthsbefore the resources are required.The TSARC is responsible for providing highlevel,centralized management of T&E resourceplanning. The TSARC is chaired by theComm<strong>and</strong>ing General, ATEC, <strong>and</strong> consists of ageneral officer or equivalent representatives fromthe Army staff <strong>and</strong> major <strong>com</strong>m<strong>and</strong>s. The TSARCmeets semiannually to review all OTPs, resolveconflicts, <strong>and</strong> coordinate all identified test resourcerequirements for inclusion in the FYTP.The FYTP is a formal resource tasking documentfor current <strong>and</strong> near-term tests <strong>and</strong> a planningdocument for tests scheduled for the out-years.All OTPs are reviewed during the semiannualreviews to ensure that any refinements or revisionsare approved by the TSARC <strong>and</strong> reflectedin the FYTP.The TSARC-approved OTP is a tasking documentby which the tester requests Army test resources.The TSARC coordinates resource requests, setspriorities, resolves conflicts <strong>and</strong> schedules resources.The resultant FYTP, when approved bythe Headquarters, Department of the Army(HQDA) Deputy Chief of Staff for G-3 (DCS G-3), is a formal tasking document that reflects theagreements made by the resource providers (ArmyMateriel Comm<strong>and</strong> (AMC), Training <strong>and</strong> DoctrineComm<strong>and</strong> (TRADOC), Forces Comm<strong>and</strong>(FORSCOM), etc.) to make the required test resourcesavailable to the designated tests. If testresources from another Service, a non-DoD governmentalagency (such as the Department ofEnergy (DOE) or National Aeronautics <strong>and</strong> SpaceAdministration (NASA)), or a contractor are required,the request is coordinated by ATEC. Forexample, the request for a range must be made atleast 2 years in advance to ensure availability.However, due to the long lead time required toschedule these non-Army resources, their availabilitycannot be guaranteed if testing is delayedor retesting is required. The use of resources outsidethe United States, such as in Canada,Germany, or other North Atlantic Treaty Organization(NATO) countries, is also h<strong>and</strong>led byATEC.15.3.2.2 Navy <strong>Test</strong> Resource PlanningIn the Navy, the developing agency <strong>and</strong> theoperational tester are responsible for identifyingthe specific test resources required in testing theweapon system. In developing requirements fortest resources, the PM <strong>and</strong> Operational <strong>Test</strong>Director (OTD) refer to documents such as theMission Need Statement (MNS), AcquisitionStrategy, Navy Decision Coordinating Paper(NDCP), Operational Requirement Document(ORD), threat assessments, Secretary of the NavyInstruction (SECNAVINST) 5000.2B, <strong>and</strong> theOperational <strong>Test</strong> Director’s <strong>Guide</strong> (Comm<strong>and</strong>er,Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Force (Navy)(COMOPTEVFOR Instruction 3960.1D). UponChief of Naval Operations’ (CNO) approval, theTEMP be<strong>com</strong>es the controlling managementdocument for all T&E of the weapon system. Itconstitutes direction by the CNO to conduct theT&E program defined in the TEMP, includingthe <strong>com</strong>mitment of Research, Development, <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> (RDT&E) financial support <strong>and</strong>of fleet units <strong>and</strong> schedules. It is prepared by thePM, who is provided OT&E input by theCOMOPTEVFOR OTD. The TEMP defines allT&E (DT&E, OT&E <strong>and</strong> Production Acceptance<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (PAT&E)) to be conductedfor the system <strong>and</strong> describes, in as much detailas possible, the test resources required.The Navy uses its operational naval forces toprovide realistic T&E of new weapon systems.Each year, the CNO (N-091) <strong>com</strong>piles all Fleetsupport requirements for RDT&E program supportfrom the TEMPs <strong>and</strong> publishes the CNOLong-Range RDT&E Support Requirementsdocument for the budget <strong>and</strong> out-years. In addition,a quarterly forecast of support requirementsis published approximately 5 months before theFleet Employment Scheduling Conference for thequarter in which the support is required. These15-6


documents summarize OT&E requirements forFleet services <strong>and</strong> are used by the Fleet forscheduling services <strong>and</strong> out-year budget projections.Requests for use of range assets are usually initiatedinformally with a phone call from the PM<strong>and</strong>/or OTD to the range manager <strong>and</strong> followedby formal documentation. Requests for Fleet supportare usually more formal. The COMOPTEV-FOR, in coordination with the PM, forwards theTEMP <strong>and</strong> a Fleet RDT&E Support Request tothe CNO. Upon approval of the request, the CNOtasks the Fleet Comm<strong>and</strong>er by letter or messageto coordinate with the Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Force (OPTEVFOR) to provide the requestedsupport.Use of most Navy ranges must be scheduled atleast a year in advance. Each range consolidates<strong>and</strong> prioritizes user requests, negotiates conflicts,<strong>and</strong> attempts to schedule range services to satisfyall requests. If the desired range services cannotbe made available when required, the test mustwait or the CNO resolves the conflict. Becauseranges are fully scheduled in advance, it is difficultto ac<strong>com</strong>modate a test that is delayed orrequires additional range time beyond that originallyscheduled. Again, the CNO can examinethe effects of delays or retest requirements <strong>and</strong>issue revised priorities, as required.Requests for use of non-Navy OT&E resourcesare initiated by COMOPTEVFOR. The OPTEV-FOR is authorized direct liaison with other Service-independentOTAs to obtain OTA-controlledresources. Requests for other government-ownedresources are forwarded to the CNO (N-091) forformal submission to the Service Chief (for Serviceassets) or to the appropriate government agency(e.g., DOE or NASA). Use of contractor resourcesis usually h<strong>and</strong>led by the PM, although contractorassets are seldom required in OT&E, sincethe Fleet is used to provide an operational environment.Requests for use of foreign ranges areh<strong>and</strong>led by the N-091 Assistant for InternationalResearch <strong>and</strong> Development (IR&D).15.3.2.3 Air Force <strong>Test</strong> Resource PlanningThe test resources required for OT&E of an AirForce weapon system are identified in detail inthe <strong>Test</strong> Resources Plan (TRP), which is preparedby the responsible Air Force OT&E organization.In general, the Air Force Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> Center (AFOTEC) is the test organizationfor OT&E programs; it obtains supportfrom a Service major <strong>com</strong>m<strong>and</strong> test agency fornon-major programs, with AFOTEC directing <strong>and</strong>providing assistance, as required.During the Advanced Planning Phase of a weaponsystem acquisition (5 to 6 years before OT&E),AFOTEC prepares the OT&E section of the firstfull TRP, coordinates the TRP with all supportingorganizations, <strong>and</strong> assists the Resource Manager(RM) in programming required resources.The resource requirements listed in the ResourceInformation Network TRP are developed by thetest manager, RM, <strong>and</strong> test support group, usingsources such as the ORD <strong>and</strong> threat assessments.The TRP should specify, in detail, all the resourcesnecessary to successfully conduct a testonce it is entered in the <strong>Test</strong> Resource Information<strong>Management</strong> System (TRIMS).The TRP is the formal means by which testresource requirements are <strong>com</strong>municated to theAir Staff <strong>and</strong> to the appropriate <strong>com</strong>m<strong>and</strong>s <strong>and</strong>agencies tasked to supply the needed resources.Hence, if a required resource is not specified inthe TRP, it is likely the resource will not beavailable for the test. The TRP is revised <strong>and</strong>updated on a continuous basis, since the testresource requirements be<strong>com</strong>e better defined asthe OT&E plans mature. The initial TRP servesas a baseline for <strong>com</strong>parison of planned OT&Eresources with actual expenditures. Comparisonsof the initial TRP with subsequent updates providean audit trail of changes in the test program <strong>and</strong>its testing requirements. The AFOTEC maintainsall TRPs on TRIMS; this permits immediate responseto all queries regarding test resourcerequirements.15-7


The AFOTEC/RM consolidates the resourcerequirements from all TRPs coordinating withparticipating <strong>and</strong> supporting organizations <strong>and</strong>agencies outside AFOTEC. Twice yearly, the RMoffice prepares a draft of the USAF Program forOperational <strong>Test</strong> (PO). The PO is a master planning<strong>and</strong> programming document for resourcerequirements for all HQ USAF-directed OT&E<strong>and</strong> is distributed to all concerned <strong>com</strong>m<strong>and</strong>s,agencies, <strong>and</strong> organizations for review <strong>and</strong> coordination.It is then submitted to the Air Staff forreview.All requests for test resources are coordinated byHQ AFOTEC as part of the TRP preparation process.When a new weapon system developmentis first identified, AFOTEC provides a <strong>Test</strong> Manager(TM) who begins long-term OT&E planning.The TM begins identifying needed test resources,such as instrumentation, simulators <strong>and</strong> models,<strong>and</strong> works with the resources directorate to obtainthem. If the required resource does not belongto AFOTEC, it will negotiate with the <strong>com</strong>m<strong>and</strong>shaving the resource. In the case of models<strong>and</strong> simulators, AFOTEC surveys what isavailable, assesses credibility, <strong>and</strong> then coordinateswith the owner or developer to use it. TheJoint Technical Coordinating Group (JTCG) publishesa document on Electronic Warfare (EW)models.Range scheduling should be done early. At leasta year is required, but often a test can beac<strong>com</strong>modated with a few months’ notice if thereis no requirement for special equipment ormodifications to be provided at the range. Someof the Air Force ranges are scheduled well inadvance <strong>and</strong> cannot ac<strong>com</strong>modate tests thatencounter delays or retest requirements.The RM attempts to resolve conflicts among varioussystems <strong>com</strong>peting for scarce test resources<strong>and</strong> elevates the request to the Comm<strong>and</strong>er, AFO-TEC, if necessary. Decisions on resource utilization<strong>and</strong> scheduling are based on the weaponsystem’s assigned priority.The RM <strong>and</strong> the TM also arrange for use of theresources of other Services, non-DoD governmentagencies <strong>and</strong> contractors. Use of non-U.S. resources,such as a Canadian range, are coordinatedby Air Force, Chief of Staff/Directorate of<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (AF/TE) <strong>and</strong> based on formalMemor<strong>and</strong>a of Underst<strong>and</strong>ing (MOU). TheU.S. Air Force-Europe/Directorate of Operations-Operations (USAFE/DOQ) h<strong>and</strong>les requests forEuropean ranges. Use of a contractor-owned resource,such as a model, is often obtained throughthe System Program Office (SPO) or a generalsupport contract.15.4 TEST RESOURCE FUNDINGThe Future Years Defense Program (FYDP),incorporating a biennial budgeting process, is thebasic DoD programming document that records,summarizes, <strong>and</strong> displays Secretary of Defense(SECDEF) decisions. In the FYDP, costs aredivided into three categories for each acquisitionProgram Element (PE): R&D costs, investmentcosts, <strong>and</strong> operating costs. The Congress appropriatesto the Office of <strong>Management</strong> <strong>and</strong> Budget(OMB), <strong>and</strong> OMB apportions funding throughthe SECDEF to the Services <strong>and</strong> to other defenseagencies. The Services <strong>and</strong> defense agencies thenallocate funds to others (claimants, sub-claimants,administering offices, <strong>com</strong>m<strong>and</strong>ing generals,etc.). (See DoD 7000.14-4, Financial <strong>Management</strong>Regulation, Vol. 2A.)The Planning, Programming, Budgeting <strong>and</strong>Execution (PPBE) system is a DoD internal systemused to develop input to the Congress for eachyear’s budget while developing future-year budgets.The PPBE is calendar oriented. There areconcurrent 2-year PPBE cycles ongoing at onetime. These cycles are: planning, programming,budgeting, <strong>and</strong> execution. At any one time thereare three budgets being worked by the Services.The current 2-year budget is being executed. Thenext 6 years of defense planning are being programmed,<strong>and</strong> long-range program plans <strong>and</strong> planningguidance are being reviewed for updating.15-8


There are various types of funding in the PPBE:R&D funding for maintaining the technology base;exploratory development funding for conductingthe concept assessments; advanced developmentfunding for conducting both the concept development<strong>and</strong> the early prototyping; engineering developmentfunding for demonstrating the EngineeringDevelopment Model (EDM); procurementfunding for conducting LRIP, Full Rate Production(FRP), system deployment <strong>and</strong> operationalsupport. RDT&E management <strong>and</strong> supportfunding is used throughout the development cycleuntil the system is operationally deployed whenOperations <strong>and</strong> Maintenance (O&M) funding isused. The RDT&E appropriation funds the costsassociated with R&D intended to improve performance,including test items, DT&E <strong>and</strong> test supportof OT&E of the system, or equipment testitems.Funding that is planned, programmed, <strong>and</strong> budgetedthrough the PPBE cycle is not always thesame funding amount that the Congress appropriatesor the PM receives. If the required funding fora test program is not authorized by the Congress,the PM has four ways to react. The PM can submita supplemental budget (for unfunded portionsof the program); request deficiency funding (forunforeseen program problems); use transferauthority (from other programs within the Service);or the PM can try to reprogram the neededfunds (to restructure the program).Generally, testing that is ac<strong>com</strong>plished for a specificsystem before the production decision isfunded from RDT&E appropriations, <strong>and</strong> testingthat is ac<strong>com</strong>plished after the production decisionis funded from other procurement or O&Mappropriations. <strong>Test</strong>ing of Product Improvements(PIs), block upgrades, <strong>and</strong> major modificationsis funded from the same appropriations as theprogram development. Follow-on <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>s(FOT&Es) are normally funded from O&Mfunds.Funding associated with T&E (including instrumentation,targets, <strong>and</strong> simulations) are identifiedin the system acquisition cost estimates,Service acquisition plans, <strong>and</strong> the TEMP. Generalfunding information for development <strong>and</strong>operational tests follows:Development <strong>Test</strong> (DT) Funding. Funds requiredfor conduct of engineering <strong>and</strong> developmenttests are programmed <strong>and</strong> budgeted by themateriel developer, based upon the requirementsof the TEMP. These costs may include, but arenot limited to, procuring test samples/prototypes;support equipment; transportation costs; technicaldata; training of test personnel; repair parts;<strong>and</strong> test-specific instrumentation, equipment, <strong>and</strong>facilities. The DT&E funds are expended forcontractor <strong>and</strong> government developmental testactivities.The Service PM may be required to pay for theuse of test resources, such as the MRTFB, <strong>and</strong>for the development of specialized resourcesneeded specifically for testing the weapon systembeing developed.Operational <strong>Test</strong> (OT) Funding. Funds requiredto conduct OT are usually programmed <strong>and</strong> budgetedby the Service PM organization. The fundsare programmed in the Service’s long-range testprogram, <strong>and</strong> the funds requirements are obtainedfrom the test resourcing documentation <strong>and</strong>TEMP. The Air Force funds OT&E separate fromthe program office through a dedicated PE forAFOTEC conducted operational testing.15.4.1 Army Funding<strong>Test</strong> resources are developed <strong>and</strong> funded undervarious Army appropriations. The AMC <strong>and</strong> its<strong>com</strong>modity <strong>com</strong>m<strong>and</strong>s provide test items, spareparts, support items (such as diagnostic equipment),<strong>and</strong> ammunition. Soldiers, ranges, fuel,test support personnel, <strong>and</strong> maneuver areas areprovided by the TRADOC or FORSCOM.Weapon system PMs use RDT&E funds to15-9


eimburse these supporting <strong>com</strong>m<strong>and</strong>s for costsdirectly related to their tests. Weapon systemmateriel developers are also responsible forfunding the development of new test resourcesspecifically needed to test the weapon system.Examples of such special-purpose resources includemodels, simulations, special instrumentation<strong>and</strong> test equipment, range modifications,EW simulators <strong>and</strong>, sometimes, threat simulators.Although the Army has a separate budget<strong>and</strong> development plan for threat simulators—thePM ITTS threat simulators program—manyweapon system developers still have to fund thecost of new threat systems that are specificallyneeded to test their weapon system. Funding forArmy operational testing is through the PM’sProgram Element (PE) <strong>and</strong> is given to ATECfor direct control of funds for each program.Funding requirements are developed in consonancewith the Outline <strong>Test</strong> Plan (OTP).15.4.2 Navy FundingIn the Navy, the weapon system PM is responsiblefor funding the development of all requiredtest-specific resources from the program’sRDT&E funds. These resources include testarticles, expendables, one-of-a-kind targets, datacollection/reduction, <strong>and</strong> instrumentation. Thedevelopment of generic test resources that canbe used in OT&E of multiple weapon systemssuch as targets, threat simulators, <strong>and</strong> rangecapabilities, is funded from the Office of theChief of Naval Operations (OPNAV) genericaccounts (such as target development) <strong>and</strong> notfrom weapon systems RDT&E. The PM’sRDT&E funds pay for all DT <strong>and</strong> OT throughOPEVAL. The PM pays for all post-productionOT with program funds.15.4.3 Air Force FundingIn the Air Force, direct-cost funding requires thattest-peculiar (direct) costs associated with a particulartest program be reimbursed by the SPO tothe designated test agency. The RDT&E appropriationfunds the cost associated with R&D,including test items, DT&E, <strong>and</strong> Air ForceMateriel Comm<strong>and</strong> (AFMC) support of OT&Eof the system or equipment <strong>and</strong> the test items.Costs associated with Initial Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (IOT&E) are RDT&E-funded, <strong>and</strong>costs of Qualification Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(QOT&E) are O&M-funded. The AFOTECis funded through its own PE <strong>and</strong> has direct controlof OT&E funds for all programs. The IOT&Emanager prepares a TRP that summarizes theresource requirements for IOT&E <strong>and</strong> related testsupport. All pre-test IOT&E planning is budgetedthrough <strong>and</strong> paid out of the O&M appropriation.The FOT&E costs are paid by AFOTEC <strong>and</strong>/orthe MAJCOM operating the system <strong>and</strong> fundedby the O&M appropriation.15.5 SUMMARY<strong>Test</strong> resources have many conflicting dem<strong>and</strong>s,<strong>and</strong> their use must be scheduled well in advanceof a test. Resources specific to a particular testmust often be developed <strong>and</strong> funded from thePM’s own RDT&E budget. Thus, the PM <strong>and</strong>testers must ensure that test resource requirementsare identified early in the acquisition cycle, thatthey are documented in the initial TEMP, <strong>and</strong> thatmodifications <strong>and</strong> refinements are reported in theTEMP updates.Funds for testing are provided by congressionalappropriation to the OMB, which apportions thefunds to the Services through the SECDEF. ThePPBE is the DoD process used to formulate budgetrequests to the Congress. Requests by PMsfor test resources are usually outlined in theTEMP. Generally, system development is fundedfrom RDT&E funds until the system is operationallydeployed <strong>and</strong> maintained. O&M fundsare used for FOT&E <strong>and</strong> system maintenance.The weapon system materiel developer is also responsiblefor funding the development of new testresources specifically needed to test the weaponsystem. The Air Force OTA develops <strong>and</strong> directlycontrols OT&E funds.15-10


ENDNOTES1. Defense Acquisition <strong>Guide</strong>book, http://akss.dau.mil/DAG/.2. Department of Defense Directive (DoDD)3200.11, DoD Major Range <strong>and</strong> <strong>Test</strong> FacilityBase, May 1, 2002.3. DoD 5000.3-M-1, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> MasterPlan (TEMP) <strong>Guide</strong>lines, October 1986.Cancelled.4. DoD 3200.11-D, Major Range <strong>and</strong> <strong>Test</strong>Facility Base Summary of Capabilities, June1983.5. Ibid.6. Range Comm<strong>and</strong>ers Council (RCC),Universal Documentation System H<strong>and</strong>book,Document 501-84, Vol. 3, August 1989.15-11


15-12


16TEST AND EVALUATIONMASTER PLAN16.1 INTRODUCTIONGuidance contained in Operation of the DefenseAcquisition System <strong>and</strong> the Defense Acquisition<strong>Guide</strong>book stipulates that a <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Master Plan (TEMP) format shall be used for allAcquisition Category (ACAT) I <strong>and</strong> IA or Officeof the Secretary of Defense (OSD)-designatedoversight acquisition programs. 1 This reinforcesthe philosophy that good planning supports goodoperations. For effective engineering development<strong>and</strong> decision-making processes, an early <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (T&E) strategy in the TechnologyDevelopment Strategy (TDS) must be evolvedinto an overall integrating master plan detailingthe collection <strong>and</strong> evaluation of test data onrequired performance parameters. Less thanACAT I programs are encouraged to tailor theirT&E strategy using the TEMP format as a guide.The TEMP relates program schedule, test managementstrategy <strong>and</strong> structure, <strong>and</strong> requiredresources to: Critical Operational Issues (COIs);Critical Technical Parameters (CTPs); MinimumAcceptable Values (MAVs) (thresholds); acquisitionstrategy; <strong>and</strong>, milestone decision points.Feedback about the degree of system performancematurity <strong>and</strong> its operational effectiveness <strong>and</strong>suitability during each phase is essential to thesuccessful fielding of equipment that satisfiesuser requirements.16.2 TEMP DEVELOPMENTThe development of system T&E strategy beginsduring the Technology Development (TD) effortwith the test planning in the TDS document. Thisevolves as the system is better defined <strong>and</strong> T&Eevents are consolidated in the TEMP. Effectivemanagement of the various test processes are theprimary functions of a Program <strong>Management</strong> Office(PMO) T&E Working-level Integrated ProductTeam (WIPT). The T&E strategy is highlycontingent on early system concept(s) that aredeemed appropriate for satisfying user requirements.As outlined in The Defense AcquisitionSystem, 2 the priority for selecting a solution is:(1) A non-materiel solution, such as changesto Doctrine, Organization, Training, Leadership<strong>and</strong> education, Personnel, <strong>and</strong> Facilities(DOT_LPF);(2) A Materiel (M) alternative, chosen accordingto the following sequence:(a) The procurement or modification of<strong>com</strong>mercially available products, services,<strong>and</strong> technologies from domesticor international systems, or the developmentof dual-use technologies.(b) The additional production or modificationof previously developed U.S. <strong>and</strong>/or allied military systems or equipment.(c) A cooperative development programwith one or more allied nations.(d) A new, joint, Department of Defense(DoD) Component or governmentagency development program.(e) A new DoD Component-unique developmentprogram.16-1


The quality of the test program may directly reflectthe level of effort expended in its development<strong>and</strong> execution. This varies in direct relationshipto the management imposed by theProgram Manager (PM) <strong>and</strong>, to some extent, bythe system engineer. The PM must evaluate theutility of dedicated T&E staff versus matrixsupport from the development <strong>com</strong>m<strong>and</strong>. Thelevels of intensity for planning <strong>and</strong> executing T&Efluctuate with changes in phases of the acquisitionprocess <strong>and</strong> in T&E staff support, as appropriate.Early planning of long-range strategies can besupported with knowledgeable planning teams(T&E Integrated Product Teams (IPTs)) <strong>and</strong>reviews by panels of senior T&E managementofficials— “gray beards.” As the tempo of actualtest activities begins to build concept to prototypeto Engineering Development Model (EDM)to pre-Low Rate Initial Production (LRIP), internalT&E management staff are needed to controlthe processes <strong>and</strong> evaluate results.16.2.1 Program <strong>Management</strong> OfficeResponsibilitiesThe PMO is the focal point of the development,review <strong>and</strong> approval process for the programTEMP. The DoD acquisition process requires aTEMP as one of the primary management strategydocuments supporting the decision to startor terminate development efforts. This task is a“difficult do” prior to program start since someServices do not formulate or staff a PMO untilformal program initiation. An additional <strong>com</strong>plicatingfactor is the nebulous condition of otherprogram source documents (Capability DevelopmentDocument (CDD), Technical <strong>Management</strong>Plan (TMP) , Acquisition Strategy, System ThreatAssessment (STA), Logistics Support Plan (LSP),etc.) that are also in early stages of development/updating for the milestone review. Since theTEMP must conform to the acquisition strategy<strong>and</strong> other program management documents, itfrequently lags in the development process <strong>and</strong>does not receive the attention needed from PMOor matrix support personnel. PMO emphasis onearly formulation of the test planning teams (T&EWIPT) is critical to the successful developmentof the program TEMP. These teams should consistof the requisite players so a <strong>com</strong>prehensive<strong>and</strong> integrated strategy <strong>com</strong>patible with otherengineering <strong>and</strong> decision-making processes isdeveloped. The PMO will find that the numberof parties desiring coordination on the TEMP farexceeds the “streamlined” approval process signatories;however, it must be coordinated. Anearly start in getting Service-level concurrence isimportant so the milestone decision documentsubmissionschedule can be supported with thedraft <strong>and</strong> final versions of the TEMP. Subsequentupdates do not be<strong>com</strong>e easier, as each acquisitionphase brings new planning, coordination, <strong>and</strong>testing requirements.16.2.2 T&E PlanningDeveloping an overall strategy provides the frameworkfor incorporating phase-oriented T&E activitiesthat will facilitate the acquisition process.The T&E strategy should be consistent with theprogram acquisition strategy, identifyingrequirements for contractor <strong>and</strong> governmentDevelopment <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E), interactionsbetween DT&E <strong>and</strong> Operational <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> (OT&E), <strong>and</strong> provisions for theseparate Initial Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(IOT&E).An evolutionary acquisition strategy will generallyinclude an incremental or spiral approachwith moderate- to low-risk technologies thatshould reduce the intensity <strong>and</strong> duration of theT&E program. It does, however, include arequirement for post-production test activities asthe system is modified to ac<strong>com</strong>modate previouslyunknown new technologies, new threats, orother performance enhancements.A revolutionary acquisition strategy (single step)incorporates all the latest technologies in the finalproduction configuration <strong>and</strong> is generally a16-2


higher-risk approach. As the contractor works onmaturing emerging technologies, the T&Eworkload increases in direct proportion to thetechnology risk <strong>and</strong> difficulty in fixing problems.There is a much higher potential for extendedschedules with iterative test-fix-test cycles.16.2.3 General T&E Planning IssuesThe Defense Science Board (DSB) report presentedguidance on T&E at two levels. 3 On a generallevel it discussed a number of issues thatwere appropriate to all weapon acquisitionprograms. These issues, along with a summarydiscussion, are given below.16.2.3.1 Effects of <strong>Test</strong> Requirements onSystem AcquisitionThe acquisition strategy for the system shouldallow sufficient time between the end of demonstrationtesting <strong>and</strong> procurement, as contractedwith limited production decisions, to allow flexibilityfor modification of plans that will berequired. It should ensure that sufficient dollarsare available not only to conduct T&E, but toallow for additional T&E that is always requireddue to failure, design changes, etc. It should beevaluated relative to constraints imposed by:• The level of system testing at various stagesof the Research, Development, <strong>Test</strong>, <strong>and</strong><strong>Evaluation</strong> (RDT&E) cycle;• The number of test items available <strong>and</strong> theschedule interface with other systems neededin the tests, such as aircraft, electronics, etc.;• The support required to assist in preparing for<strong>and</strong> conducting tests <strong>and</strong> analyzing the testresults;• Being evaluated to minimize the so-called T&E“gap” caused by lack of hardware during thetest phase.16.2.3.2 <strong>Test</strong> Requirements <strong>and</strong> Restrictions<strong>Test</strong>s should:• Have specific objectives;• List, in advance, actions to be taken as aconsequence of the test results;• Be instrumented to permit diagnosis of the causeof lack of performance including r<strong>and</strong>om, designinducedwear-out <strong>and</strong> operator error failure;• Not be repeated if failures occur without firstconducting a detailed analysis of the failure.16.2.3.3 Trouble IndicatorsEstablish an early detection scheme to identifyprogram illness.When a program begins to have trouble, thereare indicators that will show up during testing.Some of these indicators are:• A test failure;• Any repetitive failure;• A revision of schedule or incremental fundingthat exceeds the original plan;• Any relaxation of the basic requirements suchas lower performance.16.2.3.4 Requirement for <strong>Test</strong> Rehearsals<strong>Test</strong> rehearsals should be conducted for each newphase of testing.16.2.4 SchedulingSpecific issues associated with test schedulingare listed below.16-3


16.2.4.1 Building Block <strong>Test</strong> SchedulingThe design of a set of tests to demonstrate feasibilityprior to testing the system-level EDMshould be used. This will allow early testing ofhigh-technical-risk items, <strong>and</strong> subsequent tests canbe incorporated into the hardware as the systemconcept has been demonstrated as feasible.16.2.4.2 Component <strong>and</strong> Subsystem <strong>Test</strong>PlansEnsure a viable <strong>com</strong>ponent <strong>and</strong> subsystem testplan. Studies show that almost all <strong>com</strong>ponent failureswill be the kind that cannot be easily detectedor prevented in full system testing. System failuremust be detected <strong>and</strong> fixed in the <strong>com</strong>ponent/subsystem stage, as detecting <strong>and</strong> correcting failureonly at the operational test level results inhigh cost.16.2.4.3 Phasing of DT&E <strong>and</strong> IOT&EProblems that be<strong>com</strong>e apparent in operationaltesting can often be evaluated faster with theinstrumented DT&E hardware. The Integrated<strong>Test</strong> Plan (ITP) should provide time <strong>and</strong> moneyto investigate test failures <strong>and</strong> eliminate causesof failures before other, similar tests take place.16.2.4.4 Schedule IOT&E to Include SystemInterfaces with Other SystemsWhenever possible, the IOT&E/FOT&E (FollowonOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>) of a weaponsystem should be planned to include other systemsthat must have a technical interface withthe new system. For example, missiles should betested on most of the platforms for which theyare programmed.A Preplanned Product Improvements (P 3 I) or incrementstrategy is a variant of the evolutionarydevelopment process in which PMs <strong>and</strong> other acquisitionmanagers recognize the high-risk technologies/subsystems<strong>and</strong> put them on paralleldevelopment tracks. The testing strategy should anticipatethe requirements to evaluate P 3 I item technologymaturity <strong>and</strong> then test the system duringthe integration of the additional capability.Advanced Technology Demonstrations (ATD) orAdvanced Concept Technology Demonstrations(ACTD) may provide early insights into availabletechnologies for incorporation into developmentalor mature, post-prototype systems. Usingproven, mature technology provides a lower-riskstrategy <strong>and</strong> may significantly reduce the developmenttesting workload. To assess <strong>and</strong> managerisk, PMs <strong>and</strong> other acquisition managers shoulduse a variety of techniques, including technologydemonstrations, prototyping, <strong>and</strong> T&E. The processfor verifying contract performance <strong>and</strong> itemspecifications, T&E of threshold performance requirementsin the Operational RequirementsDocument (ORD), exit criteria, or the AcquisitionProgram Baseline (APB) performance shouldbe addressed in the DT&E strategy. The DT&Eis an iterative process starting at configurationitem/software module levels <strong>and</strong> continuingthroughout the <strong>com</strong>ponent integration into subassemblies<strong>and</strong>, finally, system-level performanceevaluations. OT&E is interwoven into early DT&Efor maximizing the efficient use of test articles<strong>and</strong> test schedules. However, OT&E must remaina distinct thread of activity that does not lose itsidentity in the tapestry of test events. Planningfor test resources is driven by the sequence <strong>and</strong>intensity of Development <strong>Test</strong> (DT) <strong>and</strong> Operational<strong>Test</strong> (OT) events. Resource coordination isan equally arduous task, which frequently has leadtimes equal to major program development activities.Included in the program T&E strategyshould be an overshadowing evaluation plan,outlining methodologies, models, simulations,<strong>and</strong> test data required at periodic decision points.The TEMP should: (a) address critical humanissues to provide data to validate the results ofhuman factors engineering analyses; <strong>and</strong> (b) requireidentification of mission critical operational <strong>and</strong>maintenance tasks.16-4


A reliability growth (<strong>Test</strong>, Analyze, Fix, <strong>and</strong> <strong>Test</strong>(TAFT)) program should be developed to satisfythe reliability levels required at Full Rate Production(FRP). Reliability tests <strong>and</strong> demonstrationswill be based on actual or simulatedoperational conditions. 4Maintainability will be verified with a maintainabilitydemonstration before FRP. 5As early as practicable, developers <strong>and</strong> test agencieswill assess survivability <strong>and</strong> validate criticalsurvivability characteristics at as high a systemlevel as possible. The TEMP will identify themeans by which the survivability objective willbe validated.Field engineering test facilities <strong>and</strong> testing in theintended operational environments are requiredto: (1) verify electric or electronic systems’predicted performance; (2) establish confidencein electromagnetic <strong>com</strong>patibility design based onst<strong>and</strong>ards <strong>and</strong> specifications; <strong>and</strong> (3) validate electromagnetic<strong>com</strong>patibility analysis methodology.The TEMP will address health hazard <strong>and</strong> safetycritical issues to provide data to validate theresults of system safety analyses.The TEMP strategy should directly support thedevelopment of more detailed planning <strong>and</strong>resource documents needed to execute the actualtest events <strong>and</strong> subsequent evaluations.The TEMP shall provide a roadmap for integratedsimulation, test, <strong>and</strong> evaluation plans, schedules,<strong>and</strong> resource requirements necessary to ac<strong>com</strong>plishthe T&E program. T&E planning shouldaddress Measures of Effectiveness/Suitability(MOEs/MOSs) with appropriate quantitative criteria,test event or scenario description, resourcerequirements, <strong>and</strong> test limitations. <strong>Test</strong> planning,at a minimum, must address all system <strong>com</strong>ponentsthat are critical to the achievement <strong>and</strong> demonstrationof contract technical performancespecifications <strong>and</strong> threshold values specified inthe Capability Production Document (CPD).16.3 TEMP FORMATThe format specified in the Defense Acquisition<strong>Guide</strong>book, Appendix 2, is required for all acquisitioncategory I, IA, <strong>and</strong> OSD-designatedoversight programs (Table 16-1). It may be tailoredas needed for lesser category acquisitionprograms at the discretion of the milestone decisionauthority. The TEMP is intended to be asummary document outlining DT&E <strong>and</strong> OT&Emanagement responsibilities across all phases ofthe acquisition process. When the developmentis a multi-Service or joint acquisition program,one integrated TEMP is developed with Serviceannexes, as required. A Capstone TEMP may notbe appropriate for a single major weapon platformbut could be used to en<strong>com</strong>pass testing of acollection of individual systems, each with its ownannex (e.g., Missile Defense Agency (MDA),Family of Tactical Vehicles, Future Combat Systems(FCSs)). A program TEMP is updated atmilestones, upon program baseline breach <strong>and</strong> forother significant program changes. Updates mayconsist of page changes <strong>and</strong> are no longer requiredwhen a program has no further developmentactivities.The TEMP is a living document that must addresschanges to critical issues associated with anacquisition program. Major changes in programrequirements, schedule, or funding usually resultin a change in the test program. Thus, the TEMPmust be reviewed <strong>and</strong> updated on programchange, on baseline breach <strong>and</strong> before each milestonedecision, to ensure that T&E requirementsare current. As the primary document used in theOSD review <strong>and</strong> milestone decision process toassess the adequacy of planned T&E, the TEMPmust be of sufficient scope <strong>and</strong> content to explainthe entire T&E program. The key topics inthe TEMP are shown in Table 16-1.16-5


Table 16-1. <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan FormatPART IPART IIPART IIIPART IVPART VAPPENDIX AAPPENDIX BAPPENDIX CSYSTEM INTRODUCTIONMISSION DESCRIPTIONSYSTEM DESCRIPTIONSYSTEM THREAT ASSESSMENTMEASURES OF EFFECTIVENESS AND SUITABILITYCRITICAL TECHNICAL PARAMETERSINTEGRATED TEST PROGRAM SUMMARYINTEGRATED TEST PROGRAM SCHEDULEMANAGEMENTDEVELOPMENT TEST AND EVALUATION OUTLINEDEVELOPMENT TEST AND EVALUATION OVERVIEWFUTURE DEVELOPMENTAL TEST AND EVALUATION LIMITATIONSOPERATIONAL TEST AND EVALUATION OUTLINEOPERATIONAL TEST AND EVALUATION OVERVIEWCRITICAL OPERATIONAL ISSUESFUTURE OPERATIONAL TEST AND EVALUATION LIMITATIONSLIVE FIRE TEST AND EVALUATIONTEST AND EVALUATION RESOURCE SUMMARYTEST ARTICLESTEST SITES AND INSTRUMENTATIONTEST SUPPORT EQUIPMENTTHREAT REPRESENTATIONTEST TARGETS AND EXPENDABLESOPERATIONAL FORCE TEST SUPPORTSIMULATIONS, MODELS, AND TEST BEDSSPECIAL REQUIREMENTSTEST AND EVALUATION FUNDING REQUIREMENTSMANPOWER/PERSONNEL TRAININGBIBLIOGRAPHYACRONYMSPOINTS OF CONTACTATTACHMENTS(AS APPROPRIATE)SOURCE: Defense Acquisition <strong>Guide</strong>book, September 2004.Each TEMP submitted to OSD should be asummary document, detailed only to the extentnecessary to show the rationale for the type,amount, <strong>and</strong> schedules of the testing planned. Itmust relate the T&E effort clearly to technicalrisks, operational issues <strong>and</strong> concepts, systemperformance, Reliability, Availability, <strong>and</strong>Maintainability (RAM), logistics objectives <strong>and</strong>requirements, <strong>and</strong> major decision points. It shouldsummarize the testing ac<strong>com</strong>plished to date <strong>and</strong>explain the relationship of the various models <strong>and</strong>simulations, subsystem tests, integrated systemdevelopment tests, <strong>and</strong> initial operational teststhat, when analyzed in <strong>com</strong>bination, provide16-6


confidence in the system’s readiness to proceedinto the next acquisition phase. The TEMP mustaddress the T&E to be ac<strong>com</strong>plished in each programphase, with the next phase addressed in themost detail. The TEMP is also used as a coordinationdocument to outline each test <strong>and</strong> supportorganization’s role in the T&E program <strong>and</strong> identifymajor test facilities <strong>and</strong> resources. The TEMPsupporting the production <strong>and</strong> initial deploymentdecision must include the T&E planned to verifythe correction of deficiencies <strong>and</strong> to <strong>com</strong>pleteproduction qualification testing <strong>and</strong> FOT&E.The objective of the OSD TEMP review processis to ensure successful T&E programs that willsupport decisions to <strong>com</strong>mit resources at majormilestones. Some of the T&E issues consideredduring the TEMP review process include:(1) Are DT&E <strong>and</strong> OT&E initiated early toassess performance, identify risks, <strong>and</strong>estimate operational potential?(2) Are critical issues, test directives, <strong>and</strong> evaluationcriteria related to mission need <strong>and</strong>operational requirements established wellbefore testing begins?(3) Are provisions made for collecting sufficienttest data with appropriate test instrumentationto minimize subjective judgment?(4) Is OT&E conducted by an organizationindependent of the developer <strong>and</strong> user?(5) Do the test methodology <strong>and</strong> instrumentationprovide a mature <strong>and</strong> flexible networkof resources that stresses (as early as possible)the weapon system in a variety ofrealistic environments?16.4 SUMMARYThe PMO is directly responsible for the content<strong>and</strong> quality of the test strategy <strong>and</strong> planningdocument. The TEMP, as an integrated summarymanagement tool, requires an extensive <strong>com</strong>mitmentof manhours <strong>and</strong> PM guidance. The interactionsof the various T&E players <strong>and</strong> supportagencies in the T&E WIPT must be woven intothe fabric of the total system acquisition strategy.Cost <strong>and</strong> schedule implications must be negotiatedto ensure a viable T&E program that providestimely <strong>and</strong> accurate data to the engineering <strong>and</strong>management decision makers.16-7


ENDNOTES1. DoD Instruction (DoDI) 5000.2, Operationof the Defense Acquisition System, May 12,2003, <strong>and</strong> Defense Acquisition <strong>Guide</strong>book,http://akss.dau.mil/DAG.2. DoD Directive (DoDD) 5000.1, The DefenseAcquisition System, May 12, 2003.4. DoD 3235.1-H, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> of SystemReliability, Availability <strong>and</strong> Maintainability– A Primer, March 1982.5. Ibid.3. Defense Science Board, Report of Task Forceon <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>, April 1, 1974.16-8


VMODULESPECIALIZEDTESTINGMany program managers face several T&E issues that must beresolved to get their particular weapon system tested <strong>and</strong>ultimately fielded. These issues may include modeling <strong>and</strong>simulation support, <strong>com</strong>bined <strong>and</strong> concurrent testing, test resources,survivability <strong>and</strong> lethality testing, multi-Service testing,or international T&E. Each issue presents a unique set ofchallenges for program managers when they develop the integratedstrategy for the T&E program.


17SOFTWARESYSTEMS TESTING17.1 INTRODUCTIONSoftware development presents a major developmentrisk for military weapons <strong>and</strong> NationalSecurity Systems (NSS). Software is found in MajorAutomated Information Systems (MAISs) <strong>and</strong>weapon system software. High-cost Software systems,such as personnel records management systems,financial accounting systems, or logisticsrecords, that are the end-item solution to userrequirements, fall in the MAIS category. Performancerequirements for the MAIS typically drivethe host hardware configurations <strong>and</strong> are managedby the Information Technology AcquisitionBoard (ITAB) chaired by the Assistant Secretaryof Defense (Networks <strong>and</strong> Information Integration(ASD(NII))/Chief Information Officer (CIO).The Director, Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(DOT&E) is a principal member of the ITAB.Software developments, such as avionics systems,weapons targeting <strong>and</strong> control, <strong>and</strong> navigation<strong>com</strong>puters, that are a subset of the hardwaresolution to user requirements fall in the weaponsystemsoftware category. Performance requirementsfor the system hardware are flowed downto drive the functionality of the software residentin onboard <strong>com</strong>puters. The effectiveness of theweapon system software is reviewed as part ofthe overall system review by the Defense AcquisitionBoard (DAB), chaired by the Under Secretaryof Defense for Acquisition, Technology, <strong>and</strong> Logistics(USD(AT&L)). The DOT&E is a principalmember <strong>and</strong> the Deputy Director, Development,<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E) is an advisor to theDAB. No Milestone A, B, or Full Rate Production(FRP) decision (or their equivalent) shall begranted for a MAIS until the Department ofDefense (DoD) CIO certifies that the MAIS programis being developed in accordance with theClinger-Cohen Act provisions. 1 For MDAP <strong>and</strong>MAIS programs, the DoD Component CIO’s confirmation(for Major Defense Acquisition Programs(MDAPs)) <strong>and</strong> certification (for MAISs)shall be provided to both the DoD CIO <strong>and</strong> theMilestone Decision Authority (MDA).Software development historically has escalatedthe cost <strong>and</strong> reduced the reliability of weaponsystems. Embedded <strong>com</strong>puter systems that arephysically incorporated into larger weapon systems,have a major data processing function. Typically,the output of the systems is information, controlsignals, or <strong>com</strong>puter data required by the hostsystem to <strong>com</strong>plete its mission. Although hardware<strong>and</strong> software often contribute in equal measureto successful implementation of system functions,there have been relative imbalances in theirtreatment during system development.Automated Information Systems (AISs), oncedeveloped, integrated, <strong>and</strong> tested in the host hardware,are essentially ready for production. Softwarein weapon systems, once integrated in thehost hardware, continues to be tested as a <strong>com</strong>ponentof the total system <strong>and</strong> is not ready for productionuntil the total system has successfully demonstratedrequired performance. Any changes toweapon system hardware configuration may stimulatechanges to the software. The development ofall software systems involves a series of activitiesin which there are frequent opportunities for errors.Errors may occur at the inception of the process,when the requirements may be erroneouslyspecified, or later in the development cycle, whenSystems Integration (SI) is implemented. This17-1


chapter addresses the use of testing to obtain insightsinto the development risk of AIS <strong>and</strong>weapon system software, particularly as it pertainsto the software development processes.17.2 DEFINITIONSThe term Automated Information System (AIS)is defined as a <strong>com</strong>bination of <strong>com</strong>puter hardware<strong>and</strong> software, data, or tele<strong>com</strong>municationsthat performs functions such as collecting,processing, transmitting, <strong>and</strong> displaying information.2 Excluded are <strong>com</strong>puter resources, bothhardware <strong>and</strong> software, that are: physically partof, dedicated to, or essential in real time to themission performance of weapon systems. 3The term weapon system software includes AutomatedData Processing Equipment (ADPE),software, or services; <strong>and</strong> the function, operation,or use of the equipment software or servicesinvolves:(1) Intelligence activities;(2) Cryptologic activities related to nationalsecurity;(3) Comm<strong>and</strong> <strong>and</strong> control of military forces;(4) Equipment that is an integral part of aweapon system;(5) Critical, direct fulfillment of military orintelligence missions.Acquisition of software for DoD is described inMilitary St<strong>and</strong>ard (MIL-STD)-498, Software Development<strong>and</strong> Documentation, which althoughrescinded has been waived for use until <strong>com</strong>mercialst<strong>and</strong>ards such as EIA 640 or J-Std-016 (SoftwareLife Cycle Processes, Software Development,September 1995) be<strong>com</strong>e the guidance forsoftware development. 4 Guidance may also befound in the Defense Acquisition <strong>Guide</strong>book.17.3 PURPOSE OF SOFTWARE TESTAND EVALUATIONA major problem in software development is alack of well-defined requirements. If requirementsare not well-defined, errors can multiplythroughout the development process. As illustratedin Figure 17-1, errors may occur at theinception of the process. These errors may occurduring requirements definition, when objectivesmay be erroneously or imperfectly specified;during the later design <strong>and</strong> development stages,when these objectives are implemented; <strong>and</strong>during software maintenance <strong>and</strong> operationalphases, when software changes are needed toeliminate errors or enhance performance. Estimatesof increased software costs arising fromin<strong>com</strong>plete testing help to illustrate the dimensionof software Life-Cycle Costs (LCCs). Averagedover the operational life cycle of a <strong>com</strong>putersystem, development costs en<strong>com</strong>pass approximately30 percent of total system costs. Theremaining 70 percent of LCCs are associatedwith maintenance, which includes system enhancements<strong>and</strong> error correction. Complete testingduring earlier development phases may havedetected these errors. The relative costs of errorcorrection increase as a function of time fromthe start of the development process. Relativecosts of error correction rise dramatically betweenrequirements <strong>and</strong> design phases <strong>and</strong> moredramatically during code implementation.Previous research in the area of software <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (T&E) reveals that half of allmaintenance costs are incurred in the correctionof previously undetected errors. Approximatelyone-half of the operational LCCs can be traceddirectly to inadequate or in<strong>com</strong>plete testing activities.In addition to cost increases, operationalimplications of software errors in weapon systemscan result in mission critical software failures thatmay impact mission success <strong>and</strong> personnel safety.A more systematic <strong>and</strong> rigorous approach to softwaretesting is required. To be effective, this17-2


NEEDREQUIREMENTSDEFINITIONCORRECTREQUIREMENTSREQUIREMENTERRORSDESIGNCORRECTDESIGNDESIGNERRORSREQUIREMENT-INDUCEDERRORSBUILD/PROGRAMMINGCORRECTHARDWARE AND/ORSOFTWAREHARDWARE AND/ORSOFTWAREERRORSDESIGN-INDUCEDERRORSREQUIREMENT-INDUCEDERRORSTEST ANDINTEGRATIONCORRECTPERFORMANCECORRECTEDERRORSKNOWN BUTUNCORRECTEDERRORSUNKNOWNERRORSSYSTEM WITH KNOWN AND UNKNOWN DEFICIENCIESFigure 17-1. The Error Avalancheapproach must be applied to all phases of the developmentprocess in a planned <strong>and</strong> coordinatedmanner, beginning at the earliest design stages<strong>and</strong> proceeding through operational testing of theintegrated system. Early, detailed software T&Eplanning is critical to the successful developmentof a <strong>com</strong>puter system.17.4 SOFTWARE DEVELOPMENTPROCESSSoftware engineering technologies used to produceoperational software are key risk factors ina development program. The T&E programshould help determine which of these technologiesincrease risk <strong>and</strong> have a life-cycle impact. Aprincipal source of risk is the support softwarerequired to develop operational software. In termsof life-cycle impact, operational software problemsare <strong>com</strong>monly associated with the difficultyin maintaining <strong>and</strong> supporting the software oncedeployed. Software assessment requires an analysisof the life-cycle impact, which varies dependingon the technology used to design <strong>and</strong> implementthe software. One approach to reducinglong-term life-cycle risks is to use a <strong>com</strong>merciallanguage <strong>and</strong> <strong>com</strong>mon hardware throughout thedevelopment <strong>and</strong> operation of the software. Theselife-cycle characteristics that affect operational capabilitiesmust be addressed in the <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Master Plan (TEMP), <strong>and</strong> tests should bedeveloped to identify problems caused by these17-3


characteristics. The technology used to design <strong>and</strong>implement the software may significantly affectsoftware supportability <strong>and</strong> maintainability.The TEMP must sufficiently describe the acceptancecriteria or software maturity metrics fromthe written specifications that will lead to operationaleffectiveness <strong>and</strong> suitability. The specificationsmust define the required software metrics toset objectives <strong>and</strong> thresholds for mission criticalfunctions. Additionally, these metrics should beevaluated at the appropriate stage of system developmentrather than at some arbitrarily imposedmilestone.17.5 T&E IN THE SOFTWARE LIFECYCLESoftware testing is an iterative process executedat all development stages to examine programdesign <strong>and</strong> code to expose errors. Software testplanning should be described in the TEMP withthe same care as test planning for other system<strong>com</strong>ponents (Figures 17-2, 17-3).17.5.1 <strong>Test</strong>ing ApproachThe integration of software development into theoverall acquisition process dictates a testingprocess consistent with the bottom-up approachtaken with hardware development. The earlieststage of software testing is characterized by heavyhuman involvement in basic design <strong>and</strong> codingprocesses. Thus, human testing is defined asinformal, non-<strong>com</strong>puter-based methods of evaluatingarchitectures, designs, <strong>and</strong> interfaces. It canconsist of:• Inspections: Programmers explain their work toa small group of peers with discussion <strong>and</strong> directfeedback on errors, inconsistencies, <strong>and</strong>omissions.• Walk-through: A group of peers develop testcases to evaluate work to date <strong>and</strong> give directfeedback to the programmer.• Desk Checking: A self evaluation is made byprogrammers of their own work. There is a lowSYSTEMREQMTSANALYSISSRRSYSTEMDESIGNSFRHW RQMTSANALYSISPRELIMDESIGNDETAILEDDESIGNPDRFABRICATIONAND TESTCDRHARDWAREDEVELOPMENTTRRHWCITESTSYSTEMINTEGANDTESTSW RQMTSANALYSISSSRPRELIMDESIGNPDRDETAILEDDESIGNCDRCODING &CSU TESTCSC INTEG& TESTCSCITESTSOFTWAREDEVELOPMENTPERFORMANCE SPECITEM SPECFigure 17-2. System Development ProcessSYSTEMDT&E17-4


DETERMINE OBJECTIVES,ALTERNATIVES, ANDCONSTRAINTSProductReviewSupport <strong>and</strong>MaintenanceObjectives,Alternatives,<strong>and</strong>ConstraintsImplementationObjectives,DesignAlternatives,Objectives,<strong>and</strong>ConstraintsAlternatives,System<strong>and</strong>Objectives,ConstraintsAlternatives,<strong>and</strong>ConstraintsProjectDesignReviewRqmtsReviewSystemReviewDesignCSCI<strong>and</strong>EnhancedIntegrationDevelopmentOperational<strong>and</strong>TransitionCapability<strong>Test</strong> SitePlanningIntegration,ActivationActivationTraining<strong>and</strong>PlanningTrainingPlanningDefinitionEngineering<strong>and</strong> ProjectPlanningIOCDELIVERYRiskAnalysisRiskAnalysisRiskAnalysisRiskAnalysisConceptualPrototypingConcept ofRiskAnalysisDesignAssessmentPrototypingDemonstrationPrototypingOperationSoftwareSystemSoftwareRequire-Specments SpecUpdatedSoftwareSystemArchitectureSoftware<strong>and</strong>SpecificationPreliminarySDDsQualification<strong>Test</strong>ingIntegration<strong>and</strong> <strong>Test</strong>UpdatedOperationalPrototypingOperationalPrototypingUnit<strong>Test</strong>EVALUATE ALTERNATIVES,IDENTIFY AND RESOLVE RISKSDetailedDesignIntegration<strong>and</strong> <strong>Test</strong>Simulations, Models,<strong>and</strong> BenchmarksCodeUnit<strong>Test</strong>UpdatedDetailedDesignCodeFOCDELIVERYUserAcceptance<strong>Test</strong> <strong>and</strong>TrainingFormal<strong>Test</strong>ingPLAN NEXT PHASEFCA/PCADEVELOP NEXT LEVEL PRODUCTFigure 17-3. Spiral Model for AIS Development Process17-5


probability of identifying their errors of logicor coding.• Peer Ratings: Mutually supportive, anonymousreviews are performed by groups of peers withcollaborative evaluations <strong>and</strong> feedback.• Design Reviews: Preliminary Design Reviews(PDRs) <strong>and</strong> Critical Design Reviews (CDRs)provide milestones in the development effortsthat review development <strong>and</strong> evaluations todate. An Independent Verification <strong>and</strong> Validation(IV&V) contractor may facilitate thegovernment’s ability to give meaningfulfeedback.Once the development effort has matured beyondthe benefits of human testing, <strong>com</strong>puterizedsoftware-only testing may be appropriate. It is performedto determine the functionality of the softwarewhen tested as an entity or “build.” Documentationcontrol is essential so that test resultsare correlated with the appropriate version of thebuild. Software testing is usually conducted usingsome <strong>com</strong>bination of “black box” <strong>and</strong> “whitebox” testing.• Black Box: Functional testing of a software unitwithout knowledge of how the internal structureor logic will process the input to obtain thespecified output. Within-boundary <strong>and</strong> out-ofboundarystimulants test the software’s abilityto h<strong>and</strong>le abnormal events. Most likely casesare tested to provide a reasonable assurance thatthe software will demonstrate specified performance.Even the simplest software designs rapidlyexceed the capacity to test all alternatives.• White Box: Structural testing of the internallogic <strong>and</strong> software structure provides an opportunityfor more extensive identification <strong>and</strong> testingof critical paths. The process <strong>and</strong> objectivesare otherwise very similar to black box testing.<strong>Test</strong>ing should be performed from the bottom up.The smallest controlled software modules—<strong>com</strong>puter software units—are tested individually.They are then <strong>com</strong>bined or integrated <strong>and</strong> testedin larger aggregate groups or builds. When thisprocess is <strong>com</strong>plete, the software system is testedin its entirety. Obviously, as errors are found inthe latter stages of the test program, a return toearlier portions of the development program toprovide corrections is required. The cost impact oferror detection <strong>and</strong> correction can be diminishedusing the bottom-up testing approach.System-level testing can begin once all modulesin the Computer Software Configuration Item(CSCI) have been coded <strong>and</strong> individually tested.A Software Integration Lab (SIL), with adequatemachine time <strong>and</strong> appropriate simulations, willfacilitate hardware simulation/emulation <strong>and</strong> theoperating environment. If data analysis indicatesproper software functioning, it is time to advanceto a more <strong>com</strong>plex <strong>and</strong> realistic test environment.• Hot Bench <strong>Test</strong>ing: Integration of the softwarereleased from the SIL for full-up testingwith actual system hardware in a Hardwarein-the-Loop(HWIL) facility marks a significantadvance in the development process.Close approximation of the actual operatingenvironment should provide test sequences <strong>and</strong>stress needed to evaluate the effectiveness ofthe software system(s). Problems stimulatedby the “noisy environment,” interface problems,Electromagnetic Interference (EMI), <strong>and</strong>different electrical transients should surface.Good hardware <strong>and</strong> software test programsleading up to HWIL testing should aid in isolatingproblems to the hardware or softwareside of the system. Caution should be taken toavoid any outside stimuli that might triggerunrealistic responses.• Field <strong>Test</strong>ing: Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(DT&E) <strong>and</strong> Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(OA (Operational Assessment), IOT&E) eventsmust be designed to provide for data collectionprocesses <strong>and</strong> instrumentation that will measuresystem responses <strong>and</strong> allow data analysts17-6


to identify the appropriate causes of malfunctions.Field testing should be rigorous, providingenvironmental stresses <strong>and</strong> missionprofiles likely to be encountered in operationalscenarios. Government software support facilitiespersonnel tasked for future maintenanceof the software system should be brought onboard to familiarize them with the system operatingcharacteristics <strong>and</strong> documentation.Their resultant expertise should be includedin the software T&E process to assist in theselection of stimuli likely to expose softwareproblems.It is critical that adequate software T&E informationbe contained in documents such as TEMPs<strong>and</strong> test plans. The TEMP must define characteristicsof critical software <strong>com</strong>ponents that effectivelyaddress objectives <strong>and</strong> thresholds for missioncritical functions. The Measures of Effectiveness(MOEs) must support the critical softwareissues. The test plan should specify the testmethodologies that will be applied. <strong>Test</strong> methodologiesconsist of two <strong>com</strong>ponents. The first isthe test strategy that guides the overall testingeffort, <strong>and</strong> the second is the testing technique thatis applied within the framework of a test strategy.Effective test methodologies require realistic softwaretest environments <strong>and</strong> scenarios. The testscenarios must be appropriate for the test objectives;i.e., the test results must be interpretable interms of software test objectives. The test scenarios<strong>and</strong> analysis should actually verify <strong>and</strong> validateac<strong>com</strong>plishment of requirements. Informationassurance testing must be conducted on any systemthat collects, stores, transmits, or processesclassified or unclassified information. In the caseof Information Technology (IT) systems, includingNSS, DT&E should support the DoD InformationTechnology Security Certification <strong>and</strong>Accreditation Process (ITSCAP) <strong>and</strong> Joint InteroperabilityCertification (JIC) processes. 5 Thetest environments must be chosen on a carefulanalysis of characteristics to be demonstrated <strong>and</strong>their relationship to the development, operational,<strong>and</strong> support environments. In addition, environmentsmust be representative of those in whichthe software will be maintained. At Milestone C,for MAIS, the MDA shall approve, in coordinationwith the DOT&E, the quantity <strong>and</strong> locationof sites for a limited deployment for IOT&E. 617.5.2 Independent Verification <strong>and</strong>ValidationIV&V are risk-reducing techniques that areapplied to major software development efforts.The primary purpose of IV&V is to ensure thatsoftware meets requirements <strong>and</strong> is reliable <strong>and</strong>maintainable. The IV&V is effective only ifimplemented early in the software developmentschedule. Requirements analysis <strong>and</strong> risk assessmentare the most critical activities performedby IV&V organizations; their effectiveness islimited if brought on board a project after thefact. Often, there is a reluctance to implementIV&V because of the costs involved, but earlyimplementation of IV&V will result in loweroverall costs of error correction <strong>and</strong> softwaremaintenance. As development efforts progress,IV&V involvement typically decreases. This isdue more to the expense of continued involvementthan to a lack of need. For an IV&V programto be effective, it must be the responsibility of anindividual or organization external to the softwaredevelopment Program Manager (PM).The application of the IV&V process to softwaredevelopment maximizes the maintainability of thefielded software system, while minimizing the costof developing <strong>and</strong> fielding it. Maintenance of asoftware system falls into several major categories:corrective maintenance, modifying software tocorrect errors in operation; adaptive maintenance,modifying the software to meet changing requirements;<strong>and</strong> perfective maintenance, modifyingthe software to incorporate new featuresor improvements.The IV&V process maximizes the reliability ofthe software product, which eases the performance17-7


of <strong>and</strong> minimizes the need for corrective maintenance.It attempts to maximize the flexibility ofthe software product, which eases the performanceof adaptive <strong>and</strong> perfective maintenance.These goals are achieved primarily by determiningat each step of the software development processthat the software product <strong>com</strong>pletely <strong>and</strong> correctlymeets the specific requirements determinedat the previous step of development. This step-bystep,iterative process continues from the initialdefinition of system performance requirementsthrough final acceptance testing.The review of software documentation at eachstage of development is a major portion of theverification process. The current documentationis a description of the software product at thepresent stage of development <strong>and</strong> will define therequirements laid on the software product at thefollowing stage. Careful examination <strong>and</strong> analysisof the development documentation ensure thateach step in the software design process is consistentwith the previous step. Omissions, inconsistencies,or design errors can then be identified<strong>and</strong> corrected early in the development process.Continuing participation in formal <strong>and</strong> informaldesign reviews by the IV&V organization maintainsthe <strong>com</strong>munication flow between softwaresystem “customers” <strong>and</strong> developers, ensuring thatsoftware design <strong>and</strong> production proceeds withminimal delays <strong>and</strong> misunderst<strong>and</strong>ings. Frequentinformal reviews, design <strong>and</strong> code walk-through<strong>and</strong> audits ensure that the programming st<strong>and</strong>ards,software engineering st<strong>and</strong>ards, Software QualityAssurance (SQA), <strong>and</strong> configuration managementprocedures designed to produce a reliable, maintainableoperational software system are followedthroughout the process. Continuous monitoring of<strong>com</strong>puter hardware resource allocation throughoutthe software development process also ensuresthat the fielded system has adequate capacity tomeet operation <strong>and</strong> maintainability requirements.The entire testing process, from the planning stagethrough final acceptance test, is also approachedin a step-by-step manner by the IV&V process.At each stage of development, the functional requirementsdetermine test criteria as well as designcriteria for the next stage. An important functionof the IV&V process is to ensure that thetest requirements are derived directly from theperformance requirements <strong>and</strong> are independentof design implementation. Monitoring of, participationin, <strong>and</strong> performance of the various testing<strong>and</strong> inspection activities by the IV&V contractorensure that the developed software meets requirementsat each stage of development.Throughout the software development process,the IV&V contractor reviews any proposals forsoftware enhancement or change, proposedchanges in development baselines, <strong>and</strong> proposedsolutions to design or implementation problemsto ensure that the original performance requirementsare not forgotten. An important facet ofthe IV&V contractor’s role is to act as the objectivethird party, continuously maintaining the “audittrail” from the initial performance requirementsto the final operational system.17.6 SUMMARYThere is a useful body of software testing technologiesthat can be applied to testing of AIS <strong>and</strong>weapon system software. As a technical discipline,though, software testing is still maturing.There is a growing foundation of guidance documentsto guide the PM in choosing one testingtechnique over another. One example is the U.S.Air Force Software Technology Support Center’s(STSC’s) <strong>Guide</strong>lines for Successful Acquisition<strong>and</strong> <strong>Management</strong> of Software-intensive Systems.The Air Force Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Center (AFOTEC) has also developed a courseon software OT&E. It is apparent that systematicT&E techniques are far superior to ad-hoc testingtechniques. Implementation of an effectivesoftware T&E plan requires a set of strong technical<strong>and</strong> management controls. Given the increasingamount of AIS <strong>and</strong> weapon system softwarebeing acquired, there will be an increased17-8


emphasis on tools <strong>and</strong> techniques for softwareT&E. Another resource is the Software ProgramManager’s Network that publishes guides on BestPractices in Software T&E, Vol. I & II. (http://www.spmn.<strong>com</strong>)17-9


ENDNOTES1. DoD Instruction (DoDI) 5000.2, Operationof the Defense Acquisition Program, May 12,2003.2. DoD 5000.2-R, M<strong>and</strong>atory Procedures forMajor Defense Acquisition Programs(MDAPs) <strong>and</strong> Major Automated InformationSystem (MAIS) Acquisition Programs, June2001.3. There is some indication that DoD Directive(DoDD) 8000.1, <strong>Management</strong> of DoD InformationResources <strong>and</strong> Information Technology,February 27, 2002, Defense Information<strong>Management</strong> Program, which provides guidanceon AIS development, will be incorporatedin a future change to the 5000 Series onacquisition management.4. MIL-STD-498, Software Development <strong>and</strong>Documentation, December 4, 1994, rescinded;EIA 640 or J-Std-016 (Software LifeCycle Processes, Software Development,September 1995).5. DoDD 4630.5, Interoperability <strong>and</strong> Supportabilityof Information Technology (IT) <strong>and</strong> NationalSecurity Systems (NSS), May 5, 2004,<strong>and</strong> Chairman of the Joint Chiefs of StaffInstruction (CJCSI) 6212.01C, Interoperability<strong>and</strong> Supportability of Information Technology<strong>and</strong> National Security Systems,November 20, 2003.6. DoDI 5000.2.17-10


18TESTING FOR VULNERABILITYAND LETHALITY18.1 INTRODUCTIONThis chapter addresses the need to explore thevulnerability <strong>and</strong> lethality aspects of a systemthrough <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E) practices <strong>and</strong>procedures. In particular, this chapter describesthe legislatively m<strong>and</strong>ated Live Fire <strong>Test</strong> Program,which has been established to evaluate the survivability<strong>and</strong> lethality of developing systems.(Table 18-1) It also discusses the role of T&E inassessing a system’s ability to perform in anuclear <strong>com</strong>bat environment. The discussion oftesting for nuclear survivability is based primarilyon information contained in the NuclearSurvivability H<strong>and</strong>book for OT&E. 118.2 LIVE FIRE TESTING18.2.1 BackgroundIn March 1984, the Office of the Secretary ofDefense (OSD) chartered a joint T&E programdesignated “The Joint Live Fire Program.” Thisprogram was to assess the vulnerabilities <strong>and</strong> lethalityof selected U.S. <strong>and</strong> threat systems alreadyfielded. The controversy over joint Live Fire <strong>Test</strong>ing(LFT) of the Army’s Bradley Fighting VehicleSystem, subsequent congressional hearings, <strong>and</strong>media exposure resulted in provisions being incorporatedin the National Defense AuthorizationAct (NDAA) of Fiscal Year (FY) 87. This act requiredan OSD-managed LFT program for majoracquisition programs fitting certain criteria. Subsequentamendments to legislative guidance havedictated the current program. The Department ofDefense (DoD) implementation of congressionalguidance in Defense Acquisition <strong>Guide</strong>book, requiresthat “covered systems, major munitions programs,missile programs, or Product Improvements(PIs) to these” (i.e., Acquisition Category (ACAT)I <strong>and</strong> II programs) must execute survivability <strong>and</strong>lethality testing before Full Rate Production (FRP).The legislation dealing with the term “coveredsystem” has been clarified as a system “thatDOT&E, acting for the Secretary of Defense [SEC-DEF], has determined to be: a major system withinthe meaning of that term in Title 10, United StatesCode (U.S.C.) 2302(5) that is (1) user-occupied<strong>and</strong> designed to provide some degree of protectionto its occupants in <strong>com</strong>bat; or (2) a conventionalTable 18-1. Relationships Between Key ConceptsPERSPECTIVETERMINOLOGY DEFENSIVE OFFENSIVE MEANINGSURVIVABILITY X PROBABILITY OF ENGAGEMENTEFFECTIVENESSXVULNERABILITY X PROBABILITY OF KILL GIVEN A HITLETHALITYXSUSCEPTIBILITY X PROBABILITY OF ENGAGEMENTSource: Adapted from Live Fire <strong>Test</strong>ing: Evaluating DoD’s Programs,U.S. General Accounting Office, GAO/PEMD-87-17, August 1987, page 15.18-1


munitions program or missile program; or (3) aconventional munitions program for which morethan 1,000,000 rounds are planned to be acquired;or (4) a modification to a covered system that islikely to affect significantly the survivability orlethality of such a system.”The Secretary of Defense (SECDEF) has delegatedthe authority to waive requirements for the fullup,system-level Live Fire <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(LFT&E) before the system passes the programinitiation milestone, to the Under Secretary of Defense(Acquisition, Technology, <strong>and</strong> Logistics)[USD(AT&L)] for ACAT ID <strong>and</strong> the CAE forACAT II programs, when it would be unreasonablyexpensive <strong>and</strong> impractical. An alternativevulnerability <strong>and</strong> lethality T&E program must stillbe ac<strong>com</strong>plished. Programs subject to LFT or designatedfor oversight are listed on the OSD annualT&E oversight list. The DoD agent for managementof the Live Fire <strong>Test</strong> program is the Director,Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DOT&E). Thistype of Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E)must be planned to start early enough in the developmentprocess to impact design <strong>and</strong> to providetimely test data for the OSD Live Fire <strong>Test</strong>Report required for the Full Rate ProductionDecision Review (FRPDR) <strong>and</strong> congressional<strong>com</strong>mittees. The Service-detailed Live Fire <strong>Test</strong>Plan must be reviewed <strong>and</strong> approved by theDOT&E, <strong>and</strong> LFT must be addressed in Part IVof the program <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan(TEMP). The OSD had previously publishedguidelines, elements of which have subsequentlybeen incorporated into the latest revision to the5000 Series Defense Acquisition <strong>Guide</strong>book.18.2.2 Live Fire <strong>Test</strong>sThere are varying types <strong>and</strong> degrees of live firetests. The matrix in Table 18-2 illustrates the variouspossible <strong>com</strong>binations. Full-scale, full-up testingis usually considered to be the most realistic.The importance of full-scale testing has been welldemonstrated by the Joint Live Fire (JLF) tests.In one case, these tests contradicted earlier conclusionsconcerning the flammability of a newhydraulic fluid used in F-15 <strong>and</strong> F-16 aircraft.Laboratory tests had demonstrated that the newfluid was less flammable than the st<strong>and</strong>ard fluid.However, during the JLF tests, 30 percent of theshots on the new fluid resulted in fires contrastedwith 15 percent of the shots on the st<strong>and</strong>ard fluid. 2While much insight <strong>and</strong> valuable wisdom are tobe obtained through the testing of <strong>com</strong>ponents orsubsystems, some phenomena are only observablewhen full-up systems are tested. The interactionof such phenomena has been termed “cascadingTable 18-2. Types of Live Fire <strong>Test</strong>ingFULL-UPLOADINGINERT AFULL SCALECOMPLETE SYSTEM: WITH COMBUSTIBLES COMPLETE SYSTEM: NO COMBUSTIBLES (E.G., TESTS OF(E.G., BRADLEY PHASE II TESTS, AIRCRAFT NEW ARMOR ON ACTUAL TANKS, AIRCRAFT FLIGHT“PROOF” TESTS)CONTROL TESTS)SUB SCALE COMPONENTS, SUBCOMPONENTS: WITH COMPONENTS, SUBCOMPONENTS: STRUCTURES,COMBUSTIBLES (E.G., FUEL CELL TESTS, TERMINAL BALLISTICS, MUNITIONS PERFORMANCE,BEHIND ARMOR, MOCK-UP AIRCRAFT, BEHIND-ARMOR TESTS, WARHEAD CHARACTERIZATIONENGINE FIRE TESTS)(E.G., ARMOR/WARHEAD INTERACTION TESTS, AIRCRAFTCOMPONENT STRUCTURAL TESTS)aIN SOME CASES, TARGETS ARE “SEMI-INERT,” MEANING SOME COMBUSTIBLES ARE ON BOARD, BUT NOT ALL.(EXAMPLE: TESTS OF COMPLETE TANKS WITH FUEL AND HYDRAULIC FLUID, BUT DUMMY AMMUNITION)Source: Live Fire <strong>Test</strong>ing: Evaluating DoD’s Program, General Accounting Office, GAO/PEMD-87-17, August 1987.18-2


damage.” Such damage is a result of the synergisticdamage mechanisms that are at work inthe “real world” <strong>and</strong> likely to be found duringactual <strong>com</strong>bat. LFT provides a way of examiningthe damages inflicted not only on materiel butalso on personnel. The crew casualty problem isan important issue that the LFT program isaddressing. The program provides an opportunityto assess the effects of the <strong>com</strong>plex environmentsthat crews are likely to encounter in <strong>com</strong>bat (e.g.,fire, toxic fumes, blunt injury shock <strong>and</strong> acousticinjuries). 318.2.3 Use of Modeling <strong>and</strong> Simulation(M&S)Survivability <strong>and</strong> lethality assessments havetraditionally relied largely on the use of modeling<strong>and</strong> simulation techniques. The LFT Programdoes not replace the need for such techniques; infact, the LFT <strong>Guide</strong>lines issued by OSD in May1987 (Figure 18-1) required that no shots beconducted until pre-shot model predictions weremade concerning the expected damage. Such predictionsare useful for several reasons. First, theyLFT&E PLANNING GUIDEIDENTIFICATIONAS LFT&ECANDIDATEIDENTIFICATIONOF MISSINGINFORMATIONCONSIDERATION OFISSUES TO SUPPORTDEVELOPMENT OFLFT&E STRATEGYACQUISITIONOF NEEDEDINFORMATIONDEVELOPMENTOF LFT&ESTRATEGYWHATWHENWHEREWHYWHOLFT&ESTRATEGY(TEMP)OSDAPPROVALNOOK?YESDEVELOPLFT&E PLANDETAILEDLFT&EPLANOSD/LFT&EREVIEW ANDCOMMENTNOOK?YESLFT&EWITNESSCONDUCTTESTSTESTREPORTOSD/LFT&EREVIEWOSDASSESSMENTSECDEF/CONGRESSIONALCOMMITTEESFigure 18-1. Live Fire <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Planning <strong>Guide</strong>18-3


assist in the test planning process. If a modelpredicts that no damage will be inflicted, testdesigners <strong>and</strong> planners should reexamine theselection of the shot lines <strong>and</strong>/or reassess theaccuracy of the threat representation. Second, preshotmodel predictions provide the Services withthe opportunity to validate the accuracy of the modelsby <strong>com</strong>paring them with actual LFT results.At the same time, the LFT program reveals areasof damage that may be absent from existing models<strong>and</strong> simulations. Third, pre-shot model predictionscan be used to help conserve scarce target resources.For example, models can be used todetermine a sequence of shots that provides forthe less damaging shots to be conducted first, followedby the more catastrophic shots resulting inmaximum target damage.18.2.4 Live Fire <strong>Test</strong> Best PracticesThe Defense Acquisition <strong>Guide</strong>book states thatplans for LFT must be included in the TEMP.Key points covered in the LFT guidelines includethe following:• The LFT&E Detailed T&E Plan is the basicplanning document used by OSD <strong>and</strong> theServices to plan, review, <strong>and</strong> approve LFT&E.Services will submit the plan to the DOT&Efor <strong>com</strong>ment at least 30 days prior to testinitiation.• The LFT&E plan must contain generalinformation on the system’s required performance,operational <strong>and</strong> technical characteristics,critical test objectives, <strong>and</strong> the evaluationprocess.• Each LFT&E plan must include testing of<strong>com</strong>plete systems. A limited set of live firetests may involve production <strong>com</strong>ponents configuredas a subsystem before full-up testing.• A Service report must be submitted within 120days of the <strong>com</strong>pletion of the live fire test.The report must include the firing results, testconditions, limitations <strong>and</strong> conclusions, <strong>and</strong> besubmitted in classified <strong>and</strong> unclassified form.• Within 45 days of receipt of the Service report,a separate Live Fire <strong>Test</strong> Report 4 will beproduced by the DOT&E, approved by theSecretary of Defense, <strong>and</strong> transmitted to Congress.The conclusions of the report will beindependent of the conclusions of the Servicereport. Reporting on LFT&E may be includedin the weapon system’s Beyond Low RateInitial Production (BLRIP) Report <strong>com</strong>pletedby the DOT&E.• The Congress shall have access to all live firetest data <strong>and</strong> all live fire test reports held byor produced by the Secretary of the concernedService or by OSD.• The costs of all live fire tests shall be paidfrom funding for the system being tested. Insome instances, the Deputy DOT&E-Live Firemay elect to supplement such funds for theacquisition of targets or target simulators, althoughthe ultimate responsibility rests on theconcerned Service.The Survivability, Vulnerability InformationAnalysis Center (SURVIAC) is the DoD focalpoint for nonnuclear survivability/vulnerabilitydata, information, methodologies, models, <strong>and</strong>analysis relating to U.S. <strong>and</strong> foreign aeronautical<strong>and</strong> surface systems. Data on file for conventionalmissiles <strong>and</strong> guns includes informationon acquisition, detection, tracking, launch,fly-out <strong>and</strong> fuzing characteristics, countermeasures<strong>and</strong> counter-countermeasures, <strong>and</strong> terminaleffects. Other weapons systems informationincludes physical <strong>and</strong> functional characteristics;design, performance, <strong>and</strong> operational information;acoustics, infrared, optical, electro-optical,<strong>and</strong> radar signature; <strong>com</strong>bat damage <strong>and</strong> repair;<strong>and</strong> system, subsystem, <strong>and</strong> <strong>com</strong>ponent probabilityof kill given a hit (pk/h) functions. SURVIACis sponsored by the Joint Logistics Comm<strong>and</strong>ers’Joint Aircraft Survivability Program Office18-4


(http://jas.jcs.mil) <strong>and</strong> the Joint Technical CoordinatingGroup on Munitions Effectiveness (http://jtcg.amsaa.army.mil/index.html).18.3 TESTING FOR NUCLEARHARDNESS AND SURVIVABILITY18.3.1 BackgroundNuclear survivability must be incorporated intothe design, acquisition, <strong>and</strong> operation of all systemsthat must perform critical missions in anuclear environment. Nuclear survivability isachieved through a <strong>com</strong>bination of four methods:hardness, avoidance, proliferation, <strong>and</strong> reconstitution.Hardness allows a system to physicallywithst<strong>and</strong> a nuclear attack. Avoidance en<strong>com</strong>passesmeasures taken to avoid encountering anuclear environment. Proliferation involves havingsufficient systems to <strong>com</strong>pensate for probablelosses. Reconstitution includes the actions takento repair or resupply damaged units in time to<strong>com</strong>plete a mission satisfactorily.There is a wide variety of possible effects from anuclear detonation. They include: ElectromagneticPulse (EMP), ionizing radiation, thermal radiation,blast, shock, dust, debris, blackout, <strong>and</strong> scintillation.Each weapon system is susceptible to somebut not all of these effects. Program Managers(PMs) <strong>and</strong> their staff must identify the effects thatmay have an impact on the system underdevelopment <strong>and</strong> manage the design, development,<strong>and</strong> testing of the system in a manner that minimizesdegradation. The variety of possible nucleareffects is described more fully in the NuclearSurvivability H<strong>and</strong>book for Air Force OT&E. 518.3.2 Assessing Nuclear SurvivabilityThroughout the System AcquisitionCycleThe PM must ensure that nuclear survivabilityissues are addressed throughout the systemacquisition cycle. During assessment of concepts,the survivability requirements stated in theService requirements document should be verified,refined, or further defined. During thesystem’s early design stages, tradeoffs betweenhardness levels <strong>and</strong> other system characteristics(such as weight, decontaminability, <strong>and</strong> <strong>com</strong>patibility)should be described quantitatively.Tradeoffs between hardness, avoidance, proliferation,<strong>and</strong> reconstitution as a method for achievingsurvivability should also be considered at thistime. During advanced engineering development,the system must be adequately tested to confirmthat hardness objectives, criteria, requirements,<strong>and</strong> specifications are met. Plans for NHS testingshould be addressed in the TEMP. The appropriate<strong>com</strong>m<strong>and</strong>s must make provision for test<strong>and</strong> hardness surveillance equipment <strong>and</strong> proceduresso required hardness levels can be maintainedonce the system is operational.During FRP, deployment, <strong>and</strong> operational support,system hardness is maintained through an activehardness assurance program. Such a program ensuresthat the end product conforms to hardnessdesign specifications <strong>and</strong> that hardness aspectsare reevaluated before any retrofit changes aremade to existing systems.Once a system is operational, a hardness surveillanceprogram may be implemented to maintainsystem hardness <strong>and</strong> to identify any further evaluation,testing, or retrofit changes required to ensuresurvivability. A hardness surveillance programconsists of a set of scheduled tests <strong>and</strong> inspectionsto ensure that a system’s designed hardness is notdegraded through operational use, logistics support,maintenance actions, or natural causes.18.3.3 <strong>Test</strong> PlanningThe Nuclear Survivability H<strong>and</strong>book for Air ForceOT&E describes the following challenges associatedwith NH&S testing:(1) The magnitude <strong>and</strong> range of effects from anuclear burst are much greater than thosefrom conventional explosions that may be18-5


Table 18-3. Nuclear Hardness <strong>and</strong> Survivability Assessment ActivitiesCONCEPT ASSESSMENT• PREPARATION OF TEST AND EVALUATION MASTER PLAN (TEMP) THAT INCLUDES INITIAL PLANS FORNUCLEAR HARDNESS AND SURVIVABILITY (NH&S) TESTS– IDENTIFICATION OF NH&S REQUIREMENTS IN VERIFIABLE TERMS– IDENTIFICATION OF SPECIAL NH&S TEST FACILITY REQUIREMENTS WITH EMPHASIS ON LONG LEADTIME ITEMS• DEVELOPMENT OF NUCLEAR CRITERIAPROTOTYPE TESTING• INCREASE TEST PLANNING• TEMP UPDATE• CONDUCT OF NH&S TRADE STUDIES– NH&S REQUIREMENTS VERSUS OTHER SYSTEM REQUIREMENTS– ALTERNATE METHODS FOR ACHIEVING NH&S• CONDUCT OF LIMITED TESTING– PIECE-PART HARDNESS TESTING– DESIGN CONCEPT TRADE-OFF TESTING– TECHNOLOGY DEMONSTRATION TESTING• DEVELOPMENT OF PERFORMANCE SPECIFICATIONS THAT INCLUDE QUANTITATIVE HARDNESS LEVELSENGINEERING DEVELOPMENT MODEL (EDM)• FIRST OPPORTUNITY TO TEST PROTOTYPE HARDWARE• TEMP UPDATE• DEVELOPMENT OF NUCLEAR HARDNESS DESIGN HANDBOOK– PRIOR TO PRELIMINARY DESIGN REVIEW– USUALLY PREPARED BY NUCLEAR EFFECTS SPECIALTY CONTRACTOR• CONDUCT OF TESTING– PRE-CRITICAL DESIGN REVIEW (CDR) DEVELOPMENT AND QUALIFICATION TEST– DEVELOPMENTAL TESTING ON NUCLEAR-HARDENED PIECE PARTS, MATERIALS, CABLING, ANDCIRCUITS– NH&S BOX AND SUBSYSTEM QUALIFICATION TESTS (POST-CDR)– ACCEPTANCE TESTS TO VERIFY HARDWARE MEETS SPECIFICATIONS (POST-CDR, PRIOR TO FIRSTDELIVERY)– SYSTEM-LEVEL HARDNESS ANALYSIS (USING BOX AND SUBSYSTEM TEST RESULTS)– SYSTEM-LEVEL NH&S TESTPRODUCTION (LRIP, FULL RATE), DEPLOYMENT AND OPERATIONAL SUPPORT• IMPLEMENTATION OF PROGRAM TO ENSURE SYSTEM RETAINS ITS NH&S PROPERTIES• SCREENING OF PRODUCTION HARDWARE FOR HARDNESS• IMPLEMENTATION BY USER OF PROCEDURES TO ENSURE SYSTEM’S OPERATION, LOGISTICS SUPPORTAND MAINTENANCE DO NOT DEGRADE HARDNESS FEATURES• REASSESSMENT OF SURVIVABILITY THROUGHOUT SYSTEM LIFE CYCLE18-6


used to simulate nuclear bursts. Nucleardetonations have effects not found in conventionalexplosions. The intense nuclearradiation, blast, shock, thermal, <strong>and</strong> EMPfields are difficult to simulate. In addition,systems are often tested at stress levels thatare either lower than those established bythe criteria or lower than the level neededto cause damage to the system.(2) The yields <strong>and</strong> configurations for undergroundtesting are limited. It is generallynot possible to test all relevant effectssimultaneously or to observe possiblyimportant synergism between effects.(3) System-level testing for nuclear effects isnormally expensive, takes years to plan <strong>and</strong>conduct, <strong>and</strong> requires specialized expertise.Often, classes of tests conducted early inthe program are not repeated later. Therefore,operational requirements should befolded into these tests from the start, oftenearly in the acquisition process. This m<strong>and</strong>atesa more extensive, <strong>com</strong>bined DT&E/OT&E (Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>)test program than normally found in othertypes of testing.PMs <strong>and</strong> <strong>Test</strong> Managers (TMs) must remain sensitiveto the ambiguities involved in testing fornuclear survivability. For example, there is nouniversal quantitative measure of survivability;<strong>and</strong> statements of survivability may lend themselvesto a variety of interpretations. Moreover,it can be difficult to <strong>com</strong>bine system vulnerabilityestimates for various nuclear effects into an assessmentof overall survivability. As a result, PMs/TMs must exercise caution when developing testobjectives <strong>and</strong> specifying measures of meritrelated to nuclear survivability.because it is not possible to test in an operationalnuclear environment. The use of an integrated DT/OT program requires early <strong>and</strong> continuous dialoguebetween the two test <strong>com</strong>munities so eachunderst<strong>and</strong>s the needs of the other <strong>and</strong> maximumcooperation in meeting objectives is obtained.T&E techniques available to validate the nuclearsurvivability aspects of systems <strong>and</strong> subsystemsinclude underground nuclear testing, environmentalsimulation (system level, subsystem level, <strong>and</strong><strong>com</strong>ponent level), <strong>and</strong> analytical simulation. Table18-3 outlines the major activities relevant to theassessment of NH&S <strong>and</strong> the phases of the acquisitioncycle in which they occur.18.4 SUMMARYThe survivability <strong>and</strong> lethality aspects of certainsystems must be evaluated through live fire tests.These tests are used to provide insights into theweapon system’s ability to continue to operate/fight after being hit by enemy threat systems. Itprovides a way of examining the damages inflictednot only on materiel but also on personnel.LFT also provides an opportunity to assessthe effects of <strong>com</strong>plex environments that crewsare likely to encounter in <strong>com</strong>bat.Nuclear survivability must be carefully evaluatedduring the system acquisition cycle. Tradeoffsbetween hardness levels <strong>and</strong> other system characteristics,such as weight, speed, range, cost, etc.,must be evaluated. Nuclear survivability testingis difficult, <strong>and</strong> the evaluation of test results mayresulted in a variety of interpretations. Therefore,PMs must exercise caution when developing testobjectives related to nuclear survivability.18.3.4 <strong>Test</strong> ExecutionFor NH&S testing, Development <strong>Test</strong> (DT) <strong>and</strong>Operational <strong>Test</strong> (OT) efforts are often <strong>com</strong>bined18-7


ENDNOTES1. Air Force Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Center (AFOTEC), Nuclear SurvivabilityH<strong>and</strong>book for Air Force OT&E.2. DoD Instruction (DoDI) 5030.55, Joint AEC-DoD Nuclear Weapons Development Procedures,January 4, 1974.4. Defense Acquisition <strong>Guide</strong>book, Appendix 3.http://akss.dau.mil/DAG/.5. AFOTEC.3. Office of the Deputy Director, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(Live Fire <strong>Test</strong>), Live Fire <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> Planning <strong>Guide</strong>, June 1989.18-8


19LOGISTICSINFRASTRUCTURE T&E19.1 INTRODUCTIONIn all materiel acquisition programs, the logisticssupport effort begins in the capabilities assessmentsof mission needs, leads to the developmentof the Initial Capabilities Document (ICD) beforeprogram initiation, continues throughout theacquisition cycle, <strong>and</strong> extends past the deploymentphase. Logistics support system testingmust, therefore, extend over the entire acquisitioncycle of the system <strong>and</strong> be carefully planned<strong>and</strong> executed to ensure the readiness <strong>and</strong> supportabilityof the system. Discussion of PerformanceBased Logistics (PBL) can be found at the DefenseAcquisition University (DAU) Web site(http://www.dau.mil), Acquisition, Technology,<strong>and</strong> Logistics (AT&L) Knowledge Sharing Site(AKSS) for the Logistics Community of Practice(CoP). This chapter covers the development oflogistics support test requirements <strong>and</strong> the conductof supportability assessments to ensure thatreadiness <strong>and</strong> supportability objectives are identified<strong>and</strong> achieved. The importance of the logisticsmanager’s participation in the <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Master Plan (TEMP) development processshould be stressed. The logistics manager mustensure the logistics system <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(T&E) objectives are considered <strong>and</strong> thatadequate resources are available for logisticssupport system T&E.Logistics system support planning is a disciplined,unified, <strong>and</strong> iterative approach to the management<strong>and</strong> technical activities necessary to integratesupport considerations into system <strong>and</strong> equipmentdesign; develop support requirements that arerelated consistently to readiness objectives, design,<strong>and</strong> each other; acquire the required support; <strong>and</strong>provide the required support during deployment<strong>and</strong> operational support at affordable cost.Logistics support systems are usually categorizedinto 10 specific <strong>com</strong>ponents, or elements:(1) Maintenance planning(2) Manpower <strong>and</strong> personnel(3) Supply support(4) Support equipment(5) Technical data(6) Training <strong>and</strong> training support(7) Computer resources support(8) Facilities(9) Packaging, h<strong>and</strong>ling, storage, <strong>and</strong> transportation(10) Design interface.19.2 PLANNING FOR LOGISTICSSUPPORT SYSTEM T&E19.2.1 Objectives of Logistics System T&EThe main objective of logistics system T&E is toverify that the logistics support being developedfor the materiel system is capable of meeting therequired objectives for peacetime <strong>and</strong> wartimeemployment. This T&E consists of the usualDevelopment <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E) <strong>and</strong>Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E) but also19-1


ACQUISITIONPHASECONCEPT TECHNOLOGY SYSTEM DEVELOPMENT PRODUCTION & OPERATIONS &TEST REFINEMENT DEVELOPMENT AND DEPLOYMENT SUPPORTTYPE DEMONSTRATIONDEVELOPMENTAL • ASSESS PREFERRED • ASSESS TECHNICAL • IDENTIFY DESIGN • VERIFY CPD LOGISTICS • EVALUATE IMPACTT&E SYSTEM SUPPORT APPROACH, LOGISTICS PROBLEMS AND ASSESS PARAMETERS FOR OF ENGINEERNIGCONCEPTS RISKS, AND ALTERNATIVE SOLUTIONS FOR SYSTEM PRODUCTION CHANGES ONSUPPORTABILITY SOLUTIONS LOGISTICS ELEMENTS SYSTEMS LOGISTICS• INPUT TO AOA PARAMETERS• EVALUATE LOGISTICS- • EVALUATE SYSTEM CDD • PROVIDE INSIGHT INTO• INPUT TO TDS RELATED TECHNOLOGIES LOGISTICS PARAMETERS, READINESS FOR OPERATIONAL • ASSESS ADEQUACYAND RAM PREDICTIONS SUITABILITY TESTING OF OF FIXES FOR• SUPPORTABILITY • TDS LOGISTICS LOGISTICS ELEMENTS LOGISTICSM&S ASSESSMENT STRATEGY • CONDUCT EARLY DEFICIENCIESEXPANDED FOR TEMP MAINTENANCE DEMOS, • CONDUCT ASSESSMENTS IDENTIFIED BYLEVEL OF REPAIR OF MAINTENANCE CONCEPTS FIELD UNITS• ASSESS TESTABILITY OF ASSESSMENTS, AND AND PROCESSESPOTENTIAL LOGISTICS MAINTENANCE TASK • EVALUATE FIXESFOOTPRINT ELEMENTS ANALYSES • PROVIDE RAM DATA TO FOR IOT&ESUPPORT LCCE DEFICIENCIES• ASSESS THRESHOLDS/ • CONDUCT FAILURE MODEOBJECTIVES FOR EFFECTS, AND CRITICALITY • DEMONSTRATELOGISTICS PARAMETERS ANALYSIS (FMECA) ATTAINMENT OFIN CDD RAM/READINESS• SUPPORT SYSTEMS CRITERIAENGINEERING DESIGNREVIEWS• RELIABILITY GROWTH T&EOPERATIONAL • ASSESS POTENTIAL • EXAMINE OPERATIONAL • ASSESS ABILITY OF • ASSESS OPERATIONAL • EVALUATET&E FOR OPERATIONAL ASPECTS OF SUPPORTABILITY LOGISTICS SYSTEM TO SUITABILITY FOR OPERATIONALSUITABILITY OF TECHNICAL SOLUTIONS PROVIDE SUPPORT IN A PRODUCTION SYSTEMS SUITABILITY ANDPREFERRED CONCEPT RELEVANT OPERATIONAL SUPPORTABILITY OF• DEVELOP MEASUES AND COI ENVIRONMENT • PROVIDE RELIABILITY DATA DESIGN CHANGES• ASSIST IN SELECTING FOR FUTURE SUITABILITY FOR RELIABILITY GROWTHPREFERRED SYSTEM ASSESSMENTS-INPUT FOR • ASSESS IMPROVEMENT IN MODEL • IDENTIFYSUPPORT CONCEPTS TEMP OPERATIONAL SUITABILITY IMPROVEMENTS FORFROM PROTOTYPE TO EDM SUPPORTABILITY• INPUT TO TDS • ESTIMATE POTENTIAL SYSTEMS PARAMETERSOPERATIONAL SUITABILITYOF LOGISTICS SYSTEMS • REFINE CONCEPTS FOR • PROVIDE DATA FORAND SUSTAINMENT EVALUATION OF CPD ADJUSTINGPLANNING LOGISTICS PARAMETERS SUPPORTABILITYMODELS• EVAUATELOGISTICSPARAMETERSWAIVED FOR IOT&EFigure 19-1. Logistics Supportability Objectives in the T&E Program19-2


includes post-deployment supportability assessments.The formal DT&E <strong>and</strong> OT&E begin withsub<strong>com</strong>ponent assembly <strong>and</strong> prototype testing,<strong>and</strong> continue throughout advanced engineeringdevelopment, production, deployment, <strong>and</strong> operationalsupport. Figure 19-1 describes the specificDevelopment <strong>Test</strong> (DT), Operational <strong>Test</strong> (OT),<strong>and</strong> supportability assessment objectives for eachacquisition phase. “The PM shall apply HumanSystems Integration (HSI) to optimize total systemperformance (hardware, software, <strong>and</strong>human), operational effectiveness, <strong>and</strong> suitability,survivability, safety, <strong>and</strong> affordability.” 119.2.2 Planning for Logistics SupportSystem T&E19.2.2.1 Logistics Support Planning“The PM shall have a <strong>com</strong>prehensive plan forHSI in place early in the acquisition process tooptimize total system performance, minimizeTotal Ownership Costs [TOCs], <strong>and</strong> ensure thatthe system is built to ac<strong>com</strong>modate the characteristicsof the user population that will operate,maintain, <strong>and</strong> support the system.” HSI includesdesign for: human factors engineering; personnel;habitability; manpower; training; Environment,Safety, <strong>and</strong> Occupational Health (ESOH);<strong>and</strong> survivability. 2 The logistics support managerfor a materiel acquisition system is responsiblefor developing the logistics support planning,which documents planning for <strong>and</strong> implementingthe support of the fielded system. It is initiallyprepared during preparation for program initiation,<strong>and</strong> progressively developed in more detailas the system moves through the acquisitionphases. Identification of the specific logisticssupport test issues related to the individual logisticselements <strong>and</strong> the overall system support <strong>and</strong>readiness objectives are included.The logistics support manager is assisted throughoutthe system’s development by a Logistics SupportIntegrated Product Team (IPT), which isformed early in the acquisition cycle. TheLogistics Support IPT is a coordination/advisorygroup <strong>com</strong>posed of personnel from the Program<strong>Management</strong> Office (PMO), the using <strong>com</strong>m<strong>and</strong>,<strong>and</strong> other <strong>com</strong>m<strong>and</strong>s concerned with acquisitionactivities such as logistics, testing, <strong>and</strong> training.19.2.2.2 Supportability Assessment PlanningBased upon suitability objectives, the logisticssupport manager, in conjunction with the system’s<strong>Test</strong> Manager (TM), develops suitability assessmentplanning. This planning identifies the testingapproach <strong>and</strong> the evaluation criteria that willbe used to assess the supportability-related designrequirements (e.g., Reliability <strong>and</strong> Maintainability(R&M)) <strong>and</strong> adequacy of the planned logisticssupport resources for the materiel system.Development of the suitability T&E planningbegins during concept refinement; the planningis then updated <strong>and</strong> refined in each successiveacquisition phase. The logistics support managermay apply the best practices of logistics supportanalysis as described in Military St<strong>and</strong>ard (MIL-STD)-1388-1A. 3The T&E strategy is formulated, T&E programobjectives <strong>and</strong> criteria are established, <strong>and</strong> requiredtest resources are identified. The logisticssupport manager ensures that T&E strategy isbased upon quantified supportability requirements<strong>and</strong> addresses supportability issues includingthose with a high degree of associated risk.Also, the logistics support manager ensures thatthe necessary quantities <strong>and</strong> types of data will becollected during system development <strong>and</strong> afterdeployment of the system to validate the variousT&E objectives. The T&E objectives <strong>and</strong> criteriamust provide a basis that ensures critical supportabilityissues <strong>and</strong> requirements are resolved orachieved within acceptable confidence levels.19.2.2.3 <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master PlanThe Program Manager (PM) must include suitabilityT&E information in the TEMP as specifiedin the Defense Acquisition <strong>Guide</strong>book. The19-3


input, derived from logistics supportability planningwith the assistance of the logistics supportmanager <strong>and</strong> the tester, includes descriptions ofrequired operational suitability, specific plans fortesting logistics supportability, <strong>and</strong> required testingresources. It is of critical importance that allkey test resources required for ILS testing (DT,OT, <strong>and</strong> post deployment supportability) be identifiedin the TEMP because the TEMP providesa long-range alert upon which test resources arebudgeted <strong>and</strong> obtained for testing.19.2.3 Planning <strong>Guide</strong>lines for LogisticsSupport System T&EThe following guidelines were selected from thoselisted in the Acquisition Logistics <strong>Guide</strong>: 4(1) Develop a test strategy for each logisticssupport-related objective. Ensure thatOT&E planning en<strong>com</strong>passes all logisticssupport elements. The general objectivesshown in Figure 19-1 must be translated intodetailed quantitative <strong>and</strong> qualitative requirementsfor each acquisition phase <strong>and</strong> eachT&E program.(2) Incorporate logistics support testing requirements(where feasible) into the formalDT&E/OT&E plans.(3) Identify logistics support T&E that will beperformed outside of the normal DT&E <strong>and</strong>OT&E. Include subsystems that require offsystemevaluation.(4) Identify all required resources, includingtest articles <strong>and</strong> logistics support items forformal DT/OT <strong>and</strong> separate logistics supportsystem testing (participate with testplanner).(5) Ensure establishment of an operationallyrealistic test environment, to include personnelrepresentatives of those who willeventually operate <strong>and</strong> maintain the fieldedsystem. These personnel should be trainedfor the test using prototypes of the actualtraining courses <strong>and</strong> devices. They shouldbe supplied with drafts of all technicalmanuals <strong>and</strong> documentation that will beused with the fielded system.(6) Ensure planned OT&E will provide sufficientdata on high-cost <strong>and</strong> high-maintenanceburden items (e.g., for high-cost criticalspares, early test results can be used toreevaluate selection).(7) Participate early <strong>and</strong> effectively in theTEMP development process to ensure theTEMP includes critical logistics T&E designatedtest funds from program <strong>and</strong> budgetdocuments.(8) Identify the planned utilization of all datacollected during the assessments to avoid mismatchingof data collection <strong>and</strong> informationrequirements.Additional guidance may be found in the Logistics<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> H<strong>and</strong>book developed bythe 412 Logistics Group, Edwards Air Force Base,California.19.3 CONDUCTING LOGISTICSSUPPORT SYSTEM T&E19.3.1 The ProcessThe purposes of logistics support system T&Eare to measure the supportability of a developingsystem throughout the acquisition process, toidentify supportability deficiencies <strong>and</strong> potentialcorrections/improvements as test data be<strong>com</strong>eavailable, <strong>and</strong> to assess the operational suitabilityof the planned support system. It also evaluatesthe system’s ability to achieve planned readinessobjectives for the system/equipment being developed.Specific logistics support system T&E tasks(guidance in MIL-STD-1388-1A) include: 519-4


• Analysis of test results to verify achievementof specified supportability requirements;• Determination of improvements in supportability<strong>and</strong> supportability-related design parametersneeded for the system to meet establishedgoals <strong>and</strong> thresholds;• Identification of areas where established goals<strong>and</strong> thresholds have not been demonstratedwithin acceptable confidence levels;• Development of corrections for identified supportabilityproblems such as modifications tohardware, software, support plans, logisticssupport resources, or operational tactics;• Projection of changes in costs, readiness, <strong>and</strong>logistics support resources due to implementationof corrections;• Analysis of supportability data from the deployedsystem to verify achievement of the establishedgoals <strong>and</strong> thresholds; <strong>and</strong> where operationalresults deviate from projections, determinationof the causes <strong>and</strong> correctiveactions.Logistics support system T&E may consist of aseries of logistics demonstrations <strong>and</strong> assessmentsthat are usually conducted as part of system performancetests. Special end-item equipment testsare rarely conducted solely for logistics parameterevaluation.19.3.2 Reliability <strong>and</strong> MaintainabilitySystem availability is generally considered to be<strong>com</strong>posed of two major system characteristics—Reliability <strong>and</strong> Maintainability (R&M). Reliability,Availability, <strong>and</strong> Maintainability (RAM) requirementsshould be based on operational requirements<strong>and</strong> Life Cycle Cost (LCC) considerations;stated in quantifiable, operational terms; measurableduring developmental <strong>and</strong> operational test <strong>and</strong>evaluation; <strong>and</strong> defined for all elements of thesystem, including support <strong>and</strong> training equipment.Relevant definitions from the DAU Glossary: 6Reliability (R) is the ability of a system<strong>and</strong> its parts to perform its mission withoutfailure, degradation, or dem<strong>and</strong> on thesupport system. A <strong>com</strong>mon MOE has beenMean Time Between Failure (MTBF).Maintainability (M) is the ability of anitem to be retained in or restored to specifiedcondition when maintenance is performedby personnel having specific skilllevels, using prescribed procedures <strong>and</strong> resources,at each prescribed level of maintenance<strong>and</strong> repair. A <strong>com</strong>mon Measureof Effectiveness (MOE) has been MeanTime To Repair (MTTR).Operational Reliability <strong>and</strong> MaintainabilityValue is any measure of reliability or maintainabilitythat includes the <strong>com</strong>bined effectsof item design, quality, installation, environment,operation, maintenance, <strong>and</strong> repair.The R <strong>and</strong> M program objectives are to be definedas system parameters early in the developmentprocess. They will be used as evaluation criteriathroughout the design, development, <strong>and</strong> productionprocesses. R&M objectives should be translatedinto quantifiable contractual terms <strong>and</strong> allocatedthrough the system design hierarchy. Anunderst<strong>and</strong>ing of how this allocation affects testingoperating characteristics below system levelcan be found in T&E of System Reliability, Availability<strong>and</strong> Maintainability. 7 This is especiallyimportant to testing organizations expected tomake early predictions of system performance.Guidance on testing reliability may also be foundin MIL-STD-78I. 8Reliability, Availability, <strong>and</strong> Maintainability(also known as RAM), are requirementsimposed on acquisition systems toinsure they are operationally ready for usewhen needed, will successfully perform19-5


assigned functions, <strong>and</strong> can be economicallyoperated <strong>and</strong> maintained within thescope of logistics concepts <strong>and</strong> policies.RAM development programs are applicableto materiel systems, test measurement<strong>and</strong> diagnostic equipment, trainingdevices, <strong>and</strong> facilities supporting weaponsystems.Availability is a measure of the degree towhich an item is in an operable state <strong>and</strong>can be <strong>com</strong>mitted at the start of a missionwhen the mission is called for at anunknown (r<strong>and</strong>om) point in time.Availability <strong>com</strong>monly uses different MOE basedon the maturity of the weapon system technology.Achieved availability (Aa) is measured forthe early prototype <strong>and</strong> EDM systems developedby the system contractor when the system is notoperating in its normal environment. InherentAvailability (Ai) is measured with respect onlyto operating time <strong>and</strong> corrective maintenance, ignoringst<strong>and</strong>by/delay time <strong>and</strong> mean logisticsdelay time. It may be measured during DT&E orwhen testing in the contractor’s facility under controlledconditions. Operational Availability (Ao),measured for mature systems that have been deployedin realistic operational environments, isthe degree to which a piece of equipment worksproperly when it is required for a mission. It iscalculated by dividing uptime by uptime plus alldowntime. The calculation is sensitive to low utilizationrates for equipment. Ao is the quantitativelink between readiness objectives <strong>and</strong>supportability. 919.3.2.1 Reliability<strong>Guide</strong>lines for reliability evaluation are to be publishedin a non-government st<strong>and</strong>ard to replaceMIL-STD-785:• Environmental Stress Screening (ESS) is atest, or series of tests during engineeringdevelopment, specifically designed to identifyweak parts or manufacturing defects. <strong>Test</strong>conditions should simulate failures typical ofearly field service rather than provide anoperational life profile.• Reliability Development/Growth <strong>Test</strong>ing(RDT/RGT) is a systematic engineering processof <strong>Test</strong>-Analyze-Fix-<strong>Test</strong> (TAFT) whereequipment is tested under actual, simulated,or accelerated environments. It is an iterativemethodology intended to rapidly <strong>and</strong> steadilyimprove reliability. <strong>Test</strong> articles are usuallysubjected to ESS prior to beginning RDT/RGT to eliminate those with manufacturingdefects.• Reliability Qualification <strong>Test</strong> (RQT) is toverify that threshold reliability requirementshave been met before items are <strong>com</strong>mitted toproduction. A statistical test plan is used topredefine criteria, which will limit governmentrisk. <strong>Test</strong> conditions must be operationallyrealistic.• Production Reliability Acceptance <strong>Test</strong>(PRAT) is intended to simulate in-service useof the delivered item or production lot. “Becauseit must provide a basis for determiningcontractual <strong>com</strong>pliance, <strong>and</strong> because it appliesto the items actually delivered to operationalforces, PRAT must be independent of thesupplier if at all possible.” PRAT may requireexpensive test facilities, so 100 percentsampling is not re<strong>com</strong>mended.19.3.2.2 MaintainabilityMaintainability design factors <strong>and</strong> test/demonstrationrequirements used to evaluate maintainabilitycharacteristics must be based on programobjectives <strong>and</strong> thresholds. Areas for evaluationmight include: 10• Accessibility: Assess how easily the item canbe repaired or adjusted.19-6


• Visibility: Assess the ability/need to see theitem being repaired.• <strong>Test</strong>ability: Assess ability to detect <strong>and</strong> isolatesystem faults to the faulty replaceable assemblylevel.• Complexity: Assess the impact of the number,location, <strong>and</strong> characteristic (st<strong>and</strong>ard orspecial purpose) on system maintenance.• Interchangeability: Assess the level of difficultyencountered when failed or malfunctioningparts are removed or replaced with anidentical part not requiring recalibration.A true assessment of system maintainabilitygenerally must be developed at the system levelunder operating conditions <strong>and</strong> using productionconfiguration hardware. Therefore, a maintainabilitydemonstration should be conducted prior toFull Rate Production (FRP). 1119.3.3 T&E of System Support PackageThe T&E of the support for a materiel systemrequires a system support package consisting ofspares, support equipment, technical documents<strong>and</strong> publications, representative personnel, anypeculiar support requirements, <strong>and</strong> the test articleitself; in short, all of the items that would eventuallybe required when the system is operational.This <strong>com</strong>plete support package must be at thetest site before the test is scheduled to begin.Delays in the availability of certain support itemscould prevent the test from proceeding on schedule.This could be costly due to onsite supportpersonnel on hold or tightly scheduled systemranges <strong>and</strong> expensive test resources not beingproperly utilized. Also, it could result in the testproceeding without conducting the <strong>com</strong>pleteevaluation of the support system. The logisticssupport test planner must ensure that the requiredpersonnel are trained <strong>and</strong> available, the testfacility scheduling is flexible enough to permitnormal delays, <strong>and</strong> the test support package ison site, on time.19.3.4 Data Collection <strong>and</strong> AnalysisThe logistics support manager must coordinatewith the testers to ensure that the methods usedfor collection, storage, <strong>and</strong> extraction of logisticssupport system T&E data are <strong>com</strong>patible withthose used in testing the materiel system. As withany testing, the test planning must ensure that allrequired data are identified; it is sufficient toevaluate a system’s readiness <strong>and</strong> supportability;<strong>and</strong> plans are made for a data management systemthat is capable of the data classification, storage,retrieval, <strong>and</strong> reduction necessary for statisticalanalysis. Large statistical sample sizes mayrequire a <strong>com</strong>mon database that integratescontractor, DT&E, <strong>and</strong> OT&E data so requiredperformance parameters can be demonstrated.19.3.5 Use of Logistics Support System <strong>Test</strong>ResultsThe emphasis on the use of the results of testingchanges as the program moves from the conceptassessments to post-deployment. During earlyphases of a program, the evaluation results areused primarily to verify analysis <strong>and</strong> developfuture projections. As the program moves into advancedengineering development <strong>and</strong> hardwarebe<strong>com</strong>es available, the evaluation addresses design,particularly the R&M aspects; training programs;support equipment adequacy; personnelskills <strong>and</strong> availability; <strong>and</strong> technical publications.The logistics support system manager must makethe PM aware of the impact on the program oflogistical short<strong>com</strong>ings that are identified duringthe T&E process. The PM, in turn, must ensurethat the solutions to any short<strong>com</strong>ings are identified<strong>and</strong> reflected in the revised specifications<strong>and</strong> that the revised test requirements are includedin the updated TEMP as the program proceedsthrough the various acquisition stages.19-7


19.4 LIMITATIONS TO LOGISTICSSUPPORT SYSTEM T&EConcurrent testing or tests that have acceleratedschedules frequently do not have sufficient testarticles, equipment, or hardware to achieve statisticalconfidence in the testing conducted. Anacquisition strategy may stipulate that supportresources such as operator <strong>and</strong> maintenancemanuals, tools, support equipment, trainingdevices, etc., for major weapon system <strong>com</strong>ponentswould not be procured before the weaponssystem/<strong>com</strong>ponent hardware <strong>and</strong> software designstabilizes. This shortage of equipment is oftenthe reason that shelf-life <strong>and</strong> service-life testingis in<strong>com</strong>plete, leaving the logistics support systemevaluator with insufficient data to predict futureperformance of the test item. Some evaluationsmust measure performance against a point on theparameter’s growth curve. The logistics supportsystem testing will continue post-production toobtain required sample sizes for verifying performancecriteria. Many aspects of the logisticssupport system may not be available for InitialOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (IOT&E) <strong>and</strong>be<strong>com</strong>e testing limitations. The PMO mustdevelop enough logistics support to ensure theuser can maintain the system during IOT&Ewithout requiring system contractor involvement(legislated constraints). Any logistics support systemlimitations upon IOT&E will likely be evaluatedduring Follow-on Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (FOT&E).19.5 SUMMARYT&E are the logisticians’ tools for measuring theability of the planned support system to fulfillthe materiel system’s readiness <strong>and</strong> supportabilityobjectives. The effectiveness of logistics supportsystem T&E is based upon the <strong>com</strong>pleteness <strong>and</strong>timeliness of the planning effort.The logistics support system T&E requirementsmust be an integral part of the TEMP to ensurebudgeting <strong>and</strong> scheduling of required test resources.Data requirements must be <strong>com</strong>pletelyidentified, with adequate plans made for collection,storage, retrieval, <strong>and</strong> reduction of test data.At the Full Rate Production Decision Review(FRPDR), decision makers can expect that somelogistics system performance parameters will nothave finished testing because of the large samplesizes required for high confidence levels instatistical analysis.19-8


ENDNOTES1. DoD Directive (DoDD) 5000.1, The DefenseAcquisition System, May 12, 2003, p. 11.2. DoD Instruction (DoDI) 5000.2, Operationof the Defense Acquisition System, May 12,2003, p. 43.3. MIL-STD-1388-1A, Logistics SupportAnalysis. Cancelled.4. Defense Systems <strong>Management</strong> College,Acquisition Logistics <strong>Guide</strong>, December 1997.5. MIL-STD-1388-1A.6. Defense Acquisition University, Glossary ofDefense Acquisition Acronyms & Terms, 11thedition, September 2003.7. DoD 3235.1-H, Department of Defense T&Eof System Reliability, Availability <strong>and</strong> Maintainability,March 1982.8. MIL-STD-781, Reliability <strong>Test</strong>ing for EngineeringDevelopment, Qualification, <strong>and</strong>Production. Cancelled.9. DoD 3235.1-H10. Ibid.11. <strong>Guide</strong>lines in MIL-HDBK-470, Designing<strong>and</strong> Developing Maintainable Products <strong>and</strong>Systems, 1998.19-9


19-10


20EC/C 4 ISR TESTAND EVALUATION20.1 INTRODUCTION<strong>Test</strong>ing of Electronic Combat (EC) <strong>and</strong> Comm<strong>and</strong>,Control, Communications, Computers, Intelligence,Surveillance, <strong>and</strong> Reconnaissance (C 4 ISR)systems poses unique problems for testers becauseof the difficulty in measuring their operational performance.Compatibility, Interoperability, <strong>and</strong> Integration(CII) are key performance areas for thesesystems. Special testing techniques <strong>and</strong> facilitiesare normally required in EC <strong>and</strong> C 4 ISR testing.This chapter discusses the problems associatedwith EC <strong>and</strong> C 4 ISR testing <strong>and</strong> presents methodologiesthe tester can consider using to over<strong>com</strong>ethe problems.20.2 TESTING EC SYSTEMS20.2.1 Special Consideration When <strong>Test</strong>ingEC SystemsElectronic <strong>com</strong>bat systems operate across theelectromagnetic spectrum, performing offensive<strong>and</strong> defensive support roles. Configurations varyfrom subsystem <strong>com</strong>ponents to full-up independentsystems. The EC systems are used to increasesurvivability, degrade enemy capability, <strong>and</strong> contributeto the overall success of the <strong>com</strong>bat mission.Decision makers want to know the incrementalcontribution to total force effectivenessmade by a new EC system when measured in aforce-on-force engagement. However, the contractualspecifications for EC systems are usuallystated in terms of engineering parameters suchas effective radiated power, reduction in <strong>com</strong>municationsintelligibility, <strong>and</strong> jamming-to-signalratio. These measures are of little use by themselvesin assessing contribution to missionsuccess. The decision makers require that testingbe conducted under realistic operational conditions;but the major field test ranges, such as theshoreline at Eglin Air Force Base (AFB), Florida,or the desert at Nellis AFB, Nevada, cannot providethe signal density or realism of threats thatwould be presented by regional conflicts in centralEurope. In field testing, the tester can achieveone-on-one or, at best, few-on-few testing conditions.To do this the tester needs a methodologythat will permit extrapolation of engineering measurements<strong>and</strong> one-on-one test events to createmore operationally meaningful measures of missionsuccess in a force-on-force context, usuallyunder simulated conditions.20.2.2 Integrated <strong>Test</strong> ApproachAn integrated approach to EC testing using a<strong>com</strong>bination of large-scale models, <strong>com</strong>putersimulations, hybrid man-in-the-loop simulators,<strong>and</strong> field test ranges is a solution for the EC tester.No tool by itself is adequate to provide a <strong>com</strong>prehensiveevaluation. Simulation, both digital<strong>and</strong> hybrid, can provide a means for efficient testexecution. Computer models can be used to simulatemany different test cases to aid the tester inassessing the critical test issues (i.e., sensitivityanalysis) <strong>and</strong> produce a <strong>com</strong>prehensive set ofpredicted results. As digital simulation modelsare validated with empirical data from testing,they can be used to evaluate the system undertest in a more dense <strong>and</strong> <strong>com</strong>plex threat environment<strong>and</strong> at expected wartime levels. In addition,the field test results are used to validate themodel; <strong>and</strong> the model is used to validate the fieldtests, thus lending more credibility to both results.Hybrid man-in-the-loop simulators, such as the20-1


Real-Time Electromagnetic Digitally ControlledAnalyzer <strong>and</strong> Processor (REDCAP) <strong>and</strong> the AirForce Electronic Warfare <strong>Evaluation</strong> Simulator(AFEWES) can provide a capability to test againstnew threats. Hybrid simulators cost less <strong>and</strong> aresafer than field testing. The field test ranges areused when a wider range of actions <strong>and</strong> reactionsby aircraft <strong>and</strong> ground threat system operationsis required.Where one tool is weak, another may be strong.By using all the tools, an EC tester can do a <strong>com</strong>pletejob of testing. An example of an integratedmethodology is shown in Figure 20-1. The ECintegrated testing can be summarized as:(1) Initial modeling phase for sensitivity analysis<strong>and</strong> test planning,(2) Active test phases at hybrid laboratorysimulator <strong>and</strong> field range facilities,(3) <strong>Test</strong> data reduction <strong>and</strong> analysis,(4) Post-test modeling phase repeating the firststep using test data for extrapolation,(5) Force effectiveness modeling <strong>and</strong> analysisphase to determine the incremental contributionof the new system to total forceeffectiveness.SYSTEM DESCRIPTIONASSUMPTIONSTEST DESIGNMANY-ON-MANYMODEL*CRITICAL TEST ISSUESEXPECTED RESULTSTEST CONDUCTHYBRIDLABORATORYSIMULATORFEEDBACKFIELDTESTINGFEEDBACK/VALIDATIONTEST DATATEST DATADATAREDUCTIONAND ANALYSISEVALUATIONTEST RESULTSMANY-ON-MANYMODEL**SAME MODELSYSTEM EFFECTIVENESSFORCE-ON-FORCEMODELFORCE EFFECTIVENESSFigure 20-1. Integrated EC <strong>Test</strong>ing Approach20-2


Another alternative is the electronic <strong>com</strong>bat testprocess. 1 The six-step process described here isgraphically represented by Figure 20-2:(1) Deriving test requirements,(2) Conducting pre-test analysis to predict ECsystem performance,(3) Conducting test sequences under progressivelymore rigorous ground- <strong>and</strong> flight-testconditions,(4) Processing test data,(5) Conducting post-test analysis <strong>and</strong> evaluationof operational effectiveness <strong>and</strong> suitability,(6) Feeding results back to the system;development employment process.As can be seen from Figure 20-3, assuming alimited budget <strong>and</strong> field test being the mostexpensive per number of trials, the cost of testtrials forces the developer <strong>and</strong> tester to maketradeoffs to obtain the necessary test data. Manymore iterations of a <strong>com</strong>puter simulation can berun for the cost of an open-air test.20.3 TESTING OF C 4 ISR SYSTEMS20.3.1 Special Considerations When <strong>Test</strong>ingC 4 ISR SystemsThe purpose of a C 4 ISR system is to provide a<strong>com</strong>m<strong>and</strong>er with timely <strong>and</strong> relevant informationTESTINGCOMPUTER SIMULATIONAIDED ANALYSISMEASUREMENTFACILITIESSYSTEMDEVELOPMENTREQUIREMENTSDESIGNEVALUATIONPRETESTSTRUCTURE TESTPREDICT RESULTSPREDICTEDVALUESINTEGRATIONLABORATORIESHARDWAREIN-THE-LOOPTESTFACILITIESMOEsTPPsSPECIFICATIONPRODUCEPOST TESTCORRELATE DATAUPDATE MODELS/TESTDESIGNEXTRAPOLATE TPPs/MOEsINSTALLEDTESTFACILITIESOPERATIONSTPP: TECHNICAL PERFORMANCE PARAMETERMOE: MEASURE OF EFFECTIVENESSDATAPROCESSING& ANALYSISMEASUREDVALUESFLIGHTTEST RANGESMEASURED DATATEST PROCESSSOURCE: USAF Electronic Combat Development <strong>Test</strong> Process <strong>Guide</strong>.Figure 20-2. EC <strong>Test</strong> Process Concept20-3


DIGITAL COMPUTER SIMULATIONSNUCOMPUTERSIMULATIONHYBRID SIMULATORSTYPES OFSIMULATORSUSEDM BER OFTRIA LSINTEGRATIONLABORATORIES(DIGITAL/HYBRID)HARDWARE-IN-THE-LOOP TEST FACILITIESFREE-SPACESIMULATORSINSTALLEDTESTFACILITIESOPEN-AIRMEASUREMENT FACILITIESSOURCE: USAF Electronic Combat Development <strong>Test</strong> Process <strong>Guide</strong>.TIMEFigure 20-3. EC <strong>Test</strong> Resource Categoriesto support sound warfighting decisionmaking.A variety of problems faces the C 4 ISR systemtester. However, in evaluating <strong>com</strong>m<strong>and</strong> effectiveness,it is difficult to separate the contributionmade by the C 4 ISR system from the contributionmade by the <strong>com</strong>m<strong>and</strong>er’s innate cognitiveprocesses. To assess a C 4 ISR system in itsoperational environment, it must be connectedto the other systems with which it wouldnormally operate, making traceability of testresults difficult. Additionally, modern C 4 ISRsystems are software-intensive <strong>and</strong> highly interactive,with <strong>com</strong>plex man-machine interfaces.Measuring C 4 ISR system effectiveness thusrequires the tester to use properly trained usertroops during the test <strong>and</strong> to closely monitorsoftware <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E). The C 4 ISRsystems of defense agencies <strong>and</strong> the Services(Army, Navy, Air Force, <strong>and</strong> Marines Corps) areexpected to interoperate with each other <strong>and</strong> withthose of the North Atlantic Treaty Organization(NATO) forces; hence, the tester must alsoensure inter-Service <strong>and</strong> NATO CII.20.3.2 C 4 I <strong>Test</strong> Facilities<strong>Test</strong>ing of Comm<strong>and</strong>, Control, Communications,Computers <strong>and</strong> Intelligence (C 4 I) systems willhave to rely more on the use of <strong>com</strong>puter simulations<strong>and</strong> C 4 I test-beds to assess their overalleffectiveness. The Defense Information SystemsAgency (DISA), which is responsible for ensuringinteroperability among all U.S. tactical C 4 Isystems that would be used in joint or <strong>com</strong>binedoperations, directs the Joint Interoperability <strong>Test</strong>Comm<strong>and</strong> (JITC) at Ft. Huachuca, Arizona. TheJITC is a test-bed for C 4 I systems CII.20-4


20.4 TRENDS IN TESTING C 4 I SYSTEMS20.4.1 Evolutionary Acquisition of C 4 ISystemsEvolutionary Acquisition (EA) is a strategydesigned to provide an early, useful capabilityeven though detailed overall system requirementscannot be fully defined at the program’s inception.The EA strategy contributes to a reduction inthe risks involved in system acquisition, since thesystem is developed <strong>and</strong> tested in manageable increments.The C 4 I systems are likely c<strong>and</strong>idatesfor EA because they are characterized by systemrequirements that are difficult to quantify or evenarticulate <strong>and</strong> that are expected to change as a functionof scenario, mission, theater, threat, <strong>and</strong> emergingtechnology. Therefore, the risk associated withdeveloping these systems can be very great.Studies by the Defense Systems <strong>Management</strong>College 2 <strong>and</strong> the International <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Association (ITEA) have addressed the issuesinvolved in the EA <strong>and</strong> testing of Comm<strong>and</strong>,Control, <strong>and</strong> Communications (C 3 ) systems. TheITEA study illustrated EA in Figure 20-4 <strong>and</strong>stated that:With regard to the tester’s role in EA, thestudy group concluded that iterative test<strong>and</strong> evaluation is essential for success inan evolutionary acquisition. The testermust be<strong>com</strong>e involved early in the acquisitionprocess <strong>and</strong> contribute throughoutthe development <strong>and</strong> fielding of the core<strong>and</strong> the subsequent increments. The testerscontribute to the requirements processthrough feedback of test results to the user[<strong>and</strong>] must judge the ability of the systemto evolve. 3The testing of EA systems presents the tester witha unique challenge as the core system must betested during fielding <strong>and</strong> the first increment beforethe core testing is <strong>com</strong>pleted. This could leadto a situation where the tester has three or fourtests ongoing on various increments of the samesystem. The PM must insist that the testing forEA systems be carefully planned to ensure thetest data are shared by all <strong>and</strong> there is a minimumof repetition or duplication in testing.20.4.2 Radio VulnerabilityThe Radio Vulnerability Analysis (RVAN) methodologyis for assessing the anti-jam capability<strong>and</strong> limitations of Radio Frequency (RF) datalinks when operating in a hostile Electronic Countermeasures(ECM) environment. The RVANevolved from the test methodologies developedfor an Office of the Secretary of Defense (OSD)-chartered Joint <strong>Test</strong> on Data Link VulnerabilityAnalysis (DVAL). In 1983, OSD directed theServices to apply the DVAL methodology to allnew data links being developed.The purpose of the DVAL methodology is toidentify <strong>and</strong> quantify the anti-jam capabilities <strong>and</strong>vulnerabilities of an RF data link operating in ahostile ECM environment. The methodology isapplied throughout the acquisition process <strong>and</strong>permits early identification of needed designmodifications to reduce identified ECM vulnerabilities.The following four <strong>com</strong>ponents determinea data link’s Electronic Warfare (EW)vulnerability:(1) The susceptibility of a data link; i.e., thereceiver’s performance when subjected tointentional threat ECM;(2) The interceptibility of the data link; i.e., thedegree to which the transmitter could be interceptedby enemy intercept equipment;(3 The accessibility of the data link; i.e., thelikelihood that a threat jammer coulddegrade the data link’s performance;(4) The feasibility that the enemy would intercept<strong>and</strong> jam the data link <strong>and</strong> successfullydegrade its performance.20-5


ARCHITECTURE AND CORE MILESTONESMS A(SYSTEM)MS B(SYSTEM)DT/OTDT/OTMS A(SYSTEM)MS B(SYSTEM)DT/OTINCREMENT IICAPABILITYNEEDSTATEMENTR&DCOREINCREMENT ISYSTEMDEFINITIONREQTSFIELDTESTARCHITECTURE*Brassboard Rapid Prototyping System Simulation, Etc.SOURCE: ITEA Study, "<strong>Test</strong> & <strong>Evaluation</strong> of C 2 Systems Developed by Evolutionary Acquisition."Figure 20-4. The Evolutionary Acquisition ProcessThe analyst applying the DVAL methodology willrequire test data; <strong>and</strong> the <strong>Test</strong> Manager (TM) ofthe C 3 I system, of which the data link is a <strong>com</strong>ponent,will be required to provide this data. TheDVAL joint test methodologies <strong>and</strong> test resultsare on file as part of the Joint <strong>Test</strong> Library beingmaintained by the Air Force Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> Center (AFOTEC), Kirtl<strong>and</strong> AFB,New Mexico.20.5 T&E OF SURVEILLANCE ANDRECONNAISSANCE SYSTEMSIntelligence, Surveillance, <strong>and</strong> Reconnaissance(ISR) capabilities provide the requisite battlespaceawareness tools for U.S. Forces to take <strong>and</strong> holdthe initiative, increase operating tempo, <strong>and</strong> concentratepower at the time <strong>and</strong> place of theirchoosing. These vital capabilities are achievedthrough highly classified sensor systems rangingfrom satellites, aircraft, maritime vessels, electronicintercept, <strong>and</strong> the soldier in the field to thesystems required to analyze that data, synthesizeit into usable information, <strong>and</strong> disseminate thatinformation in a timely fashion to the warfighter.As a general rule, ISR systems are considered tobe Joint Systems.Because of the multifaceted nature of ISRprograms, the classified nature of how data are20-6


gathered in the operational element, test planningto verify effectiveness <strong>and</strong> suitability can be <strong>com</strong>plex.Adding to that inherent <strong>com</strong>plexity is thevariable nature of organizational guidance directivesupon the tester. While the broad managementprinciples enunciated by Department ofDefense Directive (DoDD) 5000.1, The DefenseAcquisition System, May 12, 2003, will apply tohighly sensitive classified systems <strong>and</strong> cryptographic<strong>and</strong> intelligence programs, the detailedguidance contained in Defense Acquisition <strong>Guide</strong>bookapplies to Major Defense AcquisitionPrograms (MDAPs) <strong>and</strong> Major Automated InformationSystems (MAISs). Many ISR programsfall below this threshold <strong>and</strong> the wise TM shouldanticipate that several agencies will have takenadvantage of this opening to tailor organizationalguidance.Key issues for the T&E of ISR systems toconsider include <strong>com</strong>pliance verification withthe CII requirements contained in DoDD 4630.5,Interoperability <strong>and</strong> Supportability of InformationTechnology (IT) <strong>and</strong> National SecuritySystems (NSS), Chairman of the Joint Chiefs ofStaff Instruction (CJCSI) 3170.01D, Joint CapabilitiesIntegration <strong>and</strong> Development System,<strong>and</strong> CJCSI 6212.01C, Interoperability <strong>and</strong> Supportabilityof Information Technology <strong>and</strong> NationalSecurity Systems, 4 as certified by theJITC; <strong>com</strong>pletion of the system security certificationrequired prior to processing classified orsensitive material; verification; <strong>and</strong> documentationof required support from interfaced C 4 ISRsystems in the Information Support Plan (ISP)to ensure the availability <strong>and</strong> quality of requiredinput data, characterization of the maturity ofmission critical software, finalization of the rangeof human factors analysis, <strong>and</strong> consideration ofinformation operations vulnerabilities/capabilities.In addition to this partial listing, many ofthese systems will operate inside a matrix of ISRsystem architectures, which must be carefullyconsidered for test planning purposes. As a finalissue, Advanced Concept Technology Demonstration(ACTD) programs are being used to quicklydeliver capability to the user because of thecritical <strong>and</strong> time-sensitive nature of many ISR requirements.The TM must carefully considerstructuring the T&E effort in light of the inherentnature of ACTD programs.20.6 SUMMARYThe EC systems must be tested under conditionsrepresentative of the dense threat signal environmentsin which they will operate. The C 4 ISRsystems must be tested in representative environmentswhere their interaction <strong>and</strong> responsivenesscan be demonstrated. The solution for the testeris an integrated approach using a <strong>com</strong>bination ofanalytical models, <strong>com</strong>puter simulations, hybridlaboratory simulators <strong>and</strong> test beds, <strong>and</strong> actualfield testing. The tester must underst<strong>and</strong> these testtechniques <strong>and</strong> resources <strong>and</strong> apply them in EC<strong>and</strong> C 4 ISR T&E.20-7


ENDNOTES1. Proposed in the Air Force Electronic CombatDevelopment <strong>Test</strong> Process <strong>Guide</strong>, May 1991,issued by what is now the Air Staff T&EElement, AF/TE.2. Defense Systems <strong>Management</strong> College, JointLogistics Comm<strong>and</strong>ers Guidance for the Useof an Evolutionary Acquisition (EA) Strategyin Acquiring Comm<strong>and</strong> <strong>and</strong> Control (C 2 ) Systems,March 1987.4. DoDD 4630.5, Interoperability <strong>and</strong> Supportabilityof Information Technology (IT) <strong>and</strong> NationalSecurity Systems (NSS), May 5, 2004;CJCSI 3170.01D, Joint CapabilitiesIntegration <strong>and</strong> Development System, May 12,2004; <strong>and</strong> CJCSI 6212.01C, Interoperability<strong>and</strong> Supportability of Information Technology<strong>and</strong> National Security Systems, November 20,2003.3. International <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Association(ITEA), <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> of Comm<strong>and</strong> <strong>and</strong>Control Systems Developed Evolutionary Acquisition,May 1987.20-8


21MULTI-SERVICE TESTS21.1 INTRODUCTIONThis chapter discusses the planning <strong>and</strong> managementof a multi-Service test program. A multi-Service test program is conducted when a systemis to be acquired for use by more than one Serviceor when a system must interface with equipmentof another Service (Joint program). A multi-Service test program should not be confused withthe Office of the Secretary of Defense (OSD)-sponsored, non-acquisition-oriented Joint <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (JT&E) program. A brief descriptionof the JT&E program is provided in Chapter 6.The Defense Acquisition <strong>Guide</strong>book providesguidance on management of joint acquisition programs.121.2 BACKGROUNDThe Defense Acquisition <strong>Guide</strong>book states: TheComponent Acquisition Executive (CAE) of adesignated acquisition agent given acquisitionresponsibilities shall utilize the acquisition <strong>and</strong>test organizations <strong>and</strong> facilities of the MilitaryDepartments to the maximum extent practicable;A designated joint program shall have…one integratedtest program <strong>and</strong> one set of documentation(TEMP [<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan])<strong>and</strong> reports; The MDA [Milestone DecisionAuthority] shall designate a lead OTA [Operational<strong>Test</strong> Agency] to coordinate all OT&E[Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>]. The lead OTAshall produce a single operational effectiveness<strong>and</strong> suitability report for the program; unlessstatute, the MDA, or a MOA [Memor<strong>and</strong>um ofAgreement] signed by all DoD [Department ofDefense] <strong>com</strong>ponents directs otherwise, the leadexecutive <strong>com</strong>ponent shall budget for <strong>and</strong> managethe <strong>com</strong>mon RDT&E [Research, Development,<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>] funds for assigned jointprograms; <strong>and</strong>, individual <strong>com</strong>ponents shall budgetfor their unique requirements.Formulation of a multi-Service <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(T&E) program designates the participantsin the program <strong>and</strong> gives a Lead Service responsibilityfor preparing a single report concerninga system’s operational effectiveness <strong>and</strong> suitability.(The Lead Service is the Service responsiblefor the overall management of a joint acquisitionprogram. A “Supporting Service” is aService designated as a participant in the systemacquisition.)A multi-Service T&E program may include eitherDevelopment <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E) orOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E) or both.The Service’s OTAs have executed a formal MOAon multi-Service OT&E that provides a frameworkfor the conduct of a multi-Service Operational<strong>Test</strong> (OT) program. 2 The Defense InformationSystems Agency (DISA) has signed the MOAin relation to the Joint Interoperability <strong>Test</strong> <strong>and</strong>certification for multi-Service T&E. The MOAincludes a glossary of <strong>com</strong>mon <strong>and</strong> Servicepeculiarterms used in multi-Service T&E. 3Generally the process includes:(1) In a multi-Service acquisition program,T&E is planned <strong>and</strong> conducted accordingto Lead Service regulations. The designatedLead Service will have the overall responsibilityfor management of the multi-Serviceprogram <strong>and</strong> will ensure that supportingService requirements are included. If anotherService has certain unique T&E21-1


equirements, testing for these uniquerequirements may be planned, funded, <strong>and</strong>conducted according to that Service’sregulations.(2) Participating Services will prepare reportsin accordance with their respective regulations.The Lead Service will prepare <strong>and</strong>coordinate a single DT&E report <strong>and</strong> asingle OT&E report, which will summarizethe conclusions <strong>and</strong> re<strong>com</strong>mendations ofeach Service’s reports. Rationale will beprovided to explain any significant differences.The individual Service reports maybe attached to this single report.(3) Deviations from the Lead Service T&Eregulations may be ac<strong>com</strong>modated by mutualagreement among the Services involved.21.3 TEST PROGRAMRESPONSIBILITIESThe Lead Service has overall managementresponsibility for the program. It must ensure thatsupporting Service requirements are included inthe formulation of the basic resource <strong>and</strong> planningdocuments.A multi-Service T&E Integrated Product Team(IPT) should be established for each multi-Servicetest program. Its membership consists of one seniorrepresentative from each participating Serviceor agency headquarters. The T&E IPT worksclosely with the Program <strong>Management</strong> Office(PMO) <strong>and</strong> is responsible for arbitrating all disagreementsamong Services that cannot be resolvedat the working level.LEADOT&E AGENCYTEST MANAGEMENTCOUNCILOT&ETEST DIRECTOR(LEAD SERVICE)ASSISTANT FOR PLANSAND TEST OPERATIONS(LEAD SERVICE)AIR FORCEOT&E DEPUTY TESTDIRECTOR(AFOTEC)ARMYOT&E DEPUTY TESTDIRECTOR(ATEC)MARINE CORPSOT&E DEPUTY TESTDIRECTOR(MCOTEA)NAVYOT&E DEPUTY TESTDIRECTOR(OPTEVFOR)XXXXOT&E DEPUTY TESTDIRECTOR (1)AIR FORCETEST TEAMSTRUCTUREARMYTEST TEAMSTRUCTUREMARINE CORPSTEST TEAMSTRUCTURENAVYTEST TEAMSTRUCTUREXXXXTEST TEAMSTRUCTURE(1) USED FOR COMPLEX PROGRAMS WITH MANY PARTICIPANTSSOURCE: Memor<strong>and</strong>um of Agreement on Multi-Service OT&E <strong>and</strong> Joint T&E, August 2003.Figure 21-1. Simple Multi-Service OT&E <strong>Test</strong> Team Composition21-2


Resource requirements are documented in theTEMP. Each participating Service is directed tobudget for the testing necessary to ac<strong>com</strong>plishits assigned test objectives <strong>and</strong> for the participationof its personnel <strong>and</strong> equipment in theentire test program. Separate annexes may beused to address each Service’s test requirements.Funding will be in accordance with Public Law<strong>and</strong> Department of Defense (DoD) 7000.14-R. 421.4 TEST TEAM STRUCTUREA sample test team structure is shown in Figure21-1. As shown in the figure, Service test teamswork through a Service deputy test director orsenior representative. The test director exercisestest management authority but not operationalcontrol over the test teams. The responsibilitiesinclude integration of test requirements <strong>and</strong>efficient scheduling of test events. The deputytest directors exercise operational control or testmanagement authority over their Service test teamsin accordance with their Service directives. Additionally,they act as advisers to the test director;represent their Service’s interests; <strong>and</strong> are responsible,at least administratively, for resources<strong>and</strong> personnel provided by their Services.21.5 TEST PLANNING<strong>Test</strong> planning for multi-Service T&E is ac<strong>com</strong>plishedin the manner prescribed by Lead Servicedirections <strong>and</strong> in accordance with the followinggeneral procedures extracted from the “Memor<strong>and</strong>umof Agreement on Multi-Service OT&E[MOT&E] <strong>and</strong> Joint T&E:” 5(1) The Lead Service T&E agency begins theplanning process by issuing a call to thesupporting Service T&E agencies forcritical issues <strong>and</strong> test objectives.(2) The Lead Service T&E agency consolidatesthe objectives into a list <strong>and</strong> coordinates thelist with the supporting Service T&Eagencies.(3) The Lead Service T&E agency ac<strong>com</strong>modatessupporting Service T&E requirements<strong>and</strong> input in the formal coordination actionof the TEMP.(4) Participating T&E agency project officersassign responsibility for the ac<strong>com</strong>plishmentof test objectives (from the consolidatedlist) to each T&E agency. These assignmentsare made in a mutually agreeablemanner. Each agency is then responsible forresource identification <strong>and</strong> ac<strong>com</strong>plishmentof its assigned test objectives under thedirection of the Lead Service T&E agency.(5) Each participating agency prepares theportion of the overall test plan(s) for its assignedobjectives, in the Lead Service testplan(s) format, <strong>and</strong> identifies its data needs.(6) The Lead Service T&E agency prepares themulti-Service T&E test plan(s), consolidatingthe input from all participating agencies.21.6 DEFICIENCY REPORTINGIn a multi-Service T&E program, a deficiencyreport is a report of any condition that reflectsadversely on the item being tested <strong>and</strong> that mustbe reported outside the test team for correctiveaction. The deficiency reporting system of theLead Service is normally used. All members ofthe multi-Service test team will report deficienciesthrough their Service’s system.Items undergoing test will not necessarily be usedby each of the Services for identical purposes. Asa result, a deficiency considered disqualifying byone Service is not necessarily disqualifying for allServices. Deficiency reports of a disqualifyingnature must include a statement by the concernedService of why the deficiency has been soclassified. It also includes statements by the otherServices as to whether or not the deficiencyaffects them significantly.21-3


If one of the participating Services identifies adeficiency that it considers as warranting terminationof the test, the circumstances are reportedimmediately to the test director.21.7 TEST REPORTINGThe following test-reporting policy applies tomulti-Service IOT&E:(1) The Lead OTA will prepare <strong>and</strong> coordinatethe report reflecting the system’soperational effectiveness <strong>and</strong> suitabilityfor each Service. Interim reports will notnormally be prepared.(2) It will synergize the different operationalrequirements <strong>and</strong> operational environmentsof the involved Services.(3) It will state findings, put those findings intoperspective, <strong>and</strong> present rationale why thereis or is not consensus on the utility of thesystem.(4) All participating OTAs will sign the report.(5) Each participating OTA may prepare anIndependent <strong>Evaluation</strong> Report (IER) orfinal test report, as required, in its ownformat <strong>and</strong> process that report through normalService channels.(6) The Lead OTA will ensure that all separateparticipating Service IERs/test reports areappended to the overall final report preparedby the Lead OTA for submission to theMDA.(7) A single integrated multi-Service report willbe submitted 90 calendar days after the officialend of test is declared, but not laterthan 45 days prior to a milestone decision.21.8 SUMMARYMulti-Service test programs are conducted bytwo or more Services when a system is to beacquired for employment by more than one Serviceor when a system must interface with equipmentof another Service. <strong>Test</strong> procedures formulti-Service T&E follow those of the designatedLead Service, with mutual agreementsresolving areas where deviations are necessary.The MOA on OT&E procedures provides guidanceto the OTA. Care must be exercised whenintegrating test results <strong>and</strong> reporting discrepanciessince items undergoing testing may be usedfor different purposes in different Services.Close coordination is required to ensure that anaccurate summary of the developing system’s capabilitiesis provided to Service <strong>and</strong> DoDdecision authorities.21-4


ENDNOTES1. Defense Acquisition <strong>Guide</strong>book, Chapter 7,“Acquiring Information Technology <strong>and</strong>National Security Systems,” http//akss.dau.mil/DAG.2. Memor<strong>and</strong>um of Agreement of Multi-ServiceOT&E, with changes, August 2003.4. DoD Financial <strong>Management</strong> Regulation(FMR) 7000.14-R, Vol. 02B, Budget Formulation<strong>and</strong> Presentation, Chapter 5, “Research,Development, <strong>Test</strong>, <strong>and</strong> <strong>Evaluation</strong>Appropriations,” June 2004.5. Memor<strong>and</strong>um.3. AFOTEC 99-103, Capabilities <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong>, October 22, 2001, <strong>and</strong> the Departmentof the Army Pamphlet (DA PAM)73-1, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Policy, January 7,2002, provide guidance for procedures followedin a multi-Service T&E program.21-5


21-6


22INTERNATIONAL TEST ANDEVALUATION PROGRAMS22.1 INTRODUCTIONThis chapter discusses <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E)from an international perspective. It describes theOffice of the Secretary of Defense (OSD)-sponsored Foreign Comparative <strong>Test</strong> (FCT)Program 1 <strong>and</strong> the International <strong>Test</strong> OperationsProcedures (ITOPs). Factors that bear on the T&Eof multinational acquisition programs arediscussed also.22.2 FOREIGN COMPARATIVE TESTPROGRAM22.2.1 Program ObjectiveThe FCT Program is designed to support theevaluation of a foreign nation’s weapons system,equipment, or technology in terms of its potentialto meet a valid requirement of one or moreof the U.S. Armed Forces. Additional goals ofthe FCT program include avoiding unnecessaryduplication in development, enhancing st<strong>and</strong>ardization<strong>and</strong> interoperability, <strong>and</strong> promoting internationaltechnology exchanges. The FCT programis not intended for use in exploiting threat systemsor for intelligence gathering. The primaryobjective of the program is to support the U.S.national policy of encouraging internationalarmaments cooperation <strong>and</strong> to reduce the costsof Research <strong>and</strong> Development (R&D). Policy <strong>and</strong>procedures for the execution of the program wereoriginally documented in the Department of Defense(DoD), but now can be found in the ForeignComparative <strong>Test</strong> H<strong>and</strong>book <strong>and</strong> DoD Instruction(DoDI) 5000.2. 222.2.2 Program AdministrationForeign Weapons <strong>Evaluation</strong> (FWE) activities <strong>and</strong>responsibilities are assigned to the Director, Comparative<strong>Test</strong> Office (CTO), Director of DefenseResearch <strong>and</strong> Engineering (DDR&E), OSD. Eachyear, sponsoring military Services forward C<strong>and</strong>idateNomination Proposals (CNPs) for systemsto be evaluated under the FCT program to theDirector, CTO. The Services are encouraged toprepare <strong>and</strong> submit a CNP whenever a promisingc<strong>and</strong>idate that appears to satisfy a current orpotential Service requirement is found. A CNPmust contain the information as required by theFCT H<strong>and</strong>book. 3The fundamental criterion for FCT programselection is the c<strong>and</strong>idate system’s potential to satisfyoperational or training requirements that existor are projected. Its possible contribution to theU.S. technology base is considered also. Additionalfactors influencing c<strong>and</strong>idate selection include:c<strong>and</strong>idate maturity, available test data, multi-Service interest, existence of a statement of operationalrequirement need, potential for subsequentprocurement, sponsorship by U.S.-based licensee,realistic evaluation schedule cost, DoD <strong>com</strong>ponentOSD evaluation cost-sharing proposal, <strong>and</strong>preprogrammed procurement funds. For technologyevaluation programs within the FCT program,the c<strong>and</strong>idate nomination proposal must addressthe specific arrangements under which the UnitedStates <strong>and</strong> foreign participants (governments,armed forces, corporations) will operate. Thesemay include government-to-government Memor<strong>and</strong>aof Agreement (MOA), private industry licensingagreements, data exchange agreements,<strong>and</strong>/or cooperative technology exchange programs.22-1


FWE activities are funded by OSD <strong>and</strong> executedby the Service with the potential need for the system.Points of contact at the headquarters level ineach Service monitor the conduct of the programs.Work is performed in laboratories <strong>and</strong> test centersthroughout the country. Systems evaluated recentlyunder the FCT program include millimeter wave<strong>com</strong>munications equipment, chemical defenseequipment, gunnery devices, maritime decoys, <strong>and</strong>navigational systems. The Under Secretary ofDefense for Acquisition, Technology, <strong>and</strong> Logistics(USD(AT&L)) shall notify Congress a minimumof 30 days prior to the <strong>com</strong>mitment of fundsfor initiation of new FCT evaluations. 422.3 NATO COMPARATIVE TESTPROGRAMThe North Atlantic Treaty Organization (NATO)Comparative <strong>Test</strong> Program has been integratedwith the FCT program. It was created by an actof the Congress in the Fiscal Year (FY) 86Defense Authorization Act. The program supportedthe evaluation of NATO nations’ weaponssystems, equipment <strong>and</strong> technology, <strong>and</strong> assessedtheir suitability for use by U.S. forces. The selectioncriteria for the NATO Comparative <strong>Test</strong> Programwere essentially the same as for the FCTprogram. The exception was that the equipmentmust be produced by a NATO member nation <strong>and</strong>be considered as an alternative to a system thatwas either in a late stage of development in theUnited States or that offered a cost, schedule, orperformance advantage over U.S. equipment. Inaddition, the NATO Comparative <strong>Test</strong> Programrequired that notification be sent to the ArmedServices <strong>and</strong> Appropriations Committees of theHouse of Representatives <strong>and</strong> Senate beforefunds were obligated. With this exception, theNATO Comparative <strong>Test</strong> Program followed thesame nomination process <strong>and</strong> administrative procedures.<strong>Guide</strong>lines for the program were alsocontained in the FCT H<strong>and</strong>book.Examples of proposals funded under the NATOComparative <strong>Test</strong> Program included T&E of aGerman mine reconnaissance <strong>and</strong> detection systemfor the Army, a United Kingdom-designedmine sweeper for the Navy, <strong>and</strong> the NorwegianPenguin missile system for the Air Force.According to the FY88 Report of the Secretary ofDefense (SECDEF) to the Congress, the programgenerated considerable interest among NATO alliednations <strong>and</strong> became a primary way of promotingarmaments cooperation within NATO.Problems associated with testing foreign weaponsnormally stem from politics, national pride, <strong>and</strong>a lack of previous test data. When foreign <strong>com</strong>paniesintroduce weapon systems for testing, theyoften will attempt to align the U.S. military/congressional organizations with their systems.For example, when a foreign nation introducedan antitank weapon to the Army, they did so byhaving a U.S. Senator write the Army stating aneed for the system. Attached to the letter was adocument containing doctrine to employ the system<strong>and</strong> a test concept to use when evaluatingthe system. Systems tested in the NATO Comparative<strong>Test</strong> Program often be<strong>com</strong>e involved innational pride. The test <strong>com</strong>munity must be carefulnot to allow national pride to be a drivingforce in the evaluation. At times, the 9mm pistol<strong>com</strong>petition in NATO resembled an internationalsoccer match, with each <strong>com</strong>peting nation cheeringfor their pistol <strong>and</strong> many other nationsselecting sides. Evaluating the 9mm pistol wasdifficult because of these forces. Thus, U.S. testersmust make every effort to obtain all available testdata on foreign systems. These data can be usedto help validate the evolving test data <strong>and</strong> additionaltest data during the evaluation.22.4 T&E MANAGEMENT INMULTINATIONAL PROGRAMS22.4.1 Compatibility With AlliesRationalization, St<strong>and</strong>ardization, <strong>and</strong> Interoperability(RSI) have be<strong>com</strong>e increasingly importantelements in the materiel acquisition process. PublicLaw 94-361 5 requires that “equipment for use22-2


of personnel of the Armed Forces of the UnitedStates stationed in Europe under the terms of theNorth Atlantic Treaty should be st<strong>and</strong>ardized orat least interoperable with equipment of othermembers of the North Atlantic TreatyOrganization.” Program Managers (PMs) <strong>and</strong> <strong>Test</strong>Managers (TMs) must, therefore, be fully awareof any potential international applications of thesystems for which they are responsible. Twoguidebooks published by the Defense AcquisitionUniversity (DAU) are valuable <strong>com</strong>pendiumsof information for the PM of a developing systemwith potential multinational applications. 6Additionally, on file at the DAU David D. AckerLibrary, is the research report “Operational <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> in the IDEA [International DefenseEducation Association] Nations” (1999)<strong>com</strong>paring the Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(OT&E) processes of the United States with thoseof Great Britain, Germany <strong>and</strong> France. This reportconcluded that there were significant differencesin what was considered OT&E by the fourcountries. The Defense Acquisition <strong>Guide</strong>bookstates that Foreign Military Sales (FMS) of U.S.major systems (ACAT [Acquisition Category] I,II) prior to IOT&E must be approved by theUSD(AT&L). 7Representatives of the United States, UnitedKingdom, France, <strong>and</strong> Germany have signed aMOA concerning the mutual acceptability of eachcountry’s T&E data. This agreement seeks toavoid redundant testing by documenting the extentof underst<strong>and</strong>ing among involved governmentsconcerning mutual acceptability of respectiveT&E procedures for systems that are developedin one country <strong>and</strong> are c<strong>and</strong>idates for procurementby one or more of the other countries. Focalpoints for development <strong>and</strong> operational testingin each of the countries are identified, <strong>and</strong> proceduresgoverning generation <strong>and</strong> release of T&Edata are described in the Memor<strong>and</strong>um of Underst<strong>and</strong>ing(MOU). The European concept ofOT&E is significantly different from that usedby the U.S. DoD.Early <strong>and</strong> thorough planning is an importantelement of any successful T&E program but iseven more critical in a multinational program.Agreement must be reached concerning T&E procedures,data requirements, <strong>and</strong> methodology.Differences in tactics, battlefield representations,<strong>and</strong> military organizations may make it difficultfor one nation to accept another’s test data. Therefore,agreement must be reached in advanceconcerning the Operational <strong>Test</strong> (OT) scenario <strong>and</strong>battlefield representation that will be used.22.4.2 International <strong>Test</strong> OperationsProceduresThe Defense Acquisition <strong>Guide</strong>book 8 states“results of T&E of systems using approved International<strong>Test</strong> Operations Procedures (ITOPs)may be accepted without repeating the testing.”The ITOPs are documents containing st<strong>and</strong>ardizedstate-of-the-art test procedures prepared bythe cooperative efforts of France, Germany, theUnited Kingdom, <strong>and</strong> the United States. Theiruse assures high quality, efficient, accurate, <strong>and</strong>cost-effective testing. The Director, Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DOT&E) is the OSD sponsorfor providing basic guidance <strong>and</strong> directionto the ITOPs’ processes. The ITOPs program isintended to shorten <strong>and</strong> reduce costs of themateriel development <strong>and</strong> acquisition cycle,minimize duplicate testing, improve interoperabilityof U.S. <strong>and</strong> allied equipment, promotethe cooperative development <strong>and</strong> exchange ofadvanced test technology, <strong>and</strong> exp<strong>and</strong> the customerbase. Each Service has designated anITOPs point of contact. The Army uses the <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> <strong>Management</strong> Agency (TEMA);in the Navy it is the Director, Navy T&E Division(N-912); <strong>and</strong> the Air Force has the Chief,Policy <strong>and</strong> Program Division (AF/TEP). TheArmy, who initiated the program in 1979, is thelead Service. A total of 75 ITOPs have been <strong>com</strong>pleted<strong>and</strong> published in six technical areas underthe Four-Nation <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> MOU. AdditionalITOPs are under development by activeworking <strong>com</strong>mittees. 9 Completed documents are22-3


submitted to the Defense Technical InformationCenter (DTIC) for official distribution.22.5 U.S. AND NATO ACQUISITIONPROGRAMSSome test programs involve <strong>com</strong>bined development<strong>and</strong> test of new weapon systems for U.S.<strong>and</strong> other NATO countries. In these programs,some differences from the regular “way of doingthings” occur. For example, the formulation ofthe Request for Proposal (RFP) must be coordinatedwith the North Atlantic Program <strong>Management</strong>Agency (NAPMA); <strong>and</strong> their input to theStatement of Work (SOW), data requirements, operationaltest planning, <strong>and</strong> test schedule formulationmust be included. Also, the U.S. Army operationaluser, Forces Comm<strong>and</strong> (FORSCOM),must be involved in the OT program. Usually, aMultinational Memor<strong>and</strong>um of Underst<strong>and</strong>ing iscreated concerning test program <strong>and</strong> productionfunding, test resources, test team <strong>com</strong>position, useof national assets for testing, etc.Nations are encouraged to use the data that anothernation has gathered on similar test programsto avoid duplication of effort. For example, duringthe U.S. <strong>and</strong> NATO Airborne Warning <strong>and</strong>Control System (AWACS) Electronic SupportMeasures (ESM) Program, both U.S. <strong>and</strong> NATOE-3As will be used for test aircraft in <strong>com</strong>binedDevelopment <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E) <strong>and</strong>subsequent OT&E. <strong>Test</strong>ing will be conducted inthe U.S. <strong>and</strong> European the Program <strong>Management</strong>Office (PMO), contractor, U.S. operational users,Air Force Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Center(AFOTEC), Force Comm<strong>and</strong> (NATO users),<strong>and</strong> logistics personnel for this program. A MultinationalMemor<strong>and</strong>um of Agreement for thisprogram was created. The U.S. program is managedby the AWACS Program Office, <strong>and</strong> theNATO program is managed by the NAPMA.22.6 SUMMARYThe procurement of weapon systems from foreignnations for use by U.S. Armed Forces can providethe following advantages: reduced Research <strong>and</strong>Development (R&D) costs, faster initial operationalcapability, improved interoperability withfriendly nations, <strong>and</strong> lower procurement costsbecause of economies of scale. This is normallya preferred solution to user requirements beforeattempting to start a new development. <strong>Test</strong>ingsuch systems presents specific challenges to ac<strong>com</strong>modatethe needs of all users. OT&E conductedby allied <strong>and</strong> friendly nations is not asrigorous as that required by U.S. public law <strong>and</strong>DoD acquisition regulations. Such testing requirescareful advance planning <strong>and</strong> systematic execution.Expectations <strong>and</strong> underst<strong>and</strong>ings must bewell documented at an early stage to ensure thatthe test results have utility for all concerned.22-4


ENDNOTES1. Title 10, United States Code (U.S.C.), Section2350A, Cooperative research <strong>and</strong> developmentagreements: NATO organizations; allied<strong>and</strong> friendly foreign countries, March 18,2004.2. DoD 5000.3-M-2, Foreign Comparative <strong>Test</strong>ingProgram Procedures Manual, January1994; DoDI 5000.2, Operation of the DefenseAcquisition System, May 12, 2003,Enclosure 5.3. Office of the Under Secretary of Defense (Acquisition,Technology <strong>and</strong> Logistics) Director,Strategic & Tactical Systems, The ForeignComparative <strong>Test</strong>ing (FCT) H<strong>and</strong>book, March2001, http://www.acq.osd.mil/cto.4. DoDI 5000.2.6. Kausal, Tony, editor. Comparison of theDefense Acquisition Systems of France, GreatBritain, Germany, <strong>and</strong> the United States,Defense Acquisition University, September1999; Kausal, Tony, editor <strong>and</strong> StefanMarkowski. Comparison of the DefenseAcquisition Systems of Australia, Japan, SouthKorea, Singapore, <strong>and</strong> the United States,Defense Acquisition University, July 2000.7. Defense Acquisition <strong>Guide</strong>book, http://akss.dau.mil/DAG.8. Ibid., Chapter 2, C2.9.2, “InternationalCooperation.”9. Check the Developmental <strong>Test</strong> Comm<strong>and</strong>(DTC) Web site at http://www.dtc.army.mil/publications/topsindex.aspx for updates.5. Public Law 94-361, "DoD Authorization Act,1977," (Title 10, U.S.C. 2457), July 14, 1976.22-5


22-6


23COMMERCIAL ANDNON-DEVELOPMENT ITEMS23.1 INTRODUCTIONMany options are available when an acquisitionstrategy calls for a material solution before electingto develop a new system. They range fromthe preferred solution of using <strong>com</strong>merciallyavailable products from domestic or internationalsources to the least preferred option of a traditional,single-Service new Research <strong>and</strong>Development (R&D) program. Between these twoextremes are other acquisition strategies that callfor using modified systems at different <strong>com</strong>ponentlevels <strong>and</strong> unmodified or ruggedizedcooperative developments to various extents. 1 Figure23-1 shows the broad spectrum of approachesthat can be taken in a system acquisition <strong>and</strong> providesexamples of systems that have been developedusing each approach. The PM [ProgramManager] is required to consider this spectrumof options for all levels of the system design. 2MORE COSTOFF THESHELFRUGGED-IZATIONMILITARI-ZATIONSTD SUBSYSTEMASSEMBLYDEV W/STDSUBSYSTEMDEV W/STDCOMPONENTSFULLDEVELOPMENTDEVELOPMENTTIMELESSMOREBERETTA 9MMHANDGUNCOMMERCIALUTILITYCARGO VEHICLECOUNTER OBSTACLEVEHICLEAPACHEHELICOPTERSource: Army Materiel Comm<strong>and</strong> Pamphlet 70-2, AMC-TRADOC Materiel Acquisition H<strong>and</strong>book. March 26, 1987.Figure 23-1. The Spectrum of Technology Maturity23-1


23.1.1 DefinitionsA <strong>com</strong>mercial item is generally defined as anyitem, other than real property, that is of a typecustomarily used for non-governmental purposes<strong>and</strong> that: (1) has been sold, leased, or licensed tothe general public; (2) has been offered for sale,lease, or license to the general public; or any itemthat evolved through advances in technology orperformance <strong>and</strong> that is not yet available in the<strong>com</strong>mercial marketplace, but will be available inthe <strong>com</strong>mercial marketplace in time to satisfy thedelivery requirements under a government solicitation.Also included in the definition are servicesin support of a <strong>com</strong>mercial item, or a typeoffered <strong>and</strong> sold <strong>com</strong>petitively in substantialquantities in the <strong>com</strong>mercial marketplace basedon established catalog or market prices for specifictasks performed under st<strong>and</strong>ard <strong>com</strong>mercialterms <strong>and</strong> conditions; this does not include servicesthat are sold based on hourly rates withoutan established catalog or market price for a specificservice performed.A Non-Developmental Item (NDI) is considered:(1) any previously developed item of supply usedexclusively for governmental purposes by a federalagency, a state, or local government, or aforeign government with which the United Stateshas a mutual defense cooperation agreement; (2)any item described in (1) that requires only minormodification or modifications of the type customarilyavailable in the <strong>com</strong>mercial marketplacein order to meet the requirements of the procuringdepartment or agency; (3) any item describedin (1) or (2) solely because the item is not yet inuse.All such systems are required to undergo technical<strong>and</strong> Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E)before the procurement decision, unless thedecision authority makes a definitive decision thatprevious testing or other data (such as user/marketinvestigations) provide sufficient evidence of acceptability.3 See Federal Acquisition Regulation(FAR), Section 2.101 for more precise definitionsof <strong>com</strong>mercial <strong>and</strong> NDIs; <strong>and</strong> SD-2, Buying Commercial<strong>and</strong> Nondevelopmental Items: A H<strong>and</strong>book,April 1, 1996.23.1.2 Advantages <strong>and</strong> Disadvantages ofCommercial <strong>and</strong> NDI ApproachesThe use of <strong>com</strong>mercial <strong>and</strong> NDI offers thefollowing advantages:• The time to field a system can be greatlyreduced, providing a quicker response to theuser’s needs;• R&D costs or Total Ownership Costs (TOCs)may be reduced;• State-of-the-art technology may be availablesooner.Commercial <strong>and</strong> NDIs offer the followingdisadvantages:• Acquisitions are difficult to st<strong>and</strong>ardize/integratewith the current fleet equipment;• Acquisitions create logistics support difficulties;• Acquisitions tend not to have <strong>com</strong>parable <strong>com</strong>petition;therefore, there are fewer secondsources available;• With <strong>com</strong>mercial <strong>and</strong> NDI acquisitions, engineering<strong>and</strong> test data often are not available.23.1.3 Application of Commercial <strong>and</strong> NDIsCommercial items or NDIs may be used in thesame environment for which the items were designed.Such items normally do not require developmenttesting prior to the Production Qualification<strong>Test</strong> (PQT) except in those cases wherea contract may be awarded to a contractor whohas not previously produced an acceptable finishedproduct <strong>and</strong> the item is assessed as highrisk. In that case, preproduction qualification23-2


testing would be required. 4 An operational assessmentor some more rigorous level of OT&E mightbe appropriate.Commercial items or NDIs may be used in anenvironment other than that for which the itemswere designed. Such items may require modificationsin hardware <strong>and</strong>/or software. These itemsrequire testing in an operational environment,preproduction qualification testing (if previoustesting resulted in item redesign), <strong>and</strong> productionqualification testing.Integration of <strong>com</strong>mercial items or NDIs into anew development system may require someregression testing. These efforts require moreextensive research, development, <strong>and</strong> testing toachieve effective operation of the desired systemconfiguration. <strong>Test</strong>ing required includes: feasibilitytesting in a military environment, preproductionqualification testing, hardware/softwareintegration testing, operational testing, <strong>and</strong> productionqualification testing. Given the varietyof approaches that may be employed, it isimperative that the acquisition strategy clearlyspecifies, with the agreement of the testingauthority, the level of testing that will be performedon <strong>com</strong>mercial items <strong>and</strong> NDI systems<strong>and</strong> the environment in which those systems willbe tested.23.2 MARKET INVESTIGATION ANDPROCUREMENTA market investigation is the central activity leadingto the program initiation decision regardingthe use of a <strong>com</strong>mercial item or NDI acquisitionstrategy. The purpose of the market investigationis to determine the nature of available products<strong>and</strong> the number of potential vendors. Marketinvestigations may vary from informal telephoneinquiries to <strong>com</strong>prehensive industry-wide reviews.During the market investigation, sufficient datamust be gathered to support a definitive decision,to finalize the requirements, <strong>and</strong> to develop anacquisition strategy that is responsive to theserequirements. Included in the search would bedual use technologies that meet military needs,yet have sufficient <strong>com</strong>mercial application to supporta viable production base. An assessment ofTechnology Readiness Level will provide theprogram office with insights into the readinessof the <strong>com</strong>mercial item technologies for militaryapplication. 5During the market investigation, a formal “requestfor information” process may be followed whereina brief narrative description of the requirement ispublished <strong>and</strong> interested vendors are invited to respond.<strong>Test</strong> samples or test items may be leased orpurchased at this time to support the conduct ofoperational suitability tests, to evaluate the abilityof the equipment to satisfy the requirements, <strong>and</strong>to help build the functional purchase descriptionor system specification. This type of preliminarytesting should not be used to select or eliminateany particular vendor or product unless it is precededby <strong>com</strong>petitive contracting procedures. 6It is imperative that technical <strong>and</strong> operationalevaluators be<strong>com</strong>e involved during this early stageof any <strong>com</strong>mercial item or NDI procurement <strong>and</strong>that they perform an early assessment of the initialissues. The evaluator must also relate theseissues to <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E) criteria <strong>and</strong>provide their Independent <strong>Evaluation</strong> Plans (IEPs)<strong>and</strong> reports to the decision authorities before theMilestone B decision review.23.3 COMMERCIAL ITEM AND NDITESTING23.3.1 General Considerations<strong>Test</strong> planning for <strong>com</strong>mercial <strong>and</strong> NDIs shall recognize<strong>com</strong>mercial testing <strong>and</strong> experience, butnonetheless determine the appropriate DT&E,OT&E, <strong>and</strong> LFT&E [Live Fire <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>]needed to ensure effective performance inthe intended operational environment. 7 The DefenseAcquisition <strong>Guide</strong>book suggests that “thePM shall develop an appropriate T&E strategy23-3


for <strong>com</strong>mercial items to include evaluating <strong>com</strong>mercialitems in a system test bed, when practical;focusing test beds on high-risk items; <strong>and</strong>testing <strong>com</strong>mercial item upgrades for unanticipatedside effects in areas such as security, safety,reliability, <strong>and</strong> performance.” T&E must be consideredthroughout the acquisition of a systemthat involves <strong>com</strong>mercial items <strong>and</strong> NDI.Program Managers (PMs) <strong>and</strong> their staff mustensure that the testing <strong>com</strong>munity is fully involvedin the acquisition from the start. Theamount <strong>and</strong> level of testing required depends onthe nature of the <strong>com</strong>mercial item or NDI <strong>and</strong>its anticipated use; it should be planned to supportthe design <strong>and</strong> decision process. At a minimum,T&E will be conducted to verify integration<strong>and</strong> interoperability with other system elements.All <strong>com</strong>mercial item <strong>and</strong> NDI modificationsnecessary to adapt them to the weapon systemenvironment will also be subject to T&E.Available test results from all <strong>com</strong>mercial <strong>and</strong>government sources will determine the actualextent of testing necessary. For example, a <strong>com</strong>mercialitem or NDI usually en<strong>com</strong>passes amature design. The availability of this maturedesign contributes to the rapid development ofthe logistics support system that will be needed.In addition, there are more “production” itemsavailable for use in a test program. PMs <strong>and</strong> theirstaff must remember that these systems also requireactivity in areas associated with traditionaldevelopment <strong>and</strong> acquisition programs. For example,training <strong>and</strong> maintenance programs <strong>and</strong>manuals must be developed; <strong>and</strong> sufficient timeshould be allowed for their preparation.When the solicitation package for a <strong>com</strong>mercialitem or NDI acquisition is assembled, PMs mustensure that it includes the following T&E-relateditems:(1) Approved T&E issues <strong>and</strong> criteria;(2) A requirement that the contractor providea description of the testing previouslyperformed on the system, including test proceduresfollowed, data, <strong>and</strong> results achieved;(3) PQT <strong>and</strong> quality conformance requirements;(4) Acceptance test plans for the system <strong>and</strong>its <strong>com</strong>ponents.23.3.2 <strong>Test</strong>ing Before Program InitiationSince an important advantage of using a <strong>com</strong>mercialitem or NDI acquisition strategy isreduced acquisition time, it is important thattesting not be redundant <strong>and</strong> that it is limited tothe minimum effort necessary to obtain therequired data. <strong>Test</strong>ing can be minimized by:(1) Obtaining <strong>and</strong> assessing contractor testresults;(2) Obtaining usage/failure data from othercustomers;(3) Observing contractor testing;(4) Obtaining test results from independenttest organizations (e.g., Underwriter’sLaboratory);(5) Verifying selected contractor test data.If it is determined that more information is neededafter the initial data collection from the abovesources, <strong>com</strong>mercial items or NDI c<strong>and</strong>idates maybe bought or leased, <strong>and</strong> technical <strong>and</strong> operationaltests may be conducted.23.3.3 <strong>Test</strong>ing After Program InitiationAll testing to be conducted after the programinitiation milestone decision to proceed with the<strong>com</strong>mercial item or NDI acquisition should bedescribed in the Acquisition Strategy <strong>and</strong> the <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> Master Plan (TEMP). Developmenttesting is conducted only if specific informationthat cannot be satisfied by contractor or23-4


other test data sources is needed. Operationaltesting is conducted as needed. The IndependentOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (IOT&E) agencyshould concur in any decisions to limit oreliminate operational testing.T&E continue even after the system has beenfielded. This testing takes the form of a followonevaluation to validate <strong>and</strong> refine: operating<strong>and</strong> support cost data; Reliability, Availability, <strong>and</strong>Maintainability (RAM) characteristics; logisticssupport plans; <strong>and</strong> training requirements, doctrine,<strong>and</strong> tactics.23.4 RESOURCES AND FUNDINGProgramming <strong>and</strong> budgeting for a <strong>com</strong>mercialitem or NDI acquisition present a specialchallenge. Because of the short duration of theacquisition process, the st<strong>and</strong>ard lead timesrequired in the normal Planning, Programming,Budgeting <strong>and</strong> Execution (PPBE) system cyclemay be unduly restrictive. This situation can beminimized through careful, advanced planning<strong>and</strong>, in the case of urgent requirements,reprogramming/supplemental funding techniques.Research, Development, <strong>Test</strong>, <strong>and</strong> <strong>Evaluation</strong>(RDT&E) funds are normally used to support theconduct of the Market Investigation Phase <strong>and</strong>the purchase or lease of c<strong>and</strong>idate systems/<strong>com</strong>ponentsrequired for T&E purposes. The RDT&Efunds are also used to support T&E activities suchas: modification of the test article; purchase ofspecifications, manufacturer’s publications, repairparts, special tools <strong>and</strong> equipment; transportationof the test article to <strong>and</strong> from the test site; <strong>and</strong>training, salaries, <strong>and</strong> temporary duty costs ofT&E personnel. Procurement, operations, <strong>and</strong>maintenance funds are usually used to supportProduction <strong>and</strong> Deployment (P&D) costs.One chief reason for using a <strong>com</strong>mercial item orNDI acquisition strategy is reduced overall TOC.Additional cost savings can be achieved after acontract has been awarded if the PM ensures thatincentives are provided to contractors to submitValue Engineering Change Proposals (VECPs) tothe government when unnecessary costs areidentified.23.5 SUMMARYThe use of <strong>com</strong>mercial items <strong>and</strong> NDIs in a systemacquisition can provide considerable time <strong>and</strong>cost savings. The testing approach used must becarefully tailored to the type of system, levels ofmodifications, technology risk areas, <strong>and</strong> theamount of test data already available. The T&E<strong>com</strong>munity must get involved early in the processso that all test issues are adequately addressed<strong>and</strong> timely <strong>com</strong>prehensive evaluations areprovided to decision authorities.23-5


ENDNOTES1. Department of Defense Directive (DoDD)5000.1, The Defense Acquisition System, May12, 2003.2. DoD Instruction (DoDI) 5000.2, Operationof the Defense Acquisition System, May 12,2003.3. Ibid.5. Defense Acquisition H<strong>and</strong>book, Appendix 6,http://akss.dau.mil/DAG/.6. Department of the Army Pamphlet (DA PAM)73-1, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> in Support of SystemsAcquisition, February 1997.7. DoDI 5000.2.4. Ibid.23-6


24TESTING THESPECIAL CASES24.1 INTRODUCTIONThis chapter covers the special factors <strong>and</strong> alternativetest strategies the tester must consider intesting dangerous or lethal weapons, systems thatinvolve one-of-a-kind or limited production,Advanced Concept Technology Demonstrations(ACTDs), <strong>and</strong> systems with high-cost <strong>and</strong>/orspecial security considerations. Examples include:chemical <strong>and</strong> laser weapons; ships; spaceweapons; <strong>and</strong> missile systems.24.2 TESTING WITH LIMITATIONSCertain types of systems cannot be tested usingrelatively st<strong>and</strong>ard <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E)approaches for reasons such as a nonst<strong>and</strong>ardacquisition strategy, resource limitations, cost,safety, or security constraints. The <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Master Plan (TEMP) must contain a statementthat identifies “those factors that willpreclude a full <strong>and</strong> <strong>com</strong>pletely realistic operationaltest...(IOT&E [Initial Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong>] <strong>and</strong> FOT&E [Follow-on Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>]),” such as inability to realisticallyportray the entire threat, limited resourcesor locations, safety, <strong>and</strong> system maturity. Theimpact of these limitations on the test’s CriticalOperational Issues (COIs) must also be addressedin the TEMP.Nonst<strong>and</strong>ard acquisition strategies are often usedfor one-of-a-kind or limited production systems.Examples of these include space systems; missiles;<strong>and</strong> ships. For one-of-a-kind systems, theproduction decision is often made prior to systemdesign; hence, testing does not support the traditionaldecision process. In limited productionsystems, there are often no prototypes availablefor test; consequently, the tester must developinnovative test strategies.The tester of dangerous or lethal systems, likechemical <strong>and</strong> laser weapons, must considervarious safety, health, <strong>and</strong> medical factors indeveloping test plans, such as:(1) Provision of medical facilities for pre- <strong>and</strong>post-test checkups <strong>and</strong> emergency treatment;(2) Need for protective gear for participating/observer personnel;(3) Approval of the test plan by the SurgeonGeneral;(4) Restrictions in selection of test participants(e.g., medical criteria or use of only volunteertroops);(5) Restricted test locations;(6) Environmental Impact Statements (EISs).Also, the tester must allow for additional planningtime, test funds, <strong>and</strong> test resources toac<strong>com</strong>modate such factors.24.2.1 Chemical Weapons <strong>Test</strong>ingThe testing of chemical weapons poses uniqueproblems, because the tester cannot performactual open-air field testing with real nerve agentsor other toxic chemicals. Since the United Statessigned <strong>and</strong> ratified the Geneva Protocol of 1925,U.S. policy has been that the United States will24-1


never be the first to use lethal chemical weapons;it may, however, retaliate with chemical weaponsif so attacked. In addition to the health <strong>and</strong> safetyfactors discussed in the last paragraph, test issuesthe chemical weapons tester must address include:(1) All possible chemical reactions due tovariations such as moisture, temperature,pressure, <strong>and</strong> contamination;(2) Physical behavior of the chemical; i.e., dropletsize, dispersion density, <strong>and</strong> ground contaminationpattern when used operationally;(3) Toxicity of the chemical; i.e., lethality <strong>and</strong>duration of contamination when usedoperationally;(4) Safety of the chemical weapon duringstorage, h<strong>and</strong>ling, <strong>and</strong> delivery;(5) Decontamination process.Addressing all of these issues requires a <strong>com</strong>binationof laboratory toxic chamber tests <strong>and</strong> openairfield testing. The latter must be performedusing “simulants,” which are substances that replicatethe physical <strong>and</strong> chemical properties of theagent but with no toxicity.The development <strong>and</strong> use of simulants for testingwill require increased attention as morechemical weapons are developed. Chemicalagents can demonstrate a wide variety of effectsdepending on such factors as moisture, temperature,<strong>and</strong> contamination. Consequently, thesimulants must be able to replicate all possibleagent reactions; it is likely that several simulantswould have to be used in a test to produce allpredicted agent behaviors. In developing <strong>and</strong>selecting simulants, the tester must thoroughlyunderst<strong>and</strong> all chemical <strong>and</strong> physical properties<strong>and</strong> possible reactions of the agent.Studies of the anticipated reactions can be performedin toxic-chamber tests using the realagent. Here, factors such as changes in moisture,temperature, pressure, <strong>and</strong> levels of impurity canbe controlled to assess the agent’s behavior. Thetester must think through all possible environmentalconditions in which the weapon couldoperate so all cases can be tested in the laboratorychamber with the real agent. For example, duringdevelopment testing of the BIGEYE chemicalweapon, it was found that higher-thanexpectedtemperatures due to aerodynamic heatingcaused pressure buildup in the bomb bodythat resulted in the bomb exploding. This causedthe operational concept for the BIGEYE to bechanged from on-board mixing of the twochemicals to mixing after release of the bomb.<strong>Test</strong>s to confirm toxicity must be conducted usingsimulants in the actual environment. Since theagent’s toxicity is dependent on factors such asdroplet size, dispersion density, ground contaminationpattern, <strong>and</strong> degradation rate, a simulantthat behaves as the agent does must be used inactual field testing. Agent toxicity is determinedin the lab.The Services publish a variety of technical documentson specific chemical test procedures. Documentssuch as the Index of <strong>Test</strong> Operations LogisticsSupport: Developmental Supportability<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> <strong>Guide</strong>, a bibliography thatincludes numerous reports on chemical testingissues <strong>and</strong> procedures, can be consulted forspecific documentation on chemical testing. 124.2.2 Laser Weapons <strong>Test</strong>ingMany new weapon systems are being designedwith embedded laser range finders <strong>and</strong> laser designators.Because of the danger to the human eyeposed by lasers, the tester must adhere to specialsafety requirements <strong>and</strong> utilize speciallyconfigured geographic locations during T&E. Forinstance, the program seeking to free-play noneyesafe airborne or ground lasers during testsinvolves significant precautions, such as the airspacemust be restricted from overflight during24-2


active testing <strong>and</strong> guards must be posted to preventanyone (hikers, bikers, off-road vehicles,equestrians) accidentally venturing into the area.A potential solution to the safety issue is to develop<strong>and</strong> use an “eye-safe” laser for testing. Thetester must ensure that eye-safe lasers producethe same laser energy as the real laser system.Another concern of the laser/directed energyweapons tester is the accurate determination ofenergy levels <strong>and</strong> location on the target. Measurementsof the energy on the target are usuallyconducted in the laboratory as part of Development<strong>Test</strong> (DT). In the field, video cameras areoften used to verify that a laser designator didindeed illuminate the target. Such determinationsare important when the tester is trying to attributeweapon performance to behavior of the directedenergy weapon, behavior of the guidance system,or some other factor.A bibliography of Army test procedures, <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> Comm<strong>and</strong> (ATEC) Pamphlet 310-4,lists several documents that cover the special issuesassociated with laser testing.24.3 SPACE SYSTEM TESTINGFrom a historical perspective, space system acquisitionhas posed several unique problems tothe test process (especially the Operational <strong>Test</strong>(OT) process) that generally fall into four categories:limited quantities/high cost, “incrementalupgrade” approach to acquisition, operatingenvironment (peacetime <strong>and</strong> wartime), <strong>and</strong> testenvironment. Exp<strong>and</strong>ed Air Force guidance canbe found in Space Systems <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Process. 2 More generic guidance is in NationalSecurity Space (NSS) Acquisition Policy thatoutlines a streamlined acquisition framework. 3(1) Limited quantities/high cost: Space systemshave traditionally involved the acquisitionof relatively few (historically, less than20) systems at extremely “high per-unitcosts” (in <strong>com</strong>parison with more traditionalmilitary systems). The high per-unit costsare driven by a <strong>com</strong>bination of hightransportation costs (launch to orbit); highlife-cycle reliability requirements <strong>and</strong> associatedcosts because of the lack of an“on-orbit” maintenance capability; <strong>and</strong> thehigh costs associated with “leading edge”technologies that tend to be a part of spacecraftdesign. From a test perspective, thisserves to drive space system acquisitionstrategy into the “nonst<strong>and</strong>ard” approachaddressed below. The problem is <strong>com</strong>poundedby the “incremental upgrade”approach to acquisition.(2) Incremental approach to acquisition: Dueto the “limited buy” <strong>and</strong> “high per-unit cost”nature of spacecraft acquisition, these systemstend to be procured using an individualincrement acquisition strategy. Under thisconcept, “the decision to deploy” is oftenmade at the front end of the acquisitioncycle; <strong>and</strong> the first prototype to be placedin orbit be<strong>com</strong>es the first operational asset.As early <strong>and</strong> follow-on systems undergoground <strong>and</strong> on-orbit testing [either Development<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E) orOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E)],discrepancies are corrected by “incrementchanges” to the next system in the pipeline.This approach to acquisition can perturbthe test process as the tester may haveno formal milestone decisions to test toward.The focus must change toward beingable to influence the design of (<strong>and</strong> blockchanges to) systems further downstream inthe pipeline. As the first “on-orbit” assetusually be<strong>com</strong>es the first operational asset,pressure is created from the operational <strong>com</strong>munityto expedite (<strong>and</strong> sometimes limit)testing so a limited operational capability canbe declared <strong>and</strong> the system can begin fulfillingmission requirements. Once the asset“goes operational,” any use of it for testingmust <strong>com</strong>pete with operational missionneeds—a situation potentially placing the24-3


tester in a position of relatively low priority.Recognition of these realities <strong>and</strong> careful“early-on” test planning can over<strong>com</strong>emany of these problems, but the tester needsto be involved <strong>and</strong> ready much earlier inthe cycle than with traditional systems.(3) Operating environment (peacetime <strong>and</strong>wartime): Most currently deployed spacesystems <strong>and</strong> near-term future space systemsoperate in the military support arena such astactical warning/attack assessment, <strong>com</strong>munications,navigation, weather, <strong>and</strong> intelligence;<strong>and</strong> their day-to-day peacetime operatingenvironment is not much different fromthe wartime operating environment except foractivity level (i.e., message throughput, moreobjects to track/see, etc.). Historically, spacehas been a relatively benign battlefield environmentbecause of technology limitationsin the capability of potential adversaries toreach into space with weapons. But this isno longer valid. This <strong>com</strong>bination of support-typemissions <strong>and</strong> a battlefield environmentthat is not much different from thepeacetime environment has played a definiterole in allowing systems to reach limitedoperational capability without as much dedicatedprototype system-level testing as seenon other systems. This situation is changingwith the advent of concepts like the MissileDefense Agency [MDA] system where actualweapons systems (impact anti-satellite<strong>and</strong> laser) may be in operation, <strong>and</strong> day-todaypeacetime operations will not mirror theanticipated battlefield environment as closely.Likewise, the elevation of the battlefield intospace <strong>and</strong> the advancing technologies thatallow potential adversaries to reach into spaceis changing the thrust of how space systemsneed to be tested in space. The Departmentof Defense (DoD) should anticipate an increasedneed for dedicated on-orbit testingon a type of space range where the battlefieldenvironment that will be replicated canbe anticipated—a situation similar to thededicated testing done today on test rangeswith Army, Navy, <strong>and</strong> Air Force weapons.(4) <strong>Test</strong> environment: The location of spaceassets in “remote” orbits also <strong>com</strong>poundsthe test problem. Space systems do not havethe ready access (as with ground or aircraftsystems) to correct deficiencies identifiedduring testing. This situation has driven themain thrust of testing into the “prelaunch”ground simulation environment where discrepanciescan be corrected before the systembe<strong>com</strong>es inaccessible. However, asmentioned previously, when space systemmissions change from a war-support focusto a warfighting focus, <strong>and</strong> the number ofsystems required to do the mission increasesfrom the “high reliability/limited number”mode to a more traditional “fairly largenumber buy” mode, future space systemtesting could be expected to be<strong>com</strong>e morelike the testing associated with currentground, sea, <strong>and</strong> air systems. From a testperspective, this could also create unique“test technology” requirements; i.e., withthese systems we will have to bring the testrange to the operating system as opposedto bringing the system to the range. Also,because the space environment tends to be“visible to the world” (others can observeour tests as readily as we can), unique testoperations security methodologies may berequired to allow us to achieve test realismwithout giving away system vulnerabilities.In summary, current <strong>and</strong> near-term future spacesystems have unique test methodologies. However,in the future space operations might entaildevelopment/deployment of weapon platforms onorbit with lower design-life reliability (becauseof cost); <strong>and</strong> day-to-day peacetime operations willnot mirror the wartime environment. Thus, spacesystem testing requirements may begin to moreclosely parallel those of traditional weaponsystems.24-4


24.4 OPERATIONS SECURITY AND T&EOperations Security (OPSEC) issues must be consideredin all test planning. Security regulations<strong>and</strong> contracting documents require the protectionof “sensitive design information <strong>and</strong> test data”throughout the acquisition cycle by:(1) Protecting sensitive technology;(2) Eliminating nonsecure transmittal data on<strong>and</strong> from test ranges;(3) Providing secure <strong>com</strong>munications linkingDoD agencies to each other <strong>and</strong> to theircontractors.Such protection is obviously costly <strong>and</strong> will requireadditional planning time, test resources, <strong>and</strong>test constraints. The test planner must determineall possible ways in which the system could besusceptible to hostile exploitation during testing.For example, announcement of test schedule <strong>and</strong>location could allow monitoring by unauthorizedpersons. Knowledge of the locations of systems<strong>and</strong> instrumentation or test concepts could revealclassified system capabilities or military concepts.Compilations of unclassified data could, as awhole, reveal classified information as could surveillance(electronic or photographic) of test activitiesor interception of unencrypted transmissions.The T&E regulations of each Service requirean operational security plan for a test. Adetailed list of questions the test planner can useto identify the potential threat of exploitation isprovided in <strong>Management</strong> of Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong>. 424.5 ADVANCED CONCEPTTECHNOLOGY DEMONSTRATIONSSystems with potential utility for the user <strong>and</strong>having relatively mature technology may be evaluatedby a user in an operational field environment.These programs are not an acquisition program<strong>and</strong> therefore are not subject to the normalacquisition T&E processes. A favorable evaluationmay result in the decision to acquire additionalsystems for Service use, bypassing a numberof the normal acquisition phases. The Serviceshave been using their operational test agenciesto assist the field <strong>com</strong>m<strong>and</strong>ers in structuringan evaluation process that would provide the documenteddata necessary for an informed acquisitiondecision.24.6 SUMMARYAll weapon systems tests are limited to somedegree, but certain systems face major limitationsthat could preclude a <strong>com</strong>prehensive <strong>and</strong> realisticevaluation. The test planners of these specialsystems must allow additional planning time,budget for extra test resources, <strong>and</strong> devise alternativetest strategies to work around testing limitationscaused by such factors as security restrictions,resource availability, environmental safetyfactors, <strong>and</strong> nonst<strong>and</strong>ard acquisition strategies.24-5


ENDNOTES1. U.S. Army <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Comm<strong>and</strong>(ATEC) Pamphlet 310-4, Index of <strong>Test</strong>Operations Logistics Support: DevelopmentalSupportability <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> <strong>Guide</strong>.2. Air Force Manual (AFM) 99-113, Space Systems<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Process, May 1,1996.3. National Security Space (NSS) AcquisitionPolicy 03-01, October 3, 2003.4. Air Force Regulation (AFR) 55-43, <strong>Management</strong>of Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>.24-6


25BEST PRACTICES IN T&EOF WEAPON SYSTEMS25.1 INTRODUCTIONNumerous pre-millennium 2000 studies were conductedby various agencies that highlighted differentperspectives on best practices for <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (T&E). The Office of the Secretaryof Defense (OSD) published their study, BestPractices Applicable to DoD Developmental <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong>. 1 The Executive Summary stated“While the study team found no ‘Silver Bullets,’it did identify some 20 practices used by <strong>com</strong>mercialenterprises that are relevant to ODTSE&E[Office of the Director, <strong>Test</strong> Systems Engineering<strong>and</strong> <strong>Evaluation</strong>] business practices. Thesepractices have been grouped under the categories‘Policy,’ ‘Planning,’ ‘<strong>Test</strong> Conduct,’ <strong>and</strong> ‘<strong>Test</strong>Analysis’.”Shortly thereafter in September 1999, the DefenseScience Board (DSB) Task Force released its reporton a broad review of the entire range of activitiesrelating to T&E. Their summary re<strong>com</strong>mendationswere: start T&E early—very early;make T&E part of the acquisition process—notadversarial to it; consolidate Development <strong>Test</strong>(DT) <strong>and</strong> Operational <strong>Test</strong> (OT); provide joint testleadership; fund Modeling <strong>and</strong> Simulation (M&S)support of T&E in program budgets; maintain independenceof evaluation process while integratingall other activities; <strong>and</strong>, establish range ownership<strong>and</strong> operation structure separate from theService DT/OT organizations. A follow-on DSBreport addressed the value of testing, managementof T&E resources, quality of testing, specificT&E investments, <strong>and</strong> the use of trainingfacilities/exercises for T&E events. 2In the same time frame, A. Lee Battershellreleased a study for the National DefenseUniversity (NDU) <strong>com</strong>paring the acquisitionpractices of the Boeing 777 <strong>and</strong> the Air Force C-17. Her most interesting yet not surprising conclusionwas that some <strong>com</strong>mercial best practicesdo not transfer to government.This was followed by the publication of the GeneralAccounting Office (GAO) Report Best Practices:A More Constructive <strong>Test</strong> Approach is Keyto Better Weapon System Out<strong>com</strong>es.” 3 After <strong>com</strong>paring<strong>com</strong>mercial <strong>and</strong> defense system developmentpractices, the three main findings were:problems found late in development signal weaknessin testing <strong>and</strong> evaluation; testing early to validateproduct knowledge is a best practice; <strong>and</strong>,different incentives make testing a more constructivefactor in <strong>com</strong>mercial programs than inweapon system programs.The following information offers guidance to Departmentof Defense (DoD) personnel who plan,monitor, <strong>and</strong> execute T&E. Checklists found inthe remainder of the chapter were obtained fromthe Report of Task Force on <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>. 4This excellent study is highly regarded in the T&E<strong>com</strong>munity but has be<strong>com</strong>e somewhat dated; consequently,the Defense Acquisition University(DAU) decided to update the study findings <strong>and</strong>include those findings <strong>and</strong> summary checklistsin this guide. This discussion follows the systemfrom early concept through the various stages oftechnology maturation demonstrated in differentsystem/test article configurations.25-1


25.2 SPECIFIC WEAPON SYSTEMSTESTING CHECKLISTThe DSB report is the result of the study of pastmajor weapon systems acquisitions. It was hopedthat this study would enhance the testing<strong>com</strong>munity’s underst<strong>and</strong>ing of the role that T&Ehas had in identifying system problems duringthe acquisition process. In the foreword of theDSB study, the authors made this statementabout including the obvious testing activity intheir checklist:The T&E expert in reading this volumewill find many precepts which will strikehim as of this type. These items are includedbecause examples were foundwhere even the obvious has been neglected,not because of in<strong>com</strong>petence orlack of personal dedication by the peoplein charge of the program, but because offinancial <strong>and</strong> temporal pressures whichforced <strong>com</strong>petent managers to <strong>com</strong>promiseon their principles. It is hoped thatthe inclusion of the obvious will preventrepetition of the serious errors whichhave been made in the past when suchpolitical, economical, <strong>and</strong> temporal pressureshave forced project managers todepart from the rules of sound engineeringpractices....In the long run, takingshort cuts during T&E to save time <strong>and</strong>money will result in significant increasesin the overall costs of the programs <strong>and</strong>in a delay of delivery of the correspondingweapon systems to <strong>com</strong>batant forces.25.2.1 Aircraft Systems25.2.1.1 Concept Assessment• <strong>Test</strong> Program/Total Costs. Prior to program initiation,all phases of the aircraft test programshould be considered so the total costs <strong>and</strong> thedevelopment schedules include considerationof all likely activities in the overall program.• <strong>Test</strong> Facilities <strong>and</strong> Instrumentation. The testfacilities <strong>and</strong> instrumentation requirements toconduct tests should be generally identifiedalong with a tentative schedule of test activities.• <strong>Test</strong> Resources <strong>and</strong> Failures. Ensure that thereare adequate funds, reasonable amounts oftime, <strong>and</strong> acceptable numbers of aircraft plannedfor the various test program phases, <strong>and</strong>that provisions are made for the occurrence offailures.• System Interfaces. Consider all aircraft systeminterfaces, their test requirements, <strong>and</strong> probablecosts at the outset of the concept assessment.• Major Weapon Subsystems. If the aircraft systemrelies on the successful development of aspecific <strong>and</strong> separately funded major weapon(such as a gun or missile) in order to ac<strong>com</strong>plishits primary mission, this major subsystemshould be developed <strong>and</strong> tested concurrentlywith, or prior to, the aircraft.• Propulsion System. If the aircraft program ispaced by the propulsion system development,an early advanced-development project for thepropulsion may be appropriate for a newconcept.• Operational Scenario. A conceptual operationalscenario for operation <strong>and</strong> use of theaircraft should be developed so that generaltest plans can be designed. This should includepurpose, roles <strong>and</strong> missions, threats, operatingenvironments, logistics <strong>and</strong> maintenance,<strong>and</strong> basing characteristics. The potential rangeof values on these aspects should be stated.• <strong>Evaluation</strong> Criteria. Develop evaluation criteriato be used for selecting the final aircraftsystem design.• Untried Elements. The aircraft developmentprogram should include conclusive testing toeliminate uncertainties of the untried elements.25-2


• Brassboard Avionics <strong>Test</strong>s. The use of brassboardor modified existing hardware to “prove”the concept will work should be seriously scrutinizedto ensure that the demonstrations <strong>and</strong>tests are applicable.• Nuclear Weapons Effects. The subject ofnuclear weapons effects should be addressedin the test concept for all aircraft weapons systemswhere operational suitability dictates thatsurvivable exposure to nuclear weapons effectsis a requirement.25.2.1.2 Prototype Development• T&E Strategy. T&E plans <strong>and</strong> test criteriashould be established so there is no questionon what constitutes a successful test <strong>and</strong> whatperformance is required.• Milestones <strong>and</strong> Goals. Ensure an integratedsystem test plan that preestablishes milestones<strong>and</strong> goals for easy measurement of programprogress at a later time.• Operating Concept <strong>and</strong> Environment. Theoperational concept <strong>and</strong> the environments inwhich the aircraft will be expected to operate<strong>and</strong> be tested in Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (OT&E) should be specified.• <strong>Test</strong> Program Building Blocks. In testing theprototype, demonstrate that high-risk technologyis in h<strong>and</strong>. In planning the system leveltest program, ensure that <strong>com</strong>ponents <strong>and</strong> subsystemsare adequately qualified for incorporationinto the system tests.• Technology Concepts. Each concept to be usedin the aircraft system (e.g., aerodynamics,structures, propulsion) should be identified <strong>and</strong>coded according to prior application, beforefuture research. <strong>Test</strong>s for each concept shouldbe specified with the effect of failureidentified.• Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E)/OT&E Plan. The aircraft DT&E/OT&E testplan should be reviewed to ensure it includesground <strong>and</strong> flight tests necessary to safely <strong>and</strong>effectively develop the system.• <strong>Test</strong> Failures. The T&E plans should be madeassuming there will be failures; they areinevitable.• Multi-Service <strong>Test</strong>ing. When a new aircraftdevelopment program requires multi-Servicetesting during OT&E <strong>and</strong> prior to Low RateInitial Production (LRIP), the test plan shouldinclude the types of tests <strong>and</strong> resourcesrequired from other activities <strong>and</strong> Services.• Traceability. The aircraft development <strong>and</strong> testprogram should be designed <strong>and</strong> scheduled soif trouble arises, its source can be traced backthrough the lab tests <strong>and</strong> the analytical studies.• Competitive Prototype <strong>Test</strong>s. When a <strong>com</strong>petitiveprototype test program using test <strong>and</strong>operational crews is employed, the aircraftshould be <strong>com</strong>pared on the basis of theperformance of critical missions.• Prototype Similarity to Development <strong>and</strong> ProductionAircraft. A firm determination shouldbe made of the degree of similarity of the winningprototype (in a <strong>com</strong>petitive prototype program)to the Engineering Development Model(EDM) <strong>and</strong> production aircraft. Thus, testresults that are derived from the prototype inthe interim period prior to availability of theEDM aircraft can be utilized effectively.• Prototype <strong>Test</strong>s. The prototype aircraft test datashould be used to determine where emphasisshould be placed in the engineeringdevelopment program.• Inlet/Engine/Nozzle Match. The aircraft testprogram should provide for an early <strong>and</strong>adequate inlet/engine/nozzle match through a25-3


well-planned test program, <strong>and</strong> there shouldbe time programming for corrections.• Subsystem <strong>Test</strong>s. There should be a balancedprogram for the aircraft subsystem tests.• Propulsion System. If the aircraft is paced bythe propulsion systems development, an earlyadvanced-development project for the propulsionmay be appropriate for a new concept.• Electromagnetic Interface (EMI) <strong>Test</strong>ing. Fullscaleaircraft systems tests in an anechoicchamber are desirable for some aircraft.• Parts Interchange. Early plans should providefor tests where theoretically identical parts, particularlyin avionics, are interchanged to ensurethat the aircraft systems can be maintainedin readiness.• Human Factors Demonstration. Ensure adequatedemonstration of human factors isconsidered in test planning.• Logistics T&E. Adequate resources should bescheduled for the aircraft logistics system T&E,<strong>and</strong> a positive program should exist for theutilization of this information at the time ofOT&E.• User Participation. It is imperative that theoperational <strong>com</strong>m<strong>and</strong> actively participate in theDT&E phase to ensure that user needs are representedin the development of the system.25.2.1.3 Engineering Development Model• <strong>Test</strong> Design. <strong>Test</strong> programs should be designedto have a high probability of early identificationof major deficiencies during the DT&E<strong>and</strong> Initial Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(IOT&E).• Data for Alternate Scenarios. By careful attentionto testing techniques, maximize theutility of the test data gathered; aircraftinstrumentation; range instrumentation; <strong>and</strong>data collection, reduction, <strong>and</strong> storage.• <strong>Test</strong> Milestones. Development programs shouldbe built around testing milestones, not calendardates.• Production Engineering Influence on Research<strong>and</strong> Development (R&D) Hardware. Encouragethat production philosophy <strong>and</strong> productiontechniques be brought to the maximum practicableextent into an early phase of the designprocess for R&D hardware.• Running <strong>Evaluation</strong> of <strong>Test</strong>s. Ensure that runningevaluations of tests are conducted. If itbe<strong>com</strong>es clear that test objectives are unattainableor additional samples will not change thetest out<strong>com</strong>e, ensure that procedures areestablished for terminating the test.• Simulation. Analysis <strong>and</strong> simulation should beconducted, where practicable, before eachphase of development flight testing.• Avionics Mock-up. Encourage use of a <strong>com</strong>pleteavionics system installed in a mock-upof the appropriate section or sections of theaircraft.• Escape Systems <strong>Test</strong>ing. Ensure the aircrewescape system is thoroughly tested with particularattention to redundant features, such aspyrotechnic firing channels.• Structural <strong>Test</strong>ing. Ensure that fatigue testingis conducted on early production airframes.Airframe production should be held to a lowrate until satisfactory progress is shown inthese tests.• Gun Firing <strong>Test</strong>s. All forms of ordnance,especially those that create gases, must befired from the aircraft for external effects(blast <strong>and</strong> debris), internal effects (shock) <strong>and</strong>25-4


effects on the propulsion (inlet <strong>com</strong>positionor distribution).• Post-Stall Characteristics. Special attention iswarranted on the post-stall test plans for DT&E<strong>and</strong> OT&E.• Subsystem Performance History. DuringDT&E <strong>and</strong> IOT&E of aircraft, ensure that aperformance history of each aircraft subsystemis kept.• Flight Deficiency Reporting. Composition offlight deficiencies reporting by aircrews, particularlythose pertaining to avionics, shouldbe given special attention.• Crew Limitations. Ensure aircrew limitationsare included in the tests.• Use of Operational Personnel. Re<strong>com</strong>mend experiencedoperational personnel help in establishingMeasures of Effectiveness (MOEs)<strong>and</strong> in other operational test planning. In conductingOT&E, use typical operational aircrews<strong>and</strong> support personnel.• Role of the User. Ensure that users participatein the T&E phase so their needs are representedin the development of the system concept <strong>and</strong>hardware.• Crew Fatigue <strong>and</strong> System Effectiveness. Inattack aircraft operational testing <strong>and</strong> particularlyin attack helicopter tests where vibrationis a fatiguing factor, ascertain that the testsinclude a measure of degradation over time.• Time Constraints on Crews. Detailed OT plansshould be evaluated to determine that the testimposedconditions on the crew do not invalidatethe applicability of the collected data.• Maintenance <strong>and</strong> Training Publications. Theaircraft development program should providefor concurrent training of crews <strong>and</strong>preparation of draft technical manuals to beused by IOT&E maintenance <strong>and</strong> operatingcrews.• R&D Completion Prior to IOT&E. The testingplans should ensure that, before an aircraft systemis subjected to IOT&E, the subsystems essentialto the basic mission have <strong>com</strong>pletedR&D.• Complete Basic DT&E before Starting OT&E.Before the weapon system is subjected toIOT&E, all critical subsystems should have<strong>com</strong>pleted basic DT&E <strong>and</strong> significantproblems should be solved.• Realism in <strong>Test</strong>ing. Ascertain that final DT&Esystem tests <strong>and</strong> IOT&E flight tests arerepresentative of operational conditions.• <strong>Test</strong> All Profiles <strong>and</strong> Modes. <strong>Test</strong>s should beconducted to evaluate all planned operationalflight profiles <strong>and</strong> all primary <strong>and</strong> backup,degraded operating modes.• Update of OT Plans. Ensure that OT plans arereviewed <strong>and</strong> updated, as needed, to make themrelevant to evolving concepts.• Plan OT&E Early. Ensure that operational suitabilitytests are planned to attempt to identifyoperational deficiencies of new systemsquickly so fixes can be developed <strong>and</strong> testedbefore large-scale production.• Missile Launch <strong>Test</strong>s. Review the final positionfix planned before launching inertialguidedair-to-surface missiles.• Mission Completion Success Probability. Mission<strong>com</strong>pletion success probability factorsshould be used to measure progress in theaircraft test program.25-5


25.2.1.4 Production (LRIP <strong>and</strong> Full Rate),Deployment <strong>and</strong> OperationalSupport• OT Realism. Assure IOT&E <strong>and</strong> Follow-on<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (FOT&E) are conductedunder realistic conditions.• Design FOT&E for Less-Than-Optimal Conditions.Structure the FOT&E logistical supportfor simulated <strong>com</strong>bat conditions.• New Threat. Be alert to the need to extend theIOT&E if a new threat shows up. AddressIOT&E limitations in FOT&E.• Certification of Ordnance. Ensure thatordnance to be delivered by an aircraft is certifiedfor the aircraft.• Inadvertent Influence of <strong>Test</strong>. The IOT&E/FOT&E plans should provide measures forensuring that actions by observers <strong>and</strong> umpiresdo not unwittingly influence trial out<strong>com</strong>e.• Deficiencies Discovered In-Service. Be awarethat in-Service operations of an aircraft systemwill surface deficiencies that extensiveIOT&E/FOT&E probably would not uncover.• Lead the Fleet. Accelerated Service test of asmall quantity of early production aircraft isadvisable <strong>and</strong> periodically during FOT&Ethereafter.25.2.2 Missile Systems25.2.2.1 Concept Assessment• Weapon System Interfaces. Consider significantweapon system interfaces, their testrequirements <strong>and</strong> probable costs at the outsetof the concept assessment. Ensure that theprogram plan assembled before program startincludes an underst<strong>and</strong>ing of the basic testcriteria <strong>and</strong> broad test plans for the wholeprogram.• Number of <strong>Test</strong> Missiles. Ensure that there issufficient time <strong>and</strong> a sufficient number of testarticles to support the program through its variousphases. Compare the program requirementswith past missile programs of genericsimilarity. If there is substantial difference, thenadequate justification should be provided. TheDT&E period on many programs has had tobe extended as much as 50 percent.• T&E Gap. A T&E gap has been experiencedin some missile programs between the timewhen testing with R&D hardware was <strong>com</strong>pleted<strong>and</strong> the time when follow-on operationalsuitability testing was initiated with productionhardware.• Feasibility <strong>Test</strong>s. Ensure experimental testevidence is available to indicate the feasibilityof the concept <strong>and</strong> the availability of thetechnology for the system development.• <strong>Evaluation</strong> of Component <strong>Test</strong>s. Results of testsconducted during the concept assessment <strong>and</strong>the prototype testing, which most likely havebeen conducted as avionics brassboard, breadboard,or modified existing hardware, shouldbe evaluated with special attention.• Multi-Service <strong>Test</strong>ing Plans. When a new missiledevelopment program requires multi-Service testing during OT&E, the early <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> Master Plan (TEMP) shouldinclude the type of tests <strong>and</strong> resources requiredfrom other activities <strong>and</strong> Services.• <strong>Test</strong> Facilities <strong>and</strong> Instrumentation Requirements.The test facilities <strong>and</strong> instrumentationrequirements to conduct tests should be generallyidentified early along with a tentativeschedule of test activities.25-6


25.2.2.2 Prototype <strong>Test</strong>ing• Establish <strong>Test</strong> Criteria. By the end of prototypetesting, test criteria should be established sothere is no question on what constitutes a successfultest <strong>and</strong> what performance is expected.• Human Factors. Ensure that the TEMP includesadequate demonstration of humanfactors considerations.• Instrumentation Diagnostic Capability <strong>and</strong>Compatibility. Instrumentation design, with adequatediagnostic capability <strong>and</strong> <strong>com</strong>patibilityin DT&E <strong>and</strong> IOT&E phases, is essential.• Provisions for <strong>Test</strong> Failures. The DT&E <strong>and</strong>OT&E plans should include provisions for theoccurrence of failures.• Integrated <strong>Test</strong> Plan (ITP). Ensure developmentof an integrated system test plan that preestablishesmilestones <strong>and</strong> goals for easymeasurement of program progress at a latertime.• T&E Requirements. Ensure that the T&Eprogram requirements are firm before approvingan R&D test program. Many missileprograms have suffered severe cost impacts asa result of this deficiency. The test plan mustinclude provisions to adequately test thoseportions of the operational envelope that stressthe system including backup <strong>and</strong> degradedoperational modes.• Personnel Training Plans. Ensure that adequatetraining <strong>and</strong> certification plans for testpersonnel have been developed.• <strong>Test</strong> <strong>and</strong> Engineering Reporting Format. Includea T&E reporting format in the programplan. Attention must be given to the reportingformat in order to provide a consistent basisfor T&E throughout the program life cycle.• Program-to-Program Cross Talk. Encourageprogram-to-program T&E cross talk. T&Eproblems <strong>and</strong> their solutions, as one program,provide a valuable index of lessons learned <strong>and</strong>techniques for problem resolution on otherprograms.• Status of T&E Offices. Ensure that T&E officesreporting to the Program Manager (PM) or directorhave the same stature as other major elements.It is important that the T&E <strong>com</strong>ponentof the system program office has organizationalstatus <strong>and</strong> authority equal to configurationmanagement, program control, systemengineering, etc.• Measurement of Actual Environments. Thoroughmeasurements should be made to define<strong>and</strong> underst<strong>and</strong> the actual environment inwhich the system <strong>com</strong>ponents must live duringthe captive, launch, <strong>and</strong> in-flight phases.• Thoroughness of Laboratory <strong>Test</strong>ing. Significanttime <strong>and</strong> money will be saved if each <strong>com</strong>ponent,each subsystem, <strong>and</strong> the full systemare all tested as thoroughly as possible in thelaboratory.• Contract Form. The contract form can be extremelyimportant to the T&E aspects. In oneprogram, the contract gave the contractor fullauthority to determine the number of test missiles;<strong>and</strong> in another, the contract incentiveresulted in the contractor concentrating testson one optimum profile to satisfy the incentiveinstead of developing the performancethroughout important areas of the envelope.• Participation of Operational Comm<strong>and</strong>. It isimperative that the operational <strong>com</strong>m<strong>and</strong>actively participate in the DT&E phase toensure that user needs are represented in thedevelopment of the system.25-7


25.2.2.3 Engineering Development Model• Production Philosophy <strong>and</strong> Techniques. Encouragethat production philosophy <strong>and</strong>production techniques be brought, to the maximumpracticable extent, into an early phase ofthe design process for R&D hardware. Thereare many missile programs in which the <strong>com</strong>ponentswere not qualified until the missilewas well into production.• Operational Flight Profiles. <strong>Test</strong>s should beconducted to evaluate all planned operationalflight profiles <strong>and</strong> all primary <strong>and</strong> backupdegraded operating modes.• Failure Isolation <strong>and</strong> Responsive Action. Doesthe system test plan provide for adequateinstrumentation so missile failures can beisolated <strong>and</strong> fixed before the next flight?• Responsive Actions for <strong>Test</strong> Failures. Encouragea closed-loop reporting <strong>and</strong> resolutionprocess, which ensures that each test failure atevery level is closed out by appropriate action;i.e., redesign, procurement, retest, etc.• Plan <strong>Test</strong>s of Whole System. Plan tests of thewhole system including proper phasing of theplatform <strong>and</strong> supporting gear, the launcher, themissile, <strong>and</strong> user participation.• Determination of Component Configuration.Conditions <strong>and</strong> <strong>com</strong>ponent configuration duringDevelopment <strong>Test</strong>s (DTs) should be determinedby the primary objectives of that test.Whenever a non-operational configuration isdictated by early test requirements, tests shouldnot be challenged by the fact that configurationis not operational.• <strong>Test</strong>ing of Software. T&E should ensure thatsoftware products are tested appropriatelyduring each phase. Software often has beendeveloped more as an add-on than as anintegral part of the overall system. Softwarerequirements need the same consideration ashardware requirements in the prototypedevelopment.• Range Safety Dry Runs. Ensure the test planincludes adequate test program/range safetydry runs. The government test ranges have toprovide facilities to safely test many differentprojects:— Assemblies/Subsystems special requirements,— Seekers <strong>and</strong> tracking devices,— Propulsion subsystems,— Connectors <strong>and</strong> their related hardware,— Lanyard assemblies,— Safing, arming, fuzing, <strong>and</strong> other ordnancedevices.• Review of Air-to-Surface Missile (ASM) <strong>Test</strong>Position Fixes. Review the final position fixplanned before launching ASMs. There areinstances in which the OT of air-launchedmissiles utilized artificial position fixes justprior to missile launch.• Operator Limitations. Ensure operator limitationsare included in the tests. Most tacticalmissiles, especially those used in close support,require visual acquisition of the targetby the missile operator <strong>and</strong>/or an air/groundcontroller.• <strong>Test</strong> Simulations <strong>and</strong> Dry Runs. Plan <strong>and</strong> usetest simulations <strong>and</strong> dry runs. Dry runs shouldbe conducted for each new phase of testing.Simulation <strong>and</strong> other laboratory or groundtesting should be conducted to predict thespecific test out<strong>com</strong>e. The “wet run” test shouldfinally be run to verify the test objectives. <strong>Evaluation</strong>of the simulation versus the actual testresults will help to refine the underst<strong>and</strong>ing ofthe system.• Component Performance Records. Keep performancerecords on <strong>com</strong>ponents. There are manyexamples in missile programs that have required25-8


parts stock sweeps associated with flight failures<strong>and</strong> <strong>com</strong>ponent aging test programs.• Tracking <strong>Test</strong> Data. Ensure the test programtracks data in a readily usable manner. Reliability<strong>and</strong> performance evaluations of a missilesystem should break down the missile’sactivity into at least the following phases:— Prelaunch, including captive carryreliability,— Launch,— In-flight,— Accuracy/fuzing.• Updating IOT&E Planning. Periodically updateProduction Qualification <strong>Test</strong>ing (PQT)<strong>and</strong> IOT&E planning during the early R&Dphase. Few missile system programs have hadadequate user participation with the desiredcontinuity of personnel to minimize the problemsof transition from DT&E to OT&E todeployment/utilization.• Instrumentation Provisions in ProductionMissiles. Encourage built-in instrumentationprovisions in production missiles.• Constraints on Missile Operator. Detailed testplans should be evaluated to determine thatthe test-imposed constraints on the missileoperator do not invalidate the applicability ofthe data so collected.• Problem Fixes Before Production. Ensure thatoperational suitability tests identify operationaldeficiencies of new systems quickly so fixescan be developed <strong>and</strong> tested before production.• Flight <strong>Test</strong>s Representative of Operations. Ascertainthat final DT&E system tests <strong>and</strong>planned IOT&E flight tests are representativeof operational flights. Some ballistic missileR&D programs have shown high success ratesin R&D flight tests; however, when the earlyproduction systems were deployed, they exhibiteda number of unsatisfactory characteristicssuch as poor alert reliability <strong>and</strong> poor operationaltest-flight reliability.• System Interfaces in OT. Ensure the primaryobjective of an OT planning is to obtain measurementson the overall performance of theweapon system when it is interfaced with thosesystems required to operationally use theweapons system.25.2.2.4 Production (LRIP, Full Rate),Deployment <strong>and</strong> OperationalSupport• Realistic Conditions for Operational <strong>Test</strong>ing.Ascertain operational testing is conductedunder realistic <strong>com</strong>bat conditions. This meansthat the offense/defense battle needs to besimulated in some way before the weapon systemevaluation can be considered <strong>com</strong>pleted.Whether this exercise is conducted within asingle Service (as in the test of a surface-tosurfaceantitank missile against tanks) oramong Services (as in the test of an ASMagainst tanks with antiaircraft protection), theplans for such testing should be formulated aspart of the system development plan.• <strong>Test</strong>ing All Operational Modes. Ensure theFOT&E plan includes tests of any operationalmodes not previously tested in IOT&E. Alllaunch modes including degraded, backupmodes should be tested in the FOT&E becausethe software interface with the production hardwaresystem should be evaluated thoroughly.Otherwise, small, easy-to-fix problems mightpreclude launch.• Extension of the OT&E for New Threats. Bealert to the need to extend the IOT&E/FOT&Eif a new threat arises. Few missile programsperform any kind of testing relatable to evaluatingsystem performance against current ornew threats.25-9


• “Lead-the-Fleet” Production Scheduling.Lead-the-Fleet missile scheduling <strong>and</strong> testsshould be considered.• <strong>Test</strong> Fixes. <strong>Test</strong> fixes result from earlier operationaltesting. After the IOT&E that identifiedproblem areas in missiles, FOT&E shouldevaluate these areas primarily to determine theadequacy of the incorporated fixes, particularlyif the IOT&E did not run long enough totest the fixes.• FOT&E Feedback to Acceptance <strong>Test</strong>ing.Ensure that FOT&E results are quickly fedback to influence early production acceptancetesting. Production acceptance testing is probablythe final means the government normallywill have to ensure the product meets specifications.Early acceptance testing could beinfluenced favorably by a quick feedback fromFOT&E to acceptance testing. This is exemplifiedby a current ASM program where productionhas reached peak rates, <strong>and</strong> the IOT&Ehas not been <strong>com</strong>pleted.25.2.3 Comm<strong>and</strong> <strong>and</strong> Control (C 2 ) Systems25.2.3.1 Concept Assessment• Concept <strong>Test</strong> Philosophy. The T&E plannersmust underst<strong>and</strong> the nature of Comm<strong>and</strong> <strong>and</strong>Control (C 2 ) systems early in the concept assessment.In a <strong>com</strong>plex C 2 system, a totalsystems concept must be developed initially.Total systems life cycle must be analyzed sothe necessary requirement for the design canbe established.• The Importance of Software <strong>Test</strong>ing. <strong>Test</strong>ersshould recognize that software is a pacing itemin C 2 systems development.• Software <strong>Test</strong> Scheduling – Contractors’Facilities. Provision should be made for includingsoftware T&E during each phase ofC 2 systems’ acquisition. Availability of contractors’facilities should be considered.• <strong>Evaluation</strong> of Exploratory Development <strong>Test</strong>s.Care should be exercised in evaluating resultsof tests conducted during exploratory developmentof C 2 systems. These tests, which mostlikely have been conducted on brassboard,breadboard, or modified existing hardware,should be evaluated with special attention.• Feasibility <strong>Test</strong>ing for Field Compilers. Earlytest planning should allow for simulating the<strong>com</strong>puter system to test for field use of<strong>com</strong>pilers, where applicable.• <strong>Evaluation</strong> of <strong>Test</strong> Plan Scheduling. Milestonesshould be event-oriented, not calendar-oriented.• Type Personnel Needs – Effects on T&E. A mixof personnel with different backgroundsaffecting T&E is required.• Planning for Joint-Service OT&E BeforeProgram Start. A Joint-Service OT&E (multi-Service) T&E strategy should be consideredfor C 2 systems.25.2.3.2 Prototype <strong>Test</strong>ing• <strong>Test</strong> Prototypes. In C 2 systems, prototypes mustreasonably resemble final hardware configurationfrom a functional-use st<strong>and</strong>point. Whenhigh technical risk is present, developmentshould be structured around the use of one ormore test prototypes designed to prove thesystem concept under realistic operationalconditions before proceeding to engineeringdevelopment.• <strong>Test</strong> Objectives: Critical Issues. In addition toaddressing critical technical issues, T&Eobjectives during prototype testing shouldaddress the functional issues of a C 2 system.25-10


• Real-Time Software: Demonstration of “ApplicationPatches.” <strong>Test</strong>s of real-time C 2 systemsshould include demonstrations of interfaceswhereby locally generated application patchesare brought into being.• Independent Software <strong>Test</strong>-User Group. Anindependent test-user software group is neededduring early software qualification testing.• System Interfaces. Critical attention should bedevoted to testing interfaces with other C 2 systems<strong>and</strong> to interfaces between subsystems.Particular attention should be devoted tointerfaces with other C 2 systems <strong>and</strong> to the interfacesbetween sensors (e.g., radar units),<strong>com</strong>munications systems (e.g., modems), <strong>and</strong>the specific processors (e.g., Central ProcessingUnits (CPUs)). Interface with informationprocessing C 2 systems must also address dataelement<strong>and</strong> code-st<strong>and</strong>ardization problems ifdata is to be processed online.• Human Factors. In a C 2 system, human factorsmust be considered from the earliest prototypedesigns <strong>and</strong> testing provided. <strong>Test</strong>ing shouldbe conducted to determine the most efficientarrangement of equipment from the humanfactor st<strong>and</strong>point; e.g., displays should be arrangedfor viewing from an optimum anglewhenever possible; adequate maneuveringroom within installation constraints should beallowed considering the number of personnelnormally manning the facility; <strong>and</strong> consolemountedcontrols should be designed <strong>and</strong>located to facilitate operation, minimize fatigue,<strong>and</strong> avoid confusion.• Degraded Operations <strong>Test</strong>ing. When the expectedoperational environment of a C 2 systemsuggests that the system may be operatedunder less-than-finely-tuned conditions, testsshould be designed to allow for performancemeasurements under degraded conditions.• <strong>Test</strong>-Bed. The use of a test-bed for study <strong>and</strong>experimentation with new C 2 systems isneeded early in the prototype development.• Software-Hardware Interfaces. The softwarehardwareinterfaces, with all operationalbackup modes to a new C 2 system, should betested early in the program.• Reproducible <strong>Test</strong>s. <strong>Test</strong> plans should containa method for allowing full-load message inputwhile maintaining reproducible test conditions.• Cost-Effectiveness. Field-test data are neededduring the prototype development for input tocost-effectiveness analyses of C 2 systems.25.2.3.3 Engineering Development Model• Acquisition Strategy. The acquisition strategyfor the system should:— Allow sufficient time between the plannedend of demonstration testing <strong>and</strong> majorprocurement (as opposed to limited procurement)decisions. This provides flexibilityfor modifying plans, which may berequired during the test phases of theprogram. For instance, because insufficienttime was allowed for testing one recent C 2system, the program <strong>and</strong> the contract hadto be modified <strong>and</strong> renegotiated;— Be evaluated relative to constraints imposed;— Ensure that sufficient dollars are available,not only to conduct the planned T&E butto allow for the additional T&E that isalways required due to failures, designchanges, etc.• Problem Indications. It is important to establishan early detection scheme so managementcan determine when a program is be<strong>com</strong>ing“ill.”• Impact of Software Failures. Prior to anyproduction release, the impact of software25-11


failures on overall system performanceparameters must be considered.• Displays. The display subsystems of a C 2system should provide an essential function tothe user. Displays are key subsystems of a C 2system. They provide the link that couples theoperator to the rest of the system <strong>and</strong> are, therefore,often critical to its success.• Pilot <strong>Test</strong>. A pilot test should be conductedbefore IOT&E so sufficient time is availablefor necessary changes.• Publications <strong>and</strong> Manuals. It is imperative thatall system publications <strong>and</strong> manuals be<strong>com</strong>pleted, reviewed, <strong>and</strong> selectively tested underoperational conditions before beginningoverall system suitability testing.• Power Sources. Mobile, prime power sourcesare usually provided as Government-FurnishedEquipment (GFE) <strong>and</strong> can be a problem areain testing C 2 systems.• Subsystem <strong>Test</strong>s. Every major subsystem of a C 2system should have a successful DT&E beforebeginning overall system operational testing.• Communications. The C 2 systems must betested in the appropriate electromagnetic environmentto determine the performance of its<strong>com</strong>munications system.• Demonstration of Procedures. <strong>Test</strong> plans shouldinclude a procedural demonstration wherebythe tested C 2 system works in conjunction withother systems.• GFE <strong>and</strong> Facilities. T&E should be concernedabout the availability of GFE as specified inthe proposed contract.• User Participation in T&E. The varying needsof the user for a C 2 system make participationin all phases of T&E m<strong>and</strong>atory.25.2.3.4 Production (LRIP, Full Rate),Deployment <strong>and</strong> OperationalSupport• Critical Issues. IOT&E should be designedduring early planning to provide the answersto some critical issues peculiar to C 2 systems.Some of these critical issues that IOT&E ofC 2 systems should be planned to answer are:— Is system mission reaction time a significantimprovement over present systems?— Is a backup mode provided for use wheneither airborne or ground system exhibitsa failure?— Can the system be transported as operationallyrequired by organic transport?(Consider ground, air, <strong>and</strong> amphibiousrequirements.)— Is there a special requirement for sitepreparation? (For example, survey <strong>and</strong> antennalocation.)— Can the system be erected <strong>and</strong> dismantledin times specified? Are these timesrealistic?— Does relocation affect system alignment?— Does system provide for operation duringmaintenance?— Can on-site maintenance be performed onshelterless subsystems (e.g., radar units)during adverse weather conditions?• IOT&E Reliability Data. The IOT&E canprovide valuable data on the operational reliabilityof a C 2 system; this data cannot beobtained through DT&E.• Maintenance. In IOT&E, maintenance shouldinclude: a measurement of the adequacy of themaintenance levels <strong>and</strong> the maintenance practices;an assessment of the impact that themaintenance plan has on the operational reliability;the accessibility of the major <strong>com</strong>ponentsof the system for field maintenance (e.g.,cables <strong>and</strong> connectors are installed to facilitateaccess); <strong>and</strong> verification that the softwaredesign for maintenance <strong>and</strong> diagnostic routines25-12


<strong>and</strong> procedures are adequate <strong>and</strong> the softwarecan be modified to ac<strong>com</strong>modate functionalchanges.• Continuity of Operations. The IOT&E shouldprovide for an impact assessment of the failureof any subsystem element of a C 2 systemon overall mission effectiveness.• Imitative Deception. The IOT&E shouldprovide for tests to assess the susceptibility ofthe data links of a C 2 system to imitativedeception.• First Article <strong>Test</strong>ing (FAT). The pre-production,FAT <strong>and</strong> evaluation should be designed <strong>and</strong>conducted to: (1) confirm the adequacy ofthe equipment to meet specified performancerequirements; (2) confirm the adequacy of thesoftware not only to meet current user needsbut to ac<strong>com</strong>modate changing needs; <strong>and</strong> (3)determine failure modes <strong>and</strong> rates of the totalintegrated system. This activity should befollowed by FOT&E.• <strong>Test</strong> Planners <strong>and</strong> Evaluators. Use the IOT&Epersonnel in the FOT&E program. The planners<strong>and</strong> evaluators for the FOT&E of the productionsystem can do a better job if they areinvolved initially in planning <strong>and</strong> conductingthe IOT&E.25.2.4 Ship Systems25.2.4.1 Concept Assessment• TEMP. Prior to program initiation, sufficientmateriel should be generated to allow forevaluating the overall T&E program.• <strong>Test</strong> Objectives <strong>and</strong> Critical Issues. In evaluatingthe initial test concept, it is importantthat the test objectives during prototype test<strong>and</strong> evaluation address the major critical issues,especially technological issues.• <strong>Test</strong> Facilities <strong>and</strong> Instrumentation Required.The test facilities <strong>and</strong> instrumentation requirementsto conduct developmental <strong>and</strong> operationaltests <strong>and</strong> a tentative schedule of testactivities should be identified.• Multiple Approach To Weapon System Development.Whenever possible, the weapon systemconcept should not be predicated on thesuccessful development of a single hardwareor software approach in the various critical subsystems(unless it has been previouslydemonstrated adequately).• Comparison of New versus Old System. Theprocedure for examining the relative performanceof new or modified systems versus oldshould be indicated in the T&E plan.• <strong>Test</strong> Support Facilities. The phasing of testsupport facilities must be planned carefully,with some schedule flexibility to cover latedelivery <strong>and</strong> other unforeseen problems.• Fleet Operating Force Requirements. The requirementfor fleet operating forces for DT&Eor OT&E should be assessed early in theprogram <strong>and</strong> a specific <strong>com</strong>mitment made asto the types of units to be employed.• Mission-Related MOEs. During the conceptassessment of the acquisition of a new classof ship, a study effort should be <strong>com</strong>mencedjointly by the Chief of Naval Operations(CNO) <strong>and</strong> the Comm<strong>and</strong>er, Operational <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> Force (COMOPTEVFOR).This effort is to establish mission-relatedMOEs, which may be expressed in numericalfashion <strong>and</strong> may later be made the subject ofOT&E to determine how closely the new shipsystem meets the operational need for whichit was conceived.• Ship T&E <strong>Management</strong>. The management ofship T&E should ensure that test requirementsare necessary <strong>and</strong> consistent relative to systems/25-13


subsystem aspects <strong>and</strong> that the necessary testingis coordinated so that test redundancy doesnot be<strong>com</strong>e a problem.• T&E of Large, Integrally-Constructed Systems.Major subsystems should be proven feasiblebefore firm <strong>com</strong>mitment to a detailed hulldesign.25.2.4.2 Prototype <strong>Test</strong>ing• Authentication of Human Factors Concepts.T&E should authenticate the human factorsconcepts embodied in the proposed systemsdesign, examining questions of safety, <strong>com</strong>fort,appropriateness of man-machine interfaces,as well as the numbers <strong>and</strong> skill levelsof the personnel required.• Acquisition Strategy. The acquisition strategyfor a ship <strong>and</strong> its subsystems should allowsufficient time between the planned end ofdemonstration testing <strong>and</strong> major procurementdecisions of GFE for flexibility to modify plans(may be required during the test phases of theprogram).• <strong>Evaluation</strong> of Results of Exploratory <strong>Test</strong>ing.Results of tests conducted during exploratorydevelopment <strong>and</strong> most likely conducted onbrassboards, breadboards, or modified existinghardware should be evaluated carefully.• Software <strong>Test</strong>ing. In view of increased dependenceupon <strong>com</strong>puters in ship management <strong>and</strong>tactical operation, software testing must be exceptionallythorough, <strong>and</strong> integrated softwaretesting must begin as early as possible.• New Hull Forms. When a new type of shipinvolves a radical departure from the conventionalhull form, extensive prototype testingshould be required prior to further<strong>com</strong>mitment to the new hull form.• Effects of Hull <strong>and</strong> Propulsion on MissionCapability. The predicted effects of the provenhull <strong>and</strong> propulsion system design on the performanceof the ship’s critical system shouldbe determined.• Advances in Propulsion. Demonstration of theuse of new propulsion systems should be conductedprior to making the decision to <strong>com</strong>mitthe propulsion systems to the ship in question.• Propulsion Systems in Other Classes. Whenan engine to be used in the propulsion systemof a new ship is already performing satisfactorilyin another ship, this is not to be taken asan indication that shortcuts can be taken inpropulsion system DT&E, or that no problemswill be encountered.• Waivers to T&E of Ship Systems. Waivers toT&E of pre-production models of a system inorder to speed up production <strong>and</strong> deliveryshould be made only after considering all costs<strong>and</strong> benefits of the waiver, including those notassociated with the contract.• Environment Effects on Sonar Domes. Environmentaleffects on sonar domes <strong>and</strong> theirself-noise should be tested <strong>and</strong> evaluatedbefore the domes are accepted as part of thesonar system.• Hull/Machinery <strong>Test</strong>ing by Computer Simulation.In DT&E ships, there will be cases wherethe best means to conduct evaluations of particularhull <strong>and</strong> machinery capabilities isthrough dynamic analysis using <strong>com</strong>putersimulation, with later validation of the simulationby actual test.25.2.4.3 Engineering Development Model• Initial or Pilot Phase of IOT&E. Before anyOTs to demonstrate operational suitability <strong>and</strong>effectiveness are conducted, an initial or pilot25-14


test should be conducted (Technical <strong>Evaluation</strong>(TECHEVAL)).• Identify Critical Subsystems. In planning forthe IOT&E of a ship system, the critical subsystems,with respect to mission performance,should be identified.• Reliability of Critical Systems. T&E should determinethe expected reliability at sea of systemscritical to the ship’s mobility <strong>and</strong> to theprimary <strong>and</strong> major secondary tasks.• Consistency in <strong>Test</strong> Objectives. There are variousphases in testing a ship system. One shouldensure the objectives of one phase are notinconsistent with the objectives of the otherphases.• Single Screw Ships. T&E of the propulsionsystems of ships with a single screw shouldbe especially rigorous to determine failurerates, maintenance, <strong>and</strong> repair alternatives.• Problems Associated With New Hulls. Whenevera new hull is incorporated into shipdesign, a T&E of this hull should be conductedprior to the Full-Rate Production (FRP) <strong>and</strong>incorporation of the major weaponssubsystems.25.2.4.4 Production (LRIP, Full Rate),Deployment <strong>and</strong> OperationalSupport• The OT&E of Shipboard Gun Systems. TheOTs of shipboard gun systems should simulatethe stress, exposure time, <strong>and</strong> other conditionsof battle so that the suitability of theweapon can be evaluated in total.• Operational Reliability. The OT&E shouldprovide valuable data on the operational reliabilityof ship weapon systems that cannot beobtained through DT&E.• Targets for Antiaircraft Warfare (AAW) OT&E.The OT of shipboard AAW weapons dem<strong>and</strong>sthe use of targets that realistically simulate thepresent-day threat.• Design of Ship FOT&E. In the testing programof a ship system, it should be recognized that,although it may be designated as a specialpurposeship, in most cases it will be used ina general-purpose role as well. This will causepost deployment FOT&E.• Operational <strong>Test</strong>ing During Shakedown Periods.The time period for FOT&E of a ship canbe used more efficiently if full advantage istaken of the periods immediately after the shipis delivered to the Navy.• Fleet Operations in FOT&E. A great deal ofinformation on the operational effectiveness ofa ship can be obtained from st<strong>and</strong>ard fleetoperations through well-designed informationcollection, processing, <strong>and</strong> analysis procedures.• Ship Antisubmarine Warfare (ASW) FOT&EPlanning. In planning FOT&E of shipboardsystems, it is important to recognize thedifficulty of achieving realism, perhaps moreso than in other areas of naval warfare.• Variable Depth Sonar FOT&E. The behaviorof towed bodies of variable depth sonar systems<strong>and</strong> towed arrays should be tested <strong>and</strong>evaluated under all ship maneuvers <strong>and</strong> speedslikely to be encountered in <strong>com</strong>bat.• Ship Self-Noise <strong>Test</strong>s. The magnetic <strong>and</strong> acousticsignatures of a ship can be tested accuratelyonly after they are <strong>com</strong>pleted.• Effect of Major Electronic Countermeasures(ECM) on Ship Capability. The FOT&E of aship should include tests of the effectivenessof the ship when subjected to major ECM.25-15


• Ship System Survivability. FOT&E of modernships should provide for the assessment of theirability to survive <strong>and</strong> continue to fight whensubjected to battle damage.• Interlocks. Shipboard electronic systems aredesigned with interlock switches that openelectrical circuits for safety reasons when theequipment cabinets are opened. The FOT&Eshould be able to detect over-design as well asminimum design adequacy of the interlocksystems.• Intra-ship Communication. In conducting leadship trials <strong>and</strong> evaluations, particular attentionshould be given to the operational impactresulting from absence, by design, of intra-ship<strong>com</strong>munications circuits <strong>and</strong> stations fromimportant operating locations.25.2.5 Surface Vehicle Systems25.2.5.1 Concept Assessment• Preparing <strong>Test</strong> Plans. It is necessary thatdetailed evaluation criteria be established thatincludes all items to be tested.• <strong>Test</strong> Plans. Prior to program initiation, a planshould be prepared for evaluating the overallT&E program. As part of this, a detailed T&Eplan for those tests to be conducted beforeadvanced engineering development to validatethe concept <strong>and</strong> hardware approach to thevehicle system should be developed. The objectiveof the validation test plan is to fullyevaluate the performance characteristics of thenew concept vehicle. This test plan cannot bedeveloped, of course, until the performancecharacteristics are defined.• Performance Characteristics Range. Statedperformance characteristics derived from studiesshould be measured early in the program.Unrealistic performance requirements can leadto false starts <strong>and</strong> costly delays.• Operating Degradation. System performancedegrades under field conditions. Anticipateddegradation must be considered during T&E.When a system must operate at peak performanceduring DT/OT to meet the specified requirements,it will then be likely to perform ata lesser level when operated in the field.• <strong>Test</strong> Personnel. The test director <strong>and</strong>/or keymembers of the test planning group within theproject office should have significant T&Eexperience.• Design Reviews. T&E factors <strong>and</strong> experiencemust influence the system design. The applicationof knowledge derived from past experiencecan be a major asset in arriving at a soundsystem design.• Surrogate Vehicles. When high technical riskis present, development should be structuredaround the use of one or more surrogate vehiclesdesigned to prove the system conceptunder realistic operational conditions beforeproceeding with further development.• <strong>Test</strong> Facilities <strong>and</strong> Scheduling. <strong>Test</strong> range <strong>and</strong>resource requirements to conduct validationtests <strong>and</strong> a tentative schedule of test activitiesshould be identified.25.2.5.2 Prototype <strong>Test</strong>ing• Vulnerability. The vulnerability of vehiclesshould be estimated on the basis of testing.• Gun <strong>and</strong> Ammunition Performance. Gun <strong>and</strong>ammunition development should be considereda part of overall tank system development.When a new gun tube, or one which has notbeen mounted previously on a tank chassis, isbeing evaluated, all ammunition types (includingmissiles) planned for use in that systemshould be test fired under simulated operationalconditions.25-16


• Increased Complexity. The addition of newcapabilities to an existing system or systemtype will generally increase <strong>com</strong>plexity of thesystem <strong>and</strong>, therefore, increase the types <strong>and</strong>amount of testing required <strong>and</strong> the time toperform these tests.• Component Interfaces. Prior to assembly in aprototype system, <strong>com</strong>ponent subsystemsshould be assembled in a mock-up <strong>and</strong> verifiedfor physical fit, human factors considerations,interface <strong>com</strong>patibility, <strong>and</strong> for electrical <strong>and</strong>mechanical <strong>com</strong>patibility.• Determining <strong>Test</strong> Conditions. During validation,test conditions should be determined bythe primary objectives of that test rather thanby more general considerations of realism.• <strong>Test</strong> Plan Development. The test plan developedby this point should be in nearly finalform <strong>and</strong> include, as a minimum:— A description of requirements,— The facilities needed to make evaluations,— The schedule of evaluations <strong>and</strong> facilities,— The reporting procedure, the objective beingto <strong>com</strong>municate test results in an underst<strong>and</strong>ableformat to all program echelons,— The T&E guidelines,— A further refinement of the cost estimatesthat were initiated during the Concept<strong>Evaluation</strong> Phase.• Prototype <strong>Test</strong>s. Prototype tests should showsatisfactory meeting of success criteria that aremeaningful in terms of operational usage. It isessential in designing contractually requiredtests, upon whose out<strong>com</strong>e large incentive paymentsor even program continuation may depend,to specify broader success criteria thansimply hit or miss in a single given scenario.• Reliability <strong>Test</strong>ing. Reliability testing shouldbe performed on <strong>com</strong>ponent <strong>and</strong> subsystem assembliesbefore testing of the <strong>com</strong>plete vehiclesystem. Prior to full system testing, viable<strong>com</strong>ponent <strong>and</strong> subsystem tests should beconducted.• Human Factors. In evaluating ground vehicles,human factors should be considered at allstages starting with the design of the prototype.• <strong>Test</strong> Plan Scheduling. <strong>Test</strong> plan schedulingshould be tied to event milestones rather thanto the calendar. In evaluating the adequacy ofthe scheduling as given by test plans, it is importantthat milestones be tied to the majorevents of the weapon system (meeting statedrequirements) <strong>and</strong> not the calendar.• <strong>Test</strong> Failures. The T&E schedule should besufficiently flexible to ac<strong>com</strong>modate failures<strong>and</strong> correction of identified problems.25.2.5.3 Engineering Development Model• Pilot <strong>and</strong> Dry-Run <strong>Test</strong>s. A scheduled seriesof tests should be preceded by a dry run,which verifies that the desired data will beobtained.• Comparison <strong>Test</strong>ing. The test program shouldinclude a detailed <strong>com</strong>parison of the characteristicsof a new vehicle system with those ofexisting systems, alternate vehicle systemconcepts (if applicable), <strong>and</strong> those of anysystem(s) being replaced.• Simulation. Simulation techniques <strong>and</strong> equipmentshould be utilized to enhance data collection.Creation of histograms for each testcourse provides a record of conditions experiencedby the vehicle during testing. Use of achassis dynamometer can produce additionaldriveling endurance testing with more <strong>com</strong>pleteinstrumentation coverage.• Environmental <strong>Test</strong>ing. Ground vehicles shouldbe tested in environmental conditions <strong>and</strong>situations <strong>com</strong>parable to those in which theywill be expected to perform.25-17


• System Vulnerability. For <strong>com</strong>bat vehicles,some estimate of vulnerability to battle damageshould be made.• Design Criteria Verification. Subsystem designcriteria should be <strong>com</strong>pared with actualcharacteristics.• Electromagnetic <strong>Test</strong>ing. Vehicle testing shouldinclude electromagnetic testing.• System Strength <strong>Test</strong>ing. In evaluating groundvehicles, early testing should verify intrinsicstrength. This implies operation with maximumanticipated loading, including trailedloads at maximum speeds <strong>and</strong> over worst-casegrades, secondary roads, <strong>and</strong> cross-countryconditions for which the vehicle was developedor procured. This test is intended to identifydeficient areas of design, not to break themachinery.• Component Compatibility. Component <strong>com</strong>patibilityshould be checked through theduration of the test sequence.• Human Interface. Critiques of good <strong>and</strong> badfeatures of the vehicle should be made earlyin the prototype stage while adequate timeremains to make any indicated changes.• Serviceability <strong>Test</strong>ing. Ground vehicles shouldbe tested <strong>and</strong> evaluated to determine the relativeease of serviceability, particularly withhigh-frequency operations.• Experienced User Critique. Ground vehicleuser opinions should be obtained early in thedevelopment program.• Troubleshooting During <strong>Test</strong>s. Provisionsshould be made to identify subsystem failurecauses. Subsystems may exhibit failures duringtesting. Adequate provisions should bemade to permit troubleshooting <strong>and</strong> identificationof defective <strong>com</strong>ponents <strong>and</strong> inadequatedesign.25.2.5.4 Production (LRIP, Full Rate),Deployment <strong>and</strong> OperationalSupport• Planning the IOT&E. The IOT&E should beplanned to be cost-effective <strong>and</strong> providemeaningful results.• Performance <strong>and</strong> Reliability <strong>Test</strong>ing. The productionFAT should verify the performance ofthe vehicle system <strong>and</strong> determine the degradation,failure modes, <strong>and</strong> failure rates.• Lead-the-Fleet <strong>Test</strong>ing. At least one productionprototype or initial production model vehicleshould be allocated to intensive testingto accumulate high operating time in a shortperiod.• User <strong>Evaluation</strong>. User-reported short<strong>com</strong>ingsshould be followed up to determine problemareas requiring correction. Fixes should beevaluated during an FOT&E.25-18


ENDNOTES1. Office of the Secretary of Defense (OSD),Best Practices Applicable to DoD [Departmentof Defense] Developmental <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong>. Conducted by the Science ApplicationsInternational Corporation (SAIC),June 1999.3. GAO/NSIAD-00-199, Best Practices: A MoreConstructive <strong>Test</strong> Approach is Key to BetterWeapon System Out<strong>com</strong>es, July 2000.4. Defense Science Board Study, Report of TaskForce on <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>, April 2, 1974.2. Defense Science Board Report, <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> Capabilities, December 2000.25-19


25-20


APPENDICES


APPENDIX AACRONYMS AND THEIR MEANINGSAaAAEAAHAAWACATACEACTDADMADPADPEAECAFAFCEAAFEWESAFFTCAFMAFMCAFOTECAFRAFSCAF/TEAFVAiAISALCMAMCAMSDLAoAchieved availabilityArmy Acquisition ExecutiveAdvanced Attack HelicopterAntiaircraft WarfareAcquisition CategoryArmy Corps of EngineersAdvanced Concept Technology DemonstrationAcquisition Decision Memor<strong>and</strong>umAutomated Data ProcessingAutomated Data Processing EquipmentArmy <strong>Evaluation</strong> CenterAir ForceArmed Forces Communications <strong>and</strong> Electronics AssociationAir Force Electronic Warfare <strong>Evaluation</strong> SimulatorAir Force Flight <strong>Test</strong> CenterAir Force ManualAir Force Materiel Comm<strong>and</strong>Air Force Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> CenterAir Force RegulationAir Force Space Comm<strong>and</strong>Chief of Staff, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (Air Force)Armored Family of VehiclesInherent AvailabilityAutomated Information SystemAir-Launched Cruise MissileArmy Materiel Comm<strong>and</strong>Acquisition <strong>Management</strong> Systems <strong>and</strong> Data Requirements Control ListOperational AvailabilityA-1


AoAAPAPBARARL-HREDASA(ALT)ASD(NII)ASMASN(RD&A)ASN/RE&SASRASWATDATEATECATIRSATPATPSATRAWACSBABITBITEBLRIPBoDBoODC 2C 3Analysis of AlternativesAcquisition PlanAcquisition Program BaselineArmy RegulationArmy Research Laboratory-Human Research <strong>and</strong> EngineeringDivision/DirectorateAssistant Secretary of the Army for Acquisition, Logistics, <strong>and</strong>TechnologyAssistant Secretary of Defense for Networks <strong>and</strong> InformationIntegrationAir-to-Surface MissileAssistant Secretary of the Navy for Research, Development, <strong>and</strong>AcquisitionAssistant Secretary of the Navy/Research, Engineering, <strong>and</strong> ScienceAlternative System ReviewAntisubmarine WarfareAdvanced Technology Demonstration (or Development)Automatic <strong>Test</strong> EquipmentArmy <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Comm<strong>and</strong>Army <strong>Test</strong> Incident Reporting SystemAutomated <strong>Test</strong> PlanAutomated <strong>Test</strong> Planning SystemAcceptance <strong>Test</strong> ReportAirborne Warning <strong>and</strong> Control SystemBudget Activity; Budget AuthorityBuilt-in <strong>Test</strong>Built-in <strong>Test</strong> EquipmentBeyond Low Rate Initial ProductionBoard of DirectorsBoard of Operating DirectorsComm<strong>and</strong> <strong>and</strong> ControlComm<strong>and</strong>, Control, <strong>and</strong> CommunicationsA-2


C 3 IC 4C 4 IC 4 ISRCADCAECAIVCAMCARSCDDCDRCDRLCECEPCERDECCFRCG MCSCCICIICIOCJCSICJCSMCLINCNOCNPCOCOMCOEACOICOICComm<strong>and</strong>, Control, Communications, <strong>and</strong> IntelligenceComm<strong>and</strong>, Control, Communications, <strong>and</strong> ComputersComm<strong>and</strong>, Control, Communications, Computers, <strong>and</strong> IntelligenceComm<strong>and</strong>, Control, Communications, Computers, Intelligence,Surveillance, <strong>and</strong> ReconnaissanceComputer-Aided DesignComponent Acquisition ExecutiveCost as an Independent VariableComputer-Aided ManufacturingConsolidated Acquisition Reporting SystemCapability Development DocumentCritical Design ReviewContract Data Requirements ListContinuous Education; Concept ExplorationCircle Error Probability; Concept Experimentation ProgramCommunications-Electronics Research Development <strong>and</strong> EngineeringCenterCode of Federal RegulationsComm<strong>and</strong>ing General, Marine Corps Systems Comm<strong>and</strong>Configuration ItemCompatibility, Interoperability, <strong>and</strong> IntegrationChief Information OfficerChairman of the Joint Chiefs of Staff InstructionChairman of the Joint Chiefs of Staff ManualContract Line Item NumberChief of Naval OperationsC<strong>and</strong>idate Nomination ProposalCombatant Comm<strong>and</strong>erCost <strong>and</strong> Operational Effectiveness AnalysisCritical Operational IssueCritical Operational Issues <strong>and</strong> CriteriaA-3


COMOPTEVFORCOOCPDCPSCPUCRCSCICTEIPCTODADABDAEDAESDAGDASDAUDBDDDCMADCSDCS/R&DDCSOPSDDR&EDIADIDDISADLTDMSODNADoDDoDDDoDIComm<strong>and</strong>er, Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Force (Navy)Concept of OperationsCapability Production DocumentCompetitive Prototyping StrategyCentral Processing UnitConcept RefinementComputer Software Configuration ItemCentral <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Investment ProgramComparative <strong>Test</strong> OfficeDepartment of the ArmyDefense Acquisition BoardDefense Acquisition ExecutiveDefense Acquisition Executive SummaryData Authentication GroupDirector of the Army StaffDefense Acquisition UniversityData Base Design DocumentDefense Contract <strong>Management</strong> AgencyDeputy Chief of StaffDeputy Chief of Staff for Research <strong>and</strong> DevelopmentDeputy Chief of Staff for OperationsDirector of Defense Research <strong>and</strong> EngineeringDefense Intelligence AgencyData Item DescriptionDefense Information Systems AgencyDesign Limit <strong>Test</strong>Defense Modeling <strong>and</strong> Simulation OfficeDefense Nuclear AgencyDepartment of DefenseDepartment of Defense DirectiveDepartment of Defense InstructionA-4


DOEDOT&EDOTLPFDPRODRRDSDSARCDSBDSMCDTDT&EDT&E/DSDTCDTICDTRMCDTSE&EDTTSGDUSA(OR)DUSD(A&T)DVALEAECECCMECMECPECREDMEDPEDTEFADepartment of EnergyDirector, Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Doctrine, Organization, Training, Leadership, Personnel, <strong>and</strong> FacilitiesDefense Plant Representative OfficeDesign Readiness ReviewDefense SystemsDefense Systems Acquisition Review Council (now the DefenseAcquisition Board (DAB))Defense Science BoardDefense Systems <strong>Management</strong> CollegeDevelopment <strong>Test</strong>Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>/Defense SystemsDevelopmental <strong>Test</strong> Comm<strong>and</strong> (Army)Defense Technical Information CenterDefense <strong>Test</strong> Resource <strong>Management</strong> CenterDirector, <strong>Test</strong> Systems Engineering <strong>and</strong> <strong>Evaluation</strong>Defense <strong>Test</strong> <strong>and</strong> Training Steering GroupDeputy Under Secretary of the Army (Operations Research)Deputy Under Secretary of Defense for Acquisition <strong>and</strong> TechnologyData Link Vulnerability AnalysisEvolutionary AcquisitionElectronic CombatElectronic Counter-CountermeasuresElectronic CountermeasuresEngineering Change ProposalEngineering Change ReviewEngineering Development ModelEvent Design Plan (Army)Engineering Design <strong>Test</strong>Engineering Flight ActivityA-5


EISEMCSEMIEMPEOAERAMESMESOHESSEWFA/AFACAFARFATFCAFCTFDTEFFBDFMECAFOCFORSCOMFOT&EFQRFQTFRPFRPDRFWEFYFYDPFYTPGAOEnvironmental Impact StatementElectromagnetic Control StudyElectromagnetic InterferenceElectromagnetic PulseEarly Operational AssessmentExtended Range Anti-armor MunitionsElectronic Support MeasuresEnvironmental Safety <strong>and</strong> Occupational HealthEnvironmental Stress ScreeningElectronic WarfareFunctional Analysis/AllocationFederal Advisory Committee ActFederal Acquisition RegulationFirst Article <strong>Test</strong>ingFunctional Configuration AuditForeign Comparative <strong>Test</strong>ingForce Development <strong>Test</strong>s <strong>and</strong> ExperimentationFunctional Flow Block DiagramFailure Mode, Effects, <strong>and</strong> Criticality AnalysisFull Operational CapabilityForces Comm<strong>and</strong> (Army)Follow-on Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Formal Qualification ReviewFormal Qualification <strong>Test</strong>/<strong>Test</strong>ingFull Rate ProductionFull Rate Production Decision ReviewForeign Weapons <strong>Evaluation</strong>Fiscal YearFuture Years Defense ProgramFuture Years <strong>Test</strong> Program; Five Year <strong>Test</strong> Program (Army)General Accounting Office (now Government Accountability Office)A-6


GFEHELSTFHQDAHSIHWHWCIHWILICBMICDICEIDDIEPIERIFPPILSINSCOMIOAIOCIOT&EIPPDIPTIR&DIRAIRSITITEAITPITOPITSCAPITTSGovernment Furnished EquipmentHigh Energy Laser System <strong>Test</strong> FacilityHeadquarters, Department of the ArmyHuman Systems IntegrationHardwareHardware Configuration ItemHardware-in-the-LoopIntercontinental Ballistic MissileInitial Capabilities DocumentIndependent Cost EstimateInterface Decision DocumentIndependent <strong>Evaluation</strong> PlanIndependent <strong>Evaluation</strong> ReportInformation for Proposal PreparationIntegrated Logistics SupportIntelligence <strong>and</strong> Security Comm<strong>and</strong>Independent Operational AssessmentInitial Operating CapabilityInitial Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Integrated Product <strong>and</strong> Process DevelopmentIntegrated Product TeamInternational Research <strong>and</strong> DevelopmentIndustrial Resource AnalysisInterface Requirements SpecificationInformation TechnologyInternational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> AssociationIntegrated <strong>Test</strong> PlanInternational <strong>Test</strong> Operating ProcedureInformation Technology Security Certification <strong>and</strong> AccreditationProgramInstrumentation, Targets, <strong>and</strong> Threat SimulatorsA-7


IV&VJCIDSJITCJLFJROCJTCGJT&EJTTRKPPKrLCCLCCELFTLFT&ELRIPLSLSALSPLTBTM&SMAISMAJCOMMANPRINTMAVMCOTEAMCPMCSCMDAMDAPMEDCOMMIL-HDBKIndependent Verification <strong>and</strong> ValidationJoint Capabilities Integration <strong>and</strong> Development SystemJoint Interoperability <strong>Test</strong> Comm<strong>and</strong>Joint Live FireJoint Requirements Oversight CouncilJoint Technical Coordinating GroupJoint <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Joint Training <strong>and</strong> <strong>Test</strong> Range RoadmapKey Performance ParameterContractorLife Cycle CostLife Cycle Cost EstimateLive Fire <strong>Test</strong>ingLive Fire <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Low Rate Initial ProductionLogistics SupportLogistics Support AnalysisLogistics Support PlanLimited <strong>Test</strong> Ban TreatyModeling <strong>and</strong> SimulationMajor Automated Information SystemMajor Comm<strong>and</strong>sManpower <strong>and</strong> Personnel IntegrationMinimum Acceptable ValueMarine Corps Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> ActivityMilitary Construction ProgramMarine Corps Systems Comm<strong>and</strong>Milestone Decision Authority; Missile Defense AgencyMajor Defense Acquisition ProgramMedical Comm<strong>and</strong> (Army)Military H<strong>and</strong>bookA-8


MIL-SPECMIL-STDMOAMOEMOPMOSMOT&EMOUMPEMRTFBMSMSIACMTBFMTTRNAPMANATONAVAIRNAVSEANAWCNBCNBCCNDAANDCPNDINDUNEPANH&SNSIADNSSO&MO&SMilitary SpecificationMilitary St<strong>and</strong>ardMemor<strong>and</strong>um of AgreementMeasure of EffectivenessMeasure of PerformanceMilitary Occupational SpecialtyMulti-Service Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Memor<strong>and</strong>um of Underst<strong>and</strong>ingMilitary Preliminary <strong>Evaluation</strong>Major Range <strong>and</strong> <strong>Test</strong> Facility BaseMilestoneModeling <strong>and</strong> Simulation Information Analysis CenterMean Time Between FailureMean Time To RepairNorth Atlantic Program <strong>Management</strong> AgencyNorth Atlantic Treaty OrganizationNaval Air Systems Comm<strong>and</strong>Naval Sea Systems Comm<strong>and</strong>Naval Air Warfare CenterNuclear, Biological, <strong>and</strong> ChemicalNuclear, Biological, <strong>and</strong> Chemical ContaminationNational Defense Authorization ActNavy Decision Coordinating PaperNondevelopment(al) ItemNational Defense UniversityNational Environmental Policy ActNuclear Hardness <strong>and</strong> SurvivabilityNational Security <strong>and</strong> International Affairs DivisionNational Security SystemsOperations <strong>and</strong> MaintenanceOperations <strong>and</strong> SupportA-9


OAOCDOIPTOMBOPEVALOPNAVOPNAVISTOPSECOPTEVFORORMAS/TEOSDOTOT&EOTAOTCOTDOTRROTPP 3 IP&DPAT&EPCAPCOPDRPDRRPDUSD(A&T)PEPEOPEPPF/DOSOperational AssessmentOperational Concept DocumentOverarching Integrated Product TeamOffice of <strong>Management</strong> <strong>and</strong> BudgetOperational <strong>Evaluation</strong>Operational NavyOperational Navy InstructionOperations SecurityOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Force (Navy)Operational Resource <strong>Management</strong> Assessment Systems for <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong>Office of the Secretary of DefenseOperational <strong>Test</strong>Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Operational <strong>Test</strong> AgencyOperational <strong>Test</strong> Comm<strong>and</strong> (Army)Operational <strong>Test</strong> DirectorOperational <strong>Test</strong> Readiness ReviewOutline <strong>Test</strong> PlanPreplanned Product ImprovementsProduction <strong>and</strong> DeploymentProduction Acceptance <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Physical Configuration AuditPrimary Contracting OfficerPreliminary Design ReviewProgram Definition <strong>and</strong> Risk ReductionPrincipal Deputy Under Secretary of Defense for Acquisition <strong>and</strong>TechnologyProgram ElementProgram Executive OfficerProduction Engineering <strong>and</strong> PlanningProduction, Fielding/Deployment, <strong>and</strong> Operational SupportA-10


PIPkPk/hPMPMOPOPOMPPBEPQTPRATPRESINSURVPRRPSAQAQOT&ER&DR&ER&MRAMRASRCCRCSRD&ARDECOMRDTRDT&ERFPRGTRMRQTRSIProduct ImprovementProbability of killProbability of kill given a hitProgram Manager; Product ManagerProgram <strong>Management</strong> OfficeProgram Office, Purchase OrderProgram Objectives Memor<strong>and</strong>umPlanning, Programming, Budgeting <strong>and</strong> ExecutionProduction Qualification <strong>Test</strong>Production Reliability Acceptance <strong>Test</strong>President of the Board of Inspection <strong>and</strong> SurveyProduction Readiness ReviewPrincipal Staff AssistantQuality AssuranceQualification Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Research <strong>and</strong> DevelopmentResearch <strong>and</strong> EngineeringReliability <strong>and</strong> MaintainabilityReliability, Availability, <strong>and</strong> MaintainabilityRequirements Allocation SheetRange Comm<strong>and</strong>ers CouncilRadar Cross SectionResearch, Development, <strong>and</strong> AcquisitionResearch, Development <strong>and</strong> Engineering Comm<strong>and</strong> (Army)Reliability Development <strong>Test</strong>ingResearch, Development, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Request for ProposalReliability Growth <strong>Test</strong>Resource ManagerReliability Qualification <strong>Test</strong>Rationalization, St<strong>and</strong>ardization, <strong>and</strong> InteroperabilityA-11


RSMS&TSSAF/AQSAF/USASARSATSDDSECARMYSECDEFSECNAVSEPSFRSISILSMDCSOCOMSOOSOWSPAWARSPECSPOSRRSRSSSRSTASTEPSTPSTRISQASVRRadar Spectrum <strong>Management</strong>Strategic <strong>and</strong> Tactical SystemsSecretary of the Air Force for AcquisitionDirector of Space AcquisitionSelected Acquisition ReportShip Acceptance <strong>Test</strong>Software Design Document; System Development <strong>and</strong> DemonstrationSecretary of the ArmySecretary of the DefenseSecretary of the NavySystems Engineering Process; System Engineering Plan; System<strong>Evaluation</strong> Plan (Army)System Functional ReviewSystems IntegrationSoftware Integration LaboratorySpace <strong>and</strong> Missile Defense Comm<strong>and</strong>Special Operations Comm<strong>and</strong>Statement of ObjectivesStatement of WorkSpace <strong>and</strong> Naval Warfare Systems Comm<strong>and</strong>SpecificationSystem Program OfficeSystems Requirements ReviewSoftware Requirements SpecificationSoftware Specification ReviewSystem Threat AssessmentSimulation, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> ProcessSoftware <strong>Test</strong> PlanSimulation, Training, <strong>and</strong> Instrumentation (Army)Software Quality AssuranceSystem Verification ReviewA-12


SWSWILT&ETAAFTADSTAFTTDTDSTDYTEAMTECHEVALTEMATEMPTERCTFRDTIRICTLSTMTMCTMPTPDTPOTPMTPWGTRADOCTRIMSTRPTRRTRSTSARCUNKSoftwareSoftware in the Loop<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong><strong>Test</strong>, Analyze, <strong>and</strong> FixTarget Acquisition Designation System; Theater Air Defense System<strong>Test</strong>, Analyze, Fix, <strong>and</strong> <strong>Test</strong>Technology DevelopmentTechnology Development StrategyTemporary Duty<strong>Test</strong>, <strong>Evaluation</strong>, Analysis, <strong>and</strong> ModelingTechnical <strong>Evaluation</strong> (Navy Term)<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> <strong>Management</strong> Agency<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Resources Committee<strong>Test</strong> Facility Requirements DocumentTraining Instrumentation Resource Investment CommitteeTime Line SheetTechnical Manual, <strong>Test</strong> Manager<strong>Test</strong> <strong>Management</strong> CouncilTechnical <strong>Management</strong> Plan<strong>Test</strong> Program Documentation<strong>Test</strong> Program OutlineTechnical Performance Measurement<strong>Test</strong> Planning Working GroupTraining <strong>and</strong> Doctrine Comm<strong>and</strong><strong>Test</strong> Resource Information <strong>Management</strong> System<strong>Test</strong> Resources Plan<strong>Test</strong> Readiness Review<strong>Test</strong> Requirements Sheet<strong>Test</strong> Schedule <strong>and</strong> Review CommitteeUnknownA-13


U.S.USAUSAFUSAFE/DOQUSAKAUSD(AT&L)USMCUSNU.S.C.VCJCSVCSAVECPVV&AWBSWIPTWSMRUnited StatesUnited States ArmyUnited States Air ForceU.S. Air Forces in Europe/Directorate of Operations-OperationsUnited States Army Kwajalein AtollUnder Secretary of Defense for Acquisition, Technology, <strong>and</strong> LogisticsUnited States Marine CorpsUnited States NavyUnited States CodeVice Chairman, Joint Chiefs of StaffVice Chief of Staff, ArmyValue Engineering Change ProposalVerification, Validation <strong>and</strong> AccreditationWork Breakdown StructureWorking-level Integrated Product TeamWhite S<strong>and</strong>s Missile RangeA-14


APPENDIX BDOD GLOSSARY OFTEST TERMINOLOGYACCEPTANCE TRIALS — Trials <strong>and</strong> material inspection conducted underway by the trail boardfor ships constructed in a private shipyard to determine suitability for acceptance of a ship.ACQUISITION — The conceptualization, initiation, design, development, test, contracting, production,deployment, Logistics Support (LS), modification, <strong>and</strong> disposal of weapons <strong>and</strong> other systems,supplies, or services (including construction) to satisfy Department of Defense (DoD) needs,intended for use in or in support of military missions.ACQUISITION CATEGORY (ACAT) — ACAT I programs are Major Defense Acquisition Programs(MDAPs). An MDAP is defined as a program that is not highly classified <strong>and</strong> is designatedby the Under Secretary of Defense (Acquisition, Technology, <strong>and</strong> Logistics) (USD(AT&L))as an MDAP or is estimated by USD(AT&L) to require eventual total expenditure for research,development, test <strong>and</strong> evaluation of more than $365 million Fiscal Year (FY) 2000 constant dollars,or for procurement of more than $2.190 billion in FY2000 constant dollars.*1. ACAT ID for which the Milestone Decision Authority (MDA) is USD(AT&L). The “D”refers to the Defense Acquisition Board (DAB), which advises the USD(AT&L) at majordecision points.2. ACAT IC for which the MDA is the DoD Component Head or, if delegated, the DoD ComponentAcquisition Executive (CAE). The “C” refers to Component.The USD(AT&L) designates programs as ACAT ID or ACAT IC.ACAT IA programs are Major Automated Information Systems (MAISs). A MAIS is any programdesignated by the Assistant Secretary of Defense for Comm<strong>and</strong>, Control, Communications,<strong>and</strong> Intelligence (ASD(C 3 I)) as an MAIS or is estimated to require program costs for any singleyear in excess of $32 million in Fiscal Year (FY) 2000 constant dollars, total program in excess of$126 million in FY2000 constant dollars, or total life cycle costs in excess of $378 million FY2000constant dollars.MAIS do not include highly sensitive classified programs or tactical <strong>com</strong>munications systems.ACAT II program is a major system that is a <strong>com</strong>bination of elements that shall function togetherto produce the capabilities required to fulfill a mission need, excluding construction or otherimprovements to real property. It is estimated by the DoD Component head to require eventualtotal expenditure for Research, Development, <strong>Test</strong>, <strong>and</strong> <strong>Evaluation</strong> (RDT&E) of more than $140million in FY2000 constant dollars, or procurement of more than $660 million in FY2000 constantdollars, or is designated as major by the DoD Component head.B-1


ACAT III programs are defined as those acquisition programs that do not meet the criteria for anACAT I, an ACAT IA, or an ACAT II. The MDA is designated by the CAE <strong>and</strong> shall be at thelowest appropriate level. This category includes less-than-major AISs.ACQUISITION DECISION MEMORANDUM (ADM) — A memor<strong>and</strong>um signed by the MilestoneDecision Authority (MDA) that documents decisions made as the result of a milestonedecision review or in-process review.ACQUISITION LIFE CYCLE — The life of an acquisition program consists of phases, each precededby a milestone or other decision point, during which a system goes through research,development, test <strong>and</strong> evaluation, <strong>and</strong> production. Currently, the four phases are: (1) ConceptExploration (CE) (Phase 0); Program Definition <strong>and</strong> Risk Reduction (PDRR) (Phase I); (3) Engineering<strong>and</strong> Manufacturing Development (EMD) (Phase II); <strong>and</strong> (4) Production, Fielding/Deployment,<strong>and</strong> Operational Support (PF/DOS) (Phase III).ACQUISITION PHASE — All the tasks <strong>and</strong> activities needed to bring a program to the next majormilestone occur during an acquisition phase. Phases provide a logical means of progressivelytranslating broadly stated mission needs into well-defined system-specific requirements <strong>and</strong> ultimatelyinto operationally effective, suitable, <strong>and</strong> survivable systems.ACQUISITION PROGRAM BASELINE (APB) — A document that contains the most importantcost, schedule, <strong>and</strong> performance parameters (both objectives <strong>and</strong> thresholds) for the program. It isapproved by the Milestone Decision Authority (MDA), <strong>and</strong> signed by the Program Manager (PM)<strong>and</strong> their direct chain of supervision, e.g., for Acquisition Category (ACAT) ID programs it issigned by the PM, Program Executive Officer (PEO), Component Acquisition Executive (CAE),<strong>and</strong> Defense Acquisition Executive (DAE).ACQUISITION STRATEGY — A business <strong>and</strong> technical management approach designed to achieveprogram objectives within the resource constraints imposed. It is the framework for planning,directing, contracting for, <strong>and</strong> managing a program. It provides a master schedule for research,development, test, production, fielding, modification, postproduction management, <strong>and</strong> other activitiesessential for program success. Acquisition strategy is the basis for formulating functionalplans <strong>and</strong> strategies (e.g., <strong>Test</strong> And <strong>Evaluation</strong> Master Plan (TEMP), Acquisition Plan (AP), <strong>com</strong>petition,prototyping, etc.).ACQUISITION RISK — See Risk.ADVANCED CONCEPT TECHNOLOGY DEMONSTRATION (ACTD) — Shall be used todetermine military utility of proven technology <strong>and</strong> to develop the concept of operations that willoptimize effectiveness. ACTDs themselves are not acquisition programs, although they are designedto provide a residual, usable capability upon <strong>com</strong>pletion. Funding is programmed to support2 years in the field. ACTDs are funded with 6.3a (Advanced Technology Development (orDemonstration) (ATD)) funds.B-2


ADVANCED TECHNOLOGY DEVELOPMENT (OR DEMONSTRATION) (ATD) (BudgetActivity 6.3) — Projects within the 6.3a (ATD) program that are used to demonstrate the maturity<strong>and</strong> potential of advanced technologies for enhanced military operational capability or cost effectiveness.It is intended to reduce technical risks <strong>and</strong> uncertainties at the relatively low costs ofinformal processes.AGENCY COMPONENT — A major organizational subdivision of an agency. For example: theArmy, Navy, Air Force, <strong>and</strong> Defense Supply Agency are agency <strong>com</strong>ponents of the Department ofDefense. The Federal Aviation, Urban Mass Transportation, <strong>and</strong> the Federal Highway Administrationsare agency <strong>com</strong>ponents of the Department of Transportation.ANALYSIS OF ALTERNATIVES (AoA) — An analysis of the estimated costs <strong>and</strong> operationaleffectiveness of alternative materiel systems to meet a mission need <strong>and</strong> the associated programfor acquiring each alternative. Formerly known as Cost <strong>and</strong> Operational Effectiveness Analysis(COEA).AUTOMATED INFORMATION SYSTEM (AIS) — A <strong>com</strong>bination of <strong>com</strong>puter hardware <strong>and</strong>software, data, or tele<strong>com</strong>munications that performs functions such as collecting, processing,transmitting, <strong>and</strong> displaying information. Excluded are <strong>com</strong>puter resources, both hardware <strong>and</strong>software, that are physically part of, dedicated to, or essential in real time to the mission performanceof weapon systems.AUTOMATIC TEST EQUIPMENT (ATE) — Equipment that is designed to automatically conductanalysis of functional or static parameters <strong>and</strong> to evaluate the degree of performance degradation<strong>and</strong> perform fault isolation of unit malfunctions.BASELINE — Defined quantity or quality used as starting point for subsequent efforts <strong>and</strong> progressmeasurement that can be a technical cost or schedule baseline.BRASSBOARD CONFIGURATION — An experimental device (or group of devices) used to determinefeasibility <strong>and</strong> to develop technical <strong>and</strong> operational data. It will normally be a modelsufficiently hardened for use outside of laboratory environments to demonstrate the technical <strong>and</strong>operational principles of immediate interest. It may resemble the end item but is not intended foruse as the end item.BREADBOARD CONFIGURATION — An experimental device (or group of devices) used todetermine feasibility <strong>and</strong> to develop technical data. It will normally be configured only for laboratoryuse to demonstrate the technical principles of immediate interest. It may not resemble theend item <strong>and</strong> is not intended for use as the projected end item.CAPSTONE TEST AND EVALUATION MASTER PLAN (TEMP) — A TEMP which addressesthe testing <strong>and</strong> evaluation of a program consisting of a collection of individual systems (family ofsystems, system of systems) which function collectively. Individual system-unique content requirementsare addressed in an annex to the basic Capstone TEMP.B-3


CERTIFICATION FOR INITIAL OPERATIONAL TEST AND EVALUATION (IOT&E) — Aservice process undertaken in the advanced stages of system development, normally during LowRate Initial Production (LRIP), resulting in the announcement of a system’s readiness to undergoIOT&E. The process varies with each Service.COMPETITIVE PROTOTYPING STRATEGY (CPS) — Prototype <strong>com</strong>petition between two ormore contractors in a <strong>com</strong>parative side-by-side test.CONCEPT EXPERIMENTATION PROGRAM (CEP) — The CEP is an annual program executedby the Training <strong>and</strong> Doctrine Comm<strong>and</strong> (TRADOC) for <strong>com</strong>m<strong>and</strong>ers who have requirementsdetermination <strong>and</strong> force development missions for specific warfighting experiments. Experimentsare discrete events investigating either a material concept or a warfighting idea. TheCEP procedures are provided in TRADOC Pamphlet 71-9.CONCURRENCY — Part of an acquisition strategy that would <strong>com</strong>bine or overlap life cycle phases(such as Engineering <strong>and</strong> Manufacturing Development (EMD), <strong>and</strong> production), or activities (suchas development <strong>and</strong> operational testing).CONTINGENCY TESTING — Additional testing required to support a decision to <strong>com</strong>mit addedresources to a program, when significant test objectives have not been met during planned tests.CONTINUOUS EVALUATION (CE) — A continuous process, extending from concept definitionthrough deployment, that evaluates the operational effectiveness <strong>and</strong> suitability of a system byanalysis of all available data.COMBAT SYSTEM — The equipment, <strong>com</strong>puter programs, people, <strong>and</strong> documentation organic tothe ac<strong>com</strong>plishment of the mission of an aircraft, surface ship, or submarine; it excludes thestructure, material, propulsion, power <strong>and</strong> auxiliary equipment, transmissions <strong>and</strong> propulsion,fuels <strong>and</strong> control systems, <strong>and</strong> silencing inherent in the construction <strong>and</strong> operation of aircraft,surface ships, <strong>and</strong> submarines.CONFIGURATION MANAGEMENT — The technical <strong>and</strong> administrative direction <strong>and</strong> surveillanceactions taken to identify <strong>and</strong> document the functional <strong>and</strong> physical characteristics of aConfiguration Item (CI), to control changes to a CI <strong>and</strong> its characteristics, <strong>and</strong> to record <strong>and</strong>report change processing <strong>and</strong> implementation status. It provides a <strong>com</strong>plete audit trail of decisions<strong>and</strong> design modifications.CONTRACT — An agreement between two or more legally <strong>com</strong>petent parties, in the proper form,on a legal subject matter or purpose <strong>and</strong> for legal consideration.CONTRACTOR LOGISTICS SUPPORT — The performance of maintenance <strong>and</strong>/or materialmanagement functions for a Department of Defense (DoD) system by a <strong>com</strong>mercial activity.Historically done on an interim basis until systems support could be transitioned to a DoD organiccapability. Current policy now allows for the provision of system support by contractors ona long-term basis. Also called Long-Term Contractor Logistics Support.B-4


COOPERATIVE PROGRAMS — Programs that <strong>com</strong>prise one or more specific cooperative projectswhose arrangements are defined in a written agreement between the parties <strong>and</strong> which are conductedin the following general areas:1. Research, Development, <strong>Test</strong>, <strong>and</strong> <strong>Evaluation</strong> (RDT&E) of defense articles (including cooperativeupgrade or other modification of a U.S.-developed system), joint production (includingfollow-on support) of a defense article that was developed by one or more of the participants,<strong>and</strong> procurement by the United States of a foreign defense article (including software),technology (including manufacturing rights), or service (including logistics support) that areimplemented under Title 22 U.S.C. §2767, Reference (c), to promote the Rationalization,St<strong>and</strong>ardization, <strong>and</strong> Interoperability (RSI) of North Atlantic Treaty Organization (NATO)armed forces or to enhance the ongoing efforts of non-NATO countries to improve their conventionaldefense capabilities.2. Cooperative Research <strong>and</strong> Development (R&D) program with NATO <strong>and</strong> major non-NATOallies implemented under Title 10 U.S.C. §2350a, to improve the conventional defense capabilitiesof NATO <strong>and</strong> enhance RSI.3. Data, information, <strong>and</strong> personnel exchange activities conducted under approved DoD programs.4. <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E) of conventional defense equipment, munitions, <strong>and</strong> technologiesdeveloped by allied <strong>and</strong> friendly nations to meet valid existing U.S. military requirements.COST AS AN INDEPENDENT VARIABLE (CAIV) — Methodologies used to acquire <strong>and</strong> operateaffordable DoD systems by setting aggressive, achievable Life Cycle Cost (LCC) objectives,<strong>and</strong> managing achievement of these objectives by trading off performance <strong>and</strong> schedule, as necessary.Cost objectives balance mission needs with projected out-year resources, taking into accountanticipated process improvements in both DoD <strong>and</strong> industry. CAIV has brought attention tothe government’s responsibilities for setting/adjusting LCC objectives <strong>and</strong> for evaluating requirementsin terms of overall cost consequences.CRITICAL ISSUES — Those aspects of a system’s capability, either operational, technical, or other,that must be questioned before a system’s overall suitability can be known. Critical issues are ofprimary importance to the decision authority in reaching a decision to allow the system to advanceinto the next phase of development.CRITICAL OPERATIONAL ISSUE (COI) — Operational effectiveness <strong>and</strong> operational suitabilityissues (not parameters, objectives, or thresholds) that must be examined in Operational <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> (OT&E) to determine the system’s capability to perform its mission. A COI isnormally phrased as a question that must be answered in order to properly evaluate operationaleffectiveness (e.g., “Will the system detect the threat in a <strong>com</strong>bat environment at adequate rangeto allow successful engagement?”) or operational suitability (e.g., “Will the system be safe tooperate in a <strong>com</strong>bat environment?”).B-5


DATA SYSTEM — Combinations of personnel efforts, forms, formats, instructions, procedures,data elements <strong>and</strong> related data codes, <strong>com</strong>munications facilities, <strong>and</strong> Automated Data ProcessingEquipment (ADPE) that provide an organized <strong>and</strong> interconnected means—either automated, manual,or a mixture of these—for recording, collecting, processing, <strong>and</strong> <strong>com</strong>municating data.DEFENSE ACQUISITION EXECUTIVE (DAE) — The individual responsible for all acquisitionmatters within the DoD. (See DoD Directive (DoDD) 5000.1.)DEPARTMENT OF DEFENSE ACQUISITION SYSTEM — A single uniform system wherebyall equipment, facilities, <strong>and</strong> services are planned, designed, developed, acquired, maintained, <strong>and</strong>disposed of within the DoD. The system en<strong>com</strong>passes establishing <strong>and</strong> enforcing policies <strong>and</strong>practices that govern acquisitions, to include documenting mission needs <strong>and</strong> establishing performancegoals <strong>and</strong> baselines; determining <strong>and</strong> prioritizing resource requirements for acquisitionprograms; planning <strong>and</strong> executing acquisition programs; directing <strong>and</strong> controlling the acquisitionreview process; developing <strong>and</strong> assessing logistics implications; contracting; monitoring theexecution status of approved programs; <strong>and</strong> reporting to the Congress.DESIGNATED ACQUISITION PROGRAM — Program designated by the Director, Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DOT&E) or the Deputy Director, Developmental <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (S&TS)for OSD oversight of test <strong>and</strong> evaluation.DEVELOPING AGENCY (DA) — The Systems Comm<strong>and</strong> or Chief of Naval Operations (CNO)-designated project manager assigned responsibility for the development, test <strong>and</strong> evaluation of aweapon system, subsystem, or item of equipment.DEVELOPMENT TEST AND EVALUATION (DT&E) — T&E conducted throughout the lifecycle to identify potential operational <strong>and</strong> technological capabilities <strong>and</strong> limitations of the alternativeconcepts <strong>and</strong> design options being pursued; support the identification of cost-performancetradeoffs by providing analyses of the capabilities <strong>and</strong> limitations of alternatives; support theidentification <strong>and</strong> description of design technical risks; assess progress toward meeting CriticalOperational Issues (COIs), mitigation of acquisition technical risk, achievement of manufacturingprocess requirements <strong>and</strong> system maturity; assess validity of assumptions <strong>and</strong> conclusions fromthe Analysis of Alternatives (AoAs); provide data <strong>and</strong> analysis in support of the decision to certifythe system ready for Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E); <strong>and</strong> in the case of AutomatedInformation Systems (AISs), support an information systems security certification prior to processingclassified or sensitive data <strong>and</strong> ensure a st<strong>and</strong>ards conformance certification.DEVELOPMENTAL TESTER — The <strong>com</strong>m<strong>and</strong> or agency that plans, conducts, <strong>and</strong> reports theresults of developmental testing. Associated contractors may perform developmental testing onbehalf of the <strong>com</strong>m<strong>and</strong> or agency.EARLY OPERATIONAL ASSESSMENT (EOA) — An operational assessment conducted prior toor in support of prototype testing.EFFECTIVENESS — The extent to which the goals of the system are attained, or the degree towhich a system can be elected to achieve a set of specific mission requirements.B-6


ENGINEERING CHANGE — An alteration in the physical or functional characteristics of a systemor item delivered, to be delivered or under development, after establishment of such characteristics.ENGINEERING CHANGE PROPOSAL (ECP) — A proposal to the responsible authority re<strong>com</strong>mendingthat a change to an original item of equipment be considered, <strong>and</strong> the design or engineeringchange be incorporated into the article to modify, add to, delete, or supersede originalparts.ENGINEERING DEVELOPMENT — The RDT&E funding category that includes developmentprograms being engineered for Service use but not yet approved for procurement or operation.Budget Category 6.4 includes those projects in full-scale development of Service use; but theyhave not yet received approval for production or had production funds included in the DoD budgetsubmission for the budget or subsequent fiscal year.EVALUATION CRITERIA — St<strong>and</strong>ards by which achievement of required technical <strong>and</strong> operationaleffectiveness/suitability characteristics or resolution of technical or operational issues maybe evaluated. <strong>Evaluation</strong> criteria should include quantitative thresholds for the Initial OperatingCapability (IOC) system. If parameter maturity grows beyond IOC, intermediate evaluation criteria,appropriately time-lined, must also be provided.FIRST ARTICLE — First article includes pre-production models, initial production samples, testsamples, first lots, pilot models, <strong>and</strong> pilot lots; <strong>and</strong> approval involves testing <strong>and</strong> evaluating thefirst article for conformance with specified contract requirements before or in the initial stage ofproduction under a contract.FIRST ARTICLE TESTING (FAT) — Production testing that is planned, conducted, <strong>and</strong> monitoredby the materiel developer. FAT includes pre-production <strong>and</strong> initial production testing conductedto ensure that the contractor can furnish a product that meets the established technicalcriteria.FOLLOW-ON OPERATIONAL TEST AND EVALUATION (FOT&E) — That test <strong>and</strong> evaluationthat may be necessary after system deployment to refine the estimates made during operationaltest <strong>and</strong> evaluation, to evaluate changes, <strong>and</strong> to re-evaluate the system to ensure that itcontinues to meet operational needs <strong>and</strong> retains its effectiveness in a new environment or againsta new threat.FOLLOW-ON PRODUCTION TEST — A technical test conducted subsequent to a full productiondecision on initial production <strong>and</strong> mass production models to determine production conformancefor quality assurance purposes. Program funding category — Procurement.FOREIGN COMPARATIVE TESTING (FCT) — A DoD T&E program that is prescribed in Title10 U.S.C. §2350a(g), <strong>and</strong> is centrally managed by the Director, Foreign Comparative <strong>Test</strong> (Strategic<strong>and</strong> Tactical Systems (S&TS)). It provides funding for U.S. T&E of selected equipment items<strong>and</strong> technologies developed by allied countries when such items <strong>and</strong> technologies are identifiedas having good potential to satisfy valid DoD requirements.B-7


FUTURE-YEAR DEFENSE PROGRAM (FYDP) —(Formerly the Five Year Defense Program).The official DoD document that summarizes forces <strong>and</strong> resources associated with programs approvedby the Secretary of Defense (SECDEF). Its three parts are the organizations affected,appropriations accounts (Research, Development, <strong>Test</strong>, <strong>and</strong> <strong>Evaluation</strong> (RDT&E), Operations <strong>and</strong>Maintenance (O&M), etc.), <strong>and</strong> the 11 major force programs (strategic forces, airlift/sealift,Research <strong>and</strong> Development (R&D), etc.). R&D is Program 06. Under the current Planning, Programming,Budgeting <strong>and</strong> Execution (PPBE) cycle, the FYDP is updated when the Servicessubmit their Program Objective Memor<strong>and</strong>ums (POMs) to the Office of the Secretary of Defense(OSD) (May/June), when the Services submit their budgets to OSD (September), <strong>and</strong> when thePresident submits the national budget to the Congress (February). The primary data element inthe FYDP is the Program Element (PE).HARMONIZATION — Refers to the process, or results, of adjusting differences or inconsistenciesin the qualitative basic military requirements of the United States, its allies, <strong>and</strong> other friendlycountries. It implies that significant features will be brought into line so as to make possiblesubstantial gains in terms of the overall objectives of cooperation (e.g., enhanced utilization ofresources, st<strong>and</strong>ardization, <strong>and</strong> <strong>com</strong>patibility of equipment). It implies especially that <strong>com</strong>parativelyminor differences in “requirements” should not be permitted to serve as a basis for thesupport of slightly different duplicative programs <strong>and</strong> projects.HUMAN SYSTEMS INTEGRATION (HSI) — A disciplined, unified, <strong>and</strong> interactive approach tointegrate human considerations into system design to improve total system performance <strong>and</strong> reducecosts of ownership. The major categories of human considerations are manpower, personnel,training, human factors engineering, safety, <strong>and</strong> health.INDEPENDENT EVALUATION REPORT (IER) — A report that provides an assessment of itemor system operational effectiveness <strong>and</strong> operational suitability versus critical issues as well as theadequacy of testing to that point in the development of item or system.INDEPENDENT OPERATIONAL TEST AGENCY — The Army <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Comm<strong>and</strong>(ATEC), the Navy Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Force (OPTEVFOR), the Air Force Operational<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Center (AFOTEC), the Marine Corps Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Activity (MCOTEA), <strong>and</strong> for the Defense Information Systems Agency (DISA) – the Joint Interoperability<strong>Test</strong> Comm<strong>and</strong> (JITC).INDEPENDENT VERIFICATION AND VALIDATION (IV&V) — An independent review of thesoftware product for functional effectiveness <strong>and</strong> technical sufficiency.INITIAL OPERATIONAL CAPABILITY (IOC) — The first attainment of the capability to employeffectively a weapon, item of equipment, or system of approved specific characteristics with theappropriate number, type, <strong>and</strong> mix of trained <strong>and</strong> equipped personnel necessary to operate, maintain,<strong>and</strong> support the system. It is normally defined in the Operational Requirements Document(ORD).B-8


INITIAL OPERATIONAL TEST AND EVALUATION (IOT&E) — Operational test <strong>and</strong> evaluationconducted on production, or production representative articles, to determine whether systemsare operationally effective <strong>and</strong> suitable for intended use by representative users to support thedecision to proceed beyond Low Rate Initial Production (LRIP). For the Navy, see Operational<strong>Evaluation</strong> (OPEVAL).IN-PROCESS REVIEW — Review of a project or program at critical points to evaluate status <strong>and</strong>make re<strong>com</strong>mendations to the decision authority.INSPECTION — Visual examination of the item (hardware <strong>and</strong> software) <strong>and</strong> associated descriptivedocumentation, which <strong>com</strong>pares appropriate characteristics with predetermined st<strong>and</strong>ards todetermine conformance to requirements without the use of special laboratory equipment orprocedures.INTEGRATED PRODUCT AND PROCESS DEVELOPMENT (IPPD) — A management techniquethat simultaneously integrates all essential acquisition activities through the use ofmultidisciplinary teams to optimize the design, manufacturing, <strong>and</strong> supportability processes. IPPDfacilitates meeting cost <strong>and</strong> performance objectives from product concept through production,including field support. One of the key IPPD tenets is multidisciplinary teamwork throughIntegrated Product Teams (IPTs).INTEGRATED PRODUCT TEAM (IPT) — Team <strong>com</strong>posed of representatives from all appropriatefunctional disciplines working together to build successful programs, identify <strong>and</strong> resolveissues, <strong>and</strong> make sound <strong>and</strong> timely re<strong>com</strong>mendations to facilitate decisionmaking. There are threetypes of IPTs: Overarching IPTs (OIPTs) focus on strategic guidance, program assessment, <strong>and</strong>issue resolution; Working-level IPTs (WIPTs) identify <strong>and</strong> resolve program issues, determineprogram status, <strong>and</strong> seek opportunities for acquisition reform; <strong>and</strong> program-level IPTs focus onprogram execution <strong>and</strong> may include representatives from both government <strong>and</strong>, after contractaward, industry.INTEROPERABILITY — The ability of systems, units, or forces to provide services to or acceptservices from other systems, units, or forces <strong>and</strong> to use the services so exchanged to operateeffectively together. The conditions achieved among <strong>com</strong>munications-electronics systems or itemsof <strong>com</strong>munications-electronics equipment when information or services can be exchanged directly<strong>and</strong> satisfactorily between them <strong>and</strong>/or their users. Designated an element of the Net-ReadyKey Performance Parameter (KPP) by Joint Chiefs of Staff (JCS).ISSUES — Any aspect of the system’s capability, either operational, technical or other, that must bequestioned before the system’s overall military utility can be known. Operational issues are issuesthat must be evaluated considering the soldier <strong>and</strong> the machine as an entity to estimate the operationaleffectiveness <strong>and</strong> operational suitability of the system in its <strong>com</strong>plete user environment.KEY PERFORMANCE PARAMETERS (KPPs) — Those minimum attributes or characteristicsconsidered most essential for an effective military capability. The Capability Development Document(CDD) <strong>and</strong> the Capability Production Document (CPD) KPPs are included verbatim in theAcquisition Program Baseline (APB).B-9


LIVE FIRE TEST AND EVALUATION (LFT&E) — A test process that is defined in Title 10U.S.C. §2366, that must be conducted on a covered system, major munition program, missileprogram, or Product Improvement (PI) to a covered system, major munition program, or missileprogram before it can proceed beyond Low Rate Initial Production (LRIP). A covered system is amajor system (Acquisition Category (ACAT) I, II) that is: 1) user-occupied <strong>and</strong> designed to providesome degree of protection to its occupants in <strong>com</strong>bat, or ; 2) a conventional munitionsprogram or missile program, or; 3) a conventional munitions program for which more than1,000,000,000 rounds are planned to be acquired, or; 4) a modification to a covered system thatis likely to affect significantly the survivability or lethality of such a system.LIVE FIRE TEST AND EVALUATION (LFT&E) REPORT — Report prepared by the Director,Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DOT&E) on survivability <strong>and</strong> lethality testing. Submitted to theCongress for covered systems prior to the decision to proceed beyond Low Rate Initial Production(LRIP).LETHALITY — The probability that weapon effects will destroy the target or render it neutral.LIFE CYCLE COST (LCC) — The total cost to the government for the development, acquisition,operation, <strong>and</strong> logistics support of a system or set of forces over a defined life span.LOGISTICS SUPPORTABILITY — The degree of ease to which system design characteristics<strong>and</strong> planned logistics resources (including the Logistics Support (LS) elements) allow for themeeting of system availability <strong>and</strong> wartime usage requirements.LONG LEAD ITEMS — Those <strong>com</strong>ponents of a system or piece of equipment that take the longesttime to procure <strong>and</strong>, therefore, may require an early <strong>com</strong>mitment of funds in order to meetacquisition program schedules.LOT ACCEPTANCE — This test is based on a sampling procedure to ensure that the productretains its quality. No acceptance or installation should be permitted until this test for the lot hasbeen successfully <strong>com</strong>pleted.LOW RATE INITIAL PRODUCTION (LRIP) — The minimum number of systems (other thanships <strong>and</strong> satellites) to provide production representative articles for Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(OT&E), to establish an initial production base, <strong>and</strong> to permit an orderly increase in theproduction rate sufficient to lead to Full Rate Production (FRP) upon successful <strong>com</strong>pletion ofoperational testing. For Major Defense Acquisition Programs (MDAPs), LRIP quantities in excessof 10 percent of the acquisition objective must be reported in the Selected Acquisition Report(SAR). For ships <strong>and</strong> satellites LRIP is the minimum quantity <strong>and</strong> rate that preserves mobilization.MAINTAINABILITY — The ability of an item to be retained in, or restored to, a specified conditionwhen maintenance is performed by personnel having specified skill levels, using prescribed procedures<strong>and</strong> resources, at each prescribed level of maintenance <strong>and</strong> repair. (See Mean Time To Repair(MTTR))MAJOR DEFENSE ACQUISITION PROGRAM (MDAP) — See “Acquisition Category.”B-10


MAJOR SYSTEM (DoD) — See “Acquisition Category.”MAJOR RANGE AND TEST FACILITY BASE (MRTFB) — The <strong>com</strong>plex of major DoD ranges<strong>and</strong> test facilities managed according to DoD 3200.11 by the Director, <strong>Test</strong> Resource <strong>Management</strong>Center reporting to the USD(AT&L).MEAN TIME BETWEEN FAILURE (MTBF) — For a particular interval, the total functional lifeof a population of an item divided by the total number of failures within the population. Thedefinition holds for time, rounds, miles, events, or other measures of life unit. A basic technicalmeasure of reliability.MEAN TIME TO REPAIR (MTTR) — The total elapsed time (clock hours) for corrective maintenancedivided by the total number of corrective maintenance actions during a given period oftime. A basic technical measure of maintainability.MEASURE OF EFFECTIVENESS (MOE) — A measure of operational performance success thatmust be closely related to the objective of the mission or operation being evaluated. For example, killsper shot, probability of kill, effective range, etc. Linkage shall exist among the various MOEs used inthe Analysis of Alternatives (AoAs), Operations Requirements Document (ORD) <strong>and</strong> <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(T&E); in particular, the MOEs, Measures of Performance (MOPs), criteria in the ORD, theAoAs, the <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan (TEMP), <strong>and</strong> the Acquisition Program Baseline (APB)shall be consistent. A meaningful MOE must be quantifiable <strong>and</strong> a measure of what degree the realobjective is achieved.MEASURE OF PERFORMANCE (MOP) — Measure of a lower level of performance representingsubsets of Measures of Effectiveness (MOEs). Examples are speed, payload, range, time onstation, frequency, or other distinctly quantifiable performance features.MILESTONE — The point when a re<strong>com</strong>mendation is made <strong>and</strong> approval sought regarding startingor continuing (proceeding to next phase) an acquisition program.MILESTONE DECISION AUTHORITY (MDA) — The individual designated in accordance withcriteria established by the Under Secretary of Defense (Acquisition, Technology, <strong>and</strong> Logistics)(USD(AT&L)), or by the Assistant Secretary of Defense (Comm<strong>and</strong>, Control, Communications, <strong>and</strong>Intelligence) (ASD(C 3 I)) for Automated Information System (AIS) acquisition programs (DoD 5000.2-R (Reference C)), to approve entry of an acquisition program into the next acquisition phase.MILITARY OPERATIONAL REQUIREMENT — The formal expression of a military need, responsesto which result in development or acquisition of item(s), equipment, or systems.MISSION RELIABILITY — The probability that a system will perform mission essential functionsfor a given period of time under conditions stated in the mission profile.MODEL — A model is a representation of an actual or conceptual system that involves mathematics,logical expressions, or <strong>com</strong>puter simulations that can be used to predict how the system mightperform or survive under various conditions or in a range of hostile environments.B-11


MULTI-SERVICE TEST AND EVALUATION — T&E conducted by two or more DoD Componentsfor systems to be acquired by more than one DoD Component (Joint acquisition program),or for a DoD Component’s systems that have interfaces with equipment of another DoD Component.May be developmental testing or operational testing (MOT&E).NON-DEVELOPMENTAL ITEM (NDI) — A NDI is any previously developed item of supplyused exclusively for government purposes by a federal agency, a state or local government, or aforeign government with which the United States has a mutual defense cooperation agreement;any item described above that requires only minor modifications or modifications of the typecustomarily available in the <strong>com</strong>mercial marketplace in order to meet the requirements of theprocessing department or agency.NONMAJOR DEFENSE ACQUISITION PROGRAM — A program other than a Major DefenseAcquisition Program (MDAP) Acquisition Category (ACAT) I or a highly sensitive classifiedprogram: i.e., ACAT II <strong>and</strong> ACAT III programs.NUCLEAR HARDNESS — A quantitative description of the resistance of a system or <strong>com</strong>ponentto malfunction (temporary <strong>and</strong> permanent) <strong>and</strong>/or degraded performance induced by a nuclearweapon environment. Measured by resistance to physical quantities such as overpressure, peakvelocities, energy absorbed, <strong>and</strong> electrical stress. Hardness is achieved through adhering to appropriatedesign specifications <strong>and</strong> is verified by one or more test <strong>and</strong> analysis techniques.OBJECTIVE — The performance value that is desired by the user <strong>and</strong> which the Program Manager(PM) is attempting to obtain. The objective value represents an operationally meaningful, timecritical,<strong>and</strong> cost-effective increment above the performance threshold for each program parameter.OPEN SYSTEMS — Acquisition of Weapons Systems—An integrated technical <strong>and</strong> business strategythat defines key interfaces for a system (or a piece of equipment under development) inaccordance with those adopted by formal consensus bodies (recognized industry st<strong>and</strong>ards’ bodies)as specifications <strong>and</strong> st<strong>and</strong>ards, or <strong>com</strong>monly accepted (de facto) st<strong>and</strong>ards (both <strong>com</strong>panyproprietary <strong>and</strong> non-proprietary) if they facilitate utilization of multiple suppliers.OPERATIONAL ASSESSMENT (OA) — An evaluation of operational effectiveness <strong>and</strong> operationalsuitability made by an independent operational test activity, with user support as required,on other than production systems. The focus of an OA is on significant trends noted in developmentefforts, programmatic voids, areas of risk, adequacy of requirements, <strong>and</strong> the ability of theprogram to support adequate Operational <strong>Test</strong>ing (OT). OA may be made at any time using technologydemonstrators, prototypes, mock-ups, Engineering Development Models (EMDs), or simulationsbut will not substitute for the independent Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E) necessaryto support full production decisions.OPERATIONAL AVAILABILITY (Ao) — The degree (expressed in terms of 1.0 or 100 percent asthe highest) to which one can expect equipment or weapon systems to work properly when required.The equation is uptime over uptime plus downtime, expressed as Ao. It is the quantitativelink between readiness objectives <strong>and</strong> supportability.B-12


OPERATIONAL EFFECTIVENESS — The overall degree of mission ac<strong>com</strong>plishment of a systemwhen used by representative personnel in the environment planned or expected (e.g. natural,electronic, threat, etc.) for operational employment of the system considering organization, doctrine,tactics, survivability, vulnerability, <strong>and</strong> threat (including countermeasures; initial nuclearweapons effects; <strong>and</strong> Nuclear, Biological, <strong>and</strong> Chemical Contamination (NBCC) threats).OPERATIONAL EVALUATION — Addresses the effectiveness <strong>and</strong> suitability of the weapons,equipment, or munitions for use in <strong>com</strong>bat by typical military users <strong>and</strong> the system operationalissues <strong>and</strong> criteria; provides information to estimate organizational structure, personnel requirements,doctrine, training <strong>and</strong> tactics; identifies any operational deficiencies <strong>and</strong> the need for anymodifications; <strong>and</strong> assesses MANPRINT (safety, health hazards, human factors, manpower, <strong>and</strong>personnel) aspects of the system in a realistic operational environment.OPERATIONAL REQUIREMENTS — User- or user representative-generated validated needs developedto address mission area deficiencies, evolving threats, emerging technologies, or weaponsystem cost improvements. Operational requirements form the foundation for weapon systemunique specifications <strong>and</strong> contract requirements.OPERATIONAL REQUIREMENTS DOCUMENT (ORD) — Documents the user’s objectives<strong>and</strong> minimum acceptable requirements for operational performance of a proposed concept orsystem. Format is contained in the Chairman of the Joint Chiefs of Staff Instruction (CJCSI)3170.01B, Requirements Generation System (RGS), April 15, 2001.OPERATIONAL SUITABILITY — The degree to which a system can be placed satisfactorily infield use with consideration being given to availability, <strong>com</strong>patibility, transportability, interoperability,reliability, wartime usage rates, maintainability, safety, human factors, manpower supportability,logistics supportability, natural environmental effects <strong>and</strong> impacts, documentation, <strong>and</strong> trainingrequirements.OPERATIONAL TEST AND EVALUATION (OT&E) — The field test, under realistic conditions,of any item (or key <strong>com</strong>ponent) of weapons, equipment, or munitions for the purpose of determiningthe effectiveness <strong>and</strong> suitability of the weapons, equipment, or munitions for use in <strong>com</strong>batby typical military users; <strong>and</strong> the evaluation of the results of such tests. (Title 10 U.S.C. 2399)OPERATIONAL TEST CRITERIA — Expressions of the operational level of performance requiredof the military system to demonstrate operational effectiveness for given functions duringeach operational test. The expression consists of the function addressed, the basis for <strong>com</strong>parison,the performance required, <strong>and</strong> the confidence level.OPERATIONAL TEST READINESS REVIEW (OTRR) — A review to identify problems thatmay impact the conduct of an Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E). The OTRRs are conductedto determine changes required in planning, resources, or testing necessary to proceed withthe OT&E. Participants include the operational tester (chair), evaluator, material developer, userrepresentative, logisticians, Headquarters, Department of the Army (HQDA) staff elements, <strong>and</strong>others as necessary.B-13


PARAMETER — A determining factor or characteristic. Usually related to performance in developinga system.PERFORMANCE — Those operational <strong>and</strong> support characteristics of the system that allow it toeffectively <strong>and</strong> efficiently perform its assigned mission over time. The support characteristics ofthe system include both supportability aspects of the design <strong>and</strong> the support elements necessaryfor system operation.PILOT PRODUCTION — Production line normally established during first production phase totest new manufacturing methods <strong>and</strong> procedures. Normally funded by Research, Development,<strong>Test</strong>, <strong>and</strong> <strong>Evaluation</strong> (RDT&E) until the line is proven.POSTPRODUCTION TESTING — <strong>Test</strong>ing conducted to assure that materiel that is reworked,repaired, renovated, rebuilt, or overhauled after initial issue <strong>and</strong> deployment conforms to specifiedquality, reliability, safety, <strong>and</strong> operational performance st<strong>and</strong>ards. Included in postproduction testsare surveillance tests, stockpile reliability, <strong>and</strong> reconditioning tests.PREPLANNED PRODUCT IMPROVEMENT (P 3 I) — Planned future evolutionary improvementof developmental systems for which design considerations are effected during development toenhance future application of projected technology. Includes improvements planned for ongoingsystems that go beyond the current performance envelope to achieve a needed operational capability.PREPRODUCTION PROTOTYPE — An article in final form employing st<strong>and</strong>ard parts, representativeof articles to be produced subsequently in a production line.PREPRODUCTION QUALIFICATION TEST — The formal contractual tests that ensure designintegrity over the specified operational <strong>and</strong> environmental range. These tests usually use prototypeor pre-production hardware fabricated to the proposed production design specifications <strong>and</strong>drawings. Such tests include contractual Reliability <strong>and</strong> Maintainability (R&M) demonstrationtests required prior to production release.PROBABILITY OF KILL (Pk) — The lethality of a weapon system. Generally refers to armaments.(e.g., missiles, ordnance, etc.). Usually the statistical probability that the weapon willdetonate close enough to the target with enough effectiveness to disable the target.PRODUCT IMPROVEMENT (PI) — Effort to incorporate a configuration change involving engineering<strong>and</strong> testing effort on end items <strong>and</strong> depot repairable <strong>com</strong>ponents, or changes on otherthan developmental items to increase system or <strong>com</strong>bat effectiveness or extend useful militarylife. Usually results from feedback from the users.PRODUCTION ACCEPTANCE TEST AND EVALUATION (PAT&E) — T&E of productionitems to demonstrate that items procured fulfill requirements <strong>and</strong> specifications of the procuringcontract or agreements.PRODUCTION ARTICLE — The end item under initial or Full Rate Production (FRP).B-14


PRODUCTION PROVE OUT — A technical test conducted prior to production testing with prototypehardware to determine the most appropriate design alternative. This testing may also providedata on safety, the achievability of critical system technical characteristics, refinement <strong>and</strong> ruggedizationof hardware configurations, <strong>and</strong> determination of technical risks.PRODUCTION QUALIFICATION TEST (PQT) — A technical test <strong>com</strong>pleted prior to the FullRate Production (FRP) decision to ensure the effectiveness of the manufacturing process, equipment,<strong>and</strong> procedures. This testing also serves the purpose of providing data for the independentevaluation required for materiel release so that the evaluator can address the adequacy of themateriel with respect to the stated requirements. These tests are conducted on a number of samplestaken at r<strong>and</strong>om from the first production lot, <strong>and</strong> are repeated if the process or design is changedsignificantly, <strong>and</strong> when a second or alternative source is brought online.PROGRAM MANAGER (PM) — A military or civilian official who is responsible for managing,through Integrated Product Teams (IPTs), an acquisition program.PROTOTYPE — An original or model on which a later system/item is formed or based. Earlyprototypes may be built during early design stages <strong>and</strong> tested prior to advancing to advancedengineering. Selected prototyping may evolve into an Engineering Development Model (EDM),as required to identify <strong>and</strong> resolve specific design <strong>and</strong> manufacturing risks in that phase or insupport of Preplanned Product Improvement (P 3 I) or Evolutionary Acquisition (EA).QUALIFICATIONS TESTING — Simulates defined operational environmental conditions with apredetermined safety factor, the results indicating whether a given design can perform its functionwithin the simulated operational environment of a system.QUALITY ASSURANCE (QA) — A planned <strong>and</strong> systematic pattern of all actions necessary toprovide confidence that adequate technical requirements are established, that products <strong>and</strong> servicesconform to established technical requirements, <strong>and</strong> that satisfactory performance is achieved.REALISTIC TEST ENVIRONMENT — The conditions under which the system is expected to beoperated <strong>and</strong> maintained, including the natural weather <strong>and</strong> climatic conditions, terrain effects,battlefield disturbances, <strong>and</strong> enemy threat conditions.RELIABILITY — The ability of a system <strong>and</strong> its parts to perform its mission without failure,degradation, or dem<strong>and</strong> on the support system. (See Mean Time Between Failure (MTBF).)REQUIRED OPERATIONAL CHARACTERISTICS — System parameters that are primary indicatorsof the system’s capability to be employed to perform the required mission functions, <strong>and</strong> to besupported.REQUIRED TECHNICAL CHARACTERISTICS — System parameters selected as primary indicatorsof achievement of engineering goals. These need not be direct measures of, but shouldalways relate to the system’s capability to perform the required mission functions, <strong>and</strong> to besupported.B-15


RESEARCH — 1. Systematic inquiry into a subject in order to discover or revise facts, theories,etc., to investigate. 2. Means of developing new technology for potential use in defense systems.RISK — A measure of the inability to achieve program objectives within defined cost <strong>and</strong> scheduleconstraints. Risk is associated with all aspects of the program, e.g., threat, technology, designprocesses, Work Breakdown Structure (WBS) elements, etc. It has two <strong>com</strong>ponents:1. The probability of failing to achieve a particular out<strong>com</strong>e; <strong>and</strong>2. The consequences of failing to achieve that out<strong>com</strong>e.RISK ASSESSMENT — The process of identifying program risks within risk areas <strong>and</strong> criticaltechnical processes, analyzing them for their consequences <strong>and</strong> probabilities of occurrence, <strong>and</strong>prioritizing them for h<strong>and</strong>ling.RISK MONITORING — A process that systematically tracks <strong>and</strong> evaluates the performance of riskitems against established metrics throughout the acquisition process <strong>and</strong> develops further riskreduction h<strong>and</strong>ling options as appropriate.SAFETY — The application of engineering <strong>and</strong> management principles, criteria, <strong>and</strong> techniques tooptimize safety within the constraints of operational effectiveness, time, <strong>and</strong> cost throughout allphases of the system life cycle.SAFETY/HEALTH VERIFICATION — The development of data used to evaluate the safety <strong>and</strong>health features of a system to determine its acceptability. This is done primarily during Developmental<strong>Test</strong> (DT) <strong>and</strong> user or Operational <strong>Test</strong> (OT) <strong>and</strong> evaluation <strong>and</strong> supplemented by analysis<strong>and</strong> independent evaluations.SAFETY RELEASE — A formal document issued to a user test organization before any h<strong>and</strong>s-onuse or maintenance by personnel. The safety release indicates the system is safe for use <strong>and</strong>maintenance by typical user personnel <strong>and</strong> describes the system safety analyses. Operational limits<strong>and</strong> precautions are included. The test agency uses the data to integrate safety into test controls<strong>and</strong> procedures <strong>and</strong> to determine if the test objectives can be met within these limits.SELECTED ACQUISITION REPORT (SAR) — St<strong>and</strong>ard, <strong>com</strong>prehensive, summary status reportson Major Defense Acquisition Programs (MDAPs) (Acquisition Category (ACAT) I) requiredfor periodic submission to the Congress. They include key cost, schedule, <strong>and</strong> technicalinformation.SIMULATION — A simulation is a method for implementing a model. It is the process of conductingexperiments with a model for the purpose of underst<strong>and</strong>ing the behavior of the system modeledunder selected conditions or of evaluating various strategies for the operation of the system withinthe limits imposed by developmental or operational criteria. Simulation may include the use ofanalog or digital devices, laboratory models, or “test bed” sites. Simulations are usually programmedfor solution on a <strong>com</strong>puter; however, in the broadest sense, military exercises <strong>and</strong> war games arealso simulations.B-16


SIMULATOR — A generic term used to describe equipment used to represent weapon systems indevelopment testing, operational testing, <strong>and</strong> training, e.g., a threat simulator has one or morecharacteristics which, when detected by human senses or man-made sensors, provide the appearanceof an actual threat weapon system with a prescribed degree of fidelity.SPECIFICATION — A document used in development <strong>and</strong> procurement that describes the technicalrequirements for items, materials, <strong>and</strong> services, including the procedures by which it will bedetermined that the requirements have been met. Specifications may be unique to a specificprogram (program-peculiar) or they may be <strong>com</strong>mon to several applications (general in nature).SUBTEST — An element of a test program. A subset is a test conducted for a specific purpose (e.g.,rain, dust, transportability, missile firing, fording).SURVIVABILITY — Survivability is the capability of a system <strong>and</strong> its crew to avoid or withst<strong>and</strong>a man-made hostile environment without suffering an abortive impairment of its ability to ac<strong>com</strong>plishits designated mission.SUSCEPTIBILITY — The degree to which a device, equipment, or weapon system is open toeffective attack due to one or more inherent weaknesses. Susceptibility is a function of operationaltactics, countermeasures, probability of enemy fielding a threat, etc. Susceptibility is considereda subset of survivability.SYSTEM — 1. The organization of hardware, software, material, facilities, personnel, data, <strong>and</strong>services needed to perform a designated function with specified results, such as the gathering ofspecified data, its processing, <strong>and</strong> delivery to users. 2. A <strong>com</strong>bination of two or more interrelatedequipment (sets) arranged in a functional package to perform an operational function or to satisfya requirement.SYSTEM ENGINEERING, DEFENSE — A <strong>com</strong>prehensive, iterative technical management processthat includes translating operational requirements into configured systems, integrating thetechnical inputs of the entire design team, managing interfaces, characterizing <strong>and</strong> managingtechnical risk, transitioning technology from the technology base into program-specific efforts,<strong>and</strong> verifying that designs meet operational needs. It is a life cycle activity that dem<strong>and</strong>s a concurrentapproach to both product <strong>and</strong> process development.SYSTEMS ENGINEERING PROCESS (SEP) — A logical sequence of activities <strong>and</strong> decisionstransforming an operational need into a description of system performance parameters <strong>and</strong> apreferred system configuration.SYSTEM SPECIFICATION — States all necessary functional requirements of a system in termsof technical performance <strong>and</strong> mission requirements, including test provisions to assure that allrequirements are achieved. Essential physical constraints are included. System specifications statethe technical <strong>and</strong> mission requirements of the system as an entity.B-17


SYSTEM THREAT ASSESSMENT (STA) — Describes the threat to be countered <strong>and</strong> the projectedthreat environment. The threat information must be validated by the Defense IntelligenceAgency (DIA) programs reviewed by the Defense Acquisition Board (DAB).TECHNICAL EVALUATION (TECHEVAL) — The study, investigations, or <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(T&E) by a developing agency to determine the technical suitability of materiel, equipment, or asystem, for use in the military Services. (See Development <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E), Navy.)TECHNICAL FEASIBILITY TEST — A technical feasibility test is a developmental test typicallyconducted during Concept Refinement (CR) <strong>and</strong> Technology Development (TD) to provide data toassist in determining safety <strong>and</strong> health hazards, <strong>and</strong> establishing system performance specifications<strong>and</strong> feasibility.TECHNICAL PERFORMANCE MEASUREMENT (TPM) — Describes all the activities undertakenby the government to obtain design status beyond those treating schedule <strong>and</strong> cost. A TPMmanager is defined as the product design assessment, which estimates through tests the values ofessential performance parameters of the current design of Work Breakdown Structure (WBS)product elements. It forecasts the values to be achieved through the planned technical programeffort, measures differences between achieved values <strong>and</strong> those allocated to the product elementby the System Engineering Process (SEP), <strong>and</strong> determines the impact of those differences onsystem effectiveness.TECHNICAL TESTER — The <strong>com</strong>m<strong>and</strong> or agency that plans, conducts, <strong>and</strong> reports the results ofArmy technical testing. Associated contractors may perform development testing on behalf of the<strong>com</strong>m<strong>and</strong> or agency.TEST — Any program or procedure that is designed to obtain, verify, or provide data for the evaluationof Research <strong>and</strong> Development (R&D) (other than laboratory experiments); progress in ac<strong>com</strong>plishingdevelopment objectives; or performance <strong>and</strong> operational capability of systems, subsystems,<strong>com</strong>ponents, <strong>and</strong> equipment items.TEST AND EVALUATION (T&E) — Process by which a system or <strong>com</strong>ponents provide informationregarding risk <strong>and</strong> risk mitigation <strong>and</strong> empirical data to validate models <strong>and</strong> simulations.T&E permits, as assessment of the attainment of technical performance, specifications, <strong>and</strong> systemmaturity to determine whether systems are operationally effective, suitable, <strong>and</strong> survivablefor intended use. There are two general types of T&E—Developmental (DT&E) <strong>and</strong> Operational(OT&E). (See Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E), Initial Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(IOT&E), <strong>and</strong> Developmental <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (DT&E).)TEST AND EVALUATION MASTER PLAN (TEMP) — Documents the overall structure <strong>and</strong>objectives of the <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (T&E) program. It provides a framework within which togenerate detailed T&E plans, <strong>and</strong> it documents schedule <strong>and</strong> resource implications associatedwith the T&E program. The TEMP identifies the necessary Developmental <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(DT&E), Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (OT&E), <strong>and</strong> Live Fire <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (LFT&E)activities. It relates program schedule, test management strategy <strong>and</strong> structure, <strong>and</strong> requiredresources to: Critical Operational Issues (COIs); critical technical parameters; objectives <strong>and</strong>B-18


thresholds documented in the Operational Requirements Document (ORD); evaluation criteria;<strong>and</strong> milestone decision points. For multi-Service or joint programs, a single integrated TEMP isrequired. Component-unique content requirements, particularly evaluation criteria associated withCOIs, can be addressed in a <strong>com</strong>ponent-prepared annex to the basic TEMP. (See Capstone TEMP).)TEST BED — A system representation consisting of actual hardware <strong>and</strong>/or software <strong>and</strong> <strong>com</strong>putermodels or prototype hardware <strong>and</strong>/or softwareTEST CRITERIA — St<strong>and</strong>ards by which test results <strong>and</strong> out<strong>com</strong>e are judged.TEST DESIGN PLAN — A statement of the conditions under which the test is to be conducted, thedata required from the test, <strong>and</strong> the data h<strong>and</strong>ling required to relate the data results to the testconditions.TEST INSTRUMENTATION — <strong>Test</strong> instrumentation is scientific, Automated Data Processing Equipment(ADPE), or technical equipment used to measure, sense, record, transmit, process, or displaydata during tests, evaluations, or examination of materiel, training concepts, or tactical doctrine.Audio-visual is included as instrumentation when used to support Army testing.TEST RESOURCES — A collective term that en<strong>com</strong>passes all elements necessary to plan, conduct,<strong>and</strong> collect/analyze data from a test event or program. Elements include test funding <strong>and</strong> supportmanpower (including Temporary Duty (TDY) costs), test assets (or units under test), test assetsupport equipment, technical data, simulation models, test-beds, threat simulators, surrogates <strong>and</strong>replicas, special instrumentation peculiar to a given test asset or test event, targets, tracking <strong>and</strong>data acquisition, instrumentation, equipment for data reduction, <strong>com</strong>munications, meteorology,utilities, photography, calibration, security, recovery, maintenance <strong>and</strong> repair, frequency management<strong>and</strong> control, <strong>and</strong> base/facility support services.THREAT —The sum of the potential strengths, capabilities, <strong>and</strong> strategic objectives of any adversarythat can limit or negate U.S. mission ac<strong>com</strong>plishment or reduce force, system, or equipmenteffectiveness.THRESHOLDS — The Minimum Acceptable Value (MAV) which, in the user’s judgment, is necessaryto satisfy the need. If threshold values are not achieved, program performance is seriouslydegraded, the program may be too costly, or the program may no longer be viable.TRANSPORTABILITY — The capability of materiel to be moved by towing, self-propulsion, orcarrier through any means, such as railways, highways, waterways, pipelines, oceans, <strong>and</strong> airways.(Full consideration of available <strong>and</strong> projected transportation assets, mobility plans, <strong>and</strong>schedules, <strong>and</strong> the impact of system equipment <strong>and</strong> support items on the strategic mobility ofoperating military forces is required to achieve this capability.)UNKNOWN-UNKNOWNS (UNK(s)) — Future situation impossible to plan, predict, or even knowwhat to look for.B-19


USER — An operational <strong>com</strong>m<strong>and</strong> or agency that receives or will receive benefit from the acquiredsystem. Combatant Comm<strong>and</strong>ers (COCOMs) <strong>and</strong> the Services are the users. There may be morethan one user for a system. The Services are seen as users for systems required to organize, equip,<strong>and</strong> train forces for the COCOMs of the unified <strong>com</strong>m<strong>and</strong>.USER FRIENDLY — Primarily a term used in Automated Data Processing (ADP), it connotes amachine (hardware) or program (software) that is <strong>com</strong>patible with a person’s ability to operateeach successfully <strong>and</strong> easily.USER REPRESENTATIVES — A <strong>com</strong>m<strong>and</strong> or agency that has been formally designated by properauthority to represent single or multiple users in the requirements <strong>and</strong> acquisition process. TheServices <strong>and</strong> the Service <strong>com</strong>ponents of the Combatant Comm<strong>and</strong>ers (COCOMs) are normallythe user representatives. There should be only one user representative for a system.VALIDATION — 1. The process by which the contractor (or as otherwise directed by the DoD<strong>com</strong>ponent procuring activity) tests a publication/Technical Manual (TM) for technical accuracy<strong>and</strong> adequacy. 2. The procedure of <strong>com</strong>paring input <strong>and</strong> output against an edited file <strong>and</strong> evaluatingthe result of the <strong>com</strong>parison by means of a decision table established as a st<strong>and</strong>ard.VARIANCE (Statistical) — A measure of the degree of spread among a set of values; a measure ofthe tendency of individual values to vary from the mean value. It is <strong>com</strong>puted by subtracting themean value from each value, squaring each of these differences, summing these results, <strong>and</strong>dividing this sum by the number of values in order to obtain the arithmetic mean of these squares.VULNERABILITY — The characteristics of a system that cause it to suffer a definite degradation(loss or reduction of capability to perform the designated mission) as a result of having beensubjected to a certain (defined) level of effects in an unnatural (manmade) hostile environment.Vulnerability is considered a subset of survivability.WORK BREAKDOWN STRUCTURE (WBS) — An organized method to break down a projectinto logical subdivisions or subprojects at lower <strong>and</strong> lower levels of details. It is very useful inorganizing a project.WORKING-LEVEL INTEGRATED PRODUCT TEAM (WIPT) — Team of representatives fromall appropriate functional disciplines working together to build successful <strong>and</strong> balanced programs,identify <strong>and</strong> resolve issues, <strong>and</strong> make sound <strong>and</strong> timely decisions. WIPTs may include membersfrom both government <strong>and</strong> industry, including program contractors <strong>and</strong> sub-contractors. A <strong>com</strong>mittee,which includes non-government representatives to provide an industry view, would be anadvisory <strong>com</strong>mittee covered by the Federal Advisory Committee Act (FACA) <strong>and</strong> must follow theprocedures of that Act.B-20


B-21


B-22


APPENDIX CTEST-RELATED DATAITEM DESCRIPTIONSextracted from DoD 5010.12-L,Acquisition <strong>Management</strong> System <strong>and</strong>Data Requirements Control List (AMSDL)Acceptance <strong>Test</strong> Plan DI-QCIC-80154, 80553Airborne Sound Measurements <strong>Test</strong> ReportDI-HFAC-80272Airframe Rigidity <strong>Test</strong> ReportDI-T-30734Ammunition <strong>Test</strong> Expenditure ReportDI-MISC-80060Armor Material <strong>Test</strong> ReportsDI-MISC-80073Ballistic Acceptance <strong>Test</strong> ReportDI-MISC-80246C.P. Propeller <strong>Test</strong> AgendaUDI-T-23737Coordinated <strong>Test</strong> PlanDI-MGMT-80937Corrosion <strong>Test</strong>ing ReportsDI-MFFP-80108Damage Tolerance <strong>Test</strong> Results ReportsDI-T-30725Demonstration <strong>Test</strong>PlanDI-QCIC-80775ReportDI-QCIC-80774Directed Energy Survivability <strong>Test</strong> PlanDI-R-1786Durability <strong>Test</strong> Results ReportDI-T-30726Electromagnetic Compatibility <strong>Test</strong> PlanDI-T-3704BElectromagnetic Interference <strong>Test</strong>PlanDI-EMCS-80201ReportDI-EMCS-80200Electrostatic Discharge Sensitivity <strong>Test</strong> ReportDI-RELI-80670Emission Control (EMCON) <strong>Test</strong> ReportDI-R-2059Endurance <strong>Test</strong> (EMCS) Failure ReportsDI-ATTS-80366Engineer Design <strong>Test</strong> PlanDI-MGMT-80688Environmental Design <strong>Test</strong> PlanDI-ENVR-80861Environmental <strong>Test</strong> ReportDI-ENVR-80863Equipment <strong>Test</strong> Plan (Nonsystem)DI-T-3709AFactory <strong>Test</strong>PlanDI-QCIC-80153EMCS PlanDI-ATTS-80360EMCS ProceduresDI-ATTS-80361EMCS ReportsDI-ATTS-80362First Article Qualification <strong>Test</strong> PlanDI-T-5315AFlight Flutter <strong>Test</strong> ReportDI-T-30733Flutter Model <strong>Test</strong> ReportDI-T-30732Hardware Diagnostic <strong>Test</strong> System Development PlanDI-ATTS-80005High-Impact Shock <strong>Test</strong> ProceduresDI-ENVR-80709Hull <strong>Test</strong> Results (Boats) ReportUDI-T-23718C-1


Human Engineering <strong>Test</strong>PlanDI-HFAC-80743ReportDI-HFAC-80744Inspection <strong>and</strong> <strong>Test</strong> PlanDI-QCIC-81110Installation <strong>Test</strong>PlanDI-QCIC-80155ProceduresDI-QCIC-80511Report DI-QCIC-80140, 80512Integrated Circuit <strong>Test</strong> DocumentationDI-ATTS-80888Maintainability/<strong>Test</strong>ability Demonstration <strong>Test</strong>PlanDI-MNTY-80831ReportDI-MNTY-80832Maintenance Training Equipment <strong>Test</strong> OutlinesDI-H-6129AMaster <strong>Test</strong> Plan/Program <strong>Test</strong> PlanDI-T-30714NBC Contamination Survivability <strong>Test</strong> PlanDI-R-1779Nuclear Survivability <strong>Test</strong>PlanDI-NUOR-80928ReportDI-NUOR-80929Packaging <strong>Test</strong>PlanDI-PACK-80456ReportDI-PACK-80457Part, Component, or Subsystem <strong>Test</strong> Plan(s)DI-MISC-80759Parts (Non-st<strong>and</strong>ard) <strong>Test</strong> Data ReportDI-MISC-81058Parts Qualification <strong>Test</strong> PlanDI-T-5477APerformance Oriented Packaging <strong>Test</strong> ReportDI-PACK-81059Production <strong>Test</strong>PlanDI-MNTY-80173ReportDI-NDTI-80492Quality Conformance <strong>Test</strong> ProceduresDI-RELI-80322Radar Spectrum <strong>Management</strong> (RSM) <strong>Test</strong> PlanDI-MISC-81113R<strong>and</strong>omizer <strong>Test</strong> ReportDI-NDTI-80884Reliability <strong>Test</strong>PlanDI-RELI-80250ProceduresDI-RELI-80251ReportsDI-RELI-80252Research <strong>and</strong> Development <strong>Test</strong> <strong>and</strong> Acceptance PlanDI-T-30744Rough H<strong>and</strong>ling <strong>Test</strong> ReportDI-T-5144CShip Acceptance <strong>Test</strong> (SAT)ScheduleDI-T-23959BReportDI-T-23190AShipboard Industrial <strong>Test</strong> ProceduresDI-QCIC-80206Shock <strong>Test</strong>Extension RequestDI-ENVR-80706ReportDI-ENVR-80708Software General Unit <strong>Test</strong> PlanDI-MCCR-80307C-2


Software <strong>Test</strong>DescriptionDI-MCCR-80015APlanDI-MCCR-80014AProceduresDI-MCCR-80310Report DI-MCCR-80017A, 80311Software SystemDevelopment <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> PlanDI-MCCR-80309Integration <strong>and</strong> <strong>Test</strong> PlanDI-MCCR-80308Sound <strong>Test</strong> Failure Notification <strong>and</strong> Re<strong>com</strong>mendationDI-HFAC-80271Special <strong>Test</strong> Equipment PlanDI-T-30702Spectrum Signature <strong>Test</strong> PlanDI-R-2068Static <strong>Test</strong>PlanDI-T-21463AReportsDI-T-21464AStructure-borne Vibration Acceleration Measurement <strong>Test</strong> DI-HFAC-80274Superimposed Load <strong>Test</strong> ReportDI-T-5463ATempest <strong>Test</strong>RequestDI-EMCS-80218PlanDI-T-1912A<strong>Test</strong> Change ProposalDI-T-26391B<strong>Test</strong> Elements ListDI-QCIC-80204<strong>Test</strong> Facility Requirements Document (TFRD)DI-FACR-80810<strong>Test</strong> PackageDI-ILSS-81085<strong>Test</strong>PlanDI-NDTI-80566Plans/ProceduresDI-NDTI-80808ProcedureDI-NDTI-80603ProceduresUDI-T-23732B<strong>Test</strong> Plan Documentation for AISDI-IPSC-80697<strong>Test</strong> ProgramDocumentation (TPD)DI-ATTS-80284Integration LogbookDI-ATTS-80281TPS <strong>and</strong> OTPS Acceptance <strong>Test</strong>Procedures (ATPS)DI-ATTS-80282AReport (ATR)DI-ATTS-80283A<strong>Test</strong> ReportsDI-NDTI-80809A,DI-MISC-80653<strong>Test</strong> Requirements Document<strong>Test</strong> Scheduling Report<strong>Test</strong>abilityProgram PlanAnalysis ReportDI-T-2181,DI-ATTS-80002, 80041DI-MISC-80761DI-T-7198DI-T-7199C-3


Trainer <strong>Test</strong> Procedures <strong>and</strong> Results ReportVibration <strong>and</strong> Noise <strong>Test</strong> ReportsVibration <strong>Test</strong>ingExtensionReportWelding Procedure Qualification <strong>Test</strong> ReportDI-T-25594CDI-T-30735UDI-T-23752UDI-T-23762DI-MISC-80876C-4


APPENDIX DBIBLIOGRAPHYAcquisition Desk Book, http://akss.dau.mil/DAG.AFLC Pamphlet 800-34, ACQ Logistics <strong>Management</strong>.AFM [Air Force Manual] 55-43, <strong>Management</strong> ofOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>.AFM 99-113, Space Systems <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Process, May 1, 1996.AFOTEC Regulation 23-1, Organization <strong>and</strong>Functions Air Force Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> Center (AFOTEC).AFOTEC, Electronic Warfare Operational <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong>.AFOTEC, Nuclear Survivability H<strong>and</strong>book forAir Force OT&E.AFOTEC 99-103, Capabilities <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>,October 22, 2001.AFR [Air Force Regulation] 23-36, Air ForceOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Center(AFOTEC).AFR 55-30, Operations Security.AFR 55-43, <strong>Management</strong> of Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong>, June 1979.AFR 800-2, Acquisition Program <strong>Management</strong>,July 21, 1994.AFR 800-8, Integrated Logistics Support (ILS)Program.AFR 80-14, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>. (Cancelled)AFR 80-38, <strong>Management</strong> of the Air Force SystemsSurvivability Program, May 27, 1997.AFR 800-2, Acquisition Program <strong>Management</strong>,July 21, 1994.AFSCR [Air Force System Comm<strong>and</strong> Regulation]550-13, Producibility.AFSCR 550-15, Statistical Process Control.AMC [Army Materiel Comm<strong>and</strong>] Plan 70-2,AMC-TRADOC Materiel AcquisitionH<strong>and</strong>book.Anderson, J.E., “Towards the Goal of ImprovingMIL-STD <strong>Test</strong>ing,” Defense Systems <strong>Management</strong>Journal, Vol. 12, No. 2, pp. 30-34, April1976.AR [Army Regulation] 10-11, U.S. Army MaterielDevelopment <strong>and</strong> Readiness Comm<strong>and</strong>.AR 10-4, U.S. Army Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> Agency. (Cancelled)AR 10-41, U.S. Army Training <strong>and</strong> DoctrineComm<strong>and</strong>.AR 10-42, U.S. Armed Forces Comm<strong>and</strong>.AR 700-127, Integrated Logistics Support, November10, 1999.AR 70-24, Special Procedures Pertaining toNuclear Weapons System Development.AR 70-60, Nuclear Survivability of ArmyMateriel.D-1


AR 70-71, Nuclear, Biological, <strong>and</strong> ChemicalContamination Survivability of Army Materiel.AR 71-1, Systems Acquisition Policy <strong>and</strong> Procedures(replaced by AR 70-1), December 31,2003.AR 71-3, Force Development User <strong>Test</strong>ing.AR 73-1, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Policy, January 7,2002.ATEC PAM [Army <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Comm<strong>and</strong>Pamphlet] 310-4, Index of <strong>Test</strong> OperationsProcedures <strong>and</strong> International <strong>Test</strong> OperationsProcedures, December 31, 1988.BDM Corporation, Application of Simulations tothe OT&E Process, May 1974.BDM Corporation, Functional Description ofthe Acquisition <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Process,July 8, 1983.Bolino, John V., Software T&E in the Departmentof Defense.CJCSI [Chairman of the Joint Chiefs of StaffInstruction] 3170.013, Joint Capabilities Integration<strong>and</strong> Development System (JCIDS),March 12, 2004.CJCSM [Chairman of the Joint Chiefs of StaffManual] 3170.01A, Operation of the JointCapabilities Integration <strong>and</strong> DevelopmentSystem, March 12, 2004.COMOPTEVFOR [Comm<strong>and</strong>er, Operational <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> Force], AD NumberAD124168, Operational <strong>Test</strong> Director <strong>Guide</strong>.COMOPTEVFOR, Project Analysis <strong>Guide</strong>.COMOPTEVFORINST 3960.1D, Operational<strong>Test</strong> Director’s <strong>Guide</strong>.DA PAM [Department of the Army Pamphlet]700-50, Integrated Logistics Support: DevelopmentalSupportability <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong><strong>Guide</strong>. (Cancelled)DA PAM 73-1, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> in Supportof Systems Acquisition, May 30, 2003.Data Link Vulnerability Analysis Joint <strong>Test</strong> Force,DVAL Methodology – Methodology Overview/Executive Summary, November 1984.Defense Acquisition University Press. TheGlossary of Defense Acquisition Acronyms<strong>and</strong> Terms, Vol. 11, September 2003.Defense Science Board Task Force Report,Solving the Risk Equation in Transitioningfrom Development to Production, May 25, 1983(later published as DoD Manual 4245.7).Defense Science Board, Report of Task Force on<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>, April 1, 1974.Defense Science Board Report, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Capabilities, December 2000.Defense Systems <strong>Management</strong> College,Acquisition Logistics <strong>Guide</strong>, December 1997.Defense Systems <strong>Management</strong> College, JointLogistics Comm<strong>and</strong>ers Guidance for the Useof an Evolutionary Acquisition (EA) Strategyin Acquiring Comm<strong>and</strong> <strong>and</strong> Control (C 2 )Systems, March 1987.Defense Systems <strong>Management</strong> College, “Risk<strong>Management</strong> as a Means of Direction <strong>and</strong>Control,” Program Manager’s Notebook, FactSheet Number 4.5, June 1992.Defense Systems <strong>Management</strong> College, DefenseManufacturing <strong>Management</strong> Course.D-2


Defense Systems <strong>Management</strong> College, IntegratedLogistics Support <strong>Guide</strong>, First Edition,May 1986.Defense Systems <strong>Management</strong> College, JointLogistics Comm<strong>and</strong>ers <strong>Guide</strong> for the<strong>Management</strong> of Joint Service Programs, 1987.Defense Systems <strong>Management</strong> College, JointLogistics Comm<strong>and</strong>ers <strong>Guide</strong> for the <strong>Management</strong>of Multinational Programs, 1987.Defense Systems <strong>Management</strong> College, SystemsEngineering Fundamentals <strong>Guide</strong>, January2001.Defense Systems <strong>Management</strong> College, SystemsEngineering <strong>Management</strong> <strong>Guide</strong>, SecondEdition, 1990.Director, Defense <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>, Memor<strong>and</strong>umof Response of Admiral Isham Linder,Director, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> to SupplementalQuestions posed by the Senate GovernmentalAffairs Committee, August 2, 1983.DoD 3200.11-D, Major Range <strong>and</strong> <strong>Test</strong> FacilityBase Summary of Capabilities, June 1, 1983.DoD 3235.1-H, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> of SystemReliability, Availability, <strong>and</strong> Maintainability:A Primer, March 1982.DoD 4245.7-M, Transition from Development toProduction, September 1985.DoD 5010.12-L, Acquisition <strong>Management</strong>Systems <strong>and</strong> Data Requirements Control List(AMSDL), April 2001.DoD 5000.2-R, M<strong>and</strong>atory Procedures for MajorDefense Acquisition Programs (MDAPs) <strong>and</strong>Major Automated Information System (MAIS)Acquisition Programs, June 2001.DoD 5000.3-M-1, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> MasterPlan (TEMP) <strong>Guide</strong>lines, October 1986.(Cancelled)DoD 5000.3-M-2, Foreign Weapons <strong>Evaluation</strong>Program <strong>and</strong> NATO Comparative <strong>Test</strong> ProgramProcedures Manual, January 1994.DoD 5000.3-M-3, Software <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Manual. (Cancelled)DoDD [Department of Defense Directive]3200.11, Major Range <strong>and</strong> <strong>Test</strong> Facility Base(MRTFB), May 1, 2002.DoDD 4245.6, Defense Production <strong>Management</strong>.(Cancelled)DoDD 4245.7, Transition from Development toProduction, with Change 1, September 1985.(Cancelled)DoDD 5000.1, The Defense Acquisition System,May 12, 2003.DoDD 5000.3, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>, December 26,1979. (Cancelled)DoDD 5000.37, Acquisition <strong>and</strong> Distribution ofCommercial Products (ADCOP), September 29,1978. (Cancelled)DoDD 5000.59, DoD Modeling <strong>and</strong> Simulation<strong>Management</strong>, January 20, 1998.DoDD 5000.38, Production Readiness Reviews.(Cancelled)DoDD 5000.39, Acquisition <strong>and</strong> <strong>Management</strong> ofIntegrated Logistics Support for Systems <strong>and</strong>Equipment, November 17, 1983. (Cancelled)DoDD 5010.8, Value Engineering Program, 1968.DoDD 5010.41, Joint <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> (JT&E)Program, February 23, 1998.D-3


DoDD 5105.31, Defense Nuclear Agency, June14, 1985. (Cancelled)DoDD 5134.1, Under Secretary of Defense forAcquisition, Technology, <strong>and</strong> Logistics(USD(AT&L)), April 21, 2000.DoDD 5141.2, Director of Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> (DOT&E), May 25, 2000.DoDD 5160.5, Responsibilities for Research,Development, <strong>and</strong> Acquisition of ChemicalWeapons <strong>and</strong> Chemical <strong>and</strong> BiologicalDefense, May 1, 1985.DoDI [Department of Defense Instruction]4245.4, Acquisition of Nuclear-SurvivableSystems, July 25, 1988. (Cancelled)DoDI 5000.2, Operation of the Defense AcquisitionSystem, May 12, 2003.DoDI 5000.61, DoD Modeling <strong>and</strong> Simulation(M&S) Verification, Validation <strong>and</strong> Accreditation(VV&A), May 13, 2003.DoDI 5030.55, Joint AEC-DoD Nuclear WeaponsDevelopment Procedures, January 4,1974.DoDI 7000.10, Contractor Cost Performance,Funds Status, <strong>and</strong> Cost/Schedule Reports.(Cancelled)DoD-STD [Department of Defense St<strong>and</strong>ard]-2167, Defense System Software Development,January 1, 1987.DoD-STD-2168, Defense System Software QualityProgram, May 21, 1996.DoD FMR 7000.14-R, Vol. 02B, Budget Formulation<strong>and</strong> Presentation, Chapter 5, “Research,Development, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Appropriations,” June 2004.FY88 Report of the Secretary of Defense to theCongress, 1988.GAO/NSIAD-00-199, Best Practices: A MoreConstructive <strong>Test</strong> Approach is Key to BetterWeapon System Out<strong>com</strong>e, July 2000.GAO/NSIAD-87-57, Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> Can Contribute More to Decision-Making, February 4, 1987.GAO/PEMD-86-12BR, Bigeye Bomb: An <strong>Evaluation</strong>of DoD’s Chemical <strong>and</strong> Developmental<strong>Test</strong>s, June 10, 1986.GAO/PEMD-87-17, Live Fire <strong>Test</strong>ing: EvaluatingDoD’s Programs, June 12, 1997.GAO/PSAD-78-102, Operational <strong>Test</strong>ing of AirForce Systems Requires Several Improvements.GAO/PSAD-79-86, Review of Military ServicesDevelopment <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> of SixWeapon Systems Totaling an Estimated $12Billion in Development <strong>and</strong> ProcurementCosts, June 25, 1979.Georgia Institute of Technology, The Software <strong>Test</strong><strong>and</strong> <strong>Evaluation</strong> Project.Hearing before the Senate Committee onGovernmental Affairs, June 23, 1983.Hoivik, Thomas H., The Navy <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Process in Major Systems Acquisition, DTICAccession No. AD A035935, November 1976.International <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Association(ITEA), <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> of Comm<strong>and</strong> <strong>and</strong>Control Systems-Developed Evolutionary Acquisition,May 1987.JCHEM Joint <strong>Test</strong> Force, Concept <strong>and</strong> Approachfor a Joint <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> of U.S.Chemical Warfare-Retaliatory Capabilities,AD #B088340, February 1984.D-4


Johnson, Larry H., Contractor <strong>Test</strong>ing <strong>and</strong> theArmy <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan forFull Engineering Development, DefenseSystems <strong>Management</strong> College, 1983.Joint <strong>Test</strong> Director’s H<strong>and</strong>book (Draft), militaryh<strong>and</strong>book.Kausal, Tony, Comparison of the Defense AcquisitionSystems of France, Great Britain, Germany,<strong>and</strong> the United States, Defense AcquisitionUniversity: Ft. Belvoir, VA, September1999.Kausal, Tony, <strong>and</strong> Stefan Markowski, Comparisonof the Defense Acquisition Systems ofAustralia, Japan, South Korea, Singapore, <strong>and</strong>the United States, Defense Acquisition University:Ft. Belvoir, VA, July 2000.Mann, Greg A., LTC, The Role of Simulation inOperational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>, Report AU-ARI-83-10, Air University Press, Maxwell AirForce Base, AL: August 1983.MCO [Marine Corps Order] 3960.2, MarineCorps Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Activity(MCOTEA); establishment of, withchanges, October 14, 1994.MCO 5000.11B, <strong>Test</strong>ing <strong>and</strong> <strong>Evaluation</strong> ofSystems <strong>and</strong> Equipment for the Marine Corps.Memor<strong>and</strong>um of Agreement of Multi-ServiceOT&E, with changes, August 2003.Memor<strong>and</strong>um of Underst<strong>and</strong>ing Among theGovernments of the French Republic, theFederal Republic of Germany, the UnitedKingdom of Great Britain <strong>and</strong> NorthernIrel<strong>and</strong>, <strong>and</strong> the United States of America Relatingto the Mutual Acceptance of <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> for the Reciprocal Procurement ofDefense Equipment.MIL-HDBK[Military H<strong>and</strong>book]-881, WorkBreakdown Structure, January 2, 1998.MIL-STD[Military St<strong>and</strong>ard]-1388-1A, LogisticsSupport Analysis, April 11, 1983.MIL-STD-1512A, Technical Reviews <strong>and</strong> Auditsfor Systems, Equipment, <strong>and</strong> ComputerPrograms, September 30, 1997.MIL-STD-1528, Manufacturing Master Plan.MIL-STD-480, Configuration Control – EngineeringChanges, Deviations, <strong>and</strong> Waivers,July 15, 1988.MIL-STD-483 (USAF), Configuration <strong>Management</strong>Practices for Systems, Equipment,Munitions <strong>and</strong> Computer Software.MIL-STD-490, Specification Practices, June 4,1985, modified March 1997.MIL-STD-499A (USAF), Engineering <strong>Management</strong>Practices, May 1, 1974.MIL-STD-961D, DoD St<strong>and</strong>ard Practice forDefense Specifications, March 22, 1995.MITRE Corporation, Software Reporting Metrics,November 1985.NAVMATINST [Naval Material Comm<strong>and</strong> Instruction]3960, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>.NAVSO-P-2457, RDT&E/Acquisition <strong>Management</strong><strong>Guide</strong>, 10th Edition.O’Bryon, James F., Assistant Deputy UnderSecretary of Defense (Live Fire <strong>Test</strong>). Statementbefore the Acquisition Policy Panel ofthe House Armed Services Committee, September10, 1987.D-5


Office of the Deputy Director, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(Live Fire <strong>Test</strong>), Live Fire <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong> Planning <strong>Guide</strong>, June 1989.Office of the Deputy Under Secretary for Research<strong>and</strong> Engineering (<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>),Joint <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Procedures Manual(Draft), October 1986.Office of the Under Secretary of Defense (Acquisition,Technology <strong>and</strong> Logistics), Director,Strategic & Tactical Systems, The ForeignComparative <strong>Test</strong>ing (FCT) H<strong>and</strong>book, March2001, http://www.acq.osd.mil/cto.Office of the Under Secretary of Defense forResearch <strong>and</strong> Engineering (Director, Defense<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>), <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> inthe United States Department of Defense,August 1983.OPNAVINST [Office of the Chief of Naval OperationsInstruction] 3070.1A, Operations Security.OPNAVINST 3960.10C, <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(Draft). (Cancelled)OPNAVINST 5000.49, ILS in the AcquisitionProcess.OPNAVINST 5440.47F, Mission <strong>and</strong> Functionsof Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Force(OPTEVFOR).Public Law (P.L.) 92-156, National Defense AuthorizationAct for Fiscal Year 1972.Public Law 98-94, National Defense AuthorizationAct for Fiscal Year 1987.Public Law 99-661, National Defense AuthorizationAct for Fiscal Year 1987.Range Comm<strong>and</strong>ers Council (RCC), UniversalDocumentation System H<strong>and</strong>book, Document501-84, Vol. 3, August 1989.Rhoads, D.I., Systems Engineering <strong>Guide</strong> for<strong>Management</strong> of Software Acquisition, DefenseSystems <strong>Management</strong> College, 1983, pp. IV-1 to IV-23.Rome Air Development Center, St<strong>and</strong>ard Proceduresfor Air Force Operational <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong>.SECNAVINST [Secretary of the Navy Instruction]5000.1B, Systems Acquisition.SECNAVINST 5000.39, USN Acquisition <strong>and</strong><strong>Management</strong> of ILS Systems <strong>and</strong> Equipment.SECNAVINST 5000.42C, RDT&E AcquisitionProcedures.SECNAVINST 5430.67A, Assignment of Responsibilitiesfor Research, Development, <strong>Test</strong> <strong>and</strong><strong>Evaluation</strong>.Stevens, Roger T., Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>Process: A Systems Engineering Process,John Wiley & Sons, New York: 1979, pp. 49-56.Thomas, E.F., Reliability <strong>Test</strong>ing Pitfalls, Reliability<strong>and</strong> Maintainability Symposium, Vol.7, No. 2, 1974, pp. 78-85.USAOTEA Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong><strong>Guide</strong>, Special Supplement, Continuous Comprehensive<strong>Evaluation</strong> Planning for NuclearHardness <strong>and</strong> Survivability <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong>(Draft). (Cancelled)Watt, Charles K., The Role of <strong>Test</strong> Beds in C 3 I,AFCEA 35th Annual Convention, June 15-17,1982.Willoughby, Willis J., “Reliability by Design, NotChance,” Defense Systems <strong>Management</strong>Journal, Vol. 12, No. 2, pp. 12-18.D-6


APPENDIX EINDEXPageAcquisition Decision Memor<strong>and</strong>um ............................................................................................... 5-4Acquisition Process .................................................................................................................. 2-1, 2-2Air Force Acquisition Executive ..................................................................................................... 3-9Air Force OT&E Organization ...................................................................................................... 3-10Air Force DT&E Organization........................................................................................................ 3-9Aircraft Systems T&E ................................................................................................................... 25-2Army Acquisition Executive ........................................................................................................... 3-5Army T&E Organization ................................................................................................................. 3-5Baseline Documentation ................................................................................................................. 5-4Chemical Weapons <strong>Test</strong>ing............................................................................................................ 24-1Combined <strong>and</strong> Concurrent <strong>Test</strong>ing ................................................................................................. 9-1Combining DT <strong>and</strong> OT ................................................................................................................... 9-1Comm<strong>and</strong> <strong>and</strong> Control System T&E .......................................................................................... 25-10Commercial Items T&E ................................................................................................................ 23-1Concept Exploration Phase ............................................................................................................. 2-2Concurrent <strong>Test</strong>ing .......................................................................................................................... 9-3Configuration Change Control........................................................................................................ 8-7Congressional Policy ....................................................................................................................... 3-1Contractor <strong>Test</strong> <strong>Management</strong> ............................................................................ 4-4, 4-6, 4-7, 4-8, 7-5Contribution at Major Milestones .......................................................................................... 1-3, 12-1Cost Associated with T&E ..................................................................................................... 4-5, 15-2Critical Operational <strong>Test</strong> Issues ........................................................................................... 11-4, 13-4Data Sources ................................................................................................................. 4-2, 12-4, 13-7Design <strong>Evaluation</strong>/Verification <strong>Test</strong>ing.................................................................................... 4-6, 7-7Design for <strong>Test</strong> ....................................................................................................................... 7-9, 7-10Design Limit <strong>Test</strong>ing ....................................................................................................................... 7-6Design Reviews .............................................................................................................. 2-10, 4-5, 8-1Development T&E .................................................................................................................... 6-1, 9-1Development <strong>Test</strong>ing ....................................................................................................................... 7-1Contractor ........................................................................................................................... 7-5Government .................................................................................................................... 7-6Limited Procurement Programs .......................................................................................... 7-9Responsibilities.................................................................................................................... 7-5Support of Technical Reviews/Milestone Decisions .......................................................... 8-1<strong>Test</strong> Program Integration .............................................................................................. 4-4, 7-6Director, Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> ...................................................................................... 3-4DoD <strong>Test</strong> Process ............................................................................................................................ 2-9DT&E <strong>and</strong> the Review Process ...................................................................................................... 8-1Early Operational Assessments ................................................................... 1-5, 6-4, 11-2, 12-1, 12-3EC/C4ISR <strong>Test</strong>ing ......................................................................................................................... 20-1End of <strong>Test</strong> Phase Report .............................................................................................................. 5-11E-1


<strong>Evaluation</strong> Agencies ................................................................................ 3-4, 4-6, 11-2—, 11-4, 13-9<strong>Evaluation</strong> ...................................................................................................................................... 13-1Criteria ............................................................................................................................... 13-4Development/Operational <strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> ................................................................. 13-7Differences between T&E ................................................................................................. 13-1Issues ................................................................................................................................. 13-4Plan ............................................................................................................................. 5-7, 13-2Planning ............................................................................................................................. 13-6Process ............................................................................................................................... 13-1Operational ........................................................................................................................ 13-9Technical ............................................................................................................................ 13-9Evolutionary Acquisition...................................................................................................... 16-2, 20-6First Article <strong>Test</strong> ............................................................................................................................ 10-2Follow-on Operational T&E ..................................................................................1-7, 6-7, 11-3, 12-6Foreign Comparative <strong>Test</strong>ing Program........................................................................... 3-2, 3-4, 22-1Formal Reviews .............................................................................................................. 2-9, 2-10, 8-1Systems Requirements Review ........................................................................................... 8-4Systems Functional Review ................................................................................................ 8-4Software Specification Review ........................................................................................... 8-4Preliminary Design Review ................................................................................................ 8-4Critical Design Review ....................................................................................................... 8-6<strong>Test</strong> Readiness Review ........................................................................................................ 8-6Formal Qualification Review .............................................................................................. 8-6Production Readiness Review ............................................................................................. 8-7Functional Configuration Audit ...................................................................................................... 8-6Hazardous Weapons <strong>Test</strong>ing .......................................................................................................... 24-1Human Factors <strong>Evaluation</strong> ................................................................................................... 11-2, 19-4Independent Verification <strong>and</strong> Validation ....................................................................................... 17-8Initial OT&E ...........................................................................................1-7, 2-2, 6-4, 6-7, 11-3, 12-5Integrated <strong>Test</strong> Plan ...................................................................................................... 7-6, 16-8, 20-1Logistics Support Planning ........................................................................................................... 19-1Objectives .......................................................................................................................... 19-1Logistics Support T&E.................................................................................................................. 19-2Data Collection <strong>and</strong> Analyses .............................................................................................. 13-6, 19-7Limitations ......................................................................................................................... 19-7Use of Results ................................................................................................................... 19-7Support Package ................................................................................................................ 19-7Integrated <strong>Test</strong> Approach............................................................................................................... 20-1Introduction to <strong>Test</strong> & <strong>Evaluation</strong> Process ..................................................................................... 2-1Issues, Critical ...................................................................................................................... 11-4, 13-4Joint T&E ........................................................................................................................................ 6-8Laser Weapons, <strong>Test</strong>ing ................................................................................................................. 24-3Life Cycle T&E .......................................................................................................2-1, 2-4, 7-1, 12-1Life <strong>Test</strong>ing ...................................................................................................................................... 7-7Live Fire <strong>Test</strong> <strong>Guide</strong>lines ......................................................................................6-8, 6-9, 18-1, 18-3E-2


Logistics T&E................................................................................................................................ 19-1Planning ............................................................................................................................. 19-2Beyond Low Rate Initial Production Report ........................................................3-2, 3-5, 5-12, 10-3Low Rate Initial Production .......................................................................................................... 10-3Major Range <strong>and</strong> <strong>Test</strong> Facility Base ...................................................................................... 3-4, 15-2Marine Corps Acquisition Executive ............................................................................................ 3-11Marine Corps DT&E Organization............................................................................................... 3-11Marine Corps Operational <strong>Test</strong> & <strong>Evaluation</strong> Agency ................................................................. 3-11Missile System T&E ..................................................................................................................... 25-5Major Milestone Reviews ........................................................................................................ 1-3, 5-3Matrix Analysis Techniques .......................................................................................................... 13-7NATO Comparative <strong>Test</strong>ing Program ........................................................................................... 22-2Navy DT&E Organizations ............................................................................................................. 3-7Navy Operational <strong>Test</strong> & <strong>Evaluation</strong> Force .................................................................................... 3-9NDI <strong>Test</strong>ing ................................................................................................................................... 23-1Nondevelopment Items Definition ................................................................................................ 23-1Nuclear Hardness T&E ................................................................................................................. 6-10Nuclear, Biological & Chemical Weapons T&E ................................................................... 6-9, 24-1Objectives <strong>and</strong> Thresholds ............................................................................................. 2-7, 5-3, 13-5Operational T&E ............................................................................................................ 6-3, 7-3, 11-1OPEVAL ........................................................................................................................................ 11-3OT&E to Support Milestone Decisions ........................................................................................ 12-1OT&E Contractor Responsibilities .......................................................................................... 4-7, 4-8OSD Oversight ................................................................................................................................ 3-2Physical Configuration Audit.......................................................................................................... 8-6Policy, The Congress ....................................................................................................................... 3-1OSD ..................................................................................................................................... 3-2Army .................................................................................................................................... 3-5Navy..................................................................................................................................... 3-7Air Force .............................................................................................................................. 3-9Marine Corps ..................................................................................................................... 3-11Planning <strong>Guide</strong>lines for Logistics T&E........................................................................................ 19-1Post Production T&E.............................................................................................1-7, 7-4, 10-3, 12-6Production Qualification <strong>Test</strong>s ............................................................................................... 6-1, 10-2Product Baseline <strong>and</strong> T&E ............................................................................................... 2-9, 5-5, 8-5Production Acceptance <strong>Test</strong> & <strong>Evaluation</strong>............................................................................. 7-4, 10-3Production Readiness Reviews ............................................................................................ 10-2, 10-5Program Definition <strong>and</strong> Risk Reduction Phase......................................................1-5, 2-2, 7-1, 12-3Program Office Responsibility for T&E......................................................................................... 4-1Qualification Operational T&E ..................................................................................................... 11-3Qualification <strong>Test</strong>ing ..................................................................................................................... 10-2Quick-Look Report ................................................................................................................ 5-9, 11-6Realism during <strong>Test</strong>ing ............................................................................................................. 6-4, 1-5Reliability, Availability & Maintainability Assessments ....................................................... 7-8, 19-5Reliability Development <strong>Test</strong>ing ............................................................................................ 7-8, 19-6E-3


Reports................................................................................................3-1, 5-9, 5-11, 11-6, 13-9, 18-6Requirements Documentation ......................................................................................................... 5-1Resources .............................................................................................................................. 4-11, 15-1Risk <strong>Management</strong> ............................................................................................................. 1-1, 1-8, 2-7Schedules Formulation ........................................................................................................... 4-3, 16-4Security <strong>and</strong> T&E .......................................................................................................................... 24-5Service Resource ........................................................................................................................... 15-1Planning ............................................................................................................................. 15-4Service <strong>Test</strong> Facilities ............................................................................................................. 6-1, 15-4Army .................................................................................................................................. 15-4Navy................................................................................................................................... 15-5Air Force ............................................................................................................................ 15-7Ship System T&E ........................................................................................................................ 25-13Simulation, <strong>Test</strong>, <strong>and</strong> <strong>Evaluation</strong> Process (STEP) ................................................................. 6-1, 14-7Software Development Process ..................................................................................................... 17-4Software Life Cycle T&E ............................................................................................................. 17-4Space Systems <strong>Test</strong>ing ........................................................................................................... 7-9, 24-3Specifications ........................................................................................................................... 4-6, 8-4Surface Vehicle T&E ................................................................................................................... 25-16System Life Cycle ........................................................................................................................... 2-1Systems Engineering Process.......................................................................................................... 2-6Technical <strong>Management</strong> Planning ............................................................................................. 2-7, 5-1Technical Reviews <strong>and</strong> Audits ................................................................................................. 4-5, 8-1<strong>Test</strong> Concepts, IOT&E ........................................................................................................... 3-5, 11-5<strong>Test</strong> <strong>and</strong> <strong>Evaluation</strong> Master Plan ..........................................................................3-4, 5-7, 15-1, 16-1<strong>Test</strong> Design Plan .............................................................................................................................. 5-7<strong>Test</strong> Funding ................................................................................................................... 3-5, 4-5, 15-8Technical Performance Measurement ............................................................................................. 2-7<strong>Test</strong> Plan .......................................................................................................................................... 5-9<strong>Test</strong> Program Integration .............................................................................................. 7-6, 16-8, 20-1<strong>Test</strong> Reports ................................................................................................. 3-1, 5-9, 11-6, 13-9, 18-6<strong>Test</strong>ing with Limitations ............................................................................................................... 24-1Thresholds ...................................................................................................................... 2-7, 5-3, 13-5Transition to Production ................................................................................................................ 10-3Transition Planning ........................................................................................................... 10-3<strong>Test</strong>ing during Transition................................................................................................... 10-3<strong>Test</strong> Resource Planning ................................................................................................................. 15-1<strong>Test</strong>ing for Nuclear Hardness <strong>and</strong> Survivability.................................................................. 6-10, 18-6TEMP Requirements ..................................................................................................................... 16-1Upgrades, Enhancements, Modification T&E .............................................. 1-7, 2-4, 4-10, 7-4, 12-6Validity of Modeling <strong>and</strong> Simulation ........................................................................................... 14-4Warranties, Impact on T&E ............................................................................................................ 7-9E-4


APPENDIX FPOINTS OF CONTACTSERVICE TEST AND EVALUATION COURSESARMYComm<strong>and</strong>er, U.S. Army Developmental <strong>Test</strong> Comm<strong>and</strong>ATTN: CSTE-DTCAberdeen Proving Ground, MD 21005-5055Comm<strong>and</strong>er, U.S. Army Logistics <strong>Management</strong> GroupATTN: AMXMC-ACM-MAFort Lee, VA 23801-6048Comm<strong>and</strong>er, U.S. Army <strong>Test</strong> Comm<strong>and</strong>ATTN: CSTE-TC4501 Ford AvenueAlex<strong>and</strong>ria, VA 22302NAVYComm<strong>and</strong>er, Space & Naval Warfare Systems Comm<strong>and</strong>ATTN: <strong>Management</strong> OperationsNational Center, Building 1Washington, DC 20361Comm<strong>and</strong>er, Operational <strong>Test</strong> & <strong>Evaluation</strong> ForceATTN: Dest 02BNorfolk, VA 23511-6388AIR FORCEComm<strong>and</strong>er, Air Force Operational <strong>Test</strong> & <strong>Evaluation</strong> CenterATTN: Asst. Director, TrainingBuilding 20130Kirtl<strong>and</strong> Air Force Base, NM 87117-7001Comm<strong>and</strong>er, Air Force Institute of TechnologyATTN: Student OperationsWright-Patterson Air Force Base, OH 45433F-1


OSDDefense <strong>Test</strong> & <strong>Evaluation</strong> Professional InstituteNaval Air Warfare Center, Weapons DivisionCode 02PIPoint Mugu, CA 93042F-2


CUSTOMER FEEDBACKDear Valued Customer:The Defense Acquisition University Press publishes this reference tool for students of the Defense Acquisition University<strong>and</strong> other members of the defense acquisition workforce. We continually seek customer inputs to maintain theintegrity <strong>and</strong> usefulness of this publication.We ask that you spend a few minutes evaluating this publication. Although we have identified several topic areas toevaluate, please feel free to add any additional thoughts as they apply to your particular situation. This information willassist us in improving future editions.To assist you in responding, this pre-addressed form requires no postage. Simply tear-out this page, fold, <strong>and</strong> tapeclosed as indicated, then place in the mail. Responses may also be faxed to the DAU Press at (703) 805-2917 or DSN655-2917.TEST AND EVALUATION MANAGEMENT GUIDE (January 2005)1. Do you find this publication a useful reference tool?YES____NO ____ How can we improve the level of usefulness?__________________________________________________________________________________________________________________________________________________________________________________________2. Is the material presented in this publication current?YES____ NO_____3. Using the rating scale below, please indicate your satisfaction level for each topic area. Place your response on theline beside the topic area.1 = Very satisfied 2 = Satisfied 3 = No opinion 4 = Dissatisfied 5 = Very dissatisfied_____Readability_____Scope of coverage_____Contribution to your knowledge of the subject_____Contribution to your job effectiveness_____Contribution to process improvement4. Are there other subject areas you would like to have included in this publication?YES___ (Please identify)________________________________________________________________________NO ___5. If you re<strong>com</strong>mend additions, deletions, or changes to this publication, please identify them below._______________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________6. Additional <strong>com</strong>ments______________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________(THIS SECTION IS OPTIONAL)NAME/TITLE_________________________________________________________________________________ADDRESS____________________________________________________________________________________CITY/STATE/ZIP______________________________________________________________________________Commercial (____)______________ DSN______________E-Mail_______________________________________

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!