11.07.2015 Views

1998 - Draper Laboratory

1998 - Draper Laboratory

1998 - Draper Laboratory

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

high-precision delta-pseudorange velocity, and other customnavigation fixes. This might require a simulator that cangenerate signals for multiple RF antennas. In addition,depending on how the velocity fix is calculated, the fundamentalprecision of the simulator may need to be greater than that of asimulator that does just PVT tests.Jamming: Although most jamming testing is for DoDapplications, there is enough potential for unintentionalinterference to be a concern for commercial GPS applications.The rows recognize the differences inherent in modelinginterference sources as seen by Fixed Reception Pattern Antennas(FRPA), Controlled Reception Pattern Antennas (CRPA), andantennas with Time-Varying Patterns (TVPA). FRPA simulationrequires that the simulator be able to generate a specificJamming-to-Signal ratio (J/S) versus time. CRPA simulationrequires either the ability to use time-varying gain, phase, andgroup delay inputs from a CRPA model (“canned CRPA”) or theability to generate RF signals for up to some number ofantenna elements, with full delay and amplitude control tosimulate the actual jamming signal wavefronts. TVPAsimulation further allows the antenna element, lever arm, andmasking parameters to vary in location and orientation withtime, enabling the simulation of deployable antennas.Spoofing: The matrix provides a means of identifying thesimulator’s spoofing test capability.Engineering: The matrix provides a means of identifying thesimulator’s capability to present specialized nonphysicalscenarios. These specialized tests would allow the user to createBode plots of a receiver’s tracking loops or help run specializedsimulator accuracy tests such as the ones presented in this paper.Exchange Ratings: In addition to the SPS/PPS SV and jammingexchange ratings defined for Table 1, the Capabilities Matrixprovides a constellation exchange factor that states how manyunits or chassis are required to make a 10-in-view SPS or PPSconstellation. On units with more than one RF element perchassis, an RF routing flexibility factor is required to describe anyrestrictions that might exist in splitting channel assignmentsbetween the RF outputs. (On some units, channels may need tobe assigned in even pairs per RF element.)ConclusionThis paper defines a strawman simulator hardware specificationsheet covering fundamental simulator performance parameters.It also describes a technique for verifying some of the keysimulator accuracy parameters. In addition, it proposes acapability matrix to concisely present a simulator’s functionalcapability. Both tables are indexed by dynamics, recognizing thatthis concept affects accuracy on the one hand and relevance tomission test needs on the other.Even though the paper outlines many of the key issues that mightevolve into a complete simulator validation test suite, there aresome issues that were not covered. Briefly, they are: userinterfaces, external interfaces and upgradability, plus thepackaging of a suite of validation tests.There are two extreme user interface cases: receiver/navigationsystem testers and receiver developers. Generally, tests thatqualify systems need repeatable scriptable operation, whiledevelopment needs a mixture of scripting and on-the-flyalterations. A user interface optimized for one may besignificantly different from one optimized for the other.Appendix 2 lists the most important features for each extreme.All external interfaces that allow system upgrades, access tothird-party hardware or software add-ins, and ways for users toinsert their own software models into the simulator must beidentified in a fully-mature Specification Sheet or CapabilitiesMatrix.An effectively packaged simulator test suite will support bothvendors and users. To be easily usable, an effective GPSsimulator validation test suite would be laid out logically so thatany potential user can access particular items of interest. Forexample, navigators might prefer an organizational layout basedon measuring typical GPS performance specifications. Receiverdevelopers might want a test suite organized about ICD-GPS-200and -203. Perhaps a simulator test suite could be written as ahypertext (HTML) file. This HTML format would let the userchoose the order most suited to his needs. In HTML, the suitecould exist on-line or as a file on a local disk. The next bestoption to HTML is to use a paper-based document with a tableof contents organized around navigation applications withmultiple different cross-referenced indices at the end. Some ofthe most common organizational indices might include: anICD-GPS-200/203 cross reference index, an alphabetical index,navigator index, and an index by physical signal path (startingfrom ground control, SV, and then to the user). Another optionis a design that vendors can also use for training new GPSsimulator users. Users would then be able to understandprecisely what their simulator can do and how accurately itworks while learning how to run it. In particular, vendors couldreduce training costs (and thus be more willing to support thetest suite). Vendors could also include the test suite in a tutorialdirectory. In addition, if the test suite and tutorials are alsodesigned to immediately generate useful answers, users will bemore willing to wade through the various tests because there areimmediate rewards for doing each test.Validating the Validating Tool: Defining and Measuring GPS Simulator Specifications13

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!