11.01.2017 Views

A Technical History of the SEI

ihQTwP

ihQTwP

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Ada/Real-Time Embedded Systems Testbed<br />

The Challenge: Evaluating Runtime Performance on Embedded Processors<br />

How real-time systems would be programmed in a high-level language like Ada was just one aspect<br />

<strong>of</strong> Ada adoption for <strong>the</strong> DoD. Two o<strong>the</strong>r important aspects concerned (1) <strong>the</strong> performance <strong>of</strong><br />

<strong>the</strong> code generated by Ada compilers for <strong>the</strong> various embedded processors used by <strong>the</strong> DoD and<br />

(2) <strong>the</strong> efficiency <strong>of</strong> <strong>the</strong> services provided by <strong>the</strong> Ada runtime environment. The runtime environment<br />

provided services such as process management, storage management, and exception handling<br />

for supporting <strong>the</strong> execution <strong>of</strong> Ada programs. Prior to <strong>the</strong> adoption <strong>of</strong> Ada, such services<br />

had been provided ei<strong>the</strong>r by <strong>the</strong> application programmer or by a small real-time executive.<br />

There was concern both inside and outside <strong>the</strong> DoD that about whe<strong>the</strong>r Ada could support <strong>the</strong>se<br />

real-time needs efficiently. In particular, <strong>the</strong> semantics <strong>of</strong> <strong>the</strong> tasking model and <strong>the</strong> processing<br />

overhead associated with task interactions and context switching were viewed as impediments to<br />

<strong>the</strong> real-time performance demanded <strong>of</strong> mission-critical s<strong>of</strong>tware. An <strong>SEI</strong> report describes <strong>the</strong> issues<br />

and summarizes some <strong>of</strong> <strong>the</strong> significant early investigative work <strong>of</strong> organizations such as <strong>the</strong><br />

Ada Runtime Environment Working Group (ARTEWG)—a special interest group established by<br />

<strong>the</strong> ACM in 1985—and <strong>the</strong> Evaluation and Validation team <strong>of</strong> <strong>the</strong> DoD’s Ada Joint Program Office<br />

[Weiderman 1987a].<br />

A Solution: A Testbed for Real-Time Performance Evaluation<br />

Any assessment <strong>of</strong> Ada for mission-critical computing on embedded processors would have to<br />

take into account <strong>the</strong> quality <strong>of</strong> both <strong>the</strong> generated code and <strong>the</strong> runtime execution environment.<br />

The <strong>SEI</strong> established <strong>the</strong> Ada Embedded Systems Testbed (AEST) in 1986 to investigate <strong>the</strong>se<br />

questions. The objective was to generate and disseminate quantitative evaluations <strong>of</strong> a representative<br />

set <strong>of</strong> vendors’ Ada implementations targeted to various embedded processors. The investigations<br />

used a test suite <strong>of</strong> Ada programs comprising existing Ada benchmarks, a simulated realtime<br />

application based on a Navy shipboard inertial navigation system (INS), and a new benchmark<br />

created specifically for <strong>the</strong> project.<br />

Criteria for constructing <strong>the</strong> testbed included requirements for each tested compiler (for example,<br />

<strong>the</strong> smallest value <strong>of</strong> <strong>the</strong> pre-defined “Duration” type should be less than 100 microseconds) and<br />

its runtime system (for example, <strong>the</strong> overhead for a context switch should be less than 200 microseconds)<br />

[Weiderman 1987b]. An investigation <strong>of</strong> existing benchmarks led to <strong>the</strong> selection <strong>of</strong> <strong>the</strong><br />

University <strong>of</strong> Michigan Ada benchmarks [Clapp 1986] and <strong>the</strong> ACM Special Interest Group on<br />

Ada (SIGAda) Performance Issues Working Group (PIWG) benchmarks as <strong>the</strong> initial test suites<br />

[Donohoe 1987].<br />

The testbed itself was a host-target environment in which a cluster <strong>of</strong> Digital Equipment Corp.<br />

Microvax II host machines were connected to a set <strong>of</strong> target single-board computers with processors,<br />

such as a 20 MHz Motorola MC68020, a 16 MHz Intel i80386, and a 15 MHz Fairchild<br />

1750A (with a MIL-STD-1750A instruction set architecture). The Ada cross-compilers for <strong>the</strong> target<br />

boards came from DDC-I, Systems Designers, Tartan Laboratories, TeleS<strong>of</strong>t, and Verdix. The<br />

testbed also included a logic analyzer because one <strong>of</strong> its construction criteria was <strong>the</strong> requirement<br />

for hardware verification <strong>of</strong> s<strong>of</strong>tware timing results. The effort soon evolved beyond <strong>the</strong> objective<br />

<strong>of</strong> evaluating Ada and was broadened into <strong>the</strong> Real-Time Embedded Systems Testbed (REST).<br />

CMU/<strong>SEI</strong>-2016-SR-027 | SOFTWARE ENGINEERING INSTITUTE | CARNEGIE MELLON UNIVERSITY 27<br />

Distribution Statement A: Approved for Public Release; Distribution is Unlimited

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!