Features

Current practice in certification of avionics systems (e.g. DO-178B1) is centered on testing. The testing process has been prescribed to incorporate a semi-formal notion of completeness to demonstrate that the software correctly implements all its requirements, and nothing further. But in the last analysis, our confidence in modern avionics is based primarily on the successful completion of a test suite that, although extensive, is incomplete.

Testing can never give 100% confidence in software correctness since it is impossible for tests to cover all cases.
We all know that from a formal point of view, testing can never give 100% confidence in software correctness since it is impossible for tests to cover all cases. In practice, the DO-178B approach has been highly successful, but we have had some close calls that remind us that the process is not infallible2. As avionics systems get more complex, and as the software is developed using advanced techniques such as object-oriented technology and modeling languages, it is unclear whether testing alone will be sufficient in the future.

But other potentially valuable approaches are available. Static analysis, in the form of sophisticated tools that analyze application programs to detect possible problems, can help, though in only a limited fashion if the analysis is not guaranteed to be complete. A tool that promises to detect 90% of buffer overruns merely leaves us worrying even more about the 10% it has not detected.

Another important approach is the application of formal methods, which allow us to demonstrate mathematically that certain properties of programs are satisfied. However, the notion of proving an entire program correct in such a mathematical sense is a mirage. Even if we prove that a program correctly implements some complete formal specification, we have no way of ensuring that the specification itself is correct. Nevertheless, formal methods can definitely play a significant role, as is shown by the SPARK language and

toolset3. Importantly for safety- or security-critical systems, the SPARK tools are sound: if they do not detect an error (such as buffer overrun), then the program does not contain any such errors. SPARK is currently being used in building the iFACTS4 air traffic control system in England, applying formal methods to prove freedom from such defects as buffer overruns and arithmetic overflow.

The Hi-Lite Project

SPARK is currently being used in building the iFACTS air traffic control system in England, applying formal methods to prove freedom from such defects as buffer overruns and arithmetic overflow.
Individually, the approaches embodied in testing, static analysis, and formal methods are useful but do not provide a complete solution. In combination, however, these techniques can complement each other and thus advance the state of the practice in developing modern avionics systems and other critical software. An effort to build a toolset supporting such a combined approach — in particular, to ease the effort in transitioning to and adopting formal methods for software verification — is currently in progress. Named “Hi-Lite”5, this project is being conducted by a consortium of partners consisting of AdaCore, Altran Praxis, CEA LIST, Astrium Space Transportation, INRIA ProVal, and Thales Communications. Hi-Lite fits in well with the upcoming DO-178C6 avionics safety standard, a revision to DO-178B, which, among other things, accounts for technologies such as formal methods. Static verification through formal methods may reduce the need for later testing (e.g. robustness tests).