Difference between revisions of "Summary"

From VERASCO
Jump to navigation Jump to search
(+coq)
Line 39: Line 39:
 
been mechanically proved to be free of miscompilation, and carry
 
been mechanically proved to be free of miscompilation, and carry
 
it to the point where it could be used in the critical software
 
it to the point where it could be used in the critical software
industry. We will take advantage of the close coupling
+
industry. {{bib|compcert|DBLP:conf/cgo/Leroy11|Leroy 2011}}
 +
We will take advantage of the close coupling
 
between compiler and analyzer made possible by the fact that both are
 
between compiler and analyzer made possible by the fact that both are
 
proved against the same formal semantics, in particular to propagate
 
proved against the same formal semantics, in particular to propagate

Revision as of 14:18, 15 February 2012

In its quest for software perfection, the critical software industry (aircraft, railways, nuclear, medical, etc.), especially in France, has been progressively embracing the use of formal verification tools as a complement, or even as an alternative, to traditional software validation techniques based on testing and reviews. Verification tools based on static analysis, deductive program proof or model checking provide additional, independent assurance about a critical software system; they can also render some tests unnecessary, resulting in net gains in validation costs and time.

The usefulness of verification tools in the development and certification of critical software is, however, limited by the amount of trust one can have in their results. A first potential issue is {\em unsoundness} of a verification tool: if a verification tool fails (by mistake or by design) to account for all possible executions of the program under verification, it can conclude that the program is correct while it actually misbehaves when executed. A second, more insidious, issue is miscompilation: verification tools generally operate at the level of source code or executable model; a bug in the compilers and code generators that produce the executable code that actually runs can lead to a wrong executable being generated from a correct program.

Both issues are known, accounted for in regulations such as DO-178 for avionics, and addressed through best practices that are, however, costly in performance (no code optimizations) and additional validation tasks (more reviews, more testing).

The VERASCO proposal advocates a different, radical, mathematically-grounded solution to these issues: the formal verification of compilers and verification tools themselves. We set out to develop a generic static analyzer based on abstract interpretation for the C language, along with a number of advanced abstract domains and domain combination operators, and prove the soundness of this analyzer using the Coq proof assistant. Likewise, we will continue our work on the CompCert C formally-verified compiler, the first realistic C compiler that has been mechanically proved to be free of miscompilation, and carry it to the point where it could be used in the critical software industry. [Leroy 2011] We will take advantage of the close coupling between compiler and analyzer made possible by the fact that both are proved against the same formal semantics, in particular to propagate and exploit during compilation the properties inferred by static analysis. Finally, we will investigate the tool qualification issues that must be addressed before formally-verified tools can be used in the aircraft industry.

Critical software deserves the highest-assurance development and verification tools that computer science can provide. By going all the way to a full formal verification of such tools, our work will generate unprecedented confidence in the results of source-level static analysis, therefore fully justifying its role in the development and certification of critical software.