*03/06/2021*

**A program for everything**

In just a few decades, computer science has become an integral part of our daily lives. While some software programs can be considered non-essential (games, home automation etc.), many have enabled our society to reach its current level of progress in transport, telecommunications and industry. This constant presence of programs, which help design, analyse and automate an increasingly large portion of our lives, has radically different consequences depending on the level of criticality of the applications involved. For example, if your local council’s mobile site has a temporary bug, the impact will be less serious than if there is an error in a program running a power plant. For the most critical systems, it is therefore essential to prove that the software involved does what it was designed to do.

**The importance of proof**

Demonstrating the validity of a program or electronic system requires mathematical logic and, more particularly, reasoning techniques called “formal methods”. These techniques, which originally come from fundamental computer science research, have proven to be particularly suited to program verification. They make it possible to produce “formal proof”, i.e. very strong assurance of the absence of bugs in the software analysed. Each proof is specially constructed using an automatic theorem demonstrator, or proving assistant, to verify a particular program. Because they are custom-made, proofs are not generally readable by other assistants, or even adapted to different versions of the program. This is why there is currently no other way to verify whether a program functions as it is supposed to than to build a formal proof yourself. Being able to reuse and share these proofs between different proof assistants would therefore save time and improve trust in IT infrastructures. However, the large number of existing demonstrators and proofs are a hindrance to meeting this need.

**The pillars of sharing: standardising...**

Several initiatives have emerged to overcome these differences by proposing a form of standardisation. One of them, Dedukti, was developed at Inria. It consists in creating a universal structure that is flexible enough to cover the needs of very different types of proof. It must be able to be used by numerous proof assistants without needing to modify them massively to correspond to the new standard. ProofCert, the project for which Dale Miller received an ERC Advanced Grant for 2012-2016 when he was Director of the Parsifal project team, uses a different approach. Here, the idea is that each person can produce their own proof format, but must supply a formal description of the logic underlying the symbolic expressions with it. This definition of the semantics of formal proofs should help to make them reusable.

**... and certifying**

However, the existence of standards is not enough to be able to depend on proofs produced by other systems or people. It is also necessary to be able to trust the person who released the formal proof verifying the program in question and to be certain that there is no chance it has been modified since its creation. The idea is therefore to certify both the issuer and content of the proof. These are challenges to which cryptography tools can provide an answer, as Dale Miller explains, now part of the Partout* project team: *“If I check a proof, I will use a certain tool. I can then generate a document saying “I, Dale Miller, on this day, with this tool, state that this is proven”. I can then sign it cryptographically to identify myself as the author and add a cryptographic code to certify that it has not been corrupted”. *

If a researcher or an institution has produced a certain number of proofs which, once verified by other people, are always shown to be correct, confidence in their work increases. Although this system is similar to peer review, which is one of the foundations of the soundness of results published in scientific journals, it does not possess the same hierarchical structure. In Dale Miller’s idea, there are no prestigious journals or established editors or reviewers. Each person should be able to propose proofs and verify those proposed by others, thus creating a very horizontal and distributed network of trust.

**Three years to explore a complete overhaul of the proof-sharing system**

This is the challenge facing the exploratory action for which he recently received support from Inria, in the form of funding for a three-year PhD. W3Proof, as he has called it, aims to create an infrastructure similar to the early days of the World Wide Web: a network of sharing and connections between formal proofs governed by a few key standards (like HTML and http, which allowed the development of the internet) that can be used by anyone. Dale Miller’s idea is for any proof assistant to be able to contribute to the production and reuse of formal proofs by using an export format certifiable by a few major trusted demonstrators. To ensure transparent and reliable networking, he intends to use distributed computing with peer to peer exchange protocols instead of centralised servers. Meeting the specific needs of formal proof sharing will “simply” require combining these protocols with storage methods able to guarantee the integrity and permanence of the proof, for example by using a unique identification key. Over the next three years, Dale Miller and his future PhD student will attempt to explore the feasibility of this idea by modifying a demonstrator to turn it into a prototype capable of interacting with others.

Far from being limited to the field of programme verification, the new structure for sharing certified data that this exploratory action intends to propose could also be of benefit to other sectors, such as journalism in the fight against fake news and scientific production, for which the reproducibility of results is a major challenge.

**Automation and representation: computing and deduction foundation*

## Learn more about proof assistants and cryptographic fingerprinting

- An example of a proof assistant, Coq. In video:

*« Coq : aspects pratiques de la théorie des types »*, Yves Bertot, InriaChannel, YouTube, 7 janvier 2019

- Creating a cryptographic fingerprint using hash functions. Wikipedia:

https://fr.wikipedia.org/wiki/Fonction_de_hachage_cryptographique