From physical models to digital models

Changed on 23/04/2020
How might it be possible to simulate highly complex physical phenomena such as turbulent flows, the risk of coastal flooding, wave propagation in subsoils or cardiac electrophysiology? Multiple teams at the Inria Bordeaux Research Centre have been tackling these questions. Computer models are obtained by digitally translating equations from physics. These are developed on a case by case basis, factoring in the challenges to be overcome and the required computing resources. Let’s take a closer look.
airplane wings modelling on a computer screan
© Inria / Photo M. Magnin - Signatures


How might computer science be used to anticipate the potential for flooding following a tsunami on a global scale? Or to make wind turbines as aerodynamic as possible? Or to anticipate airflow processes likely to reduce the speed of a car? “It is possible to adapt equations from physics - those used to model fluid dynamics or the behaviour of materials, for example - so that they can be used by a computer”, explains Emmanuel Jeannot, deputy head of science at the Inria Bordeaux Research Centre, where a number of project teams specialising in digital technology and applied mathematics have been modelling physical phenomena. “This process requires transforming continuous equations (whether in time or in space) into discrete equivalents, divided up into small temporal or spatial units that are easier for computers to process.” 

The accuracy of models and calculation costs

The technology used “depends on the complexity of the problems and the accuracy of the information needed for a decision to be made”, explains Mario Ricchiuto, head of the CARDAMOM project team, which specialises in fluid mechanics.

Picture of Mario Ricchiuto

For real-life phenomena, the researchers often work with experts from the field in order to put together different descriptions enabling them to look at a problem from the most appropriate angle: “very expensive but very precise high-fidelity models (requiring the resolution of ever more, and ever more complex, equations), low-fidelity models (very quick to use, but which provide only very rough interpretations) and intermediary approximations”, explains Mario Ricchiuto. These different approximations can be used for the same problem depending on how accurate the modelling has to be. This work borrows from multiple branches of continuous and discrete mathematics, in addition to computing.

Miniature models, increased automation

For applications in “coastal hydrodynamics” (quantifying coastal risks, particularly flooding), the researchers from CARDAMOM use “large scale” models, in collaboration with the CEA and the BRGM (the French Geological Survey). These are developed by analysing so-called asymptotic equations, which are used to describe behaviour when certain physical parameters are extremely large (or extremely small). The advantage of this is that “these models have fewer equations and, although they are also relatively complex, they are less costly to process”, stresses Mario Ricchiuto. However, in order to simulate real flows, you also have to take certain “small scale” effects into account, with such simulations obtained by adapting equations locally. In order to simulate even more quickly, other adaptation techniques are used, particularly when it comes to spatial and temporal accuracy (mesh adaptation).


The MEMPHIS project team, meanwhile, which specialises in digital modelling, places an emphasis on two innovations capable of reducing the time spent on modelling in the future. The first is automating the process of starting up the calculation, “which is the phase requiring the most in terms of human resources”, according to Angelo Iollo, the man in charge of this team: “Our goal is to develop a type of drag and drop for simulation, where a non-specialist would be able to take a shape with their mouse and drag it onto a virtual framework in order to begin a calculation.” The second relates to employing the use of previously simulated data in order to minimise the time spent developing new models, “in the interests of convergence between data and models”:

Picture of Angelo IOLLO

In order to help manufacturers process the extremely small, including “whirlwinds in turbulent air or water flows, some of which are no bigger than 10 microns in size”, Rémi Manceau, the head of the CAGIRE project team, has his sights set on two potentially major breakthroughs: “Higher order methods, which would enable us to obtain extremely precise simulations in timeframes deemed to be acceptable by manufacturers; and new ways of modelling the tiniest of whirlwinds.” Too small for meshing, the latter could be taken into account by “modifying the laws of physics” and by playing with other parameters (such as the viscosity of the air or the liquid) likely to have an impact on the aerodynamics of a car or an aircraft.

Ecran d'ordinateur présentant une modélisation de turbulences
© Inria / Photo C. Morel

High performance computing

In cases involving very expensive simulations, it is often essential for the discretisation of the mathematical models (dividing data series up into distinct classes) to be compatible with the new computing architecture used in high performance computing. This is what is being developed by the MAGIQUE-3D project team, which specialises in the simulation of seismic waves applied to the detection and monitoring of natural resources underground (in partnership with Total) and to the analysis of seismic risks (in collaboration with the SME RealTimeSeismic). “The Magicians” also collaborate with both field and laboratory geophysicists (members of the Laboratory of complex fluids and their reservoirs, UPPA), their goal being to improve the mathematical models used by employing the use of experimental measurements, in addition to field data. All of this supposes designing advanced inversion methods, which are developed generically before being applied to other problems, such as studying the composition of stars or the craftsmanship of musical instruments. In the second example, an inverse tool is used in order to seek out an instrument capable of producing a given note.

Projection murale d'une modélisation d'ondes temporelles dans le sous-sol terrestre
© Inria / Photo C. Morel


In much the same vein, researchers from the MONC (modelling in oncology) and CARMEN (mathematical models for simulating the propagation of cardiac electrical signals) project teams employ the use of partial differential equations for high performance computing. According to Yves Coudière, head of CARMEN, their goal is to “transition from a continuous heart model - represented as a material viewed from a distance - to a cell-by-cell representation, which would be impossible to represent using current homogenised or averaged models. The cells are connected to each other, and the calculations will be genuinely massive.” A huge undertaking.