3464 Supplement . april 14 2016 . not to be Sold Separately


S.o.S. planet
special simulation issue

CEA at the heart of innovation
for extreme computing
and Big Data
CEA and Bull are co-designing exascale technologies


Harnessing exascale computing and data processing
will open unexplored perspectives for the numerical
simulation of complex physical phenomena and
industrial objects, by 2020 and beyond.
In order to tackle this challenge, CEA, in partnership
with Atos, is co-designing technologies to:
Reduce energy consumption
Process and manage massive flows of data
Increase performance, efficiency and modularity of
supercomputer architectures
Design fault-tolerant architectures
1 - At the scale of a billion of billions of operations per second (exaFlops) and
memory bytes (exaBytes).

TERA 1000, developed in partnership with Atos/Bull
according to CEA requirements and installed in 2016,
is foreshadowing exascale supercomputers.

CEA boosts industrial innovation
Located at CEA Bruyèresle-Châtel site, TGCC (CEA
Very Large Computing
Centre) hosts CCRT
(Computing Centre for
Research and Technology), a shared infrastructure
optimized for HPC. CCRT partners receive 1.4 Pflops of
computing power, as well as services and expertise
supported by CEA HPC team skills – an essential asset for
their numerical simulations.

Numerical simulation of
combusion in an helicopter
turbo-engine - TURBOMECA

Aero-acoustic numerical simulation
of a car interior air conditioning
system - VALEO

CCRT partners as of March 2016
Airbus D&S, Areva, Cerfacs, EDF, Herakles, Ineris,
L’Oréal, SafranTech, Snecma, Techspace Aero, Turbomeca,
Thales, Thales Alenia Space, Valeo, CEA as well as
FranceGénomique consortium (supported by French
government PIA).


To know more www-ccrt.cea.fr Contact christine.menache@cea.fr

Simulation of surface
currents on an aircraft nose
radome - THALES



Supplément du 3464 . 14 avril 2016 . ne peut être vendu Séparément


S.o.S. planète
spécial simulation

of the magnetic
field in the
earth’s core.

9 Billion Humans,
onE planEt


a year of simulation

p. 4


antoine petit, Ceo of Inria


p. 6


We need to simulate the planet now
a French savoir faire p. 10
Keepers of the planet p. 16

p. 8


olivier Gupta, Météo France’s deputy director
general p. 18

SMeS, natural element experts

p. 20


Cybeletech, numerical farming p. 26

the new agricultural revolution

p. 28


nature revealed p. 34

affordable immersion for everyone

p. 38


the quest for fully autonomous vehicles

p. 42


networks of connected objects,
it’s a challenge p. 44

Solar Impulse 100% calculated

p. 46

Président-directeur général : Christophe Czajka
Directeur général délégué : Julien Elmaleh
Directeur du pôle industrie : Pierre-Dominique Lucas
Directrice de la rédaction : Christine Kerdellant
Directrice adjointe de la rédaction : Anne Debray
Coordinatrice éditoriale : Aurélie Barbaux
Chef d’édition : Guillaume Dessaix
Direction artistique : Eudes Bulard
Ont participé à ce numéro : Claire Laborde et
Rebecca Lecauchois (secrétariat de rédaction) ;
Laurent Pennec (maquette)
Supplément de « L’Usine Nouvelle » n° 3464
du 14 avril 2016 (commission paritaire n° 0712T81903)
Ne peut être vendu séparément.
Une publication du groupe Gisi, Antony Parc II 10 place du Général-de-Gaulle - BP 20156 - 92186 Antony Cedex.
Directeur de publication : Christophe Czajka
Impression : Roto France Impression 77185 Lognes
Photo de couverture : Julien Aubert/Finlay, Jackson, Olson, Gillet,
Université Paris Diderot/IPGP/CNRS

L’USINE NOUVELLE i n° 3464 supplEmEnt i apRil 14 2016

e are going to need more than the good intentions of governments if we are to limit global
warming to less than 2° now and feed 9 billion
humans in the future. Public-sector and economic
players must do everything in their power to meet these
challenges. Although it is no miracle solution, simulation enables us to be good diagnosticians by predicting
changes, especially climate change. Simulation also
provides invaluable decision-making tools to improve
how we manage resources and deal with natural hazards.
By modeling living organisms, simulation will facilitate
human existence on the only planet we’ve got.
Scientists already know how to model crop and microassistant EDitoR,
algae growth. And models can be created in every
l’usinE DiGitalE
domain, as Antoine Petit, INRIA’s CEO, explains. We
can use the same corpus of mathematics to simulate
physical and biological phenomena, such as biochemical reactions and cell migration. Fluid mechanics
techniques used to calculate
planes’ air penetration have “Combined with artificial
been reused to model blood
flow around the body. Other intelligence technology, models
techniques adapted to the
will become better adapted.”
pharmaceuticals industry will
soon help us simulate clinical
trials, thus reducing how long it takes for treatments
to come onto the market. Coupling patient data with
general models is leading the way for patient-specific
medicine and more reliable diagnoses.
Combined with artificial intelligence technology,
models will become better adapted and more able to be
rebuilt from huge amounts of data. This means scientists will have an even greater responsibility. Just as the
parameters chosen for climate models determines how
close they come to real conditions, so too the quality
of modeling living organisms is the key to renewing
medical science and even human existence. Despite
space exploration, 9 billion human beings will have to
live on a single planet with limited resources. ❚❚



Solar System


PlanEt ninE


the First numerical
simulation trophies
In 2015, TERATEC organized the first
numerical simulation trophies. The
six prizewinners were revealed on 23 June
at the TERATEC Forum, held at the École
Polytechnique in Palaiseau (Essonne).
There were five categories, showcasing
the simulation expertise of start-ups
and SMEs, as well as the ability of the French
public-sector research ecosystem
to collaborate with private companies.
Since the simulation expertise of big French
companies is now well-established, they were
not included in these awards. The start-up
trophy was awarded to CybeleTech, which is
simulating plant growth [see page 26].
The SME trophy, showcasing a company
that has used
Five categories
to showcase
the expertise
of French start-ups to develop
and smEs.
new products
or services was
awarded to Principia, which is simulating
seabed-surface links in offshore systems.
The European Center for Research and
Advanced Training in Scientific Computation
(CERFACS) was the joint winner (with
Turbomeca) of the collaboration trophy,
which rewards groupings between a big
company and an SME. This award was given
for collaborative simulation work on a turbojet
combustion chamber and first-stage turbine.
The innovation trophy was awarded
to HydrOcean, a start-up that has
industrialized its SPH (Smooth Particle
Hydrodynamic) simulation method
for application to fast, complex fluid flows
[see page 24]. Finally, the first jury prize for
SME projects displaying exemplary innovation
and promoting the use of numerical
computation was awarded to Distene
for its mesh generation tool, MeshGems.
TERATEC also made special mention of
Danielson Engineering, an SME that designs
and manufactures prototype thermal engines
for the automobile, aeronautics, and defense
industries. The company is using numerical
simulation to develop digital models with very
similar performance to real engines. ❚❚
JeAn-FrAnçois PrevérAud


Using mathematical models, researchers at the California Institute of
Technology (Caltech) believe they
have discovered a ninth planet in
the solar system. They reached this
conclusion while trying to explain
the behavior of celestial objects in
the Kuiper Belt, beyond Neptune’s
orbit. Six of these objects have an
elliptical orbit converging to the
same point and on the same 30°
tilt, suggesting that they are “attracted” by an invisible mass. Since the
probability of this being coincidence
is only 0.007%, the data could be
explained by the influence of a ninth

Planet nine could be ten times
heavier than earth.

planet’s gravity. This hypothetical
planet could have an extraordinary
mass – ten times that of Earth –
and orbit twenty times farther from
the sun on average than Neptune.
Planet Nine could also take 10,00020,000 years to rotate around the
sun. ❚❚ Julien Bergounhoux


a nEW 3D simulation ClustER
The it3D event, organized in autumn by the Bordeaux-based company Immersion, boosted the local 3D simulation industry. It even generated an idea
for a cluster bringing together companies such as ESI Group, Lumiscaphe,
and Diodasoft. Used intensively in the aeronautics and automobile industries
for the past decade, 3D simulation now needs to be assimilated by their
sub-contractors over the next five years. The hardware has become accessible and «the ‘Factory of the Future’ plan launched by the French state is
a fantastic opportunity to introduce this technology,» explains Christophe
Chartier, Immersion’s CEO. ❚❚ nicolAs césAr


FRanCE’s FiRst RiVER
A consortium run by the Compagnie Nationale du Rhône took delivery of France’s first
river navigation simulator in mid-March.
This 3D simulator was designed by Alyotech
and is used for training and improving the
skills of boat pilots navigating the Rivers
Rhône and Saöne, especially the trickiest sections. The device has a 10-metre cylindrical
screen with a 240° angle of vision, together
with a helm station that can be configured
into five types of boat. The pilot’s cabin
equipment, as well as the signaling system,
currents and water-column density, weather
conditions, river traffic, time of day, etc. are

An infinite number of scenarios can be tested
in complete safety.

also modular. The scenarios include various
types of breakdown and alerts to replicate as
many situations as possible. ❚❚ J. B.



tERa 1000 in sERViCE
at tHE CEa-Dam
Atos has just delivered the first phase of the
new TERA 1000 supercomputer to the French
Atomic Energy and Alternative Energies
Commission, Military Applications Division
(CEA-DAM). This supercomputer has double
the theoretical computing power of the previous generation TERA 100 supercomputer and is five times more energy efficient.

“The TERA 1000 is the third generation
of supercomputer resulting from the Bull/
CEA-DAM partnership, which started in
the early 2000s. The computing power has
been increased 5,000-fold,” says François
Geleznikoff, the CEA’s director of military
applications. The first phase of TERA 1000
comprises two computing systems: one based
on an Intel Xeon E5 v3 processor and another
on an Intel Xeon Phi processor. This installation is a forerunner of the future generation
exascale (1,018 flops) supercomputers that
will come into service by 2020. ❚❚ J.-F. P.


240 tERaFloPs at tHE CERFaCs
The European Center for Research and Advanced Training in Scientific Computation
(Cerfacs) took delivery of a new supercomputer. Built by the Chinese company Lenovo, with 240 teraflops of computing power
(1 teraflop is equivalent to 1 trillion floating
point operations per second), and will triple
the computing power at this Toulouse-based
research center. Some 1.2 million Euros have
been invested in this new computer, which
replaces a 20-teraflop HP supercomputer and
completes another 53-teraflop supercomputer
built by Bull. The Cerfacs is a fundamental
and applied research center specializing in
numerical modeling and simulation. It deals
with technical and scientific topics in various
fields of public-sector and industrial research:

Fluid Mechanics

siEmEns aCQuiREs

Siemens has acquired CD-adapco
for 970 million dollars to
complete its portfolio of
numerical simulation tools for
products and industrial
processes. CD-adapco specializes
in fluid mechanics (Star-CD),
solid mechanics,
electrochemistry, and acoustics.
The company generates turnover
of approximately 200 million
dollars, an increase of 12% a year
over the past three years, and
employs more than 900 people.

HYDRoCEan at
BuREau VERitas

240 x 1 trillion operations per second.

aeronautics, the automobile industry, space,
climate studies, etc. Approximately 150 researchers, engineers, and administrative staff
work at the Cerfacs. ❚❚ Ridha LoukiL

Bureau Veritas recently acquired
HydrOcean, a spin-off of the
École Centrale de Nantes
specializing in numerical
simulation of fluid dynamics
for the maritime sector.
HydrOcean was set up in 2007
by Erwan Jacquin and the École
Centrale de Nantes. It provides
design and optimization services
for ships, offshore structures,
racing yachts, and marine-energy
recovery systems.
Sound and Light

Consequences of a 100-year Flood in Paris


if the River Seine bursts its banks, the floodwaters
will rise gradually and are not expected to cause
casualties. a prediction made by the organizers of the
Eu Sequana exercise, which between the 7 and 18 of
March 2016 simulated a 100-year flood. Nevertheless,
a flood like the one in 1910, when the floodwaters rose
8.62 meters, would have major consequences on the
ile-de-France region’s life and economy. according to
numerical simulations by the Paris Police Prefecture,
such a flood could result in 5 million disaster victims,
affect 400,000 direct jobs, and leave 1.5 inhabitants
without electricity. a study by the organization for
Economic Cooperation and development estimates
that property damage could be 3-30 billion Euros and
cost 0.1-3% of France’s GdP over five years. ❚❚ A. B.
L’uSiNE NouvELLE I N° 3464 SupplEmENT I apRIl 14 2016

oPtis sWalloWs uP

The French software vendor
Optis that specializes in light
simulation has acquired Genesis,
another French software vendor
with expertise in sound
simulation and perception.
Both companies share a common
simulation philosophy based on
using the physics of phenomena
they simulate. Tools simulating
human sight and hearing will help
both companies’ designers take
account of “user experiences”
to develop their products.



“moDElinG is
a corE activity
at inria”
Simulation, especially modeling, is a core activity
at the French National Institute for Research
in Computer Science and Automation (INRIA),
as its CEO, Antoine Petit, explains.


What type of simulation work does INRIA do?
Modeling is a long-standing core activity at INRIA since we
simulate models, which are needed in every field. But the big
difficulty lies in designing appropriate models. For example,
climate modeling is extremely difficult since there are several
models rather than just one. Climate change controversies
have all focused on whether or not these models are valid.
It’s hard to know whether or not a model is correct, especially
since no single model can take into account every parameter.
You have to omit some of the less important parameters, and
this causes disagreements.
You say that everything can be modeled, but INRIA can’t be
at the cutting edge of every field.
The French school of mathematics is regarded as one of
the best in the world. In some sense, we’re intrinsically
good at modeling. What’s good about modeling exercises
is that they require the cooperation of mathematicians and
computer scientists with people working in applied sectors.
Therefore, we are not better in any specific applied field than
in another. We’ve developed our programs in response to

Luc Pérénom


interest in modeling from various disciplines, it happened in
healthcare twelve years ago. But we work with every sector
and, on the whole, the techniques are always the same.
What environmental research is INRIA carrying out and who
are you working with?
We’re working with institutions such as Météo France, Institut Pierre-Simon-Laplace, and the University of Versailles
on climate modeling – more specifically on air quality – and
in partnership with companies such as EDF, modeling wind
circulation to help position wind turbines. Environmental
research is not limited to climate issues.
Have you developed the results of this research yet?
Generally speaking, I don’t like the term «development» since
it’s often put into the same category as financial spin-offs. This
isn’t necessarily the case and other types of value can take precedence over monetary value. For example, over the past thirty
years INRI has initiated 120 start-ups, which have created
3,000 jobs. Several start-ups a year are launched in the healthcare sector. It’s a bit early for this in the environmental sector,
even though some topics lend themselves to development. In
collaboration with the cities of Paris and San Francisco, we’ve
designed a smartphone application called SoundCity, which
measures city noise pollution and enables users to collect data.
We also have a project looking at microalgae as a protein and
energy source, modeling how environmental conditions affect
growth according to microalgae type. This project is part of
the BIOCORE program in Montpellier.
Do you need new models to simulate living organisms
or can you adapt existing models?
There’s often a common basis. For example, fluid mechanics techniques used to simulate aeroplanes’ air penetration
are now being used to model blood flow around the body. We
can reuse an entire corpus of mathematics when we switch
from one application to another.
How many teams work on modeling?
INRIA employs 2,700 people, including 1,800 researchers
(600 of whom have permanent contracts). Modeling could
potentially interest any of our teams, especially life-science
research groups (25% of our total teams). Part of our work
involves running calculation codes on increasingly powerful
computers. To move from petaflops to exaflops, we have
dedicated simulation (rather than modeling) teams writing
software stacks to get the best use out of these new supercomputers. We’re working with manufacturers – especially
Bull and Hewlett Packard – as well as on virtual supercomputers to take into account processor-failure probabilities.
Since some computer codes use much more electricity
than others, we’re designing algorithms to minimize power
consumption and are also working on dedicated hardware.
Research is a long-term effort whereas digital technology moves
very quickly. How do you reconcile these different paces?
It’s a popular misconception that digital technology
moves very quickly, although it’s true that this technology
L’UsINE NoUVELLE i n° 3464 suPPLEmEnt i aPriL 14 2016

“the French school of mathematics is regarded
as one of the best in the world. in some sense,
we’re intrinsically good at modeling.”

can come onto the market very quickly. In 2003, the French
National Science Fund, which was replaced by the National
Research Agency (ARN), launched three new areas of activity: computer security, bulk data, and new mathematical
interfaces. These are today’s research topics. But it would
be a mistake to think that some amazing new research topic
is going to emerge in six months’ time that will put all our
researchers out of a job. They know what the topics of the
future will be.
What are your main current areas of research?
Everything relating to modeling and simulation, of course.
We’re conducting research on the Internet of Things, data
and big data, developing new algorithms for data processing,
display, encoding, and transport. Another major research
area is cybersecurity, cryptography, and protocols. Robotics
is also a key topic and we’re very interested in autonomous
cars. We also run many multi-disciplinary research programs
at the interface of various disciplines.
What happened to the initiative giving sMEs access to your
We cooperate with companies in several ways. First, companies benefit from skills transfers whenever our researchers
move into the business world. We’ve also set up a dozen
joint laboratories with the main industrial groups: Microsoft,
Alcatel-Nokia, Total, EDF, Orange, Alstom, Airbus Group,
etc. What’s more, we’ve established many start-ups. Finally,
our INRIA innovation labs collaborate with SMEs – on much
shorter deadlines than with big groups – to change innovations into products. We’ve already set up a dozen such
innovation labs and another is currently being established
with Safety Line, to reduce airplanes’ fuel consumption. In
2015, we also launched INRIA Tech in Lille, with support
from EuraTechnologie Center of Excellence. Although its
name is inspired by CEA Tech, we’re not doing the same
thing. INRIA Tech has a dozen engineers – and not researchers – providing initial responses to industrial needs. It’s
got off to an encouraging start. ❚❚


we need to SIMULAte
tHe PLAnet now
Even if global warming slows down, it will continue to radically change our environment.
Numerical simulation is helping predict these changes.




he agreement signed by 195 countries at the COP21
conference in December 2015 is historic, imperfect, and very ambitious. Working towards global
warming of less than 2°C – with an aim of 1.5°C to meet
the needs of the most vulnerable states – was not easy
to negotiate. Furthermore, funding for energy transition
in developing countries is far from certain after 2020.
Nevertheless, things seem to be in motion. Apart from their
financial resources, Western countries have something else
to share and promote in the fight against global warming:
their technological expertise in numerical simulation.
Modeling natural systems helps understand and predict
climate change, of course, as well as the consequences of
climate disruption (flooding, rising sea levels, and catastrophes) and ways to optimize natural resources. In the
L’USINE NOUVELLE i n° 3464 supplEmEnt i april 14 2016

race to calculate natural resources under threat – such as
water – and resources that need developing – such as solar,
wind, tidal, and other renewable energy sources – France,
with its long tradition of applied mathematics teaching,
is in a strong position. French researchers, industrialists,
and a start-up incubator are working on new simulation
models. They are aiming to reproduce ever more advanced
phenomena on their computer displays and study new
approaches to numerical simulation making it easier for
everyone to take up. ❚❚

the behaviour
of industrial
buildings before
construction helps
assess and limit
the dispersion
of pollutants into
the atmosphere.




savoiR FaiRE
French laboratories have become experts at
simulating our environment – whether airconditioning systems, storms or major industrial
catastrophes – as demonstrated below.
By AlAin ClApAud


ust when we are seeing the first consequences of
climate change – including in France – simulation is
playing an increasingly important role in protecting
our environment and combating global warming. Simulation
calculations are used on all fronts to protect the environment: optimizing human infrastructures and water/energy
resources, predicting coastal erosion in the face of rising
sea levels, forecasting catastrophes, modeling pollution
phenomena, etc. And French experts – at EDF, Suez, Veolia,
the CNRS, the National Institute for Research in Computer
Science and Automation (INRIA), the Scientific and Technical Center for Building (CSTB), and the National Research
and Safety Institute (INRS) – are often at the cutting edge of
innovation, providing original solutions.

proteCtion From ClimAte

The worldwide agreement signed at the COP21 conference in December 2015 set nations new targets to limit
CO2 discharged into the atmosphere. Many countries
are already facing climate change. France – especially its
coastline – has not been spared, as Antoine Rousseau, an
INRIA researcher in charge of the LEMON team (coastline,
environment, models, and numerical tools), underlines:
“Coastlines face three risks: storms and tsunamis, erosion
and flooding. We need to design mathematical models to
understand what’s happening long term, as well as ‘realtime’ risk-management simulation tools to evacuate flood
zones in time,” he explains.
It’s not easy. Every process brought into play – whether
winds, currents or ground motion – is very complex. The
equations require vast amounts of calculation, not to
mention the difficulty of coupling these simulations when
each process changes at its own pace. Erosion is caused
by water/sediment interactions when seabeds influence
coastal currents and vice versa. “Currents change within
hours whereas modifications to sandy seabeds occur over
several months or years. From a mathematical perspective,

in Bordeaux, the ramses supercomputer delivers real time flood
prevention and control of wastewater treatment plants.

coupling phenomena is very difficult,” says Rousseau. But
it’s not impossible. A program funded by the French National Research Agency (ANR) enabled Rousseau’s research
team to limit silting up in the Sète (Hérault) area. The team
used their calculation codes to place geotextile “socks” on
the seabed; these “socks” modify currents and hence limit
coastal erosion.
While simulation is invaluable for long and medium-term
environmental forecasting, it is also essential for dealing with
natural catastrophes. The LEMON team is collaborating closely with the University of Montpellier on real-time modeling
of city flood scenarios. Their goal is to design tools for use
by the Prefecture and firefighters during evacuations. “The
issue here is working out whether we need a rough, very
quick forecast or should wait for a better quality forecast.
It’s about finding a compromise. Mathematics has role to
play in improving the model (at constant calculation time)
and hence forecasting quality,” says Rousseau.


“We’re working on more
eco-friendly materials”
KaRim aÏt-moKHtaR,
Director of the Environmental
Engineering Laboratory
What progress have you made
on numerical simulation
We’re working on
materials, especially on
simulating transfer of
aggressive agents. This
will help work out how
long structures such
as the Millau Viaduct will
last in relation to their
environment. The goal is to
design more eco-friendly
coatings than those used
currently, which often
consist of chemical
compounds with low

biodegradability. Another
area of research is focusing
on buildings’ energy
consumption and quality
of life.
one of your teams is working
on theoretical aspects, why
is that?
Our theorists are aiming to
make simulation cheaper
and faster by developing
new calculation methods.
Computational Fluid
Dynamics (CFD) codes use a
lot of computing resources,
especially for long-term
simulations. But if you use
more approximate models,
such as zonal or empirical
models, you lose the
subtlety of meshing. ❚❚

William Daniels ; D.R.

Better uSe oF nAturAl reSourCeS

Suez Environnement is also interested in this operational
simulation. “In the event of storms, we have to manage water
arriving suddenly in the network,” explains Bertrand Vanden
Bossche, who manages the R&D portfolio of the Smart Solutions business. In the Paris region, Vanden Bossche’s team
has started installing extremely complex calculation models
combining meteorological data and radar image analysis
from Météo France. “This enables us to calculate runoff, i.e.
the volume of water that is going to pour into our network.
Once we know the water levels in our storage basins and
networks, we can organize the correct pumping strategy to
ensure wastewater treatment plants are not sent any more
water they can actually accept. At the same time, we try to
avoid discharging potentially polluted water into the natural
environment,” says Vanden Bossche. In Bordeaux, Suez has
gone even further using algorithms to control sluice-gate
height and launch pumping operations to absorb rainwater
as effectively as possible.
l’uSine nouvelle i n° 3464 supplement i apRil 14 2016

Simulation can also be used to act on the causes of global
warming, helping the main industrial countries reduce their
CO2 emissions. In particular, simulation helps increase the
level of renewable energy sources in each country’s energy
mix by making them more cost-effective. Photovoltaics, wind
power, tidal turbines and new energy sources require heavy
investment. The five offshore wind farms authorized for
construction off the French coast represent a 7-billion Euro
investment. It is therefore essential to have reliable calculations of each turbine’s potential wind power, i.e. estimate as
accurately as possible how much energy it will produce over
its life span. “Calculating wind turbines’ potential power over
their 20-25-year life span requires sophisticated statistical
and dynamics models to understand wind interactions
where turbines are located,” explains Pierre-Guy Thérond,
director of new technology at EDF Énergies Nouvelles. To
achieve this, EDF is increasingly abandoning simple simulation models based on laminar flow. Instead, the company
is turning to Navier-Stokes fluid mechanics equations (or
CFD equations for fluid dynamics calculations), which have
until now been used in aeronautics. Thérond justifies this
far more expensive choice in terms of computing power.
“CFD models give a better representation of turbulence
effects. In complex zones – i.e. very rugged terrain on which
wind turbines are surrounded by complex wind movements


Cyclone Xynthia – Simulation goes to court
in February 2010, Cyclone Xynthia –
combined with a high tidal coefficient
– caused breaches of several dikes
and flooding in the Vendée. some 29
people were killed, despite the
warning issued by météo France.
“the French Geological survey
(BRGm) has been active in
explaining Cyclone Xynthia’s physical
causes in reports submitted to the
authorities. these assessments were
based largely on numerical
simulation,” says Fabrice Dupros, the
BRGm’s head of high-performance
computing. the insurance
companies maif, aXa and maaf
funded the Johanna research project,
which cross-checked simulation data
with insurance claims. a model has
been drawn up linking very high
resolution coastal flooding data
(water height, current speed) with
building damage. this model will
help choose construction materials
and urban development strategies to
prevent the consequences of such
catastrophes as far as possible. ❚❚

and vortexes – the laminar flow hypothesis is no longer
sufficient,” he states. EDF’s approach has optimized windturbine location in relation to relief and to one another.
This approach prevents the wake of the first wind turbines
facing the wind from compromising the efficiency of those
placed behind them.
Water is another natural resource that must be optimized.
As a result of rising average temperatures and the world’s
population increasing to 8 billion people by 2025, the
amount of water available per inhabitant will drop by almost
a third worldwide. France will not escape this trend since farmers in southern France are already affected by watercourse
pumping restrictions every summer. Here too, simulation
has a role to play, as the Astuce & Tic consortium – which
studied water-table changes on the Crau Plain – has shown.
The triangle formed by Arles, Salon-de-Provence and Fos12

sur-Mer (south of the Alpilles) covers a water table that is
very important for this region and is subject to both climate
constraints and heavy demographic pressure. Fabienne Trolard, director of research at INRA, has assessed the impact
of global warming and economic development on this area.
Her calculations show that a 14% reduction in grassland
combined with a 30% drop in irrigation water and a 30%
rise in drinking water extraction would cause a 1.5-13-meter
drop in the water table, depending on the location studied.
This key data must guide political decisions for urban development in this area.
Water supply network operators are also directly confronted
with the need to protect water resources, especially from
pollution threatening water quality. Suez Environnement is
simulating how pollutants spread through soil to assess the
impact of agricultural fertilizer on groundwater.

eDF ; umR6143 m2C - équipe ClaReC / CnRs photothèque.

of wind-force
distribution on
a wind farm.


June, 28th 2016 • Teratec Forum
École Polytechnique • 2nd edition


A flooding scenario in la Faute-sur-mer (vendée), based on detailed
topography of one of its neighborhoods.

improvinG inFrAStruCture deSiGn

While simulation promotes better use of renewable
energy, it can also be used to improve construction work
and make buildings more energy efficient. “We’ve developed
a simulation code called MATHIS (modeling buildings’
aeraulics, thermics and unsteady humidity),” explains
Maxime Roger, director of climatology, aerodynamics,
pollution, and water purification at the CSTB. This is no
conventional fluid mechanics code, such as those used to
model wind, but rather a nodal code representing every
room within a building in the form of nodes. Each node is
interconnected to models representing air vents and other
accessories in systems. Buildings’ ventilation, hygrometry,
and other parameters can also be represented over a year,
taking into account many different use scenarios.”The CSTB
believes this simplified simulation will facilitate its uptake
by architects and design offices, especially for compliance
with RT 2012 thermal regulations.
Nevertheless, CFD calculations are still essential for simulating more complex phenomena, e.g. studying the impact
of buildings and infrastructures on their environment.
Cheaper access to computing capacity makes it possible
to simulate how wind acts on a structure and takes into
account the surrounding buildings within a several hundred-meter radius. More developed numerical models can
also simulate phenomena such as rain. “We know how to
model driving rain in stadiums or semi-open spaces, such as
shopping malls partly covered by a canopy,” explains Sylvain
Aguinaga, head of the CSTB’s numerical modeling division.
It is impossible to carry out small-scale simulation of water
droplet dispersion in wind tunnels; such phenomena can
only be predicted via numerical simulation. “Our clients –
design offices and architects – therefore use concomitant
rain/wind scenarios to predict whether or not terraces and
gangways will be in the rain. We then suggest modifications
to a building’s geometry before construction and hence limit
these phenomena,” continues Aguinaga.
Simulations of buildings’ microclimate are increasingly
sophisticated. The Center for Research in Architectural
Methods (CERMA) – a CNRS laboratory now merged into
the Nantes Center for Research in Urban Architecture – has
developed solar simulation software called Solene. This
software enables architects to view shade buildings will cast
l’uSine nouvelle i n° 3464 supplement i apRil 14 2016

You have a project or an innovative approach
? Send us your application file and confront
yourself with the best people in the digital
simulation field !
The prize list will be unveiled by the Usine Digitale
team and its sponsors during a night
at the heart of Teratec Forum
on June, 28 2016 at Ecole Polytechnique

our Award
categories :

Submit your
application be
April 22, 2016

J Start-up Award
J Collaboration Award
J SMB Award
J Grand prix of simulation
J Innovation Award
J Simulation personality Award

Become a sponsor of the Awards, contact
Béatrice Allègre : +33 (1) 77 92 93 62
Email : ballegre@infopro-digital.com
Organized with :

Find out more :
Ask for your application file to :



Sabella’s d10 turbine runs
on numerical simulation

on the ground, calculate how long sunshine will last at any
point in a 3D scene, and estimate direct natural light levels
outside buildings and in their rooms. Although these tools
are now standard in architects’ and design offices’ palettes,
Solene has developed and integrated increasingly complex
factors. The impact of grassy areas, ponds and trees planted
around buildings is now integrated into calculations. In
addition, for buildings’ thermo-aeraulic calculations, the
Solene software can be coupled with fluid dynamics simulation tools such as those supplied by the software vendor
Fluent. Simulating local microclimates means architects
can calculate as accurately as possible how comfortable
buildings will be (both inside and outside) for their users.
This would avoid buildings such as the recently constructed
20 Fenchurch Street skyscraper in London, whose solar
glare dazzles strollers by and melts the plastic components
of vehicles parked in front of it.

ComBAtinG pollution


modeling sea currents and sea beds helped decide the best location
for the tidal turbine.

this tidal turbine was submerged
in June 2015 and connected to
ushant island’s (Finistère) power
grid in november 2015. the D10
is the first tidal turbine designed
by sabella to be tested full-size.
sabella turned to the engineering
company artelia – which had
modeled the seabed around
ushant island – to choose a
location for its tidal turbine.
“Currents were modeled using
bathymetric readings and seacurrent measurements. the
installation site was chosen on

the basis of this data, the swell
and its planarity,” says erwann
nicolas, head of mechanical
engineering design at sabella.
the tidal turbine’s structure and
blades were also sized using
simulation calculations. CFD
simulation, which was required
for detailed design of the nacelle
and carbon fiber blades, was
done by hydrocean. engineers
studied the mechanical and
vibratory strength of the shaft
and blades, as well as the
turbine’s fatigue. ❚❚

Although a lot of resources have been poured into simulating natural phenomena, considerable investment has also
been made in simulating the effects of human activity. There
is nothing unusual these days about consulting pollution
maps for major cities, in the same way as we check weather
forecasts. Vivien Mallet, an INRIA researcher, underlines
the progress made in forecasting since the 1980s: “Simulation and observations made using sensor networks now
correlate more closely, which has really helped improve
weather forecasts. This is especially true for air-quality
forecasts, which are often very uncertain,” explains Mallet.
For this reason, researchers improving these models have
installed many new sensors. Data from weather and airquality monitoring stations – e.g. AIRPARIF – is already
supplemented by satellite images and is increasingly
established using the Internet of Things. Smart pollution
sensors have been installed on Lyon’s tramways and are
expected to appear on public streetlights in more and more
French cities. All these data sources will feed into numerical models to produce more reliable pollution forecasts.
Even cell phones can serve as sensors. For example, the
SoundCity application uses cell phones to measure city
noise pollution.
Besides forecasting atmospheric pollution levels several
hours in advance, simulation also helps respond to catastrophes. For example, we can simulate how a dirty bomb
would explode in Place de l’Hôtel-de-Ville in Paris; depending on the weather, this will tell us the arrondissements
in which to raise the alarm. Numerical simulation is also
used when constructing new petrochemical facilities to see
the impact of toxic discharged into the atmosphere. During
the Fukishima disaster, the French Institute for Radiological
Protection and Nuclear Safety (IRSN) made operational
forecasts to find out whether or not the population was
going to be exposed to radionuclide emissions. “We need
high-quality weather data for reliable forecasts,” says
Mallet. Radioactive smoke plumes move according to wind
direction and the risk to the population varies according to
whether the wind is blowing inland or out to sea. “Weather


SoundCity – listening to cities via smartphones

noise pollution map in paris, drawn up using data sent from smartphones.

the soundCity application was developed as
part of inRia’s Citylab initiative on smart
cities. the application uses smartphone
microphones to measure noise all day long.
“the application gives people an idea of their
personal noise exposure, while the data also
helps correct city noise maps,” explains Vivien
mallet, an inRia researcher. the main problem
is the poor quality of readings, which are
measured using mobile applications that are,
by definition, very unstable. “this project also
plans to make progress on combining
numerical simulation with observations using
poor-quality mobile sensors. Currently,
although the application sends about 1 million
measurements every 4 days, only about 2-3%
of them can be used to correct noise maps.
“We’re hoping that participatory data collection
will enable us to correct and hone our
simulations and forecasts.” ❚❚


l’uSine nouvelle i n° 3464 supplement i apRil 14 2016



forecasts can be very uncertain, especially in light winds.
During the Fukishima disaster, Japanese scientists used
operational simulation to predict smoke-plume movements
and decide when they could discharge gas into the atmosphere to stop the reactor chamber exploding. We now want
to calculate not only where smoke plumes will go, but also

how much confidence we can have in these forecasts. We
want to quantify models’ uncertainty in order to make the
best possible decisions,” explains Mallet. Numerical simulation is about aiding humans make the best decisions to
protect their environment. Equations developed in French
laboratories help achieve this goal. ❚❚

KEEpERs oF tHE planEt
Climate simulation models served as the basis for the conclusions of the IPCC report.
Teams of researchers around the world are striving to complete and improve their models.
By tHierry luCAS


one of them appeared on the podium at COP21 or
made any decisions affecting the future of our planet.
And yet without them nothing can be done. We’re
referring to the hundreds of researchers and engineers
throughout the world who are modeling climate change,
i.e. converting global climate phenomena into equations.
The goal, after lengthy calculations using the world’s most
powerful computers, is to predict likely temperature and
precipitation changes over the next twenty, fifty, and even
one hundred years.

prediCtinG ClimAte CHAnGe –
CollABorAtive WorK

This enormous task, coupling vast amounts of simulation,
is largely collaborative. Global atmosphere, landmasses (with
their relief and vegetation), oceans, pack ice, clouds, atmospheric dust, CO2 and other greenhouse gases have all been
simulated. Experts are now striving to validate and couple
their models. “The first atmosphere models appeared in the
1960s and were derived from those developed for weather
forecasting. Simplified ocean models have been around since
the 1970s, but only became 3 D from the 1990s onwards,”
says Michel Déqué, a researcher at the French National
Center for Meteorological Research (Météo France).

ComBininG 20-30 modelS

Increasingly powerful computers mean that atmosphere
and ocean models can now be coupled. Demand from the

Intergovernmental Panel on Climate Change (IPCC) has
boosted development of this technology since it is impossible
to calculate CO2-linked global warming without a coupled
atmosphere/ocean model. This core climate simulation is
central to the world’s 20-30 climate models.
The French research community has focused on two global
models: one produced by Météo France and another by the
Institut Pierre-Simon-Laplace (IPSL), a federation of nine
specialist environmental laboratories. Other major research
groups have been formed: the National Center for Atmospheric Research (Colorado) and the Geophysical Fluid Dynamics
Laboratory (Princeton University, New Jersey) in the USA; the
Hadley Centre for Climate Prediction and Research in the UK;
and the Max-Planck Institute for Meteorology in Germany.

diSCrepAnCieS BetWeen reSeArCHerS’

There are no fundamental discrepancies between research
teams’ core models, even though each team has its own
method of dividing up the atmosphere for calculations. For
example, each “mesh” in one type of meshing devised is
a parallelepiped, 100-200 km long on both sides but becoming
thinner as it approaches the earth’s surface. The parameters
(temperature, wind speed, etc.) within each mesh are average
values, which are varied over time to simulate climate change.
Laboratories are also developing many specific models for
glaciers, aerosol impact, changes in vegetation, etc. These
models have either not yet been integrated into climate models

a Global tEst bEd
Climate-modeling researchers have been
regularly comparing their results since 1995.
every five or six years, Cmips (Coupled model
intercomparison projects) invite research
teams throughout the world to run their
models on shared scenarios. these projects
aim to assess climate models’ performance,

see where they diverge, and thus further
progress in climate science.
simulation carried out for Cmip5 served as
the basis for the fifth intergovernmental panel
on Climate Change (ipCC) report’s conclusions
in 2013. Researchers are currently designing
scenarios in preparation for Cmip6. this time,

small amounts of simulation will help compare
the various core models, while specialized
models – for example on the effect of
aerosols – will undergo a series of tests.
the exercises are expected to start in 2016
and 2017. several years’ research will then
be needed to capitalize on the results.


or have been done so in simplified form. “It’s not necessarily
worthwhile integrating everything since this isn’t the best use
of computing resources or budgets,” says Déqué.

nasa / sVs

empiriCAl eQuAtionS For SuB-meSH pHenomenA

The regularly observed discrepancies between climate
models – especially during international atmospheric model
intercomparison projects [see box opposite] – have already
given researchers a lot to do. The cause of this problem has
been identified by researchers as “sub-mesh” phenomena,
which play a key role at a sub-mesh (i.e. smaller) scale. Submesh phenomena include radiation exchanges, air turbulence
in the atmospheric layer near the earth’s surface, the role of
clouds and sulfate-containing aerosols, etc.
Basing calculations on theory alone is not enough. Researchers need to fine-tune their models using empirical equations, which are gradually being improved by the results of
measurements and observations. Clouds in particular, which
cover large areas above oceans, play a key role. “Although
clouds have a significant impact on global temperature
changes, their role remains poorly understood. Clouds are
l’uSine nouvelle i n° 3464 supplement i apRil 14 2016

responsible for many of the discrepancies between various
climate models’ results,” says Jean-Yves Grandpeix, a
researcher at the Dynamic Meteorology Laboratory (LMD),
a joint research unit on three sites: the École Polytechnique
in Palaiseau (Essonne), the École Normale Supérieure, and
Université Pierre-et-Marie-Curie in Paris.

rates for
which struck
the philippines
in 2014.

improvinG dAtA ACCurACy

Climate-model discrepancies led to projects using satellites
to measure clouds in 2007, the results of which are starting to
be taken into account. As with other sub-mesh phenomena,
each research team is fine-tuning its models by moving back
and forth between actual results and expected values.
Another challenge for climatologists is to fine-tune their calculations, i.e. reduce mesh size. This presupposes simulation
programs that can use supercomputer power to maximum
advantage by distributing calculations over many parallel
processors. For this reason, teams of computer scientists
(20-30 people at the Institut Pierre-Simon-Laplace) work
alongside physicists and now play a key role transforming
climate models into effective simulation codes. ❚❚



“WEatHER FoRCastinG REliEs
ComPlEtElY on simulation”
The quality of weather forecasts depends on how much computing power we have,
as Météo France’s deputy director general, Olivier Gupta, explains.

What was Météo France’s role at the COP 21?
Let’s put our role back into its proper place. The COP 21
was primarily a series of political and strategic negotiations
whereas we were involved in its preparatory stages. Météo
France’s research teams contributed to the Intergovernmental
Panel on Climate Change (IPCC) report, which served as a
basis for political negotiations. We provided simulations for
the fifth IPCC report that were carried out using the ARPEGEClimate model. The same model used for weather forecasting
but set for climate simulation. Weather forecasting and
climate prediction are two aspects of the same science. We
also initiated various COP 21 side events. In particular, we
were behind the C3 Challenge hackathon that ran throughout
2015. There was also a scientific conference at UNESCO in
July and the Climate Train in collaboration with SNCF, which
was the brainchild of one of our engineers. In addition, we
were planning various events for school children but they
had to be cancelled for security reasons.
What role does simulation play in today’s weather forecasting?
A central and increasingly important role. Weather forecasting and climate prediction are both based on simulating
the atmosphere. People often think that weather forecasting
is based on statistics. They imagine us making comparisons
with the past and with various situations to deduce tomorrow’s
weather by analogy. This idea, while widespread, is wrong.
Weather forecasting has relied completely on simulation for
decades. The quality of forecasts therefore depends on how
much computing power we have. This power has a direct

“a significant amount of computation
time is devoted to initializing models
rather than to actual weather
forecasting. We do this using data
assimilation algorithms.”

impact on mesh refinement for a territory and on how sophisticated atmospheric calculations can be. The other essential
parameter is algorithms, which vary in their effectiveness at
solving equations and incorporating observations to initialize
models. A significant amount of computation time is devoted
to initializing models rather than to actual weather forecasting. New data has to be integrated intelligently since it does
not cover the whole grid. We mix old and new data and then
weight the two. The skill lies in choosing the right weighting
for the initial state to match reality as closely as possible.
Otherwise, the model will rapidly diverge from reality. Part of
our research involves fine-tuning these algorithms.
Who works on these models?
The Météo France research center in Toulouse, which employs 300 people, or 10% of our workforce. For models, we
collaborate extensively with international centers, especially
the European Center for Medium-Range Weather Forecasts
(ECMWF). It is an inter-governmental organization. We’re
co-developing an atmosphere model on a global scale,
ARPEGE and the European model IFS are two versions.
As a result of this collaboration, we exert a major leverage
effect and obtain excellent results even with a modest-sized
team. We’re the leading organization in the ALADIN consortium – which brings together twelve European countries,
three North African countries and Turkey – on limited-area
models. This consortium is developing fine-mesh models
for 24-hour forecasts.
Do you tackle more specific fields?
Absolutely. Our team in Grenoble is responsible for snowpack simulation, to predict the risk of avalanches. Which is
another of our roles. The team focuses on the atmosphere, as
well as the mechanics of snow-layer accumulation, how snow
layers are configured by wind, the probability of creep, etc.
They’ve developed a specific model for this research. We’re
also committed to cutting-edge research on urban climate
modeling. One of our research units is modeling town-specific features, which helps local decision-makers assess the
need for green roofs, whitewashed roads, and dampened


We run the model several times, slightly changing its initial
conditions to study various scenarios. We can now do this
for the fine-mesh model rather than just the global model.
Our contract runs until early 2019 and the supercomputer
will be upgraded mid-term to approximately 5 petaflops. This
computer is split into two units: the first is for the operational
system whereas the other is for climate simulation research
and for developing the weather forecasting operational
system of the future. Models must be tested before they
are rolled out. Since it’s essential that we provide a continuous service, the research half of the computer can switch
automatically to production mode to ensure an operational
service. Furthermore, the two halves of this configuration
are physically separated (by several kilometers) to guarantee
service in the event of an accident. This supercomputer is
combined with a regularly upgraded storage system. We
reached 40 petabytes of storage capacity this year, and will
achieve a hundred or so petabytes by the end of 2017.

ground and road surfaces. Our model can simulate the effect
of these actions on the temperature, especially during heat
waves. This is an increasingly important territorial issue in
the fight against global warming. Some of our researchers are
working on air quality. Others are simulating land surfaces
since transfers, humidity, vegetation, and soil sealing all need
to be modeled. We’re also simulating the upper ocean (waves
and swell), running back-trajectories to calculate flotsam and
jetsam and pollutant (hydrocarbon) drift. These are all joint
research units attached to both Météo France and the French
National Center for Scientific Research (CNRS), which also
ensures we have close ties with academia.

Luc Pérénom

What technical resources do you use?
The supercomputer we’ve been renting from Bull since
2014 is in Toulouse; its peak power is one petaflop. One
of our tasks is to purchase computing power suited to our
requirements, the models we use and their frequency at the
best possible price. This power allows us to produce immediate predictive models covering a few hours and focused
on France. We also provide overall fine mesh forecasting.

“We’re the leading organization in the
alaDin consortium – which brings
together 16 countries – on limited-area
models. this consortium is developing
fine-mesh models for 24-hour forecasts.”
L’UsINE NOUVELLE i n° 3464 suPPLEmEnt i aPriL 14 2016

Why do you rent instead of buying?
Since the equipment is replaced every 5-6 years, it doesn’t
make much difference. It’s a formality more than anything.
We switched to renting about ten years ago (as did equivalent
organizations) and only purchase computing power. The
supercomputer is on our premises and we provide its power
supply and coolants. But we have it on a rental agreement.
What are the main areas of simulation development
at Météo France?
We have four areas of development for simulating atmospheric phenomena. First, we want to take more and more
observations into account. A lot of data now comes from
satellites, with new instruments. Our first, and very complex,
task is to invent new algorithms. We must then initialize
the model intelligently, reconciling observations with results
from the previous model. Data assimilation algorithms are
a big project and we must choose the most promising technique. Our third area of development is physics (equations
for models). Considering the full-scale experiments that are
now possible, we must improve our modeling. One area of
research involves probing what happens inside clouds. We’re
using instrumented planes to achieve better understanding
of phenomena such as Cévenol episodes, which is helping
us improve our equations. Our fourth area of development is
connected with resolution. The global model’s resolution is
7.5 km whereas the resolution of the fine-mesh model over
an extended metropolis (Western Europe and North Africa)
is 1.3 km. Once you go beyond a certain resolution, new
problems appear. For example, if slopes become too steep the
grid computing and numerical schemes must be modified.
In addition, our code must be better optimized to hardware
architecture. One of our main current research projects is on
model scalability and adapting models to massively parallel
architecture. This is a no-degradation, gain factor. For this
research, we’re working with the ECMWF and the European
Center for Research and Advanced Training in Scientific
Computation (CERFACS), which is located on our Toulouse
site and in which we are shareholders. ❚❚


The Aircity project –
developed by Aria
Technologies – takes the
influence of buildings into
account when assessing
how city air pollution
is dispersed by wind.


limate change and the deployment of new energy
sources have given simulation a real boost. Offshore
and even nuclear entrepreneurs and engineers have
seized on environmental modelling. Some of the start-ups
and SMEs launched by these experts have made a real
name for themselves on this new market. Here are eleven
of these outstanding companies, which are highly rated by
the sector’s experts.


Aria Technologies was set up
in 1990. The company produces a numerical simulation
software suite and carries out
studies on air pollutant dispersion, climate analysis, and
renewable energy sources.
Tools intended for industrialists – in cities such as Rome,
Rio de Janeiro, Peking – and researchers. Some 65% of
studies carried out by Aria Technologies are exports. The
company’s climate change business is currently expanding
rapidly. This work covers storm frequency calculations, the
impact of rising temperatures on urban pollution, and city
dwellers’ energy consumption in fifty years’ time.

AriA Technologies

specialization Atmospheric
Date set up 1990
Locations Boulogne-Billancourt
(Hauts-de-Seine), Rio de Janeiro
(Brazil), Milan (Italy)
Workforce 45 people
T/O (2014) 4 million Euros


smes, natuRal
element eXPeRts
Several outstanding French companies are at
the cutting edge of environmental simulation,
whether of air and land phenomena, pollution,
or renewable energy sources.

KeY Clients total, Cea, Renault, suez


Numtech is located on La
Pardieu Technology Park
(in Aubières, near ClermontFerrand) and specializes in
weather forecasting and
simulating pollutant dispersion into the atmosphere. The
company’s operational systems feed data 24/7 into air monitoring stations run by associations such as AirPARIF and
AirPACA. Numtech opened its first subsidiary in Morocco
a year ago. Pierre Béal – the company’s founder – is also
looking into providing pollution forecasts in very fine scale.
This service, called NOA, is at the preparation stage and
could give two-hourly city pollution forecasts, accurate to
within a few meters. Ideal for planning your run!
specialization Atmospheric event
modelling and simulation
Date set up April 2000
Location Aubière (Puy-de-Dôme)
Workforce 20 people
T/O (2014) 1.3 million Euros

KeY Clients eDF, suez environnement, total, areva, Cofiroute,



more than 250,000 inhabitants, and electromagnetic wave
propagation maps. Geomod is collaborating with the CSTB
to develop its Mithra software suite.
KeY Clients suez, Veolia, lyonnaise des eaux, siaaP,
Bouygues travaux Publics


Innosea – a spin-off from the
École Centrale de Nantes – is
an engineering company specializing in renewable marine
energy (RME). Its business
covers all aspects of RME use.
As such, Innosea has carried
out calculations for EDF’s
offshore wind-farm foundations in Courseulles-sur-Mer
(Calvados) and Saint-Nazaire (Loire-Atlantique). Innosea’s
engineers are working on the next generation of floating wind
turbines, and on tidal energy and wave energy converters.
specialization Simulation of
renewable marine energy sources
Date set up 2012
Location Nantes
Workforce 21 people
T/O (2014) 732,000 Euros

on a wind

eNerGY sOUrCes

Meteodyn was set up by
Didier Delaunay – a CSTB
(French Scientific and Technical Center for Buildings) researcher – and has positioned
itself on the market for wind
and climatology numerical
simulation using its own solvers. The company produces and markets a range of calculation software and carries out studies for wind-farm operators
in Europe, China, and India. Meteodyn now employs around
fifteen people abroad. It also carries out calculations for the
construction business, simulating building ventilation and
helping architects construct passive buildings. The company
has diversified into solar energy and has carried out several
solar-farm feasibility studies, especially for China.
specialization Simulation of
renewable energy sources
Date set up 2003
Location Nantes
Workforce 50 people
T/O (2015) 3.2 million Euros

KeY Clients eDF, iberdrola, enel, General electric, Joyful melody
Wind energy Company (China)


Geomod’s hydrology business
involves assessing the impact of precipitation on city
wastewater systems. Suez,
Veolia, Lyonnaise des Eaux,
and the Greater Paris Sanitation Authority (SIAAP) all use
Geomod’s calculation codes.
But this is not the only string to Philippe Bischoff’s bow. He
has also positioned Geomod on more «land-based» calculations. For example, noise maps for urban agglomerations of


specialization Hydrology and
electromagnetic wave pollution
Date set up October 1995
Locations Lyon (Rhône), Brest
Workforce 12 people
T/O (2014) 1.4 million Euros


KeY Clients eDF, enBW


If ocean currents and valley
winds can be modelled, why
not human behavior too? This
principle has inspired Datapole, a start-up established
by Frédéric Gagnaire in 2010.
The company’s initial idea
was to provide local authorities with a tool for forecasting waste volumes generated
by their citizens, thus enabling them to optimize resources
allocated to waste collection. Datapole’s predictive models
are based on collection data, weather forecasts, and shop
retail data. Datapole has since developed its initial offer to
produce an electricity consumption tool (PrediWatt), and will
shortly be adding a building management tool.
specialization Forward-looking
resource management software
Date set up May 2010
Location Paris – La Plaine
Saint-Denis (Seine-Saint-Denis)
Workforce 11 people
T/O (2015) 410,000 Euros

KeY Clients engie, Plaine Commune agglomeration community,
Chevreuse Valley public waste authority


Nextflow Software is a spinoff from HydrOcean, a specialist hydrodynamics firm
bought out by Bureau Veritas
in September 2015. Nextflow
has taken over HydrOcean’s
software vendor business.
A dozen people have joined
this start-up, including Erwan Jacquin, HydrOcean’s founder.
Nextflow’s current business should ensure a 1-2 millionEuro turnover, and the company will continue working
with the École Centrale de Nantes to codevelop simulation
models. Nextflow has particular expertise in simulating wave
effects, which is invaluable for designing ship hulls, offshore
platforms, and for building offshore wind farms.
specialization Fluid mechanics
simulation software
Date set up June 2015
Location Nantes
Workforce 10 people
T/O Not applicable

KeY Clients Renault, michelin, airbus Helicopters



OptiFluides is located on La
Doua science campus in Villeurbanne (Rhône) and has
developed process simulation
and modelling expertise. To
achieve this, the start-up’s
founder Nicolas Boisson used
simulation codes produced by the software vendor Fluent.
OptiFluide’s calculations simulate flows in industrial plants,
thus enabling petrochemical companies to optimize their
processes. OptiFluide also models air pollutant dispersion
as part of risk assessments for industrial sites.

specialization Simulation of
liquid, gas, and multiphase flows
Date set up June 2011
Location Villeurbanne (Rhône)
Workforce 5 people
T/O (2015) 400,000 Euros

KeY Clients total, eDF, Bluestar silicones, solvay


AmplisIm is assessing the impact of pollutants on air quality in urban
and industrial environments.

Jérôme Cuny and Renaud
Laborbe set up Open Ocean
in 2011 to put their ocean
simulation knowledge online.
Although simulation codes
are very complex and generate large volumes of data,
Open Ocean gives access to
calculation results via a simple web interface. The company’s
Metocean Analytics platform is the fruit of three years’ work.
This platform is for industrialists developing projects in the
tidal energy, offshore wind turbine, and offshore oil sectors.
Open Ocean’s clients include Engie and Sabella. Open Ocean
raised 1.6 million Euros in 2015 and intends to increase
its workforce from 11 to 16 people by the end of this year.
specialization Online solutions
for ocean/weather analysis
Date set up 2011
Locations Paris and Brest
Workforce 11 people
T/O (2014) 96,500 Euros

“As A serVICe”

AmpliSIM was set up in Paris
in 2015 by Sylvie Perdriel
– a former Aria Technologies’ employee – and Olivier
Oldrini, the founder of the
consultancy firm Mokili. AmT/O Not applicable
pliSIM offers an air-quality
simulation service for design
and engineering offices. This cloud-based service assesses
the impact of industrial-plant and road-traffic pollutant emissions on surrounding neighborhoods. AmpliSIM’s founders
use open-source simulation codes recommended by the US
Environmental Protection Agency, and free software (OpenFoam and EDF R&D department’s Code_Saturne software).
AmpliSIM is currently negotiating with Qarnot Computing
and Bull to increase its computing power.
specialization Air-quality
Date set up March 2015
Location Paris
Workforce 2 people

KeY Clients engie, sabella

KeY PaRtneRs Bull, eDF

FOreCAsTs AND reNeWAbLe eNerGY

EDF SF is a spin-off from EDF
launched by three researchers
at EDF’s R&D department.
They wanted to set up a company to market the fruit of
their research. Their Pegase
software (an EMS – energy
management system) uses
energy consumption and meteorological data for precise
management of battery charging/discharging processes
coupled to wind farms and photovoltaic panels. This solution is suitable for islands since batteries compensate for
intermittent energy resources. Following a pilot project in
La Réunion, EDF Énergies Nouvelles’s photovoltaic power
station in French Guiana was the first facility to benefit from
Pegase’s calculations.
specialization Energy
management software
Date set up March 2014
Location Incubateur Agoranov
Workforce 10 people
T/O (2014) 474,000 Euros

open oceAn ; AMplisiM

Open Ocean’s
decisionmaking tool
can model
currents, wind,
and waves.


KeY Clients eDF Énergies nouvelles, Électricité de mayotte,

The evolution of computational tools for
numerical simulation of physics-based systems
has reached a major milestone.

comsol multiphysics ®

application builder

Custom applications are now being developed
by simulation specialists using the Application
Builder in COMSOL Multiphysics®.

With a local installation of COMSOL Server™,
applications can be deployed within an entire
organization and accessed worldwide.
Make your organization truly benefit from the
power of analysis.


© Copyright 2016 COMSOL. COMSOL, COMSOL Multiphysics, Capture the Concept, COMSOL Desktop, COMSOL Server, LiveLink, and Simulation for Everyone are either registered trademarks or
trademarks of COMSOL AB. All other trademarks are the property of their respective owners, and COMSOL AB and its subsidiaries and products are not affiliated with, endorsed by, sponsored by, or
supported by those trademark owners. For a list of such trademark owners, see www.comsol.com/trademarks


Le rendez-vo us inte r nation a l
The international meeting

Les c lés du futu r
Unlocking the future
Les 28 & 29
juin / June 2016
Ecole Polytechnique
Palaiseau - France


Platinum Sponsors :

Gold Sponsors :

Silver Sponsors :



a reaL sTar

Agriculture needs to produce bigger, higher quality
yields while also reducing its pollution. CybeleTech
has taken up this challenge by incorporating
farmland data into plant-growth simulation.
by Jean-François Prevéraud


Models based on real-Time Measurements

Saguez met Philippe de Reffye – the agronomist in this story
(who is not one of CybeleTech’s cofounders) – in the early
2000s. Together they developed a model to simulate plant
growth and architecture, for use in agronomy and imaging.
In 2003, they set up the Digiplante joint research unit at
the École Centrale de Paris. Under Cournède’s leadership,
this research team developed a plant-growth model used
in an online tourist application produced by Île-de-France
regional council: 3D simulation of the gardens at the Palace
of Versailles.
This application appealed to Lambert, who was looking
for a plant-growth model to synthesize and exploit his own
data. “I thought, if these guys can produce this, I need to
meet them,” he recounts. Several projects were launched to
validate the feasibility of this approach and test its relevance.
Following successful results, Saguez, Cournède and Lambert
set up CybeleTech at the end of 2011. The company’s first
orders came from a satellite operator that wanted to develop

Modelling crop growth optimizes the use of fertilizers, water and
treatments to improve yield.

its image-based service offer, and a seed company that had
started using modelling for genotype selection.
CybeleTech’s expertise enabled Digiplante’s mechanistic
plant-growth models to be combined with satellite images
showing plant biomass potential at any point on a given piece
of land. According to the weather conditions forecast, farmers
can now calculate the correct fertilizer and phytosanitary
treatment needed to achieve set added-value targets while
also causing as little pollution as possible.

Towards Plant PLM

This solution means farmers can base their decisions on
real-time crop measurements rather than on extrapolations
from previous years’ experience. It is now possible to accurately predict disease onset according to a plant’s actual stage
of development. Consequently, farmers can give plants more
protection while also intervening less often. In the short term,
this approach will meet the challenges facing agriculture: to


hat do you get if you take an agronomist, a numerical simulation researcher, a student engineer mad
about IT developments, an agri-food marketing
expert and bring them together under a name evoking the
Phrygian goddess of nature and abundance? CybeleTech, of
course – a French start-up established at the end of 2011
to model plant growth. “As is often the case, CybeleTech’s
story is first and foremost about human encounters rather
than technology,” explains Marie-Joseph Lambert, Novartis
Agro’s former director of business forecasting. Together
with the mathematician Christian Saguez, his PhD student Paul-Henry Cournède, and two other members of the
Digiplante team (Professor Véronique Letort – Cournède’s
deputy – and Benoit Bayol), Lambert is one of CybeleTech’s
five cofounders.


maRiE-JosEPH lamBERt,
director – Cofounder

trained in sales
and marketing
before pursuing
his career
in various
industrial groups. He became
Novartis Agro’s director of
commercial policy and
distribution marketing in 1997,
and its director of business
forecasting in 2000. After the
company merged with Zeneca,
he was key account manager
Europe from 2001 to 2005,
before joining CybeleTech as
its director.
CHRistian saGuEz,
Ceo – Cofounder

graduated from
the École
Centrale de
Paris in 1972,
and was
director of international and
industrial relations at the French
National Institute for Research
in Computer Science and
Automation (INRIA). He
subsequently founded and was
CEO of Simulog, and was also

increase production 1.5-fold while also reducing inputs by
30-40% and improving the sanitary quality of products.
“The next stage of this technology will enable farmers to
select plant varieties for sowing and draw up a complete
protocol for growing them on a given piece of land, varying
this protocol according to the qualities expected in the end
L’usine nouveLLe I N° 3464 SUPPLEMENT I APRIL 14 2016

director of industrial relations
and subsidiaries at the French
National Center for Space
Studies (CNES). At the same
time, he was also a professor
at the École Centrale de Paris,
where he set up the
Mathematics Applied to
Systems (MAS) Laboratory. He
was president of the TER@TEC
Association (European
Competence Center for HighPerformance Numerical
Simulation) until 2009. He is a
member of the French Academy
of Technology’s Communication
and IT Technology Commission,
as well as president of the Scilab
Paul-HEnRY CouRnèdE,

scientific advisor – Cofounder

from the École
Centrale de
Paris (1997)
and Cambridge
University, and has a PhD in
applied mathematics (2001).
He is director of the
CentraleSupélec’s Digiplante
team, working on mathematical
modeling of plant growth in the
MAS laboratory.

product. It will be a sort of ‘plant PLM’ [ed. note: product
lifecycle management], connecting every step of the process
from start to finish,” explains Lambert. CybeleTech currently
carries out a dozen ad hoc studies a year for industrialists, but
its short-term plan is to provide online services for farmers.
This will provide farmers – via their cooperatives – with
the seed and inputs they need to maximize yield on each
piece of land. CybeleTech is currently working on arable
crops (cereals, oilseeds), market-garden produce (tomatoes,
strawberries), and perennial crops (vines). The company
plans to expand into forestry (developing the value of forest
resources, adapting to climate change), and tropical products
(coffee, cocoa beans, cotton, bananas). For Lambert, this
means CybeleTech could generate turnover of 5-10 million
Euros over the next five years. ❚❚



ean-Hilaire Renaudat farms 1,200 hectares north of
Châteauroux (Indre) and belongs to a new generation
of farmers at the cutting edge of digital technology. In
a few days’ time, he will be applying nitrogen fertilizer to
his soft wheat, barley, and rapeseed crops. What’s special
about this is that the amount of fertilizer Renaudat applies is
entirely adjusted to data collected by sensors on his field crop
sprayer. The N-Sensor software developed by the Norwegian
group Yara measures crops’ light reflectance to calculate their
biomass and nitrogen requirements. The German agricultural
giant Bayer uses the same method to adjust fungicide levels,
as does a combine harvester-mounted software program
that calculates yields in real time while they are harvested.
Renaudat sees many advantages in using these new tools:
“We made a return on our investment within four years.
We’ve cut down on nitrogen fertilizer, have less crop disease,
and produce higher, more protein-rich yields,” he says, clearly
delighted. Renaudat also uses his smartphone to calculate
the nitrogen requirements of his rapeseed.
As in industry, digital tools mean farmers can now keep
track of running their business. Above all, these tools help
farmers anticipate, increase yields, and predict the quantity
and quality of harvests even before production is over. This
radical shake up has been dubbed the “third agricultural
revolution” in some quarters, following the first revolution
that turned agriculture into a commercial business and the
second that mechanized it and introduced chemical fertilizers to increase yields. The past few years have witnessed
an abundance of decision-making and simulation solutions:
adjusted irrigation, crop-disease forecasting, yield simulation,
milking robots on dairy farms, calving prediction, automated
cowshed cleaning, etc. “One of the main challenges is to
improve eco-friendly crop management by limiting the use
of agricultural inputs,” says Jacques Mathieu, director of
Arvalis Plant Institute.


Challenges in agricultural modeling – simulating
wheat growth, optimizing inputs, predicting milk
production, etc. – have spurred on start-ups and
yielded a bumper crop of innovations.



Requirements Calculated to Within a Square Meter

Arvalis and Airbus Defence & Space developed Farmstar in
2003; it was one of the first systems to make crop-management recommendations based on satellite photos. Although
Farmstar has always been very popular with farmers, other
systems have gradually caught up. Drones are now used
for precision farming, and France seems far ahead of other
European countries on this technology. Airinov is a start-up
that was established in a barn in Poitou-Charentes by two
engineers and a farmer’s son back in 2010. Within a few
years, the company has become the French number one on
agridrone-based recommendations for nitrogen fertilization.
Airinov has developed quad-band sensors that can, for
example, work out rapeseed weight and calculate chlorophyll
levels in wheat using green and infra-red light wavelengths
emitted by the plants. Airnov then calculates how much
nitrogen is required for each square meter of a land parcel.
“Our technology is the most advanced in Europe. After
five years of trial and error, we found our business model
thanks to a partnership with the French drone manufacturer
Parrot, which invested capital with us,” explains Romain
Faroux, Airinov’s cofounder. Airinov’s technology offers

sebastien sindeu ; d.R.

Airinov drones calculate the nitrogen requirements of wheat and rapeseed.


“Our start-ups must move into the industrial phase”
maRinE PouYat,
Director of the Renaissance
Numérique think tank and head of
the legal department at the French
Federation of E-commerce and
Distance Selling (FEVAD).
Why did you send the government
an official report on agriculture and
digital technology last November?
Agriculture is facing many
challenges. It must feed the
planet, be eco-friendly, make its
profession more attractive, as
well as reconnect with
consumers and win back their
trust. We’ve made sixteen
proposals to rethink agricultural

production, distribution and
consumption in the digital era.
What are these proposals?
We want farmers to receive
training and support to
purchase precision digital tools,
especially tools reconciling yield
targets with sustainable
development. We also suggest
setting up regional open data
experimental programs in some
branches of agriculture. This
will bring production costs and
retail prices back into balance.
What’s more, we believe that
cooperatives – which have
access to agricultural and

market data – should be at the
forefront of agricultural big data
strategies to defend farmers’
Has France adequately got the
measure of this new agricultural
France has a large pool of
start-ups, which must now
move their solutions into a
more industrial phase. French
public and private stakeholders
should seize on this official
report now or else it will be too
late. We’re facing international
competition from very powerful
groups. ❚❚

We equip,
you compute

Genci (Grand équipement national de calcul intensif)
makes available powerful HPC resources to French scientific
communities and also to industrials wishing to conduct research
in link with an academic PI, in all fields.

Climate changes

Reducing pollutant

L’USINE NOUvELLE i n° 3464 supplement i apRil 14 2016

Prevention of seismic damages

Propagation of major fires



Arvalis models the resistance of plants to water shortage using sensor data and cameras (RGb images of photosynthesis).

many benefits. “Although satellite images are expensive,
they’re worthwhile if you can recover the cost on large land
parcels or areas. Drones can be used for more occasional
missions, even though they’re still expensive to access [Ed.
note: 25,000 Euros for Airinov’s solution],” says Véronique
Bellon-Maurel, director of the ecotechnology department
at the French National Research Institute of Science and
Technology for the Environment and Agriculture (IRSTEA).
Since 2014, several chambers of agriculture in the GrandOuest region of France have been offering decision-making

pascal Guittet

“our challenge is to promote
this data to third parties.
some of them, like thales,
could use it to simulate plant growth.”
François Houllier, president of the French national institute for agricultural Research (inRa)


services based on drone images. “Drones fly over areas
very quickly and precisely. Weather conditions sometimes
make it harder to use satellites, although drones can’t fly in
strong wind,” says Nassim Hamiti, an agricultural equipment
project leader at the Permanent Assembly of Chambers of
Agriculture (APCA). Another constraint is that, for the time
being, drone-based services are limited to nitrogen-input
recommendations for rapeseed and soft wheat, although
Airinov is extending this to other crops.
Many other French start-ups like Airinov have appeared in
the field of agricultural digital technology. After tough early
years, some of them are consolidating and have moved into
a more industrial phase. One example is Naïo, a Toulousebased start-up established in 2010. Naïo developed the Oz
robot for mechanical weeding of market-garden crops on land
parcels covering less than ten hectares. “Oz features eleven
infra-red sensors on its front and sides, together with four
electric-motor driven wheels. These wheels are powered by
lead batteries giving Oz four hours of autonomy,” explains
Gaëtan Séverac, Naio’s cofounder. Although Naïo has only
sold about thirty of its Oz robots – which sell for 21,000 Euros


Standardized Computer
to benefit from digital
technology, we need a single
computer language to speed
things up. the agro edi europe
association – set up twenty years
ago by agricultural organizations
to improve traceability and data
exchange between phytosanitary
suppliers and users – is
standardizing remote-sensing
data. “this new language should
be operational by the end of the
first half of 2016 and will apply in
particular to all drone operators.
cooperatives and chambers of
agriculture will be able to change

supplier more easily,” says
bruno prépin, ceO of agro
edi europe. the association is
also working with the union
Française des semenciers
(uFs, a French trade association
for seed companies and plant
breeders) on seed production,
and in the poultry sector
to standardize computer data
among operators. “this will
benefit French industrialists,
making them more productive
and distinctive,” says prépin,
who hopes the French system
will be rolled out in europe. ❚❚

each – the company has just developed Oz’s big brother: a
high-clearance robot called Anatis. This robot was designed
in collaboration with the French manufacturer Carré and is
for weeding land parcels bigger than ten hectares. But that’s
not all it does since Anatis features 3D cameras and many
sensors to collect data on humidity, soil temperature, weed
infestation, density, and plant development stage.
Digital technology is also radically changing agricultural
research, leading technology institutes to adapt their research
programs. Two of Arvalis Plant Institute’s experimental
farms are currently specialist “digital farms”. The first – in
Boigneville (Essonne) – specializes in arable crops, while the
second – in Saint-Hilaire-en-Woëvre (Meuse) – specializes
in mixed farming and cattle rearing (beef production). By
the end of 2016, both farms will have implemented digital
production management, combining existing techniques with
test tools and prototypes from external companies.

A Phenotyping Experimental Station

Fundamental research is also adopting digital tools to
improve its efficiency. Since 2013, the French National
Institute for Agricultural Research (INRA) has been working


“one of the main challenges is
to improve eco-friendly crop
management by limiting the use
of agricultural inputs.”
Jacques mathieu, director of arvalis plant institute


with Arvalis on a national high-throughput cereal phenotyping program called Phénome. One of the main platforms
for this 24-million Euro project jointly funded by the French
“Investment for the Future” initiative is in the commune of
Ouzouer-le-Marché, deep in Le Loir-et-Cher. This site has
eight large, mobile greenhouses in the middle of a field.
Sensors measure rainfall, soil humidity, wind, etc. At the
far end of the land parcel is an imposing boom on a railmounted gantry. This boom – fitted with many sensors and
cameras – takes photos to analyze plants’ phenotype (i.e. their
physical characteristics) and chlorophyll content. “The goal
is to create controlled water stress in plants and stop rain
falling on the plot,” explains Yann Flodrops, director of the
experimental station. This pioneering European program will
help scientists select varieties consuming less inputs (water,
nitrogen, and pesticides) and identify phenotype genes.

Orléans – Plant Digital valley

The advent of thousands or even millions pieces of agricultural data from fields and research programs presupposes
high-security data storage facilities. The INRA has built two
data centers – one in Toulouse (Haute-Garonne) in 2014 and
another in Bruyères-le-Châtel (Essonne) in October 2015
– which each cost four million Euros. “The challenge is to
promote this data to third parties such as Thales, which
use it to simulate plant growth,” says François Houllier,
president of INRA.
Although entering the digital era is a major change for
agriculture, digital projects and initiatives remain scattered.
Faced with global foreign giants at the cutting edge of agricultural digital technology – such as John Deere and Monsanto
– French projects urgently need to pool their efforts. Some
joint initiatives are starting to emerge: for example, the
three-year Smart Agriculture System project launched in
2014. This project brings together the Végépolys, Céréales
Vallée, DREAM Eau & Milieux competitiveness clusters and
private groups such as Limagrain. Its purpose is to design an
innovative wheat modeling, yield forecasting, and decisionmaking system.
“Seed companies will be able to market their seeds faster.
Farmers will be able to discover constraints more quickly,
predict water requirements, and increase production. This
will give a real competitive advantage,” says an enthusiastic
Christian Saguez, CEO of the start-up CybeleTech, one of
the project’s partners. Alongside this project, a plant digital
valley – Agreen Tech Valley – is to be established in Orléans
and its suburbs (Loiret) in 2017. This initiative is expected
to bring together several agricultural companies – such as
Axéréal, Kuhn and Sofiprotéol – as well as the University of
Orléans. “France has all the skills it needs to hold a strong
position in the international digital agriculture race. We’re
well positioned on IT technology and agriculture. It’s now up
to us to move into the industrial phase,” says Saguez, who
is also vice-president of Agreen Tech Valley. In the future,
French agriculture will be distinguished by its technology
and France’s ability to export this technology, as well as by
its production. This is the challenge facing French agriculture
if it wishes to retain its world ranking. ❚❚


European Pole

HPC BigData Simulation

Unlocking the future

Mastering technology
Industrial Research
Dissemination across industries
Teaching and training
International cooperations

& Research


Contact & Information
Tel. +33 (0)9 70 65 02 10
2 rue de la Piquetterie
91680 Bruyères-le-Châtel





Nature revealed
Forces, movement, flow, growth and turbulence:
when simulation unveils the environment.
By Jean-Louis saLque and Bernard VidaL

tree growth
and crown



distribution of carbon
dioxide emmisions in
earth’s atmosphere.

representation of ocean
surface temperature
in the Gulf of Mexico.

in an unstable

Nasa ; INrIa - CIrad ; INrIa - CNrs ; CNrs ; TaCC

Wave dispersion
along the bow
of a boat.

L’usine nouVeLLe I N° 3464 supplEmENT I aprIl 14 2016



of pollutants.

representation of
the magnetic field in
the earth’s core.
Vortex lines
of a tornado.

of seismic waves
the planet.

earth’s magnetic



trajectory of air
particles around
a sailboat.

a north atlantic
type ocean.

OpTIfluIdEs ; CNrs ; TaCC ; uCsC ; INrIa-CNrs ; uNIvErsITé dE Jyväshylä (fINlaNdE)

the temperature of
currents at a depth
of 30 meters off the
Cape of Good Hope.

3d visualization
of fluid flow
in porous rock.
L’usine nouVeLLe I N° 3464 supplEmENT I aprIl 14 2016



Virtual-reality helmets help train staff to use machines without disrupting production.

immERsion FoR
The advent of mass virtual reality will benefit
companies – especially small ones – by making
immersive simulation more accessible.
BY Julien Bergounhoux



omputer-aided design (CAD) is the traditional bastion
of virtual reality. Until now, only big groups were able
to invest in Cave Automatic Virtual Environments
(CAVEs) –rooms equipped with projectors for virtual-environment work – e.g. to enter and interact in simulated cockpits.
These companies have devoted huge amounts of money to
virtual reality: Renault spent three million euros on a CAVE
in 2013. The cost of this technology has limited its take-up.
Nevertheless, CAVEs make huge profits for manufacturers. It
took the car manufacturer Jaguar just three weeks to make a
return on a CAVE costing £3 million. Big manufacturers such
as Dassault Aviation stopped making physical mock-ups a
long time ago and now design their products using virtual
reality, from the assembly line to maintenance.

FroM FighTer PlAneS To eleCTriCAl APPliAnCeS

The advent of HMDs (head-mounted displays) for immersive simulation are changing this model. By significantly
reducing costs – which have dropped from around a million
to approximately a thousand or even a hundred Euros
–HMDs have made virtual reality affordable for all compa-

Seb Kuntz ; D.R.

VirTuAl reAliTY



« it won’t be easy for
CAVes to survive »

FaBiEn BaRati
CEO and cofounder of Emissive
have you noticed more
interest in virtual reality?
Yes. We’ve been working in
this field for the past ten
years. Over the past few
months, we’ve sensed that
demand is increasing.
It hasn’t yet rocketed,
but it’s really growing.
Will these helmets replace
existing equipment, even in
big companies?
I don’t see how CAVE’s
[Ed. note: rooms with
projectors, for virtual-

environment work] will
survive since they’re very
expensive to install and
maintain. Even projector
light bulbs are almost as
expensive as investing in
virtual helmets. I think
it’s a decision that will be
made quickly, especially
since CAVEs don’t provide
any additional advantages.
Their multi-user dimension
is very limited.
What are you setting up using
these systems?
We’re working on “The
Enemy”, a documentary
pitting visitors against
enemy combatants in an
armed conflict. The
documentary combines
ten connected helmets
in the same virtual
environment and will
enable visitors to move
about in an area covering
200-300 m2. ❚❚

e date !
Réservez votr


« La data c’est la santé ! »


Nathalie MAUBON

Malika MIR




Chief Information & Digital Officer Directrice de la stratégie
• Malika MIR revient sur la transformation digitale d’IPSEN Pharma

nies, even the smallest design offices. This technology was
previously limited to planes, boats and cars, but is now used
– as at the German company Miele – for kitchen appliances.
Virtual-reality solutions are not limited to viewing images.
When connected to systems – either hand-held peripherals
or motion sensors – they enable users to interact naturally
with their virtual environment, and even move about in it
[see box on following page].
The advantage of virtual reality compared to screen-based
3D representation is that images are viewed life-size and in
spatial dimensions. When plans on CAD software are hard
for non-specialists to understand, all relevant departments
(marketing, maintenance, and even end clients) can be
involved at the very early stages of a product’s life cycle.
This collaboration can also be achieved remotely since the
relevant parties can come together in a virtual environment
from several physical sites.

AnTiCiPATing uSeS

Virtual reality can also simulate use in real conditions,
especially for architecture. For example, it helps architects

• Pascale SAUVAGE de l’ASIP et Jean-Yves QUENTEL de MENSIA
TECHNOLOGIES vous donnent les clés pour réussir sur ce marché
très réglementé
• Éric DESSERTENNE de BIOCORP vous dit comment passer du
gadget à l’outil médical professionnel
• Thomas LANDRAIN de LA PAILLASSE imagine avec vous la (e)
santé en 2030
• Découvrez les 10 startups de la e-santé à suivre selon L’Usine
Avec le soutien de :

Programme complet et inscriptions sur :
Contact : Elvire ROULET
e-mail : eroulet@infopro-digital.com
tél. : +33 (0)1 77 92 93 36



decide whether people will have the sun in their eyes in a
particular location, or if a room will be cramped once it’s been
furnished. This applies to all work areas. For example, virtual
reality can integrate virtual humans to test the ergonomics
of assembly line workstations before they are built.
Neuromarketing is another example. Virtual reality could
replace conventional market studies, which use imitation
stores, display cases, and products. Companies will be able
to create their own virtual supermarkets, which are much
cheaper and can be configured however they like. These
supermarkets will enable every aspect of an experiment to
be simulated and recorded. In addition, their settings can
be changed according to where customers go in them and
what they look at.

training – e.g. train-checking procedures – for several years
now. Virtual environments do not disorientate people, even
non-technical staff. They also take up far less room, provide
more exhaustive training (every possible configuration can
be gone over as often as you like), and are much quicker to
set up. This technology also facilitates distance training of
both large groups and handfuls of students, even when they
are geographically scattered.
Staff can be trained to use new equipment before it is installed. Even after equipment is up and running, simulation
enables staff to train without holding up production. Lastly,
this technology is more accessible and provides a completely
safe setting in which to raise staff awareness of high-risk
environments. For example, fire-fighters or staff in specific
industrial settings (nuclear power stations, oil rigs, etc.).
There are now applications for virtual reality in medical
settings: it helps medical students view the full complexity
of the human body, learn about anatomy in detail, diagnose
virtual patients, and even practice operations. ❚❚

SiMPliFYing AnD Pooling TrAining

Apart from design, training is another area that will be
transformed by virtual reality. Here too, big companies such
as SNCF have been using virtual reality solutions for staff

thREE solutions to KEEp an EYE on
oCuluS riFT

Price 699 euros
Release Date March 2016
Manufacturer Oculus,
a Facebook subsidiary


Price 3,000 dollars
Release Date March 2016
Manufacturer Microsoft

hTC ViVe

Price 799 dollars
Release Date April 1st, 2016


Manufacturer HTC


Oculus Rift is behind the revival of virtual reality helmets.
the project was launched in 2012 and the start-up’s
growing success led to its buy-out by Facebook in
March 2014. two years later, the definitive version of
this helmet is about to be released. Oculus touch
– two controllers designed for completely natural
interaction in virtual environments – will also be released
for use with this solution. Rift helmets are the yardstick
by which other solutions should measure themselves.
Microsoft has taken another route with its HoloLens
helmet. Instead of preventing users from seeing the
world around them, this helmet superimposes holograms
on it to enhance users’ vision. the helmet is not just a
user interface; it is also a fully autonomous computer in
its own right. It runs on Windows 10 and is hence
compatible with many applications, enabling interaction
with virtual elements and motion sensors. unlike Oculus
and HtC, Microsoft’s priority target is professional use.

this helmet is coproduced by HtC (a smartphone
manufacturer) and Valve (software vendor of the Steam
platform, which has revolutionized the sale of online
games). It rivals Oculus Rift helmets and features
distinctive Lighthouse base stations, which are placed
in the corner of a room so that users can move about in
a virtual environment. the HtC Vive helmet also has a
camera, enabling users to see transparent silhouettes of
the real world while in the virtual world by simply pressing
a button.


G/OM/16.21/A Images courtesy of EADS Innovation Works and Ford.


Build, test and experience your virtual prototype
Get your product right the first time!
Lessen the impact on our environment

w w w.esi-group.com / innovate | innovate@esi-group.com


although most cars already have automated-driving features, the transition to full automation is not going to happen overnight.

tHE QuEst FoR
FullY autonomous
Before autonomous vehicles take to our roads,
engineers and researchers are using simulation
to improve their performance in all situations.
by Frédéric Parisot



he 2015 ITS world congress in Bordeaux included a
demonstration of half-a-dozen cars in autonomous
mode. Car manufacturers, auto parts companies, and
engineers, they all came to show off their wares. However,
the race towards autonomous vehicles is not over yet. There
remain many challenges – especially in simulation – since
autonomous vehicles are not ready for the average driver.
“Although automation technology exists, we need to make
it cheaper and ensure it works in every situation,” says Marc
Charlet, general manager of Mov’eo competitiveness cluster.
Most features of autonomous vehicles are now available as
advanced driver assistance systems. They are included in all
current executive models; from emergency braking and adaptive cruise control to traffic jam assistants. But according to
the SAE international standard J3016 on automated driving,
these are level 3 automation features. Although vehicles can
self-drive in some situations, drivers must be ready to take
control again at any moment. Level 4 automation, where the
vehicle can cope with any situation even if the driver does not
retake control of the steering wheel, has yet to be achieved.
Simulation engineers and researchers have a major role to

D.R. ; InRIa / J. Wallace

dEsiGN oFFicE


play in reaching this goal. “We’re working on three aspects to
produce decision-making autonomous vehicles: knowledge of
the surrounding environment, understanding situations, and
calculating safe trajectories conforming to various comfort and
fuel-economy criteria,” says Frédéric Mathis, director of the
Vehicles program at the VEDECOM Institute.
To find their way around, autonomous vehicles have
sensors and image-recognition algorithms that identify
road markings, traffic lights, and road signs. But that is not
enough; autonomous vehicles need to communicate with
other road users and with the road infrastructure to anticipate
as many situations as possible. Although communication
standards exist, they remain to be applied. “We’re developing
simulation platforms for inter-vehicle communication,”
confirms Bruno Grandjean, program director at Véhicule
du Futur competitiveness cluster. “The idea is to put real
vehicles into a virtual environment, which can be controlled
and hence replicated. We then test communication between
various makes of car model and simulate ramp-up, when
thousands of vehicles are communicating with one another
Once vehicles can find their way around, they must be
able to identify high-risk situations. “There are some real
challenges for perception systems since vehicles encounter
all sorts of objects on roads. To avoid overreacting, autonomous vehicles must distinguish between genuine obstacles
and those presenting no danger. We’re using sensor fusion
to tackle these issues,” says Vincent Abadie, head of innovation for advanced driver assistance systems at PSA Peugeot
Citroën. According to Sébastien Glaser, team leader of the
Autonomous Vehicle project at the VEDECOM Institute,
sensor fusion has become the key to success in recent years:
“Accidents are caused by a combination of various factors,
for example, when roads are wet and drivers have the sun in
their eyes. We need to develop databases of sensor signals
from all these situations and then use them in our simulation
models. This will enable us to check that at least one sensor
can detect obstacles in poor driving conditions.”

Making the right decisions

Once vehicles know their surrounding environment and
have identified situations, they must be able to make the
right decisions. “To work out the rules for this, we’re developing decision tree-based expert systems,” explains Fawzi
Nashashibi, research director at the French National Institute
for Research in Computer Science and Automation (INRIA).
“We’ve integrated advanced mathematical techniques
such as fuzzy logic since cars have to ask themselves many
questions to make apparently simple decisions (whether to
accelerate, brake, overtake, etc.). For example, they need to
know whether they are complying with the Highway Code,
whether their fuel consumption is reasonable, and whether
it will be comfortable for passengers when they brake. All
this requires constant real-time optimization of criteria.”
Technology for constant optimization is very IT-intensive.
“We’ve opted for staged simulation,” explains Glaser. “First,
we simulate features in an ideal environment. Then we
introduce error using another simulator. Finally, we use a
l’usiNE NouvEllE I n° 3464 SuPPleMenT I aPRIl 14 2016

star French companies
behind the Wheel

ever since setting up the national
Institute for Transport and Safety
Research (InReTS) in 1985,
France has been a leading
supplier of autonomous-vehicle
technology. The French national
Institute for Research in
computer Science and
automation (InRIa) in particular
supplies many car manufacturers
and auto parts companies. France
has a network of recognized
SMes in this sector. French
simulation expertise includes

Voxelia – specializing in road
traffic engineering – and
cIVITec, whose multicriteria
simulation platform Pro-SiVIc
is used to validate autonomousvehicle features. France also has
algorithm champions: nexyad, a
signal processing specialist
supplying road- and obstacledetection modules; Datasio,
which has developed big databased analytics tools; and Innov+,
for driver vigilance monitoring
devices. ❚❚

third simulator to compare the results with the trajectory
chosen by a human being in a similar situation.” For Dominique Gruyer, director of the Vehicle Infrastructure Driver
Interactions (VIVIC) laboratory at the French Institute of
Science and Technology for Transport, Development and
Networks (IFSTTAR), the main difficulty lies in the diversity
of phenomena to be represented: “Simulating each scenario
involves modeling road features, other road users, obstacles,
the surrounding environment, as well as the driver’s concentration level and direction of sight. Different simulation
motors are used for each scenario.” Car manufacturers still
have work to do if they want to bring out their first level 4
autonomous vehicles by 2018. ❚❚



networks of
ConneCted objeCts,
it’s a Challenge
Vast emerging networks of connected objects and
sensors will force this sector to use simulation,
says Wind River’s Michel Genard.
by SylvaiN aRNulf


or the time being, all is well on the Internet of Things.
Deployment of networks for sensors and connected
objects is still largely at the small-scale experimental
stage. But projects involving the Internet of Things (IoT) in
smart cities, business and industry 4.0 –which are integrating sensors into logistics, road and airport infrastructures,
etc.– are expected to pick up pace. As a result, there will be
hundreds of millions of connected objects and widespread
network interconnections, giving rise to increasingly complex
systems of systems. We will have to oversee the deployment
of connected objects themselves, as well as their software
configuration, links to gateways and hubs, and communication
with cloud-based exchanges. This is a real challenge. “The
hardest thing is not independently connecting millions or
even billions of objects, but rather creating millions of systems
that will themselves depend on millions of systems,” warns
Michel Genard, a French expat in California who is vice-president of the simulation tools & lifecycle solutions department
at Wind River, an American embedded software company (a
division of Intel). For the IoT to create value, even smarter,
and hence complex, networks will have to be devised. “We’re
talking about systems that can detect breakdowns, issue
alerts and reconfigure themselves completely autonomously
on a large scale involving thousands of objects and sensors,”
sums up Genard.


very early simulation

Using simulation at the very early stages of projects would
be the best way to create these high-performance systems. Simulation would help design, configure, and size systems, as
well as simulate how they operate during use to ensure they
are ready as soon as they are deployed. Virtual simulation
of every component in physical systems is already possible.
Each object – with its software, memory, processor, battery,
radio chip, etc. – is reconstructed virtually. The network
environment is also recreated to take real operating conditions into account. It is thus possible to design a network’s
architecture and then carry out a battery of tests to see how
each component reacts. The more simulators anticipate, the

Simulation tools for networks of connected
objects are ready, but underutilized.
“We’re still spreading the word about
them,” says Wind River’s Michel Genard.
Nevertheless, some pioneers have started
using these tools. “We’re working with a
Japanese company that designed a smart

meter with a European industrialist.
They simulated 1,000 meters on a system
to see early on how it was going to react.
Simulation was used to assess the impact
of radio-signal power, weather,
environment, building position, etc.
on the system’s configuration.”

more protection there is from breakdowns and bugs. But it
is impossible to anticipate everything. Simulation tools are
therefore mobilized throughout a network’s lifecycle. “IoT
systems are by definition never finished,” underlines Genard.
“Applications are regularly updated by cloud services.”

Replicating breakdowns to understand them

Simulation is also very useful in the event of breakdowns.
“Simulators can replicate breakdowns; they’re a bit like the
film ‘Avatar’ – the real and virtual worlds coexist in them. As
a result of simulation’s infinite computing power, we have
control over time; we can pause, rewind or fast forward,”
says Genard. “It takes just 1 second to do what would
normally take 100 or 1,000.” This works when pinpointing
incidents in real life or when predicting them. Simulation
is an essential investment, especially compared with the
cost of potential massive breakdowns. “Time control is
priceless. We need to reason in terms of the cost of system
failure and its impact on business rather than focusing on
development costs,” says Genard decisively. This factor is
even more important for sensor networks linked to critical
businesses such as water and energy, for which security
issues are paramount. ❚❚

Simulation-driven Innovation™

Simulation-driven Innovation is at the heart of HyperWorks 14.0.
Altair has just released HyperWorks 14.0, its latest simulation software suite.
With over 2,100 new features and convenient licensing that gives instant
access to leading technologies from Altair and the Altair Partner Alliance, users
are enabled to develop great products faster than ever.
Learn more at altair.com and altairhyperworks.com/hw14



solaR imPulsE
100% CalCulatED

Consulting and engineering firm Altran used an
unconventional approach to design a solar-powered
airplane. Soon they will use the same method to
draw up flight plans.
by Jean-François Prevéraud


olar Impulse is a pure product of simulation. Altran
had to go back to the drawing board to design a solarpowered airplane capable not only of circumnavigating
the globe but also of predicting every flight situation and establishing the best routes. “To pull off a project like this, you have
to constantly push the limits without going too far. Simulation
helped us to achieve this,” says Christian Le Liepvre, who is in
charge of the Solar Impulse project at the Altran Foundation
for Innovation. So how did engineers simulate a project like
this one, without any experience to draw on? “We were well
beyond the realm of conventional simulation, which adjusts
virtual reality to the real world. That’s why we used unconventional exploratory simulation,” explains Christophe Béesau,
an advanced simulation expert at Altran.

an airplane behavioral Model

Conventional simulation represents physics as explained by
the general theory of systems. But innovation moves beyond
existing engineering knowledge. And the further away you
move, the more the experts disagree on how to achieve
desired outcomes. Before anything is settled, simulation is
therefore used to optimize design processes. This ensures
the best possible solutions to problems, using currently or
shortly available technology. “Since the end of the 1990s,
we’ve been developing mathematical methods and tools to
analyze complexity. Our goal is to establish the most direct
routes for problem-solving and find ways to build exploratory
simulation models,” continues Béesau. Exploratory simulation gives mathematical rather than physical (by definition


“unconventional simulation
is a wonderful opportunity
to change how we think about
Christophe Béesau, an advanced simulation expert at Altran


solar impulse’s flight trajectory calculated using altran’s simulation model.

unknown) representations of systems. It enables IT experts
to create unconventional simulation models containing
enough physical representation to guide engineers. Using
25,000 parameters, this approach helped make design decisions for the Solar Impulse in eighteen months rather than
the five years needed for conventional simulation backed up
by physical tests. “This method enabled us to recommend
a four- rather than two-engine approach,” recounts Béesau.
“We also built an airplane behavioral model to assess various
design options, some of which – such as slenderness – are
counterintuitive and completely unlike gliders’ slenderness.”

a tailor-Made supercomputer

Before constructing the airplane, unconventional simulation
of virtual flights was used to validate its in-flight behavior and
simulate its mission, i.e. virtually fly the airplane behavioral
model on a given route in the most likely flight conditions.
These conditions were established using weather forecasts
from the American National Oceanic and Atmospheric
Administration and were updated every six hours. The
energy efficiency aspect of flight feasibility was also validated using unconventional simulation. “This year, we’ll
be taking into account fifty probabilistic weather scenarios
for each flight, thus increasing opportunities for takeoff,”
explains Le Liepvre. To this end, Altran has developed its
own supercomputer, with about ten teraflops of computing
power and optimum architecture for running very large-scale
unconventional simulation. “It’s a wonderful opportunity to
step up the pace by changing how we think about problems,”
concludes Béesau. ❚❚

European pole of competence
HPC BigData Simulation

Teratec Campus

CEA Very Large
Computing Center
An infrastructure for high-end HPC, open to
industrials partners with
CCRT Centre de Calcul Recherche et technologie.
1.4 Pflop/s of secured parallel computing power, made
available to large industrial companies, sharing
competencies,costs and innovations through
sustainable partnerships with CEA.

Industry. Large companies, SMEs and start-up companies

develop their activities at TERATEC campus, all along the high
performance IT value chain – from components and
system levels to software and applications.


Research. Several research teams work at
TERATEC Campus to investigate new usages of HPC
and harness and deploy new heavy computing
and Big Data technologies.


Contacts & Informations

Tel. +33 (0)9 70 65 02 10
Campus Teratec
2 rue de la Piquetterie
91680 Bruyères-le-Châtel - France

Contacts & Informations


Tel. +33 (0)1 69 26 62 56
2 rue de la Piquetterie
91680 Bruyères-le-Châtel - France

Intel, the Intel logo, Intel Inside, the Intel Inside logo, Xeon, Xeon Inside are trademarks of Intel Corporation in the U.S. and/or other countries.


Bull sequana supercomputers:
live innovation to the fullest
Sequana features Intel® Xeon® processors
Intel Inside®. Extraordinary Performance Outside
Contact an Atos business technologist at bull.com/sequana