The Bull-Joseph Fourier Prize 2010 confirms the importance of computer simulation to society
Catching up with the first prize winner in 2010, Dimitri Komatitsch
Professor of Geophysics at the University of Pau, France, working with CNRS and INRIA
What does computer simulation represent for you, as a researcher? And what wider impact might it have for the general public?
Firstly, it’s important to say that computer simulation helps us a lot because seismic waves are the only practical means we have of studying what is below the Earth’s surface, that is: waves that can get inside places that we cannot go. So the fact that we can do very high-resolution 3D calculations enables us to learn things about the Earth’s internal structure.
When it comes to the impact on people, let’s look at examples of the earthquakes in Aquila in Italy or Haiti. We are not able to do calculations and predict when the earthquakes would happen, unfortunately. But on the other hand when a dramatic event like that happens somewhere in the world, using calculations in particular on GPU graphics cards we can fairly rapidly be very responsive. And in the hours that follow, we can try to predict the possible effect of aftershocks, secondary quakes and create ground move maps that enable experts to analyze how buildings where people live might be affected.
So at the same time there is an impact on theoretical geophysics, to help understand how the Earth is formed, and an impact in terms of the seismic threat, which enables regional studies to be carried out that can help local authorities better face the consequences of an earthquake.
How do you see the future?
In terms of how this discipline is changing, currently when the biggest machines are totally inundated with data they allow us to calculate very high-resolution models, but that’s just for a constant given model. So I hope we will be able to studying an inverse problem in the future – in other words because we don’t perfectly understand the structure of the sub-soil – we have to start to solve an iterative problem in which we’re looking at the gap between the calculated and observed data in a real earthquake, and iteratively improve the model in the form of an inverse problem to improve our knowledge of the sub-soil. Not doing it by trial/error, but really as part of a formal process with hundreds or even thousands of iterations. So I think we’re going to be even more demanding of computing power from massive machines in the future, because we have two major needs: we need fast processing but also a great deal of memory because we’re want to reach a very high resolution.
GPU technology now allows us to respond efficiently to the question of time. The next step forward we want to make is for there to be much bigger GPU clusters, because right now their memory capacity is even smaller than traditional CPU clusters, which is why I am using both. And the dream is to have very large-scale computers, as big as current CPUs, equipped with accelerators with lots of memory that would meet both the time and resolution needs at the same time, so that we would be able to efficiently solve an inverse problem. So for the time being it is not the case, but I think that in the next year or two this technology will be available. You only need to think about the Petaflops-scale machines that are currently being developed.
So how long do your simulations typically take these days?
When you have aftershocks, you calculate them at a regional level, the responsiveness of the GPUs enables us to calculate three, four or even five possible aftershocks in a single night. That’s good, because in these instances you have to react fast. We need this information, if possible, within hours. The usefulness of such calculations decreases after a week or so, when the risk has diminished.
Then for large-scale simulations at a worldwide scale, we start with current simulations where we use between 6,000 and 10,000 CPU cores and each simulation typically runs for 24, 48 or even 72 hours. So that’s millions of hours per simulation.
How has winning the Bull-Joseph Fourier Prize helped you?
The Bull-Fourier Prize has effectively helped me already. I’m delighted to have won for a second time, and I want to thank the organizers and Bull, as well as GENCI, La Tribune and Serviware because I’m especially happy about two things. On the one hand, the fact that very large-scale machines are being made available to the community through GENCI and the collaboration with Bull.
And that we are building a whole community around this, that’s at least as important as the machines themselves. The computers have to be there, but the feeling that you are being supported by people who get together regularly. And then you win a prize that recognizes the people who have invested in that community. I think that’s very good. I worked with a whole team, it’s very important to say that. Without those people and without the technical support, without the people from Bull and GENCI, I would not have been able to do anything for myself, but I see it as recognition of the whole team.
Last year, when I finished in third place I was awarded GPU processing time. Without that I wouldn’t have been able to meet the challenge this year. So it’s true that I really benefited from that massive time quota on a very high performance machine. I had exclusive use of the machine, which is unusual. And so that enabled me to meet the second challenge and meet our objective.
The Bull-Joseph Fourier Prize
- The work of the three prize winners has resulted in major advances in the areas of predicting earthquakes, improving treatments for cancer, and reducing the emission of pollutants in combustion phenomena
- The 2010 Bull-Joseph Fourier Prize was awarded at the Ter@tec 2010 conference, which brought together over 600 specialists in scientific computing and computer simulation at the Ecole Polytechnique in France
With a first prize of €15,000, the Bull-Joseph Fourier Prize is unique in that it also offers the two other prize winners ‘machine time’ on GENCI supercomputers, to help intensify their research efforts in the best possible conditions.
First prize: Dimitri Komatitsch was awarded the first prize for his work on the parallelization of codes to simulate global phenomena, as well as for the impact of his research, which enables the effects of earthquakes and their aftershocks to be predicted more effectively. His work has already been used by the Italian authorities, following the Aquila earthquake.
Second prize: Sébastien Jan, a researcher with the Life Sciences Department at the French Atomic Energy Authority (CEA), working in the Institute of Biomedical Imaging at the Frédéric Joliot Hospital Service. His work is helping to improve detection and therapy in cancer treatment. It encourages better monitoring of patients, so as to evaluate their response to a particular treatment. When used in radiotherapy treatments, it also enables the area where doses are positioned to be more closely targeted. Sébastien Jan has been awarded 100,000 hours of GPU processing time on a GENCI hybrid supercomputer.
Third prize: Vincent Moureau, a researcher at the French National Center for Scientific Research, the CNRS, in the Inter-professional Research Center for Aerothermochemistry (CORIA) in Rouen. The main aim of his work is to carry out highly accurate simulations of the combustion of fuel jets. The results of these simulations will help to reduce the energy consumption and pollution emissions from combustion, whether in vehicle engines, aeronautical gas turbines or industrial furnaces. Vincent Moureau has been awarded 300,000 hours of processing time on a GENCI supercomputer.