Bullx™ supercomputers will be used for weather forecasting and climate research.
Featuring a highly innovative liquid cooling system, the bullx supercomputers chosen by Météo-France will have a very limited carbon footprint.
Météo-France – the French weather forecasting agency – has chosen bullx B700 DLC supercomputers from Bull for their new-generation processing requirements. The supercomputers will be installed at Météo-France’s site in Toulouse from the first quarter of 2013 onwards, and then at the Espace Clément Ader in Autumn 2013. Equipped with an innovative, high-performance cooling system and the latest generation of Intel® Xeon® processors, the bullx supercomputers will initially deliver processing power of around 1 Petaflops (in 2013-2014), with a total performance of over 5 Petaflops due to be available by 2016.
This boost to Météo-France’s processing power is reflected in a major technological evolution: the move from vector to scalar technology, based on industry standards and capable of delivering much higher levels of parallel processing power for minimal total cost of ownership.
Météo-France’s decision confirms Bull’s expertise in parallelization of application codes used in meteorology and climate science. Greater parallelization is vital to ensure optimum utilization of the machines involved. It involves essential changes to computer code – a major challenge in itself – which all meteorological institutes worldwide will have to undertake.
“Bull is extremely proud that Météo-France has chosen our very latest, most powerful bullx supercomputers,” commented Philippe Vannier, Chairman and CEO of Bull. “This decision reaffirms Bull’s ability to help major organizations modernize their computing infrastructures to support weather forecasting and climate research; areas that are vital for society and strategic for nation States.”
Météo-France wanted to restrict its electricity consumption and the cooling systems for its supercomputers, while also increasing its available useable computing power by a factor of several tens of times. Data Center electricity consumption – running to several MegaWatts for the largest cases – has become one of the most important limiting factors for increased processing power.
In the perfect Data Center, energy would only be used by the servers; reflected in a PUE of 1. In practice, other equipment in the Data Center also uses energy, especially cooling/air conditioning systems, which can raise the PUE up to around 1.8. By choosing the bullx B700 DLC (Direct Liquid Cooling) supercomputer, Météo-France has opted for Extreme Computing solutions that feature a cooling system which will eventually enable processing power to be increased by a factor of 50 times compared with today, while achieving greater control over the floor space and electricity consumption of the systems involved.
The direct liquid cooling technology developed by Bull – the subject of several patents – is revolutionary. Cooling takes place inside the blade itself, through direct contact between the heat-producing components (processors, memory…) and a cold plate with coolant circulating within it. This enables water at ambient temperature to be used for cooling, boosting energy performance by around 40% compared to traditional Data Centers, despite being just as easy to maintain as standard, air-cooled servers.
 A collective project bringing together the Institut Clément Ader (ICA), CRITT Mécanique & Composites, the materials micro-characterization platform and the High-Performance Computing platform which most notably hosts the supercomputers for Météo-France and the University of Toulouse Research and Higher Education Center.
 Petaflops is the unit of measurement for supercomputer power. A Petaflops denotes one million billion operations a second.
 Power Usage Effectiveness (PUE) measures Data Center energy efficiency. PUE was introduced by The Green Grid, a global association of IT companies. PUE is defined as the ratio between the total energy consumption of a Data Center (for all IT hardware and infrastructure equipment) and the electricity consumption of the IT hardware alone.