Twitter Facebook Google Plus Linkedin email Print Pdf

||| BIG DATA

Full-scale experiments with Big Data

28 July 2014 by Pierre Pleven, TeraLab founder and coordinator

In Big Data ­– an area that is still in its infancy – full-scale experimentation is essential to test, evaluate and validate new ideas. But for these experiments to be meaningful, they need to be carried out at full scale, with the tools, the amounts and types of data that are specific to Big Data. However for most organizations, especially smaller ones, it’s difficult to meet these conditions.

The aim of the TeraLab platform is precisely to provide the hardware, software, data and expertise needed to experiment in this emerging ecosystem at a true scale. This powerful and reliable environment is unique in Europe, and is open to anyone involved in Big Data projects, whether public or private.

Developed jointly by the Institut Mines-Télécom and the GENOA group of Schools of Economics and Statistics, in partnership with the Cap Digital competitiveness cluster, the TeraLab platform was the winner in the French government’s ‘Big Data 2012’ call for projects as part of its program of future investments (PIA). With a budget of €5.6 million, TeraLab will accelerate research and innovation in Big Data.

Under the joint authority of the Ministries of Industry and Electronic Communications, the Institut Mines-Télécom – which covers thirteen higher education establishments – not only has an academic and scientific role, but also a mission to support innovation. For some years now, Big Data has been one of the future trends identified by the Institut.

The bullion server: spearheading TeraLab
For our ‘In-Memory-as-a-Service’ offering we have equipped TeraLab with a new-generation bullion server. Available on-site or remotely, bullion will realize one of the promises of Big Data: to provide near real-time results from the analysis of massive amounts of data, and draw meaningful conclusions from this in terms of innovative usages and applications. Projects based on our ‘In-Memory-as-a-Service’ solution are already planned in areas such as banking, social protection and the Internet. bullion will help TeraLab achieve its goal to be a facilitator and a catalyst for innovation in Big Data at a national and European level.

The technical platform includes a large collection of useful data, divided into several sections: free, industrial and hyper-secure (CASD technology). The software stack (infrastructure components, databases, compilers, analytical tools…) is based on state-of-the-art standards so it offers the maximum number of options and configured so that it’s ready to run for users. The TeraLab bullion server has 240 processing cores and 4TB of RAM, enabling users to test the possibilities of in-memory analysis. Combined with a 50TB storage system, bullion allows complex algorithms to be applied to huge amounts of data. With its scalability, performance and reliability, bullion enables us to offer a simple and secure computing resource that’s perfectly suited to meeting the challenges of Big Data.

Write a Comment

Your email address will not be published. Required fields are marked *

*