European consortium develops new approaches for dealing with Big Data

Publié le 18 août 2015, par Thérèse Hameau

Big Data is a major factor driving knowledge discovery and innovation in our information society. However, large amounts of data can only be used efficiently if algorithms for understanding the data are available and if these algorithms can also be appropriately applied in highly scalable systems with thousands of hard drives. Big Data thus presents complex challenges for software developers, as the necessary algorithms can only be created with the aid of specialist skills in a wide range of different fields, such as statistics, machine learning, visualization, databases, and high-performance computing.

The new BigStorage project, funded by the European Union, will thus develop new approaches to deal with Big Data concepts over the next three years, from theoretical basic research to the development of complex infrastructures and software packages. As an Innovative Training Network (ITN) of the European Union, it also plays an important role in the training of researchers and developers in the international context. The various tasks are being addressed by a European consortium of research teams and industrial partners. The work being undertaken at the Data Center at Johannes Gutenberg University Mainz (JGU) will focus on the impact of new storage technologies as well as the convergence of high-performance computing and Big Data.


Le site du projet :

BigStorage is an European Training Network (ETN) whose main goal is to train future data scientists in order to enable them and us to apply holistic and interdisciplinary approaches for taking advantage of a data-overwhelmed world, which requires HPC and Cloud infrastructures with a redefinition of storage architectures underpinning them – focusing on meeting highly ambitious performance and energy usage objectives...

BigStorage : Storage-based Convergence between HPC and Cloud to handle Big Data :