The Big Data Institute is an innovation hub, based at UCL, founded with support from Elsevier. It will tackle the challenges faced by researchers as they seek to forecast trends, synthesise information from thousands of research papers and show the potential societal impact of their research.
Big Data is a general term, referring to the large volumes of data the automated acquisition of information will produce. Forms of Big Data have become ubiquitous in government, society and science. Big Data methods are designed for the analysis of such large collections of heterogeneous data. The size of most modern data sets challenge state-of-the-art data acquisition, computation, analysis and storage and retrieval methods. Much focus has been placed on the application of Big Data methods to mine large data sets (often for information they were not designed to deliver). Less focus has been placed on the theoretical underpinning of the field, the focus of our institute’s activities. This lack of theoretical coherency will form a bottleneck to further developments in the near future, and impede an easy transferral of ideas between fields where the data was collected. By abstracting and studying the data forms present in many practical applications, the institute aims to advance our understanding of making sense of the information present in extremely large volumes of data. To make this understanding practical, and to seek inspiration from real world problems, the institute will also directly make progress on Big Data applications, adapting our theoretical advances to this purpose.