User Tools

Site Tools


Sidebar

study:master_dsdm:year_1:block_4:algorithms_for_big_data

Algorithms for Big Data

Full course description

The emergence of very large datasets poses new challenges for the algorithm designer. For example, the data may not fit into the main memory anymore, and caching from a hard-drive becomes a new bottleneck that needs to be addressed. Similarly, algorithms with larger than linear running time take simply too long on very large datasets. Moreover, simple sensory devices can observe large amount of data over time, but cannot store all the observed information due to insufficient storage, and an immediate decision of what to store and compute needs to be made. Classical algorithmic techniques do not address these challenges, and a new algorithmic toolkit needs to be developed. In this course we will look at a number of algorithmic responses to these problems, such as: algorithms with (sub-) linear running times, algorithms where the data arrive as a stream, computational models where memory is organized hierarchically (with larger storage units, such as hard-drives, being slower to access than smaller, faster storage such as CPU cache memory). New programming paradigms and models such as MapReduce/Hadoop will be discussed. We will also look at a number of topics from classical algorithm design that have undiminished relevance in the era of big data such as approximation algorithms and multivariate algorithmic analysis.

Prerequisites

None

None

Files

medialist: nothing to show here.
study/master_dsdm/year_1/block_4/algorithms_for_big_data.txt · Last modified: 2018/11/02 14:18 by emasliankova