The amounts and throughput of data to be analysed in financial markets data analysis can be daunting. This is by no means specific to the financial world, as it happens in many other data analysis fields too. What is pretty unique to this specific industry is that data is highly structured (something that does not happen so often in other fields). Huge amounts of small size and mostly unrelated data messages is what constitutes financial market data: it is easy to end up with hundreds of millions of small messages to be received, stored, decoded, parsed and correlated.

Analysing such a huge amount of data requires many iterations, and the need of saving time in random data access becomes relevant.

#ai #machine-learning #data #data-science #fintech

Using in-memory access in Data Science
1.15 GEEK