Guest post: Analysing Big Data

In Big Data, Data & Analytics, Events, Featured on App, Technology by Simon Crompton-Reid


It's absolutely staggering to look at the amount of digital information generated these days, and projected to be generated into the future… Can you visualise what 3,000 Exabytes (EB)* looks like? That's a conservative estimate of the size of the ‘digital universe' today.

EMC, a leading digital storage provider, gives a fascinating overview of this on their website.

However, just capturing data is not very useful in itself – it's the analysis and understanding that really counts.

As the ‘buckets' of data just keep getting bigger, it's the businesses that can analyse data more effectively and then refine their strategies that will be bound to be more successful. And this era is just starting. In IDC's Digital Universe Study (Dec 2012) they demonstrate the Big Data Gap: the difference between the amount of data captured and that actually analysed can be seen in the image above.

IDC estimates that less than 1% of current ‘big data' is actually analysed!

So there's plenty of room for improvement and lots of opportunity out there for people and organisations that are good at collecting and analysing data and then making sound strategic decisions based on the analysis.

In our Financial Modelling Masterclass, we learn some extremely powerful desktop methods that help business people do practical data reduction and analysis. Whether it's looking for trends, making forecasts or finding optimum ways of managing a process, we use some powerful techniques for getting the most value from the underlying data.

* 1 EB = 1000000000000000000B = 1018 bytes = 1,000 petabytes = 1 billion gigabytes


This is a guest post by Harold Graycar, Business Advisor and trainer for the 5 Day MBA in Financial Modelling.