There are bad guys out there!
Going back to the gist of my last post, one of the pillars that underpinned de-regulation was the idea that companies would work in a ‘correct’ manner and regulate themselves. The truth is that this worked and still does work very well for 95% of companies but there are always bad pennies committing fraud or simply not being careful in accounting practices. Thanks to a few well known financial disasters, even before the global meltdown, the concept of re-regulation loomed large across many industries. There are many sets of rules that are now in place to bring governance to company business – some of the more well known include Sarbanes-Oxley and Basel II and III which have been around for a little while now. We might ask ourselves what do they have in common and the answer is that both and many more such initiatives, demand that very accurate and accountable numbers are produced quickly from very complex underlying data – the need for Business Intelligence rears its head once again and the term ‘Big Data’ can certainly be applied to some of these initiatives.
Re-regulation demands that some very complex numbers are delivered:
- Quickly
- Accurately
- Transparently
Throw into the pot that the data needed often as not comes from tens or even hundreds of operational systems distributed across the world and that some of these initiatives need very complex predictive modelling and detailed segmentation and we see a new class of Big Data applications.