AAA The decade of big data disruption has arrived

The decade of big data disruption has arrived

From London to Boston to Silicon Valley, the message is consistent – we are finally at the tipping point of realising the value of big data. Analytics coupled with machine learning will help both companies and individuals to derive valuable insights from ever-growing data sources, which will ultimately translate into better decision-making and greater value creation.

As costs of data infrastructure – transmission, storage and computing – drop dramatically and analytical tools proliferate, the innovations of big data will be game-changing.

We are already seeing remarkable developments. At a San Francisco event showcasing innovative uses of big data, I was amazed to learn how Orbital Insight, an Silicon Valley Bank (SVB) client founded by artificial intelligence expert Jimi Crawford, is using images collected by nano-satellites to help track the world’s oil supply. As oil tanks are drained, shadows cast by their sinking lids change, allowing experts to estimate how much oil remains in tanks around the world. Other company projects include analysing photo imagery to measure construction in China, grain production in Russia and the number of cars parked at big-box (mega-store) retailers.

In May, my colleagues at SVB Analytics, a non-bank affiliate of the SVB, presented their vision of what is ahead at the SVB Next: Big Data event in Boston. Data is growing faster than ever, and the means for storing and processing these huge quantities of data are becoming less expensive. These two trends are driving new and more powerful uses for big data.

SVB Analytics head Steve Allan says thinking of the evolution of big data as a 21st century alchemy process. New types of sensors and connected devices are expanding the ability to capture data in every type of business. This data can be processed using new technologies that can quickly and inexpensively handle large quantities of both structured and unstructured datasets in a cost-effective way.

Finally, new analytics applications help to derive insight from the vast new quantities of data, driving real-world value for business users across many industries. These tools ultimately help companies to know their customers better, bolster security and improve operational efficiency.

Steve has created the SVB Big Data Maturation Index, which is designed to pinpoint which industries have adopted big data more quickly than others. Several factors impact the pace of adoption, including ease of data capture, level of regulatory oversight, and level of technology enablement. Interestingly, the report finds several of the industries with the largest market sizes – financial services, healthcare and energy – are relatively underdeveloped when it comes to big data, which indicates significant potential in these markets.

In the coming iteration of big data, Steve says the highest-value input will be the data itself. As data infra-structure, management and analytics tools become commoditised and more commonly adopted, building a proprietary data set will become the most important factor in maintaining a sustainable competitive advantage.

At the Boston big data event, we heard lots of sage advice to consider as we enter this brave new world of zettabytes.

Several presenters working at corporations or with corporate partners recommended starting with a clear long-term data strategy, including identifying a hypothesis, a problem to solve or customer demand to create. And then, they advised, do not be afraid to go against preconceived notions if the data disproves that hypothesis.

Cracking the code in this environment means identifying the key to how humans make decisions. Machines are growing in sophistication to augment and supplant current data analysis techniques and the “automated hypothesis” is now a tour doorstep. The decade of big data disruption has arrived. 

Leave a comment

Your email address will not be published. Required fields are marked *