Why big data’s big promises are finally within reach


By Adam Spearing, VP Platform EMEA, Salesforce

Let’s face it - until very recently big data has been a big letdown. Data warehouses and data analytics tools have historically proven difficult to design, build, and maintain. How much storage space will be necessary? How much data is there? What data management tools can the organisation afford and, just as important, what expertise is available in-house to build and run the data warehouse or data analytics platform?

InformationWeek recently outlined eight reasons why big data projects often fail. The article cited a survey from Gartner that found an astonishing 92% of organisations are stuck in neutral when it comes to their big data initiatives. Why? Because enterprises are spending a lot of money on big data technologies, or plan to, but don’t have the right skills or strategies in place to drive the initiatives forward.

That’s a shame because, when done right, good analytics can improve customer experience, drive sales, mitigate financial risks, and streamline business efficiencies.

As modern data analytics technologies are built on cloud infrastructure, they don’t require dedicated hardware or sophisticated applications

For a moment, consider how the typical data warehouse is run in today’s enterprise. Delivery of information from the data warehouse is often performed within a two-step process. Those needing new business insight tools would first send a request to their analysts, who then would turn to their IT team and ask it to provision the hardware and data platforms necessary. Then, those analysts or data scientists would perform all of the associated integration work and schema development. Remember, just building the most basic data warehouse can take eight months to a year.

If anyone in the organisation needed access to data or had questions they’d like answered, they’d always have to go to the gatekeepers — the data scientists.

Legacy data analytics tools have also fallen short when it comes to the promises made regarding so-called “big data solutions.” These tools have proven themselves to be difficult to use. They’re too reliant upon on-premise technologies, and they are not available where their insights are needed most: in the field, at customer locations, within the factory, before the big presentation, or wherever employees, executives, and teams happen to be when they need answers.

Fortunately, there have been so many advancements in enterprise technology in recent years that the impediments between the data that enterprises hold and the insight it can produce are disappearing.

Currently, three technological trends are driving this change: data analytics tools are now cloud native, they are mobile, and they are within reach to everyone.

When done right, good analytics can improve customer experience, drive sales, mitigate financial risks, and streamline business efficiencies

As modern data analytics technologies are built on cloud infrastructure, they don’t require dedicated hardware or sophisticated applications. They are always available as required and in the capacity they are needed.

Access to the data and being able to query it is no longer something that happens in the data center, or behind the corporate firewall. Data and the associated insight can be accessed securely from anywhere, which means comprehensive answers to the business’s most important and unexpected questions are available through mobile devices on engineering sites, in the manufacturing plant, or at a customer location.

Because modern analytics tools are based on cloud technologies and available on mostly any device, the insight they provide is accessible to anyone. This is a profound change. Now, salespeople, business managers, designers, engineers — whoever is in need of answers — can query the data and get the most actionable insight they need to succeed for their specific jobs without having to send requests through an analyst.

Of course, these cloud native, mobile, and insight-accessible data analytics technologies are incredibly powerful. Essentially, they provide unlimited power to consume billions and billions of rows of data that were never possible before. And this data can be surveyed from anywhere, delivered on mostly any mobile device, and comprehended by anyone. This intelligent collection, analysis, and mobile distribution of data analytics enables enterprises to not only stay a step or two ahead of the competition, but to leap far away from it.

Related Stories

Leave a comment


This will only be used to quickly provide signup information and will not allow us to post to your account or appear on your timeline.

30 Mar 2015, 7:38 p.m.

Adam, Cloud computing is driving a new wave of innovation in the area of big data. The open source solution from HPCC Systems provides a single platform that is easy to install, manage and code. Designed by data scientists, HPCC Systems is a data intensive supercomputer that has evolved for more than a decade, with enterprise customers who need to process large volumes of data in a 24/7 environment. Its Thor Data Refinery Cluster, which is responsible for ingesting vast amounts of data, transforming, linking and indexing that data, and its Roxie Data Delivery Cluster are now offered on the Amazon Web Services (AWS) platform. Taking advantage of HPCC Systems in the cloud provides a powerful combination designed to make Big Data Analytics computingeasier for developers and can be launched and configured with the click of a button through their Instant Cloud solution. More at http://hpccsystems.com