With the digitisation of most of the processes, emergence of different social network platforms, blogs, deployment of different kinds of sensors, adoption of hand-held digital devices, wearable devices and explosion in the usage of internet, huge amount of data are being generated on a continuous basis.

According to IBM, the world’s population collectively generates 2.5 quintillion bytes of data every day. Anything and everything “smart,” from smart homes to smart grids, means greater data volumes at exponentially accelerating speeds. In short, we are getting increasingly more ‘datafied’ in every single moment.

Data overload

There is a growing belief that gleaning insights from vast arrays of data will be a key business differentiator in the coming decades, thus promoting growth and popularity of business analytics, and demand for data scientists. Now, more than ever, managers are asked to know how to tease insight from data — to understand where they come from, make sense of the numbers, and use those findings to inform their toughest decisions.

Availability of more and more data at increasingly lower cost seems to have created a notion that with increased volume of data one can do away with judgment. Evidences, however, show that more data does not mean better performance, nor does confidence equate to competence. Investments in analytics can be useless, even harmful, unless employees can incorporate that data into complex decision-making.

Undoubtedly, increased availability of data (so called ‘big data’ ultimately) can work as raw material for business intelligence. This has led to the current craze of going whole hog to capture and analyse every possible data. There is a growing belief that superior analytical ability can overcome any and every issue relating to data, and can lead to better decision-making. Consequently, a notion got generated that data is the answer, for which organisations have to come up with questions.

A few years ago, Chris Anderson, former editor in chief of Wired magazine, published a provocative and thought-provoking article: ‘The end of theory: the data deluge makes the scientific method obsolete’. Point to note is, datafication is an information technology-driven sense-making process, while in organisational literature, sense-making refers to processes of organising using the technology of language. This creates a gap because technology-driven sense-making in itself is not enough, although those engaged in data analytics seem to strongly believe it is.

Too many sources

Fact of the matter is so much data come from so many sources that ambiguity, inconsistency and contradictions abound. Let’s face it: the basic principles that make for good strategy often get obscured. Sometimes, the explanation is a quest for the next new thing — natural in a field like strategy that emerged through the steady accumulation of frameworks promising to unlock the secret of competitive advantage. This evolution of the model illustrates why it is a mistake to think of a model used mechanically to develop strategies as “free of human judgment”.

The algorithms that clean data at the point of capture, find patterns, trends and relationships in its volume, velocity and variety are closed in their nature. This is of import because they do not only extract and derive meaning from the world, but also have started shaping it. Recent financial and business events show all too plainly what can happen when rich data and analytics collide with gaps in knowledge or lapses in judgment. Sports field too is not an exception.

Take the case of IPL (Indian Premier League) that got over in May. Each team had tons of data on batters like batting averages on each player against all kinds of bowling, different surfaces, different positions in the batting line-up and the like. Similarly on bowlers, including performance in the initial overs when field restrictions are on, and also in the death overs.

Despite all that the teams often struggled to figure out ‘who all will constitute the best playing eleven on a given day’. Why? Because, in reality, torrents of data, reams of analysis, and piles of documents can be more distracting than enlightening. No analytical tool can do more than augment or complement what is a cognitive and sometimes social process.

Also, numbers cannot nail every nuance of a decision. This is not to say data are not useful, as it all depends on how the data are used. All said and done, generating insight is an inherently human trait and strategy is a way of thinking, not a procedural exercise or a set of frameworks.

Therefore, leaders need to ensure that their processes and human capabilities keep pace with the computing fire-power and information they import. To overcome the insight deficit, Big Data — no matter how comprehensive or well analysed — needs to be complemented by big judgment.

The writer is former Dean and Director-in-Charge, IIM Lucknow.

comment COMMENT NOW