That was bad, very bad.

What?

What we saw in the video that went viral. A passenger being dragged out of a UA flight...

Yes. Some days ago, a man was dragged off an overbooked United Airlines flight in the US after the airline couldn’t find any volunteers to leave the plane and picked four random passengers. In fact, an algorithm selected the ones to be de-boarded.

A computer? But humans run the computer, I’m sure.

Agreed. This is not to absolve UA of bad behaviour, but to say, as mathematician and data scientist Cathy O’Neil pointed out in her Bloomberg column, how algorithms based on inferences from big data (a large pool of assorted data gleaned from various sources over a period of time that is used to find correlations in business, consumer behaviour) supersede human concerns over dignity and fair play.

Could you elaborate, please?

See, in the UA episode, as O’Neil explains, the algorithm was merely looking for a lowest-value customer. All it wanted was to minimise the degree of damage (from de-boarding a passenger). So, it zeroed in on David Dao, who was a ‘coach passenger’, not a business-class member.

Obviously, he had paid less for his ticket than others. Further, he was not a member of any rewards programme either. Thus, to the intelligence level of our algo friend, dumping Dao would cost the least to the airline in comparison with chucking out, say, a family or someone who need an overnight stay.

But Dao had paid neatly for his tickets. So, where’re his rights?

Therein lies the rub. Algorithms, unlike a human selector, do not worry about fair play. They are designed to safeguard the commercial interests of the companies that make them. In this chaotic, complex world of big data where algorithms dissect, correlate and analyse every action of the consumer to arrive at judgments that help companies enhance profits and (in some cases, to be fair) offer better services to customers, the customer’s value is what his data reveals. And in this interplay of data and profits, consumer rights often take a back seat.

But they always say data can’t be prejudiced!

Well, data is what you want it to be. As O’Neil’s illustrative work, Weapons of Math Destruction , has exposed, big data analytics abets inequality and threatens democracy. UA’s case is a classic example of this. Here, thanks to big data, the customer has become just a marketing category. He can be measured on a myriad metrics and each metric is designed according to the needs (read whims and fancies) of the companies. And such classifications tag consumers differently from each other based on their ability to generate revenues and, resultantly, their ‘rights’ too get redefined, as we have seen with Davido Dao vs United Airlines.

Sigh! This is a dormant volcano.

You bet! The sad part is such behaviour is slowly becoming the norm. Today, technology firms help businesses scan data from people’s gadgets to help them put a ‘price’ on them as customers and customise their products and consumer benefits. That’s plain preferential treatment. This also means the next time you call customer care, your ‘value’ (according to big data tools) will determine whether you will get connected quickly or not. Such issues are being contested widely in the advanced countries now.

So, what’s the fix?

This is a trend that’s just starting. So, we are not up against heavy odds. Policymaking should include provisions to humanise big data by inserting clauses on fair play and the need to create a level playing field in business. For a start.

A weekly column that helps you ask the right questions

comment COMMENT NOW