Non-technical glitches

C Gopinath | Updated on May 12, 2019

Managerial oversight in the quest for profits

The mining industry is an important one in Brazil and Vale SA is among the giants operating in that sector. On January 25, one of their waste dams broke and the resulting flood of a contaminated muddy liquid killed over 300 people, many of them company employees. Questions began to be asked whether it was an accident beyond anybody’s control, was it a technical fault, or were there deep rooted human failures that could have been avoided.

The dam under question is what is called an upstream dam, a common practice to store tailings, or mine waste. Vale reportedly had first hired Tractebel, a Brazilian subsidiary of a French company Engie SA to do the certification of the dam’s safety but they had refused due to concerns over stability. Vale then hired TUV SUD, a German certifications group who cleared it. Subsequent investigations revealed that the certification was done even though the German company had reservations about the dam’s safety, but they did not want to displease a major client and lose future contracts.

If an inspector is to be paid by the organisation that he is certifying, the conflict of interest is bigger than any gap in the dam that he would find. Reports also suggest that Vale’s senior management was aware of the dam’s problems since company procedures require that they be informed of safety issues. So we had a combination of a faulty certification system combined with a managerial failure.

Boeing Co., the US aircraft manufacturer, suffered two consecutive disasters of its 737 MAX jet. One in October in Indonesia and the other in March in Ethiopia, both together killing about 350 people.

The culprit was identified as a new automated flight-control system that was faulty in its response to data it received from sensors and plunged the aircraft. Boeing initially denied that its system was faulty and pointed its finger at pilot error and maintenance deficiencies. Investigations subsequently revealed problems in the design of the system, as well as deficiencies in how Boeing provided training to pilots, and the management while aware about the software problems did not share its concerns with its airline customers.

The whole episode has tarnished the reputation of the Federal Aviation Administration, a US agency held in high regard around the world. Foreign regulators began grounding the planes even while the FAA sang the same tune as Boeing. It now transpires that over time, the FAA had been allowing Boeing to self-certify some aspects of changes in design of its aircraft. So in its rush to sell more planes, Boeing took short-cuts. Again, a system failure combined with a managerial failure.

In both these cases, root technical concerns were overturned by managers who took risks in their desire to meet organisational goals. Organisations everyday rightfully complain about the regulatory system stifling innovation and enterprise, and adding to costs.

Yet, the same managers, when their decisions potentially put the public at risk, failed to meet the standards expected of them. We’ve seen this scenario before. US credit rating agencies, ignoring conflicts of interest, were rating dodgy securities for the banks who were paying them, and who in turn sold the securities to the trusting public and brought down the financial system in 2008. While we may fix systems and try to make them fail-proof, where we fail is in training these highly-paid managers or in the values we inculcate in them?

The writer is a professor at Suffolk University, Boston

Follow us on Telegram, Facebook, Twitter, Instagram, YouTube and Linkedin. You can also download our Android App or IOS App.

Published on May 12, 2019
This article is closed for comments.
Please Email the Editor