3 minute read

Book extract: Rebel Ideas by Matthew Syed

United Airlines Flight 173 took off from Denver on 28 December 1978, flying to Portland, Oregon. Everything went smoothly until the final approach. The captain pulled the lever to lower the landing gear, but instead of a smooth descent of the wheels, there was a loud bang. A light that should have illuminated to show that the landing gear was down and secure failed to light up. The crew couldn’t be sure that the wheels were down, so the captain put the plane into a holding pattern as they attempted to troubleshoot the problem.

They couldn’t see below the plane to check if the wheels were down, so they conducted proxy checks. First the engineer went into the cabin. When the landing gear has slid down into place, two bolts shoot up above the wing tips. These bolts were, indeed, up. They then contacted the United Airlines Control Center in San Francisco to talk through what had happened, and received advice that the wheels were probably down.

But the captain still wasn’t certain. What had caused that loud bang? Why hadn’t the light on the dashboard illuminated? Landing without the wheels in place can generally be achieved without loss of life, but it contains risk. The captain, a decent man with long experience, didn’t want to place his passengers in unnecessary danger. He began to wonder if the reason the light had failed to illuminate was because of the wiring. Or perhaps it was a faulty bulb.

However, as he deliberated and the plane continued in its holding pattern, a new danger had come into play. The plane was running out of fuel. The engineer knew that the fuel was critical: he could see it disappearing on the gauge before his eyes. He also had a powerful incentive to alert the pilot: his life, and the lives of everyone on the plane, were on the line.

But this was the 1970s. Aviation culture was characterised by a dominant hierarchy. The pilot was called “sir”. The other crew members were expected to defer to his judgements and act upon his commands. It’s what sociologists call a “steep authority gradient”. If the engineer voiced his concerns about the fuel, it might have carried the implication that the pilot wasn’t on top of all the key information (which he wasn’t). It might have been perceived as a threat to his status.

By 17:46 local time, the fuel had dropped to five on the dials. This was now an emergency. Almost 200 lives, including that of the engineer, were in severe danger. The pilot was still focused on the bulb, oblivious to the dwindling fuel. Perception had narrowed. You might suppose that the engineer might have said: “We have to land now! Fuel is critical!” But he didn’t. We know from the cockpit voice recorder that he merely hinted at the problem: “15 minutes is gonna really run us low on fuel here,” he said.

The engineer was so fearful of directly challenging the captain that he softened his language. The captain interpreted his remarks as meaning that while the fuel was going to get low as they circled again, it wasn’t going to run out. This was incorrect, and the engineer knew it. Even at 18:01, when it was probably too late, and with the captain now focused on the plane’s antiskid system, the engineer and first officer were still struggling to state the problem clearly.

It wasn’t until 18:06, with the engines flaming out, that they finally made the information explicit, but it was too late. They had gone past the point of no return, not because the team lacked the information, but because it wasn’t shared. The plane crashed minutes later, piling into a wooded suburb, ploughing through a house and coming to rest upon another. The lower left side of the fuselage was completely torn away. On a clear evening when the airport had been visible since they entered the holding pattern, more than 20 people died, including the engineer.

An extract from Rebel Ideas: The power of diverse thinking by Matthew Syed, winner of the 2020 CMI Management Book of the Year Award

This article is from: