In the historic Gimli flight, an Air Canada plane ran out of fuel mid-flight and made an emergency landing. Faulty software failed to report that the plane was running out of fuel. This is a simple case of negligently designed software putting lives at risk, right?
Perhaps there's more to the story. The second paragraph of Wade H. Nelson's article indicates that the software's malfunction resulted in conjunction with several mistakes by technicians and flight crew, including a poorly soldered sensor. The designer of the FQIS fuel-monitoring software was responsible to test the software rigorously, but it is impossible to test every combination of circumstances that could happen in the field.
But if software purports to fill a vital purpose, shouldn't it take financial responsibility if it fails to do so? Of course! Now, does responsibility for the Gimli crash rest fully upon the software failure? Let's investigate. Flatrock.org's article, paragraphs 1-3, explains that the ground crew and pilots knew that the plane's FQIS software didn't work, and that they deployed the plane anyway, relying on old fuel-gauging techniques. While using old techniques, they made an error in the conversion between pounds and kilograms. The pilots and flight crew chose to ignore the software and take the matter of measuring fuel into their own hands. It was because of their negligence that the plane ran out of fuel in midair.
If the crew had reported the malfunctions of FQIS when they first manifested, the software designer would have owed Air Canada the cost of the inconvenience of having one plane out of commission during the repair time. Accountability for these costs would have rested squarely on the designer of FQIS, and no emergency landing at Gimli would have occurred. As such, the designer of FQIS should not pay for the entire cost of the Gimli crash, but only the hypothetical repair time cost.
3 comments:
it is impossible to test every combination of circumstances that could happen in the field
- How many tests would it take to test every combination of circumstances that could happen in the field?
(hint, computers can't count that high and there wouldn't be enough time to complete the tests before the technology being tested was obsolete)
The pilots and flight crew chose to ignore the software and take the matter of measuring fuel into their own hands.
It's not the pilots' choice as to how that situation was handled. The procedure would be specified in an approved document called a Minimum Equipment List, and the pilots and engineers have to follow it. The procedure included manually checking the fuel level in the tanks. The mistake occurred when the engineer used the wrong conversion factor. Don't see how that has anything to do with software.
Anonymous has brought in a point that I hadn't noticed. If the pilots really were require to fly the plane anyway, then their accountability is passed up the line. Canada Air, Boeing, the hardware company, the software company, the individuals within each...it's difficult to place blame when so many people have put a hand in the flight's complications.
Post a Comment