Agnostic.com

5 7

LINK Future Tense newsletter: A Darkly Comical Programming Error Shows the Human Assumptions Behind Code

Drop-down menus on online forms: They could be the death of us.

On Thursday, journalist and programmer Dan Nguyen pointed out a darkly comic incident report connected to a flight over the summer. During a pandemic hiatus, an unnamed airline operator upgraded the system that helps ensure an accurate load sheet—the document tabulating the estimated weight of the passengers, crew, cargo, etc. The problem: The new system included titles and assumed that anyone listed as “Miss” must be a child. The incident report for the July 21 flight from Birmingham, England, to Mallorca, Spain, says:

“The system allocated them a child’s standard weight of 35 kg as opposed to the correct female standard weight of 69 kg. Consequently, with 38 females checked in incorrectly and misidentified as children, the G-TAWG takeoff mass from the load sheet was 1,244 kg (2742.551 pounds) below the actual mass of the aircraft.”

The report details that the crew noticed a discrepancy between the load sheet and a flight plan document that listed a higher weight total. “The commander recalled thinking that the number was high but plausible,” it says. So they took off. In the end, no one was injured—but things could have gone differently. How could this happen? According to the report, “The system programming was not carried out in the UK, and in the country where it was performed the title Miss was used for a child, and Ms for an adult female, hence the error.”

It’s a remarkable lesson in how cultural assumptions can be translated into code. It’s also a lesson in how humans work: Before the Mallorca flight, the error had been noticed, and workers were checking the bookings to manually change each relevant “Miss” to “Ms.” A software fix was also implemented, but: “A combination of the teams not working over the weekend and the ‘online’ check-in being open early on Monday 20 July, 24 hours ahead of the flight, meant the incorrectly allocated passenger weights were not corrected.”

Perhaps it will be a long time before robots take all of our jobs.

HippieChick58 9 Apr 19
Share

Enjoy being online again!

Welcome to the community of good people who base their values on evidence and appreciate civil discourse - the social network you will enjoy.

Create your free account

5 comments

Feel free to reply to any comment by clicking the "Reply" button.

1

It shows just how lax aviation programming and operations. This software actually should be considered “man-rated” because mistakes such as this one can cause death. OTOH from what I have observed, nearly everything within the scope of the FAA is generally not even this well designed. I don’t fly and never will fly again. Certainly not in US airspace.

1

AI deals with numbers and codes. It has no actual dealings with weight as a human may see it. This is because a human lifts weight and in so doing can make judgements on weight. For AI to do this it would have to also have the job of doing this.

1

weigh luggage and humans

1

Didn't something really expensive blow up because a part was measured using metric measure instead of whatever it's called that the United States uses for measure.

1

we all define things differently, i guess

Write Comment
You can include a link to this post in your posts and comments by including the text q:590537
Agnostic does not evaluate or guarantee the accuracy of any content. Read full disclaimer.