Algorithm Problem

By on April 17th, 2017 in Articles, Ethics, Societal Impact

United Airlines has been having it’s problems since recently ejecting a passenger to facilitate crew members getting to their next flight.  As the Wall St. Journal article points out, this is a result (in part) of employees following a fairly strict rule book — i.e., a decision algorithm.  In many areas from safety to passenger relations, United has rules to follow, and employee (i.e., human) discretion  is reduced or eliminated. It is somewhat ironic that the employees who made the decisions that lead up to this debacle could have been fired for not taking this course of action. But how does this relate to Technology and Society?

There are two immediate technology considerations that become apparent. First is the automated reporting systems. No doubt the disposition of every seat, passenger and ticket is tracked, along with who made what decisions. This means that employees not following the algorithm will be recorded, may be detected/reported. In the good old days a supervisor could give a wink and smile to an employee who broke the “rules” but did the right thing. Today, the technology is watching, and increasingly the technology is comparing the data with history, rule books, and other data.

The second aspect of this is “gate attendant 2.0.” This is when we automate these humans out of their jobs, or into less responsible “face-keepers” (i.e., persons present only to provide a human face to the customer while all of the actual work/decisions are automated, akin to the term “place-keeper”.) Obviously if there is a “rule book,” this will be asserted in the requirements for the system, and exact execution of the rules can be accomplished. It is possible that passengers will respond differently if a computerized voice/system is informing them of their potential removal — realizing there is no “appeal.” However it is also possible that an AI system spanning all of an airlines operations, and aware of all flight situations and past debacles like this one, may have more informed responses. The airline might go beyond the simple check-in, frequent flyer and TSA passenger profile to their Facebook, credit score, and other data in making the decisions on who to “bump.” One can envision bumping passengers with lower credit ratings, or whose Facebook psychological profiles indicate that they are mild-mannered reporters, or shall we say “meek.”

The ethics programmed into gate-attendant 2.0 are fairly important. They will reflect the personality of the company, the prejudices of the developers, the wisdom of the deep-learning processes, and the cultural narratives of all of the above.

Tags: