The Technium

Looking For Ugly


[Translations: Japanese]

Preventing errors within extremely complicated technological systems is often elusive. The more complex the system, the more complex the pattern of error. But a curious thing happens in systems that are kept relatively error free: as major errors are prevented, it gets more difficult to forecast future major errors — because so few happen!  In these kind of mission-critical systems the genesis profile of a major failure may be unknown because major failures are so rare.

We would not know very much about how to cure a disease if all we could study were healthy people with minor ailments. Some life-critical technological systems — like a modern passenger jet — have zero tolerance for major errors. And so we do a very good job of preventing them. But because there are so few crashes we don’t have a large body of knowledge about how they happen. A US Congressional panel investigating the FAA policy of not punishing disclosure of minor safety errors made a very keen observation about technological systems:

It said that in fields where there were few accidents, the only choice to improve safety was to gather data on “accident precursors,” minor events that could add up to catastrophe. Such events, it said, were often known only to a few airline employees.

How do you prevent major errors in a system built to successfully keep major errors to a minimum?  You look for the ugly.

The safety of aircraft is so essential it is regulated in hopes that regulation can decrease errors. Error prevention enforced by legal penalties presents a problem, though: severe penalties discourages disclosure of problems early enough to be remedied.  To counter that human tendency, the US FAA has generally allowed airlines to admit errors they find without punishing them. These smaller infractions are the “ugly.” By themselves they aren’t significant, but they can compound with other small “uglies.” Often times they are so minimal — perhaps a worn valve, or discolored pipe — that one can hardly call them errors. They are just precursors to something breaking down the road.  Other times they are things that break without causing harm.

Pwconnections E A000602970

The general agreement in the industry is that a policy of unpunished infractions encourages quicker repairs and reduces the chances of major failures. Of course not punishing companies for safety violations rubs some people the wrong way. A recent Times article reports on the Congressional investigation into whether this policy of unpunished disclosure should continue, which issued the quote above. The Times says:

“We live in an era right now where we’re blessed with extremely safe systems,” said one panel member, William McCabe, a veteran of business aviation companies. “You can’t use forensics,” he said, because there are not enough accidents to analyze.

“You’re looking for ugly,” Mr. McCabe said. “You ask your people to look for ugly.” A successful safety system, he said, “acknowledges, recognizes and rewards people for coming forward and saying, ‘That might be one of your precursors.’ ”

Looking for ugly is a great way to describe a precursor-based error detection system. You are not really searching for failure as much as signs failure will begin. These are less like errors and more like deviations. Offcenter in an unhealthy way.  For some very large systems — like airplanes, human health, ecosystems — detection of deviations is more art than science, more a matter of beauty or the lack of it.

Come to think of it, looking for ugly is how we assess our own health.  I suspect looking for ugly is how we will be assessing complex systems like robots, AIs and virtual realities.




Comments
  • NGA

    Also interesting is the fact that ugly things are composed of very small beautiful things used in the wrong place on the wrong time that end up creating an ugly objective…

  • http://www.garreau.com Joel Garreau

    FYI, there is a growing movement away from the modernist notion that beauty is socially constructed, and toward the idea that it is something innate, and to be discovered, like the laws of physics. Also like the laws of physics, beauty in this view is thought to be not so much about objects, but the relationships between objects. The significance of this is that in a world of ever-increasing change, beauty may be a good horse-back way of determining the fitness of a system.

    I took a prelimary whack at these topics here:

    The Call of Beauty, Coming In Loud & Clear

    http://www.washingtonpost.com/ac2/wp-dyn?pagename=article&node=&contentId=A30489-2002Feb18&notFound=true

  • tim

    you already see this also in architecture, a complex designed system, in both design and maintenance.

    also, in ecosystems, how do you tell what is “ugly” and what is evolutionary?

  • http://www.freshbooks.com Corey

    The thing about trying to do this in software development is that so much of the ugly comes from the social aspects of the development. Watching for over-specification, for scope creep and for delayed releases is probably like looking for worn valves on airplanes. They aren’t bugs and they aren’t system failures, but they’re signs that bugs and system failures are in your future.

  • http://http:/www.foodsci.rutgers.edu/schaffner Don Schaffner

    This is the same problem we face in assuring the microbiological safety of the food supply. We only learn about failures when the system goes really wrong.

  • jackson

    Back in 1999, the New Yorker ran a lengthy article about the crash of an AA flight out of Miami. The conventional wisdom was that it was carrying oxygen canisters against regulations and that caused the fire that caused the accident. But the gist of the article was that no single error caused the accident. Instead, the system was so complex that a series of minor accidents/oversight failures/unforeseen interactions created a situation in which an accident of this magnitude was almost if not completely inevitable. Complex systems generate more complex monitoring. The complexity itself may be a cause of catastrophic failure. The challenge is understanding what may be a ‘minor precursor’ when what appear to be innocent actions may be disastrous in combination.

  • BWL

    To follow up on the comment by tim, since the evolution of technology closely mirrors biological evolution, ‘ugly’ is also necessary for evolution to occur, is it not? I think his comment about ecosystems applies equally well to technological ecosystems.