Category Archives: Design

Why I Watch Airplane Disaster Documentaries

One of the few television shows I watch religiously is Air Disasters. In the US it currently airs on the Smithsonian Channel and many episodes are available on YouTube. Many if not most of the episodes were produced in Canada, where it goes by the name of Mayday.

I don’t watch it out of morbid curiosity. Indeed, my favorite episodes are those where everyone, or at least some individuals, survive the accident. I watch Air Disasters because they are fantastic lessons in causality—almost every airplane crash is a concatenation of unusual circumstances. And because they offer fantastic insights on human performance under stress and human error.

Almost every accident contains an element of human error—not necessarily pilot error but nevertheless some type of human miscue caused by stress, exhaustion, unfortunate personal characteristics, or a poorly engineered process or system. The aviation industry to its great credit—spurred on by the usually stout oversight and regulation provided by government agencies responsible for aviation—has continuously revised its practices in light of the many lessons learned. As a result, airlines around the world have implemented strict rules governing the work and rest patterns of pilots and the interpersonal dynamics of cockpit crews.

The Intelligence Community could learn a lot about the performance of their workforce from the aviation industry. Indeed, watching the documentaries has led me to “appreciate” the considerable flaws of the IC’s work assumptions and practices. As the aviation industry has learned over the last 100 years, humans perform much better when they are positioned for success. So here are some lessons and concepts from the aviation industry that the IC should pay attention to. These are in fact relevant for anyone involved in difficult and/or risky work that must be reliably performed at a consistent high level.

  1. The Startle Factor. Almost all flights are routine. But often when something goes wrong it starts with a surprise—something happens to the plane or during the flight that the pilots had never previously experienced. The Startle Factor refers to how the human body and brain respond to such a surprise. Your heart races, your palms sweat, and your rational brain slows down. Instincts—for good or bad—may take over. Boeing had made an assumption with the 737 MAX advanced aviation system that the average pilot crew would only need a few seconds to realize when the system was malfunctioning and turn it off. But in the two crashes that grounded the 737 MAX, the crews were startled by the unexpected performance of the plane, their responses were delayed or incorrect, and hundreds lost their lives.

    Intelligence officers can often find themselves in surprising predicaments. Does the IC take the startle factor into account when estimating the risk of certain operations? Even in the admittedly less dangerous work of the intelligence analyst, officers can be startled by new, unexpected information, leading them to misinterpret or ignore it.
  • The Importance of Sleep and Good Rest. Commercial airlines have strict rules about how many hours flight crews can work before they must rest. I imagine most of us have experienced a flight being cancelled because the crew has “timed out.” These rules reflect hard lessons learned about how poor rest and lack of sleep can degrade the cognitive performance and judgment of pilots. Every time I watch an episode where crew exhaustion was a factor, I think about how my old agency CIA ran task forces during crises. !2-hour shifts were common. I remember during the first Iraq war having to work 6 12-hour shifts per week. The aviation industry learned long ago that “people just have to tough it out” is not a useful strategy. IC agencies need to rethink the protocols associated with working during periods of crisis.
  • Hierarchy Can Kill You. Traditionally the captain and the first officer in commercial aviation were in a command and obey-orders relationship. But captains are not infallible and there are several fatal accidents that could have been avoided if the first officer had been listened to. Oftentimes the captain would have had a hard time “hearing” the other view because the first officer actually never verbalized his concern. The respect for hierarchy was so paralyzing that first officers have deferred to wrongheaded captains even when it led to certain death. These types of accidents became so concerning for the aviation industry that airlines instituted mandatory crew resource management procedures that emphasize the importance of collaboration and teamwork in the cockpit.

    When I started at CIA, it seemed to me that many of the most legendary leaders celebrated in agency lore were known for their authoritarian styles. Ugh! Strong leaders did not second guess themselves, always knew exactly what to do, and never tolerated backtalk. Somehow, we managed to do good things despite a flawed leadership tradition, and I’m happy to report that the agency’s leadership approach evolved while I was there. But there is still much more that could be done to improve our “crew resource management.”
  • Assholes Can Kill You. One of the most compelling and tragic airplane disasters is the story of Northwest Airlinks Flight 4719, which crashed in Minnesota in 1993 killing 18 people. In this crash, the captain was known to have a temper, often lashing out at airline employees, and belittling and intimidating his first officers. The investigators surmised that the first officer, who had been mocked throughout the flight, did not speak up to correct the captain about his too-steep descent. Toxic leaders are so harmful and intimidating that a person can choose death rather than confrontation.
  • Even the Smartest Person in the Room Can Screw Up. Korean Airlines Flight 007 was shot down in 1983 after it strayed over Soviet airspace in the north Pacific Ocean. I was at CIA at the time, and I remember how incredulous we were and how scary the incident was during a period of heightened Cold War tensions. The actual cause of the accident was a mystery for more than ten years because the black boxes were not made available to investigators until 1992; the Soviets had recovered them and kept them locked away. When the flight data and voice recordings were analyzed, investigators concluded the veteran crew failed to correctly set the plane’s navigation system, leading the 747 to drift north of its flight plan and into Soviet territory. Navigational and communication issues occurred during the flight that should have alerted the crew to their error, but they apparently didn’t pay attention. The captain was a respected and experienced veteran. And he made a fatal mistake.

    Expertise-driven organizations have to appreciate that expertise carries its own blinders and is not foolproof. Long and tedious routine—such as what occurs during a long flight–can also numb the intellect of even the smartest individual.
  • Checklists are Useful. One way to guard against the various blind spots of expertise and the inevitability of mental errors is to incorporate mandatory checklists into flight procedures. Too many airplane accidents have been caused by a cockpit crew overlooking or forgetting an essential step for flight, such as setting the flaps. When something goes wrong with a plane, crews consult extensive checklists  although until recently they were printed on paper resulting in an increasingly frantic crew member paging through a binder trying to find the right section. (Luckily these are automated on newer planes)

    When I was still at CIA I would imagine what an analysts’ checklist would look like. Perhaps even a “Turbo Tax’ application that would make sure the analysts considered all the wrinkles when producing an analytic product. I thought we could come up with a workable model, although it did worry me that, in an unintended consequence, analysts might react by behaving more like automatons than thinking human beings. With the arrival of ChatGPT and other artificial intelligence engines, my idea has perhaps been overtaken by events.
  • Distraction. Even the most competent cockpit crews can make egregious mistakes when they are distracted. Humans just aren’t that good at dealing with multiple tasks. A classic example is Eastern Airlines 401, which crashed in the Florida Everglades in 1972 when the pilots, trying to determine if their landing gear was properly down, failed to notice they had disengaged the autopilot and were rapidly losing altitude.

    Many organizations, not just the Intelligence Community, have the habit of piling additional responsibilities onto teams without taking any away. This piece of advice was popular when I was at CIA: if you want something done, ask a busy person to do it.

  • Human/Technology Interaction. Technological advances have made commercial aviation the safest way to travel. And yet, as the Boeing 737 MAX crashes show, technologies that make ill-informed assumptions about how humans will react in unusual circumstances can create more and deadlier accidents. As planes become more advanced, the possibility of misjudging human interaction with technology grows. Another dynamic is that growing cockpit automation can lead pilots to lose touch with their ”analog” flying skills. Some airlines have lowered the requirements for flying experience to address the pilot shortage, reasoning in part that advanced cockpit automation now handles most piloting duties.

    These are dangerous trends. There’s no doubt in my mind that advanced technologies will continue to replace human labor in many scenarios, including some of the more difficult tasks that humans perform. But as this process unfolds, we have to be clear about how reliance on technology can dull human talent and senses to the point that we become incapable of dealing with the unexpected concatenation of circumstances on which the software was never trained.

  • Who’s Accountable? The final lesson I’ve learned is how to think about “accountability” in complex systems. As airline crash investigators know, many airplane accidents involve a chain of unlikely events, any one of which would rarely occur. A supervisor decides to pitch in and help his overworked maintenance team by removing a set of screws. The maintenance team isn’t able to finish the job but don’t know to replace the screws. Nevertheless, the plane makes many safe landings and takeoffs until a pilot decides to make an unusually fast descent. The pilot and all the passengers die.

    Who exactly is accountable here? Is it the supervisor who tried to be helpful? Or the airline management that under-resourced its maintenance operations? Or the pilot? In many organizations, holding someone “accountable” is the signature move of “strong leaders”. But what often happens is that some unfortunate individual is held to blame for what was a systemic failure of an organization—often driven by complacency, expediency, and/or greed.

The aviation industry’s motivation to eliminate airplane crashes has created a strong safety and lessons-learned culture, but as the experience with the 737 MAX shows, positive outcomes depend upon persistent vigilance. The Intelligence Community has long claimed that what it does is unique and that lessons learned from other industries are not always applicable. But the human being remains the same: we don’t employ unicorns but rather just normal folk, who can make mistakes, who need sleep, and who perform best when they’re positioned for success.

The Glass Edge: Lots of Pictures

My First Week with Glass

Positive Impressions

  • Many serendipitous conversations.

This employee at Bed, Bath, and Beyond who turns out to have a very interesting background. I almost got it on my first guess.

The shoppers at the local HEB in Texas

Image

Two charming young girls, and their cool Dad, at the local steakhouse in Texas

Image

A collage of pictures I’ve taken. You can’t zoom with GoogleGlass yet, and that means some people don’t even know they are in the frame. I also like the non-posed quality of some of the shots.

Image

Overall, there is a more honest quality to many of the pictures.

  • People are comfortable speaking to me when I’m wearing GoogleGlass

Huge surprise here. People of all ages have been very relaxed. We talk for many minutes and it’s clear they’ve forgotten about them or at least processed their presence. Kids of course are no problem. Texans (I’ve been here this weekend) are quite enthusiastic. Even my 78-year-old mom was happy to try GoogleGlass and ended up reading a text on it wishing her a happy birthday yesterday.

Image

Of course many people pretend not to notice them.

  • Even if you have imperfect vision, you can make them work.

I wasn’t sure how GoogleGlass was going to work for me as I wear glasses and have a particularly weak right eye, and at least as of now the prism screen sits only over your right eye. But I’m happy to report I don’t have any problems using it or reading simple text (which is all you’ll ever see really) and I’m actually hopeful the new exercise for my right eye will finally get it to pull it’s weight.

Downsides

  1. They get warm, perhaps even hot. You tend to forget that GoogleGlass is a small computer and when you ramp it up to do harder things—like recording a video, it heats up. Just a bit uncomfortable but I have thick curly hair so I’m padded.
  2. I really wish they bent in the middle like regular glasses. You can’t easily tuck them in the V of your shirt. As I said I need to wear regular glasses in many situations so I’m constantly trading them with my GoogleGlass. Unless I’m going to put them away in their nifty carrying case (size of a quality paperback), I’m left to put them on the top of my head. Because the right arm of Glass is heavy, the slight pressure on my head tends to give me a little headache. The exact same kind that I would get in my youth when I tried to wear headbands.
  3. Short and mysterious battery life. Not always clear what’s drawing the power or why the power level can drop precipitously at certain moments.
  4. Touchpad and voice controls are both uncertain. I’m much better at it than I was a week ago, but both are still quite buggy. Of course voice controls can’t seem to distinguish nuances among words so there are just some things you can’t make it understand.

Stay tuned for more reports from the Glass Edge.

Being Puerto Rican–A Network Analysis

I was born in Puerto Rico and I try to go back every year for at least a week. This is one of those weeks. If you haven’t been there, it is a diverse and beautiful island with some of the best examples of karst topography in the United States and hundreds of miles of diverse coastline, from pure white beaches to dramatic bluffs. Most people who just visit San Juan never get to experience this diversity and may get little appreciation for the Puerto Rican people and culture.

Speaking about Puerto Rican people and culture reminds me of how I  learned that “being Puerto Rican” had (has?) a certain malodor, at least in some major cities of the East Coast. I came to live as an adult in DC as a transfer college student. Previously I had been in Texas for 8 years, which is why I consider myself a Puerto Rican by birth and a Texan by nationality. And I had no real concept of what being Puerto Rican meant on the East Coast. (To this day when I meet someone new, the chances are 50-50 that, when they find out I’m Puerto Rican, they will just assume I grew up in NYC. Conversely, I take great and flinching pains to rescue them from that idea, which reveals my prejudice as well.) Anyway when I got to Catholic University, I started working in the dining hall and I just innocently told my coworkers I was Puerto Rican. And this fellow student from Connecticut advises me: “If I were you I wouldn’t tell too many people that cuz where I come from Puerto Ricans are lazy and dirty.” (To this day I have had a prejudice against any Connecticut, including always rooting against their excellent basketball teams. But it can’t be helped.)

Once you get away from San Juan it becomes clear how different the island is from the States. If Puerto Rico ever became a state, it would be, by a 50% factor, the poorest state in the union. The economy as currently structured does not generate enough jobs for the young men and women who live there. So everywhere you go you see large clumps of young people, mostly men, mostly doing nothing. Another indicator of the absence of youth engagement are the large pied-piper party trucks and vans that bounce along the roads, topped by about 6-foot speakers, blaring out rhythms enticing all to join them at the best place to party that night and this weekend. My mom observing all this said, “We Puerto Ricans are lazy.” I said, “I don’t think so. But there is certainly a messed up rewards structure here.” When there is nothing to work for, we humans, being reasoning animals, often choose not to work. Sensible.

In terms of manmade structures, most of the island is just a mess. For example, there appeared to have been no zoning laws for much of recent history, leading to just the ugliest stretches of suburban blight I’ve ever seen. And, and this is my opinion, the Fast Food industry appears to have been allowed to run roughshod over the local food culture (which truth be told wasn’t too healthy either, but at least there was less of it usually served).  What I find really appalling are the huge billboards all over the state, tempting you to “take a break” by eating about 2000 calories of carbohydrates and fat. Diabetes is, of course, an epidemic.

So as I drove around the state, visiting relatives (and I thought it would be impossible to drive 400 miles in one day in PR but I am here to attest it can be done) I realized the entire state is a very good example of a very poorly designed network. The risk/rewards structure facilitates nonproductive behavior. Lack of design thinking at the very beginning leads to consequences one could have foretold. (Is there a design checklist out there? There must be. That actually reminds me of another lesson, #20, as a CIA manager. The importance of checklists. Most serious professionals I’ve known are offended by the concept of checklists–we’re above that, they say. As a seriously flawed human being, however, who claims to very little expertise, I love checklists. But I digress….)

The road “system” is the most obvious if not the best example of bad networks. Intersections between major highways are particularly clumsy, leading to creative workarounds by drivers. I wish I had a picture to show you, but of course I’m always driving at the time. But the lesson is clear: people will adjust themselves to even the most badly designed network. But making the most of bad situations allows for survival, but not prosperity.

My relatives in Puerto Rico are real characters. A real highlight is Juana, my mother’s cousin, I think, although determining exact family connections is a challenge given the alarming rate at which people recouple, at least in my family, and produce half-brothers and half-sisters. Anyway here’s Juana telling us one her many wonderful stories. I particularly liked the one where she went to visit the grave of her beloved husband (I was going to write deceased husband, but thought better of it given that’s why he’s in the grave in the first place). Anyway, she’s at the gravesite when a sudden burst of wind manifests, blows her down–that wind was substantial!!–and deposits her exactly alongside the burial site of her departed Gabriel, precisely where she will lie when she eventually joins him.

And here are some of her beautiful orchids.