Category Archives: Knowledge Work

In Search of Insight

When I was a manager of analysis at CIA, I would hear our customers, often senior policymakers, demand more INSIGHT in our analysis. And I would go back and tell the analysts they needed to produce more insight. Until one day an analyst asked me “Well, what is INSIGHT?” And I thought to myself, that’s a good question; a “good question” is ipso facto a question for which you do not have a ready answer.

I put on my tattered thinking cap and worked to come up with some type of answer—a “formula for INSIGHT” that was reproducible and generated a set of actions that analysts could actually perform. I asked many colleagues to describe how they thought. This is another good question. Almost nobody could describe their own thinking process.

“I read and then I write.”

What happens in between?

“I shake my head until some ideas fall out,” one analyst offered.

Eventually I came up with a formula—the steps of analysis—that I thought enough of to share with others. Like everything I do, it’s imperfect but hopefully it offers a starting point.

  1. COMPREHENSION. When we’re thinking about a problem, first we try to comprehend it. We assemble relevant information and consume it however we prefer.

  2. CATEGORIZATION. Once we’ve achieved some comfort in our level of understanding, the next step is categorization. We sort what we know into various categories and patterns. (Actually, this starts to happen organically during the Comprehension stage. This is unavoidable and can be the place where cognitive biases take root. Some information you consume early on colors how you think about every subsequent report, and you fall victim to the anchoring bias. I’ve always wanted to run an experiment where the same 100 pieces of information were presented to analytic teams, but in different orders. Would their analysis differ? My bet is yes!)

    The categories can be as simple as Old Information, New Information, but they eventually evolve into a complex taxonomy that forms the backbone of your Analytic Lines. These Analytic Lines are powerful beings and resist change. This is usually very bad.

  3. PRIMARY INSIGHT. INSIGHT occurs when you see things you’ve never seen before or in ways that are new to you. When an individual takes an item of information and argues that it belongs in a different category, you have produced a moment of INSIGHT. Recategorization of information is a way of generating INSIGHT. Is President Xi’s third term in China an indicator of his strength or of China’s weakness? The conventional wisdom probably is to categorize the event as the former but making a credible argument that it is the latter generates INSIGHT. The INSIGHT argument doesn’t have to be convincing; just provoking others to take a second look is useful.

  4. PROFOUND INSIGHT. A harder but more powerful way to generate INSIGHT is to renovate and/or rebuild your categorization schema. For example, analysts realize that a significant amount of information remains uncategorized—it doesn’t easily fit the current taxonomy. Do you ignore it, or do you begin to experiment with new categories that might better explain the information? And at some point you can experiment with rethinking your categorization scheme—your Analytic Line—from scratch. To return to the China example, how best should we think of it—as an emerging Superpower, as a declining power, or as a country destined for the middle-income trap?  Each of these options can generate significantly different categorization schemes. (When your Analytic Line is long in the tooth, lots of information will no longer easily fit your existing categories. This is a “sell signal” for how you currently think about your problem, but not enough analysts recognize it as such.)

Analytic teams need to be hawk-like in policing their categorization schemes because they often sneakily embed themselves in the back-office processes of the organization. Take, for example, the reading profiles of an analytic team—the algorithms that determine which information flows into their inboxes. Ask your analysts how often these reading profiles are updated. You will not be happy with their answers.

What inspired me to natter on about analysis and insight on this beautiful fall day? Reading Adrian Wolfberg’s excellent monograph In Pursuit of Insight: The Everyday Work of Intelligence Analysts Who Solve Real World Novel Problems. It’s not a quick read but luckily there’s a shorter version available here. Based on extensive interviews with analysts of varying experience, Wolfberg seeks to unpack how insight actually happens from a cognitive, neurological perspective. It tackles the all-important step that my all-too-neat process completely ignores: how does the idea for new categories enter your brain? What leads to its emergence?

Wolfberg writes that the insight process begins with a trigger phase, “an initiating event that, seemingly by chance, brings the analyst’s attention to a particular problem to address; alternatively, after an analyst has been working on a given problem, a random event contributes to focusing their attention more intently on the problem. Entering into and navigating through the trigger phase takes cognitive and emotional courage on the analyst’s part.”

After the trigger phase, Wolfberg identifies emergence as the next step. Two activities promote the emergence of insight: internalized tensions and priming. Quoting from the shorter paper:

Internalized Tensions: As analysts start working on a novel problem, they become aware of inconsistencies that can be cognition-based (i.e., inconsistencies between pairs of thoughts or ideas) or emotion-based (i.e., inconsistencies between an analyst’s action and interpretation of others’ reactions). Tensions induced by these inconsistencies can originate within the individual (i.e., self-initiated) or in the individual’s social environment (i.e., related to organizational structure and the behavior of others). An analyst who concludes that exploring a diversity of ways to represent a problem would lead to the most accurate assessment, while others judge that a standardized process would be best, is an example of cognition-based tension at the individual level. An analyst who presents a unique methodology in a detailed, transparent way to address skeptical concerns, while worried this could lead to being discredited in a production system that values standard product lines, is an example of emotion-based tension at the social level.

Priming: Analysts draw upon memories of past experiences unrelated to their present novel problem in order to make progress toward overcoming tensions and reaching insight. Priming sources also occur across the intersection of the emotion-cognition and individual-social dimensions. In an example of cognition-based priming at the individual level, an analyst who studied in graduate school how physical factors in the environment could trigger social or political outcomes applies that cause-and-effect knowledge to a national security novel problem. In an example of emotion-based priming at the social level, an analyst who had lived in a foreign country appreciates that even in countries where the same language is spoken, cultures can be very different.

What’s clear is that insight emerges from a rich casserole of experiences, emotions, and feelings in an analyst’s mind. Our intuition, what Daniel Kahneman calls our System 1, is the primary custodian of these insight-generating prompts. Wolfberg notes that “although these past experiences were unrelated to the problem at hand, an aspect of these past experiences brought forth a combination of emotional and cognitive meaning that informed how the analysts thought…” Every analyst interviewed by Wolfberg reflected on past experiences unrelated to the problem.

Clarity and INSIGHT are the most sought-after products from intelligence analysts. Clarity is getting the facts straight, sorting out complicated situations. Intelligence organizations usually do clarity well, but not always (think Iraq WMD). INSIGHT requires going beyond the facts to consider hidden causes and motivations, anticipate unexpected changes in trajectory, and appreciate the nonlinearity of events. The work processes of most intelligence teams are suited more to producing clarity than generating INSIGHT. Analysts often describe having to buck the established way of doing things to explore their emerging INSIGHT.

As Wolfberg notes, leaders of intelligence organizations need to appreciate the conditions necessary for the generation of INSIGHT and work to allow the time and space necessary for its emergence. Many of the work processes of the Intelligence Community emphasize order and consistency over thoughtfulness and contemplation. Working 8+ hours a day in a cubicle is also not ideal. As the science writer Annie Murphy Paul notes in her excellent book The Extended Mind, human brains evolved to think best when we’re physically active. My favorite “structured analytic technique” has always been to take a walk, preferably with a favorite thinking partner.


Wolfberg’s study has many other insights about INSIGHT. It’s a rewarding read for anyone wanting to make intelligence analysis better.

Thinking in the Time of Coronavirus–Part 2

20200402_172635The previous post discussed three important thinking dynamics relevant to our analysis of coronavirus:

Actions reveal intentions and motivations
Ideology often colors how we think; and
Worst case scenarios are always considered unlikely
(I’d amend that now to say almost always.)

…but there are many more.

Since the crisis began in January, I’ve come across many commentators—scientists, non-scientists, experts in other fields, jacks-of-all-trades—speculating about coronavirus and attempting to predict its course. Many statements were similar to this one by Dr. Fauci on January 26: “It’s a very, very low risk to the US.” I could not comprehend at the time the evidentiary or logical basis for such statements. Did the individuals making these statements believe the Chinese Government was engaged in some weird overreaction or that the virus would only uniquely prosper in China? Did they assume that the hundreds of thousands of people who would come to the US in 2020 after visiting China (or in a few weeks time Italy) would all be free of the disease or that we would somehow detect them as they entered the country? Were they just making a linear projection from the miniscule number of cases then in the US?

One cognitive pathology at work here is that INDIVIDUALS, EVEN TRAINED SCIENTISTS, ARE REALLY BAD AT DRAWING APPROPRIATE CONCLUSIONS FROM AVAILABLE EVIDENCE. Because I worked as an analyst at CIA for 32 years, I am familiar with this phenomenon. Policymakers are always demanding judgments from analysts, and we often feel obliged to provide them even when the evidentiary basis is insufficient. At any moment regarding any situation, how accurately does the evidence available to us reflect reality? Today as I write this, how much do we really know about coronavirus: 50% of reality, 30%, 10%? The answer at this point is unknowable. Therefore, predictions concerning its future course are tenuous.

Two other realities about thinking are worth mentioning here. First, OUR ABILITY TO KNOW IS A FUNCTION OF OUR TOOLS FOR KNOWING. We can only know what our tools reveal to us. Breakthroughs, revolutions in thinking in so many fields have been the result of inventions/discoveries of new knowledge tools. In cosmology, for example, our understanding of the universe expanded when we learned to build great observatories and combined cameras with telescopes. The deployment of orbital platforms such as the Hubble have further revolutionized our knowledge.

Our understanding of coronavirus has been diminished not just by its novelty but also because China may not have revealed all it has learned about the disease. Another tool problem is the lack of comprehensive testing of populations. Some of my Texas friends have claimed that Texas must be doing a great job containing coronavirus (or that there really isn’t a threat) because of the relatively low rates of infections and deaths. But Texas, as of April 15, has one of the three lowest rates of testing in the country. We don’t really know what’s going on there. And we won’t comprehend critical attributes of the virus, such as fatality and contagion rates, until we have tested a large and random sample of our population. This inherently incomplete nature of our knowledge should make us more humble about our predictions and expectations concerning the course of the disease. For many questions, we still do not have sufficient information to make a firm determination and thus need to err on the sides of caution and resilience.

But instead we have a tendency when confronted with limited information to succumb to THE STREETLIGHT EFFECT. The joke is that a policeman runs across an individual, usually described as inebriated, looking for car keys under a street lamp. When the policeman asks if this is where the keys were lost, the seeker answers “No, but this is the only place I can see.”

When we make confident predictions based on insufficient or flawed evidence, we are succumbing to the streetlight effect. One vivid example is how people jumped on the  hydroxychloroquine bandwagon after just a couple of positive reports. At the start of the pandemic, many (and some still do) argued that covid-19 would be no worse than a bad seasonal flu. Those arguments were based on deaths up to that point (a few hundred or thousands) and I’m not exactly sure what else. So many flaws in that argument it’s hard to know where to begin. First, the number of flu deaths are totals for an entire year while the number of covid-19 deaths are just for a few weeks; we are assuming a lot about how the disease (and  people…) will perform during the course of an entire year. Second, the statement assumed linear growth which of course is not what happens during uncontrolled epidemics. Third, this argument implied that the Chinese stupidly and inexplicably closed down their economy because of the seasonal flu. (Actions reveal intentions and motivations.)

Another flaw in the argument that covid-19 is just another flu is captured by the aphorism: QUANTITY HAS A QUALITY ALL ITS OWN. Mistakenly attributed to Joseph Stalin, the observation appears to have become popularized instead by the US military-industrial complex. It attacks the logic behind linear projections—it’s just more of the same thing and therefore we can handle it. At some point, more of the same thing evolves into a different plant; we can pull out a few weeds by hand but not an entire yard-full. And quantity is not the only factor in play; pacing and tempo have significant impacts as well. One million cases of covid-19 during the course of a year may be manageable but half a million cases in 8 weeks not so much.

When I’m asked to recommend a book for aspiring intelligence analysts, I always mention Daniel Kahneman’s Thinking Fast and Slow. One of his famous findings is that humans are bad at comprehending exponential numbers. (If you start with a penny and double it every day, at the end of the month you will have more than $5 million; actually if the month has 31 days you end up with more than $10 million.)

I like to extend that idea by observing that HUMANS FIND IT HARD TO DEAL WITH EXPONENTIAL CAUSALITY. Exponential causality is one of the characteristics of complex systems. Any one event can have a cascade of consequences in unpredictable directions and time frames. Feedback can even travel backwards in time in the sense that a development today can reveal the unappreciated causal importance of some past event. Because exponential causality confounds humans, we like to pretend it doesn’t exist; a popular way to do that these days is by subscribing to conspiracy theories. So many factors contribute to today’s reality that there’s always a stray thread or two that can be pulled to create a conspiracy-based explanation. If you yearn for a simpler, linear world, then you’re happy to accept that Bill Gates and 5G technology have combined to cause the coronavirus. It’s a particularly dangerous cognitive trap.

One of my first bosses at CIA, John, impressed me with a story from his early days as an analyst. He was following a particular insurgent group in southeast Asia in the 1960s, and had calculated that because of supply disruptions the group would literally use up its ammunition by a date certain. John’s boss advised him to rethink his analysis because YOU NEVER RUN OUT OF BULLETS. In other words, linear predictions are always flawed because 1. our knowledge of any situation is incomplete 2. we never know the exact dimensions of our ignorance; and 3. shit happens.

 

Which brings us to the topic of coronavirus models. I’m sure statisticians will beat me up for this but I often think of models as compilations of hundreds of linear projections. The modeler tries to include every possible variable in her model and stipulates the tens of thousands of relationships among the variables—which is like really hard.  As the model runs every possible combination of variables is instantiated. This can be helpful to policymakers by representing in a more digestible fashion a complex set of possibilities. But models always simplify the complex—they make more linear that which is random. In my experience, models are particularly bad at accounting for the variations and peculiarities of human psychology—one of the most important factors determining the course of covid-19. Indeed, the failings of models will luckily keep human intelligence analysts employed for years to come.

Another useful aspect of models is that they bring into focus the most dangerous, possible outcomes and identify the levers policymakers and individuals can pull to avoid them. Which brings us to the PARADOX OF WARNING. The world has moved smartly to limit the worst consequences although the ones we’re left with are still pretty dire; it turns out the Chinese were not crazy to lock down entire cities to prevent further spread of the disease. But as we succeed in lowering the final number of deaths and infections, we start hearing from critics who claim the crisis was exaggerated from the start. Aaaargh! The only point of warning is to avoid the bad outcomes. No one should be rooting for maximum coronavirus. Effective warners always want to be wrong.

The coronavirus pandemic illustrates that good thinking is more than an academic exercise. It can be a matter of life and death. I’ve seen too many friends on social media using poor arguments to justify bad decisions. Please everyone, just put on your thinking caps.

 

 

Thinking in the Time of Coronavirus–Part 1

I’ve been wanting to comment on all the examples of bad thinking and cognitive traps that I’ve seen regarding coronavirus for a while now, well since early February for sure, but I’ve hesitated to put them down in writing because there is already too much content drawing spurious links to this horrible pandemic. But as we see signs that the infection curves are beginning to flatten in some countries (although certainly not all), it strikes me that good thinking will be just as critical as we work to recover our economies and manage the continuing threat of disease. So what follows is a compilation of some of the best and worst thinking practices revealed so far this year. (There are many so expect at least two posts.)

I was convinced the reports of a new, SARS-like disease in China were significant by mid-January. On 16 January I spoke at a conference that had a sizable contingent of attendees from Seattle and I remember fretting that Seattle would likely be one of the first American cities to get hit by coronavirus given the Chinese population on the West Coast and the travel patterns associated with Lunar New Year. I started tweeting and posting on Facebook about the disease in the second half of January and by late February it dominated my posts. Friends have asked me why I was so sure the disease would pose such a threat and I answered with one of my favorite heuristics from my CIA years: ACTIONS REVEAL INTENTIONS AND MOTIVATIONS.

When you’re trying to figure out a government or actor’s intentions, it’s always best to start with their actions. Pay attention to what they are doing. Given China’s obsession with economic growth and how the Communist Party’s legitimacy rested on delivering prosperity, I could not imagine why China would have closed down one of its most important cities out of an “abundance of caution”—a good name for a new rock band. The coronavirus had scared the shit out of the Chinese Government and the most reasonable explanation was that it was contagious and dangerous.

Whe20200411_144242n we began to see reports of massive disinfection campaigns and attacks on Chinese doctors who issued first warnings, I began to wonder what Beijing was trying to hide, if anything. Of course there was immediate speculation that coronavirus was some type of bioweapon; I’m no expert on this issue so I have to accept the judgment that the virus is not man-made. But the possibility that coronavirus leaked because of an industrial mishap or accidental discharge remains credible to me. Recent reports that the Chinese Government is controlling research into the origins of coronavirus just further pique my suspicions. Actions reveal intentions and motivations.

When I actually shared this view on social media a few weeks ago, several friends criticized me for going there. Why, I wondered. It wasn’t like the Chinese Government was known for its transparency and complete honesty. Why couldn’t these ideas be entertained? My answer in part is that IDEOLOGY OFTEN COLORS HOW WE THINK. There are so many examples of this dynamic spanning the ideological spectrum.

  • Advocates of globalization loathe to admit that China might have deceived other countries.
  • Supporters of the international system reluctant to criticize the World Health Organization.
  • Proponents of American exceptionalism insisting, against a lot of evidence, that the US has had the best response to the coronavirus.
  • Backers of the President condemning any suggestion that the US could have acted more quickly to contain the disease.
  • Critics of the President attacking his decision to limit travel from China in late January, although it was clearly the right thing to do. The more valid criticism is that it didn’t go far enough and there were too many loopholes.

And countless other examples we could mention. Because this is such a terrifying disease, it’s natural for people to fall back upon their values and ideological beliefs to interpret events. It’s natural but not helpful. In fact, it’s dangerous. Our beliefs lead us to ignore facts that don’t fit our ideology and overamplify developments that do. Unfortunately this thinking weakness will haunt our recovery efforts, particularly in the US where our politics have become exceptionally poisonous.

One important caveat: our ideology and values will play an unavoidable role going forward as we think about levels of acceptable risk. To my knowledge there is no objective way to measure the value of a human life. In the months to come we will be trading hundreds if not thousands of lives for decimals of economic growth. Your values are what will determine how you solve that equation. Less-polarized societies will find it easier to agree on the solution. The math will be difficult for the US. (And let me add that the very idea that this can be thought of as a math problem is anathema to many.)

I spoke at a conference in D.C. on 6 February about cognitive traps and used the emerging disease for my examples. The one cognitive bias that was most evident then is that WORST-CASE SCENARIOS ARE ALWAYS CONSIDERED UNLIKELY. In early February few people were expecting the disease to ravage Western Europe and the US and painted any such thinking as worst-case scenarios. Indeed, the first deaths did not occur in Italy until the last week of February. And yet it was reasonable to assume, I thought, that the disease could easily flare up in any country with connections to China, which was basically any place on the planet.

If you’re an analyst responsible for warning, remember that when you paint the most dangerous scenarios as worst-case, you make it easier for the decision-maker to dismiss them. And that’s what appears to have happened in the US government. Impact and probability need to be thought of as independent variables. Some category of “worst-case” scenario happens every year; the only “unlikely” aspect of “worst-case” scenarios is the ability to predict their timing. We are unable to know with precision when a dangerous development will occur, but we are sure to experience several in our lifetimes.

Humans have been flourishing on this planet for tens of thousands of years, solving many problems (and, of course, creating others). We can assume that almost all the easy problems have been solved and many of the hard ones as well. Going forward, most of our problems will be difficult to handle and few, if any, will have clear-cut solutions. Only good thinking will help.

Thinking Ain’t What It Used To be

I’ve been reading a great book the last couple of weeks, Thinking, Fast and Slow by Daniel Kanehan. (I read books like I watch TV, I dip in and out, watching (or reading) several things at one time.) I recommend Kanehan’s book to everyone; it is not as hard to read as you might think–in fact the prose style is very pleasant, although the illustrative mental puzzles do take a bit of effort sometimes. I think most people will react like I have, reflecting on the implications of Kanehan’s findings for how I lead my life, how we make decisions. His major message so far is that all of us need to be aware of the shortcuts (fast thinking) we use and the likelihood that these shortcuts will lead us astray.

Thinking about thinking has ended up being a huge part of my life’s work. That’s a lot of what I did for CIA. It is no reassuring task to write or edit reports intended for policymakers, even for the President, and then pause to ask yourself whether the prose before you actually provides anything useful or insightful to the intended reader. Because the real truth is that facts (outside of science, and even there you gotta wonder!) rarely speak for themselves.  (And it hardly matters at all if they are secret facts.) In those rare instances when they do speak for themselves, then you don’t need an analyst or interpreter to make sense of them. Do you? Most facts require context, invite assumptions, have a back story, and probably also have a future. That’s what the analyst, the real thinker needs to bring to the conveyance of the fact. But it is all too rarely accomplished.

I can’t help but think we have a humongous thinking deficit in today’s world. (We also have significant values confusion and volatility, but that would be another blog.) I don’t watch any of the political debates because I can’t stand to watch people embarrass themselves. But I sort of follow them on Twitter sometimes so I know the various candidates drop Fact Bombs; Speaker Gingrich engages in carpet bombing and Rick Perry often misses the target. It doesn’t matter whether I support the candidate or not: they all misuse facts and encourage the sloppy thinking habits that haunt us today.

Some of our thinking errors have been monumental. For example you would think that when the US and other Western nations began investing in large social welfare programs in the 1950s and 1960s they might have actually done some actuarial planning, plotting out demographic, social, and economic trends to develop projections for how long the economy could sustain such benefits. Now my bet is that in some governments someone actually did something like this, but that inconvenient scenarios were dismissed as worst case scenarios and thus relegated to the low probability trash can. 

This is one of the most common thinking errors I encountered professionally: the association of worst case scenarios with low probabilities. Think about it. I bet most of you do it all the time. You hear worst case, you think “unlikely to happen”. The two values–probability, severity–move independently of each other, of course. There is a variation to this thinking pathology: that’s when worst case is is equated with high probability or even only possibility. In the past ten years, this has manifested itself in obsessive hoarding of gold.

Another area where a little long-range thinking and contextual analysis might cast some interesting light is the immigration debate. Some of us know that immigrants are an increasingly important part of US economic growth and that economic growth is a good way to ameliorate deficits. But these points aren’t often made. If the point can’t be conveyed in a soundbite or factbomb, then it isn’t conveyed at all.

What follows are examples of sloppy–even meaningless–analysis I see all the time even in the most serious publications.

The negotiations will be difficult. Now I’m betting that if the situation was easy to resolve, you wouldn’t need negotiations in the first place. Negotiations are supposed to be difficult. Also lengthy, bitter, hard-fought. This is one of my faves, because you SEE IT ALL THE TIME. (Also see below for discussion on use of adjectives and adverbs.)

It is too soon to tell and its cousin Only time will tell. No elucidation necessary.

The transition will be difficult or The transition will be smooth. I distrust just about all adjectives and adverbs in analysis. These slippery little modifiers disguise many errors in thinking. What smooth means to the writer may not be what smooth means to me. I would rather have the elements of the transition discussed so that in addition to considering the analyst’s judgment, I can develop my own.

The elections are too close to call. Basically, I don’t understand why we spend so much analytic and journalistic energy trying to forecast elections in the first place. An election is actually a specific event in the near future. Its timing is known to all. When it ends, we will know the outcome. If the election is rigged, then our forecast doesn’t matter. I’m gobsmacked at how American journalists in particular have convinced so many people to hang on their every word about events concerning which they have little insight and over which they have little control.

X dynamic is not a problem in Society Y because it only has the support of 10% of the population. Another classic mistake that you see ALL THE TIME and which, at this point, is absolutely criminal. When you hear about a particular revolution not being anticipated, you can bet that this type of analytic statement was at the heart of the faulty thinking. This analytic statement is flawed because it tries to capture extremely complicated societal dynamics in a criminally simple mathematical statement. Before making  such a statement, the analyst needs to examine his theory for social change. Is social change simply an arithmetic progression? Or do movements often gain support suddenly and/or exponentially?

Anyway I think I’ll stop now. Really I’ve just been venting on many of the thinking errors that get under my skin. For a much better discussion, do read Thinking, Fast and Slow.

On CIA, typewriters, and sensemaking

Check out my guest blog post on IBM’s Smarter Planet blog.  http://asmarterplanet.com/blog/2011/09/10766.html

Revisiting Lessons from a CIA Heretic

Events in Middle East have led me to reflect on the talk I gave in September of last year at the Business Innovation Factory. (You can read the prepared text here or see the video of my speech here)  I was noting how the world is changing and how that in turn requires a different sensemaking method. The key paragraphs:

If you think that the world is driven mostly by the secret deals and aspirations of powerful people—the Hitlers, the Communist Party of the Soviet Unon, Mao Tse Tung, Idi Amin, Saddam Hussein, Osama bin Laden, I’m desperately trying to think of a likely woman here—then you will conclude that you need some kind of capability to figure out what these people are doing, to ferret out their secrets. To protect our nation from some very nasty ideas these individuals cook up. And you may also want an organization that can impede their plans, cross your fingers.

But if you think that most of the forces the US will need to navigate are not specifically man-made, or at least not specifically made by one man or a small group of them–then you need a different kind of organization. If what matters is that the US understand the trends in the world, like globalization or the emergence of new economies such as India and China and Brazil (which clearly no one is like trying to keep a big secret) than spending a lot of time digging out secrets seems not as important, and what you really want is to have your hand on the pulse of the world, to be out there sensing and in many ways just being part of the whole big ride.

(A little later in the talk.)  

Making sense of the world is so hard and so important that it demands collaboration with as broad a network as possible. It was around this time that this thought entered my mind: The CIA will end up being the last secret organization in the world. And being the last of anything is never a good thing.

And so back to the question. I actually think the answer to it is very complicated. But I do believe that more of what will be important to US prosperity in the future will lie in the second dynamic and our success will depend on how well we understand these large shift changes underway and are able to engage them. Here’s where the imbalance of the Intelligence Community really can hurt us. To deal with the first circumstance it’s important to be a closed network. But to understand and prosper in the second dynamic it’s best to be an open network.  What we have here is a real innovator’s dilemma.

That’s why one of my passions now that I’ve retired from the Agency is to do what little I can to help Americans think about connecting, about working in open networks, about transparency. I believe as a successful multicultural society the US is poised to be innovative in this new world, and this time perhaps all out of proportion to our size. I love all social networks and in particular Twitter because of its power to spread ideas faster than the speed of light. Just think of it. One thought can reach a thousand people much faster than a single beam of light could physically touch those same individuals.

Lessons from a CIA Heretic

Last week I told a story at the Business Innovation Factory Summit, a wonderful event that I was blessed to attend. The storytellers were awesome. (Let me also give a big shout-out to my friends and reverse mentors Tony and Jen Silbert of Innovation Partners, who were the kind folk who connected me to Saul Kaplan and all the wonderful people at the Business Innovation Factory.) I was talking to a friend last night about all the interesting people I met and I couldn’t talk fast enough to keep up with my memories.

Anyway, even as a retired CIA person, I still need to get public or published comments approved if they deal with subjects pertaining to my CIA employment. And so this forced me to actually write out a draft of my extemporaneous comments to submit to the publications review board. You can catch the differences (not that significant) between what I wrote and what I said here, where you can download the MP-3 file of my remarks. So I thought I would post that text below. I think particularly toward the last half there are some ideas I rushed through or omitted that might be of some interest. I’m sorry it’s so long…

TALK BEGINS

My hope is that 15 minutes from now you will have developed your own answers to the following three questions or at least be provoked to think about them.

The questions are:

1.  Is the perception of the CIA in the popular media accurate, distorted, and/or useful to the organization and US national security?

2.  What is the motor that runs the world? Is it the secret agreements and machinations of men (and historically it’s been men) getting together in smoke-filled rooms generally up to no good; or, Is it the large dynamic and trends that emerge on the planet from God knows where and set in motion events that elude our attempts at prediction and manipulation.

And

3.  Are we the world?

So question 1. The perception of the CIA. Now first I have to tell you that I hate spy fiction and spy films and I even dislike nonfiction about the topic, so I’m not the best person to have an opinion as to whether the common perception out there is accurate. But I can tell you a little bit about my early days at the Agency. That’s a start.

Unlike many young people I’ve met over the years, I never dreamt of working for the CIA. As the first person in my extended family to graduate from college, I of course had no idea what I was supposed to do with the degree I was earning but because I was a college debater I’d always assumed I would be a lawyer. Until at Catholic University I started meeting law school students and went “OOOO….I don’t want to end up anything like them.” At that point I was at a loss. The only thing I was really interested in was the world, and so I thought well, I’ll go to Georgetown for graduate school. And so I did and the first semester there was a CIA recruiter on campus and I said sounds good. That’s the sum total of the story.

Now when I first joined the Agency, in 1978, it wasn’t what we would call a very diverse environment. (and even today Agency leaders are not satisfied with the level of diversity in the organization.) In later years I would tell people that I used to wander the halls searching for another Latino or Latina, because someone had told me there was another Puerto Rican working in the Agency and I was determined to find that person. Now that story is not specifically true, but it is generally accurate, if you get my drift. I used to get strange comments, like people in a conversation suddenly volunteering, in a culinary non sequiter,  how much they liked Mexican food or assuming that I would only want to work on Latin America. But for the most part, the Agency environment was a meritocracy, specifically I can say that about the analytic directorate where I worked, and I can’t point to any particular issues. In fact, when I would speak on college campuses kids were always asking me to comment on how being a woman and Latina affected my career, and I always told them the truth, that neither had as near the effect as being a different type of thinker—but I’ll talk more about that later.

I soon learned that most of the work at the Agency was, well, like the work at any other knowledge organization, although of course we didn’t use that term then. (By the way, given the malodor in which managers and management are generally held, I just don’t understand why consultants banded together and decided they could make a lot of money pitching organizations on Knowledge Management, but I digress.) True, the CIA is by law responsible for carrying out covert actions, an activity that, for my taste, assumed a heck of a lot about the planning abilities and foresight of the average American, whatever, but for some time now great powers (another term I rather dislike) have assumed they needed the ability to do some things secretly to make their way around this big, blue planet and, rather endearingly, the US decided to give this activity a legal structure. But much, most of what most Agency employees do has very little to do with covert action. It has to do with trying to make sense of the world, and trying to gather information about the world that others would rather us not know, so it’s a bit like trying to figure out what Steve Jobs is going to do next at Apple. But for the press the CIA is like the Lindsay Lohan of government. No matter what we do, how insignificant or banal really, it makes headlines and it’s always bad. “CIA uses solar-powered lawn mowers!!” Ridiculous!! I guess stories about the CIA sell newspapers, if anyone bought them anymore. I had a colleague at the Agency, wonderful fellow, who started every morning reading multiple newspapers (and he also has his office decorated with Brooklyn Dodgers memorabilia, so you know the type—salt of the earth.) And for the last six or seven years, I would stick my head into his office and say, “You know, newspapers are dying.” It was really mean of me.

So this is a good point to start making the segue to the second question, which as you recall has two parts. So I’ll repeat them.

What do you think is the motor that runs the world?

Is it the secret agreements and machinations of men (and historically it’s been men) getting together in smoke-filled rooms generally up to no good or

Is it the large dynamic and trends that emerge on the planet from God knows where and set in motion events that elude our attempts at prediction and manipulation.

So right about now, I’m going to start connecting my comments to the topic of innovation, which will be very exciting, I think.

So my point in asking you to think about this question is that how you choose and/or what reality is tells you a lot about what kind of intelligence organization you’ll need. If you think that the world is driven mostly by the secret deals and aspirations of powerful people—the Hitlers, the Communist Party of the Soviet Unon, Mao Tse Tung, Idi Amin, Saddam Hussein, Osama bin Laden, I’m desperately trying to think of a likely woman here—then you will conclude that you need some kind of capability to figure out what these people are doing, to ferret out their secrets. To protect our nation from some very nasty ideas these individuals cook up. And you may also want an organization that can impede their plans, cross your fingers.

But if you think that most of the forces the US will need to navigate are not specifically man-made, or at least not specifically made by one man or a small group of them–then you need a different kind of organization. If what matters is that the US understand the trends in the world, like globalization or the emergence of new economies such as India and China and Brazil (which clearly no one is like trying to keep a big secret) than spending a lot of time digging out secrets seems not as important, and what you really want is to have your hand on the pulse of the world, to be out there sensing and in many ways just being part of the whole big ride.

Now of course the question is a false dichotomy, because it is not either/or, and both dynamics can exist at the same time. But what is critical for understanding the CIA and why I spent my last 20 years there as a frustrated innovator, is that much of the Agency’s theology and modus operandi are built on the first assumption. This was the driving principle in the Cold War—countries hostile to us are planning to destroy us and do us harm and we’ve got to get out there and figure out what they’re up to. And of course it’s a Mad Magazine Spy vs. Spy world and the bad guys are trying to figure out what you know, so you have to be secret about everything, be very, very quiet, and trust no one.

It’s all very tiring but it was all very important up until about 1990 or so, which curiously, now that I reflect back on it, was when I published my first article in Studies in Intelligence arguing that we needed to do analysis in new and different ways. We needed to recognize that policymakers often knew as much about the open world as we did and that these newfangled operations like CNN were providing news faster than we could and well we needed to adjust. And then the internet came along and the Agency was really thrown for a loop. One has to understand that for intelligence organizations how one handles information is not a secondary or enabling activity. Handling information is the essence of our mission so that changes here are doctrinal and theological. Well, of course, we had a really hard time figuring out what to do, and I would argue we are still having a hard time.

This period, the 90s, ended up being the most difficult of my Agency career because it just became harder and harder for me to reconcile what I believed needed to be done with what the Agency was actually doing. There was a small group of us that I in any case referred to as the Rebel Alliance. We tried to raise the Agency’s awareness of how the world was changing around it, we would bring in guest speakers to talk about Change—how naïve it all seems in retrospect. During this time, and I’m afraid this is a danger all innovators run, I began to get the reputation of being cynical and negative…positive thinking has its limits, you know. During a reception up in NYC around then, I was approached by someone who had been watching me, I remember she worked for DuPont, who said. “I can see you are a heretic in your organization. And I just want to tell you that you need to learn to live with the feeling of discomfort all heretics get. In fact you need to learn to be comfortable with these feelings of discomfort. Not just comfortable, you need to learn to like, love them, because when you get those feelings then you can be sure you are being true to your convictions.” I never spoke to this person again and I’m convinced she was one of the two guardian angels I’ve encountered in my life. (If you want to know the other one, catch me later!!)

Despite all this doom and gloom, I spent the last ten years or so of my Agency career as a senior executive—and ended up in positions of increasing responsibility. I wish I could tell you exactly how I as a heretic innovator managed to succeed in the system anyway, but part of it was just sticking to it, many good friends and mentors—especially reverse mentors, and that extremely important variable in all plans—luck. By 2005 I was part of the executive team that led the analytic Directorate, the Directorate of Intelligence. Very soon after I assumed that position, a young man and his manager approached me about an idea they had at that time to use the media wiki software to create an Iraqipedia so that analysts throughout the Intelligence Community could collaborate and work together on the problem set. I thought what a great idea but did they know that the Agency was OK with using collaboration software as long as you only collaborated with people within the Agency. No, they didn’t, they said. And I said that was OK because I doubted anyone in the bureaucracy realized any longer this stricture existed so let’s proceed, full speed ahead. (It never ceases to amaze me how bureaucracies create rules at a rate no human can ever remember, not even bureaucrats.)

So that was my small role in getting Intellipedia started, which is still viewed by many as the most important adjustment the intelligence community has made to the Internet Age. Nothing came easily and I remember Sean and Don, the two heroes who ended up pushing the concept throughout the intelligence community and winning last year one of the Service to America awards given to outstanding civil servants, often asking in frustration if we couldn’t just MAKE everyone use Intellipedia. To which I said, wrongly or rightly, no, we can’t. I happen to believe organizational change is a lie—organizations don’t change, people do, and each person changes for particular reasons of their own. You can’t make people think differently. You can create an environment where they can have a Eureka moment. You can MANIPULATE them into thinking differently. But you can’t FORCE the issue.

Not only that, many in the intelligence community then, and perhaps now, didn’t think ideas such as Intellipedia were such good ideas in the first place. Virtues of Intellipedia such as transparency don’t sound too hot to intelligence professionals accustomed to clandestinity. The CIA and Intelligence Community also were hung up on the concept of authoritative views. National Security intelligence is just too important to be handled through collaborative processes, they would argue. During this period I came to the exact opposite view. Making sense of the world is so hard and so important that it demands collaboration with as broad a network as possible. It was around this time that this thought entered my mind: The CIA will end up being the last secret organization in the world. And being the last of anything is never a good thing.

And so back to the question. I actually think the answer to it is very complicated. But I do believe that more of what will be important to US prosperity in the future will lie in the second dynamic and our success will depend on how well we understand these large shift changes underway and are able to engage them. Here’s where the imbalance of the Intelligence Community really can hurt us. To deal with the first circumstance it’s important to be a closed network. But to understand and prosper in the second dynamic it’s best to be an open network.  What we have here is a real innovator’s dilemma.

Which brings me to the last question: Are we the world? In the immediate aftermath of WWII, the US was 50% of the world economy. We also make a big deal of how we led the world in innovation, but of course most of that was probably just a function of our size. So during the Cold War we dealt with the world as if we were the world. We called the shots. And that is the world our intelligence community learned to function in. Of course individuals were ready to share secrets with the US government because we were after all where the action was.

That world is ending very rapidly. The world to follow will be a good world too. A world in which the US will remain very influential and prosperous. But once the US represents, let’s say 10% of the world economy, which could happen in most of our lifetimes, the arithmetic of dealing with the world from a position of absolute strength sort of falls apart. Much of the American public, from what I can tell, doesn’t appear ready for this turn of events. We learn, as kids, that America owes its prosperity to its independence from the rest of the world. It is part of our founding myth. We also believe that the world and its problems scale to the capabilities of individuals or small groups of individuals, freely associating. So in a very real sense, Complexity is Un-American!

That’s why one of my passions now that I’ve retired from the Agency is to do what little I can to help Americans think about connecting, about working in open networks, about transparency. I believe as a successful multicultural society the US is poised to be innovative in this new world, and this time perhaps all out of proportion to our size. I love all social networks and in particular Twitter because of its power to spread ideas faster than the speed of light. Just think of it. One thought can reach a thousand people much faster than a single beam of light could physically touch those same individuals. I found myself a few weeks ago teaching a group of 20-somethings my Twitter secrets. This is nuts, I thought, but what a blast.

So there you have it. My last lesson: All organizations, no matter how reactionary or conservative, always have people in them thinking how we can be better.  All organizations need to find better ways to tap into what these individuals have to offer, because they often have an orientation to the outside environment that you may be lacking.

And for you frustrated innovators out there, form a Rebel Alliance. But remember, that optimism is the greatest act of rebellion.

Thank you.

What is your Twitter Diversity Score?

How do we,  particularly us knowledge workers, expose ourselves to different views and perspectives? I’m asking this as a very practical question. I’m currently reading Amartya Sen’s The Idea of Justice and one of the most important concepts concerning justice is that it should represent fairness, and that in fact justice and fairness are separate concepts. (Except that many languages, including French, do not have separate words for justice and fairness.) But it occurred to me as I was reading that it is impossible to be fair if you are not aware of all possible views on a subject. (Now, of course, knowing all views probably is itself impossible for most complicated subjects…there would be a long tail of views that defied comprehension, so then you immediately get involved in deciding which views are relevant, etc., which puts you in another miasma of subjectivity. And this is one of the main reasons why I’ve always been rather dubious about the wisdom of authority and institutions, but I digress…)

To return to the main topic, diversity of thought is key to attaining justice and fairness in societies. And how can we hope to achieve such diversity of thought? Well, of course, one way is through the use of TWITTER. Roger Schank, a leading thinker in education and artificial intelligence, just wrote a piece in eLearn magazine on how Twitter  is capable of changing the very nature of what it means to learn from your peers.  Absolutely! The magic of Twitter is in how it has become a perpetual learning machine. But the question remains, is my Twitter stream diverse?

So I decided to do a manual check this morning of the people I follow. I only follow about 200 people so it wasn’t too hard to do it by hand, although admittedly my analysis is extremely primitive. (Is there a program that lets you analyze the diversity of your network? I just tried Google’s FollowFinder, which helps me find more people who are like the ones I already follow. I want the opposite. I want someone to analyze my network and provide me with links to people who are in the same intellectual domains but have different perspectives. Then I would like a tool that shows me other intellectual disciplines I should follow.) In any case here are the results of my Twitter Diversity Inventory. N=200 (Yes, I know the numbers don’t actually add up…I took several shortcuts which I can explain if you’d like…but the numbers are generally truthy.)

Companies/Groups 67
Male 68
Female 55
American 122
Non-American 25
Northern European 129
Not northern European 19
Internet/Social Media/IT Experts 64
Other Topics/Disciplines/Interests 77

Bottom line: I’m not very satisfied. Normalizing to a 100-point scale, more or less, my diversity score for American/non American and northern European/non-northern European is below 20. That can’t be good.  It must lead to personal blind spots in how I think about many subjects. The next step is to figure out how to fix this. More to come….

What is your Stupification Point?

Malcolm Gladwell has a piece in this week’s New Yorker on the nature of espionage and asks some very penetrating questions about the psychology of the business: essentially once you’re in the hall of mirrors is there anything or anyone you can really trust or accept at face value?  It’s very much worth reading and also an amusing read, because Gladwell makes his points while reviewing what looks like a really fun book on the WWII exploits of the British intelligence service, Operation Mincemeat.

But what I really thought was worth sharing were some more overarching points about the business of intelligence or sensemaking. (I really don’t like to use the term intelligence because I think it has too many negative or at least questionable connotations.) Gladwell notes the point made by political scientist Richard Betts that in intelligence analysis there tends to be an inverse relationship between accuracy and significance. Boy, does that ring true, although I would just generalize Betts’s point by applying it to just about all knowledge activities. We almost always can be most specific about that which is least significant. This actually relates to the phenomenon of attaching disproportionate importance to activities you can count. To wit: When trying to fix something, as managers we tend to concentrate our efforts on the parts of the process we understand well, even though those parts may not really be what are causing the problems. I’m sure you’ve  suffered through this in your organization. Some large problems are identified but you and all your coworkers know intuitively that the solutions offered–often rolled out to great huffing and puffing–just don’t tackle root causes.

Gladwell also points to the work of Harold Wilensky, a professor emeritus at the University of California at Berkeley who has done some groundbreaking work over his career but whose book, Organizational Intelligence, which Gladwell quotes from, appears to be out of print.

As Harold Wilensky wrote in his classic work “Organizational Intelligence” (1967), “The more secrecy, the smaller the intelligent audience, the less systematic the distribution and indexing of research, the greater the anonymity of authorship, and the more intolerant the attitude toward deviant views.” Wilensky had the Bay of Pigs debacle in mind when he wrote that. But it could just as easily have applied to any number of instances since, including the private channels of “intelligence” used by members of the Bush Administration to convince themselves that Saddam Hussein had weapons of mass destruction.

I’ve been searching the internet all morning for more on the book Organizational Intelligence, because anyone who made the wonderful observation above has got to have more to offer. Sadly, I can’t find it, although, as is the way of the internet, I was next-linked to this very nice presentation by Richard Veryard. He asks a wonderful question: what stupifies your organization?

Each organization has its particular form of stupidity–It is up to the consultant (or the above average manager) to recognize the way that stupidity manifests itself and to find a way of doing something about it.

I would just note that organizational culture is probably the number one factor that stupifies organizations.

This presentation is chock-full of gems. “Stupidity is not making errors. Stupidity is repeating them.” I also love his discussion of the Algebra of Intelligence. Intelligence is not arithmetical: “lots of intelligent pieces doesn’t add up to an intelligent organization.”

So to summarize, what did I learn in the last 24 hours about intelligence (sensemaking), organizations, and networks?

  1. Closed networks have a hard time determining if what they know is really significant. (In part because determining significance invariably requires perspective and context, which can only be gained from a vantage point. Closed networks lack vantage points.)
  2. The smaller the network, the less room it will have for diversity. (So, a diversity solution that is self-contained is no diversity solution at all.)
  3. The smaller the network, the less it can tolerate differences of opinions.
  4. Every network has stupification points. You must constantly be hunting for and eliminating them or they will destroy you.

Twelve Stupid Things People Say about the Internet

I remember when I was a kid people  would always say that to find life elsewhere in the universe, we had to look for carbon, because life was carbon-based. And I remember thinking, probably as a 10-year old, well who says that all life has to be carbon-based? Can’t we imagine a different kind of life? And people would say, no, that’s wrong, except now it isn’t so wrong to think that way. And I thought the same thing about the chances of finding life in the deep, deep ocean. I bet we will find weird life down there, I thought to myself, because these arguments that life can’t survive the lack of light and the intense pressure, they’re based on our very limited experiential base. After all, in terms of how life works, our N = 1. It can’t be right to be so sure.

So when I read critiques of the new culture we may be creating using all this internet stuff and mobile devices and Twitter and all the other things certain people like to make fun of, my ear is always listening for these unproven and unjustified assumptions. For example, almost all these critics assume that the good is self-evident and that the internet is displacing a wonderful tradition of knowledge, wisdom. and contemplation, offering very little of substance in return.  Hmmm…I’m just not so sure about that, and I offer the following list of shaky assumptions that we should question fiercely and for which we should demand either evidential or logical proof

  1. Heavy internet users have short attention spans and lack mental discipline. This is just plain silly, if you ask me. When I get deep into researching a topic on the internet, I have a very long attention span. And if I am traveling across many different topics, how is that proof of anything other than a curious disposition?
  2. Digital life is shallow. Says who? By what standard? Compared to what? Going to the movies? Watching old I Love Lucy reruns? Reading a thick economic treatise? And in any case, digital life itself is neutral. It’s the person who is shallow or not, if indeed we want to use this rather elitist formulation.  Even in the old analog culture, I never bought the line that going to the symphony is somehow culturally more significant than catching Bonnie Raitt at Wolf Trap. (I have to say I’ve even been to a Donnie Osmond concert in my lifetime….or was it the Osmond Brothers…the synapses misfire.)
  3. Slow is better than fast. You often hear the digital culture beaten up for its quick answers or its provision of instant gratification. But independent of all other values, such as accuracy, fairness, completeness, etc, there is nothing inherently bad about fast. In fact, fast will always, all other things being equal, be more efficient.
  4. Always “on” is bad. Prove it. As our societies and economies have become more complex, there are significant costs to periods of non-sentience. We may want to go back to an era of slower pace and tempo, but we can’t wish our way there. My experience as a manager is that organizations work best when they sustain momentum; there’s a favorite saying among managers: if you want something to get done, assign it to the busiest person on your team. And I actually believe that for many crackberry addicts, being constantly aware of the status of projects or other activities is actually less stressful than not knowing what is going on.
  5. Work based on reflection is better than immediate reactions. This is actually the ultimate argument of individuals who criticize the internet culture for being too fast or too persistent. And I think you have to admit that reflection has many advantages. Let’s unpack them. Reflection usually contributes to completeness and, in most cases, to accuracy. But reflection is at best neutral in terms of creativity; many argue for example that the best way to be creative is to generate as many ideas as possible without stopping to be judgmental. And there are certainly opportunity costs associated with reflection. As a CIA manager, I was always aware that one never had a monopoly on good ideas.  The longer you wait to propose a new way of looking at a problem, the greater the chance that some other entity will beat you to it. (Whether knowledge work should be competitive in the first place–now that’s another question.)
  6. Formal work is better than casual work. By formal, people usually mean work that has gone through some recognized quality control or expert process. Writing in a hurry is just not as elegant and good, goes the argument, as a carefully constructed essay. The arguments used in discussing reflection apply here as well. There are certain situations where formal work is obviously appropriate, but they are not as numerous as the critics would have you believe. And informality has many advantages in addition to immediacy. For example authenticity, directness, and, often, honesty.
  7. Correct spelling and grammar is essential for communication and is an indication of careful expression. Now I have sympathy for this position because I would tell people whose work I was editing that the worst thing that could happen was for me to gain the impression that I as the editor was paying more attention to their piece than they ever did. Finding obvious typos was one of the events that would create that impression. But that said, I also reviewed many pieces that were impeccable in terms of spelling and grammar but deplorable when it came to logic or original thinking. So sometimes correct spelling and grammar indicates nothing more than that.  The argument that spelling and grammar are essential for communication cannot be disputed. But special communication methods, such as the telegram for example, have always developed spelling and grammatical shortcuts that quickly became well understood. Twitter is just following in that tradition.
  8. It is more serious to do things by yourself than to do them in collaboration with others. Oh, for heaven’s sake!! This can only be accepted as gospel by individuals enamored of the great person theory of knowledge work.
  9. The internet is destroying literature. You’ve heard this. Nobody reads serious fiction any longer. Although I do believe classic forms of literature are threatened, I don’t buy the theory that it is the internet’s fault. Actually, it is probably more the fault of movies, television, DVDs, and video games. And in my view the real issue is that, compared to other, newer media for storytelling, the advantages of the novel just aren’t that apparent any longer.
  10. People who are playing Farmville on Facebook would otherwise be writing the great American novel or reading Proust. Please…(the game I like to play is Typing Maniac.)
  11. Most people don’t have anything interesting to say. This point is made peevishly in reaction to the fact that anyone now can blog or tweet. Again, I’ll concede that good writers of 500-word essays are not that common; but my experience, and I bet the experience of many others, is that lots of people actually do have something worthwhile to offer in the short form.  Twitter and Facebook –and let’s not forget YouTube–are great democratizers of the public space and, if anything, are giving many the confidence to share their views with others. I’m darned if I can figure out why that is bad for a democracy. Now in your average dictatorship…
  12. Our current culture, which has taken millenia to develop, is better than any culture we could develop over the next ten years. At face value, that sounds pretty reasonable, but given the explosion of information, connectivity and transparency, I’m not so sure we should  concede even this point. Knowledge is doubling in many fields at a faster rate than the education cycle for those disciplines. I don’t know about you but I’m putting my money on the future.