Monthly Archives: May 2010

There is Nothing So Weak as an Idea whose Time Has Not Yet Come

Readers of this blog (there are some out there, right?) have probably figured out by now that I am a sucker for pithy sayings. Yes, I am shallow enough to find meaning in short statements that appear to capture something truthful–others might say short statements that capture something obvious. I’ve organized many of them in my Lessons from a CIA Manager page. But yesterday, while talking to someone preparing material for the Business Innovation Factory, where I am scheduled to tell my story at their Collaboration Innovation Summit in September, I was led to recall one of the most meaningful guiding principles/obvious statements I’ve encountered: There is nothing so weak as an idea whose time has not yet come. This guidance has such wide applicability that it can’t be restricted to the management lessons category.

Now when I decided to write on this topic I googled the phrase to see if I could track down its origins. The phrase “an idea whose time has come” is of course everywhere and appears to have been popularized by the French novelist Victor Hugo, who is credited variously for having written:

All the forces in the world are not so powerful as an idea whose time has come.
An invasion of armies can be resisted, but not an idea whose time has come.
Greater than the tread of mighty armies is an idea whose time has come.
Nothing else in the world… not all the armies… is so powerful as an idea whose time has come.
One can resist the invasion of an army but one cannot resist the invasion of ideas.
There is one thing stronger than all the armies in the world, and that is an idea whose time as come

Martin Luther King, Jr. quoted Victor Hugo on this point when he accepted the Nobel Peace Prize. The phrase ‘an idea whose time has passed or gone’ is also popular. But I did not find, at least not on the first few pages of the Google search results, any reference to the inherent weakness of an idea whose time has not yet come. My memory, which is I fear becoming increasingly unreliable, tells me I first heard the phrase 20 years ago while listening to a show on the BBC’s Radio 4. (I lived in England in the early 90s and was quite devoted to all the talk shows and introspection celebrations that aired on Radio 4.) The phrase was used by the narrator in a show discussing the experience of British colonial administrators in Africa. As soon as he said it, my brain BOINGed, and I’ve been devoted to the principle ever since. (The entire show was fascinating for the light it shone on the British colonial experience in Africa and its implications. I remember particularly one fellow who as a sergeant (as I remember) ruled a large swath of Ghana during the 1950s. I tried to understand the impact such an experience must have had on the British psyche: to be British meant that you had some right, perhaps even obligation, to guide the rest of the world, and really you needed no other qualification then to be British. The experience of the United Kingdom during the last century does show that a great power can make the transition, more or less elegantly, from superpower to member in good standing of the neighborhood watch, which should give some hope to Americans who are beginning to grapple with this possible future for the US in this century. But I digress….)

There is nothing so weak as an idea whose time has not yet come. If you have the good or bad fortune (you decide) of  being able to see, to imagine how things can be different, to make out the fuzzy outline of the new amidst the noise of the current state, then you have  experienced the frustration of  trying to popularize a concept for which no one else is ready. The interviewer from the Business Innovation Factory reminded me of this when she quoted from a rebuttal that a colleague made almost ten years ago to a piece I had written on the need for intelligence reform in the journal Studies in Intelligence. Let me quote a paragraph that makes his point and illustrates the weakness of premature innovation (the acronym DI refers to the analytic directorate of the CIA):

Claims of dramatic shifts in large systems, whether the environment, a national economy, or a US government agency, always need to be viewed with some skepticism. Systems do not change overnight, especially those affected by some of the more immutable traits of human nature. Medina’s claims about a new environment of information abundance radically altering policymaker needs are overstated. They echo much of the “new economy” thinking that, as good as it sounds, is increasingly unconvincing as it has been put into practice. Not too long ago, The Washington Post ran a series of articles on “The Rise and Fall of Michael Saylor,” the Microstrategy chief who became a multi-billionaire, then watched his wealth and his company collapse after bad accounting practices took the luster off his vision of how to handle the new environment of information abundance.4 The series reminds us that untested theories, especially when presented in glowing terms to excite the imagination of investors and managers, often promise more than can be delivered and more than, in practice, anyone wants.5 The DI, like many corporations, already has a good and useful product. When consultants and others come to us saying that everything has changed and so must we, the proper response before investing significant resources ought to be “prove it.”

Well I can’t really disagree with anything the author wrote many years ago. Yet I know in my heart (and my brain) that if organizations wait for the need for change to be obvious, it becomes White Rabbit time–too late.  It is a fact that many ideas for how to do the new are wrong. But it is also a fact that ALL existing ideas for how to do things, whatever the field, however small or complex, are eventually replaced by a better way of doing things. So I guess it all comes down to timing.

This then is one of the many dilemmas of innovators. Innovators have to balance the “earliness” of their ideas against their “effectiveness”.  Too early and you might as well be speaking to yourself. Too late and you have sacrificed your effectiveness and failed your group, organization, mission. Decisionmakers in organizations must have a process to harvest ideas so that they can incubate, be protected from the perils of premature delivery, and thus eventually reach their maturity–their time of acceptance. Innovators, who too often are blinded tactically by falling in love with their ideas, need to have a long-term approach to the marketing of their concepts. It is just immature to expect widespread acceptance of your new ideas or to give up at the first opposition.

So let me turn again to Victor Hugo for an appropriate last word: “Each man should frame life so that at some future hour fact and his dreaming meet.”

Advertisements

Leadership and Disappointment

I noticed this morning that someone reached my blog by searching on the terms “leadership and disappointment.” No doubt they found my page on Lessons from a CIA Manager, where Lesson 12 quotes Ron Heifetz on his insight that leadership is disappointing your followers at a rate they can tolerate. But I think there is much more to say on the subject. (When I searched on leadership and disappointment, I ran across this blog by a pastor who is also writing about the Heifetz leadership principles.)

Heifetz of course is talking primarily about the disappointment caused when leaders take their followers in a direction they may never have thought of going and, even harder, to a place they do not want to be. But being a leader is also about constantly and personally dealing with the emotion of disappointment. Being a leader–and I’m talking specifically here about the role of the leader as the agent of change–means living through long periods of disappointment which, if you’re lucky, are punctuated by occasional moments of giddy success.

What are the different types of disappointment a leader is likely to experience?

The Kneejerk Negative: I know you’ve lived this innumerable times. You start explaining an idea you have about how to see a situation from a different perspective or change a process and several people in the audience immediately start shaking their heads and tell you that’s not right and you’re wrong. Now you know, given the time you’ve devoted to this idea, thinking about it, considering the pros and cons, that it’s almost certainly impossible the nay-sayers are basing their comments on anything but immediate and visceral reactions. Once those reactions occur, however, good luck in trying to return the discussion to a more measured approach.

The God, You’re Brilliant: The opposite of the Kneejerk Negative but really just as bad. Again you’re recommending a change or improvement agenda, and the sycophants immediately accept it just because of your authority position. Those you can handle, but the more problematic group are the naive enthusiasts who underestimate the implementation and acceptance hurdles, disrespect the thoughtful concerns of others–“I don’t understand how they can be so stupid,”  and turn off fence-sitters with their excessive euphoria.

The I Was a Coward in that Meeting: Unless you bull rush your way through organizations, which is, I would contend, just about impossible given the physics of change, you will, as a leader interested in facilitating change, always be carefully trading off when to be aggressive against when to be conciliatory and/or indirect. You will mess up that calculation on a regular basis and walk away from many a meeting knowing you could have done more to advance your argument if you had been aggressive with your convictions.

The I Blew It: The existential disappointment: when you realize you’ve been wrong. You will be wrong; change is a risky endeavor. Even if your ideas are structurally correct–and they won’t always be, just the challenge of implementation will inject messiness and error. This is why the Kneejerk Negative reaction of so many of your colleagues is so damaging to the health of your group and its mission. A considered conversation on what to do next always gives you the best odds for improvement.

The I Never Thought You’d Disappoint Me: I had a colleague, technically someone who worked for me, say that to me once. Although disappointing your followers is tough, disappointing the individuals in your organization who are actually your allies, now that hurts. And it’s guaranteed that you will come to that point, particularly if you’re nailing down the last couple of compromises with the skeptics that will allow the change effort to go forward. The successful leader of change in an organization will never be radical enough in her implementation to satisfy the true believers.

The Someone Else Takes the Credit: This requires no additional explanation and is the cousin of…

The This Certainly Didn’t Help my Career: I hosted an intern at work one summer, a very intelligent fellow, who asked me why I was always suggesting ways of improving the work of the CIA, or at least things I thought would help. “Does it benefit your career?” he asked. Cue Hollow Laughter. This disappointment has the potential to turn into bitterness and cynicism. You must fight this tendency with all your mojo, because it will in fact kill your motivation and sour your intentions.

So what’s a change agent to do? I once got a piece a advice from someone I consider a guardian angel of sorts, a total stranger, who told me at a function that, as a reformer, I needed to understand I was always going to feel uncomfortable in an organization. For my own health and sanity, I needed to accept that feeling of discomfort. And in fact, the best scenario would be to come to actually like feeling uncomfortable, because that feeling indicated fidelity to your convictions.

The guardian angel was correct. There is no other way to survive.

What is your Twitter Diversity Score?

How do we,  particularly us knowledge workers, expose ourselves to different views and perspectives? I’m asking this as a very practical question. I’m currently reading Amartya Sen’s The Idea of Justice and one of the most important concepts concerning justice is that it should represent fairness, and that in fact justice and fairness are separate concepts. (Except that many languages, including French, do not have separate words for justice and fairness.) But it occurred to me as I was reading that it is impossible to be fair if you are not aware of all possible views on a subject. (Now, of course, knowing all views probably is itself impossible for most complicated subjects…there would be a long tail of views that defied comprehension, so then you immediately get involved in deciding which views are relevant, etc., which puts you in another miasma of subjectivity. And this is one of the main reasons why I’ve always been rather dubious about the wisdom of authority and institutions, but I digress…)

To return to the main topic, diversity of thought is key to attaining justice and fairness in societies. And how can we hope to achieve such diversity of thought? Well, of course, one way is through the use of TWITTER. Roger Schank, a leading thinker in education and artificial intelligence, just wrote a piece in eLearn magazine on how Twitter  is capable of changing the very nature of what it means to learn from your peers.  Absolutely! The magic of Twitter is in how it has become a perpetual learning machine. But the question remains, is my Twitter stream diverse?

So I decided to do a manual check this morning of the people I follow. I only follow about 200 people so it wasn’t too hard to do it by hand, although admittedly my analysis is extremely primitive. (Is there a program that lets you analyze the diversity of your network? I just tried Google’s FollowFinder, which helps me find more people who are like the ones I already follow. I want the opposite. I want someone to analyze my network and provide me with links to people who are in the same intellectual domains but have different perspectives. Then I would like a tool that shows me other intellectual disciplines I should follow.) In any case here are the results of my Twitter Diversity Inventory. N=200 (Yes, I know the numbers don’t actually add up…I took several shortcuts which I can explain if you’d like…but the numbers are generally truthy.)

Companies/Groups 67
Male 68
Female 55
American 122
Non-American 25
Northern European 129
Not northern European 19
Internet/Social Media/IT Experts 64
Other Topics/Disciplines/Interests 77

Bottom line: I’m not very satisfied. Normalizing to a 100-point scale, more or less, my diversity score for American/non American and northern European/non-northern European is below 20. That can’t be good.  It must lead to personal blind spots in how I think about many subjects. The next step is to figure out how to fix this. More to come….

What is your Stupification Point?

Malcolm Gladwell has a piece in this week’s New Yorker on the nature of espionage and asks some very penetrating questions about the psychology of the business: essentially once you’re in the hall of mirrors is there anything or anyone you can really trust or accept at face value?  It’s very much worth reading and also an amusing read, because Gladwell makes his points while reviewing what looks like a really fun book on the WWII exploits of the British intelligence service, Operation Mincemeat.

But what I really thought was worth sharing were some more overarching points about the business of intelligence or sensemaking. (I really don’t like to use the term intelligence because I think it has too many negative or at least questionable connotations.) Gladwell notes the point made by political scientist Richard Betts that in intelligence analysis there tends to be an inverse relationship between accuracy and significance. Boy, does that ring true, although I would just generalize Betts’s point by applying it to just about all knowledge activities. We almost always can be most specific about that which is least significant. This actually relates to the phenomenon of attaching disproportionate importance to activities you can count. To wit: When trying to fix something, as managers we tend to concentrate our efforts on the parts of the process we understand well, even though those parts may not really be what are causing the problems. I’m sure you’ve  suffered through this in your organization. Some large problems are identified but you and all your coworkers know intuitively that the solutions offered–often rolled out to great huffing and puffing–just don’t tackle root causes.

Gladwell also points to the work of Harold Wilensky, a professor emeritus at the University of California at Berkeley who has done some groundbreaking work over his career but whose book, Organizational Intelligence, which Gladwell quotes from, appears to be out of print.

As Harold Wilensky wrote in his classic work “Organizational Intelligence” (1967), “The more secrecy, the smaller the intelligent audience, the less systematic the distribution and indexing of research, the greater the anonymity of authorship, and the more intolerant the attitude toward deviant views.” Wilensky had the Bay of Pigs debacle in mind when he wrote that. But it could just as easily have applied to any number of instances since, including the private channels of “intelligence” used by members of the Bush Administration to convince themselves that Saddam Hussein had weapons of mass destruction.

I’ve been searching the internet all morning for more on the book Organizational Intelligence, because anyone who made the wonderful observation above has got to have more to offer. Sadly, I can’t find it, although, as is the way of the internet, I was next-linked to this very nice presentation by Richard Veryard. He asks a wonderful question: what stupifies your organization?

Each organization has its particular form of stupidity–It is up to the consultant (or the above average manager) to recognize the way that stupidity manifests itself and to find a way of doing something about it.

I would just note that organizational culture is probably the number one factor that stupifies organizations.

This presentation is chock-full of gems. “Stupidity is not making errors. Stupidity is repeating them.” I also love his discussion of the Algebra of Intelligence. Intelligence is not arithmetical: “lots of intelligent pieces doesn’t add up to an intelligent organization.”

So to summarize, what did I learn in the last 24 hours about intelligence (sensemaking), organizations, and networks?

  1. Closed networks have a hard time determining if what they know is really significant. (In part because determining significance invariably requires perspective and context, which can only be gained from a vantage point. Closed networks lack vantage points.)
  2. The smaller the network, the less room it will have for diversity. (So, a diversity solution that is self-contained is no diversity solution at all.)
  3. The smaller the network, the less it can tolerate differences of opinions.
  4. Every network has stupification points. You must constantly be hunting for and eliminating them or they will destroy you.

About Airline Fees, the Washington Nationals, and Dan Snyder

Economics has never been my strong point. In fact, anything mathematical  leaves me without clues. I was once asked to do a job at the CIA that essentially amounted to editing the work of economists, and it was the one job I felt most insecure in doing. (Although I soon learned that most economists, like most–but not all–deep or single-threaded experts, need help in putting things in perspective and in understanding the noneconomic consequences of their findings.)

But it has struck me this week that there is an unhealthy trend manifesting itself in American companies that does not bode well for the vibrancy of the US economy. It is the explosion in using fees to bolster profitability, rather than doing things to make money the old-fashioned way, by earning it through increased productivity, efficiencies, and innovations.

This hit me quite personally yesterday when I went online to buy tickets for a Nationals game this weekend.  As I started to check out, I realized I was being charged almost $15 in fees for the transaction, including a ridiculous, I believe it was $1.75 fee, to print the tickets using my own equipment at home. What is this I thought?! I’m being charged almost 15% in fees for online transactions that, by the way, have miniscule marginal costs. I immediately cancelled the transaction, and, as I was subwaying downtown anyway for a meeting, just wandered by the stadium and got my tickets there at face value.

But if you’re paying attention, you know that it’s not just baseball teams that are garnering additional revenues through the imaginative use of fees. Credit card companies were really champion performers here at least until recent legislation passed aimed at tempering them. Airlines, of course, are on a fee rampage. They made $8 billion dollars in revenues last year from fees. I couldn’t easily find an overall revenue number to compare that to, but for at least one airline the income from fees accounted for 20% of their total revenues. (By the way, we really need to get on the media for so often not providing context for the numbers they use. That is just sloppy journalism and analysis. We are often told that X activity is going to cost [insert here some really scary number], but are rarely told what percentage that number represents of  total revenues or expenses. This is not an insignificant issue, because it is this very lack of context that obfuscates the real meaning of most developments. But I digress…)

And if you want to read about a really hideous example of using fees diabolically, read this story about the beloved owner of the Washington Redskins, Dan Snyder. It just sickens you.

And it also points out the real problem with American companies getting on the fees bandwagon. You’ll notice in the article that the company Dan Snyder was running, Six Flags, went bankrupt anyway despite its excessive use of fees. Companies seem to be using fees in lieu of other and much healthier ways of increasing revenues, such as through greater efficiency or innovation. As managers and leaders, we need to rise to the challenge of doing our missions better and not sink to employing clever machinations to compensate for substandard results.

You may not think as a manager of a small group, for example, that you collect fees rather than fix and innovate. But you may very well collect fees, they are just not monetary. Every time you impose some new control step in your process to guard against a recurring error committed by some, for example, you are collecting a fee and avoiding dealing with root causes of undesirable performance.  Managing by rule-making is essentially managing through non-monetary fees. And just as reliance on fees threatens to stifle innovation in American companies, managing through rules does the same for any small group effort.

Another Commercial Break

A piece I’ve been working on concerning the future the government needs to start preparing for was just published on the Center for American Progress website. You can check it out here if you’re interested.

A Commercial Break

I’m going to be speaking at the Gov 2.0 Expo in D.C. at the end of May. My talk is the afternoon of 26 May. If any of you are interested in attending and haven’t registered yet, I’ve been given a code that purportedly provides you all a 40% discount on registration. Here it is:  gxp10sbx      In addition, early registration closes 5 May, this Wednesday. The message I received from the coordinator of the speakers doesn’t make clear whether the code applies on top of the early registration discount, but I guess no harm in trying.

My talk takes the principles of social networks and applies it to the work of high reliability/high risk organizations–you know, the ones who usually say their work is too difficult or too important to entrust to collaborative work practices. Like the one I used to work for. Of course, the argument is completely the reverse. Such silliness.

Twelve Stupid Things People Say about the Internet

I remember when I was a kid people  would always say that to find life elsewhere in the universe, we had to look for carbon, because life was carbon-based. And I remember thinking, probably as a 10-year old, well who says that all life has to be carbon-based? Can’t we imagine a different kind of life? And people would say, no, that’s wrong, except now it isn’t so wrong to think that way. And I thought the same thing about the chances of finding life in the deep, deep ocean. I bet we will find weird life down there, I thought to myself, because these arguments that life can’t survive the lack of light and the intense pressure, they’re based on our very limited experiential base. After all, in terms of how life works, our N = 1. It can’t be right to be so sure.

So when I read critiques of the new culture we may be creating using all this internet stuff and mobile devices and Twitter and all the other things certain people like to make fun of, my ear is always listening for these unproven and unjustified assumptions. For example, almost all these critics assume that the good is self-evident and that the internet is displacing a wonderful tradition of knowledge, wisdom. and contemplation, offering very little of substance in return.  Hmmm…I’m just not so sure about that, and I offer the following list of shaky assumptions that we should question fiercely and for which we should demand either evidential or logical proof

  1. Heavy internet users have short attention spans and lack mental discipline. This is just plain silly, if you ask me. When I get deep into researching a topic on the internet, I have a very long attention span. And if I am traveling across many different topics, how is that proof of anything other than a curious disposition?
  2. Digital life is shallow. Says who? By what standard? Compared to what? Going to the movies? Watching old I Love Lucy reruns? Reading a thick economic treatise? And in any case, digital life itself is neutral. It’s the person who is shallow or not, if indeed we want to use this rather elitist formulation.  Even in the old analog culture, I never bought the line that going to the symphony is somehow culturally more significant than catching Bonnie Raitt at Wolf Trap. (I have to say I’ve even been to a Donnie Osmond concert in my lifetime….or was it the Osmond Brothers…the synapses misfire.)
  3. Slow is better than fast. You often hear the digital culture beaten up for its quick answers or its provision of instant gratification. But independent of all other values, such as accuracy, fairness, completeness, etc, there is nothing inherently bad about fast. In fact, fast will always, all other things being equal, be more efficient.
  4. Always “on” is bad. Prove it. As our societies and economies have become more complex, there are significant costs to periods of non-sentience. We may want to go back to an era of slower pace and tempo, but we can’t wish our way there. My experience as a manager is that organizations work best when they sustain momentum; there’s a favorite saying among managers: if you want something to get done, assign it to the busiest person on your team. And I actually believe that for many crackberry addicts, being constantly aware of the status of projects or other activities is actually less stressful than not knowing what is going on.
  5. Work based on reflection is better than immediate reactions. This is actually the ultimate argument of individuals who criticize the internet culture for being too fast or too persistent. And I think you have to admit that reflection has many advantages. Let’s unpack them. Reflection usually contributes to completeness and, in most cases, to accuracy. But reflection is at best neutral in terms of creativity; many argue for example that the best way to be creative is to generate as many ideas as possible without stopping to be judgmental. And there are certainly opportunity costs associated with reflection. As a CIA manager, I was always aware that one never had a monopoly on good ideas.  The longer you wait to propose a new way of looking at a problem, the greater the chance that some other entity will beat you to it. (Whether knowledge work should be competitive in the first place–now that’s another question.)
  6. Formal work is better than casual work. By formal, people usually mean work that has gone through some recognized quality control or expert process. Writing in a hurry is just not as elegant and good, goes the argument, as a carefully constructed essay. The arguments used in discussing reflection apply here as well. There are certain situations where formal work is obviously appropriate, but they are not as numerous as the critics would have you believe. And informality has many advantages in addition to immediacy. For example authenticity, directness, and, often, honesty.
  7. Correct spelling and grammar is essential for communication and is an indication of careful expression. Now I have sympathy for this position because I would tell people whose work I was editing that the worst thing that could happen was for me to gain the impression that I as the editor was paying more attention to their piece than they ever did. Finding obvious typos was one of the events that would create that impression. But that said, I also reviewed many pieces that were impeccable in terms of spelling and grammar but deplorable when it came to logic or original thinking. So sometimes correct spelling and grammar indicates nothing more than that.  The argument that spelling and grammar are essential for communication cannot be disputed. But special communication methods, such as the telegram for example, have always developed spelling and grammatical shortcuts that quickly became well understood. Twitter is just following in that tradition.
  8. It is more serious to do things by yourself than to do them in collaboration with others. Oh, for heaven’s sake!! This can only be accepted as gospel by individuals enamored of the great person theory of knowledge work.
  9. The internet is destroying literature. You’ve heard this. Nobody reads serious fiction any longer. Although I do believe classic forms of literature are threatened, I don’t buy the theory that it is the internet’s fault. Actually, it is probably more the fault of movies, television, DVDs, and video games. And in my view the real issue is that, compared to other, newer media for storytelling, the advantages of the novel just aren’t that apparent any longer.
  10. People who are playing Farmville on Facebook would otherwise be writing the great American novel or reading Proust. Please…(the game I like to play is Typing Maniac.)
  11. Most people don’t have anything interesting to say. This point is made peevishly in reaction to the fact that anyone now can blog or tweet. Again, I’ll concede that good writers of 500-word essays are not that common; but my experience, and I bet the experience of many others, is that lots of people actually do have something worthwhile to offer in the short form.  Twitter and Facebook –and let’s not forget YouTube–are great democratizers of the public space and, if anything, are giving many the confidence to share their views with others. I’m darned if I can figure out why that is bad for a democracy. Now in your average dictatorship…
  12. Our current culture, which has taken millenia to develop, is better than any culture we could develop over the next ten years. At face value, that sounds pretty reasonable, but given the explosion of information, connectivity and transparency, I’m not so sure we should  concede even this point. Knowledge is doubling in many fields at a faster rate than the education cycle for those disciplines. I don’t know about you but I’m putting my money on the future.