Why Micah Zenko’s “Red Team” Lives in The Serendipity Economy

Why Micah Zenko’s “Red Team” Lives in The Serendipity Economy

I was listening to a Hear & Now interview with Micah Zenko, author of Red Team: How to Succeed By Thinking Like the Enemy. It struck me that many of the issues related to terrorism and cyber-attacks stem from the propensity of those working in the management of intelligence systems. Managing systems that require security do so under industrial age economic models rather than the more emergent model of The Serendipity Economy, a set of principals that those perpetrating the crimes tend to live under, even if they don’t know it.

The Serendipity Economy was created to explain why it is impossible to determine the value of horizontal technology investments, in particular, knowledge management and collaboration ahead of their deployment. This is because much of the value of these systems are emergent.  The threats we face in national security, personal security, and technology security also fit these principles. Here is how I see the principals of The Serendipity Economy aligning with the various threats we face.

  1. The process of creation is distinct from value realization. The creation of the tools of attack is significantly different than the deployment of the tools. There is nothing fundamentally wrong with conceiving a computer virus or a new improvised weapon. Those that combat these threats must be able to conceive of them to thwart them. The general social or economic factors that lead to or facilitate such an idea or even the development of an idea are not, in and of themselves, a threat. In The Serendipity Economy the process of conceiving an attack vector, and deploying, are two different processes. The analog statement for threat development might be: The creation of a threat is distinct from realizing the threat.
  2. Value realization is displaced in time from the act that initiated the value. Rarely does the maker of a tool of terror or other attack vector create the item using industrial-like methods where each item is launched at a target upon creation. What typically happens is that these items are created and then stored for later use. This is parallel with, for instance, the creation of a PowerPoint presentation that isn’t given for weeks or months. Analog: The realization of a threat is displaced in time from the act of creating the threat.
  3. Value is not fixed and cannot be forecasted. This principal generally looks at Serendipitous Activity and states that at the onset of any serendipitous activity no one can determine what value will be derived from it. Say you suggest something in a social network. You have no idea who will read it, comment on it, perhaps be inspired by it, or inspired by you posting it, and what, if anything, will come of that activity. No standard expresses a serendipity unit—the outcome of serendipity ranges from zero to an arbitrarily large number. In terrorism and hacking, this principle applies to those developing the threats, in that they can’t know until they are deployed what havoc they will wreak, if any. Analog: The impact of a threat is not fixed and cannot be forecasted. (Note: These first three principals offer a stark contrast between asymmetrical warfare and traditional industrial warfare. In World War II, for instance, the production of airplanes, rifles, or guns, could be thought of, at least in the aggregate, as being pointed at a particular strategic objective. It would take X amount of hardware and person-power to overcome the enemy. In asymmetrical warfare, it is unknown what is required to overcome the enemy, regardless of which side defines the term enemy. While the ultimate goal may be the destruction of the other side, the approach is emergent, opportunistic, and compartmentalized. We have no idea who will take up making an improvised weapon, and if they make one, where it will be set off.)
  4. The measure of value requires external validation. This Serendipity Economy principle applies to both those creating threats and those guarding against them. As Zenko states in his interview, “you can’t grade your own work.” Meaning that those deploying countermeasures or evaluating threat assessments do so with assumptions and biases in place that move them toward not seeing what they need to see. The industrial economy mentality of streamlining and creating efficiencies is diametrically opposed to the emergent and chaotic goals of terrorists and hackers. Built-in measurements look for the things to be there that we believe should be there and for very specific things that should not. In s, in fact, the precision and mechanical, near the repetitive nature of the safeguards make them so easy to overcome. In the end, those internal measures don’t matter. What matters is the external validation that the system has not been compromised. For the attackers, the reaction that an attack evokes determines its effectiveness (going back to principal three, neither of those future states can be determined at the onset).  Analog: The impact of a threat requires external validation.
  5. Looking at a network in the present cannot anticipate either its potential for value or any actual value it may produce. Looking at a collaborative network at any point in time will not reveal the value of future collaboration or the value of collaborations currently underway. This principle translates easily to populations and threats. Looking at any population will not yield any insight into the potential for a threat. There may be statistical probabilities that a threat will occur, which is more specific than can be provided for collaborative work activity. While it may be possible to say that for every 1M people X number of terrorist activities will occur, we cannot, for instance, say that for that same population, X number of innovative technology breakthroughs will occur. Applying probability to threats does not alter the principals because it offers no actual predictive, actionable insight. A probability, even if it proves true, saves no lives nor does it protect assets. In the most recent attacks in Paris, there was no knowing that ISIS would target Paris next. There might be a high probability of a Western-facing attack, but the sheer number of elements in play makes it impossible to predict the next move. Analog: Looking at a network or population in the present cannot anticipate either its potential threat or any actual damage it may inflict.
  6. Serendipity may enter at any point in the value web, and it may change the configuration of the web at any time. This principle is particularly interesting, for instance, when examining radicalization. There is no way to anticipate when someone will become radicalized or radicalized enough to act on their changed mindset. Even further, once that person has become radicalized there is no way to anticipate how their new mindset will affect whatever communities (networks) to which they belong. In collaboration, we look to new entrants to find value in existing content that hasn’t been discovered before. A person serendipitously discovers an insight, previously overlooked, and because of unique skill or ability, transforms previously inert content into something of value. Radicalization transforms previously inert people into potential triggers or weapons, not for value, but for destruction. Analog: Serendipity may enter at any point in a network, and it may change the configuration of the network at any time.

These are just initial thoughts, but they provide a new perspective and potentially, a language for thinking about emergent threats that may help avoid biases, and the solidification of assumptions, that can lead to the following situation described in Zenko’s book:

Though we would now refer to this derisively as “going native” or “clientism”—whereby people become incapable of perceiving a subject critically after years of continuous study—any honest employee or staffer should recognize this all-pervasive phenomenon that results in organizational biases. This is particularly prominent in jobs that require deep immersion in narrow fields of technical or classified knowledge, and those that are characterized by rigid hierarchical authority—the military is a clear example. Taken together, these common human and organizational pressures generally prevent institutions from hearing bad news, without which corrective steps will not be taken to address existing or emerging problems.

Rather than looking for efficiencies, our enemies are looking to create chaos, which can be a source of serendipity. Understanding serendipity then may be a key to better adapting to the asymmetrical warfare we face.

Daniel W. Rasmus

Daniel W. Rasmus, Founder and Principal Analyst of Serious Insights, is an internationally recognized speaker on the future of work and education. He is the author of several books, including Listening to the Future and Management by Design.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.