In this guest cross-post, Geoffrey West, former President of the Santa Fe Institute, argues that just as the industrial age produced the laws of thermodynamics, we need universal laws of complexity to solve intractable problems of the post-industrial era, and that ‘big data’ needs such ‘big theory’. For more on this topic, see David Hales’ guest post from February this year ‘Lies, Damned Lies and Big Data’.

As the world becomes increasingly complex and interconnected, some of our biggest challenges have begun to seem intractable. What should we do about uncertainty in the financial markets? How can we predict energy supply and demand? How will climate change play out? How do we cope with rapid urbanization? Our traditional approaches to these problems are often qualitative and disjointed and lead to unintended consequences. To bring scientific rigor to the challenges of our time, we need to develop a deeper understanding of complexity itself.

What does this mean? Complexity comes into play when there are many parts that can interact in many different ways so that the whole takes on a life of its own: it adapts and evolves in response to changing conditions. It can be prone to sudden and seemingly unpredictable changes—a market crash is the classic example. One or more trends can reinforce other trends in a “positive feedback loop” until things swiftly spiral out of control and cross a tipping point, beyond which behavior changes radically.

What makes a “complex system” so vexing is that its collective characteristics cannot easily be predicted from underlying components: the whole is greater than, and often significantly different from, the sum of its parts. A city is much more than its buildings and people. Our bodies are more than the totality of our cells. This quality, called emergent behavior, is characteristic of economies, financial markets, urban communities, companies, organisms, the Internet, galaxies and the health care system.

The digital revolution is driving much of the increasing complexity and pace of life we are now seeing, but this technology also presents an opportunity. The ubiquity of cell phones and electronic transactions, the increasing use of personal medical probes, and the concept of the electronically wired “smart city” are already providing us with enormous amounts of data. With new computational tools and techniques to digest vast, interrelated databases, researchers and practitioners in science, technology, business and government have begun to bring large-scale simulations and models to bear on questions formerly out of reach of quantitative analysis, such as how cooperation emerges in society, what conditions promote innovation, and how conflicts spread and grow.

The trouble is, we don’t have a unified, conceptual framework for addressing questions of complexity. We don’t know what kind of data we need, nor how much, or what critical questions we should be asking. “Big data” without a “big theory” to go with it loses much of its potency and usefulness, potentially generating new unintended consequences.

When the industrial age focused society’s attention on energy in its many manifestations—steam, chemical, mechanical, and so on—the universal laws of thermodynamics came as a response. We now need to ask if our age can produce universal laws of complexity that integrate energy with information. What are the underlying principles that transcend the extraordinary diversity and historical contingency and interconnectivity of financial markets, populations, ecosystems, war and conflict, pandemics and cancer? An overarching predictive, mathematical framework for complex systems would, in principle, incorporate the dynamics and organization of any complex system in a quantitative, computable framework.

We will probably never make detailed predictions of complex systems, but coarse-grained descriptions that lead to quantitative predictions for essential features are within our grasp. We won’t predict when the next financial crash will occur, but we ought to be able to assign a probability of one occurring in the next few years. The field is in the midst of a broad synthesis of scientific disciplines, helping reverse the trend toward fragmentation and specialization, and is groping toward a more unified, holistic framework for tackling society’s big questions. The future of the human enterprise may well depend on it.

Join the conversation! 5 Comments

  1. […] trouble is, as Ben Ramalingan puts it, that “we don’t have a unified, conceptual framework for addressing questions of […]

    Reply
  2. Dear Ben,

    Do we really need Big Theory in Big Data …and if we do would it come with education. Access to information doesn’t mean that a person can actually understand the immediate need for action for a future potential benefit. A few example come to mind and I think the most simple is “tobacco” – the information is simple to understand on the pack it does say…This product will KILL YOU but people continue smoking because the potential benefit is not immediate. When working with the tobacco companies about their presence in Africa and using kids to sell tobacco their response was…well they will most likely die of something else before they would die of the potential effect of cigarette smoking. The same goes with our environmental footprint – when asked if people understood the impact of their daily habits on the environment — the majority respond yes – when asked if they would change their habits based on this knowledge – again the majority responded – NO.

    What made the biggest impact in our no smoking campaign was taxing tobacco smoking – people saw the immediate impact of smoking…higher cost!

    Aid delivery / development might be complex but not definitely complicated. I live in both world and it’s quite obvious that the “great divide” is the financial incentive that will force changes.

    Information is abundant but the way people look at the same data is often different. Data is bias…and political – one benefit the other one loses! The sugar cane industry tells us they are creating job but the local see impact on access to water and losing their agricultural base. People in these communities have access to information on the menu at the restaurant — one side is the description of what will be in the plate and the potential nutrition and the other side is the price. People don’t look at the potential benefit of what they could eat – they look at the right side of the menu…what can I afford?!

    All of this to tell you that it’s not so complex…data is bias and political … I doubt that the process associated with the collection or analysis of data can be fully rational …it’s all (plus) + and (minus) -!! What side of the data will you be!!!

    Saludos from Colombia……Luc Lapointe

    Reply
  3. Ben, great post. I do think that there is a use for big data in combination with an appropriate theory. I am, however skeptical about the prediction part. The problem with human systems – or complex adaptive systems in general – is that they adapt. Humans are susceptible to predictions. So every prediction you make directly influences the decisions of the actors – either directly and consciously, or through unconscious effects like anchoring. With this, the prediction invalidates itself (or becomes self-fulfilling). You would have to keep the prediction secret for it to be of any value. That was already recognized long before (or after?) complexity theory by Hari Saldon, who predicted the future of the galactic empire but kept the predictions for himself.

    Reply
  4. Ben, this, to me, is one of the most important and challenging issues facing organizations as they try to evolve toward post-industrial models. Not only is there so much complexity in Big Data that it defies predictability, there is so much ambiguity in it that invites competing interpretations. And then we are stuck in the same old divisive and wasteful battles for the dominant narrative. I believe the key to spiraling out of this vicious cycle is story. I’m not talking about linear storytelling, which is Taylorist, deterministic and channel-oriented. I’m talking about a model developed by David Boje at New Mexico State U.–quantum storytelling. This model deals in probabilities, accounts for uncertainty, and is network-oriented. Networks are not only awash in data, they are “story fields” that hold infinite possibilities for shaping the future. Our material intra-actions (we act on the field and it acts on us) optimize the probabilities of favorable outcomes in across different time and space frames. The humanist scholar Karen Barad (who was trained as a quantum physicist and whose work is foundational to quantum storytelling) calls these intra-actions agential realism. My own response to the “infobesity” that plagues organizations swamped by competing interpretations of Big Data (and it is a fact that they are): Don’t look for meaning in the data. Look for data in the meaning. Without Big Story, Big Data is meaningless!

    Reply

Leave a comment

About Ben Ramalingam

I am a researcher and writer specialising on international development and humanitarian issues. I am currently working on a number of consulting and advisory assignments for international agencies. I am also writing a book on complexity sciences and international aid which will be published by Oxford University Press. I hold Senior Research Associate and Visiting Fellow positions at the Institute of Development Studies, the Overseas Development Institute, and the London School of Economics.

Category

Conflict and peace building, Economics, Evolution, Financial crisis, Innovation, Networks, Public Policy, Research, Science, Technology