Limiting the Complexity of Complex Systems

It’s widely known that financial system is very complex. Even just the credit system is complex:


One question we might ask is “is the global financial system complex?” The answer is obvious: yes. But there’s a different question I’ve been wondering for some time: “how complex is it?” That is, is there a way to quantify, in some meaningful way, the complexity of the financial system?

A couple of years ago some CS theorists started analyzing one specific class of complex financial products—collateralized debt obligations. (The latest update of the paper is here.) Their primary focus is on the asymmetry of information between the buyer and the seller—the fact that the seller knows what among the debts being sold is bad and what’s good, whereas the buyer may not. What they prove is that the seller can manipulate the CDOs in such a way that it is computationally intractable for a buyer to know whether the CDOs they’re buying are junk or not. Worse than that, they show that such tampering can’t be detected after the fact, making them difficult or impossible to regulate; as they put it at the time:

Would a lemons law for derivatives (or an equivalent in terms of a standard clauses in CDO contracts) remove the problems identified in this paper? The paper suggests a surprising answer: in many models, even the problem of detecting the tampering ex post may be intractable.

While their results are fascinating, the complexity issues that they address are of the computational complexity variety rather than an examination of systemic complexity. That is, they looked at the downsides of the complexity of a specific financial product, not the downsides of the complexity of the whole financial system.

It’s this latter issue I’ve been trying to understand better. I’ve been on the lookout for ways to analyze and understand systemic complexity, but haven’t found much well-established work. (Not being an economist, it’s quite possible I’m just unaware of some well-known approaches here.) My hope was that something like the Ratnasamy complexity metric (developed to analyze the complexity of network protocols and distributed computer systems) could be applied to systems like the credit system above, and would enable economists and regulators to precisely study the complexity of those systems that succeed as well as those that fail. The metric yields an asymptotic complexity measure that is an attempt to capture how many pieces of information are manipulated by how many parties how many times. As it turns out, the complexity measure that this analysis yields happens to align with intuitive notions of complexity, and may naturally map to the analysis of systems like the credit system as depicted above. My hope is that by analyzing the complexity of a financial (sub)system, it would be possible to identify best practices—i.e. match those systems that work well with their complexity to see if / how the two are correlated—to reduce complexity and avoid future global economic meltdowns. That is, establish complexity limits beyond which regulatory bodies are allowed to step in and decrease system complexity.

As a final step, it would be valuable to develop simpler, more intuitive thresholds that are functionally equivalent to more complex thresholds that come out of the metric(s). Then we could both assure ourselves that the limiting mechanism in place would do the job and we’d be able to understand what the limit is. (We don’t want to rely upon arcane rules in an effort to make things more simple.) For example, suppose we were to institute two rules: “1. No financial institution may be responsible for more than 1% of national assets. 2. No financial institution may sell or transact more than two classes of products.” The first rule would help decrease the cascading damage caused by failures and the second rule would help increase the diversity and decrease widespread common-cause failures. Hopefully it’d be possible to relate these rules or ones like them to the system’s complexity analysis.

Stepping beyond the financial dimension, I’d be interested to learn if any such studies have been conducted on ecosystems. Many if not most ecosystems are complex, and involve an intricate dance of creatures and biogeochemical cycles, and so it seems somewhat unlikely that complexity itself is a core problem in unstable ecosystems. Are those problem factors discoverable from a macro analysis of the ecosystem in question? Is diversity more primary in this context than complexity? Do the patterns generalize?

At this point fixing the financial system seems like wishful thinking, but I still believe understanding the problems we’re in today in greater depth is the first step to figuring out how to prevent them from happening again.

Leave a Reply

(required)

Responses to “Limiting the Complexity of Complex Systems”

  1. This is always educational for me. Higher energy flows create more spatial hierarchy and complexity (relationships), leading to more diversity (variety) (Odum, Ecological and General Systems, 1983, pp. 321-351). Diversity requires flows of energy and is a stored manifestation of previous energy flow. “Diversity can be used to measure the state of a system in the balance between energy flows that develop diversity and those negative actions that may decrease diversity. Studies of pollution, for example, show decreases in diversity indices correlated with negative actions. Often high diversity requires large flows of energy relative to negative actions, and often time is required to develop diversity starting with low diversity states” (Odum, p. 343). Energy and diversity are directly related, yes.

    The simplest measure of complexity is probably a richness index, where you count the diversity in a sample of 1000 individuals in the population; number of types of jobs in a city’s sample, number of species of shells on a stretch of beach, etc. “A very convenient reference graph for data is the cumulative number of types found versus the number of individuals counted” (Odum, 1983, p. 336).

    Is diversity more primary than complexity? “One idea is that diversity of units is a low-priority use of energy that operates when other basic physiological functions have been met. Diversity of structure for physiology may have priority over diversity of species. Diversity, then, is an indicator of a good balance between energy sources and stresses. Increasing physiological requirements for adaptation rehire a decline in the variety of units and number of circuits that can be supported. A decline in the [energy] budget reduces the number of processes possible” (p. 344). “When conditions of adaptation are severe, diversity of species is usually observed to be low. Species that prevail have energy utilized in special adaptations such as special biochemical systems, special organs, and special seasonal adaptive programs. Presumably, energy is being utilized for adapting to special conditions with less energy remaining for the special functions required to prevent competitive exclusion and otherwise organize species for cooperative coexistence” (p. 345).

    If you compare a financial portfolio from 1912 with a portfolio from 2012, you would find about 3 or 4 investment options in the 1912 portfolio; stocks, bonds, bank account, and savings account. Compare that to the current complexity of investment options revealed by a single prospectus of today? One qualitative illustration of the complexity and imbalance that has developed over time hierarchically is Exter’s pyramid (there go the upside down pyramids again). How many types of credit default swaps and derivatives are there out there in the world, representing $1.6? Quadrillion dollars worth of over investment requiring 20% of the American workforce now to push the paper around, back and forth, as the underlying real assets wane? Complex, yes. Useful, no! The comforting part of all of this is that since a lot of the complexity really doesn’t represent real resources, when collapse of the complex part of the financial system comes, a lot of these complex financial apparitions can just disappear without harming anything but the big banks and their employees. Ontogeny recapitulates phylogeny . . . we’re headed back to a future of less complexity and diversity in our financial system along with everything else.

    http://4.bp.blogspot.com/-0J9PffitZKI/TzisQP8rU-I/AAAAAAAARNg/e84p_KxkeAg/s400/exter-inverse-pyramid.jpg

  2. Mary – great explanation. I’m going to have to pick up that book again some day and go through it carefully. It was among the first I went through when first looking into emergy, and I think I’d get a lot more out of it today than I did on my first pass through.

    I do wonder how such such a measure of complexity like richness can include some measure of, for lack of better phrase, transfer of flow. Basically the notion that’s captured in Ratnasamy’s complexity metric is that if a piece of information has to be acted upon and trasferred between a number of different entities, then the overall system is more complex as a result. In a non-information system that might not be information; it might be energy or resources of some kind, though it could be information in the form of energy or resources.

  3. You’re touching on a number of areas with this post. One (which may have been in the back of your mind) is the thesis of Tainter and others, that nations/empires/economies continually increase in complexity, and begin to collapse when all available resources are needed just to manage the complexity.

    A related one is a couple of formal models: C.S. Holling’s “adaptive cycle”: http://www.resalliance.org/index.php/adaptive_cycle and
    Howard Odum’s “pulsing paradigm”: http://prosperouswaydown.com/principles-of-self-organization/energy-hierarchy/pulsing-paradigm/ According to Odum and Holling, all autocatalytic systems go through these cycles. (The work Mary Logan referenced is, I think, from Odum’s earlier work. A good overview of his work is available in “Environment, Power, and Society for the 21st Century”.)

    You say “At this point fixing the financial system seems like wishful thinking, but I still believe understanding the problems we’re in today in greater depth is the first step to figuring out how to prevent them from happening again.” Indeed. If we as a society understood concepts like this, we might be able to recognize when it’s time to simplify voluntarily, shifting relatively easily from the growth paradigm to the contraction paradigm, and thus avoid the worst effects of collapse.

  4. Don -

    I had been thinking about Tainter’s argument while I was writing it up, but one of the things that always irked me was that I (unfairly, I think) expected him to make a quantitative case to support his notion of diminishing returns on complexity. (Neither the diminishing returns nor the complexity itself are quantified; arguably this is for the very good reason that it’s hard or impossible do quantify such things.)

    I do wonder if there is a good root cause explanation / analysis for the fact that in so many different models such systemic (civilizational) cycles seem apparent.