Complexity modelling in economics: the state of art

Download full paper

This paper is closed for comments.

Abstract

The crisis happened in the world in the last years, describing a whole of interdependencies and interactions, highlighted the theory’s fundamental flaws of neoclassical economic theory: its unedifying focus on prediction and, above all, its inability to explain how the economy really works.

It is the time to investigate economic phenomena not as derived from deterministic, predictable and mechanistic dynamics, but as history-dependent, organic and always evolving processes. Because this view implies new challenges and opportunities for policy, we will focus the attention on innovative components of Complexity Theory for the study of economics and the evaluation of public policies.

Posted for comments on 18 Feb 2016, 11:19 am.
Published

Comments (3)

  • Ron Wallace says:

    Any science, no matter how mathematized, reflects to some extent its social and political context. Economics is no exception: Galbraith (1987) organized his history of the discipline around the concept of circularity between economic theory, policy formulation, economic and social change, and re-examination of theory. The 2008 Great Recession conforms to this design. The economic crisis, already widely chronicled, was foreseen by almost no one (Greenspan 2013). Neither the folk models of economics operating at the local level of subprime mortgage lenders, the more formalized understandings of the central banks and the Federal Reserve, nor the elaborate financial models motivating Wall Street investment firms, hedge funds, rating agencies nor – perhaps most significantly- the Securities and Exchange Commission, anticipated the collapse (Lewis 2010). As a result, theoretical frameworks long operative in economics, including some of its most cherished, are coming under vigorous attack (Rodriguez et al. 2014; Kirman 2009). Foremost among these is General Equilibrium Theory (For a formal presentation see Levin 2006). Originally formulated by Léon Walras in 1874, and further developed by Arrow, Debreu, and McKenzie in the 1950s through the 1970s, the model formally describes an economy as an idealized stable state—an “experience of balance” (Arrow 1973)—resulting from each consumer seeking to maximize utility, and each producer seeking to maximize profit, amid conditions of perfect competition. Contrasting dramatically with this viewpoint is “complexity economics” (Arthur 2013), a rapidly developing school of thought which, in Joan Robinson’s prescient phrase, recognizes that an economy “must be capable of getting out of equilibrium; indeed, it must normally not be in it (Robinson 1962).”

    “Complexity modeling in economics: The state of the art” by Bruna Bruno, Marisa Faggini, and Anna Parziale offers a succinct overview and defense of this approach, while also concisely mentioning its policy implications. Intended as a complexity-economics primer with brief programmatic discussions, the article is overall clearly written; in a few places, however, the writing should be brought into closer conformity with English usage. The primary shortcoming is the reluctance of the authors to relate the properties of complex systems (which they accurately describe) to well-established and powerful computational methods which, in turn, could be related to specific policy objectives. While a detailed consideration of methodologies and applications would clearly require a separate paper, it should nonetheless be possible to provide a few schematic examples. Otherwise, this and similar presentations will appear as estranged from reality as the neoclassical frameworks which they are intended to replace.

    Among the key features of complexity summarized by the authors are feedback loops. Here is their complete description: “The effect of any activity can feed back upon itself, sometimes directly, sometimes after intervening stages. A part of a system receives feedback when the way its neighbors interact with it at a later time depends on how it interacts with them at an earlier time. Feedback can be positive (enhancing, stimulating, reinforcing) or negative (detracting, inhibiting, counterbalancing). The interplay between the two feedbacks is just one of the few examples of a self-perpetuating process that complex systems possess.” Refreshingly absent from this description are jargon and formalism, two devices frequently deployed to impress rather than to enlighten. But, as noted, the concept should be examined in an economic context, and then linked by a proposed computational approach to a sample economic policy. Importantly, feedback—and other closely-related network properties such as hubs (nodes with high connectivity achieved through an elaborate nexus of feedback loops)—have been functionally explored in a wide variety of systems ranging from molecular signaling pathways to monopolistic economies (Albert et al. 2000). Their major systemic effect is stabilization (robustness), but at the risk of systemic collapse should one or more hubs be eliminated. In industrial economies, as Matutinovic (2010) has shown, “firms with large numbers of connections and high volume of sales represent ‘hubs’ in the network and amount to only a small fraction of firms in the economy”, a finding that has been confirmed for the US. The social implication—of potential significance for policy formulation—is “an unequal distribution of income and wealth among economic agents”. This, and related interpretations, could be explored computationally through a wide and expanding set of relevant complexity metrics e.g., Lempel-Ziv (Lempel and Ziv 1976), betweenness centrality (Freeman 1977), and entropy-maximization measures (Lezon et al. 2006).

    Similarly, the authors’ vigorous endorsement of an interdisciplinary approach in economic modeling—with which I strongly agree—should be supported by a brief discussion of computational strategies and representative cases. They point out that “both macro and micro events, from prediction of the general performance of the economy to more local issues such as climate change, sustainability, demographic change and migration, transnational governance, and security, among others, seem beyond our understanding and control. The issues involved in each of these areas transcend disciplinary boundaries and making progress will require a significant interdisciplinary effort and a paradigm change in scientific thinking.” Having spent more than forty years in the highly interdisciplinary science of anthropology, I welcome this assessment. But the paradigm change they are calling for is likely to be resisted, not least because of the mindset of tribal elders in economics. Painting, admittedly, with a broad brush, we may portray the modern discipline from Walras through Pareto to Samuelson as an increasingly physics-like field where rates of change in quantitative variables (e.g., GDP, inflation, median individual income) may be expressed as differential equations (Turk 2012). In recent years, this approach has been extended to include general topology, complex geometry, and optimization theory without, however, questioning the near-exclusive focus on readily-quantified data. The difficulty, of course, is that the world stubbornly presents us with cultural, ideological, and religious processes which do not presently yield to rigorous quantification, but yet have profound economic effects. Chua (2004) has powerfully documented the strong control over goods and services in a wide range of developing, and democratizing, countries exerted by ethnic or religious groups, or “market-dominant minorities”. Similarly, Ferguson (2006), in a dark and riveting account, has examined the turbulent interplay of racism, ethnicity, and economics from 1914 to 1945. Bruno et al., therefore, need to anticipate neoclassical criticism – and more strongly defend their interdisciplinary economics – by invoking such examples, and describing relevant computational methods. It is eminently possible to conserve the mathematical rigor of the neoclassical school while benefiting from the inclusion of economically-significant ethnic and religious data: a “best-of-both-worlds” approach. Hybrid models which combine discrete and continuous variables (Medio 1991), already widely used in systems biology, would be well-suited for such applications (e.g., Khan et al. 2014).

    Last, but most important, the authors’ recommendation for the application of complexity modeling to economic systems (broadly defined) by a decentralized set of intercommunicating agents is a valuable contribution to economic policy. Bruno et al. explicitly reject the ritualistic application of traditional theoretical frameworks to a highly dynamic economy: “Policy needs to be suitably tailored to specific problems” they write, “and has to take into account that a policy instrument launched today might not always work tomorrow because the economic system is evolving in unpredictable ways.” Recognizing that policy-making, if grounded in a more inclusive understanding of economics, could become an intractable problem, they defend a patchwork approach in which different “levels of governance” would iteratively deploy optimization strategies – they propose genetic algorithms – to identify “a single very powerful [if provisional] solution.” This is a bold recommendation, but critical questions remain unexplored. We need to know more about the somewhat mysterious “levels of governance”. Do these correspond to existing political and economic hierarchies in developed and emerging economies? Will the proposed computational strategy, therefore, be retrofitted to these traditional entities? Conversely, if a novel structure is to be responsible for governance – at least in the economic realm – from what source will its authority derive? Last, and most significant, is the issue of corruption. If the 2008 calamity taught us anything at all, it was that the lowest levels of U.S. economic power (e.g., subprime mortgage lenders) were as morally compromised as the highest (e.g., the “Big Three” credit rating agencies). Every level was complicit (Lewis 2010). In addition, the problem of dispersed malgovernance is by no means confined to technologically-advanced nations. Decentralized corruption, involving kin-based patronage, policing, and political repression, is widely encountered in the tribal hierarchies of Third World kleptocracies (Chua 2004; but see also Acemoglu et al. 2003). We are thus inexorably led to Juvenal’s ancient query: “Who will guard the guardians?”

    In 2016, the future of economics – like much else – is highly uncertain. Bruno and colleagues have formulated a possible strategy emphasizing flexibility and dispersed economic governance in dealing with the challenges of constant systemic change. While their views have historical precedent, the extensive, and potentially global, application of complexity modeling by intercommunicating agents using optimization methods has never been attempted. Should it be? In considering the question we should evaluate the risks of deploying this dynamic, but untested, approach in relation to the hazards – actual, historically-demonstrated – of perpetuating an approach dubiously based on classical physics. Meanwhile, all over the world, cultural conflicts increase (Cavanaugh 2014), inequality worsens (Stiglitz 2015), and Francis Fukuyama’s “end of history” is not on schedule (Fukuyama 2006). Together these portents suggest that future crises in the global economy may not only be more frequent, but perhaps more destructive as well.

    References

    Acemoglu, D., Robinson, J., Vendier, T. (2003). Kleptocracy and Divide-and-Rule: A Model of Personal Rule. The National Bureau of Economic Research, Working Paper 10136.
    Albert, R., Jeong, H., Barabasi, A.-L., (2000). Error and attack tolerance of complex networks. Nature 406(6794): 378-382.
    Arrow, K. (1973) General Economic Equilibrium: Purpose, Analytic Techniques, Collective Choice. American Economic Review 64(3): 253-272.
    Arthur, W. (2013). Complexity economics: A different framework for economic thought.
    Santa Fe Institute Working Paper 2013-04-012.
    Cavanaugh, J. (2014). Huntington’s Revenge: The Global Increase in Religious and Cultural Conflict. Mint Press News, March 31, 2014.
    Chua, A. (2004). World on Fire: How Exporting Free Market Democracy Breeds Ethnic Hatred and Global Instability. Harpswell: Anchor.
    Ferguson, N. (2006). The War of the World: Twentieth-Century Conflict and the Descent of the West. New York: Penguin Press.
    Freeman, L. (1977) A Set of Measures of Centrality based on Betweenness. Sociometry 40: 35-41.
    Fukuyama, F. (2006). The End of History and The Last Man. Florence: Free Press.
    Galbraith, J. (1987) Economics in Perspective: A Critical History. New York: Houghton-Mifflin.
    Greensapn, A. (2013) Never saw it coming: Why the financial crisis took economists by surprise. Foreign Affairs 92, November/December 88-96.
    Khan, F., Schmitz, U., Nikolov, S., Engelmann, D., Pützer, B., Wolkenhauer, O., Vera, J. (2014). Hybrid modeling of the crosstalk between signaling and transcriptional networks using ordinary differential equations and multi-valued logic. Biochimica et Biophysica Acta 1844(1 Pt B): 289-298.
    Kirman, A. (2009) Economic Theory and the Crisis. VOX, Center for Economic and Policy Research, 14 November, 2009.
    Lempel, A. and Ziv, J. (1976). On the complexity of finite sequences. IEEE Transactions in Information Theory IT22( 22): 75-81.
    Levin, J. (2006). General Equilibrium. Department of Economics. Stanford University.
    Lewis, M. (2010). The Big Short: Inside the Doomsday Machine. New York: W.W. Norton.
    Lezon, T., Banavar, J., Cieplak, M., Maritan, A., Federoff, N. (2006). Using the Principle of Entropy Maximization to infer Genetic Interaction Networks from Gene Expression Patterns.
    Proceedings of the National Academy of Sciences 103(50): 19033-19038.
    Matutinovic, I. (2010). Economic Complexity and the Role of Markets. Journal of Economic
    Issues 44(1): 31-51.
    Medio, A. (1991). Discrete and continuous-time models of chaotic dynamics in economics. Structural Change and Economic Dynamics, volume 2. Oxford: Oxford University Press.
    Robinson, J. (1962). Economic Philosophy. Hawthorne: Aldine:
    Rodriguez, Á., Arnal, J., Crespo, O. (2014). Financial Crisis and the Failure of Economic
    Theory. London: Pickering and Chatto.
    Stiglitz, J. (2015). The Great Divide: Unequal Societies and What We Can Do About Them.
    New York: W.W Norton.
    Turk, M. (2012). The Mathematical Turn in Economics: Walras, the French Mathematicians, and the Road not taken. Journal of the History of Economic Thought. 34(2): 149-167.

  • Brian O' Boyle says:

    Complexity Modelling in Economics: The State of the Art, is, as Ron Wallace contends, basically a primer into economic theorising that highlights (1) complex/interdependent variables (2) dynamic movements (frequently) away from equilibrium and, (3) path dependent historical evolution. Viewed as an an introductory piece it is interesting and generally informative. The authors give a useful overview of the key problem within mainstream modelling in terms of its inability to capture central features of economic reality. They also highlight the kinds of modelling that could potentially replace this neoclassical dogma.

    Beyond this there are a number of issues with the paper -some of which may well be easily rectifiable in terms of the key assumptions underpinning the paper, others which flow from my own (Marxist) ideological perspective and so may be useful in terms of generating discussion and debate.

    The first major issue is the lack of specificity on display. As currently constituted, the paper does little more than restate much of what has previously been said by Post-Keynesian’s such as Kaldor and Robinson, Critical Realists like Tony Lawson, Institutionalists like Geoffrey Hodgson, and more eclectic thinkers like Philip Mirowski. Each of these contributors have, in their own way, highlighted the problems referred to by Bruno et al, along with similar statements about the need for social relational analysis, historical sensitivities and open systems approaches etc.

    The obvious way to move beyond these contributions is to illustrate the power of computational models in terms of their ability to cope with socio-economic complexity. However, without any evidence from real world phenomena the paper struggles to add novelty to the existing literature. The way around this problem is to add substance to the general theory, highlighting important phenomena within the economy that can be modelled in terms of feedback loops and the heterogeneity of individuals etc. This can be done in the way that Ron Wallace insists above – namely to examine complexity “in an economic context….linked by a proposed computational approach to a sample economic policy”.

    This leads onto my second concern, which involves questioning the use of complexity modelling as an alternative to political economy. My key point here is that we don’t merely live in a complex and interdependent system – we live in a capitalist economy, replete with a class structure, a universal logic of profit and anarchic (compel and interdependent) decision making processes via the market. None of this is visible in the current paper as it operates at a level of generality and assumes that most of the problems we face in the world are technical and/or epistemic. When the authors state that issues like climate change, migration and security are “beyond our understanding and control” for example, they implicitly assume that our key problems are technical and epistemological. All we need is the better (complexity) theory and policy making can be subsequently improved. Unfortunately this is not the case.

    Issues like migration and climate change are undoubtedly complex, but they are not, for all of that, beyond our comprehension. Indeed, we already have a very good understanding of these issues including the real world forces that move them beyond our democratic control. Think about our technical ability to move beyond a fossil fuel economy or to offer migrants safe passage versus the political constraints imposed by the fossil fuel industry or a fortress Europe that encourages racism and xenophobia for example.

    This is not to suggest that rigorous modelling has no place in economics. Rather, it is to highlight how much ground complexity theory is ceding to mainstream theory by framing the problem overwhelmingly in terms of epistemology and technique (see our paper above for similar comments in relation to Tony Lawson). Meanwhile, Complexity theory also cedes to much ground to mainstream by emphasising the lack of rationality in decision making as opposed to the structured elements of the capitalist economy.

    Versions of complexity theory have been around for decades, but they are unlikely to have any real impact on the discipline (think of the Capital Controversy) until they start to appreciate that the main problem with the mainstream is not its lack of technical ability -buts its overarching defense of the capitalist economy.

  • Pietro Terna says:

    Pietro Terna, A comment to Bruno, Faggini and Parziale (2016) paper on “Complexity modelling in Economics: the state of art.”

    The paper represents a significant step in the direction of synthesis of complexity in social science; for this reason, I suggest to the authors to take in consideration both the deep roots of the arguments that they are analyzing and the tools that we can use to cope with this new perspective in economics.

    Roots

    The roots of the complexity view in social sciences can be found into classical papers: those of Anderson (1972) and of Rosenblueth and Wiener (1945).

    The complexity manifesto is mostly identified with Anderson (1972) paper More is different, where we read:

    “The reductionist hypothesis may still be a topic for controversy among philosophers, but among the great majority of active scientists I think it is accepted without questions.
    (…) The main fallacy in this kind of thinking is that the reductionist hypothesis does not by any means imply a ‘constructionist’ one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe.
    (…) The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. The behavior of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of a simple extrapolation of the properties of a few particles. Instead, at each level of complexity entirely new properties appear, and the understanding of the new behaviors requires research which I think is as fundamental in its nature as any other.” (p.393)

    This is the key starting point: the economic world is made by interconnected layers populated by more and more complicated agents (people, families, firms, banks, central banks, international institutions, multinationals, …). People make economies, but each of us is so far from understanding and controlling the economic system as a humble ant with the anthill. Economics, as a science, has been simply ignoring that detail for about two hundred years. Complexity, in Anderson’s words:

    “is the big trap generating the current paranoiac situation in which the crisis (2007–2014, at least) has no room in perfect models, but … exists in the actual world.

    How to work with the lens of complexity? We need models, for us and …for the ants. From the wonderful list of foundational papers about complexity, that we can find at http://www.santafe.edu/library/foundational-papers-complexity-science, let us have a second basic reference, related to the model-building perspective.”

    In Rosenblueth and Wiener (1945), the founders of cybernetics, we read:

    “A material model is the representation of a complex system by a system which is assumed simpler and which is also assumed to have some properties similar to those selected for study in the original complex system.
    (…) Material models are useful in the following cases. a) They may assist the scientist in replacing a phenomenon in an unfamiliar field by one in a field in which he is more at home. (…) b) A material model may enable the carrying out of experiments under more favorable conditions than would be available in the original system.” (p. 317)

    Being cybernetics a root of all our contemporary work in complexity and agent-based simulation, it is important to underline the analogy between the material model above, which is now the artificial artifact we can construct into a computational system, to examine our problems in a closer way, while we are also studying them in a theoretical way. So we cope the confrontation between the material model (the artifact of the system) that we need to build taking in account randomness, heterogeneity, continuous learning in repeated trials and errors processes and the theoretical one.

    Tools

    Following the considerations reported above, what kind of tools can we use?

    In social sciences, models are simplified representations of the reality and are usually built in two ways: (i) verbal argumentation and (ii) mathematical equations, with statistics and econometrics.

    The first way (i) is highly flexible and adaptable, as in the case of a historical book reporting an analysis of past events, but having in our hands only descriptions and considerations, we cannot make computations, tests and what-if verifications. The second way (ii) allows for computations and verifications, but is limited in flexibility and in capability if looking for fine-grained analysis and description: mainly, when we think about how agents are expected to operate into our models, considering—first of all—their heterogeneity and their interactions.

    There is a third way to build models: (iii) computer simulation, mainly if agent-based.

    Computer simulation can combine the useful flexibility of a computer code—where we can create agents acting, making choices, and reacting to the choices of other agents and to modification of their environment—and its intrinsic computability. In this way, we can use together the descriptive capabilities of verbal argumentation and the ability to calculate the effects of different situations and hypotheses. From this perspective, a computer program is a form of mathematics. In addition, we can generate data — i.e., time series—from our models and analyze them employing statistics and econometrics. Summarizing with a scheme, we have:
    (i) verbal argumentations;
    (ii) mathematical equations, with statistics and econometrics;
    (iii) agent-based computer simulations.

    We can also imagine building models based on multiple layers of agents, with the agents of each layer composing — in a collective sense — the more complicated agents of its upper level, helping us in understanding reality.

    The considerations above act in a way similar to abduction, or inference to the best explanation, where one chooses the hypotheses that, if true, give the best explanation of the actual evidence. Note that in the ABM perspective, the hypotheses are also related to the rule that determines the behavior of the agent.

    We can further explore the analysis about ABMs and their classification, with the three possible cases of use of computational constructions based on agents, as proposed in Axtell (2000):
    • the first use occurs “when models can be formulated and completely solved: agent models as classical simulation”, mainly to make what-if evaluations;
    • the “second use (is about) partially soluble models: artificial agents as complementary to mathematical theorizing”;
    • the third use relates to the cases of existing equilibria, which can be incomputable, not attainable by bounded rational agents, known only for simple network configurations, or less interesting than transition phases, fluctuations, and extreme events.

    The abductive interpretation of the agent-based paradigm as a source of knowledge, corresponds to the third use, when we are coping with intractable models (my addendum to Axtell’s considerations), mainly when we believe that agents should be able to develop self-generated behavioral rules.

    In any case, ABMs can (could) be at the basis of the search for micro foundations of the economics analysis, not having in mind an extreme reductionist perspective, but considering causality and intentionality in actors.

    Those above are my suggestion to improve the paper.

    References

    Anderson, P. W. (1972). More is different. In Science, vol. 177(4047), pp. 393-396.
    Rosenblueth, A. and Wiener, N. (1945). The Role of Models in Science. In Philosophy of Science, vol. 12(4), pp. pp. 316-321. 
http://www.jstor.org/stable/184253