Forecasting, prediction and precision: a commentary

Download full paper

This paper is closed for comments.

Abstract

Forecasting involves an underlying conceptualization of probability. It is this that gives sense to the notion of precision in number that makes us think of economic forecasting as more than simply complicated guesswork. We think of it as well-founded statement, a science and not an art of numbers. However, this understanding is at odds with the nature of social reality and the attributes of the forecaster. We should think differently about how we both anticipate and make the future and what this means. Foresight is perhaps a more appropriate term.

Posted for comments on 28 Mar 2012, 9:19 am.
Published

Comments (9)

  • Alan Shipman says:

    This paper tackles two severely under-addressed questions – why are economic forecasts in such high demand despite their lack of success at anticipating important turning-points, and what exactly are forecasts saying about the future state of the world?

    I’d like to offer five observations on specific points, and a general reflection on what can be concluded.

    (1) on p2, there is a conflation of ‘likely’ and ‘more likely’ that may need to be more clearly dis-assembled. ‘A more likely than not-A’ and ‘A more likely than B’ only translate into ‘Likelihood of A >50%’ if there is no other possibility (such as C or ‘not sure’). So, ascribing a 50% (or more) chance to A does add information in this case, but it’s information that doesn’t necessarily follow from the premises.

    It’s true that a lot of economic forecasts take the form of ‘return to recession is more likely than continued growth’ and, if growth or recession are mutually exclusive alternatives, this amounts to saying that the chance of renewed recession is over 50%. But many forecasts limit themselves to saying that one thing (A) is more likely than another, without any implied ascription of probability except in relation to not-A.
    Given this, I think the conclusion from the A and B propositions (2 paras up from foot of page) needs to be restated as

    ‘There is a greater than 50% likelihood of A *occurring rather* than B, but not of A in fact.’

    (2) It may not always be the case that “In a truistic sense forecasting is successful description of some future phenomena” (top of p4). This does not apply to such cases (among others) as:

    – IMF and Bank of International Settlements forecasts of the increased GDP share that will have to be assigned to state pensions (or the rise in public debt required to fund state pensions) if these are not reformed. These forecasts depict an impossible situation arising sometime in the future (eg an unsustainable public debt); they predict a situation that cannot arise, with the intention of persuading agents (in this case governments with pay-as-you-go state pension system) to change course before it is ‘too late’

    General implication: The distinction between forecasting and foresight may not always be as clear as is implied in the paper’s conclusion. One function of forecasts may be to present cautionary tales with impossible outcomes, to force behavioural change that leads to different outcomes. Extreme global warming forecasts may be another example. This relates to the forecast/foresight distinction in the Conclusion (p7-8) – some ‘forecasts’ are exercises in foresight because they depict events that cannot happen (eg a government with an impossibly large debt, a bank that continues to exist in a state of bankruptcy).

    – Forecasts of Belarus’s GDP. Belarus is taken as an example of a government which is likely to put a positive bias on its official data. Economic forecasts of Belarus’s GDP growth have to be forecasts of the official GDP figure (and its growth). They are not forecasts of national output, or even the GDP measure of national output, but of the government’s representation of the GDP measure of national output. If there is a bias, then forecasters who successfully forecast next year’s official growth figure are (by design) not describing the future underlying phenomenon (GDP), only the superficial phenomenon (announced GDP).

    General implication: Economic forecasts, which are usually forecasts of a statistical office’s or central bank’s quantitative description of future reality, may be further removed from reality than (for example) a psephologist’s forecast of the next US president.

    (3) While agreeing that “All forecasting involves an underlying conceptualization of probability” (p4, para above bulletpoint), the range of conceptualizations may be wider than is implied here. Some forecasts are presented as being an absolutely certain description of the future outcome if certain conditions prevail, with the caveat that the number may go higher or lower if conditions turn out different. This enables forecasters to assign one number and then specific ‘upside’ and ‘downside’ risks that may be symmetric or asymmetric. The procedure has affinities with George Shackle’s ‘potential surprise’ approach, which was advanced as an alternative to probabilistic forecasting – though today’s forecasters are unlikely to have heard of Shackle.

    4) While agreeing that probabilities cannot be equated with patterns (p4-6), pattern-identification seems no less conducive than other forecasting methods (eg econometrics) to probabilistic forecasting. The key difference seems to be that pattern-identification just looks at the past and present state of one variable or set of variables, whereas other forecasting methods take account of a wider set of information (‘underlying’ variables, policy parameters etc).

    For example, ‘chartists’ try to predict a stock price movement using just the past pattern of price movement, whereas econometricians might build a model that includes interest rates, sector growth rates etc. The chartists’ pattern is certainly a ‘human construct about human constructs’ (as is said on p5), but so is a forecast of the same variable arrived at by any other means. In the house-sales example, a ‘pattern’ group might say that sales will spike upwards in month X because that’s an observable seasonal pattern, a ‘forecast’ group might say that sales will spike upwards in month X because the sales tax is due to rise in month X+1; the motivation for the predictions is different, but the nature of the predictions seems to be the same.

    If so, the conditions listed on p6 would apply not only to probability statements based on patterns (for which they seem to be correct), but to other ways of arriving at probability statements. But again, they only apply to ‘cardinal’ statements that assign a numerical probability or likelihood, not ‘ordinal’ statements that just assign one outcome a higher probability or likelihood than another. This again highlights the significant difference between ‘likelihood is…’ and ‘is more likely than…’ statements. It’s only the assignment of precise probabilities that requires closed and complete conditions (as rightly specified on p6). Most forecasters stop short of being so precise, because they are aware of how demanding the conditions are – which supports the point the paper is making. Even a forecast that boldly asserts ‘the economy will grow at 0.5% next year’ tends actually to be saying, on closer inspection of the commentary, ‘0.5% growth is more likely than 1% growth or stagnation/recession next year, given our set of assumptions about exogenous variables and policy parameters.’ Forecasters turn out to be making statements of relative probability because they are aware that absolute probability statements require closedness/completeness conditions hey cannot satisfy. This could be an additional point to which the paper’s analysis leads.

    (5) Point 1) on p7-8 implies that that the EU/US bank ‘stress tests’ were deliberately set so that banks would pass them, to avoid a panic about banks’ condition or a challenge to existing minimum capital ratios. That may have been the aim of the first round of tests, but as these were observable, they could be seen to lack credibility. Later runs of the tests did lead to some banks failing – as by this time the regulators wanted to force a change in banks’ behaviour, and rally support for higher minimum ratios. Some forecasts may be neutrally generated, but all use of forecasts is to reinforce or change people’s opinions. The imaginary world – point 2) on p8 – is always one which the forecast-users want to steer us towards or away from.

    That helps to explain why demand for economic forecasts hasn’t been undermined by their failure to anticipate the 2008 crisis, and may even have risen during and after hat crisis. People have always wanted an authoritative source of statements about the future that support decisions they have taken. The demand shapes the nature of the supply, and explains why people tend to want forecasts that make absolute probability statements even when most forecasters are actually being more cautious than that.

    General: The paper is currently structured as a discussion of probability statements on p1-4, a discussion of patterns (recognition and use) on p5-7, and a conclusion contrasting forecasts and foresight on p7-9. As suggested above, some of what is said about patterns could also apply to forecasts reached by other means, the main distinction being between predicting on the basis of surface variables (patterns) and appealing also to underlying variables (econometric forecasts). More could be made of the distinction between types of prediction, arrived at by either method – those that present relative likelihoods vs those that claim to assign an actual numerical probability, and how the second group is often relegated to the first when all the conditions and caveats are examined.

    • Jamie Morgan says:

      Constructive comments Alan, thanks for taking the time to read the paper.

      The argument presented is one I’ve tried to state in a minimalist fashion – similar to Gettier on epistemology and so is one that can easily involve many additional developments, points of nuance or conditional aspects. That said:

      Your 1) seems a better way to state the point.
      Your 2) makes me think I ought to add a statement emphasising the link to the constructive element of how forecasting is presented and used – since, if I read you correctly – these are examples of additional ways of shaping future behaviour, they are policy positioning as forms of argument for the negative. Perhaps the original statement should be tautological and truistic as in successful forecasting but this raises the issue of what success is? What d you think?
      Your 3) I’m not sure if forecasters have or have not heard of Shackle – if they studied economics at a top university will they have seen The Years of High Theory; if not his work on time etc.? The point that if conditions turn out higher or lower is still one of probability isn’t it, and what I wanted to stress overall was that there is a tendency to misconceive what probability/forecasting is providing… Do you think there is some additional point here to be made or that I can usefully clarify the underlying argument here…
      Your 4) distinguishes technical analysis and other approaches to investment (‘fundamentals’) – as you say both are about anticipating price movements in different ways – but aren’t both about different concepts of patterns – I’ve recently been looking at hedge funds and how they generate returns based on different ways of approaching arbitrage opportunities in equity markets and it seems to me that the main difference is the type of information and the degree of underlying specification from a model – creating different ways of constructing what are in the end probability based investment systems (rules from patterns in pricing, patterns in variables related to patterns in pricing). What specific modification to the argument would you think would help here…
      Your 5) is true but perhaps I didn’t make it clear enough that I was illustrating the potential for the intention to be of that kind rather than claiming that it is always of that kind… As I see it, it is the potential sometimes followed that indicates the problem of our understanding of forecasting re construction. It does not always have to be thecae for the point to be relevant – are you suggesting an additional caveat?

      Thanks again, Alan

  • Douglas Porpora says:

    I like this paper a lot. It is provocative, and made me think.

    I think you are saying that we think of forecasting as a precise, mathematical algorithm, but in most cases, and especially in economic cases, it is not. Or actually, as you say in the beginning, you are, like Wittgenstein, more raising questions than providing answers about what it is we are doing when we are forecasting.

    I have spent some time arguing with Tony Lawson about probability, and your piece made me understand that argument differently. I think in some ways, your argument is kin to the one that used to rage about computers and AI. So computers finally can now beat the best of humans, but is that because the options are both discrete and finite?

    I thought you were suggesting something similar with regard to probability. That we think of probability as involving patterns or frequencies. I would have said set theory. But we can only assign probabilities really when the events are discrete and finite (or closed). Otherwise, it is a fudge.

    And then there is the epistemological matter. I liked your introduction of Sherlock Holmes and the way you tied the end back to the beginning in this respect. I have been watching the series by the way. I am told it is very popular especially among the young. I must be young at heart. I hadn’t thought before of all the stupendous qualities Sherlock must have, and your delineation made it clear how unrealistic a character he is. If the economic world is anything like the world of murder, then I agree it is preposterous for economists to presume they can succeed.

    I also liked by the way your discussion of precision. It reminded me of the trouble I had when learning statistics. Being more than a bit obsessive, if I don’t get a step in an argument, I just stop. So I remember refusing to go on when learning about confidence intervals. The reason was they did not really seem like 95% confidence intervals because we were using an estimate for the variance rather than the real variance, and the error associated to that must somehow be added in. Why was everyone going on and talking about 95%? It was thoughts like that that made me less successful in math than I might have been — although that tendency helped a lot in philosophy. Anyway, it seems related somehow to your argument, that the precision we speak of is always for all intents and purposes, which means it is contextual.

    • Jamie Morgan says:

      Thanks Doug,

      Your approach to the paper seems quite different than Alan’s. It made me think of something Heikki Patomaki has been talking about recently – not something new, but something we don’t talk about enough -precision is contextual, systems are open – how does this affect probabilistic statements? Heikki places emphasis on the fact we are constantly making predictions about the future and that these are more or less successful despite open systems. He draws on Keynes here that probability is about evidence outcome rather than frequency of occurrence as a grounding per se. If systems are relatively stable for some period, and result in Tony Lawson’s demi-regs, then perhaps there is a basis for imprecise probability statements – ordinal expressions of likelihood that don’t fetishise any given mathematical manipulation of the original data – do you think the language of probability is the right one to be using though? It is the case that x does or does not occur. We can express a sense of the likelihood of that occurrence, if it does occur are we vindicated in having confidence in imprecise ordinal probability (in cases that deviate from casino based fixed rule games)?
      I wrote an essay in Journal of Critical Realism on set theory 10(2) 2011 if you are interested Doug.

  • David Byrne says:

    This is an interesting paper and the general line of argument is sound. What follows is a reflection rather than a critique as such. That is to say it made me think on the following lines. It would be useful to consider other domains in which forecasting is attempted using methods which are somewhat different from those deployed in Economics / Econometrics. There are two approaches here. One is the use of difference equation based models which originate in physics’ approaches to complex systems. These are operationalized in simulations which have been deployed in a range of areas including urban planning and physicists have started to colonize economic forecasting and the processes of financial prediction. Paul Ormerod for example is a great enthusiast for this style. The other is the development of agent based models following rule specifications. Here Holland is the prime source of reference. Interestingly the latter attaches no probabilistic estimate to outcomes at all. Fat tails seem to represent the probabilistic approach in relation to difference equations. Both of course are more or less explicitly attempts to cope with prediction for open complex systems. However, both remain within the domain of what Morin has called ‘restricted complexity’ – i.e. they attempt formal scientistic representation, very clearly in the case of the equation based simulations and in style for agent based modelling. Implicit or even sometime explicit in Morgan’s paper is the issue of agency and the implication this has for any prediction in relation to complex open social systems. Morgan picks this up in relation to the reflexivity of forecasts and previous comments do the same e.g. in relation to the way in which demographic / financial forecasts of an actuarial form lead to revision of pension policies. So I think agency in relation to social causality should always be absolutely explicit. To tie this back to forecasting, what we have in complex systems is a limited number of future system states after a phase shift. As Morgan makes clear in his Point 5 on page 4 of the paper, traditional linear econometric forecasting can cope with ‘stabilitiy’ – in complexity language with the oscillations as we might say of a system in a torus attractor form. What they cannot handle are phase shifts. However, past the point of phase shift there are not infinite futures but a limited set. The agency point is that it is agency which produces which future occurs. We can use a language of crisis to describe phase shift and as O’Connor noted in his paper on ‘The meaning of crisis’ in a context of crisis, it is not a matter of what will happen but what will be made to happen.

    • Jamie Morgan says:

      Thanks David, I’ll think about this. This paper was set up as a basic philosophical question begging process rather than a full range of possible applications and proof areas.

      As you say though there are many more avenues here that one might consider.

      In terms of crisis, one might say it is true that there are a limited number of possible futures in a truistic sense. But if context can mean that change can be profound then there are times when the nature of possible futures is not limited by the limited number of possible futures we conceive, even though it remains ‘limited’. There is also the conceptual problem that one can have a position of radical uncertainty regarding the future but know that the results of radical uncertainty create reasonably definite behaviours – fear responses are explicable and anticipatable and are responses to an environment where one cannot know the outcome so ordinary behaviours are refused – the result being – do nothing (with consequences) or do new somethings (as defences).

      These of course are not frequency based probability issues but they are definitely belief based likelihood issues in Keynes’ sense.

      I wrote a paper with Wendy on agency and rules which might be relevant here – did she give you a copy?

      Thanks again for the comments.

  • Heikki Patomaki says:

    This is a very good paper written in a Wittgensteinian fashion (though W used continuous numbering, which you might also consider, e.g. 1.1…., 2.1. …3.1… etc).

    As Alan, Douglas and David have already made many insightful comments, I would like to raise two issues and focus on them. First is the question – asked already by Jamie in one of his responses – about whether the language of probability is the right one to be using? The second question is closely related: what are the criteria of validity or success of probabilistic forecasts or, more generally, anticipations?

    These questions are closely intertwined because, as Habermas argues, “we understand a speech act when we know the kinds of reasons that a speaker could provide in order to convince a hearer that he is entitled in the given circumstances to claim validity for his utterance—in short, when we know what makes it acceptable”. In historically evolving, complex dynamic contexts involving agency – where there can be no stable frequencies except for a limited period of time – what kinds of reasons may we provide for the validity of a probability-claim?

    The list on p.4 analyses this question and also the point on p.7 about forecasts as “expectations shaping exercise” is highly relevant. From a strategic action point of view, what matters, firstly, is whether a forecast contributes to bringing about the intended result, quite independently of whether it assumes the form of a self-fulfilling or self-denying prophecy or, say, constructing possible futures to minimize current damage (e.g. in the context of calculating bank undercapitalization etc). Secondly, however, these kinds of forecasts are always subject to external critique as well.

    The “credibility compromise” (p.7) means that attempts to exaggerate the promise or dangers of X can easily backfire, if ex post it is all too easy to see a major discrepancy between the shaped anticipation and actual outcome. Moreover, in the context of business and financial boom-and-bust cycles official bodies often face a dilemma. They do not want to trigger a downturn and thus they continue with the official optimism, qualified only by occasional acts and words of caution. This may actually serve to make the downturn or crisis more dramatic.

    Claims about the (most) likely short-term future are in essence rhetorical (political) attempts to convince the audience for the sake ensuring a desired outcome, subject to credibility-constraints and various strategic dilemmas. The same holds true for various longer-term anticipations, say, concerning future inflation rates or aging and rising costs of pensions. While they may be successful in convincing the audience for the need for a monetarist turn in economic policy and central bank independence, or pension reform, the question is wither that is sufficient for making these claims acceptable? What if monetarist economic policies also bring about unintended consequences such as decline in per capita growth, rising income inequalities or political alienation? What if volatility and oscillations in financial markets consume the capital of private pension funds, so that there will be in fact less money available for paying pensions?

    Here I can see a good reason for Jamie’s (and Tony Lawson’s) skepticism for using the language of probability. There tends to be an element of deception in attempts to use complex econometric and other technical models based on the standard frequency interpretation of probability for purposes that are essentially rhetorical/political, as representations of mathematical-technical preciseness can yield credibility to forecasts and related probability-claims.

    This skepticism is well-taken even when we take into account that the validity of strategically motivated rhetorical claims can be evaluated from various (critical) viewpoints. It is not only the instant success of convincing the audience that matters; there are number of other pertinent considerations as well.

    The concept of probability changes its meaning, however, when we adopt the actor-perspective ourselves and raise the ethico-political question: what is the right course of action in this context and under these circumstances? What should I/we do? To answer these questions necessarily involves (i) anticipations of the future and (ii) assessments of likely consequences of one’s own actions. Moreover, from this perspective, it is not possible to follow the logic of strategic actions only. The criteria of successful communicative action include, among other things, also the truth and truthfulness of claims.

    What is the basis for my/our reasonable ethical (and political) actions? Following G.E.Moore, this is the question Keynes posed in his Treatise on Probability. I do not have the space (or time) to start a discussion on Keynes’s answer and its limits. The point is just to stress the idea of probability as a practical concept.

    I think the concept of probability is vital for comparing different anticipations in terms of whether they can provide an ‘approvable as a basis for action’ – now or in the near or not-so-distant future.

    • Jamie Morgan says:

      Hello Heikki, the practice of seeking to anticipate and shape the future are vital but is the concept of probability? This comes down to what is the concept of probability? You have a better idea of what the concept ought to be than I. I’m still stuck on its problems.

      For example,

      Do your provisos above translate into the absence of certainty?
      Probably yes in any non-trivial case.
      But this is a trivial point because certainty is not the measure of even ordinary activity; it is confidence in outcomes

      there is a major grey area here it seems between ‘x will happen’ and and ‘I should do x’ in terms of the cases that seem to interest you

      does it make sense to state we should do x and x is probable
      rather than we should do x and x is desirable and realisable

      I’m not sure

      both will happen and should do are in the agent sense you are focusing on, not just markers for chains of reasoning and evidence as forms of justifications that then express a probability they seem to be also the final statement of a chain of reasons for acting – a springboard for the translation of reasons for acting to acting… the concept of probability becomes something quite different when one’s own chains of reasoning and one’s own actions affect the nature of outcomes in this intimate way – we did x followed by y occurred (even if z was the intention). What do you think?

  • Heikki Patomaki says:

    Yes, Jamie, the concept of probability is about degrees of confidence or rational belief in particular outcomes, in a situation where neither total ignorance nor certainty prevails (its sister-concept plausibility is about degrees of confidence in premises, arguments and conclusions, coming very close to the meaning of probability as well). This kind of epistemological uncertainty was Keynes’s starting point in his Treatise on Probability, though he also adhered to a sort of Platonic metaphysics.

    In the absence of certainty about outcomes, what basis can there be for any action, policy or emancipatory transformation? If we are not certain about what outcome will follow from, say, me smoking cigarettes or taking particular medicine, should we then conclude that we have no knowledge at all about the outcome? Or if our aim is to reduce unemployment in Europe, but we have no certain knowledge about the outcomes of policy X, should we then say that we must choose our policy arbitrarily, perhaps simply by casting dice or consulting astrologers? Or if our aim is to empower people by democratizing social context C, but we have no knowledge whatever about the real effects of this transformation (which may thus be highly undesirable), should we then conclude that the best course of action is to refrain from even trying that sort of emancipation?

    Some skeptical empiricists from Hume to Moore have answered all these questions approvingly. Because we cannot know the (long-term) consequences of our actions, the best we can do is, as proposed by Hume, to follow conventional moral rules. The only thing we can do is to rely on certain conventional principles, in practice meaning that mostly it is best to leave things as they are – or to the invisible hand of markets. We may have no real certainty whether markets in fact work, but at least in decentralized markets no-one is claiming any better knowledge about what is good or what the consequences of action are or how the whole works. In contrast to the neoclassical habit of pretending deductive and conclusive arguments and certain knowledge, this would amount to a skeptical line of defending capitalist market economy & society (in fact, no so far from Hayek).

    The concept of probability is the key to making rational knowledge-claims concerning outcomes of considered actions, policies or transformations. I think it would be a mistake to discard the concept because it has been associated with the assumption of stable frequencies, particular forms of mathematics (system of numbers, arithmetic, algebra and probability calculus) and practices and institutions of capitalist market society. It is possible to develop the concept further, for instance by combining insights from Bayesian and Keynesian theories within the epistemological framework of informal, practical or dialectical argumentation and ontological framework of critical scientific realism. To repeat, the concept of probability is vital for comparing different anticipations in terms of whether they can provide an approvable basis for action or not.

    Clearly, the point is not to reduce normative questions to questions about probability. Rather, whenever consequences matter, normative arguments must involve considerations about the likely or probable outcomes of whatever we may be recommending or advocating.