There Is No Intertemporal Budget Constraint

Last week Nick Rowe posted a link to a just published article in a special issue of the Review of Keynesian Economics commemorating the 80th anniversary of the General Theory. Nick’s article discusses the confusion in the General Theory between saving and hoarding, and Nick invited readers to weigh in with comments about his article. The ROKE issue also features an article by Simon Wren-Lewis explaining the eclipse of Keynesian theory as a result of the New Classical Counter-Revolution, correctly identified by Wren-Lewis as a revolution inspired not by empirical success but by a methodological obsession with reductive micro-foundationalism. While deploring the New Classical methodological authoritarianism, Wren-Lewis takes solace from the ability of New Keynesians to survive under the New Classical methodological regime, salvaging a role for activist counter-cyclical policy by, in effect, negotiating a safe haven for the sticky-price assumption despite its shaky methodological credentials. The methodological fiction that sticky prices qualify as micro-founded allowed New Keynesianism to survive despite the ascendancy of micro-foundationalist methodology, thereby enabling the core Keynesian policy message to survive.

I mention the Wren-Lewis article in this context because of an exchange between two of the commenters on Nick’s article: the presumably pseudonymous Avon Barksdale and blogger Jason Smith about microfoundations and Keynesian economics. Avon began by chastising Nick for wasting time discussing Keynes’s 80-year old ideas, something Avon thinks would never happen in a discussion about a true science like physics, the 100-year-old ideas of Einstein being of no interest except insofar as they have been incorporated into the theoretical corpus of modern physics. Of course, this is simply vulgar scientism, as if the only legitimate way to do economics is to mimic how physicists do physics. This methodological scolding is typically charming New Classical arrogance. Sort of reminds one of how Friedrich Engels described Marxian theory as scientific socialism. I mean who, other than a religious fanatic, would be stupid enough to argue with the assertions of science?

Avon continues with a quotation from David Levine, a fine economist who has done a lot of good work, but who is also enthralled by the New Classical methodology. Avon’s scientism provoked the following comment from Jason Smith, a Ph. D. in physics with a deep interest in and understanding of economics.

You quote from Levine: “Keynesianism as argued by people such as Paul Krugman and Brad DeLong is a theory without people either rational or irrational”

This is false. The L in ISLM means liquidity preference and e.g. here …

http://krugman.blogs.nytimes.com/2013/11/18/the-new-keynesian-case-for-fiscal-policy-wonkish/

… Krugman mentions an Euler equation. The Euler equation essentially says that an agent must be indifferent between consuming one more unit today on the one hand and saving that unit and consuming in the future on the other if utility is maximized.

So there are agents in both formulations preferring one state of the world relative to others.

Avon replied:

Jason,

“This is false. The L in ISLM means liquidity preference and e.g. here”

I know what ISLM is. It’s not recursive so it really doesn’t have people in it. The dynamics are not set by any micro-foundation. If you’d like to see models with people in them, try Ljungqvist and Sargent, Recursive Macroeconomic Theory.

To which Jason retorted:

Avon,

So the definition of “people” is restricted to agents making multi-period optimizations over time, solving a dynamic programming problem?

Well then any such theory is obviously wrong because people don’t behave that way. For example, humans don’t optimize the dictator game. How can you add up optimizing agents and get a result that is true for non-optimizing agents … coincident with the details of the optimizing agents mattering.

Your microfoundation requirement is like saying the ideal gas law doesn’t have any atoms in it. And it doesn’t! It is an aggregate property of individual “agents” that don’t have properties like temperature or pressure (or even volume in a meaningful sense). Atoms optimize entropy, but not out of any preferences.

So how do you know for a fact that macro properties like inflation or interest rates are directly related to agent optimizations? Maybe inflation is like temperature — it doesn’t exist for individuals and is only a property of economics in aggregate.

These questions are not answered definitively, and they’d have to be to enforce a requirement for microfoundations … or a particular way of solving the problem.

Are quarks important to nuclear physics? Not really — it’s all pions and nucleons. Emergent degrees of freedom. Sure, you can calculate pion scattering from QCD lattice calculations (quark and gluon DoF), but it doesn’t give an empirically better result than chiral perturbation theory (pion DoF) that ignores the microfoundations (QCD).

Assuming quarks are required to solve nuclear physics problems would have been a giant step backwards.

To which Avon rejoined:

Jason

The microfoundation of nuclear physics and quarks is quantum mechanics and quantum field theory. How the degrees of freedom reorganize under the renormalization group flow, what effective field theory results is an empirical question. Keynesian economics is worse tha[n] useless. It’s wrong empirically, it has no theoretical foundation, it has no laws. It has no microfoundation. No serious grad school has taught Keynesian economics in nearly 40 years.

To which Jason answered:

Avon,

RG flow is irrelevant to chiral perturbation theory which is based on the approximate chiral symmetry of QCD. And chiral perturbation theory could exist without QCD as the “microfoundation”.

Quantum field theory is not a ‘microfoundation’, but rather a framework for building theories that may or may not have microfoundations. As Weinberg (1979) said:

” … quantum field theory itself has no content beyond analyticity, unitarity,
cluster decomposition, and symmetry.”

If I put together an NJL model, there is no requirement that the scalar field condensate be composed of quark-antiquark pairs. In fact, the basic idea was used for Cooper pairs as a model of superconductivity. Same macro theory; different microfoundations. And that is a general problem with microfoundations — different microfoundations can lead to the same macro theory, so which one is right?

And the IS-LM model is actually pretty empirically accurate (for economics):

http://informationtransfereconomics.blogspot.com/2014/03/the-islm-model-again.html

To which Avon responded:

First, ISLM analysis does not hold empirically. It just doesn’t work. That’s why we ended up with the macro revolution of the 70s and 80s. Keynesian economics ignores intertemporal budget constraints, it violates Ricardian equivalence. It’s just not the way the world works. People might not solve dynamic programs to set their consumption path, but at least these models include a future which people plan over. These models work far better than Keynesian ISLM reasoning.

As for chiral perturbation theory and the approximate chiral symmetries of QCD, I am not making the case that NJL models requires QCD. NJL is an effective field theory so it comes from something else. That something else happens to be QCD. It could have been something else, that’s an empirical question. The microfoundation I’m talking about with theories like NJL is QFT and the symmetries of the vacuum, not the short distance physics that might be responsible for it. The microfoundation here is about the basic laws, the principles.

ISLM and Keynesian economics has none of this. There is no principle. The microfoundation of modern macro is not about increasing the degrees of freedom to model every person in the economy on some short distance scale, it is about building the basic principles from consistent economic laws that we find in microeconomics.

Well, I totally agree that IS-LM is a flawed macroeconomic model, and, in its original form, it was borderline-incoherent, being a single-period model with an interest rate, a concept without meaning except as an intertemporal price relationship. These deficiencies of IS-LM became obvious in the 1970s, so the model was extended to include a future period, with an expected future price level, making it possible to speak meaningfully about real and nominal interest rates, inflation and an equilibrium rate of spending. So the failure of IS-LM to explain stagflation, cited by Avon as the justification for rejecting IS-LM in favor of New Classical macro, was not that hard to fix, at least enough to make it serviceable. And comparisons of the empirical success of augmented IS-LM and the New Classical models have shown that IS-LM models consistently outperform New Classical models.

What Avon fails to see is that the microfoundations that he considers essential for macroeconomics are themselves derived from the assumption that the economy is operating in macroeconomic equilibrium. Thus, insisting on microfoundations – at least in the formalist sense that Avon and New Classical macroeconomists understand the term – does not provide a foundation for macroeconomics; it is just question begging aka circular reasoning or petitio principia.

The circularity is obvious from even a cursory reading of Samuelson’s Foundations of Economic Analysis, Robert Lucas’s model for doing economics. What Samuelson called meaningful theorems – thereby betraying his misguided acceptance of the now discredited logical positivist dogma that only potentially empirically verifiable statements have meaning – are derived using the comparative-statics method, which involves finding the sign of the derivative of an endogenous economic variable with respect to a change in some parameter. But the comparative-statics method is premised on the assumption that before and after the parameter change the system is in full equilibrium or at an optimum, and that the equilibrium, if not unique, is at least locally stable and the parameter change is sufficiently small not to displace the system so far that it does not revert back to a new equilibrium close to the original one. So the microeconomic laws invoked by Avon are valid only in the neighborhood of a stable equilibrium, and the macroeconomics that Avon’s New Classical mentors have imposed on the economics profession is a macroeconomics that, by methodological fiat, is operative only in the neighborhood of a locally stable equilibrium.

Avon dismisses Keynesian economics because it ignores intertemporal budget constraints. But the intertemporal budget constraint doesn’t exist in any objective sense. Certainly macroeconomics has to take into account intertemporal choice, but the idea of an intertemporal budget constraint analogous to the microeconomic budget constraint underlying the basic theory of consumer choice is totally misguided. In the static theory of consumer choice, the consumer has a given resource endowment and known prices at which consumers can transact at will, so the utility-maximizing vector of purchases and sales can be determined as the solution of a constrained-maximization problem.

In the intertemporal context, consumers have a given resource endowment, but prices are not known. So consumers have to make current transactions based on their expectations about future prices and a variety of other circumstances about which consumers can only guess. Their budget constraints are thus not real but totally conjectural based on their expectations of future prices. The optimizing Euler equations are therefore entirely conjectural as well, and subject to continual revision in response to changing expectations. The idea that the microeconomic theory of consumer choice is straightforwardly applicable to the intertemporal choice problem in a setting in which consumers don’t know what future prices will be and agents’ expectations of future prices are a) likely to be very different from each other and thus b) likely to be different from their ultimate realizations is a huge stretch. The intertemporal budget constraint has a completely different role in macroeconomics from the role it has in microeconomics.

If I expect that the demand for my services will be such that my disposable income next year would be $500k, my consumption choices would be very different from what they would have been if I were expecting a disposable income of $100k next year. If I expect a disposable income of $500k next year, and it turns out that next year’s income is only $100k, I may find myself in considerable difficulty, because my planned expenditure and the future payments I have obligated myself to make may exceed my disposable income or my capacity to borrow. So if there are a lot of people who overestimate their future incomes, the repercussions of their over-optimism may reverberate throughout the economy, leading to bankruptcies and unemployment and other bad stuff.

A large enough initial shock of mistaken expectations can become self-amplifying, at least for a time, possibly resembling the way a large initial displacement of water can generate a tsunami. A financial crisis, which is hard to model as an equilibrium phenomenon, may rather be an emergent phenomenon with microeconomic sources, but whose propagation can’t be described in microeconomic terms. New Classical macroeconomics simply excludes such possibilities on methodological grounds by imposing a rational-expectations general-equilibrium structure on all macroeconomic models.

This is not to say that the rational expectations assumption does not have a useful analytical role in macroeconomics. But the most interesting and most important problems in macroeconomics arise when the rational expectations assumption does not hold, because it is when individual expectations are very different and very unstable – say, like now, for instance — that macroeconomies become vulnerable to really scary instability.

Simon Wren-Lewis makes a similar point in his paper in the Review of Keynesian Economics.

Much discussion of current divisions within macroeconomics focuses on the ‘saltwater/freshwater’ divide. This understates the importance of the New Classical Counter Revolution (hereafter NCCR). It may be more helpful to think about the NCCR as involving two strands. The one most commonly talked about involves Keynesian monetary and fiscal policy. That is of course very important, and plays a role in the policy reaction to the recent Great Recession. However I want to suggest that in some ways the second strand, which was methodological, is more important. The NCCR helped completely change the way academic macroeconomics is done.

Before the NCCR, macroeconomics was an intensely empirical discipline: something made possible by the developments in statistics and econometrics inspired by The General Theory. After the NCCR and its emphasis on microfoundations, it became much more deductive. As Hoover (2001, p. 72) writes, ‘[t]he conviction that macroeconomics must possess microfoundations has changed the face of the discipline in the last quarter century’. In terms of this second strand, the NCCR was triumphant and remains largely unchallenged within mainstream academic macroeconomics.

Perhaps I will have some more to say about Wren-Lewis’s article in a future post. And perhaps also about Nick Rowe’s article.

HT: Tom Brown

Update (02/11/16):

On his blog Jason Smith provides some further commentary on his exchange with Avon on Nick Rowe’s blog, explaining at greater length how irrelevant microfoundations are to doing real empirically relevant physics. He also expands on and puts into a broader meta-theoretical context my point about the extremely narrow range of applicability of the rational-expectations equilibrium assumptions of New Classical macroeconomics.

David Glasner found a back-and-forth between me and a commenter (with the pseudonym “Avon Barksdale” after [a] character on The Wire who [didn’t end] up taking an economics class [per Tom below]) on Nick Rowe’s blog who expressed the (widely held) view that the only scientific way to proceed in economics is with rigorous microfoundations. “Avon” held physics up as a purported shining example of this approach.
I couldn’t let it go: even physics isn’t that reductionist. I gave several examples of cases where the microfoundations were actually known, but not used to figure things out: thermodynamics, nuclear physics. Even modern physics is supposedly built on string theory. However physicists do not require every pion scattering amplitude be calculated from QCD. Some people do do so-called lattice calculations. But many resort to the “effective” chiral perturbation theory. In a sense, that was what my thesis was about — an effective theory that bridges the gap between lattice QCD and chiral perturbation theory. That effective theory even gave up on one of the basic principles of QCD — confinement. It would be like an economist giving up opportunity cost (a basic principle of the micro theory). But no physicist ever said to me “your model is flawed because it doesn’t have true microfoundations”. That’s because the kind of hard core reductionism that surrounds the microfoundations paradigm doesn’t exist in physics — the most hard core reductionist natural science!
In his post, Glasner repeated something that he had before and — probably because it was in the context of a bunch of quotes about physics — I thought of another analogy.

Glasner says:

But the comparative-statics method is premised on the assumption that before and after the parameter change the system is in full equilibrium or at an optimum, and that the equilibrium, if not unique, is at least locally stable and the parameter change is sufficiently small not to displace the system so far that it does not revert back to a new equilibrium close to the original one. So the microeconomic laws invoked by Avon are valid only in the neighborhood of a stable equilibrium, and the macroeconomics that Avon’s New Classical mentors have imposed on the economics profession is a macroeconomics that, by methodological fiat, is operative only in the neighborhood of a locally stable equilibrium.

 

This hits on a basic principle of physics: any theory radically simplifies near an equilibrium.

Go to Jason’s blog to read the rest of his important and insightful post.

24 Responses to “There Is No Intertemporal Budget Constraint”


  1. 1 Henry February 9, 2016 at 9:44 pm

    Thanks David – brilliant post.

  2. 2 Ramanan February 9, 2016 at 10:24 pm

    Cluster decomposition principle.

    Won’t even find it beyond Steven Weinberg’s book on Quantum Field Theory.

    Never thought the term would be used in an economics forum.

  3. 3 Hugo André February 10, 2016 at 9:23 am

    Now you’re even writing blogposts about comments made on other blogs? Where will we end up if this trend continues…

    I do have one question if David or anyone else has time to answer it: What’s the problem with Samuelson’s contention that “only potentially empirically verifiable statements have meaning” ?

    Excellent post, anyway!

  4. 4 Tom Brown February 10, 2016 at 4:52 pm

    David, a very interesting post. You may be interested to know that Jason contemplated doing a follow up on intertemporal budget constraints of his own recently (inspired by your comment on your last post). Should be interesting!

  5. 5 David Glasner February 10, 2016 at 5:03 pm

    Henry, Thanks, glad you liked it.

    Ramanan, I only quoted it, can’t say I understood it.

    Hugo, It’s all Tom Brown’s fault, he showed it to me. The problem with Samuelson is the idea that statements that can’t, in principle, be empirically verified have no meaning. According to that view, the previous sentence and this one are meaningless. That’s clearly false, and it is amazing that many brilliant people aside from Samuelson actually believed it.

    Tom, OK, I must admit that I have trouble following Jason, so I don’t visit his blog that often, so I’m relying on you to alert me.

  6. 6 Tom Brown February 10, 2016 at 5:10 pm

    David, no problem. Although I already alerted him about this post, so perhaps my work is done here. ;^D (but, sure, I will do so if required)

  7. 7 Tom Brown February 10, 2016 at 5:21 pm

    Also, it’s funny “intertemporal budget constraints” have come up: I spent last night trying to understand this post on Diamond-Dybvig. I get parts of it, but I think I need to understand the mainstream version first. Unfortunately graphical approaches seem to be scarce.

  8. 8 Hugo André February 10, 2016 at 6:17 pm

    What you say is patently true which is why I can’t imagine that Samuelson would have meant it that way. Did he actually write something like that? More importantly, does his writing make clear that he believed it in the literal sense?

    For example, couldn’t he have meant it in the sense that only empirically verifiable statements are scientifically useful so that a statement like “in reality the whole universe is situated on the toe of a giant named Ug”, though not technically impossible, is useless because we can never find out about it’s veracity.

    I’m sorry to be hammering on about this minor detail. Feel free to disregard my reply if you feel like it.

  9. 9 David Glasner February 10, 2016 at 6:43 pm

    Hugo, You are right that I am only surmising that that is what Samuelson meant, although it may be that I once read something of his that would have been more definitive along those lines, but if I did I no longer have a specific recollection of it. But as I am writing this, it occurs to me that Samuelson’s theory of revealed preference, which is actually a wonderful theoretical insight into the neoclassical theory of consumer choice and preferences, was similarly motivated by the desire to reformulate neoclassical theory without any reference to unobservable (and thus meaningless) concepts like marginal utility. But I can’t quote you a definitive source to prove that that is what Samuelson was thinking, but you could check Samuelson’s original paper on revealed preference. If you read up on logical positivism, you will see that in the mid-twentieth century logical positivism was the dominant philosophical doctrine, so Samuelson was simply going with the flow. He would have simply been trying to describe what he was doing in the Foundations in a way that conformed to the standard philosophical views of his time.

  10. 10 Tom Brown February 10, 2016 at 9:11 pm

    David, you probably saw this on twitter, but to keep my promise above:
    http://informationtransfereconomics.blogspot.com/2016/02/one-more-physics-analogy.html

  11. 11 Tom Brown February 10, 2016 at 10:30 pm

    … actually there are some choice bits in that post I just linked to: perhaps worthy of an update?

  12. 12 Ben Johannson February 11, 2016 at 3:14 am

    Keynesian economics is worse tha[n] useless. It’s wrong empirically, it has no theoretical foundation, it has no laws. It has no microfoundation.

    Avon’s entire series of posts can be encapsulated by logical and knowledge failures in the above quote.

    “It’s wrong empirically. . .” Which part? What has beeen falsified? He doesn’t tell us and I suspect isn’t familiar enough to do so.

    “. . . it has no theoretical foundation. . .” Either he hasn’t read anything, doesn’t know what theory is or is arguing in less than good faith.

    “. . . it has no laws, it has no microfoundation.” This is flawed logic. The writer begins with the twin assumptions that a school of thought must begin with microfoundations and claim immutable laws. Keynesianism does not and therefore must be “worse than useless.” Having crafted an exceedingly narrow definition of useful which excludes the work he disapproved of, the writer then dismisses the work as failing to meet the definition he created to exclude it.

  13. 13 Greg Hill (@GregHill1000) February 11, 2016 at 8:15 am

    Hi David,

    Great post! On the issue of whether Keynesian models are composed of “people,” you summarize, “So there are agents in both formulations (Keynesian and New Classical) preferring one state of the world relative to others.”

    I take your point and agree with it as far as it goes. But let me take it a step further: before we can prefer “one state of the world relative to others,” we must have some conception of what these “states” are. If we’re really talking about “people,” then virtually every conception of a “state” that forms the basis for a choice will involve some kind of narrative description.

    Market participants aren’t presented with a “list of states” from which to choose (as in Arrow-Debreu), but must construct these states, or scenarios, using their imagination and other powers. Moreover, we have considerable freedom in imaging these states. I think it follows from this (and some other premises) that the expectations of these autonomous market participants will most likely be inconsistent with each other.

    At this point, I think my argument dovetails with another one of yours, “What Avon fails to see is that the microfoundations that he considers essential for macroeconomics are themselves derived from the assumption that the economy is operating in macroeconomic equilibrium.” If the postulate of rational expectations implicitly assumes an economy “in macroeconomic equilibrium,” and if this equilibrium, itself, depends on mutually consistent plans and expectations, then small deviations from either side are self-reinforcing, and the paradigm dissolves into inconsistent expectations (which need not be irrational) and disequilibrium.

    Some of these ideas (if I may dignify them as such) are explored further here:

    http://www.the-human-predicament.com/2015/09/rational-expectations-and.html

    Best regards, Greg

  14. 14 Benjamin Cole February 11, 2016 at 4:48 pm

    Demanding post.
    Sadly, I still believe macroeconomics is politics in drag. Practitioners of macroeconomics seem to start with a premise–be it rich people should be taxed more or rich people should be taxed less or the government should run deficits or not–and then work backwards from there creating fantastically complicated and brilliant arguments to support their case.

    An advocacy lawyer can be telling the truth. That reality makes dissecting macro economic arguments even more challenging.

  15. 15 Henry February 11, 2016 at 8:13 pm

    “Perhaps I will have some more to say about………..Nick Rowe’s article”

    David,

    Would look forward to seeing what you have to say.

  16. 16 Jason Smith February 11, 2016 at 9:28 pm

    Thanks, David! I’m glad you liked it!

  17. 17 Tom Brown February 13, 2016 at 12:29 am

    More on “macrofoundations”: with contributions from you, Nick Rowe, and Jason (being it’s his post of course), … and an ecologist too. Enjoy.

  18. 18 jpd February 13, 2016 at 9:09 am

    Reblogged this on DAMIJAN blog and commented:
    Huh, če vas le malce zanima “drobovje” ekonomije in srčika razlike med neoklasično redukcionistično šolo ter širšo keynesiansko šolo, je tale post Davida Glasnerja MUST READ.
    Na kratko, post pokaže, da je neoklasična ekonomska šola, ki “modelira” obnašanje ljudi, bistveno bolj restriktivna (redukcionistična) od najbolj hard core fizike, ki se trudi razumeti (modelirati) obnašanje temeljnih delcev narave, ki je sicer bistveno bolj predvidljivo od obnašanja ljudi.

  19. 19 David Glasner February 15, 2016 at 9:16 am

    Tom, Thanks for the link. You are right, it would have been worthy, but one update was enough for me.

    Ben, Agreed.

    Greg, Actually, the summary you quote was Jason’s not mine. At any rate, I think you make a very good point, which possibly relates to the Keynesian distinction between probability and uncertainty. The Arrow-Debreu framework depends on being able to reduce all knowledge imperfections to known probability distributions, which essentially assumes away the problem (i.e., begs the question). Our uncertainty about the future is not just having to work with a probability distribution rather than a point estimate; we just don’t have a clue about what’s going to happen. Who would have ever thought that Donald Trump would become a “serious” candidate for President of the United States?

    Thanks for the link, which makes an important point about the extreme nature of the rational expectations assumption that all agents share the same expectations. The assumption is a good way of checking on the internal consistency of the model, because the assumption of rational expectations should imply that the model is at a fixed point, but it is an absurd assumption to make if one is trying to model real world economic fluctuations.

    Benjamin, Well, I am trying to take the politics out of macro. I just love lost causes.

    Henry, Well, unfortunately I don’t see a link to the paper, so don’t hold your breath.

    Jason, You’re welcome, Jason. I did, indeed.

    Tom, Thanks, again.

  20. 20 Henry February 15, 2016 at 1:05 pm

    David,

    The link is there:

    http://www.elgaronline.com/abstract/journals/roke/4-1/roke.2016.01.05.xml

    I tried to get the paper from the local university, but unfortunately they don’t subscribe to the journal (well, we’re all New Classicals now, aren’t we?) and I ain’t gonna pay $35 for the privilege. If I paid $35 for every paper I’ve read I would be destitute. Nick Rowe does make various comments in several posts which flesh out his position but they are more like assertions than argument so it’s difficult to pin it down.

  21. 21 David Glasner February 15, 2016 at 1:43 pm

    Henry, I saw the abstract, but I don’t feel that I can write a post about the paper without reading the paper, not just the abstract.

  22. 22 Henry February 15, 2016 at 2:33 pm

    Totally understand David. The abstract and his various posts are not even clear about what he is asserting let alone his reasoning.

  23. 23 Mitch Golden March 10, 2016 at 8:15 pm

    I left a similar comment on Jason Smith’s blog, but I think you might be interested in it so I’ll put a version of it here:

    I am a (former) physicist, so I was quite amused by these discussions. In fact Avon Barksdale is even wronger than anyone has stated. In physics, both historically and as a matter of principle, one always starts with the macro theory and proceeds to the micro. Macro tests micro, not the other way around.

    For example: Modern science essentially started with Newton’s Law of Gravitation explaining the motions of the planets. Planets are pretty macro after all. What goes into that? Well, for one thing you need a rigid body approximation – the planets don’t collapse to points under their own weight.

    The composition of matter wasn’t understood for more than two centuries after, but one of the things that had to be explained was the existence of rigid bodies. That is brought about by quantum mechanics and the antisymmetry of fermion wave functions, ensuring that atomic orbitals take up space and can’t sit on top of each other.

    Had a proposed theory of the structure of matter failed to explain rigid bodies, it would not have been the rigid body approximation that would be invalidated, rather the proposed micro theory.

    Or to use another example: thermodynamics was a well-validated theory – one that was used to design steam engines for example – for long before the atomic theory of matter was accepted and statistical mechanics gave “micro-founded” meanings to the macroscopic variables such as “pressure” and “temperature”. And, as in the above case, it’s the macro theory that is the test of the micro theory, not the other way around.

    Philosophically, this is because the world we actually observe is that of the macro variables, not the variables of the micro theory we’re using to explain things.

    The analogy to economics is direct. When an economist draws a supply curve with a “price” on one axis, that is not a variable with micro meaning. We have individual transactions each with something being sold, and the price on the axis is a function of the aggregate of transactions. The micro models Avon Barksdale is referring to are just guesses as to the “physical laws” (i.e. the motivations) obeyed by those entering into the transactions. Nothing about these motivations is itself an observable quantity; only the behavior of the transactions can be measured. (And to top it off, the “laws” are obviously egregious simplifications that no one would mistake for a real explanation of the behavior of an actual human being.)

    If those making the claim that IS-LM should be rejected because it is not microfounded could actually prove some sort of “no-go” theorem, showing that it was not possible with *any* micro model to derive IS-LM, that would be interesting. In the absence of that, the observed truth of IS-LM can be used to rule out particular micro models, but that’s pretty much the end of the story.


  1. 1 Paul Romer on Modern Macroeconomics, Or, the “All Models Are False” Dodge | Uneasy Money Trackback on September 23, 2016 at 11:21 am

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s




About Me

David Glasner
Washington, DC

I am an economist in the Washington DC area. My research and writing has been mostly on monetary economics and policy and the history of economics. In my book Free Banking and Monetary Reform, I argued for a non-Monetarist non-Keynesian approach to monetary policy, based on a theory of a competitive supply of money. Over the years, I have become increasingly impressed by the similarities between my approach and that of R. G. Hawtrey and hope to bring Hawtrey's unduly neglected contributions to the attention of a wider audience.

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 425 other followers

Follow Uneasy Money on WordPress.com

%d bloggers like this: