Microfoundations (aka Macroeconomic Reductionism) Redux

In two recent blog posts (here and here), Simon Wren-Lewis wrote sensibly about microfoundations. Though triggered by Wren-Lewis’s posts, the following comments are not intended as criticisms of him, though I think he does give microfoundations (as they are now understood) too much credit. Rather, my criticism is aimed at the way microfoundations have come to be used to restrict the kind of macroeconomic explanations and models that are up for consideration among working macroeconomists. I have written about microfoundations before on this blog (here and here)  and some, if not most, of what I am going to say may be repetitive, but obviously the misconceptions associated with what Wren-Lewis calls the “microfoundations project” are not going to be dispelled by a couple of blog posts, so a little repetitiveness may not be such a bad thing. Jim Buchanan liked to quote the following passage from Herbert Spencer’s Data of Ethics:

Hence an amount of repetition which to some will probably appear tedious. I do not, however, much regret this almost unavoidable result; for only by varied iteration can alien conceptions be forced on reluctant minds.

When the idea of providing microfoundations for macroeconomics started to catch on in the late 1960s – and probably nowhere did they catch on sooner or with more enthusiasm than at UCLA – the idea resonated, because macroeconomics, which then mainly consisted of various versions of the Keynesian model, seemed to embody certain presumptions about how markets work that contradicted the presumptions of microeconomics about how markets work. In microeconomics, the primary mechanism for achieving equilibrium is the price (actually the relative price) of whatever good is being analyzed. A full (or general) microeconomic equilibrium involves a set of prices such that each of markets (whether for final outputs or for inputs into the productive process) are in equilibrium, equilibrium meaning that every agent is able to purchase or sell as much of any output or input as desired at the equilibrium price. The set of equilibrium prices not only achieves equilibrium, the equilibrium, under some conditions, has optimal properties, because each agent, in choosing how much to buy or sell of each output or input, is presumed to be acting in a way that is optimal given the preferences of the agent and the social constraints under which the agent operates. Those optimal properties don’t always follow from microeconomic presumptions, optimality being dependent on the particular assumptions (about preferences, production and exchange technology, and property rights) adopted by the analyst in modeling an individual market or an entire system of markets.

The problem with Keynesian macroeconomics was that it seemed to overlook, or ignore, or dismiss, or deny, the possibility that a price mechanism is operating — or could operate — to achieve equilibrium in the markets for goods and for labor services. In other words, the Keynesian model seemed to be saying that a macoreconomic equilibrium is compatible with the absence of market clearing, notwithstanding that the absence of market clearing had always been viewed as the defining characteristic of disequilibrium. Thus, from the perspective of microeconomic theory, if there is an excess supply of workers offering labor services, i.e., there are unemployed workers who would be willing to be employed at the same wage that currently employed workers are receiving, there ought to be market forces that would reduce wages to a level such that all workers willing to work at that wage could gain employment. Keynes, of course, had attempted to explain why workers could only reduce their nominal wages, not their real wages, and argued that nominal wage cuts would simply induce equivalent price reductions, leaving real wages and employment unchanged. The microeconomic reasoning on which that argument was based hinged on Keynes’s assumption that nominal wage cuts would trigger proportionate price cuts, but that assumption was not exactly convincing, if only because the percentage price cut would seem to depend not just on the percentage reduction in the nominal wage, but also on the labor intensity of the product, Keynes, habitually and inconsistently, arguing as if labor were the only factor of production while at the same time invoking the principle of diminishing marginal productivity.

At UCLA, the point of finding microfoundations was not to create a macroeconomics that would simply reflect the results and optimal properties of a full general equilibrium model. Indeed, what made UCLA approach to microeconomics distinctive was that it aimed at deriving testable implications from relaxing the usual informational and institutional assumptions (full information, zero transactions costs, fully defined and enforceable property rights) underlying conventional microeconomic theory. If the way forward in microeconomics was to move away from the extreme assumptions underlying the perfectly competitive model, then it seemed plausible that relaxing those assumptions would be fruitful in macroeconomics as well. That led Armen Alchian and others at UCLA to think of unemployment as largely a search phenomenon. For a while that approach seemed promising, and to some extent the promise was fulfilled, but many implications of a purely search-theoretic approach to unemployment don’t seem to be that well supported empirically. For example, search models suggest that in recessions, quits increase, and that workers become more likely to refuse offers of employment after the downturn than before. Neither of those implications seems to be true. A search model would suggest that workers are unemployed because they are refusing offers below their reservation wage, but in fact most workers are becoming unemployed because they are being laid off, and in recessions workers seem likely to accept offers of employment at the same wage that other workers are getting. Now it is possible to reinterpret workers’ behavior in recessions in a way that corresponds to the search-theoretic model, but the reinterpretation seems a bit of a stretch.

Even though he was an early exponent of the search theory of unemployment, Alchian greatly admired and frequently cited a 1974 paper by Donald Gordon “A Neoclassical Theory of Keynesian Unemployment,” which proposed an implicit-contract theory of employer-employee relationship. The idea was that workers make long-term commitments to their employers, and realizing their vulnerability, after having committed themselves to their employer, to exploitation by a unilateral wage cut imposed by the employer under threat of termination, expect some assurance from their employer that they will not be subjected to a unilateral demand to accept a wage cut. Such implicit understandings make it very difficult for employers, facing a reduction in demand, to force workers to accept a wage cut, because doing so would make it hard for the employer to retain those workers that are most highly valued and to attract new workers.

Gordon’s theory of implicit wage contracts has a certain similarity to Dennis Carlton’s explanation of why many suppliers don’t immediately raise prices to their steady customers. Like Gordon, Carlton posits the existence of implicit and sometimes explicit contracts in which customers commit to purchase minimum quantities or to purchase their “requirements” from a particular supplier. In return for the assurance of having a regular customer on whom the supplier can count, the supplier gives the customer assurance that he will receive his customary supply at the agreed upon price even if market conditions should change. Rather than raise the price in the event of a shortage, the supplier may feel that he is obligated to continue supplying his regular customers at the customary price, while raising the price to new or occasional customers to “market-clearing” levels. For certain kinds of supply relationships in which customer and supplier expect to continue transacting regularly over a long period of time, price is not the sole method by which allocation decisions are made.

Klein, Crawford and Alchian discussed a similar idea in their 1978 article about vertical integration as a means of avoiding or mitigating the threat of holdup when a supplier and a customer must invest in some sunk asset, e.g., a pipeline connection, for the supply relationship to be possible. The sunk investment implies that either party, under the right circumstances, could threaten to holdup the other party by threatening to withdraw from the relationship leaving the other party stuck with a useless fixed asset. Vertical integration avoids the problem by aligning the incentives of the two parties, eliminating the potential for holdup. Price rigidity can thus be viewed as a milder form of vertical integration in cases where transactors have a relatively long-term relationship and want to assure each other that they will not be taken advantage of after making a commitment (i.e., foregoing other trading opportunities) to the other party.

The search model is fairly easy to incorporate into a standard framework because search can be treated as a form of self-employment that is an alternative to accepting employment. The shape and position of the individual’s supply curve reflects his expectations about future wage offers that he will receive if he chooses not to accept employment in the current period. The more optimistic the worker’s expectation of future wages, the higher the worker’s reservation wage in the current period. The more certain the worker feels about the expected future wage, the more elastic is his supply curve in the neighborhood of the expected wage. Thus, despite its empirical shortcomings, the search model could serve as a convenient heuristic device for modeling cyclical increases in unemployment because of the unwillingness of workers to accept nominal wage cuts. From a macroeconomic modeling perspective, the incorrect or incomplete representation of the reason for the unwillingness of workers to accept wage cuts may be less important than the overall implication of the model, which is that unanticipated aggregate demand shocks can have significant and persistent effects on real output and employment. For example in his reformulation of macroeconomic theory, Earl Thompson, though he was certainly aware of Donald Gordon’s paper, relied exclusively on a search-theoretic rationale for Keynesian unemployment, and I don’t know (or can’t remember) if he had a specific objection to Gordon’s model or simply preferred to use the search-theoretic approach for pragmatic modeling reasons.

At any rate, these comments about the role of search models in modeling unemployment decisions are meant to illustrate why microfoundations could be useful for macroeconomics: by adding to the empirical content of macromodels, providing insight into the decisions or circumstances that lead workers to accept or reject employment in the aftermath of aggregate demand shocks, or why employers impose layoffs on workers rather than offer employment at reduced wages. The spectrum of such microeconomic theories of employer-employee relationships have provided us with a richer understanding of what the term “sticky wages” might actually be referring to, beyond the existence of minimum wage laws or collective bargaining contracts specifying nominal wages over a period of time for all covered employees.

In this context microfoundations meant providing a more theoretically satisfying, more micreconomically grounded explanation for a phenomenon – “sticky wages” – that seemed somehow crucial for generating the results of the Keynesian model. I don’t think that anyone would question that microfoundations in this narrow sense has been an important and useful area of research. And it is not microfoundations in this sense that is controversial. The sense in which microfoundations is controversial is whether a macroeconomic model must show that aggregate quantities that it generates can be shown to consistent with the optimizing choices of all agents in the model. In other words, the equilibrium solution of a macroeconomic model must be such that all agents are optimizing intertemporally, subject to whatever informational imperfections are specified by the model. If the model is not derived from or consistent with the solution to such an intertemporal optimization problem, the macromodel is now considered inadequate and unworthy of consideration. Here’s how Michael Woodford, a superb economist, but very much part of the stifling microfoundations consensus that has overtaken macroeconomics, put in his paper “The Convergence in Macroeconomics: Elements of the New Synthesis.”

But it is now accepted that one should know how to render one’s growth model and one’s business-cycle model consistent with one another in principle, on those occasions when it is necessary to make such connections. Similarly, microeconomic and macroeconomic analysis are no longer considered to involve fundamentally different principles, so that it should be possible to reconcile one’s views about household or firm behavior, or one’s view of the functioning of individual markets, with one’s model of the aggregate economy, when one needs to do so.

In this respect, the methodological stance of the New Classical school and the real business cycle theorists has become the mainstream. But this does not mean that the Keynesian goal of structural modeling of short-run aggregate dynamics has been abandoned. Instead, it is now understood how one can construct and analyze dynamic general-equilibrium models that incorporate a variety of types of adjustment frictions, that allow these models to provide fairly realistic representations of both shorter-run and longer-run responses to economic disturbances. In important respects, such models remain direct descendants of the Keynesian macroeconometric models of the early postwar period, though an important part of their DNA comes from neoclassical growth models as well.

Woodford argues that by incorporating various imperfections into their general equilibrium models, e.g.., imperfectly competitive output and labor markets, lags in the adjustment of wages and prices to changes in market conditions, search and matching frictions, it is possible to reconcile the existence of underutilized resources with intertemporal optimization by agents.

The insistence of monetarists, New Classicals, and early real business cycle theorists on the empirical relevance of models of perfect competitive equilibrium — a source of much controversy in past decades — is not what has now come to be generally accepted. Instead, what is important is having general-equilibrium models in the broad sense of requiring that all equations of the model be derived from mutually consistent foundations, and that the specified behavior of each economic unit make sense given the environment created by the behavior of the others. At one time, Walrasian competitive equilibrium models were the only kind of models with these features that were well understood; but this is no longer the case.

Woodford shows no recognition of the possibility of multiple equilibria, or that the evolution of an economic system and time-series data may be path-dependent, making the long-run neutrality propositions characterizing most DSGE models untenable. If the world – the data generating mechanism – is not like the world assumed by modern macroeconomics, the estimates derived from econometric models reflecting the worldview of modern macroeconomics will be inferior to estimates derived from an econometric model reflecting another, more accurate, world view. For example, if there are many possible equilibria depending on changes in expectational parameters or on the accidental deviations from an equilibrium time path, the idea of intertemporal optimization may not even be meaningful. Rather than optimize, agents may simply follow certain simple rules of thumb. But, on methodological principle, modern macroeconomics treats the estimates generated by any alternative econometric model insufficiently grounded in the microeconomic principles of intertemporal optimization as illegitimate.

Even worse from the perspective of microfoundations are the implications of something called the Sonnenchein-Mantel-Debreu Theorem, which, as I imperfectly understand it, says something like the following. Even granting the usual assumptions of the standard general equilibrium model — continuous individual demand and supply functions, homogeneity of degree zero in prices, Walras’s Law, and suitable boundary conditions on demand and supply functions, there is no guarantee that there is a unique stable equilibrium for such an economy. Thus, even apart from the dependence of equilibrium on expectations, there is no rationally expected equilibrium because there is no unique equilibrium to serve as an attractor for expectations. Thus, as I have pointed out before, as much as macroeconomics may require microfoundations, microeconomics requires macrofoundations, perhaps even more so.

Now let us compare the methodological demand for microfoundations for macroeconomics, which I would describe as a kind of macroeconomic methodological reductionism, with the reductionism of Newtonian physics. Newtonian physics reduced the Keplerian laws of planetary motion to more fundamental principles of gravitation governing the motion of all bodies celestial and terrestrial. In so doing, Newtonian physics achieved an astounding increase in explanatory power and empirical scope. What has the methodological reductionism of modern macroeconomics achieved? Reductionsim was not the source, but the result, of scientific progress. But as Carlaw and Lipsey demonstrated recently in an important paper, methodological reductionism in macroeconomics has resulted in a clear retrogression in empirical and explanatory power. Thus, methodological reductionism in macroeconomics is an antiscientific exercise in methodological authoritarianism.

14 Responses to “Microfoundations (aka Macroeconomic Reductionism) Redux”


  1. 1 Marcus Nunes October 25, 2013 at 3:23 pm

    David A demanding but great post. I don´t even think it qualifies as a “post”!

  2. 2 Kevin Donoghue (@Paddy_Solemn) October 26, 2013 at 2:47 am

    I broadly agree with this post and enjoyed reading it. But being a cranky git, I’ll home in on the bits that are wrong. (Maybe I should say “in my opinion” but the first one, at least, is surely not a matter of opinion.)

    “…Keynes, habitually and inconsistently, arguing as if labor were the only factor of production while at the same time invoking the principle of diminishing marginal productivity.”

    No, he didn’t. There’s a ton of stuff about the user cost of capital in the General Theory, which wouldn’t be there if labour was the only input. Keynes assumes that the capital stock cannot be altered appreciably in the time-period relevant for his analysis. Diminishing marginal productivity makes sense in that context.

    “Woodford shows no recognition of the possibility of multiple equilibria….”

    Maybe not in that paper, but I’d say he’s very well aware of the problem.

    “For example, if there are many possible equilibria depending on changes in expectational parameters or on the accidental deviations from an equilibrium time path, the idea of intertemporal optimization may not even be meaningful. Rather than optimize, agents may simply follow certain simple rules of thumb.”

    This I do agree with, but Keynes managed to reach that conclusion long before anyone was talking about microfoundations. There’s plenty along those lines not only in the GT but also in the Treatise on Probability.

    “Thus, even apart from the dependence of equilibrium on expectations, there is no rationally expected equilibrium because there is no unique equilibrium to serve as an attractor for expectations.”

    I’m pretty sure that defenders of RE would reject the notion that non-uniqueness implies non-existence. I agree with what I think you’re getting at, but if you don’t phrase it right you give the RE brigade an excuse to ignore you. (However Frank Hahn did phrase it right, decades ago, and they ignored him anyway.)

    These quibbles apart, thanks for an interesting post and links which I’ll check out when I’m in the library (or sooner if I find freebie versions online).

  3. 3 Unlearningecon October 26, 2013 at 9:08 am

    Great post, David, but a couple of things near the beginning caught my eye. One is not a criticism of you but just seemed odd to me:

    “because macroeconomics, which then mainly consisted of various versions of the Keynesian model, seemed to embody certain presumptions about how markets work that contradicted the presumptions of microeconomics about how markets work.”

    This just shows how distorted the thought process of economics can be compared to other “sciences”. The idea that results at a higher level of emergence are different, even in contradiction, with those at lower levels, comes as no surprise in many complex systems.

    Then there’s this, which I think you get partially wrong:

    “The microeconomic reasoning on which that argument was based hinged on Keynes’s assumption that nominal wage cuts would trigger proportionate price cuts, but that assumption was not exactly convincing”

    Keynes argued that there were a few possibilities. There was the above, but there was also the point that a decrease in real wages would simply reduce demand.

    Obviously, I agree about microfoundations. Actually, I’ve just been having a debate on twitter in which I realised that microfoundations aren’t just a problem in macroeconomics, but also in microeconomics! The apparent necessity of building up consumer and producer theory from axioms forces students through tedious process with no relation to the real world, and actually say very little about the theoretical implications of eg utility.

  4. 4 sumnerbentley October 26, 2013 at 12:03 pm

    Excellent post. I’d add that the reasons for wage and price stickiness are probably exceedingly complex, and in most cases not worth trying to model at the micro level. Even so, the sticky wage assumption is definitely worth using. I’d say the same about ratex. It’s hard to create a model with a unique ratex solution, but not impossible. In any case, it probably makes sense to assume ratex (at least for policy purposes) even if the model isn’t able to rule out alternative equilibria. McCallum has some good stuff on this problem.

  5. 5 David Glasner October 26, 2013 at 8:12 pm

    Marcus, Thanks so much. I spent a weak trying to write it. Sorry for the late appearance of your comment; the spam filter almost ate it, but somehow I found it and rescued it from oblivion.

    Kevin, My point was that he argued that if money wages were reduced by 10%, producers would simply reduce output prices by 10%, so that real wages would not change. That argument only works if labor is the only input. If there are inputs other than labor, then a 10% reduction in wages, will not necessarily lead to a 10% reduction in output prices. Why would a firm with a highly capital intensive production process reduce its output price by 10% just because its trivial labor cost went down by 10%? I agree that Keynes had a lot to say about capital – and much of what he said about was absolutely splendid – but he sometimes resorted to simplifications that don’t hold up. But you were probably upset because I went a little overboard in accusing him of habitually and inconsistently arguing as if labor were the only factor of production. I probably went too far in that assertion, but he did so from time to time.

    I am sure that Woodford is fully aware of the possibility of multiple equilibria. I only meant to criticize his characterization of the consensus in macroeconomics.

    About Keynes and expectations and rules of thumb, I totally agree with you.

    About non-uniqueness of equilibrium, I agree that non-uniquness does not imply non-existence, but if there are many possible equilibria, how can it be said that there is tendency toward any single one of the potential equilibria? Suppose that there are 5 potential equilibria and one-fifth of the agents expect each of the 5; it is not clear to me which, if any, of the 5 equilibria would be realized.

    Unlearningecon, I am not so sure that the idea that results at a higher level of emergence are different from those at a lower does not come as a surprise in complex systems. I think scientists and philosophers of science have been struggling with how to deal with reductionist ideas for quite some time. I agree, however, that at this late date, economists are definitely way behind the curve.

    Sorry, I don’t remember where Keynes said that a decrease in real wages would simply reduce demand.

    Gustav Cassel (I think in his book The Theory of Social Economy) actually argued against utility theory and instead argued that economists should simply posit that demand curves were downward sloping.

    Scott, I think that you’ve directed me to McCallum in the past on this, but I have never gotten around to reading him. I agree with your assessment of him as one of the best economic theorists of our time. If you could provide a few specific references, I would really appreciate it.

  6. 7 David Glasner October 30, 2013 at 7:27 pm

    Scott, Yes, I hope so. Thanks so much.

  7. 8 Blue Aurora November 9, 2013 at 11:57 pm

    Although I’m obviously late to his discussion – does the term “microfoundations” in this context mean behaviour at the individual level, or the supply side?

    Although The General Theory isn’t the main subject of this discussion, the following articles might call into question whether Keynes’s magnum opus lacked a supply side or microeconomic foundations.

    http://www.hetsa.org.au/pdf-back/21-A-4.pdf

    http://www.hetsa.org.au/pdf-back/24-A-4.pdf

    http://www.hetsa.org.au/pdf-back/25-A-13.pdf

  8. 9 David Glasner November 16, 2013 at 5:59 pm

    Blue Aurora, Thanks for the links. Microfoundations can mean any number of things. I think at the simplest level, microfoundations refers to some kind of rationale for why markets fail to clear in the context of involuntary unemployment.


  1. 1 The Microfoundations Wars Continue | Uneasy Money Trackback on January 2, 2014 at 7:54 pm
  2. 2 Methodological Arrogance | Uneasy Money Trackback on February 26, 2014 at 8:35 pm
  3. 3 interfluidity » Inequality and market allocation Trackback on May 12, 2014 at 7:51 pm
  4. 4 interfluidity » Should markets clear? Trackback on May 14, 2014 at 12:53 am
  5. 5 Explaining the Hegemony of New Classical Economics | Uneasy Money Trackback on September 30, 2014 at 9:24 pm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s




About Me

David Glasner
Washington, DC

I am an economist at the Federal Trade Commission. Nothing that you read on this blog necessarily reflects the views of the FTC or the individual commissioners. Although I work at the FTC as an antitrust economist, most of my research and writing has been on monetary economics and policy and the history of monetary theory. In my book Free Banking and Monetary Reform, I argued for a non-Monetarist non-Keynesian approach to monetary policy, based on a theory of a competitive supply of money. Over the years, I have become increasingly impressed by the similarities between my approach and that of R. G. Hawtrey and hope to bring Hawtrey's unduly neglected contributions to the attention of a wider audience.

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 279 other followers


Follow

Get every new post delivered to your Inbox.

Join 279 other followers

%d bloggers like this: