Two Cheers (Well, Maybe Only One and a Half) for Falsificationism

Noah Smith recently wrote a defense (sort of) of falsificationism in response to Sean Carroll’s suggestion that the time has come for scientists to throw falisficationism overboard as a guide for scientific practice. While Noah isn’t ready to throw out falsification as a scientific ideal, he does acknowledge that not everything that scientists do is really falsifiable.

But, as Carroll himself seems to understand in arguing against falsificationism, even though a particular concept or entity may itself be unobservable (and thus unfalsifiable), the larger theory of which it is a part may still have implications that are falsifiable. This is the case in economics. A utility function or a preference ordering is not observable, but by imposing certain conditions on that utility function, one can derive some (weakly) testable implications. This is exactly what Karl Popper, who introduced and popularized the idea of falsificationism, meant when he said that the aim of science is to explain the known by the unknown. To posit an unobservable utility function or an unobservable string is not necessarily to engage in purely metaphysical speculation, but to do exactly what scientists have always done, to propose explanations that would somehow account for some problematic phenomenon that they had already observed. The explanations always (or at least frequently) involve positing something unobservable (e.g., gravitation) whose existence can only be indirectly perceived by comparing the implications (predictions) inferred from the existence of the unobservable entity with what we can actually observe. Here’s how Popper once put it:

Science is valued for its liberalizing influence as one of the greatest of the forces that make for human freedom.

According to the view of science which I am trying to defend here, this is due to the fact that scientists have dared (since Thales, Democritus, Plato’s Timaeus, and Aristarchus) to create myths, or conjectures, or theories, which are in striking contrast to the everyday world of common experience, yet able to explain some aspects of this world of common experience. Galileo pays homage to Aristarchus and Copernicus precisely because they dared to go beyond this known world of our senses: “I cannot,” he writes, “express strongly enough my unbounded admiration for the greatness of mind of these men who conceived [the heliocentric system] and held it to be true […], in violent opposition to the evidence of their own senses.” This is Galileo’s testimony to the liberalizing force of science. Such theories would be important even if they were no more than exercises for our imagination. But they are more than this, as can be seen from the fact that we submit them to severe tests by trying to deduce from them some of the regularities of the known world of common experience by trying to explain these regularities. And these attempts to explain the known by the unknown (as I have described them elsewhere) have immeasurably extended the realm of the known. They have added to the facts of our everyday world the invisible air, the antipodes, the circulation of the blood, the worlds of the telescope and the microscope, of electricity, and of tracer atoms showing us in detail the movements of matter within living bodies.  All these things are far from being mere instruments: they are witness to the intellectual conquest of our world by our minds.

So I think that Sean Carroll, rather than arguing against falisficationism, is really thinking of falsificationism in the broader terms that Popper himself laid out a long time ago. And I think that Noah’s shrug-ability suggestion is also, with appropriate adjustments for changes in expository style, entirely in the spirit of Popper’s view of falsificationism. But to make that point clear, one needs to understand what motivated Popper to propose falsifiability as a criterion for distinguishing between science and non-science. Popper’s aim was to overturn logical positivism, a philosophical doctrine associated with the group of eminent philosophers who made up what was known as the Vienna Circle in the 1920s and 1930s. Building on the British empiricist tradition in science and philosophy, the logical positivists argued that our knowledge of the external world is based on sensory experience, and that apart from the tautological truths of pure logic (of which mathematics is a part) there is no other knowledge. Furthermore, no meaning could be attached to any statement whose validity could not checked either by examining its logical validity as an inference from explicit premises or verified by sensory experience. According to this criterion, much of human discourse about ethics, morals, aesthetics, religion and much of philosophy was simply meaningless, aka metaphysics.

Popper, who grew up in Vienna and was on the periphery of the Vienna Circle, rejected the idea that logical tautologies and statements potentially verifiable by observation are the only conveyors of meaning between human beings. Metaphysical statements can be meaningful even if they can’t be confirmed by observation. Metaphysical statements are meaningful if they are coherent and are not nonsensical. If there is a problem with metaphysical statements, the problem is not necessarily because they have no meaning. In making this argument, Popper suggested an alternative criterion of demarcation to that between meaning and non-meaning: a criterion of demarcation between science and metaphysics. Science is indeed different from metaphysics, but the difference is not that science is meaningful and metaphysics is not. The difference is that scientific statements can be refuted (or falsified) by observations while metaphysical statements cannot be refuted by observations. As a matter of logic, the only way to refute a proposition by an observation is for the proposition to assert that the observation was not possible. Unless you can say what observation would refute what you are saying, you are engaging in metaphysical, not scientific, talk. This gave rise to Popper’s then very surprising result. If you positively assert the existence of something – an assertion potentially verifiable by observation, and hence for logical positivists the quintessential scientific statement — you are making a metaphysical, not a scientific, statement. The statement that something (e.g., God, a string, or a utility function) exists cannot be refuted by any observation. However the unobservable phenomenon may be part of a theory with implications that could be refuted by some observation. But in that case it would be the theory not the posited object that was refuted.

In fact, Popper thought that metaphysical statements not only could be meaningful, but could even be extremely useful, coining the term “metaphysical research programs,” because a metaphysical, unfalsifiable idea or theory could be the impetus for further research, possibly becoming scientifically fruitful in the way that evolutionary biology eventually sprang from the possibly unfalsifiable idea of survival of the fittest. That sounds to me pretty much like Noah’s idea of shrug-ability.

Popper was largely successful in overthrowing logical positivism, though whether it was entirely his doing (as he liked to claim) and whether it was fully overthrown are not so clear. One reason to think that it was not all his doing is that there is still a lot of confusion about what the falsification criterion actually means. Reading Noah Smith and Sean Carroll, I almost get the impression that they think the falsification criterion distinguishes not just between science and non-science but between meaning and non-meaning. Otherwise, why would anyone think that there is any problem with introducing an unfalsifiable concept into scientific discussion. When Popper argued that science should aim at proposing and testing falsifiable theories, he meant that one should not design a theory so that it can’t be tested, or adopt stratagems — ad hoc hypotheses — that serve only to account for otherwise falsifying observations. But if someone comes up with a creative new idea, and the idea can’t be tested, at least given the current observational technology, that is not a reason to reject the theory, especially if the new theory accounts for otherwise unexplained observations.

Another manifestation of Popper’s imperfect success in overthrowing logical positivism is that Paul Samuelson in his classic The Foundations of Economic Analysis chose to call the falsifiable implications of economic theory, meaningful theorems. By naming those implications “meaningful theorems,” Samuelson clearly was operating under the positivist presumption that only a proposition that could (at least in principle) be falsified by observation was meaningful. However, that formulation reflected an untenable compromise between Popper’s criterion for distinguishing science from metaphysics and the logical positivist criterion for distinguishing meaningful from meaningless statements. Instead of referring to meaningful theorems, Samuelson should have called them, more modestly, testable or scientific theorems.

So, at least as I read Popper, Noah Smith and Sean Carroll are only discovering what Popper already understood a long time ago.

At this point, some readers may be wondering why, having said all that, I seem to have trouble giving falisficationism (and Popper) even two cheers. So I am afraid that I will have to close this post on a somewhat critical note. The problem with Popper is that his rhetoric suggests that scientific methodology is a lot more important than it really is. Apart from some egregious examples like Marxism and Freudianism, which were deliberately formulated to exclude the possibility of refutation, there really aren’t that many theories entertained by scientists that can be ruled out of order on strictly methodological grounds. Popper can occasionally provide some methodological reminders to scientists to avoid relying on ad hoc theorizing — at least when a non-ad-hoc alternative is handy — but beyond that I don’t think methodology counts for very much in the day to day work of scientists. Many theories are difficult to falsify, but the difficulty is not necessarily the result of deliberate choices by the theorists, it is the result of the nature of the problem and the nature of the evidence that could potentially refute the theory. The evidence is what it is. It is nice to come up with a theory that predicts a novel fact that can be observed, but nature is not always so accommodating to our theories.

There is a kind of rationalistic (I am using “rationalistic” in the pejorative sense of Michael Oakeshott) faith that following the methodological rules that Popper worked so hard to formulate will guarantee scientific progress. Those rules tend to encourage an unrealistic focus on making theories testable (especially in economics) when by their nature the phenomena are too complex for theories to be formulated in ways that are susceptible to decisive testing. And although Popper recognized that empirical testing of a theory has very limited usefulness unless the theory is being compared to some alternative theory, too often discussions of theory testing are in the context of testing a single theory in isolation. Kuhn and others have pointed out that science is not routinely carried out in the way that Popper suggested it should be. To some extent, Popper acknowledged the truth of that observation, though he liked to cite examples from the history of science to illustrate his thesis, but argued that he was offering a normative, not a positive, theory of scientific discovery. But why should we assume that Popper had more insight into the process of discovery for particular sciences than the practitioners of those sciences actually doing the research? That is the nub of the criticism of Popper that I take away from Oakeshott’s work. Life and any form of endeavor involves the transmission of ways of doing things, traditions, that cannot be reduced to a set of rules, but require education, training, practice and experience. That’s what Kuhn called normal science. Normal science can go off the tracks too, but it is naïve to think that a list of methodological rules is what will keep science moving constantly in the right direction. Why should Popper’s rules necessarily trump the lessons that practitioners have absorbed from the scientific traditions in which they have been trained? I don’t believe that there is any surefire recipe for scientific progress.

Nevertheless, when I look at the way economics is now being practiced and taught, I can’t help but think that a dose of Popperianism might not be the worst thing that could be administered to modern economics. But that’s a discussion for another day.


16 Responses to “Two Cheers (Well, Maybe Only One and a Half) for Falsificationism”

  1. 2 W. Peden January 26, 2014 at 11:29 pm

    I agree. A very good post, David.

    I also can give only qualified support to what Popper said. Firstly, I have to note that most famous philosophers of science have a standard interpretation that is pretty reasonable and perhaps even a little banal (including philosophers as radical as Kuhn, Lakatos, Feyerabend and van Fraassen) which is how they penetrate into popular consciousness, if at all; their irrationalism doesn’t get noted. So, for example people quite happily use Kuhn’s words regarding paradigms, without embracing the doctrines of incommensurability and relativism that one finds when one actually reads Kuhn.

    Second, this is true of Popper as well as the above philosophers of science. Under Popper’s methodology (and he was quite explicit about this) it is NEVER rational to believe a scientific theory. Even more bizarrely, it is no LESS irrational to believe a scientific theory that has been confirmed by a billion successful experiments than it is to believe a theory that has never been tested. This is because Popper rejects the concept of positive evidence entirely, and his concept of “corroboration” is vague enough to mean many things, but “confirmation” isn’t one of them. In fact, Popper believed that the probability of ANY scientific law, given the evidence, was equal to the probability of a contradiction i.e. 0. This is why I must disagree with the description of Popper as providing a “theory of scientific discovery”. Though he called one of his books “The Logic of Scientific Discovery”, Popper’s methodology is a theory of scientific TESTING, and the reason for this is simple: no-one discovers anything in Popper’s methodology, except the inconsistency of some propositions with other propositions.

    You correctly note that scientists don’t follow Popper’s methodology. I actually think that it is quite reasonable for philosophers to put forward philosophies of science that aren’t descriptions of science, just as it is reasonable to put forward moral philosophies that aren’t descriptions of moral practice. If you look at incidents of grand controversy in the history of science (Darwinism, Copernicanism, interpretations of quantum phenomena etc.) scientists make methodological pronouncements that are philosophical in nature, and often very wrong. So, for example, Darwin was condemned by many scientists for not following a methodology of Baconian induction that doesn’t make sense as a logical level. It was the development (by philosophers) of the hypothetico-deductive model of confirmation and the discrediting (by philosophers) of the “logic of discovery” that provided a solid rational justification of what Darwin was doing.

    However, this line of defence doesn’t work well for Popper in particular, because he used appeals to practice as attempts to extricate himself from the contradictions of his theory. For example, Popper believed that (1) only theories that are logically inconsistent with some basic existential statements regarding observable things (which is the technical sense of ‘falsifiable’ in Popper’s work) are scientific; he also believed (correctly) that (2) scientists put forward scientific theories that are probabilistic in nature, e.g. “The probability that a newborn baby will be male is approximately 1/2”; and he was smart enough to know that (3) probabilistic theories are unfalsifiable, because even a birthrate of 1/5 UP UNTIL NOW is consistent with the preceding probabilistic example.

    (1), (2) and (3) form an inconsistent trilemma. One proposition has to be dropped in order to be logically consistent. Popper should have dropped (1), of course, but this was 1934 and it was his first book, and thus a little early to abandon his methodology, so he defended himself on the basis that scientists adopt a methodological rule to ignore very low probabilities. This is true (at least among good scientists) and though it is totally irrelevant to his problem (because the propositions are still inconsistent) it is a sign of how Popper was quite willing to appeal to the practice of scientists when it suited him.

    And practice REALLY doesn’t support falsificationism. Take Newtonian physics, surely the classic paradigmatic case of a scientific theory, and the most encouraging case for falsificationism. The falsificationists never did come up with a persuasive explanation of how Darwinian evolution is scientific under their methodology, but they’re confident on Newtonian physics. After all, Newtonian physics is false, so it must be falsifiable, right? Nope. As I said above, ‘falsifiable’ has a technical meaning in the philosophy of science. It doesn’t just mean “made less probable by the evidence”. That’s how ‘disconfirmed’ is generally used. ‘Falsifiable’ means inconsistent with statements like “There is a white raven here”. And there is nothing, at the observable level, that is inconsistent with Newtonian physics, and when Lakatos challenged Popper on this point, the latter was utterly unable to provide a single example of an observation that would falsify Newtonian physics. In fact, beyond the simplified cases that we philosophers of science like to use sometimes for convenience (like “All ravens are black”) it is very hard to find falsifiable theories in actual science.

    (The above points are largely drawn, with some elaboration, from “Popper and After”, which is a very good book by D. C. Stove.)

    It is true that the popular version of Popper has a lot going for him. Rigorously testing theories is an important part of good science, and a theory must have some possibility of being disconfirmed by evidence if it is to be scientific (though this is a necessary but not sufficient requirement, because “There is a mouse in the house” is also disconfirmable, but not scientific). However, you don’t need falsificationism to tell you this. Basic confirmation theory implies that if a theory is to be probable given the evidence, it must be less probable given some alternative evidence; a theory that predicts anything isn’t more probable given the evidence. So if scientists seek to develop theories that are probable in relation to the data, then they will also be led (as if by an invisible logical hand!) to develop theories that are disconfirmable, because theories that are the former are also the latter.

    To end on a positive note: on meaningfulness and science, I think that Popper got it spot on. It is true that a meaningless proposition cannot be scientific (just as a meaningless proposition cannot be a proposition of ethics) but that does not entail that only scientific propositions are meaningful, and the pseudo-scientific/scientific distinction is quite separate from the meaningful/meaningless distinction. And a few minutes with a neuroscientist should eliminate any thoughts that metaphysics isn’t a part of science: in general, I find that in less than 5 minutes a thoughtful neuroscientist will have both said that the mind IS the brain and that the brain CAUSES the mind.

    And I think that Popper’s legacy has been mostly positive, primarily because he was so misunderstood. A scientist who worries about whether or not she can craft experiments (natural or laboratory) that disconfirm her theory is a better scientist for this worry. Also, some of his criticisms of historicism are brilliant, and have great importance in the philosophy of social science.


  2. 3 Herman January 27, 2014 at 2:02 am

    Nice post, despite, dare I say it, a certain prolixity.

    “Nevertheless, when I look at the way economics is now being practiced and taught, I can’t help but think that a dose of Popperianism might not be the worst thing that could be administered to modern economics”

    Couldn’t agree more. On this see Mark Blaug on economic methodology.

    Onto a few minor disagreements/differences in emphasis:

    “Apart from some egregious examples like Marxism and Freudianism, which were deliberately formulated to exclude the possibility of refutation, there really aren’t that many theories entertained by scientists that can be ruled out of order on strictly methodological grounds.”

    General Equilibrium Theory?
    The EMH (at least as interpreted by some)? Since any change in asset prices can be decomposed as either due to news that affects the stream of future returns and/or the SDF and nothing else. This, of course turns the SDF into a meaningless catch-all that has nothing to do with risk, except in a purely formal, tautological sense. And the EMH becomes unfalsifiable. Of course, the EMH can be interpreted as a falsifiable hypothesis, but then surely it has been falsified.

    Kuhn and ‘normal science’:
    While I agree with your point that there is no surefire recipe for scientific progress (growth of knowledge), I would be wary of appeals to Kuhn.

    Kuhn is, of course not about epistemology or the scientific method. Kuhn is about the history and sociology of — mostly academic — scientific establishment. Too often one gets the impression that Kuhn — or at any rate, some of his interpreters — end up mistaking social impediments to the growth of knowledge for defining characteristics of science. Putting it bluntly, and perhaps a little unfairly to Kuhn: Kuhn saw groupthink, and imagined paradigms.

    I like to quote this from Feyerabend:
    ” More than one social scientist has pointed out to me that now at last he has learned how to turn his field into a ‘science’ – by which of course he meant that he had learned how to improve it. The recipe, according to these people, is to restrict criticism, to reduce the number of comprehensive theories to one, and to create a normal science that has this one theory as its paradigm. Students must be prevented from speculating along different lines and the more restless colleagues must be made to conform and ‘to do serious work’. Is this what Kuhn wants to achieve?”

    [from Lakatos & Musgrave, Criticism and the Growth of Knowledge]

    Paul Feyerabend was probably not talking about economics, but the quote does eerily reflect the situation in many branches of academic economics.


  3. 4 Pablo Paniagua Prieto January 27, 2014 at 2:21 am

    Great post and it has profound applications (and limitations) of falsificationism in economic theory, it seems to me that your reading of Popper regarding falsificationism as a normative path of scientific discovery and the fact that falsificationism cannot be a positive guide for economics due to the nature of the problem, degree of complexity and subjectivity in the economic order is directly link with Hayek’s reading of Popper, when he wrote about the degrees of falsificationism in his papers “Degrees Of Explanation” , “The Theory Of Complex Phenomena ” and his paper on “the facts of social science”, Hayek saw that falsificationism cannot be the only guide for scientific discovery in economics due to its degrees of complexity that makes falsificationism in a lot of cases an impossibility and that focusing on falsificationism as the only way to guarantee scientific progress and meaningful statements in economics is “rationalistic” in the pejorative sense of Oakeshott.


  4. 5 David Glasner January 27, 2014 at 9:55 am

    Noah, Thanks.

    W. Peden, Thanks for the compliment and for illuminating discussion of the details of Popper’s philosophy. I think I recall reading an essay in Encounter many years ago (when I was a lot more in sympathy with Popper than I am now) by Stove in which he bashed Popper pretty severely, but I have never seen his book. At any rate, I am not up to debating you on Popper. But I will point out that the notion that we have no rational basis for believing in a scientific theory was advanced long before Popper by David Hume, who nevertheless went on to write an essay in which he dismissed the idea of miracles as being inconsistent with observed evidence. Go figure.

    Herman, Thanks, and sorry if I taxed your patience. About Blaug, for whom I had the utmost respect, I will say that I thought that he went a bit overboard in his Popperianism.

    You raise a fair point about GE theory and EMH. But I think that they are obviously refuted in the strict sense. The question is whether they provide any reasonable insight into how markets work in the real world. But I think that they are different from Marxianism and Freudianism which were deliberately shielded from refutation. One might also think about GE and EMH as metaphysical theories or systems for organizing our thinking about certain problems without necessarily generating specific predictions.

    That’s a great quote from Feyerabend. Thanks.

    Pablo, Thanks. I agree that Hayek’s essays, especially “The Theory of Complex Phenomena,” are very important in themselves and as correctives to Popper.


  5. 6 W. Peden January 27, 2014 at 10:09 am

    David Glasner,

    Oddly enough, I am currently working on just that thesis of Hume’s! Like a lot of famous arguments in philosophy, it tends to morph from solid rock into mercury the moment one tries to mount counterarguments, but nonetheless it’s a topic that I find utterly addictive.


  6. 7 PeterE January 27, 2014 at 11:22 am

    Wittgenstein was once asked for his views on the verification principle. He told this story: Imagine a village policeman who asks each villager what his job is, and writes each answer down in his notebook. One of the villagers tell him that he has no job. The policeman writes that down too, because it is useful information.


  7. 8 Blue Aurora January 28, 2014 at 3:20 am

    Interesting post, David Glasner. I believe that this paper would be relevant to your interests and Noah Smith’s interests…


  8. 9 Mike Sax January 28, 2014 at 9:26 am

    Great post David. You really gave me a different perspective on Popper. I never really realized that he actually debunks logical positivism but it makes sense. Whether he did it by himself I can buy he had a part to play in it.

    I think maybe Noam Chomsky-in a very different field, linguistics and the philosophy of language-had a hand in the sense that in Chomsky’s theory of language acquisition the child is much more active than in crude Lockean empiricism where a child learns to speak simply by hearing and aping what he hears. Chomsky argues that every time a child-or anyone else-speaks they’re saying something brand new that couldn’t have been imagined before hand.

    All of this kind of jibes with the Platoistic idea that in learning we don’t so much learn something new as recall what we already knew. For instance you have the idea of the uneducated slaveboy who Socrates through questioning shows already knows geometry.

    Then though I read W. Peden. I got to say W you’ve missed your calling-you should do a philosophy of science blog. You have some pretty negative readings of Popper that are new to me but make some sense.

    The idea that practice doesn’t support falsification is a very thought provoking idea.

    At the minimum I agree that not everything that is important in knowledge is falsifiable-far from it. It always seemed that the positivist school went way too far-if you followed them not only was all metaphysics mere pseudo puffery but even things like literature and most of philosophy would be thrown at the window.


  9. 10 Tom Brown January 29, 2014 at 1:23 pm

    David, I’ve brought this up before I think, but let me make this brief (only tangentially related to your post here).

    For the sake of argument assume we can accurately simulate humans interacting in an simulated environment, and their behavior is in every way indistinguishable from real humans (i.e. we can build “The Matrix”)

    If you indulge my reductionist fantasy for the moment, and ethical considerations aside, could such a simulation be valuable in the study of econ and macro in particular? Why or why not? If “yes” how so? What would be the biggest difficulty in applying the results of experiments run in this simulation to the real world? (Again, I’m asking you to assume my reductionist fantasy is possible). I’m not saying the simulation has to mimic the real world in all it’s details: it can just serve as a representative nation with a representative economy: one in which the experimenter can play God and change monetary systems, introduce economic shocks of various kinds, etc… re-running any cases of interest like any other simulation.


  10. 11 W. Peden February 3, 2014 at 3:50 am

    Mike Sax,

    Actually, I’m doing a PhD in philosophy of science (and the philosophy of economics in particular). So I haven’t missed my calling- I’m in the rare and extremely lucky position in life where my calling is my profession. In general, I can’t imagine any life I’d find more fulfilling.


  11. 12 David Glasner February 4, 2014 at 9:05 am

    W. Peden, I would love to see what you come up with.

    PeterE, Clever fellow, that Wittgenstein.

    Blue Aurora, Thanks for yet another interesting reference. Sorry that I just can’t keep up with you.

    Mike, Thanks. But doesn’t Chomsky’s view contradict the Platonistic view?

    Tom, I think every economic model is an attempt to do what you are suggesting, except that economic models involve very radical simplified versions of human decision making. If we could create more realistic model human beings to populate our models that would certainly help us create better economic models.


  12. 13 Tom Brown February 4, 2014 at 4:37 pm

    David, thanks. I agree. But sometimes I think that even a perfect laboratory (playing God) would only be of limited usefulness.

    I put the same question to Sumner, and he had an interesting reply:

    “Tom, I honestly don’t know. It seems to me that the complexity of the human brain isn’t the real problem. It doesn’t explain why we disagree about the minimum wage, for instance. We disagree because we can’t decide whether labor markets are competitive or monopsonistic.”

    The 1st part doesn’t surprise me… but I’m having trouble understanding how his minimum wage example applies. Do you get it?


  13. 14 Mike sax February 4, 2014 at 8:15 pm

    David. Chomsky has suggested he agrees with the Platonic idea that learning is a remembering. That’s how he sees language acquisition. Again, as I said above, he’s explicitly referred to when Scotrates was able to show a slave boy with no education nevertheless understood geometry.

    Maybe this his hard to believe with Chomsky’s radical political views but it’s nevertheless the case. As a matter of fact Chomsky coined a phrase ‘Plato’s Problem’-again I think this may be surprising as it might seem that Platoists are all conservatives.

    “Plato’s problem is the term given by Noam Chomsky to the gap between knowledge and experience. It presents the question of how we account for our knowledge when environmental conditions seem to be an insufficient source of information. It is used in linguistics to refer to the “argument from poverty of the stimulus” (APS). In a more general sense, Plato’s Problem refers to the problem of explaining a “lack of input”. Solving Plato’s Problem involves explaining the gap between what one knows and the apparent lack of substantive input from experience (the environment). Plato’s Problem is most clearly illustrated in the Meno dialogue, in which Socrates demonstrates that an uneducated boy nevertheless understands geometric principles.”


    So he’s as far as you’ll get from the vulgar Lockean position of pure empiricism. This doesn’t have much bearing on chomsky’s political position. You don’t have to agree with him here-he claims to be a ‘libertarian socialist’ to follow him on linguistics or the philosophy of knowledge.


  14. 15 Mike sax February 4, 2014 at 8:17 pm

    Of course, I say that but it’s always interesting to see if you find a link between someone’s ‘scientific’ views’ and their ‘political views.’


  15. 16 Mike sax February 4, 2014 at 8:19 pm

    W. Peden clearly you are the luckiest of men. While I’ve become fascinated by econoimcs over the last 2 years or so, philosophy is my first love as well.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

About Me

David Glasner
Washington, DC

I am an economist in the Washington DC area. My research and writing has been mostly on monetary economics and policy and the history of economics. In my book Free Banking and Monetary Reform, I argued for a non-Monetarist non-Keynesian approach to monetary policy, based on a theory of a competitive supply of money. Over the years, I have become increasingly impressed by the similarities between my approach and that of R. G. Hawtrey and hope to bring Hawtrey’s unduly neglected contributions to the attention of a wider audience.

My new book Studies in the History of Monetary Theory: Controversies and Clarifications has been published by Palgrave Macmillan

Follow me on Twitter @david_glasner


Enter your email address to follow this blog and receive notifications of new posts by email.

Join 3,260 other subscribers
Follow Uneasy Money on

%d bloggers like this: