Michael Oakeshott Exposes Originalism’s Puerile Rationalistic Pretension to Jurisprudential Profundity

Last week in my post about Popperian Falsificationism, I quoted at length from Michael Oakeshott’s essay “Rationalism in Politics.” Rereading Oakeshott’s essay reminded me that Oakeshott’s work also casts an unflattering light on the faux-conservative jurisprudential Originalism, of which right-wing pretend-populists masquerading as conservatives have become so enamored under the expert tutelage of their idol Justice Scalia.

The faux-conservative nature of Originalism was nowhere made so obvious as in Scalia’s own Tanner Lectures at the University of Utah College of Law, “Common-Law Courts in a Civil-Law System” in which Scalia made plain his utter contempt for the common-law jurisprudence upon which the American legal system is founded. Here is that contempt on display in a mocking description of how law is taught in American law schools.

It is difficult to convey to someone who has not attended law school the enormous impact of the first year of study. Many students remark upon the phenomenon: It is like a mental rebirth, the acquisition of what seems like a whole new mode of perceiving and thinking. Thereafter, even if one does not yet know much law, he – as the expression goes – “thinks like a lawyer.”

The overwhelming majority of the courses taught in that first year of law school, and surely the ones that have the most impact, are courses that teach the substance, and the methodology, of the common law – torts, for example; contracts; property; criminal law. We lawyers cut our teeth upon the common law. To understand what an effect that must have, you must appreciate that the common law is not really common law, except insofar as judges can be regarded as common. That is to say, it is not “customary law,” or a reflection of the people’s practices, but is rather law developed by the judges. Perhaps in the very infancy of the common law it could have been thought that the courts were mere expositors of generally accepted social practices ; and certainly, even in the full maturity of the common law, a well established commercial or social practice could form the basis for a court’s decision. But from an early time – as early as the Year Books, which record English judicial decisions from the end of the thirteenth century to the beginning of the sixteenth – any equivalence between custom and common law had ceased to exist, except in the sense that the doctrine of stare decisis rendered prior judicial decisions “custom.” The issues coming before the courts involved, more and more, refined questions that customary practice gave no answer to.

Oliver Wendell Holmes’s influential book The Common Law – which is still suggested reading for entering law students – talks a little bit about Germanic and early English custom. . . . Holmes’s book is a paean to reason, and to the men who brought that faculty to bear in order to create Anglo-American law. This is the image of the law – the common law – to which an aspiring lawyer is first exposed, even if he hasn’t read Holmes over the previous summer as he was supposed to. (pp. 79-80)

What intellectual fun all of this is! I describe it to you, not – please believe me – to induce those of you in the audience who are not yet lawyers to go to law school. But rather, to explain why first-year law school is so exhilarating: because it consists of playing common-law judge. Which in turn consists of playing king – devising, out of the brilliance of one’s own mind, those laws that ought to govern mankind. What a thrill! And no wonder so many lawyers, having tasted this heady brew, aspire to be judges!

Besides learning how to think about, and devise, the “best” legal rule, there is another skill imparted in the first year of law school that is essential to the making of a good common-law judge. It is the technique of what is called “distinguishing” cases. It is a necessary skill, because an absolute prerequisite to common-law lawmaking is the doctrine of stare decisis – that is, the principle that a decision made in one case will be followed in the next. Quite obviously, without such a principle common-law courts would not be making any “law”; they would just be resolving the particular dispute before them. It is the requirement that future courts adhere to the principle underlying a judicial decision which causes that decision to be a legal rule. (There is no such requirement in the civil-law system, where it is the text of the law rather than any prior judicial interpretation of that text which is authoritative. Prior judicial opinions are consulted for their persuasive effect, much as academic commentary would be; but they are not binding.)

Within such a precedent-bound common-law system, it is obviously critical for the lawyer, or the judge, to establish whether the case at hand falls within a principle that has already been decided. Hence the technique – or the art, or the game – of “distinguishing” earlier cases. A whole series of lectures could be devoted to this subject, and I do not want to get into it too deeply here. Suffice to say that there is a good deal of wiggle-room as to what an earlier case “holds.” In the strictest sense, the holding of a decision cannot go beyond the facts that were before the court. . . .

As I have described, this system of making law by judicial opinion, and making law by distinguishing earlier cases, is what every American law student, what every newborn American lawyer, first sees when he opens his eyes. And the impression remains with him for life. His image of the great judge — the Holmes, the Cardozo — is the man (or woman) who has the intelligence to know what is the best rule of law to govern the case at hand, and then the skill to perform the broken-field running through earlier cases that leaves him free to impose that rule — distinguishing one prior case on his left, straight-arming another one on his right, high-stepping away from another precedent about to tackle him from the rear, until (bravo!) he reaches his goal: good law. That image of the great judge remains with the former law student when he himself becomes a judge, and thus the common-law tradition is passed on and on. (pp. 83-85)

In place of common law judging, Scalia argues that the judicial function should be confined to the parsing of statutory or Constitutional texts to find their meaning, contrasting that limited undertaking to the anything-goes practice of common-law judging.

[T]he subject of statutory interpretation deserves study and attention in its own right, as the principal business of lawyers and judges. It will not do to treat the enterprise as simply an inconvenient modern add-on to the judges’ primary role of common-law lawmaking. Indeed, attacking the enterprise with the Mr. Fix-it mentality of the common-law judge is a sure recipe for incompetence and usurpation.

The state of the science of statutory interpretation in American law is accurately described by Professors Henry Hart and Albert Sacks (or by Professors William Eskridge and Philip Frickey, editors of the famous often-taught-but-never-published Hart-Sachs materials on the legal process) as follows:

Do not expect anybody’s theory of statutory interpretation, whether it is your own or somebody else’s, to be an accurate statement of what courts actually do with statutes. The hard truth of the matter is that American courts have no intelligible, generally accepted, and consistently applied theory of statutory interpretation.

Surely this is a sad commentary: We American judges have no intelligible theory of what we do most. (pp. 89-90)

But the Great Divide with regard to constitutional interpretation is not that between Framers’ intent and objective meaning; but rather that between original meaning (whether derived from Framers’ intent or not) and current meaning. The ascendant school of constitutional interpretation affirms the existence of what is called the “living Constitution,” a body of law that (unlike normal statutes) grows and changes from age to age, in order to meet the needs of a changing society. And it is the judges who determine those needs and “find” that changing law. Seems familiar, doesn’t it? Yes, it is the common law returned, but infinitely more powerful than what the old common law ever pretended to be, for now it trumps even the statutes of democratic legislatures.

If you go into a constitutional law class, or study a constitutional-law casebook, or read a brief filed in a constitutional-law case, you will rarely find the discussion addressed to the text of the constitutional provision that is at issue, or to the question of what was the originally understood or even the originally intended meaning of that text. Judges simply ask themselves (as a good common-law judge would) what ought the result to be, and then proceed to the task of distinguishing (or, if necessary, overruling) any prior Supreme Court cases that stand in the way. Should there be (to take one of the less controversial examples) a constitutional right to die? If so, there is. Should there be a constitutional right to reclaim a biological child put out for adoption by the other parent? Again, if so, there is. If it is good, it is so. Never mind the text that we are supposedly construing; we will smuggle these in, if all else fails, under the Due Process Clause (which, as I have described, is textually incapable of containing them). Moreover, what the Constitution meant yesterday it does not necessarily mean today. As our opinions say in the context of our Eighth Amendment jurisprudence (the Cruel and Unusual Punishments Clause), its meaning changes to reflect “the evolving standards of decency that mark the progress of a maturing society.”

This is preeminently a common-law way of making law, and not the way of construing a democratically adopted text. . . . The Constitution, however, even though a democratically adopted text, we formally treat like the common law. What, it is fair to ask, is our justification for doing so? (pp. 112-14)

Aside from engaging in the most ridiculous caricature of how common-law judging is conducted by actual courts, Scalia, in describing statutory interpretation as a science, either deliberately misrepresents or simply betrays his own misunderstanding of what science is all about. Scientists seek to discover anomalies, contradictions, and gaps within a received body of conjectural knowledge by finding solutions for those anomalies and contradictions and finding new hypotheses to explain gaps in knowledge. And they evaluate their work by criticizing the logic of their solutions and hypotheses and by testing those solutions and hypotheses against empirical evidence.

What Scalia calls a science of statutory interpretation seems to be nothing more than a set exegetical or hermeneutic rules passively and mechanically applied to arrive at a supposedly authoritative reading of the statute without regard to the substantive meaning or practical implications of applying the statute after those exegetical rules have been faithfully applied. In other words, the role of judge is to skillfully read and interpret legal texts, not to render a just verdict or decision, not unless, that is, justice is tautologically defined as the outcome of the Scalia-sanctioned exegetical/hermeneutic exercise. Scalia fraudulently attempts to endow this purely formal approach to textual exegesis with scientific authority, as if by so doing he could invoke the authority of science to override, or annihilate, the authority of judging.

Here is where I want to invite Michael Oakeshott into the conversation. I quote from his essay “Political Education” reprinted as chapter two of his Rationalism in Politics and Other Essays.

[A] tradition of behaviour is a tricky thing to get to know. Indeed, it may even appear to be essentially unintelligible. It is neither fixed nor finished; it has no changeless centre to which understanding anchor itself; there is no sovereign purpose to be perceived or inevitable direction to be detected; there is no model to be copied, idea to be realized, or rule to be followed. Some parts of it may change more slowly than others, but none is immune from change. Everything is temporary. Nevertheless, though a tradition of behaviour is flimsy and elusive, it is not without identity, and what makes it a possible object of knowledge is the fact that all its parts do not change at the same time and that the changes it undergoes are potential within it. Its principle is a principle of continuity: authority is diffused between past, present, and future; between the old, the new, and what is to come. It is steady because, though it moves, it is never wholly in motion; and though it is tranquil, it is never wholly at rest. Nothing that ever belonged to it: we are always swerving back to recover and make something topical out of even its remotest moments; and nothing for long remains unmodified. Everything is temporary, but nothing is arbitrary. Everything figures by comparison, not with what stands next to it, but with the whole. And since a tradition of behaviour is not susceptible of the distinction between essence and accident, knowledge of it is unavoidable knowledge of its detail: to know only the gist is to know nothing. What has to be learned is not an abstract idea, or a set of tricks, not even a ritual, but a concrete, coherent manner of living in all its intricateness. (pp. 61-62).

In a footnote to this passage, Oakeshott added the following comment.

The critic who found “some mystical qualities” in this passage leaves me puzzled: it seems to me an exceedingly matter-of-fact description of the characteristics of any tradition — the Common Law of England, for example, the so-called British Constitution, the Christian religion, modern physics, the game of cricket, shipbuilding.

I will close with another passage from Oakeshott, this time from his essay Rationalism in Politics, but with certain terms placed in parentheses to be replaced with corresponding, substitute terms placed in brackets.

The heart of the matter is the pre-occupation of the [Originalist] (Rationalist) with certainty. Technique and certainty are, for him, inseparably joined because certain knowledge is, for him, knowledge that is, which not only ends with certainty but begins with  certainty and is certain throughout. And this is precisely what [textual exegesis] (technical knowledge) appears to be. It seems to be a self-complete sort of knowledge because it seems to range between an identifiable initial point (where it breaks in upon sheer ignorance) and an identifiable terminal point, where it is complete, as in learning the rules of a new game. It has the aspect of knowledge that can be contained wholly between the covers of a [written statutory code], whose application is, as nearly as possible, purely mechanical, and which does not assume knowledge not itself provided in the [exegetical] technique. For example, the superiority of an ideology over a tradition of thought lies in the appearance of being self-contained. It can be taught best to those whose minds are empty: and if it is to be taught to one who already believes something, the first step of the teacher must be to administer a purge, to make certain that all prejudices and preconceptions are removed, to lay his foundation upon the unshakeable rock of absolute ignorance. In short, [textual exegesis] (technical knowledge) appears to e the only kind of knowledge which satisfies the standard of certainty which the [Originalist] (Rationalist) has chosen. (p. 16)


16 Responses to “Michael Oakeshott Exposes Originalism’s Puerile Rationalistic Pretension to Jurisprudential Profundity”

  1. 1 JMRJ May 27, 2019 at 7:10 am

    Well, this is an interesting theme you’re onto, David! Sort of as an aside, I thought you’d find this interesting: https://www.scotusblog.com/2019/05/academic-highlight-hyatt-is-latest-example-of-textualist-originalist-justices-willingness-to-overturn-precedent/

    But see here. This is epistemology you’re expounding here, not economics and not law. We went into this subject ourselves not that long ago:



    I’ll have to read some Oakeshott when I get some time.

    Meanwhile, let me just throw out a couple of things you might find ponder-able.

    In Catholicism, which was apparently also an interest of Oakeshott’s, there are a number of sins considered “unforgivable”, known as “sins against the Holy Ghost”. One of these is to presume one’s own salvation. Another is to despair of one’s own salvation.

    I should have thought that Oakeshott would have a lot to say, much of it critical, about Hegel, who I think I’m safe in saying took the rationalist position basically into insanity, despite having the best of intentions, I think. But then Kierkegaard had so effectively demolished the Hegelianism in his day that maybe there’s not much point rehashing all that.

    Also, I don’t know if you like movies, but I’ve always thought that one of the themes of Dr. Zhivago was the destructive effects of what you might call “political madness” – the whole story is in some ways about the struggle of the main character to “just live” as he puts it at one point in the film.


  2. 2 Brian Winters May 27, 2019 at 7:33 am

    The problem indeed is “rationalistic pretension,” and in the last two sentences of David’s post, Oakeshott channels, in order to take issue with, Descartes. Those interested in how the Cartesian project has led us astray might enjoy Stephen Toulmin’s book “Cosmopolis.” From a review of that book by Richard Rorty : “By showing how differently the last three centuries would have been if Montaigne, rather than Descartes, had been taken as a starting point, Toulmin helps destroy the illusion that the Cartesian quest for certainty is intrinsic to the nature of science or philosophy.”

    Scalia’s project was very much Cartesian. He believed that unless the law can proceed valid syllogism by valid syllogism, always with some authoritative text as the somehow self-evident major premise, then anything goes and we are condemned to legal anarchy. Various scholars have shown why that view, which did not originate with Scalia, is hopelessly confused. For those interested in these issues, try Steven Burton’s book “An Introduction to Law and Legal Reasoning” or Melvin Eisenberg’s “The Nature of the Common Law.” There is also a nice chapter on legal reasoning in a textbook Toulmin himself wrote called “An Introduction to Reasoning.”

    (I am a retired lawyer spending my time reading and writing about the common law and legal reasoning so I happened to have these and other works at hand when David’s post popped up, though I had forgotten about Oakeshott.)

    Liked by 1 person

  3. 3 Henry Rech May 27, 2019 at 4:16 pm


    ” Various scholars have shown why that view, which did not originate with Scalia, is hopelessly confused.”

    Could you briefly outline what sort of issues are involved?


  4. 4 Benjamin Cole May 27, 2019 at 4:24 pm

    Great post And I wish I was smart enough to understand it.

    Remember, we must always hide our venal self-interest and political views behind an escutcheon of marvelous principles!


  5. 5 Brian Winters May 27, 2019 at 9:54 pm

    I will try, but first, for context, let’s apply the principle of charity to Scalia’a enterprise and assume he is operating in good faith. What problem is he worried about that his formalist approach to legal reasoning is supposed to solve? He doesn’t think judges should “make” law. The way to prevent them from “making” law is to restrict their function to reasoning deductively from authoritative texts, like statutes. If that is all they do, then they are just following instructions issued by a legislature (or some other democratic source of authority like a constitutional convention, etc.) But is such adjudication by algorithm possible?

    It may seem possible in what lawyers and judges call “easy” cases. Suppose the city council enacts an ordinance that says “Anyone who discharges a firearm within the city limits shall pay a fine of $500.” Bob discharges a firearm within the city limits. Looks easy enough:

    MAJOR PREMISE (the legal rule): Anyone who discharges a firearm within the city limits shall pay a fine of $500.
    MINOR PREMISE (the facts of the case): Bob discharged a firearm within the city limits.
    CONCLUSION (decision by the judge): Bob shall pay a fine of $500.

    The conclusion follows necessarily from the premises. All the judge had to do was read the ordinance, hear enough evidence to conclude Bob had done the deed, and Bob owes $500. No room for the judge to “make” any law here. Adjudication by algorithm. Scalia smiles. (Why one needs judges at all is a question worth asking if this is the model.)

    What if the gun went off accidentally? It discharged, but did Bob discharge it? What if, while Bob was holding the gun, it went off due to a manufacturing defect? What if Bob fired a warning shot when he heard someone trying to break down his door? And so forth…. Just like the ordinance that prohibits motor vehicles in the park—how about an ambulance?

    Another example:

    MAJOR PREMISE (the legal rule): As incorporated by the 14th Amendment,
    the 1st Amendment to the Constitution prohibits states from making any law that abridges freedom of speech.
    MINOR PREMISE (the facts of the case): From a tree in his front yard Smith hangs the governor in effigy. His neighbors are outraged and call the police. Smith is charged with disorderly conduct and convicted. He appeals his conviction arguing that his 1st amendment right to free speech has been abridged.
    CONCLUSION (decision by the court of appeals): ???????

    And it’s not just in the constitutional stratosphere that conclusions fail to automatically tumble out given a “rule” and the “facts.” Burton draws examples from the law of sales as set forth in the Uniform Commercial Code, in which, the algorithm having failed to function, some person will have to make a judgment. And the two words I have put in scare quotes make the formalist/algorithmic model even less plausible because in many cases it is not obvious how to articulate the rule (the major premise) or which of the infinitude of facts (the minor premise) matter. Somebody has to decide.

    Every practicing lawyer who has ever written a brief understands this. We also understand that the decisions are almost never arbitrary. (Flipping a coin would be arbitrary.) Judges do actually judge, and often in the process “make” law, but they are subject to constraints in doing so, not by the formalist’s imaginary algorithms, but in the way that Oakeshott understood.


  6. 6 David Glasner May 28, 2019 at 11:23 am

    JMRJ, Thanks for the link to the SCOTUS blog which made a point about the decidedly anti-conservative nature of textualist/originalist jurisprudence in action. I am not a fan of Nietzche or Hegel, but I haven’t read enough of either to venture any opinion about them.

    Benjamin, Thanks.

    Brian, I’m very happy that my post led you to comment and share your insights as well as provide the interesting citations to the literature on law and philosophy.

    Liked by 1 person

  7. 7 Henry Rech May 28, 2019 at 5:26 pm


    Thanks for your response.

    I am not a lawyer, so please excuse me if I say something silly below.

    Taking your first example and putting aside the fact that it is offered as a way of demonstrating a point, in a real world situation is how the accused is charged not a factor? The prosecutors mount a case brought by police having charged someone under a particular statute. if there are other considerations involved does the defence have the ability to raise these and what other statutes might apply that might aid the defence of the accused? If this is so, the judge is presented with a more complex array of factors and even if he subscribes to the notion of algorithmic adjudication he must consider these factors otherwise justice cannot seem to be done. Your example is simplistic by design but might lead one to carry the argument in an unfair way and characterise judges who rule by algorithm in a too unkindly manner. It seems to me that judges might be able to adjudge in an algorithmic way as long as they can consider all applicable statutes.


  8. 8 Brian Winters May 29, 2019 at 8:04 am


    As in so many areas of life the root of the problem was best articulated by Yogi Berra who said “It’s really hard to make predictions, especially about the future.”

    Just to be clear, what we are talking about here is, to use the jargon, “canonical legal rules.” Roughly, these consist of rules that are written down: municipal ordinances, statutes enacted by a state legislature or the Congress, rules promulgated by administrative bodies like the FTC or a state public service commission, etc. Interestingly, a private contract, if it is in writing, can usefully be viewed as a set of “canonical legal rules”: Such a contract is written down and it specifies the rights and obligations the parties have, ie, rules that govern the relationship they are establishing. When a court is called upon to enforce a private contract a lot of the same issues arise as when a court is called upon to apply a statute, etc.

    Let’s restate the Scalia theory (also known as legal formalism or as we have been calling it “algorithmic adjudication”): “Given a rule as the Major Premise of a syllogism, and given somebody’s conduct as the Minor Premise of the syllogism, all judges need to do is DEDUCE the correct legal conclusion.” The first example in my previous comment is supposed to show how this works: Given an ordinance that makes it punishable to discharge a firearm within the city limits, and given facts that make it clear Bob did indeed discharge his gun in the city, what the judge needs to do is as clear as night and day: Order Bob to pay the fine.

    What makes deduction work in such an “easy” case is that the language of the rule tracks clearly and perfectly the facts on the ground. In an easy case the judge can just articulate the syllogism’s necessary conclusion. She does not have to “make” law. She runs the algorithm.

    In my previous email, I then suggested some twists on the easy case that get in the way of the deduction, eg, what if Bob’s gun went off accidentally, or as the result of a manufacturing defect or he fired it in self-defense?

    Your reaction isn’t silly at all, but precisely the right question to ask. I will take the liberty of rephrasing your question a bit: “In the face of one of these “twists” that seems to screw up the deduction, can’t we reconstruct the syllogism somehow by going beyond the quoted language from the ordinance and bringing in material from elsewhere in the canonical legal rules? For example, what if Bob has evidence (like a witness) who convinces us that he fired the gun accidentally? That changes the minor premise and seems to screw up the initial syllogism, BUT WHAT IF nearby in the municipal ordinances the city council has thoughtfully included a DEFINITION of “to discharge a firearm” and the definition makes it crystal clear that this phrase means to do it DELIBERATELY. Now can’t we restate the major premise to reflect the definition and restate the minor premise to reflect the evidence that Bob’s gun went off accidentally? And if we do that HAVEN’T WE FIXED THE SYLLOGISM AND RESTORED THE ALGORITHM?”

    Yes, given the new facts about what Bob actually did, and given the city council’s definition of “discharge a firearm” we have neutralized the force of the “twist” and make the case “easy” again, that is, we can reach the appropriate legal conclusion by DEDUCTION from canonical legal rules: Bob need not pay a fine.

    But note that an essential condition for an easy case, subject to processing by logical deduction, is that the language of the relevant canonical rules tracks perfectly and with crystal clarity the facts that arise in the case. It happens. But very often it does not. And then there is a problem; the syllogism doesn’t work, or to put it differently, there is a “gap.”

    But why do such gaps exist? Can’t the legislators just be more careful when they draft canonical rules and make sure they cover all the possible bases so that, whatever set of facts a judge is presented with, the canonical rules would be robust enough that she could get to the appropriate legal conclusion algorithmically, like Scalia believes she should?

    Have you ever met a legislator? Even the ones who are super-conscientious and completely honest and really smart and devoted only to the public interest are going to have the Yogi Berra problem. They are not omniscient and they are using language. Several hundred years of legal history (and my own personal experience) demonstrate, at least to my satisfaction, that there are always going to be gaps that screw up the syllogisms the formalists would love to have.

    And for hundreds of years, the gaps have been filled by judges, whose gap-filling is also known as “making law.”

    And it’s not just legislators, the folks we send to the statehouse. In his book on legal reasoning. Steven Burton illustrates how there can be gaps even in a set of canonical rules like the Uniform Commercial Code. These rules have been drafted VERY CAREFULLY, by law professors who are experts in commercial law and who understand the need to include definitions of terms used in the rules, etc. Guess what? These folks aren’t omniscient either, and cases will arise in which the language of the rules does not track perfectly and with crystal clarity the facts on the ground. Guess who has to fill in the gaps?

    I have actually drafted a few canonical rules. If you are feeling masochistic, have a look at section 196.027 of the Wisconsin Statutes. We took lots of time and we looked at examples of similar statutes elsewhere. We got litigators to “poke holes” in the drafts, to tell us what kind of twists might present themselves that would require going to court. And, if you look at the statute, you will see how detailed it is, with plenty of definitions of terms, etc. AND YET, I have had this conversation with a client: “We have been brainstorming and we would like to do X. Does that statute you wrote permit it?” My answer: ” I am not sure. I can make a good argument that it is permitted, but that conclusion does not follow deductively from the rule (that I wrote) and the facts as they would be if you act on your brainstorm.”

    Same problem with “private” canonical rules, ie, contracts. These rules are spelled out in excruciating detail for dozens of pages with scores of definitions, etc. And yet…. gaps. A’s claim that B has breached the contract is unlikely to be provable by means of a syllogism. “See you in court.”

    One more observation. While some legislators channel Scalia and scream (at least from time to time) about how “courts should not make law,” there are very well known and very important instances where a legislature has understood in spades that they face the Yogi Berra problem and have enacted canonical legal rules that, in effect, invite the courts to fill in the gaps as time goes by. A good example of this is the Sherman Anti-trust Act.
    Most states have “little” Sherman Acts. The one in Wisconsin reads: “Evey contract, combination in the form of trust or otherwise, or conspiracy in restraint of trade is illegal.” Suppose a client comes to you with a new marketing plan. What he describes about his firm’s proposed conduct gives you the facts, the minor premise. NOW, using only the canonical legal text as your major premise, deduce whether the marketing plan runs afoul of the law.

    Same problem with the client who comes to you and says he plans to hang the governor in effigy and wants to know whether that would be protected by the first amendment right to freedom of speech. Using only the canonical legal text….

    My apologies for abusing David’s virtual real estate by going on at such length. These issues are nicely suited for discussing over beer.



  9. 9 David Glasner May 29, 2019 at 8:37 am

    Brian, You’re welcome on my real estate any time. Thanks for your very clear and persuasive articulation of why judging can’t be reduced to looking things up in law codes and dictionaries.


  10. 10 Henry Rech May 29, 2019 at 4:26 pm


    Thanks again for your response.

    It sounds as if you’ve had an interesting career.

    It would seem that if it wasn’t for the “gap” there would be no need for lawyers. 🙂


  11. 11 Kurt Schuler June 7, 2019 at 5:00 am

    David, what you describe as Scalia’s “ridiculous caricature of how common-law judging is conducted by actual courts” seems to be how Richard Posner, the most widely cited legal scholar ever and the most influential lower-court judge of the last few decades, actually operated. See the quote highlighted here:


    These are pretty deep waters that we nonexperts wade into at our peril.


  12. 12 David Glasner June 7, 2019 at 5:15 am

    Kurt, Thanks for your comment and the link to the review of Posner’s latest. As opposed to Scalia, I have a decided preference for Posner with whom I personally interacted two or three times, but I’m not partial to legal realism either, and Posner did agree to some extent with Scalia’s view of common law judging which he characterized as disguised legislating, advocating that judges should be more honest about their legislative function. I disagree with that view, though I don’t deny that there are similarities between judges legislating and judges judging. I agree it’s complicated. I think it was Dworkin who got it right.


  13. 13 Brian Winters June 7, 2019 at 6:58 am

    A couple of thoughts from out here in the Seventh Circuit:

    1. Posner is indeed a phenomenon. Standard advice has been if you want a thoughtful, beautifully written primer on an area of the law, find a Posner decision and study it.

    2. Another 7th Circuit judge, the late Terry Evans, once remarked (referring to Posner) “there are judges on this court who have written more books than I have read.”

    3. Posner has always been a polemicist of sorts. I would be careful to distinguish what he says about his judicial practice from his actual judicial practice. Read a Posner decision and you are unlikely to walk away thinking “this is a judge for whom anything goes.”

    4. Beyond his decisions themselves, a good place to look for Posner’s less polemical take on the common law, stare decisis, statutory interpretation, etc. might be his textbook “Economic Analysis of Law.” At the very least there are powerful efficiency arguments against judges just making stuff up as they go along. That doesn’t mean they don’t sometimes have to make law/fill gaps, but there is a discernible order to this process.


  14. 14 David Glasner June 7, 2019 at 7:25 am

    Brian, Thanks for weighing in again on this topic. I think the really important point is that a judge even when overturning a particular precedent or making up a new precedent to fill in a gap still has to be guided by the objective of fitting that particular decision as seemlessly as possible into a larger legal structure of statute and precedent. Judges can’t just make stuff up (legislators can do so a lot more easily but they are also constrained by the legal system in which they are operating) they have to make their decision seem to follow from “the law” taken as a whole and as a tradition of jurisprudence.


  15. 15 Hemant Kumar June 10, 2019 at 11:58 pm

    “…the objective of fitting that particular decision as seamlessly as possible into a larger legal structure of statute and precedent.” I believe that this is what is called “the principle of harmonious construction”.

    I occasionally (actually, not so occasionally) visit Uneasy Money for the Macro/Monetary discussions, which are always informative and thought provoking. The sporadic deviations into Law, Philosophy and Politics is icing on the cake!


  1. 1 Nightcap | Notes On Liberty Trackback on May 27, 2019 at 8:17 pm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

About Me

David Glasner
Washington, DC

I am an economist in the Washington DC area. My research and writing has been mostly on monetary economics and policy and the history of economics. In my book Free Banking and Monetary Reform, I argued for a non-Monetarist non-Keynesian approach to monetary policy, based on a theory of a competitive supply of money. Over the years, I have become increasingly impressed by the similarities between my approach and that of R. G. Hawtrey and hope to bring Hawtrey’s unduly neglected contributions to the attention of a wider audience.

My new book Studies in the History of Monetary Theory: Controversies and Clarifications has been published by Palgrave Macmillan

Follow me on Twitter @david_glasner


Enter your email address to follow this blog and receive notifications of new posts by email.

Join 3,263 other subscribers
Follow Uneasy Money on WordPress.com

%d bloggers like this: