CulturePolitics

On the “Tyranny of the Dead” Fallacy

“An imbecile habit has arisen in modern controversy of saying that such and such a creed can be held in one age but cannot be held in another.  Some dogma, we are told, was credible in the twelfth century, but is not credible in the twentieth.  You might as well say that a certain philosophy can be believed on Mondays, but cannot be believed on Tuesdays.  You might as well say of a view of the cosmos that it was suitable to half-past three, but not suitable to half-past four.  What a man can believe depends upon his philosophy, not upon the clock or the century.”
– G.K. Chesterton

__________________________________________

I was just twenty-five, and, given that I was attending a law school in the Washington D.C. area, I was in the youngest ten percent of my class.  The majority of students were legal or bureaucratic professionals who had decided to add a law degree to their resumes.  Those of us attending straight out of undergraduate school were in the minority.  This made for a unique social setting where virtually everyone had scheduled internships, clerkships, or at least part-time legislative assistant jobs that occupied them.  Time of thinking and meaningful conversation were slim.  Time for forming friendships was nil.  It was easy to form close friendships in undergrad college.  There, we enjoyed each other’s company.  We pursued common interests.  We were not always looking at each other as competitors or future adversaries.  By contrast, at the time of law school graduation, I could count my genuine friendships developed there on the fingers of one hand.

The resulting social dynamic was that (a) the older professionals looked down their noses at the young college kids as young, inexperienced and naive, while (b) the college kids looked down their noses at the older professionals as losers who were making late starts on their careers.  The thing about law school is that it strongly attracts specific personality types.  I’m told that they are called “alpha personalities” – extroverted, assertive, aggressive, impatient, competitive, highly self-confident and all around … well, you know the word for them.  So whether you were in the younger minority or the older majority, you were supposed to look down your nose at someone. Everyone of your fellows, you were told, could conceivably either take the career you wanted or beat you in the courtroom.  I thought they all talked fast.  As one the only Californians in my class, I was repeatedly told by everyone else “you talk slow.”  I was also frequently asked “you smoke weed, don’t you?”  Looking down my nose at them like I was supposed to, I would answer with a slow smile.

I might have had all the stress and high-blood pressure that everyone else did if I had been the straight-A student in law school that I used to be in undergrad.  But, it occurs to one swiftly in the first semester of law school that, in order to have any hope of getting all A’s, one needs to: (a) be a genius-prodigy with a super-photographic memory; or (b) crawl into a little hole for the next three years and eat, breath, and sleep nothing but court decisions, briefs and “IRAC” summaries.  However, if one enjoyed thinking, reading, and writing too much to commit that sort of social, spiritual and literary suicide, there was always option (c) – avoid stress and high blood-pressure by focusing on the goal of graduating from law school without giving a damn about making junior partner at any large corporate firm or clerking for a United States Supreme Court Justice.  It is wonderful how stress-free law school suddenly is once one opts for (c).

The end result was that I, for first time in my life in a school setting, began to gravitate towards the back row of the classroom.  It’s a little difficult to feel like a rebel in class while you’re attending something as privileged as graduate school, but I wasn’t trying to be one of the cool crowd.  I started sitting in the back row merely to distance myself from all the high-volume talkers.  Every class in law school has a few talkers that you wish would just shut up.  They absolutely have to give their opinion about everything (and thereby prolonging the class).  A disadvantage to graduate school is that class times are more vague and undefined.  A professor can easily go over or under the allotted hour for class time (usually it’s over).  An advantage to graduate school is that class attendance is often more vague and undefined.  You can just get up and walk out of class any time you please (sitting in the back row assists in this strategy).  But, and this is something most people don’t understand, a “class talker” in law school is not the same as in other schools.  Every student in law school is normally considered an assertive talker by the rest of society.  So, if you put fifty of these people all in one little classroom, and then add a professor who asks them questions for the purpose of encouraging them to talk, the results are often unpleasant to the ear.

It is worth noting that Constitutional Law is perhaps the most fascinating of all classes in law school.  It’s certainly a more cheery subject than administrative law, corporate law, or, God forbid, tax law.  One does not often find oneself stimulated with interesting or happy thoughts while pouring over the United States Internal Revenue Code.  Some thoughts do arise, but they are far from pleasant ones.  (We could, in fact, entertain the entirely reasonable possibility that all modern day anarchists are created by originally innocent law students who are assigned to read even small portions of Title 26, U.S.C.)  The corporate law classroom always had the most disagreeable collection of students.  Most of them had perfected the art of a distinctively arrogant sneer that is unique to the District of Columbia.  The tax law classroom was dark and full of ludicrous amounts of spider webs. A couple skeletons of long deceased students sat in the back row, and the pasty-skinned professor only taught night classes.  Thus, by comparison, Constitutional law was full of life.  It involved invigorating debates and discussions of philosophy and history.

It is often something of a vice.  But I do occasionally enjoy listening to, and even participating in, a good solid argument.  Constitutional Law class had just the sorts of arguments that were more enjoyable than most, so it was not with immediate plans of escape that I entered class one day.  It was early in the semester.  Early on, we had gone over the foundations of English Common Law, William Blackstone, Charles de Montesquieu, Thomas Hobbes, John Locke and “social contract” theory.  Now we were discussing our reading on the Federalist and Anti-Federalist Papers and the origins of the U.S. Constitution.  The discussion turned upon the nature of self-government.

“So … how did the founders who wrote the Constitution, define self-government?” asked the professor.

“Government by the people instead of by a king with absolute power,” was the first tentative and formulaic answer.

“Democracy,” was the second boring answer.

“The founders defined self-government as democratic majority rule,” said someone else.

“Yeah, but majority rule to the founders only meant a majority of votes by rich upper-class white guys,” said the class feminist.

“Self-government is when the people delegate power to a governing authority by social contract,” read the student who was looking at his notes from last class.

The professor wasn’t quite producing what he was looking for.  He tried again.  “So what precisely does self-government consist of?”

“Elections.”

“Democracy,” again.

The professor decided to use a leading question, “Who makes the law in a self-government?”

“Politicians.”

“The people.”

“Their elected representatives.”

After further discussion, it was eventually and essentially agreed upon that self-government is when the people make the law or when the people choose the people who make the law.

One student, who to this day I still remember always wore a black trench-coat, sneered.  “Whatever” (yes, students in law school use the word “whatever” in conversation), “we don’t even have self-government in the United States.”

“Do you mean that many of our laws and regulations arise out of unelected bureaucracy that continues to grow without any competent oversight by elected representatives?” the professor asked, reasonably assuming that the black trench-coat was a libertarian.

“Nah, it’s not that,” scoffed the trench-coat, “It’s that we didn’t make the law and we didn’t choose the people who made ninety-nine point ninety-nine percent of our laws.”

“Isn’t that what I just said?,” the professor seemed bored.

“That’s not the problem.  We don’t have self-government in America because most of the people who wrote the laws are now DEAD.”  The trench coat emphasized the word “dead.” “We didn’t write them.  We didn’t consent to them.  Therefore, we don’t have self-government.”

The young lady, who had earlier referred to upper-class white guys, was quite taken by this point.  “The same goes for the Constitution, even,” she said, agreeing with the trench-coat, “most of our country’s laws were written by white males who died over a hundred years ago.”

One of the most aggressive class-talkers who had, up until then, not joined in the discussion finally started in: “It’s really not fair when you think about it.  Why should we have to follow a system and a Constitution that was written hundreds of years ago by dead people?  We pretend we’re a democracy.  We have all this feel-good stuff about self-government and it’s all a sham.  I didn’t choose those old guys to write the Constitution for me.  I didn’t elect most of the Congressmen and Presidents who wrote and passed all our laws.  Why should I have to follow their ideas just because they were born before I was?”

This was, suffice to say, not the direction of discussion that the professor had planned.  He looked a little surprised, actually.  But then his eyes narrowed.

“Do you mean to assert,” he demanded of them, “that laws can lose their legitimacy as soon as the people who created them die?”

“Yes!” said the trench-coat.

“It’s only fair,” said the feminist.

“Logically, it’s necessary,” said the talker.  “Let’s stop being so hypocritical.  I didn’t consent to the Constitution.  You didn’t consent to the Constitution.  We didn’t write it and we didn’t choose the people who wrote it.  And the same applies to almost all the laws of the land.  Back then, they believed all sorts of things that we know are now are, to put it frankly, just stupid.  These guys owned slaves.  Having to follow the laws that a bunch of dead guys wrote is not self-government.  It’s a tyranny of the dead, when you think about it.”

The trench-coat’s sneer doubled in size:  “It’s a form of despotism, really.  We are being bound by a bunch of guys who are rotting in their graves.  We do live under a tyranny of the dead.”

“A tyranny of the dead!”  The phrase was murmured and repeated by a large number of classmates.  They obviously liked it.  “Not self-government.”  “Despotism of the dead.”  “Who said we had to follow what they decided hundreds of years ago?”  “Yeah!”  “They are, like, dead, like literally dead.”  “Tyranny of the dead.”  “Who said the ‘founding fathers’ (finger quotation marks) could impose their ideas of government on us?”  “Why should we bound to follow what they believed?”

“They’re DEAD,” pointed out one particularly bright student with a nod.

Finally, a lone hand was raised.  The professor motioned for him to go ahead and speak.  The student paused, waiting for all the self-congratulatory murmuring to die down a little.

“First,” he declared, “I fail to see why all my classmates are going into tens of thousands of dollars in debt to study and memorize the law when, as it has been so enthusiastically repeated, it was written by dead people.  Second, if we don’t have self-government merely because the law fails to be legitimate past the lives of a single generation, then I’m idiotically wasting my time in this class, aren’t I?  Third, the very idea of the law, if it ceases to be worth anything once a single generation dies, is absolutely and utterly meaningless.  Fourth, I hope that this is one of the most absurd discussions in class that these walls have ever had the misfortune to hear.  Fifth, since the very idea of law and government itself is so self-evidently oxymoronic if what all of you are saying is true, then why, for the love of God …”

He didn’t get to finish.  That was when the entire class erupted into the law school yuppie equivalent of an angry mob.

While not word-for-word identical, this is a close approximation to a real discussion that occurred in my class in law school.  Unfortunately, this example is symptomatic of a typically modern and popular sort of thinking.  The idea of being restrained and held back by the ideas of the past is not new, but it is increasingly prevalent today.  “The reformers of every age,” notes historian Jaroslav Pelikan, “whether political or religious or literary, have protested against the tyranny of the dead, and in doing so have called for innovation and insight in place of tradition.”  Charles Perrault, a supporter of Louix XIV (the king who became famous for eliminating older civil French traditions that placed limits upon the power of the monarchy), wrote an entire book arguing that the French Literature of the late 1600s was superior to all other literature from the past.  “Learned Antiquity, through all its extent, was never enlightened to equal our times,” Perrault declared.  “Why,” asked Ralph Waldo Emerson, in 1836, “should not we have a poetry and philosophy of insight and not of tradition?”

In 1917, Reverend W. John Murray, spokesman for “the New Thought” movement, delivered an address that he titled, with zero subtlety, “The Tyranny of the Past.”  Murray announced to his audience:

“Traditions and customs, conceived, conceived in the minds of antediluvian ancestors, seem to have exercised a power over the race, despite all its progressive unfoldment … Thus it is that we stand on the threshold of a higher revelation, only to turn back to some tradition of the past, with which this larger vision does not coincide. The tyranny of the past is that it wears a ditch in the brain, into which all new thoughts tumble, and are carried away before we take time to assimilate or digest them.”

It would be amusing if this type of rhetoric wasn’t still so popular.  “The past is an encumbrance ever, as the future is a myth,” Murray intoned.  There are, admittedly, different fantasies of the future created by those who are allergic to history.  But Murray isn’t interested in any sort of forward thinking.  He prefers living in the moment: “Therefore, we should be too busy with the well-filled hour to look backward or forward.  Sufficient for the day is the experience thereof …”  Of course, one’s experience on any given day can be quite different depending upon what one is willing to learn from the past.  Murray was the sort of motivational speaker who would get caught up in his own oratory: “As we cannot feel the weight of any misery one moment before it arrives, let us not drag the clanking chains of yesterday’s sins into today’s activities. They are excess baggage to him who would climb the Alps of self-conquest … The tyranny of the past is tyrannous only to him who, like Lot’s wife, looks back upon it.”

There are, of course, other views about the past.  But the evidence in our culture seems to be that this is still the view that is most popular.  It very well may be that this sort of thing will cause us some problems.