Clifford, belief, and morality

Clifford's dictum is probably too strong, though of course it depends on how much and what kind of evidence is required to establish sufficiency. I would also distinguish, as I don't think Clifford did, between intellectual wrongs and moral wrongs. But it may be argued that it is always wrong in some sense to be indifferent to whether one’s beliefs are true. All manner of error or fallacious thinking may be excused. Deception can be excused in some circumstances, and even self-deception in rarer circumstances. But to affirm in any general way the moral irrelevance of congruity between belief and truth is to undermine a basis for morality itself. We may argue that in some circumstances some considerations must outweigh truth, but we may not claim that the truth never matters. If a person prefers feeling good to facing reality, he or she should say so.

The rightness or wrongness of any action depends among other things on what will be or could be its consequences. Moral judgment therefore entails the ascertainment of those consequences, and the ascertainment must be true if the moral judgment is to be correct.
This cannot work if it may ever be presupposed that an action will have no morally relevant consequences. Therefore, indifference to truth can never be justified. One may be excused for not believing it, but not for not caring whether one believes it.

The philosophical pioneers

The Greeks were probably not the first philosophers, nor the only philosophers of their time. They were the only ones mentioned in the documents that have survived from their time. We cannot say there must have been others. Neither can we say there could not have been any others. Considering human nature, it is not likely that nobody anywhere had had such thoughts before.

Ancient documents and ancient thinking

Thoughts are immaterial. They do not get preserved in the historical record. Only their tangible expressions get preserved. Architecture, pottery, jewelry, sculpture, and other such cultural artifacts sometimes might convey hints of an ancient people’s mind set, but only written documents can tell us anything specific about the thinking of a community that no longer exists.

There is a notion attractive to many people that all ancient documents are either true or fraudulent (and almost never fraudulent). These people suppose, in other words, that until modern times, all writers either believed everything they wrote — and believed it with good reason — or else were attempting to perpetrate some hoax or fraud. It seems to be supposed that nobody ever knowingly wrote fiction before it became possible, with the invention of printing and the advent of mass literacy, to make money writing fiction. The crucial distinction between fiction and fraud is that the author of fiction does not hope or expect to be believed. He might hope that his readers will be enlightened by his narrative. He will surely hope that they will be entertained. But he will not intend for them to think that the events of his narrative actually occurred.

Although creative writing is nothing new, disclaimers such as "This is a work of fiction" are very new. That is not because modern people have lost an ability, previously common to all people, to distinguish fact from fiction. It is rather because modern legal systems make it prudent for a storyteller to clearly state his intentions. I don’t mean to suggest that all ancient writings should be treated as so many Morte D’Arthurs. In general, narrative works that seem intended to be perceived as factual history probably were so intended. But if ancient readers were no more skilled than we are at distinguishing fact from fiction, then neither were ancient writers. With rare exceptions if any, even the real historians of antiquity had access to little if any primary information. They might in some cases have been able to examine a few documents that recorded eyewitness accounts of significant events. For events occurring within their lifetimes, they might sometimes have been able to interview the actual witnesses. For the most part, though, Herodotus and others like him were recording stories they had heard, or in some cases had read, and some of those stories had been circulating for many generations. The transmission of history by oral tradition is not a reliable process. Many ancient historians acknowledged this fact and were explicit about their skepticism toward certain of their sources. They would say things like "I don’t believe it myself, but according to _____, this is what happened." We should not infer, though, that wherever they failed to express such doubt, doubt is not now justified.

Historical research, legalities aside

The reconstruction of history is not a legal proceeding. No relevant evidence is barred from consideration on grounds of having been improperly obtained. History's witnesses cannot be cross-examined. Their testimony is fixed and they get no opportunity to clarify or retract any of it. Furthermore, we rarely have the witnesses’ actual testimony, even in writing. What we have are documents that are copies of copies of copies of documents of which the originals, according to some people, were written by certain individuals. It can never be merely assumed that the extant copies accurately reproduce the originals or that the originals were in fact written by those credited with their authorship. The task of historians is among other things to reach scientifically informed judgments about such issues. The rest of us can either defer to the professionals' judgment or do enough research ourselves to make up our own minds.

And so what?

A mistaken belief about what happened in the past will not ordinarily have important consequences for the present. It is unlikely to make any practical difference in my life whether or not I believe there was a real King Arthur or William Tell. Of course, I will be thought foolish if I believe something contrary to a universal consensus, but nothing worse is going to happen, and if I care nothing about the opinions of other people, that will be the end of it.

Not much to go on

There is a tendency to suppose that particular ideas originate with whoever is first known to have written about them, absent the author’s explicitly crediting predecessors. In many cases, a contrary supposition does not necessarily violate parsimony. Ancient documents usually did not survive unless someone wanted them to. Their default fate was disappearance, which usually meant their vanishing from history. There is no way we can know about a document unless either we have a copy of it or else the document’s existence is mentioned by someone whose work we do have a copy of.

There was no mass production of any books. The cost of making copies was great enough that no copy was made without someone's being powerfully motivated. Such motivation rarely happened, and when it did, the person it happened to usually wanted only one copy. A book had to be considered extremely special -- like perhaps the word of God -- before people with lots of money thought it worthwhile to produce numerous copies of it. And naturally, almost nobody felt any motivation to copy books containing ideas they disapproved of. It did happen, although rarely, that someone would produce quotations from a book in order to argue against the author’s ideas. More often, though, a writer disputing another writer would only paraphrase the other. Considering what can be done with a paraphrase, we often have no good idea what was really written originally. But even a paraphrase, no matter how unfair, would at least let us know of the book's existence and provide some hint of its content.

A book that was considered not even worth mentioning would have vanished without a trace from the historical record. In many cases, it almost might as well have never been written. I say "almost," because there is no telling how many of the people who did influence history might have gotten some of their key ideas from books that they never mentioned to anyone whose own recollections got recorded for posterity.

Of course, the more popular a book was, the likelier it was to be copied by many people (as well as made mention of in other books). And the more copies that were made at any given time, the likelier it was some of those copies would be copied, and that some of those copies would be copied, and so on until the printing press was invented. Nowadays, the writer of an extremely popular book becomes not only famous but rich. In ancient times, you almost had to be rich to begin with in order to have enough leisure time to write a book, and there was no way for the book to make you any richer. The most that you could hope to gain from the book was some measure of fame.

Truth, reality, and logic

Truth is supposed to have some connection with reality — another concept for which a universally accepted definition has been elusive. Most of us assume that there is such a thing as reality, though, and that truth is a property of any statement congruent with it. This in a way is a restatement of the axiom of identity: A true statement is a true statement. If that is so, then by definition a statement is false if it has no such congruence. Most of us believe it cannot be both congruent and incongruent, which is all we’re saying when we affirm the axiom of noncontradiction. If we also believe that it must be one or the other, then we affirm the axiom of the excluded middle.

Everything else in logic follows from these axioms. All the technical terms, the Latin verbiage, the rules of inference, the validity of syllogisms, the whole works is nothing but commentary and elaboration on the three axioms. If an argument is valid, it will be impossible for the conclusion to be false if all the premises are true. So then, if the premises happen to be true, then we can be certain that the conclusion is true as well, because otherwise we would have a contradiction. An argument is invalid if it allows the possibility of a false conclusion notwithstanding all true premises. It fails to demonstrate, in other words, that a false conclusion would in fact constitute a contradiction of at least one of the premises. The formal study of logic is an investigation of what it takes to make an argument valid and exactly what is wrong with arguments that lack validity.

A fallacy is any error that renders an argument invalid. Much work has gone into the taxonomy of fallacies. Their common characteristic is that they make it possible for the conclusion to be false notwithstanding that all the premises are true. The only thing being categorized is the way in they do that. Likewise for the taxonomy of valid arguments. The several labels simply identify different ways in which it can be shown that there is no way for the conclusion to be false if all the premises are true.

Needed: Citizen philosophers

Until philosophers are kings, or the kings and princes of this world have the spirit and power of philosophy, and political greatness and wisdom meet in one, and those commoner natures who pursue either to the exclusion of the other are compelled to stand aside, cities will never have rest from their evils — nor the human race, as I believe — and then only will this our State have a possibility of life and behold the light of day. (Plato, Republic, Book V)

No one yearns for philosopher-kings anymore, because no one wants kings of any kind. The western world has made up its collective mind that no unelected government can be a legitimate government. The power to govern must be derived from the consent of the governed, and that consent must be conveyed by a vote of those who would be governed.

Plato scorned democracy. His objections to it still find plenty of sympathy among intellectuals, but there is a strong consensus that if there are any solutions to democracy’s disadvantages relative to the alternatives, they must be found within a democratic system, because no alternative is acceptable. One of the most famous comments to this effect is attributed to Winston Churchill: "Democracy is the worst form of government there is, except for all the others."

We have become committed to the ideal that government must be of the people, by the people, and for the people. Rarely throughout human history has it been disputed that government ought to be for the people. The most tyrannical despots have generally made at least a pretense of thinking that nothing was more important to them than their subjects' welfare and happiness.

Government of the people and by the people has been less often considered a good idea. The ancient Greeks tried it for a while and failed, and for almost 2,000 years thereafter most western thinkers supposed that the failure was clear evidence that democracy was not really such a good idea.

The first efforts to rehabilitate it were made in late Medieval England, where a hybrid democracy-monarchy evolved over several centuries. The eventual result was that essentially all political power shifted from the hereditary ruler to the elected rulers. Before that transition was completed, England’s American colonies revolted and set up their own government without even a vestige of monarchy. After a little over two centuries, it seems to be more successful than the Greek experiment was. Most of the world’s nations are now democracies, at least superficially.

So, has Plato been proved wrong? Substituting "nation" for "city," we may ask whether we have achieved any rest from evils. I don’t know anyone who thinks so. Some think that most Americans are better off than most Athenians were. Others will dispute that. But nobody thinks that America or any other democratic nation is a utopia. Even if a few of the evils that existed in Athens have lately been exorcized from our own culture, there are too many remaining, tormenting too many people.

Most of us have had the occasional fantasy about what we would do to eradicate one or more of our society’s pathologies if we were granted absolute dictatorial power. We are not moved to such fantasies by any craving for power. Well, maybe some are, but the rest of us are moved instead by the apparent inability of democratic institutions to take any effective action against those pathologies. The conventional wisdom is that if we must choose, the present pathologies are to be preferred over those that come with any dictatorship, however benevolent, and that in any case there is no reason to suppose that we must choose. There is nothing a dictatorship can do that a democracy cannot do if the people will just decide to do it. It is up to those who see a need for government action to persuade at least half of their fellow citizens to approve the action. Such is the democratic principle, anyway, however more complicated the practice might be.

A commitment to democracy need not imply a fatalistic acceptance of democracy's shortcomings or a pretense that they either don't exist or are no cause for concern. It should instead imply a determination to minimize the consequences of those shortcomings. They cannot be eliminated. If they are the cost of our freedom, then let us gladly pay them, but let us not gladly pay more than we must.

Except in a few very small communities, democracy in real life is not self-rule, strictly speaking. It is rule by a few people who are chosen to be rulers, periodically subject to being denied their ruling authority. As Jefferson put it, they govern by the consent of the governed. The correct term for such a government is not democracy but republic. A republic may be considered a kind of democracy, but it is not a pure kind.

Republican government is often justified on grounds not of quality but expediency. The delegation of power is thought to be necessitated by the unfeasibility of having the citizens of a large state gather regularly to conduct government business. The citizens therefore delegate their governing authority to representatives whom they hold accountable by conducting regular elections. This is thought to be a compromise between the ideal of a pure democracy and the bane of any kind of unelected government.

Of course, there being no free lunch, minimizing one cost tends to entail other costs. If we won't be ruled by philosopher kings but are determined instead to rule ourselves, then perhaps it behooves us to become philosophers ourselves. It is guaranteed that we will always know what we want. It is not guaranteed that what we want will always be good for us, nor that we will always know how to tell the difference between what we want and what we ought to have.

Not everyone can be a philosopher, and not everyone who could be one needs to be one. But rulers uninformed by philosophy cannot be the best rulers, even of themselves. A little learning is indeed dangerous, but there is not any less danger in complete ignorance. Those who have learned a little should be trying to learn more, not to know less.

The epistemic community

What shall we take to be our epistemological community? Ideally, we should be as unparochial as we can manage. That is to say we should not assume that epistemic communities coincide with geographical communities. Our neighbors or coworkers might not know what we suppose everybody knows.

This was not an issue through most of human history. Until very recently, nearly all communities were small and culturally homogeneous, and almost everyone spent their entire lives within walking distance of where they were born. Only a handful of the world's major cities were multicultural. Even in those places, the diversity was not so much celebrated as simply tolerated, and often enough not even tolerated, at least not peacefully.

An epistemic community is a set of intercommunicating people all of whom know approximately the same things. Here, "know" is being loosely defined. The shared knowledge of an epistemic community is that set of beliefs that are treated by the community as incontrovertible statements of fact. It is not relevant to this definition whether anyone outside the community thinks any of those beliefs constitute knowledge. A person who was raised in a certain religious environment and, through his formative years, had no significant interaction with people having different worldviews, is not to blame for considering his religion's worldview to be in some sense a self-evident truth.

As a generality with no instructive exceptions, the acquisition of knowledge is never a solitary activity. We get most of our knowledge from other people, and even our private thoughts, however productive of new knowledge they might be, depend for their intelligibility on concepts that we have learned from others. Newton was not the only discoverer standing on the shoulders of giants. We all have mentors, and we all ride their shoulders. Skeptics and other freethinkers are no exceptions. We all believe much because, and only because, we heard it from sources we think authoritative. We skeptics differ from the credulous mainly in whom we consider authoritative, and in our readiness to dispute our own authorities when we think we have good reason to.

The important questions

Leo Strauss is credited with the observation that conservatism is the doctrine that all the important questions have already been answered. Few liberals, in my experience, exhibit any suspicion that those questions remain unanswered. The difference between conservatism and liberalism is only about what the right answers are.

There was a time, within living memory of the oldest baby boomers, when liberalism also had the distinction of being the only political philosophy that any educated person could espouse. Then along came William F. Buckley Jr., and a few of us discovered that conservatism was not just for troglodytes. But the man who made conservatism intellectually respectable did not quite make it intellectually fashionable.

The prevailing wisdom has it, more than ever before, that the lessons of the past are all cautionary. The study of history is thought to be the study of human folly at best or, more typically, human wickedness. The consensus among most educated people seems to be that our ancestors, with the possible exception of a few painters, sculptors, musicians, and writers, achieved nothing laudatory.

It is easy to get the impression that conservative territory is coterminous with religious territory, that conservatives are religious and liberals are secular. It might even be approximately so. While it may be noted that the explicit disparagement of religion in any general way is thought to be politically incorrect, it is no coincidence that the Christian Coalition was a conservative movement, or that while "Religious Right" is part of our political lexicon, "Religious Left" is not. It comes back to those important questions and their answers. Conservatives think the answers were revealed to men of rare virtue. Liberals think the same thing, but not about the same men. Both sides believe in revealed truth. They just differ about what truth was revealed and to whom it was first revealed.

They also differ in that conservatives usually admit to believing in revealed truth while liberals tend to deny it.

History and principled behavior

We see things that have happened in history and believe that they should not happen again. We should then ask: What principle, had it been followed by those who did the thing that should not happen again, would have prevented their doing it? We then ask: Have other groups, claiming to follow the same principle, done things that we think should not happen again? Must we not then reformulate the principle?

Proving arithmetic

It may be objected that the basic operations of arithmetic are in fact empirically verifiable – that, for example, whenever two objects are combined with three objects, the result is invariably five objects, thus confirming 2 + 3 = 5.

Let’s make it as simple as it gets and consider 1 + 1 = 2. It is claimed that we never observe a contrary. But what would constitute a contrary observation? Well, what constitutes the observation itself? To what empirical fact, exactly, does "one plus one" correspond?
We begin with an obvious case. I stand in front of a table. Beside the table is a barrel full of apples. I remove one apple from the barrel and put it on the table. Then I remove another apple and put it on the table beside the first apple. There are now two apples on the table. Let us now disregard all the other apples in the barrel. The two on the table were two apples while they were both in the barrel. They were two apples when one was on the table and the other was still in the barrel. And they will remain two apples if one of them is removed from the table and taken anywhere else – into the next room or across the world. The sum of one apple and one apple has nothing to do with their location or movement relative to each other. One apple and another apple are two apples by definition, regardless of their spatial relationship.

Even their temporal relationship is irrelevant. Suppose I rarely eat apples. Many years ago, while living in Florida, I bought one at a grocery there and ate it. This year, now living in California, I buy one from a grocery here and eat it. Thus I have eaten two apples, but those apples never existed at the same time, let alone in the same place.

For further illustration, we can consider how we deal with an apparent falsification. Add one cup of sugar to one cup of water. The result will not be two cups of sugar water, but considerably less. Scientists do not treat this as an exception to 1 + 1 = 2. They treat it as evidence of how matter is constituted. When we observe a fact that seems to contradict mathematics, we adjust our understanding of the facts in order to accommodate the math, if we are scientifically rational.

Knowledge and certainty

In common usage, in contrast to academic philosophical usage, a claim of knowledge is essentially a claim of certainty. To say "I know P" is to say "I don't consider it possible for P for be false." This is reflected in the assertion sometimes made, "I don’t believe it, I know it." Such a statement is nonsense, strictly speaking, but it is widely supposed that mere belief implies at least some uncertainty about the thing believed.

This happens not to be an intellectually useful distinction between belief and knowledge, though, which is probably why Plato first tried to establish a better one in the Theaetetus. Twenty-four hundred years later, a good consensus is still elusive, but there is much to be gained from a study of the reasons why the issue has been so challenging.

Human nature

We often hear that you can’t change human nature. Sometimes it is a complaint. Other times it is an excuse. For people who think certain social changes would be desirable, the intractability of human nature is a problem, because reforms would entail forcing people to act contrary to their natural inclinations. For those opposed to such changes, it means that reform efforts would be futile at best.

There are those, too, who deny the existence of anything like human nature. This is the "blank slate" model, according to which a society (or its ruling class) can manipulate people to act pretty much any way the rulers want them to. In the nature-nurture debate, these are the champions of nurture.

Back to blog index.


Return to site home.