Skip to main content

Critical Thinking, Logic, and Argument: Chapter 13. Introduction to Fallacies and Bias

Critical Thinking, Logic, and Argument
Chapter 13. Introduction to Fallacies and Bias
    • Notifications
    • Privacy

“Chapter 13. Introduction to Fallacies and Bias” in “Critical Thinking, Logic, and Argument”

Chapter 13 Introduction to Fallacies and Bias

13.1 Introduction to Fallacies

The introduction to this book gave an overview of the role of identifying fallacies for critical thinking, including defining the term “fallacy,” which we distinguished from a “falsity.” A proposition is either true or false, but arguments contain fallacies. The term “fallacy” is often used in everyday speech to just mean “error” or mistaken beliefs. We will not use it that way. We are identifying fallacies as mistakes in reasoning or inferences. We have already covered deductive fallacies (invalid argument patterns) such as denying the antecedent and affirming the consequent.

A fallacy, in the strict sense, is a form of argument that is invalid or else violates a relevance condition.

This part of the book covers informal reasoning, so we are now looking at more everyday forms of making arguments. Here we will look at the strengths and weaknesses of patterns and examine the ways they can be misused. So what makes something an informal fallacy? Fallacies are specific features of unsuccessful arguments. Fallacies appear in arguments—that is, they appear in the transition from a set of premises to a conclusion. There is often more than one way that an argument can go wrong at once. It is rarely so simply separated as many of the examples we show here. We are offering exemplars as diagnostic tools so that you can become a more careful reasoner—identifying both what is wrong and how to fix it.

Douglas N. Walton, in his A Pragmatic Theory of Fallacies (Tuscaloosa: University of Alabama Press, 1995, p. 255), offers five conditions for identifying fallacies:

  1. 1. an argument (or at least something that purports to be an argument) that
  2. 2. falls short of some standard of correctness,
  3. 3. is used in a context of a dialogue,
  4. 4. has a semblance of correctness about it, and
  5. 5. poses a serious problem to the realization of the goal of the dialogue.

So far we have talked about conditions 1 and 2—that arguments have fallacies and fallacies are a problem. But the rest deserve more attention. Condition 3 reminds us that arguments are about trying to convince, thus they are part of an exchange where a conclusion is considered (ideally!). Arguments are public exchanges, governed by rules, intended to establish a truth. But since fallacies are not successful or undermine cogency in some way, we must pay attention to 4. Many fallacious arguments will feel familiar, and by that very fact, they might seem convincing. Sometimes fallacious arguments will say true things that are irrelevant to the dialogue at hand, thus distracting the conversation. Either way, the problem with a fallacy isn’t always immediately detectible, so we should be looking below our first impressions. Condition 5 tells us that sometimes a fallacious argument, more than even being bad reasoning, is just a roadblock that interrupts the very possibility of getting to the argument we want to make. To a certain extent, one or all these will apply to the fallacies we cover in part 3 of this book.

Remember that critical thinkers have a certain kind of attitude toward belief: They are both open-minded and sceptical. Fallacies usually have a deceptive appearance and pass for good arguments. In fact, we all likely use fallacious forms of argument many times every day. Thus we can all benefit from paying attention to our level of scepticism of what we hear and read. Often we perpetuate fallacies and it causes no damage because, if we were more careful, we could reformulate our arguments in better forms. However, about equally often, the very thinking behind our arguments is at fault (our inferences), and the fallaciousness of our arguments can only be removed by rethinking our opinions and correcting our tendencies for sloppy and irrelevant thinking. So the study of fallacies is valuable because it provides us with tools for thinking more coherently and increasing our ability to discover the truth.

You can find many indexes of fallacies online.1 There are often various names for the same fallacy—some in English and some in Latin!

While we will cover fallacies and identify them according to specific names, the important bit is to differentiate the kind of error in reasoning and what is wrong with it and why. Biases also lead us astray from the truth and need to be identified in order to improve our thinking. Often when evaluating the ways in which arguments go wrong, biases and fallacies are both identified. Arguments often have both! However, it is important to find out how fallacies relate to biases.

13.2 Bias and Relativism

The fallacies involving bias might well be called fallacies of irrelevance. In each, a different kind of irrelevancy involving bias is introduced in an attempt to obscure the real issue by stirring up our emotions. It is very common for critical thinking texts to focus on the importance of avoiding bias and the evils of stereotyping, vested interests, prejudice, and conflicts of interest. The danger of this sort of overemphasis is that these failures of reasoning can be overstated and exaggerated so that students come to believe that everyone’s opinion is equally valid because we are all woefully biased. Often students also jump to the conclusion that simply criticizing another’s position or argument is a kind of error (because “all opinions are valid”), or that our beliefs are all reducible to claims toward our self-interest (people “believe what they want to believe”), none of which are remotely true. So some preventative medicine is called for.

First, let’s deal with the claim that “everyone’s opinion is valid.” We hear this claim a lot, but if we are to understand it, we need significant context. Does this mean that everyone has a political right to free thought? It does seem like we have a right to form our opinions without direct interference from the government. But does this mean that all the opinions formed are “valid” in the sense that they all lay claim to truth?

If we are talking about truth when saying, “All opinions are valid,” the position is called relativism,2 which the Stanford Encyclopedia of Philosophy author Maria Baghramian defines as the following: “Relativism about truth . . . is the claim that what is true for one individual or social group may not be true for another, and there is no context-independent vantage point to adjudicate the matter. What is true or false is always relative to a conceptual, cultural, or linguistic framework.”

If the truth is completely relative, then there is little point in a book on critical thinking that is trying to offer better methods of arriving at the truth. A lot depends on what we are investigating. If we mean complex linguistic and spiritual claims embedded in a total way of life, then it might need context for its truth to be understood. But if we mean most of our usual truth claims, such as “Smoking causes cancer,” or “Climate change is accelerating” and “The earth is round,” the truth of these claims do not vary by culture, language, or location. So this means that there are some truth claims that are not relative or context dependent. Critical thinking is about building a method for finding and justifying truth claims.

So if all truth isn’t relative, then we should work on trying to weed out inaccuracies. Let us start with concerns that opinions are biased. We can look directly at the word “bias.” It has a neutral origin but is now primarily used in a negative way. The word “bias” starts its life simply as referring to a diagonal line, as when cloth is cut “on the bias” (diagonally across the grid made by the threads) and has come to mean point of view or preference or attitude toward. The idea that bias is bad creeps into usage because our preferences or attitudes can lead us to deviate from, or outright conflict with, what reason requires. Let us look at an example:

Example exploring bias: Is bias always bad?

Are parents biased toward their children? They are typically interested in the welfare of their own children. Some parents treat their own children as exceptions, as though their children deserve special treatment, treatment they do not grant to other children, just because they are their children. Think of cutting in line at an amusement park. Parents might believe that their child shouldn’t have to wait their turn while others should, or excuse the bad behaviour of their own child but not that of their child’s friend. In such a case, they will both care about their children and give them unfair preferential treatment. Of course, parents should care about their children; after all, they love them, and their children are deeply dependent on them.

So it is possible to both be “biased” (since parents care) and to give unfair preferential treatment. It isn’t the bias per se that’s the issue; it is the unfair preferential treatment. To treat your own children in an unfairly preferential way is not an acceptable consequence of parental love. It is wrong not because you are biased, but because it is a failure to universalize a simple moral rule: that we should be willing and able to put ourselves in the shoes of others. In other words, you’ve acted against reason. Any rule that a parent could apply to grant goods to their children could be used by any other parent to grant similar goods to their children; rules, whether intellectual or moral, apply universally or not at all. This is a fundamental starting point of critical thinking: don’t distort reasoning by using selective procedures (i.e., I will apply rules when it works for my interests, not in any consistent way).

Consider the cognitive bias illusory superiority,3 where one overestimate’s one’s good qualities. A species of this is the “Lake Wobegon Effect,”4 named for a fictional town where all the parents think their children are above average, which, of course, could not be true.

But this is all to point out that the very fact that people have interests, care about particular things or people, or have wants and hopes does not imply that people will always reason badly, treat others shabbily, or be “biased” in a bad sense. Having interests or preferences is not by itself bad; what is usually called “bias” in the negative sense is really an intellectual failure to deal properly with one’s interests. We all have motives, desires, and preferences, but that doesn’t mean every claim we make is woefully biased. To be good critical thinkers, we have to be open to scrutinizing the way that biases can distort our thinking, and we need methods for correcting or accounting for bias. One way to do this is to learn about them and remind ourselves and others of them in relevant situations. This is very different than dismissing all opinions as biased.

The emotions and interests that human beings have provide us with motives for reasoning, and such motives are not by themselves sources of rationality or irrationality. And for the critical thinker, the incentives offered by emotion, interests, or hope will not be barriers to critical thinking but only guides for which problems to consider (we are motivated by truth, at least sometimes!). When we undertake a project of critical thinking, we have an aim. The aim is to pursue truth. For this, we need rules of clear thought and good cognitive practice. Selfishness and bigotry—like cheating, lying, and theft—are moral failures involving patterns of irrationality; they are not mere products of interest.

13.3 Stereotyping

Like biases, we are often told that to be a good critical thinker, we need to avoid stereotyping. However, when applied to reasoning, stereotyping is an important and powerful method of inference. The word “stereotype” also has a neutral origin and meaning that has become primarily negative. The root of the word lies in a process of manufacturing where one makes a model of something by means of a mould, and the objects produced by the mould share the shape of the original. This is a passive transmission of shape from one thing to another.

Applied to reasoning, stereotyping is the kind of inference where we are led to expect that one thing will be like another because it is superficially like it; basically, it is the application of analogy.

Stereotyping provides hypotheses for future evaluation and testing. Of course, the dominant use of the word “stereotype” has a negative use that emphasizes the passive and superficial sides of the root meaning. After all, things that have the same shape need not otherwise be similar; chocolate coins, for example, are not genuine currency. Stereotyping has come to refer primarily to a settled and prejudicial belief and attitude. But notice again that the problem with stereotyping in this sense is explicitly cognitive—it is an error in thinking. The bigot who “stereotypes” others engages in (among other things) shoddy reasoning and holds on to dubious and implausible beliefs in the face of counter-evidence by avoiding or discounting available facts—they are ignoring relevant differences and they are not being sensitive to context. These are failures that good critical thinkers should avoid because they tend to produce false beliefs through flawed reasoning. Thus, we should be wary of our tendency to stereotype and be on the lookout for dissimilarities when we are undertaking analogical reasoning.

In the long run, you are less likely to get what you want when reasoning badly. Correcting for bias, stereotyping, and emotional interference in your thinking will benefit you and others. Having said this much, let us end on a note of caution. None of this is to say that emotions are irrelevant and always lead us away from the truth. To begin, emotions give us information. They aim us at goals and highlight relevant features of a situation. Of course, emotions can make those features seem more important than they actually are and might hold your attention for too long, making you miss other important features of a situation. Add to this that bias can obscure important and relevant features, and it might seem like the goals of critical thinking are out of reach. But we should not despair.

The appropriate critical response to these difficulties is care; one steps back, thinks methodically about the whole issue, and attempts to take a more objective consideration of the facts. A useful approach is to shift perspectives. If other parties are affected by the issue, we can ask how the situation would be viewed by each other person involved. Others are similar enough to us such that we can learn from their experience. Indeed, their different interests will highlight different but equally relevant features of the situation in question—they will have a better view of some things, and you will have a better view of some things.

If there is a purely rational case for intellectual cooperation, it rests in this: everyone’s view of the whole is likely to be partial, and real objectivity requires the contribution of many views.

Notice how intellectual cooperation is not relativism; this is working together toward a careful consideration of perspectives. The traditional moral vices of pride, greed, and selfishness are barriers to critical thinking because they distort reasoning, and just because (and to the extent that) each person is vulnerable to these vices, good practices of critical thinking require vigilance against their effects. In short, bias is not intrinsically negative, but it does offer dangers, both in the first person and in others, which the critical thinker must solve in order to reason more clearly and well.

Key Takeaways

  • • A fallacy, in the strict sense, is a form of argument that is invalid or else violates a relevance condition.
  • • Fallacies appear in arguments—that is, they appear in the transition from a set of premises to a conclusion (which can contain a fallacy).
  • • Walton suggests five features of a fallacy: an argument that falls short of a standard of correctness and is used in the context of a dialogue that has a semblance of correctness about it and poses a serious threat to the realization of the goal of the dialogue (truth-seeking).
  • • The truth cannot be relative if critical thinking has a point. There is a difference between carefully considering other perspectives and declaring the truth to be relative.
  • • To be biased is to have a point of view, preference, or attitude. Bad biases are those that embody a preference for unfairness, inaccuracy, or irrationality.
  • • It is consistent to have emotions and interests and to still be rational and a good critical thinker.
  • • Stereotyping is the kind of inference where we are led to expect that one thing will be like another because it is superficially like it; basically, it is the application of analogy.
  • • Good critical thinkers are sensitive to critical differences, so they avoid the bad use of the term “stereotype.”
  • • Good critical thinkers approach their thinking with clear values (such as the value of consistency), values that do not distort reasoning.

13.4 List of Fallacies Covered

Chapter 14. Fallacies of Ambiguity

Equivocation

Equivocation occurs when a key word is used in two or more senses in the same argument, and the apparent success of the argument depends on the shift in meaning. Or, two different words that look or sound the same that may become confused and lead to fallacious inference.

Amphiboly

The fallacy of amphiboly is when there is a structural ambiguity in the grammar of a sentence that the argument or claim depends on.

Accent

The fallacy of accent arises when there is an ambiguity of meaning because it is unclear where the stress should fall in a statement or what tone of voice is intended.

Composition

The fallacy of composition is when one argues invalidly from the properties of the parts of a whole to the properties of the whole itself and when one reasons invalidly from properties of a member to properties of a class.

Division

The fallacy of division is when one argues invalidly from the properties of the whole itself to properties of a part and when one reasons invalidly from properties of a class to properties of a member.

Hypostatization

The fallacy of hypostatization consists of regarding an abstract word or a metaphor as if it were a concrete one.

Chapter 15. Fallacies of Emotional Bias

Personal attack (ad hominem)

An ad hominem fallacy occurs when we reject someone’s claim or argument simply by attacking the person rather than the person’s claim or argument.

Abuse

The fallacy of abuse is when name-calling and abusive words are used to direct attention away from the issue at hand and toward those who are arguing.

Poisoning the well

The fallacy of poisoning the well occurs when we criticize a person’s motivation for offering a particular argument or claim rather than examining the worth of the argument or claim itself.

Tu quoque (“Look who’s talking”)

In the fallacy of tu quoque, a person is charged with acting in a manner that is incompatible with the position he or she is arguing for.

Mob appeal

Mob appeal or argumentum ad populum can be described as attempting to sway belief with an appeal to our emotions, using theatrical language, or appealing to group-based or special interests.

Appeal to pity (argumentum ad misericordiam)

The fallacy of appeal to pity occurs when an arguer attempts to evoke feelings of pity or compassion in order to cause their dialogue partner to assent to their claim.

Appeal to force or fear (argumentum ad baculum)

The appeal to force or fear consists of the use of threats of force or unfortunate consequences to cause acceptance of a conclusion.

Two wrongs make a right

In two wrongs make a right, the arguer attempts to justify their claim or behaviour by asserting that the person they are trying to convince would do the same thing.

Chapter 16. Fallacies of Expertise

Appeal to authority

The appeal to authority is a fallacy where we take something as fact just because an expert claims it to be true (without supporting considerations about their expertise and how that relates to their claim).

Snob appeal

The fallacy of snob appeal tries to motivate belief by saying that if the dialogue partner supports this claim, they will be a part of an exclusive and thus superior group.

Appeal to tradition

In the fallacy of the appeal to tradition, the fact that a social or cultural practice has been done a certain way in the past is taken to be reason for it to be done in the future.

Appeal to nature

In the fallacy of the appeal to nature, one argues that if something occurs in nature it is good, and if it is unnatural it is bad.

Appeal to anonymous authority

In the appeal to anonymous authority, claims are asserted on the basis of being held by an authority that is not clarified or given.

Appeal to ignorance

In the appeal to ignorance, one takes the failure to disprove a claim as an adequate reason to take the claim seriously. It inappropriately argues that negative evidence can prove a positive claim.

Chapter 17. Fallacies of Distorting the Facts

False analogy

The fallacy of false analogy is the comparison of two things that are only superficially similar, or that even if they are very similar are not similar in the relevant respect.

False cause (family)

The fallacy of false cause is actually a family of related fallacies that occur when an arguer gives insufficient evidence for a claim that one thing is the cause of another.

Post hoc, ergo propter hoc

Post hoc, ergo propter hoc is Latin for “after this therefore because of this.” This fallacy occurs when we assume, without adequate reason, that one event B was caused by another event A because B happened after A.

Mere correlation

With mere correlation, we assume that B was caused by A merely because of a positive correlation between A and B.

Reversing cause and effect

With reversing cause and effect, we conclude that A causes B when B causes A, so there is a causal connection but not the connection we believe.

Spurious correlation

In spurious correlation, we conclude that A is the cause of C, when in fact both A and C are the effects of some event cause B.

Slippery slope (wedge) argument

In this fallacy of slippery slope, a person asserts that some event or consequence must inevitably follow from another without any argument for the inevitability of the event in question.

Irrelevant thesis (ignoratio elenchi)

In the fallacy of irrelevant thesis, an arguer attempts to sidetrack his or her audience by raising an irrelevant issue and then claiming that the original issues has been effectively settled by the diversion. In short, the attempt is made to prove a thesis other than the one at issue.

Chapter 18. Fallacies of Presumption

Sweeping generalization (fallacy of accident)

The fallacy of sweeping generalization is committed when an argument that depends on the application of a generalization or rule to a particular case is improper because a special circumstance (accident) makes the rule inapplicable to that particular case.

Hasty generalization (converse accident)

The fallacy of hasty generalization is committed when an argument that develops a general rule does so in an improper way because it reasons from a special case (accident) to a general rule.

Bifurcation

The fallacy of bifurcation is when an arguer treats a distinction of classification as exclusive and exhaustive of the possibilities, when in fact other alternatives exist. In this fallacy, one confuses contraries with contradictories.

Chapter 19. Fallacies of Evading the Facts

Straw person

In the case of the straw person fallacy, an arguer constructs their dialogue partner’s view out of “straw” (to make it easy to knock down), which effectively creates a new person, the “straw person” who is refuted (rather than the original dialogue partner).

Begging the question

The fallacy of begging the question is assuming what you intend to prove or should be proving. It is a failure of the support relationship.

Question-begging epithets

Question-begging epithets uses slanted language that is question begging because it implies what we wish to prove but have not yet proved.

Complex question

The fallacy of complex question is when the arguer asks a question that presupposes the truth of the question at issue.

Special pleading

Special pleading is when we use slanted or loaded language for others, but when describing ourselves we use neutral or positive language.

1 https://iep.utm.edu/fallacy/

2 https://plato.stanford.edu/entries/relativism/

3 https://en.wikipedia.org/wiki/Illusory_superiority

4 https://en.wikipedia.org/wiki/Lake_Wobegon

Next Chapter
Chapter 14. Fallacies of Ambiguity
PreviousNext
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org
Manifold uses cookies

We use cookies to analyze our traffic. Please decide if you are willing to accept cookies from our website. You can change this setting anytime in Privacy Settings.