Skip to main content.

Back to: >> Education

Also Known as Critical Thinking

Wherever possible there must be independent confirmation of the facts.

The following is based on the book by Carl Sagan "The Demon Haunted World". He suggested tools for testing arguments and detecting fallacious or fraudulent arguments:

    • Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
    • Arguments from authority carry little weight (in science there are no "authorities").
    • Spin more than one hypothesis don't simply run with the first idea that catches your fancy.
    • Try not to get overly attached to a hypothesis just because it's yours.
    • Quantify, wherever possible.
    • If there is a chain of argument every link in the chain must work.
    • Occam's razor - if there are two hypothesis that explain the data equally well, choose the simpler.
    • Ask whether the hypothesis can, at least in principle, be falsified (shown to be false by some unambiguous test). In other words, is it testable?
    • Can others duplicate the experiment and get the same result?

Additional issues are:

    • Conduct controlled experiments - especially "double blind" experiments where the person taking measurements is not aware of the test objectives or who are the control subjects.
    • Check for confounding factors - separate the variables.
    • Look for common fallacies of logic and rhetoric.
    • Ad hominem - attacking the arguer and not the argument.
    • Argument from adverse consequences (putting pressure on the decision maker by pointing out dire consequences of an "unfavorable" decision).
    • Appeal to ignorance (absence of evidence is not evidence of absence).
    • Begging the question (assuming an answer in the way the question is phrased).
    • Caricaturing (or stereotyping) a position to make it easier to attack.
    • Confusion of correlation and causation.
    • Observational selection (counting the hits and forgetting the misses).
    • Excluded middle - considering only the two extremes in a range of possibilities (making the "other side" look worse than it really is).
    • Inconsistency (e.g. military expenditures based on worst case scenarios but scientific projections on environmental dangers thriftily ignored because they are not "proved").
    • Misunderstanding the nature of statistics (President Eisenhower expressing astonishment and alarm on discovering that fully half of all Americans have below average intelligence!)
    • Non sequitur - "it does not follow" - the logic falls down.
    • Post hoc, ergo propter hoc - "it happened after so it was caused by" - confusion of cause and effect.
    • Meaningless question ("what happens when an irresistible force meets an immovable object?).
    • Short-term v. long-term - a subset of excluded middle ("why pursue fundamental science when we have so huge a budget deficit?").
    • Slippery slope - a subset of excluded middle - unwarranted extrapolation of the effects (give an inch and they will take a mile).
    • Special pleading (typically referring to god's will).
    • Statistics of small numbers (such as drawing conclusions from inadequate sample sizes).
    • Suppressed evidence or half-truths.
    • Weasel words - for example, use of euphemisms for war such as "police action" to get around limitations on Presidential powers. "An important art of politicians is to find new names for institutions which under old names have become odious to the public."

(excerpted from The Planetary Society Australian Volunteer Coordinators. Prepared by Michael Paine).

A profound book in this area is one edited by Rudiger F. Pohl; "Cognitive Illusions," Psychology Press. Pohl and his co-authors go into the psychology of how we view information and go astray in its interpretation. As often as not there is an emotional base and/or mind-set from previous experience that leads us to less-than-perfect reasoning.

Comments

No comments yet

To be able to post comments, please register on the site.