May 19, 2016

How to detect nonsense!


Guest post by Jamie Hale 

Nonsense, as it is referred to here, refers to “nonscientific information” that is perpetuated as scientific, when in fact it is not scientific. I developed the Nonsense Detection Kit that provides guidelines that can be used to separate sense from nonsense. There is no single criterion for distinguishing sense from nonsense, but it is possible to identify indicators, or warning signs. The more warnings signs that appear the more likely that claims are nonsense. The Nonsense Detection Kit was inspired by the works of Carl Sagan, Scott Lilienfeld and Michael Shermer). The Nonsense Detection Kit is referring to nonsense in terms of “scientific nonsense.”

Below is a brief description of indicators that should be useful when separating sense from nonsense.

Nonsense indicator 1: personal beliefs and biases drive the conclusions 
Everyone is biased to a degree. That is, if they have any knowledge in a specific area. Nonsense claims are often heavily persuaded by personal biases and beliefs. Scientists recognize their biases and personal beliefs, and use scientific processes in order to minimize the effects. Scientists, much like non-scientists, often find themselves looking for confirmatory evidence. “[A]t some point usually during the peer-review system (either informally, when one finds colleagues to read a manuscript before publication submission, or formally when the manuscript is read and critiqued by colleagues, or publicly after publication), such biases and beliefs are rooted out, or the paper or book is rejected for publication.” (Shermer, 2001, p. 22). Nonsense claimants often fail to recognize their biases (consciously or unconsciously) and thus make little effort to prevent this from influencing their claims.

Nonsense indicator 2: excessive reliance on authorities 
A large portion of what we learn comes from authority figures (including teachers, authors, parents, journalists, etc.). Often, the authority provides accurate information. The problem occurs when we begin to rely too heavily on authority. Authority may provide a hint to what’s right, but authorities are fallible. Authorities often assert different beliefs. Which authority is right? They may both be so-called experts! Authorities too are susceptible to a range of conscious and unconscious biases, they make mistakes, and often have vested interests just like non-experts.

Thoughts on Authority 
  • "It is generally best to take the advice of “authorities” with a grain of salt." (Morling, 2012, p.37) 
  • "Authorities must prove their contentions like everybody else. This independence of science, its occasional unwillingness to accept conventional wisdom, makes it dangerous to doctrines less self-critical, or with pretensions to certitude." (Sagan, 1996, p.28) 
  • "Authority may be a hint as to what the truth is, but is not the source of information. As long as it’s possible, we should disregard authority whenever the observations disagree with it." (Feynman, 1999, p.104) 
Perpetuators of nonsense sometimes prefer authority claims to evidence. I am sure you have heard it before my doctor says, my preacher says, my coach says, etc. The only real authority in science is the evidence.

Nonsense indicator 3: use of logical fallacies (common fallacies of logic and rhetoric)
Those making nonsense claims often present bad arguments. I am not saying that evidence based information is not also sometimes presented by using fallacious arguments. It is probably not as likely to occur when evidence exists for a claim. However, some are just not very good at arguing, so they may be asserting a true claim, but lack the argumentative skills needs to present a logical argument.

Common fallacies:
  • Ad hominem: attacking the arguer and not the argument 
  • Argument from authority (as mentioned earlier) 
  • Appeal to ignorance: the assertion that whatever has not been proved false must be true, or because we don’t know the answer we can attribute to whatever the claim of interest might be 
  • Appeal to tradition: that is the way it has always been done so that must be the right way to do it 
  • Appeal to emotion: I feel very strongly about it, so it has to be right 
  • Strawman fallacy: pretending to refute an opponent's argument, while actually refuting an argument which the opponent did not advance
  • Non sequitur: from Latin “It doesn’t follow”, often a failure to recognize alternatives Post hoc, ergo propter hoc-from Latin “It happened after, so it was caused by” the illusion of cause 
  • Confusing correlation with causation: association, relationship (correlation) does not necessarily imply causation 

References are available upon request The complete Nonsense Detection Kit is featured in the book- In Evidence We Trust: The Need for Science, Rationality and Statistics. http://maxcondition.com/store-2/

Jamie Hale, M.S. is associated with Eastern Kentucky University's Psychophysiology Lab and Perception & Cognition Lab. His current research interests are scientific cognition, memory and eating behavior. He has written for numerous national and international publications. He is the author of In Evidence We Trust: The Need for Science, Rationality and Statistics. To learn more about Jamie visit his sites www.knowledgesummit.net and www.maxcondition.com. To hire him for lectures send him an e-mail (Jamie.hale1@gmail.com).


No comments:

Post a Comment

Enter your email address:

Delivered by FeedBurner