Definition of contaminated mindware:
- not true: it is not grounded in evidence
- potentially harmful: it is on the whole not beneficial for the one who holds it
- it can be very sticky and
- it can spread easily throughout a population.
- when it is packaged in an appealing narrative which promises some kind of benefit to the host,
- when it rides on the back of other popular mindware which may be more valid by copying superficial characteristics from that mindware (for example scientific looking graphs),
- when it contains self-replication instructions (‘send this mail on to 10 different people', 'recruit two new participants for this training course’),
- when it has evaluation-disabling properties (for instance by claiming that evidence is not relevant or possible, by denying that evidence exists, by making belief which is unsupported by evidence into a virtue, by encouraging adherents to attack non-believers, etc).
You might think that intelligence would guarantee a good protection against contaminated mindware but this turns out to be wrong. By making narratives complex, highly intelligent people can even become extra attracted to them. Further, studies have demonstrated that intelligent people may be more capable of creating ‘islands of false beliefs’ or ’webs of falsity’ by using their considerable computational power to rationalize their beliefs and to ward off the arguments of skeptics.
How can you protect yourself against contaminated mindware?
This leads me to the question of how to protect ourselves against contaminated mindware. The question is whether a head on attack of popular contaminated mindware will leads to its demise or runs the risk of making it even more popular. A head on attack might lead to further publicity for the contaminated mindware, thus exposing more people to its attractiveness.
A direct challenge may lead to a defensive response and a strengthening of the convictions
In general people may become defensive when they feel that someone else is trying to convince them of something. This effect is known as the reactance theory (developed by Jack Brehm, see photo) which can be summarized as follows: When someone notices that another person tries to convince him of something he will try to protect his own freedom. When we fear our freedom is threatened, we'll try to protect it. Recent research has demonstrated that people whose confidence in closely held beliefs gets undermined may become stronger advocates of those beliefs. This defensive response may be even stronger when the contaminated mindware contains the above mentioned evaluation-disabling properties ("When someone says so or so, don't listen! That is the Devil talking!"). Another reason why a head on attack may be ineffective is that contaminated mindware may even contain an instruction to attack opponents.
Philosopher Daniel Dennett has said: "You seldom talk anybody out of a position by arguing directly with their premises and inferences. Sometimes it is more effective to nudge them sideways with images, examples, helpful formulations that stick to their habits of thought." Or might a different approach work better? For instance an approach of teaching people to recognize contaminated mindware more easily and protect themselves better against it? Or might satire be a good way to help people see through contaminated mindware?
Questions: I am NOT asking you to answer them here on this website but rather to reflect on these questions and try to answer them for yourself:
- How well would you be able to recognize contaminated mindware when you'd come across it?
- How well could you protect yourself against it?
- How would you be able to recognize the evaluation-disabling properties of the contaminated mindware?
- Have you ever freed yourself from contaminated mindware which you were already 'infected' with?
- How did you do that?