Myside bias: the bias that divides us

I interviewed Canadian psychologist Keith Stanovich twice (here and here). He has done a lot of groundbreaking work in the fields of rationality and the psychology of reading. 
In his new book The Bias That Divides Us: The Science and Politics of Myside Thinking he Stanovich argues that we do not live in a post-truth society. He argues that we rather live in a myside-society. 

No post-truth society 

Stanovich argues in the book that our problem is not that we no longer attach importance to truth and facts, but that we cannot agree on generally accepted truth and facts. We believe that our side knows the truth. Post-truth? That describes the other side. 
The inevitable result is political polarization. Stanovich shows what science can tell us about myside bias: how common it is, how to avoid it, and what purposes it serves. 

What is myside bias? 

Myside bias is the human tendency to evaluate evidence, generate evidence, and test hypotheses in a way that is biased toward their own previous beliefs, opinions, and attitudes. Myside bias refers to things we want to be true. 

Blind spot 

Stanovich explains that while myside bias is ubiquitous, it is an exception among cognitive biases. It's unpredictable. Intelligence doesn't protect you from it, and neither does education, and myside bias in one domain is not a good indicator of myside bias in another. 
The author argues that because of its status as an exception among biases, myside bias creates a real blind spot among the cognitive elite —those with high intelligence, executive functioning, or other prized psychological dispositions. They may consider themselves unbiased and purely rational in their thinking, but in fact they are just as biased as everyone else. 


Stanovich examines how blind spot with respect to myside bias contributes to our current ideologically polarized politics and connects it to another recent trend: the declining trust in scientific research as an objective arbiter.