Keeping Faith in Science

2013-11-21_beakerScience requires some level of faith. Coming from a scientist, this position of the faith of science often takes people by surprise, particularly when they have accepted science as an unwavering truth. After some explaining, the intricacies of this position become clear. Taking the child’s approach, if you ask “why?” often enough, you will eventually run out of answers, despite your certainty of “that’s just the way it is.” In 1979, Niklas Luhmann1 proposed that faith in the knowledge of a third party was necessary to apprehend the complexity of the world. Faith in the knowledge of experts is becoming even more crucial to accepting science as our combined scientific knowledge becomes increasingly complex. Since faith is fluid and open to change (such as what happens in a life-changing event), an individual’s adherence to science also adopts a fluid state. When scientific results oppose one’s ideological beliefs, faith in that area of science or in those experts conveying the results is shaken. When other science is offered (regardless of rigor or peer-review), the individual whose beliefs have been shaken is now receptive to an alternative concept that better supports their ideology. This process is not scientific in that, scientifically, when a hypothesis (proposal or belief) is shown to be invalid, the hypothesis should be dismissed or reworked. In this circumstance, however, the data is dismissed in favor of any data supporting the original, discredited hypothesis. Science requires some degree of skepticism for progress to occur, but this skepticism sometimes collapses into pure distrust.

Recent studies by Kahan et al. indicate that, although this distrust can appear in anyone, it is unevenly distributed across political ideologies. Everyone is at risk of a collapse in critical thinking when their ideology is threatened. Without delving into potential psychological ramifications of ideologies and levels of distrust, I propose that this may also result from scientific developments’ alignment with certain ideologies. When scientific discoveries disprove one ideology more than another, this provides followers of the debunked ideology more of a reason to distrust science. They are suddenly in the position to dismiss data that does not fit their ideology in order to restore order to their view of the world. On a larger scale, a simple way to justify dismissing an entire field is to assume that people generating the data are all wrong and not to be trusted. Once someone is convinced that the experts are all wrong, then whatever data best supports their personal view is suddenly “correct.” A personal favorite theory on how a lack of knowledge leads to a presumed level of expertise is the Dunning-Kruger effect.

Taking a moment to differentiate scientific debate from political debate, some recent examples highlight the ability of this to impact both ends of the ideological spectrum. The occurrence of climate change, the safety of vaccines, and the benefits of two-parent households (regardless of gender) are areas which are scientifically proven (hypotheses proposed, tested, evaluated, peer-reviewed, published, repeated, verified), yet the results are not trusted by different groups of people. On one hand, addressing climate change requires a group effort from individuals, governments and corporations, but imposing regulations (forced cooperation) is not acceptable to some political ideologies, so any alternative to that science suddenly becomes viable. For vaccines, a parent who is emotionally distressed needs answers to their concerns and questions. For both vaccines and two-parent households, a single discredited study exists for each (by Andrew Wakefield and Mark Regnerus, respectively) that is repeatedly touted as “proof” that vaccines cause autism and that same-sex parents are dangerous to children. These articles are still touted as the only source of support for many people’s opinions, despite the wealth of evidence that neither of these conclusions are accurate and the repeated fallacies in these studies were based on a lack of controls and violations of numerous standards for scientific integrity and ethics. If no scientifically accepted answers are available, then the mind is receptive to something else filling that gap. Humans have always created answers for that which science cannot explain. The only difference here is that the scientific explanations are being dismissed in favor of explanations that better fit an individual’s mindset.

How do we deal with a self-imposed moratorium on rational thought when our minds panic at the hint of legitimate evidence that we may be wrong? Can we be taught a level of self-control when it comes to panic-thinking? I don’t know how much can be done, but there are some obvious starting points. A lack of critical thinking skills leaves someone with little capability to distinguish between a testable (or tested) hypothesis and an imaginative alternative. A basic understanding of science and the scientific process is also necessary to make that distinction. Ensuring that everyone is equipped with these basic tools won’t prevent skepticism and questioning of science, but they just might end the paralysis that stifles progress when these tools are found to be lacking.

1 Luhmann, N,. Trust and Power: Two Works by Niklas Luhmann. (1979).

Advertisement
This entry was posted in Philosophy and tagged , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s