Value neutrality among researchers is a myth that hurts the public trust of science
As the U.S. recoils from the divisions of recent years and the scientific community tries to rebuild trust in science, scientists may be tempted to reaffirm their neutrality. If people are to trust us again, as I have frequently heard colleagues argue, we have to be scrupulous about not allowing our values to intrude into our science. This presupposes that value neutrality is necessary for public trust and that it is possible. But available evidence suggests that neither presumption is correct.
Our beliefs, prejudices and personal history can influence our research work in ways that can be significant. This Scientific American piece explores why disclosing our biases can be important for safeguarding the public trust. The exercise might also make us alert to limitations in our research. This is a topic that can be usefully included in professional development and resources for researchers of all career stages.
It is well known that people are more likely to accept evidence that accords with what they already believe. Psychologists call this “motivated reasoning,” and although the term is relatively recent, the insight is not. Four hundred years ago Francis Bacon put it this way: “Human understanding is not composed of dry light, but is subject to influence from the will and the emotions … man prefers to believe what he wants to be true.”*