[This is part of a set: Thinking]
Confirmation bias is such a tricky one that it requires persistent vigilance.
Scientific American for November carries the story based on Marc Hauser’s problems, the nature of which hasn’t been made clear yet. Some suspect fraud, but the more generous view is confirmation bias.
Two factors make combating confirmation bias an uphill battle. For one, data show that eminent scientists tend to be more arrogant and confident than other scientists. As a consequence,they may be especially vulnerable to confirmation bias and to wrong-headed conclusions, unless they are perpetually vigilant. Second, the mounting pressure on scholars to conduct single-hypothesis-driven research programs supported by huge federal grants is a recipe for trouble. Many scientists are highly motivated to disregard or selectively reinterpret negative results that could doom their careers. Yet when members of the scientific community see themselves as invulnerable to error, they impede progress and damage the reputation of science in the public eye.
The very edifice of science hinges on the willingness of investigators to entertain the possibility that they might be wrong.
The best antidote to fooling ourselves is adhering closely to scientific methods. Indeed, history teaches us that science is not a monolithic truth-gathering method but rather a motley assortment of tools designed to safeguard us against bias.
As astronomer Carl Sagan and his wife and co-author Ann Druyan noted, science is like a little voice in our heads that says, “You might be mistaken. You’ve been wrong before.” Good scientists are not immune from confirmation bias. They are aware of it and avail themselves of procedural safeguards against its pernicious effects.
At least it’s reassuring that scientists are keeping an eye on each other, given the difficulty of keeping an eye on oneself. It’s about the best we can expect. And given this is the case, it illustrates the paucity of any ‘other way of knowing’.
More on Marc Hauser here.