Necessary Fallibility

How do we know what we know? Mostly, by looking. By seeing the world around us and applying our critical thinking abilities to understanding it. Those critical thinking abilities rely on our own previous experience and on what we learn from others. But what are the limits to this experience? In Final Cut: Medical arrogance and the decline of the autopsy, I found this bit of ruminating on the possible limits of knowing:

In a 1976 essay, the philosophers Samuel Gorovitz and Alasdair MacIntyre explored the nature of fallibility. Why would a meteorologist, say, fail to correctly predict where a hurricane was going to make landfall? They saw three possible reasons. One was ignorance: perhaps science affords only a limited understanding of how hurricanes behave. A second reason was ineptitude: the knowledge is available, but the weatherman fails to apply it correctly. Both of these are surmountable sources of error. We believe that science will overcome ignorance, and that training and technology will overcome ineptitude. The third possible cause of error the philosophers posited, however, was an insurmountable kind, one they termed “necessary fallibility.”

There may be some kinds of knowledge that science and technology will never deliver, Gorovitz and MacIntyre argued. When we ask science to move beyond explaining how things (say, hurricanes) generally behave to predicting exactly how a particular thing (say, Thursday’s storm off the South Carolina coast) will behave, we may be asking it to do more than it can. No hurricane is quite like any other hurricane. Although all hurricanes follow predictable laws of behavior, each one is continuously shaped by myriad uncontrollable, accidental factors in the environment. To say precisely how one specific hurricane will behave would require a complete understanding of the world in all its particulars—in other words, omniscience.

Our critical thinking skills must be accompanied by humility, especially in the face of complexity. Otherwise, we become arrogant — and the real evil of arrogance is that it clouds our thinking. Knowing that we can be wrong leads us to think harder, better and deeper about the problems that confront us.

The article linked above explores that arrogance in the context of autopsies, which are so rarely performed these days that they aren’t even counted at the national level. The whole article is worth reading.

Posted on December 20th, 2011 by Katxena