I want to introduce a new paradox. It’s not a strict paradox, but it is of practical concern:
- The majority opinion in almost every field is more likely to be correct than your own, if your opinion deviates from the majority opinion. This is true even the group has no more raw data than you- because the aggregate reasoning of the group is likely to be better than yours.
- If everyone reasoned on the basis of (1) that they should simply adopt the majority opinion, the quality of the majority opinion would fall.
I take it that it is obvious that the above statements are usually true. (1) may not always hold- for example you might be massively better informed and better at reasoning about a topic than everyone else with an opinion on it (though people will think this is true far more often than it is). It may feel like your reasons for believing P instead of ~P are overwhelmingly good, but no doubt it feels like the reasons for believing ~P are overwhelmingly good to the other side.
As several commentators have pointed out, this is all very context dependent. If you’re the only biologist, and everyone else has no scientific training and is a creationist, you probably shouldn’t be worried about the above paradox, because you’ll simply reject (1). The case we have in mind where this is a problem is that of an inquirer in a community of relative epistemic equals.
To dramatise the paradox: Galileo said that “In questions of science, the authority of a thousand is not worth the humble reasoning of a single individual.” This is wrong, the humble reasoning of a single individual is usually not that good. What is true is that without the humble reasoning of individuals and small groups, the authority of a thousand would not advance.
One very simple way forward here ( I will not claim it is the best way forward) is to create two sets of propositions. The set of propositions we ‘believe’ in the sense that we conduct our investigations on the basis of them, and use as the basis of our arguments internally to a community of inquiry and a second set of propositions we ‘believe’ in the sense that, if you were asked what was all things considered most likely, you would assent to. We populate the first category with propositions that seem to us to be true on the basis of all available evidence except the evidence of the judgements of others. We populate the second category with propositions that seem true on the basis of all available evidence including the judgements of others.
There are some subtleties in the above model- you need to take the time to understand the judgements of others even when it comes to guiding your own research, because while the judgement itself may not be evidence for the purposes of the first set of propositions, there might be evidence you are currently unaware of in the arguments given by others. We don’t, after all, want 1st year chemistry students running off and setting up labs because they think they’ve found a way to turn lead into gold before they’ve had time to hear the dominant paradigm out.
Of course belief partitioning may not be psychologically viable for individuals. An alternative would be to give individuals tacit permission to engage in self-deception about the likelihood that they’ve grasped what the majority hasn’t.
And then there are hybrid models, where we put some weight on the reasoning of others, but not as much as it probably deserves. This would have the effect of preventing too much ink and lucre being spent on fringe ideas, while still alleviating the paradox of crowd judgement. As I said, I don’t claim to have an exact correct solution here.
Part of orienting ourselves in this landscape is reflect on our goals. Are we trying to be right, or we trying to make the group that we are a part of right? Traditional epistemology has assumed that the goal of the agent is, or should be, to have correct beliefs about the world. To this end they seek to form justified beliefs. What if instead we view our goal as trying to expand the knowledge of the group as a whole? This can be quite a liberating way of seeing things. Got some eccentric hobby horse ideas? Excellent! Someone needs to follow those up. Act like you don’t know how much of a stab in the dark it is, or even fool yourself into believing they’re likely true if it helps- this will all be to the good as the cognitive reach of the group expands. We now enter the domain of Normative Social Epistemology the study of reasoning for, and as a part of, a group of enquirers, with the aim of supporting that group in its collective search for truth.