People don’t think as much about the possibilities of a machine that could tell whether people were lying as they should. Partly this is because many people erroneously believe that such machines already exist. Partly this is because fictional depictions of universes in which lying can be made impossible are unpopular- lying being of key importance to many tropes in fiction. We should think about it though because there’s every chance rapidly improving machine learning technology and brain scanning capabilities will meet in the middle sometime in the next four decades or so to give us a genuine lie detector.
If we had a machine that could determine whether people were lying or telling the truth right now, I can only assume it would be a disaster. The supreme court would probably rule that employers could use it on their employees or some such awful thing. There would be no hope of stopping the three-letter agencies from using it. It would possibly be what I call a critical-social technological point- locking us into a hierarchical society with little possibility of resistance. Awful to contemplate.
However, I can’t help but think that in a different kind of society- a kinder, wiser society- not infinitely kind or wise, but kinder and wiser than ours-, would gain enormously from the existence of a working lie detector.
Asymmetric information- markets & bureaucrats
As Stiglitz has shown, one of the most common problems in a variety of markets is asymmetric information. Huge problems with asymmetric information affect sectors as diverse as health, education, real-estate and second-hand car sales. How should a consumer choose between hospitals without being a doctor themself? How many second-hand cars are either grossly over or undervalued? A way of guaranteeing honesty could fix this.
On the other hand, if our hypothetical kinder and wiser society is using non-market-based solutions, either for a particular sector or in general, the assurance of truthfulness would still be invaluable. Honesty in high-level decision-making and interactions between bureaucrats on topics like quotas, targets, safety standards etc. would transform governance.
Crime and the possibility of redemption
One of the reasons (publicly acknowledged) redemption is so rare in our society is because the recognition of redemption requires trust and the criminal is, generally, seen as untrustworthy. A mechanism that would allow us to verify the truthfulness of a criminal’s claim to remorse would be invaluable. Unfortunately being genuinely remorseful is not always enough to guarantee you won’t do it again, but it’s a start. Maybe also this would shake society’s obsession with retribution.
If you’re like me and you suspect the false conviction rate is a lot higher than should be implied by that comfortable phrase “beyond reasonable doubt”, the reduction in false convictions will also be welcome. The true conviction rate would also rise, as the only way to raise questions about the credibility of a witness who had undergone a screening would be to claim they were mistaken, which simply wouldn’t be plausible in a lot of cases. A lot of types of crime that intrinsically difficult to prove- e.g. abuse by the police- would be bought further into the open. One downside here is that certain kinds of perpetrators might decide their only option was to kill victims and potential witnesses
The possibility of an honest democracy
We would get politicians who truly, really believe in what they say. I can see that there is a cynics case to be made that this is a bad thing, but I have enough faith in the voters to think not. Deliberately cynical manoeuvres like gerrymandering would be impossible or have enormous political costs.
A lot of people will say to the above claim about gerrymandering and other cynical manoeuvres “oh, but we already know they do these things and it has no effect, so a lie detector wouldn’t change things”. I disagree. There is a fundamental difference between knowledge which is founded on inference, and knowledge which is founded on admission or a leak or something like that. It may not be rational, but humans treat the two utterly differently. This is why there is a huge difference between saying something and dog-whistling it.
Interpersonal and social honesty as a building block for interpersonal and social empathy
This may sound bizarre or even paranoid, but there’s something quite frightening about the privacy of experience. Part of what forms that heavy barrier is the reality that someone could always be lying about their experiences. Even if the possibility seems remote, it’s there, and unlike other kinds of statements, there’s no way to check whether someone is honestly conveying what they feel. On a personal level, knowing that you were hearing the whole story from someone would allow for a new depth of emotional intimacy.
At the social level, however, is where this effect would be strongest. People often talk about what it’s like to be in such and such social position, and such talk can literally change the world. One of the mental barriers people put up against this is telling themselves that these people are lying or not telling the full truth. A working lie detector could do away with this layer of protective armour against empathy, though of course others would remain. Much richer social conversations might be possible when the facts of experience weren’t under dispute.