We like to think that people improve their judgment by putting their minds together, and sometimes they do. The studio audience at “Who Wants to Be a Millionaire” usually votes for the right answer. But suppose, instead of the audience members voting silently in unison, they voted out loud one after another. And suppose the first person gets it wrong.
If the second person isn’t sure of the answer, he’s liable to go along with the first person’s guess. By then, even if the third person suspects another answer is right, she’s more liable to go along just because she assumes the first two together know more than she does. Thus begins an “informational cascade” as one person after another assumes that the rest can’t all be wrong.
Because of this effect, groups are surprisingly prone to reach mistaken conclusions even when most of the people started out knowing better, according to the economists Sushil Bikhchandani, David Hirshleifer and Ivo Welch. (NYT)
“Informational cascades”, as they’re called, are usually applied to behavioural economics/group behaviour analyses, but I think it might also apply to areas of academe — trends in academic thought and methodology, for instance, or even the peer review process itself, where it might take some herculean and brave soul to point out that, so to speak, the Emperor has no clothes on. (I am thinking here of the Sokal incident, in which there were indeed no herculean and brave souls to point out that Emperor Sokal was parading around in the annals of academe, brazenly butt-naked).
I’m also thinking of this incident, a little while back, where I read a book about things I knew little of, written by someone I thought knew more than I did, well-reviewed by people I thought knew more than I did, and which I consequently read with a kind of credulity that later turned out to have been horribly misplaced. It’s not quite an analogous situation, but still, there’s a distinct ‘Chinese Whispers‘ effect resulting from reading an error-filled book and disseminating error-filled information, and when the disseminator is someone dangerously influential, the resultant ‘cascade’ can be vast indeed. (See Bjorn Lomborg and his many errors).
And I’m also thinking about Jared Diamond’s (highly contended) claim that agriculture was “The Worst Mistake in the History of the Human Race“, in which, according to him, that trailblazing First Person in the ‘Who Wants To Be A Millionaire’ of human prehistory, who decided that agriculture was a good idea and told other people about it, did indeed turn out to be wrong. And people hopped on the bandwagon behind him, leading to a great cascade down to modern-day resource-depletion, overconsumption and climate change. The difference here, which Diamond does talk about, is that the initially wrong idea was irresistably wrong: once agriculture was introduced, it made less sense not to go with it, because it yielded so many benefits. The point still remains, though, that groups are “surprisingly prone to reach mistaken conclusions” — what I guess might be related to the “tragedy of the commons“.
I suppose there’s a happy medium to be struck between groupthink (conversation) and solipsism. (For example, historians have to engage with the literature before they strike out on their own). The problem, which I have struggled with, is not to take the views of others too slavishly. And to expect everyone in the world to do that, even in a non-academic capacity, is I guess what leads to diet fads, pedophile paranoia, creationism, and LEGGINGS