Friday, July 22, 2011

The Wisdom (???) of Crowds

The notion that markets aggregate the opinions of many and thereby make superior estimations of value has a very long history. It's certainly at the root of the infamous Efficient Markets Hypothesis, which claims that markets gather and process information so efficiently that price movements have no predictable patterns and prices of financial instruments always reflect something very close to the true fundamental value of the assets in question. More recently, The Wisdom of Crowds has been the driving force behind prediction markets. One one way or another, this notion lurks behind the slippery and insidious idea that "markets know best" and that pretty much everything from water distribution to higher education should be organized as a market. 

But in his bestselling book on the topic, James Surowiecki was somewhat careful at the outset to acknowledge that the idea only works in some rather special situations (not that readers paid much attention). A crowd estimating the number of marbles in a jar or the correct price of a stock will only get superior results -- superior in accuracy to the guess of any one individual, and even of experts -- if the people are on average unbiased in their estimates; it won't work if they tend systematically to estimate too high or low. Moreover, the people have to make their estimates independently of one another. Any kind of social influence, one person copying or even being slightly swayed by the actions of another, also spoils the result. Wise crowds very quickly become dumb herds.

For an idea of such broad influence, it's surprising how few experiments have been done to probe in detail around the boundaries where wise crowds become unwise, how it happens and which are the key effects. This has been rectified by an impressive set of experiments carried out by Jan Lorenz and colleagues from ETH-Zurich, and published recently in PNAS. Their idea was to use a crowd of 144 student volunteers and have them perform estimation experiments in a range of conditions. They gave the participants monetary incentives to estimate accurately, and chose questions (on things like geography and crime statistics) for which the true answers are known. Then, in some trials, participants made their estimates on their own, without having any idea about the estimations of others, and in other trials, they were either informed in complete detail of what others had estimated or given at least average information on the others' estimates. The idea was to compare how well the crowd made estimates in the absence and presence of social influence.

What the results show is that social influence totally undermines the wisdom of crowds effect, and does so in three specific ways. It's interesting to consider these in some detail to see just how this whole "wise crowd" illusion falls apart in the face of a little social influence:

1. In what the researchers call the “social influence effect,” the mere act of listening to the judgements of others led to a marked decrease in the diversity of the participants estimates. That is, the estimates of the various people become more like one another -- people adjust their views to fit more closely with others -- but this does very little to improve the collective accuracy of the crowd. In effect, people think they are sharing information, but little information actually gets shared. The figure below illustrates what happens: in successive trials, a measure of the group's opinion diversity decreases dramatically if people hear either full or average information on the estimates of others, meanwhile the collective error decreases only marginally.


2. A second and even more interesting effect is what the researchers call the “range reduction effect.” Imagine that a government tries to use the wisdom of crowds, assembling a group and surveying their opinions, hoping to get a range of views and some idea of how much consensus there is on some topic. You would hope that, if the crowd's estimate was NOT accurate, this lack of accuracy would be reflected in a wide range of estimates from the individuals -- the wide range would signal a lack unanimity and confidence. A truly bad outcome would be a crowd that at once gives a very inaccurate estimate and does so with a narrow range of opinion differences, signalling apparent strong certainty in the result. But this is precisely what the research found -- in the social influence conditions, the individuals' estimates didn't "bracket" the true answer, with some being higher and others lower. Rather, the group narrowed the range of their views so strongly that the truth tended to reside outside of the group's range -- they were both inaccurate and apparently confident at the same time.

3. Finally, and worse still, is the “confidence effect”. The researchers interviewed the participants in the different conditions, asking them how confident they were in the accuracy of the group's final consensus estimate. Social influence, while it didn't make the crowd's estimate any more accurate, did fill the participants with strong confidence and belief in improved accuracy. Think 2005, housing bubble, mortgages with no income and no assets, etc. As hard as it is to imagine that people could have believed the market could not fail to go up further, most did. And they did in large part because they saw others apparently believing the same thing.

Altogether, this careful study points more toward the idiocy of crowds than their wisdom. Social influence is hard to eradicate. Even in markets, supposedly driven by anonymous individuals making their own estimates, lots of people are reading the newspapers and news feeds and listening to analysts, and, even when not, looking to price movements and using them to infer whether someone else may know something they don't. In these experiments, social influence makes everyone think and do much the same thing, makes it likely that the consensus view aims well wide of the actual truth, and, perversely, makes everyone involved increasingly confident that the group knows what it's doing. Some kind of Wisdom.

No comments:

Post a Comment