Learning from Shared News: when Abundant Information Leads to Belief Polarization
Abstract: We study social learning in networks where agents share information selectively. Contrary to standard convergence results, we show that if agents hold even minor misperceptions about the selective sharing, then allowing the quantity of external information to grow indefinitely can lead to divergence of beliefs. Specifically, if quality of external information is sufficiently low, agents’ friends in the network are able to bias their beliefs towards any preferred state. Consequently, to avoid polarization, it is optimal to aggregate external signals into less frequent and more precise communications.