Bayesian experts with a common prior who are exposed to different evidence often make contradictory probabilistic forecasts. An aggregator who receives the forecasts must aggregate them in the best way possible. This may prove to be a challenge whenever the aggregator is not familiar with the prior or the model and evidence available to the experts. We propose a model where experts provide forecasts over a binary state space. We adapt the notion of regret as a means of evaluating schemes that aggregate their forecasts into a single forecast. Our results show that arbitrary correlation between the experts entails high regret, whereas if there are two experts who are Blackwell-ordered (i.e., one of the experts is more informed) or who have conditionally independent evidence, then the regret is surprisingly low. For these latter cases we construct (nearly) optimal aggregation schemes.
↧