Asking one question and answering another: when decisions and statistical analysis are not aligned.
In this paper, we present findings from an experiment that tested the effect of not directly answering the question asked by decision-makers due to the analytic approach used, on the choices decision-makers make and their comfort with those choices. We asked U.S. adults recruited from Amazon Mechanical Turk to make a decision about whether or not to adopt a fictional education technology based on results from a study using synthetic data. Participants were randomly assigned to see findings from one of four studies using different approaches: a traditional frequentist analysis using confidence intervals, a frequentist analysis with two one-sided tests, a Bayesian approach that presented the posterior probabilities for a positive or negative impact, and a Bayesian approach that presented the posterior probabilities for a negative, non-meaningful, and a positive impact. Each approach presented findings based on synthetic data where the true effect was statistically significant but not meaningful. Our experiment shows that depending on which findings are presented to decision-makers, the decision to adopt the technology is different. In addition, decision-makers’ level of comfort with their choice varies depending on which findings are shown to them. The choices made in the first three approaches are different from those made using the fourth approach, which is the approach that directly answers the question being asked by decision-makers.
Ignacio Martinez (Ph.D., Economics, University of Virginia) is an Economist at The Chief Economist's Team at Google. Before joining Google, he was a researcher at Mathematica. His focus is to help decision makers use data to inform the choices they make. In his work, he uses Bayesian methods including hierarchical models and Bayesian Additive Regression Trees (BART). In this presentation he will be speaking on his own behalf and not on behalf of Google or Alphabet.