In health, stated preference data from discrete choice experiments (DCEs) are commonly used to estimate discrete choice models that are then used for forecasting behavioral change, often with the goal of informing policy decisions. Data from DCEs are potentially subject to hypothetical bias. In turn, forecasts may be biased, yielding substandard evidence for policymakers. Bias can enter both through the elasticities as well as through the model constants. Simple correction approaches exist (using revealed preference data) but are seemingly not widely used in health economics. We use DCE data from an experiment on smokers in the US. Real-world data are used to calibrate the scale of utility (in two ways) and the alternative-specific constants (ASCs); several innovations for calibration are proposed. We find that embedding revealed preference data in the model makes a substantial difference to the forecasts; and that how models are calibrated also makes a substantial difference.
Buckell, J. & Hess, S. (2019), Stubbing out hypothetical bias: improving tobacco market predictions by combining stated and revealed preference data. Journal of Health Economics, forthcoming.