Although the empirical and analytical study of terrorism has grown dramatically in the past decade and a half to incorporate more sophisticated statistical and econometric methods, data validity is still an open, first-order question. Specifically, methods for treating missing data often rely on strong, untestable, and often implicit assumptions about the nature of the missing values. The current study drew on Manski's concept of no-assumption bounds to demonstrate the vulnerability of empirical results to different tactics for treating missing cases. This report concludes by advocating for researchers to be transparent when building analytical models about the assumptions they are making about the nature of the data and their implications for the analysis and its interpretation. This article concludes by advocating for researchers, when building their models, to be transparent about the assumptions they are making about the nature of the data and their implications for the analysis and its interpretation. (Publisher abstract modified)
Downloads
Related Datasets
Similar Publications
- The Role of Simulated Data in Making the Best Predictions (from the 87th Annual Meeting of the American Association of Physical Anthropologists - 2018)
- Badges for Basics Helps KCPD Develop Community Rapport
- Surveillance or Safekeeping? How School Security Officer and Camera Presence Influence Students' Perceptions of Safety, Equity, and Support