Note:
This awardee has received supplemental funding. This award detail page includes information about both the original award and supplemental awards.
Award Information
Awardee
Award #
2015-R2-CX-0028
Funding Category
Continuation
Status
Closed
Funding First Awarded
2015
Total funding (to date)
$74,560
Description of original award (Fiscal Year 2015, $33,513)
As submitted by the proposer:
The development of quantifiable measures of uncertainty in forensic conclusions has resulted in the apparition of several ad-hoc methods for approximating the weight of the evidence (WoE) [1]. In particular, following developments in the field of biometry in the 1990s, forensic researchers have attempted to use similarity measures, or scores, to simplify the approximation of the weight of high-dimensional and complex evidential data. Score-based methods have been proposed for numerous evidence types such as fingerprints, handwritings, inks, controlled substances, firearms, and voice analysis. Researchers have designed different score-based statistics to approximate the weight of evidence. In general, scorebased methods consider the score as a projection onto the real line and focuses on different sampling distributions of the score. We can show that score-based methods do not converge to, or share the same basic properties as, the true WoE. Overall, there is currently some controversy on whether these score-based methods are statistically sound and whether they might overestimate the weight of evidence, thus resulting in prejudicial forensic conclusions. Given the convenience and power of score-based methods for reducing the dimensionality of complex evidential forms and the resulting widespread interest from the forensic community, there is significant risk that these methods will have a long-lasting negative impact on the United States criminal justice system. Another type of statistic takes advantage of the data reduction abilities of similarity measures, but shows properties that are similar to the WoE. Kernel-based methods consider the transformation of the entire feature space using the score as a kernel function. Gantz and Saunders [2] and Lock and Morris [3] have proposed some initial developments for these methods. During this 2-year research project, we propose to:
1. Systematically study the (lack of) theoretical and empirical convergence of score-based methods to the true WoE in examples that can be derived analytically;
2. Review the appropriateness of different metrics as kernel functions (with respect to forensic problems);
3. Explore the theoretical foundations of kernel-based methods as an alternative for quantifying the WoE using similarity measures, and propose a set of requirements
ensuring their convergence.
The aim of this project is to deliver two main products with applications to forensic science:
1. A Ph.D. thesis (and relevant computer code and data) developing and justifying a set of requirements ensuring the convergence of kernel-based methods with the WoE;
2. A series of peer-reviewed publications and presentations based on the Ph.D. research program.
[1] Good, I. J. (1950), Probability and the Weighting of Evidence 1st Ed., London, Charles Griffin.
[2] Gantz, D., Saunders, C. P. (2014) Quantifying the Effects of Database Size and Sample Quality on Measures of Individualization Validity and Accuracy in Forensics. National Institute of Justice, Final Grant Report for award 2009_DN_BX_K234
[3] Lock, A. B., Morris, M. D. (2013), Significance of Angle in the Statistical Comparison of Forensic Tool Marks, Technometrics, 55(4), 548-561
This project contains a research and/or development component, as defined in applicable law.
ca/ncf
Date Created: September 15, 2015