This awardee has received supplemental funding. This award detail page includes information about both the original award and supplemental awards.
Description of original award (Fiscal Year 2015, $33,513)
As submitted by the proposer: The development of quantifiable measures of uncertainty in forensic conclusions has resulted in the apparition of several ad-hoc methods for approximating the weight of the evidence (WoE) . In particular, following developments in the field of biometry in the 1990s, forensic researchers have attempted to use similarity measures, or scores, to simplify the approximation of the weight of high-dimensional and complex evidential data.
Score-based methods have been proposed for numerous evidence types such as fingerprints, handwritings, inks, controlled substances, firearms, and voice analysis. Researchers have designed different score-based statistics to approximate the weight of evidence. In general, score-based methods consider the score as a projection onto the real line and focuses on different sampling distributions of the score.
During the first year of this project, we have shown that score-based methods do not follow a set of desirable properties analogous to the true WoE. It was found that these score-based methods are not statistically sound and will unreliably overestimate or underestimate the weight of evidence, thus resulting in prejudicial forensic conclusions. Given the convenience and power of score-based methods for reducing the dimensionality of complex evidential forms and the resulting widespread interest from the forensic community, there is significant risk that these methods will have a long-lasting negative impact on the United States criminal justice system.
Another type of statistic takes advantage of the data reduction abilities of similarity measures, but shows properties that are similar to the WoE. Kernel-based methods consider the transformation of the entire feature space using the score as a kernel function. Gantz and Saunders  and Lock and Morris  have proposed some initial developments for these methods.
During the remainder of this research project, we propose to:
1. Review the appropriateness of different metrics as kernel functions (with respect to forensic problems);
2. Explore the theoretical foundations of kernel-based methods as an alternative for quantifying the WoE using similarity measures, and propose a set of requirements ensuring their convergence.
The aim of this project is to deliver two main products with applications to forensic science:
1. A Ph.D. thesis (and relevant computer code and data) developing and justifying a set of requirements ensuring the convergence of kernel-based methods with the WoE;
2. A series of peer-reviewed publications and presentations based on the Ph.D. research program.
 Good, I. J. (1950), Probability and the Weighting of Evidence 1st Ed., London, Charles Griffin.
 Gantz, D., Saunders, C. P. (2014) Quantifying the Effects of Database Size and
Sample Quality on Measures of Individualization Validity and Accuracy in Forensics. National Institute of Justice, Final Grant Report for award 2009_DN_BX_K234
 Lock, A. B., Morris, M. D. (2013), Significance of Angle in the Statistical Comparison of
Forensic Tool Marks, Technometrics, 55(4), 548-561.
This project contains a research and/or development component, as defined in applicable law. nca/ncf.
- A Statewide Mixed-methods Evaluation of Pennsylvania’s 8th Edition Sentencing Guidelines and their Impacts on Racial and Ethnic Disparities in Sentencing Outcomes
- Quantifying the accuracy of low quality DNA sample analysis from genotyping to genealogical searching and integration as a bioinformatic pipeline
- The Low-Down on Methamphetamine Isomers: Prevalence and Pharmacology in Humans