As submitted by the proposer:
The development of quantifiable measures of uncertainty in the conclusions of forensic analyses has resulted in the apparition of several ad-hoc methods for approximating the weight of the evidence (in particular for some that are qualitative in nature, such as pattern evidence). The statistical rigor of these methods, as well as their reliability and accuracy, have not been studied.
In fact, some data and research papers suggest that:
1. Some of these methods do not provide an accurate quantification of the weight of forensic evidence;
2. Their statistical and computational complexity is not well understood;
3. These issues are magnified when handling highly dimensional evidence forms such as pattern evidence.
These issues slow the application of these methods in casework and may result in the mistrust of the forensic and legal communities with respect to the statistical inference of the sources of trace samples.
During this 3-years basic research project, we propose to study the validity, accuracy and computational complexity of methods designed to quantify the weight of complex evidence forms, such as pattern evidence and trace evidence. This project will be divided in four phases:
1. The validity, accuracy and computational complexity of 2 methods for quantifying uncertainty in forensic conclusions will be studied using a toy glass example, where the ground truth weight of evidence is known;
2. Two frameworks for using similarity measures (as a means to reduce the dimension of the problem) in the quantification of the weight of evidence will be developed, and their reliability and accuracy will be studied using our toy example;
3. The two frameworks will be applied to a variety of complex forms of evidence (such as fingerprint, handwriting, toolmark and fiber);
4. The reliability of the frameworks, when applied to these evidence types, will be studied as a function of sample and database size.
This project will deliver several products with broad applications to forensic sciences:
1. A series of recommendations on the computational techniques and datasets size required to ensure the reliability and accuracy of methods for approximating the weight of complex forms of evidence;
2. Two frameworks for quantifying the weight of forensic evidence that can be generalized to any forensic field, where there exists a means to quantify the level of similarity between pairs of samples;
3. A free statistical library (with examples, data and documentation) for the R software package for the interpretation of forensic evidence.