U.S. flag

An official website of the United States government, Department of Justice.

Using Computed Tomography (CT) Data to Build 3D Resources for Forensic Craniofacial Identification

NCJ Number
303195
Date Published
May 2021
Length
22 pages
Annotation

This chapter outlines the 3D resources that can be built from CT data for forensic craniofacial identification methods, including how to view 3D craniofacial CT data and modify surface models for 3D printing. 

Abstract

Forensic craniofacial identification encompasses the practices of forensic facial approximation (aka facial reconstruction) and craniofacial superimposition within the field of forensic art in the United States. Training in forensic facial approximation methods historically has used plaster copies, high-cost commercially molded skulls, and photographs. Despite the increased accessibility of computed tomography (CT) and the numerous studies utilizing CT data to better inform facial approximation methods, 3D CT data have not yet been widely used to produce interactive resources or reference catalogs aimed at forensic art practitioner use or method standardization. There are many free, open-source 3D software packages that allow engagement in immersive studies of the relationships between the craniofacial skeleton and facial features and facilitate collaboration between researchers and practitioners. 3D CT software, in particular, allows the bone and soft tissue to be visualized simultaneously with tools such as transparency, clipping, and volume rendering of underlying tissues, allowing for more accurate analyses of bone to soft tissue relationships. Analyses and visualization of 3D CT data can not only facilitate basic research into facial variation and anatomical relationships relevant for reconstructions but can also lead to improved facial reconstruction guidelines. Further, skull and face surface models exported in digital 3D formats allow for 3D printing of custom reference models and novel training materials and modalities for practitioners. (publisher abstract modified)

Date Published: May 1, 2021