This study used quantitative PCR to estimate the degree of mitochondrial DNA (mtDNA) preservation variance across sections of 19 northern fur seal ribs (Callorhinus ursinus) that date to ∼3000 years ago, and the researchers developed a measure called the “density index,” which was used to gauge the relative densities of the rib sections studied to determine whether density was an appropriate predictor of preservation.
Although recent forensic research has focused on determining which skeletal elements are superior in their preservation of DNA over the long term, there has been little focus on measuring intra-element variation. Moreover, there is a general belief that dense (cortical) bone material will contain better-preserved DNA than does spongy (cancellous) bone. The study found that the average preservation among the samples was significantly different (ANOVA, p = 1.9 × 10−9), with only 15 percent of the total variance observed within samples; however, 12 of the 19 specimens (∼63.2 percent) exhibited at least an order of magnitude difference in mtDNA preservation across the whole. Regression of the amount of mtDNA extracted per gram of bone material against the density index of the bone from which it was extracted demonstrated no relationship between these variables (R2 = 0.03, p = 0.28). The average preservation among the samples was significantly different (ANOVA, p = 1.9 × 10−9), with only 15 percent of the total variance observed within samples; however, 12 of the 19 specimens (∼63.2 percent) exhibited at least an order of magnitude difference in mtDNA preservation across the whole. Regression of the amount of mtDNA extracted per gram of bone material against the density index of the bone from which it was extracted demonstrated no relationship between these variables (R2 = 0.03, p = 0.28). (publisher abstract modified)