James Doyle | Lynn Garcia | Scott Shappell | Jenifer Smith | Peter Stout | Anthony Tessarolo | Linzi Wilson-Wilde | Craig Beyler | Dawn A. Boswell | Mehzeb Chowdhury | Sarah Chu | Sabrina Cillessen | Lynn Garcia (Panel 1) | Lynn Garcia (Panel 2) | Melissa Gische | Heike Hoffman | Vici Inlow | Amy Jeanguenat 1 | Amy Jeanguenat 2 | Jim Jones | Karen Kafadar | Roman Karas | Paulo Kunii (FHE Methodology) | Elizabeth Laposata | Ranjan Maitra | Marcel Matley | Ashraf Mozayani 1 | Ashraf Mozayani 2 | Robert Nelms | Jennifer Newman | Anthony Onorato | Imani Palmer | Sandra Rodriguez-Cruz | Sandra Sachs | Ivana Sesardic | Carolyn R. Steffen | Harish Swaminathan | Nicholas Tiscione | Richard Torres
Take a Cue From NASA: The Real Value of 3D and 360 Imaging in Crime Scene Investigation and the Investigative Process
Courts have traditionally relied on forensic science units to produce visual evidence in court as an alternative to crime scene visits. Crime scene investigators (CSIs) gather and use evidence to recreate the precise sequence of events that occurred during the course of a crime. Part of this reconstruction process is photography and sketching, with the latter still largely done by hand. Photos give a limited picture of the crime scene, restricted by the photographer's field of view and subject to their interpretation of the scene and the importance they place on different pieces of evidence. Video can capture more of the scene but is still limited in its field of view. Sketches lay out the scene in a way that neither photographs nor videos can. They provide a general overview of the scene and the precise and relative location of evidence. But they also give an inherently less realistic representation of the crime scene, determined even more by the artist's interpretation. Similarly, photos and videos can be turned into 3D computer animations but again are subjective, and can even be tailored to support the case of whichever side is presenting them. Are there any alternatives? In this paper, I explore the latest innovations in 3D and 360° imaging at crime scenes, as well as the scope and feasibility of adopting commercial technology to criminal justice endeavors. Could artificial intelligence, robotics, virtual and augmented reality assist us to be better CSIs? It will be presented, how using the latest of these innovations, crime scene managers could review an entire investigation as if he or she were already present at the crime scene, evaluating the performance of individual crime scene examiners, and providing feedback accordingly. As a means of overall quality control within crime scene settings, it could prove invaluable. 360° video could also provide an extra layer of protection for crime scene examiners, if the credibility and reliability of their professional judgment and acumen is called into question during a trial. If the recording device is placed from a vantage point where the entire crime scene can be seen, then it can be argued that the crime scene team has nothing to hide, thus giving their work transparency in the process of evidence identification, collection and storage. In error-management systems, 360° imaging could allow for the early detection of flawed techniques, and be a useful tool in promoting and showcasing good practice. These issues and others will be investigated by closely tying in broad scientific innovations with the possible utility of these technologies to crime scene work. These could, not only allow investigators to revisit crime scenes as they were at the time of the initial forensic examination, but also add a further dynamic to the scene documentation process which may be invaluable in contextualizing both physical and behavioral evidence.
Error Reduction in Death Investigation: A Defined Role for the Forensic Pathologist in Scientific Analysis and Reasoning Beyond the Autopsy
Errors can occur at many levels during investigation of death. These include crime scene processing and documentation (bloodstain pattern recognition, photography, and evidence collection); laboratory work (DNA, toxicology, trace evidence analysis); and postmortem examination (documentation of all that is possible and necessary regarding the body). However, on a larger scale, final integration of all these elements into a cohesive picture of the incident under investigation that ensures a fair and morally right outcome also requires methods that are subject to error. The forensic pathologist, trained in medicine and scientific analysis and accustomed to developing differential diagnoses, is in a unique position to ensure competent death investigation by fully participating with colleagues in the integration of all scientific findings and abandoning a mindset that limits his or her role to examination of the body. To identify specific subject areas and develop a working model to integrate the forensic pathologist more closely with the death investigation process, 500 cases referred to the independent consulting firm, Forensic Pathology & Legal Medicine, Inc., Providence, RI, were examined. Presentation of cases examined will illustrate particular areas where integration of the forensic pathologist into the death investigation is of substantial benefit and can prevent errors. These cases will focus on the analysis of infant deaths, sharp force and firearm deaths, and deaths in custody. Sequencing of events, identifying substantial intervening causes and events, integrating trace evidence analysis and correlating witness statements with events in the death investigation are areas of particular importance for scrutiny by the forensic pathologist. Further, involvement of the forensic pathologist early in the investigation will serve to develop lines of inquiry that result in the best and most economic utilization of testing resources. A triage menu has been developed to identify cases that will benefit most from such analysis by the forensic pathologist. Finally, knowledge of types of errors in thinking that can subtly infiltrate, contaminate and limit the comprehensive analysis of death by the forensic pathologist is also required. These include understanding differences between inductive and deductive reasoning, recognizing the tunnel vision of group think, being aware of the possible existence of silos of sequestered information, and acknowledging the entity of the so-called "wicked problems" that may be impossible to solve because of various factors. As a medical doctor, the forensic pathologist is also familiar with the cognitive biases associated with medical patient-oriented decision-making and can apply these skills to death investigation.
When an item of evidence is submitted to a seized-drug laboratory, one of the first tests performed is the determination of net weight; that is, the weight of the actual material, not including wrappers or containers. These measurements are critical as they can significantly influence or outright determine sentencing outcomes. As such, laboratories should have procedures in place to evaluate the factors contributing to the variability of net weight determinations to accurately assess, document, and report the uncertainty associated with any net weight measurement. This presentation will summarize and discuss the policy and procedures implemented throughout the DEA laboratory system. As a system comprised of eight separate laboratories and more than 250 analysts, the DEA's methodology for estimating net weight uncertainties was developed as a "quasi-budget" approach using laboratory system-wide balance calibration and performance verification data. This allows for the assessment of the total variability expected across all laboratories, including different weighing equipment, users, environments, reference weight standards, and weighing procedures, among other factors. As expected, the uncertainty associated with a particular net weight measurement is also significantly dependent on how the measurement was performed and how the total net weight was calculated. Was it obtained by weighing all items directly and subtracting the weight of the containers/wrappings? Or was it obtained by weighing a small sample of items and calculating the total net weight via some type of extrapolation? Development of the DEA uncertainty policy also included revision and standardization of net weight determination procedures, to ensure consistency in their application while minimizing the effects of high uncertainty influences. Successful implementation of the DEA net weight uncertainty policy also involved the training of analysts and laboratory managers. Training sessions included background information on statistics and metrology, the rationale behind DEA policy revisions, and the importance of communicating uncertainty information to laboratory customers and triers of fact. To facilitate implementation of the policy and standardization of net weight measurement procedures and uncertainty calculations, a DEA Uncertainty Calculator was developed and validated for required documentation and inclusion into analysis casefiles. This calculator was designed to accommodate various weighing procedures and scenarios applicable to solids, liquids, tablets/capsules, and bio-hazardous exhibits. The calculator also provides documentation of the type of equipment used, uncertainty factors considered, minimum weight requirements, acceptance criteria for measurements, and weighing operations needed to complete a net weight determination. The incorporation of the DEA Uncertainty Calculator into the laboratory information management system (LIMS) will also be discussed. To conclude, this presentation will also review numerous insights gained through implementation of the net weight uncertainty policy within the DEA laboratory system. Emphasis will be on the importance of appropriate balance calibration procedures, proper balance usage, traceability to reference weight standards, and robust performance verification protocols.
co-presented with Sandra Sachs - Drug Chemists are Getting the Right Answers: Assessing Drug Analysis Error Rates in Municipal, County, and Federal Laboratories
What can decades-worth of proficiency test (PT) data, quality assurance (QA) measures, re-analysis results, an Excel spreadsheet and a pot of coffee lead to? The conclusion that drug chemists are doing a very good job at identifying materials submitted to forensic laboratories for analysis. Do errors occur? What is the error rate? How confident should a laboratory be in its results? These and many other questions can be answered by using data that may already be available to laboratories, allowing assessment of error rates and in some cases, the use of Bayesian analysis to formulate posterior probabilities characterizing the confidence and uncertainty associated with drug analysis procedures and the test results they produce. This presentation will answer these questions with data obtained from one municipal, one county, and eight federal laboratories and address differing analytical approaches in these laboratories. The forensic community has increasingly been asked to provide data to support the accuracy and validity of their results (1-2). One way to address this in the drug analysis discipline is to use PT data to assess how analysts perform while employing various analytical schemes. Large laboratory systems, like the Drug Enforcement Administration (DEA), produce extensive data sets that provide statistically meaningful results. This presentation will describe data obtained from DEA PTs completed during 2005-2016 and including over 4700 outcomes. Similar studies, however, are not feasible for small laboratories. For example, the Oakland Police Department (OPD) Criminalistics Laboratory has taken and passed 87 PTs over the last 20 years with no failures. Such small data sets are insufficient for the assessment of errors, and could lead to "zero-error-rate" conclusions that would be wrong or misleading, at best. However, the OPD has a robust 20-year-old QA program through which it has collected and analyzed casework-derived Quality Control (QC) samples from over 3300 exhibits. Also included are results from the re-analysis of over 1350 cases, originally analyzed during 2002-2007 by the Kern Regional Crime Laboratory (KRCL) in Bakersfield, CA. Although the analytical schemes employed by these federal, county, and municipal laboratories are different, estimated error rates using the different assessment approaches are all found to be below 1%. Additional information can be obtained by combining error-rate results and Bayesian analysis to estimate posterior probabilities associated with positive identification results. By making reasonable assumptions about prior probabilities relevant to the population, high confidence (> 99%) and low uncertainty (< 1%) estimates are obtained. Lastly, error rates estimated from the analysis of OPD and KRCL data also confirm the reliability of analytical schemes employing microcrystalline testing, as these tests were used in the majority of original case analyses. Error rates below 0.5% (6/3552 to date) are obtained, in agreement with the estimated error rates for other laboratories in this study. These results support the use of analytical schemes employing two Category-B and one Category-C technique, the minimum recommended by the SWGDRUG (3) since 1999. In summary, this study indicates that metrics and QA tools such as PT data and re-analysis support the validity and reliability of individual methods or analytical schemes used by drug analysts. (1) "Strengthening Forensic Science in the United States: A Path Forward" National Academy of Sciences, 2009, Recommendation #3 (2) "Report to the President Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods" Executive Office of the President President's Council of Advisors on Science and Technology (PCAST) September 2016. Feature Comparisons can apply to data interpretation methods used in Drug Analysis. (3) Scientific Working Group for the Analysis of Seized Drugs; Recommendations available via http://swgdrug.org/approved.htm.