Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Forensics@NIST Poster Abstracts

Forensics@NIST 2014 Poster Abstracts

Download abstracts as a PDF document.

1. 3D GROUND-TRUTH ANNOTATION FOR VIDEO ANALYTICS FROM LARGE-SCALE CAMERA NETWORKS
John Garofolo, Haiying Guan, P. Jonathon Phillips, Brian Antonishek, Nader Moayeri, John Roberts, Martial Michel, Afzal Godil

Reconstructing three dimensional (3D) crime scenes is an important aspect of forensic investigation. With installations of large-scale camera networks, multiple videos of a crime may be collected. Video analytics technologies are helpful for data reduction, identifying which video sources have the most useful forensic data for a crime, and generating an understanding of the scene by piecing together video and imagery from many different cameras. It is especially crucial to accurately localize and track the positions of people and objects in the physical world across all videos collected. We present accurate camera calibration techniques and proposed a 3D ground-truth annotation system. Given the single view or multi-view of videos with camera calibration parameters, the system projects the locations of objects from cameras into the 3D physical world. We perform multiple experiments using our methods. They show that the prototype system provides localization and tracking results with high accuracy. A 3D ground-truth annotation system can be used for 3D crime scene reconstruction, cross camera tracking etc. Forensic scientists can reconstruct the crime scene and analyze line of sight, bullet trajectories (with other techniques), provide a 3D virtual tour to verify witness testimony or evaluate hypotheses, and provide 2D and 3D evidence for courtroom presentation.
Presenters: John Garofolo/Information Access Division and Haiying Guan/Multimodal Information Group
Information Technology Laboratory

2. A Novel Standard for the Mass Spectrometric Imaging and quantification of illicit drugs in fingerprints
Shin Muramoto, Thomas P. Forbes, Arian C. van Asten, and Greg Gillen

A novel standard for the forensic analysis of fingerprints was introduced and characterized using time-of-flight secondary ion mass spectrometry (ToF-SIMS) and desorption electrospray ionization mass spectrometry (DESI-MS), aimed at the trace level detection and quantitative analysis of drug molecules embedded in a latent fingerprint. A calibration array containing roughly (1, 5, 10, 25, and 50) ng of cocaine, methamphetamine, and heroin was deposited onto fingerprints made of artificial sebum using a precision drop-on-demand inkjet printer. Chemical analysis yielded peak intensities for each drug deposit in the array, and they were subsequently used to formulate a regression equation that was successfully used to quantify the surface concentration of drugs pre-mixed in an artificial sebum. For a pre-mixed sebum containing 250 µg of cocaine, for example, ToF-SIMS was able to determine that a fingerprint deposited onto a silicon substrate contained (137.8 ± 15.8) ng/mm2 of cocaine, corresponding to roughly 7% of the total cocaine molecules being transferred in a single deposition event. When the substrate was changed to cotton paper, only (19.3 ± 7.4) ng/mm2 of cocaine was detected, corresponding to just 0.9% of transfer. The effect of substrate type on the attenuation of signal turned out to be quite severe for paper, as it is thought that the cocaine molecules were being absorbed into the substrate. This was consistent with the results for DESI, where the surface concentration of cocaine on paper was (270 ± 121) pg/mm2, or roughly 1.7% transfer.
Presenter: Shin Muramoto
Materials Measurement Sciences Division
Surface and Trace Chemical Analysis Group

3. Ambient Ionization Mass Spectrometry for Chemical Imaging and Detection of Inorganic and Organic Explosives, Narcotics, and other Forensically Relevant Analytes
Thomas P. Forbes and Edward Sisco

Ambient ionization mass spectrometry (AI-MS) has provided a powerful tool for the forensic analysis, trace detection, and chemical imaging of explosives, illicit narcotics, radionuclides, and gunshot residue (GSR). A wide range of atmospheric pressure and ambient ionization methods have been developed for the generation of gas phase ions from liquids and solid surfaces for MS analysis. Here, a range of forensic analyses were conducted using a few such ion sources, specifically, desorption electro-flow focusing ionization (DEFFI), ultrasonic nebulization (USN), and laser desorption/ionization (LDI) all coupled to an ABSciex 4000 QTrap® Triple-Quadrupole system for chemical imaging and trace detection. DEFFI employs the "electro-flow focusing" technique, which integrates a recessed solvent capillary, focused through a small orifice by a concentric laminar gas stream. The physics of flow focusing jet formation and applied electric field configuration enabled the chemical imaging of endogenous (fatty acids) and exogenous (explosives, narcotics, lotions, GSR) compounds in deposited and lifted fingerprints. USN and acoustic pressure wave focusing within an array of exponential horn structures enabled efficient atomization of discrete liquid samples ranging from 3 μL to 10 μL aliquots. USN coupled in an extractive electrospray ionization configuration demonstrated detection of inorganic compounds from complex matrices, including synthetic fingerprint material and waterway sediment, without detriment to device operation. Finally, in conjunction with in-source collision induced dissociation (CID), DEFFI, USN, and LDI demonstrated enhanced detection and/or chemical imaging of inorganic components of explosives (potassium chlorate), radiological dispersion devices (radionuclides: cesium, strontium, cobalt), and GSR (lead, barium, antimony).
Presenter: Thomas P. Forbes
Materials Measurement Science Division
Material Measurement Laboratory

4. Assessing the Impact of NIST's Forensics Publications and Collaborations
Stacy Bruss, Susan Makar, Amanda Malanowski, and Katie Rapp

NIST has been publishing research studies and results related to forensics since 1978. Forensics at NIST crosses many disciplines and includes computer forensics, firearms analysis, trace analysis, and forensic biometrics, including fingerprints and DNA analysis. This research and the resulting publications by NIST scientists have proven impact on the forensics research community. The Information Services Office (ISO) at NIST analyzed the impact of NIST's peer-reviewed journal literature through citation analysis and network visualizations. ISO's study identified the most highly cited forensics-related journal articles by NIST authors, as well as the journals, institutions, and countries that cite NIST forensics articles most frequently. The analyses also identified the journal venues where NIST forensics research is most frequently published and examined NIST's extramural publication collaborations in forensics. ISO's study informs the forensics research community where NIST's forensic research has had some of the greatest impact.
Presenters:Stacy Bruss, Susan Makar and Amanda Malanowski
Information Services Office
Management Resources

5. Cathodoluminescence Microscopy for Characterization of Explosive and Narcotic Particle Residues on Surfaces
Greg Gillen and Scott Wight

Characterization of the particle characteristics (size, morphology, chemical composition and areal coverage) of trace contraband residues on surfaces is critical for developing optimized sampling strategies for current and emerging forensic detection technologies. Furthermore, a detailed understanding of the nature of these residues is important for the development of appropriate and realistic test materials. In this work we present the first (to our knowledge) evaluation of Cathodoluminescence Microscopy (CLM) for the chemical identification and imaging of a variety of particle residues from both explosives and illicit narcotics. The samples were examined in an FEI Quanta 200F environmental SEM fitted with a Gatan MonoCL4 Cathodoluminescence system. Representative CL spectra and images as well as secondary electron images were obtained from each of the materials using several different primary electron beam energies. CL images of micrometer sized particles typically show more uniform illumination and less sensitivity to sample topography and sample charging compared to conventional SEM imaging. This can greatly simplify automated particle recognition, sizing and counting of contraband particles in the SEM. Distinct CL spectra were obtained for most compounds, although the emission bands are typically quite broad. Overall, CLM imaging appears to provide a useful new tool to further understand the characteristics of trace contraband residue particles.
Presenter: Greg Gillen
Material Measurement Laboratory
Materials Measurement Science Division

6. Characterization of UV-Photodegrataion of Poly(ethyelene-terephthalate) base Window film
John Harwell, Paul Scott, Nicholas Carbone, Christopher White, Aaron Forster

Poly(ethylene terephthalate), or PET, is a popular commercial polymer commonly used in protective window films to increase both glass strength and shatter resistance. These films have high exposure to UV radiation and to the elements, so characterization of the rate of degradation as well as the alteration in its mechanical properties as a result of degradation is critical to product safety. Commercially produced multi-layered PET samples were placed under accelerated weathering conditions in the NIST SPHERE at 50°C and 0% relative humidity. A carbonyl index (CI) ratio was calculated using the IR-absorption peaks at 3520 cm-1 and 721 cm-1 / 871 cm-1 to identify degradation spectrographically. The degradation showed a good logarithmic association between photon dosage and CI. Tensile tests revealed orientational differences in PET's mechanical properties. In addition, post 14990 J/cm2 – min photon dosage, the material was observed to have lost approximately 63.5% of its original tensile capacity. This information in combination with ongoing natural weathering tests will be of critical importance in determining the service life of multi-layered PET window films.
Presenter: Christopher White
Materials and Structural Systems Division
Engineering Laboratory

7. CLOUD COMPUTING FORENSIC SCIENCE CHALLENGES
Martin Herman, Michaela Iorga

We present many of the challenges faced by forensic investigators attempting to identify, collect, analyze and interpret digital evidence residing in cloud computing environments. These challenges are either unique to or exacerbated in the cloud ecosystem. The challenges were researched, aggregated and categorized by the NIST Cloud Computing Forensic Science Working Group. The Group has prioritized the challenges, which will be further studied to determine gaps in technology and standards needed to mitigate these challenges. The challenges are categorized into the following groups:

  • Architecture - e.g., diversity, complexity, provenance, multi-tenancy, data segregation, variability in cloud architectures between providers, tenant data compartmentalization and isolation during resource provisioning, lack of transparency
  • Data Collection - e.g., data integrity, data recovery, data location, imaging, locating and collecting volatile data, data collection from virtual machines, data integrity in multi-tenant environment, recovery of deleted data in a shared and distributed virtual environment, root of trust
  • Analysis - e.g., correlation, reconstruction, time synchronization, logs, metadata, timelines
  • Anti-forensics - e.g., obfuscation, data hiding, malware
  • Incident first responders - e.g., trustworthiness of cloud providers, response time, reconstruction
  • Role management - e.g., data owners, identity management, users, access control
  • Legal - e.g., jurisdictions, laws, service level agreements, contracts, subpoenas, international cooperation, privacy, ethics
  • Standards - e.g., standard operating procedures, interoperability, testing, validation
  • Training - e.g., misuse of digital forensic training materials that are not applicable to cloud forensics, lack of cloud forensic training and expertise for both investigators and instructors

Presenter: Martin Herman
NIST Information Technology Laboratory

8. COCAINE SMOKE RESIDUES AS RECOVERABLE TRACE EVIDENCE
Julie Bitter, Bob Fletcher, Ed Sisco, and Greg Gillen

Smoke residues, composed of aerosolized drug particles and vapor phase drug molecules, are the remnants of second hand smoke that are deposited onto users clothing, as well as surfaces in car and home interiors. The ability to recover smoke residues from these surfaces and identify an illicit substance through chemical analysis would be useful forensic evidence, one that would not require the actual drug present at a scene. In this study, we generated cocaine aerosols by vaporizing cocaine base and collecting the aerosols onto substrates of various materials. These substrates were left under ambient conditions over a period of one week, and then underwent microscopic and/or chemical analysis. Investigation with mass spectrometry methods showed decreases in the overall efficiency of cocaine recovery as a function of time, as well as decomposition of the parent ion into four major products. Scanning electron microscopy revealed some aerosols and aggregates of small particles along with the slow formation of crystals less than 500nm diameter over the given time frame. Particle size analysis with an aerosol impactor was also performed in an attempt to determine the fraction of drug in the aerosol and vapor phases, an important piece of information for trace detection by thermal desorption. Results indicate that the majority of aerosols formed in this way possess average diameters ranging between 1.0 and 2.5μm. Future work includes continuing to examine cocaine recovery under a variety of environmental conditions, and examining the vapor phase cocaine content.
Presenter: Julie Bitter
Materials and Measurements Science Division
Material Measurement Laboratory

9. EFFECT OF REUSING SWIPE MATERIALS FOR PARTICLE COLLECTION EFFICIENCY
Jessica Staymates and Matthew Staymates

When trace evidence is collected via wipe sampling, the collection material is rarely reused after it is analyzed. However, it is sometimes common practice in the trace contraband detection community to reuse sampling media several times, only discarding the wipe once it is visibly dirty or after 25 uses. The number of times a wipe can be used before it gets too dirty or the particle collection efficiency (PCE) becomes too poor for it to be useful for ion mobility spectrometry (IMS)-based trace detection is currently unknown. The primary purpose of this study is to investigate the number of times a collection wipe can be used before the PCE decreases or the IMS response is negatively affected. Trace collection is an important part of forensic work, and knowing the effects of heat and age of the collection media can help us to understand optimal methods for trace collection. To date we have aged two types of wipes by swiping them in an automated manner over various surfaces, including canvas and dirty cardboard, which represent materials that are likely to be swiped in a screening environment. They were be swiped between 10 and 1000 times, and then PCEs will be determined using previously discussed methods [2]. The IMS response of aged wipes was also investigated, and SEM images were captured to observe possible surface degradation. This experiment should help answer just a piece of the much larger puzzle that is the effect of wipe age as a function of wipe performance.
Presenter: Jessica Staymates
Materials Measurement Science Division

10. Effects of Precursors on the Mass Spectrometric Detection of Homemade Explosives
Edward Sisco and Thomas P Forbes

Techniques such as gas chromatography mass spectrometry (GC/MS), infrared spectroscopy and colorimetric tests have all shown to readily detect a wide range of explosives in both pure form and in complex matrices. However, the capabilities of one of the emerging techniques in forensic science, ambient ionization mass spectrometry (AI-MS), has been significantly less researched due to the recent advent of the platform. While most explosives research focuses on military grade explosives, it is as important to investigate common homemade and improvised explosives such as sugar alcohol based nitrate ester explosives and peroxide based explosives. Since these compounds are typically homemade it is crucial to understand their performance characteristics and potential detection issues in both in pure form and in mixtures with their precursors. This work focuses on identifying potential advantages and issues that can arise in the detection of homemade explosives when found in the presence of their common precursor chemicals using AI-MS platforms such as direct analysis in real time mass spectrometry (DART-MS). A number of explosives derived from sugar alcohols were analyzed by DART-MS neat and in the presence of common precursors. Analysis of mixtures found that competitive ionization does occur when these precursors are concurrently sampled. Affinity for free nitrate ions and differences in vapor pressures of the explosives and sugar alcohols can lead to diminished sensitivity, depending on the magnitude of difference in the affinity and vapor pressures. Furthermore, extracted ion chromatographs of the mixtures highlight how different ionization pathways are preferred in the presence of difference combinations of precursors, with adduct formation with free nitrates being the dominant pathway for both components.
Presenter: Edward Sisco
Surface and Trace Chemical Analysis Group
Materials Measurement Laboratory

11. Evaluation of Benchtop DNA Sequencers for Forensic Applications
Katherine Gettings, Kevin Kiesler, Peter Vallone

Current forensic DNA testing methods for human identification generate DNA profiles using short tandem repeat (STR) markers. After evidence is collected from a crime scene and DNA is extracted, sample analysis is performed by multiplex PCR of selected STR loci, combined with florescent detection on capillary electrophoresis (CE) platforms. Present technology, while robust and reliable, is limited to typing approximately 20 STRs in one reaction. Other types of markers (single nucleotide polymorphisms [SNPs] or mitochondrial DNA [mtDNA] sequence) require different analytical processes. Multiple assays can be employed to type different markers from the same sample, but this requires considerable time and resources, and several nanograms of starting template. This limits the information readily obtained from a forensic DNA sample. Next generation sequencing (NGS) can analyze many thousands of genomic regions in a single reaction. Sequencing costs have plunged in recent years, and medical applications of NGS such as real-time screening of cancerous tumor cells (which are limited quantity, degraded, and mixtures of cancer/normal cells), concomitantly make NGS technology more amenable to forensic samples.This work explores the potential of NGS technology implementation in forensic science, specifically focusing on bench-top sequencing platforms (Ion Torrent PGM and Illumina MiSeq) that are better suited to the lower throughput needs and budget constraints of forensic laboratories. One conceivable benefit of NGS for forensics is a multi-marker multiplex, wherein STRs, mtDNA and various types of SNPs could be analyzed concurrently to maximize the information obtained from one sample in one assay.
Presenter: Kevin Kiesler
Applied Genetics Group, Biomolecular Measurement Division
Materials Measurement Laboratory

12. FORENSIC ANALYSIS METHODOLOGY AND DATABASE FOR HOMEMADE EXPLOSIVES
Ashot Nazarian and Cary Presser

Forensic identification of homemade explosive (HME) materials is critical for determining the origin of explosive mixtures and precursors, and formulation procedures. The forensics community traditionally uses a multi-evidence investigation strategy. In support of that approach, development of a new methodology was proposed in our Forensics@NIST 2012 poster, 'Forensic Analysis Methodology and Database of Statistically Combined HME Thermal, Mass, and Spectral Signatures' [Nazarian, et. al. 2012] to use an existing theoretical framework that combines forensic information (e.g., data, signatures) from independent methodologies. Thermal, mass, and infrared spectral signatures, as well as isotopic composition and ratios, were to be correlated to pre-identified HME precursors, and used to establish a database with specified confidence levels. As an initial phase to this program (supported by the NIST Special Programs Office, program manager J.P. Jones), thermal signatures from a variety of HME samples and recipes were obtained and analyzed in collaboration with the FBI/ATF. This initial thermal-signature effort was composed of the following phases: methodology development, demonstration, actual sample measurements, database concept development and integration, and population of the database with oxide/fuel mixed and synthesized HMEs. A novel measurement technique, referred to as the laser-driven thermal reactor (LDTR), was used to obtain thermal/chemical signatures of HME materials of interest. Ammonium nitrate (AN) and nitromethane (NM) were studied under a variety of operating conditions and protocols. It was demonstrated that the LDTR can serve as a diagnostic tool for characterizing the thermal and chemical behavior of trace amounts of HMEs [Nazarian and Presser, Thermochimica Acta, 576 (2014) 60 - 70]. Mixtures of ANNM (ammonium nitrate/nitromethane) and ANFO (ammonium nitrate/diesel fuel oil), as well as an additional ten HME mixtures suggested by the FBI/ATF (CTTSO Int. Counter HME Workshop, 2014), were investigated using the LDTR. The thermograms were found to be different than the individual components and each other, indicating sensitivity of the LDTR technique to uniquely identify different HME compositions for forensic analysis. Bio-diesel-based HME were also found to be potentially more reactive and dangerous than conventional petroleum-based diesel fuel oil. Currently, measurements are underway to include HMEs synthesized by the FBI, and nanoparticle metal/metal oxide thermite powders. Also, development of an HME forensic signature database (including directions to input data or query information) is proceeding forward to make available evaluated data to the forensics community, provide access for HME signature data from other federal agencies (e.g., USGS digital spectral library, NASA/JPL ASTER spectral library, JHU spectral library), and enable other agencies to query the database.
Presenters: Ashot Nazarian and Cary Presser
Chemical Sciences Division
Material Measurements Laboratory

13. FORENSIC LATENT FINGERPRINT PREPROCESSING
Mary Theofanos, Andrew Dienstfrey, Brian Stanton, and Haiying Guan

Forensic latent fingerprint preprocessing covers all the image transformation activities from the latent fingerprint image directly collected from the crime scene ('before image') to the image used for identity analysis ('after image'). The preprocessing directly affects the performance of fingerprint recognition. However, this phase has received less attention, and therefore there are fewer guidelines, rule sets, or standards. Our project focuses on three major tasks: first: to understand the process and procedure and study how to guarantee the image integrity; second: to provide guidance on recording the procedure and increase the procedure's reproducibility; third: to measure the effectiveness of the process and to compare the quality of the images. Preprocessing is actually a very complex procedure. We present a work flow of the major steps and activities used in preprocessing. The diverse activities include color filtration, contrast adjustment, edge enhancement, background suppression, noise removal etc. We propose guides to guarantee the integrity and several rules for reproduction. We also propose a measurement method to measure the effectiveness of the preprocessing. We revised the SIVV (Spectral Image Validation and Verification) feature, previously used for flat or rolled fingerprint images (low noises), and extended it to the latent fingerprint image domain (high noises). The experimental results show that the revisions greatly improve the fingerprint detection rate in latent image cases. By comparing the extended SIVV of before and after images, we can measure the effectiveness of the preprocessing.
Presenters: Mary Frances Theofanos
Office of Data and Informatics, Material Measurement Laboratory
and
Haiying Guan
Multimodal Information Group, Information Access Division, Information Technology Laboratory

14. MEASURING ELECTROSTATIC CHARGE IN WIPE SAMPLING: SURFACES AND PARTICLES
Robert Fletcher,* Greg Gillen and James Kushmerick

Wipe sampling, a key method for trace detection of explosives and drugs, entails rubbing a swipe material across a surface for the purposes of collecting residue particles for subsequent analysis. Triboelectric charging arises on the wipe from the contact of dissimilar material surfaces, resulting in charge and material transfer. These wipe surfaces are frequently composed of polymer materials which can serve as quasi-electrets and hold charge, under dry conditions, for a considerable length of time. The question we ask is - does electrostatic charging impact particle collection? Currently we are developing the capability to quantify levels of charge that exists on the particles, the swipe material and the substrate where the particles may reside. We report on measurements using a NIST-constructed electrostatic microprobe that has the lateral spatial resolution of approximately 1000 mm that is used to map the charge domains on surfaces. In some cases, charged particles can be detected on these surfaces with this microprobe. Charge on particles was measured using Scanning Kelvin Probe Microscopy and an aerosol electrometer (AE). The AFM based Kelvin Probe demonstrated that charge polarities and isolated domains on deposited RDX particles can be quantified in a relative manner. The AE determines a current produced for a flowing stream of charged airborne particles. A commercial Slip/Peel Tester with an electrically insulated sled was used to test the collection efficiency of charged wipes for polymer microspheres and explosive residue particles. Our results show that charged wipes collect more polymer microspheres than neutral wipes.
Presenter: Robert Fletcher
Materials and Measurements Science Division, Material Measurement Laboratory

15. Measuring Perceptual Performance of Facial Forensic Examiners
David White, P. Jonathon Phillips, Carina A. Hahn, Matthew Hill, Alice J. O'Toole

In forensic and security occupations, unfamiliar individuals are often identified by comparing facial images. Experiments consistently show that human viewers are poor at this task, raising questions about the reliability of important identity verification processes. Scientific knowledge of identification accuracy, however, is based primarily on people who are not formally trained at this task. Here we administered three challenging unfamiliar face matching tests to a group of highly trained forensic image examiners with many years experience comparing face images for law enforcement and government agencies. Two tests were constructed using image pairs that produced poor performance in humans and in computer-based face recognition systems (Expertise in Facial Comparison and Person Identification Challenge tests). The third was a standard psychometric test of face matching performance (Glasgow Face Matching Test). Forensic examiners were more accurate than the control groups on all face comparison tasks; on the Glasgow Face Matching Test, examiners were more accurate than normative performance of the general population. Moreover, computationally fusing the responses of multiple experts produced near-perfect human performance on tests where machine performance is unreliable.
Presenter: P. Jonathon Phillips
Information Access Division, Information Technology Laboratory

16. NUCLEAR FORENSIC STANDARDS: THE VERIFICATION OF A LIGHTLY – ENRICHED URANIUM RADIOCHRONOMETRIC CERTIFIED REFERENCE MATERIAL, CRM 125-A
Mark Tyra, Jacqueline Mann, Jerome LaRosa, and Svetlana Nour

New Brunswick Laboratory (NBL) with support from DHS National Technical Nuclear Forensic Center recently certified the first radiochronometric 234U – 230Th standard – a lightly enriched uranium oxide material CRM125A. NIST is performing confirmatory measurements on this material. An experimental design that includes adding additional chemical purification steps as well as longer counting times to improve the counting statistics and an improved spectrum deconvolution approach has been used to ensure accurate results. Particular care has been used in deconvoluting each alpha spectrum using a modification of the DAMM software (part of ORNL's UPAK software suite) to better handle uranium alpha-particle spectra. Thorium has proved problematic; however, as the 229Th peak shape confounds the modified code due to the many decay pathways of 229Th decay. Coincident conversion electrons and low energy gamma rays create peak shapes very different than that of 230Th. Therefore we are deconvoluting the Th spectra manually and comparing results to the modified DAMM code. These results will be compared to the original NBL certificate.
Presenter: Mark Tyra
Radiation Physics Division, Physical Measurement Laboratory

17. ON THE ORIGIN OF FILES AND REGISTRY CELLS: MEASURING NAMESPACE EVOLUTION FOR SOFTWARE IDENTIFICATION
Alex J. Nelson, Mary T. Laamanen, Darrell D. E. Long

Hierarchical storage system namespaces are notorious for their immense size, which is a steady source of woe for any computer inspection. File systems for computers start with tens of thousands of files, and Registries of Windows computers start with hundreds of thousands of cells. An analysis of a storage system, whether for digital forensics or locating old data, depends on being able to reduce the namespaces down to the features of interest. Typically, having such large volumes to analyze is seen as a challenge to identifying relevant content. However, if the origins of files can be identified—particularly dividing between software and human origins—large counts of files become a boon to profiling how a computer has been used. It becomes possible to identify software that has influenced the computer's state, which gives an important overview of storage system contents not available to date.In this work, we apply document search to observed changes in a class of forensic artifact, cells of the Windows Registry, to identify effects of software on storage systems. Using the search model, a system's Registry becomes a query for matching software signatures. To derive signatures, we extend file system differential analysis from between two storage system states, to many sequences of states. The workflow that creates these signatures is an example of analytics on data lineage, from branching data histories. The signatures independently indicate past presence or usage of software, based on consistent creation of measurable distinct artifacts. We present challenges in refining software signatures.
Presenter: Alex Nelson
Software and Systems, Division, Information Technology Laboratory

18. QUANTIFYING THE DEGRADATION OF TNT AND RDX IN A SALINE ENVIRONMENT WITH AND WITHOUT UV-EXPOSURE
Marcela Najarro, Edward Sisco, Candice Bridge, Roman Aranda IV

Terrorist attacks in a maritime setting, such as the bombing of the USS Cole in 2000, or the detection of underwater mines, require the development of proper protocols to collect and analyze explosive material from a marine environment. In addition to proper analysis of the explosive material, protocols must also consider the exposure of the material to potentially deleterious elements, such as UV light and salinity, time spent in the environment, and time between storage and analysis. To understand how traditional explosives would be affected by such conditions, saline solutions of explosives were exposed to real and artificial sunlight. Degradation of the explosives over time was quantified using negative chemical ionization gas chromatography mass spectrometry (GC/NCI-MS). Two explosives, trinitrotoluene (TNT) and cyclotrimethylenetrinitramine (RDX) were exposed to different aqueous environments and light exposures with salinities ranging from freshwater to twice the salinity of ocean water. Solutions were then aged for up to 6 months to simulate different conditions the explosives may be recovered from. Salinity was found to have a negligible impact on the degradation of both RDX and TNT. RDX was stable in solutions of all salinities while TNT solutions degraded regardless of salinity. Solutions of varying salinities were also exposed to UV light, where accelerated degradation was seen for both explosives. Potential degradation products of TNT were identified using electrospray ionization mass spectrometry (ESI-MS), and correspond to proposed degradation products discussed in previously published work.
Presenter: Marcela Najarro, NIST Materials Measurement Science Division

19. Rapid DNA Testing Approaches for Reference Samples
Erica L.R. Butts, Peter M. Vallone

Current methods of forensic DNA typing are currently conducted in approximately 8 to 10 hours. The process includes DNA extraction, quantitation, multiplex PCR amplification, and fragment length detection. Advances in extraction, multiplex polymerase chain reaction (PCR), and fragment separation have aided in reducing the time required to generate a complete short tandem repeat (STR) profile. Advances in extraction include, automated extraction and liquid extraction technologies. Multiple advances have been made in recent years to PCR multiplex kits, to include new buffers, improved polymerases, and faster thermal cycler technologies. Each of these advances has dramatically reduced the required time for PCR amplification. Technology such as the 3500 Genetic Analyzer has also reduced the time required to separate and detect an STR profile to about 38 minutes per 8 samples.Techniques will be examined for single source samples throughout the entire DNA typing process to reduce the time required to generate a profile from the time a sample has been collected. Several methods which include automated extraction, liquid based extraction protocols, direct PCR for buccal swabs and blood cards, and rapid PCR protocols will be examined. The differences between current DNA testing methods and the rapid techniques will be evaluated. Results will illustrate the typing of a full STR profile in less than 2 hours with current instrumentation commonly available in forensic laboratories.
Presenter: Erica L.R.Butts
Material Measurement Laboratory, Biomolecular Measurements Division, Applied Genetics Group

20. RE-USABLE AND SEMANTIC FRIENDLY METADATA FOR FORENSICS
T. N. Bhat, J. T. Elliott and E. Subrahmanian

Re-usable and semantic friendly metadata play a major role in the integration of information obtained from distributed resources. Text-mining techniques like Natural Language Processing rely on re-used semantic friendly metadata to create data-graphs from documents. Re-used terminology at some of the nodes of the data-graphs can allow automated alignment of different data-graphs and facilitate the search for new knowledge. The lack of an infrastructure to create such metadata at the time of the creation of documents has been well established by the recent new initiatives such as the Research Data Alliance (RDA) headed by NSF and NIST. Forensic Science Research is invariably an inter-disciplinary, federated effort. However, to our knowledge, at this time there is no established infrastructure to create re-used, customizable, use-case-friendly metadata for use in such a federated environment. During the last several years, we have been working on building a scalable, rule-based and machine friendly infrastructure to build semantic, on-demand terminologies. We have used these concepts first to create metadata for use in chemical structural database and then in a cell image database. More recently, we are in the process of implementing these concepts to add value to the metadata used by the NIST data resources for Material Genome Initiative. These results, together with examples of their possible implementation to Forensics Science Research will be presented.
Presenter: T.N. Bhat
Biosystems and Biomaterials Division, Materials Measurement Laboratory

21. REFERENCE MATERIALS FOR NUCLEAR FORENSICS
Jacqueline Manna, Mark Tyra, Robert D. Vocke, Richard Essex, Jeffrey Morrison, Jeffrey Leggitt, and Simon Jerome

The development of reference materials (RMs) for the improvement of traceability, accuracy, and precision of nuclear forensic measurements is currently being pursued by NIST as well as the USDOE New Brunswick Laboratory (NBL) with support from DHS National Technical Nuclear Forensic Center. Demonstrating the validity of nuclear forensic data is critical as this data could be an important component of the attribution process resulting in prosecution in a court of law or development of actionable intelligence. Thus, using a proper RM standard to constrain measurement accuracy and uncertainty bolsters the veracity of measurement results and provides a strong base to stand up to legal scrutiny. Current RM development efforts, in collaboration with partner organizations (DHS DNDO NTNFC, FBI, DOD, DOE National Laboratories, and international partners), focus on producing certified and/or working RMs for nuclear forensic analysis that include RMs for mass spectrometric calibration, isotopic tracer RMs for isotope dilution mass spectrometry (IDMS), and radiochronometry RMs. This poster will highlight some of the RM projects where NIST is actively involved in the certification process including Surrogate Post-Detonation Urban Debris Material, Th229 IDMS tracer, and Ba134 IDMS tracer.
Presenter: Jacqueline Mann
Radiation Physics Division, Physical Measurement Laboratory

22. RESTORATION OF FIREARM SERIAL NUMBERS USING ELECTRON BACKSCATTER DIFFRACTION
Ryan M. White, Robert R. Keller

Firearm serial numbers are a critical identifying mark, and restoration of destroyed serial numbers is often crucial for prosecution of a criminal case. A method is presented utilizing electron backscatter diffraction (EBSD) in the scanning electron microscope (SEM) which allows for clear visualization of die-stamped imprints which have been completely polished away. A known shape (the letter "X") was die stamped into 316 stainless steel and then polished away such that no visual evidence of the imprint remains. Residual deformation is observed to a depth of approximately 760 µm below the newly-polished surface. Using EBSD, the original shape of the stamped imprint is identified without ambiguity. With further development, the described method is capable of reconstructing an 8 character serial number in approximately one hour.
Presenter: Ryan M. White
Applied Chemicals and Materials, Material Measurement Laboratory

23. RFID Technology in Forensic Evidence Management: An Assessment of Barriers, Benefits, and Costs
Shannan Williams

Forensic science laboratories and law enforcement agencies have increasingly used automated identification technologies (AIT) such as barcoding and radio frequency identification (RFID) to track and manage assets such as forensic evidence, firearms, and personnel. RFID uses radio waves to perform automatic data acquisition. Several methods of identifying objects using RFID exist today, but the most common is to store a serial number that identifies an item, and perhaps other information, on a microchip attached to an antenna, which enables the chip to transmit the stored information to a reader. Conceptually, RFID and barcodes are similar; both are intended to provide rapid and reliable item identification and tracking capabilities. However, the value of RFID is evident in the differences between them. By taking advantage of the inherent features of RFID, more powerful applications can be created that can take action at a distance. The RF nature of RFID enables users to establish zones where activity can be monitored automatically. RFID-enabled applications can be engineered in such a way that the movement, number, and specific type of items, as well as timing and frequency of events, can all be monitored at a distance. For example, taking the inventory of a property and evidence room can be done with a single RFID read and without handling each item individually.The report entitled, RFID Technology in Forensic Evidence Management: An Assessment of Barriers, Benefits, and Costs is available for download at: http://www.nist.gov/forensics/evidence-management.cfm
Presenter: Shannan Williams
NIST Special Programs Office

24. Title: Shape Analysis for Forensics Applications
Afzal Godil

In this poster, I will review some of the shape analysis methods that play an important role in algorithms for identification, matching and statistical analysis for different forensics applications. I will also discuss some recent advances in 3D imaging and scanning technologies, that has made accurate capture of 3D objects possible and could possibly improve the accuracy for matching and identification for different forensics applications. Finally, I will present some initial results on Graffiti Image matching and retrieval.
Presenter: Afzal Godil
Information Technology Laboratory, National Institute of Standards and Technology

25. SINGLE SPORE MASS SPECTROMETRIC ANALYSIS FOR MICROBIAL FORENSICS
Christopher Szakal, Sandra da Silva, and Nathan D. Olson

Microbial attribution generally requires fundamental knowledge of the natural variation or mutation rate across organisms in varied environments, which is often confounded by lack of a comprehensive repository of genetic data or reference microbial strains. While this information is improved, it is also important to simultaneously develop non-genomic measurements for attribution purposes. The disadvantage of spore population measurements is that they are unable to distinguish rare pathogenic spores (either nefariously produced or naturally occurring) present within a population of hundreds to thousands of normal spores – resulting in potentially dangerous consequences. Therefore, the ability to differentiate single spores of different origin within a population is crucial for attribution confidence. Such differentiation requires a single spore measurement to have 1) enough counts possible for relevant and quantifiable signatures, 2) enough precision to correlate signatures with different environmental origins, and 3) enough individual spore analyses to place the natural variability of a single spore in context for establishing attribution confidence. Here, large geometry secondary ion mass spectrometry (LG-SIMS) was used to obtain single Bacillus Thuringiensis spore profiles of common elemental signatures Mg, Ca, and Fe at a dynamic signal range of > 4 orders of magnitude. Additionally, removal of Ca from the sporulation conditions resulted in an approximately 2-order of magnitude reduction in Ca signal with respect to Mg. Efforts are underway to establish the variability of the above data to provide context as to what the measurement values of unknown elemental concentrations within spores mean for growth condition attribution.
Presenter: Christopher Szakal
Materials Measurement Science Division, Material Measurement Laboratory

26. STANDARDS FOR FORENSIC X RAYS USED TO DETECT BULK EXPLOSIVES AND OTHER CONTRABAND
Larry Hudson, Paul Bergstrom, Fred Bateman, Ronnie Minniti, and Ron Tosh

During 2011-13 there were over 60,000 deaths and injuries from improvised explosive devices, spread across 66 countries, with civilians accounting for 81 percent of these casualties. The revenues of the global x-ray security screening industry are forecast to grow from $1.6 billion in 2013 to $2.6 billion by 2020. In response to these trends, the NIST Radiation Physics Division is funded by DHS to fill the well-documented gaps in commercial and transportation security that have been highlighted in Executive and Legislative requirements for unprecedented screening in the following security-sensitive venues: checkpoint, checked luggage, cargo, vehicle, and whole-body imaging. This program seeks to facilitate the development of the national and international standards and measurement tools to gauge both the technical performance and safety of screening systems that employ ionizing radiation as a forensic for the detection of bulk explosives and other contraband.
Presenter: Paul Bergstrom, Radiation Physics Division, Physical Measurement Laboratory

27. STATISTICAL FRICTION RIDGE ANALYSIS
Soweon Yoon, Hariharan Iyer, Elham Tabassi, and Gregory Fiumara

Although friction ridge patterns are critical evidence in crime scene investigations and highly admissible in courts of law, its analysis methodology has been criticized due to the lack of scientific basis for making identification decision. It is mainly because the analysis and evaluation of latent fingerprint comparisons are based on an examiner's judgment from his/her own expertise, rather than objective decision criteria that are developed based on scientific methodology and large-scale fingerprint data. In this project, we aim to establish evidential value of latent fingerprint comparisons which refers to a statistical measure of the uncertainty of the decisions made on the friction ridge evidence. For a pair of latent and an exemplar print, the evidential value of the comparison is comprised of three components: (i) a probability of observing the given evidence under the hypothesis that the latent print and the potential mate came from the same finger, (ii) a probability of observing the evidence under the hypothesis that the two prints came from two distinct fingers, and (iii) the strength/reliability of the observed evidence. In order to establish the three components of the evidential value of fingerprint comparisons, we will conduct a population study to understand: (i) intra-class variability due to skin distortion, and (ii) inter-class variability, particularly, close non-mated fingerprints that two impostor fingers share sufficiently large partial ridge structure. We will then introduce the quality of latent print to gauge the reliability of the observation and develop objective criteria to make a decision on latent fingerprint pair.
Presenter: Soweon Yoon
Information Access Division, Information Technology Laboratory

28. STR Sequence Diversity in Population Samples and Nomenclature Guidance for the "Next Generation"
Katherine B. Gettings, Seth A. Faith, Brian Young, Esley Heizer Jr., Kevin M. Kiesler, Francisco Martinez, Elizabeth Montano, Richard Guerrieri and Peter M. Vallone

As STR loci were being identified in the 1990's, various nomenclature systems were developed for different loci, with the primary variation being whether or not to "count" non-repeat bases interspersed in the repeat motif. In 1997, the ISFG issued guidelines on STR nomenclature, in an attempt to provide a common currency for information exchange. Historical precedent already existed for some loci, and this was maintained to avoid confusion, resulting in several commonly used forensic loci having complicated and contradictory nomenclature systems. This has not been an issue within the forensic community, as the capillary electrophoresis (CE)-length analyses are kit-based, with corresponding computer programs that automatically count repeats in a standardized manner. Now, as the costs associated with next-generation sequencing (NGS) methods decline, forensic research laboratories are beginning to explore the increase in information sequencing STR loci may provide. As a new generation of scientists begins interrogating these loci on a deeper level, an understanding of historical nomenclature is needed to achieve bioinformatic concordance with existing CE data. Herein, NGS results from 183 population samples exemplify the sequence variation that exists in forensic STR loci. This experimental sequence data gives an indication of the level of diversity expected in the larger population and will be used to provide examples of how sub-alleles can improve discrimination and mixture deconvolution in forensic casework. Finally, the different purposes of nomenclature are reviewed as well as examples of possible NGS-compatible nomenclature systems that may meet the needs of the forensic community.
Presenter: Katherine Gettings
Materials Measurement Laboratory

29. TATTOO RECOGNITION TECHNOLOGY – CHALLENGE
Mei Ngan, Patrick Grother, and Michael Garris

Tattoos can provide valuable information on an individual's affiliations or beliefs and have been used for many years to assist law enforcement in the identification of criminals and victims and for investigative purposes. Historically, law enforcement agencies have followed the ANSI-NIST-ITL 1-2011 standard to collect and assign keyword labels to tattoos. This keyword labeling approach comes with drawbacks, which include the limitation of ANSI-NIST class labels to describe the increasing variety of new tattoo designs, the need for multiple keywords to sufficiently describe some tattoos, and subjectivity in human annotation as the same tattoo can be labeled differently between examiners. As such, the shortcomings of keyword-based tattoo image retrieval have driven the need for automated image-based tattoo recognition capabilities.The Tattoo Recognition Technology - Challenge (Tatt-C) is being conducted to challenge academic and commercial developers to advance automated image-based tattoo matching technology. The activity will drive and assess the capability of image-based tattoo recognition methods to detect and retrieve tattoos, with the goals to determine which are most effective and whether they are viable for the following operational use-cases: Tattoo Similarity - matching visually similar or related tattoos from different subjects; Tattoo Identification - matching different instances of the same tattoo image from the same subject over time; Region of Interest - matching a small region of interest that is contained in a larger image; Mixed Media - matching visually similar or related tattoos using different types of images (e.g. sketches, scanned print, computer graphics, etc.); Tattoo Detection - detecting whether an image contains a tattoo or not.
Presenter: Mei Ngan
Information Access Division, Information Technology Laboratory

30. Using an Artificial Dog Nose to learn how canines detect explosives and narcotics
Matthew Staymates, William MacCrehan, Jessica Staymates, Brent Craven and Greg Gillen

An investigation of the external aerodynamics of canine olfaction is presented. Extending upon the previous work done by Settles (2002) and Craven (2010), we have developed an anatomically-correct artificial dog nose. The nose is modeled from detailed MRI imaging of a female Labrador Retriever and fabricated using a 3D printer and sniffs with realistic flow rates and frequencies. Flow visualization experiments using schlieren imaging enable real-time examination of the dogs remarkable ability to attract and sample vapors from extended distances. During exhale, a turbulent air jet emanates from each nostril and entrains fluid from ahead of the nose, sometimes at a distance of many tens of centimeters. This vapor is now readily available for inhalation, during which the nose now acts as a potential-flow inlet. During active sniffing, this exhale/inhale cycle is repeated at a frequency of around 5Hz. We have learned that the dog is an active aerodynamic sampling system, utilizing fluid dynamics to increase its aerodynamic reach to sample vapors at increasingly large distances. We are in the process of measuring the differences in performance characteristics of a dog that sniffs regularly vs. a dog that could inhale only. These measurements require the development of unique vapor-collection and LCMS chemical detection techniques to evaluate the collection of trace vapors associated with the detection of TNT via aerodynamic sampling. As a form of biomimicry, we are now utilizing bio-inspired design principles from the dog and applying them to optimize current- and next-generation vapor sampling technology.

GS Settles, et.al. The External Aerodynamics of Canine Olfaction. A chapter in Sensors and Sensing in Biology and Engineering, ed. F.G. Barth, J.A.C. Humphrey, and T.W. Secomb, Springer, Vienna & NY, 2002.

BA Craven, et.al. The fluid dynamics of canine olfaction: unique nasal airflow patterns as an explanation of macrosmia. J.R. Soc. Interface, 7, 2010
Created November 21, 2014, Updated June 2, 2021