Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Research Programs

The laboratories and offices hosting research opportunities for the 2023 SURF Gaithersburg program are listed below. Please read carefully, as some opportunities are in person while others are virtual, and some have options for format

Due to the multi-disciplinary nature of NIST's research, students should look through the different opportunities to discover projects and then select first and second-choice host laboratories. For example, a computer science student may find project opportunities in CTL, EL, and PML in addition to the logical choice of ITL. Similar opportunities exist for other disciplines. Applicants indicate their top two host laboratories in the online questions section of the application available on

This list is not necessarily final. Additional projects might be added to the bottom of each laboratory's list during the open application period. Applicants should check back periodically before the application closes, and adjust first and second-choice host laboratories if needed.

2019 SURF Abstract Book

  CTL EL ITL MML/NCNR PML Special Projects
Biology & Biochemistry       x x x
Chemistry   x   x x x
Computer Science x x x x x x
Engineering x x x x x x
Materials Science   x   x x x
Mathematics x x x x x x
Physical Sciences x x x x x x
Physics   x x x x x
Statistics x x x     x


Communications Technology Laboratory (CTL)

The CTL serves as an independent, unbiased arbiter of trusted measurements and standards to government and industry and focuses on developing precision instrumentation and creating test protocols, models, and simulation tools to enable a range of emerging wireless technologies. The CTL is also home to the National Advanced Spectrum and Communications Test Network (NASCTN), which provides a neutral forum for addressing spectrum-sharing challenges. Learn more about CTL.


Lotfi Benmohamed, (301) 975-3650, lotfi.benmohamed [at]
Wesley Garey, (301) 975-5190, wesley.garey [at]
David W. Griffith, (301) 975-3512, david.griffith [at]

Research Opportunities

Safety Assessment of Automated Vehicles Using Simulation
Thomas Roth, thomas.roth [at]
An automated vehicle is expected to be competent in planning, monitoring, and performing behaviors that today are executed by human drivers. These behaviors include maintaining a lane under various adverse operating conditions, navigating an intersection including pedestrian traffic, and others. The automated execution of these behaviors must be both safe and predictable to meet expectations of human drivers and other humans in the driving environment. Simulation software such as MATLAB, CARLA and SUMO can be used to test vehicle behaviors in a virtual environment prior to field deployment with low cost and risk to humans. This project is focused on the simulation of maintaining a lane under various conditions. The student will be trained in the use of NIST’s simulation environment for automated vehicles, design simulation scenarios in MATLAB, and collect data for analysis of whether the scenarios meet human expectations for safe and predictable driving. This work will be performed collaboratively with a team of 3 NIST researchers. This format for this opportunity is flexible; it could be in-person or virtual.

Qt-based GUI for ns-3
Evan Black, [at]
The Wireless Networks Division is developing a simulation platform to model next-generation wireless networks for public safety communication. To complement our platform we have been developing Graphical User Interface (GUI) tools to visualize the network topology and the data collected during our simulations ( This summer, you will develop new capabilities for the Graphical User Interface (GUI) based on Qt (, an open-source widget toolkit for creating graphical user interfaces. Your work will include the following: the development of the Qt interface, OpenGL rendering components, development of statistical models, and/or the creation of 3D models.

Industrial Artificial Management and Metrology: Robotic Work Cell Monitoring
Michael Sharp, [at]
This work will present a student with an opportunity to assist in the testing and development of methods and tools designed to evaluate risks and opportunities for AI tools in an industrial production process. 

This project will focus on NIST’s Collaborative Robotic System experimental workcell. The student will work closely under the direction and supervision of robotics experts and NIST AI experts to create and test a process monitoring procedure with both human and AI feedback systems. Experiments and tasks may include, but are not limited to:
-    Observing and recording behavior and anomalies of a robotic work cell
-    Implementing an AI driven COTS monitoring system on a closed loop production process
-    Using ‘natural language’ as training, input, and feedback to an AI observer
-    Developing best practice methods for the human-AI feedback mechanisms 
-    Determining measures and methods for assessing trust and trust worthiness of AI monitoring systems
-    Creating measures and metrics for capturing the risks and returns of AI driven technologies on an industrial manufacturing process

The goal of this project is to help lower barriers and provide intuitive recommendations for US manufacturers looking to enhance their productivity through AI monitoring, controls, and design technologies. [In-peron opportunity]

Industrial Artificial Management and Metrology: Process Simulation Testing and Development
Michael Sharp, [at]
This work will present a student with an opportunity to assist in the testing and development of novel software tools designed to evaluate risks and opportunities for AI tools in an industrial production process. 

This project will focus on Sim-PROCESD, a production process event simulation tool. The student will work closely under the direction and supervision of software development experts and NIST AI experts to create and test a process simulation for a major government manufacturing center. Experiments and tasks may include, but are not limited to:
-    Implementing an AI driven COTS monitoring system
-    Developing best practice methods for the design and creation of high-level digital simulators
-    Determining mechanisms and methods for rapidly executing process redesigns and ‘what if’ scenarios
-    Creating measures and metrics for capturing the risks and returns of AI driven technologies on an industrial manufacturing process

The goal of this project is to help lower barriers and provide intuitive recommendations for US manufacturers looking to enhance their productivity through AI monitoring, controls, and design technologies. [Flexible format: In-peron or virtual opportunity]

additional SURF 2023 CTL projects are under development. Please see examples of past projects below and check back later for updates. 

Interoperability Assessment Analytics of Smart Sensors
Smart sensors can provide real-time data and status of electrical power grids for real-time monitoring, protection, and control of grid operations to improve reliability and resilience of smart grids. Smart sensor data exchange and interoperability are major challenges for SGs. Interoperability measurement and assessment methods for smart sensors are keys to achieving and assuring the interoperability of smart sensors in smart grids. This project will study interoperability assessment methodology based on standard communication protocols, process model of interoperability developed in our current smart sensor project, design and develop open-source software tool to assess interoperability of phasor-measurement unit (PMU)-based smart sensors. See page 24 of the 2022 student abstract book for results of this project.

Classification and Inference of Operating and Market Conditions Contributing to Resilience and Failures on the Electric Power System
Researchers with the NIST Smart Grid Program are developing open-source tools using public data resources to model and evaluate the benefits to system performance from enhancements to interoperability and the new operating and control strategies made possible by such investments. Efforts to quantify the value propositions and performance outcomes of interoperability improvements are complicated by the complexity of interactions between devices, systems, and stakeholders; the opacity of competitively sensitive business operations; the diversity of market structures and operating conditions; and the limited opportunity to conduct experiments on live infrastructure systems. The conditions confronting the electric power system are as varied as the communities served by this critical infrastructure. Researchers seeking to value returns on infrastructure investment need flexible tools for evaluating the efficacy of operating strategies and the propensity for operating and market conditions to contribute to system resilience or failure. As a result, quantitative research has sought to build tools for the construction of plausible counterfactual scenarios through which economic analysis of alternative strategies may be conducted. One such tool currently under development at the NIST Smart Grid Program is the Generator Fleet Characteristics Model (GFCM), which consists of a set of MATLAB functions for the wrangling and analysis of public data on the balancing authorities that comprise the electric power system. In the interest of improving the counterfactual analyses to be produced using the GFCM, the Smart Grid Program has an opportunity for a SURF researcher to develop a MATLAB module for classification and inference with respect to GFCM model outputs, which detail the operating and market conditions present on the electric power system. A successful work effort will be incorporated into ongoing GFCM development and applications, and may therefore contribute to future modeling efforts and publications on the economics of emerging technology and operating strategies. The classification and inference module to be produced may involve methods and tools from computer science, engineering, economics, or the social sciences, including machine learning and/or automated algorithms for detection and classification of patterns within multivariate data structures. Consequently, students with strong computer skills, interest in these subject areas, and experience working with MATLAB are encouraged to apply. See page 17 of the 2022 student abstract book for results of this project.

Data Tracing Web Platform
The recent and significant digitalization of industries has been driven by numerous potential benefits, from better physical goods to faster services. In this digital world, digital data becomes the key driver to many important decisions, processes, and flows within and across organizations. Unfortunately, this move into the digital word has exposed organizations to numerous new cyber threats that need to be addressed before they are exploited. One major threat, known as data tampering, discreetly modifies data to corrupt the processes and decisions that rely on it, and can quickly propagate itself within and across organizations. Because this tampering cannot always be prevented, understanding the exposure of an organization to such a threat is key to properly responding to it. This project focuses on improving our next-gen cybersecurity tool, to model, trace, and analyze data flows, to understand, control, and reduce exposure to data tampering in complex environments. This project will sharpen your programming skills and expand your cybersecurity knowledge. See page 21 of the 2022 student abstract book for results of this project.

Integrated Standards Publication Environment
The STEP standard (ISO 10303: Standard for the Exchange of Product model data) is developed and published in what was modern, cutting-edge document XSL/XSLT publication system … 20 years ago. In that time, STEP continues to be the most widely used format for data exchange and interoperability between Computer-Aided Design (CAD), Manufacturing (CAM), Analysis (CAE), and Inspection (CMM) software. The standard is regularly updated with new features, but the brittle publication tool chain is increasingly causing delays. The ISO Subcommittee responsible for this standard is embarking on an ambitious redesign of the publication environment that will automate the complex workflow and build in extensive error checking. We are seeking a student to participate on that team and contribute to the development and implementation of a modern, state-of-the-art, open-source publication system based on software engineering best practices, that is easier to install, use and maintain. This project is an opportunity to gain experience working on a sophisticated software engineering project alongside professional software developers, to build a turnkey solution for standards developers.

  1. Depending on project status, your skills and interests, this project offers a variety of tasks such as:
  2. CVS-to-Git migration support
  3. Management of Git repo contents, tagging
  4. Migrating image maps to SVGs
  5. Migrating legacy documents to Annotated EXPRESS format (similar to JavaDocs)
  6. Extract prior editions of standards from CVS to enable auto generation of document Change History
  7. Build tools to generate the STEP Resource Library and STEP Application Protocols
  8. Iterate and refine publication workflow

See page 20 of the 2022 student abstract book for results of this project.

[Back to top of page]

Engineering Laboratory (EL)

The EL promotes the development and dissemination of advanced manufacturing and construction technologies, guidelines, and services to the U.S. manufacturing and construction industries through various activities in areas such as fire prevention and control; national earthquake hazards reduction; national windstorm impact reduction; national construction safety teams; and building materials and structures. Learn more about EL.


Cartier P. Murrill, (301) 975-5738, cartier.murrill [at]

Research Opportunities

Online Documentation for FDS - Fire Research Division
Randall McDermott, randall.mcdermott [at]
Begin the process of developing online documentation for the NIST Fire Dynamics Simulator (FDS). All documents are currently posted in PDF format. But most modern software packages have online documentation generated by something similar to Sphinx. This and alternatives will be explored. In particular, the ability to transfer LaTeX math documentation will be evaluated.

Digital Twin Development for a Robot Arm - Systems Integration Division
Guodong Shao, guodong.shao [at]
A laboratory scale robot workcell for digital twin research is being established. Among other equipment, the workcell comprises two UR5e robots. A digital twin will be developed for the robot arms using data collected using the MTConnect standard. The digital twin will help monitor, analyze, and optimize the manufacturing process and also help test methods and tools that support robot workcell automation. The SURF student will work with the NIST researchers to analyze the collected data, build and validate a digital twin, and integrate the digital twin with other systems.

Hierarchical Data Structure Development to Support Digital Thread and Digital Twin Applications - Systems Integration Division
Guodong Shao, guodong.shao [at]
Digital twins can digitally represent a physical manufacturing element, e.g., a part. However, to represent a part being manufactured, data from multiple stages of the Product Lifecycle may be needed. In each of these stages, specific data standards and formats may be used. It is challenging to relate the different information to a part. A digital thread could help link all the data to support the digital twin development. Currently, there is no framework or structure can easily and efficiently support the store, representation, and exchange such data. In this summer, the SURF student will work with the NIST researchers on novel data storage solutions, such as Hierarchical Data Format (HDF5), to represent product lifecycle data, which will leverage digital twin applications in manufacturing; perform research on existing work in digital thread and digital twin; and develop use case of the selected structure to represent data from product lifecycle stages; and demonstrate digital twin applications using the data structure.

Study of Degradation Mechanism and Failure Mode of Polymeric Components Used in Photovoltaics - Materials Structural Systems Division
Xiaohong Gu, [at]
Understanding the degradation modes of polymeric components used in solar cells during services is critical to the development and assurance of photovoltaic technology. In this study, the degradation of polymeric backsheets aged in the accelerated laboratory conditions and in the fielded modules under different climates will be analyzed using spectroscopic and mechanical techniques such as attenuated total reflection Fouriertransform infrared spectroscopy (ATR-FTIR) and tensile tester. The mechanisms of chemical and mechanical degradation will be studied. The results will be used to understand the root causes of the backsheet failure and provide scientific basis for material selection and product development.

Analyzing and Applying Data from Irradiance Database for Indoor Energy Harvesting - Building Energy and Environment Division
Andrew Shore, andrew.shore [at]
Internet-of-Things devices are becoming increasingly more used in the home. Harvesting the available ambient indoor light energy using Photovoltaic (PV) mini-modules can help reduce electrical needs and power these devices. To better assess the year-round feasibility of this energy harvesting approach, the PV Characterization Lab is measuring the spectrum and intensity of the available light at three different locations in the Net-zero Energy Residential Test Facility (NZERTF). The student intern will help analyze this data to determine seasonal effects and the typical available spectra and intensities at each location. The student will have a further opportunity to synthesize and test a model spectrum on various PV mini-modules in our indoor light test setup to study effectiveness and ability to power a wireless device for a duration of time.

Measurements of Thermal Properties of Organic Phase Change Materials Modified by Thermally Conductive Fillers - Building Energy and Environment Division
Jae Hyun Kim, [at]
Phase Change Materials (PCMs) are used as latent heat storage sources of thermal energy through a phase change. For PCMs, magnitudes of latent heat of fusion and thermal conductivity are important governing factors for storing and releasing of thermal energy efficiently. This project will investigate thermal properties of a selected organic PCM with different ratios of fillers. Changes in the thermal behaviors and thermal conductivities of the modified PCM will be characterized. The experimental results will help us understand the relationship between heat storage behaviors and thermal conductivities of the modified composite organic PCMs.

Study of Pyrrhotite Reactions in Concrete - Materials Structural Systems Division
Stephanie Watson, stephanie.watson [at]
Damage to concrete structures in building construction in Connecticut was attributed to iron sulfide mineral pyrrhotite and results in decomposition and structure cracking. Some states passed building codes to prevent this issue, but there are no standardized methods or concentration limits to assess pyrrhotite abundance. NIST developed reference standards (RM) to provide an accurate, consistent pyrrhotite analysis in concrete. This project focuses on optimizing an x-ray fluorescence method to quantify sulfide and sulfate species in RMs compared to foundation specimens. An experimental design to create a model aggregate to better understand pyrrhotite reaction mechanisms and rates will also be investigated.

3D Additive Manufacturing of Cement Based Materials - Materials Structural Systems Division
Nicos Martys, nicos.martys [at]
3D printing of concrete is a relatively new approach to the placement of concrete and other cement based materials. This project will evaluate the stability of different printed structures given the viscoelastic properties of the printing fluid, whose properties may change with time. Such information may help provide guidelines for the suitability of certain structures for 3D printing as well as the printing process.

Understanding Time-dependent Behavior of Alternative Cementitious Mixtures for Additive Manufacturing - Materials Structural Systems Division
Rachel E. Cook, rachel.cook [at]
Concrete, a ceramic composite, is the most-utilized man-made material worldwide. With cement production responsible for 9%-to-10% of global anthropogenic CO2 emissions, research focused on improving the sustainability and resiliency of U.S. infrastructure is imperative. In this project, a student will have the opportunity to study the effect of recycled plastic materials on the time-dependent behavior of sustainable cement blends for the purpose of additive manufacturing (AM). Experimental work will include isothermal calorimetry and small amplitude oscillatory shear (SAOS) measurements. The results of this work will help to improve understanding of sustainable mixtures generally and for the purposes of AM.

Study of Fire-Affected Concrete - Materials Structural Systems Division
Cody M. Strack, cody.strack [at]
Climate change is increasing the frequency of fire-related events triggering more instances of concrete exposed to conditions that can reduce its expected service life. Many studies use furnaces and other artificial means to study fire-affected concrete, whereas actual fire events can lead to irregular distribution of heat compounded by the inherent heterogeneity of concrete. This study will utilize NIST’s National Fire Research Laboratory to simulate real fire conditions within concrete mixes of various compositions. Samples will be analyzed via microscopy, image analysis, and mechanical tests to evaluate extent of damage and link fire conditions to expected structural performance.

Technical Language Processing for Improved Document Annotation and Community Resilience - Materials Structural Systems Division
Juan F. Fung, juan.fung [at]
Vast amounts of technical data exist in published documents, including those used to conduct community resilience and climate adaptation planning. Current process of reading and summarizing such documents by hand requires expert judgment and is incredibly time-consuming, which limits our ability to create large-scale datasets. The goal of this project is to harness text mining and natural language processing techniques to create a semi-automated human-in-the-loop tool to assist domain experts with annotating technical documents. This groundbreaking tool will be used to summarize current climate adaptation and community resilience approaches nationwide and assist other researchers with similar data challenges.

Estimating the Coarse Aggregate Sieve Distribution in Concrete from Observations Made on a Cut Surface - Materials Structural Systems Division
Kenneth A. Snyder, kenneth.snyder [at]
Forensic situations can arise whereby a researcher would like to determine the ASTM C 33 sieve distribution that was used for a particular concrete mixture. Alternatively, one might want to know whether two difference concrete mixtures were made using the same coarse aggregate sieve distribution. For this project, the plane cut surfaces of hardened concrete will be studied to characterize the coarse aggregate size distribution for comparison to other distributions. The student will perform statistical tests to determine the sample size required to distinguish two similar distributions, and will study the qualitative “distance” two distributions need to be before they can be distinguished.

Generating Weathered Microplastic Particles Using the NIST SPHERE - Materials Structural Systems Division 
Li Piin Sung, li-piin.sung [at]
This project will focus on generating weathered plastic particles with the NIST SPHERE, where macro-samples or films of plastics are UV-weathered while immersed in water (or simulated ocean water) or under high humidity, dry conditions. ATR-FTIR and laser scanning confocal microscopy will be used to characterize chemical properties of UV-degraded surface and morphology (the size and distribution) of nano-/micro- plastics particles as a function of UV exposure time. The outcome of this project would provide spectral database (FTIR) of weathered plastics, particles sizes of the microparticles at various temp and generation of more relevant, weathered microplastic particles.

Using Resonant Frequency Testing Methods to Characterize the Extent of Damage in Concrete Cores Taken From an Existing Building - Materials Structural Systems Division
Kenneth A. Snyder, kenneth.snyder [at]
A forensic study of an in-service concrete structure typically involves collecting concrete cores for mechanical testing (e.g., compressive strength). Although the values are used to characterize the concrete properties, there can arise situations where the observed properties are much lower than expected. There can be a number of reasons for this to occur: 1) the concrete has an inherently lower strength due to actions/steps taken during construction; 2) the concrete has been damaged due to chemical attack (e.g., corrosion, sulfate attack) or due to unanticipated loads (e.g., earthquakes, hurricanes). How to incorporate the measured value into the overall distribution of measurements requires knowing which category the sample is in.

ASTM C 215 resonant testing of cores can be used to estimate the Young’s and shear moduli of a sample. This project will extent this test by determining whether the nature of the resonance is correlated to damage within the core. Specifically, the nature of the isolation of the resonant frequency, with respect to other frequencies present.

The Effect of Wavelength and Intensity in the UV Region on Polymer Degradation - Materials Structural Systems Division
Deborah Jacobs, debbie.jacobs [at]
The new 0.5-m NIST SPHERE is now operational. The new device must be tested thoroughly before the technology can be transferred to industry. Here, the effect of lower UV wavelengths on the degradation pathway and their impact on the reciprocity law will be investigated. The student will run experiments on a polymer under different exposure conditions by varying filters to determine if the same degradation mechanism is followed independent of which filter is used. Analysis methods will include Fourier Transform infrared (FTIR) and dynamic mechanical analysis (DMA) to monitor the changes in chemical and mechanical properties.

Fire Modeling Software Verification and Validation - Fire Research Division
Kevin McGrattan, kevin.mcgrattan [at]
NIST develops and maintains two computer fire models. One, the Fire Dynamics Simulator (FDS), is a computational fluid dynamics model. Its documentation includes separate verification and validation manuals that describe the results of hundreds of calculations and comparisons to experimental test data or analytical solutions. This database of V&V cases is expanding, and there is a need to incorporate new data and new cases into the repository. The project shall involve working with experimental fire test data, running numerical simulations, and comparing the results of both. A particular emphasis is on quantifying the uncertainty of the model.

Additive Manufacturing Part Inspection Data Registration Software Development - Systems Integration Division
Shaw Feng, shaw.feng [at]
The number and types of sensors used for inprocess monitoring la serpowder bed fusion processes for metal Additive Manufacturing (AM) are increasing. Each sensor is independent of others. The datasets from different sensors have different reference frames for reporting the collected data. Furthermore, postprocess inspection data adds another layer of complexity that needs to be addressed. AM data registration is needed to help resolve the issue of monitoring the powder fusion processes and predicting the material properties in the part. This summer research work involves developing fundamental algorithms and a software tool for processing and registering data, using available Xray Computed Tomography data as an example. Functions of the tool include image segmentation, feature extraction, and defect identification. Some programming and image processing skills are required, for example, Python, C++, Java Script, ImageJ, or MATLAB. The applicant must be interested in additive manufacturing or 3D printing.

Performance of Schools, Shelters, and Hospitals in Hurricane Maria - Materials Structural Systems Division
Camila Young, camila.young [at]
In February 2018, the NIST director established a National Construction Safety Team (NCST) to conduct a technical investigation of the effects of Hurricane Maria on Puerto Rico. The goal of this summer research project is to support the NCST investigation by collecting and analyzing information on the performance of critical building s in Hurricane Maria, to evaluate the adequacy of existing design standards and codes for these facilities. The SURF student will work with the NIST mentor to mine school, shelter, and hospital information from various data sources (e.g., satellite and aer ial imagery, NOAA datasets, damage reports, news media reports etc.) and contribute to a geodatabase. This damage data will then be analyzed, along with information on the wind hazard, to explore how the hazard levels encountered at the facility site impac ted damage and loss of function of the facility.

Assessment of Polymer Composite Degradation During Long-term Use in Outdoor Infrastructure Applications - Materials Structural Systems Division
David Goodwin, david.goodwin [at]
Data and test methods are currently lacking to assess the health and performance of polymer composite materials used as retrofits and protective coatings on buildings and infrastructure in outdoor environments. Accelerated laboratory tests and outdoor exposure of polymer composites help to assess timelines for loss of functionality during service life and inform replacement schedules. Measurement methods, including chemical and microscopic methods, will be used to track degradation in both accelerated laboratory tests and outdoor tests. Fiber-reinforced polymer (FRP) composite samples weathered outdoors as well as FRP and polymer nanocomposites degraded under accelerated ultraviolet exposure and freeze/thaw cycling will be assessed.

Understanding Strength Development in 3D Printed Cementitious Materials - Materials Structural Systems Division
Aron Newman, aron.newman [at]
Interest in replacing the conventional form work and concrete placement with 3D printing of cementitious materials has increased in the last several years. The stability of these structures needs to be better understood, particularly at early age where there is a risk of creep that can compromise structural integrity. This research project will measure the frequency response of hardened cement pastes through dynamic mechanical analysis. The measured storage modulus and creep from this method will be compared to the indentation response using a microhardness tester that records a load – displacement curve to evaluate for modulus and creep. Measurement values from these two methods can potentially provide guidance on optimizing cement mix designs for building robust 3D printed structures.

Laser Powder bed Fusion Additive Manufacturing Data Analysis - Systems Integration Division
Yan Lu, [at]
This project will investigate the correlation between in-process monitoring data and ex-situ measurements for laser powder bed fusion (PBF) additive manufacturing (AM). The results will enable the fusion of multiplemodality in-situ data for part quality prediction. The student will analyze several data sets published by NIST, including high-speed melt pool images and high-resolution layerwise images from the “Overhang Part X4” build using the NIST Additive Manufacturing Metrology Testbed, as well the X-Ray CT data of the as-built parts. Both classic machine learning and deep learning methods should be investigated to establish the relationship between in-process measurements and XCT data.

Semantic Models for Embedded Intelligence of Building Operation - Building Energy and Environment Division
Parastoo Delgoshaei, parastoo.delgoshaei [at]
Semantic Web technologies promise new opportunities for the efficient management of information and knowledge in the built environment. Semantic models of buildings lower the cost of analytics and enhance intelligent control across buildings. This project aims to use a set of software tools for creating RDF models of Building Automation Systems (BASs) for Heating Ventilation and Air Conditioning, Lighting, and shading devices according to the evolving ASHRAE 223 Semantic Data Model for analytics and automation applications in buildings.

Development of Non-Destructive Polymer Degradation Measurements in Photovoltaic Modules - Materials Structural Systems Division
Ashlee R. Aiello, ashlee.aiello [at]
Prevention and understanding of early failure mechanisms in photovoltaic modules is needed to economize solar energy. The polymeric components in photovoltaic modules degrade during outdoor exposure, which can result in multiple failure mechanisms including cracking, delamination, and discoloration. While many characterization methods are well suited for polymer degradation studies, they require disassembly of the module and are limited to post-mortem analysis. This project will focus on the development of new nondestructive measurements to study polymer degradation in either fully assembled modules or under in-situ conditions (e.g. during exposure to temperature, humidity, or mechanical strain).

Real-time Pose Measurement to Support Robot Inspection - Intelligent Systems Division
Helen Qiao, guixiu.qiao [at]
The use of robots in high-precision applications has been increasing, for example, robot real-time inspection. The capture, analysis, and real-time feedback of inspection results in users making the best decision on time. For robot inspection, the robot is performed as a carrier of the inspection sensor. The robot’s accuracy needs to be assessed and the dynamic motions need to be measured to satisfy the requirement of registering inspection data. The robot arm’s position and orientation information are used to register the sensor data for full 3-D analysis. The National Institute of Standards and Technology (NIST) has developed a novel smart target (patent) to support the precise measurement of a robot’s position and orientation. The smart target is mounted on the object (e.g., end effector or tool of a robot arm) whose accuracy is to be ensured in order to measure and track the object’s six-dimensional (6-D) position and orientation. The smart target consists of fixed-wavelength light pipes and two high-precision rotary gimbals. The light pipe structure defines a coordinate frame that contains 6D information. One measurement of the smart target can output the pose of the object (6-D measurement – x, y, and z position, roll, pitch, and yaw orientation).

Risk and Uncertainty in Community Resilience Planning - Applied Economics Office
Christina Gore, christina.gore [at]
A flexible methodology to value the socioeconomic impacts, avoided costs, and evaluate the return on expected benefits is needed to investment of community resource allocation decisions to reduce future economic damages from disasters, while accounting for uncertainty and behavioral influences on decision making, such as risk profiles and learning. To develop this framework, first a complete documentation of uncertainty and risk tolerance needs to be completed. This documentation is likely to include heuristics and other behavioral ways that individuals and communities make decisions , especially about community resilience in the built and natural environments when faced with a diverse set of resource allocation alternatives.

Joint Cognitive Work to Formulate Business Transactions - Systems Integration Division
Peter Denno, peter.denno [at]
We are interested in enabling non-programmers at small manufacturers to formulate for themselves the information technology needed to transact with their large corporate customers. The general idea is to do this task as joint (human/AI) cognitive work (JCW). We developed a "mapping language," RADmapper, that facilitates JCW by analyzing samples expressed in its own abstract syntax trees. We seek someone with strong math or CS skills to use RADmapper to generate and characterize language samples. Example characterization might include identifying paths from target data back to source data, or mathematical structures such as natural transformations.

[Back to top of page]

Information Technology Laboratory (ITL)

The ITL focuses on information technology (IT) measurements, testing, and standards and is a globally recognized and trusted source of high-quality, independent, and unbiased research and data. As a world-class measurement and testing laboratory encompassing a wide range of areas of computer science, mathematics, statistics, and systems engineering, ITL’s strategy is to maximize the benefits of IT to society through a balanced IT measurement science and standards portfolio of three main activities: fundamental research in mathematics, statistics, and IT; applied IT research and development; and standards development and technology transfer. Learn more about ITL.


Yolanda Bursie, (301) 975-6738, yolanda.bursie [at]

Research Opportunities

SURF 2023 ITL projects are under development. For results of past projects, please see excerpts from the 2022 student abstract book below. check back later for updates. 

Multimodal Image Registration for Fluorescence Guided Surgery
Abstract: Fluorescence guided surgery is an important tool for surgeons to accurately identify and remove cancerous tumors. This work is motivated by the spatial misalignment of real-time streaming brightfield images and fluorescent images with tumor indications acquired by a fluorescence guided hand-held imaging system during a head and neck surgery. The spatial misalignment of brightfield and fluorescent images pose challenges for a surgeon who is deciding where to remove tumor tissue with significant consequences for a patient.

The problem of spatially aligning (registering) two multimodal images involves designing an automated method for estimating registration transformation parameters. The challenges include (a) achieving high spatial accuracy for tumor tissue removal, (b) overcoming limitations of existing algorithms that are optimized for monomodal images and assume many spatial features in both modalities and (c) decoupling the intermodality transformation and registration tasks in multimodal algorithms. 

We approached the multimodal registration problem by (a) creating ground truth data by manually registering paired images (b) evaluating the effectiveness of traditional image registration algorithms such as SIFT, and (c) training and testing an unsupervised generative adversarial network (GAN) called NeMAR. The NeMAR method consists of a spatial transformation (registration) network, intermodality translation network, and discriminator network. 

By testing different configurations of NeMAR with artificially generated training images, training iterations, and network types, we found the optimal NeMAR configuration with respect to our ground truth registered images. The method’s accuracy increases with the number of input image pairs, but remains about the same for over 200 epochs of training. 

In summary, while machine learning methods show promise in multimodal registration tasks, a robust GAN-based method would require a large training dataset sampling from a variety of surgical environments. Future research will compare a supervised machine learning approach to our unsupervised GAN-based approach with more training data.

Enhanced Viewing of 3D Objects Scanned Using Photogrammetry
Abstract: Most of a museum's collection is held in storage due to a lack of space for public displays. One solution for displaying these stored artifacts is to create 3D models of them. This can be done using photogrammetry, a technique for creating 3D virtual models of objects by taking many pictures of an object from different angles and using software that inputs camera images in order to reconstruct a virtual model mesh. These models are then saved as glTF files. glTF (GL Transmission Format) is a file format used to store 3D models and scenes, and is becoming an ISO (International Organization for Standardization) standard. 

This study focuses on features that can be implemented to improve the user experience of viewing imported glTF models. Implemented features are presented to the user as a series of tools that can be interacted with through an on-screen HUD (heads-up display). Some of these tools include a light that follows the mouse cursor to brighten a model and annotations for describing individual parts of a model. Annotations are presented to the user through a separate HUD window that appears when clicking on an object. All tools were developed in AFRAME, a web framework that uses HTML and Javascript to create 3D scenes which are viewable through a web browser and virtual reality devices.

Visualizing Cybersecurity Vulnerabilities and their Role in Recent Cyber Attacks
Abstract: The influx of recent large-scale cyber-attacks has created the need to understand how known cybersecurity vulnerabilities impact the integrity, availability, and confidentiality of network infrastructures across all business and government sectors. To aid in cybersecurity awareness efforts the NIST and its team of researchers are working to furnish the cybersecurity community with well-informed datasets/metrics. The goal if this project is to capture the process of enhancing cybersecurity related data through a wide range visualization software and resources. Utilizing the NIST’s National Vulnerability Database (NVD), a breakdown of verified cyber vulnerabilities with each vulnerability’s criticality score and influencing factors, we can identify some of the most common types of network/system vulnerabilities. Together with additional open-source resources such as CISA, the U.S. Department of Health and Human Services Office for Civil Rights, and others we can establish multipoint connections and create approximate reference relationships between the NVD’s vulnerability data and a significant number of reported cybersecurity incidents. From these connections we are then able to create visualizations that depict the correlations and possible influencing factors between many of the published vulnerabilities and recent incidents/breaches. By utilizing multiple data visualization services like PowerBI, Splunk, and ElasticSearch we can then create unique visualizations and compare similar findings from across the different services to validate the results. Once we are able to complete this process, we can then apply a method for tagging the data that will assist information security personnel and developers in determining how to prioritize implementing patches for these vulnerable systems.

Geometric Augmentations to File Identifiers in File System Forensics
Abstract: Digital forensics is a process that is key to examining and interpreting data in cyber related investigations. File system forensics makes up a significant portion of digital forensics as it is logically sorting through hard drive storage to determine creations, deletions, and other data essential to event reconstruction. 

Important to the functionality of file systems is the principle of namespace uniqueness, which uses file paths and names as identifiers that can distinguish file objects. In the world of digital forensics, there are several libraries used, one being, The Sleuth Kit (TSK). Within it is a command, 'fiwalk', whose purpose is to convert raw disk images' metadata into extensible markup language (XML) but does not populate with the guarantee of namespace uniqueness due to its reporting of unallocated ("deleted") files. This reporting of unallocated files means that file names cannot be relied on as identifiers. Considering the need to review unallocated files, logical code changes that focus on incorporating new identifiers for file objects are made necessary. This work evaluates a practice that identifies: the start of an index node (a file's attributes), the start of the directory entry, and the start of the file's content, producing a three-dimensional address for each file object. Subsequently, reported results from a 2012 paper that contain a measurement discrepancy will be corrected and additionally, this research will enable better cross-tool comparison.

Addressing the Causes and Consequences of AI Failures
Abstract: Artificial Intelligence (AI) has become increasingly prevalent in nearly all areas of life. It now controls our thermostats, drives our cars, produces our electronics, recognizes our faces, evaluates our resumes, predicts our purchases, and so much more. But what happens when AI systems fail, and how can we learn from these failures to prevent such incidents in the future? Our project proposes a framework for characterizing these AI failure incidents, and provides a structured way of documenting them in an online repository.

This repository is designed to address some key concerns. First, it will provide all known, verifiable information about the AI failure incidents in a convenient, searchable manner to allow users to discover and learn about these incidents with some level of technical depth. Secondly, it will allow users to report incidents as new ones are discovered, ensuring that the data remains up to date. Lastly, it will address the issue that Machine-Learning based AI systems tend to be black boxes. While such systems often succeed in achieving remarkable levels of accuracy, they rarely provide much understanding of their decision making process or the factors that influence its decisions. Our proposed characterization of the AI-failures will help us gather information that could shed light on this aspect.

Regular software vulnerability documentation efforts, such as NIST's National Vulnerability Database (NVD), require some knowledge about the inner workings of the code to be analyzed effectively. Such details are a lot harder to garner when it comes to AI/ML failures. Our proposed framework - Failures of Artificial Intelligence Learning Systems (FAILS) - captures the causes for the incident, the sources of weaknesses involved, and a measure of impact that the failure caused without needing to know the details of the code. In short, all the information needed to ensure it doesn’t happen again.

High-Dimensional Consensus Mass Spectra Comparison
Abstract: Mass spectrometry (MS) is an analytical chemistry technique for analyzing compounds. It provides a signature—called a mass spectrum—that can be used for compound discrimination. That signature is a scatterplot of charged fragments of the substance. One popular application is forensics chemistry, where drug chemists are trying to determine whether seized evidence is an illicit drug. The traditional method for discriminating mass spectra in forensic chemistry is to bin the scatterplot into a vector (essentially a histogram) and take the cosine similarity between the vectors. While generally effective, this method can occasionally lead to misidentifications. We recently developed two novel methods for incorporating measurement variability when comparing mass spectra to limit the likelihood of misclassifications. The first method works by binning the mass spectra—identical to the traditional approach—but then uses the mean and standard deviations of the bins across replicate measurements to form a summary-statistic vector. The second method works by taking the n highest y-valued points in the mass spectra and finding the mean and standard deviation across replicate measurements of that value and using the statistics to represent the compound. We use these summary-statistics as a more informative way to compare compounds. We have implemented these methods in C and performed preliminary evaluation using experimental data collected with two different types of mass spectrometers. We have found good performance in the discrimination of current drugs of interest (methamphetamine vs phentermine, nicotinamide vs isonicotinamide) and are currently evaluating the performance of these new methods across a larger test set of mass spectra that are difficult to discriminate by the traditional method, including applications outside of seized drugs.

Exploring Graph Analytics on Nisaba GPU Cluster with cuGraph
Abstract: Graph algorithms and Graph analytics are designed for manipulating and analyzing graph types of data to determine the relationships between graph objects and the structural characteristics of a graph as a whole. They have been adopted and used heavily in fields such as social networking, route optimization, fraud detection, and so on. A major issue with modern graph analytics is that it is usually challenging to perform the algorithms quickly or with high computational efficiency at a large scale. Graphics Processing Units (GPUs) can be utilized to accelerate graph data analysis and machine learning. Recently, NVIDIA produced the open-source graph analytics library cuGraph, which operates directly on GPU DataFrames and provides a collection of GPU accelerated graph algorithms with NetworkX-like API that can be treated as an efficient graph analytics solution for Python users. The purpose of this project is to evaluate benchmark graph analysis algorithms on NIST’s Nisaba GPU cluster and compare the quantitative performance of cuGraph with other CPU-based graph analysis tools, such as NetworkX and NetworKit. Both synthetic and real world datasets are employed to benchmark the common network analysis algorithms among six categories, namely Katz for centrality analysis, Louvain for community detection, Breadth-First-Search (BFS) and Single-Source-Shortest-Path (SSSP) for graph traversal, Weakly Connected Components for component detection, and PageRank for link analysis. Furthermore, cuGraph supports multi-GPU and multi-node operations (MNMG) in conjunction with Dask (Dask cuGraph). Dask cuGraph was also benchmarked alongside cuGraph and NetworkX. Through our reproducible experimentation, we were able to identify large performance increases when using the cuGraph library compared to both NetworkX and NetworKit, as well as unprecedented scalability of graph analytics using multiple GPUs. A GitLab repository was created to allow future users to test the cuGraph benchmarks with their own specifications and provide implementation examples for using the library.

Dynamic Access Review and Control Implementation and Enforcement (DARCIE)
Abstract: As cloud services are becoming more widely adopted, the amount of data available to members of an organization is vastly increasing along with the risk of data breaches. This project develops an access control mechanism that dynamically reviews, implements and enforces access control policies in real-time. The mechanism ensures granularity of control through privilege access management, allowing the system to control user access to resources. The access control mechanism enforces zero-trust policies so that users are continuously authenticated and are granted or denied access to the sensitive information based on their geolocation, organization’s network availability and historic pattern of accessed resources.

Our proof-of-concept system uses two devices to simulate a user accessing local or cloud data. A virtual machine acts as the end user's device and another device acts as a router that simulates different geolocations from where data in the cloud is accessed. The system demonstrates a dynamically changing policy generated by the state of a sensor and enforced by the kernel using Security-Enhanced Linux (SELinux). In this case, the demo system limits a user’s access to some system resources based on the device’s connection to an access point, simulating a dynamically generated access policy based on geographic location. Future work will focus on access control policy review and control implementation and enforcement rules also derived from the user's historic pattern of accessing the resources of interest.

Benchmarking Queries from Zeno against FCPW
Abstract: NIST has developed a software called Zeno, which estimates material properties from a geometric model of a particle of said material. One of the main computational tasks Zeno performs is to compute the closest point on the model to a query point. In addition, a newly proposed algorithm in Zeno would need to determine whether a part of a geometric model is contained within another. Currently, Zeno uses an internally-developed library to compute its closest point queries. However, using another open-source library may prove to be more optimal. In this project, we benchmark the closest point and contains queries performed by the current Zeno library against those performed by the “Fastest Closest Points in the West” (FCPW) library. The results of these benchmarks will help us decide whether or not the internally-developed Zeno library should be replaced with the FCPW library when implementing the new Zeno algorithm.

To obtain the benchmarks, we created C++ programs using either library. Users can specify a .obj file for the program to construct its geometric model, a query type (either the closest point or contains query), and a number of random query trials to run. The programs will time how long it takes to construct the geometric model (preprocessing time) and how long it takes to compute all the query trials. We then used Python scripts to calculate benchmarking statistics for different .obj files, query types, and trial runs. These statistics were plotted using various double bar graphs to help visualize patterns and directly compare each library's preprocessing and query times. Early tests suggest that the FCPW library is more efficient for a larger number of trial runs and is less error-prone than its existing counterpart. However, through more in-depth testing and analysis, we will be able to determine whether the FCPW library will be optimal for Zeno’s next implementation.

Creating an Algorithm for Searching RNGs to Link with Test Results
Abstract: Random number generators (RNGs) are often used in many aspects of everyday life from simulation and decision making to video games and other recreational activities. For a category of objects used so often, there must be a reliable method to test the quality of individual objects in that category. One of the most popular methods to test RNGs today is through a software library known as TestU01. Unfortunately, despite being effective at testing the quality of RNGs, TestU01 is expensive to run with the biggest test battery, BigCrush, consistently taking multiple CPU hours to test one RNG, which will inevitably take more wallclock hours. The original task was to research and figure out how to store RNGs and their TestU01 test results in a database such that they would be searchable, but figuring out a working algorithm to make said RNGs easily searchable ended up being so big that it turned into a project of its own.

Initially, a lot of time was spent on reading about RNGs and experimenting with the TestU01 software library in order to gain an understanding of TestU01 and the relevant RNGs. While becoming acquainted with them, we were also thinking of ideas as to how we could classify RNGs such that they would be searchable. Many potential algorithms were thought of, but the algorithm eventually proposed contains features from multiple of the potential algorithms we came up with along the way. It ended up being complicated to explain, but it should be relatively easy to use. This algorithm will likely be used in the database that the original project idea was supposed to create. However, it will also be usable in other contexts as long as they involve TestU01.

Interactive Online Histogram-Based Visualization of AI Model Fingerprints
Abstract: Previously, NIST has generated hundreds of thousands of artificial intelligence (AI) models for the TrojAI Challenge focused on detecting poisoned (trojaned) AI models. The main motivation for this project is to support discoveries/analyses of relationships between various clean and poisoned AI models by measuring their model utilization and relating it to Trojan characteristics.
In order to draw connections between AI models, the problem lies in creating interactive and traceable histograms that allow researchers to group AI models according to their characteristics, select pairs of AI models to perform qualitative/quantitative comparisons, share and discuss AI model comparisons remotely. Challenges include: interactivity over thousands of data points, traceability of histogram contributing points (AI utilization fingerprints) to their training images, and reusability of existing libraries and of the visualization prototype.

Our approach is based on the D3 JavaScript Library and Papa Parse CSV parser followed by the design of interactive, traceable, and reusable histograms. Histograms are dynamically created based on AI model attributes, including architecture name, predicted classes, Trojan triggers, and measurement probes. By selecting two contributing data points to a histogram bin, a side-by-side comparison of two AI model utilization fingerprints is enabled to quantify AI model similarities.
The resulting visualization presents a histogram of AI model utilization fingerprints with drop-down menus to allow users to select attributes for binning. Interactive images in histogram bins can be selected, new comparisons of utilization values are rendered, and buttons can trigger computations of distribution statistics.

Implementing Real Time Constraints in Hedgehog API
Abstract: An operating system (OS) is system software. Among its various capabilities, the OS can manage multiple threads and rapidly switch between their executions. A real time operating system (RTOS) provides more fine grained control over multithread behavior, allowing for a deterministic response and guaranteed execution time. For example, threads with higher priority values in RTOS would be guaranteed to run over threads with lower priorities.

The ability for RTOS to provide guaranteed real-time responses is significant especially for jobs needing consistent responses within a time constraint, such as monitoring a metal additive manufacturing process in real time by keeping up with data collected from a high-speed camera.

Over the past several years, NIST has been developing a C++ library called Hedgehog, which creates task graphs for algorithms to obtain performance across CPUs and multiple co-processors. The library relies on the OS to schedule its threads and provides no real time guarantees.

The focus of this research is to extend Hedgehog to provide access to real time priorities and scheduling algorithms, so that applications utilizing Hedgehog can be more deterministic when launched on an RTOS. In this presentation, we will present the implementation efforts to add the real time capabilities into Hedgehog, and the associated performance costs. To evaluate the performance, we have implemented two algorithms; (1) the Hadamard product and (2) Matrix multiplication. We will explore the performance behaviors with and without real time constraints of these algorithms by varying priorities and thread configurations within the algorithms.

Translating Mathematica Source Code to a Presentable LaTeX Format
Abstract: Mathematica is a powerful programming language that is often used to handle and process mathematical data and equations. Mathematica is powered by the Wolfram Language, enabling it to define, display, and calculate essentially any level of mathematics, namely hypergeometric series in this use case. While Mathematica is well suited to manipulating, defining, and calculating these series, it is often very difficult to read and present longer equations. Through utilizing the programming language Perl, string analysis, regular expressions, and the Wolfram Engine, provided Mathematica source code is translated into the markup language LaTeX. The result is a much more user-friendly and discernible view of the hypergeometric series and other expressions contained within, and the ability to export these results easily. Translating Mathematica source into LaTeX allows for the intense computational power of Mathematica to be combined with the compatibility and readability provided by LaTeX to display the results.

Scientific Reproducibility of AI Trojan Detector Results
Abstract: AI Trojans are malicious and intentional attacks that change the behavior of an AI by inserting hidden classes. To motivate research into Trojan detectors, NIST administered the TrojAI competition, where teams submit algorithms that detect Trojan AI models. The detector algorithms are known to output slightly different results across systems. These differences are problematic for scientific study of the algorithms because it means that results aren’t reproducible. This problem was the motivation for my NIST SURF project in which my mentor, Derek Juba, and I researched how algorithms submitted to the TrojAI competition behave when run in different environments. Submitted algorithms are containerized using Singularity which allows them to be easily run on broad range of machines. We tried to test the algorithms on as many combinations of software and hardware as possible (CPU core count, GPU drivers, etc.) in order to deduce potential causes of differing results.

We theorized that one of the main reasons for differences in the results across systems was changes in the orders in which floating point arithmetic operations were being performed. With this in mind, we attempted to quantify the uncertainty resulting from the choice of system without running the container on different systems. We simulated different orders of operations by tweaking the weights and biases of an AI model by a small amount. We used multiple random samples of such tweaks to find the variance we can expect in results if someone were to run an algorithm on a given model across different machines. Early analysis of the data suggests that results produced on other machines agreed with the variance we predicted with our tweaks and that the statistical distributions of tweaked models are largely reproducible across machines. Additionally, we propose that that the variance of the tweaked distributions can be used to score the confidence of detector algorithms.

Multimodal Fusion with Modality-Specific Factors for IEMOCAP Dataset
Abstract: In the scope of human-computer interaction, technology that can quickly analyze and identify emotion from varying data sources, is a coveted development. Potential applications of emotion recognition span from healthcare to gaming, only increasing demand for methods with efficient analysis and identification. Humans convey emotion through various mediums, most common of which are speech, facial expression, body language, etc. Emotion recognition technology frameworks are built upon foundational fusion methods, which synthesize various data modalities into features, utilized by prediction algorithms. This work mainly focuses on processing speech, text, and video data, and extracting the features from multiple modalities to develop a fusion model for emotion recognition tasks. We consider the IEMOCAP benchmark dataset by the processing of spliced data from modalities which includes features from audio data, video data, and embeddings from text data. These three modalities were processed for multimodal representations to recognize human emotions.

Making TRECVID Results More Accessible and Coherent
Abstract: NIST has run the TREC Video Retrieval Evaluation (TRECVID) program since 2001, allowing institutions to evaluate how successful their systems are at retrieving video content from textual queries. As the results of these evaluations were simply sent back to the submitting institution(s), discussed at the annual TRECVID workshop, and only reported in published papers, there was no another means for teams or the public to examine the results. The website also enables the displaying of data in more organized and visually appealing ways, such as playing the video results, corresponding to the tested queries, based on different result conditions across participating teams.

The data from these evaluations was also stored locally, with minimal organization, making it difficult to perform many statistical analyses. Building a comprehensive web interface with a suitable relational database to house the TRECVID result information was the clear solution to the problem. By developing an easy-to-use website, the information not only becomes more easily accessible to participating institutions, but it allows them to compare their tools across like systems, and over time. The website was developed with a focus on simplicity and maintainability, while also striving to remain lightweight. All data is displayed in simple tables, with a user interface that allows for easy navigation and finding of important data points with visualized results.

Term and Relation Extraction in Mathematical Texts
Abstract: There exist a variety of Natural Language Processing tools for term and relation extraction. Examples include Parmenides, a framework that applies structured and normalized terms to represent natural language, as well as DyGIE++, a deep learning system for entity and relation extraction. However, while these tools may be effective in extracting terms from scientific texts, their performance is less substantial with mathematical texts.

The two tools have previously been tested on their ability to extract terms from a collection of abstracts in the Theory and Application of Categories (TAC) journal. Parmenides extracted many valid mathematical terms, however it also extracted several times as many non-term phrases. We now hypothesize that term candidates that are part of relations, that is, subject-verb-object patterns, are more likely to be terms. Thus, a filter that removes words that cannot be found in relations reduces false-positives generated by the Parmenides term extractor.

In the case of DyGIE++, the model was retrained on TAC abstracts using author provided keywords as training data. Since the model was trained on more domain specific text, it performed stronger than the default model.

These measures increased the precision and recall of both tools by a noticeable margin. In future research, we will utilize this term extraction for the creation of comprehensive knowledge graphs for mathematical domains. Further, the relations extracted by Parmenides and DyGIE++ can be employed for the evaluation of these knowledge graphs.

Artificial Intelligence-Based Texture Analysis
Abstract: Texture analysis is ubiquitous, and it finds application in both biomedical and nanomaterial research. The ability to address it in an automated fashion is greatly beneficial. However, in most cases, visual analysis and custom-tailored approaches are employed. Convolutional neural networks (CNNs) represent a viable approach to characterize image texture accurately, and in particular properties that humans can detect: directionality and granularity.

NIST researchers have been addressing AI-controlled texture analysis for years, however, they have only used synthetic data to train Artificial intelligence, not real-life data. To further advance the CNNs and our AI as a whole, we need to change the testing data to real-life images. The only barrier is that there is no efficient software allowing users to annotate real-life images to be then used for testing.
Another contribution of the GUI I created is associated with a step forward my NIST mentors are envisioning on this project. Basically, the software will enable the creation of a public database of annotated texture images that will be globally available to other scientists. Images annotated using our software will be uploaded to a public database where others can view, source, and use it. To the best of our knowledge, there is no global database that contains this information.

This would not only help researchers around the world train AIs but help advance machine learning texture analysis as a whole.

Evaluating the Implementation of NIST SP 800-181 in Cybersecurity-Related Job Descriptions
Abstract: With technology and data science becoming so prominent in society, it’s becoming increasingly imperative that companies and organizations protect themselves from malicious cybersecurity threats. However, in the United States alone, there are over 700,000 unfilled cybersecurity positions. The National Initiative for Cybersecurity Education (NICE) created the NICE Framework (NIST SP 800-181) to provide a set of building blocks for describing the tasks, knowledge, and skills that are needed to perform cybersecurity work. Through these building blocks, the NICE Framework enables organizations to develop their workforces, and helps learners to engage in appropriate learning activities to develop their knowledge and skills.

The purpose of this research is to evaluate if employers are using this framework by examining job descriptions found on online hiring platforms and measuring the extent of their alignment to the Framework. The results of this research will provide insight into whether or not actions need to be taken to increase industry awareness of the Framework or to modify the Framework to better apply to employer needs.

Two methodologies will be explored to complete this project. In the first methodology, job descriptions from multiple hiring platforms such as LinkedIn and USAJobs will be graded using a rubric to determine how well they align with the framework. A job description which matches a larger amount of key words found in the knowledge, skills and tasks of a work role will score higher on the rubric. In the second methodology matching keywords and qualifications will first be found between job descriptions. After compiling a list of the most common keywords and qualifications, this list will then be compared to the Framework work role to determine how well the Framework covers what employers desire.

Optimizing Data Communication for Low Latency Quantum Network Metrology
Abstract: Quantum networks currently require various in-situ measurements from their components to ensure good network fidelity. Communications between quantum network nodes are carried out with single photons through the use of single-photon sources and single-photon detectors. One undesirable characteristic of these photon transmissions is substantial timing jitter associated with the single-photon detection process. To monitor this issue, each photon’s emission time and absorption time is recorded with picosecond accuracy and sent to the quantum network’s management system for analysis. This time-data transfer can become a considerable bottleneck in the network due to bandwidth limitations in classical data communication. Thus, we seek to reduce network overhead and optimally compress this data. In our investigation, we tested several lossless compression methods such as delta encoding, different types of variable length quantity encoding, and a hybrid approach on a sample of such data. We found that the hybrid approach produced the best results by compressing the data by 83.11% (a 5.92 compression ratio). Implementing this compression technique into quantum network metrology toolsets could significantly speed up quantum network analysis and allow for more data to be analyzed as well.

Understanding Neural Search Algorithms
Abstract: Search engines have models for predicting if a document is relevant to a query. Furthermore, in search engines, deep learning methods for predicting relevance are an emerging area of research. To determine if a document is relevant or not based on a query, search engines may use three different models. The first model is manual (non-automatic) where there is human intervention to determine whether a document is relevant or not. The next two models are considered automatic in that the query is created from the textual description of the user information needed. The first automatic model is traditional where it looks at how often terms appear in documents and uses formulas to calculate its relevance. The second automatic model is neural where neural networks are used to determine the document’s relevance. The question then becomes how do all these three models compare with one another?

To answer this question, we use a query-by-query analysis approach by examining traditional, neural, and manual outputs on lots of search queries, then trying to identify patterns of success and failure for each model. I then conducted a qualitative analysis of traditional, neural, and manual ranking methods to understand the differences.

[Back to top of page]

Material Measurement Laboratory (MML) and the NIST Center for Neutron Research (NCNR)

The MML serves as the national reference laboratory for measurement research, standards, and data in the chemical, biological and material sciences, and conducts research in analytical chemistry, biochemical science, ceramics, chemical and biochemical reference data, materials reliability, metallurgy, polymers, surface and microanalysis science, and thermophysical properties of materials. MML research supports areas of national importance including but not limited to advanced materials, electronics, energy, the environment, food safety and nutrition, and health care. Learn more about MML.

The NCNR is a major national user facility and resource for industry, universities, and government agencies with merit-based access made available to the entire U.S. technological community. Neutron-based research covers a broad spectrum of disciplines, including engineering, biology, materials science, chemistry, physics, and computer science. Current experimental and theoretical research is focused on materials such as polymers, metals, ceramics, magnetic materials, porous media, fluids and gels, and biological molecules. Learn more about the NCNR.

The MML/NCNR program is specifically designed to provide hands-on research experience in three topic areas: Chemical/Biochemical Sciences, Materials Science, and Computational Materials Science. Applicants interested in participating in research opportunities at the NCNR should apply to the MML/NCNR Materials Science topic area. All SURF projects at the NCNR will be conducted in person.


Katherine Gettings, (301) 975-6401, katherine.gettings [at]
Nathan A. Mahynski, (301) 975-6836, nathan.mahynski [at]

Julie A. Borchers, (301) 975-6597, julie.borchers [at]
Leland Harriger, (301) 975-8360, leland.harriger [at]

Susana Marujo Teixeira, (301) 975-4404, susana.marujoteixeira [at]

Research Opportunities

SURF 2023 MML/NCNR projects are under development. Please see examples of past projects below and check back later for updates. 

Chemical/Biochemical Sciences: This concentration addresses the nation's needs for measurements, standards, technology development, and reference data in the areas broadly encompassed by chemistry, biotechnology, and chemical engineering.

Developing Oral Sensors Based on Glucose and pH Levels
In this project, we tested electrochemical sensors to be used in saliva or dental pulp tissues. Our goal was to measure the concentration of relevant biomarkers in order to provide more accurate and timely diagnosis of disease. The student worked with their mentor to prepare electrode sensors, test them in in physiologically-relevant relevant media, and then analyze the data, comparing results with calibration curves to determine parameters such as pH and concentration of protein biomarkers. The objective was to learn how to prepare and chemically modify electrodes as well as collect and analyze electrochemical data. See page 63 of the 2019 student abstract book for results of this project.

Measuring Cell Viability in Collagen Scaffolds
The student contributed to this project by assessing cell viability in scaffolds used for tissue engineering applications. They fabricated scaffolds, cultured cells in scaffolds, and conducted biochemical assays to assess cell viability in the scaffolds. The student worked with NIST scientists to design experiments and developed strategies for validating viability measurements. They prepared hydrogels, used microscopes, used plate readers, ran biochemical assays, analyzed data, generated plots, conducted statistical tests, and summarized findings. See page 67 of the 2019 student abstract book for results of this project.

Combining LC-MS and ELISA for Quantification of Allergenic Milk Protein in Food
To support an emerging program in food allergen measurements at NIST, the SURF student developed methods to quantify protein allergens in existing NIST food reference materials using antibody-based assays (ELISAs) and measurements using mass spectrometry (MS). The method development focused on using commercial antibody-based assays for milk proteins. Following immunoassay, mass spectrometry will be used to analyze the milk proteins, allowing for a direct comparison of measurement methods. The methods developed for milk protein allergens were then applied to measurements on existing NIST SRMs such a whole milk powder, infant formula, protein drink mix, and a food composite. See page 75 of the 2019 student abstract book for results of this project.

Characterizing the Cooperative Motion in Condensed Fluids using Machine Learning
This project focused on developing and generalizing the metrology for characterizing cooperative rearrangements emerged universally in many condensed fluids, including glass-forming polymers of various architectures (linear, branched, star, and ring polymers), interfacial dynamics of crystals, internal dynamics of proteins, lipid membranes, superionic, driven granular fluids and colloidal hard sphere fluids. The objective was to learn the fundamentals of molecular dynamics simulations, glass-forming polymer and machine learning, and be involved in developing python codes to extract spatially correlated motion from molecular dynamics simulations of polymers in particular. See page 60 of the 2019 student abstract book for results of this project.

Materials Science: This concentration focuses on synthesis, measurements, and theory of innovative materials and devices. Note: This concentration includes projects from the NCNR. Additionally, a limited number of projects are available at the NCNR for students with interest in nuclear engineering and/or reactor operations.

Determining Optimal 3-D Configuration of Porous CO2 Reduction Catalysts
The student’s responsibility included the preparation and evaluation of electrochemically deposited Cu catalysts for In-situ Surface Enhanced Raman Spectroscopy/CO2 electroreduction experiments. This entails the preparation of electrolyte solutions, assembling electrochemical cells, and applying constant potential or current through gas-diffusion layers, which serve as the plating substrate. See page 68 of the 2019 student abstract book for results of this project.

Assessment of Elemental Homogeneity in Modern Glass µXRF for Forensics
The elemental characterization of glass evidence is an important tool in forensic science that allows practitioners to associate or discriminate a known and questioned glass sample. The micro-heterogeneity of glass is important when considering sampling and analysis strategies, since the analysis of small areas of a glass specimen should be representative of the bulk composition. This project focused on the elemental analysis of glass specimens to evaluate the micro-heterogeneity of the samples and ultimately assess the risk of false exclusions. See page 70 of the 2019 student abstract book for results of this project.

Chemical Weathering and Additives in Plastic Marine Debris in the Hawaiian Islands
Marine plastic pollution is a growing issue, and researchers need the best methods for quantifying and characterizing plastic in complex environmental samples. The student analyzed Fourier transform infrared (FT-IR) spectra of plastic marine debris collected from Hawaiian beaches. The FT-IR spectra of over 3000 items contain a wealth of information that can help understand how old the debris is (when it was littered) and how toxic it might be to animals that eat it (additives). The existing spectra were examined for a carbonyl index to validate our visual weathering codes and for brominated flame retardant additives that are known to be toxic to the thyroid system. In addition to this project, the student also assisted four on-going plastic marine debris projects. See page 69 of the 2019 student abstract book for results of this project.

Neutron Tomography and Simulation of Compton Imaging
The Neutron and X-ray Tomography (NeXT) system provides the ability to produce simultaneous, dual-modality, tomography datasets. The Neutron Imaging Team has created a MATLAB based tool that allows users to draw polygons around regions of interest in the 2D bivariate histogram to isolate phases (materials) of interest. The tool will process an entire volume and produce individual binary volumes for each polygon drawn on the histogram. While this tool provides a convenient way to begin the segmentation process, it lacks the ability to distinguish individual peaks that would indicate a region of interest. For this project the student expanded on this segmentation tool through the exploration of methods to improve the identification of peaks and points of inflection. See page 77 of the 2019 student abstract book for results of this project.

Computational Materials Science: This concentration includes the application of modeling, simulation, and computational methods to enhance our understanding of innovative materials and devices. This concentration includes projects within the Materials Genome Initiative.

Interatomic Potentials for Calculating Diffusion Behavior
The student learned to use the LAMMPS molecular dynamics simulation software and help develop Python scripts for testing interatomic models. The developed calculation methods then served as standard tests for evaluating and comparing the different models. See page 66 of the 2019 student abstract book for results of this project.

[Back to top of page]

Physical Measurement Laboratory (PML)

The PML sets the definitive U.S. standards for nearly every kind of measurement in modern life, sometimes across more than 20 orders of magnitude. PML is a world leader in the science of physical measurement, devising procedures and tools that make continual progress possible. Learn more about PML.


Electrical & Electronics Engineering
Joseph J. Kopanski, (301) 975-2089, joseph.kopanski [at] nist.govv
Richard L. Steiner, (301) 975-4226, richard.steiner [at]

Maritoni Litorja, (301) 975-8095, maritoni.litorja [at]
Uwe Arp, (301) 975-3233, uwe.arp [at]

Research Opportunities

Rotational Cooling of a Molecular Beam
Eric Norrgard, eric.norrgard [at]
Polar molecules can be used as precision quantum sensors of their environment (radiance temperature, pressure, electromagnetic fields, and more). Often this requires a large number of molecules to be in a given quantum state. Student will develop and use a combination of microwave and laser frequencies to “cool” the rotational distribution of a molecular beam. Student will assist in data collection to demonstrate effectiveness of cooling technique, determined by the increased population in the target quantum state. [In-person opportunity]

Disorder potentials for Bose-Einstein Condensates
Ian Spielman, ian.spielman [at] 
The successful student will use digital mirror devices (DMDs) commonly used in overhead projectors to create a disordered optical pattern. This pattern will illuminate an atomic Bose-Einstein condensate (BEC) creating a random energy landscape for the atoms to move about in. [In-person opportunity]

Soliton Formation in a Laser Cavity Containing an Rb Vapor Cell: Simulations
Zachary H. Levine, zlevine [at]
Highly stable lasers can interact with the hyperfine-split levels of certain atoms such as Rb. The dielectric response of such systems is highly dispersive and nonlinear. The active media are in an optical cavity formed by optical fibers, leading to well-defined modes in the environment. The project seeks to calculate the response of such systems to both pulses and periodic driving fields. We hope to find stable periodic patterns which correspond to soliton solutions. The methods will be primarily numerical with some analytic work on model systems. [Virtual opportunity]

Creating a Fully Digital Proportional-Integrator-Differentiator Servo Loop
Stephen Eckel, stephen.eckel [at]
For the cold atom vacuum standard, we use several analog proportional-integrator-differentiator (PID) circuits for to control laser frequencies, laser intensities, magnetic fields, and temperatures. Not only are our current generation of PIDs based on unreliable analog electronics, they are also missing features, such as sampling and holding the integrator, that would be extremely useful for our experiment. For this project, we would like to build a new, fully digital PID based on a field-programmable gate array (FPGA) or microcontroller with all the complex features we need for our specific experiment. We will most likely build our PID around a Red Pitaya. Once developed, the student will help to deploy the PID on the apparatus to help control our complex laser-cooling apparatus. The student will learn programming and engineering skills, and help to develop an excited new tool that will be used in the lab for years to come. [In-person opportunity]

Electronic Instrumentation for Magnetic Particle Imaging (MPI)
Thinh Q. Bui, thinh.bui [at]
In the current Thermal MagIC IMS program, we are developing a novel 3D thermal imaging technology based on magnetic nanoparticles as nanoscale temperature sensors. To achieve fast, high throughput 3D thermal imaging, we rely on state-of-the-art electronics to generate magnetic fields for excitation of nanoparticles and detect their time-dependent response. This requires custom, tailormade analog and digital electronics to operate the homebuilt electromagnets and magnetic sensors. The electromagnets for the excitation field are driven by high-power resonant circuits, and the detection of magnetic nanoparticle response utilizes low-noise amplifiers and filter circuits to maximize signal-to-noise. Experience in circuit design (SPICE), electronic measurements, electronics labwork, and programming for control and data acquisition with LabVIEW, MATLAB, or Python are desired. [In-person opportunity]

Power and Energy Instrument Calibrations for the U.S. Electric Power Grid
Richard Steiner, richard.steiner [at] 
As car chargers and solar panel arrays proliferate, a new DC Power and Energy calibration service is needed to accurately calibrate the instruments used to measure (and charge money for) that power and energy. Accurately determining all the variables in this project means studying and learning the programming details and physics of how electric signal sources and meters work in real conditions. There are several mini-projects involved:

  1. Accurately characterize a current transducer (I-to-V converter) to measure up to 200 A of current;
  2. Program several instruments to generate up to 200 A and 1000 V;
  3. Program two digital multimeters to measure the current and voltage, while keeping accurate time of the integration periods;
  4. Learn to use and program power meters or power calibration sources, if available;
  5. Calculate the correction factors and uncertainties in the measurements;
  6. Create and document a procedure, including an analysis of the hazards involved. 

[In-person opportunity]

Automated Data Acquisition and Analysis for Optomechanical Thermometry
Daniel Barker, daniel.barker [at]
In the applied optomechanics lab, we are developing optomechanical primary thermometers, which use laser light to measure the thermal motion of nanoscale mechanical systems to deduce temperature. The current thermometry data acquisition setup uses manufacturer-provided software to control lab instrumentation and requires us to be physically present in the lab to change experimental settings. In this project, we want to develop Python programs to control the apparatus, automate data acquisition, and send SMS alerts when errors occur. The student will also build any electronics necessary for automation and help to create data analysis scripts that interface with the new apparatus control software. During the course of the project, the student will gain scientific programming, optical alignment, and data analysis skills as well as contribute to an aspect of the apparatus that will be used for the foreseeable future. [In-person opportunity]

High-Resolution Bioelectronics Metrology
Arvind Balijepalli, arvind.balijepalli [at]
Charge sensitive electronics provide a label-free approach to measure multiple biomarker types such as proteins, nucleic acids and small molecules. This project involves developing instrumentation and data acquisition tools for multiplexed measurements of DNA. The student will develop LabView and Python scripts to interface measurement instrumentation such as lock-in amplifiers and PID controllers with switch matrices to realize multi-channel measurements. [In-person opportunity]

Spectroscopic Analysis of Low-Dimensional Interfaces for Optoelectronic and Magnetoelectronic Applications
Christina Hacker, christina.hacker [at]
In this project, the student will leverage the spectroscopic techniques at NIST to study interfaces of low dimensional materials like nanoparticles and monolayers with polymer films. These interfaces are expected to demonstrate interesting optoelectronic and magnetoelectronic behavior that could be leveraged for next generation devices. Duties will include learning and utilizing advanced spectroscopic techniques, building optical systems for sample characterization, and fabricating samples for testing via spin-coating and vapor deposition. [In-person opportunity]

Advanced Characterization of Few- and Single-Defect Transistors
Jason Ryan, jason.ryan [at]
Atomic-scale defects at the oxide/semiconductor interface of metal-oxide-semiconductor field-effect transistors (MOSFETs) have become increasingly difficult to study as transistors have been scaled down to impossibly small dimensions. In some cases, modern MOSFETs can contain very few, or even just a single defect. Despite their small numbers, these defects remain important to MOSFET function and reliability, and they may have other applications outside of the typical operation of a MOSFET. The goals of this project are to:

  1. Experimentally explore the sensitivity limits of the characterization techniques used to study atomic-scale defects in MOSFETs. These techniques include current vs. voltage measurements, charge pumping, gated diode measurements, capacitance vs. voltage measurements, and others. The extension of these techniques to electrically detected magnetic resonance may also be explored. Comparing the relative sensitivities of these methods on standard, well-characterized test structures is not only beneficial to the field of metrology, but also will allow for better communication between NIST’s Magnetic Resonance Spectroscopy project staff and potential collaborators.
  2. Experimentally evaluate the utility of the characterized few- and single-defect MOSFETs for alternate purposes such as sensing or probing electron/nuclear spin interactions.

Pushing the limits of the techniques above could ultimately lead to the observation of quantized charge transfer and quantized spin transitions.

The duties of the intern will include:

  1. Reviewing literature on MOSFET interface defect characterization;
  2. Characterizing MOSFETs on a wafer probing station with the techniques listed above;
  3. Organizing, plotting, and reporting data to the PI and research group.

[In-person opportunity]

Experiments with Highly Charged Ions
Joseph Tan, joseph.tan [at]
Highly ionized atoms have certain long-lived states that are potentially interesting candidates for optical atomic clocks and for determination of fundamental constants. Experiments can utilize a compact electron beam ion trap (mini-EBIT) or the NIST superconductive EBIT to facilitate the generation of such exotic charge states. [In-person opportunity]

Laboratory Metrology Proficiency Testing and Training Resource Development
Isabel Chavez-Baucom, isabel.chavez.baucom [at]
SURF candidate will work directly with their mentor to collaborate with subject matter experts to develop Office of Weights and Measures (OWM) Proficiency Testing (PT) Program quality and statistical analysis tools, program and implement a PT tracking and process management scheme with companion user instructions, improve the PT artifact inventory management process (Access database) that is used by OWM’s national program coordinators and participants to support calibration laboratory Recognition and Accreditation requirements. Candidate will collaborate to develop measurement science training resources, including video scripts and storyboards. Videos may include: PT Quality Manual (NISTIR 7214) and PT Test Policy and Plan (NISTIR 7082) topics, such as statistical equations and Pass/Fail criteria; or laboratory operating procedures and techniques, such as care and handling of state standards (GLP 3), understanding factors affecting weight operations (GMP 10), and standard operating procedures (SOP 8); and/or general “Welcome to Training” topics that support OWM’s IACET accredited Training Program. The candidate will serve on an OWM working group exploring new modes of training service delivery, such as Learning Management Systems and Virtual delivery platforms. This project will be hosted as an in-person (Gaithersburg campus) workplace experience. The researcher will work with a diverse team to research, design, develop, edit, and produce project deliverables. Project development will include independent and collaborative research, defining project, purpose, scope, goals, and timeline. Typical tasks may include reviewing current publications and scientific literature, applying international and national PT statistical methods and techniques, spreadsheet design and validation, handling and organizing PT artifacts, collaborating with state laboratory stakeholders, training video script writing, storyboarding and editing support, reviewing training presentations, identifying appropriate photographic assets, and draft publication review for 508 compliance and content accuracy to meet NIST publication requirements. [In-person opportunity]

Compact Blackbody Radiation Atomic Sensors
David La Mantia, david.lamantia [at]
Electromagnetic radiation sensing is at the core of modern physics. While coherent radiation sensing techniques are quite mature, incoherent radiation sensing has not seen significant recent advancement. A ubiquitous source of incoherent radiation is blackbody radiation (BBR); i.e., radiation emitted by thermal bodies. Characterizing BBR is an appropriate technique to accurately assess the temperature of a distant entity. Rydberg atoms are a cutting-edge tool with enhanced physical properties. Therefore, it is natural they be used in electrometry as radiation sensors. The undergraduate Research Fellow will participate in an experiment dedicated to the use of Rydberg atoms to serve as calibration-free, SI-traceable sensors of thermal radiation, thereby characterizing reference blackbodies and greatly reducing the calibration uncertainty for classical thermal radiation sensors. [In-person opportunity]

Development of Flexible Broadband Nonlinear Metasurfaces with Multiresonant Plasmonic Enhancement
Amit Agrawal, amit.agrawal [at]
Simultaneous nano-localized enhancement of excitation and emission transitions in nonlinear processes remains a challenge in nanophotonics research but can offer many applications in coherent light conversion, imaging, sensing, quantum optics, and spectroscopy. This project aims to develop flexible nanolaminate plasmonic metasurfaces with broadband multiresonant enhancement of excitation and emission transitions in the second harmonic generation (SHG) and third harmonic generation (THG) processes. In the first stage, the student will employ physical vapor deposition to create flexible nanolaminate plasmonic metasurfaces on nanoimprinted polymer nanopillar arrays based on different plasmonic metal materials, including Ag, Au, Cu, and Al. In the second stage, the student will assist in conducting optical measurements to characterize both linear and nonlinear optical characteristics of the fabricated nanolaminate plasmonic metasurfaces. We envision that Al-based plasmonic metasurfaces can push the nonlinear SHG/THG emission into the UV region, and Ag-based plasmonic metasurfaces can achieve the broadest nonlinear SHG/THG emissions over the entire visible to near-infrared range. [In-person opportunity]

Electric Field Modelling for Medical Physics Application
Csilla Szabo-Foster, csilla.szabo-foster [at]
Radiation therapy with small, encapsulated implantable sources (brachytherapy) is a powerful tool in the treatment of various kinds of cancer. The National Institute of Standards and Technology (NIST) maintains the U.S. primary standard for radioactive source strength (air-kerma-strength) for low dose rate photon emitting brachytherapy sources. The device to provide these measurements is the Wide-Angle Free-Air Chamber (WAFAC). The WAFAC has a front membrane electrode with a cylindrical side electrode raised to high voltage potential to guide electrons to the grounded back electrode when ionizing radiation enters the chamber. The scope of this project is to create a model of the WAFAC in COMSOL Multiphysics software and simulate the electric field of the ion chamber under various conditions. This research will help us to understand the influence of systematic effects on brachytherapy source strength measurements and with that potentially improve the understanding of sources of measurement uncertainty. [In-person or virtual opportunity]

Plasmonic Nanopores through DNA Nanotechnology
Joseph Robertson, joseph.robertson [at]
The student will develop a generalized nanopore sensor platform that will allow for rapid control over the local temperature gradients at various length scales at a nanopore resistive pulse biosensor. Plasmonic nano-structures will be fabricated through DNA nanotechnology (origami), which will create super-assemblies of pore forming proteins and plasmonic nanoparticles and these will be used as an optically modulated single-molecule sensor. The student will design LSPR-DNA origami structures to organize and place particles with precision on the 1 nm to 100 nm length range and test the feasibility of creating structures with mixed particles (i.e., combining Au clusters with diamond nanoparticles) to create sensors that provide both optical and ionic measurement of local temperature. These structures and measurement protocols will also advance fundamental biophysical measurements, such as that found in transport of peptides and proteins across membranes. [In-person opportunity]

Directional Reflectance Measurements from the UV to IR
Aaron Goldfain, aaron.goldfain [at]
NIST is creating an open access database of the optical directional reflectance of materials. How materials reflect light in different directions is important to a variety of applications, such as the disinfection of public spaces using UV light and interpreting satellite-based remote sensing images of the Earth. We will use a new commercial instrument to measure the directional reflectance, specifically the bidirectional reflectance distribution function (BRDF), of a wide range of materials across the wavelength range of 200 nm to 2400 nm. We will thoroughly characterize the new commercial instrument by validating it against other NIST instruments and will improve its performance as necessary. A key part of this validation will be formulating an uncertainty budget. One of the main challenges anticipated with this project will be ensuring the results are accurate when measuring materials with varied and unusual optical properties. There will also be opportunities to explore new methods to visualize and share BRDF data, as our goal is to eventually build an interactive, online data viewer where users can explore and download BRDF data. [In-person opportunity]

Stability of Photodiodes in Space
Joe Rice, joe.rice [at]
The student will mine existing data from three satellite missions from NIST collaborators at the University of Colorado Laboratory and Atmospheric Space Physics (LASP). These three satellites used a variety of photodiodes and thermal detectors to measure solar flux across the near ultraviolet, visible, and near-infrared as their primary missions. As a secondary bye-product that has not yet been analyzed, data exists that can be analyzed for spectral response stability. The student will work with collaborators at LASP to obtain the data, understand the data formats, and perform analysis to extract information about how much silicon and InGaAs photodiodes degrade while exposed to the harsh radiation environment of space. This will inform future satellite missions trying to select photodiodes for accurate measurements from space. [In-person opportunity]

Time-Resolved Spectroscopy of Excitons and Exciton-Polaritons in Organic Crystals
Jared Wahlstrand, jared.wahlstrand [at]
Organic semiconductors offer a number of potential advantages over conventional semiconductors, including low-cost processing and the ability to tailor materials by modifying molecular structure. For opto-electronic applications, they have particularly interesting and rich properties, including tightly bound excitons (bound electron-hole pairs) that interact strongly with light and remain bound at room temperature, with spin dynamics that depend sensitively on material properties and can be modified by applying a magnetic field. Recently, it was shown using time-resolved fluorescence measurements that organic crystals placed on 2D plasmonic nanoparticle arrays display modified exciton dynamics due to the formation of exciton-polaritons, hybrid modes that result from strong light-matter coupling. This phenomenon may enable new applications of these materials, but before it can be applied in a commercial setting, more fundamental research is needed. The student will perform experiments using a broadband subpicosecond pump-probe setup and a streak camera to investigate exciton and exciton-polariton dynamics in organic crystals. The project will involve a combination of hands-on experimental work and data analysis. [In-person opportunity]

Microfluidic Cytometry
Greg Cooksey, Gregory.cooksey [at]
This project supports the development of a microfluidic cytometer that repeats measurements of single objects in flow. These measurements enable estimation of per-object uncertainty, something conventional cytometers cannot do, which facilitates optimization of device performance and leads to better classification of sample composition. The student will learn how to make microfluidic devices with integrated optical waveguides, to interface the chips with flow systems, light sources and detectors, and to collect and analyze data. The student will explore measurement of object size and shape and work on metrics to improve cell counting and classification. The student will receive appropriate safety training in addition to hands-on training regarding the other specialized equipment in the NIST laboratory. [In-person opportunity]

Towards Rapid Optimization of Solution-Processed Organic Thin-Film Transistors
Emily Bittle, emily.bittle [at] and Katelyn Goetz, Katelyn.goetz [at]
Organic semiconductors are processable from solution, making them suitable for low-temperature fabrication processes on flexible substrates. This is beneficial because it enables previously impossible electronic devices. Unfortunately, solution processing is complex and depends on numerous interactions between the semiconductor solute, solvent, and substrate, making the window of optimal processing difficult and time-consuming to understand. Only a select few, high-performance semiconductors undergo optimization at a years-long, cross-laboratory pace. This both reduces commercialization potential and complicates scientific metrology studies where a lower-performance semiconductor may be interesting. For this project, the SURF student will develop strategies to reduce the time taken to optimize the solution-processing parameter space. They will fabricate organic field-effect transistors by the blade-coating technique, image them and measure their electrical characteristics, and finally, develop a MATLAB or Python-driven toolkit to rapidly correlate structure-function properties for a large set of devices. This work will assist scientists in the fabrication of “ideal” devices for both commercial and scientific purposes. Tasks:

  1. Fabricate, measure, and image organic field-effect transistors;
  2. Write code to rapidly identify and catalogue features of interest;
  3. Correlate optimal electrical characteristics with image features and fabrication procedure;
  4. Use results to refine fabrication process.

[In-person opportunity]

Ultrasonic CT for Radiation Dosimetry
Ronald E. Tosh, ronald.tosh [at]
Cancer therapy with external-beam radiation seeks to destroy malignancies while minimizing damage to surrounding healthy tissue. This often entails elaborate mechanisms for shaping radiation beams spatially and temporally to optimize conformality of dose delivery to irregular tumor volumes. Absolute dosimetry of such highly nonuniform dose distributions is not possible with current technology, but we are working on dose-imaging techniques that would make this achievable. One such technique pioneered at NIST uses an array of transducers for imaging temperature changes within an irradiated phantom. Computational modeling of the thermal response of the array and acquisition and processing of temperature profiles is necessary for interpreting experimental data and design improvements. This project would seek to advance preliminary work we have completed in finite-element modeling of heat transfer in irradiated phantoms and simulation of image-data acquisition and tomographic reconstruction of temperature distributions, ultimately to involve a hybrid computational platform combining CPUs and a GPU. [In-person or virtual opportunity]

Source Measurements and Simulation of Dose Calibrator
Brittany Broder, brittany.broder [at] and Denis Bergeron, denis.bergeron [at]
A new generation of radiotherapeutic agents are being developed in the medical community to target disease and treatments more directly. The efficacy of these treatments is often determined using quantitative imaging to assess how much of the radiopharmaceutical accumulates in a tissue of interest compared to the initial administered activity. An accurate, reliable measurement of the activity is necessary for precision imaging and dosimetry for therapy treatments. Such measurements are usually made with radionuclide dose calibrators, a common utility in nuclear medicine to determine the initial activity administered to the patient. NIST is responsible for making sure dose calibrator measurements can be made with traceability to a common National standard, making dose calibrators an important tool in disseminating activity standards. Modelling dose calibrators allows us to predict the response for new nuclides and validate experimental data. In this project, the student will use a model of a commercial dose calibrator to simulate the response of various nuclides using TOPAS, a Geant4-based, Monte Carlo simulation program. The student will validate their simulations by comparing the results to published response curves and measurements they obtain in the lab. The student may also use this model to investigate measurement errors that can arise when sources are measured under common clinical conditions. [In-person opportunity]

NIST Stars: Absolute Flux Calibration of Stars for Astronomy
Susana Deustua, susana.deustua [at]
The objective is to provide visible and near infrared stellar spectral energy distributions of standard stars with sub-percent uncertainty that are SI-traceable for use by the astronomical research community. Some of the research areas supported by these improved flux standards are dark energy experiments, the growing area of exoplanet research, and stellar astrophysics. In collaboration with the European Southern Observatory in Chile, NIST is building a research station on Cerro Paranal, whose instruments include small telescopes, and instruments for measuring atmospheric properties. We are also interested in research using LIDAR, spectroradiometry, and other methods to improve the determination of the atmospheric corrections. In preparation for deploying our telescope system in Chile, we shall be characterizing the instruments, automating telescope operation, and testing these in the lab and on the sky. [In-person opportunity]

Training Embedded Inference Engines to be Robust Against Device Variations
Mark Stiles, mark.stiles [at] and Nitin Prasad, nitin.prasad [at]
Efficient computation on edge-computing platforms such as autonomous vehicles requires the use of compressed neural network models with low precision parameters. Binary neural networks, in which network parameters take one of only two values, are extreme versions of such compressed models. These networks can be realized in mixed signal computing platforms containing arrays of devices with bistable conductance values. However, implementations of such bistable devices are prone to device-to-device variations, which degrade performance. The goal of the project is to create offline training algorithms that account for and compensate for such device-to-device variations during the offline training process. The resulting offline-trained solutions should exhibit optimized accuracies when implemented on physical networks with actual device variations. [Virtual opportunity]

DC Power and Energy Calibrations to Verify Electric Vehicle Supply Equipment Meters
Maritoni Litorja, litorja [at]
The power and energy dispensed by charging equipment to an electric vehicle has to be verified using secondary field meters. These meters in turn, must be calibrated to be traceable to the SI. This project has several parts:

  1. Work with the Office of Weights and Measures to determine the calibration needs of the EVSE community
  2. Work with another student (advisee of Dr. Richard Steiner) to develop the basic laboratory measurements necessary for a NIST calibration service to disseminate SI to meters used by regulators.
  3. Depending on the time available, test commercial charging equipment using a commercial field meter. 

[In-person opportunity]

ANALYTICAL Simulations of Electromagnetic Test Structures to Investigate and Partition Microwave Signal Losses in Emerging Barrier Materials for Cu Interconnects
Yaw Obeng, yaw.obeng [at]
Higher-speed signal transmission is increasingly required to handle massive data in electronic systems. So, signal transmission loss of copper wiring interconnects is critical. The total signal loss can be divided into dielectric loss and conductor loss based on electromagnetic theory. In particular, the scattering loss due to skin effects, and the copper-barrier interface will be quantitatively examined in detail. The elative usefulness of the emerging copper barriers materials, and their interfaces to copper, will be examined though analytical simulations. Specifically, efforts will be made to understand signal loss partitioning, viz.:

  1. Losses in the metal (Cu): Skin Effect, metal surface roughness;
  2. Losses in the dielectric around metal line;
  3. Losses in the metal-dielectric / corrosion film interface: voids, corrosions;
  4. Losses into the Si substrate;
  5. Where else?

Starting from existing base COMSOL codes, the SURF student will evaluate the impact of some newly identified Cu-barrier candidate materials on high speed signal loss. The exiting codes may be optimized based on new physical/ chemical insights, as well as to improve accuracy and computation efficiency. [In-person opportunity]

Quantum Waveform Metrology
Jason Underwood, jason.underwood [at]
Over the last several decades, quantum voltage standards based on the Josephson effect have revolutionized dc electrical metrology. More recently, the development of the Josephson Arbitrary Waveform Synthesizer (JAWS) promises to improve ac electrical metrology and may eventually supersede artifact standards. Students will contribute to research efforts on quantum-based arbitrary waveform synthesis and control. Concepts covered include instrument control, digital signal processing, superconducting electronics, and formal uncertainty evaluation. [In-person opportunity]

Past PML SURF Projects

[Back to top of page] 

Special Projects

Diversity, Equity, and Inclusivity Office (DEIO)

The Diversity, Equity, and Inclusivity Office (DEIO) helps to cultivate a culture of belonging in the work environment by providing NIST-wide consultation on the integration of diversity, equity, inclusivity, and accessibility (DEIA) across all aspects of NIST mission delivery. The office serves as the focal point for all NIST-wide DEIA-related efforts including internally facing actions focused upon employee experience and engagement, as well externally facing efforts to ensure that NIST is supporting an inclusive economy and providing equitable service delivery. Part of the mission of our office is to foster an equitably accessible and inclusive workplace for all. Our office celebrates the diverse experiences, skills, and perspectives individuals add to the collective whole, and we recognize that a standardized application process is not one size fits all. NIST provides reasonable accommodations to applicants with disabilities. We encourage interested applicants requesting accommodations to visit the Job Accommodation Network (JAN) and/or to contact cara.omalley [at] (Cara O’Malley) to assist us in tailoring this opportunity to each person’s accessibility needs.


Jo Wu, jo.wu [at]
Juan Fung, juan.fung [at]

Research Opportunities

Analyzing Job Announcements for Biased Language
Jo Wu, jo.wu [at] and Juan Fung, juan.fung [at]
The goal of this project is to investigate occupational segregation resulting from conscious and unconscious biases in the expectations of hiring managers. In particular, the student will use text mining methods to analyze key terms used in job opening announcements for NIST.

This will entail first creating a data set of historical job advertisements using the USAJOBS application programming interface (API) to compile role and desired qualification descriptions. The data will then be analyzed for indicators of employer or recruiter socio-economic bias in relation to specific jobs or industries where occupational composition or demographic concentration distributions are incongruent with equitable access to occupational opportunity or economic security.

The student should have experience with or be comfortable using a programming language like Python or R for tasks such as making API requests, collecting and cleaning data, text mining and text analysis, as well as version control systems such as git. Familiarity with social justice topics (through lived, work, or educational experience) is also desired but not required. The project will be approached collaboratively, and the student will have access to formal and informal mentorship in our office.

The project can be completed on-site or in a virtual format with flexible hours and virtual networking opportunities for both options.

If this project resonates with your skillsets and interests, please indicate preference in applying to DEIO in the special skills section of the application. It is also helpful if you email jo.wu [at] or juan.fung [at] after you have submitted the application. [Flexible format: In-person or virtual opportunity]

Technology Partnerships Office (TPO)


Bethany H. Loftinbethany.loftin [at]

Research Opportunities

Drafting Marketing Sheets for NIST Inventions and Conducting Relevant Market Research
Robin Bunch, robin.bunch [at]
The Technology Partnerships Office (TPO) is responsible for the Technology Transfer (T2) or commercialization of all NIST intellectual property. The TPO Communications and Technology Promotion team has devised a marketing sheet that succinctly describes and illustrates these inventions. The project involves drafting marketing sheets for NIST’s inventions and assisting with researching appropriate market data and potential venues and partners. The successful candidate should understand physics and chemistry to draft content for patents and inventions in plain English and assist with the illustrations. Knowledge of technology transfer is a plus, but TPO will train the candidate in what it is and why it is necessary. 

If this project resonates with your skillsets and interests, please indicate preference in applying to TPO in the special skills section of the application. [Virtual Opportunity]

Standards Coordination Office (SCO)

The Standards Coordination Office provides guidance and coordination to the federal government and to private entities on documentary standards and conformity assessment.  Documentary standards effect our daily lives by guiding the production of nearly everything we purchase and use. Examples range from agricultural products such as milk, consumer products such as electronics, clothing, toys, to innumerable products still to be created.  Our goal is to equip U.S. industry and the federal government with the standards-related tools and information necessary to effectively compete in the global marketplace. SURF participants engage in research opportunities that advance the development or assessment of performance standards and test methods of benefit to manufacturers, end users, and federal agencies, and explore the economic analysis of the impacts of standards and conformity assessment. Learn more about SCO.

How to apply - In the special skills section of the application, please indicate your interest in applying to SCO. It is also helpful if you email nathalie.rioux [at] () after you have submitted the application. SCO welcomes uniqueness and diversity!  Background training or an interest in economics, international relations, and/or political science as well as in other science and engineering fields is desirable but not required.


Nathalie Rioux, (301)975-2649, nathalie.rioux [at]

Research Opportunities

SCO will work with a SURF intern to develop a research plan tailored to enhancing their knowledge and capabilities in documentary (written) standardization activities. This work will be appealing to those interested in public policy, international standards, and/or the relationship between documentary standards and the development of technology.

​​​​​​​​​​​​​​    Some policy project ideas to explore are in the following areas:

  • NIST’s role in coordinating the U.S. government’s standards development activities
  • U.S.-EU Trade and Technology Council (areas of interest include Technology Standards Working Group (WG1) on Digital Identity, Additive Manufacturing, Plastics Recycling, Heavy Duty Charing, etc.)
  • Quad (U.S., Australia, Japan, India) Critical and Emerging Technologies Working Group, Standards Sub-group participation.
  • NIST’s participation in the International Telecommunications Union - Telecommunication Standardization Sector

    Some examples of past projects:
  • China’s Changing Standards Infrastructure: A New Approach to the Global Stage
  • Harmonizing Standards: Reviewing Military and Law Enforcement PPE Performance Standards
  • Development of Standards within ISO/TC 276-Biotechnology
  • What is the Meaning of Life? Terminology and Measurement Assurance for Biotechnology Standards
  • Harmonizing Standards: Reviewing Military and Law Enforcement PPE Performance Standards
  • Development of Standards within ISO/TC 276-Biotechnology
  • What is the Meaning of Life? Terminology and Measurement Assurance for Biotechnology Standards

[Back to top of page]

Project Descriptions and Acceptance Rates


Main Contact

Created June 3, 2010, Updated February 1, 2023