For industry and government labs to ensure their pressure-measurement machines are working correctly, they need a reliable source of pressure. Often, that source is a piston gauge – a cylinder of metal that falls through a hollow cylinder or “sleeve” at a predictable rate. Staff at NIST’s Physical Measurement Laboratory (PML) perform precise calibrations of piston gauges for customers including the Navy, the Army, airlines, and power utility companies.
For decades, those calibrations were done painstakingly by hand. But staff have recently developed and launched a new, automated system that dramatically reduces the time required for each test.
Pressure is a measurement of the amount of force applied to a unit area. For a piston gauge, that force comes from a mass that is placed on top of the piston, pushing it down. The area in this case is the cross-sectional area of the piston, corrected for distortion and defects and referred to as the “effective area.” So for piston gauges, pressure is dependent on the effective area of the piston and how much mass is placed on it.
To calibrate a customer’s gauge, researchers need to balance the pressures generated by two different gauges. On one side of this balance is the customer’s piston gauge. On the other side is one of NIST’s standard piston gauges, whose dimensions have been measured precisely so that its effective area is known extremely well. The same nominal mass – also measured at NIST with high precision – is placed on each gauge.
Researchers know the effective area of the standard piston gauge, and they know the value of the masses being placed on the gauges. What they don’t know, but need to find out to complete the calibration, is the effective area of the customer’s gauge.
In the old, manual method, the gauges were connected with a differential pressure cell, which has a dial that indicates how different one pressure is from another. To achieve balance, a researcher would add or remove masses of all different sizes, some as tiny as 5 milligrams (mg), to and from the tops of the piston gauges. Once the pressures were balanced, researchers would note how much mass had been used for the customer’s gauge and use that to calculate the effective area. This would then be repeated several times – usually a total of 10 – at different pressures within the piston gauge’s range. In the end, NIST researchers would be able to furnish the customer with a report that gave corrections that they could apply to any measurements they made with their piston gauge.
This might sound straightforward. But in practice, it was extremely time-consuming, because researchers had to essentially make a near-perfect balance condition between the two gauges. “It was very tedious,” says Greg Driver, the NIST PML staff member who performed these calibrations for more than forty years until his retirement early this year.
“In the old way, you would see the researcher with a pair of tweezers, grabbing these tiny little masses out of the box and popping them on this thing over and over and over again,” says NIST PML’s Julia Scherschligt. On top of that, she continues, the technician would have to properly set up for a balance each time the masses were changed, covering and uncovering the piston gauges to keep air currents from affecting their performance, adding puffs of nitrogen gas to the gauges to keep them from falling to the bottom of their sleeves, and hand-spinning the gauges to prevent any slight imperfections in their construction from skewing the results. (See video for a demonstration of the old manual method.)
About a year ago, the team began investigating a way to automate this comparison to avoid having to create a perfect balance. In the new method, the pressures are compared using a transducer,* which first samples the pressure created by one and then, within a few seconds, samples the pressure created by the other. No more adding and removing small masses: the researchers simply place the same nominal mass on top of each gauge and then add a little bit of nitrogen gas to each side when needed. Soon, they hope, they won’t even have to manually switch the nitrogen on – the computer will take care of the entire calibration automatically.
The beauty of the new system, Scherschligt says, is that the transducer itself does not need to be calibrated. “It can be thought of as a device that near-instantaneously transfers the calibration of the NIST piston gauge to the customer gauge,” she says. “The accuracy of the transducer doesn’t matter – it could be off by orders of magnitude – so long as it responds linearly to changes in pressure.”
They estimate they’re saving significant time this way. “We can do a ten-point calibration in about an hour,” Driver says. “Before, it would take me almost a whole day.” Furthermore, they have achieved precision on the order of a few parts per million, comparable to the manual method.
Creating the automated system was a team effort. Yuanchao Yang, who was a NIST associate at the time, designed and built the system with input from Driver and Scherschligt as well as NIST PML’s Dawn Cross, who has been performing the piston gauge calibrations since Driver’s retirement. From the NIST Office of Information Systems Management, Katie Schlatter was the process engineer and John Quintavalle designed and wrote the software.
Right now, the pressure ranges covered by the automated system only go up to 7 MPa (megapascals, about 1000 psi), meaning that any piston gauge with a pressure range higher than 7 MPa must still be calibrated by hand. The NIST team plans to upgrade this partially automated system to 14 MPa (about 2000 psi), and would ultimately like to make another system with full automation.
-- Reported and written by Jennifer Lauren Lee
*A transducer is a device that converts one type of signal into another. In this case, the transducer turns a pressure signal into an electrical signal.