Two Charpy machines were used to test NIST verification specimens at three energy levels: low energy (≈ 15 J at -40°C), high energy (≈ 100 J at -40°C) and super-high energy (≈ 240 J at room temperature). The study evaluates the differences observed for the bias between two impact machines and the variation in test data for instrumented versus non-instrumented impact tests. The machines used for testing were of very similar design, and all tests were performed using the same instrumented striker (switched between machines). After testing, the raw force/time data were used for the analyses, without applying any normalization or adjustment based on the absorbed energy measured by the machine encoder. The characteristic forces at general yield (Fgy) and the maximum forces (Fm) were determined in accordance with ASTM E 2298-09 from the instrumented impact record that was used to calculate the total impact energy (Wt). The findings show: (1) One machine consistently produced higher absorbed energy values than the other machine (in terms of both KV and Wt); (2) The variation in Wt is significantly lower than the variation in KV for a given machine and energy level; (3) The relative differences between KV and Wt increased with increasing absorbed energy levels; (4) Variations in maximum force are lower than variations in absorbed energy values; (5) Instrumented data indicate that the variation in the curves is very small up to maximum force, and that differences in absorbed energy mainly occur during fracture propagation (post-maximum force data); (7) Data from these two independent measures of absorbed energy indicate that scatter is due primarily to material variability, and (8) The bias between the two machines is significantly reduced when the same striker is used for testing.
Journal of Testing and Evaluation
absorbed energy, bias, Charpy impact test, instrumented impact tests, instrumented striker, verification specimens