In the wake of a well-publicized 2003 field failure of a poly (p-phenylene-2,6-benzobisoxazole), or PBO, body armor, questions have been raised regarding the expected service life of ballistic-resistant body armor. For the last decade, the body-armor community has conducted a major research effort to understand the effect of field and laboratory ageing on the performance of body armor. The first part of this effort culminated in a revised standard for ballistic-resistant body armor that relied upon an environmental-conditioning protocol to examine the capability of a given armor design to withstand conditions of heat, moisture, and mechanical wear, which may translate to increased confidence in the armor long-term performance. Ideally, long-term field performance of armor could be assessed by issuing a single body armor design to many end users, each of whom working in a wide array of different types of positions (e.g., patrol officers vs. detectives) and in a variety of different climates (e.g., southern US vs. northern Canada). This armor would subsequently be removed from service for analysis at specified intervals. Unfortunately, this ideal study has been proven to be difficult to execute in practice. This is because a) law enforcement officers in many countries utilize a variety of different armor designs, and b) the actual use condition for an individual piece of armor is incredibly challenging to precisely assess, that is, officers can be reassigned or promoted over the course of a few years of service, some officers do not wear their armor regularly, and officers do not follow common standards of care for their armor (e.g., some may hang it when it’s not being worn, while others may throw it in the bottom of their locker or the trunk of their car until it is needed again). A recent study on field-aged armor was conducted in Canada in 2008. The interpretations of this data were complicated by a) the wide variety of different armor designs sampled and b) the difficulty inherent in assessing the performance of the armor at the end of its life cycle without a good benchmark for the performance of that armor when it was new. The benchmark was not good due to the limited number of shots that were required to assess the performance new armor according to historical versions of the NIJ body-armor standard. These issues led the authors to state that no correlations between armor age and performance could be concluded from this study. While performing a large-scale study to look at fielded-armor performance remains challenging, we are investigating ways to combine available data from fielded- and laboratory-aged armor to better understand armor long-term performance.
Results: The tensile strengths of yarns extracted from artificially-aged armor are compared those from fielded armor, and correlations between aged armor tensile strength and V50 ballistic limit are examined. In addition, the tensile strength of yarns extracted from armor that had been aged is being compared with the tensile strengths of yarns extracted from fielded armor to look for similarities, which would indicate that the ageing protocol used is relevant to field-worn armor. The mechanical properties of the extracted yarns can be used to estimate a theoretical change in the V50 ballistic limit. These theoretical estimates are then compared to the experimentally measured V50. This work raises the possibility of a tool for fielded armor performance surveillance programs relying upon testing of armor coupon samples. This work is the subject of a forthcoming publication.