Adobe offers an analytics product as part of the Marketing Cloud software with which customers can track many details about users across various digital platforms. For the most part, customers define the amount and type of data to track. In addition, customers can specify many feature combinations when reporting on this data. These features create high dimensionality that makes validation challenging for some of the most critical components of the Adobe Analytics product. One of these critical components is the reporting engine. This component has a validation framework often qualitatively considered within the engineering organization as highly effective. However, the effectiveness of this framework has never been quantitatively measured. Due to recent applications of combinatorial testing, the Analytics Tools team determined to use combinatorial coverage measurements (CCM) to evaluate the effectiveness of the Replay validation framework. In this paper, we therefore report the practical application of combinatorial coverage measurements to evaluate the effectiveness of the validation framework for the Adobe Analytics reporting engine. The results of this evaluation show that combinatorial coverage measurements are an effective way to supplement existing validation for several purposes. In addition, we report details of the approach used to parse moderately nested data for use with the combinatorial coverage measurement tools.
Proceedings of IEEE International Conference on Software Testing, Verification and Validation
April 22-27, 2019
IEEE International Conference on Software Testing, Verification and Validation
ICST 2019 Workshops
, Kuhn, D.
and Smith, R.
Measuring Combinatorial Coverage at Adobe, Proceedings of IEEE International Conference on Software Testing, Verification and Validation
ICST 2019, Xian, -1, [online], https://doi.org/10.1109/ICSTW.2019.00052
(Accessed November 30, 2023)