Towards Generative Adversarial Network on Industrial Internet of Things
Cheng Qian, Wei Yu, Chao Lu, David W. Griffith, Nada T. Golmie
Machine learning, as a viable way of conducting data analytics, has been successfully applied to a number of areas. Nonetheless, the lack of sufficient data is one critical issue for applying machine learning in Industrial Internet of Things (IIoT) systems. Insufficient data raises the potential for under-fitting in the machine learning model training process, negatively affecting the accuracy of learning models. To tackle this issue, we design a framework to systematically investigate the impacts of insufficient data on machine learning model training. This framework employs the Generative Adversarial Network (GAN) and continuous learning to generate and engage new data in learning model training, enabling us to study the security risks of introducing new data in the model training process and develop countermeasures to mitigate these risks. To validate the efficacy of our framework, we consider a representative IIoT scenario, in which a variety of industrial components need to be recognized by Convolutional Neural Networks (CNN), and design three evaluation scenarios. Based on a real-world IIoT dataset, we implement those scenarios. Our experimental results confirm that insufficient data can have a significant impact on the learning model accuracy, and that new data generated by GAN and continuous learning can greatly improve the learning model accuracy. Our experimental results also show that the data poisoning threat posed by the GAN can significantly reduce the learning model accuracy. Moreover, our proposed defensive mechanism is capable of securing the model learning process. We further discuss some emerging issues that need to be addressed in future work.
, Yu, W.
, Lu, C.
, Griffith, D.
and Golmie, N.
Towards Generative Adversarial Network on Industrial Internet of Things, IEEE Internet of Things Journal, [online], https://doi.org/10.1109/JIOT.2022.3163894, https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=933204
(Accessed December 9, 2023)