Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Analysis of Neural Network Detectors for Network Attacks

Published

Author(s)

Qingtian Zou, Lan Zhang, Anoop Singhal, Xiaoyan Sun, Peng Liu

Abstract

While network attacks play a critical role in many advanced persistent threat (APT) campaigns, an arms race exists between the network defenders and the adversary: to make APT campaigns stealthy, the adversary is strongly motivated to evade the detection system. However, new studies have shown that neural network is likely a game-changer in the arms race: neural network could be applied to achieve accurate, signature-free, and low-false-alarm-rate detection. In this work, we investigate whether the adversary could fight back during the next phase of the arms race. In particular, noticing that none of the existing adversarial example generation methods could generate malicious packets (and sessions) that can simultaneously compromise the target machine and evade the neural network detection model, we propose a novel attack method to achieve this goal. We have designed and implemented the new attack. We have also used Address Resolution Protocol (ARP) Poisoning and Domain Name System (DNS) Cache Poisoning as the case study to demonstrate the effectiveness of the proposed attack.
Citation
Journal of Computer Security
Volume
32
Issue
3

Keywords

Network Attack, Neural Network, Adversarial Example

Citation

Zou, Q. , Zhang, L. , Singhal, A. , Sun, X. and Liu, P. (2024), Analysis of Neural Network Detectors for Network Attacks, Journal of Computer Security, [online], https://doi.org/10.3233/JCS-230031, https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=936523 (Accessed October 12, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created June 17, 2024, Updated August 22, 2024