Skip to content

We use attention model for intrusion detection. The idea of Hierarchical Attention Model for Intrusion Detection comes from the application of Attention in NLP.

Notifications You must be signed in to change notification settings

FlamingJay/Hierarchical-Attention-Model-for-Intrusion-Detection

Repository files navigation

Hierarchical-Attention-Model-for-Intrusion-Detection

The idea of Hierarchical Attention Model for Intrusion Detection comes from the application of Attention in NLP.

In this paper, we separately adopt two kinds of attention mechanism. The overall view of the system is as follows: include

Here, we apply the location-based attention on the features. That is the feature-based attention. This is good for visualization in the next step.

include

Then, we apply the dot-product attention on different timestep aiming to improve the performance of model.

include

The attention used in this paper all belongs to global attention as shown below.

include

When timestep equals to 10, the accuracy can reach more than 98.7%.

We visualize the attention map shown below. include include

Pleas cite as follows if this repository does good for you.

Liu C, Liu Y, Yan Y, et al. An Intrusion Detection Model With Hierarchical Attention Mechanism[J]. IEEE Access, 2020, 8: 67542-67554.

By the way, you can contact with me: [email protected]

Thanks a lot!

About

We use attention model for intrusion detection. The idea of Hierarchical Attention Model for Intrusion Detection comes from the application of Attention in NLP.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages