@04068750

Towards Few-Shot Time Series Anomaly Detection with Temporal Attention and Dynamic Thresholding

, und . International Conference on Machine Learning and Applications (ICMLA), Seite 1444--1450. IEEE, (2023)

Zusammenfassung

Anomaly detection plays a pivotal role in diverse realworld applications such as cybersecurity, fault detection, network monitoring, predictive maintenance, and highly automated driving. However, obtaining labeled anomalous data can be a formidable challenge, especially when anomalies exhibit temporal evolution. This paper introduces LATAM (Long short-term memory Autoencoder with Temporal Attention Mechanism) for few-shot anomaly detection, with the aim of enhancing detection performance in scenarios with limited labeled anomaly data. LATAM effectively captures temporal dependencies and emphasizes significant patterns in multivariate time series data. In our investigation, we comprehensively evaluate LATAM against other anomaly detection models, particularly assessing its capability in few-shot learning scenarios where we have minimal examples from the normal class and none from the anomalous class in the training data. Our experimental results, derived from real-world photovoltaic inverter data, highlight LATAM’s superiority, showcasing a substantial 27% mean F1 score improvement, even when trained on a mere two-week dataset. Furthermore, LATAM demonstrates remarkable results on the open-source SWaT dataset, achieving a 12% boost in accuracy with only two days of training data. Moreover, we introduce a simple yet effective dynamic thresholding mechanism, further enhancing the anomaly detection capabilities of LATAM. This underscores LATAM’s efficacy in addressing the challenges posed by limited labeled anomalies in practical scenarios and it proves valuable for downstream tasks involving temporal representation and time series prediction, extending its utility beyond anomaly detection applications.

Links und Ressourcen

BibTeX-Schlüssel:
nivarthi2023towards
Suchen auf:

Kommentare und Rezensionen  
(0)

Es gibt bisher keine Rezension oder Kommentar. Sie können eine schreiben!

Tags


Zitieren Sie diese Publikation