You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Transformers are popular models for a variety of tasks including anomaly detection. There are many methods that have been developed including Anomaly Transformer that use attention for anomaly detection.
As part of our endeavor to provide popular models in Orion, we seek to provide anomaly transformer pipeline, paper reference: https://arxiv.org/pdf/2110.02642.pdf
Design
Initial plan for primitives to use
time segments aggregate primitive to make the samples equi-distant
imputing missing values
scaling the data using Standard Scaler
transformer model
find the threshold using attention values
Evaluation Criteria
One clear difference between Orion benchmark's evaluation calculation and the paper's is that in Orion we only factor in the positive class (i.e. anomalies). In other words, we do not care if we predict normal sequences and normal. We mainly care about (1) getting all the correct anomalies (2) not raising false alarms
The text was updated successfully, but these errors were encountered:
Transformers are popular models for a variety of tasks including anomaly detection. There are many methods that have been developed including
Anomaly Transformer
that use attention for anomaly detection.As part of our endeavor to provide popular models in Orion, we seek to provide anomaly transformer pipeline, paper reference: https://arxiv.org/pdf/2110.02642.pdf
Design
Initial plan for primitives to use
Evaluation Criteria
One clear difference between Orion benchmark's evaluation calculation and the paper's is that in Orion we only factor in the positive class (i.e. anomalies). In other words, we do not care if we predict normal sequences and normal. We mainly care about (1) getting all the correct anomalies (2) not raising false alarms
The text was updated successfully, but these errors were encountered: