Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add anomaly transformer method #452

Closed
sarahmish opened this issue Aug 28, 2023 · 0 comments · Fixed by #454
Closed

Add anomaly transformer method #452

sarahmish opened this issue Aug 28, 2023 · 0 comments · Fixed by #454
Labels
enhancement Improvements on the current features
Milestone

Comments

@sarahmish
Copy link
Collaborator

Transformers are popular models for a variety of tasks including anomaly detection. There are many methods that have been developed including Anomaly Transformer that use attention for anomaly detection.

As part of our endeavor to provide popular models in Orion, we seek to provide anomaly transformer pipeline, paper reference: https://arxiv.org/pdf/2110.02642.pdf

Design
Initial plan for primitives to use

  • time segments aggregate primitive to make the samples equi-distant
  • imputing missing values
  • scaling the data using Standard Scaler
  • transformer model
  • find the threshold using attention values

Evaluation Criteria
One clear difference between Orion benchmark's evaluation calculation and the paper's is that in Orion we only factor in the positive class (i.e. anomalies). In other words, we do not care if we predict normal sequences and normal. We mainly care about (1) getting all the correct anomalies (2) not raising false alarms

@sarahmish sarahmish added the enhancement Improvements on the current features label Aug 28, 2023
@sarahmish sarahmish added this to the 0.6.1 milestone Oct 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Improvements on the current features
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant