• PP: Robust Anomaly Detection for Multivariate Time Series through Stochastic Recurrent Neural Network


    PROBLEM: OmniAnomaly

    multivariate time series anomaly detection + unsupervised

    主体思想: input: multivariate time series to RNN ------> capture the normal patterns -----> reconstruct input data by the representations ------> use the reconstruction probabilities to determine anomalies. 

    INTRODUCTION: 

    1. The first challenge is how to learn robust latent representations, considering both the temporal dependence and stochasticity of multivariate time series. 

    -------stochastic RNN + explicit temporal dependence among stochastic variables. 

    Stochastic variables are latent representations of input data and their quality is the key to model performance.

    Their approach glues GRU and VAE with two key techniques:

    • stochastic variable connection technique: explicitly model temporal dependence among stochastic variables in the latent space. 
    • Planar Normalizing Flows, which uses a series of invertible mappings可逆映射 to learn non-Gaussian posterior distributions in latent stochastic space. 

    2. The second challenge is how to provide interpretation to the detected entity-level anomalies, given the stochastic deep learning approaches.

    Challenge: 1. capture long-term dependence. 2. capture probability distributions of multivariate time series. 3. how to interpret your results (unsupervised learning) 

    EVIDENCE: literature 5 shown that explicitly modeling the temporal dependence are better. 

    RELATED WORK: 

    PRELIMINARIES:

     Problem statement: 以时序数据的个数作为维度,M个TS, x 属于R[M*N], x_t为一个M维的列向量,

     gru, vae, and stochastic gradient variational bayes

    DESIGN

    OmniAnomaly structure: returns an anomaly score for x_t. 

    • online detection
    • offline detection
      • data preprocessing: data standardization, sequence segmentation through sliding windows T+1; 
      • input: multivariate time series inside a window, ----------Model training ------------output: an anomaly score for each observation ------- automatic threshold selection;

    Detection: detect anomalies based on the reconstruction probability of x_t

    Loss function: ELBO; 

    Variational inference algorithms: SGVB;

    Output: a univariate time series of anomaly scores

    Automatic thresholds selection: extreme value theory + peaks-over-threshold; 


    1. use GRU to capture complex temporal dependence in x-space. 

    2. apply VAE to map observations to stochastic variables.

    3. explicitly model temporal dependence among latent space, they propose the stochastic variable connection technique. 

    4. adopt planar NF. 

    Evaluation:

    We use Precision, Recall, F1-Score (denoted as F1) to evaluate the performance of OmniAnomaly. 

    Baseline: 

    1. LSTM with nonparametric dynamic thresholding
    2. EncDec-AD
    3. DAGMM
    4. LSTM-VAE
    5. Donut; 采取别的方式使donut适用于multivariate TS.

    Supplementary knowledge:

    1. VAE: 

    inference net qnet + generative net pnet.

    2. GRU: gate recurrent unit

    Reference

    1. 人人都能看懂的GRU
    2. 变分自编码器VAE:原来是这么一回事 | 附开源代码
  • 相关阅读:
    企业应用开发中最常用c++库
    ESOURCE_LOCKED
    c++标准之于gcc/vc/boost等实现相当于jsr规范之于sunjdk/ibmjdk/tomcat/weblogic等实现
    c++的各种类型转换方式
    c++的友元类、方法及其益处
    c/c++的typedef/using类型别名
    c++的class声明及相比java的更合理之处
    c++ sleep(windows/linux)
    c++不自动生成相关函数比如赋值、拷贝函数
    c++继承、多态以及与java的行为差异之处
  • 原文地址:https://www.cnblogs.com/dulun/p/12241938.html
Copyright © 2020-2023  润新知