• PP: A dual-stage attention-based recurrent neural network for time series prediction


    Problem: time series prediction

    The nonlinear autoregressive exogenous model: The Nonlinear autoregressive exogenous (NARX) model, which predicts the current value of a time series based upon its previous values as well as the current and past values of multiple driving (exogenous) series.

    However, few NARX models can capture the long-term temporal dependencies appropriately and select the relevant driving series to make a prediction.

    2 issues:

    1. capture the long-term temporal dependencies

    2. select the relevant driving series to make a prediction

    We propose a dual-stage attention-based RNN to address these 2 issues. 

    1. first stage: input attention mechanism to extract relevant driving series. 

    2. second stage: temporal attention mechanism. 

    attention-based encoder-decoder networks for time series prediction/ LSTM/ GRU

    One problem with encoder-decoder networks is that their performance will deteriorate rapidly as the length of input sequence increases. 

    Contribution: the two-stage attention mechanism. input attention for driving series and temporal attention for all time stamps. 

    input attention can select the relevant driving series.

    temporal attention capture temporal information. 

    Supplementary knowledge:

    1. what is driving series?

  • 相关阅读:
    linux中read用法
    apt-get 使用指南
    linux文件系统
    KMP
    在C#中的各种应用
    A*算法,遗传算法
    Dijkstra算法,Floyd算法
    AE开发tips
    TOC 右键菜单
    ubuntu下的一些意外
  • 原文地址:https://www.cnblogs.com/dulun/p/12267003.html
Copyright © 2020-2023  润新知