You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you very much for your answer. I now know how to run your code. But I have an another question: how do I get input attention weights as the authors did in the original paper ( A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction), I tried but failed. Looking forward to your help.
The text was updated successfully, but these errors were encountered:
In order to obtain the input weights (or the temporal attention weights) you would have to save the node of the attention alpha (analogously beta for temporal attention) as an attribute of the model, i.e. refactor that line of code to self.alpha and then when doing session.run pass such node to be evaluated in addition to the rest.
The last question, l have got it. Thanks! But l have another question. If I want to set the stride or horizon which is called in time series papers to 2,3,...7 rather than 1. I only changed the code in model.py by
for t in range(self.config.T):
if t<self.config.T:
if t<self.config.T-2 or -3 or -7 :
but the result is lagged. Maybe there is some wrong else or DA-rnn is not suitainable for prediction like this. I am really confusing. Because using the curent values to predict the target curent value is not significant in many cases.
Thank you very much for your answer. I now know how to run your code. But I have an another question: how do I get input attention weights as the authors did in the original paper ( A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction), I tried but failed. Looking forward to your help.
The text was updated successfully, but these errors were encountered: