TimeDistributed wrapper for dense layer (Keras)

Dear all,

I’m trying to build a’ workflow for multi-step sequence prediction using a simple LSTM network.

As far as I’m aware, this would require to:

Is there any option to set-up that kind of wrapper, ideally without resorting to Python snippet nodes?

Best Regards, and thank you for your help!


Jorge

Hello @menuetto,

to directly answer your question: No, there is currently no way to set-up a TimeDistributed wrapper outside of resorting to the DL Python Network Editor node.
However, I don’t think you actually need this wrapper for what you are describing because Keras’ Dense layer is applied time-distributed when called on a sequence.
See for example this Keras blog post which shows how to do neural machine translation (which is a kind of multi-step sequence prediction) or our example workflow on the same topic:
https://hub.knime.com/knime/workflows/Examples/04_Analytics/14_Deep_Learning/02_Keras/12_Machine_Translation/NMT_Training*WI2o2pt4hStUwRrZ

Kind regards,

Adrian

2 Likes

Hi @nemad, thank you for your reply!

If I’m understanding correctly, I can indeed use the concept behind the workflow you are referencing for my task (predicting several timesteps in the future for a given timeseries) by feeding back the predicted sample for timestep t in order to get a prediction for timestep t+1 and so on (one timestep at a time)

My concern here is the accumulation of prediction errors (see https://datascience.stackexchange.com/questions/15524/is-time-series-multi-step-ahead-forecasting-a-sequence-to-sequence-problem), compared to a “many to many” approach as described in:

https://machinelearningmastery.com/timedistributed-layer-for-long-short-term-memory-networks-in-python/

…in which all timesteps are predicted at once

Anyway I will try to fiddle with the DL Python Network Editor and compare the two approaches because, to be honest, for now I’m just hypotesizing about the model that would best fit my use case.

Just one more thing, one of the alternatives I am also exploring is 1D CNN using dilated causal convolutions, but I’m unable to find a way to set the “Padding” option to “Causal”… is this configuration supported in any way?

Best Regards, and thank you for your time and support,


Jorge

I understand your concern but I don’t think the time distributed wrapper will help you with this issue as it applies its argument layer to slices of a sequence (which the Dense layer also supports without the wrapper).
Regarding your other question, I have to admit that this is the first time I’ve heard of causal padding and I created a ticket for adding it in the future. Right now you will have to rely on the DL Python Network Editor node to add Conv1D layers with causal padding.

Best,

Adrian

3 Likes

Excellent thank you @nemad!

@nemad quick question - any sense on when to expect the timedistributed wrapper node? Would be super helpful to have it. thanks!

Hello @whois_rb,

I’ll bring it up again but at this point there are no plans for a timedistributed wrapper node.
If you need this functionality, you’ll have to resort to the DL Python Network Editor that allows you to modify your network using the Keras Python API.
Please also note that we added support for TensorFlow 2 with the latest release which gives you even more freedom in terms of layers and functionality.

Best,
Adrian

3 Likes