What is the role of TimeDistributed layer in Keras?

Buomsoo Kim

I am trying to grasp what TimeDistributed wrapper does in Keras.

I get that TimeDistributed "applies a layer to every temporal slice of an input."

But I did some experiment and got the results that I cannot understand.

In short, in connection to LSTM layer, TimeDistributed and just Dense layer bear same results.

model = Sequential()
model.add(LSTM(5, input_shape = (10, 20), return_sequences = True))
model.add(TimeDistributed(Dense(1)))
print(model.output_shape)

model = Sequential()
model.add(LSTM(5, input_shape = (10, 20), return_sequences = True))
model.add((Dense(1)))
print(model.output_shape)

For both models, I got output shape of (None, 10, 1).

Can anyone explain the difference between TimeDistributed and Dense layer after an RNN layer?

Marcin Możejko

In keras - while building a sequential model - usually the second dimension (one after sample dimension) - is related to a time dimension. This means that if for example, your data is 5-dim with (sample, time, width, length, channel) you could apply a convolutional layer using TimeDistributed (which is applicable to 4-dim with (sample, width, length, channel)) along a time dimension (applying the same layer to each time slice) in order to obtain 5-d output.

The case with Dense is that in keras from version 2.0 Dense is by default applied to only last dimension (e.g. if you apply Dense(10) to input with shape (n, m, o, p) you'll get output with shape (n, m, o, 10)) so in your case Dense and TimeDistributed(Dense) are equivalent.

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

From Java

Memory leak with Keras Lambda layer

From Java

what is the difference between using softmax as a sequential layer in tf.keras and softmax as an activation function for a dense layer?

From Java

What is the role of "Flatten" in Keras?

From Java

How to use keras layers in custom keras layer

From Dev

What is the role of Service layer in spring and how do I split my logic in this scenario?

From Dev

Keras error with merge layer

From Dev

Incompatible dense layer error in keras

From Dev

In Keras(Deep learning library), RepeatVector + TimeDistributed = Error?

From Dev

Correct usage of keras SpatialDropout2D inside TimeDistributed layer - CNN LSTM network

From Dev

What is the role of preprocess_input() function in Keras VGG model?

From Dev

What is the role of TimeDistributed layer in Keras?

From Dev

Keras replacing input layer

From Dev

Keras: Custom layer without inputs

From Dev

How to use lambda layer in keras?

From Dev

Why is TimeDistributed not needed in my Keras LSTM?

From Dev

Keras Multiply() layer in functional API

From Dev

Keras Input Layer Shape On Input Layer Error

From Dev

Keras custom masking layer

From Dev

Get Ouput Of A Keras Model/Layer

From Dev

Verifying the output of an isolated layer in keras

From Dev

Viewing layer activations with Keras

From Dev

How not to output a sequence with `TimeDistributed` layer in Keras?

From Dev

Create a special layer in CNN with keras

From Dev

Using TimeDistributed with recurrent layer in Keras

From Dev

Multiple Layer hidden layer in LSTM in Keras

From Dev

Inverting keras.layers.Add() layer in Keras

From Dev

Why does the Keras sequence-to-sequence tutorial not mention the TimeDistributed layer wrapper?

From Dev

Error while applying TimeDistributed to InceptionResnetV2 in Keras

From Dev

Separate activation from a timedistributed layer

Related Related

  1. 1

    Memory leak with Keras Lambda layer

  2. 2

    what is the difference between using softmax as a sequential layer in tf.keras and softmax as an activation function for a dense layer?

  3. 3

    What is the role of "Flatten" in Keras?

  4. 4

    How to use keras layers in custom keras layer

  5. 5

    What is the role of Service layer in spring and how do I split my logic in this scenario?

  6. 6

    Keras error with merge layer

  7. 7

    Incompatible dense layer error in keras

  8. 8

    In Keras(Deep learning library), RepeatVector + TimeDistributed = Error?

  9. 9

    Correct usage of keras SpatialDropout2D inside TimeDistributed layer - CNN LSTM network

  10. 10

    What is the role of preprocess_input() function in Keras VGG model?

  11. 11

    What is the role of TimeDistributed layer in Keras?

  12. 12

    Keras replacing input layer

  13. 13

    Keras: Custom layer without inputs

  14. 14

    How to use lambda layer in keras?

  15. 15

    Why is TimeDistributed not needed in my Keras LSTM?

  16. 16

    Keras Multiply() layer in functional API

  17. 17

    Keras Input Layer Shape On Input Layer Error

  18. 18

    Keras custom masking layer

  19. 19

    Get Ouput Of A Keras Model/Layer

  20. 20

    Verifying the output of an isolated layer in keras

  21. 21

    Viewing layer activations with Keras

  22. 22

    How not to output a sequence with `TimeDistributed` layer in Keras?

  23. 23

    Create a special layer in CNN with keras

  24. 24

    Using TimeDistributed with recurrent layer in Keras

  25. 25

    Multiple Layer hidden layer in LSTM in Keras

  26. 26

    Inverting keras.layers.Add() layer in Keras

  27. 27

    Why does the Keras sequence-to-sequence tutorial not mention the TimeDistributed layer wrapper?

  28. 28

    Error while applying TimeDistributed to InceptionResnetV2 in Keras

  29. 29

    Separate activation from a timedistributed layer

HotTag

Archive