Confusion about Keras RNN Input shape requirement

Kristofer

I have read plenty of posts for this point. They are inconsistent with each other and every answer seems to have a different explanation so I thought to ask based on my analyzing of all of them.

As Keras RNN documentation states, the input shape is always in this form (batch_size, timesteps, input_dim). I am a bit confused about that but I guess, not sure though, that input_dim is always 1 while timesteps depends on your problem (could be the data dimension as well). Is that roughly correct?

The reason for this question is that I always get an error when trying to change the value of input_dim to be my dataset dimension (as input_dim sounds like that!!), so I made an assumption that input_dim represent the shape of the input vector to LSTM at a time. Am I wrong again?

C = C.reshape((C.shape[0], C.shape[1], 1))
tr_C, ts_C, tr_r, ts_r = train_test_split(C, r, train_size=.8)
batch_size = 1000

print('Build model...')
model = Sequential()

model.add(LSTM(8, batch_input_shape=(batch_size, C.shape[1], 1), stateful=True, activation='relu'))
model.add(Dense(1, activation='relu'))

print('Training...')
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

model.fit(tr_C, tr_r,
          batch_size=batch_size, epochs=1,
          shuffle=True, validation_data=(ts_C, ts_r))

Thanks!

Daniel Möller

Indeed, input_dim is the shape of the input vector at a time. In other words, input_dim is the number of the input features.

It's not necessarily 1, though. If you're working with more than one var, it can be any number.

Suppose you have 10 sequences, each sequence has 200 time steps, and you're measuring just a temperature. Then you have one feature:

  • input_shape = (200,1) -- notice that the batch size (number of sequences) is ignored here
  • batch_input_shape = (10,200,1) -- only in specific cases, like stateful = True, you will need a batch input shape.

Now suppose you're measuring not only temperature, but also pressure and volume. Now you've got three input features:

  • input_shape = (200,3)
  • batch_input_shape = (10,200,3)

In other words, the first dimension is the number of different sequences. The second is the length of the sequence (how many measures along time). And the last is how many vars at each time.

この記事はインターネットから収集されたものであり、転載の際にはソースを示してください。

侵害の場合は、連絡してください[email protected]

編集
0

コメントを追加

0

関連記事

分類Dev

Keras LSTM training. How to shape my input data?

分類Dev

Confusion about dependencies in cargo

分類Dev

Keras Shape'ValueError '

分類Dev

Keras入力の説明:input_shape、units、batch_size、dimなど

分類Dev

Use "Flatten" or "Reshape" to get 1D output of unknown input shape in keras

分類Dev

RNNでステートフルLSTMにbatch_input_shapeを使用するとエラーが発生する

分類Dev

Confusion about std::unique implementation?

分類Dev

Confusion about data usage in Controllers

分類Dev

keras bidirectional layer with custom RNN Cell

分類Dev

Keras Dense layer shape error

分類Dev

Keras input_shape、output_shape、get_weights、get_config、summaryのPyTorchの代替手段は何ですか

分類Dev

Keras を使用した LSTM の input_shape パラメーターの理解

分類Dev

Keras:binary_crossentropy&categorical_crossentropy confusion

分類Dev

Confusion with weights dumping from neural net in keras

分類Dev

confusion about subproblem in dynamic programming question

分類Dev

Confusion about malloc and calloc function in C

分類Dev

Apollo Server - Confusion about cache/datasource options

分類Dev

Confusion about deploying a front end project

分類Dev

Confusion about Logic between constructor and ngOnInit() in Angular

分類Dev

Confusion about the execution order in SQL query

分類Dev

Markdown: Confusion about empty lines in editor

分類Dev

confusion about how to use c++ reference

分類Dev

Confusion about maximum output from Geometry Shaders

分類Dev

A confusion about scopes of If Else statements in python

分類Dev

LSTM autoencoder with Keras data shape issue

分類Dev

Implementing a minimal LSTMCell in Keras using RNN and Layer classes

分類Dev

Kerasでニューラルネットワークのinput_shapeを適切に宣言していますか?

分類Dev

Kerasで変換されたシーケンスのinput_shapeを宣言しますか?

分類Dev

Keras Input(shape =)がdim = 100ベクトルの形状(1、)を期待しているのはなぜですか

Related 関連記事

  1. 1

    Keras LSTM training. How to shape my input data?

  2. 2

    Confusion about dependencies in cargo

  3. 3

    Keras Shape'ValueError '

  4. 4

    Keras入力の説明:input_shape、units、batch_size、dimなど

  5. 5

    Use "Flatten" or "Reshape" to get 1D output of unknown input shape in keras

  6. 6

    RNNでステートフルLSTMにbatch_input_shapeを使用するとエラーが発生する

  7. 7

    Confusion about std::unique implementation?

  8. 8

    Confusion about data usage in Controllers

  9. 9

    keras bidirectional layer with custom RNN Cell

  10. 10

    Keras Dense layer shape error

  11. 11

    Keras input_shape、output_shape、get_weights、get_config、summaryのPyTorchの代替手段は何ですか

  12. 12

    Keras を使用した LSTM の input_shape パラメーターの理解

  13. 13

    Keras:binary_crossentropy&categorical_crossentropy confusion

  14. 14

    Confusion with weights dumping from neural net in keras

  15. 15

    confusion about subproblem in dynamic programming question

  16. 16

    Confusion about malloc and calloc function in C

  17. 17

    Apollo Server - Confusion about cache/datasource options

  18. 18

    Confusion about deploying a front end project

  19. 19

    Confusion about Logic between constructor and ngOnInit() in Angular

  20. 20

    Confusion about the execution order in SQL query

  21. 21

    Markdown: Confusion about empty lines in editor

  22. 22

    confusion about how to use c++ reference

  23. 23

    Confusion about maximum output from Geometry Shaders

  24. 24

    A confusion about scopes of If Else statements in python

  25. 25

    LSTM autoencoder with Keras data shape issue

  26. 26

    Implementing a minimal LSTMCell in Keras using RNN and Layer classes

  27. 27

    Kerasでニューラルネットワークのinput_shapeを適切に宣言していますか?

  28. 28

    Kerasで変換されたシーケンスのinput_shapeを宣言しますか?

  29. 29

    Keras Input(shape =)がdim = 100ベクトルの形状(1、)を期待しているのはなぜですか

ホットタグ

アーカイブ