LSTM autoencoder with Keras data shape issue

TheTechGuy

I am trying to make a model using Keras with LSTM autoencoder. Here what I have tried

data = df.values
timesteps = 10
dim = data.shape[1]
samples = data.shape[0]
data.shape = (int(samples/timesteps),timesteps,dim)

and then

model = Sequential()
model.add(LSTM(50,input_shape=(timesteps,dim),return_sequences=True))
model.add(LSTM(50,input_shape=(timesteps,dim),return_sequences=True))
model.add(LSTM(50,input_shape=(timesteps,dim),return_sequences=True))
model.add(LSTM(50,input_shape=(timesteps,dim),return_sequences=True))
model.add(LSTM(50,input_shape=(timesteps,dim),return_sequences=True))
model.add(LSTM(50,input_shape=(timesteps,dim),return_sequences=True))
model.add(LSTM(50,input_shape=(timesteps,dim),return_sequences=True))
model.add(LSTM(50,input_shape=(timesteps,dim),return_sequences=True))
model.add(LSTM(50,input_shape=(timesteps,dim),return_sequences=True))
model.add(LSTM(50,input_shape=(timesteps,dim),return_sequences=True))
model.add(LSTM(50,input_shape=(timesteps,dim),return_sequences=True))
model.compile(loss='mae', optimizer='adam')

this is my model fit

model.fit(data, data, epochs=50, batch_size=72, validation_data=(data, data), verbose=0, shuffle=False)

This is the error message I am getting

ValueError: Error when checking target: expected lstm_33 to have shape (None, 10, 50) but got array with shape (711, 10, 1)

How can I fix this ?

I have only I data set

Update

input data shape I have = (7110, 1)

This is an Univariate time series data

Kurtis Streutker

The error is caused by specifying input_shape=(timesteps,dim) for all the layers. You only need to do this for the first layer and the rest will be inferred by the previous layer. What is happening is you are overriding the input shape which is causing the error.

この記事はインターネットから収集されたものであり、転載の際にはソースを示してください。

侵害の場合は、連絡してください[email protected]

編集
0

コメントを追加

0

関連記事

分類Dev

Keras LSTM training. How to shape my input data?

分類Dev

timeseriesのLSTM Autoencoder

分類Dev

Python、keras、Convolutional autoencoder

分類Dev

Keras を使用した LSTM の input_shape パラメーターの理解

分類Dev

The mathematical formulation of LSTM in Keras?

分類Dev

ssim as custom loss function in autoencoder (keras or/and tensorflow)

分類Dev

Keras flow_from_directory autoencoder training

分類Dev

Keras Shape'ValueError '

分類Dev

Train a model using lstm and keras

分類Dev

Shape error when passed custom LSTM

分類Dev

how to build Sequence-to-sequence autoencoder in keras with embedding layer?

分類Dev

Trying to create a convolutional neural Autoencoder network in Keras but it keeps crashing

分類Dev

Keras Dense layer shape error

分類Dev

Keras LSTMについて

分類Dev

Kerasの1対多のLSTM

分類Dev

Train Keras LSTM model with a variable number of features

分類Dev

Keras LSTM Larger Feature Overwhelm Smaller Ones?

分類Dev

python - Implementing an LSTM network with Keras and TensorFlow

分類Dev

Keras LSTM Model for text-generation purpose

分類Dev

keras lstm error: expected to see 1 array

分類Dev

Python Keras LSTM Features order relevance

分類Dev

Keras LSTM: 最初の引数

分類Dev

subset shape file data

分類Dev

"ValueError: The shape of the input to "Flatten" is not fully defined" with variable length LSTM

分類Dev

lstmモデルのOutput_shape

分類Dev

Confusion about Keras RNN Input shape requirement

分類Dev

Shuffling training data with LSTM RNN

分類Dev

Shuffling training data with LSTM RNN

分類Dev

How can I load a trained autoencoder (Keras), take the encoder part, and freeze it?

Related 関連記事

ホットタグ

アーカイブ