Merge repeating layers with a dense layer in Keras

I want to build a neural network where the first two layers are simple, and the last is repetitive. here is my code:

model = Sequential()
model.add(Dense(150, input_dim=23,init='normal',activation='relu'))
model.add(Dense(80,activation='relu',init='normal'))
model.add(SimpleRNN(2,init='normal')) 
adam =OP.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08)
model.compile(loss="mean_squared_error", optimizer="rmsprop")  

and I get this error:

Exception: Input 0 is incompatible with layer simplernn_11: expected  ndim=3, found ndim=2.
model.compile(loss='mse', optimizer=adam)
+4
source share
2 answers

It is true that in Keras the RNN layer is waiting for input as (nb_samples, time_steps, input_dim). However, if you want to add an RNN layer after the Dense layer, you can still do this after changing the input for the RNN layer. Reshape can be used both as a first layer and as an intermediate level in a sequential model. Examples are given below:

Solve as the first level in a sequential model

model = Sequential()
model.add(Reshape((3, 4), input_shape=(12,)))
# now: model.output_shape == (None, 3, 4)
# note: `None` is the batch dimension

model.add(Reshape((6, 2)))
# now: model.output_shape == (None, 6, 2)

, , . , - . .

from keras.models import Sequential
from keras.layers import Dense, SimpleRNN, Reshape
from keras.optimizers import Adam

model = Sequential()
model.add(Dense(150, input_dim=23,init='normal',activation='relu'))
model.add(Dense(80,activation='relu',init='normal'))
model.add(Reshape((1, 80)))
model.add(SimpleRNN(2,init='normal')) 
adam = Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08)
model.compile(loss="mean_squared_error", optimizer="rmsprop")
+4

Keras Reccurrent Dense, Dense (nb_samples, output_dim). , (nb_samples, time_steps, input_dim). , Dense 2-D , Recurrent- . , .. Dense .

+2

All Articles