Below is a snippet of code below:
You must submit a value for the placeholder tensor 'bidirectional_1 / keras_learning_phase' with dtype bool
If I add a model.add(Dropout(dropout)) layer model.add(Dropout(dropout)) , it will work. Does anyone know why? Back-end - Tensorflow, Keras 2.0.1
def prep_model1(embedding_layer1, embedding_layer2, dropout=0.5): model0 = Sequential() model0.add(embedding_layer1) model0.add(Bidirectional(LSTM(128, return_sequences=False, dropout=dropout))) model1 = Sequential() model1.add(embedding_layer2) model1.add(Bidirectional(LSTM(128, return_sequences=False, dropout=dropout))) model = Sequential() model.add(Merge([model0, model1], mode='concat', concat_axis=1)) #model.add(Dropout(dropout)) model.add(Dense(1, activation='sigmoid')) return model
deep-learning tensorflow keras
wolfshow
source share