I use a large dataset, so I'm trying to use train_on_batch (or match with epoch = 1)
model = Sequential()
model.add(LSTM(size,input_shape=input_shape,return_sequences=False))
model.add(Dense(output_dim))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=["accuracy"])
for e in range(nb_epoch):
for batch_X, batch_y in batches:
model.train_on_batch(batch_X,batch_y)
# or
# model.fit(batch_X,batch_y,batch_size=batch_size,nb_epoch=1,verbose=1,shuffle=True,)
But when training begins, it happens:
(0, 128)
Epoch 1/1
128/128 [
(129, 257)
Epoch 1/1
128/128 [
No matter how many eras I wait, this does not change. Even if I changed the batch size, the same thing will happen: the first batch has good values, and then it just goes into "loss: -0.0000e + 00 - acc: 0.0000e + 00".
Can someone help to understand what is happening here?
Thanks in advance
source
share