Keras: What if the data size is not divisible by batch_size?

I am new to Keras and just started working on some examples. I am dealing with the following problem: I have 4032 samples and you can use about 650 of them both for fitting and mainly for training, and then use the rest for testing the model. The problem is that I keep getting the following error:

Exception: In a stateful network, you should only pass inputs with a number of samples that can be divided by the batch size.

I understand why I get this error, my question is: what if the size of my data is not divisible by batch_size? I used to work with Deeplearning4j LSTM and did not deal with this problem. Is there any way around this?

thank

+4
source share
1 answer

- fit_generator . dataloader, , . : get_next_batch_data , , ..

class BatchedLoader():
    def __init__(self):
        self.possible_indices = [0,1,2,...N] #(say N = 33)
        self.cur_it = 0
        self.cur_epoch = 0

    def get_batch_indices(self):
        batch_indices = self.possible_indices [cur_it : cur_it + batchsize]
        # If len(batch_indices) < batchsize, the you've reached the end
        # In that case, reset cur_it to 0 and increase cur_epoch and shuffle possible_indices if wanted
        # And add remaining K = batchsize - len(batch_indices) to batch_indices


    def get_next_batch_data(self):
        # batch_indices = self.get_batch_indices()
        # The data points corresponding to those indices will be your next batch data
0

All Articles