I am trying to run CNN, similar to the one in the Keras documantation "VGG-like convection" , but for a custom set of images and binary classification instead of 10-class output.
When I try to set CNN, I get this long error, which I believe tells me that my input image size is not the right size for CNN input.
ValueError: GpuDnnConv images and kernel must have the same stack size Apply node that caused the error: GpuDnnConv{algo='small', inplace=True}(GpuContiguous.0, GpuContiguous.0, GpuAllocEmpty.0, GpuDnnConvDesc{border_mode='valid', subsample=(1, 1), conv_mode='conv', precision='float32'}.0, Constant{1.0}, Constant{0.0}) Toposort index: 130 Inputs types: [CudaNdarrayType(float32, 4D), CudaNdarrayType(float32, 4D), CudaNdarrayType(float32, 4D), <theano.gof.type.CDataType object at 0x7f0eefc8d790>, Scalar(float32), Scalar(float32)] Inputs shapes: [(32, 232, 300, 3), (300, 1, 3, 3), (32, 300, 298, 1), 'No shapes', (), ()] Inputs strides: [(208800, 900, 3, 1), (9, 0, 3, 1), (89400, 298, 1, 0), 'No strides', (), ()] Inputs values: ['not shown', 'not shown', 'not shown', <PyCObject object at 0x7f0efaba8e68>, 1.0, 0.0] Inputs name: ('image', 'kernel', 'output', 'descriptor', 'alpha', 'beta')
The fact is that I decided to change all my images to fit. My input is a stack of images with a resolution of 4000 232x300 pixels RBG, and the output is an array of 4000 Boolean values.
Input : im_list.shape Out[49]: (4000, 232, 300, 3)
Output : np.asarray(cls).shape Out[50]: (4000,)
This is the function to create CNN.
CNN = buildCNN(3, 232, 300, 2) CNN.fit(im_list, cls, batch_size=32, nb_epoch=1) def buildCNN(depth,width,height,outputShape): CNN = Sequential() # input: 232x300 images with 3 channels -> (3, 100, 100) tensors. # this applies 32 convolution filters of size 3x3 each. CNN.add(Convolution2D(32, 3, 3, border_mode='valid', input_shape=(depth,width,height))) CNN.add(Activation('relu')) CNN.add(Convolution2D(32, 3, 3)) CNN.add(Activation('relu')) CNN.add(MaxPooling2D(pool_size=(2, 2))) CNN.add(Dropout(0.25)) # CNN.add(Convolution2D(64, 3, 3, border_mode='valid')) CNN.add(Activation('relu')) CNN.add(Convolution2D(64, 3, 3)) CNN.add(Activation('relu')) CNN.add(MaxPooling2D(pool_size=(2, 2))) CNN.add(Dropout(0.25)) # CNN.add(Flatten()) # Note: Keras does automatic shape inference. CNN.add(Dense(256)) CNN.add(Activation('relu')) CNN.add(Dropout(0.5)) # CNN.add(Dense(outputShape)) CNN.add(Activation('softmax')) # sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True) CNN.compile(loss='categorical_crossentropy', optimizer=sgd) # return CNN
I hit my head against the wall and I thought that maybe someone had this problem. Any thoughts? Thanks in advance.