Neuroph Vs Encog

I decided to use feedback NN to teach backpropagation for my handwriting OCR application, and the input layer will consist of 32 * 32 (1024) neurons and at least 8-12 of the neurons.

I found Neuroph easy to use by reading some articles, while Encog is several times better in performance. Given the parameters in my scenario, an API is the most appropriate. And I appreciate if you can comment on the number of input nodes I took, is this value too much (although this is not from the topic)

+6
source share
1 answer

First my disclaimer, I am one of the main developers of the Encog project. This means that I am more familiar with Enkog, that Neurof and, perhaps, was biased towards him. In my opinion, the relative strength of each of them is as follows. Encog supports many interchangeable machine learning methods and training methods. Neuroph is VERY focused on neural networks and you can express the connection between anything. Therefore, if you are going to create very non-standard (research) neural networks of different typologies than typical Elman / Jordan, NEAT, HyperNEAT, Feedforward networks, then Neuroph will be well suited to the calculation.

+12
source

All Articles