I am trying to create a simple neural network in a tensor stream that studies the simple relationship between inputs and outputs (e.g. y = -x), where the inputs and outputs are floating point values ββ(which means that softmax does not use the output).
I feel that it should be pretty easy to do, but I must have messed up something. Wonder if there are any lessons or examples that do something like this. I looked at existing tensor tutorials and didnβt see anything like it and looked at several other sources of tensor flow examples that I found while searching on Google, but didn't see what I was looking for.
Here is a stripped down version of what I tried. In this particular version, I noticed that my weights and prejudices always seem to be zero. Perhaps this is due to my only entrance and only exit?
I was lucky to change the fog example for various vile purposes, but all I got was to successfully use softmax in the output for categorization. If I can understand how to generate floating point output from my neural network, there are some interesting projects that I would like to do with it.
Does anyone see what I'm missing? Thanks in advance! - J.
tensorflow
jrjbertram
source share