How to implement this neural network cost function in Matlab:

Here is what the symbols represent:
% m is the number of training examples. [a scalar number] % K is the number of output nodes. [a scalar number] % Y is the matrix of training outputs. [an m by k matrix] % y^{(i)}_{k} is the ith training output (target) for the kth output node. [a scalar number] % x^{(i)} is the ith training input. [a column vector for all the input nodes] % h_{\theta}(x^{(i)})_{k} is the value of the hypothesis at output k, with weights theta, and training input i. [a scalar number] %note: h_{\theta}(x^{(i)}) will be a column vector with K rows.
I am having problems with nested sums, offset nodes, and the overall complexity of this equation. I also struggle because there are 2 weight matrices, one of which connects the inputs to the hidden layer and connects the hidden layer to the outputs. Here is my attempt.
Define Variables
m = 100 %number of training examples K = 2 %number of output nodes E = 2 %number of input nodes A = 2 %number of nodes in each hidden layer L = 1 %number of hidden layers Y = [2.2, 3.5 %targets for y1 and y2 (see picture at bottom of page) 1.7, 2.1 1.9, 3.6 . . %this is filled out in the actual code but to save space I have used ellipsis. there will be m rows. . . . . 2.8, 1.6] X = [1.1, 1.8 %training inputs. there will be m rows 8.5, 1.0 9.5, 1.8 . . . . . . 1.4, 0.8] W1 = [1.3, . . 0.4 %this is just an E by A matrix of random numbers. this is the matrix of initial weights. . . . - 2 . . . 3.1 . . . - 1 2.1, -8, 1.2, 2.1] W2 = [1.3, . . 0.4 %this is an A by K matrix of random numbers. this is the matrix of initial weights. . . . - 2 . . . 3.1 . . . - 1 2.1, -8, 1.2, 2.1]
The hypothesis using these weights is ...
Htheta = sigmf( dot(W2 , sigmf(dot(W1 , X))) ) %This will be a column vector with K rows.
The cost function using these weights is ... (That's where I'm afraid)
sum1 = 0 for i = 1:K sum1 = sum1 + Y(k,i) *log(Htheta(k)) + (1 - Y(k,i))*log(1-Htheta(k))
I just keep writing such things, and then I realize that all this is wrong. I canβt let my life determine how to make invested amounts, or include an input matrix, or do any of this. All this is very complicated.
How to create this equation in matlab?
Thank you very much!
Two-layer neural network with 2 inputs, 2 outputs, 2 hidden nodes and 2 offset units http://imagizer.imageshack.us/v2/320x240q90/40/92bn.jpg
Note. The code has weird colors since stackoverflow doesn't know what I'm programming in MATLAB. I also wrote the code directly in stackoverflow, so it may have syntax errors. I'm more interested in the general idea of ββhow I should do this, and not just copy and paste the code. It is for this reason that I did not worry about half columns, etc.