Backpropagation in multi-input/multi-output neural networks


that is a very good question. As you know in a multi input network and output like in Siamese network, it usually has a same structure and same weights and it is defined by the euclidean distance ( it basically measure the amount of similarity between the two classes. if it is closer to 0, there is more similarity and if it higher or more than 1 then there is dissimilarity.

Even in Siamese network, if you have seen the video of this week course, you will notice first a base network is made and then a model is trained with two inputs and one output, which is then passed through a custom loss function.

So in case of multiple input or output, when there are multiple weights, a weight matrix is calculated. weights are connecting the input and the hidden nodes, i.e. they are between the input and hidden layers. These weight matrix is created according to the value of weights. the next step is to multiple the matrix with the input vector leading to the output.

I am attaching some of the pictures which are self explanatory, if still unclear you can ask.


1 Like