Backpropagation in multi-input/multi-output neural networks

Hi everyone
I have a question about how exactly are weights updated for the neural network that has multiple inputs/multiple outputs
Taking the multi-input siamese network from the lectures, how do the weights get updated in the two branches?
Taking the multi-output wine quality and type network from the assignment, does the back prop occur twice? Once for the quality and second time for the classification?

Hello @ainewbie if you search in this part of the forum or in the entire forum, you will find this question has been asked more than once.

Thanks for the reply. I’ll look into it

1 Like


that is a very good question. As you know in a multi input network and output like in Siamese network, it usually has a same structure and same weights and it is defined by the euclidean distance ( it basically measure the amount of similarity between the two classes. if it is closer to 0, there is more similarity and if it higher or more than 1 then there is dissimilarity.

Even in Siamese network, if you have seen the video of this week course, you will notice first a base network is made and then a model is trained with two inputs and one output, which is then passed through a custom loss function.

So in case of multiple input or output, when there are multiple weights, a weight matrix is calculated. weights are connecting the input and the hidden nodes, i.e. they are between the input and hidden layers. These weight matrix is created according to the value of weights. the next step is to multiple the matrix with the input vector leading to the output.

I am attaching some of the pictures which are self explanatory, if still unclear you can ask.


1 Like

Thank you for the reply :grinning: