How does model weights get updated when there are 2 output layer and 2 loss function?

I am taking course on Custom Models, Layers, and Loss Functions with TensorFlow. On Week 1, we are creating multi output layer(2 output layers) NN with 2 loss function, so first hidden layers are sharing weights but they are getting split into 2 branches where each branch has different output layer. How does this architecture update weights when they backpropagate from different output layer into the same shared hidden layer?

1 Like

Interesting perpective, I think mathematically up to the point of splitting there will be one set of weights, after that each branch will have its weights relevant to what is outputing. Both branches output will have an effect to the main branch weights, a compromise will be made to satisfy both tasks at best.

1 Like

Very Interesting Question. I had the same doubt too.


Not super-detailed, but there are some breadcrumbs provided midway through this Keras developer guide:

See the section titled Models with multiple inputs and outputs

1 Like