C2W3 transfer learning

What does w_5 b_5 refers to? since it’s an output layer why adding w and b?

Yes it doesnt make sense for the output but this model might have more layers than just 4 and hence the w5 and b5…

1 Like

No I just understood now! there must be w and b because it is applied to the output of layer 4 else it doesn’t have a meaning to add this layer.

The output is normally only an activation output, a single value output, it doesnt normally have weights and biases…

1 Like