How did we expect layer 2 to have 4 parameters? (3 W parameter, and 1 B). I would expect it to only have 2 W paramerer since the input is in the shape of (2,)
The classroom referenced
How did we expect layer 2 to have 4 parameters? (3 W parameter, and 1 B). I would expect it to only have 2 W paramerer since the input is in the shape of (2,)
Thankyou for answering, however I am still confused why is the layer 2 parameter will be 3x1 + 1. I thought it would the same as layer 1, 2x1 + 1 (because layer 1 will outputs matrix of (2,1) ?
see layer 1 has 3 units hence it will output the matric of size (3,1), and layer 2 has 1 unit, therefore to match the dimension of input it will have 3 weight parameters and 1 bais term. Got it?
Owww, I understand, I forgot that layer 1 will output matrix of size (3,1). Does it mean that, if for example layer 2 has 2 unit, it would be 3x2 + 2?
yes,yes!!
If your doubt is solved, feel free to my mark my answer as SOLUTION to close the thread. Let me know if there is anything I can help you with.
My sincere gratitude and thankfulness for helping me learn ML
@tarunsaxena1000, note that marking a reply as a Solution does not close the thread.