My representation graph of NN implementation, and a question about a variable

Hello everyone, I have made a representation graph of implementing NN, based on Andrew’s “Forward and backward functions” drawing in week4’s “Building Blocks of Deep Neural Networks” video.

This graph uses different colors to represent how the variables flow within and between layers. I believe it should be a closer representation to the structure of week4’s assignment, including the “activation cache” and “linear cache” implementation in our code.

It would be great if someone can help me verify whether there are any mistake in my representation. Also, during the assignment I had noticed that the “b” variable seemed actually not being used in calculating “dW” and “db” in the “linear_backward” function, so I did not include “b” in the “L-cache” in my graph, can someone also confirm for me please?

In case there is no mistake in my graph, I hope this could be helpful to you guys, especially in doing the assignment and gaining intuition behind the codes. Since our assignments are mainly about filling the missing functions, not developing the whole structure from scratch, I often found myself lacking a clear structural view right after finishing the assignment, but drawing a graph like this definitely help clearing my mind a lot!

2 Likes

Very nice graph. I don’t see any errors.

1 Like

Yes, as John says, very nice! A work of art you could even say. It is a good point that the b value in the linear cache is not really used in back propagation, so it could be eliminated with no harm. My guess is they wrote the code that way just so they didn’t have to explain why they didn’t include b.

The only very minor error I see is that in the Update^{[1]} block lower left, the superscripts on W and b in the update formulas look more like L than 1 to me, but maybe it’s more a problem with my glasses than your presentation. :nerd_face:

1 Like

Thanks! good to know the representation is correct as well as spotting out a minor mistake with a very sharp sight! :flushed: