Feedforward Neural Networks in Depth

Summation means summing all the terms. You highlighted the addition sign (+) in the second figure. All terms are adding starting from i = 1 to i = m.

Dear Mr Saif,

May i know why should we add all terms together in this case?

Thank you.

First, give us the background. What is the relationship of u_{i} and g_{i}(x_{1},...,x_{j},...,x_{n}). Addition? And, y_{k} & f_{k}(u_{1},...,u_{i},...,u_{m}).

Thank you for sharing, Your notes were very organized and useful. As an MSc student I understood most of it. :slight_smile:
For calculating backward propagation equations, However, I prefer writing the chain rule wholly all the way from cost function to the target . For example, instead of writing 4 chain rules with the length of 1, I find it simpler to write a single line chain rule with the length of 4. It’s basically the same thing but I find the second formation easier to understand.
Just trying to throw a suggestion out there. Thank you again for the amazing notes.

U must calculate dZ/dW from eq (3), Z[l] = W[l]A[l-1]+B[l]. Where dZ/dW = A[l-1]

I am facing a problem with the articles.
the mathematical equations appear like this:

Hi, could you clarify the notation for f in (9) in part 1? I was expecting f to be f:R^{n^L x m} x R^{n^L x m} → R, but it is f:R^2n^L → R

Any idea why the MathML embedded in the neural net SVGs here would fail to render? I get the same issue in Chrome, Firefox and MS Edge,

I was confused by this when I first saw this as well. I found this article that might be helpful:

(see section titled " The Generalized Chain Rule")

1 Like

Thanks for wonderful work.

Thank you for sharing, @peppy :smiley:

Thanks for the derivation. I initially thought after reading part 1 article that equation #17 was an overkill given that for any layer l, any unit j in that later, any training example i, the activation function a_j^[l](i) would be a function of only z_j^[\l](i) and not any other a_p^[l](i). However after reading part 2 article and in particular the softmax function, I understood your motivation behind equation 17. Thanks much!

1 Like

Coolest thing I have seen today

1 Like

Hi,

I asked myself the same question and even send a personal e-mail to @jonaslalin about it before i read this conversation.
Thank for the explantation @rmwkwok .

Apologies for cluttering his mailbox.

Francis

1 Like

No, but you can save as html. there are tools to parse html to pdf but it is touchy.

Hope it Helps:

1 Like