Error in general implementation of forward propagation


In the layer represented the x is input and has 3 units in the layer. So for sequential layer the a4 should be different as below, instead of what is in the image. (I have highlighted the parts that I had doubt with red color, and also used red color text to annotate)

def sequential(x):
  a1 = dense(x, W1, b1)
  a2 = dense(x, W2, b2)
  a3 = dense(x, W3, b3)
  a4 = dense([a1,a2,a3], W4, b4) #instead of just a3 since this should be the final layer
  f_x = a4
  return f_x

Hi @hsuyab, welcome to our community!

The graph at the top left corner shows the graphical representation of one layer.
The sequential code at the right shows a code implementation of four layers.

The dense code in the middle is for one layer, so you may compare the top left graph with the middle code.

In the sequential code in the right, for each sample x, the first layer produces a1 which has as many values as the number of neurons in the layer, the second layer takes in a1 and produces a2, again, with as many number of values as neurons in the layer. The last layer takes in a3 and produces a4 where a4 also has as many values as the number of neurons in the last layer.

Raymond

1 Like

Hello @hsuyab

Looking at your comment I can see where you are getting confused:

When we call:

a1 = dense(x, W1, b1) → If x is an input vector of 3 elements representing 3 input features, we don’t need to call dense 3 times with x as the input, as you have done, to account for the 3 input features:

a1, a2, a3, a4 stands for the 4 layers that we are creating to go from input to output. As @rmwkwok explained, the inputs x get transformed to a1 in layer 1. a1 gets tranformed to a2 in layer 2 and so on till we reach the output layer. The general process being that the output of one layer serves as the input of the next layer. This sequential process is represented as:

a1 = dense(x, W1, b1)
a2 = dense(a1, W2, b2)
a3 = dense(a2, W3, b3)
a4 = dense(a3, W4, b4)

2 Likes

Thanks for the explanation! I got confused between layers and units. :smiley:

1 Like

You are very welcome @hsuyab!

1 Like