About z and x: z=w.t*x+b


I was wondering if z and x has other form of relationship apart from this equation?


That’s pretty much how you get z.
Then z becomes a layer output ‘a’ by applying some activation function.

Hello @mobaobao,

Besides Tom’s comment, it also depends on how you define w and x. In this DLS’s neural network setting, we have z = wx + b. However, in the Machine Learning Specialization, we have z = xw + b. It is about how we arrange samples in X (either stacking samples horizontally or vertically), and about how we arrange neurons in W (again, horizontally or vertically).

Your w.T*x is likely to be from the DLS’s logistic regression lectures? Such relation is also due to its own arrangement.

Besides the above three relations, we can actually come up with more relations by making different arrangments… Therefore, we need to read carefully how we define those w and x.