Why is w a vector in the activation layer function?

Why is w a vector here ? Wouldn’t the value of wj be same for all values of vector a[l-1] ?

No, it wouldn’t.
Each input to an activation unit has its own weight value.

1 Like