Relu Optional Lab Course 2 Week 2

I agree with Tom, and, therefore, I think a2_0 is still of the form you wrote down, only that the weights were fixed to 1 and the bias to 0. Certainly, the use of a linear activation function makes a2_0 appear to be under no activation function at all.

Furthermore, with weights fixed to ones, the graph on the left will be the sum of the graphs on the right. That makes it easier for us learners to correlate these graphs with only simple maths.

Cheers,
Raymond

1 Like

Thanks! On that topic: I also finally realized when looking at this again that the outputs of the units (say, when x= 3 (which would result in (3,2) and (3,2) for y = x - 1 and y = 2x - 4) would cumulatively sum to 4, which is what the graph on the left shows. It hadn’t occurred to me until looking at this again yesterday that the ‘stitched’ graph on the left isn’t necessarily just attaching the graphs of the neurons on the right; its based on the equation defined in a2_0

1 Like

Yes! That is a key to understanding this lab! Onwards!

Cheers,
Raymond

1 Like