Why is ReLU jagged?

The doubt is from Multiclass Lab
image
The model created is as shown in above image. Then the model is trained and then weights and bias are extracted from first layer to draw the boundaries created by each unit. The boundaries are as follow


Now my doubt is why is the boundary or ReLU line jagged?, I thought it is straight line.

By the way is ReLU function equation as follow
F(x) = max(0.0, z(x)), where z(x) is line equation?

It’s only jagged because of the spacing of the values used to plot the curve.

I just want to answer this part of the question as well and I think the ReLU function I mentioned is correct but z(x) is a plane rather than line here since there are two inputs. I got confused by the shape of weights matrix but after playing a little in wolfram alpha and ploting random stuff I understood it.
Maybe thats why the line has to be zagged as well? or is it spacing of values?

Yes.

That’s the best way - try and see it for yourself. Do the same for that zagged line, adjust the spacing to convince yourself.

Cheers,
Raymond

1 Like