@aloklal99,
This looks like a question for the Deep learning specialization (DLS), not the machine learning specialization (MLS), so I’m moving it to the appropriate category.

I don’t have access to the DLS assignments, but from your description, it sounds like you are right. If you are calculating the derivative of the ReLU function wrt z, the ReLU function is g(z) = max(0, z), and the derivative should be 0 if z < 0 and 1 if z > 1.

I’ll leave it for a DLS mentor to confirm and put in a request for the DLS staff to update the assignment