You’re right: that is the derivative of ReLU. Think about it for a second and that should become clear. It is 1 if a > 0 and 0 otherwise, right?
Note that the code here looks quite a bit different than the fully general code that we wrote in Week 4 of Course 1 for the L layer case. Here they just “hard-code” everything to keep the code simple, so they don’t call a function for the derivative of the activations that is parameterized with the function being used: they just literally write it out.