W4 Assignment 1, Exercise 7 correction made to assignment code

I had to add a line m = b.shape[1] to the existing code in Exercise #7 to avoid the Python error: UnboundLocalError: local variable 'm' referenced before assignment. Once line was added my code executed and passed tests. Am I missing something?

A_prev, W, b = cache
    m = b.shape[1] # line added by student to define m priort to use
    m = 1/m * A_prev.shape[1]

If you mean linear_backward, here is the template code as it was given to you:

# GRADED FUNCTION: linear_backward

def linear_backward(dZ, cache):
    """
    Implement the linear portion of backward propagation for a single layer (layer l)

    Arguments:
    dZ -- Gradient of the cost with respect to the linear output (of current layer l)
    cache -- tuple of values (A_prev, W, b) coming from the forward propagation in the current layer

    Returns:
    dA_prev -- Gradient of the cost with respect to the activation (of the previous layer l-1), same shape as A_prev
    dW -- Gradient of the cost with respect to W (current layer l), same shape as W
    db -- Gradient of the cost with respect to b (current layer l), same shape as b
    """
    A_prev, W, b = cache
    m = A_prev.shape[1]

    ### START CODE HERE ### (≈ 3 lines of code)

So they defined m for you. Where did that factor \frac {1}{m} come from in your version? It looks like you made some ill-advised or perhaps accidental changes to the template code.

If you don’t believe me, here are the instructions for getting a clean version of the notebook.

Of course also note that you are far from the first person to attempt this assignment. If that really was the way it was written, everyone would have had this problem and it would have been fixed by now.

Also note that your fix is not going to really work. The shape of b is independent of the number of samples, right?

It’s quite possible I accidentally changed the code. I certainly thought it would be unlikely that code was incorrect, but correct versions in software sometimes unintentionally regress to incorrect versions. That’s why I posted the question.