Hi!

So I am getting an error in exercise 8 model where it says tuple index out of range.

What could be the issue?

Hi!

So I am getting an error in exercise 8 model where it says tuple index out of range.

What could be the issue?

You are indexing something incorrectly. Sorry, we canât say much more than that without more information. Weâre not supposed to publicly share source code, but itâs fine to âcopy/pasteâ the complete exception trace that you are getting. That should at least give us more to work with in terms of debugging this.

Well, Thank you for your reply!

I was able to solve this. Yes, it was indexing issue.

Also, is there anyway I could extend my assignments deadline because I could not do them earlier as I had symptoms of Covid and was not feeling well over past couple of days and week. I have not gotten better but still I am trying to do it right now.

Thanks!

Congrats! Itâs great to hear that you were able to debug the problem under your own power. Thanks for confirming.

About the deadlines, just ignore them. They are all âfakeâ in the sense that there is no penalty for missing them. They will just reset or you can reset them. You may well ask why they bother with deadlines then, but it turns out that theyâve been studying us like lab rats and they have statistics that show that hassling people in this way increases their chances of success. Who knew?

2 Likes

Thank you for the information!

Actually my course ends on September 6th and final deadline as it says is 2 46 am on Sept 6th. That is why I was asking.

So will I still be able to submit assignments after Sep 6th and complete the course?

Thanks!

That is my understanding. Of course if you are taking the course through a sponsoring institution, they may have some way to enforce deadlines. But Coursera in general does not enforce them: they are happy to have you continue paying the subscription fee for longer.

Okay Thanks!

So I am having an issue in week 3 assignment.

For exercise 6 back propagation, it says value error. i am pasting my code temporarily so you can check it:

dZ2=A2-Y

dW2=(1/m)*dZ2*(A1.T)

db2=(1/m)*np.sum(dZ2, axis=1, keepdims=True)
dZ1=(W2.T*dZ2)*(1 - np.power(A1, 2))

dW1=(1/m)

db1=(1/m)*np.sum(dZ1, axis=1, keepdims=True)

The error that it is giving is this:

ValueError Traceback (most recent call last)

in

1 parameters, cache, t_X, t_Y = backward_propagation_test_case()

2

----> 3 grads = backward_propagation(parameters, cache, t_X, t_Y)

4 print ("dW1 = "+ str(grads[âdW1â]))

5 print ("db1 = "+ str(grads[âdb1â]))

in backward_propagation(parameters, cache, X, Y)

46 # YOUR CODE STARTS HERE

47 dZ2=A2-Y

â> 48 dW2=(1/m)*dZ2*(A1.T)

49 db2=(1/m)*np.sum(dZ2, axis=1, keepdims=True)
50 dZ1=(W2.T*dZ2)*(1 - np.power(A1, 2))

ValueError: operands could not be broadcast together with shapes (1,3) (3,4)

Youâve made the same mistake in several places: you are using elementwise multiply * where you should be using dot products. You need to realize the notational convention that Prof Ng uses:

When he means elementwise multiply, he *always* writes the explicit * as the operator.

When he just writes to the two operands next to each other without an explicit operator, he means dot product style multiply. So in this formula:

dW^{[2]} = \displaystyle \frac {1}{m} dZ^{[2]} A^{[1]T}

The operation between dZ^{[2]} and A^{[1]T} is the dot product. I would write it like this:

dW^{[2]} = \displaystyle \frac {1}{m} dZ^{[2]} \cdot A^{[1]T}

But Prof Ng is the boss, so you just have to understand how he does things. Youâve made a similar mistake in the code for dZ^{[1]}. Take another look at the formula with what I said above in mind. In the case of dZ^{[1]}, there is one dot product and one elementwise multiply.

Hello, Iâm getting an assertion error that the size of the d[âwâ] matrix is not equal to the expected result. What might be the issue?

All my previous exercises have been correct and I donât seem to understand why itâs so

There must be something that matter with your code in the `model`

function. Note that a perfectly correct function can still throw errors if you pass it bad or mismatching arguments. So if your previous functions pass the tests, then the bug must be how you are calling them from `model`

.

In terms of how to debug a shape mismatch, the first question is âwhat shape is it?â Add this code at a couple of relevant places in your model function:

`print(f"w.shape = {w.shape}")`

E.g. right after the initialize call and right before the call to whichever subroutine it is that is throwing the error in this case. If thatâs not enough to get you to a solution, then the next step would be to show us the full exception trace that you are getting.

I put the print(f"w.shape = {w.shape}â) after calling the initialize âinitialize_with_zeros(dim)â function in the model but the shape was (2,1). I also tried putting it in exercise 4 where the"initialize_with_zeros(dim)â function was defined but I still got the shape as (2,1). I donât seem to find a way through.

Hereâs the full exception trace

Ok, that means you are passing the wrong argument when you call the initialize routine. You are probably using a global variable like `dim`

. That is not defined in the `model`

function, right?

Yes dim is not defined in the model function

And you did not reference it there, right?

I already did but itâs returning a different error.

âValueError: shapes (1,4) and (2,3) not aligned: 4 (dim 1) != 2 (dim 0)â

Ok, you fixed the first global variable error. Now youâve got another one. You are probably referencing X, which is also not defined in `model`

, right?

Not really, I only passed in X as an argument to âoptimizeâ and âpredictâ functions respectively when calling them in the model function.

I think I have to use the defined parameters in the model while calling the functions, right?

Exactly. There is no X in the local scope of that function, right? That was my point. If you reference X, you are picking up values that are global and have nothing to do with the parameters that were actually passed. This is a pretty key point to understand.

1 Like