Course 2 Week :6.1 Mini-Batch Gradient Descent

This code is available, I try to run but I got error. I also got the same error with Adam, Momentum although I passed all exercises.
@bahadir

Hi @Ngo_Nam_Khanh , welcome to the community! This is your first post :slight_smile:

This is what I would do, to begin with:

Although this code is available, if you analyze the error, it comes from calling the “model” (which is also provided) and in turn the model calls backward_propagation, which in turn receives the parameters minibatch_x, minibatch_y, and cache.

The error then indicates the line da2 = np.dot(W3.T, dz3) which, by looking at the util python file, comes from parameter ‘cache’, passed to backward_propagation.

This ‘cache’ in turn is coming from forward_propagation, which received the parameters (minibatch_X, parameters).

The minibatch_X parameter passed to forward_propagation was calculated using a function you implemented:

minibatches = random_mini_batches(X, Y, mini_batch_size, seed)

So, how about starting to see if this function is returning an invalid shape? you can use print statements to print the output at several points to understand the details.

Let me know how it goes!

Juan

2 Likes

Yes, I think Juan has narrowed this down to the place to look for the problem. There is a lot of code here that you didn’t write, but somehow the code you did write is not passing correct values. One “gotcha” to watch out for is that the tests for random_mini_batches only check the minibatches for X and totally ignore the minibatches for Y. And if you look at that backprop code where the error is being thrown, it uses Y before the point of the exception. I’ll bet your Y minibatches are the wrong shape.

2 Likes

Thanks, I figured out the problem and fixed it.