Guidance on L_layer_model function

Hello, I am having problems with the function L_layer_model. It is passing the datatype_check and the shape_check but is failing on both of the equation_output_checks. I am utilizing the following functions from the previous assignment: initialize_parameters_deep(layers_dims), L_model_forward(X, parameters), compute_cost(AL, Y), L_model_backward(AL, Y, caches), and update_parameters(parameters, grads, learning_rate). I am using the input parameters that are given. Any guidance would be appreciated.
Amy

Hi, Amy.

Thanks for being careful about not showing the actual source code. Everything you describe sounds correct. Are you sure that you did not “hand import” any of your functions from the “Step by Step” assignment? Note that was not part of the instructions. You can just call them and they provide you with correct implementations. But they use a more sophisticated version of the initialization function for the “deep” case. If you use your version from “Step by Step”, the results will be the right shape, but with different values.

If that’s not the issue, then the next step would be to show us the full output you get from the test cell that fails.

1 Like

Hello, I am getting a matrix multiplication error on the L_layer_model function:


How do I ensure I have the right dimensions for the activations and the weight matrixes?

This thread is about the second assignment in Week 4, but you are asking about the “Step by Step” assignment, which is the first one in Week 4.

In your case, you have fallen out of the “for” loop and you are doing the processing for the output layer, which uses the sigmoid activation. But what is happening is that the A_prev value there is not what you need it to be. In order to understand what is happening here, the best first step is to do the “dimensional analysis”. That will give you a clear picture of what should be happening at each layer. With that information, it will be more clear what the nature of your mistake is. Here’s a thread which walks you through that for this “2hidden” test case.

Please read that thread I linked and then see what that shows you. What shape should the A_prev value be at that point? Why did your value end up being wrong?

My bad for posting in the wrong thread. But I want to thank you for your help, I was able to resolve my issue!

1 Like

That’s great to hear that you were able to find the solution based on those suggestions. Nice work!