Not sure why I’m getting this error. If I want to call the linear_activation_forward function, then it is not accepting parameters dict output as argument or input to function.
You are using the wrong syntax for indexing a dictionary. If you say:
myDict(myKey)
in python that means myDict is a function and you are invoking it with the argument myKey. If myDict is a dictionary and myKey is the key, the correct syntax is:
myDict[myKey]
See the difference?
Note that is not the only mistake in your invocation of linear_activation_forward. What is the type of the activation parameter supposed to be? You are passing an object reference to a function. I predict that will not end happily.
Hi Paul, thanks for the prompt reply.
I changed the activation parameter as required, but now I’m getting the below mentioned error. Also, now I’m calling the parameters dictionary as you mentioned above. I’m not able to recognize what is the mistake I’m doing here.
Regards,
Harsh

One step at a time, right? So now you’ve solved the problems that were throwing exceptions. Now you just have incorrect outputs. So you still have more debugging to do.
Look at the shape of your AL value: it is 3 x 4, right? That is wrong. There is only 1 neuron in the output layer so it should be 1 x m. In this test case, the number of samples is 4, so AL should be 1 x 4. So now you need to figure out how that happened.
The best way to get a handle on this type of problem is to start with the “dimensional analysis”. Here is a thread that shows that for the “2hidden” test case here. Please have a look at that thread and then start by putting print statements in your code to show the dimensions of A^{[l]} for all layers. That should give you someplace to start in figuring out what is wrong here.
Thanks you so much Paul!!
I got to know where I was wrong. While calling the function with ‘Relu’ activation, each time I was also storing the value of Activation function, which needs to be used to calculate the value of AL while calling the linear_activation_forward with sigmoid function for last layer. I corrected the mistake after doing dimensional analysis.
Thanks & Regards,
Harsh
Congrats on finding the solution! This is the most complicated assignment we’ve seen yet, so it takes some serious thinking to get it right.