Hi Bernard, I think you are missing something in your code. Is the activation for sigmoid and then activation for relu mentioned in your codes? I can’t assume, they are. And, in case, it is then re-run all the cells again through shift+enter as mentioned by Paul sir, above. Hope you will get it.
I think that means you used the “deep” version of the initialization routine for the two layer case. That is not incorrect from a philosophical point of view, but it will cause you to get different results. They use a more sophisticated version in the “deep” case. There are two separate init routines and you have to use the appropriate one.
Yes Rashmi, the activation functions are indeed mentioned in the code as you can see in the two lines of code I shared above. I really appreciate the assistance of Paul, many thanks for being around sir
Hello Paul, it was actually the ‘init’ function issue. As you rightly pointed out, I used the deep version to initialize the parameters. I changed it, and got the code working just fine. Many thanks for your assistance, well appreciated!
Okay, I see. Yes Paul sir and other mentors always help us in getting our problems duly solved.
I suggest that you rewrite the codes again for this cell completely. Erase the your present codes and try to rewrite it again after doing a restart with all the outputs cleared and saving your work.