W3_A1_Compute cost_cross entropy loss

{moderator edit - solution code removed}

I can not complete this function , where is my fault ?

Can you explain what the function is supposed to do?

Hi @muhmoud_shehata,

Please read through the instructions for this exercise and try again.

Best,
Mubsi

2 Likes

You have made the very common mistake of assuming that the sample code they gave you is the complete solution. It’s not. Read the math formula for the cost given in the instructions and compare that to the code and ask yourself two questions:

  1. What happened to the factor of \frac {1}{m}?
  2. Why is there only one term? What happened to the Y = 0 term?

In fact they even say in the instructions that it’s not the full answer, but apparently they were a little too subtle about how they phrased it.

2 Likes

Are you familiar with the assignments in DLS Course 1? This is from the Week 3 assignment. In all the networks we build in Course 1, we are doing binary classifications, so the output activation is sigmoid and the corresponding loss function is “cross entropy” loss, which you see mentioned in the “docstring” of the function. If you are not familiar with that, you can learn about it by taking DLS Course 1! :nerd_face:

Am I familiar with the assignments in DSL Course 1? Not yet… However, I am very familiar with the concepts. In elaboration, I meant to ask “What is the end goal of the assignment? What is the motive?”

I got it :smiling_face_with_tear:
thanks alot .

1 Like

To build our first actual Neural Network with one hidden layer and demonstrate how to use it to solve a particular binary classification problem. All this will be explained in the lectures and in the assignments for DLS Course 1 Week 3. With the previous weeks as background of course, where we see how to solve a different binary classification problem using Logistic Regression.