# W3_A1_Compute cost_cross entropy loss

{moderator edit - solution code removed}

I can not complete this function , where is my fault ?

Can you explain what the function is supposed to do?

Best,
Mubsi

2 Likes

You have made the very common mistake of assuming that the sample code they gave you is the complete solution. Itâ€™s not. Read the math formula for the cost given in the instructions and compare that to the code and ask yourself two questions:

1. What happened to the factor of \frac {1}{m}?
2. Why is there only one term? What happened to the Y = 0 term?

In fact they even say in the instructions that itâ€™s not the full answer, but apparently they were a little too subtle about how they phrased it.

2 Likes

Are you familiar with the assignments in DLS Course 1? This is from the Week 3 assignment. In all the networks we build in Course 1, we are doing binary classifications, so the output activation is sigmoid and the corresponding loss function is â€ścross entropyâ€ť loss, which you see mentioned in the â€śdocstringâ€ť of the function. If you are not familiar with that, you can learn about it by taking DLS Course 1!

Am I familiar with the assignments in DSL Course 1? Not yetâ€¦ However, I am very familiar with the concepts. In elaboration, I meant to ask â€śWhat is the end goal of the assignment? What is the motive?â€ť

I got it
thanks alot .

1 Like

To build our first actual Neural Network with one hidden layer and demonstrate how to use it to solve a particular binary classification problem. All this will be explained in the lectures and in the assignments for DLS Course 1 Week 3. With the previous weeks as background of course, where we see how to solve a different binary classification problem using Logistic Regression.