Course 1 W3A1 Exercise 5: Entropy Loss

I get the below error for the code. I think I have implemented the code correctly, but I keep getting the same error.
Here is the code, which is exactly as per the guidelines in the notebook for calculating Entropy loss:
logprobs = np.multiply(np.log(A2), Y)
cost = - np.sum(logprobs)

AssertionError: Wrong value. Expected: 0.5447066599017815 got: 1.0493608309109266

Hi @mridul1979

Double-check the implementation of the entropy loss calculation and also check that A2 contains the predicted probabilities for each class and Y contains the one-hot encoded true labels. The np.log(A2) operation should only be applied to non-zero elements of A2 to avoid invalid values, too.

Hope it helps! feel free to ask if you need further assistance.

I think you may have mis-read the instructions.

The code you quoted is only part of the solution.

The entire equation for the cost is just above the instructions:

@TMosh Thanks for pointing that out. This is resolved. :slight_smile: