# Course 5 Week 2 Asn 2 Exercise 2 --> Initialize cost inside of the loop over epochs

Hello Mentors –

I believe that in course 5 week 2 Asn 2, right now the cost should be initialized as 0 inside the for loop over epochs. Otherwise, the cost is accumulating across epochs (notice in the tests that accuracy gets higher – which is good – but cost also got higher, which means something is wrong.

– A112

If your code works correctly, the cost should not increase.

There is also a question regarding how the published notebook defines the cost, but that’s a different topic for the course staff to resolve.

I think you are right if you want to watch how “cost” is decreased by every epoch (or every 100 epochs.) That should be a general requirement. (On the other hand, someone may want to see how total cost is saturating… )

In any cases, we may be bette not to touch this…

``````# Initialize cost. It is needed during grading
cost = 0
``````

I think resetting “cost” at every iteration should not be harmful. → I just confirmed that a Grader worked even if I insert a code to reset “cost”.

There are two possible implementation for the cost calculation.

One is just calculate the cost for one training example, the cost is reset everytime. That’s Tom’s first reply, I think.
The other is to accumulate the cost for all training examples for one single Epoch. This fits for the print statement below to show “epoch number and cost”. But, If the cost value is accumulated, no way to reset it.

OK, Here is the summary.

1. In this assignment, the usage of the cost is not well described. Tom raised an issue.
2. In this assignment, we are requested to implement the cost calculation. It is better to be an accumulation of costs for all training examples for one epoch. But, there is no way to reset it within ## START CODE HERE ## and ## END CODE HERE ## tags. We need to insert one line like below.
``````# Optimization loop
for t in range(num_iterations): # Loop over the number of iterations
cost = 0
for i in range(m):          # Loop over the training examples
``````

In net, we think @A112 is right.

Here are my results using progress updates every 10 epochs, and resetting the cost to 0 at the start of each epoch, and with the cost accumulated by using “cost += …”.