There is any change of C1_W_Assignment?

I write my code in example5 and It worked.
However, my cost after iteration is different with example a little bit.


There is any change?

Hi @rlaskan95 ,

Normally, if there is a change of the assignment, an update notice will be displayed giving details of the change when the lab is opened.

The output values presented here are very different from the expected output, and the value of b is too far out.
Try refreshing the kernel and clear all output, then rerun the code from start. Hopefully, it should sort out the problem.

Thank you for youre answer.
First of all, I tried to run this cell many times but my bias was not change. And the score also was not change 90 / 100 (Exercise 5 10 / 20)


I don’t know what is happening, I have confirmed w3_tools.py there is no any initialiization of paremters. According to previous situtation, my bias is initializaed in

# GRADED FUNCTION: initialize_parameters

def initialize_parameters(n_x, n_y):
    """
    Returns:
    params -- python dictionary containing your parameters:
                    W -- weight matrix of shape (n_y, n_x)
                    b -- bias value set as a vector of shape (n_y, 1)
    """
    
    ### START CODE HERE ### (~ 2 lines of code)
    W = np.random.randn(n_y, n_x) * 0.01
    b = np.zeros((n_y, 1))
    ### END CODE HERE ###
    
    assert (W.shape == (n_y, n_x))
    assert (b.shape == (n_y, 1))
    
    parameters = {"W": W,
                  "b": b}
    
    return parameters

How to solve this problem?

Hi @rlaskan95

Apart from initialize_parameters(), did your forward_propagation() and compute_cost() pass their unit test?

1 Like

Sure

Hi @rlaskan95,

Please download your lab in ipynb format and attach it to a DM to me, I will have a look for you.

1 Like

Hi @rlaskan95 ,

In the forward_propagation(), Z was calculated without adding the bias term. Please check the formula.

Thank you so much @Kic. I missed the basic formula.
Now, I have solved that issue.