Course 3, Week 1, Anomaly Detection Programming Assignement

Hi,

I am getting a weird error when running my code with a numpy array to calculate the mean (mu) and the variance (var) in the assignment. I am not sure how to debug this, as my code calculates the values properly. I have no clue what the stack trace is reporting, let alone this line:

AssertionError: Wrong value for mu. Expected: [1, 2, 3] got: [1. 2. 0.]

Here is the unit test results:

mu[0]: 14.1122257839456
mu[1]: 14.99771050813621
sum_var0/m  562.6178439428168 307 1.8326314134945172
sum_var1/m  524.891816564435 307 1.7097453308287784
var[0]: 1.8326314134945172
var[1]: 1.7097453308287784
END 
Mean of each feature: [14.11222578 14.99771051  0.        ]
Variance of each feature: [1.83263141 1.70974533 0.        ]
mu[0]: 1.0
mu[1]: 2.0
sum_var0/m  0.0 3 0.0
sum_var1/m  0.0 3 0.0
var[0]: 0.0
var[1]: 0.0
END 
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-14-485cf4fdd5ec> in <module>
      7 # UNIT TEST
      8 from public_tests import *
----> 9 estimate_gaussian_test(estimate_gaussian)

~/work/public_tests.py in estimate_gaussian_test(target)
     44     assert type(var) == np.ndarray, f"Wrong shape for var. Expected: {(X.shape[1],)} got: {var.shape}"
     45 
---> 46     assert np.allclose(mu, [1., 2., 3.]), f"Wrong value for mu. Expected: {[1, 2, 3]} got: {mu}"
     47     assert np.allclose(var, [0., 0., 0.]), f"Wrong value for var. Expected: {[0, 0, 0]} got: {var}"
     48 

AssertionError: Wrong value for mu. Expected: [1, 2, 3] got: [1. 2. 0.]

Here’s how I defined my numpy arrays at the top of the program:

mu = np.array([0, 0, 0], dtype=float)
var = np.array([0, 0, 0], dtype=float)

Feel free to reply if you have any idea

Thanks

Defining your own test data isn’t effective, because the notebook code doesn’t know in advance what the correct results should be.

The test that is failing is included in the public_tests.py file, you can read it using the File → Open menu. It’s in the estimate_gaussian_test() function.

Then you can see what data your code isn’t handling correctly.

Hello, @Francois_Poirier, did you initialize the mu and var with zeros and fill in the computed values one-by-one? If so, it seems to me that you were only filling in the first two answers at least for mu, which is why you got:

-which is saying that it expects [1, 2, 3] for mu but your program returned [1, 2, 0]. See? The first two values are correct but not the third.

As for why it was expecting these 3 values, you might follow Tom’s reply to check out the actual input of the test, then you will see the reason.



I recommend you to open the “hints” right below that exercise in which 2 ways are suggested. In particular, if you try the second way, you will not need to initialize mu or var, instead that will require you to know how to use np.sum.

np.sum is a handy function that, as long as we are in the machine learning field, we will have to know it. In case you have not used it, perhaps this is a good time for you to give it a try.

For example, you may run a code cell like below:

import numpy as np

X = np.array([
    [1, 1, 1], 
    [3, 5, 2], 
    [7, 6, 3],
])

print(....) # use np.sum here

and see if you can use np.sum to calculate the row-wise sum and the column-wise sum of X correctly.

Good luck to you, @Francois_Poirier :wink:

Cheers,
Raymond

I have edited your post for proper code formatting. You may edit your post to see what changes I have made.