Bug in week 1 practice lab unit tests?

I am working on this assignment
First practice lab for week 1

I think there is a bug in this unit test

# UNIT TESTS

test_c2(my_dense)

I have been trying to debug the code it is testing and have tried the following print statements (without revealing my code)

        print(W[:,unit])
        print(a_in)
        print(np.dot([0.1,0.4],[0.1,0.2]))
        print(np.dot(np.array([0.1,0.4]),np.array([0.1,0.2])))
        print(0.1*0.1+0.4*0.2)

The following cell runs this

# Quick Check
x_tst = 0.1*np.arange(1,3,1).reshape(2,)  # (1 examples, 2 features)
W_tst = 0.1*np.arange(1,7,1).reshape(2,3) # (2 input features, 3 output features)
b_tst = 0.1*np.arange(1,4,1).reshape(3,)  # (3 features)
A_tst = my_dense(x_tst, W_tst, b_tst, sigmoid)
print(A_tst)

When I run the following cell, the first few lines of my output are

[0.1 0.4]
[0.1 0.2]
0.09000000000000001
0.09000000000000001
0.09000000000000001

These are all correct answers for this dot product. And clearly the dot product is running successfully for non-square matricies.

But when I run the unit test, I get this error

AssertionError                            Traceback (most recent call last)
<ipython-input-33-b4cce73b9594> in <module>
      1 # UNIT TESTS
      2 
----> 3 test_c2(my_dense)

~/work/public_tests.py in test_c2(target)
     41     assert A_tst.shape[0] == len(b_tst)
     42     assert np.allclose(A_tst, [10., 20.]), \
---> 43         "Wrong output. Check the dot product"
     44 
     45     b_tst = np.array([3., 5.])  # (2 features)

AssertionError: Wrong output. Check the dot product

I’ve tried changing around the names of my variables a bit in case that’s the issue, but I’m not finding exactly what the assertion error is. I put back my variable names as they were.

This is definitely the correct dot product, especially based on checking this math with the printed output. That trailing one at the end of the decimal is called roundoff error, but shouldn’t make a difference to a well-constructed unit test. It’s a fact of how computers do arithmetic.

I do believe this is a bug in the unit test. Can anyone help?

Edit: I just double checked and it is definitely two features. Not only is it definitely two features based on its characteristics, but it says so in the code I’m running to test that was provided but also in the unit test error.

Thanks,
Steven

The error is not necessarily with the dot product. The test case just offers that because it’s a common error.

It’s certainly not the only possible error though.

Now I’m trying this debugging statement

print(a_in.shape[0], len(b))

Recall the definition of the function I am supposed to write includes those variables

def my_dense(a_in, W, b, g):

So I did not define or modify them.

Running this code, also provided in the lab

# Quick Check
x_tst = 0.1*np.arange(1,3,1).reshape(2,)  # (1 examples, 2 features)
W_tst = 0.1*np.arange(1,7,1).reshape(2,3) # (2 input features, 3 output features)
b_tst = 0.1*np.arange(1,4,1).reshape(3,)  # (3 features)
A_tst = my_dense(x_tst, W_tst, b_tst, sigmoid)
print(A_tst)

I get this

2 3
2 3
2 3
[0.54735762 0.57932425 0.61063923]

It is apparent that the two dimensions are not equal.

However, the error I am seeing is

AssertionError                            Traceback (most recent call last)
<ipython-input-25-b4cce73b9594> in <module>
      1 # UNIT TESTS
      2 
----> 3 test_c2(my_dense)

~/work/public_tests.py in test_c2(target)
     41     assert A_tst.shape[0] == len(b_tst)
     42     assert np.allclose(A_tst, [10., 20.]), \
---> 43         "Wrong output. Check the dot product"
     44 
     45     b_tst = np.array([3., 5.])  # (2 features)

AssertionError: Wrong output. Check the dot product

Any idea why it seems to be failing on this line, or why this line is required at all?

assert A_tst.shape[0] == len(b_tst)

Clearly it is not true that the 0th dimension of a is supposed to match the length of b based on the examples that run in the previous cell. So why assert that they should be equal for this unit test here?

Edit

I see the line that it is literally failing on is

assert np.allclose(A_tst, [10., 20.]),

numpy allclose

This tests if all numbers are “close enough”.

But if the dot product is expected to be taken along the wrong dimension (presumably a square array), wouldn’t there be an issue with the numbers matching? The first of the two asserts is clearly internally inconsistent and also wrong based on the definitions provided in the routine we are supposed to write.

    """
    Computes dense layer
    Args:
      a_in (ndarray (n, )) : Data, 1 example 
      W    (ndarray (n,j)) : Weight matrix, n features per unit, j units
      b    (ndarray (j, )) : bias vector, j units  
      g    activation function (e.g. sigmoid, relu..)
    Returns
      a_out (ndarray (j,))  : j units
    """

Steven

a_in.shape[0]=n and len(b)= j so the only way they can match is if n=j (in which case W is a square matrix, although as requested I am not using matrix methods, I am using the dot product).

If you multiply a square matrix W by a, then W@a is not equal to transpose(W)@a. So even if n=j, it doesn’t work to just change the direction of the dimensions. As a straightforward example,

(sorry, LaTeX’s amsmath isn’t formatting here, so I’m not sure how to do the markup for this)

[[1 1],[0,0]] @ [[1, 0], [2, 0]]

is equal to

[[3, 0], [0, 0]]

while if I take the transpose of the first matrix and multiply it by the second, we get

[[1, 0], [1, 0]] @ [[1, 0], [2, 0]]

which is

[[1, 0], [1, 0]]

So in general, even though square matricies and their transposes share the same dimensions, numerical results are not the same when they are used incorrectly.

So, if n must equal j, that is a special case for a square matrix, and it is still important to not take the transpose of W.

Okay, I see the issue. We have libraries specific to the course, and I guess in forgetting about that I misread the question.

Thank you!

Hi @s-dorsher,

Please feel free to DM me your code if you still need help with debugging.

Thanks, but I solved week 1 and am on week 2. Busy with Christmas preparations and celebrations though! Happy holidays if you celebrate! I am sure I’ll be back to it soon

1 Like