Hello,
I completed my assignment and successfully passed all the Python tests with the message ‘All tests passed’. However, upon grading, I received 87%, and noticed that there are several matrices where the function did not work as expected. It’s a bit confusing because I didn’t receive any warning messages to indicate where my mistakes might be, leaving me unsure about where I went wrong. I wanted to copy and paste the matrices from the error message into my notebook and run the function to check the results, but not all the numbers are displaying. Can someone help me ?
Hi @sabinaL,
Try to debug your code by ensuring it works on all matrices. Here are a few tips that might help:
- Check Edge Cases
- Avoid Hard-Coded Values
- Print Intermediate Values
- Follow Instructions in notebook
If you follow these guidelines, you should be able to pass the Coursera grader without any difficulties.
Hope this helps! Feel free to ask if you need further assistance, and if you feel stuck, don’t hesitate to share your notebook via private message for more specific help!
2 Likes
Yes, I would start by investigating your back_substitution
code and solving that first. gaussian_elimination
calls back_substitution
, so a bug in the latter will cause the former to fail the grader as well.
Your code passes a lot of test cases, but evidently the one it fails is different than the others. I’m not sure I’m reading your image correctly, but it looks like the input is 5 x 7, which wouldn’t make any sense. Try looking at the grader output in as much detail as you can to confirm what the shapes are.
1 Like
Hi @sabinaL ,
This is very interesting as I have almost exactly the same problem (which is in the back_substitution as well) - passed the unit tests in the notebook and failed the submission assessment. Did you get anywhere as I’m having trouble reading the output from the assessment to build some unittests of my own to try and narrow down the problem?

… and my last test failed completely

(EDIT]
So this is weird - with no changes whatsoever I resubmitted and got 87% with 24/30 for the final test this time - even with some failed test cases. I’m going to take this as a win for now.
Maybe it would be worth looking at your code to get more information. We normally don’t do that in a public thread, though. I will send you and Sabina DM messages about that.
Thank you very much @paulinpaloalto and @Alireza_Saei for your reply. @Stevehh250 I sent a DM with my notebook to @Alireza_Saei and @paulinpaloalto, I am waiting for a feedback, because I really don’t understand where the error is.
@sabinaL The error is in back_substitution
. You get points for knowing more about the numpy APIs than I do for finding the function np.flatnonzero
. I had never heard of that before. But the way you use it does not work apparently. Reading the documentation, the only potential problem I can see with your use is that you fail to take into account the fact that we are dealing with an augmented matrix here. But when I enhance your logic using flatnonzero
to take that into account, it still fails the grader.
The point is we already have a function earlier in this notebook that does precisely what we need: get_index_first_non_zero_value_from_row
. If you use that instead, everything works and passes the grader.
Now there are several unresolved questions here that I want to figure out:
- Why do the notebook test cases pass with the implementation that fails the grader?
- Why doesn’t
flatnonzero
work if I truncate the augmented element from the row?
I will continue to look at this to try to understand those issues, but at least there is a clear way for you to pass this assignment.
2 Likes
I am getting this ^ as well.
I thought I was going crazy.
I believe there is an unsolveable test case, for back substitution, in the notebook. IE:
#!/usr/bin/env python3
import numpy as np
if __name__ == "__main__":
#I believe this is a test case in the grader for...
#[Linear Algebra for Machine Learning and Data Science]:
#Week 2 Gaussian Elimination
#Exercise 2
#Back Substitution
A = np.array(
[
[1.0, 0.5, 8.25, 2.5, 12.0],
[0.0, 1.0, 12.83333333, 5.22222222, 29.44444444],
[0.0, 0.0, 1.0, -0.0404463, 0.76638773],
[-0.0, -0.0, -0.0, 1.0, 3.88107807]
]
)
B = np.array(
[
[6.5],
[16.44444444],
[0.15132497],
[2.62950719]
]
)
#How do I solve this when dealing with a non square matrix?
#Why is this included in the grader as a test?
solution = np.linalg.solve(A, B)
I don’t have access to the source code for the graders in this course. Where did you get this information?
The grader spit out the test cases, the same as what Stevehh250 posted above. I looked at the output, grep’ed and awk’d this particular test case, and noticed it was not solvable (I believe).
We have recently seen two cases of students getting 87/100 on this assignment and in both cases their code used an alternative to calling the “helper” function that we are given in this notebook at the appropriate point in back_substitution
:
get_index_first_non_zero_value_from_row
1 Like
Well, I’m not sure what their real point is with this test case, but if I just run this test case with my version of back_substitution
(which passes the grader), here is what happens:
A = np.array(
[
[1.0, 0.5, 8.25, 2.5, 12.0],
[0.0, 1.0, 12.83333333, 5.22222222, 29.44444444],
[0.0, 0.0, 1.0, -0.0404463, 0.76638773],
[-0.0, -0.0, -0.0, 1.0, 3.88107807]
]
)
B = np.array(
[
[6.5],
[16.44444444],
[0.15132497],
[2.62950719]
]
)
M = augmented_matrix(A,B)
print(f"M.shape {M.shape}")
backout = back_substitution(M)
print(f"backout\n{backout}")
Running that produces this output:
M.shape (4, 6)
backout
[-1.9024659 -0.59430445 0.25767881 2.62950719]
What do you see when you run that test? You’re right that this doesn’t make sense in terms of the mathematics here, but the test runs and produces a result.
1 Like
Since we can’t see what the grader is actually doing, maybe we need to be a bit more creative. Note that A as it stands is 4 x 5, which could be a legitimate input.
If I do this set of tests:
M = augmented_matrix(A,B)
print(f"M.shape {M.shape}")
backoutM = back_substitution(M)
print(f"backoutM\n{backoutM}")
backoutA = back_substitution(A)
print(f"backoutA\n{backoutA}")
backoutB = back_substitution(B)
print(f"backoutB\n{backoutB}")
I get this with code that passes the grader:
M.shape (4, 6)
backoutM
[-1.9024659 -0.59430445 0.25767881 2.62950719]
backoutA
[-3.98382346 -2.67323258 0.92336298 3.88107807]
backoutB
[-4.54254775e+02 -3.34039195e+01 -2.46585127e-01 2.62950719e+00]
1 Like
Thank you so much ! It worked. However, I have the same questions as you. Please let me know if you find the answers. Thank you so much again for your help 
Ok, I spent a bit more time on this and there are three ways in which your implementation based on np.flatnonzero
is different from the function they built for us:
- Your version does not handle the fact that the input matrix is augmented.
- Your version does not handle returning -1 if the full row (not counting the augmented element) is all zeros.
- The version they gave us considers any number with an absolute value < 10^{-5} to be equivalent to zero.
It’s easy to remedy issues 1) and 2). I built a version of the function that does that:
# implementation based on code by sabina
def get_index_non_zero(M, row, augmented = False):
# Create a copy to avoid modifying the original matrix
M = M.copy()
# If it is an augmented matrix, then ignore the constant values
if augmented == True:
# Isolating the coefficient matrix (removing the constant terms)
M = M[:,:-1]
# Get the desired row
row_array = M[row]
index_array = np.flatnonzero(row_array)
if len(index_array) == 0:
return -1
index = index_array[0]
return index
But it turns out that still fails the grader if you use it in place of the helper function they gave us, because the grader has test cases which depend on the behavior w.r.t. values < 10^{-5}. So that’s the main reason your code does not work.
You can see how they deal with that using np.isclose
by studying the given code for the function:
get_index_first_non_zero_value_from_row
3 Likes
Got my assignment to submit with 100, as well.
Literally did a back flip.
Thank you, Paul.
1 Like