C1_W3_Logistic_Regression_UNQ_C3

Hello Everyone,

I am currently working UNQ_C3 and my code produces the expected output. However, my Unit Test is failing for compute_gradient_test(compute_gradient).

Here is my screen shot with Assertion Error

Your code does not work correctly for the tests cases that are in the “compute_gradient_test()” function.

You can inspect these test by opening the File menu and finding the .py file that was imported in the first cell in the notebook.

Hello @Skkiles

Yea… Your computer_gradient could not compute the correct dj_db value for the test case, and Tom’s approach is literally the best one, so I did check out the test script too. If you also do it, you will find that the X in the test case are some random numbers.

Normally I would suggest learners to add print in between every line of the code, so that each print shows the result of the previous line, and in this way, we can verify the printed result with what we calculate by hand and find out which line starts to be wrong.

But with X being random numbers, obviously hand calculation becomes a bit harder, but not impossible, so maybe you can try that, or you may temporarily add a line that rounds off X to like 3 decimal places so that the math work is easier.

If you do it, you know you can spot the problem in like 5 or 10 minutes.

Cheers,
Raymond

Hello TMosh,
Thanks for responding to my discussion post. I have downloaded the .py file for reference via note pad which is very convenient for future use. Additionally, I like how the expected output sections are written in HTML formatting using Open and Closed Tags & Attributes. However, this does not convey to me how to resolve the Assertion Error.

I recommend you go though the calculation by hand, since you have the data that the test is using from the .py file.

Then you can compare your results vs those that your code is generating. This will allow you to find the error.

Hello rmwkwok,
Thanks for getting back to me regarding this matter. I like the idea of placing a print function between several lines of code, I too utilize the same approach with other programming languages such as C# or C++ using MessageBox.show() or in Java I can use the print function. However, in hopes of tracking the origin of the error in UNQ_C3 is worth a shot. Unfortunately, I have tried a similar approach by stepping through every line of code from the recommended Code Hint and nothing is working as I am still getting the expected output. Moreover, I have also reviewed other methods of how to resolve the assertion error via YouTube, but this will force me to alter some of the cells inside the notebook.

1 Like

Check your personal messages for instructions.

Hello @Skkiles,

No problem. I added the prints like below, based on the code skeleton provided by the lab:

And below are my results for your cross-check. Luckily we don’t have many lines here, so we could follow them line by line, otherwise, jumping in between would be an alternative.

===========Begin=============
input X = [[ 1.62434536 -0.61175641 -0.52817175]
 [-1.07296862  0.86540763 -2.3015387 ]
 [ 1.74481176 -0.7612069   0.3190391 ]
 [-0.24937038  1.46210794 -2.06014071]
 [-0.3224172  -0.38405435  1.13376944]
 [-1.09989127 -0.17242821 -0.87785842]
 [ 0.04221375  0.58281521 -1.10061918]]
input y = [1 0 1 0 1 1 0]
input w = [ 1.    0.5  -0.35]
input b = 1.7
Sample 0 z_wb = 0
Sample 0 feature 0 z_wb = 1.6243453636632417
Sample 0 feature 1 z_wb = 1.318467156838204
Sample 0 feature 2 z_wb = 1.5033272701304135
Sample 0 z_wb = 3.2033272701304134
Sample 0 f_wb = 0.9609592964541614
Sample 0 dj_db_i = -0.03904070354583855
Sample 0 dj_db = -0.03904070354583855
Sample 0 feature 0 dj_dw = [-0.06341559  0.          0.        ]
Sample 0 feature 1 dj_dw = [-0.06341559  0.0238834   0.        ]
Sample 0 feature 2 dj_dw = [-0.06341559  0.0238834   0.0206202 ]
Sample 1 z_wb = 0
Sample 1 feature 0 z_wb = -1.0729686221561705
Sample 1 feature 1 z_wb = -0.6402648074938313
Sample 1 feature 2 z_wb = 0.16527373641426757
Sample 1 z_wb = 1.8652737364142675
Sample 1 f_wb = 0.865910461610504
Sample 1 dj_db_i = 0.865910461610504
Sample 1 dj_db = 0.8268697580646654
Sample 1 feature 0 dj_dw = [-0.99251034  0.0238834   0.0206202 ]
Sample 1 feature 1 dj_dw = [-0.99251034  0.77324892  0.0206202 ]
Sample 1 feature 2 dj_dw = [-0.99251034  0.77324892 -1.97230624]
Sample 2 z_wb = 0
Sample 2 feature 0 z_wb = 1.74481176421648
Sample 2 feature 1 z_wb = 1.3642083137689287
Sample 2 feature 2 z_wb = 1.2525446301489442
Sample 2 z_wb = 2.952544630148944
Sample 2 f_wb = 0.9503836170616118
Sample 2 dj_db_i = -0.049616382938388215
Sample 2 dj_db = 0.7772533751262772
Sample 2 feature 0 dj_dw = [-1.07908159  0.77324892 -1.97230624]
Sample 2 feature 1 dj_dw = [-1.07908159  0.81101725 -1.97230624]
Sample 2 feature 2 dj_dw = [-1.07908159  0.81101725 -1.9881358 ]
Sample 3 z_wb = 0
Sample 3 feature 0 z_wb = -0.2493703754774101
Sample 3 feature 1 z_wb = 0.48168359304507696
Sample 3 feature 2 z_wb = 1.2027328413692557
Sample 3 z_wb = 2.9027328413692555
Sample 3 f_wb = 0.947981365801279
Sample 3 dj_db_i = 0.947981365801279
Sample 3 dj_db = 1.7252347409275561
Sample 3 feature 0 dj_dw = [-1.31548006  0.81101725 -1.9881358 ]
Sample 3 feature 1 dj_dw = [-1.31548006  2.19706833 -1.9881358 ]
Sample 3 feature 2 dj_dw = [-1.31548006  2.19706833 -3.94111081]
Sample 4 z_wb = 0
Sample 4 feature 0 z_wb = -0.3224172040135075
Sample 4 feature 1 z_wb = -0.5144443813477153
Sample 4 feature 2 z_wb = -0.9112636861651184
Sample 4 z_wb = 0.7887363138348815
Sample 4 f_wb = 0.6875599282748379
Sample 4 dj_db_i = -0.3124400717251621
Sample 4 dj_db = 1.412794669202394
Sample 4 feature 0 dj_dw = [-1.214744    2.19706833 -3.94111081]
Sample 4 feature 1 dj_dw = [-1.214744    2.3170623  -3.94111081]
Sample 4 feature 2 dj_dw = [-1.214744    2.3170623  -4.29534581]
Sample 5 z_wb = 0
Sample 5 feature 0 z_wb = -1.0998912673140309
Sample 5 feature 1 z_wb = -1.1861053710892486
Sample 5 feature 2 z_wb = -0.8788549248167685
Sample 5 z_wb = 0.8211450751832314
Sample 5 f_wb = 0.6944793537104562
Sample 5 dj_db_i = -0.3055206462895438
Sample 5 dj_db = 1.1072740229128502
Sample 5 feature 0 dj_dw = [-0.87870451  2.3170623  -4.29534581]
Sample 5 feature 1 dj_dw = [-0.87870451  2.36974268 -4.29534581]
Sample 5 feature 2 dj_dw = [-0.87870451  2.36974268 -4.02714194]
Sample 6 z_wb = 0
Sample 6 feature 0 z_wb = 0.04221374671559283
Sample 6 feature 1 z_wb = 0.333621353573504
Sample 6 feature 2 z_wb = 0.7188380655980264
Sample 6 z_wb = 2.4188380655980266
Sample 6 f_wb = 0.9182525665526068
Sample 6 dj_db_i = 0.9182525665526068
Sample 6 dj_db = 2.025526589465457
Sample 6 feature 0 dj_dw = [-0.83994163  2.36974268 -4.02714194]
Sample 6 feature 1 dj_dw = [-0.83994163  2.90491425 -4.02714194]
Sample 6 feature 2 dj_dw = [-0.83994163  2.90491425 -5.03778833]
dj_dw = [-0.11999166  0.41498775 -0.71968405]
dj_db = 0.28936094135220813

Cheers,
Raymond

Oh… Tom has sent you a personal message, but these prints should also help anyone who finds this thread.