Supervised Machine Learning: Regression and Classification / Module 3

Good evening everyone, on coursera I am completing the course “Supervised Machine Learning: Regression and Classification”

In the programming task “C1_W3_Logistic_Regression” this error appears: “Cell #3. Can’t compile the student’s code. Error: OSError(‘data/ex2data1.txt not found.’)”, even though the file does exist.

has this happened to anyone? How could I solve it?

I leave the programming file attached for completeness of information.

Thanks in advance

Gianluca

{mentor edit: file removed}

Please do not share your notebook. That’s not allowed by the Code of Conduct.
I’ll edit your message to remove the file.

If a mentor needs to see your notebook, we’ll contact you privately with instructions.

= = = =

What programming environment are you using? The Coursera Labs environment, or something else?

It appears that you added a new cell to the notebook.

Do not add any cells, or insert any functions. It is not necessary.

The load_data() function is part of the “utils.py” package. It is imported in Cell 1 of the notebook.

So you did not need to insert that function separately.

Tips:

  • Every time you open a notebook, you must run all of the cells starting from the top. That’s where the assets are imported, and the workspace is configured.
  • Only add your code to the places that are marked “### START CODE HERE ###”. Do not make any other changes.
  • Do not add, delete, or move any cells within the notebook.
  • Do not rename the notebook. The grader only uses the notebook with the original file name.

Thanks for the advice! In fact now I have another problem, during the exercise of the def compute_gradient_test.

Practically this error always comes out in the test:

dj_db at test w and b: -0.5999999999991071
dj_dw at test w and b: [-44.831353617873795, -44.37384124953978]


AssertionError Traceback (most recent call last)
in
8
9 # UNIT TESTS
—> 10 compute_gradient_test(compute_gradient)

in compute_gradient_test(compute_gradient_function)
18
19 # Test assert
—> 20 assert np.isclose(dj_db, expected_dj_db, atol=1e-3), f"dj_db computed {dj_db}, expected {expected_dj_db}"
21 assert np.allclose(dj_dw, expected_dj_dw, atol=1e-3), f"dj_dw computed {dj_dw}, expected {expected_dj_dw}"
22

AssertionError: dj_db computed 0.47194683434663404, expected -0.6666666666666667

Why do you think?

I think it’s the sigmond function, but I don’t see any errors about it.

There isn’t much to go wrong in sigmoid(), it’s only one line of code, and you only need to be sure you’re using the function argument in the math, not a constant value.

The most common error in the compute_gradient() function regards how your code uses indentation. That’s how Python defines blocks of code.

You’re right! Luckily, the bug was solved!

Thanks a lot for the interesting insights. Sometimes the simplest info is the most powerful :slight_smile:

1 Like

Nice work!