Decision Tree Practice Lab Module 4

Hello Mentors,

I’ve been working hard to complete this Decision Tree lab and wanted to share the steps I took in case you can help me figure out why Cell #7 keeps failing after submission.

  1. I followed the Coursera notebook step-by-step and used the provided hints to write code for each of the 4 graded exercises (compute_entropy, split_dataset, compute_information_gain, and get_best_split).
  2. I ran the test cells after each exercise individually, and all the tests passed (including the expected output from Cell #8 for entropy: 1.0).
  3. After completing all exercises, I clicked Restart and Run All to cleanly re-run the entire notebook in top-to-bottom order.
  4. I then submitted the notebook, but I’m still getting an error in Cell #7:
    “AssertionError: Entropy must be 1 with same amount of ones and zeros”
  5. I also saved the notebook to a folder and can upload it here if needed.

Would it help if I added print() statements inside the graded cells? I ran each one independently before the final restart and all the results looked correct.

Thank you for taking a look — I’m just trying to understand what’s causing this error even though everything works when I run it manually. I also used an updated notebook in the lab.

Best,
Mark

Passing the tests in the notebook does not prove your code is perfect. The notebook’s tests do not cover ever possible error.

The grader uses entirely different tests.

Your compute_entropy() function has a flaw.

Hi TMosh,

Thanks again for your help earlier. I’m still stuck on Cell 7 in the Decision Tree lab. When I run this manually:

print(compute_entropy(np.array([0, 1]))) # returns 1.0

…it prints 1.0 as expected.

But the grader still fails with:

AssertionError: Entropy must be 1 with same amount of ones and zeros

Based on your earlier suggestion and what I’ve seen in the forum I tried:

  • Flattening y at the start,
  • Returning 0.0 for empty input,
  • Returning 1.0 if y contains [0, 1] or [1, 0],
  • Making sure only one version of compute_entropy() exists,
  • Following the indentation tip in the hint section.

Really appreciate your time and guidance!

Best,
Mark

Please check your personal messages for instructions.