Decision Tree programming assignment

For the first exercise, its returning entropy = 1, but Im getting the following error:

AssertionError: Entropy must be 0 with array of ones

Hi @Tayler-Frances_Chapm
Always not return 1 but if one of these conditions below is correct return 0

• For implementation purposes, 0log2(0)=0. That is, if `p_1 = 0` or `p_1 = 1`, set the entropy to `0`
• Make sure to check that the data at a node is not empty (i.e. `len(y) != 0`). Return `0` if it is

Ok. I thought I did that correctly. Im not really sure what I’m doing wrong.

Can I send you the errors?

You can post your error (not code) here.

@Tayler-Frances_Chapm, did you figure this out?

The error you’re getting is from the `compute__entropy_test` unit test, which you can look at by going to the File menu and opening public_test.py, but I’ll paste the specific test here for convenience:

``````def compute_entropy_test(target):
y = np.array([1] * 10)
result = target(y)

assert result == 0, "Entropy must be 0 with array of ones"
...
``````

As @AbdElRhaman_Fakhry mentioned, the instructions say:

``````For implementation purposes, 0log2(0)=0. That is, if p_1 = 0 or p_1 = 1, set the entropy to 0
...
``````

So, when an array of all 1’s is passed in, your `compute_entropy()` function should be returning 0. The unit test is finding that it’s not returning 0.
To help you figure out why, you can look at the hints, or temporarily add some print statements in your function to see why it’s not returning 0 in this case.

Also, I noticed you have this recorded under Week 3. I’m going to move it to Week 4, since that’s where the Decision Tree assignment is. This will make it easier for future students to find it if they have a similar question.

@Wendy, this question has been answered via offline discussion.