Issue with information gain exercise 3

Hi,

I have an issue with the programming assignement C2W4 on decision trees (for the Advanced Learning Algorithms course).
I received a grade of 0% although I have successfully done exercises 1,2, 4 and 5 (it says “All tests passed”).
For exercise 3 (on information gain) it gives me a message error although I compute the correct information gain for the the three features.
I don’t understand, I cannot find the mistake. Do you have any clue? Have you met the same issue?

This is what I compute:
Information Gain from splitting the root on brown cap: 0.034851554559677034
Information Gain from splitting the root on tapering stalk shape: 0.12451124978365313
Information Gain from splitting the root on solitary: 0.2780719051126377

This is the message error:

AssertionError Traceback (most recent call last)
in 9
10 # UNIT TESTS
—> 11 compute_information_gain_test(compute_information_gain)

~/work/public_tests.py in compute_information_gain_test(target)
101 node_indexes = list(range(4))
102 result = target(X, y, node_indexes, 0)
→ 103 assert np.isclose(result, 0.311278, atol=1e-6), f"Wrong information gain. Expected {0.311278} got: {result}"
104
105 result = target(X, y, node_indexes, 1)

AssertionError: Wrong information gain. Expected 0.311278 got: 0.2822287189138014

This is the Expected Output:
Information Gain from splitting the root on brown cap: 0.034851554559677034
Information Gain from splitting the root on tapering stalk shape: 0.12451124978365313
Information Gain from splitting the root on solitary: 0.2780719051126377"

Thank you
Sebastien

Hi Sebastien @Sebastien_Gac, welcome to our community!

Let’s focus on just exercise 3. According to what you have shared, you passed one test (matching the 3 information gain values) but not all the tests. Please check out this post for how you may debug your code.

Raymond

Thank you for your post.
I have solved the issue

Sebastien

1 Like