# Is it possible that entropy could increase from father node to children nodes in decision tree algorithm

Hi,

I am just finished the session of “decision tree learning” about purity-based entropy function and weight-based entropy reduction process. I wonder if it is possible that entropy could increase from father node to children nodes.

Specifically for the cats and dog problem, let’s assume the variables as follows,

Father node(root node): C cats, D dogs, sum S as C+D, entropy as Hf,
Left Child node: C1 cats, D1 dogs, sum S1 as C1+D1,entropy as Hl,
Right Child node: C2(=C-C1) cats, D2(D-D1) dog, sum is S-S1, entropy as Hr.
The weighted entropy of children as Hc.

I am curious that if there is a combo (C,D,C1,D1) that Hf<Hc (I wonder if there is some theorem in Math to disproof this). Many thanks.

Best,
Steve

Hi Steve,

This course uses entropy as the measure of impurity for splitting decision, such that we split a node if entropy decreases. In order to know whether the entropy will decrease or not, we calculate the entropy before splitting, and the sum of weighted entropies after applying a candidate split. Again, since the decision is based on reducing entropy, the sum of weighted entropy in the child nodes is always smaller than the entropy in the parent node.

Let’s say the entropy of the parent node be H_p, entropy of the left and right child nodes be H_l and H_r respectively, we have the weighted sum entropy for the child nodes as H_c = w_lH_l + w_rH_r under the condition that H_c < H_p.

H_c < H_p is the condition for a split.

Cheers,
Raymond

Hi Raymond,