Discrimination and bias video question

In the video Discrimination / Bias Andrew explains the findings of the Microsoft researchers about the problem with AI reading words from the internet. The example with the analogies of Man: as King Woman: as Queen and then Man: as programmer Woman: as homemaker shows the problem. Andrew then explains with the coordinates what causes this problem. Transferring the word man to the coordinates 1,1 and the word woman to 2,3. Then the system adds two steps to the right and one step up to find the equation. This leads to the word programmer for the man and homemaker for the woman. Here comes my question why does this not happen with the words Man: as Father and Woman: as Mother and also not with Man: as King and Woman: as Queen?

Hello, @MarkusEicher. Thank you for using discourse. As for your query, the analogy of Man: as Father and Woman: as Mother or Man: as King and Woman: as Queen does not exhibit the same issue of gender bias as the analogy of Woman: as Queen and Man: as programmer because the words “father” and “mother” are more strongly associated with roles/duties that are perceived as more neutral therefore generating less bias. In other words, while there are cultural expectations and preconceptions about what a “good” father or mother should be, these positions are often considered as more evenly shared between genders than the roles of “programmer” or “homemaker.” These words are more strongly associated with genders, which contributes to the bias. This prejudice is exacerbated by the fact that these words are more strongly connected with genders. As a result, the algorithm is less likely to be influenced by the gender of the linked word, but rather by the terms that are commonly associated with the linked word and the roles they play.
I hope this answers your question