**Week 1 - Monty Hall Lab**

Hello !

I am currently going through the week1 courses and trying to analytically solve the generalized Monty Hall problem where Monty opens k doors out of n. I would like to solve the problem using Bayes theorem instead of the more straightforward solution provided in the lab.

I am using the same notations as provided in the lab :

E_i = \text{ the door behind which the car is. In this case, } i = 1, \ldots, n.

And adding the following

O_{j,..,k+j} = \text{ the set of k doors monty opens, with goats behind}.

We are looking for P(E_1 | O_{j,..,k+j}), which using Bayes theorem can be rewritten as:

P(E_1 | O_{j,..,k+j}) = \frac{P(O_{j,..,k+j} | E_1)\cdot P(E_1)}{P(O_{j,..,k+j} | E_1)\cdot P(E_1) + P(O_{j,..,k+j} | E_1^c)\cdot P(E_1^c)}

Next I find the following expressions for each term :

P(E_1) = 1/n, given that there is n doors and 1 car.

P(E_1^c) = \frac{n-1}{n}, follows from above.

P(O_{j,..,k+j} | E_1) is the probability of Monty opening a random set of k doors, given that the car is behind the first door. That means Monty can pick any set of k doors out of the remaining (n-1) doors. Using the combination formula, we have :

P(O_{j,..,k+j} | E_1) = \frac{1}{{n-1 \choose k}}

Likewise, P(O_{j,..,k+j} | E_1^c) is the probability of Monty opening a given set of k doors, given that the car is **not** behind the first door. Given that Monty knows what is behind each door, he just has to pick k doors out of the n-2 doors behind which he knows there are goats :

P(O_{j,..,k+j} | E_1^c) = \frac{1}{{n-2 \choose k}}

Now when I replace the following in the formula above and try to solve, I do not get the same solution as the one provided in the lab, namely :

P(E_1 | O_{1,..,k}) = \frac{1}{n}

Could anyone point the flaws in my logic please? Thanks!