Greddy Search Clarification

Hi Mentor,

@GordonRobinson
@Kic
@edwardyu
@laacdm

In the lecture video of Picking the Most Likely Sentence, at 6:33 minute, this statement not understandable , can please help to explain ?

So, it’s quite possible that if you just pick the third word based on whatever maximizes the probability of just the first three words, you end up choosing option number two. But, this ultimately ends up resulting in a less optimal sentence, in a less good sentence as measured by this model for p of y given
x.

Also at 7:49, can the below statement explained a bit more, looks not understandable ?

So, this is just a huge space of possible sentences, and it’s impossible to enumerate them all, here what does it mean enumerate them all ?

Here, Andrew is basically saying that Greedy Search does not produce the best sentence translation. ( So, Beam Search is a better option as it considers multiple words at each step.)
At 7:49, what is means is that there are infinite number of combination of words in English language. So if we pick all the possible word combinations, we will end up with Combinations^ (number of words) sentences.