Comparing Perspectives on Dimensionality and Local Optima in Machine Learning

Hello Learners,
I have a question that I would like to hear your perspective on:
In a video from the Deep Learning Specialization titled “The Problem of Local Optima (C2W3L10),” Andrew said:
“If you are in, say, a 20000 dimensional space then for it to be a local optima all 20,000 directions need to look like this, and so the chance of that happening is maybe very small, you know maybe 2 to the minus 20000.”

However, in this course, while discussing dimensionality, Robert says:
“Let’s look at an example of how dimensionality reduction can help our models perform better, apart from distances and volumes. Increasing the number of dimensions can create other problems. Processor and memory requirements often scale non-linearly with an increase in the number of dimensions, due to an exponential rise in feasible solutions. Many optimization methods cannot reach a global optima and get stuck in a local optima.”

I lean more towards what Andrew said in this case, but I would like to make sure I am not missing something. What are your thoughts on these contrasting perspectives regarding dimensionality and local optima in machine learning?

Andrew’s explanation in DLS is a intuitive explanation.
The reasoning in the MLOps lecture is more analytical.

1 Like

Thank you for your reply TMosh
I apologize for the late reply, I didn’t receive a notification that someone had replied.
I see, but there is a contradiction, right?
I think my question can be better stated as follows:
Are we most likely or less likely to get stuck in a local optima the more dimensions we have? And, if it’s more likely, what do we do? is it that we just do dimensionality reduction, which has other benefits, and hope for the best?

I think these are not related.

Local minima are caused by non-linear cost functions - not by the number of dimensions.

The fix for local minima is to train multiple times with different initial values, and then choose the solution with the lowest cost.

1 Like

I see, thank you very much again for taking the time to reply to me.
Much appreciated!
Have a great day!
Wissam