Week 1 Quiz question - re: smaller training sets

Hello,
I’m confused by the explanation for why I got this question incorrect:
"The question (number 10 for me) that asks: “The sparsity of connections and weight sharing are mechanisms that allow us to use fewer parameters in a convolutional layer making it possible to train a network with smaller training sets. True/False?”

I feel I got the answer correct, and the explanation has no reference to “smaller training sets”, so I’m wondering what I’m missing (or if there’s a “bug” in the question".

(I don’t want to post the answer, or any hint, so sorry for sounding vague here…)

Hi GregX999,

When I tried the quiz myself twice I did not get that question, so I cannot evaluate the provided answer and explanation. But the required size of the training set is usually seen to depend on the number of parameters that are to be calibrated, though there are exceptions. For a discussion see this post.

This question knocked me down too. Less training params indeed lead to lower training set requirements.
it is mentioned in the Lecture “Why convolutions?” around 5:35