Can few-shot learning be used for larger datasets

I realize few-shot learning is used in situations where there is very little training data. One article even states that it is best used when the number of training data is less than 10. However is there a case where it can be used with say dataset with a 1000 or more samples?

Greetings,

according to definitions, the goal in FSL is model learning the similarities and differences of two objects, not like supervised learning which recognizing images in train-set and generalizing in test set. based on that more samples can always be helpful, if they have good variance and fidelity. unless the model will converge to its optimal state very soon and training on more samples won’t affect the accuracy.

Yours Sincerely.

1 Like

Are you saying it does not matter whether we have more or less data as long as the model attains optimal convergence?

it’s heavily dependent on the goal in which we decided to implement FSL for. for instance, if the model’s goal is to distinguish between a tiger and other types of big cats, 10 pictures can result in an optimally converged model, but if you want your model to be able to distinguish between a Sumatran and Bengal tiger, you probably need more than 10 pictures.

1 Like

Alright, that clearifies things for me. Thank you

1 Like