I came across this definition of Hyperparameter
“Choices about the algorithm we set rather than learn”
An an example of K-nearest classifier where the “value of K’“ and the “distance metric” would be hyperparameters.
Then a question came to mind for CNN there could be many different kinds of hyperparameters right? Like architecturally it could be the number of Conv2d layers, the number of RelU activation layers the number of pooling layers, the number of filters. Then during training different hyperparameters like the chosen optimizer, the learning rate the momentum, the regularization?
So it is okay to assume this topic of identifying hyperparameters is a subject in itself and varies rather than have some generic things we identify as hyperparameters?