Hi there,
I think it makes total sense to think of optimizers as a supporting tool that work particularly well combined with certain models that should be used as a tool to solve a problem:
- e.g linear regression model can be fitted analytically
- e.g. gradient-based optimisers are popular in deep learning: Optimizers in Deep Learning. What is an optimizer? | by Musstafa | MLearning.ai | Medium
- e.g. gradient-free / Bayesian optimization is popular for hyperparameter optimization etc.
I believe this thread could be interesting for you since also advanced optimization methods are touched upon:
- Why not always use Adam optimizer - #4 by Christian_Simonis
- Adam Optimization Question - #3 by Christian_Simonis
Hope that helps. Have a great day, @FrederikH!
Best regards
Christian