Some personal feedback: of course it depends on the underlying function, but very often I have had quite good experience with regula falsi - also known as Pegasus method. In my experience it is more robust than the newton method in many cases!
source
Big Plus is also that Pegasus method has some guarantees for conversion!
I think trying out several initial points or even optimising them with a gradient-free Baysian approach can help, see also this thread: Why there are so many optimizer algorithms? - #2 by Christian_Simonis
More concretely, you could also treat the initial value in your newtons method as a hyperparameter which you could optimize in a Baysian way. Truth is however: often this would be too complicated and a simple grid search does the trick!
In the end I believe there is no silver bullet as method for numerical optimization, but it definetely helps to have a nice toolkit for numerical optimization.
PS: Especially for large linear systems also the conjugate gradient method (CGM) is also powerful even though I have to admit since I left university, I did not work a lot with it.
Best regards
Christian