Rule of thumb for Hyperparameters

IS there any rule of thumb for hyper parameters like ( no of layers, no of neurons , no of neurons in each layers). if didn’t have how to take the hyper parameters.

You can use similar models that deal with a similar application but there no rule of thumbs, there are searching techniques to find good performing models but in general is a trial and error process.

Hi gent.spah. i asked from this. In that video the professor said for finding these hyperparameters

no of layers → 2log2(N)
no of perceptrons on the network–> 3(N-1)
no of perceptrons in a single hidden layer → 2^(N-1)

is this always use??

In my experience so far I havent come across this, but I can not say for sure if he is right or not.