Automatically measure how Gaussian a distribution is

In the video “Choosing what features to use” Andrew mentions the following:

If you read the machine learning literature, there are some ways to automatically measure how close these distributions are to Gaussian.

Can you point out such literature?

Also, is there a library that, given a feature X, automatically determines the transformation you should make to achieve a Gaussian distribution? e.g. log(x + 1) or x**2 and so on.
It would be very cool!


Hi @popaqy,

If you don’t mind a non-Andrew answer, you may start from here. If you have read and tried something, we can discuss further.

Never seen that, look for inputs from the others!