# Can a neural network approximate non linear and periodic functions

I recently found a few assertions in forums (not this one) saying that neural networks cannot approximate non linear functions and, in particular, periodic functions.

I wondered if I could answer this on my own by using what we’ve seen so far in the course. So I extended the “Hello world” example obtaining these results.

Sinusoidal approximation (in the domain of the training)

Sinusoidal approximation (predicting further from training points)

I think that the results speak for themselves:

1. neural networks can easily learn non linear functions,
2. they can be pretty accurate as soon as they are trained close to the points they make predictions for.

The sources used for the charts can be found here.

This is false. Non-linear functions are a particular specialty of neural networks. It’s because they have a non-linear activation in the hidden layer.

1 Like

Thanks TMosh! You are absolutely right. I’m happy that I already developed the tools to figure it out on my own. What you are pointing out is what I imagined in the first place but I still don’t have enough theoretical knowledge to answer with confidence.

By the way, as a physics student (once I was) I wanted to see if I could use a neural network to guess the explicit parameters (phase and angular velocity) of a sinusoidal signal using a neural network and it seems like it’s quite easy with a custom layer:

``````class Cosine(tf.keras.layers.Layer):
def __init__(self, units=32, input_dim=32):
super().__init__()
shape=(input_dim, units),
initializer="random_normal",
trainable=True
)