Memory allocated for neural network

Hi ML mentors,

I tried to run the optional lab with neural network examples outside the coursera platform by using provided jupyter Notebook.
And find that neural network allocates many memory even after completing the modelling calculation, as long as the jupyter notebook kernel is not killed.
Is there any tricks to give some memory back after model.fit completed and without killing the kernel?

Many thanks!
Liyu

Hi @liyu,

Try this. You will need to know the object that takes up the space, del all variables that reference to that object, and do gc.collect().

Note that if you have more than one variable referencing to the same object, you will need to find anddel all of those variables.

a = object()
b = a
c = a
del a, b, c

Raymond

1 Like

Hi @liyu in addition to @rmwkwok answer I would suggest using notebooks from Kaggle and/or Google Colab which has GPU that you could use to speed up your training process since running things locally for deep learning might be tricky if you don’t have a machine with good resources.

In my experience using online notebooks can be a great way to practice deep learning without running things locally.

I hope this helps!

1 Like

In addition to Raymond’s more sophisticated strategy of explicitly releasing memory objects, another approach would be to add the logic after the training completes to save the model to disk. Then add logic so that if you close and reopen the notebook and it sees that there is a model already saved on disk, it just loads that and is ready to use the model, instead of retraining it. That way you can use the model in inference mode without all the excess memory usage created by the training. In other words, the simple way to release the memory is to close and reopen the notebook or click “Kernel → Restart and Clear Output” and then rerun. But the point is that unless you save the model to disk then you would also lose the trained coefficients when you do the “close and reopen” strategy.

Here’s a link that I found by googling … wait for it … “tensorflow save and load models”.

1 Like

Thank you Raymond for the quick help! Ok, I will check which objects take so many memories. Liyu

Oh great, this might be a quick solution! Thank you very much for your help! Liyu

Oh, yes, this will help. I have not tried both online notebook. Will do it. Thank you very much ! Liyu