How to deal with big data?

I have a massive dataset with billions of entry. I can not load the data at once because of it’s massive size. Is there a way for me to:
i) Load 10% of the data
ii) Run a neural network model
iii) Load the next10% of data
iv) Update the neural model
and so on…

Yes, TensorFlow provides a class called “Dataset” that is specifically designed to deal with this type of issue. We’ll get introduced to that in Course 4 or you can just google “TensorFlow dataset” to find the documentation.

The idea of “minibatch” gradient descent covered here in Course 2 is also relevant of course, but that is also managed for you by TF, once you switch to using that framework.

1 Like