- What is the purpose behind using tf.random.set_seed(1234)?
- Also, in the optional lab of some gradient descent algo, we use;
if i<10000: # for preventing resource exhaustion
if i%1000==0: print(i)
May I know what is meant by resource exhaustion?
The reason for setting the random seed is for reproducibility. This is to ensure that the same result is obtained when the notebook is run at different times.
As for the resource exhaustion aspect, training neural networks requires 8 ina lot of computing power, that’s why a cap is set to ensure that we are within the available compute given in the Coursera environment.
Thank you so much…
could you explain in detail about your first statement as to how it helps reproduciblity?
When we use random number generators in machine learning e.g., to initialize model weights or shuffle data, in most instances, the output is different each time you run the code. This can make it hard to debug or reproduce results.
By calling tf.random.set_seed(1234), we’re telling TensorFlow to start the sequence of random numbers in a predictable way. This means that every time the code runs, the same “random” numbers will be generated, and you’ll get the exact same results, which is especially important for reproducibility in experiments, debugging, or sharing results with others.
So even though the process looks random, setting a seed makes it deterministic. This is why we say it helps with reproducibility: it allows you (and others) to repeat the experiment under the same conditions and get consistent results.