Results not matching for linear_function in DLS Course 2 week 3

I had to initialise random tensor of specific shapes for X, W, b and compute Y. I did not change the order of initialisation which was given as X, W, b, Y. I initialised X, W,b as constant. Computed Y by adding b and product of W,X. The results are not matching. Please help.

Hi, @RohanD.

Be careful, rand and randn sample from different distributions!

1 Like

Also, @RohanD, note the difference between tf.constant and tf.Variable.
As the hint in the assignment points out:

Note that the difference between tf.constant and tf.Variable is that you can modify the state of a tf.Variable but cannot change the state of a tf.constant.

Think about which of X,W,b,Y change, and which ones do not, throughout the training process.

1 Like

Oh my god!!! Thank you so much. I do have a question now though, why don’t we use rand for initialising our W matrix? Since it initialises over 0 to 1, it should be valid right? I say this because in previous assignments we multiplied each element in the W matrix with 0.01 to keep it small.

Yes got it, I changed my W and b ‘constant’ initialization after I corrected the mistake in my code. Thank you so much for quick help!

Glad it worked!

Assuming the effect of weight initialization in neural networks is clear, the only reason rand fails here is because it generates values different from those expected by the grader (the assignment specifically asks you to use randn) :sweat_smile:

Good luck with the rest of the assignments!

Thank you. Randn and rand is a source of errors. Not the first time.