Hi, I have finished the assignment already but this code still is not clear to me.
# Testing your Dense layer
dense_layer = Dense(n_units=10) #sets number of units in dense layer
random_key = trax.fastmath.random.get_prng(seed=0) # sets random seed
z = np.array([[2.0, 7.0, 25.0]]) # input array
dense_layer.init(z, random_key)
why is it calling dense_layer.init if the method’s name is “init_weights_and_state”?
I was thinking, maybe .init calls “init_” but it’s not the case since the parameters are also different.
So my main question is, why we don’t get an error when calling dense_layer.init since this method does not exist?
Hi @Lucas_Queiroz
As you can see Dense layer has a parent (or is a child of) Layer which has init method (the one the Dense layer inherited from).
Hi @arvyzukai
thanks a lot for point this out. So our implemented method “init_weights_and_state” is actually overriding another method with the same name which is called inside “init”, is that right?
The original method in Layer would only execute this:
del input_signature
Don’t we need any @override annotation in this case?
Btw, do you know why trax is not getting any updates since 2021?
Thank you once again!
Not quite. As you can see from this code (when init is called):
if rng is not None:
self.rng = rng
self.init_weights_and_state(input_signature)
It sets the passed random generator (for our Dense layer) and it calls the Dense layer init_weights_and_state method which describes how to properly create weights.
No. When you declare child’s methods, it by default “overrides” the parents methods (in this case init_weights_and_state, forward and __init__) while others (in this case init, __call__ and others) remain intact.
I’m not sure. AFAIK the lead developer Lukasz Kaiser left Google Brain for OpenAI and the library was not actively developed since (only minor changes since).
Cheers
P.S. But it’s a good library in any case, especially for learning, since the code is way cleaner than the big ones’.
1 Like