DL5, W4 Assignment. ValueError: The first argument to `Layer.call` must always be passed


I am not able to understand, why i am getting the error.
Would anyone help in resolving or throw any light on it?

You have not provided the correct arguments to the self.mha() call.
You have to provide data for the query, value, and key. Fortunately, since this is self-attention, you use β€˜x’ for all three.
And you do not need the training argument.

1 Like

I did the same thing, but i am still getting the same error

Please share your full error.


ValueError                                Traceback (most recent call last)
<ipython-input-22-00617004b1af> in <module>
      1 # UNIT TEST
----> 2 EncoderLayer_test(EncoderLayer)

~/work/W4A1/public_tests.py in EncoderLayer_test(target)
     84     encoder_layer1 = target(4, 2, 8)
     85     tf.random.set_seed(10)
---> 86     encoded = encoder_layer1(q, True, np.array([[1, 0, 1]]))
     87 
     88     assert tf.is_tensor(encoded), "Wrong type. Output must be a tensor"

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs)
   1010         with autocast_variable.enable_auto_cast_variables(
   1011             self._compute_dtype_object):
-> 1012           outputs = call_fn(inputs, *args, **kwargs)
   1013 
   1014         if self._activity_regularizer:

<ipython-input-21-b5dd6fb08ad4> in call(self, x, training, mask)
     52         # apply dropout layer to ffn output during training (~1 line)
     53         # use `training=training`
---> 54         ffn_output = self.dropout_ffn(training=training)
     55 
     56         # apply layer normalization on sum of the output from multi-head attention (skip connection) and ffn output to get the

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs)
    940     #   not to any other argument.
    941     # - setting the SavedModel saving spec.
--> 942     inputs, args, kwargs = self._split_out_first_arg(args, kwargs)
    943     input_list = nest.flatten(inputs)
    944 

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _split_out_first_arg(self, args, kwargs)
   3046     else:
   3047       raise ValueError(
-> 3048           'The first argument to `Layer.call` must always be passed.')
   3049     return inputs, args, kwargs
   3050 

ValueError: The first argument to `Layer.call` must always be passed.

You also need to specify the input to the dropout. For dropout, input is β€œthe output of the self.ffn”

thankyou ,it worked

Hello Deepak, by any chance have you change the code written in the notebook ??

Because without the code the cell looks like this

START CODE HERE

    # calculate self-attention using mha(~1 line).
    # Dropout is added by Keras automatically if the dropout parameter is non-zero during training
    self_mha_output = # Self attention (batch_size, input_seq_len, fully_connected_dim)
    
    # skip connection
    # apply layer normalization on sum of the input and the attention output to get the  
    # output of the multi-head attention layer (~1 line)
    skip_x_attention =   # (batch_size, input_seq_len, fully_connected_dim)

    # pass the output of the multi-head attention layer through a ffn (~1 line)
    ffn_output =  # (batch_size, input_seq_len, fully_connected_dim)
    
    # apply dropout layer to ffn output during training (~1 line)
    # use `training=training` 
    ffn_output = 
    
    # apply layer normalization on sum of the output from multi-head attention (skip connection) and ffn output to get the
    # output of the encoder layer (~1 line)
    encoder_layer_out =  # (batch_size, input_seq_len, embedding_dim)
    # END CODE HERE

Also your code is incorrect for attn_output which in upgraded lab is mentioned as self_mha_output. (for the write code refer to the ungraded lab: you will find the answer. Also TMosh gave you the hint for the first line of code.

I sincerely suggest you to upgrade the lab if you have not done any changes to the given cell or the notebook.

Hope it helps

Regards
DP