C5W4 exercise 5 question

I do not understand this instruction: Add the position encoding: self.pos_encoding [:, :seq_len, :] to your embedding. What is implied by [:, :seq_len, :] ? I replaced the seq_len by x.shape[1] but the extra colon seems to be extraneous and I got a syntax error. What am I really passing to this method?

They essentially wrote the code out for you in the instructions. Note that the template code defines seq_len for you as well.

The indexing syntax there says that we’ve got a 3D tensor and for the first and third dimensions, we take all possible values. For the second dimension, the expression is:

:seq_len

The general python syntax for generating a sequence of indices is:

first:last:step

So in this case, first is not specified, so that means start at the beginning. step is not specified, so it defaults to 1. So the expression they gave you is equivalent to:

0:seq_len:1

or

0:seq_len

Thanks! I think I actually tried that earlier. I seem to be having strange behavior of the kernel. I do the restart and clear output and then do the cell run all above, but when I run the encoder cell it gives me the same syntax error from an earlier run even though I have changed the code.

NVM, I found the problem.

3 Likes

I am getting an error in the encoder layer call. I may have problems earlier - I am including my printout of x dim at various points. Thank you as always for your help.

pos =  [[0]
 [1]
 [2]
 [3]
 [4]]
k =  [0 1 2 3]
i =  [0 0 1 1]
pos =  [[0]
 [1]
 [2]
 [3]
 [4]]
angle =  [[0.   0.   0.   0.  ]
 [1.   1.   0.01 0.01]
 [2.   2.   0.02 0.02]
 [3.   3.   0.03 0.03]
 [4.   4.   0.04 0.04]]
x embed dim =  (2, 3, 4)
x after scaling =  tf.Tensor(
[[[ 0.01721156  0.01480629 -0.0819052   0.01511107]
  [ 0.02541938  0.09872026 -0.09528103 -0.09266322]
  [-0.04945436 -0.07790992  0.02245057 -0.0741863 ]]

 [[ 0.02541938  0.09872026 -0.09528103 -0.09266322]
  [ 0.01721156  0.01480629 -0.0819052   0.01511107]
  [ 0.02883019  0.06164945  0.07953096  0.02737803]]], shape=(2, 3, 4), dtype=float32)
x after pos dim =  (2, 3, 4)
x after dropout dim =  tf.Tensor(
[[[ 0.01912395  1.1275625  -0.09100578  1.1279013 ]
  [ 0.9632115   0.7100251  -0.09475689  1.0080965 ]
  [ 0.9553812  -0.548952    0.04716581  1.0284597 ]]

 [[ 0.02824375  1.2208004  -0.10586781  1.008152  ]
  [ 0.9540917   0.6167873  -0.07989486  1.1278456 ]
  [ 1.042364   -0.39388603  0.11058848  1.141309  ]]], shape=(2, 3, 4), dtype=float32)

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-25-68aa9b8f5815> in <module>
      1 # UNIT TEST
----> 2 Encoder_test(Encoder)

~/work/W4A1/public_tests.py in Encoder_test(target)
    114     x = np.array([[2, 1, 3], [1, 2, 0]])
    115 
--> 116     encoderq_output = encoderq(x, True, None)
    117 
    118     assert tf.is_tensor(encoderq_output), "Wrong type. Output must be a tensor"

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs)
   1010         with autocast_variable.enable_auto_cast_variables(
   1011             self._compute_dtype_object):
-> 1012           outputs = call_fn(inputs, *args, **kwargs)
   1013 
   1014         if self._activity_regularizer:

<ipython-input-24-7af95bffe933> in call(self, x, training, mask)
     60        # Pass the output through the stack of encoding layers
     61        for i in range(self.num_layers):
---> 62            x = self.enc_layers(x)
     63        # END CODE HERE
     64 

TypeError: 'ListWrapper' object is not callable

The problem is that self.enc_layers is defined as a python enumeration of a list of layers. So you can’t just reference self.enc_layers and call it: you need to index it to select the current layer you want and then call that. Notice that the template gives you the loop there with index i. :nerd_face:

Sorry, it took me a couple of tries to spot that. I guess that first cup of coffee hadn’t really kicked in yet :hot_beverage: :laughing: …

I did that and now it runs by gives me this error: (BTW, I just called self.enc_layers indexed to [i])

AssertionError                            Traceback (most recent call last)
<ipython-input-42-68aa9b8f5815> in <module>
      1 # UNIT TEST
----> 2 Encoder_test(Encoder)

~/work/W4A1/public_tests.py in Encoder_test(target)
    116     encoderq_output = encoderq(x, True, None)
    117 
--> 118     assert tf.is_tensor(encoderq_output), "Wrong type. Output must be a tensor"
    119     assert tuple(tf.shape(encoderq_output).numpy()) == (x.shape[0], x.shape[1], embedding_dim), f"Wrong shape. We expected ({x.shape[0]}, {x.shape[1]}, {embedding_dim})"
    120     assert np.allclose(encoderq_output.numpy(), 

AssertionError: Wrong type. Output must be a tensor

When you get a type error, the first question is “Ok, what type is it?” Then the second question is “How did it get that way?” :nerd_face:

OK, finally got it. Of course it needed the input and then eventually I discovered it needed the mask as well. Painful!

Yes, I was just going to note that there are more arguments than just x to the encoder layers. Glad you figured that out under your own power. So we’re half way through this assignment. Now we have to decode what we just encoded.

5 done out of 8, so a little bit more than half! Not to put too fine a point on it. . . .

1 Like

And just finished 6, so 75% complete!

1 Like

Finished! I know you will be sad to see me go . . . :slightly_smiling_face:

2 Likes

Congratulations! It’s a serious accomplishment to get through all 5 courses of DLS successfully.

I wish you the best for whatever you plan to do with ML/DL in the future. Are you planning to take more courses to focus on any specific application area?

Note that you can stay connected on this forum at no extra charge, even if you are done with the course material from DLAI.

Thanks! I will probably take some courses in applying AI to data science and marketing. I want to thank you and Tom for your patience and incredibly quick and helpful responses to my queries. You were a big factor in my success.

cheers!

2 Likes