I don’t understand why my model isn’t working, would appreciate any guidance
Hey Milosh!
First things first: Welcome to the Discourse!
Secondly: I’d check out this link, and look at how the parameters are ordered. When you’re passing maxlen
, does tf.keras.layers.Embedding()
know you’re passing the input_length
argument?
If not, how would you make sure it knew that max_len
was supposed to be the input_length
argument?
Would you mind clicking on my name and sending me the full error screenshot or the Notebook so I can better troubleshoot for you Milosh?
Hey Milosh!
I would point you back to the docs for the tf.keras.layers.Embedding()
layer again!
Pay close attention to the breakdown of the parameters:
Specifically these two. What order is the Embedding()
layer expecting arguments, and what order are you providing in your code?
This is where I am stumped, I am not sure what I am doing wrong, I did several more iterations of the inputs for the embedding layer and they keep returning an error, even though that’s how they ran in the first 2 labs
Am I missing an argument? The instructions say only to use these 3 arguments, and if it’s only these 3 then they should be in the correct order according to the documentation
input_dim → output_dim → input_length
I feel like I’m going crazy since I’m sure the answer is something simple
Hello Alex, have you had a chance to take a look at my last response? I feel like I’ve exhausted all options with this code, I read over the documentation and re-watched the lessons, but I don’t know what I’m missing, I’ve tried several more iterations of the inputs and nothing works
Ah! Milosh!
I took a closer look at your notebook, and I noticed something tricky that you probably just missed by accident.
For the GRADED FUNCTION: fit_tokenizer
you declare the Tokenizer()
with the oov_token
parameter, but there’s no num_words
parameter!
I would give adding that a try, and then using the second option of the pictures above, and see how that goes!
Sorry for the late response, I hope this is helpful!
This fixed it!! Thank you so much!