Course 4 week 1 assignment 2 exercise 2

I’m getting this error

Apparently there is a problem with how your convolutional_model() function uses the Dense() layer.

Did you include the “(F)” argument for the Dense() layer?

I just did. then I got another error

Remember that this is the “Functional API” now, so you need to instantiate the layer function (specify the parameters you want) and then invoke the resulting function with an actual input tensor (the “(F)” that Tom was referring to in the Dense case). There should be two separate sets of parentheses on each line involving a layer. Did you actually read the thread I linked on our earlier discussion about the Sequential and Functional APIs? If you are trying to save time by “winging it” here and not actually reading the information, that may not actually end up being a net savings of time. For any TF function, you can also find the documentation by searching “tensorflow flatten” e.g. …

I read the sequential API aspect of the thread. Also I’ve read the documentations on each of the layers. The documentation on “Flatten” says
tf.keras.Flatten(data_format=None) and thats what I have in my code

Well, this is not the Sequential API anymore, right? So maybe it’s worth reading the other half of that thread.

Read the thread. Applied it. had to multiply each later by the output of the preceeding layer. I still got an error

That operation there is not multiplication: it is function invocation. I suggest you take a few deep cleansing breaths and then read that thread again. That point was covered in pretty specific detail.

Being in a hurry and not taking the trouble to really understand what you’re reading ends up just wasting more time. “Go slow to go fast …” is the old saying, right?

It looks like you must have forgotten the input tensor in one of the Conv2D layers.

yes. I missed a couple of them. All is good now. Thank You

Ok, I hope there is a “meta” lesson learned there. Repeat after me: “Go slow to go fast”. :nerd_face:

1 Like

GO SLOW TO GO FAST!!! :+1:t5: :+1:t5: :+1:t5: :grin: