C4 W1_A2 Exc1

Can someone help me with the proper syntax here?

Check the previous line, especially for mis-matched parenthesis.

Syntax errors are identified in the line that follows where the problem lies (because that’s when the parser detects a problem and gives up).

{moderator edit - solution code removed}

I don’t see any issues there too

Is this in the Sequential API section or the Functional API section? In the Sequential case, you are composing a list of layers, so you need commas between the entries. Notice that your functions are actually functions, not tensors, since you are not feeding them any inputs. So my guess is that this is the Sequential section and your error is omitting the commas.


I now have a problem with defining the output/input shape. This attribute error is shown:
“The layer has never been called and thus has no defined output shape”

I’m still stuck

Which part of the assignment are you talking about? In either case, it would be worth spending some time studying this very helpful thread about how to use the “Layer” APIs.

I have read through the thread and still don’t understand how to specify the input shape in advance (as mentioned in point 3.1 of the assignment).

You simply use the argument input_shape on the first layer of the model. Note that you still haven’t told us whether this is the Sequential case or the Functional case. Of course you only need to do this once at the beginning, since the rest of the shapes are computable from the input shape and the parameters of the various layers.

Sorry, it’s the sequential case
I have tried that before:
Conv2D(input_shape = (64, 64, 3)…),

But in the Sequential case, the Conv2D layer is not the first layer, right? You apply that on the first layer, which is the ZeroPadding layer.

You did read the comments, right? Here’s what they say:

            ## ZeroPadding2D with padding 3, input shape of 64 x 64 x 3
            ## Conv2D with 32 7x7 filters and stride of 1
            ## BatchNormalization for axis 3
            ## ReLU

I see
turns out I had it done in the zero padding layer, but the positional arg was not in the right place, so I put it in the second layer. Thank you though