C4 W1 assigment 2

Hello everyone,

I am doing the second assigment of the C4-W4, where we need to program a basic cnn using TF Keras Functional API.

I am getting the following error:

Input 0 of layer max_pooling2d_8 is incompatible with the layer: expected ndim=4, found ndim=5. Full shape received: [1, None, 64, 64, 8]

It seems that tf.keras.layers.ReLU(Z1) is creating a new axis and I do not really know why or how to fix it without modfying the code (you know, squeezing this axis or something like that).

If someone could help me, I would really appreciate it.
Thanks in advance.

1 Like

In which assignment you are getting the error? You said “C4-W4” but your title said “C4 W1”. Also, In which exercise you are getting the error? Please share your full error. The more accurate and detailed information you provide, the more I will be willing to assist you.

1 Like

Sorry, my mistake. It is in C4-W1, in exercise 2, and the error message is the one I added to my first message.

1 Like

This is incorrect to implement Functional API. The general form is: tf.keras.layers.XYZ(...)(input)

1 Like

Yes, okey, this is my code:

input_img = tf.keras.Input(shape=input_shape)

YOUR CODE STARTS HERE

Moderator Edit: Solution Code Removed.

YOUR CODE ENDS HERE

And the full error message is:

ValueError Traceback (most recent call last)
in
----> 1 conv_model = convolutional_model((64, 64, 3))
2 conv_model.compile(optimizer=‘adam’,
3 loss=‘categorical_crossentropy’,
4 metrics=[‘accuracy’])
5 conv_model.summary()

in convolutional_model(input_shape)
38 Z1 = tfl.Conv2D(filters=8, kernel_size=4, strides=(1,1), padding=‘same’)(input_img),
39 A1 = tfl.ReLU()(Z1),
—> 40 P1 = tfl.MaxPooling2D(pool_size=(8,8), strides=(8,8), padding=‘same’)(A1),
41 Z2 = tfl.Conv2D(filters=16, kernel_size=2, strides=(1,1), padding=‘same’)(P1),
42 A2 = tfl.ReLU()(Z2),

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in call(self, *args, **kwargs)
924 if _in_functional_construction_mode(self, inputs, args, kwargs, input_list):
925 return self._functional_construction_call(inputs, args, kwargs,
→ 926 input_list)
927
928 # Maintains info about the Layer.call stack.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _functional_construction_call(self, inputs, args, kwargs, input_list)
1090 # TODO(reedwm): We should assert input compatibility after the inputs
1091 # are casted, not before.
→ 1092 input_spec.assert_input_compatibility(self.input_spec, inputs, self.name)
1093 graph = backend.get_graph()
1094 # Use self._name_scope() to avoid auto-incrementing the name.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
178 ‘expected ndim=’ + str(spec.ndim) + ‘, found ndim=’ +
179 str(ndim) + '. Full shape received: ’ +
→ 180 str(x.shape.as_list()))
181 if spec.max_ndim is not None:
182 ndim = x.shape.ndims

ValueError: Input 0 of layer max_pooling2d_8 is incompatible with the layer: expected ndim=4, found ndim=5. Full shape received: [1, None, 64, 64, 8]

1 Like

Your code seems correct to me. Let’s wait for other mentors to respond.

1 Like

If this is the Functional API, then what are these commas for?
image

1 Like

And, in your Dense layer, I don’t see the (F) argument being used.

1 Like

Yes, the commas that Tom points out are a serious problem. That has the effect of making the RHS a “tuple”, which is why it adds the 5th dimension.

You need the commas in the Sequential API because you are constructing a list of instantiated layer functions, but they are a disaster in the Functional API because you are just invoking instantiated layer functions there. It might help to spend a bit of time reading this thread, which gives you a much nicer explanation of the Sequential and Functional APIs than we get in either the lectures or the assignment itself.

New learning for me: don’t use commas in Functional API. Thanks, Tom and Paul.

Oops, I did notice that I added the commas haha. Thank you very much for your help to all of you!!!