C4 W2 A1: 3.1 Identity block - DLS Course 4

hi,

I am stuck and cant figure out why my result is not matching to expected value
And there is assertion Error: Check the padding and strides.

Kindly help me please.

### START CODE HERE ###

# Second component of main path (≈3 lines)
X = Conv2D(filters = F2, kernel_size = (f, f), strides = (1,1), padding = 'same', kernel_initializer = glorot_uniform(seed=0))(X)
X = BatchNormalization(axis = 3)(X)
X = Activation('relu')(X, training = training)

# Third component of main path (≈2 lines)
X = Conv2D(filters = F3, kernel_size = (1, 1), strides = (1,1), padding = 'valid', kernel_initializer = glorot_uniform(seed=0))(X)
X = BatchNormalization(axis = 3)(X, training = training)

# Final step: Add shortcut value to main path, and pass it through a RELU activation (≈2 lines)
X = Add()([X_shortcut, X]) 
X = Activation('relu')(X)

### END CODE HERE ###

My results versus expected results

With training=False

[[[0. 0. 0. 0. ]
[0. 0. 0. 0. ]]

[[1.16404 1.16404 1.16404 1.81691]
[1.23441 1.23441 1.23441 1.3923 ]]

[[3.4921 3.4921 3.4921 5.45072]
[3.70323 3.70323 3.70323 4.17691]]]
1.2344106

With training=True

[[[0. 0. 0. 0. ]
[0. 0. 0. 0. ]]

[[0.81198 0.81198 0.81198 0.81198]
[0.81198 0.81198 0.81198 0.81198]]

[[3.28676 3.28676 3.28676 3.48738]
[3.41158 3.41158 3.41158 4.11966]]]

Expected value

With training=False

[[[  0.        0.        0.        0.     ]
  [  0.        0.        0.        0.     ]]

 [[192.71234 192.71234 192.71234  96.85617]
  [ 96.85617  96.85617  96.85617  48.92808]]

 [[578.1371  578.1371  578.1371  290.5685 ]
  [290.5685  290.5685  290.5685  146.78426]]]
96.85617

With training=True

[[[0.      0.      0.      0.     ]
  [0.      0.      0.      0.     ]]

 [[0.40739 0.40739 0.40739 0.40739]
  [0.40739 0.40739 0.40739 0.40739]]

 [[4.99991 4.99991 4.99991 3.25948]
  [3.25948 3.25948 3.25948 2.40739]]]

Which week number?
Which assignment number?

hi

its week 2, assignment 1, Residual Network

Assignment : Identity Block

Your first BatchNormalization() is missing the “training = training” argument.
You have it on Activation(), but it doesn’t work there.

1 Like

hi Tmosh,

It came back with more errors, after I added (X = trainng = training) in second and third component after normalization (axis=3)


AttributeError Traceback (most recent call last)
in
8 A3 = identity_block(X, f=2, filters=[4, 4, 3],
9 initializer=lambda seed=0:constant(value=1),
—> 10 training=False)
11 print(’\033[1mWith training=False\033[0m\n’)
12 A3np = A3.numpy()

in identity_block(X, f, filters, training, initializer)
37
38 # Third component of main path (≈2 lines)
—> 39 X = Conv2D(filters = F3, kernel_size = (1, 1), strides = (1,1), padding = ‘valid’, kernel_initializer = glorot_uniform(seed=0))(X)
40 X = BatchNormalization(axis = 3)(X, training = training)
41

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/base_layer.py in call(self, *args, **kwargs)
980 with ops.name_scope_v2(name_scope):
981 if not self.built:
→ 982 self._maybe_build(inputs)
983
984 with ops.enable_auto_cast_variables(self._compute_dtype_object):

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/base_layer.py in _maybe_build(self, inputs)
2616 if not self.built:
2617 input_spec.assert_input_compatibility(
→ 2618 self.input_spec, inputs, self.name)
2619 input_list = nest.flatten(inputs)
2620 if input_list and self._dtype_policy.compute_dtype is None:

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
164 spec.min_ndim is not None or
165 spec.max_ndim is not None):
→ 166 if x.shape.ndims is None:
167 raise ValueError('Input ’ + str(input_index) + ’ of layer ’ +
168 layer_name + ’ is incompatible with the layer: ’

AttributeError: ‘Activation’ object has no attribute ‘shape’

Sorry, but that makes no sense. X is a tensor, and training is a boolean. You can’t equate them.
Maybe you meant (X, training = training).