W2 A2 Transfer Learning with MobileNet: Transfer_learning_with_MobileNet_v1

Hi All,
when I run my code I get the following error:

"
ValueError: Negative dimension size caused by subtracting 160 from 5 for ‘{{node average_pooling2d_1/AvgPool}} = AvgPoolT=DT_FLOAT, data_format=“NHWC”, ksize=[1, 160, 160, 1], padding=“VALID”, strides=[1, 160, 160, 1]’ with input shapes: [?,5,5,1280].
"
It seems like I’ve done something wrong with AveragePooling2D layer but I cannot see what exactly.

Any suggestions/advices please?

Many thanks!
Daniel.

Please post a screen capture image that shows the entire assert or error message.

Unless something is badly wrong in your alpaca_model(), the only other place this could go askew is in your code for the data_augmenter() function.

Note also that in alpaca_model(), you should use data_augmentation(…), not data_augmenter()

Thanks Tom! I am unsing data_augmentation(…). Here is the stack asked for:


InvalidArgumentError Traceback (most recent call last)
/opt/conda/lib/python3.7/site-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs, op_def)
1811 try:
→ 1812 c_op = pywrap_tf_session.TF_FinishOperation(op_desc)
1813 except errors.InvalidArgumentError as e:

InvalidArgumentError: Negative dimension size caused by subtracting 160 from 5 for ‘{{node average_pooling2d_4/AvgPool}} = AvgPoolT=DT_FLOAT, data_format=“NHWC”, ksize=[1, 160, 160, 1], padding=“VALID”, strides=[1, 160, 160, 1]’ with input shapes: [?,5,5,1280].

During handling of the above exception, another exception occurred:

ValueError Traceback (most recent call last)
in
----> 1 model2 = alpaca_model(IMG_SIZE, data_augmentation)

in alpaca_model(image_shape, data_augmentation)
36 # add the new Binary classification layers
37 # use global avg pooling to summarize the info in each channel
—> 38 x = tfl.AveragePooling2D(image_shape)(x)
39 # include dropout with probability of 0.2 to avoid overfitting
40 x = tfl.Dropout(rate=0.2)(x)

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in call(self, *args, **kwargs)
924 if _in_functional_construction_mode(self, inputs, args, kwargs, input_list):
925 return self._functional_construction_call(inputs, args, kwargs,
→ 926 input_list)
927
928 # Maintains info about the Layer.call stack.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _functional_construction_call(self, inputs, args, kwargs, input_list)
1115 try:
1116 with ops.enable_auto_cast_variables(self._compute_dtype_object):
→ 1117 outputs = call_fn(cast_inputs, *args, **kwargs)
1118
1119 except errors.OperatorNotAllowedInGraphError as e:

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/layers/pooling.py in call(self, inputs)
294 strides=strides,
295 padding=self.padding.upper(),
→ 296 data_format=conv_utils.convert_data_format(self.data_format, 4))
297 return outputs
298

/opt/conda/lib/python3.7/site-packages/tensorflow/python/util/dispatch.py in wrapper(*args, **kwargs)
199 “”“Call target, and fall back on dispatchers if there is a TypeError.”“”
200 try:
→ 201 return target(*args, **kwargs)
202 except (TypeError, ValueError):
203 # Note: convert_to_eager_tensor currently raises a ValueError, not a

/opt/conda/lib/python3.7/site-packages/tensorflow/python/ops/nn_ops.py in avg_pool(value, ksize, strides, padding, data_format, name, input)
4282 padding=padding,
4283 data_format=data_format,
→ 4284 name=name)
4285
4286

/opt/conda/lib/python3.7/site-packages/tensorflow/python/ops/gen_nn_ops.py in avg_pool(value, ksize, strides, padding, data_format, name)
83 _, _, _op, _outputs = _op_def_library._apply_op_helper(
84 “AvgPool”, value=value, ksize=ksize, strides=strides, padding=padding,
—> 85 data_format=data_format, name=name)
86 _result = _outputs[:]
87 if _execute.must_record_gradient():

/opt/conda/lib/python3.7/site-packages/tensorflow/python/framework/op_def_library.py in _apply_op_helper(op_type_name, name, **keywords)
742 op = g._create_op_internal(op_type_name, inputs, dtypes=None,
743 name=scope, input_types=input_types,
→ 744 attrs=attr_protos, op_def=op_def)
745
746 # outputs is returned as a separate return value so that the output

/opt/conda/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py in _create_op_internal(self, op_type, inputs, dtypes, input_types, name, attrs, op_def, compute_device)
591 return super(FuncGraph, self)._create_op_internal( # pylint: disable=protected-access
592 op_type, inputs, dtypes, input_types, name, attrs, op_def,
→ 593 compute_device)
594
595 def capture(self, tensor, name=None, shape=None):

/opt/conda/lib/python3.7/site-packages/tensorflow/python/framework/ops.py in _create_op_internal(self, op_type, inputs, dtypes, input_types, name, attrs, op_def, compute_device)
3483 input_types=input_types,
3484 original_op=self._default_original_op,
→ 3485 op_def=op_def)
3486 self._create_op_helper(ret, compute_device=compute_device)
3487 return ret

/opt/conda/lib/python3.7/site-packages/tensorflow/python/framework/ops.py in init(self, node_def, g, inputs, output_types, control_inputs, input_types, original_op, op_def)
1973 op_def = self._graph._get_op_def(node_def.op)
1974 self._c_op = _create_c_op(self._graph, node_def, inputs,
→ 1975 control_input_ops, op_def)
1976 name = compat.as_str(node_def.name)
1977 # pylint: enable=protected-access

/opt/conda/lib/python3.7/site-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs, op_def)
1813 except errors.InvalidArgumentError as e:
1814 # Convert to ValueError for backwards compatibility.
→ 1815 raise ValueError(str(e))
1816
1817 return c_op

ValueError: Negative dimension size caused by subtracting 160 from 5 for ‘{{node average_pooling2d_4/AvgPool}} = AvgPoolT=DT_FLOAT, data_format=“NHWC”, ksize=[1, 160, 160, 1], padding=“VALID”, strides=[1, 160, 160, 1]’ with input shapes: [?,5,5,1280].

Here’s the problem.
image

Just remove the image_shape argument from that call.

The image shape should have already been defined when you created the input layer.

Thanks Tom, that solved it! Many thanks! I just could not catch it!